CHECK: Is CUDA the right version (10)? Using backbone resnet101 Using data augmentation generator weights arg is None Loading imagenet weights Creating model, this may take a second... Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b1_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2a (BatchNormalizat (None, None, None, 1 512 res3b1_branch2a[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu (Activatio (None, None, None, 1 0 bn3b1_branch2a[0][0] __________________________________________________________________________________________________ padding3b1_branch2b (ZeroPaddin (None, None, None, 1 0 res3b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b1_branch2b (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b[0][0] __________________________________________________________________________________________________ bn3b1_branch2b (BatchNormalizat (None, None, None, 1 512 res3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu (Activatio (None, None, None, 1 0 bn3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2c (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b1_branch2c[0][0] __________________________________________________________________________________________________ res3b1 (Add) (None, None, None, 5 0 bn3b1_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b1_relu (Activation) (None, None, None, 5 0 res3b1[0][0] __________________________________________________________________________________________________ res3b2_branch2a (Conv2D) (None, None, None, 1 65536 res3b1_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2a (BatchNormalizat (None, None, None, 1 512 res3b2_branch2a[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu (Activatio (None, None, None, 1 0 bn3b2_branch2a[0][0] __________________________________________________________________________________________________ padding3b2_branch2b (ZeroPaddin (None, None, None, 1 0 res3b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b2_branch2b (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b[0][0] __________________________________________________________________________________________________ bn3b2_branch2b (BatchNormalizat (None, None, None, 1 512 res3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu (Activatio (None, None, None, 1 0 bn3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2c (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b2_branch2c[0][0] __________________________________________________________________________________________________ res3b2 (Add) (None, None, None, 5 0 bn3b2_branch2c[0][0] res3b1_relu[0][0] __________________________________________________________________________________________________ res3b2_relu (Activation) (None, None, None, 5 0 res3b2[0][0] __________________________________________________________________________________________________ res3b3_branch2a (Conv2D) (None, None, None, 1 65536 res3b2_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2a (BatchNormalizat (None, None, None, 1 512 res3b3_branch2a[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu (Activatio (None, None, None, 1 0 bn3b3_branch2a[0][0] __________________________________________________________________________________________________ padding3b3_branch2b (ZeroPaddin (None, None, None, 1 0 res3b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b3_branch2b (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b[0][0] __________________________________________________________________________________________________ bn3b3_branch2b (BatchNormalizat (None, None, None, 1 512 res3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu (Activatio (None, None, None, 1 0 bn3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2c (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b3_branch2c[0][0] __________________________________________________________________________________________________ res3b3 (Add) (None, None, None, 5 0 bn3b3_branch2c[0][0] res3b2_relu[0][0] __________________________________________________________________________________________________ res3b3_relu (Activation) (None, None, None, 5 0 res3b3[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b1_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2a[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu (Activatio (None, None, None, 2 0 bn4b1_branch2a[0][0] __________________________________________________________________________________________________ padding4b1_branch2b (ZeroPaddin (None, None, None, 2 0 res4b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b1_branch2b (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b[0][0] __________________________________________________________________________________________________ bn4b1_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu (Activatio (None, None, None, 2 0 bn4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2c (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b1_branch2c[0][0] __________________________________________________________________________________________________ res4b1 (Add) (None, None, None, 1 0 bn4b1_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b1_relu (Activation) (None, None, None, 1 0 res4b1[0][0] __________________________________________________________________________________________________ res4b2_branch2a (Conv2D) (None, None, None, 2 262144 res4b1_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2a[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu (Activatio (None, None, None, 2 0 bn4b2_branch2a[0][0] __________________________________________________________________________________________________ padding4b2_branch2b (ZeroPaddin (None, None, None, 2 0 res4b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b2_branch2b (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b[0][0] __________________________________________________________________________________________________ bn4b2_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu (Activatio (None, None, None, 2 0 bn4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2c (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b2_branch2c[0][0] __________________________________________________________________________________________________ res4b2 (Add) (None, None, None, 1 0 bn4b2_branch2c[0][0] res4b1_relu[0][0] __________________________________________________________________________________________________ res4b2_relu (Activation) (None, None, None, 1 0 res4b2[0][0] __________________________________________________________________________________________________ res4b3_branch2a (Conv2D) (None, None, None, 2 262144 res4b2_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2a[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu (Activatio (None, None, None, 2 0 bn4b3_branch2a[0][0] __________________________________________________________________________________________________ padding4b3_branch2b (ZeroPaddin (None, None, None, 2 0 res4b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b3_branch2b (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b[0][0] __________________________________________________________________________________________________ bn4b3_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu (Activatio (None, None, None, 2 0 bn4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2c (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b3_branch2c[0][0] __________________________________________________________________________________________________ res4b3 (Add) (None, None, None, 1 0 bn4b3_branch2c[0][0] res4b2_relu[0][0] __________________________________________________________________________________________________ res4b3_relu (Activation) (None, None, None, 1 0 res4b3[0][0] __________________________________________________________________________________________________ res4b4_branch2a (Conv2D) (None, None, None, 2 262144 res4b3_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2a[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu (Activatio (None, None, None, 2 0 bn4b4_branch2a[0][0] __________________________________________________________________________________________________ padding4b4_branch2b (ZeroPaddin (None, None, None, 2 0 res4b4_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b4_branch2b (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b[0][0] __________________________________________________________________________________________________ bn4b4_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu (Activatio (None, None, None, 2 0 bn4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2c (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b4_branch2c[0][0] __________________________________________________________________________________________________ res4b4 (Add) (None, None, None, 1 0 bn4b4_branch2c[0][0] res4b3_relu[0][0] __________________________________________________________________________________________________ res4b4_relu (Activation) (None, None, None, 1 0 res4b4[0][0] __________________________________________________________________________________________________ res4b5_branch2a (Conv2D) (None, None, None, 2 262144 res4b4_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2a[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu (Activatio (None, None, None, 2 0 bn4b5_branch2a[0][0] __________________________________________________________________________________________________ padding4b5_branch2b (ZeroPaddin (None, None, None, 2 0 res4b5_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b5_branch2b (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b[0][0] __________________________________________________________________________________________________ bn4b5_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu (Activatio (None, None, None, 2 0 bn4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2c (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b5_branch2c[0][0] __________________________________________________________________________________________________ res4b5 (Add) (None, None, None, 1 0 bn4b5_branch2c[0][0] res4b4_relu[0][0] __________________________________________________________________________________________________ res4b5_relu (Activation) (None, None, None, 1 0 res4b5[0][0] __________________________________________________________________________________________________ res4b6_branch2a (Conv2D) (None, None, None, 2 262144 res4b5_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2a[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu (Activatio (None, None, None, 2 0 bn4b6_branch2a[0][0] __________________________________________________________________________________________________ padding4b6_branch2b (ZeroPaddin (None, None, None, 2 0 res4b6_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b6_branch2b (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b[0][0] __________________________________________________________________________________________________ bn4b6_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu (Activatio (None, None, None, 2 0 bn4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2c (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b6_branch2c[0][0] __________________________________________________________________________________________________ res4b6 (Add) (None, None, None, 1 0 bn4b6_branch2c[0][0] res4b5_relu[0][0] __________________________________________________________________________________________________ res4b6_relu (Activation) (None, None, None, 1 0 res4b6[0][0] __________________________________________________________________________________________________ res4b7_branch2a (Conv2D) (None, None, None, 2 262144 res4b6_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2a[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu (Activatio (None, None, None, 2 0 bn4b7_branch2a[0][0] __________________________________________________________________________________________________ padding4b7_branch2b (ZeroPaddin (None, None, None, 2 0 res4b7_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b7_branch2b (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b[0][0] __________________________________________________________________________________________________ bn4b7_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu (Activatio (None, None, None, 2 0 bn4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2c (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b7_branch2c[0][0] __________________________________________________________________________________________________ res4b7 (Add) (None, None, None, 1 0 bn4b7_branch2c[0][0] res4b6_relu[0][0] __________________________________________________________________________________________________ res4b7_relu (Activation) (None, None, None, 1 0 res4b7[0][0] __________________________________________________________________________________________________ res4b8_branch2a (Conv2D) (None, None, None, 2 262144 res4b7_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2a[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu (Activatio (None, None, None, 2 0 bn4b8_branch2a[0][0] __________________________________________________________________________________________________ padding4b8_branch2b (ZeroPaddin (None, None, None, 2 0 res4b8_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b8_branch2b (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b[0][0] __________________________________________________________________________________________________ bn4b8_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu (Activatio (None, None, None, 2 0 bn4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2c (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b8_branch2c[0][0] __________________________________________________________________________________________________ res4b8 (Add) (None, None, None, 1 0 bn4b8_branch2c[0][0] res4b7_relu[0][0] __________________________________________________________________________________________________ res4b8_relu (Activation) (None, None, None, 1 0 res4b8[0][0] __________________________________________________________________________________________________ res4b9_branch2a (Conv2D) (None, None, None, 2 262144 res4b8_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2a[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu (Activatio (None, None, None, 2 0 bn4b9_branch2a[0][0] __________________________________________________________________________________________________ padding4b9_branch2b (ZeroPaddin (None, None, None, 2 0 res4b9_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b9_branch2b (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b[0][0] __________________________________________________________________________________________________ bn4b9_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu (Activatio (None, None, None, 2 0 bn4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2c (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b9_branch2c[0][0] __________________________________________________________________________________________________ res4b9 (Add) (None, None, None, 1 0 bn4b9_branch2c[0][0] res4b8_relu[0][0] __________________________________________________________________________________________________ res4b9_relu (Activation) (None, None, None, 1 0 res4b9[0][0] __________________________________________________________________________________________________ res4b10_branch2a (Conv2D) (None, None, None, 2 262144 res4b9_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2a[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu (Activati (None, None, None, 2 0 bn4b10_branch2a[0][0] __________________________________________________________________________________________________ padding4b10_branch2b (ZeroPaddi (None, None, None, 2 0 res4b10_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b10_branch2b (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b[0][0] __________________________________________________________________________________________________ bn4b10_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu (Activati (None, None, None, 2 0 bn4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2c (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b10_branch2c[0][0] __________________________________________________________________________________________________ res4b10 (Add) (None, None, None, 1 0 bn4b10_branch2c[0][0] res4b9_relu[0][0] __________________________________________________________________________________________________ res4b10_relu (Activation) (None, None, None, 1 0 res4b10[0][0] __________________________________________________________________________________________________ res4b11_branch2a (Conv2D) (None, None, None, 2 262144 res4b10_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2a[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu (Activati (None, None, None, 2 0 bn4b11_branch2a[0][0] __________________________________________________________________________________________________ padding4b11_branch2b (ZeroPaddi (None, None, None, 2 0 res4b11_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b11_branch2b (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b[0][0] __________________________________________________________________________________________________ bn4b11_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu (Activati (None, None, None, 2 0 bn4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2c (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b11_branch2c[0][0] __________________________________________________________________________________________________ res4b11 (Add) (None, None, None, 1 0 bn4b11_branch2c[0][0] res4b10_relu[0][0] __________________________________________________________________________________________________ res4b11_relu (Activation) (None, None, None, 1 0 res4b11[0][0] __________________________________________________________________________________________________ res4b12_branch2a (Conv2D) (None, None, None, 2 262144 res4b11_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2a[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu (Activati (None, None, None, 2 0 bn4b12_branch2a[0][0] __________________________________________________________________________________________________ padding4b12_branch2b (ZeroPaddi (None, None, None, 2 0 res4b12_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b12_branch2b (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b[0][0] __________________________________________________________________________________________________ bn4b12_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu (Activati (None, None, None, 2 0 bn4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2c (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b12_branch2c[0][0] __________________________________________________________________________________________________ res4b12 (Add) (None, None, None, 1 0 bn4b12_branch2c[0][0] res4b11_relu[0][0] __________________________________________________________________________________________________ res4b12_relu (Activation) (None, None, None, 1 0 res4b12[0][0] __________________________________________________________________________________________________ res4b13_branch2a (Conv2D) (None, None, None, 2 262144 res4b12_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2a[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu (Activati (None, None, None, 2 0 bn4b13_branch2a[0][0] __________________________________________________________________________________________________ padding4b13_branch2b (ZeroPaddi (None, None, None, 2 0 res4b13_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b13_branch2b (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b[0][0] __________________________________________________________________________________________________ bn4b13_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu (Activati (None, None, None, 2 0 bn4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2c (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b13_branch2c[0][0] __________________________________________________________________________________________________ res4b13 (Add) (None, None, None, 1 0 bn4b13_branch2c[0][0] res4b12_relu[0][0] __________________________________________________________________________________________________ res4b13_relu (Activation) (None, None, None, 1 0 res4b13[0][0] __________________________________________________________________________________________________ res4b14_branch2a (Conv2D) (None, None, None, 2 262144 res4b13_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2a[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu (Activati (None, None, None, 2 0 bn4b14_branch2a[0][0] __________________________________________________________________________________________________ padding4b14_branch2b (ZeroPaddi (None, None, None, 2 0 res4b14_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b14_branch2b (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b[0][0] __________________________________________________________________________________________________ bn4b14_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu (Activati (None, None, None, 2 0 bn4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2c (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b14_branch2c[0][0] __________________________________________________________________________________________________ res4b14 (Add) (None, None, None, 1 0 bn4b14_branch2c[0][0] res4b13_relu[0][0] __________________________________________________________________________________________________ res4b14_relu (Activation) (None, None, None, 1 0 res4b14[0][0] __________________________________________________________________________________________________ res4b15_branch2a (Conv2D) (None, None, None, 2 262144 res4b14_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2a[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu (Activati (None, None, None, 2 0 bn4b15_branch2a[0][0] __________________________________________________________________________________________________ padding4b15_branch2b (ZeroPaddi (None, None, None, 2 0 res4b15_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b15_branch2b (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b[0][0] __________________________________________________________________________________________________ bn4b15_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu (Activati (None, None, None, 2 0 bn4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2c (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b15_branch2c[0][0] __________________________________________________________________________________________________ res4b15 (Add) (None, None, None, 1 0 bn4b15_branch2c[0][0] res4b14_relu[0][0] __________________________________________________________________________________________________ res4b15_relu (Activation) (None, None, None, 1 0 res4b15[0][0] __________________________________________________________________________________________________ res4b16_branch2a (Conv2D) (None, None, None, 2 262144 res4b15_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2a[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu (Activati (None, None, None, 2 0 bn4b16_branch2a[0][0] __________________________________________________________________________________________________ padding4b16_branch2b (ZeroPaddi (None, None, None, 2 0 res4b16_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b16_branch2b (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b[0][0] __________________________________________________________________________________________________ bn4b16_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu (Activati (None, None, None, 2 0 bn4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2c (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b16_branch2c[0][0] __________________________________________________________________________________________________ res4b16 (Add) (None, None, None, 1 0 bn4b16_branch2c[0][0] res4b15_relu[0][0] __________________________________________________________________________________________________ res4b16_relu (Activation) (None, None, None, 1 0 res4b16[0][0] __________________________________________________________________________________________________ res4b17_branch2a (Conv2D) (None, None, None, 2 262144 res4b16_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2a[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu (Activati (None, None, None, 2 0 bn4b17_branch2a[0][0] __________________________________________________________________________________________________ padding4b17_branch2b (ZeroPaddi (None, None, None, 2 0 res4b17_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b17_branch2b (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b[0][0] __________________________________________________________________________________________________ bn4b17_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu (Activati (None, None, None, 2 0 bn4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2c (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b17_branch2c[0][0] __________________________________________________________________________________________________ res4b17 (Add) (None, None, None, 1 0 bn4b17_branch2c[0][0] res4b16_relu[0][0] __________________________________________________________________________________________________ res4b17_relu (Activation) (None, None, None, 1 0 res4b17[0][0] __________________________________________________________________________________________________ res4b18_branch2a (Conv2D) (None, None, None, 2 262144 res4b17_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2a[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu (Activati (None, None, None, 2 0 bn4b18_branch2a[0][0] __________________________________________________________________________________________________ padding4b18_branch2b (ZeroPaddi (None, None, None, 2 0 res4b18_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b18_branch2b (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b[0][0] __________________________________________________________________________________________________ bn4b18_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu (Activati (None, None, None, 2 0 bn4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2c (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b18_branch2c[0][0] __________________________________________________________________________________________________ res4b18 (Add) (None, None, None, 1 0 bn4b18_branch2c[0][0] res4b17_relu[0][0] __________________________________________________________________________________________________ res4b18_relu (Activation) (None, None, None, 1 0 res4b18[0][0] __________________________________________________________________________________________________ res4b19_branch2a (Conv2D) (None, None, None, 2 262144 res4b18_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2a[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu (Activati (None, None, None, 2 0 bn4b19_branch2a[0][0] __________________________________________________________________________________________________ padding4b19_branch2b (ZeroPaddi (None, None, None, 2 0 res4b19_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b19_branch2b (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b[0][0] __________________________________________________________________________________________________ bn4b19_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu (Activati (None, None, None, 2 0 bn4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2c (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b19_branch2c[0][0] __________________________________________________________________________________________________ res4b19 (Add) (None, None, None, 1 0 bn4b19_branch2c[0][0] res4b18_relu[0][0] __________________________________________________________________________________________________ res4b19_relu (Activation) (None, None, None, 1 0 res4b19[0][0] __________________________________________________________________________________________________ res4b20_branch2a (Conv2D) (None, None, None, 2 262144 res4b19_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2a[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu (Activati (None, None, None, 2 0 bn4b20_branch2a[0][0] __________________________________________________________________________________________________ padding4b20_branch2b (ZeroPaddi (None, None, None, 2 0 res4b20_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b20_branch2b (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b[0][0] __________________________________________________________________________________________________ bn4b20_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu (Activati (None, None, None, 2 0 bn4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2c (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b20_branch2c[0][0] __________________________________________________________________________________________________ res4b20 (Add) (None, None, None, 1 0 bn4b20_branch2c[0][0] res4b19_relu[0][0] __________________________________________________________________________________________________ res4b20_relu (Activation) (None, None, None, 1 0 res4b20[0][0] __________________________________________________________________________________________________ res4b21_branch2a (Conv2D) (None, None, None, 2 262144 res4b20_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2a[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu (Activati (None, None, None, 2 0 bn4b21_branch2a[0][0] __________________________________________________________________________________________________ padding4b21_branch2b (ZeroPaddi (None, None, None, 2 0 res4b21_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b21_branch2b (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b[0][0] __________________________________________________________________________________________________ bn4b21_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu (Activati (None, None, None, 2 0 bn4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2c (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b21_branch2c[0][0] __________________________________________________________________________________________________ res4b21 (Add) (None, None, None, 1 0 bn4b21_branch2c[0][0] res4b20_relu[0][0] __________________________________________________________________________________________________ res4b21_relu (Activation) (None, None, None, 1 0 res4b21[0][0] __________________________________________________________________________________________________ res4b22_branch2a (Conv2D) (None, None, None, 2 262144 res4b21_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2a[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu (Activati (None, None, None, 2 0 bn4b22_branch2a[0][0] __________________________________________________________________________________________________ padding4b22_branch2b (ZeroPaddi (None, None, None, 2 0 res4b22_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b22_branch2b (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b[0][0] __________________________________________________________________________________________________ bn4b22_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu (Activati (None, None, None, 2 0 bn4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2c (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b22_branch2c[0][0] __________________________________________________________________________________________________ res4b22 (Add) (None, None, None, 1 0 bn4b22_branch2c[0][0] res4b21_relu[0][0] __________________________________________________________________________________________________ res4b22_relu (Activation) (None, None, None, 1 0 res4b22[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4b22_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4b22_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3b3_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3b3_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 55,427,309 Trainable params: 55,216,621 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 1:01:15 - loss: 4.2489 - regression_loss: 3.1190 - classification_loss: 1.1299 2/500 [..............................] - ETA: 31:51 - loss: 4.0457 - regression_loss: 2.9161 - classification_loss: 1.1296 3/500 [..............................] - ETA: 22:03 - loss: 4.1251 - regression_loss: 2.9948 - classification_loss: 1.1303 4/500 [..............................] - ETA: 17:11 - loss: 4.1569 - regression_loss: 3.0268 - classification_loss: 1.1302 5/500 [..............................] - ETA: 14:16 - loss: 4.0888 - regression_loss: 2.9592 - classification_loss: 1.1296 6/500 [..............................] - ETA: 12:19 - loss: 4.0826 - regression_loss: 2.9531 - classification_loss: 1.1294 7/500 [..............................] - ETA: 10:55 - loss: 4.0935 - regression_loss: 2.9640 - classification_loss: 1.1294 8/500 [..............................] - ETA: 9:53 - loss: 4.0935 - regression_loss: 2.9642 - classification_loss: 1.1292 9/500 [..............................] - ETA: 9:04 - loss: 4.0859 - regression_loss: 2.9570 - classification_loss: 1.1290 10/500 [..............................] - ETA: 8:24 - loss: 4.0821 - regression_loss: 2.9532 - classification_loss: 1.1289 11/500 [..............................] - ETA: 7:52 - loss: 4.0571 - regression_loss: 2.9280 - classification_loss: 1.1291 12/500 [..............................] - ETA: 7:25 - loss: 4.0605 - regression_loss: 2.9316 - classification_loss: 1.1289 13/500 [..............................] - ETA: 7:02 - loss: 4.0774 - regression_loss: 2.9485 - classification_loss: 1.1289 14/500 [..............................] - ETA: 6:41 - loss: 4.0620 - regression_loss: 2.9332 - classification_loss: 1.1288 15/500 [..............................] - ETA: 6:25 - loss: 4.0680 - regression_loss: 2.9393 - classification_loss: 1.1287 16/500 [..............................] - ETA: 6:10 - loss: 4.0541 - regression_loss: 2.9257 - classification_loss: 1.1284 17/500 [>.............................] - ETA: 5:56 - loss: 4.0384 - regression_loss: 2.9102 - classification_loss: 1.1281 18/500 [>.............................] - ETA: 5:44 - loss: 4.0303 - regression_loss: 2.9023 - classification_loss: 1.1280 19/500 [>.............................] - ETA: 5:34 - loss: 4.0368 - regression_loss: 2.9091 - classification_loss: 1.1278 20/500 [>.............................] - ETA: 5:24 - loss: 4.0312 - regression_loss: 2.9035 - classification_loss: 1.1277 21/500 [>.............................] - ETA: 5:16 - loss: 4.0293 - regression_loss: 2.9015 - classification_loss: 1.1278 22/500 [>.............................] - ETA: 5:08 - loss: 4.0424 - regression_loss: 2.9146 - classification_loss: 1.1278 23/500 [>.............................] - ETA: 5:01 - loss: 4.0425 - regression_loss: 2.9150 - classification_loss: 1.1276 24/500 [>.............................] - ETA: 4:54 - loss: 4.0369 - regression_loss: 2.9095 - classification_loss: 1.1274 25/500 [>.............................] - ETA: 4:48 - loss: 4.0291 - regression_loss: 2.9019 - classification_loss: 1.1272 26/500 [>.............................] - ETA: 4:42 - loss: 4.0301 - regression_loss: 2.9031 - classification_loss: 1.1270 27/500 [>.............................] - ETA: 4:38 - loss: 4.0225 - regression_loss: 2.8958 - classification_loss: 1.1267 28/500 [>.............................] - ETA: 4:33 - loss: 4.0190 - regression_loss: 2.8922 - classification_loss: 1.1267 29/500 [>.............................] - ETA: 4:29 - loss: 4.0190 - regression_loss: 2.8924 - classification_loss: 1.1266 30/500 [>.............................] - ETA: 4:24 - loss: 4.0170 - regression_loss: 2.8906 - classification_loss: 1.1264 31/500 [>.............................] - ETA: 4:20 - loss: 4.0077 - regression_loss: 2.8814 - classification_loss: 1.1264 32/500 [>.............................] - ETA: 4:16 - loss: 4.0061 - regression_loss: 2.8801 - classification_loss: 1.1261 33/500 [>.............................] - ETA: 4:13 - loss: 4.0041 - regression_loss: 2.8784 - classification_loss: 1.1258 34/500 [=>............................] - ETA: 4:09 - loss: 3.9987 - regression_loss: 2.8732 - classification_loss: 1.1255 35/500 [=>............................] - ETA: 4:06 - loss: 3.9964 - regression_loss: 2.8711 - classification_loss: 1.1252 36/500 [=>............................] - ETA: 4:03 - loss: 3.9945 - regression_loss: 2.8695 - classification_loss: 1.1249 37/500 [=>............................] - ETA: 3:59 - loss: 3.9947 - regression_loss: 2.8700 - classification_loss: 1.1247 38/500 [=>............................] - ETA: 3:57 - loss: 3.9919 - regression_loss: 2.8674 - classification_loss: 1.1245 39/500 [=>............................] - ETA: 3:54 - loss: 3.9905 - regression_loss: 2.8662 - classification_loss: 1.1243 40/500 [=>............................] - ETA: 3:52 - loss: 3.9901 - regression_loss: 2.8661 - classification_loss: 1.1240 41/500 [=>............................] - ETA: 3:49 - loss: 3.9937 - regression_loss: 2.8698 - classification_loss: 1.1238 42/500 [=>............................] - ETA: 3:47 - loss: 3.9907 - regression_loss: 2.8673 - classification_loss: 1.1234 43/500 [=>............................] - ETA: 3:44 - loss: 3.9847 - regression_loss: 2.8619 - classification_loss: 1.1228 44/500 [=>............................] - ETA: 3:42 - loss: 3.9885 - regression_loss: 2.8659 - classification_loss: 1.1226 45/500 [=>............................] - ETA: 3:40 - loss: 3.9911 - regression_loss: 2.8686 - classification_loss: 1.1225 46/500 [=>............................] - ETA: 3:38 - loss: 3.9876 - regression_loss: 2.8652 - classification_loss: 1.1224 47/500 [=>............................] - ETA: 3:36 - loss: 3.9844 - regression_loss: 2.8625 - classification_loss: 1.1218 48/500 [=>............................] - ETA: 3:34 - loss: 3.9799 - regression_loss: 2.8585 - classification_loss: 1.1215 49/500 [=>............................] - ETA: 3:32 - loss: 3.9758 - regression_loss: 2.8549 - classification_loss: 1.1209 50/500 [==>...........................] - ETA: 3:30 - loss: 3.9719 - regression_loss: 2.8515 - classification_loss: 1.1204 51/500 [==>...........................] - ETA: 3:29 - loss: 3.9713 - regression_loss: 2.8510 - classification_loss: 1.1203 52/500 [==>...........................] - ETA: 3:27 - loss: 3.9671 - regression_loss: 2.8471 - classification_loss: 1.1200 53/500 [==>...........................] - ETA: 3:25 - loss: 3.9673 - regression_loss: 2.8477 - classification_loss: 1.1197 54/500 [==>...........................] - ETA: 3:24 - loss: 3.9640 - regression_loss: 2.8444 - classification_loss: 1.1196 55/500 [==>...........................] - ETA: 3:22 - loss: 3.9645 - regression_loss: 2.8450 - classification_loss: 1.1195 56/500 [==>...........................] - ETA: 3:21 - loss: 3.9630 - regression_loss: 2.8440 - classification_loss: 1.1190 57/500 [==>...........................] - ETA: 3:19 - loss: 3.9623 - regression_loss: 2.8436 - classification_loss: 1.1187 58/500 [==>...........................] - ETA: 3:18 - loss: 3.9585 - regression_loss: 2.8406 - classification_loss: 1.1179 59/500 [==>...........................] - ETA: 3:16 - loss: 3.9583 - regression_loss: 2.8407 - classification_loss: 1.1176 60/500 [==>...........................] - ETA: 3:15 - loss: 3.9573 - regression_loss: 2.8397 - classification_loss: 1.1176 61/500 [==>...........................] - ETA: 3:14 - loss: 3.9509 - regression_loss: 2.8344 - classification_loss: 1.1165 62/500 [==>...........................] - ETA: 3:12 - loss: 3.9456 - regression_loss: 2.8302 - classification_loss: 1.1153 63/500 [==>...........................] - ETA: 3:11 - loss: 3.9457 - regression_loss: 2.8305 - classification_loss: 1.1152 64/500 [==>...........................] - ETA: 3:10 - loss: 3.9408 - regression_loss: 2.8268 - classification_loss: 1.1140 65/500 [==>...........................] - ETA: 3:09 - loss: 3.9392 - regression_loss: 2.8254 - classification_loss: 1.1138 66/500 [==>...........................] - ETA: 3:07 - loss: 3.9377 - regression_loss: 2.8242 - classification_loss: 1.1135 67/500 [===>..........................] - ETA: 3:06 - loss: 3.9323 - regression_loss: 2.8196 - classification_loss: 1.1126 68/500 [===>..........................] - ETA: 3:05 - loss: 3.9274 - regression_loss: 2.8167 - classification_loss: 1.1107 69/500 [===>..........................] - ETA: 3:04 - loss: 3.9203 - regression_loss: 2.8107 - classification_loss: 1.1095 70/500 [===>..........................] - ETA: 3:03 - loss: 3.9206 - regression_loss: 2.8111 - classification_loss: 1.1094 71/500 [===>..........................] - ETA: 3:02 - loss: 3.9143 - regression_loss: 2.8067 - classification_loss: 1.1077 72/500 [===>..........................] - ETA: 3:01 - loss: 3.9111 - regression_loss: 2.8043 - classification_loss: 1.1068 73/500 [===>..........................] - ETA: 3:00 - loss: 3.9074 - regression_loss: 2.8015 - classification_loss: 1.1058 74/500 [===>..........................] - ETA: 2:59 - loss: 3.9030 - regression_loss: 2.7986 - classification_loss: 1.1043 75/500 [===>..........................] - ETA: 2:58 - loss: 3.8978 - regression_loss: 2.7960 - classification_loss: 1.1018 76/500 [===>..........................] - ETA: 2:57 - loss: 3.8937 - regression_loss: 2.7938 - classification_loss: 1.0998 77/500 [===>..........................] - ETA: 2:56 - loss: 3.8901 - regression_loss: 2.7917 - classification_loss: 1.0985 78/500 [===>..........................] - ETA: 2:55 - loss: 3.8859 - regression_loss: 2.7895 - classification_loss: 1.0964 79/500 [===>..........................] - ETA: 2:54 - loss: 3.8816 - regression_loss: 2.7865 - classification_loss: 1.0951 80/500 [===>..........................] - ETA: 2:53 - loss: 3.8757 - regression_loss: 2.7830 - classification_loss: 1.0926 81/500 [===>..........................] - ETA: 2:53 - loss: 3.8692 - regression_loss: 2.7781 - classification_loss: 1.0910 82/500 [===>..........................] - ETA: 2:52 - loss: 3.8621 - regression_loss: 2.7744 - classification_loss: 1.0877 83/500 [===>..........................] - ETA: 2:51 - loss: 3.8568 - regression_loss: 2.7716 - classification_loss: 1.0852 84/500 [====>.........................] - ETA: 2:50 - loss: 3.8500 - regression_loss: 2.7684 - classification_loss: 1.0816 85/500 [====>.........................] - ETA: 2:49 - loss: 3.8428 - regression_loss: 2.7640 - classification_loss: 1.0788 86/500 [====>.........................] - ETA: 2:48 - loss: 3.8406 - regression_loss: 2.7643 - classification_loss: 1.0764 87/500 [====>.........................] - ETA: 2:48 - loss: 3.8334 - regression_loss: 2.7614 - classification_loss: 1.0720 88/500 [====>.........................] - ETA: 2:47 - loss: 3.8344 - regression_loss: 2.7591 - classification_loss: 1.0754 89/500 [====>.........................] - ETA: 2:46 - loss: 3.8272 - regression_loss: 2.7563 - classification_loss: 1.0709 90/500 [====>.........................] - ETA: 2:45 - loss: 3.8217 - regression_loss: 2.7534 - classification_loss: 1.0683 91/500 [====>.........................] - ETA: 2:45 - loss: 3.8253 - regression_loss: 2.7527 - classification_loss: 1.0727 92/500 [====>.........................] - ETA: 2:44 - loss: 3.8171 - regression_loss: 2.7495 - classification_loss: 1.0676 93/500 [====>.........................] - ETA: 2:43 - loss: 3.8113 - regression_loss: 2.7484 - classification_loss: 1.0629 94/500 [====>.........................] - ETA: 2:43 - loss: 3.8076 - regression_loss: 2.7459 - classification_loss: 1.0617 95/500 [====>.........................] - ETA: 2:42 - loss: 3.8012 - regression_loss: 2.7427 - classification_loss: 1.0585 96/500 [====>.........................] - ETA: 2:41 - loss: 3.7965 - regression_loss: 2.7401 - classification_loss: 1.0564 97/500 [====>.........................] - ETA: 2:41 - loss: 3.7897 - regression_loss: 2.7371 - classification_loss: 1.0526 98/500 [====>.........................] - ETA: 2:40 - loss: 3.7848 - regression_loss: 2.7322 - classification_loss: 1.0525 99/500 [====>.........................] - ETA: 2:39 - loss: 3.7840 - regression_loss: 2.7322 - classification_loss: 1.0518 100/500 [=====>........................] - ETA: 2:38 - loss: 3.7797 - regression_loss: 2.7294 - classification_loss: 1.0503 101/500 [=====>........................] - ETA: 2:37 - loss: 3.7776 - regression_loss: 2.7289 - classification_loss: 1.0486 102/500 [=====>........................] - ETA: 2:37 - loss: 3.7721 - regression_loss: 2.7265 - classification_loss: 1.0456 103/500 [=====>........................] - ETA: 2:36 - loss: 3.7659 - regression_loss: 2.7243 - classification_loss: 1.0416 104/500 [=====>........................] - ETA: 2:36 - loss: 3.7606 - regression_loss: 2.7229 - classification_loss: 1.0378 105/500 [=====>........................] - ETA: 2:35 - loss: 3.7562 - regression_loss: 2.7199 - classification_loss: 1.0362 106/500 [=====>........................] - ETA: 2:34 - loss: 3.7500 - regression_loss: 2.7177 - classification_loss: 1.0323 107/500 [=====>........................] - ETA: 2:34 - loss: 3.7438 - regression_loss: 2.7150 - classification_loss: 1.0288 108/500 [=====>........................] - ETA: 2:33 - loss: 3.7368 - regression_loss: 2.7125 - classification_loss: 1.0243 109/500 [=====>........................] - ETA: 2:32 - loss: 3.7290 - regression_loss: 2.7097 - classification_loss: 1.0193 110/500 [=====>........................] - ETA: 2:32 - loss: 3.7237 - regression_loss: 2.7088 - classification_loss: 1.0149 111/500 [=====>........................] - ETA: 2:31 - loss: 3.7224 - regression_loss: 2.7084 - classification_loss: 1.0140 112/500 [=====>........................] - ETA: 2:31 - loss: 3.7155 - regression_loss: 2.7055 - classification_loss: 1.0100 113/500 [=====>........................] - ETA: 2:30 - loss: 3.7098 - regression_loss: 2.7033 - classification_loss: 1.0064 114/500 [=====>........................] - ETA: 2:29 - loss: 3.7062 - regression_loss: 2.7021 - classification_loss: 1.0040 115/500 [=====>........................] - ETA: 2:29 - loss: 3.7001 - regression_loss: 2.7005 - classification_loss: 0.9996 116/500 [=====>........................] - ETA: 2:28 - loss: 3.6998 - regression_loss: 2.6990 - classification_loss: 1.0008 117/500 [======>.......................] - ETA: 2:28 - loss: 3.6931 - regression_loss: 2.6965 - classification_loss: 0.9967 118/500 [======>.......................] - ETA: 2:27 - loss: 3.6886 - regression_loss: 2.6949 - classification_loss: 0.9937 119/500 [======>.......................] - ETA: 2:26 - loss: 3.6829 - regression_loss: 2.6933 - classification_loss: 0.9896 120/500 [======>.......................] - ETA: 2:26 - loss: 3.6804 - regression_loss: 2.6912 - classification_loss: 0.9892 121/500 [======>.......................] - ETA: 2:25 - loss: 3.6793 - regression_loss: 2.6907 - classification_loss: 0.9886 122/500 [======>.......................] - ETA: 2:24 - loss: 3.6741 - regression_loss: 2.6884 - classification_loss: 0.9857 123/500 [======>.......................] - ETA: 2:24 - loss: 3.6702 - regression_loss: 2.6861 - classification_loss: 0.9841 124/500 [======>.......................] - ETA: 2:23 - loss: 3.6652 - regression_loss: 2.6853 - classification_loss: 0.9799 125/500 [======>.......................] - ETA: 2:23 - loss: 3.6652 - regression_loss: 2.6865 - classification_loss: 0.9788 126/500 [======>.......................] - ETA: 2:22 - loss: 3.6585 - regression_loss: 2.6838 - classification_loss: 0.9747 127/500 [======>.......................] - ETA: 2:22 - loss: 3.6532 - regression_loss: 2.6818 - classification_loss: 0.9714 128/500 [======>.......................] - ETA: 2:21 - loss: 3.6483 - regression_loss: 2.6807 - classification_loss: 0.9676 129/500 [======>.......................] - ETA: 2:21 - loss: 3.6422 - regression_loss: 2.6784 - classification_loss: 0.9639 130/500 [======>.......................] - ETA: 2:20 - loss: 3.6406 - regression_loss: 2.6784 - classification_loss: 0.9623 131/500 [======>.......................] - ETA: 2:20 - loss: 3.6367 - regression_loss: 2.6780 - classification_loss: 0.9588 132/500 [======>.......................] - ETA: 2:19 - loss: 3.6329 - regression_loss: 2.6777 - classification_loss: 0.9552 133/500 [======>.......................] - ETA: 2:18 - loss: 3.6299 - regression_loss: 2.6763 - classification_loss: 0.9536 134/500 [=======>......................] - ETA: 2:18 - loss: 3.6290 - regression_loss: 2.6751 - classification_loss: 0.9538 135/500 [=======>......................] - ETA: 2:17 - loss: 3.6298 - regression_loss: 2.6762 - classification_loss: 0.9537 136/500 [=======>......................] - ETA: 2:17 - loss: 3.6282 - regression_loss: 2.6777 - classification_loss: 0.9506 137/500 [=======>......................] - ETA: 2:16 - loss: 3.6246 - regression_loss: 2.6762 - classification_loss: 0.9484 138/500 [=======>......................] - ETA: 2:16 - loss: 3.6196 - regression_loss: 2.6745 - classification_loss: 0.9451 139/500 [=======>......................] - ETA: 2:15 - loss: 3.6158 - regression_loss: 2.6737 - classification_loss: 0.9421 140/500 [=======>......................] - ETA: 2:15 - loss: 3.6098 - regression_loss: 2.6716 - classification_loss: 0.9383 141/500 [=======>......................] - ETA: 2:14 - loss: 3.6068 - regression_loss: 2.6718 - classification_loss: 0.9351 142/500 [=======>......................] - ETA: 2:14 - loss: 3.6013 - regression_loss: 2.6698 - classification_loss: 0.9315 143/500 [=======>......................] - ETA: 2:13 - loss: 3.5978 - regression_loss: 2.6686 - classification_loss: 0.9291 144/500 [=======>......................] - ETA: 2:13 - loss: 3.5962 - regression_loss: 2.6679 - classification_loss: 0.9282 145/500 [=======>......................] - ETA: 2:12 - loss: 3.5932 - regression_loss: 2.6655 - classification_loss: 0.9277 146/500 [=======>......................] - ETA: 2:12 - loss: 3.5895 - regression_loss: 2.6641 - classification_loss: 0.9255 147/500 [=======>......................] - ETA: 2:11 - loss: 3.5868 - regression_loss: 2.6632 - classification_loss: 0.9236 148/500 [=======>......................] - ETA: 2:11 - loss: 3.5851 - regression_loss: 2.6618 - classification_loss: 0.9233 149/500 [=======>......................] - ETA: 2:10 - loss: 3.5854 - regression_loss: 2.6616 - classification_loss: 0.9238 150/500 [========>.....................] - ETA: 2:10 - loss: 3.5857 - regression_loss: 2.6606 - classification_loss: 0.9251 151/500 [========>.....................] - ETA: 2:09 - loss: 3.5845 - regression_loss: 2.6611 - classification_loss: 0.9234 152/500 [========>.....................] - ETA: 2:09 - loss: 3.5857 - regression_loss: 2.6632 - classification_loss: 0.9225 153/500 [========>.....................] - ETA: 2:08 - loss: 3.5822 - regression_loss: 2.6619 - classification_loss: 0.9203 154/500 [========>.....................] - ETA: 2:08 - loss: 3.5782 - regression_loss: 2.6605 - classification_loss: 0.9177 155/500 [========>.....................] - ETA: 2:07 - loss: 3.5741 - regression_loss: 2.6589 - classification_loss: 0.9152 156/500 [========>.....................] - ETA: 2:07 - loss: 3.5714 - regression_loss: 2.6582 - classification_loss: 0.9132 157/500 [========>.....................] - ETA: 2:06 - loss: 3.5712 - regression_loss: 2.6583 - classification_loss: 0.9129 158/500 [========>.....................] - ETA: 2:06 - loss: 3.5709 - regression_loss: 2.6590 - classification_loss: 0.9119 159/500 [========>.....................] - ETA: 2:06 - loss: 3.5699 - regression_loss: 2.6592 - classification_loss: 0.9107 160/500 [========>.....................] - ETA: 2:05 - loss: 3.5673 - regression_loss: 2.6593 - classification_loss: 0.9080 161/500 [========>.....................] - ETA: 2:05 - loss: 3.5651 - regression_loss: 2.6585 - classification_loss: 0.9066 162/500 [========>.....................] - ETA: 2:04 - loss: 3.5622 - regression_loss: 2.6579 - classification_loss: 0.9043 163/500 [========>.....................] - ETA: 2:04 - loss: 3.5618 - regression_loss: 2.6574 - classification_loss: 0.9044 164/500 [========>.....................] - ETA: 2:03 - loss: 3.5593 - regression_loss: 2.6575 - classification_loss: 0.9018 165/500 [========>.....................] - ETA: 2:03 - loss: 3.5553 - regression_loss: 2.6561 - classification_loss: 0.8992 166/500 [========>.....................] - ETA: 2:02 - loss: 3.5517 - regression_loss: 2.6548 - classification_loss: 0.8969 167/500 [=========>....................] - ETA: 2:02 - loss: 3.5485 - regression_loss: 2.6532 - classification_loss: 0.8953 168/500 [=========>....................] - ETA: 2:01 - loss: 3.5460 - regression_loss: 2.6519 - classification_loss: 0.8941 169/500 [=========>....................] - ETA: 2:01 - loss: 3.5435 - regression_loss: 2.6511 - classification_loss: 0.8923 170/500 [=========>....................] - ETA: 2:01 - loss: 3.5408 - regression_loss: 2.6501 - classification_loss: 0.8907 171/500 [=========>....................] - ETA: 2:00 - loss: 3.5383 - regression_loss: 2.6495 - classification_loss: 0.8888 172/500 [=========>....................] - ETA: 2:00 - loss: 3.5349 - regression_loss: 2.6481 - classification_loss: 0.8868 173/500 [=========>....................] - ETA: 1:59 - loss: 3.5342 - regression_loss: 2.6487 - classification_loss: 0.8855 174/500 [=========>....................] - ETA: 1:59 - loss: 3.5350 - regression_loss: 2.6477 - classification_loss: 0.8872 175/500 [=========>....................] - ETA: 1:58 - loss: 3.5322 - regression_loss: 2.6470 - classification_loss: 0.8853 176/500 [=========>....................] - ETA: 1:58 - loss: 3.5292 - regression_loss: 2.6462 - classification_loss: 0.8830 177/500 [=========>....................] - ETA: 1:57 - loss: 3.5263 - regression_loss: 2.6455 - classification_loss: 0.8809 178/500 [=========>....................] - ETA: 1:57 - loss: 3.5228 - regression_loss: 2.6444 - classification_loss: 0.8784 179/500 [=========>....................] - ETA: 1:57 - loss: 3.5193 - regression_loss: 2.6429 - classification_loss: 0.8765 180/500 [=========>....................] - ETA: 1:56 - loss: 3.5168 - regression_loss: 2.6421 - classification_loss: 0.8747 181/500 [=========>....................] - ETA: 1:56 - loss: 3.5160 - regression_loss: 2.6427 - classification_loss: 0.8733 182/500 [=========>....................] - ETA: 1:55 - loss: 3.5133 - regression_loss: 2.6417 - classification_loss: 0.8715 183/500 [=========>....................] - ETA: 1:55 - loss: 3.5099 - regression_loss: 2.6405 - classification_loss: 0.8694 184/500 [==========>...................] - ETA: 1:54 - loss: 3.5080 - regression_loss: 2.6399 - classification_loss: 0.8681 185/500 [==========>...................] - ETA: 1:54 - loss: 3.5059 - regression_loss: 2.6396 - classification_loss: 0.8663 186/500 [==========>...................] - ETA: 1:53 - loss: 3.5026 - regression_loss: 2.6385 - classification_loss: 0.8641 187/500 [==========>...................] - ETA: 1:53 - loss: 3.4998 - regression_loss: 2.6377 - classification_loss: 0.8621 188/500 [==========>...................] - ETA: 1:53 - loss: 3.4963 - regression_loss: 2.6365 - classification_loss: 0.8598 189/500 [==========>...................] - ETA: 1:52 - loss: 3.4939 - regression_loss: 2.6359 - classification_loss: 0.8580 190/500 [==========>...................] - ETA: 1:52 - loss: 3.4925 - regression_loss: 2.6360 - classification_loss: 0.8565 191/500 [==========>...................] - ETA: 1:51 - loss: 3.4899 - regression_loss: 2.6349 - classification_loss: 0.8550 192/500 [==========>...................] - ETA: 1:51 - loss: 3.4876 - regression_loss: 2.6344 - classification_loss: 0.8532 193/500 [==========>...................] - ETA: 1:51 - loss: 3.4853 - regression_loss: 2.6336 - classification_loss: 0.8516 194/500 [==========>...................] - ETA: 1:50 - loss: 3.4827 - regression_loss: 2.6328 - classification_loss: 0.8499 195/500 [==========>...................] - ETA: 1:50 - loss: 3.4796 - regression_loss: 2.6316 - classification_loss: 0.8481 196/500 [==========>...................] - ETA: 1:50 - loss: 3.4775 - regression_loss: 2.6311 - classification_loss: 0.8464 197/500 [==========>...................] - ETA: 1:49 - loss: 3.4746 - regression_loss: 2.6300 - classification_loss: 0.8446 198/500 [==========>...................] - ETA: 1:49 - loss: 3.4714 - regression_loss: 2.6283 - classification_loss: 0.8431 199/500 [==========>...................] - ETA: 1:48 - loss: 3.4690 - regression_loss: 2.6276 - classification_loss: 0.8414 200/500 [===========>..................] - ETA: 1:48 - loss: 3.4666 - regression_loss: 2.6271 - classification_loss: 0.8395 201/500 [===========>..................] - ETA: 1:47 - loss: 3.4639 - regression_loss: 2.6264 - classification_loss: 0.8376 202/500 [===========>..................] - ETA: 1:47 - loss: 3.4615 - regression_loss: 2.6257 - classification_loss: 0.8358 203/500 [===========>..................] - ETA: 1:47 - loss: 3.4581 - regression_loss: 2.6245 - classification_loss: 0.8336 204/500 [===========>..................] - ETA: 1:46 - loss: 3.4551 - regression_loss: 2.6236 - classification_loss: 0.8315 205/500 [===========>..................] - ETA: 1:46 - loss: 3.4526 - regression_loss: 2.6229 - classification_loss: 0.8297 206/500 [===========>..................] - ETA: 1:45 - loss: 3.4481 - regression_loss: 2.6209 - classification_loss: 0.8272 207/500 [===========>..................] - ETA: 1:45 - loss: 3.4457 - regression_loss: 2.6204 - classification_loss: 0.8253 208/500 [===========>..................] - ETA: 1:45 - loss: 3.4432 - regression_loss: 2.6198 - classification_loss: 0.8234 209/500 [===========>..................] - ETA: 1:44 - loss: 3.4424 - regression_loss: 2.6198 - classification_loss: 0.8226 210/500 [===========>..................] - ETA: 1:44 - loss: 3.4421 - regression_loss: 2.6207 - classification_loss: 0.8214 211/500 [===========>..................] - ETA: 1:43 - loss: 3.4399 - regression_loss: 2.6199 - classification_loss: 0.8200 212/500 [===========>..................] - ETA: 1:43 - loss: 3.4381 - regression_loss: 2.6191 - classification_loss: 0.8190 213/500 [===========>..................] - ETA: 1:43 - loss: 3.4347 - regression_loss: 2.6175 - classification_loss: 0.8172 214/500 [===========>..................] - ETA: 1:42 - loss: 3.4320 - regression_loss: 2.6164 - classification_loss: 0.8155 215/500 [===========>..................] - ETA: 1:42 - loss: 3.4322 - regression_loss: 2.6172 - classification_loss: 0.8150 216/500 [===========>..................] - ETA: 1:41 - loss: 3.4309 - regression_loss: 2.6173 - classification_loss: 0.8136 217/500 [============>.................] - ETA: 1:41 - loss: 3.4294 - regression_loss: 2.6168 - classification_loss: 0.8126 218/500 [============>.................] - ETA: 1:41 - loss: 3.4270 - regression_loss: 2.6161 - classification_loss: 0.8110 219/500 [============>.................] - ETA: 1:40 - loss: 3.4247 - regression_loss: 2.6153 - classification_loss: 0.8094 220/500 [============>.................] - ETA: 1:40 - loss: 3.4223 - regression_loss: 2.6146 - classification_loss: 0.8077 221/500 [============>.................] - ETA: 1:39 - loss: 3.4211 - regression_loss: 2.6135 - classification_loss: 0.8075 222/500 [============>.................] - ETA: 1:39 - loss: 3.4187 - regression_loss: 2.6126 - classification_loss: 0.8061 223/500 [============>.................] - ETA: 1:39 - loss: 3.4180 - regression_loss: 2.6127 - classification_loss: 0.8053 224/500 [============>.................] - ETA: 1:38 - loss: 3.4158 - regression_loss: 2.6121 - classification_loss: 0.8037 225/500 [============>.................] - ETA: 1:38 - loss: 3.4132 - regression_loss: 2.6113 - classification_loss: 0.8020 226/500 [============>.................] - ETA: 1:37 - loss: 3.4103 - regression_loss: 2.6103 - classification_loss: 0.8001 227/500 [============>.................] - ETA: 1:37 - loss: 3.4102 - regression_loss: 2.6116 - classification_loss: 0.7985 228/500 [============>.................] - ETA: 1:37 - loss: 3.4078 - regression_loss: 2.6110 - classification_loss: 0.7968 229/500 [============>.................] - ETA: 1:36 - loss: 3.4057 - regression_loss: 2.6105 - classification_loss: 0.7951 230/500 [============>.................] - ETA: 1:36 - loss: 3.4035 - regression_loss: 2.6097 - classification_loss: 0.7938 231/500 [============>.................] - ETA: 1:35 - loss: 3.4010 - regression_loss: 2.6088 - classification_loss: 0.7922 232/500 [============>.................] - ETA: 1:35 - loss: 3.3988 - regression_loss: 2.6081 - classification_loss: 0.7907 233/500 [============>.................] - ETA: 1:35 - loss: 3.4003 - regression_loss: 2.6088 - classification_loss: 0.7915 234/500 [=============>................] - ETA: 1:34 - loss: 3.3973 - regression_loss: 2.6073 - classification_loss: 0.7900 235/500 [=============>................] - ETA: 1:34 - loss: 3.3949 - regression_loss: 2.6065 - classification_loss: 0.7885 236/500 [=============>................] - ETA: 1:33 - loss: 3.3950 - regression_loss: 2.6076 - classification_loss: 0.7874 237/500 [=============>................] - ETA: 1:33 - loss: 3.3939 - regression_loss: 2.6083 - classification_loss: 0.7857 238/500 [=============>................] - ETA: 1:33 - loss: 3.3912 - regression_loss: 2.6072 - classification_loss: 0.7840 239/500 [=============>................] - ETA: 1:32 - loss: 3.3895 - regression_loss: 2.6070 - classification_loss: 0.7825 240/500 [=============>................] - ETA: 1:32 - loss: 3.3879 - regression_loss: 2.6067 - classification_loss: 0.7812 241/500 [=============>................] - ETA: 1:32 - loss: 3.3859 - regression_loss: 2.6057 - classification_loss: 0.7802 242/500 [=============>................] - ETA: 1:31 - loss: 3.3842 - regression_loss: 2.6050 - classification_loss: 0.7791 243/500 [=============>................] - ETA: 1:31 - loss: 3.3845 - regression_loss: 2.6050 - classification_loss: 0.7794 244/500 [=============>................] - ETA: 1:30 - loss: 3.3846 - regression_loss: 2.6048 - classification_loss: 0.7798 245/500 [=============>................] - ETA: 1:30 - loss: 3.3828 - regression_loss: 2.6043 - classification_loss: 0.7785 246/500 [=============>................] - ETA: 1:30 - loss: 3.3835 - regression_loss: 2.6057 - classification_loss: 0.7778 247/500 [=============>................] - ETA: 1:29 - loss: 3.3811 - regression_loss: 2.6048 - classification_loss: 0.7763 248/500 [=============>................] - ETA: 1:29 - loss: 3.3791 - regression_loss: 2.6040 - classification_loss: 0.7751 249/500 [=============>................] - ETA: 1:28 - loss: 3.3766 - regression_loss: 2.6031 - classification_loss: 0.7735 250/500 [==============>...............] - ETA: 1:28 - loss: 3.3747 - regression_loss: 2.6026 - classification_loss: 0.7721 251/500 [==============>...............] - ETA: 1:28 - loss: 3.3736 - regression_loss: 2.6023 - classification_loss: 0.7713 252/500 [==============>...............] - ETA: 1:27 - loss: 3.3715 - regression_loss: 2.6013 - classification_loss: 0.7702 253/500 [==============>...............] - ETA: 1:27 - loss: 3.3703 - regression_loss: 2.6013 - classification_loss: 0.7691 254/500 [==============>...............] - ETA: 1:27 - loss: 3.3681 - regression_loss: 2.6004 - classification_loss: 0.7677 255/500 [==============>...............] - ETA: 1:26 - loss: 3.3662 - regression_loss: 2.5997 - classification_loss: 0.7664 256/500 [==============>...............] - ETA: 1:26 - loss: 3.3638 - regression_loss: 2.5988 - classification_loss: 0.7650 257/500 [==============>...............] - ETA: 1:25 - loss: 3.3624 - regression_loss: 2.5983 - classification_loss: 0.7641 258/500 [==============>...............] - ETA: 1:25 - loss: 3.3616 - regression_loss: 2.5981 - classification_loss: 0.7636 259/500 [==============>...............] - ETA: 1:25 - loss: 3.3598 - regression_loss: 2.5978 - classification_loss: 0.7620 260/500 [==============>...............] - ETA: 1:24 - loss: 3.3580 - regression_loss: 2.5973 - classification_loss: 0.7607 261/500 [==============>...............] - ETA: 1:24 - loss: 3.3556 - regression_loss: 2.5963 - classification_loss: 0.7593 262/500 [==============>...............] - ETA: 1:24 - loss: 3.3569 - regression_loss: 2.5977 - classification_loss: 0.7592 263/500 [==============>...............] - ETA: 1:23 - loss: 3.3549 - regression_loss: 2.5968 - classification_loss: 0.7581 264/500 [==============>...............] - ETA: 1:23 - loss: 3.3552 - regression_loss: 2.5966 - classification_loss: 0.7586 265/500 [==============>...............] - ETA: 1:22 - loss: 3.3555 - regression_loss: 2.5957 - classification_loss: 0.7598 266/500 [==============>...............] - ETA: 1:22 - loss: 3.3549 - regression_loss: 2.5957 - classification_loss: 0.7592 267/500 [===============>..............] - ETA: 1:22 - loss: 3.3536 - regression_loss: 2.5952 - classification_loss: 0.7584 268/500 [===============>..............] - ETA: 1:21 - loss: 3.3517 - regression_loss: 2.5945 - classification_loss: 0.7572 269/500 [===============>..............] - ETA: 1:21 - loss: 3.3510 - regression_loss: 2.5947 - classification_loss: 0.7563 270/500 [===============>..............] - ETA: 1:20 - loss: 3.3482 - regression_loss: 2.5935 - classification_loss: 0.7548 271/500 [===============>..............] - ETA: 1:20 - loss: 3.3475 - regression_loss: 2.5935 - classification_loss: 0.7540 272/500 [===============>..............] - ETA: 1:20 - loss: 3.3461 - regression_loss: 2.5928 - classification_loss: 0.7534 273/500 [===============>..............] - ETA: 1:19 - loss: 3.3445 - regression_loss: 2.5921 - classification_loss: 0.7523 274/500 [===============>..............] - ETA: 1:19 - loss: 3.3432 - regression_loss: 2.5915 - classification_loss: 0.7517 275/500 [===============>..............] - ETA: 1:19 - loss: 3.3419 - regression_loss: 2.5912 - classification_loss: 0.7507 276/500 [===============>..............] - ETA: 1:18 - loss: 3.3414 - regression_loss: 2.5913 - classification_loss: 0.7502 277/500 [===============>..............] - ETA: 1:18 - loss: 3.3394 - regression_loss: 2.5905 - classification_loss: 0.7489 278/500 [===============>..............] - ETA: 1:17 - loss: 3.3376 - regression_loss: 2.5896 - classification_loss: 0.7481 279/500 [===============>..............] - ETA: 1:17 - loss: 3.3367 - regression_loss: 2.5879 - classification_loss: 0.7487 280/500 [===============>..............] - ETA: 1:17 - loss: 3.3353 - regression_loss: 2.5872 - classification_loss: 0.7481 281/500 [===============>..............] - ETA: 1:16 - loss: 3.3334 - regression_loss: 2.5866 - classification_loss: 0.7468 282/500 [===============>..............] - ETA: 1:16 - loss: 3.3320 - regression_loss: 2.5861 - classification_loss: 0.7459 283/500 [===============>..............] - ETA: 1:16 - loss: 3.3311 - regression_loss: 2.5857 - classification_loss: 0.7453 284/500 [================>.............] - ETA: 1:15 - loss: 3.3311 - regression_loss: 2.5849 - classification_loss: 0.7462 285/500 [================>.............] - ETA: 1:15 - loss: 3.3300 - regression_loss: 2.5847 - classification_loss: 0.7454 286/500 [================>.............] - ETA: 1:14 - loss: 3.3291 - regression_loss: 2.5847 - classification_loss: 0.7444 287/500 [================>.............] - ETA: 1:14 - loss: 3.3268 - regression_loss: 2.5838 - classification_loss: 0.7431 288/500 [================>.............] - ETA: 1:14 - loss: 3.3258 - regression_loss: 2.5834 - classification_loss: 0.7423 289/500 [================>.............] - ETA: 1:13 - loss: 3.3261 - regression_loss: 2.5838 - classification_loss: 0.7423 290/500 [================>.............] - ETA: 1:13 - loss: 3.3241 - regression_loss: 2.5829 - classification_loss: 0.7412 291/500 [================>.............] - ETA: 1:13 - loss: 3.3253 - regression_loss: 2.5849 - classification_loss: 0.7404 292/500 [================>.............] - ETA: 1:12 - loss: 3.3251 - regression_loss: 2.5849 - classification_loss: 0.7402 293/500 [================>.............] - ETA: 1:12 - loss: 3.3241 - regression_loss: 2.5844 - classification_loss: 0.7397 294/500 [================>.............] - ETA: 1:11 - loss: 3.3225 - regression_loss: 2.5838 - classification_loss: 0.7388 295/500 [================>.............] - ETA: 1:11 - loss: 3.3206 - regression_loss: 2.5830 - classification_loss: 0.7376 296/500 [================>.............] - ETA: 1:11 - loss: 3.3189 - regression_loss: 2.5823 - classification_loss: 0.7366 297/500 [================>.............] - ETA: 1:10 - loss: 3.3176 - regression_loss: 2.5821 - classification_loss: 0.7355 298/500 [================>.............] - ETA: 1:10 - loss: 3.3177 - regression_loss: 2.5829 - classification_loss: 0.7348 299/500 [================>.............] - ETA: 1:10 - loss: 3.3157 - regression_loss: 2.5822 - classification_loss: 0.7336 300/500 [=================>............] - ETA: 1:09 - loss: 3.3152 - regression_loss: 2.5823 - classification_loss: 0.7329 301/500 [=================>............] - ETA: 1:09 - loss: 3.3135 - regression_loss: 2.5817 - classification_loss: 0.7318 302/500 [=================>............] - ETA: 1:09 - loss: 3.3129 - regression_loss: 2.5812 - classification_loss: 0.7317 303/500 [=================>............] - ETA: 1:08 - loss: 3.3115 - regression_loss: 2.5810 - classification_loss: 0.7306 304/500 [=================>............] - ETA: 1:08 - loss: 3.3106 - regression_loss: 2.5809 - classification_loss: 0.7297 305/500 [=================>............] - ETA: 1:07 - loss: 3.3094 - regression_loss: 2.5798 - classification_loss: 0.7297 306/500 [=================>............] - ETA: 1:07 - loss: 3.3080 - regression_loss: 2.5791 - classification_loss: 0.7289 307/500 [=================>............] - ETA: 1:07 - loss: 3.3065 - regression_loss: 2.5784 - classification_loss: 0.7281 308/500 [=================>............] - ETA: 1:06 - loss: 3.3052 - regression_loss: 2.5781 - classification_loss: 0.7270 309/500 [=================>............] - ETA: 1:06 - loss: 3.3043 - regression_loss: 2.5779 - classification_loss: 0.7264 310/500 [=================>............] - ETA: 1:06 - loss: 3.3026 - regression_loss: 2.5774 - classification_loss: 0.7253 311/500 [=================>............] - ETA: 1:05 - loss: 3.3009 - regression_loss: 2.5767 - classification_loss: 0.7242 312/500 [=================>............] - ETA: 1:05 - loss: 3.2998 - regression_loss: 2.5764 - classification_loss: 0.7234 313/500 [=================>............] - ETA: 1:05 - loss: 3.2983 - regression_loss: 2.5758 - classification_loss: 0.7226 314/500 [=================>............] - ETA: 1:04 - loss: 3.2970 - regression_loss: 2.5752 - classification_loss: 0.7217 315/500 [=================>............] - ETA: 1:04 - loss: 3.2976 - regression_loss: 2.5753 - classification_loss: 0.7223 316/500 [=================>............] - ETA: 1:03 - loss: 3.2962 - regression_loss: 2.5749 - classification_loss: 0.7213 317/500 [==================>...........] - ETA: 1:03 - loss: 3.2965 - regression_loss: 2.5755 - classification_loss: 0.7210 318/500 [==================>...........] - ETA: 1:03 - loss: 3.2951 - regression_loss: 2.5749 - classification_loss: 0.7202 319/500 [==================>...........] - ETA: 1:02 - loss: 3.2953 - regression_loss: 2.5749 - classification_loss: 0.7204 320/500 [==================>...........] - ETA: 1:02 - loss: 3.2937 - regression_loss: 2.5742 - classification_loss: 0.7194 321/500 [==================>...........] - ETA: 1:02 - loss: 3.2921 - regression_loss: 2.5736 - classification_loss: 0.7185 322/500 [==================>...........] - ETA: 1:01 - loss: 3.2900 - regression_loss: 2.5725 - classification_loss: 0.7175 323/500 [==================>...........] - ETA: 1:01 - loss: 3.2888 - regression_loss: 2.5720 - classification_loss: 0.7168 324/500 [==================>...........] - ETA: 1:01 - loss: 3.2878 - regression_loss: 2.5716 - classification_loss: 0.7162 325/500 [==================>...........] - ETA: 1:00 - loss: 3.2877 - regression_loss: 2.5714 - classification_loss: 0.7163 326/500 [==================>...........] - ETA: 1:00 - loss: 3.2854 - regression_loss: 2.5701 - classification_loss: 0.7152 327/500 [==================>...........] - ETA: 59s - loss: 3.2844 - regression_loss: 2.5699 - classification_loss: 0.7144 328/500 [==================>...........] - ETA: 59s - loss: 3.2826 - regression_loss: 2.5693 - classification_loss: 0.7133 329/500 [==================>...........] - ETA: 59s - loss: 3.2818 - regression_loss: 2.5689 - classification_loss: 0.7129 330/500 [==================>...........] - ETA: 58s - loss: 3.2805 - regression_loss: 2.5684 - classification_loss: 0.7121 331/500 [==================>...........] - ETA: 58s - loss: 3.2802 - regression_loss: 2.5681 - classification_loss: 0.7121 332/500 [==================>...........] - ETA: 58s - loss: 3.2792 - regression_loss: 2.5678 - classification_loss: 0.7114 333/500 [==================>...........] - ETA: 57s - loss: 3.2788 - regression_loss: 2.5681 - classification_loss: 0.7107 334/500 [===================>..........] - ETA: 57s - loss: 3.2768 - regression_loss: 2.5673 - classification_loss: 0.7095 335/500 [===================>..........] - ETA: 57s - loss: 3.2773 - regression_loss: 2.5682 - classification_loss: 0.7091 336/500 [===================>..........] - ETA: 56s - loss: 3.2760 - regression_loss: 2.5677 - classification_loss: 0.7083 337/500 [===================>..........] - ETA: 56s - loss: 3.2749 - regression_loss: 2.5673 - classification_loss: 0.7076 338/500 [===================>..........] - ETA: 56s - loss: 3.2739 - regression_loss: 2.5672 - classification_loss: 0.7067 339/500 [===================>..........] - ETA: 55s - loss: 3.2747 - regression_loss: 2.5683 - classification_loss: 0.7064 340/500 [===================>..........] - ETA: 55s - loss: 3.2732 - regression_loss: 2.5678 - classification_loss: 0.7054 341/500 [===================>..........] - ETA: 54s - loss: 3.2748 - regression_loss: 2.5692 - classification_loss: 0.7055 342/500 [===================>..........] - ETA: 54s - loss: 3.2734 - regression_loss: 2.5687 - classification_loss: 0.7046 343/500 [===================>..........] - ETA: 54s - loss: 3.2719 - regression_loss: 2.5682 - classification_loss: 0.7038 344/500 [===================>..........] - ETA: 53s - loss: 3.2703 - regression_loss: 2.5673 - classification_loss: 0.7030 345/500 [===================>..........] - ETA: 53s - loss: 3.2698 - regression_loss: 2.5670 - classification_loss: 0.7028 346/500 [===================>..........] - ETA: 53s - loss: 3.2680 - regression_loss: 2.5663 - classification_loss: 0.7017 347/500 [===================>..........] - ETA: 52s - loss: 3.2663 - regression_loss: 2.5655 - classification_loss: 0.7008 348/500 [===================>..........] - ETA: 52s - loss: 3.2667 - regression_loss: 2.5662 - classification_loss: 0.7005 349/500 [===================>..........] - ETA: 52s - loss: 3.2652 - regression_loss: 2.5655 - classification_loss: 0.6997 350/500 [====================>.........] - ETA: 51s - loss: 3.2648 - regression_loss: 2.5654 - classification_loss: 0.6994 351/500 [====================>.........] - ETA: 51s - loss: 3.2635 - regression_loss: 2.5649 - classification_loss: 0.6986 352/500 [====================>.........] - ETA: 50s - loss: 3.2628 - regression_loss: 2.5646 - classification_loss: 0.6981 353/500 [====================>.........] - ETA: 50s - loss: 3.2609 - regression_loss: 2.5636 - classification_loss: 0.6972 354/500 [====================>.........] - ETA: 50s - loss: 3.2607 - regression_loss: 2.5634 - classification_loss: 0.6973 355/500 [====================>.........] - ETA: 49s - loss: 3.2595 - regression_loss: 2.5629 - classification_loss: 0.6966 356/500 [====================>.........] - ETA: 49s - loss: 3.2578 - regression_loss: 2.5621 - classification_loss: 0.6958 357/500 [====================>.........] - ETA: 49s - loss: 3.2566 - regression_loss: 2.5617 - classification_loss: 0.6949 358/500 [====================>.........] - ETA: 48s - loss: 3.2550 - regression_loss: 2.5610 - classification_loss: 0.6941 359/500 [====================>.........] - ETA: 48s - loss: 3.2538 - regression_loss: 2.5603 - classification_loss: 0.6935 360/500 [====================>.........] - ETA: 48s - loss: 3.2535 - regression_loss: 2.5606 - classification_loss: 0.6929 361/500 [====================>.........] - ETA: 47s - loss: 3.2522 - regression_loss: 2.5601 - classification_loss: 0.6920 362/500 [====================>.........] - ETA: 47s - loss: 3.2509 - regression_loss: 2.5597 - classification_loss: 0.6913 363/500 [====================>.........] - ETA: 47s - loss: 3.2499 - regression_loss: 2.5593 - classification_loss: 0.6906 364/500 [====================>.........] - ETA: 46s - loss: 3.2486 - regression_loss: 2.5587 - classification_loss: 0.6900 365/500 [====================>.........] - ETA: 46s - loss: 3.2470 - regression_loss: 2.5578 - classification_loss: 0.6892 366/500 [====================>.........] - ETA: 46s - loss: 3.2458 - regression_loss: 2.5572 - classification_loss: 0.6886 367/500 [=====================>........] - ETA: 45s - loss: 3.2454 - regression_loss: 2.5573 - classification_loss: 0.6882 368/500 [=====================>........] - ETA: 45s - loss: 3.2436 - regression_loss: 2.5563 - classification_loss: 0.6873 369/500 [=====================>........] - ETA: 45s - loss: 3.2428 - regression_loss: 2.5562 - classification_loss: 0.6866 370/500 [=====================>........] - ETA: 44s - loss: 3.2426 - regression_loss: 2.5564 - classification_loss: 0.6862 371/500 [=====================>........] - ETA: 44s - loss: 3.2411 - regression_loss: 2.5556 - classification_loss: 0.6855 372/500 [=====================>........] - ETA: 43s - loss: 3.2407 - regression_loss: 2.5554 - classification_loss: 0.6852 373/500 [=====================>........] - ETA: 43s - loss: 3.2386 - regression_loss: 2.5543 - classification_loss: 0.6843 374/500 [=====================>........] - ETA: 43s - loss: 3.2376 - regression_loss: 2.5540 - classification_loss: 0.6835 375/500 [=====================>........] - ETA: 42s - loss: 3.2362 - regression_loss: 2.5535 - classification_loss: 0.6827 376/500 [=====================>........] - ETA: 42s - loss: 3.2368 - regression_loss: 2.5545 - classification_loss: 0.6823 377/500 [=====================>........] - ETA: 42s - loss: 3.2359 - regression_loss: 2.5541 - classification_loss: 0.6818 378/500 [=====================>........] - ETA: 41s - loss: 3.2348 - regression_loss: 2.5536 - classification_loss: 0.6812 379/500 [=====================>........] - ETA: 41s - loss: 3.2345 - regression_loss: 2.5539 - classification_loss: 0.6807 380/500 [=====================>........] - ETA: 41s - loss: 3.2347 - regression_loss: 2.5544 - classification_loss: 0.6802 381/500 [=====================>........] - ETA: 40s - loss: 3.2336 - regression_loss: 2.5543 - classification_loss: 0.6793 382/500 [=====================>........] - ETA: 40s - loss: 3.2324 - regression_loss: 2.5538 - classification_loss: 0.6786 383/500 [=====================>........] - ETA: 40s - loss: 3.2314 - regression_loss: 2.5533 - classification_loss: 0.6781 384/500 [======================>.......] - ETA: 39s - loss: 3.2302 - regression_loss: 2.5527 - classification_loss: 0.6775 385/500 [======================>.......] - ETA: 39s - loss: 3.2287 - regression_loss: 2.5519 - classification_loss: 0.6768 386/500 [======================>.......] - ETA: 39s - loss: 3.2290 - regression_loss: 2.5525 - classification_loss: 0.6765 387/500 [======================>.......] - ETA: 38s - loss: 3.2285 - regression_loss: 2.5524 - classification_loss: 0.6762 388/500 [======================>.......] - ETA: 38s - loss: 3.2283 - regression_loss: 2.5527 - classification_loss: 0.6756 389/500 [======================>.......] - ETA: 38s - loss: 3.2282 - regression_loss: 2.5530 - classification_loss: 0.6752 390/500 [======================>.......] - ETA: 37s - loss: 3.2286 - regression_loss: 2.5534 - classification_loss: 0.6752 391/500 [======================>.......] - ETA: 37s - loss: 3.2272 - regression_loss: 2.5527 - classification_loss: 0.6746 392/500 [======================>.......] - ETA: 37s - loss: 3.2270 - regression_loss: 2.5523 - classification_loss: 0.6747 393/500 [======================>.......] - ETA: 36s - loss: 3.2265 - regression_loss: 2.5519 - classification_loss: 0.6745 394/500 [======================>.......] - ETA: 36s - loss: 3.2262 - regression_loss: 2.5521 - classification_loss: 0.6740 395/500 [======================>.......] - ETA: 36s - loss: 3.2259 - regression_loss: 2.5522 - classification_loss: 0.6737 396/500 [======================>.......] - ETA: 35s - loss: 3.2246 - regression_loss: 2.5516 - classification_loss: 0.6731 397/500 [======================>.......] - ETA: 35s - loss: 3.2249 - regression_loss: 2.5515 - classification_loss: 0.6734 398/500 [======================>.......] - ETA: 35s - loss: 3.2236 - regression_loss: 2.5509 - classification_loss: 0.6727 399/500 [======================>.......] - ETA: 34s - loss: 3.2215 - regression_loss: 2.5496 - classification_loss: 0.6719 400/500 [=======================>......] - ETA: 34s - loss: 3.2204 - regression_loss: 2.5492 - classification_loss: 0.6712 401/500 [=======================>......] - ETA: 33s - loss: 3.2199 - regression_loss: 2.5491 - classification_loss: 0.6708 402/500 [=======================>......] - ETA: 33s - loss: 3.2189 - regression_loss: 2.5486 - classification_loss: 0.6703 403/500 [=======================>......] - ETA: 33s - loss: 3.2182 - regression_loss: 2.5486 - classification_loss: 0.6696 404/500 [=======================>......] - ETA: 32s - loss: 3.2173 - regression_loss: 2.5482 - classification_loss: 0.6691 405/500 [=======================>......] - ETA: 32s - loss: 3.2160 - regression_loss: 2.5477 - classification_loss: 0.6683 406/500 [=======================>......] - ETA: 32s - loss: 3.2146 - regression_loss: 2.5469 - classification_loss: 0.6677 407/500 [=======================>......] - ETA: 31s - loss: 3.2145 - regression_loss: 2.5471 - classification_loss: 0.6674 408/500 [=======================>......] - ETA: 31s - loss: 3.2132 - regression_loss: 2.5465 - classification_loss: 0.6667 409/500 [=======================>......] - ETA: 31s - loss: 3.2117 - regression_loss: 2.5457 - classification_loss: 0.6660 410/500 [=======================>......] - ETA: 30s - loss: 3.2112 - regression_loss: 2.5454 - classification_loss: 0.6658 411/500 [=======================>......] - ETA: 30s - loss: 3.2104 - regression_loss: 2.5446 - classification_loss: 0.6658 412/500 [=======================>......] - ETA: 30s - loss: 3.2098 - regression_loss: 2.5445 - classification_loss: 0.6653 413/500 [=======================>......] - ETA: 29s - loss: 3.2091 - regression_loss: 2.5442 - classification_loss: 0.6649 414/500 [=======================>......] - ETA: 29s - loss: 3.2075 - regression_loss: 2.5434 - classification_loss: 0.6641 415/500 [=======================>......] - ETA: 29s - loss: 3.2066 - regression_loss: 2.5430 - classification_loss: 0.6636 416/500 [=======================>......] - ETA: 28s - loss: 3.2049 - regression_loss: 2.5422 - classification_loss: 0.6628 417/500 [========================>.....] - ETA: 28s - loss: 3.2038 - regression_loss: 2.5417 - classification_loss: 0.6621 418/500 [========================>.....] - ETA: 28s - loss: 3.2029 - regression_loss: 2.5413 - classification_loss: 0.6616 419/500 [========================>.....] - ETA: 27s - loss: 3.2031 - regression_loss: 2.5417 - classification_loss: 0.6614 420/500 [========================>.....] - ETA: 27s - loss: 3.2017 - regression_loss: 2.5406 - classification_loss: 0.6611 421/500 [========================>.....] - ETA: 27s - loss: 3.1999 - regression_loss: 2.5396 - classification_loss: 0.6603 422/500 [========================>.....] - ETA: 26s - loss: 3.1993 - regression_loss: 2.5390 - classification_loss: 0.6603 423/500 [========================>.....] - ETA: 26s - loss: 3.1981 - regression_loss: 2.5383 - classification_loss: 0.6598 424/500 [========================>.....] - ETA: 26s - loss: 3.1953 - regression_loss: 2.5367 - classification_loss: 0.6586 425/500 [========================>.....] - ETA: 25s - loss: 3.1947 - regression_loss: 2.5365 - classification_loss: 0.6582 426/500 [========================>.....] - ETA: 25s - loss: 3.1940 - regression_loss: 2.5362 - classification_loss: 0.6578 427/500 [========================>.....] - ETA: 25s - loss: 3.1928 - regression_loss: 2.5356 - classification_loss: 0.6572 428/500 [========================>.....] - ETA: 24s - loss: 3.1913 - regression_loss: 2.5349 - classification_loss: 0.6565 429/500 [========================>.....] - ETA: 24s - loss: 3.1895 - regression_loss: 2.5338 - classification_loss: 0.6558 430/500 [========================>.....] - ETA: 24s - loss: 3.1882 - regression_loss: 2.5330 - classification_loss: 0.6552 431/500 [========================>.....] - ETA: 23s - loss: 3.1874 - regression_loss: 2.5328 - classification_loss: 0.6546 432/500 [========================>.....] - ETA: 23s - loss: 3.1864 - regression_loss: 2.5323 - classification_loss: 0.6541 433/500 [========================>.....] - ETA: 22s - loss: 3.1857 - regression_loss: 2.5319 - classification_loss: 0.6539 434/500 [=========================>....] - ETA: 22s - loss: 3.1845 - regression_loss: 2.5311 - classification_loss: 0.6534 435/500 [=========================>....] - ETA: 22s - loss: 3.1840 - regression_loss: 2.5310 - classification_loss: 0.6530 436/500 [=========================>....] - ETA: 21s - loss: 3.1836 - regression_loss: 2.5307 - classification_loss: 0.6529 437/500 [=========================>....] - ETA: 21s - loss: 3.1830 - regression_loss: 2.5304 - classification_loss: 0.6526 438/500 [=========================>....] - ETA: 21s - loss: 3.1850 - regression_loss: 2.5319 - classification_loss: 0.6531 439/500 [=========================>....] - ETA: 20s - loss: 3.1837 - regression_loss: 2.5313 - classification_loss: 0.6524 440/500 [=========================>....] - ETA: 20s - loss: 3.1829 - regression_loss: 2.5310 - classification_loss: 0.6520 441/500 [=========================>....] - ETA: 20s - loss: 3.1820 - regression_loss: 2.5305 - classification_loss: 0.6515 442/500 [=========================>....] - ETA: 19s - loss: 3.1819 - regression_loss: 2.5308 - classification_loss: 0.6511 443/500 [=========================>....] - ETA: 19s - loss: 3.1816 - regression_loss: 2.5309 - classification_loss: 0.6507 444/500 [=========================>....] - ETA: 19s - loss: 3.1807 - regression_loss: 2.5304 - classification_loss: 0.6503 445/500 [=========================>....] - ETA: 18s - loss: 3.1801 - regression_loss: 2.5302 - classification_loss: 0.6499 446/500 [=========================>....] - ETA: 18s - loss: 3.1791 - regression_loss: 2.5296 - classification_loss: 0.6495 447/500 [=========================>....] - ETA: 18s - loss: 3.1783 - regression_loss: 2.5292 - classification_loss: 0.6491 448/500 [=========================>....] - ETA: 17s - loss: 3.1773 - regression_loss: 2.5287 - classification_loss: 0.6486 449/500 [=========================>....] - ETA: 17s - loss: 3.1773 - regression_loss: 2.5291 - classification_loss: 0.6482 450/500 [==========================>...] - ETA: 17s - loss: 3.1758 - regression_loss: 2.5285 - classification_loss: 0.6473 451/500 [==========================>...] - ETA: 16s - loss: 3.1735 - regression_loss: 2.5271 - classification_loss: 0.6464 452/500 [==========================>...] - ETA: 16s - loss: 3.1728 - regression_loss: 2.5266 - classification_loss: 0.6462 453/500 [==========================>...] - ETA: 16s - loss: 3.1717 - regression_loss: 2.5260 - classification_loss: 0.6457 454/500 [==========================>...] - ETA: 15s - loss: 3.1712 - regression_loss: 2.5260 - classification_loss: 0.6452 455/500 [==========================>...] - ETA: 15s - loss: 3.1710 - regression_loss: 2.5260 - classification_loss: 0.6449 456/500 [==========================>...] - ETA: 15s - loss: 3.1698 - regression_loss: 2.5255 - classification_loss: 0.6444 457/500 [==========================>...] - ETA: 14s - loss: 3.1700 - regression_loss: 2.5261 - classification_loss: 0.6439 458/500 [==========================>...] - ETA: 14s - loss: 3.1690 - regression_loss: 2.5257 - classification_loss: 0.6434 459/500 [==========================>...] - ETA: 14s - loss: 3.1681 - regression_loss: 2.5252 - classification_loss: 0.6429 460/500 [==========================>...] - ETA: 13s - loss: 3.1670 - regression_loss: 2.5247 - classification_loss: 0.6423 461/500 [==========================>...] - ETA: 13s - loss: 3.1659 - regression_loss: 2.5243 - classification_loss: 0.6416 462/500 [==========================>...] - ETA: 13s - loss: 3.1650 - regression_loss: 2.5240 - classification_loss: 0.6410 463/500 [==========================>...] - ETA: 12s - loss: 3.1643 - regression_loss: 2.5237 - classification_loss: 0.6405 464/500 [==========================>...] - ETA: 12s - loss: 3.1633 - regression_loss: 2.5233 - classification_loss: 0.6401 465/500 [==========================>...] - ETA: 11s - loss: 3.1626 - regression_loss: 2.5229 - classification_loss: 0.6397 466/500 [==========================>...] - ETA: 11s - loss: 3.1618 - regression_loss: 2.5226 - classification_loss: 0.6393 467/500 [===========================>..] - ETA: 11s - loss: 3.1607 - regression_loss: 2.5220 - classification_loss: 0.6388 468/500 [===========================>..] - ETA: 10s - loss: 3.1596 - regression_loss: 2.5214 - classification_loss: 0.6382 469/500 [===========================>..] - ETA: 10s - loss: 3.1588 - regression_loss: 2.5210 - classification_loss: 0.6378 470/500 [===========================>..] - ETA: 10s - loss: 3.1575 - regression_loss: 2.5203 - classification_loss: 0.6372 471/500 [===========================>..] - ETA: 9s - loss: 3.1564 - regression_loss: 2.5197 - classification_loss: 0.6367 472/500 [===========================>..] - ETA: 9s - loss: 3.1579 - regression_loss: 2.5196 - classification_loss: 0.6383 473/500 [===========================>..] - ETA: 9s - loss: 3.1566 - regression_loss: 2.5189 - classification_loss: 0.6378 474/500 [===========================>..] - ETA: 8s - loss: 3.1581 - regression_loss: 2.5204 - classification_loss: 0.6377 475/500 [===========================>..] - ETA: 8s - loss: 3.1576 - regression_loss: 2.5203 - classification_loss: 0.6373 476/500 [===========================>..] - ETA: 8s - loss: 3.1569 - regression_loss: 2.5198 - classification_loss: 0.6370 477/500 [===========================>..] - ETA: 7s - loss: 3.1555 - regression_loss: 2.5190 - classification_loss: 0.6364 478/500 [===========================>..] - ETA: 7s - loss: 3.1547 - regression_loss: 2.5188 - classification_loss: 0.6359 479/500 [===========================>..] - ETA: 7s - loss: 3.1532 - regression_loss: 2.5179 - classification_loss: 0.6353 480/500 [===========================>..] - ETA: 6s - loss: 3.1522 - regression_loss: 2.5173 - classification_loss: 0.6349 481/500 [===========================>..] - ETA: 6s - loss: 3.1512 - regression_loss: 2.5167 - classification_loss: 0.6345 482/500 [===========================>..] - ETA: 6s - loss: 3.1508 - regression_loss: 2.5165 - classification_loss: 0.6343 483/500 [===========================>..] - ETA: 5s - loss: 3.1498 - regression_loss: 2.5161 - classification_loss: 0.6338 484/500 [============================>.] - ETA: 5s - loss: 3.1491 - regression_loss: 2.5157 - classification_loss: 0.6334 485/500 [============================>.] - ETA: 5s - loss: 3.1495 - regression_loss: 2.5163 - classification_loss: 0.6331 486/500 [============================>.] - ETA: 4s - loss: 3.1482 - regression_loss: 2.5157 - classification_loss: 0.6326 487/500 [============================>.] - ETA: 4s - loss: 3.1474 - regression_loss: 2.5151 - classification_loss: 0.6323 488/500 [============================>.] - ETA: 4s - loss: 3.1462 - regression_loss: 2.5145 - classification_loss: 0.6317 489/500 [============================>.] - ETA: 3s - loss: 3.1450 - regression_loss: 2.5139 - classification_loss: 0.6312 490/500 [============================>.] - ETA: 3s - loss: 3.1441 - regression_loss: 2.5134 - classification_loss: 0.6307 491/500 [============================>.] - ETA: 3s - loss: 3.1442 - regression_loss: 2.5135 - classification_loss: 0.6307 492/500 [============================>.] - ETA: 2s - loss: 3.1429 - regression_loss: 2.5127 - classification_loss: 0.6302 493/500 [============================>.] - ETA: 2s - loss: 3.1409 - regression_loss: 2.5112 - classification_loss: 0.6297 494/500 [============================>.] - ETA: 2s - loss: 3.1395 - regression_loss: 2.5104 - classification_loss: 0.6291 495/500 [============================>.] - ETA: 1s - loss: 3.1387 - regression_loss: 2.5100 - classification_loss: 0.6288 496/500 [============================>.] - ETA: 1s - loss: 3.1385 - regression_loss: 2.5101 - classification_loss: 0.6284 497/500 [============================>.] - ETA: 1s - loss: 3.1385 - regression_loss: 2.5103 - classification_loss: 0.6282 498/500 [============================>.] - ETA: 0s - loss: 3.1378 - regression_loss: 2.5099 - classification_loss: 0.6279 499/500 [============================>.] - ETA: 0s - loss: 3.1367 - regression_loss: 2.5092 - classification_loss: 0.6275 500/500 [==============================] - 171s 342ms/step - loss: 3.1353 - regression_loss: 2.5083 - classification_loss: 0.6270 1172 instances of class plum with average precision: 0.1812 mAP: 0.1812 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:50 - loss: 3.1954 - regression_loss: 2.6137 - classification_loss: 0.5817 2/500 [..............................] - ETA: 2:52 - loss: 3.1168 - regression_loss: 2.5824 - classification_loss: 0.5345 3/500 [..............................] - ETA: 2:48 - loss: 2.9950 - regression_loss: 2.4917 - classification_loss: 0.5033 4/500 [..............................] - ETA: 2:46 - loss: 2.9392 - regression_loss: 2.4689 - classification_loss: 0.4703 5/500 [..............................] - ETA: 2:48 - loss: 2.8393 - regression_loss: 2.3941 - classification_loss: 0.4452 6/500 [..............................] - ETA: 2:47 - loss: 2.7789 - regression_loss: 2.3511 - classification_loss: 0.4278 7/500 [..............................] - ETA: 2:46 - loss: 2.7893 - regression_loss: 2.3643 - classification_loss: 0.4249 8/500 [..............................] - ETA: 2:47 - loss: 2.7900 - regression_loss: 2.3699 - classification_loss: 0.4201 9/500 [..............................] - ETA: 2:47 - loss: 2.8174 - regression_loss: 2.3891 - classification_loss: 0.4284 10/500 [..............................] - ETA: 2:47 - loss: 2.8654 - regression_loss: 2.4272 - classification_loss: 0.4382 11/500 [..............................] - ETA: 2:45 - loss: 2.8256 - regression_loss: 2.3997 - classification_loss: 0.4259 12/500 [..............................] - ETA: 2:45 - loss: 2.8186 - regression_loss: 2.3941 - classification_loss: 0.4246 13/500 [..............................] - ETA: 2:45 - loss: 2.8403 - regression_loss: 2.4000 - classification_loss: 0.4402 14/500 [..............................] - ETA: 2:45 - loss: 2.8774 - regression_loss: 2.4226 - classification_loss: 0.4548 15/500 [..............................] - ETA: 2:45 - loss: 2.8835 - regression_loss: 2.4264 - classification_loss: 0.4571 16/500 [..............................] - ETA: 2:44 - loss: 2.8660 - regression_loss: 2.4122 - classification_loss: 0.4539 17/500 [>.............................] - ETA: 2:44 - loss: 2.8477 - regression_loss: 2.4006 - classification_loss: 0.4471 18/500 [>.............................] - ETA: 2:44 - loss: 2.8629 - regression_loss: 2.3947 - classification_loss: 0.4682 19/500 [>.............................] - ETA: 2:44 - loss: 2.8468 - regression_loss: 2.3894 - classification_loss: 0.4574 20/500 [>.............................] - ETA: 2:44 - loss: 2.8429 - regression_loss: 2.3841 - classification_loss: 0.4587 21/500 [>.............................] - ETA: 2:43 - loss: 2.8325 - regression_loss: 2.3781 - classification_loss: 0.4544 22/500 [>.............................] - ETA: 2:42 - loss: 2.8547 - regression_loss: 2.3936 - classification_loss: 0.4611 23/500 [>.............................] - ETA: 2:42 - loss: 2.8451 - regression_loss: 2.3864 - classification_loss: 0.4587 24/500 [>.............................] - ETA: 2:42 - loss: 2.8276 - regression_loss: 2.3741 - classification_loss: 0.4535 25/500 [>.............................] - ETA: 2:41 - loss: 2.8201 - regression_loss: 2.3702 - classification_loss: 0.4499 26/500 [>.............................] - ETA: 2:41 - loss: 2.8098 - regression_loss: 2.3625 - classification_loss: 0.4473 27/500 [>.............................] - ETA: 2:41 - loss: 2.7921 - regression_loss: 2.3508 - classification_loss: 0.4413 28/500 [>.............................] - ETA: 2:40 - loss: 2.7802 - regression_loss: 2.3421 - classification_loss: 0.4382 29/500 [>.............................] - ETA: 2:39 - loss: 2.7743 - regression_loss: 2.3361 - classification_loss: 0.4383 30/500 [>.............................] - ETA: 2:39 - loss: 2.7775 - regression_loss: 2.3388 - classification_loss: 0.4387 31/500 [>.............................] - ETA: 2:38 - loss: 2.7790 - regression_loss: 2.3435 - classification_loss: 0.4354 32/500 [>.............................] - ETA: 2:37 - loss: 2.7776 - regression_loss: 2.3434 - classification_loss: 0.4342 33/500 [>.............................] - ETA: 2:37 - loss: 2.7763 - regression_loss: 2.3422 - classification_loss: 0.4340 34/500 [=>............................] - ETA: 2:37 - loss: 2.7745 - regression_loss: 2.3406 - classification_loss: 0.4338 35/500 [=>............................] - ETA: 2:36 - loss: 2.7662 - regression_loss: 2.3342 - classification_loss: 0.4320 36/500 [=>............................] - ETA: 2:36 - loss: 2.7673 - regression_loss: 2.3344 - classification_loss: 0.4329 37/500 [=>............................] - ETA: 2:36 - loss: 2.7666 - regression_loss: 2.3330 - classification_loss: 0.4336 38/500 [=>............................] - ETA: 2:35 - loss: 2.7658 - regression_loss: 2.3313 - classification_loss: 0.4345 39/500 [=>............................] - ETA: 2:35 - loss: 2.7537 - regression_loss: 2.3220 - classification_loss: 0.4317 40/500 [=>............................] - ETA: 2:35 - loss: 2.7465 - regression_loss: 2.3171 - classification_loss: 0.4294 41/500 [=>............................] - ETA: 2:34 - loss: 2.7451 - regression_loss: 2.3135 - classification_loss: 0.4316 42/500 [=>............................] - ETA: 2:34 - loss: 2.7448 - regression_loss: 2.3137 - classification_loss: 0.4311 43/500 [=>............................] - ETA: 2:33 - loss: 2.7420 - regression_loss: 2.3113 - classification_loss: 0.4307 44/500 [=>............................] - ETA: 2:33 - loss: 2.7359 - regression_loss: 2.3057 - classification_loss: 0.4302 45/500 [=>............................] - ETA: 2:33 - loss: 2.7508 - regression_loss: 2.3177 - classification_loss: 0.4331 46/500 [=>............................] - ETA: 2:32 - loss: 2.7467 - regression_loss: 2.3136 - classification_loss: 0.4332 47/500 [=>............................] - ETA: 2:32 - loss: 2.7391 - regression_loss: 2.3084 - classification_loss: 0.4307 48/500 [=>............................] - ETA: 2:32 - loss: 2.7384 - regression_loss: 2.3076 - classification_loss: 0.4309 49/500 [=>............................] - ETA: 2:31 - loss: 2.7420 - regression_loss: 2.3062 - classification_loss: 0.4358 50/500 [==>...........................] - ETA: 2:31 - loss: 2.7353 - regression_loss: 2.3007 - classification_loss: 0.4345 51/500 [==>...........................] - ETA: 2:31 - loss: 2.7384 - regression_loss: 2.3026 - classification_loss: 0.4358 52/500 [==>...........................] - ETA: 2:30 - loss: 2.7356 - regression_loss: 2.3007 - classification_loss: 0.4349 53/500 [==>...........................] - ETA: 2:30 - loss: 2.7247 - regression_loss: 2.2924 - classification_loss: 0.4323 54/500 [==>...........................] - ETA: 2:30 - loss: 2.7207 - regression_loss: 2.2903 - classification_loss: 0.4304 55/500 [==>...........................] - ETA: 2:29 - loss: 2.7288 - regression_loss: 2.2966 - classification_loss: 0.4323 56/500 [==>...........................] - ETA: 2:29 - loss: 2.7236 - regression_loss: 2.2933 - classification_loss: 0.4303 57/500 [==>...........................] - ETA: 2:29 - loss: 2.7356 - regression_loss: 2.3040 - classification_loss: 0.4316 58/500 [==>...........................] - ETA: 2:29 - loss: 2.7422 - regression_loss: 2.3095 - classification_loss: 0.4327 59/500 [==>...........................] - ETA: 2:28 - loss: 2.7400 - regression_loss: 2.3080 - classification_loss: 0.4320 60/500 [==>...........................] - ETA: 2:28 - loss: 2.7355 - regression_loss: 2.3046 - classification_loss: 0.4308 61/500 [==>...........................] - ETA: 2:27 - loss: 2.7334 - regression_loss: 2.3025 - classification_loss: 0.4309 62/500 [==>...........................] - ETA: 2:27 - loss: 2.7293 - regression_loss: 2.2977 - classification_loss: 0.4316 63/500 [==>...........................] - ETA: 2:27 - loss: 2.7274 - regression_loss: 2.2967 - classification_loss: 0.4307 64/500 [==>...........................] - ETA: 2:26 - loss: 2.7272 - regression_loss: 2.2968 - classification_loss: 0.4304 65/500 [==>...........................] - ETA: 2:26 - loss: 2.7272 - regression_loss: 2.2967 - classification_loss: 0.4305 66/500 [==>...........................] - ETA: 2:26 - loss: 2.7165 - regression_loss: 2.2880 - classification_loss: 0.4285 67/500 [===>..........................] - ETA: 2:25 - loss: 2.7200 - regression_loss: 2.2906 - classification_loss: 0.4295 68/500 [===>..........................] - ETA: 2:25 - loss: 2.7012 - regression_loss: 2.2747 - classification_loss: 0.4265 69/500 [===>..........................] - ETA: 2:25 - loss: 2.7039 - regression_loss: 2.2770 - classification_loss: 0.4269 70/500 [===>..........................] - ETA: 2:24 - loss: 2.6960 - regression_loss: 2.2705 - classification_loss: 0.4255 71/500 [===>..........................] - ETA: 2:24 - loss: 2.6939 - regression_loss: 2.2692 - classification_loss: 0.4247 72/500 [===>..........................] - ETA: 2:23 - loss: 2.6915 - regression_loss: 2.2680 - classification_loss: 0.4235 73/500 [===>..........................] - ETA: 2:23 - loss: 2.6858 - regression_loss: 2.2628 - classification_loss: 0.4230 74/500 [===>..........................] - ETA: 2:23 - loss: 2.6932 - regression_loss: 2.2695 - classification_loss: 0.4237 75/500 [===>..........................] - ETA: 2:23 - loss: 2.6897 - regression_loss: 2.2670 - classification_loss: 0.4226 76/500 [===>..........................] - ETA: 2:22 - loss: 2.6855 - regression_loss: 2.2657 - classification_loss: 0.4198 77/500 [===>..........................] - ETA: 2:22 - loss: 2.6866 - regression_loss: 2.2642 - classification_loss: 0.4224 78/500 [===>..........................] - ETA: 2:22 - loss: 2.6798 - regression_loss: 2.2593 - classification_loss: 0.4205 79/500 [===>..........................] - ETA: 2:21 - loss: 2.6830 - regression_loss: 2.2625 - classification_loss: 0.4205 80/500 [===>..........................] - ETA: 2:21 - loss: 2.6762 - regression_loss: 2.2562 - classification_loss: 0.4200 81/500 [===>..........................] - ETA: 2:20 - loss: 2.6656 - regression_loss: 2.2488 - classification_loss: 0.4169 82/500 [===>..........................] - ETA: 2:20 - loss: 2.6684 - regression_loss: 2.2498 - classification_loss: 0.4186 83/500 [===>..........................] - ETA: 2:20 - loss: 2.6633 - regression_loss: 2.2453 - classification_loss: 0.4180 84/500 [====>.........................] - ETA: 2:19 - loss: 2.6623 - regression_loss: 2.2448 - classification_loss: 0.4175 85/500 [====>.........................] - ETA: 2:19 - loss: 2.6631 - regression_loss: 2.2441 - classification_loss: 0.4190 86/500 [====>.........................] - ETA: 2:19 - loss: 2.6656 - regression_loss: 2.2465 - classification_loss: 0.4191 87/500 [====>.........................] - ETA: 2:18 - loss: 2.6622 - regression_loss: 2.2436 - classification_loss: 0.4186 88/500 [====>.........................] - ETA: 2:18 - loss: 2.6642 - regression_loss: 2.2461 - classification_loss: 0.4181 89/500 [====>.........................] - ETA: 2:18 - loss: 2.6644 - regression_loss: 2.2458 - classification_loss: 0.4187 90/500 [====>.........................] - ETA: 2:18 - loss: 2.6638 - regression_loss: 2.2451 - classification_loss: 0.4186 91/500 [====>.........................] - ETA: 2:17 - loss: 2.6646 - regression_loss: 2.2463 - classification_loss: 0.4183 92/500 [====>.........................] - ETA: 2:17 - loss: 2.6635 - regression_loss: 2.2456 - classification_loss: 0.4179 93/500 [====>.........................] - ETA: 2:16 - loss: 2.6646 - regression_loss: 2.2462 - classification_loss: 0.4184 94/500 [====>.........................] - ETA: 2:16 - loss: 2.6571 - regression_loss: 2.2397 - classification_loss: 0.4174 95/500 [====>.........................] - ETA: 2:16 - loss: 2.6543 - regression_loss: 2.2373 - classification_loss: 0.4170 96/500 [====>.........................] - ETA: 2:16 - loss: 2.6524 - regression_loss: 2.2346 - classification_loss: 0.4178 97/500 [====>.........................] - ETA: 2:15 - loss: 2.6512 - regression_loss: 2.2335 - classification_loss: 0.4177 98/500 [====>.........................] - ETA: 2:15 - loss: 2.6483 - regression_loss: 2.2316 - classification_loss: 0.4166 99/500 [====>.........................] - ETA: 2:14 - loss: 2.6476 - regression_loss: 2.2308 - classification_loss: 0.4167 100/500 [=====>........................] - ETA: 2:14 - loss: 2.6456 - regression_loss: 2.2295 - classification_loss: 0.4161 101/500 [=====>........................] - ETA: 2:14 - loss: 2.6443 - regression_loss: 2.2294 - classification_loss: 0.4149 102/500 [=====>........................] - ETA: 2:13 - loss: 2.6464 - regression_loss: 2.2301 - classification_loss: 0.4163 103/500 [=====>........................] - ETA: 2:13 - loss: 2.6496 - regression_loss: 2.2331 - classification_loss: 0.4165 104/500 [=====>........................] - ETA: 2:13 - loss: 2.6468 - regression_loss: 2.2306 - classification_loss: 0.4162 105/500 [=====>........................] - ETA: 2:12 - loss: 2.6466 - regression_loss: 2.2302 - classification_loss: 0.4163 106/500 [=====>........................] - ETA: 2:12 - loss: 2.6477 - regression_loss: 2.2314 - classification_loss: 0.4163 107/500 [=====>........................] - ETA: 2:12 - loss: 2.6464 - regression_loss: 2.2305 - classification_loss: 0.4159 108/500 [=====>........................] - ETA: 2:11 - loss: 2.6387 - regression_loss: 2.2237 - classification_loss: 0.4151 109/500 [=====>........................] - ETA: 2:11 - loss: 2.6300 - regression_loss: 2.2166 - classification_loss: 0.4134 110/500 [=====>........................] - ETA: 2:11 - loss: 2.6290 - regression_loss: 2.2155 - classification_loss: 0.4135 111/500 [=====>........................] - ETA: 2:10 - loss: 2.6234 - regression_loss: 2.2112 - classification_loss: 0.4122 112/500 [=====>........................] - ETA: 2:10 - loss: 2.6199 - regression_loss: 2.2095 - classification_loss: 0.4104 113/500 [=====>........................] - ETA: 2:10 - loss: 2.6219 - regression_loss: 2.2117 - classification_loss: 0.4102 114/500 [=====>........................] - ETA: 2:09 - loss: 2.6188 - regression_loss: 2.2092 - classification_loss: 0.4097 115/500 [=====>........................] - ETA: 2:09 - loss: 2.6140 - regression_loss: 2.2048 - classification_loss: 0.4092 116/500 [=====>........................] - ETA: 2:09 - loss: 2.6116 - regression_loss: 2.2036 - classification_loss: 0.4080 117/500 [======>.......................] - ETA: 2:09 - loss: 2.6102 - regression_loss: 2.2024 - classification_loss: 0.4078 118/500 [======>.......................] - ETA: 2:08 - loss: 2.6089 - regression_loss: 2.2014 - classification_loss: 0.4075 119/500 [======>.......................] - ETA: 2:08 - loss: 2.6039 - regression_loss: 2.1975 - classification_loss: 0.4064 120/500 [======>.......................] - ETA: 2:07 - loss: 2.6058 - regression_loss: 2.1989 - classification_loss: 0.4069 121/500 [======>.......................] - ETA: 2:07 - loss: 2.6071 - regression_loss: 2.1997 - classification_loss: 0.4074 122/500 [======>.......................] - ETA: 2:07 - loss: 2.6067 - regression_loss: 2.1998 - classification_loss: 0.4070 123/500 [======>.......................] - ETA: 2:06 - loss: 2.6053 - regression_loss: 2.1980 - classification_loss: 0.4073 124/500 [======>.......................] - ETA: 2:06 - loss: 2.6090 - regression_loss: 2.2002 - classification_loss: 0.4089 125/500 [======>.......................] - ETA: 2:06 - loss: 2.6083 - regression_loss: 2.1994 - classification_loss: 0.4089 126/500 [======>.......................] - ETA: 2:05 - loss: 2.6113 - regression_loss: 2.2017 - classification_loss: 0.4096 127/500 [======>.......................] - ETA: 2:05 - loss: 2.6114 - regression_loss: 2.2017 - classification_loss: 0.4097 128/500 [======>.......................] - ETA: 2:05 - loss: 2.6101 - regression_loss: 2.2009 - classification_loss: 0.4092 129/500 [======>.......................] - ETA: 2:04 - loss: 2.6082 - regression_loss: 2.1995 - classification_loss: 0.4087 130/500 [======>.......................] - ETA: 2:04 - loss: 2.6092 - regression_loss: 2.2002 - classification_loss: 0.4090 131/500 [======>.......................] - ETA: 2:04 - loss: 2.6105 - regression_loss: 2.2015 - classification_loss: 0.4090 132/500 [======>.......................] - ETA: 2:03 - loss: 2.6108 - regression_loss: 2.2014 - classification_loss: 0.4094 133/500 [======>.......................] - ETA: 2:03 - loss: 2.6137 - regression_loss: 2.2032 - classification_loss: 0.4105 134/500 [=======>......................] - ETA: 2:03 - loss: 2.6154 - regression_loss: 2.2042 - classification_loss: 0.4112 135/500 [=======>......................] - ETA: 2:02 - loss: 2.6149 - regression_loss: 2.2038 - classification_loss: 0.4111 136/500 [=======>......................] - ETA: 2:02 - loss: 2.6136 - regression_loss: 2.2017 - classification_loss: 0.4119 137/500 [=======>......................] - ETA: 2:02 - loss: 2.6138 - regression_loss: 2.2014 - classification_loss: 0.4124 138/500 [=======>......................] - ETA: 2:01 - loss: 2.6125 - regression_loss: 2.2005 - classification_loss: 0.4119 139/500 [=======>......................] - ETA: 2:01 - loss: 2.6125 - regression_loss: 2.2008 - classification_loss: 0.4118 140/500 [=======>......................] - ETA: 2:01 - loss: 2.6070 - regression_loss: 2.1956 - classification_loss: 0.4114 141/500 [=======>......................] - ETA: 2:00 - loss: 2.6063 - regression_loss: 2.1950 - classification_loss: 0.4112 142/500 [=======>......................] - ETA: 2:00 - loss: 2.6071 - regression_loss: 2.1958 - classification_loss: 0.4113 143/500 [=======>......................] - ETA: 2:00 - loss: 2.6054 - regression_loss: 2.1950 - classification_loss: 0.4104 144/500 [=======>......................] - ETA: 2:00 - loss: 2.6039 - regression_loss: 2.1937 - classification_loss: 0.4102 145/500 [=======>......................] - ETA: 1:59 - loss: 2.6038 - regression_loss: 2.1934 - classification_loss: 0.4104 146/500 [=======>......................] - ETA: 1:59 - loss: 2.6037 - regression_loss: 2.1934 - classification_loss: 0.4103 147/500 [=======>......................] - ETA: 1:59 - loss: 2.6103 - regression_loss: 2.1985 - classification_loss: 0.4118 148/500 [=======>......................] - ETA: 1:58 - loss: 2.6100 - regression_loss: 2.1982 - classification_loss: 0.4118 149/500 [=======>......................] - ETA: 1:58 - loss: 2.6107 - regression_loss: 2.1987 - classification_loss: 0.4120 150/500 [========>.....................] - ETA: 1:58 - loss: 2.6106 - regression_loss: 2.1984 - classification_loss: 0.4122 151/500 [========>.....................] - ETA: 1:57 - loss: 2.6139 - regression_loss: 2.1998 - classification_loss: 0.4141 152/500 [========>.....................] - ETA: 1:57 - loss: 2.6138 - regression_loss: 2.1998 - classification_loss: 0.4140 153/500 [========>.....................] - ETA: 1:57 - loss: 2.6131 - regression_loss: 2.1984 - classification_loss: 0.4147 154/500 [========>.....................] - ETA: 1:56 - loss: 2.6069 - regression_loss: 2.1921 - classification_loss: 0.4149 155/500 [========>.....................] - ETA: 1:56 - loss: 2.6023 - regression_loss: 2.1878 - classification_loss: 0.4146 156/500 [========>.....................] - ETA: 1:56 - loss: 2.6069 - regression_loss: 2.1915 - classification_loss: 0.4154 157/500 [========>.....................] - ETA: 1:55 - loss: 2.6070 - regression_loss: 2.1920 - classification_loss: 0.4150 158/500 [========>.....................] - ETA: 1:55 - loss: 2.6080 - regression_loss: 2.1931 - classification_loss: 0.4149 159/500 [========>.....................] - ETA: 1:55 - loss: 2.6100 - regression_loss: 2.1937 - classification_loss: 0.4163 160/500 [========>.....................] - ETA: 1:54 - loss: 2.6060 - regression_loss: 2.1909 - classification_loss: 0.4152 161/500 [========>.....................] - ETA: 1:54 - loss: 2.6106 - regression_loss: 2.1954 - classification_loss: 0.4152 162/500 [========>.....................] - ETA: 1:54 - loss: 2.6112 - regression_loss: 2.1969 - classification_loss: 0.4142 163/500 [========>.....................] - ETA: 1:53 - loss: 2.6101 - regression_loss: 2.1956 - classification_loss: 0.4145 164/500 [========>.....................] - ETA: 1:53 - loss: 2.6093 - regression_loss: 2.1948 - classification_loss: 0.4145 165/500 [========>.....................] - ETA: 1:53 - loss: 2.6081 - regression_loss: 2.1940 - classification_loss: 0.4141 166/500 [========>.....................] - ETA: 1:52 - loss: 2.6078 - regression_loss: 2.1932 - classification_loss: 0.4146 167/500 [=========>....................] - ETA: 1:52 - loss: 2.6060 - regression_loss: 2.1919 - classification_loss: 0.4141 168/500 [=========>....................] - ETA: 1:52 - loss: 2.6065 - regression_loss: 2.1917 - classification_loss: 0.4149 169/500 [=========>....................] - ETA: 1:51 - loss: 2.6093 - regression_loss: 2.1936 - classification_loss: 0.4157 170/500 [=========>....................] - ETA: 1:51 - loss: 2.6084 - regression_loss: 2.1930 - classification_loss: 0.4155 171/500 [=========>....................] - ETA: 1:51 - loss: 2.6086 - regression_loss: 2.1933 - classification_loss: 0.4153 172/500 [=========>....................] - ETA: 1:50 - loss: 2.6096 - regression_loss: 2.1942 - classification_loss: 0.4154 173/500 [=========>....................] - ETA: 1:50 - loss: 2.6076 - regression_loss: 2.1927 - classification_loss: 0.4149 174/500 [=========>....................] - ETA: 1:49 - loss: 2.6072 - regression_loss: 2.1926 - classification_loss: 0.4146 175/500 [=========>....................] - ETA: 1:49 - loss: 2.6084 - regression_loss: 2.1937 - classification_loss: 0.4147 176/500 [=========>....................] - ETA: 1:49 - loss: 2.6097 - regression_loss: 2.1941 - classification_loss: 0.4156 177/500 [=========>....................] - ETA: 1:48 - loss: 2.6092 - regression_loss: 2.1937 - classification_loss: 0.4155 178/500 [=========>....................] - ETA: 1:48 - loss: 2.6079 - regression_loss: 2.1928 - classification_loss: 0.4151 179/500 [=========>....................] - ETA: 1:48 - loss: 2.6101 - regression_loss: 2.1947 - classification_loss: 0.4155 180/500 [=========>....................] - ETA: 1:47 - loss: 2.6089 - regression_loss: 2.1940 - classification_loss: 0.4149 181/500 [=========>....................] - ETA: 1:47 - loss: 2.6087 - regression_loss: 2.1936 - classification_loss: 0.4151 182/500 [=========>....................] - ETA: 1:47 - loss: 2.6078 - regression_loss: 2.1931 - classification_loss: 0.4147 183/500 [=========>....................] - ETA: 1:46 - loss: 2.6064 - regression_loss: 2.1922 - classification_loss: 0.4142 184/500 [==========>...................] - ETA: 1:46 - loss: 2.6098 - regression_loss: 2.1952 - classification_loss: 0.4146 185/500 [==========>...................] - ETA: 1:46 - loss: 2.6090 - regression_loss: 2.1945 - classification_loss: 0.4145 186/500 [==========>...................] - ETA: 1:45 - loss: 2.6134 - regression_loss: 2.1979 - classification_loss: 0.4155 187/500 [==========>...................] - ETA: 1:45 - loss: 2.6128 - regression_loss: 2.1975 - classification_loss: 0.4153 188/500 [==========>...................] - ETA: 1:45 - loss: 2.6164 - regression_loss: 2.2002 - classification_loss: 0.4162 189/500 [==========>...................] - ETA: 1:44 - loss: 2.6147 - regression_loss: 2.1990 - classification_loss: 0.4157 190/500 [==========>...................] - ETA: 1:44 - loss: 2.6133 - regression_loss: 2.1972 - classification_loss: 0.4161 191/500 [==========>...................] - ETA: 1:44 - loss: 2.6121 - regression_loss: 2.1962 - classification_loss: 0.4159 192/500 [==========>...................] - ETA: 1:43 - loss: 2.6111 - regression_loss: 2.1952 - classification_loss: 0.4159 193/500 [==========>...................] - ETA: 1:43 - loss: 2.6094 - regression_loss: 2.1937 - classification_loss: 0.4157 194/500 [==========>...................] - ETA: 1:43 - loss: 2.6116 - regression_loss: 2.1948 - classification_loss: 0.4168 195/500 [==========>...................] - ETA: 1:42 - loss: 2.6114 - regression_loss: 2.1948 - classification_loss: 0.4167 196/500 [==========>...................] - ETA: 1:42 - loss: 2.6135 - regression_loss: 2.1964 - classification_loss: 0.4171 197/500 [==========>...................] - ETA: 1:42 - loss: 2.6121 - regression_loss: 2.1954 - classification_loss: 0.4167 198/500 [==========>...................] - ETA: 1:41 - loss: 2.6127 - regression_loss: 2.1963 - classification_loss: 0.4164 199/500 [==========>...................] - ETA: 1:41 - loss: 2.6144 - regression_loss: 2.1979 - classification_loss: 0.4165 200/500 [===========>..................] - ETA: 1:41 - loss: 2.6121 - regression_loss: 2.1964 - classification_loss: 0.4156 201/500 [===========>..................] - ETA: 1:40 - loss: 2.6102 - regression_loss: 2.1948 - classification_loss: 0.4153 202/500 [===========>..................] - ETA: 1:40 - loss: 2.6093 - regression_loss: 2.1941 - classification_loss: 0.4152 203/500 [===========>..................] - ETA: 1:40 - loss: 2.6106 - regression_loss: 2.1950 - classification_loss: 0.4156 204/500 [===========>..................] - ETA: 1:39 - loss: 2.6108 - regression_loss: 2.1952 - classification_loss: 0.4156 205/500 [===========>..................] - ETA: 1:39 - loss: 2.6094 - regression_loss: 2.1940 - classification_loss: 0.4154 206/500 [===========>..................] - ETA: 1:39 - loss: 2.6069 - regression_loss: 2.1919 - classification_loss: 0.4150 207/500 [===========>..................] - ETA: 1:38 - loss: 2.6065 - regression_loss: 2.1918 - classification_loss: 0.4147 208/500 [===========>..................] - ETA: 1:38 - loss: 2.6064 - regression_loss: 2.1919 - classification_loss: 0.4145 209/500 [===========>..................] - ETA: 1:38 - loss: 2.6050 - regression_loss: 2.1908 - classification_loss: 0.4142 210/500 [===========>..................] - ETA: 1:37 - loss: 2.6048 - regression_loss: 2.1913 - classification_loss: 0.4135 211/500 [===========>..................] - ETA: 1:37 - loss: 2.6018 - regression_loss: 2.1889 - classification_loss: 0.4129 212/500 [===========>..................] - ETA: 1:37 - loss: 2.6021 - regression_loss: 2.1890 - classification_loss: 0.4131 213/500 [===========>..................] - ETA: 1:36 - loss: 2.6014 - regression_loss: 2.1886 - classification_loss: 0.4128 214/500 [===========>..................] - ETA: 1:36 - loss: 2.6020 - regression_loss: 2.1892 - classification_loss: 0.4128 215/500 [===========>..................] - ETA: 1:36 - loss: 2.5997 - regression_loss: 2.1873 - classification_loss: 0.4125 216/500 [===========>..................] - ETA: 1:35 - loss: 2.5996 - regression_loss: 2.1869 - classification_loss: 0.4127 217/500 [============>.................] - ETA: 1:35 - loss: 2.5974 - regression_loss: 2.1850 - classification_loss: 0.4124 218/500 [============>.................] - ETA: 1:35 - loss: 2.5958 - regression_loss: 2.1835 - classification_loss: 0.4123 219/500 [============>.................] - ETA: 1:34 - loss: 2.5948 - regression_loss: 2.1827 - classification_loss: 0.4121 220/500 [============>.................] - ETA: 1:34 - loss: 2.5947 - regression_loss: 2.1827 - classification_loss: 0.4120 221/500 [============>.................] - ETA: 1:34 - loss: 2.5935 - regression_loss: 2.1815 - classification_loss: 0.4120 222/500 [============>.................] - ETA: 1:33 - loss: 2.5937 - regression_loss: 2.1812 - classification_loss: 0.4125 223/500 [============>.................] - ETA: 1:33 - loss: 2.5940 - regression_loss: 2.1813 - classification_loss: 0.4127 224/500 [============>.................] - ETA: 1:33 - loss: 2.5938 - regression_loss: 2.1811 - classification_loss: 0.4127 225/500 [============>.................] - ETA: 1:32 - loss: 2.5944 - regression_loss: 2.1813 - classification_loss: 0.4130 226/500 [============>.................] - ETA: 1:32 - loss: 2.5926 - regression_loss: 2.1801 - classification_loss: 0.4125 227/500 [============>.................] - ETA: 1:32 - loss: 2.5924 - regression_loss: 2.1802 - classification_loss: 0.4123 228/500 [============>.................] - ETA: 1:31 - loss: 2.5929 - regression_loss: 2.1806 - classification_loss: 0.4123 229/500 [============>.................] - ETA: 1:31 - loss: 2.5925 - regression_loss: 2.1803 - classification_loss: 0.4122 230/500 [============>.................] - ETA: 1:31 - loss: 2.5912 - regression_loss: 2.1786 - classification_loss: 0.4127 231/500 [============>.................] - ETA: 1:30 - loss: 2.5918 - regression_loss: 2.1791 - classification_loss: 0.4127 232/500 [============>.................] - ETA: 1:30 - loss: 2.5920 - regression_loss: 2.1793 - classification_loss: 0.4127 233/500 [============>.................] - ETA: 1:30 - loss: 2.5900 - regression_loss: 2.1778 - classification_loss: 0.4122 234/500 [=============>................] - ETA: 1:29 - loss: 2.5886 - regression_loss: 2.1767 - classification_loss: 0.4119 235/500 [=============>................] - ETA: 1:29 - loss: 2.5881 - regression_loss: 2.1763 - classification_loss: 0.4117 236/500 [=============>................] - ETA: 1:29 - loss: 2.5889 - regression_loss: 2.1770 - classification_loss: 0.4119 237/500 [=============>................] - ETA: 1:28 - loss: 2.5865 - regression_loss: 2.1738 - classification_loss: 0.4127 238/500 [=============>................] - ETA: 1:28 - loss: 2.5868 - regression_loss: 2.1741 - classification_loss: 0.4127 239/500 [=============>................] - ETA: 1:28 - loss: 2.5872 - regression_loss: 2.1745 - classification_loss: 0.4127 240/500 [=============>................] - ETA: 1:27 - loss: 2.5831 - regression_loss: 2.1712 - classification_loss: 0.4119 241/500 [=============>................] - ETA: 1:27 - loss: 2.5826 - regression_loss: 2.1709 - classification_loss: 0.4117 242/500 [=============>................] - ETA: 1:27 - loss: 2.5821 - regression_loss: 2.1706 - classification_loss: 0.4115 243/500 [=============>................] - ETA: 1:26 - loss: 2.5843 - regression_loss: 2.1725 - classification_loss: 0.4119 244/500 [=============>................] - ETA: 1:26 - loss: 2.5820 - regression_loss: 2.1707 - classification_loss: 0.4113 245/500 [=============>................] - ETA: 1:26 - loss: 2.5805 - regression_loss: 2.1695 - classification_loss: 0.4109 246/500 [=============>................] - ETA: 1:25 - loss: 2.5787 - regression_loss: 2.1674 - classification_loss: 0.4112 247/500 [=============>................] - ETA: 1:25 - loss: 2.5774 - regression_loss: 2.1666 - classification_loss: 0.4107 248/500 [=============>................] - ETA: 1:24 - loss: 2.5768 - regression_loss: 2.1661 - classification_loss: 0.4106 249/500 [=============>................] - ETA: 1:24 - loss: 2.5742 - regression_loss: 2.1641 - classification_loss: 0.4101 250/500 [==============>...............] - ETA: 1:24 - loss: 2.5731 - regression_loss: 2.1632 - classification_loss: 0.4099 251/500 [==============>...............] - ETA: 1:23 - loss: 2.5723 - regression_loss: 2.1627 - classification_loss: 0.4096 252/500 [==============>...............] - ETA: 1:23 - loss: 2.5716 - regression_loss: 2.1624 - classification_loss: 0.4093 253/500 [==============>...............] - ETA: 1:23 - loss: 2.5694 - regression_loss: 2.1605 - classification_loss: 0.4088 254/500 [==============>...............] - ETA: 1:22 - loss: 2.5698 - regression_loss: 2.1608 - classification_loss: 0.4090 255/500 [==============>...............] - ETA: 1:22 - loss: 2.5695 - regression_loss: 2.1607 - classification_loss: 0.4089 256/500 [==============>...............] - ETA: 1:22 - loss: 2.5695 - regression_loss: 2.1606 - classification_loss: 0.4089 257/500 [==============>...............] - ETA: 1:21 - loss: 2.5689 - regression_loss: 2.1601 - classification_loss: 0.4088 258/500 [==============>...............] - ETA: 1:21 - loss: 2.5660 - regression_loss: 2.1578 - classification_loss: 0.4082 259/500 [==============>...............] - ETA: 1:21 - loss: 2.5683 - regression_loss: 2.1601 - classification_loss: 0.4082 260/500 [==============>...............] - ETA: 1:20 - loss: 2.5670 - regression_loss: 2.1591 - classification_loss: 0.4079 261/500 [==============>...............] - ETA: 1:20 - loss: 2.5613 - regression_loss: 2.1542 - classification_loss: 0.4070 262/500 [==============>...............] - ETA: 1:20 - loss: 2.5601 - regression_loss: 2.1535 - classification_loss: 0.4066 263/500 [==============>...............] - ETA: 1:19 - loss: 2.5601 - regression_loss: 2.1535 - classification_loss: 0.4066 264/500 [==============>...............] - ETA: 1:19 - loss: 2.5589 - regression_loss: 2.1526 - classification_loss: 0.4063 265/500 [==============>...............] - ETA: 1:19 - loss: 2.5628 - regression_loss: 2.1560 - classification_loss: 0.4068 266/500 [==============>...............] - ETA: 1:18 - loss: 2.5614 - regression_loss: 2.1550 - classification_loss: 0.4064 267/500 [===============>..............] - ETA: 1:18 - loss: 2.5611 - regression_loss: 2.1548 - classification_loss: 0.4063 268/500 [===============>..............] - ETA: 1:18 - loss: 2.5620 - regression_loss: 2.1558 - classification_loss: 0.4062 269/500 [===============>..............] - ETA: 1:17 - loss: 2.5605 - regression_loss: 2.1544 - classification_loss: 0.4060 270/500 [===============>..............] - ETA: 1:17 - loss: 2.5610 - regression_loss: 2.1550 - classification_loss: 0.4060 271/500 [===============>..............] - ETA: 1:17 - loss: 2.5596 - regression_loss: 2.1541 - classification_loss: 0.4055 272/500 [===============>..............] - ETA: 1:16 - loss: 2.5612 - regression_loss: 2.1552 - classification_loss: 0.4061 273/500 [===============>..............] - ETA: 1:16 - loss: 2.5610 - regression_loss: 2.1549 - classification_loss: 0.4061 274/500 [===============>..............] - ETA: 1:16 - loss: 2.5609 - regression_loss: 2.1549 - classification_loss: 0.4060 275/500 [===============>..............] - ETA: 1:15 - loss: 2.5612 - regression_loss: 2.1549 - classification_loss: 0.4063 276/500 [===============>..............] - ETA: 1:15 - loss: 2.5614 - regression_loss: 2.1550 - classification_loss: 0.4063 277/500 [===============>..............] - ETA: 1:15 - loss: 2.5614 - regression_loss: 2.1552 - classification_loss: 0.4062 278/500 [===============>..............] - ETA: 1:14 - loss: 2.5604 - regression_loss: 2.1544 - classification_loss: 0.4060 279/500 [===============>..............] - ETA: 1:14 - loss: 2.5610 - regression_loss: 2.1550 - classification_loss: 0.4060 280/500 [===============>..............] - ETA: 1:14 - loss: 2.5608 - regression_loss: 2.1549 - classification_loss: 0.4060 281/500 [===============>..............] - ETA: 1:13 - loss: 2.5596 - regression_loss: 2.1540 - classification_loss: 0.4056 282/500 [===============>..............] - ETA: 1:13 - loss: 2.5580 - regression_loss: 2.1528 - classification_loss: 0.4052 283/500 [===============>..............] - ETA: 1:13 - loss: 2.5569 - regression_loss: 2.1515 - classification_loss: 0.4055 284/500 [================>.............] - ETA: 1:12 - loss: 2.5571 - regression_loss: 2.1512 - classification_loss: 0.4059 285/500 [================>.............] - ETA: 1:12 - loss: 2.5565 - regression_loss: 2.1507 - classification_loss: 0.4058 286/500 [================>.............] - ETA: 1:12 - loss: 2.5579 - regression_loss: 2.1518 - classification_loss: 0.4061 287/500 [================>.............] - ETA: 1:11 - loss: 2.5545 - regression_loss: 2.1492 - classification_loss: 0.4053 288/500 [================>.............] - ETA: 1:11 - loss: 2.5532 - regression_loss: 2.1482 - classification_loss: 0.4050 289/500 [================>.............] - ETA: 1:11 - loss: 2.5525 - regression_loss: 2.1478 - classification_loss: 0.4046 290/500 [================>.............] - ETA: 1:10 - loss: 2.5514 - regression_loss: 2.1471 - classification_loss: 0.4044 291/500 [================>.............] - ETA: 1:10 - loss: 2.5510 - regression_loss: 2.1469 - classification_loss: 0.4041 292/500 [================>.............] - ETA: 1:10 - loss: 2.5495 - regression_loss: 2.1455 - classification_loss: 0.4040 293/500 [================>.............] - ETA: 1:09 - loss: 2.5496 - regression_loss: 2.1456 - classification_loss: 0.4041 294/500 [================>.............] - ETA: 1:09 - loss: 2.5493 - regression_loss: 2.1453 - classification_loss: 0.4039 295/500 [================>.............] - ETA: 1:09 - loss: 2.5489 - regression_loss: 2.1452 - classification_loss: 0.4037 296/500 [================>.............] - ETA: 1:08 - loss: 2.5497 - regression_loss: 2.1457 - classification_loss: 0.4040 297/500 [================>.............] - ETA: 1:08 - loss: 2.5490 - regression_loss: 2.1453 - classification_loss: 0.4037 298/500 [================>.............] - ETA: 1:08 - loss: 2.5478 - regression_loss: 2.1444 - classification_loss: 0.4035 299/500 [================>.............] - ETA: 1:07 - loss: 2.5491 - regression_loss: 2.1456 - classification_loss: 0.4035 300/500 [=================>............] - ETA: 1:07 - loss: 2.5508 - regression_loss: 2.1469 - classification_loss: 0.4039 301/500 [=================>............] - ETA: 1:07 - loss: 2.5496 - regression_loss: 2.1458 - classification_loss: 0.4037 302/500 [=================>............] - ETA: 1:06 - loss: 2.5484 - regression_loss: 2.1447 - classification_loss: 0.4037 303/500 [=================>............] - ETA: 1:06 - loss: 2.5490 - regression_loss: 2.1448 - classification_loss: 0.4042 304/500 [=================>............] - ETA: 1:06 - loss: 2.5497 - regression_loss: 2.1458 - classification_loss: 0.4039 305/500 [=================>............] - ETA: 1:05 - loss: 2.5499 - regression_loss: 2.1462 - classification_loss: 0.4037 306/500 [=================>............] - ETA: 1:05 - loss: 2.5487 - regression_loss: 2.1452 - classification_loss: 0.4035 307/500 [=================>............] - ETA: 1:05 - loss: 2.5474 - regression_loss: 2.1441 - classification_loss: 0.4033 308/500 [=================>............] - ETA: 1:04 - loss: 2.5463 - regression_loss: 2.1431 - classification_loss: 0.4032 309/500 [=================>............] - ETA: 1:04 - loss: 2.5451 - regression_loss: 2.1421 - classification_loss: 0.4030 310/500 [=================>............] - ETA: 1:04 - loss: 2.5446 - regression_loss: 2.1417 - classification_loss: 0.4029 311/500 [=================>............] - ETA: 1:03 - loss: 2.5449 - regression_loss: 2.1422 - classification_loss: 0.4028 312/500 [=================>............] - ETA: 1:03 - loss: 2.5460 - regression_loss: 2.1428 - classification_loss: 0.4032 313/500 [=================>............] - ETA: 1:03 - loss: 2.5459 - regression_loss: 2.1428 - classification_loss: 0.4031 314/500 [=================>............] - ETA: 1:02 - loss: 2.5461 - regression_loss: 2.1425 - classification_loss: 0.4035 315/500 [=================>............] - ETA: 1:02 - loss: 2.5473 - regression_loss: 2.1429 - classification_loss: 0.4043 316/500 [=================>............] - ETA: 1:02 - loss: 2.5478 - regression_loss: 2.1432 - classification_loss: 0.4046 317/500 [==================>...........] - ETA: 1:01 - loss: 2.5482 - regression_loss: 2.1436 - classification_loss: 0.4045 318/500 [==================>...........] - ETA: 1:01 - loss: 2.5452 - regression_loss: 2.1411 - classification_loss: 0.4041 319/500 [==================>...........] - ETA: 1:01 - loss: 2.5453 - regression_loss: 2.1413 - classification_loss: 0.4040 320/500 [==================>...........] - ETA: 1:00 - loss: 2.5440 - regression_loss: 2.1401 - classification_loss: 0.4039 321/500 [==================>...........] - ETA: 1:00 - loss: 2.5418 - regression_loss: 2.1383 - classification_loss: 0.4035 322/500 [==================>...........] - ETA: 1:00 - loss: 2.5417 - regression_loss: 2.1382 - classification_loss: 0.4035 323/500 [==================>...........] - ETA: 59s - loss: 2.5410 - regression_loss: 2.1377 - classification_loss: 0.4034  324/500 [==================>...........] - ETA: 59s - loss: 2.5405 - regression_loss: 2.1372 - classification_loss: 0.4033 325/500 [==================>...........] - ETA: 59s - loss: 2.5400 - regression_loss: 2.1371 - classification_loss: 0.4029 326/500 [==================>...........] - ETA: 58s - loss: 2.5396 - regression_loss: 2.1366 - classification_loss: 0.4030 327/500 [==================>...........] - ETA: 58s - loss: 2.5391 - regression_loss: 2.1361 - classification_loss: 0.4031 328/500 [==================>...........] - ETA: 58s - loss: 2.5385 - regression_loss: 2.1355 - classification_loss: 0.4030 329/500 [==================>...........] - ETA: 57s - loss: 2.5375 - regression_loss: 2.1345 - classification_loss: 0.4030 330/500 [==================>...........] - ETA: 57s - loss: 2.5361 - regression_loss: 2.1333 - classification_loss: 0.4027 331/500 [==================>...........] - ETA: 57s - loss: 2.5362 - regression_loss: 2.1336 - classification_loss: 0.4026 332/500 [==================>...........] - ETA: 56s - loss: 2.5358 - regression_loss: 2.1333 - classification_loss: 0.4024 333/500 [==================>...........] - ETA: 56s - loss: 2.5360 - regression_loss: 2.1338 - classification_loss: 0.4022 334/500 [===================>..........] - ETA: 56s - loss: 2.5346 - regression_loss: 2.1326 - classification_loss: 0.4020 335/500 [===================>..........] - ETA: 55s - loss: 2.5338 - regression_loss: 2.1323 - classification_loss: 0.4015 336/500 [===================>..........] - ETA: 55s - loss: 2.5329 - regression_loss: 2.1315 - classification_loss: 0.4013 337/500 [===================>..........] - ETA: 55s - loss: 2.5316 - regression_loss: 2.1306 - classification_loss: 0.4010 338/500 [===================>..........] - ETA: 54s - loss: 2.5313 - regression_loss: 2.1305 - classification_loss: 0.4008 339/500 [===================>..........] - ETA: 54s - loss: 2.5309 - regression_loss: 2.1301 - classification_loss: 0.4008 340/500 [===================>..........] - ETA: 54s - loss: 2.5312 - regression_loss: 2.1301 - classification_loss: 0.4011 341/500 [===================>..........] - ETA: 53s - loss: 2.5308 - regression_loss: 2.1298 - classification_loss: 0.4009 342/500 [===================>..........] - ETA: 53s - loss: 2.5292 - regression_loss: 2.1285 - classification_loss: 0.4006 343/500 [===================>..........] - ETA: 53s - loss: 2.5287 - regression_loss: 2.1283 - classification_loss: 0.4004 344/500 [===================>..........] - ETA: 52s - loss: 2.5284 - regression_loss: 2.1280 - classification_loss: 0.4004 345/500 [===================>..........] - ETA: 52s - loss: 2.5278 - regression_loss: 2.1277 - classification_loss: 0.4002 346/500 [===================>..........] - ETA: 52s - loss: 2.5284 - regression_loss: 2.1280 - classification_loss: 0.4004 347/500 [===================>..........] - ETA: 51s - loss: 2.5299 - regression_loss: 2.1288 - classification_loss: 0.4011 348/500 [===================>..........] - ETA: 51s - loss: 2.5286 - regression_loss: 2.1278 - classification_loss: 0.4008 349/500 [===================>..........] - ETA: 50s - loss: 2.5278 - regression_loss: 2.1273 - classification_loss: 0.4006 350/500 [====================>.........] - ETA: 50s - loss: 2.5285 - regression_loss: 2.1276 - classification_loss: 0.4009 351/500 [====================>.........] - ETA: 50s - loss: 2.5282 - regression_loss: 2.1274 - classification_loss: 0.4008 352/500 [====================>.........] - ETA: 49s - loss: 2.5288 - regression_loss: 2.1275 - classification_loss: 0.4012 353/500 [====================>.........] - ETA: 49s - loss: 2.5286 - regression_loss: 2.1275 - classification_loss: 0.4011 354/500 [====================>.........] - ETA: 49s - loss: 2.5271 - regression_loss: 2.1261 - classification_loss: 0.4010 355/500 [====================>.........] - ETA: 48s - loss: 2.5268 - regression_loss: 2.1258 - classification_loss: 0.4010 356/500 [====================>.........] - ETA: 48s - loss: 2.5253 - regression_loss: 2.1245 - classification_loss: 0.4007 357/500 [====================>.........] - ETA: 48s - loss: 2.5252 - regression_loss: 2.1245 - classification_loss: 0.4007 358/500 [====================>.........] - ETA: 47s - loss: 2.5249 - regression_loss: 2.1243 - classification_loss: 0.4006 359/500 [====================>.........] - ETA: 47s - loss: 2.5240 - regression_loss: 2.1234 - classification_loss: 0.4005 360/500 [====================>.........] - ETA: 47s - loss: 2.5225 - regression_loss: 2.1226 - classification_loss: 0.3999 361/500 [====================>.........] - ETA: 46s - loss: 2.5222 - regression_loss: 2.1225 - classification_loss: 0.3998 362/500 [====================>.........] - ETA: 46s - loss: 2.5210 - regression_loss: 2.1215 - classification_loss: 0.3995 363/500 [====================>.........] - ETA: 46s - loss: 2.5210 - regression_loss: 2.1211 - classification_loss: 0.3999 364/500 [====================>.........] - ETA: 45s - loss: 2.5210 - regression_loss: 2.1213 - classification_loss: 0.3997 365/500 [====================>.........] - ETA: 45s - loss: 2.5198 - regression_loss: 2.1203 - classification_loss: 0.3995 366/500 [====================>.........] - ETA: 45s - loss: 2.5207 - regression_loss: 2.1212 - classification_loss: 0.3995 367/500 [=====================>........] - ETA: 44s - loss: 2.5203 - regression_loss: 2.1209 - classification_loss: 0.3994 368/500 [=====================>........] - ETA: 44s - loss: 2.5214 - regression_loss: 2.1219 - classification_loss: 0.3995 369/500 [=====================>........] - ETA: 44s - loss: 2.5207 - regression_loss: 2.1215 - classification_loss: 0.3992 370/500 [=====================>........] - ETA: 43s - loss: 2.5202 - regression_loss: 2.1207 - classification_loss: 0.3994 371/500 [=====================>........] - ETA: 43s - loss: 2.5203 - regression_loss: 2.1211 - classification_loss: 0.3992 372/500 [=====================>........] - ETA: 43s - loss: 2.5206 - regression_loss: 2.1214 - classification_loss: 0.3992 373/500 [=====================>........] - ETA: 42s - loss: 2.5190 - regression_loss: 2.1198 - classification_loss: 0.3992 374/500 [=====================>........] - ETA: 42s - loss: 2.5194 - regression_loss: 2.1202 - classification_loss: 0.3992 375/500 [=====================>........] - ETA: 42s - loss: 2.5214 - regression_loss: 2.1219 - classification_loss: 0.3995 376/500 [=====================>........] - ETA: 41s - loss: 2.5209 - regression_loss: 2.1215 - classification_loss: 0.3994 377/500 [=====================>........] - ETA: 41s - loss: 2.5197 - regression_loss: 2.1205 - classification_loss: 0.3992 378/500 [=====================>........] - ETA: 41s - loss: 2.5184 - regression_loss: 2.1195 - classification_loss: 0.3989 379/500 [=====================>........] - ETA: 40s - loss: 2.5205 - regression_loss: 2.1208 - classification_loss: 0.3997 380/500 [=====================>........] - ETA: 40s - loss: 2.5215 - regression_loss: 2.1215 - classification_loss: 0.4000 381/500 [=====================>........] - ETA: 40s - loss: 2.5215 - regression_loss: 2.1215 - classification_loss: 0.4000 382/500 [=====================>........] - ETA: 39s - loss: 2.5208 - regression_loss: 2.1208 - classification_loss: 0.4000 383/500 [=====================>........] - ETA: 39s - loss: 2.5216 - regression_loss: 2.1213 - classification_loss: 0.4004 384/500 [======================>.......] - ETA: 39s - loss: 2.5232 - regression_loss: 2.1225 - classification_loss: 0.4007 385/500 [======================>.......] - ETA: 38s - loss: 2.5217 - regression_loss: 2.1208 - classification_loss: 0.4010 386/500 [======================>.......] - ETA: 38s - loss: 2.5206 - regression_loss: 2.1199 - classification_loss: 0.4007 387/500 [======================>.......] - ETA: 38s - loss: 2.5196 - regression_loss: 2.1190 - classification_loss: 0.4006 388/500 [======================>.......] - ETA: 37s - loss: 2.5184 - regression_loss: 2.1181 - classification_loss: 0.4003 389/500 [======================>.......] - ETA: 37s - loss: 2.5196 - regression_loss: 2.1195 - classification_loss: 0.4002 390/500 [======================>.......] - ETA: 37s - loss: 2.5199 - regression_loss: 2.1195 - classification_loss: 0.4004 391/500 [======================>.......] - ETA: 36s - loss: 2.5191 - regression_loss: 2.1187 - classification_loss: 0.4004 392/500 [======================>.......] - ETA: 36s - loss: 2.5186 - regression_loss: 2.1183 - classification_loss: 0.4002 393/500 [======================>.......] - ETA: 36s - loss: 2.5174 - regression_loss: 2.1174 - classification_loss: 0.4000 394/500 [======================>.......] - ETA: 35s - loss: 2.5168 - regression_loss: 2.1168 - classification_loss: 0.4000 395/500 [======================>.......] - ETA: 35s - loss: 2.5168 - regression_loss: 2.1167 - classification_loss: 0.4001 396/500 [======================>.......] - ETA: 35s - loss: 2.5171 - regression_loss: 2.1171 - classification_loss: 0.4001 397/500 [======================>.......] - ETA: 34s - loss: 2.5171 - regression_loss: 2.1170 - classification_loss: 0.4001 398/500 [======================>.......] - ETA: 34s - loss: 2.5165 - regression_loss: 2.1166 - classification_loss: 0.3999 399/500 [======================>.......] - ETA: 34s - loss: 2.5164 - regression_loss: 2.1165 - classification_loss: 0.3999 400/500 [=======================>......] - ETA: 33s - loss: 2.5165 - regression_loss: 2.1165 - classification_loss: 0.4000 401/500 [=======================>......] - ETA: 33s - loss: 2.5155 - regression_loss: 2.1158 - classification_loss: 0.3998 402/500 [=======================>......] - ETA: 33s - loss: 2.5152 - regression_loss: 2.1157 - classification_loss: 0.3995 403/500 [=======================>......] - ETA: 32s - loss: 2.5150 - regression_loss: 2.1156 - classification_loss: 0.3994 404/500 [=======================>......] - ETA: 32s - loss: 2.5151 - regression_loss: 2.1159 - classification_loss: 0.3992 405/500 [=======================>......] - ETA: 32s - loss: 2.5138 - regression_loss: 2.1148 - classification_loss: 0.3990 406/500 [=======================>......] - ETA: 31s - loss: 2.5145 - regression_loss: 2.1154 - classification_loss: 0.3991 407/500 [=======================>......] - ETA: 31s - loss: 2.5143 - regression_loss: 2.1153 - classification_loss: 0.3990 408/500 [=======================>......] - ETA: 31s - loss: 2.5136 - regression_loss: 2.1148 - classification_loss: 0.3988 409/500 [=======================>......] - ETA: 30s - loss: 2.5130 - regression_loss: 2.1143 - classification_loss: 0.3987 410/500 [=======================>......] - ETA: 30s - loss: 2.5129 - regression_loss: 2.1142 - classification_loss: 0.3987 411/500 [=======================>......] - ETA: 30s - loss: 2.5138 - regression_loss: 2.1147 - classification_loss: 0.3992 412/500 [=======================>......] - ETA: 29s - loss: 2.5133 - regression_loss: 2.1143 - classification_loss: 0.3990 413/500 [=======================>......] - ETA: 29s - loss: 2.5132 - regression_loss: 2.1143 - classification_loss: 0.3989 414/500 [=======================>......] - ETA: 29s - loss: 2.5145 - regression_loss: 2.1154 - classification_loss: 0.3991 415/500 [=======================>......] - ETA: 28s - loss: 2.5146 - regression_loss: 2.1149 - classification_loss: 0.3997 416/500 [=======================>......] - ETA: 28s - loss: 2.5167 - regression_loss: 2.1144 - classification_loss: 0.4023 417/500 [========================>.....] - ETA: 28s - loss: 2.5164 - regression_loss: 2.1142 - classification_loss: 0.4022 418/500 [========================>.....] - ETA: 27s - loss: 2.5162 - regression_loss: 2.1142 - classification_loss: 0.4021 419/500 [========================>.....] - ETA: 27s - loss: 2.5158 - regression_loss: 2.1136 - classification_loss: 0.4022 420/500 [========================>.....] - ETA: 27s - loss: 2.5156 - regression_loss: 2.1133 - classification_loss: 0.4022 421/500 [========================>.....] - ETA: 26s - loss: 2.5167 - regression_loss: 2.1145 - classification_loss: 0.4021 422/500 [========================>.....] - ETA: 26s - loss: 2.5163 - regression_loss: 2.1143 - classification_loss: 0.4020 423/500 [========================>.....] - ETA: 25s - loss: 2.5161 - regression_loss: 2.1141 - classification_loss: 0.4019 424/500 [========================>.....] - ETA: 25s - loss: 2.5150 - regression_loss: 2.1132 - classification_loss: 0.4017 425/500 [========================>.....] - ETA: 25s - loss: 2.5140 - regression_loss: 2.1125 - classification_loss: 0.4015 426/500 [========================>.....] - ETA: 24s - loss: 2.5136 - regression_loss: 2.1120 - classification_loss: 0.4016 427/500 [========================>.....] - ETA: 24s - loss: 2.5132 - regression_loss: 2.1118 - classification_loss: 0.4015 428/500 [========================>.....] - ETA: 24s - loss: 2.5123 - regression_loss: 2.1111 - classification_loss: 0.4012 429/500 [========================>.....] - ETA: 23s - loss: 2.5130 - regression_loss: 2.1114 - classification_loss: 0.4016 430/500 [========================>.....] - ETA: 23s - loss: 2.5131 - regression_loss: 2.1115 - classification_loss: 0.4016 431/500 [========================>.....] - ETA: 23s - loss: 2.5137 - regression_loss: 2.1120 - classification_loss: 0.4018 432/500 [========================>.....] - ETA: 22s - loss: 2.5123 - regression_loss: 2.1107 - classification_loss: 0.4016 433/500 [========================>.....] - ETA: 22s - loss: 2.5102 - regression_loss: 2.1090 - classification_loss: 0.4012 434/500 [=========================>....] - ETA: 22s - loss: 2.5102 - regression_loss: 2.1090 - classification_loss: 0.4013 435/500 [=========================>....] - ETA: 21s - loss: 2.5117 - regression_loss: 2.1102 - classification_loss: 0.4015 436/500 [=========================>....] - ETA: 21s - loss: 2.5113 - regression_loss: 2.1100 - classification_loss: 0.4013 437/500 [=========================>....] - ETA: 21s - loss: 2.5100 - regression_loss: 2.1088 - classification_loss: 0.4012 438/500 [=========================>....] - ETA: 20s - loss: 2.5096 - regression_loss: 2.1083 - classification_loss: 0.4013 439/500 [=========================>....] - ETA: 20s - loss: 2.5074 - regression_loss: 2.1065 - classification_loss: 0.4009 440/500 [=========================>....] - ETA: 20s - loss: 2.5072 - regression_loss: 2.1066 - classification_loss: 0.4006 441/500 [=========================>....] - ETA: 19s - loss: 2.5057 - regression_loss: 2.1054 - classification_loss: 0.4003 442/500 [=========================>....] - ETA: 19s - loss: 2.5049 - regression_loss: 2.1047 - classification_loss: 0.4002 443/500 [=========================>....] - ETA: 19s - loss: 2.5040 - regression_loss: 2.1040 - classification_loss: 0.4000 444/500 [=========================>....] - ETA: 18s - loss: 2.5041 - regression_loss: 2.1038 - classification_loss: 0.4003 445/500 [=========================>....] - ETA: 18s - loss: 2.5045 - regression_loss: 2.1042 - classification_loss: 0.4003 446/500 [=========================>....] - ETA: 18s - loss: 2.5049 - regression_loss: 2.1045 - classification_loss: 0.4004 447/500 [=========================>....] - ETA: 17s - loss: 2.5044 - regression_loss: 2.1042 - classification_loss: 0.4003 448/500 [=========================>....] - ETA: 17s - loss: 2.5033 - regression_loss: 2.1034 - classification_loss: 0.4000 449/500 [=========================>....] - ETA: 17s - loss: 2.5030 - regression_loss: 2.1030 - classification_loss: 0.4000 450/500 [==========================>...] - ETA: 16s - loss: 2.5029 - regression_loss: 2.1029 - classification_loss: 0.4000 451/500 [==========================>...] - ETA: 16s - loss: 2.5036 - regression_loss: 2.1036 - classification_loss: 0.4001 452/500 [==========================>...] - ETA: 16s - loss: 2.5037 - regression_loss: 2.1038 - classification_loss: 0.3999 453/500 [==========================>...] - ETA: 15s - loss: 2.5036 - regression_loss: 2.1037 - classification_loss: 0.3998 454/500 [==========================>...] - ETA: 15s - loss: 2.5031 - regression_loss: 2.1035 - classification_loss: 0.3997 455/500 [==========================>...] - ETA: 15s - loss: 2.5026 - regression_loss: 2.1031 - classification_loss: 0.3995 456/500 [==========================>...] - ETA: 14s - loss: 2.5020 - regression_loss: 2.1027 - classification_loss: 0.3993 457/500 [==========================>...] - ETA: 14s - loss: 2.5024 - regression_loss: 2.1032 - classification_loss: 0.3991 458/500 [==========================>...] - ETA: 14s - loss: 2.5017 - regression_loss: 2.1028 - classification_loss: 0.3989 459/500 [==========================>...] - ETA: 13s - loss: 2.5013 - regression_loss: 2.1026 - classification_loss: 0.3987 460/500 [==========================>...] - ETA: 13s - loss: 2.5027 - regression_loss: 2.1041 - classification_loss: 0.3986 461/500 [==========================>...] - ETA: 13s - loss: 2.5020 - regression_loss: 2.1036 - classification_loss: 0.3984 462/500 [==========================>...] - ETA: 12s - loss: 2.4990 - regression_loss: 2.1011 - classification_loss: 0.3979 463/500 [==========================>...] - ETA: 12s - loss: 2.4982 - regression_loss: 2.1006 - classification_loss: 0.3977 464/500 [==========================>...] - ETA: 12s - loss: 2.4974 - regression_loss: 2.1000 - classification_loss: 0.3974 465/500 [==========================>...] - ETA: 11s - loss: 2.4974 - regression_loss: 2.1000 - classification_loss: 0.3974 466/500 [==========================>...] - ETA: 11s - loss: 2.4968 - regression_loss: 2.0995 - classification_loss: 0.3973 467/500 [===========================>..] - ETA: 11s - loss: 2.4963 - regression_loss: 2.0990 - classification_loss: 0.3973 468/500 [===========================>..] - ETA: 10s - loss: 2.4968 - regression_loss: 2.0992 - classification_loss: 0.3975 469/500 [===========================>..] - ETA: 10s - loss: 2.4964 - regression_loss: 2.0991 - classification_loss: 0.3973 470/500 [===========================>..] - ETA: 10s - loss: 2.4959 - regression_loss: 2.0988 - classification_loss: 0.3972 471/500 [===========================>..] - ETA: 9s - loss: 2.4976 - regression_loss: 2.1003 - classification_loss: 0.3973  472/500 [===========================>..] - ETA: 9s - loss: 2.4968 - regression_loss: 2.0998 - classification_loss: 0.3970 473/500 [===========================>..] - ETA: 9s - loss: 2.4970 - regression_loss: 2.1001 - classification_loss: 0.3969 474/500 [===========================>..] - ETA: 8s - loss: 2.4958 - regression_loss: 2.0989 - classification_loss: 0.3969 475/500 [===========================>..] - ETA: 8s - loss: 2.4954 - regression_loss: 2.0985 - classification_loss: 0.3969 476/500 [===========================>..] - ETA: 8s - loss: 2.4946 - regression_loss: 2.0978 - classification_loss: 0.3968 477/500 [===========================>..] - ETA: 7s - loss: 2.4944 - regression_loss: 2.0976 - classification_loss: 0.3968 478/500 [===========================>..] - ETA: 7s - loss: 2.4952 - regression_loss: 2.0983 - classification_loss: 0.3969 479/500 [===========================>..] - ETA: 7s - loss: 2.4945 - regression_loss: 2.0978 - classification_loss: 0.3967 480/500 [===========================>..] - ETA: 6s - loss: 2.4941 - regression_loss: 2.0975 - classification_loss: 0.3966 481/500 [===========================>..] - ETA: 6s - loss: 2.4917 - regression_loss: 2.0955 - classification_loss: 0.3962 482/500 [===========================>..] - ETA: 6s - loss: 2.4914 - regression_loss: 2.0954 - classification_loss: 0.3961 483/500 [===========================>..] - ETA: 5s - loss: 2.4921 - regression_loss: 2.0960 - classification_loss: 0.3961 484/500 [============================>.] - ETA: 5s - loss: 2.4908 - regression_loss: 2.0949 - classification_loss: 0.3959 485/500 [============================>.] - ETA: 5s - loss: 2.4920 - regression_loss: 2.0958 - classification_loss: 0.3962 486/500 [============================>.] - ETA: 4s - loss: 2.4903 - regression_loss: 2.0943 - classification_loss: 0.3960 487/500 [============================>.] - ETA: 4s - loss: 2.4907 - regression_loss: 2.0946 - classification_loss: 0.3961 488/500 [============================>.] - ETA: 4s - loss: 2.4906 - regression_loss: 2.0945 - classification_loss: 0.3961 489/500 [============================>.] - ETA: 3s - loss: 2.4889 - regression_loss: 2.0931 - classification_loss: 0.3958 490/500 [============================>.] - ETA: 3s - loss: 2.4887 - regression_loss: 2.0930 - classification_loss: 0.3956 491/500 [============================>.] - ETA: 3s - loss: 2.4880 - regression_loss: 2.0926 - classification_loss: 0.3955 492/500 [============================>.] - ETA: 2s - loss: 2.4871 - regression_loss: 2.0917 - classification_loss: 0.3953 493/500 [============================>.] - ETA: 2s - loss: 2.4862 - regression_loss: 2.0910 - classification_loss: 0.3952 494/500 [============================>.] - ETA: 2s - loss: 2.4857 - regression_loss: 2.0906 - classification_loss: 0.3951 495/500 [============================>.] - ETA: 1s - loss: 2.4859 - regression_loss: 2.0907 - classification_loss: 0.3952 496/500 [============================>.] - ETA: 1s - loss: 2.4850 - regression_loss: 2.0899 - classification_loss: 0.3951 497/500 [============================>.] - ETA: 1s - loss: 2.4852 - regression_loss: 2.0900 - classification_loss: 0.3953 498/500 [============================>.] - ETA: 0s - loss: 2.4855 - regression_loss: 2.0902 - classification_loss: 0.3953 499/500 [============================>.] - ETA: 0s - loss: 2.4840 - regression_loss: 2.0891 - classification_loss: 0.3950 500/500 [==============================] - 169s 338ms/step - loss: 2.4837 - regression_loss: 2.0888 - classification_loss: 0.3949 1172 instances of class plum with average precision: 0.3697 mAP: 0.3697 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 2:47 - loss: 1.6654 - regression_loss: 1.4435 - classification_loss: 0.2218 2/500 [..............................] - ETA: 2:49 - loss: 2.1326 - regression_loss: 1.6885 - classification_loss: 0.4441 3/500 [..............................] - ETA: 2:49 - loss: 2.2426 - regression_loss: 1.8319 - classification_loss: 0.4107 4/500 [..............................] - ETA: 2:49 - loss: 2.3090 - regression_loss: 1.9120 - classification_loss: 0.3971 5/500 [..............................] - ETA: 2:49 - loss: 2.3025 - regression_loss: 1.9157 - classification_loss: 0.3868 6/500 [..............................] - ETA: 2:49 - loss: 2.2588 - regression_loss: 1.8781 - classification_loss: 0.3807 7/500 [..............................] - ETA: 2:49 - loss: 2.3005 - regression_loss: 1.9149 - classification_loss: 0.3856 8/500 [..............................] - ETA: 2:50 - loss: 2.3185 - regression_loss: 1.9350 - classification_loss: 0.3835 9/500 [..............................] - ETA: 2:49 - loss: 2.3228 - regression_loss: 1.9458 - classification_loss: 0.3770 10/500 [..............................] - ETA: 2:48 - loss: 2.3366 - regression_loss: 1.9604 - classification_loss: 0.3762 11/500 [..............................] - ETA: 2:47 - loss: 2.4089 - regression_loss: 2.0159 - classification_loss: 0.3931 12/500 [..............................] - ETA: 2:46 - loss: 2.3924 - regression_loss: 2.0056 - classification_loss: 0.3868 13/500 [..............................] - ETA: 2:46 - loss: 2.3475 - regression_loss: 1.9721 - classification_loss: 0.3754 14/500 [..............................] - ETA: 2:45 - loss: 2.3706 - regression_loss: 1.9948 - classification_loss: 0.3758 15/500 [..............................] - ETA: 2:45 - loss: 2.3393 - regression_loss: 1.9696 - classification_loss: 0.3697 16/500 [..............................] - ETA: 2:43 - loss: 2.3421 - regression_loss: 1.9725 - classification_loss: 0.3696 17/500 [>.............................] - ETA: 2:43 - loss: 2.3437 - regression_loss: 1.9771 - classification_loss: 0.3667 18/500 [>.............................] - ETA: 2:43 - loss: 2.3522 - regression_loss: 1.9838 - classification_loss: 0.3685 19/500 [>.............................] - ETA: 2:42 - loss: 2.3401 - regression_loss: 1.9745 - classification_loss: 0.3657 20/500 [>.............................] - ETA: 2:42 - loss: 2.3488 - regression_loss: 1.9835 - classification_loss: 0.3654 21/500 [>.............................] - ETA: 2:42 - loss: 2.3781 - regression_loss: 2.0071 - classification_loss: 0.3709 22/500 [>.............................] - ETA: 2:41 - loss: 2.3602 - regression_loss: 1.9926 - classification_loss: 0.3676 23/500 [>.............................] - ETA: 2:41 - loss: 2.3443 - regression_loss: 1.9792 - classification_loss: 0.3651 24/500 [>.............................] - ETA: 2:41 - loss: 2.3031 - regression_loss: 1.9389 - classification_loss: 0.3642 25/500 [>.............................] - ETA: 2:41 - loss: 2.2895 - regression_loss: 1.9284 - classification_loss: 0.3611 26/500 [>.............................] - ETA: 2:41 - loss: 2.2921 - regression_loss: 1.9313 - classification_loss: 0.3608 27/500 [>.............................] - ETA: 2:41 - loss: 2.2534 - regression_loss: 1.8979 - classification_loss: 0.3555 28/500 [>.............................] - ETA: 2:40 - loss: 2.2666 - regression_loss: 1.9101 - classification_loss: 0.3565 29/500 [>.............................] - ETA: 2:39 - loss: 2.2744 - regression_loss: 1.9145 - classification_loss: 0.3599 30/500 [>.............................] - ETA: 2:39 - loss: 2.2804 - regression_loss: 1.9208 - classification_loss: 0.3596 31/500 [>.............................] - ETA: 2:39 - loss: 2.2794 - regression_loss: 1.9157 - classification_loss: 0.3637 32/500 [>.............................] - ETA: 2:38 - loss: 2.2973 - regression_loss: 1.9317 - classification_loss: 0.3655 33/500 [>.............................] - ETA: 2:38 - loss: 2.3038 - regression_loss: 1.9395 - classification_loss: 0.3643 34/500 [=>............................] - ETA: 2:38 - loss: 2.2841 - regression_loss: 1.9213 - classification_loss: 0.3629 35/500 [=>............................] - ETA: 2:37 - loss: 2.3152 - regression_loss: 1.9390 - classification_loss: 0.3762 36/500 [=>............................] - ETA: 2:37 - loss: 2.2968 - regression_loss: 1.9253 - classification_loss: 0.3715 37/500 [=>............................] - ETA: 2:37 - loss: 2.2973 - regression_loss: 1.9273 - classification_loss: 0.3701 38/500 [=>............................] - ETA: 2:36 - loss: 2.3018 - regression_loss: 1.9323 - classification_loss: 0.3696 39/500 [=>............................] - ETA: 2:36 - loss: 2.3079 - regression_loss: 1.9371 - classification_loss: 0.3708 40/500 [=>............................] - ETA: 2:35 - loss: 2.3170 - regression_loss: 1.9440 - classification_loss: 0.3730 41/500 [=>............................] - ETA: 2:35 - loss: 2.3151 - regression_loss: 1.9427 - classification_loss: 0.3724 42/500 [=>............................] - ETA: 2:35 - loss: 2.3097 - regression_loss: 1.9381 - classification_loss: 0.3716 43/500 [=>............................] - ETA: 2:35 - loss: 2.3004 - regression_loss: 1.9308 - classification_loss: 0.3696 44/500 [=>............................] - ETA: 2:34 - loss: 2.3027 - regression_loss: 1.9337 - classification_loss: 0.3690 45/500 [=>............................] - ETA: 2:34 - loss: 2.3060 - regression_loss: 1.9364 - classification_loss: 0.3696 46/500 [=>............................] - ETA: 2:33 - loss: 2.3049 - regression_loss: 1.9363 - classification_loss: 0.3686 47/500 [=>............................] - ETA: 2:33 - loss: 2.2979 - regression_loss: 1.9314 - classification_loss: 0.3665 48/500 [=>............................] - ETA: 2:32 - loss: 2.2987 - regression_loss: 1.9339 - classification_loss: 0.3648 49/500 [=>............................] - ETA: 2:32 - loss: 2.3072 - regression_loss: 1.9350 - classification_loss: 0.3722 50/500 [==>...........................] - ETA: 2:32 - loss: 2.3044 - regression_loss: 1.9314 - classification_loss: 0.3730 51/500 [==>...........................] - ETA: 2:31 - loss: 2.2986 - regression_loss: 1.9272 - classification_loss: 0.3714 52/500 [==>...........................] - ETA: 2:31 - loss: 2.2964 - regression_loss: 1.9247 - classification_loss: 0.3717 53/500 [==>...........................] - ETA: 2:31 - loss: 2.2987 - regression_loss: 1.9253 - classification_loss: 0.3734 54/500 [==>...........................] - ETA: 2:30 - loss: 2.3036 - regression_loss: 1.9315 - classification_loss: 0.3721 55/500 [==>...........................] - ETA: 2:30 - loss: 2.3041 - regression_loss: 1.9329 - classification_loss: 0.3712 56/500 [==>...........................] - ETA: 2:30 - loss: 2.3038 - regression_loss: 1.9335 - classification_loss: 0.3703 57/500 [==>...........................] - ETA: 2:29 - loss: 2.2945 - regression_loss: 1.9257 - classification_loss: 0.3688 58/500 [==>...........................] - ETA: 2:29 - loss: 2.2918 - regression_loss: 1.9233 - classification_loss: 0.3686 59/500 [==>...........................] - ETA: 2:28 - loss: 2.2977 - regression_loss: 1.9291 - classification_loss: 0.3687 60/500 [==>...........................] - ETA: 2:28 - loss: 2.3010 - regression_loss: 1.9321 - classification_loss: 0.3689 61/500 [==>...........................] - ETA: 2:28 - loss: 2.3063 - regression_loss: 1.9364 - classification_loss: 0.3699 62/500 [==>...........................] - ETA: 2:28 - loss: 2.2948 - regression_loss: 1.9260 - classification_loss: 0.3688 63/500 [==>...........................] - ETA: 2:27 - loss: 2.2930 - regression_loss: 1.9248 - classification_loss: 0.3682 64/500 [==>...........................] - ETA: 2:27 - loss: 2.3172 - regression_loss: 1.9410 - classification_loss: 0.3763 65/500 [==>...........................] - ETA: 2:27 - loss: 2.3378 - regression_loss: 1.9592 - classification_loss: 0.3785 66/500 [==>...........................] - ETA: 2:26 - loss: 2.3360 - regression_loss: 1.9580 - classification_loss: 0.3781 67/500 [===>..........................] - ETA: 2:26 - loss: 2.3268 - regression_loss: 1.9504 - classification_loss: 0.3764 68/500 [===>..........................] - ETA: 2:26 - loss: 2.3304 - regression_loss: 1.9539 - classification_loss: 0.3764 69/500 [===>..........................] - ETA: 2:25 - loss: 2.3362 - regression_loss: 1.9583 - classification_loss: 0.3779 70/500 [===>..........................] - ETA: 2:25 - loss: 2.3339 - regression_loss: 1.9567 - classification_loss: 0.3773 71/500 [===>..........................] - ETA: 2:25 - loss: 2.3360 - regression_loss: 1.9587 - classification_loss: 0.3772 72/500 [===>..........................] - ETA: 2:24 - loss: 2.3256 - regression_loss: 1.9507 - classification_loss: 0.3749 73/500 [===>..........................] - ETA: 2:24 - loss: 2.3213 - regression_loss: 1.9467 - classification_loss: 0.3745 74/500 [===>..........................] - ETA: 2:24 - loss: 2.3073 - regression_loss: 1.9346 - classification_loss: 0.3728 75/500 [===>..........................] - ETA: 2:24 - loss: 2.3027 - regression_loss: 1.9312 - classification_loss: 0.3715 76/500 [===>..........................] - ETA: 2:23 - loss: 2.3015 - regression_loss: 1.9313 - classification_loss: 0.3702 77/500 [===>..........................] - ETA: 2:23 - loss: 2.3047 - regression_loss: 1.9326 - classification_loss: 0.3721 78/500 [===>..........................] - ETA: 2:23 - loss: 2.3080 - regression_loss: 1.9359 - classification_loss: 0.3722 79/500 [===>..........................] - ETA: 2:22 - loss: 2.3060 - regression_loss: 1.9346 - classification_loss: 0.3714 80/500 [===>..........................] - ETA: 2:22 - loss: 2.3064 - regression_loss: 1.9352 - classification_loss: 0.3712 81/500 [===>..........................] - ETA: 2:22 - loss: 2.3167 - regression_loss: 1.9426 - classification_loss: 0.3741 82/500 [===>..........................] - ETA: 2:21 - loss: 2.3135 - regression_loss: 1.9399 - classification_loss: 0.3736 83/500 [===>..........................] - ETA: 2:21 - loss: 2.3131 - regression_loss: 1.9403 - classification_loss: 0.3728 84/500 [====>.........................] - ETA: 2:21 - loss: 2.3107 - regression_loss: 1.9391 - classification_loss: 0.3715 85/500 [====>.........................] - ETA: 2:20 - loss: 2.3108 - regression_loss: 1.9397 - classification_loss: 0.3711 86/500 [====>.........................] - ETA: 2:20 - loss: 2.3085 - regression_loss: 1.9384 - classification_loss: 0.3701 87/500 [====>.........................] - ETA: 2:20 - loss: 2.3171 - regression_loss: 1.9464 - classification_loss: 0.3706 88/500 [====>.........................] - ETA: 2:19 - loss: 2.3125 - regression_loss: 1.9430 - classification_loss: 0.3695 89/500 [====>.........................] - ETA: 2:19 - loss: 2.3117 - regression_loss: 1.9426 - classification_loss: 0.3691 90/500 [====>.........................] - ETA: 2:19 - loss: 2.3174 - regression_loss: 1.9481 - classification_loss: 0.3693 91/500 [====>.........................] - ETA: 2:18 - loss: 2.3133 - regression_loss: 1.9448 - classification_loss: 0.3684 92/500 [====>.........................] - ETA: 2:18 - loss: 2.3163 - regression_loss: 1.9488 - classification_loss: 0.3675 93/500 [====>.........................] - ETA: 2:18 - loss: 2.3074 - regression_loss: 1.9412 - classification_loss: 0.3662 94/500 [====>.........................] - ETA: 2:17 - loss: 2.3101 - regression_loss: 1.9435 - classification_loss: 0.3666 95/500 [====>.........................] - ETA: 2:17 - loss: 2.3048 - regression_loss: 1.9385 - classification_loss: 0.3664 96/500 [====>.........................] - ETA: 2:16 - loss: 2.2983 - regression_loss: 1.9329 - classification_loss: 0.3653 97/500 [====>.........................] - ETA: 2:16 - loss: 2.3052 - regression_loss: 1.9393 - classification_loss: 0.3659 98/500 [====>.........................] - ETA: 2:16 - loss: 2.2978 - regression_loss: 1.9340 - classification_loss: 0.3638 99/500 [====>.........................] - ETA: 2:16 - loss: 2.2963 - regression_loss: 1.9329 - classification_loss: 0.3635 100/500 [=====>........................] - ETA: 2:15 - loss: 2.2944 - regression_loss: 1.9313 - classification_loss: 0.3631 101/500 [=====>........................] - ETA: 2:15 - loss: 2.2934 - regression_loss: 1.9301 - classification_loss: 0.3633 102/500 [=====>........................] - ETA: 2:14 - loss: 2.2928 - regression_loss: 1.9304 - classification_loss: 0.3624 103/500 [=====>........................] - ETA: 2:14 - loss: 2.2897 - regression_loss: 1.9272 - classification_loss: 0.3625 104/500 [=====>........................] - ETA: 2:14 - loss: 2.2897 - regression_loss: 1.9277 - classification_loss: 0.3620 105/500 [=====>........................] - ETA: 2:13 - loss: 2.2838 - regression_loss: 1.9229 - classification_loss: 0.3609 106/500 [=====>........................] - ETA: 2:13 - loss: 2.2758 - regression_loss: 1.9162 - classification_loss: 0.3596 107/500 [=====>........................] - ETA: 2:13 - loss: 2.2800 - regression_loss: 1.9207 - classification_loss: 0.3593 108/500 [=====>........................] - ETA: 2:12 - loss: 2.2768 - regression_loss: 1.9184 - classification_loss: 0.3585 109/500 [=====>........................] - ETA: 2:12 - loss: 2.2836 - regression_loss: 1.9245 - classification_loss: 0.3591 110/500 [=====>........................] - ETA: 2:12 - loss: 2.2810 - regression_loss: 1.9225 - classification_loss: 0.3586 111/500 [=====>........................] - ETA: 2:11 - loss: 2.2759 - regression_loss: 1.9186 - classification_loss: 0.3573 112/500 [=====>........................] - ETA: 2:11 - loss: 2.2783 - regression_loss: 1.9204 - classification_loss: 0.3579 113/500 [=====>........................] - ETA: 2:11 - loss: 2.2701 - regression_loss: 1.9132 - classification_loss: 0.3568 114/500 [=====>........................] - ETA: 2:10 - loss: 2.2682 - regression_loss: 1.9113 - classification_loss: 0.3569 115/500 [=====>........................] - ETA: 2:10 - loss: 2.2671 - regression_loss: 1.9109 - classification_loss: 0.3562 116/500 [=====>........................] - ETA: 2:10 - loss: 2.2675 - regression_loss: 1.9100 - classification_loss: 0.3575 117/500 [======>.......................] - ETA: 2:09 - loss: 2.2666 - regression_loss: 1.9093 - classification_loss: 0.3573 118/500 [======>.......................] - ETA: 2:09 - loss: 2.2643 - regression_loss: 1.9075 - classification_loss: 0.3568 119/500 [======>.......................] - ETA: 2:09 - loss: 2.2652 - regression_loss: 1.9084 - classification_loss: 0.3568 120/500 [======>.......................] - ETA: 2:08 - loss: 2.2603 - regression_loss: 1.9044 - classification_loss: 0.3559 121/500 [======>.......................] - ETA: 2:08 - loss: 2.2662 - regression_loss: 1.9090 - classification_loss: 0.3572 122/500 [======>.......................] - ETA: 2:08 - loss: 2.2635 - regression_loss: 1.9068 - classification_loss: 0.3568 123/500 [======>.......................] - ETA: 2:07 - loss: 2.2629 - regression_loss: 1.9064 - classification_loss: 0.3565 124/500 [======>.......................] - ETA: 2:07 - loss: 2.2621 - regression_loss: 1.9057 - classification_loss: 0.3565 125/500 [======>.......................] - ETA: 2:07 - loss: 2.2671 - regression_loss: 1.9075 - classification_loss: 0.3596 126/500 [======>.......................] - ETA: 2:06 - loss: 2.2670 - regression_loss: 1.9075 - classification_loss: 0.3595 127/500 [======>.......................] - ETA: 2:06 - loss: 2.2637 - regression_loss: 1.9050 - classification_loss: 0.3586 128/500 [======>.......................] - ETA: 2:06 - loss: 2.2642 - regression_loss: 1.9058 - classification_loss: 0.3584 129/500 [======>.......................] - ETA: 2:05 - loss: 2.2631 - regression_loss: 1.9050 - classification_loss: 0.3581 130/500 [======>.......................] - ETA: 2:05 - loss: 2.2600 - regression_loss: 1.9025 - classification_loss: 0.3575 131/500 [======>.......................] - ETA: 2:05 - loss: 2.2593 - regression_loss: 1.9020 - classification_loss: 0.3574 132/500 [======>.......................] - ETA: 2:04 - loss: 2.2562 - regression_loss: 1.8995 - classification_loss: 0.3567 133/500 [======>.......................] - ETA: 2:04 - loss: 2.2577 - regression_loss: 1.9010 - classification_loss: 0.3567 134/500 [=======>......................] - ETA: 2:04 - loss: 2.2552 - regression_loss: 1.8988 - classification_loss: 0.3564 135/500 [=======>......................] - ETA: 2:03 - loss: 2.2534 - regression_loss: 1.8978 - classification_loss: 0.3557 136/500 [=======>......................] - ETA: 2:03 - loss: 2.2547 - regression_loss: 1.8986 - classification_loss: 0.3561 137/500 [=======>......................] - ETA: 2:03 - loss: 2.2549 - regression_loss: 1.8991 - classification_loss: 0.3558 138/500 [=======>......................] - ETA: 2:02 - loss: 2.2545 - regression_loss: 1.8988 - classification_loss: 0.3556 139/500 [=======>......................] - ETA: 2:02 - loss: 2.2522 - regression_loss: 1.8970 - classification_loss: 0.3552 140/500 [=======>......................] - ETA: 2:02 - loss: 2.2499 - regression_loss: 1.8951 - classification_loss: 0.3547 141/500 [=======>......................] - ETA: 2:01 - loss: 2.2509 - regression_loss: 1.8960 - classification_loss: 0.3549 142/500 [=======>......................] - ETA: 2:01 - loss: 2.2542 - regression_loss: 1.8981 - classification_loss: 0.3561 143/500 [=======>......................] - ETA: 2:01 - loss: 2.2536 - regression_loss: 1.8978 - classification_loss: 0.3558 144/500 [=======>......................] - ETA: 2:00 - loss: 2.2575 - regression_loss: 1.9002 - classification_loss: 0.3573 145/500 [=======>......................] - ETA: 2:00 - loss: 2.2597 - regression_loss: 1.9020 - classification_loss: 0.3577 146/500 [=======>......................] - ETA: 2:00 - loss: 2.2595 - regression_loss: 1.9021 - classification_loss: 0.3574 147/500 [=======>......................] - ETA: 1:59 - loss: 2.2603 - regression_loss: 1.9025 - classification_loss: 0.3578 148/500 [=======>......................] - ETA: 1:59 - loss: 2.2594 - regression_loss: 1.9012 - classification_loss: 0.3582 149/500 [=======>......................] - ETA: 1:59 - loss: 2.2599 - regression_loss: 1.9014 - classification_loss: 0.3585 150/500 [========>.....................] - ETA: 1:58 - loss: 2.2546 - regression_loss: 1.8972 - classification_loss: 0.3575 151/500 [========>.....................] - ETA: 1:58 - loss: 2.2547 - regression_loss: 1.8968 - classification_loss: 0.3578 152/500 [========>.....................] - ETA: 1:58 - loss: 2.2553 - regression_loss: 1.8980 - classification_loss: 0.3573 153/500 [========>.....................] - ETA: 1:57 - loss: 2.2541 - regression_loss: 1.8972 - classification_loss: 0.3569 154/500 [========>.....................] - ETA: 1:57 - loss: 2.2549 - regression_loss: 1.8976 - classification_loss: 0.3573 155/500 [========>.....................] - ETA: 1:57 - loss: 2.2559 - regression_loss: 1.8988 - classification_loss: 0.3571 156/500 [========>.....................] - ETA: 1:56 - loss: 2.2587 - regression_loss: 1.9008 - classification_loss: 0.3579 157/500 [========>.....................] - ETA: 1:56 - loss: 2.2523 - regression_loss: 1.8943 - classification_loss: 0.3579 158/500 [========>.....................] - ETA: 1:56 - loss: 2.2511 - regression_loss: 1.8935 - classification_loss: 0.3576 159/500 [========>.....................] - ETA: 1:55 - loss: 2.2519 - regression_loss: 1.8935 - classification_loss: 0.3584 160/500 [========>.....................] - ETA: 1:55 - loss: 2.2515 - regression_loss: 1.8933 - classification_loss: 0.3582 161/500 [========>.....................] - ETA: 1:55 - loss: 2.2518 - regression_loss: 1.8934 - classification_loss: 0.3584 162/500 [========>.....................] - ETA: 1:54 - loss: 2.2527 - regression_loss: 1.8942 - classification_loss: 0.3585 163/500 [========>.....................] - ETA: 1:54 - loss: 2.2556 - regression_loss: 1.8966 - classification_loss: 0.3590 164/500 [========>.....................] - ETA: 1:54 - loss: 2.2565 - regression_loss: 1.8969 - classification_loss: 0.3596 165/500 [========>.....................] - ETA: 1:53 - loss: 2.2607 - regression_loss: 1.9009 - classification_loss: 0.3598 166/500 [========>.....................] - ETA: 1:53 - loss: 2.2590 - regression_loss: 1.8996 - classification_loss: 0.3595 167/500 [=========>....................] - ETA: 1:53 - loss: 2.2589 - regression_loss: 1.8998 - classification_loss: 0.3591 168/500 [=========>....................] - ETA: 1:52 - loss: 2.2584 - regression_loss: 1.8994 - classification_loss: 0.3589 169/500 [=========>....................] - ETA: 1:52 - loss: 2.2601 - regression_loss: 1.9008 - classification_loss: 0.3593 170/500 [=========>....................] - ETA: 1:52 - loss: 2.2581 - regression_loss: 1.8992 - classification_loss: 0.3590 171/500 [=========>....................] - ETA: 1:51 - loss: 2.2613 - regression_loss: 1.9022 - classification_loss: 0.3591 172/500 [=========>....................] - ETA: 1:51 - loss: 2.2626 - regression_loss: 1.9033 - classification_loss: 0.3593 173/500 [=========>....................] - ETA: 1:51 - loss: 2.2620 - regression_loss: 1.9029 - classification_loss: 0.3590 174/500 [=========>....................] - ETA: 1:50 - loss: 2.2619 - regression_loss: 1.9030 - classification_loss: 0.3589 175/500 [=========>....................] - ETA: 1:50 - loss: 2.2649 - regression_loss: 1.9017 - classification_loss: 0.3632 176/500 [=========>....................] - ETA: 1:50 - loss: 2.2659 - regression_loss: 1.9029 - classification_loss: 0.3631 177/500 [=========>....................] - ETA: 1:49 - loss: 2.2646 - regression_loss: 1.9015 - classification_loss: 0.3631 178/500 [=========>....................] - ETA: 1:49 - loss: 2.2696 - regression_loss: 1.9061 - classification_loss: 0.3635 179/500 [=========>....................] - ETA: 1:49 - loss: 2.2696 - regression_loss: 1.9062 - classification_loss: 0.3634 180/500 [=========>....................] - ETA: 1:48 - loss: 2.2692 - regression_loss: 1.9063 - classification_loss: 0.3629 181/500 [=========>....................] - ETA: 1:48 - loss: 2.2708 - regression_loss: 1.9078 - classification_loss: 0.3629 182/500 [=========>....................] - ETA: 1:48 - loss: 2.2677 - regression_loss: 1.9058 - classification_loss: 0.3619 183/500 [=========>....................] - ETA: 1:47 - loss: 2.2700 - regression_loss: 1.9075 - classification_loss: 0.3625 184/500 [==========>...................] - ETA: 1:47 - loss: 2.2671 - regression_loss: 1.9052 - classification_loss: 0.3618 185/500 [==========>...................] - ETA: 1:47 - loss: 2.2665 - regression_loss: 1.9048 - classification_loss: 0.3617 186/500 [==========>...................] - ETA: 1:46 - loss: 2.2694 - regression_loss: 1.9075 - classification_loss: 0.3619 187/500 [==========>...................] - ETA: 1:46 - loss: 2.2696 - regression_loss: 1.9080 - classification_loss: 0.3616 188/500 [==========>...................] - ETA: 1:46 - loss: 2.2685 - regression_loss: 1.9074 - classification_loss: 0.3611 189/500 [==========>...................] - ETA: 1:45 - loss: 2.2671 - regression_loss: 1.9059 - classification_loss: 0.3612 190/500 [==========>...................] - ETA: 1:45 - loss: 2.2668 - regression_loss: 1.9059 - classification_loss: 0.3609 191/500 [==========>...................] - ETA: 1:45 - loss: 2.2665 - regression_loss: 1.9059 - classification_loss: 0.3606 192/500 [==========>...................] - ETA: 1:44 - loss: 2.2699 - regression_loss: 1.9084 - classification_loss: 0.3615 193/500 [==========>...................] - ETA: 1:44 - loss: 2.2696 - regression_loss: 1.9085 - classification_loss: 0.3611 194/500 [==========>...................] - ETA: 1:44 - loss: 2.2701 - regression_loss: 1.9091 - classification_loss: 0.3610 195/500 [==========>...................] - ETA: 1:43 - loss: 2.2695 - regression_loss: 1.9088 - classification_loss: 0.3607 196/500 [==========>...................] - ETA: 1:43 - loss: 2.2682 - regression_loss: 1.9072 - classification_loss: 0.3610 197/500 [==========>...................] - ETA: 1:43 - loss: 2.2684 - regression_loss: 1.9074 - classification_loss: 0.3610 198/500 [==========>...................] - ETA: 1:42 - loss: 2.2695 - regression_loss: 1.9085 - classification_loss: 0.3610 199/500 [==========>...................] - ETA: 1:42 - loss: 2.2688 - regression_loss: 1.9077 - classification_loss: 0.3611 200/500 [===========>..................] - ETA: 1:42 - loss: 2.2699 - regression_loss: 1.9081 - classification_loss: 0.3618 201/500 [===========>..................] - ETA: 1:41 - loss: 2.2704 - regression_loss: 1.9086 - classification_loss: 0.3618 202/500 [===========>..................] - ETA: 1:41 - loss: 2.2710 - regression_loss: 1.9092 - classification_loss: 0.3618 203/500 [===========>..................] - ETA: 1:41 - loss: 2.2707 - regression_loss: 1.9091 - classification_loss: 0.3616 204/500 [===========>..................] - ETA: 1:40 - loss: 2.2717 - regression_loss: 1.9097 - classification_loss: 0.3621 205/500 [===========>..................] - ETA: 1:40 - loss: 2.2683 - regression_loss: 1.9067 - classification_loss: 0.3616 206/500 [===========>..................] - ETA: 1:40 - loss: 2.2677 - regression_loss: 1.9065 - classification_loss: 0.3613 207/500 [===========>..................] - ETA: 1:39 - loss: 2.2681 - regression_loss: 1.9068 - classification_loss: 0.3613 208/500 [===========>..................] - ETA: 1:39 - loss: 2.2690 - regression_loss: 1.9076 - classification_loss: 0.3615 209/500 [===========>..................] - ETA: 1:39 - loss: 2.2690 - regression_loss: 1.9073 - classification_loss: 0.3617 210/500 [===========>..................] - ETA: 1:38 - loss: 2.2670 - regression_loss: 1.9059 - classification_loss: 0.3612 211/500 [===========>..................] - ETA: 1:38 - loss: 2.2635 - regression_loss: 1.9030 - classification_loss: 0.3605 212/500 [===========>..................] - ETA: 1:38 - loss: 2.2645 - regression_loss: 1.9038 - classification_loss: 0.3607 213/500 [===========>..................] - ETA: 1:37 - loss: 2.2669 - regression_loss: 1.9060 - classification_loss: 0.3609 214/500 [===========>..................] - ETA: 1:37 - loss: 2.2669 - regression_loss: 1.9063 - classification_loss: 0.3606 215/500 [===========>..................] - ETA: 1:37 - loss: 2.2674 - regression_loss: 1.9068 - classification_loss: 0.3606 216/500 [===========>..................] - ETA: 1:36 - loss: 2.2682 - regression_loss: 1.9076 - classification_loss: 0.3606 217/500 [============>.................] - ETA: 1:36 - loss: 2.2672 - regression_loss: 1.9066 - classification_loss: 0.3606 218/500 [============>.................] - ETA: 1:36 - loss: 2.2660 - regression_loss: 1.9055 - classification_loss: 0.3605 219/500 [============>.................] - ETA: 1:35 - loss: 2.2659 - regression_loss: 1.9057 - classification_loss: 0.3602 220/500 [============>.................] - ETA: 1:35 - loss: 2.2659 - regression_loss: 1.9060 - classification_loss: 0.3599 221/500 [============>.................] - ETA: 1:35 - loss: 2.2657 - regression_loss: 1.9064 - classification_loss: 0.3593 222/500 [============>.................] - ETA: 1:34 - loss: 2.2664 - regression_loss: 1.9071 - classification_loss: 0.3592 223/500 [============>.................] - ETA: 1:34 - loss: 2.2693 - regression_loss: 1.9097 - classification_loss: 0.3596 224/500 [============>.................] - ETA: 1:34 - loss: 2.2667 - regression_loss: 1.9076 - classification_loss: 0.3591 225/500 [============>.................] - ETA: 1:33 - loss: 2.2676 - regression_loss: 1.9087 - classification_loss: 0.3589 226/500 [============>.................] - ETA: 1:33 - loss: 2.2667 - regression_loss: 1.9080 - classification_loss: 0.3587 227/500 [============>.................] - ETA: 1:33 - loss: 2.2672 - regression_loss: 1.9084 - classification_loss: 0.3588 228/500 [============>.................] - ETA: 1:32 - loss: 2.2668 - regression_loss: 1.9084 - classification_loss: 0.3584 229/500 [============>.................] - ETA: 1:32 - loss: 2.2676 - regression_loss: 1.9091 - classification_loss: 0.3585 230/500 [============>.................] - ETA: 1:32 - loss: 2.2664 - regression_loss: 1.9082 - classification_loss: 0.3582 231/500 [============>.................] - ETA: 1:31 - loss: 2.2695 - regression_loss: 1.9099 - classification_loss: 0.3595 232/500 [============>.................] - ETA: 1:31 - loss: 2.2695 - regression_loss: 1.9095 - classification_loss: 0.3600 233/500 [============>.................] - ETA: 1:31 - loss: 2.2689 - regression_loss: 1.9089 - classification_loss: 0.3600 234/500 [=============>................] - ETA: 1:30 - loss: 2.2636 - regression_loss: 1.9043 - classification_loss: 0.3592 235/500 [=============>................] - ETA: 1:30 - loss: 2.2655 - regression_loss: 1.9060 - classification_loss: 0.3596 236/500 [=============>................] - ETA: 1:29 - loss: 2.2647 - regression_loss: 1.9053 - classification_loss: 0.3594 237/500 [=============>................] - ETA: 1:29 - loss: 2.2652 - regression_loss: 1.9058 - classification_loss: 0.3594 238/500 [=============>................] - ETA: 1:29 - loss: 2.2667 - regression_loss: 1.9069 - classification_loss: 0.3599 239/500 [=============>................] - ETA: 1:28 - loss: 2.2682 - regression_loss: 1.9082 - classification_loss: 0.3600 240/500 [=============>................] - ETA: 1:28 - loss: 2.2705 - regression_loss: 1.9100 - classification_loss: 0.3604 241/500 [=============>................] - ETA: 1:28 - loss: 2.2669 - regression_loss: 1.9071 - classification_loss: 0.3598 242/500 [=============>................] - ETA: 1:27 - loss: 2.2671 - regression_loss: 1.9072 - classification_loss: 0.3598 243/500 [=============>................] - ETA: 1:27 - loss: 2.2687 - regression_loss: 1.9085 - classification_loss: 0.3602 244/500 [=============>................] - ETA: 1:27 - loss: 2.2711 - regression_loss: 1.9108 - classification_loss: 0.3603 245/500 [=============>................] - ETA: 1:26 - loss: 2.2665 - regression_loss: 1.9069 - classification_loss: 0.3596 246/500 [=============>................] - ETA: 1:26 - loss: 2.2670 - regression_loss: 1.9073 - classification_loss: 0.3596 247/500 [=============>................] - ETA: 1:26 - loss: 2.2653 - regression_loss: 1.9060 - classification_loss: 0.3593 248/500 [=============>................] - ETA: 1:25 - loss: 2.2635 - regression_loss: 1.9046 - classification_loss: 0.3590 249/500 [=============>................] - ETA: 1:25 - loss: 2.2631 - regression_loss: 1.9043 - classification_loss: 0.3588 250/500 [==============>...............] - ETA: 1:25 - loss: 2.2673 - regression_loss: 1.9076 - classification_loss: 0.3597 251/500 [==============>...............] - ETA: 1:24 - loss: 2.2665 - regression_loss: 1.9069 - classification_loss: 0.3596 252/500 [==============>...............] - ETA: 1:24 - loss: 2.2654 - regression_loss: 1.9061 - classification_loss: 0.3593 253/500 [==============>...............] - ETA: 1:24 - loss: 2.2651 - regression_loss: 1.9060 - classification_loss: 0.3591 254/500 [==============>...............] - ETA: 1:23 - loss: 2.2640 - regression_loss: 1.9052 - classification_loss: 0.3588 255/500 [==============>...............] - ETA: 1:23 - loss: 2.2626 - regression_loss: 1.9042 - classification_loss: 0.3584 256/500 [==============>...............] - ETA: 1:23 - loss: 2.2633 - regression_loss: 1.9051 - classification_loss: 0.3583 257/500 [==============>...............] - ETA: 1:22 - loss: 2.2634 - regression_loss: 1.9052 - classification_loss: 0.3582 258/500 [==============>...............] - ETA: 1:22 - loss: 2.2639 - regression_loss: 1.9057 - classification_loss: 0.3583 259/500 [==============>...............] - ETA: 1:22 - loss: 2.2631 - regression_loss: 1.9048 - classification_loss: 0.3583 260/500 [==============>...............] - ETA: 1:21 - loss: 2.2631 - regression_loss: 1.9050 - classification_loss: 0.3581 261/500 [==============>...............] - ETA: 1:21 - loss: 2.2625 - regression_loss: 1.9047 - classification_loss: 0.3578 262/500 [==============>...............] - ETA: 1:21 - loss: 2.2616 - regression_loss: 1.9037 - classification_loss: 0.3580 263/500 [==============>...............] - ETA: 1:20 - loss: 2.2624 - regression_loss: 1.9043 - classification_loss: 0.3581 264/500 [==============>...............] - ETA: 1:20 - loss: 2.2621 - regression_loss: 1.9041 - classification_loss: 0.3581 265/500 [==============>...............] - ETA: 1:20 - loss: 2.2640 - regression_loss: 1.9057 - classification_loss: 0.3583 266/500 [==============>...............] - ETA: 1:19 - loss: 2.2636 - regression_loss: 1.9055 - classification_loss: 0.3582 267/500 [===============>..............] - ETA: 1:19 - loss: 2.2644 - regression_loss: 1.9063 - classification_loss: 0.3581 268/500 [===============>..............] - ETA: 1:19 - loss: 2.2627 - regression_loss: 1.9051 - classification_loss: 0.3576 269/500 [===============>..............] - ETA: 1:18 - loss: 2.2626 - regression_loss: 1.9049 - classification_loss: 0.3577 270/500 [===============>..............] - ETA: 1:18 - loss: 2.2620 - regression_loss: 1.9045 - classification_loss: 0.3575 271/500 [===============>..............] - ETA: 1:18 - loss: 2.2620 - regression_loss: 1.9042 - classification_loss: 0.3577 272/500 [===============>..............] - ETA: 1:17 - loss: 2.2618 - regression_loss: 1.9040 - classification_loss: 0.3579 273/500 [===============>..............] - ETA: 1:17 - loss: 2.2614 - regression_loss: 1.9036 - classification_loss: 0.3578 274/500 [===============>..............] - ETA: 1:17 - loss: 2.2620 - regression_loss: 1.9042 - classification_loss: 0.3578 275/500 [===============>..............] - ETA: 1:16 - loss: 2.2616 - regression_loss: 1.9039 - classification_loss: 0.3577 276/500 [===============>..............] - ETA: 1:16 - loss: 2.2613 - regression_loss: 1.9038 - classification_loss: 0.3575 277/500 [===============>..............] - ETA: 1:16 - loss: 2.2609 - regression_loss: 1.9034 - classification_loss: 0.3575 278/500 [===============>..............] - ETA: 1:15 - loss: 2.2638 - regression_loss: 1.9034 - classification_loss: 0.3604 279/500 [===============>..............] - ETA: 1:15 - loss: 2.2635 - regression_loss: 1.9033 - classification_loss: 0.3602 280/500 [===============>..............] - ETA: 1:15 - loss: 2.2627 - regression_loss: 1.9027 - classification_loss: 0.3599 281/500 [===============>..............] - ETA: 1:14 - loss: 2.2613 - regression_loss: 1.9016 - classification_loss: 0.3597 282/500 [===============>..............] - ETA: 1:14 - loss: 2.2641 - regression_loss: 1.9035 - classification_loss: 0.3607 283/500 [===============>..............] - ETA: 1:14 - loss: 2.2637 - regression_loss: 1.9031 - classification_loss: 0.3606 284/500 [================>.............] - ETA: 1:13 - loss: 2.2634 - regression_loss: 1.9030 - classification_loss: 0.3604 285/500 [================>.............] - ETA: 1:13 - loss: 2.2627 - regression_loss: 1.9026 - classification_loss: 0.3601 286/500 [================>.............] - ETA: 1:13 - loss: 2.2622 - regression_loss: 1.9023 - classification_loss: 0.3599 287/500 [================>.............] - ETA: 1:12 - loss: 2.2625 - regression_loss: 1.9027 - classification_loss: 0.3598 288/500 [================>.............] - ETA: 1:12 - loss: 2.2620 - regression_loss: 1.9022 - classification_loss: 0.3598 289/500 [================>.............] - ETA: 1:11 - loss: 2.2626 - regression_loss: 1.9028 - classification_loss: 0.3598 290/500 [================>.............] - ETA: 1:11 - loss: 2.2622 - regression_loss: 1.9026 - classification_loss: 0.3596 291/500 [================>.............] - ETA: 1:11 - loss: 2.2612 - regression_loss: 1.9022 - classification_loss: 0.3590 292/500 [================>.............] - ETA: 1:10 - loss: 2.2611 - regression_loss: 1.9022 - classification_loss: 0.3589 293/500 [================>.............] - ETA: 1:10 - loss: 2.2612 - regression_loss: 1.9023 - classification_loss: 0.3589 294/500 [================>.............] - ETA: 1:10 - loss: 2.2604 - regression_loss: 1.9018 - classification_loss: 0.3586 295/500 [================>.............] - ETA: 1:09 - loss: 2.2588 - regression_loss: 1.9003 - classification_loss: 0.3585 296/500 [================>.............] - ETA: 1:09 - loss: 2.2616 - regression_loss: 1.9026 - classification_loss: 0.3590 297/500 [================>.............] - ETA: 1:09 - loss: 2.2593 - regression_loss: 1.9006 - classification_loss: 0.3587 298/500 [================>.............] - ETA: 1:08 - loss: 2.2601 - regression_loss: 1.9013 - classification_loss: 0.3588 299/500 [================>.............] - ETA: 1:08 - loss: 2.2602 - regression_loss: 1.9013 - classification_loss: 0.3589 300/500 [=================>............] - ETA: 1:08 - loss: 2.2599 - regression_loss: 1.9012 - classification_loss: 0.3587 301/500 [=================>............] - ETA: 1:07 - loss: 2.2600 - regression_loss: 1.9014 - classification_loss: 0.3587 302/500 [=================>............] - ETA: 1:07 - loss: 2.2611 - regression_loss: 1.9022 - classification_loss: 0.3589 303/500 [=================>............] - ETA: 1:07 - loss: 2.2642 - regression_loss: 1.9044 - classification_loss: 0.3598 304/500 [=================>............] - ETA: 1:06 - loss: 2.2641 - regression_loss: 1.9045 - classification_loss: 0.3596 305/500 [=================>............] - ETA: 1:06 - loss: 2.2639 - regression_loss: 1.9044 - classification_loss: 0.3594 306/500 [=================>............] - ETA: 1:06 - loss: 2.2648 - regression_loss: 1.9053 - classification_loss: 0.3595 307/500 [=================>............] - ETA: 1:05 - loss: 2.2652 - regression_loss: 1.9056 - classification_loss: 0.3596 308/500 [=================>............] - ETA: 1:05 - loss: 2.2669 - regression_loss: 1.9068 - classification_loss: 0.3601 309/500 [=================>............] - ETA: 1:05 - loss: 2.2676 - regression_loss: 1.9074 - classification_loss: 0.3602 310/500 [=================>............] - ETA: 1:04 - loss: 2.2670 - regression_loss: 1.9070 - classification_loss: 0.3600 311/500 [=================>............] - ETA: 1:04 - loss: 2.2663 - regression_loss: 1.9066 - classification_loss: 0.3598 312/500 [=================>............] - ETA: 1:04 - loss: 2.2660 - regression_loss: 1.9063 - classification_loss: 0.3597 313/500 [=================>............] - ETA: 1:03 - loss: 2.2649 - regression_loss: 1.9058 - classification_loss: 0.3591 314/500 [=================>............] - ETA: 1:03 - loss: 2.2636 - regression_loss: 1.9048 - classification_loss: 0.3588 315/500 [=================>............] - ETA: 1:03 - loss: 2.2637 - regression_loss: 1.9051 - classification_loss: 0.3586 316/500 [=================>............] - ETA: 1:02 - loss: 2.2637 - regression_loss: 1.9052 - classification_loss: 0.3585 317/500 [==================>...........] - ETA: 1:02 - loss: 2.2636 - regression_loss: 1.9050 - classification_loss: 0.3586 318/500 [==================>...........] - ETA: 1:02 - loss: 2.2626 - regression_loss: 1.9040 - classification_loss: 0.3586 319/500 [==================>...........] - ETA: 1:01 - loss: 2.2629 - regression_loss: 1.9043 - classification_loss: 0.3586 320/500 [==================>...........] - ETA: 1:01 - loss: 2.2628 - regression_loss: 1.9044 - classification_loss: 0.3583 321/500 [==================>...........] - ETA: 1:01 - loss: 2.2607 - regression_loss: 1.9028 - classification_loss: 0.3580 322/500 [==================>...........] - ETA: 1:00 - loss: 2.2606 - regression_loss: 1.9028 - classification_loss: 0.3578 323/500 [==================>...........] - ETA: 1:00 - loss: 2.2605 - regression_loss: 1.9029 - classification_loss: 0.3575 324/500 [==================>...........] - ETA: 1:00 - loss: 2.2604 - regression_loss: 1.9031 - classification_loss: 0.3574 325/500 [==================>...........] - ETA: 59s - loss: 2.2607 - regression_loss: 1.9033 - classification_loss: 0.3574  326/500 [==================>...........] - ETA: 59s - loss: 2.2610 - regression_loss: 1.9035 - classification_loss: 0.3575 327/500 [==================>...........] - ETA: 58s - loss: 2.2589 - regression_loss: 1.9017 - classification_loss: 0.3571 328/500 [==================>...........] - ETA: 58s - loss: 2.2591 - regression_loss: 1.9022 - classification_loss: 0.3570 329/500 [==================>...........] - ETA: 58s - loss: 2.2593 - regression_loss: 1.9023 - classification_loss: 0.3569 330/500 [==================>...........] - ETA: 57s - loss: 2.2590 - regression_loss: 1.9022 - classification_loss: 0.3568 331/500 [==================>...........] - ETA: 57s - loss: 2.2607 - regression_loss: 1.9032 - classification_loss: 0.3574 332/500 [==================>...........] - ETA: 57s - loss: 2.2587 - regression_loss: 1.9017 - classification_loss: 0.3570 333/500 [==================>...........] - ETA: 56s - loss: 2.2610 - regression_loss: 1.9019 - classification_loss: 0.3591 334/500 [===================>..........] - ETA: 56s - loss: 2.2598 - regression_loss: 1.9010 - classification_loss: 0.3588 335/500 [===================>..........] - ETA: 56s - loss: 2.2590 - regression_loss: 1.8999 - classification_loss: 0.3591 336/500 [===================>..........] - ETA: 55s - loss: 2.2585 - regression_loss: 1.8995 - classification_loss: 0.3590 337/500 [===================>..........] - ETA: 55s - loss: 2.2570 - regression_loss: 1.8981 - classification_loss: 0.3588 338/500 [===================>..........] - ETA: 55s - loss: 2.2569 - regression_loss: 1.8981 - classification_loss: 0.3588 339/500 [===================>..........] - ETA: 54s - loss: 2.2559 - regression_loss: 1.8973 - classification_loss: 0.3587 340/500 [===================>..........] - ETA: 54s - loss: 2.2566 - regression_loss: 1.8977 - classification_loss: 0.3588 341/500 [===================>..........] - ETA: 54s - loss: 2.2562 - regression_loss: 1.8974 - classification_loss: 0.3588 342/500 [===================>..........] - ETA: 53s - loss: 2.2560 - regression_loss: 1.8972 - classification_loss: 0.3588 343/500 [===================>..........] - ETA: 53s - loss: 2.2577 - regression_loss: 1.8976 - classification_loss: 0.3601 344/500 [===================>..........] - ETA: 53s - loss: 2.2579 - regression_loss: 1.8973 - classification_loss: 0.3606 345/500 [===================>..........] - ETA: 52s - loss: 2.2575 - regression_loss: 1.8971 - classification_loss: 0.3604 346/500 [===================>..........] - ETA: 52s - loss: 2.2600 - regression_loss: 1.8984 - classification_loss: 0.3616 347/500 [===================>..........] - ETA: 52s - loss: 2.2589 - regression_loss: 1.8974 - classification_loss: 0.3616 348/500 [===================>..........] - ETA: 51s - loss: 2.2604 - regression_loss: 1.8986 - classification_loss: 0.3618 349/500 [===================>..........] - ETA: 51s - loss: 2.2601 - regression_loss: 1.8986 - classification_loss: 0.3615 350/500 [====================>.........] - ETA: 51s - loss: 2.2589 - regression_loss: 1.8976 - classification_loss: 0.3613 351/500 [====================>.........] - ETA: 50s - loss: 2.2589 - regression_loss: 1.8973 - classification_loss: 0.3616 352/500 [====================>.........] - ETA: 50s - loss: 2.2578 - regression_loss: 1.8965 - classification_loss: 0.3613 353/500 [====================>.........] - ETA: 50s - loss: 2.2553 - regression_loss: 1.8944 - classification_loss: 0.3608 354/500 [====================>.........] - ETA: 49s - loss: 2.2546 - regression_loss: 1.8939 - classification_loss: 0.3606 355/500 [====================>.........] - ETA: 49s - loss: 2.2546 - regression_loss: 1.8938 - classification_loss: 0.3607 356/500 [====================>.........] - ETA: 49s - loss: 2.2552 - regression_loss: 1.8942 - classification_loss: 0.3610 357/500 [====================>.........] - ETA: 48s - loss: 2.2547 - regression_loss: 1.8938 - classification_loss: 0.3608 358/500 [====================>.........] - ETA: 48s - loss: 2.2543 - regression_loss: 1.8935 - classification_loss: 0.3608 359/500 [====================>.........] - ETA: 48s - loss: 2.2529 - regression_loss: 1.8922 - classification_loss: 0.3607 360/500 [====================>.........] - ETA: 47s - loss: 2.2510 - regression_loss: 1.8902 - classification_loss: 0.3608 361/500 [====================>.........] - ETA: 47s - loss: 2.2503 - regression_loss: 1.8895 - classification_loss: 0.3608 362/500 [====================>.........] - ETA: 47s - loss: 2.2518 - regression_loss: 1.8908 - classification_loss: 0.3610 363/500 [====================>.........] - ETA: 46s - loss: 2.2513 - regression_loss: 1.8906 - classification_loss: 0.3607 364/500 [====================>.........] - ETA: 46s - loss: 2.2500 - regression_loss: 1.8895 - classification_loss: 0.3605 365/500 [====================>.........] - ETA: 46s - loss: 2.2481 - regression_loss: 1.8879 - classification_loss: 0.3602 366/500 [====================>.........] - ETA: 45s - loss: 2.2484 - regression_loss: 1.8881 - classification_loss: 0.3603 367/500 [=====================>........] - ETA: 45s - loss: 2.2486 - regression_loss: 1.8881 - classification_loss: 0.3605 368/500 [=====================>........] - ETA: 45s - loss: 2.2499 - regression_loss: 1.8893 - classification_loss: 0.3606 369/500 [=====================>........] - ETA: 44s - loss: 2.2459 - regression_loss: 1.8859 - classification_loss: 0.3600 370/500 [=====================>........] - ETA: 44s - loss: 2.2464 - regression_loss: 1.8864 - classification_loss: 0.3600 371/500 [=====================>........] - ETA: 43s - loss: 2.2457 - regression_loss: 1.8857 - classification_loss: 0.3600 372/500 [=====================>........] - ETA: 43s - loss: 2.2451 - regression_loss: 1.8853 - classification_loss: 0.3598 373/500 [=====================>........] - ETA: 43s - loss: 2.2468 - regression_loss: 1.8869 - classification_loss: 0.3598 374/500 [=====================>........] - ETA: 42s - loss: 2.2455 - regression_loss: 1.8852 - classification_loss: 0.3603 375/500 [=====================>........] - ETA: 42s - loss: 2.2442 - regression_loss: 1.8838 - classification_loss: 0.3604 376/500 [=====================>........] - ETA: 42s - loss: 2.2440 - regression_loss: 1.8837 - classification_loss: 0.3603 377/500 [=====================>........] - ETA: 41s - loss: 2.2437 - regression_loss: 1.8836 - classification_loss: 0.3601 378/500 [=====================>........] - ETA: 41s - loss: 2.2435 - regression_loss: 1.8834 - classification_loss: 0.3601 379/500 [=====================>........] - ETA: 41s - loss: 2.2427 - regression_loss: 1.8829 - classification_loss: 0.3598 380/500 [=====================>........] - ETA: 40s - loss: 2.2441 - regression_loss: 1.8842 - classification_loss: 0.3599 381/500 [=====================>........] - ETA: 40s - loss: 2.2420 - regression_loss: 1.8823 - classification_loss: 0.3597 382/500 [=====================>........] - ETA: 40s - loss: 2.2412 - regression_loss: 1.8816 - classification_loss: 0.3596 383/500 [=====================>........] - ETA: 39s - loss: 2.2401 - regression_loss: 1.8807 - classification_loss: 0.3594 384/500 [======================>.......] - ETA: 39s - loss: 2.2387 - regression_loss: 1.8796 - classification_loss: 0.3591 385/500 [======================>.......] - ETA: 39s - loss: 2.2379 - regression_loss: 1.8788 - classification_loss: 0.3591 386/500 [======================>.......] - ETA: 38s - loss: 2.2381 - regression_loss: 1.8790 - classification_loss: 0.3591 387/500 [======================>.......] - ETA: 38s - loss: 2.2376 - regression_loss: 1.8787 - classification_loss: 0.3589 388/500 [======================>.......] - ETA: 38s - loss: 2.2382 - regression_loss: 1.8788 - classification_loss: 0.3594 389/500 [======================>.......] - ETA: 37s - loss: 2.2376 - regression_loss: 1.8783 - classification_loss: 0.3593 390/500 [======================>.......] - ETA: 37s - loss: 2.2386 - regression_loss: 1.8789 - classification_loss: 0.3597 391/500 [======================>.......] - ETA: 37s - loss: 2.2371 - regression_loss: 1.8777 - classification_loss: 0.3593 392/500 [======================>.......] - ETA: 36s - loss: 2.2381 - regression_loss: 1.8790 - classification_loss: 0.3592 393/500 [======================>.......] - ETA: 36s - loss: 2.2388 - regression_loss: 1.8793 - classification_loss: 0.3594 394/500 [======================>.......] - ETA: 36s - loss: 2.2370 - regression_loss: 1.8779 - classification_loss: 0.3591 395/500 [======================>.......] - ETA: 35s - loss: 2.2375 - regression_loss: 1.8782 - classification_loss: 0.3593 396/500 [======================>.......] - ETA: 35s - loss: 2.2379 - regression_loss: 1.8787 - classification_loss: 0.3592 397/500 [======================>.......] - ETA: 35s - loss: 2.2375 - regression_loss: 1.8781 - classification_loss: 0.3593 398/500 [======================>.......] - ETA: 34s - loss: 2.2371 - regression_loss: 1.8777 - classification_loss: 0.3594 399/500 [======================>.......] - ETA: 34s - loss: 2.2363 - regression_loss: 1.8772 - classification_loss: 0.3591 400/500 [=======================>......] - ETA: 34s - loss: 2.2373 - regression_loss: 1.8782 - classification_loss: 0.3590 401/500 [=======================>......] - ETA: 33s - loss: 2.2376 - regression_loss: 1.8785 - classification_loss: 0.3591 402/500 [=======================>......] - ETA: 33s - loss: 2.2379 - regression_loss: 1.8786 - classification_loss: 0.3593 403/500 [=======================>......] - ETA: 33s - loss: 2.2380 - regression_loss: 1.8788 - classification_loss: 0.3593 404/500 [=======================>......] - ETA: 32s - loss: 2.2387 - regression_loss: 1.8792 - classification_loss: 0.3595 405/500 [=======================>......] - ETA: 32s - loss: 2.2382 - regression_loss: 1.8789 - classification_loss: 0.3593 406/500 [=======================>......] - ETA: 32s - loss: 2.2380 - regression_loss: 1.8787 - classification_loss: 0.3593 407/500 [=======================>......] - ETA: 31s - loss: 2.2388 - regression_loss: 1.8792 - classification_loss: 0.3596 408/500 [=======================>......] - ETA: 31s - loss: 2.2389 - regression_loss: 1.8793 - classification_loss: 0.3596 409/500 [=======================>......] - ETA: 31s - loss: 2.2385 - regression_loss: 1.8791 - classification_loss: 0.3594 410/500 [=======================>......] - ETA: 30s - loss: 2.2370 - regression_loss: 1.8779 - classification_loss: 0.3591 411/500 [=======================>......] - ETA: 30s - loss: 2.2368 - regression_loss: 1.8777 - classification_loss: 0.3590 412/500 [=======================>......] - ETA: 29s - loss: 2.2374 - regression_loss: 1.8783 - classification_loss: 0.3591 413/500 [=======================>......] - ETA: 29s - loss: 2.2361 - regression_loss: 1.8773 - classification_loss: 0.3588 414/500 [=======================>......] - ETA: 29s - loss: 2.2365 - regression_loss: 1.8777 - classification_loss: 0.3588 415/500 [=======================>......] - ETA: 28s - loss: 2.2363 - regression_loss: 1.8775 - classification_loss: 0.3587 416/500 [=======================>......] - ETA: 28s - loss: 2.2352 - regression_loss: 1.8768 - classification_loss: 0.3584 417/500 [========================>.....] - ETA: 28s - loss: 2.2338 - regression_loss: 1.8757 - classification_loss: 0.3580 418/500 [========================>.....] - ETA: 27s - loss: 2.2330 - regression_loss: 1.8751 - classification_loss: 0.3579 419/500 [========================>.....] - ETA: 27s - loss: 2.2329 - regression_loss: 1.8750 - classification_loss: 0.3578 420/500 [========================>.....] - ETA: 27s - loss: 2.2324 - regression_loss: 1.8746 - classification_loss: 0.3578 421/500 [========================>.....] - ETA: 26s - loss: 2.2313 - regression_loss: 1.8738 - classification_loss: 0.3575 422/500 [========================>.....] - ETA: 26s - loss: 2.2312 - regression_loss: 1.8736 - classification_loss: 0.3576 423/500 [========================>.....] - ETA: 26s - loss: 2.2299 - regression_loss: 1.8728 - classification_loss: 0.3571 424/500 [========================>.....] - ETA: 25s - loss: 2.2291 - regression_loss: 1.8724 - classification_loss: 0.3567 425/500 [========================>.....] - ETA: 25s - loss: 2.2262 - regression_loss: 1.8698 - classification_loss: 0.3564 426/500 [========================>.....] - ETA: 25s - loss: 2.2270 - regression_loss: 1.8704 - classification_loss: 0.3565 427/500 [========================>.....] - ETA: 24s - loss: 2.2281 - regression_loss: 1.8712 - classification_loss: 0.3569 428/500 [========================>.....] - ETA: 24s - loss: 2.2264 - regression_loss: 1.8698 - classification_loss: 0.3565 429/500 [========================>.....] - ETA: 24s - loss: 2.2293 - regression_loss: 1.8722 - classification_loss: 0.3572 430/500 [========================>.....] - ETA: 23s - loss: 2.2296 - regression_loss: 1.8724 - classification_loss: 0.3571 431/500 [========================>.....] - ETA: 23s - loss: 2.2293 - regression_loss: 1.8722 - classification_loss: 0.3572 432/500 [========================>.....] - ETA: 23s - loss: 2.2303 - regression_loss: 1.8730 - classification_loss: 0.3573 433/500 [========================>.....] - ETA: 22s - loss: 2.2304 - regression_loss: 1.8732 - classification_loss: 0.3572 434/500 [=========================>....] - ETA: 22s - loss: 2.2306 - regression_loss: 1.8729 - classification_loss: 0.3577 435/500 [=========================>....] - ETA: 22s - loss: 2.2296 - regression_loss: 1.8722 - classification_loss: 0.3575 436/500 [=========================>....] - ETA: 21s - loss: 2.2296 - regression_loss: 1.8723 - classification_loss: 0.3574 437/500 [=========================>....] - ETA: 21s - loss: 2.2290 - regression_loss: 1.8715 - classification_loss: 0.3575 438/500 [=========================>....] - ETA: 21s - loss: 2.2288 - regression_loss: 1.8714 - classification_loss: 0.3574 439/500 [=========================>....] - ETA: 20s - loss: 2.2280 - regression_loss: 1.8707 - classification_loss: 0.3573 440/500 [=========================>....] - ETA: 20s - loss: 2.2277 - regression_loss: 1.8705 - classification_loss: 0.3572 441/500 [=========================>....] - ETA: 20s - loss: 2.2271 - regression_loss: 1.8701 - classification_loss: 0.3571 442/500 [=========================>....] - ETA: 19s - loss: 2.2282 - regression_loss: 1.8710 - classification_loss: 0.3572 443/500 [=========================>....] - ETA: 19s - loss: 2.2280 - regression_loss: 1.8708 - classification_loss: 0.3572 444/500 [=========================>....] - ETA: 19s - loss: 2.2278 - regression_loss: 1.8707 - classification_loss: 0.3571 445/500 [=========================>....] - ETA: 18s - loss: 2.2277 - regression_loss: 1.8707 - classification_loss: 0.3570 446/500 [=========================>....] - ETA: 18s - loss: 2.2276 - regression_loss: 1.8706 - classification_loss: 0.3570 447/500 [=========================>....] - ETA: 18s - loss: 2.2278 - regression_loss: 1.8708 - classification_loss: 0.3570 448/500 [=========================>....] - ETA: 17s - loss: 2.2287 - regression_loss: 1.8715 - classification_loss: 0.3571 449/500 [=========================>....] - ETA: 17s - loss: 2.2294 - regression_loss: 1.8723 - classification_loss: 0.3571 450/500 [==========================>...] - ETA: 17s - loss: 2.2274 - regression_loss: 1.8707 - classification_loss: 0.3567 451/500 [==========================>...] - ETA: 16s - loss: 2.2272 - regression_loss: 1.8706 - classification_loss: 0.3566 452/500 [==========================>...] - ETA: 16s - loss: 2.2272 - regression_loss: 1.8706 - classification_loss: 0.3566 453/500 [==========================>...] - ETA: 16s - loss: 2.2261 - regression_loss: 1.8697 - classification_loss: 0.3564 454/500 [==========================>...] - ETA: 15s - loss: 2.2255 - regression_loss: 1.8693 - classification_loss: 0.3562 455/500 [==========================>...] - ETA: 15s - loss: 2.2251 - regression_loss: 1.8691 - classification_loss: 0.3560 456/500 [==========================>...] - ETA: 14s - loss: 2.2259 - regression_loss: 1.8698 - classification_loss: 0.3560 457/500 [==========================>...] - ETA: 14s - loss: 2.2251 - regression_loss: 1.8691 - classification_loss: 0.3560 458/500 [==========================>...] - ETA: 14s - loss: 2.2247 - regression_loss: 1.8686 - classification_loss: 0.3560 459/500 [==========================>...] - ETA: 13s - loss: 2.2257 - regression_loss: 1.8693 - classification_loss: 0.3564 460/500 [==========================>...] - ETA: 13s - loss: 2.2247 - regression_loss: 1.8684 - classification_loss: 0.3563 461/500 [==========================>...] - ETA: 13s - loss: 2.2239 - regression_loss: 1.8677 - classification_loss: 0.3562 462/500 [==========================>...] - ETA: 12s - loss: 2.2238 - regression_loss: 1.8677 - classification_loss: 0.3561 463/500 [==========================>...] - ETA: 12s - loss: 2.2216 - regression_loss: 1.8659 - classification_loss: 0.3557 464/500 [==========================>...] - ETA: 12s - loss: 2.2195 - regression_loss: 1.8643 - classification_loss: 0.3552 465/500 [==========================>...] - ETA: 11s - loss: 2.2191 - regression_loss: 1.8641 - classification_loss: 0.3550 466/500 [==========================>...] - ETA: 11s - loss: 2.2183 - regression_loss: 1.8635 - classification_loss: 0.3548 467/500 [===========================>..] - ETA: 11s - loss: 2.2185 - regression_loss: 1.8638 - classification_loss: 0.3547 468/500 [===========================>..] - ETA: 10s - loss: 2.2197 - regression_loss: 1.8647 - classification_loss: 0.3550 469/500 [===========================>..] - ETA: 10s - loss: 2.2188 - regression_loss: 1.8640 - classification_loss: 0.3548 470/500 [===========================>..] - ETA: 10s - loss: 2.2205 - regression_loss: 1.8657 - classification_loss: 0.3548 471/500 [===========================>..] - ETA: 9s - loss: 2.2204 - regression_loss: 1.8657 - classification_loss: 0.3547  472/500 [===========================>..] - ETA: 9s - loss: 2.2193 - regression_loss: 1.8648 - classification_loss: 0.3545 473/500 [===========================>..] - ETA: 9s - loss: 2.2190 - regression_loss: 1.8644 - classification_loss: 0.3546 474/500 [===========================>..] - ETA: 8s - loss: 2.2193 - regression_loss: 1.8647 - classification_loss: 0.3546 475/500 [===========================>..] - ETA: 8s - loss: 2.2191 - regression_loss: 1.8646 - classification_loss: 0.3545 476/500 [===========================>..] - ETA: 8s - loss: 2.2195 - regression_loss: 1.8649 - classification_loss: 0.3545 477/500 [===========================>..] - ETA: 7s - loss: 2.2185 - regression_loss: 1.8642 - classification_loss: 0.3542 478/500 [===========================>..] - ETA: 7s - loss: 2.2188 - regression_loss: 1.8647 - classification_loss: 0.3541 479/500 [===========================>..] - ETA: 7s - loss: 2.2196 - regression_loss: 1.8656 - classification_loss: 0.3540 480/500 [===========================>..] - ETA: 6s - loss: 2.2194 - regression_loss: 1.8654 - classification_loss: 0.3540 481/500 [===========================>..] - ETA: 6s - loss: 2.2192 - regression_loss: 1.8653 - classification_loss: 0.3538 482/500 [===========================>..] - ETA: 6s - loss: 2.2196 - regression_loss: 1.8657 - classification_loss: 0.3538 483/500 [===========================>..] - ETA: 5s - loss: 2.2200 - regression_loss: 1.8660 - classification_loss: 0.3540 484/500 [============================>.] - ETA: 5s - loss: 2.2198 - regression_loss: 1.8658 - classification_loss: 0.3540 485/500 [============================>.] - ETA: 5s - loss: 2.2199 - regression_loss: 1.8660 - classification_loss: 0.3539 486/500 [============================>.] - ETA: 4s - loss: 2.2187 - regression_loss: 1.8651 - classification_loss: 0.3536 487/500 [============================>.] - ETA: 4s - loss: 2.2183 - regression_loss: 1.8648 - classification_loss: 0.3536 488/500 [============================>.] - ETA: 4s - loss: 2.2182 - regression_loss: 1.8649 - classification_loss: 0.3533 489/500 [============================>.] - ETA: 3s - loss: 2.2183 - regression_loss: 1.8650 - classification_loss: 0.3533 490/500 [============================>.] - ETA: 3s - loss: 2.2178 - regression_loss: 1.8647 - classification_loss: 0.3531 491/500 [============================>.] - ETA: 3s - loss: 2.2177 - regression_loss: 1.8647 - classification_loss: 0.3530 492/500 [============================>.] - ETA: 2s - loss: 2.2181 - regression_loss: 1.8652 - classification_loss: 0.3530 493/500 [============================>.] - ETA: 2s - loss: 2.2166 - regression_loss: 1.8640 - classification_loss: 0.3526 494/500 [============================>.] - ETA: 2s - loss: 2.2160 - regression_loss: 1.8635 - classification_loss: 0.3525 495/500 [============================>.] - ETA: 1s - loss: 2.2153 - regression_loss: 1.8630 - classification_loss: 0.3523 496/500 [============================>.] - ETA: 1s - loss: 2.2163 - regression_loss: 1.8639 - classification_loss: 0.3523 497/500 [============================>.] - ETA: 1s - loss: 2.2165 - regression_loss: 1.8643 - classification_loss: 0.3523 498/500 [============================>.] - ETA: 0s - loss: 2.2158 - regression_loss: 1.8637 - classification_loss: 0.3521 499/500 [============================>.] - ETA: 0s - loss: 2.2146 - regression_loss: 1.8627 - classification_loss: 0.3519 500/500 [==============================] - 170s 341ms/step - loss: 2.2150 - regression_loss: 1.8632 - classification_loss: 0.3518 1172 instances of class plum with average precision: 0.4293 mAP: 0.4293 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 2:31 - loss: 2.2148 - regression_loss: 1.8370 - classification_loss: 0.3778 2/500 [..............................] - ETA: 2:39 - loss: 1.9266 - regression_loss: 1.5565 - classification_loss: 0.3701 3/500 [..............................] - ETA: 2:44 - loss: 2.1832 - regression_loss: 1.8078 - classification_loss: 0.3753 4/500 [..............................] - ETA: 2:47 - loss: 2.3948 - regression_loss: 1.9874 - classification_loss: 0.4074 5/500 [..............................] - ETA: 2:44 - loss: 2.3687 - regression_loss: 1.9745 - classification_loss: 0.3943 6/500 [..............................] - ETA: 2:45 - loss: 2.2599 - regression_loss: 1.8970 - classification_loss: 0.3629 7/500 [..............................] - ETA: 2:45 - loss: 2.3769 - regression_loss: 2.0040 - classification_loss: 0.3729 8/500 [..............................] - ETA: 2:45 - loss: 2.3880 - regression_loss: 2.0072 - classification_loss: 0.3807 9/500 [..............................] - ETA: 2:46 - loss: 2.4014 - regression_loss: 2.0124 - classification_loss: 0.3889 10/500 [..............................] - ETA: 2:46 - loss: 2.2810 - regression_loss: 1.9148 - classification_loss: 0.3663 11/500 [..............................] - ETA: 2:46 - loss: 2.3001 - regression_loss: 1.9315 - classification_loss: 0.3686 12/500 [..............................] - ETA: 2:46 - loss: 2.2995 - regression_loss: 1.9301 - classification_loss: 0.3694 13/500 [..............................] - ETA: 2:46 - loss: 2.3164 - regression_loss: 1.9447 - classification_loss: 0.3718 14/500 [..............................] - ETA: 2:46 - loss: 2.3113 - regression_loss: 1.9407 - classification_loss: 0.3706 15/500 [..............................] - ETA: 2:46 - loss: 2.2895 - regression_loss: 1.9258 - classification_loss: 0.3637 16/500 [..............................] - ETA: 2:45 - loss: 2.3039 - regression_loss: 1.9342 - classification_loss: 0.3697 17/500 [>.............................] - ETA: 2:45 - loss: 2.3273 - regression_loss: 1.9530 - classification_loss: 0.3743 18/500 [>.............................] - ETA: 2:44 - loss: 2.4229 - regression_loss: 2.0169 - classification_loss: 0.4060 19/500 [>.............................] - ETA: 2:43 - loss: 2.4160 - regression_loss: 2.0137 - classification_loss: 0.4024 20/500 [>.............................] - ETA: 2:43 - loss: 2.4197 - regression_loss: 2.0180 - classification_loss: 0.4017 21/500 [>.............................] - ETA: 2:43 - loss: 2.4028 - regression_loss: 2.0051 - classification_loss: 0.3978 22/500 [>.............................] - ETA: 2:43 - loss: 2.4014 - regression_loss: 2.0053 - classification_loss: 0.3961 23/500 [>.............................] - ETA: 2:43 - loss: 2.3935 - regression_loss: 1.9991 - classification_loss: 0.3944 24/500 [>.............................] - ETA: 2:43 - loss: 2.3856 - regression_loss: 1.9936 - classification_loss: 0.3920 25/500 [>.............................] - ETA: 2:43 - loss: 2.3466 - regression_loss: 1.9640 - classification_loss: 0.3826 26/500 [>.............................] - ETA: 2:42 - loss: 2.3388 - regression_loss: 1.9560 - classification_loss: 0.3828 27/500 [>.............................] - ETA: 2:42 - loss: 2.3304 - regression_loss: 1.9507 - classification_loss: 0.3797 28/500 [>.............................] - ETA: 2:42 - loss: 2.3213 - regression_loss: 1.9452 - classification_loss: 0.3761 29/500 [>.............................] - ETA: 2:41 - loss: 2.3032 - regression_loss: 1.9295 - classification_loss: 0.3737 30/500 [>.............................] - ETA: 2:40 - loss: 2.3070 - regression_loss: 1.9330 - classification_loss: 0.3740 31/500 [>.............................] - ETA: 2:40 - loss: 2.2997 - regression_loss: 1.9285 - classification_loss: 0.3712 32/500 [>.............................] - ETA: 2:40 - loss: 2.2891 - regression_loss: 1.9216 - classification_loss: 0.3675 33/500 [>.............................] - ETA: 2:40 - loss: 2.2530 - regression_loss: 1.8921 - classification_loss: 0.3610 34/500 [=>............................] - ETA: 2:39 - loss: 2.2537 - regression_loss: 1.8945 - classification_loss: 0.3592 35/500 [=>............................] - ETA: 2:39 - loss: 2.2265 - regression_loss: 1.8723 - classification_loss: 0.3542 36/500 [=>............................] - ETA: 2:38 - loss: 2.2301 - regression_loss: 1.8769 - classification_loss: 0.3532 37/500 [=>............................] - ETA: 2:38 - loss: 2.2377 - regression_loss: 1.8837 - classification_loss: 0.3540 38/500 [=>............................] - ETA: 2:38 - loss: 2.2349 - regression_loss: 1.8819 - classification_loss: 0.3530 39/500 [=>............................] - ETA: 2:37 - loss: 2.2335 - regression_loss: 1.8810 - classification_loss: 0.3526 40/500 [=>............................] - ETA: 2:36 - loss: 2.2450 - regression_loss: 1.8925 - classification_loss: 0.3525 41/500 [=>............................] - ETA: 2:36 - loss: 2.2261 - regression_loss: 1.8781 - classification_loss: 0.3480 42/500 [=>............................] - ETA: 2:36 - loss: 2.2264 - regression_loss: 1.8786 - classification_loss: 0.3478 43/500 [=>............................] - ETA: 2:35 - loss: 2.2191 - regression_loss: 1.8730 - classification_loss: 0.3461 44/500 [=>............................] - ETA: 2:35 - loss: 2.2197 - regression_loss: 1.8735 - classification_loss: 0.3462 45/500 [=>............................] - ETA: 2:34 - loss: 2.1986 - regression_loss: 1.8571 - classification_loss: 0.3415 46/500 [=>............................] - ETA: 2:34 - loss: 2.1882 - regression_loss: 1.8480 - classification_loss: 0.3402 47/500 [=>............................] - ETA: 2:34 - loss: 2.1825 - regression_loss: 1.8435 - classification_loss: 0.3389 48/500 [=>............................] - ETA: 2:34 - loss: 2.1736 - regression_loss: 1.8358 - classification_loss: 0.3377 49/500 [=>............................] - ETA: 2:33 - loss: 2.1737 - regression_loss: 1.8359 - classification_loss: 0.3378 50/500 [==>...........................] - ETA: 2:33 - loss: 2.1722 - regression_loss: 1.8353 - classification_loss: 0.3369 51/500 [==>...........................] - ETA: 2:33 - loss: 2.1736 - regression_loss: 1.8373 - classification_loss: 0.3362 52/500 [==>...........................] - ETA: 2:33 - loss: 2.1752 - regression_loss: 1.8390 - classification_loss: 0.3362 53/500 [==>...........................] - ETA: 2:32 - loss: 2.1865 - regression_loss: 1.8489 - classification_loss: 0.3375 54/500 [==>...........................] - ETA: 2:32 - loss: 2.1851 - regression_loss: 1.8479 - classification_loss: 0.3373 55/500 [==>...........................] - ETA: 2:31 - loss: 2.1783 - regression_loss: 1.8438 - classification_loss: 0.3345 56/500 [==>...........................] - ETA: 2:31 - loss: 2.1730 - regression_loss: 1.8391 - classification_loss: 0.3338 57/500 [==>...........................] - ETA: 2:30 - loss: 2.1650 - regression_loss: 1.8315 - classification_loss: 0.3335 58/500 [==>...........................] - ETA: 2:30 - loss: 2.1565 - regression_loss: 1.8235 - classification_loss: 0.3331 59/500 [==>...........................] - ETA: 2:30 - loss: 2.1511 - regression_loss: 1.8193 - classification_loss: 0.3319 60/500 [==>...........................] - ETA: 2:30 - loss: 2.1556 - regression_loss: 1.8237 - classification_loss: 0.3320 61/500 [==>...........................] - ETA: 2:29 - loss: 2.1579 - regression_loss: 1.8259 - classification_loss: 0.3320 62/500 [==>...........................] - ETA: 2:29 - loss: 2.1535 - regression_loss: 1.8220 - classification_loss: 0.3315 63/500 [==>...........................] - ETA: 2:28 - loss: 2.1508 - regression_loss: 1.8205 - classification_loss: 0.3303 64/500 [==>...........................] - ETA: 2:28 - loss: 2.1503 - regression_loss: 1.8204 - classification_loss: 0.3299 65/500 [==>...........................] - ETA: 2:28 - loss: 2.1526 - regression_loss: 1.8224 - classification_loss: 0.3301 66/500 [==>...........................] - ETA: 2:27 - loss: 2.1499 - regression_loss: 1.8212 - classification_loss: 0.3288 67/500 [===>..........................] - ETA: 2:27 - loss: 2.1446 - regression_loss: 1.8164 - classification_loss: 0.3281 68/500 [===>..........................] - ETA: 2:27 - loss: 2.1448 - regression_loss: 1.8179 - classification_loss: 0.3269 69/500 [===>..........................] - ETA: 2:26 - loss: 2.1381 - regression_loss: 1.8122 - classification_loss: 0.3259 70/500 [===>..........................] - ETA: 2:26 - loss: 2.1290 - regression_loss: 1.8047 - classification_loss: 0.3243 71/500 [===>..........................] - ETA: 2:26 - loss: 2.1263 - regression_loss: 1.8028 - classification_loss: 0.3235 72/500 [===>..........................] - ETA: 2:25 - loss: 2.1376 - regression_loss: 1.8117 - classification_loss: 0.3259 73/500 [===>..........................] - ETA: 2:25 - loss: 2.1373 - regression_loss: 1.8116 - classification_loss: 0.3256 74/500 [===>..........................] - ETA: 2:25 - loss: 2.1440 - regression_loss: 1.8161 - classification_loss: 0.3278 75/500 [===>..........................] - ETA: 2:24 - loss: 2.1447 - regression_loss: 1.8175 - classification_loss: 0.3272 76/500 [===>..........................] - ETA: 2:24 - loss: 2.1359 - regression_loss: 1.8102 - classification_loss: 0.3256 77/500 [===>..........................] - ETA: 2:24 - loss: 2.1390 - regression_loss: 1.8144 - classification_loss: 0.3245 78/500 [===>..........................] - ETA: 2:23 - loss: 2.1380 - regression_loss: 1.8137 - classification_loss: 0.3243 79/500 [===>..........................] - ETA: 2:23 - loss: 2.1360 - regression_loss: 1.8124 - classification_loss: 0.3236 80/500 [===>..........................] - ETA: 2:23 - loss: 2.1358 - regression_loss: 1.8130 - classification_loss: 0.3229 81/500 [===>..........................] - ETA: 2:22 - loss: 2.1386 - regression_loss: 1.8160 - classification_loss: 0.3226 82/500 [===>..........................] - ETA: 2:22 - loss: 2.1438 - regression_loss: 1.8204 - classification_loss: 0.3234 83/500 [===>..........................] - ETA: 2:22 - loss: 2.1486 - regression_loss: 1.8253 - classification_loss: 0.3233 84/500 [====>.........................] - ETA: 2:22 - loss: 2.1482 - regression_loss: 1.8255 - classification_loss: 0.3227 85/500 [====>.........................] - ETA: 2:21 - loss: 2.1454 - regression_loss: 1.8231 - classification_loss: 0.3223 86/500 [====>.........................] - ETA: 2:20 - loss: 2.1451 - regression_loss: 1.8229 - classification_loss: 0.3221 87/500 [====>.........................] - ETA: 2:20 - loss: 2.1516 - regression_loss: 1.8231 - classification_loss: 0.3285 88/500 [====>.........................] - ETA: 2:20 - loss: 2.1484 - regression_loss: 1.8207 - classification_loss: 0.3277 89/500 [====>.........................] - ETA: 2:19 - loss: 2.1486 - regression_loss: 1.8222 - classification_loss: 0.3264 90/500 [====>.........................] - ETA: 2:19 - loss: 2.1383 - regression_loss: 1.8137 - classification_loss: 0.3246 91/500 [====>.........................] - ETA: 2:19 - loss: 2.1393 - regression_loss: 1.8145 - classification_loss: 0.3249 92/500 [====>.........................] - ETA: 2:18 - loss: 2.1354 - regression_loss: 1.8117 - classification_loss: 0.3237 93/500 [====>.........................] - ETA: 2:18 - loss: 2.1322 - regression_loss: 1.8093 - classification_loss: 0.3228 94/500 [====>.........................] - ETA: 2:17 - loss: 2.1321 - regression_loss: 1.8095 - classification_loss: 0.3226 95/500 [====>.........................] - ETA: 2:17 - loss: 2.1347 - regression_loss: 1.8117 - classification_loss: 0.3230 96/500 [====>.........................] - ETA: 2:17 - loss: 2.1240 - regression_loss: 1.8026 - classification_loss: 0.3214 97/500 [====>.........................] - ETA: 2:16 - loss: 2.1233 - regression_loss: 1.8017 - classification_loss: 0.3216 98/500 [====>.........................] - ETA: 2:16 - loss: 2.1243 - regression_loss: 1.8026 - classification_loss: 0.3217 99/500 [====>.........................] - ETA: 2:16 - loss: 2.1205 - regression_loss: 1.7994 - classification_loss: 0.3211 100/500 [=====>........................] - ETA: 2:15 - loss: 2.1206 - regression_loss: 1.7995 - classification_loss: 0.3210 101/500 [=====>........................] - ETA: 2:15 - loss: 2.1231 - regression_loss: 1.8015 - classification_loss: 0.3216 102/500 [=====>........................] - ETA: 2:15 - loss: 2.1253 - regression_loss: 1.8038 - classification_loss: 0.3215 103/500 [=====>........................] - ETA: 2:14 - loss: 2.1317 - regression_loss: 1.8083 - classification_loss: 0.3234 104/500 [=====>........................] - ETA: 2:14 - loss: 2.1309 - regression_loss: 1.8077 - classification_loss: 0.3232 105/500 [=====>........................] - ETA: 2:14 - loss: 2.1335 - regression_loss: 1.8103 - classification_loss: 0.3232 106/500 [=====>........................] - ETA: 2:13 - loss: 2.1246 - regression_loss: 1.8029 - classification_loss: 0.3217 107/500 [=====>........................] - ETA: 2:13 - loss: 2.1250 - regression_loss: 1.8041 - classification_loss: 0.3209 108/500 [=====>........................] - ETA: 2:13 - loss: 2.1217 - regression_loss: 1.8013 - classification_loss: 0.3204 109/500 [=====>........................] - ETA: 2:12 - loss: 2.1243 - regression_loss: 1.8034 - classification_loss: 0.3209 110/500 [=====>........................] - ETA: 2:12 - loss: 2.1242 - regression_loss: 1.8034 - classification_loss: 0.3208 111/500 [=====>........................] - ETA: 2:12 - loss: 2.1351 - regression_loss: 1.8119 - classification_loss: 0.3232 112/500 [=====>........................] - ETA: 2:11 - loss: 2.1357 - regression_loss: 1.8121 - classification_loss: 0.3236 113/500 [=====>........................] - ETA: 2:11 - loss: 2.1327 - regression_loss: 1.8102 - classification_loss: 0.3225 114/500 [=====>........................] - ETA: 2:10 - loss: 2.1295 - regression_loss: 1.8076 - classification_loss: 0.3218 115/500 [=====>........................] - ETA: 2:10 - loss: 2.1265 - regression_loss: 1.8052 - classification_loss: 0.3213 116/500 [=====>........................] - ETA: 2:10 - loss: 2.1280 - regression_loss: 1.8065 - classification_loss: 0.3215 117/500 [======>.......................] - ETA: 2:09 - loss: 2.1252 - regression_loss: 1.8042 - classification_loss: 0.3210 118/500 [======>.......................] - ETA: 2:09 - loss: 2.1247 - regression_loss: 1.8024 - classification_loss: 0.3222 119/500 [======>.......................] - ETA: 2:09 - loss: 2.1267 - regression_loss: 1.8044 - classification_loss: 0.3223 120/500 [======>.......................] - ETA: 2:08 - loss: 2.1253 - regression_loss: 1.8031 - classification_loss: 0.3221 121/500 [======>.......................] - ETA: 2:08 - loss: 2.1255 - regression_loss: 1.8029 - classification_loss: 0.3226 122/500 [======>.......................] - ETA: 2:08 - loss: 2.1227 - regression_loss: 1.8008 - classification_loss: 0.3220 123/500 [======>.......................] - ETA: 2:07 - loss: 2.1240 - regression_loss: 1.8028 - classification_loss: 0.3213 124/500 [======>.......................] - ETA: 2:07 - loss: 2.1238 - regression_loss: 1.8025 - classification_loss: 0.3213 125/500 [======>.......................] - ETA: 2:07 - loss: 2.1250 - regression_loss: 1.8035 - classification_loss: 0.3215 126/500 [======>.......................] - ETA: 2:06 - loss: 2.1218 - regression_loss: 1.8009 - classification_loss: 0.3208 127/500 [======>.......................] - ETA: 2:06 - loss: 2.1216 - regression_loss: 1.8008 - classification_loss: 0.3208 128/500 [======>.......................] - ETA: 2:06 - loss: 2.1162 - regression_loss: 1.7961 - classification_loss: 0.3201 129/500 [======>.......................] - ETA: 2:05 - loss: 2.1129 - regression_loss: 1.7936 - classification_loss: 0.3193 130/500 [======>.......................] - ETA: 2:05 - loss: 2.1174 - regression_loss: 1.7978 - classification_loss: 0.3196 131/500 [======>.......................] - ETA: 2:05 - loss: 2.1156 - regression_loss: 1.7963 - classification_loss: 0.3193 132/500 [======>.......................] - ETA: 2:04 - loss: 2.1165 - regression_loss: 1.7974 - classification_loss: 0.3192 133/500 [======>.......................] - ETA: 2:04 - loss: 2.1190 - regression_loss: 1.8000 - classification_loss: 0.3190 134/500 [=======>......................] - ETA: 2:04 - loss: 2.1207 - regression_loss: 1.8022 - classification_loss: 0.3186 135/500 [=======>......................] - ETA: 2:03 - loss: 2.1225 - regression_loss: 1.8035 - classification_loss: 0.3190 136/500 [=======>......................] - ETA: 2:03 - loss: 2.1159 - regression_loss: 1.7982 - classification_loss: 0.3178 137/500 [=======>......................] - ETA: 2:02 - loss: 2.1168 - regression_loss: 1.7989 - classification_loss: 0.3179 138/500 [=======>......................] - ETA: 2:02 - loss: 2.1176 - regression_loss: 1.7990 - classification_loss: 0.3186 139/500 [=======>......................] - ETA: 2:02 - loss: 2.1186 - regression_loss: 1.7999 - classification_loss: 0.3187 140/500 [=======>......................] - ETA: 2:01 - loss: 2.1203 - regression_loss: 1.8020 - classification_loss: 0.3183 141/500 [=======>......................] - ETA: 2:01 - loss: 2.1205 - regression_loss: 1.8027 - classification_loss: 0.3178 142/500 [=======>......................] - ETA: 2:01 - loss: 2.1192 - regression_loss: 1.8015 - classification_loss: 0.3177 143/500 [=======>......................] - ETA: 2:00 - loss: 2.1201 - regression_loss: 1.8027 - classification_loss: 0.3173 144/500 [=======>......................] - ETA: 2:00 - loss: 2.1221 - regression_loss: 1.8046 - classification_loss: 0.3175 145/500 [=======>......................] - ETA: 2:00 - loss: 2.1245 - regression_loss: 1.8068 - classification_loss: 0.3176 146/500 [=======>......................] - ETA: 1:59 - loss: 2.1173 - regression_loss: 1.8003 - classification_loss: 0.3170 147/500 [=======>......................] - ETA: 1:59 - loss: 2.1165 - regression_loss: 1.7999 - classification_loss: 0.3166 148/500 [=======>......................] - ETA: 1:59 - loss: 2.1197 - regression_loss: 1.8026 - classification_loss: 0.3171 149/500 [=======>......................] - ETA: 1:58 - loss: 2.1202 - regression_loss: 1.8035 - classification_loss: 0.3167 150/500 [========>.....................] - ETA: 1:58 - loss: 2.1201 - regression_loss: 1.8037 - classification_loss: 0.3165 151/500 [========>.....................] - ETA: 1:58 - loss: 2.1177 - regression_loss: 1.8017 - classification_loss: 0.3160 152/500 [========>.....................] - ETA: 1:58 - loss: 2.1181 - regression_loss: 1.8019 - classification_loss: 0.3162 153/500 [========>.....................] - ETA: 1:57 - loss: 2.1194 - regression_loss: 1.8033 - classification_loss: 0.3162 154/500 [========>.....................] - ETA: 1:57 - loss: 2.1166 - regression_loss: 1.8011 - classification_loss: 0.3155 155/500 [========>.....................] - ETA: 1:57 - loss: 2.1155 - regression_loss: 1.8005 - classification_loss: 0.3150 156/500 [========>.....................] - ETA: 1:56 - loss: 2.1183 - regression_loss: 1.8022 - classification_loss: 0.3162 157/500 [========>.....................] - ETA: 1:56 - loss: 2.1168 - regression_loss: 1.8011 - classification_loss: 0.3158 158/500 [========>.....................] - ETA: 1:56 - loss: 2.1183 - regression_loss: 1.8021 - classification_loss: 0.3162 159/500 [========>.....................] - ETA: 1:55 - loss: 2.1231 - regression_loss: 1.8055 - classification_loss: 0.3176 160/500 [========>.....................] - ETA: 1:55 - loss: 2.1245 - regression_loss: 1.8066 - classification_loss: 0.3179 161/500 [========>.....................] - ETA: 1:55 - loss: 2.1244 - regression_loss: 1.8064 - classification_loss: 0.3179 162/500 [========>.....................] - ETA: 1:54 - loss: 2.1255 - regression_loss: 1.8070 - classification_loss: 0.3184 163/500 [========>.....................] - ETA: 1:54 - loss: 2.1244 - regression_loss: 1.8061 - classification_loss: 0.3183 164/500 [========>.....................] - ETA: 1:54 - loss: 2.1207 - regression_loss: 1.8031 - classification_loss: 0.3175 165/500 [========>.....................] - ETA: 1:53 - loss: 2.1189 - regression_loss: 1.8022 - classification_loss: 0.3167 166/500 [========>.....................] - ETA: 1:53 - loss: 2.1192 - regression_loss: 1.8022 - classification_loss: 0.3170 167/500 [=========>....................] - ETA: 1:53 - loss: 2.1197 - regression_loss: 1.8029 - classification_loss: 0.3168 168/500 [=========>....................] - ETA: 1:52 - loss: 2.1236 - regression_loss: 1.8063 - classification_loss: 0.3173 169/500 [=========>....................] - ETA: 1:52 - loss: 2.1187 - regression_loss: 1.8023 - classification_loss: 0.3164 170/500 [=========>....................] - ETA: 1:51 - loss: 2.1211 - regression_loss: 1.8044 - classification_loss: 0.3166 171/500 [=========>....................] - ETA: 1:51 - loss: 2.1186 - regression_loss: 1.8021 - classification_loss: 0.3165 172/500 [=========>....................] - ETA: 1:51 - loss: 2.1182 - regression_loss: 1.8019 - classification_loss: 0.3163 173/500 [=========>....................] - ETA: 1:51 - loss: 2.1176 - regression_loss: 1.8016 - classification_loss: 0.3160 174/500 [=========>....................] - ETA: 1:50 - loss: 2.1175 - regression_loss: 1.8017 - classification_loss: 0.3158 175/500 [=========>....................] - ETA: 1:50 - loss: 2.1172 - regression_loss: 1.8015 - classification_loss: 0.3157 176/500 [=========>....................] - ETA: 1:50 - loss: 2.1155 - regression_loss: 1.7996 - classification_loss: 0.3159 177/500 [=========>....................] - ETA: 1:49 - loss: 2.1181 - regression_loss: 1.8013 - classification_loss: 0.3168 178/500 [=========>....................] - ETA: 1:49 - loss: 2.1209 - regression_loss: 1.8033 - classification_loss: 0.3175 179/500 [=========>....................] - ETA: 1:49 - loss: 2.1191 - regression_loss: 1.8019 - classification_loss: 0.3172 180/500 [=========>....................] - ETA: 1:48 - loss: 2.1185 - regression_loss: 1.8015 - classification_loss: 0.3170 181/500 [=========>....................] - ETA: 1:48 - loss: 2.1196 - regression_loss: 1.8022 - classification_loss: 0.3174 182/500 [=========>....................] - ETA: 1:48 - loss: 2.1184 - regression_loss: 1.8013 - classification_loss: 0.3171 183/500 [=========>....................] - ETA: 1:47 - loss: 2.1171 - regression_loss: 1.8000 - classification_loss: 0.3171 184/500 [==========>...................] - ETA: 1:47 - loss: 2.1141 - regression_loss: 1.7972 - classification_loss: 0.3169 185/500 [==========>...................] - ETA: 1:46 - loss: 2.1145 - regression_loss: 1.7976 - classification_loss: 0.3169 186/500 [==========>...................] - ETA: 1:46 - loss: 2.1159 - regression_loss: 1.7989 - classification_loss: 0.3170 187/500 [==========>...................] - ETA: 1:46 - loss: 2.1169 - regression_loss: 1.7999 - classification_loss: 0.3170 188/500 [==========>...................] - ETA: 1:45 - loss: 2.1148 - regression_loss: 1.7985 - classification_loss: 0.3163 189/500 [==========>...................] - ETA: 1:45 - loss: 2.1179 - regression_loss: 1.8007 - classification_loss: 0.3172 190/500 [==========>...................] - ETA: 1:45 - loss: 2.1177 - regression_loss: 1.8006 - classification_loss: 0.3171 191/500 [==========>...................] - ETA: 1:44 - loss: 2.1185 - regression_loss: 1.8014 - classification_loss: 0.3171 192/500 [==========>...................] - ETA: 1:44 - loss: 2.1158 - regression_loss: 1.7989 - classification_loss: 0.3169 193/500 [==========>...................] - ETA: 1:44 - loss: 2.1143 - regression_loss: 1.7976 - classification_loss: 0.3167 194/500 [==========>...................] - ETA: 1:43 - loss: 2.1172 - regression_loss: 1.8003 - classification_loss: 0.3170 195/500 [==========>...................] - ETA: 1:43 - loss: 2.1120 - regression_loss: 1.7955 - classification_loss: 0.3165 196/500 [==========>...................] - ETA: 1:43 - loss: 2.1128 - regression_loss: 1.7962 - classification_loss: 0.3166 197/500 [==========>...................] - ETA: 1:42 - loss: 2.1125 - regression_loss: 1.7962 - classification_loss: 0.3163 198/500 [==========>...................] - ETA: 1:42 - loss: 2.1102 - regression_loss: 1.7946 - classification_loss: 0.3157 199/500 [==========>...................] - ETA: 1:42 - loss: 2.1116 - regression_loss: 1.7956 - classification_loss: 0.3160 200/500 [===========>..................] - ETA: 1:41 - loss: 2.1096 - regression_loss: 1.7936 - classification_loss: 0.3160 201/500 [===========>..................] - ETA: 1:41 - loss: 2.1104 - regression_loss: 1.7944 - classification_loss: 0.3160 202/500 [===========>..................] - ETA: 1:41 - loss: 2.1095 - regression_loss: 1.7937 - classification_loss: 0.3158 203/500 [===========>..................] - ETA: 1:40 - loss: 2.1060 - regression_loss: 1.7908 - classification_loss: 0.3152 204/500 [===========>..................] - ETA: 1:40 - loss: 2.1075 - regression_loss: 1.7922 - classification_loss: 0.3153 205/500 [===========>..................] - ETA: 1:40 - loss: 2.1054 - regression_loss: 1.7903 - classification_loss: 0.3150 206/500 [===========>..................] - ETA: 1:39 - loss: 2.1137 - regression_loss: 1.7968 - classification_loss: 0.3169 207/500 [===========>..................] - ETA: 1:39 - loss: 2.1130 - regression_loss: 1.7963 - classification_loss: 0.3166 208/500 [===========>..................] - ETA: 1:39 - loss: 2.1141 - regression_loss: 1.7972 - classification_loss: 0.3169 209/500 [===========>..................] - ETA: 1:38 - loss: 2.1096 - regression_loss: 1.7935 - classification_loss: 0.3160 210/500 [===========>..................] - ETA: 1:38 - loss: 2.1099 - regression_loss: 1.7937 - classification_loss: 0.3162 211/500 [===========>..................] - ETA: 1:38 - loss: 2.1097 - regression_loss: 1.7937 - classification_loss: 0.3159 212/500 [===========>..................] - ETA: 1:37 - loss: 2.1096 - regression_loss: 1.7938 - classification_loss: 0.3159 213/500 [===========>..................] - ETA: 1:37 - loss: 2.1107 - regression_loss: 1.7948 - classification_loss: 0.3159 214/500 [===========>..................] - ETA: 1:37 - loss: 2.1098 - regression_loss: 1.7942 - classification_loss: 0.3156 215/500 [===========>..................] - ETA: 1:36 - loss: 2.1053 - regression_loss: 1.7904 - classification_loss: 0.3149 216/500 [===========>..................] - ETA: 1:36 - loss: 2.1054 - regression_loss: 1.7908 - classification_loss: 0.3146 217/500 [============>.................] - ETA: 1:36 - loss: 2.1028 - regression_loss: 1.7883 - classification_loss: 0.3145 218/500 [============>.................] - ETA: 1:35 - loss: 2.1022 - regression_loss: 1.7876 - classification_loss: 0.3145 219/500 [============>.................] - ETA: 1:35 - loss: 2.1022 - regression_loss: 1.7877 - classification_loss: 0.3145 220/500 [============>.................] - ETA: 1:35 - loss: 2.1015 - regression_loss: 1.7871 - classification_loss: 0.3144 221/500 [============>.................] - ETA: 1:34 - loss: 2.1038 - regression_loss: 1.7890 - classification_loss: 0.3148 222/500 [============>.................] - ETA: 1:34 - loss: 2.1043 - regression_loss: 1.7897 - classification_loss: 0.3146 223/500 [============>.................] - ETA: 1:34 - loss: 2.1070 - regression_loss: 1.7917 - classification_loss: 0.3153 224/500 [============>.................] - ETA: 1:33 - loss: 2.1050 - regression_loss: 1.7900 - classification_loss: 0.3150 225/500 [============>.................] - ETA: 1:33 - loss: 2.1012 - regression_loss: 1.7869 - classification_loss: 0.3143 226/500 [============>.................] - ETA: 1:33 - loss: 2.1023 - regression_loss: 1.7878 - classification_loss: 0.3144 227/500 [============>.................] - ETA: 1:32 - loss: 2.1013 - regression_loss: 1.7866 - classification_loss: 0.3146 228/500 [============>.................] - ETA: 1:32 - loss: 2.1018 - regression_loss: 1.7869 - classification_loss: 0.3149 229/500 [============>.................] - ETA: 1:32 - loss: 2.1023 - regression_loss: 1.7873 - classification_loss: 0.3150 230/500 [============>.................] - ETA: 1:31 - loss: 2.1024 - regression_loss: 1.7874 - classification_loss: 0.3149 231/500 [============>.................] - ETA: 1:31 - loss: 2.1060 - regression_loss: 1.7891 - classification_loss: 0.3169 232/500 [============>.................] - ETA: 1:31 - loss: 2.1044 - regression_loss: 1.7877 - classification_loss: 0.3167 233/500 [============>.................] - ETA: 1:30 - loss: 2.1050 - regression_loss: 1.7883 - classification_loss: 0.3168 234/500 [=============>................] - ETA: 1:30 - loss: 2.1128 - regression_loss: 1.7929 - classification_loss: 0.3200 235/500 [=============>................] - ETA: 1:29 - loss: 2.1144 - regression_loss: 1.7944 - classification_loss: 0.3200 236/500 [=============>................] - ETA: 1:29 - loss: 2.1124 - regression_loss: 1.7928 - classification_loss: 0.3196 237/500 [=============>................] - ETA: 1:29 - loss: 2.1118 - regression_loss: 1.7922 - classification_loss: 0.3196 238/500 [=============>................] - ETA: 1:28 - loss: 2.1113 - regression_loss: 1.7919 - classification_loss: 0.3194 239/500 [=============>................] - ETA: 1:28 - loss: 2.1122 - regression_loss: 1.7927 - classification_loss: 0.3195 240/500 [=============>................] - ETA: 1:28 - loss: 2.1163 - regression_loss: 1.7964 - classification_loss: 0.3199 241/500 [=============>................] - ETA: 1:27 - loss: 2.1181 - regression_loss: 1.7979 - classification_loss: 0.3202 242/500 [=============>................] - ETA: 1:27 - loss: 2.1173 - regression_loss: 1.7974 - classification_loss: 0.3199 243/500 [=============>................] - ETA: 1:27 - loss: 2.1187 - regression_loss: 1.7983 - classification_loss: 0.3204 244/500 [=============>................] - ETA: 1:26 - loss: 2.1180 - regression_loss: 1.7977 - classification_loss: 0.3203 245/500 [=============>................] - ETA: 1:26 - loss: 2.1162 - regression_loss: 1.7962 - classification_loss: 0.3201 246/500 [=============>................] - ETA: 1:26 - loss: 2.1153 - regression_loss: 1.7955 - classification_loss: 0.3198 247/500 [=============>................] - ETA: 1:25 - loss: 2.1159 - regression_loss: 1.7963 - classification_loss: 0.3197 248/500 [=============>................] - ETA: 1:25 - loss: 2.1173 - regression_loss: 1.7975 - classification_loss: 0.3197 249/500 [=============>................] - ETA: 1:25 - loss: 2.1179 - regression_loss: 1.7982 - classification_loss: 0.3197 250/500 [==============>...............] - ETA: 1:24 - loss: 2.1180 - regression_loss: 1.7984 - classification_loss: 0.3196 251/500 [==============>...............] - ETA: 1:24 - loss: 2.1174 - regression_loss: 1.7978 - classification_loss: 0.3196 252/500 [==============>...............] - ETA: 1:24 - loss: 2.1172 - regression_loss: 1.7975 - classification_loss: 0.3198 253/500 [==============>...............] - ETA: 1:23 - loss: 2.1149 - regression_loss: 1.7955 - classification_loss: 0.3194 254/500 [==============>...............] - ETA: 1:23 - loss: 2.1158 - regression_loss: 1.7964 - classification_loss: 0.3194 255/500 [==============>...............] - ETA: 1:23 - loss: 2.1164 - regression_loss: 1.7967 - classification_loss: 0.3197 256/500 [==============>...............] - ETA: 1:22 - loss: 2.1140 - regression_loss: 1.7947 - classification_loss: 0.3193 257/500 [==============>...............] - ETA: 1:22 - loss: 2.1156 - regression_loss: 1.7959 - classification_loss: 0.3196 258/500 [==============>...............] - ETA: 1:22 - loss: 2.1120 - regression_loss: 1.7930 - classification_loss: 0.3190 259/500 [==============>...............] - ETA: 1:21 - loss: 2.1136 - regression_loss: 1.7944 - classification_loss: 0.3192 260/500 [==============>...............] - ETA: 1:21 - loss: 2.1128 - regression_loss: 1.7936 - classification_loss: 0.3192 261/500 [==============>...............] - ETA: 1:21 - loss: 2.1146 - regression_loss: 1.7951 - classification_loss: 0.3195 262/500 [==============>...............] - ETA: 1:20 - loss: 2.1141 - regression_loss: 1.7948 - classification_loss: 0.3193 263/500 [==============>...............] - ETA: 1:20 - loss: 2.1136 - regression_loss: 1.7944 - classification_loss: 0.3192 264/500 [==============>...............] - ETA: 1:19 - loss: 2.1119 - regression_loss: 1.7930 - classification_loss: 0.3190 265/500 [==============>...............] - ETA: 1:19 - loss: 2.1113 - regression_loss: 1.7927 - classification_loss: 0.3186 266/500 [==============>...............] - ETA: 1:19 - loss: 2.1121 - regression_loss: 1.7934 - classification_loss: 0.3187 267/500 [===============>..............] - ETA: 1:18 - loss: 2.1129 - regression_loss: 1.7940 - classification_loss: 0.3188 268/500 [===============>..............] - ETA: 1:18 - loss: 2.1126 - regression_loss: 1.7938 - classification_loss: 0.3188 269/500 [===============>..............] - ETA: 1:18 - loss: 2.1137 - regression_loss: 1.7948 - classification_loss: 0.3190 270/500 [===============>..............] - ETA: 1:17 - loss: 2.1118 - regression_loss: 1.7932 - classification_loss: 0.3187 271/500 [===============>..............] - ETA: 1:17 - loss: 2.1108 - regression_loss: 1.7920 - classification_loss: 0.3188 272/500 [===============>..............] - ETA: 1:17 - loss: 2.1110 - regression_loss: 1.7921 - classification_loss: 0.3188 273/500 [===============>..............] - ETA: 1:16 - loss: 2.1108 - regression_loss: 1.7921 - classification_loss: 0.3187 274/500 [===============>..............] - ETA: 1:16 - loss: 2.1080 - regression_loss: 1.7896 - classification_loss: 0.3184 275/500 [===============>..............] - ETA: 1:16 - loss: 2.1078 - regression_loss: 1.7894 - classification_loss: 0.3184 276/500 [===============>..............] - ETA: 1:15 - loss: 2.1091 - regression_loss: 1.7908 - classification_loss: 0.3183 277/500 [===============>..............] - ETA: 1:15 - loss: 2.1076 - regression_loss: 1.7896 - classification_loss: 0.3180 278/500 [===============>..............] - ETA: 1:15 - loss: 2.1065 - regression_loss: 1.7888 - classification_loss: 0.3177 279/500 [===============>..............] - ETA: 1:14 - loss: 2.1067 - regression_loss: 1.7890 - classification_loss: 0.3177 280/500 [===============>..............] - ETA: 1:14 - loss: 2.1028 - regression_loss: 1.7855 - classification_loss: 0.3174 281/500 [===============>..............] - ETA: 1:14 - loss: 2.1026 - regression_loss: 1.7853 - classification_loss: 0.3173 282/500 [===============>..............] - ETA: 1:13 - loss: 2.1027 - regression_loss: 1.7854 - classification_loss: 0.3173 283/500 [===============>..............] - ETA: 1:13 - loss: 2.1007 - regression_loss: 1.7838 - classification_loss: 0.3169 284/500 [================>.............] - ETA: 1:13 - loss: 2.1036 - regression_loss: 1.7867 - classification_loss: 0.3169 285/500 [================>.............] - ETA: 1:12 - loss: 2.1037 - regression_loss: 1.7872 - classification_loss: 0.3165 286/500 [================>.............] - ETA: 1:12 - loss: 2.1025 - regression_loss: 1.7861 - classification_loss: 0.3164 287/500 [================>.............] - ETA: 1:12 - loss: 2.1008 - regression_loss: 1.7849 - classification_loss: 0.3159 288/500 [================>.............] - ETA: 1:11 - loss: 2.1031 - regression_loss: 1.7868 - classification_loss: 0.3163 289/500 [================>.............] - ETA: 1:11 - loss: 2.1033 - regression_loss: 1.7872 - classification_loss: 0.3161 290/500 [================>.............] - ETA: 1:11 - loss: 2.1023 - regression_loss: 1.7864 - classification_loss: 0.3159 291/500 [================>.............] - ETA: 1:10 - loss: 2.1035 - regression_loss: 1.7873 - classification_loss: 0.3162 292/500 [================>.............] - ETA: 1:10 - loss: 2.1017 - regression_loss: 1.7857 - classification_loss: 0.3159 293/500 [================>.............] - ETA: 1:10 - loss: 2.1015 - regression_loss: 1.7858 - classification_loss: 0.3157 294/500 [================>.............] - ETA: 1:09 - loss: 2.1011 - regression_loss: 1.7854 - classification_loss: 0.3157 295/500 [================>.............] - ETA: 1:09 - loss: 2.1019 - regression_loss: 1.7860 - classification_loss: 0.3159 296/500 [================>.............] - ETA: 1:09 - loss: 2.1030 - regression_loss: 1.7871 - classification_loss: 0.3159 297/500 [================>.............] - ETA: 1:08 - loss: 2.1055 - regression_loss: 1.7890 - classification_loss: 0.3164 298/500 [================>.............] - ETA: 1:08 - loss: 2.1043 - regression_loss: 1.7883 - classification_loss: 0.3160 299/500 [================>.............] - ETA: 1:08 - loss: 2.1028 - regression_loss: 1.7872 - classification_loss: 0.3156 300/500 [=================>............] - ETA: 1:07 - loss: 2.1013 - regression_loss: 1.7861 - classification_loss: 0.3152 301/500 [=================>............] - ETA: 1:07 - loss: 2.1030 - regression_loss: 1.7876 - classification_loss: 0.3154 302/500 [=================>............] - ETA: 1:07 - loss: 2.1012 - regression_loss: 1.7861 - classification_loss: 0.3152 303/500 [=================>............] - ETA: 1:06 - loss: 2.1037 - regression_loss: 1.7882 - classification_loss: 0.3155 304/500 [=================>............] - ETA: 1:06 - loss: 2.1114 - regression_loss: 1.7907 - classification_loss: 0.3208 305/500 [=================>............] - ETA: 1:06 - loss: 2.1086 - regression_loss: 1.7884 - classification_loss: 0.3202 306/500 [=================>............] - ETA: 1:05 - loss: 2.1081 - regression_loss: 1.7878 - classification_loss: 0.3203 307/500 [=================>............] - ETA: 1:05 - loss: 2.1078 - regression_loss: 1.7879 - classification_loss: 0.3199 308/500 [=================>............] - ETA: 1:05 - loss: 2.1046 - regression_loss: 1.7850 - classification_loss: 0.3196 309/500 [=================>............] - ETA: 1:04 - loss: 2.1020 - regression_loss: 1.7829 - classification_loss: 0.3192 310/500 [=================>............] - ETA: 1:04 - loss: 2.1009 - regression_loss: 1.7819 - classification_loss: 0.3190 311/500 [=================>............] - ETA: 1:04 - loss: 2.1017 - regression_loss: 1.7821 - classification_loss: 0.3195 312/500 [=================>............] - ETA: 1:03 - loss: 2.1016 - regression_loss: 1.7823 - classification_loss: 0.3193 313/500 [=================>............] - ETA: 1:03 - loss: 2.1024 - regression_loss: 1.7831 - classification_loss: 0.3193 314/500 [=================>............] - ETA: 1:03 - loss: 2.1010 - regression_loss: 1.7819 - classification_loss: 0.3191 315/500 [=================>............] - ETA: 1:02 - loss: 2.1008 - regression_loss: 1.7821 - classification_loss: 0.3188 316/500 [=================>............] - ETA: 1:02 - loss: 2.1010 - regression_loss: 1.7823 - classification_loss: 0.3187 317/500 [==================>...........] - ETA: 1:01 - loss: 2.1011 - regression_loss: 1.7824 - classification_loss: 0.3187 318/500 [==================>...........] - ETA: 1:01 - loss: 2.1014 - regression_loss: 1.7827 - classification_loss: 0.3187 319/500 [==================>...........] - ETA: 1:01 - loss: 2.1000 - regression_loss: 1.7816 - classification_loss: 0.3184 320/500 [==================>...........] - ETA: 1:00 - loss: 2.1002 - regression_loss: 1.7817 - classification_loss: 0.3186 321/500 [==================>...........] - ETA: 1:00 - loss: 2.1005 - regression_loss: 1.7821 - classification_loss: 0.3184 322/500 [==================>...........] - ETA: 1:00 - loss: 2.1014 - regression_loss: 1.7827 - classification_loss: 0.3186 323/500 [==================>...........] - ETA: 59s - loss: 2.1021 - regression_loss: 1.7834 - classification_loss: 0.3188  324/500 [==================>...........] - ETA: 59s - loss: 2.1025 - regression_loss: 1.7837 - classification_loss: 0.3188 325/500 [==================>...........] - ETA: 59s - loss: 2.1030 - regression_loss: 1.7842 - classification_loss: 0.3188 326/500 [==================>...........] - ETA: 58s - loss: 2.1025 - regression_loss: 1.7838 - classification_loss: 0.3187 327/500 [==================>...........] - ETA: 58s - loss: 2.1021 - regression_loss: 1.7836 - classification_loss: 0.3185 328/500 [==================>...........] - ETA: 58s - loss: 2.1020 - regression_loss: 1.7836 - classification_loss: 0.3185 329/500 [==================>...........] - ETA: 57s - loss: 2.1021 - regression_loss: 1.7832 - classification_loss: 0.3189 330/500 [==================>...........] - ETA: 57s - loss: 2.1036 - regression_loss: 1.7845 - classification_loss: 0.3191 331/500 [==================>...........] - ETA: 57s - loss: 2.1017 - regression_loss: 1.7829 - classification_loss: 0.3188 332/500 [==================>...........] - ETA: 56s - loss: 2.1016 - regression_loss: 1.7830 - classification_loss: 0.3186 333/500 [==================>...........] - ETA: 56s - loss: 2.1019 - regression_loss: 1.7832 - classification_loss: 0.3186 334/500 [===================>..........] - ETA: 56s - loss: 2.1009 - regression_loss: 1.7824 - classification_loss: 0.3185 335/500 [===================>..........] - ETA: 55s - loss: 2.1015 - regression_loss: 1.7828 - classification_loss: 0.3187 336/500 [===================>..........] - ETA: 55s - loss: 2.1024 - regression_loss: 1.7836 - classification_loss: 0.3187 337/500 [===================>..........] - ETA: 55s - loss: 2.1028 - regression_loss: 1.7836 - classification_loss: 0.3192 338/500 [===================>..........] - ETA: 54s - loss: 2.1061 - regression_loss: 1.7864 - classification_loss: 0.3196 339/500 [===================>..........] - ETA: 54s - loss: 2.1048 - regression_loss: 1.7854 - classification_loss: 0.3194 340/500 [===================>..........] - ETA: 54s - loss: 2.1038 - regression_loss: 1.7846 - classification_loss: 0.3191 341/500 [===================>..........] - ETA: 53s - loss: 2.1055 - regression_loss: 1.7859 - classification_loss: 0.3196 342/500 [===================>..........] - ETA: 53s - loss: 2.1056 - regression_loss: 1.7860 - classification_loss: 0.3196 343/500 [===================>..........] - ETA: 53s - loss: 2.1046 - regression_loss: 1.7852 - classification_loss: 0.3194 344/500 [===================>..........] - ETA: 52s - loss: 2.1054 - regression_loss: 1.7860 - classification_loss: 0.3194 345/500 [===================>..........] - ETA: 52s - loss: 2.1042 - regression_loss: 1.7850 - classification_loss: 0.3192 346/500 [===================>..........] - ETA: 52s - loss: 2.1050 - regression_loss: 1.7860 - classification_loss: 0.3189 347/500 [===================>..........] - ETA: 51s - loss: 2.1046 - regression_loss: 1.7858 - classification_loss: 0.3188 348/500 [===================>..........] - ETA: 51s - loss: 2.1050 - regression_loss: 1.7862 - classification_loss: 0.3188 349/500 [===================>..........] - ETA: 51s - loss: 2.1057 - regression_loss: 1.7867 - classification_loss: 0.3190 350/500 [====================>.........] - ETA: 50s - loss: 2.1063 - regression_loss: 1.7868 - classification_loss: 0.3194 351/500 [====================>.........] - ETA: 50s - loss: 2.1074 - regression_loss: 1.7878 - classification_loss: 0.3195 352/500 [====================>.........] - ETA: 50s - loss: 2.1126 - regression_loss: 1.7912 - classification_loss: 0.3214 353/500 [====================>.........] - ETA: 49s - loss: 2.1104 - regression_loss: 1.7895 - classification_loss: 0.3209 354/500 [====================>.........] - ETA: 49s - loss: 2.1098 - regression_loss: 1.7889 - classification_loss: 0.3209 355/500 [====================>.........] - ETA: 49s - loss: 2.1091 - regression_loss: 1.7884 - classification_loss: 0.3207 356/500 [====================>.........] - ETA: 48s - loss: 2.1094 - regression_loss: 1.7889 - classification_loss: 0.3205 357/500 [====================>.........] - ETA: 48s - loss: 2.1101 - regression_loss: 1.7897 - classification_loss: 0.3204 358/500 [====================>.........] - ETA: 48s - loss: 2.1098 - regression_loss: 1.7896 - classification_loss: 0.3202 359/500 [====================>.........] - ETA: 47s - loss: 2.1103 - regression_loss: 1.7899 - classification_loss: 0.3204 360/500 [====================>.........] - ETA: 47s - loss: 2.1087 - regression_loss: 1.7887 - classification_loss: 0.3200 361/500 [====================>.........] - ETA: 47s - loss: 2.1081 - regression_loss: 1.7883 - classification_loss: 0.3198 362/500 [====================>.........] - ETA: 46s - loss: 2.1062 - regression_loss: 1.7866 - classification_loss: 0.3196 363/500 [====================>.........] - ETA: 46s - loss: 2.1061 - regression_loss: 1.7866 - classification_loss: 0.3194 364/500 [====================>.........] - ETA: 46s - loss: 2.1068 - regression_loss: 1.7874 - classification_loss: 0.3194 365/500 [====================>.........] - ETA: 45s - loss: 2.1077 - regression_loss: 1.7882 - classification_loss: 0.3195 366/500 [====================>.........] - ETA: 45s - loss: 2.1083 - regression_loss: 1.7888 - classification_loss: 0.3195 367/500 [=====================>........] - ETA: 44s - loss: 2.1092 - regression_loss: 1.7895 - classification_loss: 0.3197 368/500 [=====================>........] - ETA: 44s - loss: 2.1089 - regression_loss: 1.7893 - classification_loss: 0.3196 369/500 [=====================>........] - ETA: 44s - loss: 2.1086 - regression_loss: 1.7891 - classification_loss: 0.3195 370/500 [=====================>........] - ETA: 43s - loss: 2.1086 - regression_loss: 1.7891 - classification_loss: 0.3195 371/500 [=====================>........] - ETA: 43s - loss: 2.1078 - regression_loss: 1.7885 - classification_loss: 0.3193 372/500 [=====================>........] - ETA: 43s - loss: 2.1068 - regression_loss: 1.7877 - classification_loss: 0.3191 373/500 [=====================>........] - ETA: 42s - loss: 2.1059 - regression_loss: 1.7869 - classification_loss: 0.3189 374/500 [=====================>........] - ETA: 42s - loss: 2.1045 - regression_loss: 1.7859 - classification_loss: 0.3186 375/500 [=====================>........] - ETA: 42s - loss: 2.1039 - regression_loss: 1.7855 - classification_loss: 0.3184 376/500 [=====================>........] - ETA: 41s - loss: 2.1046 - regression_loss: 1.7859 - classification_loss: 0.3187 377/500 [=====================>........] - ETA: 41s - loss: 2.1046 - regression_loss: 1.7860 - classification_loss: 0.3186 378/500 [=====================>........] - ETA: 41s - loss: 2.1044 - regression_loss: 1.7859 - classification_loss: 0.3185 379/500 [=====================>........] - ETA: 40s - loss: 2.1032 - regression_loss: 1.7849 - classification_loss: 0.3183 380/500 [=====================>........] - ETA: 40s - loss: 2.1007 - regression_loss: 1.7827 - classification_loss: 0.3180 381/500 [=====================>........] - ETA: 40s - loss: 2.1004 - regression_loss: 1.7826 - classification_loss: 0.3178 382/500 [=====================>........] - ETA: 39s - loss: 2.1005 - regression_loss: 1.7827 - classification_loss: 0.3178 383/500 [=====================>........] - ETA: 39s - loss: 2.1004 - regression_loss: 1.7826 - classification_loss: 0.3178 384/500 [======================>.......] - ETA: 39s - loss: 2.0994 - regression_loss: 1.7816 - classification_loss: 0.3177 385/500 [======================>.......] - ETA: 38s - loss: 2.0986 - regression_loss: 1.7808 - classification_loss: 0.3178 386/500 [======================>.......] - ETA: 38s - loss: 2.0973 - regression_loss: 1.7797 - classification_loss: 0.3176 387/500 [======================>.......] - ETA: 38s - loss: 2.0966 - regression_loss: 1.7791 - classification_loss: 0.3175 388/500 [======================>.......] - ETA: 37s - loss: 2.0953 - regression_loss: 1.7781 - classification_loss: 0.3173 389/500 [======================>.......] - ETA: 37s - loss: 2.0949 - regression_loss: 1.7779 - classification_loss: 0.3171 390/500 [======================>.......] - ETA: 37s - loss: 2.0973 - regression_loss: 1.7798 - classification_loss: 0.3174 391/500 [======================>.......] - ETA: 36s - loss: 2.0959 - regression_loss: 1.7784 - classification_loss: 0.3175 392/500 [======================>.......] - ETA: 36s - loss: 2.0968 - regression_loss: 1.7791 - classification_loss: 0.3177 393/500 [======================>.......] - ETA: 36s - loss: 2.0967 - regression_loss: 1.7790 - classification_loss: 0.3177 394/500 [======================>.......] - ETA: 35s - loss: 2.0967 - regression_loss: 1.7791 - classification_loss: 0.3176 395/500 [======================>.......] - ETA: 35s - loss: 2.0966 - regression_loss: 1.7792 - classification_loss: 0.3174 396/500 [======================>.......] - ETA: 35s - loss: 2.0939 - regression_loss: 1.7771 - classification_loss: 0.3168 397/500 [======================>.......] - ETA: 34s - loss: 2.0947 - regression_loss: 1.7778 - classification_loss: 0.3170 398/500 [======================>.......] - ETA: 34s - loss: 2.0941 - regression_loss: 1.7773 - classification_loss: 0.3168 399/500 [======================>.......] - ETA: 34s - loss: 2.0930 - regression_loss: 1.7765 - classification_loss: 0.3165 400/500 [=======================>......] - ETA: 33s - loss: 2.0934 - regression_loss: 1.7768 - classification_loss: 0.3167 401/500 [=======================>......] - ETA: 33s - loss: 2.0937 - regression_loss: 1.7770 - classification_loss: 0.3167 402/500 [=======================>......] - ETA: 33s - loss: 2.0927 - regression_loss: 1.7762 - classification_loss: 0.3166 403/500 [=======================>......] - ETA: 32s - loss: 2.0928 - regression_loss: 1.7763 - classification_loss: 0.3165 404/500 [=======================>......] - ETA: 32s - loss: 2.0929 - regression_loss: 1.7765 - classification_loss: 0.3164 405/500 [=======================>......] - ETA: 32s - loss: 2.0936 - regression_loss: 1.7770 - classification_loss: 0.3166 406/500 [=======================>......] - ETA: 31s - loss: 2.0962 - regression_loss: 1.7792 - classification_loss: 0.3170 407/500 [=======================>......] - ETA: 31s - loss: 2.0958 - regression_loss: 1.7788 - classification_loss: 0.3169 408/500 [=======================>......] - ETA: 31s - loss: 2.0965 - regression_loss: 1.7795 - classification_loss: 0.3170 409/500 [=======================>......] - ETA: 30s - loss: 2.0956 - regression_loss: 1.7786 - classification_loss: 0.3169 410/500 [=======================>......] - ETA: 30s - loss: 2.0949 - regression_loss: 1.7781 - classification_loss: 0.3168 411/500 [=======================>......] - ETA: 30s - loss: 2.0949 - regression_loss: 1.7782 - classification_loss: 0.3167 412/500 [=======================>......] - ETA: 29s - loss: 2.0951 - regression_loss: 1.7784 - classification_loss: 0.3167 413/500 [=======================>......] - ETA: 29s - loss: 2.0949 - regression_loss: 1.7783 - classification_loss: 0.3166 414/500 [=======================>......] - ETA: 29s - loss: 2.0951 - regression_loss: 1.7785 - classification_loss: 0.3166 415/500 [=======================>......] - ETA: 28s - loss: 2.0961 - regression_loss: 1.7795 - classification_loss: 0.3165 416/500 [=======================>......] - ETA: 28s - loss: 2.0957 - regression_loss: 1.7792 - classification_loss: 0.3166 417/500 [========================>.....] - ETA: 28s - loss: 2.0958 - regression_loss: 1.7793 - classification_loss: 0.3165 418/500 [========================>.....] - ETA: 27s - loss: 2.0954 - regression_loss: 1.7790 - classification_loss: 0.3164 419/500 [========================>.....] - ETA: 27s - loss: 2.0940 - regression_loss: 1.7777 - classification_loss: 0.3163 420/500 [========================>.....] - ETA: 27s - loss: 2.0941 - regression_loss: 1.7778 - classification_loss: 0.3163 421/500 [========================>.....] - ETA: 26s - loss: 2.0935 - regression_loss: 1.7772 - classification_loss: 0.3163 422/500 [========================>.....] - ETA: 26s - loss: 2.0929 - regression_loss: 1.7768 - classification_loss: 0.3161 423/500 [========================>.....] - ETA: 26s - loss: 2.0928 - regression_loss: 1.7767 - classification_loss: 0.3161 424/500 [========================>.....] - ETA: 25s - loss: 2.0914 - regression_loss: 1.7754 - classification_loss: 0.3160 425/500 [========================>.....] - ETA: 25s - loss: 2.0910 - regression_loss: 1.7753 - classification_loss: 0.3157 426/500 [========================>.....] - ETA: 25s - loss: 2.0896 - regression_loss: 1.7741 - classification_loss: 0.3155 427/500 [========================>.....] - ETA: 24s - loss: 2.0898 - regression_loss: 1.7744 - classification_loss: 0.3154 428/500 [========================>.....] - ETA: 24s - loss: 2.0897 - regression_loss: 1.7742 - classification_loss: 0.3155 429/500 [========================>.....] - ETA: 24s - loss: 2.0891 - regression_loss: 1.7736 - classification_loss: 0.3155 430/500 [========================>.....] - ETA: 23s - loss: 2.0878 - regression_loss: 1.7726 - classification_loss: 0.3152 431/500 [========================>.....] - ETA: 23s - loss: 2.0861 - regression_loss: 1.7712 - classification_loss: 0.3149 432/500 [========================>.....] - ETA: 23s - loss: 2.0852 - regression_loss: 1.7705 - classification_loss: 0.3147 433/500 [========================>.....] - ETA: 22s - loss: 2.0847 - regression_loss: 1.7700 - classification_loss: 0.3147 434/500 [=========================>....] - ETA: 22s - loss: 2.0831 - regression_loss: 1.7688 - classification_loss: 0.3144 435/500 [=========================>....] - ETA: 22s - loss: 2.0811 - regression_loss: 1.7670 - classification_loss: 0.3141 436/500 [=========================>....] - ETA: 21s - loss: 2.0814 - regression_loss: 1.7673 - classification_loss: 0.3141 437/500 [=========================>....] - ETA: 21s - loss: 2.0808 - regression_loss: 1.7667 - classification_loss: 0.3141 438/500 [=========================>....] - ETA: 21s - loss: 2.0789 - regression_loss: 1.7652 - classification_loss: 0.3137 439/500 [=========================>....] - ETA: 20s - loss: 2.0787 - regression_loss: 1.7648 - classification_loss: 0.3139 440/500 [=========================>....] - ETA: 20s - loss: 2.0788 - regression_loss: 1.7648 - classification_loss: 0.3140 441/500 [=========================>....] - ETA: 19s - loss: 2.0800 - regression_loss: 1.7659 - classification_loss: 0.3141 442/500 [=========================>....] - ETA: 19s - loss: 2.0771 - regression_loss: 1.7635 - classification_loss: 0.3137 443/500 [=========================>....] - ETA: 19s - loss: 2.0776 - regression_loss: 1.7639 - classification_loss: 0.3137 444/500 [=========================>....] - ETA: 18s - loss: 2.0772 - regression_loss: 1.7636 - classification_loss: 0.3137 445/500 [=========================>....] - ETA: 18s - loss: 2.0788 - regression_loss: 1.7647 - classification_loss: 0.3141 446/500 [=========================>....] - ETA: 18s - loss: 2.0772 - regression_loss: 1.7633 - classification_loss: 0.3138 447/500 [=========================>....] - ETA: 17s - loss: 2.0768 - regression_loss: 1.7632 - classification_loss: 0.3136 448/500 [=========================>....] - ETA: 17s - loss: 2.0770 - regression_loss: 1.7633 - classification_loss: 0.3137 449/500 [=========================>....] - ETA: 17s - loss: 2.0766 - regression_loss: 1.7629 - classification_loss: 0.3137 450/500 [==========================>...] - ETA: 16s - loss: 2.0764 - regression_loss: 1.7627 - classification_loss: 0.3137 451/500 [==========================>...] - ETA: 16s - loss: 2.0759 - regression_loss: 1.7622 - classification_loss: 0.3136 452/500 [==========================>...] - ETA: 16s - loss: 2.0764 - regression_loss: 1.7626 - classification_loss: 0.3137 453/500 [==========================>...] - ETA: 15s - loss: 2.0761 - regression_loss: 1.7625 - classification_loss: 0.3137 454/500 [==========================>...] - ETA: 15s - loss: 2.0759 - regression_loss: 1.7625 - classification_loss: 0.3134 455/500 [==========================>...] - ETA: 15s - loss: 2.0762 - regression_loss: 1.7628 - classification_loss: 0.3134 456/500 [==========================>...] - ETA: 14s - loss: 2.0765 - regression_loss: 1.7630 - classification_loss: 0.3135 457/500 [==========================>...] - ETA: 14s - loss: 2.0748 - regression_loss: 1.7616 - classification_loss: 0.3132 458/500 [==========================>...] - ETA: 14s - loss: 2.0739 - regression_loss: 1.7609 - classification_loss: 0.3130 459/500 [==========================>...] - ETA: 13s - loss: 2.0721 - regression_loss: 1.7593 - classification_loss: 0.3127 460/500 [==========================>...] - ETA: 13s - loss: 2.0720 - regression_loss: 1.7594 - classification_loss: 0.3126 461/500 [==========================>...] - ETA: 13s - loss: 2.0723 - regression_loss: 1.7597 - classification_loss: 0.3126 462/500 [==========================>...] - ETA: 12s - loss: 2.0724 - regression_loss: 1.7598 - classification_loss: 0.3126 463/500 [==========================>...] - ETA: 12s - loss: 2.0698 - regression_loss: 1.7575 - classification_loss: 0.3123 464/500 [==========================>...] - ETA: 12s - loss: 2.0673 - regression_loss: 1.7555 - classification_loss: 0.3118 465/500 [==========================>...] - ETA: 11s - loss: 2.0664 - regression_loss: 1.7546 - classification_loss: 0.3118 466/500 [==========================>...] - ETA: 11s - loss: 2.0636 - regression_loss: 1.7522 - classification_loss: 0.3114 467/500 [===========================>..] - ETA: 11s - loss: 2.0644 - regression_loss: 1.7530 - classification_loss: 0.3114 468/500 [===========================>..] - ETA: 10s - loss: 2.0638 - regression_loss: 1.7524 - classification_loss: 0.3114 469/500 [===========================>..] - ETA: 10s - loss: 2.0633 - regression_loss: 1.7520 - classification_loss: 0.3114 470/500 [===========================>..] - ETA: 10s - loss: 2.0633 - regression_loss: 1.7519 - classification_loss: 0.3115 471/500 [===========================>..] - ETA: 9s - loss: 2.0605 - regression_loss: 1.7494 - classification_loss: 0.3111  472/500 [===========================>..] - ETA: 9s - loss: 2.0603 - regression_loss: 1.7493 - classification_loss: 0.3110 473/500 [===========================>..] - ETA: 9s - loss: 2.0594 - regression_loss: 1.7485 - classification_loss: 0.3109 474/500 [===========================>..] - ETA: 8s - loss: 2.0596 - regression_loss: 1.7490 - classification_loss: 0.3107 475/500 [===========================>..] - ETA: 8s - loss: 2.0599 - regression_loss: 1.7491 - classification_loss: 0.3109 476/500 [===========================>..] - ETA: 8s - loss: 2.0608 - regression_loss: 1.7499 - classification_loss: 0.3109 477/500 [===========================>..] - ETA: 7s - loss: 2.0611 - regression_loss: 1.7500 - classification_loss: 0.3111 478/500 [===========================>..] - ETA: 7s - loss: 2.0607 - regression_loss: 1.7495 - classification_loss: 0.3112 479/500 [===========================>..] - ETA: 7s - loss: 2.0610 - regression_loss: 1.7499 - classification_loss: 0.3111 480/500 [===========================>..] - ETA: 6s - loss: 2.0607 - regression_loss: 1.7494 - classification_loss: 0.3113 481/500 [===========================>..] - ETA: 6s - loss: 2.0597 - regression_loss: 1.7485 - classification_loss: 0.3111 482/500 [===========================>..] - ETA: 6s - loss: 2.0592 - regression_loss: 1.7479 - classification_loss: 0.3112 483/500 [===========================>..] - ETA: 5s - loss: 2.0595 - regression_loss: 1.7479 - classification_loss: 0.3117 484/500 [============================>.] - ETA: 5s - loss: 2.0594 - regression_loss: 1.7475 - classification_loss: 0.3119 485/500 [============================>.] - ETA: 5s - loss: 2.0603 - regression_loss: 1.7483 - classification_loss: 0.3119 486/500 [============================>.] - ETA: 4s - loss: 2.0611 - regression_loss: 1.7491 - classification_loss: 0.3120 487/500 [============================>.] - ETA: 4s - loss: 2.0590 - regression_loss: 1.7474 - classification_loss: 0.3117 488/500 [============================>.] - ETA: 4s - loss: 2.0593 - regression_loss: 1.7475 - classification_loss: 0.3118 489/500 [============================>.] - ETA: 3s - loss: 2.0602 - regression_loss: 1.7484 - classification_loss: 0.3118 490/500 [============================>.] - ETA: 3s - loss: 2.0607 - regression_loss: 1.7486 - classification_loss: 0.3120 491/500 [============================>.] - ETA: 3s - loss: 2.0595 - regression_loss: 1.7478 - classification_loss: 0.3118 492/500 [============================>.] - ETA: 2s - loss: 2.0597 - regression_loss: 1.7479 - classification_loss: 0.3118 493/500 [============================>.] - ETA: 2s - loss: 2.0597 - regression_loss: 1.7479 - classification_loss: 0.3118 494/500 [============================>.] - ETA: 2s - loss: 2.0600 - regression_loss: 1.7479 - classification_loss: 0.3121 495/500 [============================>.] - ETA: 1s - loss: 2.0599 - regression_loss: 1.7478 - classification_loss: 0.3121 496/500 [============================>.] - ETA: 1s - loss: 2.0596 - regression_loss: 1.7476 - classification_loss: 0.3120 497/500 [============================>.] - ETA: 1s - loss: 2.0578 - regression_loss: 1.7460 - classification_loss: 0.3118 498/500 [============================>.] - ETA: 0s - loss: 2.0586 - regression_loss: 1.7467 - classification_loss: 0.3119 499/500 [============================>.] - ETA: 0s - loss: 2.0576 - regression_loss: 1.7458 - classification_loss: 0.3117 500/500 [==============================] - 170s 339ms/step - loss: 2.0577 - regression_loss: 1.7459 - classification_loss: 0.3118 1172 instances of class plum with average precision: 0.5444 mAP: 0.5444 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 2:48 - loss: 1.9452 - regression_loss: 1.7166 - classification_loss: 0.2287 2/500 [..............................] - ETA: 2:52 - loss: 1.9834 - regression_loss: 1.7110 - classification_loss: 0.2724 3/500 [..............................] - ETA: 2:48 - loss: 2.1497 - regression_loss: 1.8184 - classification_loss: 0.3313 4/500 [..............................] - ETA: 2:48 - loss: 2.1645 - regression_loss: 1.8361 - classification_loss: 0.3284 5/500 [..............................] - ETA: 2:49 - loss: 2.2013 - regression_loss: 1.8791 - classification_loss: 0.3222 6/500 [..............................] - ETA: 2:50 - loss: 2.2042 - regression_loss: 1.8855 - classification_loss: 0.3187 7/500 [..............................] - ETA: 2:49 - loss: 2.1582 - regression_loss: 1.8457 - classification_loss: 0.3126 8/500 [..............................] - ETA: 2:49 - loss: 2.0848 - regression_loss: 1.7837 - classification_loss: 0.3011 9/500 [..............................] - ETA: 2:49 - loss: 2.1436 - regression_loss: 1.8220 - classification_loss: 0.3216 10/500 [..............................] - ETA: 2:48 - loss: 2.1049 - regression_loss: 1.7890 - classification_loss: 0.3159 11/500 [..............................] - ETA: 2:47 - loss: 2.1246 - regression_loss: 1.8015 - classification_loss: 0.3231 12/500 [..............................] - ETA: 2:46 - loss: 2.1141 - regression_loss: 1.7959 - classification_loss: 0.3182 13/500 [..............................] - ETA: 2:46 - loss: 2.0538 - regression_loss: 1.7325 - classification_loss: 0.3213 14/500 [..............................] - ETA: 2:46 - loss: 2.0723 - regression_loss: 1.7506 - classification_loss: 0.3217 15/500 [..............................] - ETA: 2:45 - loss: 2.0610 - regression_loss: 1.7390 - classification_loss: 0.3220 16/500 [..............................] - ETA: 2:44 - loss: 2.0742 - regression_loss: 1.7516 - classification_loss: 0.3225 17/500 [>.............................] - ETA: 2:44 - loss: 2.0760 - regression_loss: 1.7542 - classification_loss: 0.3218 18/500 [>.............................] - ETA: 2:43 - loss: 2.1195 - regression_loss: 1.7920 - classification_loss: 0.3275 19/500 [>.............................] - ETA: 2:42 - loss: 2.1074 - regression_loss: 1.7850 - classification_loss: 0.3224 20/500 [>.............................] - ETA: 2:42 - loss: 2.1072 - regression_loss: 1.7877 - classification_loss: 0.3195 21/500 [>.............................] - ETA: 2:42 - loss: 2.1064 - regression_loss: 1.7889 - classification_loss: 0.3176 22/500 [>.............................] - ETA: 2:42 - loss: 2.1049 - regression_loss: 1.7872 - classification_loss: 0.3177 23/500 [>.............................] - ETA: 2:41 - loss: 2.1264 - regression_loss: 1.8050 - classification_loss: 0.3215 24/500 [>.............................] - ETA: 2:41 - loss: 2.0999 - regression_loss: 1.7636 - classification_loss: 0.3363 25/500 [>.............................] - ETA: 2:41 - loss: 2.1099 - regression_loss: 1.7749 - classification_loss: 0.3350 26/500 [>.............................] - ETA: 2:40 - loss: 2.1084 - regression_loss: 1.7745 - classification_loss: 0.3339 27/500 [>.............................] - ETA: 2:40 - loss: 2.1152 - regression_loss: 1.7819 - classification_loss: 0.3333 28/500 [>.............................] - ETA: 2:40 - loss: 2.1091 - regression_loss: 1.7763 - classification_loss: 0.3328 29/500 [>.............................] - ETA: 2:40 - loss: 2.0952 - regression_loss: 1.7662 - classification_loss: 0.3290 30/500 [>.............................] - ETA: 2:39 - loss: 2.0846 - regression_loss: 1.7569 - classification_loss: 0.3278 31/500 [>.............................] - ETA: 2:39 - loss: 2.0901 - regression_loss: 1.7606 - classification_loss: 0.3295 32/500 [>.............................] - ETA: 2:38 - loss: 2.1006 - regression_loss: 1.7710 - classification_loss: 0.3296 33/500 [>.............................] - ETA: 2:38 - loss: 2.0973 - regression_loss: 1.7699 - classification_loss: 0.3273 34/500 [=>............................] - ETA: 2:38 - loss: 2.0734 - regression_loss: 1.7508 - classification_loss: 0.3226 35/500 [=>............................] - ETA: 2:38 - loss: 2.0587 - regression_loss: 1.7400 - classification_loss: 0.3187 36/500 [=>............................] - ETA: 2:37 - loss: 2.0600 - regression_loss: 1.7425 - classification_loss: 0.3176 37/500 [=>............................] - ETA: 2:37 - loss: 2.0703 - regression_loss: 1.7483 - classification_loss: 0.3220 38/500 [=>............................] - ETA: 2:36 - loss: 2.0696 - regression_loss: 1.7487 - classification_loss: 0.3209 39/500 [=>............................] - ETA: 2:36 - loss: 2.0782 - regression_loss: 1.7568 - classification_loss: 0.3214 40/500 [=>............................] - ETA: 2:35 - loss: 2.0798 - regression_loss: 1.7583 - classification_loss: 0.3215 41/500 [=>............................] - ETA: 2:35 - loss: 2.0817 - regression_loss: 1.7611 - classification_loss: 0.3206 42/500 [=>............................] - ETA: 2:35 - loss: 2.0711 - regression_loss: 1.7531 - classification_loss: 0.3180 43/500 [=>............................] - ETA: 2:34 - loss: 2.0571 - regression_loss: 1.7422 - classification_loss: 0.3148 44/500 [=>............................] - ETA: 2:34 - loss: 2.0569 - regression_loss: 1.7427 - classification_loss: 0.3141 45/500 [=>............................] - ETA: 2:34 - loss: 2.0564 - regression_loss: 1.7423 - classification_loss: 0.3141 46/500 [=>............................] - ETA: 2:34 - loss: 2.0518 - regression_loss: 1.7387 - classification_loss: 0.3131 47/500 [=>............................] - ETA: 2:33 - loss: 2.0432 - regression_loss: 1.7328 - classification_loss: 0.3104 48/500 [=>............................] - ETA: 2:33 - loss: 2.0635 - regression_loss: 1.7483 - classification_loss: 0.3152 49/500 [=>............................] - ETA: 2:32 - loss: 2.0656 - regression_loss: 1.7510 - classification_loss: 0.3146 50/500 [==>...........................] - ETA: 2:32 - loss: 2.0644 - regression_loss: 1.7498 - classification_loss: 0.3146 51/500 [==>...........................] - ETA: 2:32 - loss: 2.0650 - regression_loss: 1.7519 - classification_loss: 0.3131 52/500 [==>...........................] - ETA: 2:32 - loss: 2.0669 - regression_loss: 1.7534 - classification_loss: 0.3135 53/500 [==>...........................] - ETA: 2:31 - loss: 2.0541 - regression_loss: 1.7441 - classification_loss: 0.3099 54/500 [==>...........................] - ETA: 2:31 - loss: 2.0534 - regression_loss: 1.7446 - classification_loss: 0.3088 55/500 [==>...........................] - ETA: 2:31 - loss: 2.0448 - regression_loss: 1.7375 - classification_loss: 0.3073 56/500 [==>...........................] - ETA: 2:31 - loss: 2.0432 - regression_loss: 1.7365 - classification_loss: 0.3067 57/500 [==>...........................] - ETA: 2:30 - loss: 2.0458 - regression_loss: 1.7386 - classification_loss: 0.3072 58/500 [==>...........................] - ETA: 2:30 - loss: 2.0473 - regression_loss: 1.7407 - classification_loss: 0.3066 59/500 [==>...........................] - ETA: 2:30 - loss: 2.0434 - regression_loss: 1.7377 - classification_loss: 0.3057 60/500 [==>...........................] - ETA: 2:29 - loss: 2.0463 - regression_loss: 1.7405 - classification_loss: 0.3058 61/500 [==>...........................] - ETA: 2:29 - loss: 2.0670 - regression_loss: 1.7396 - classification_loss: 0.3274 62/500 [==>...........................] - ETA: 2:29 - loss: 2.0738 - regression_loss: 1.7403 - classification_loss: 0.3335 63/500 [==>...........................] - ETA: 2:28 - loss: 2.0677 - regression_loss: 1.7355 - classification_loss: 0.3322 64/500 [==>...........................] - ETA: 2:28 - loss: 2.0648 - regression_loss: 1.7338 - classification_loss: 0.3311 65/500 [==>...........................] - ETA: 2:28 - loss: 2.0616 - regression_loss: 1.7317 - classification_loss: 0.3299 66/500 [==>...........................] - ETA: 2:27 - loss: 2.0537 - regression_loss: 1.7246 - classification_loss: 0.3291 67/500 [===>..........................] - ETA: 2:27 - loss: 2.0549 - regression_loss: 1.7262 - classification_loss: 0.3287 68/500 [===>..........................] - ETA: 2:26 - loss: 2.0593 - regression_loss: 1.7303 - classification_loss: 0.3289 69/500 [===>..........................] - ETA: 2:26 - loss: 2.0549 - regression_loss: 1.7271 - classification_loss: 0.3278 70/500 [===>..........................] - ETA: 2:26 - loss: 2.0539 - regression_loss: 1.7270 - classification_loss: 0.3269 71/500 [===>..........................] - ETA: 2:25 - loss: 2.0545 - regression_loss: 1.7259 - classification_loss: 0.3286 72/500 [===>..........................] - ETA: 2:25 - loss: 2.0608 - regression_loss: 1.7308 - classification_loss: 0.3300 73/500 [===>..........................] - ETA: 2:25 - loss: 2.0426 - regression_loss: 1.7148 - classification_loss: 0.3278 74/500 [===>..........................] - ETA: 2:24 - loss: 2.0438 - regression_loss: 1.7149 - classification_loss: 0.3289 75/500 [===>..........................] - ETA: 2:24 - loss: 2.0414 - regression_loss: 1.7135 - classification_loss: 0.3279 76/500 [===>..........................] - ETA: 2:24 - loss: 2.0464 - regression_loss: 1.7178 - classification_loss: 0.3287 77/500 [===>..........................] - ETA: 2:23 - loss: 2.0476 - regression_loss: 1.7194 - classification_loss: 0.3283 78/500 [===>..........................] - ETA: 2:23 - loss: 2.0585 - regression_loss: 1.7300 - classification_loss: 0.3286 79/500 [===>..........................] - ETA: 2:23 - loss: 2.0635 - regression_loss: 1.7335 - classification_loss: 0.3300 80/500 [===>..........................] - ETA: 2:22 - loss: 2.0649 - regression_loss: 1.7349 - classification_loss: 0.3300 81/500 [===>..........................] - ETA: 2:22 - loss: 2.0643 - regression_loss: 1.7347 - classification_loss: 0.3296 82/500 [===>..........................] - ETA: 2:21 - loss: 2.0676 - regression_loss: 1.7378 - classification_loss: 0.3298 83/500 [===>..........................] - ETA: 2:21 - loss: 2.0660 - regression_loss: 1.7340 - classification_loss: 0.3320 84/500 [====>.........................] - ETA: 2:21 - loss: 2.0610 - regression_loss: 1.7295 - classification_loss: 0.3315 85/500 [====>.........................] - ETA: 2:20 - loss: 2.0613 - regression_loss: 1.7301 - classification_loss: 0.3311 86/500 [====>.........................] - ETA: 2:20 - loss: 2.0629 - regression_loss: 1.7324 - classification_loss: 0.3305 87/500 [====>.........................] - ETA: 2:20 - loss: 2.0633 - regression_loss: 1.7332 - classification_loss: 0.3301 88/500 [====>.........................] - ETA: 2:19 - loss: 2.0629 - regression_loss: 1.7337 - classification_loss: 0.3292 89/500 [====>.........................] - ETA: 2:19 - loss: 2.0630 - regression_loss: 1.7340 - classification_loss: 0.3289 90/500 [====>.........................] - ETA: 2:19 - loss: 2.0643 - regression_loss: 1.7359 - classification_loss: 0.3284 91/500 [====>.........................] - ETA: 2:18 - loss: 2.0598 - regression_loss: 1.7323 - classification_loss: 0.3275 92/500 [====>.........................] - ETA: 2:18 - loss: 2.0631 - regression_loss: 1.7353 - classification_loss: 0.3279 93/500 [====>.........................] - ETA: 2:17 - loss: 2.0622 - regression_loss: 1.7351 - classification_loss: 0.3271 94/500 [====>.........................] - ETA: 2:17 - loss: 2.0575 - regression_loss: 1.7311 - classification_loss: 0.3264 95/500 [====>.........................] - ETA: 2:17 - loss: 2.0614 - regression_loss: 1.7354 - classification_loss: 0.3260 96/500 [====>.........................] - ETA: 2:16 - loss: 2.0631 - regression_loss: 1.7372 - classification_loss: 0.3259 97/500 [====>.........................] - ETA: 2:16 - loss: 2.0678 - regression_loss: 1.7393 - classification_loss: 0.3285 98/500 [====>.........................] - ETA: 2:16 - loss: 2.0675 - regression_loss: 1.7399 - classification_loss: 0.3276 99/500 [====>.........................] - ETA: 2:15 - loss: 2.0662 - regression_loss: 1.7392 - classification_loss: 0.3271 100/500 [=====>........................] - ETA: 2:15 - loss: 2.0665 - regression_loss: 1.7388 - classification_loss: 0.3278 101/500 [=====>........................] - ETA: 2:15 - loss: 2.0663 - regression_loss: 1.7391 - classification_loss: 0.3272 102/500 [=====>........................] - ETA: 2:14 - loss: 2.0655 - regression_loss: 1.7389 - classification_loss: 0.3266 103/500 [=====>........................] - ETA: 2:14 - loss: 2.0631 - regression_loss: 1.7372 - classification_loss: 0.3259 104/500 [=====>........................] - ETA: 2:14 - loss: 2.0680 - regression_loss: 1.7417 - classification_loss: 0.3263 105/500 [=====>........................] - ETA: 2:13 - loss: 2.0713 - regression_loss: 1.7446 - classification_loss: 0.3267 106/500 [=====>........................] - ETA: 2:13 - loss: 2.0689 - regression_loss: 1.7429 - classification_loss: 0.3260 107/500 [=====>........................] - ETA: 2:13 - loss: 2.0660 - regression_loss: 1.7402 - classification_loss: 0.3259 108/500 [=====>........................] - ETA: 2:12 - loss: 2.0626 - regression_loss: 1.7379 - classification_loss: 0.3247 109/500 [=====>........................] - ETA: 2:12 - loss: 2.0546 - regression_loss: 1.7316 - classification_loss: 0.3231 110/500 [=====>........................] - ETA: 2:11 - loss: 2.0452 - regression_loss: 1.7219 - classification_loss: 0.3234 111/500 [=====>........................] - ETA: 2:11 - loss: 2.0457 - regression_loss: 1.7225 - classification_loss: 0.3232 112/500 [=====>........................] - ETA: 2:11 - loss: 2.0475 - regression_loss: 1.7242 - classification_loss: 0.3233 113/500 [=====>........................] - ETA: 2:10 - loss: 2.0470 - regression_loss: 1.7240 - classification_loss: 0.3231 114/500 [=====>........................] - ETA: 2:10 - loss: 2.0519 - regression_loss: 1.7287 - classification_loss: 0.3231 115/500 [=====>........................] - ETA: 2:10 - loss: 2.0488 - regression_loss: 1.7267 - classification_loss: 0.3222 116/500 [=====>........................] - ETA: 2:09 - loss: 2.0504 - regression_loss: 1.7280 - classification_loss: 0.3224 117/500 [======>.......................] - ETA: 2:09 - loss: 2.0556 - regression_loss: 1.7327 - classification_loss: 0.3229 118/500 [======>.......................] - ETA: 2:09 - loss: 2.0578 - regression_loss: 1.7353 - classification_loss: 0.3225 119/500 [======>.......................] - ETA: 2:08 - loss: 2.0552 - regression_loss: 1.7334 - classification_loss: 0.3218 120/500 [======>.......................] - ETA: 2:08 - loss: 2.0496 - regression_loss: 1.7291 - classification_loss: 0.3205 121/500 [======>.......................] - ETA: 2:08 - loss: 2.0472 - regression_loss: 1.7270 - classification_loss: 0.3202 122/500 [======>.......................] - ETA: 2:07 - loss: 2.0451 - regression_loss: 1.7252 - classification_loss: 0.3199 123/500 [======>.......................] - ETA: 2:07 - loss: 2.0421 - regression_loss: 1.7229 - classification_loss: 0.3192 124/500 [======>.......................] - ETA: 2:07 - loss: 2.0450 - regression_loss: 1.7256 - classification_loss: 0.3194 125/500 [======>.......................] - ETA: 2:06 - loss: 2.0432 - regression_loss: 1.7245 - classification_loss: 0.3186 126/500 [======>.......................] - ETA: 2:06 - loss: 2.0409 - regression_loss: 1.7231 - classification_loss: 0.3178 127/500 [======>.......................] - ETA: 2:06 - loss: 2.0342 - regression_loss: 1.7175 - classification_loss: 0.3166 128/500 [======>.......................] - ETA: 2:05 - loss: 2.0308 - regression_loss: 1.7147 - classification_loss: 0.3160 129/500 [======>.......................] - ETA: 2:05 - loss: 2.0303 - regression_loss: 1.7145 - classification_loss: 0.3159 130/500 [======>.......................] - ETA: 2:05 - loss: 2.0334 - regression_loss: 1.7173 - classification_loss: 0.3161 131/500 [======>.......................] - ETA: 2:04 - loss: 2.0273 - regression_loss: 1.7122 - classification_loss: 0.3151 132/500 [======>.......................] - ETA: 2:04 - loss: 2.0242 - regression_loss: 1.7093 - classification_loss: 0.3148 133/500 [======>.......................] - ETA: 2:04 - loss: 2.0251 - regression_loss: 1.7105 - classification_loss: 0.3146 134/500 [=======>......................] - ETA: 2:03 - loss: 2.0277 - regression_loss: 1.7129 - classification_loss: 0.3148 135/500 [=======>......................] - ETA: 2:03 - loss: 2.0260 - regression_loss: 1.7119 - classification_loss: 0.3141 136/500 [=======>......................] - ETA: 2:03 - loss: 2.0284 - regression_loss: 1.7136 - classification_loss: 0.3147 137/500 [=======>......................] - ETA: 2:03 - loss: 2.0203 - regression_loss: 1.7069 - classification_loss: 0.3133 138/500 [=======>......................] - ETA: 2:02 - loss: 2.0211 - regression_loss: 1.7076 - classification_loss: 0.3135 139/500 [=======>......................] - ETA: 2:02 - loss: 2.0224 - regression_loss: 1.7082 - classification_loss: 0.3142 140/500 [=======>......................] - ETA: 2:02 - loss: 2.0203 - regression_loss: 1.7066 - classification_loss: 0.3137 141/500 [=======>......................] - ETA: 2:01 - loss: 2.0177 - regression_loss: 1.7047 - classification_loss: 0.3131 142/500 [=======>......................] - ETA: 2:01 - loss: 2.0176 - regression_loss: 1.7043 - classification_loss: 0.3133 143/500 [=======>......................] - ETA: 2:01 - loss: 2.0129 - regression_loss: 1.6995 - classification_loss: 0.3134 144/500 [=======>......................] - ETA: 2:00 - loss: 2.0127 - regression_loss: 1.6997 - classification_loss: 0.3130 145/500 [=======>......................] - ETA: 2:00 - loss: 2.0130 - regression_loss: 1.7003 - classification_loss: 0.3126 146/500 [=======>......................] - ETA: 2:00 - loss: 2.0150 - regression_loss: 1.7021 - classification_loss: 0.3129 147/500 [=======>......................] - ETA: 1:59 - loss: 2.0141 - regression_loss: 1.7014 - classification_loss: 0.3127 148/500 [=======>......................] - ETA: 1:59 - loss: 2.0148 - regression_loss: 1.7021 - classification_loss: 0.3126 149/500 [=======>......................] - ETA: 1:58 - loss: 2.0103 - regression_loss: 1.6987 - classification_loss: 0.3116 150/500 [========>.....................] - ETA: 1:58 - loss: 2.0086 - regression_loss: 1.6975 - classification_loss: 0.3111 151/500 [========>.....................] - ETA: 1:58 - loss: 2.0103 - regression_loss: 1.6993 - classification_loss: 0.3110 152/500 [========>.....................] - ETA: 1:57 - loss: 2.0084 - regression_loss: 1.6977 - classification_loss: 0.3106 153/500 [========>.....................] - ETA: 1:57 - loss: 2.0092 - regression_loss: 1.6986 - classification_loss: 0.3106 154/500 [========>.....................] - ETA: 1:57 - loss: 2.0034 - regression_loss: 1.6940 - classification_loss: 0.3093 155/500 [========>.....................] - ETA: 1:56 - loss: 2.0026 - regression_loss: 1.6930 - classification_loss: 0.3095 156/500 [========>.....................] - ETA: 1:56 - loss: 2.0023 - regression_loss: 1.6928 - classification_loss: 0.3094 157/500 [========>.....................] - ETA: 1:56 - loss: 1.9982 - regression_loss: 1.6895 - classification_loss: 0.3087 158/500 [========>.....................] - ETA: 1:55 - loss: 2.0009 - regression_loss: 1.6913 - classification_loss: 0.3096 159/500 [========>.....................] - ETA: 1:55 - loss: 1.9982 - regression_loss: 1.6890 - classification_loss: 0.3093 160/500 [========>.....................] - ETA: 1:55 - loss: 2.0027 - regression_loss: 1.6918 - classification_loss: 0.3110 161/500 [========>.....................] - ETA: 1:54 - loss: 1.9948 - regression_loss: 1.6851 - classification_loss: 0.3097 162/500 [========>.....................] - ETA: 1:54 - loss: 1.9925 - regression_loss: 1.6831 - classification_loss: 0.3093 163/500 [========>.....................] - ETA: 1:54 - loss: 1.9912 - regression_loss: 1.6821 - classification_loss: 0.3090 164/500 [========>.....................] - ETA: 1:53 - loss: 1.9898 - regression_loss: 1.6811 - classification_loss: 0.3087 165/500 [========>.....................] - ETA: 1:53 - loss: 1.9874 - regression_loss: 1.6786 - classification_loss: 0.3088 166/500 [========>.....................] - ETA: 1:53 - loss: 1.9866 - regression_loss: 1.6783 - classification_loss: 0.3083 167/500 [=========>....................] - ETA: 1:52 - loss: 1.9882 - regression_loss: 1.6799 - classification_loss: 0.3083 168/500 [=========>....................] - ETA: 1:52 - loss: 1.9853 - regression_loss: 1.6777 - classification_loss: 0.3076 169/500 [=========>....................] - ETA: 1:52 - loss: 1.9869 - regression_loss: 1.6793 - classification_loss: 0.3076 170/500 [=========>....................] - ETA: 1:51 - loss: 1.9874 - regression_loss: 1.6799 - classification_loss: 0.3076 171/500 [=========>....................] - ETA: 1:51 - loss: 1.9894 - regression_loss: 1.6807 - classification_loss: 0.3088 172/500 [=========>....................] - ETA: 1:51 - loss: 1.9889 - regression_loss: 1.6805 - classification_loss: 0.3084 173/500 [=========>....................] - ETA: 1:50 - loss: 1.9911 - regression_loss: 1.6827 - classification_loss: 0.3085 174/500 [=========>....................] - ETA: 1:50 - loss: 1.9897 - regression_loss: 1.6813 - classification_loss: 0.3084 175/500 [=========>....................] - ETA: 1:50 - loss: 1.9910 - regression_loss: 1.6828 - classification_loss: 0.3082 176/500 [=========>....................] - ETA: 1:49 - loss: 1.9913 - regression_loss: 1.6832 - classification_loss: 0.3081 177/500 [=========>....................] - ETA: 1:49 - loss: 1.9928 - regression_loss: 1.6849 - classification_loss: 0.3079 178/500 [=========>....................] - ETA: 1:49 - loss: 1.9947 - regression_loss: 1.6866 - classification_loss: 0.3082 179/500 [=========>....................] - ETA: 1:48 - loss: 1.9934 - regression_loss: 1.6856 - classification_loss: 0.3079 180/500 [=========>....................] - ETA: 1:48 - loss: 1.9888 - regression_loss: 1.6818 - classification_loss: 0.3070 181/500 [=========>....................] - ETA: 1:48 - loss: 1.9885 - regression_loss: 1.6818 - classification_loss: 0.3066 182/500 [=========>....................] - ETA: 1:47 - loss: 1.9924 - regression_loss: 1.6855 - classification_loss: 0.3069 183/500 [=========>....................] - ETA: 1:47 - loss: 1.9956 - regression_loss: 1.6886 - classification_loss: 0.3070 184/500 [==========>...................] - ETA: 1:47 - loss: 1.9966 - regression_loss: 1.6897 - classification_loss: 0.3069 185/500 [==========>...................] - ETA: 1:46 - loss: 2.0046 - regression_loss: 1.6958 - classification_loss: 0.3087 186/500 [==========>...................] - ETA: 1:46 - loss: 2.0060 - regression_loss: 1.6973 - classification_loss: 0.3087 187/500 [==========>...................] - ETA: 1:46 - loss: 2.0062 - regression_loss: 1.6977 - classification_loss: 0.3085 188/500 [==========>...................] - ETA: 1:45 - loss: 2.0062 - regression_loss: 1.6979 - classification_loss: 0.3084 189/500 [==========>...................] - ETA: 1:45 - loss: 2.0117 - regression_loss: 1.7016 - classification_loss: 0.3101 190/500 [==========>...................] - ETA: 1:45 - loss: 2.0118 - regression_loss: 1.7017 - classification_loss: 0.3101 191/500 [==========>...................] - ETA: 1:44 - loss: 2.0121 - regression_loss: 1.7020 - classification_loss: 0.3101 192/500 [==========>...................] - ETA: 1:44 - loss: 2.0147 - regression_loss: 1.7041 - classification_loss: 0.3106 193/500 [==========>...................] - ETA: 1:44 - loss: 2.0169 - regression_loss: 1.7058 - classification_loss: 0.3111 194/500 [==========>...................] - ETA: 1:43 - loss: 2.0162 - regression_loss: 1.7053 - classification_loss: 0.3110 195/500 [==========>...................] - ETA: 1:43 - loss: 2.0155 - regression_loss: 1.7042 - classification_loss: 0.3113 196/500 [==========>...................] - ETA: 1:43 - loss: 2.0136 - regression_loss: 1.7025 - classification_loss: 0.3111 197/500 [==========>...................] - ETA: 1:42 - loss: 2.0116 - regression_loss: 1.7010 - classification_loss: 0.3106 198/500 [==========>...................] - ETA: 1:42 - loss: 2.0112 - regression_loss: 1.7005 - classification_loss: 0.3107 199/500 [==========>...................] - ETA: 1:42 - loss: 2.0106 - regression_loss: 1.7002 - classification_loss: 0.3105 200/500 [===========>..................] - ETA: 1:41 - loss: 2.0119 - regression_loss: 1.7014 - classification_loss: 0.3105 201/500 [===========>..................] - ETA: 1:41 - loss: 2.0093 - regression_loss: 1.6986 - classification_loss: 0.3107 202/500 [===========>..................] - ETA: 1:41 - loss: 2.0151 - regression_loss: 1.7034 - classification_loss: 0.3116 203/500 [===========>..................] - ETA: 1:40 - loss: 2.0127 - regression_loss: 1.7017 - classification_loss: 0.3110 204/500 [===========>..................] - ETA: 1:40 - loss: 2.0112 - regression_loss: 1.7003 - classification_loss: 0.3109 205/500 [===========>..................] - ETA: 1:40 - loss: 2.0110 - regression_loss: 1.7001 - classification_loss: 0.3109 206/500 [===========>..................] - ETA: 1:39 - loss: 2.0107 - regression_loss: 1.7000 - classification_loss: 0.3107 207/500 [===========>..................] - ETA: 1:39 - loss: 2.0129 - regression_loss: 1.7016 - classification_loss: 0.3113 208/500 [===========>..................] - ETA: 1:38 - loss: 2.0132 - regression_loss: 1.7019 - classification_loss: 0.3114 209/500 [===========>..................] - ETA: 1:38 - loss: 2.0144 - regression_loss: 1.7025 - classification_loss: 0.3119 210/500 [===========>..................] - ETA: 1:38 - loss: 2.0135 - regression_loss: 1.7014 - classification_loss: 0.3121 211/500 [===========>..................] - ETA: 1:37 - loss: 2.0132 - regression_loss: 1.7012 - classification_loss: 0.3120 212/500 [===========>..................] - ETA: 1:37 - loss: 2.0131 - regression_loss: 1.7016 - classification_loss: 0.3115 213/500 [===========>..................] - ETA: 1:37 - loss: 2.0127 - regression_loss: 1.7014 - classification_loss: 0.3113 214/500 [===========>..................] - ETA: 1:36 - loss: 2.0132 - regression_loss: 1.7015 - classification_loss: 0.3118 215/500 [===========>..................] - ETA: 1:36 - loss: 2.0093 - regression_loss: 1.6981 - classification_loss: 0.3112 216/500 [===========>..................] - ETA: 1:36 - loss: 2.0099 - regression_loss: 1.6987 - classification_loss: 0.3111 217/500 [============>.................] - ETA: 1:35 - loss: 2.0115 - regression_loss: 1.7005 - classification_loss: 0.3110 218/500 [============>.................] - ETA: 1:35 - loss: 2.0104 - regression_loss: 1.6988 - classification_loss: 0.3116 219/500 [============>.................] - ETA: 1:35 - loss: 2.0102 - regression_loss: 1.6984 - classification_loss: 0.3118 220/500 [============>.................] - ETA: 1:34 - loss: 2.0113 - regression_loss: 1.6992 - classification_loss: 0.3121 221/500 [============>.................] - ETA: 1:34 - loss: 2.0100 - regression_loss: 1.6981 - classification_loss: 0.3119 222/500 [============>.................] - ETA: 1:34 - loss: 2.0116 - regression_loss: 1.6994 - classification_loss: 0.3121 223/500 [============>.................] - ETA: 1:33 - loss: 2.0110 - regression_loss: 1.6992 - classification_loss: 0.3118 224/500 [============>.................] - ETA: 1:33 - loss: 2.0104 - regression_loss: 1.6985 - classification_loss: 0.3118 225/500 [============>.................] - ETA: 1:33 - loss: 2.0101 - regression_loss: 1.6985 - classification_loss: 0.3116 226/500 [============>.................] - ETA: 1:32 - loss: 2.0101 - regression_loss: 1.6983 - classification_loss: 0.3119 227/500 [============>.................] - ETA: 1:32 - loss: 2.0060 - regression_loss: 1.6951 - classification_loss: 0.3110 228/500 [============>.................] - ETA: 1:32 - loss: 2.0073 - regression_loss: 1.6962 - classification_loss: 0.3112 229/500 [============>.................] - ETA: 1:31 - loss: 2.0071 - regression_loss: 1.6961 - classification_loss: 0.3111 230/500 [============>.................] - ETA: 1:31 - loss: 2.0040 - regression_loss: 1.6934 - classification_loss: 0.3106 231/500 [============>.................] - ETA: 1:31 - loss: 2.0016 - regression_loss: 1.6915 - classification_loss: 0.3101 232/500 [============>.................] - ETA: 1:30 - loss: 2.0024 - regression_loss: 1.6920 - classification_loss: 0.3104 233/500 [============>.................] - ETA: 1:30 - loss: 2.0015 - regression_loss: 1.6912 - classification_loss: 0.3103 234/500 [=============>................] - ETA: 1:30 - loss: 2.0013 - regression_loss: 1.6913 - classification_loss: 0.3100 235/500 [=============>................] - ETA: 1:29 - loss: 2.0023 - regression_loss: 1.6922 - classification_loss: 0.3101 236/500 [=============>................] - ETA: 1:29 - loss: 2.0030 - regression_loss: 1.6934 - classification_loss: 0.3096 237/500 [=============>................] - ETA: 1:29 - loss: 2.0032 - regression_loss: 1.6938 - classification_loss: 0.3094 238/500 [=============>................] - ETA: 1:28 - loss: 2.0033 - regression_loss: 1.6940 - classification_loss: 0.3092 239/500 [=============>................] - ETA: 1:28 - loss: 2.0014 - regression_loss: 1.6926 - classification_loss: 0.3087 240/500 [=============>................] - ETA: 1:28 - loss: 2.0026 - regression_loss: 1.6940 - classification_loss: 0.3086 241/500 [=============>................] - ETA: 1:27 - loss: 2.0025 - regression_loss: 1.6941 - classification_loss: 0.3084 242/500 [=============>................] - ETA: 1:27 - loss: 2.0027 - regression_loss: 1.6943 - classification_loss: 0.3083 243/500 [=============>................] - ETA: 1:27 - loss: 2.0011 - regression_loss: 1.6930 - classification_loss: 0.3080 244/500 [=============>................] - ETA: 1:26 - loss: 2.0007 - regression_loss: 1.6931 - classification_loss: 0.3076 245/500 [=============>................] - ETA: 1:26 - loss: 2.0003 - regression_loss: 1.6929 - classification_loss: 0.3074 246/500 [=============>................] - ETA: 1:26 - loss: 1.9971 - regression_loss: 1.6904 - classification_loss: 0.3067 247/500 [=============>................] - ETA: 1:25 - loss: 1.9974 - regression_loss: 1.6907 - classification_loss: 0.3067 248/500 [=============>................] - ETA: 1:25 - loss: 1.9978 - regression_loss: 1.6910 - classification_loss: 0.3068 249/500 [=============>................] - ETA: 1:25 - loss: 2.0005 - regression_loss: 1.6931 - classification_loss: 0.3074 250/500 [==============>...............] - ETA: 1:24 - loss: 2.0013 - regression_loss: 1.6937 - classification_loss: 0.3076 251/500 [==============>...............] - ETA: 1:24 - loss: 2.0027 - regression_loss: 1.6949 - classification_loss: 0.3079 252/500 [==============>...............] - ETA: 1:24 - loss: 2.0014 - regression_loss: 1.6938 - classification_loss: 0.3076 253/500 [==============>...............] - ETA: 1:23 - loss: 1.9999 - regression_loss: 1.6928 - classification_loss: 0.3071 254/500 [==============>...............] - ETA: 1:23 - loss: 1.9955 - regression_loss: 1.6891 - classification_loss: 0.3065 255/500 [==============>...............] - ETA: 1:23 - loss: 1.9937 - regression_loss: 1.6873 - classification_loss: 0.3063 256/500 [==============>...............] - ETA: 1:22 - loss: 1.9940 - regression_loss: 1.6880 - classification_loss: 0.3060 257/500 [==============>...............] - ETA: 1:22 - loss: 1.9936 - regression_loss: 1.6876 - classification_loss: 0.3060 258/500 [==============>...............] - ETA: 1:22 - loss: 1.9930 - regression_loss: 1.6869 - classification_loss: 0.3060 259/500 [==============>...............] - ETA: 1:21 - loss: 1.9928 - regression_loss: 1.6869 - classification_loss: 0.3060 260/500 [==============>...............] - ETA: 1:21 - loss: 1.9929 - regression_loss: 1.6870 - classification_loss: 0.3059 261/500 [==============>...............] - ETA: 1:21 - loss: 1.9960 - regression_loss: 1.6898 - classification_loss: 0.3062 262/500 [==============>...............] - ETA: 1:20 - loss: 1.9959 - regression_loss: 1.6899 - classification_loss: 0.3060 263/500 [==============>...............] - ETA: 1:20 - loss: 1.9958 - regression_loss: 1.6899 - classification_loss: 0.3059 264/500 [==============>...............] - ETA: 1:19 - loss: 1.9948 - regression_loss: 1.6891 - classification_loss: 0.3057 265/500 [==============>...............] - ETA: 1:19 - loss: 1.9963 - regression_loss: 1.6900 - classification_loss: 0.3063 266/500 [==============>...............] - ETA: 1:19 - loss: 1.9963 - regression_loss: 1.6901 - classification_loss: 0.3062 267/500 [===============>..............] - ETA: 1:18 - loss: 1.9963 - regression_loss: 1.6898 - classification_loss: 0.3065 268/500 [===============>..............] - ETA: 1:18 - loss: 1.9970 - regression_loss: 1.6905 - classification_loss: 0.3066 269/500 [===============>..............] - ETA: 1:18 - loss: 1.9945 - regression_loss: 1.6885 - classification_loss: 0.3060 270/500 [===============>..............] - ETA: 1:17 - loss: 1.9964 - regression_loss: 1.6902 - classification_loss: 0.3062 271/500 [===============>..............] - ETA: 1:17 - loss: 1.9947 - regression_loss: 1.6890 - classification_loss: 0.3057 272/500 [===============>..............] - ETA: 1:17 - loss: 1.9921 - regression_loss: 1.6870 - classification_loss: 0.3051 273/500 [===============>..............] - ETA: 1:16 - loss: 1.9928 - regression_loss: 1.6877 - classification_loss: 0.3051 274/500 [===============>..............] - ETA: 1:16 - loss: 1.9929 - regression_loss: 1.6878 - classification_loss: 0.3051 275/500 [===============>..............] - ETA: 1:16 - loss: 1.9909 - regression_loss: 1.6861 - classification_loss: 0.3048 276/500 [===============>..............] - ETA: 1:15 - loss: 1.9914 - regression_loss: 1.6867 - classification_loss: 0.3047 277/500 [===============>..............] - ETA: 1:15 - loss: 1.9939 - regression_loss: 1.6891 - classification_loss: 0.3049 278/500 [===============>..............] - ETA: 1:15 - loss: 1.9896 - regression_loss: 1.6854 - classification_loss: 0.3041 279/500 [===============>..............] - ETA: 1:14 - loss: 1.9913 - regression_loss: 1.6871 - classification_loss: 0.3042 280/500 [===============>..............] - ETA: 1:14 - loss: 1.9907 - regression_loss: 1.6865 - classification_loss: 0.3042 281/500 [===============>..............] - ETA: 1:14 - loss: 1.9913 - regression_loss: 1.6874 - classification_loss: 0.3039 282/500 [===============>..............] - ETA: 1:13 - loss: 1.9908 - regression_loss: 1.6871 - classification_loss: 0.3037 283/500 [===============>..............] - ETA: 1:13 - loss: 1.9893 - regression_loss: 1.6859 - classification_loss: 0.3035 284/500 [================>.............] - ETA: 1:13 - loss: 1.9908 - regression_loss: 1.6871 - classification_loss: 0.3037 285/500 [================>.............] - ETA: 1:12 - loss: 1.9898 - regression_loss: 1.6865 - classification_loss: 0.3032 286/500 [================>.............] - ETA: 1:12 - loss: 1.9870 - regression_loss: 1.6840 - classification_loss: 0.3031 287/500 [================>.............] - ETA: 1:12 - loss: 1.9863 - regression_loss: 1.6834 - classification_loss: 0.3028 288/500 [================>.............] - ETA: 1:11 - loss: 1.9886 - regression_loss: 1.6855 - classification_loss: 0.3031 289/500 [================>.............] - ETA: 1:11 - loss: 1.9889 - regression_loss: 1.6859 - classification_loss: 0.3030 290/500 [================>.............] - ETA: 1:11 - loss: 1.9896 - regression_loss: 1.6865 - classification_loss: 0.3030 291/500 [================>.............] - ETA: 1:10 - loss: 1.9878 - regression_loss: 1.6852 - classification_loss: 0.3027 292/500 [================>.............] - ETA: 1:10 - loss: 1.9871 - regression_loss: 1.6845 - classification_loss: 0.3026 293/500 [================>.............] - ETA: 1:10 - loss: 1.9871 - regression_loss: 1.6845 - classification_loss: 0.3026 294/500 [================>.............] - ETA: 1:09 - loss: 1.9853 - regression_loss: 1.6829 - classification_loss: 0.3023 295/500 [================>.............] - ETA: 1:09 - loss: 1.9858 - regression_loss: 1.6835 - classification_loss: 0.3023 296/500 [================>.............] - ETA: 1:09 - loss: 1.9861 - regression_loss: 1.6838 - classification_loss: 0.3023 297/500 [================>.............] - ETA: 1:08 - loss: 1.9854 - regression_loss: 1.6832 - classification_loss: 0.3022 298/500 [================>.............] - ETA: 1:08 - loss: 1.9869 - regression_loss: 1.6847 - classification_loss: 0.3022 299/500 [================>.............] - ETA: 1:08 - loss: 1.9899 - regression_loss: 1.6874 - classification_loss: 0.3025 300/500 [=================>............] - ETA: 1:07 - loss: 1.9918 - regression_loss: 1.6890 - classification_loss: 0.3028 301/500 [=================>............] - ETA: 1:07 - loss: 1.9916 - regression_loss: 1.6889 - classification_loss: 0.3027 302/500 [=================>............] - ETA: 1:07 - loss: 1.9925 - regression_loss: 1.6898 - classification_loss: 0.3027 303/500 [=================>............] - ETA: 1:06 - loss: 1.9902 - regression_loss: 1.6878 - classification_loss: 0.3024 304/500 [=================>............] - ETA: 1:06 - loss: 1.9979 - regression_loss: 1.6906 - classification_loss: 0.3073 305/500 [=================>............] - ETA: 1:06 - loss: 1.9976 - regression_loss: 1.6905 - classification_loss: 0.3072 306/500 [=================>............] - ETA: 1:05 - loss: 1.9962 - regression_loss: 1.6891 - classification_loss: 0.3070 307/500 [=================>............] - ETA: 1:05 - loss: 1.9957 - regression_loss: 1.6888 - classification_loss: 0.3070 308/500 [=================>............] - ETA: 1:05 - loss: 1.9958 - regression_loss: 1.6888 - classification_loss: 0.3069 309/500 [=================>............] - ETA: 1:04 - loss: 1.9970 - regression_loss: 1.6899 - classification_loss: 0.3071 310/500 [=================>............] - ETA: 1:04 - loss: 1.9953 - regression_loss: 1.6886 - classification_loss: 0.3067 311/500 [=================>............] - ETA: 1:04 - loss: 1.9954 - regression_loss: 1.6890 - classification_loss: 0.3063 312/500 [=================>............] - ETA: 1:03 - loss: 1.9922 - regression_loss: 1.6863 - classification_loss: 0.3059 313/500 [=================>............] - ETA: 1:03 - loss: 1.9931 - regression_loss: 1.6875 - classification_loss: 0.3056 314/500 [=================>............] - ETA: 1:03 - loss: 1.9910 - regression_loss: 1.6858 - classification_loss: 0.3052 315/500 [=================>............] - ETA: 1:02 - loss: 1.9908 - regression_loss: 1.6854 - classification_loss: 0.3053 316/500 [=================>............] - ETA: 1:02 - loss: 1.9897 - regression_loss: 1.6845 - classification_loss: 0.3052 317/500 [==================>...........] - ETA: 1:02 - loss: 1.9908 - regression_loss: 1.6857 - classification_loss: 0.3051 318/500 [==================>...........] - ETA: 1:01 - loss: 1.9874 - regression_loss: 1.6829 - classification_loss: 0.3045 319/500 [==================>...........] - ETA: 1:01 - loss: 1.9872 - regression_loss: 1.6829 - classification_loss: 0.3043 320/500 [==================>...........] - ETA: 1:01 - loss: 1.9859 - regression_loss: 1.6818 - classification_loss: 0.3040 321/500 [==================>...........] - ETA: 1:00 - loss: 1.9865 - regression_loss: 1.6823 - classification_loss: 0.3042 322/500 [==================>...........] - ETA: 1:00 - loss: 1.9868 - regression_loss: 1.6825 - classification_loss: 0.3042 323/500 [==================>...........] - ETA: 1:00 - loss: 1.9869 - regression_loss: 1.6828 - classification_loss: 0.3042 324/500 [==================>...........] - ETA: 59s - loss: 1.9861 - regression_loss: 1.6821 - classification_loss: 0.3040  325/500 [==================>...........] - ETA: 59s - loss: 1.9857 - regression_loss: 1.6819 - classification_loss: 0.3038 326/500 [==================>...........] - ETA: 59s - loss: 1.9864 - regression_loss: 1.6828 - classification_loss: 0.3036 327/500 [==================>...........] - ETA: 58s - loss: 1.9873 - regression_loss: 1.6838 - classification_loss: 0.3035 328/500 [==================>...........] - ETA: 58s - loss: 1.9840 - regression_loss: 1.6809 - classification_loss: 0.3031 329/500 [==================>...........] - ETA: 58s - loss: 1.9839 - regression_loss: 1.6809 - classification_loss: 0.3030 330/500 [==================>...........] - ETA: 57s - loss: 1.9846 - regression_loss: 1.6815 - classification_loss: 0.3031 331/500 [==================>...........] - ETA: 57s - loss: 1.9837 - regression_loss: 1.6810 - classification_loss: 0.3027 332/500 [==================>...........] - ETA: 57s - loss: 1.9831 - regression_loss: 1.6806 - classification_loss: 0.3024 333/500 [==================>...........] - ETA: 56s - loss: 1.9820 - regression_loss: 1.6797 - classification_loss: 0.3023 334/500 [===================>..........] - ETA: 56s - loss: 1.9822 - regression_loss: 1.6799 - classification_loss: 0.3023 335/500 [===================>..........] - ETA: 56s - loss: 1.9826 - regression_loss: 1.6804 - classification_loss: 0.3022 336/500 [===================>..........] - ETA: 55s - loss: 1.9819 - regression_loss: 1.6799 - classification_loss: 0.3020 337/500 [===================>..........] - ETA: 55s - loss: 1.9827 - regression_loss: 1.6806 - classification_loss: 0.3021 338/500 [===================>..........] - ETA: 54s - loss: 1.9810 - regression_loss: 1.6792 - classification_loss: 0.3018 339/500 [===================>..........] - ETA: 54s - loss: 1.9808 - regression_loss: 1.6791 - classification_loss: 0.3017 340/500 [===================>..........] - ETA: 54s - loss: 1.9806 - regression_loss: 1.6790 - classification_loss: 0.3016 341/500 [===================>..........] - ETA: 53s - loss: 1.9806 - regression_loss: 1.6790 - classification_loss: 0.3016 342/500 [===================>..........] - ETA: 53s - loss: 1.9785 - regression_loss: 1.6774 - classification_loss: 0.3010 343/500 [===================>..........] - ETA: 53s - loss: 1.9775 - regression_loss: 1.6765 - classification_loss: 0.3010 344/500 [===================>..........] - ETA: 52s - loss: 1.9770 - regression_loss: 1.6764 - classification_loss: 0.3006 345/500 [===================>..........] - ETA: 52s - loss: 1.9783 - regression_loss: 1.6775 - classification_loss: 0.3008 346/500 [===================>..........] - ETA: 52s - loss: 1.9778 - regression_loss: 1.6771 - classification_loss: 0.3007 347/500 [===================>..........] - ETA: 51s - loss: 1.9783 - regression_loss: 1.6777 - classification_loss: 0.3006 348/500 [===================>..........] - ETA: 51s - loss: 1.9774 - regression_loss: 1.6769 - classification_loss: 0.3005 349/500 [===================>..........] - ETA: 51s - loss: 1.9762 - regression_loss: 1.6760 - classification_loss: 0.3002 350/500 [====================>.........] - ETA: 50s - loss: 1.9758 - regression_loss: 1.6757 - classification_loss: 0.3001 351/500 [====================>.........] - ETA: 50s - loss: 1.9747 - regression_loss: 1.6747 - classification_loss: 0.3000 352/500 [====================>.........] - ETA: 50s - loss: 1.9759 - regression_loss: 1.6755 - classification_loss: 0.3004 353/500 [====================>.........] - ETA: 49s - loss: 1.9762 - regression_loss: 1.6756 - classification_loss: 0.3006 354/500 [====================>.........] - ETA: 49s - loss: 1.9761 - regression_loss: 1.6757 - classification_loss: 0.3004 355/500 [====================>.........] - ETA: 49s - loss: 1.9739 - regression_loss: 1.6737 - classification_loss: 0.3002 356/500 [====================>.........] - ETA: 48s - loss: 1.9732 - regression_loss: 1.6732 - classification_loss: 0.3000 357/500 [====================>.........] - ETA: 48s - loss: 1.9725 - regression_loss: 1.6722 - classification_loss: 0.3003 358/500 [====================>.........] - ETA: 48s - loss: 1.9714 - regression_loss: 1.6713 - classification_loss: 0.3001 359/500 [====================>.........] - ETA: 47s - loss: 1.9728 - regression_loss: 1.6726 - classification_loss: 0.3002 360/500 [====================>.........] - ETA: 47s - loss: 1.9728 - regression_loss: 1.6727 - classification_loss: 0.3001 361/500 [====================>.........] - ETA: 47s - loss: 1.9722 - regression_loss: 1.6723 - classification_loss: 0.2999 362/500 [====================>.........] - ETA: 46s - loss: 1.9718 - regression_loss: 1.6722 - classification_loss: 0.2996 363/500 [====================>.........] - ETA: 46s - loss: 1.9726 - regression_loss: 1.6727 - classification_loss: 0.2999 364/500 [====================>.........] - ETA: 46s - loss: 1.9718 - regression_loss: 1.6721 - classification_loss: 0.2997 365/500 [====================>.........] - ETA: 45s - loss: 1.9721 - regression_loss: 1.6723 - classification_loss: 0.2998 366/500 [====================>.........] - ETA: 45s - loss: 1.9715 - regression_loss: 1.6718 - classification_loss: 0.2997 367/500 [=====================>........] - ETA: 45s - loss: 1.9717 - regression_loss: 1.6718 - classification_loss: 0.2999 368/500 [=====================>........] - ETA: 44s - loss: 1.9718 - regression_loss: 1.6720 - classification_loss: 0.2998 369/500 [=====================>........] - ETA: 44s - loss: 1.9724 - regression_loss: 1.6727 - classification_loss: 0.2998 370/500 [=====================>........] - ETA: 44s - loss: 1.9727 - regression_loss: 1.6732 - classification_loss: 0.2995 371/500 [=====================>........] - ETA: 43s - loss: 1.9722 - regression_loss: 1.6728 - classification_loss: 0.2994 372/500 [=====================>........] - ETA: 43s - loss: 1.9689 - regression_loss: 1.6699 - classification_loss: 0.2990 373/500 [=====================>........] - ETA: 43s - loss: 1.9702 - regression_loss: 1.6710 - classification_loss: 0.2992 374/500 [=====================>........] - ETA: 42s - loss: 1.9712 - regression_loss: 1.6719 - classification_loss: 0.2993 375/500 [=====================>........] - ETA: 42s - loss: 1.9732 - regression_loss: 1.6732 - classification_loss: 0.3001 376/500 [=====================>........] - ETA: 42s - loss: 1.9729 - regression_loss: 1.6729 - classification_loss: 0.3000 377/500 [=====================>........] - ETA: 41s - loss: 1.9721 - regression_loss: 1.6723 - classification_loss: 0.2998 378/500 [=====================>........] - ETA: 41s - loss: 1.9722 - regression_loss: 1.6723 - classification_loss: 0.2998 379/500 [=====================>........] - ETA: 41s - loss: 1.9733 - regression_loss: 1.6734 - classification_loss: 0.2999 380/500 [=====================>........] - ETA: 40s - loss: 1.9741 - regression_loss: 1.6743 - classification_loss: 0.2998 381/500 [=====================>........] - ETA: 40s - loss: 1.9726 - regression_loss: 1.6731 - classification_loss: 0.2995 382/500 [=====================>........] - ETA: 40s - loss: 1.9720 - regression_loss: 1.6727 - classification_loss: 0.2993 383/500 [=====================>........] - ETA: 39s - loss: 1.9728 - regression_loss: 1.6732 - classification_loss: 0.2995 384/500 [======================>.......] - ETA: 39s - loss: 1.9701 - regression_loss: 1.6711 - classification_loss: 0.2990 385/500 [======================>.......] - ETA: 39s - loss: 1.9696 - regression_loss: 1.6707 - classification_loss: 0.2989 386/500 [======================>.......] - ETA: 38s - loss: 1.9681 - regression_loss: 1.6694 - classification_loss: 0.2986 387/500 [======================>.......] - ETA: 38s - loss: 1.9681 - regression_loss: 1.6696 - classification_loss: 0.2985 388/500 [======================>.......] - ETA: 38s - loss: 1.9677 - regression_loss: 1.6693 - classification_loss: 0.2985 389/500 [======================>.......] - ETA: 37s - loss: 1.9672 - regression_loss: 1.6689 - classification_loss: 0.2983 390/500 [======================>.......] - ETA: 37s - loss: 1.9695 - regression_loss: 1.6706 - classification_loss: 0.2989 391/500 [======================>.......] - ETA: 36s - loss: 1.9693 - regression_loss: 1.6705 - classification_loss: 0.2988 392/500 [======================>.......] - ETA: 36s - loss: 1.9694 - regression_loss: 1.6705 - classification_loss: 0.2989 393/500 [======================>.......] - ETA: 36s - loss: 1.9721 - regression_loss: 1.6731 - classification_loss: 0.2990 394/500 [======================>.......] - ETA: 35s - loss: 1.9726 - regression_loss: 1.6736 - classification_loss: 0.2991 395/500 [======================>.......] - ETA: 35s - loss: 1.9703 - regression_loss: 1.6716 - classification_loss: 0.2987 396/500 [======================>.......] - ETA: 35s - loss: 1.9709 - regression_loss: 1.6720 - classification_loss: 0.2989 397/500 [======================>.......] - ETA: 34s - loss: 1.9708 - regression_loss: 1.6721 - classification_loss: 0.2987 398/500 [======================>.......] - ETA: 34s - loss: 1.9714 - regression_loss: 1.6728 - classification_loss: 0.2986 399/500 [======================>.......] - ETA: 34s - loss: 1.9704 - regression_loss: 1.6720 - classification_loss: 0.2984 400/500 [=======================>......] - ETA: 33s - loss: 1.9710 - regression_loss: 1.6727 - classification_loss: 0.2983 401/500 [=======================>......] - ETA: 33s - loss: 1.9704 - regression_loss: 1.6722 - classification_loss: 0.2981 402/500 [=======================>......] - ETA: 33s - loss: 1.9709 - regression_loss: 1.6727 - classification_loss: 0.2982 403/500 [=======================>......] - ETA: 32s - loss: 1.9710 - regression_loss: 1.6728 - classification_loss: 0.2982 404/500 [=======================>......] - ETA: 32s - loss: 1.9718 - regression_loss: 1.6735 - classification_loss: 0.2984 405/500 [=======================>......] - ETA: 32s - loss: 1.9713 - regression_loss: 1.6732 - classification_loss: 0.2980 406/500 [=======================>......] - ETA: 31s - loss: 1.9698 - regression_loss: 1.6720 - classification_loss: 0.2977 407/500 [=======================>......] - ETA: 31s - loss: 1.9694 - regression_loss: 1.6717 - classification_loss: 0.2978 408/500 [=======================>......] - ETA: 31s - loss: 1.9695 - regression_loss: 1.6719 - classification_loss: 0.2976 409/500 [=======================>......] - ETA: 30s - loss: 1.9677 - regression_loss: 1.6704 - classification_loss: 0.2972 410/500 [=======================>......] - ETA: 30s - loss: 1.9658 - regression_loss: 1.6688 - classification_loss: 0.2969 411/500 [=======================>......] - ETA: 30s - loss: 1.9654 - regression_loss: 1.6686 - classification_loss: 0.2968 412/500 [=======================>......] - ETA: 29s - loss: 1.9680 - regression_loss: 1.6710 - classification_loss: 0.2970 413/500 [=======================>......] - ETA: 29s - loss: 1.9675 - regression_loss: 1.6708 - classification_loss: 0.2967 414/500 [=======================>......] - ETA: 29s - loss: 1.9671 - regression_loss: 1.6705 - classification_loss: 0.2966 415/500 [=======================>......] - ETA: 28s - loss: 1.9677 - regression_loss: 1.6711 - classification_loss: 0.2966 416/500 [=======================>......] - ETA: 28s - loss: 1.9674 - regression_loss: 1.6708 - classification_loss: 0.2966 417/500 [========================>.....] - ETA: 28s - loss: 1.9686 - regression_loss: 1.6717 - classification_loss: 0.2969 418/500 [========================>.....] - ETA: 27s - loss: 1.9679 - regression_loss: 1.6712 - classification_loss: 0.2967 419/500 [========================>.....] - ETA: 27s - loss: 1.9677 - regression_loss: 1.6711 - classification_loss: 0.2965 420/500 [========================>.....] - ETA: 27s - loss: 1.9642 - regression_loss: 1.6681 - classification_loss: 0.2961 421/500 [========================>.....] - ETA: 26s - loss: 1.9630 - regression_loss: 1.6671 - classification_loss: 0.2960 422/500 [========================>.....] - ETA: 26s - loss: 1.9627 - regression_loss: 1.6668 - classification_loss: 0.2959 423/500 [========================>.....] - ETA: 26s - loss: 1.9629 - regression_loss: 1.6670 - classification_loss: 0.2959 424/500 [========================>.....] - ETA: 25s - loss: 1.9626 - regression_loss: 1.6667 - classification_loss: 0.2959 425/500 [========================>.....] - ETA: 25s - loss: 1.9626 - regression_loss: 1.6668 - classification_loss: 0.2959 426/500 [========================>.....] - ETA: 25s - loss: 1.9630 - regression_loss: 1.6670 - classification_loss: 0.2960 427/500 [========================>.....] - ETA: 24s - loss: 1.9641 - regression_loss: 1.6680 - classification_loss: 0.2961 428/500 [========================>.....] - ETA: 24s - loss: 1.9643 - regression_loss: 1.6681 - classification_loss: 0.2962 429/500 [========================>.....] - ETA: 24s - loss: 1.9644 - regression_loss: 1.6681 - classification_loss: 0.2963 430/500 [========================>.....] - ETA: 23s - loss: 1.9634 - regression_loss: 1.6674 - classification_loss: 0.2961 431/500 [========================>.....] - ETA: 23s - loss: 1.9634 - regression_loss: 1.6674 - classification_loss: 0.2961 432/500 [========================>.....] - ETA: 23s - loss: 1.9630 - regression_loss: 1.6671 - classification_loss: 0.2959 433/500 [========================>.....] - ETA: 22s - loss: 1.9630 - regression_loss: 1.6672 - classification_loss: 0.2958 434/500 [=========================>....] - ETA: 22s - loss: 1.9624 - regression_loss: 1.6667 - classification_loss: 0.2957 435/500 [=========================>....] - ETA: 22s - loss: 1.9625 - regression_loss: 1.6669 - classification_loss: 0.2956 436/500 [=========================>....] - ETA: 21s - loss: 1.9620 - regression_loss: 1.6665 - classification_loss: 0.2955 437/500 [=========================>....] - ETA: 21s - loss: 1.9621 - regression_loss: 1.6666 - classification_loss: 0.2955 438/500 [=========================>....] - ETA: 21s - loss: 1.9629 - regression_loss: 1.6674 - classification_loss: 0.2956 439/500 [=========================>....] - ETA: 20s - loss: 1.9639 - regression_loss: 1.6682 - classification_loss: 0.2957 440/500 [=========================>....] - ETA: 20s - loss: 1.9627 - regression_loss: 1.6672 - classification_loss: 0.2954 441/500 [=========================>....] - ETA: 20s - loss: 1.9629 - regression_loss: 1.6674 - classification_loss: 0.2954 442/500 [=========================>....] - ETA: 19s - loss: 1.9639 - regression_loss: 1.6684 - classification_loss: 0.2955 443/500 [=========================>....] - ETA: 19s - loss: 1.9640 - regression_loss: 1.6686 - classification_loss: 0.2954 444/500 [=========================>....] - ETA: 19s - loss: 1.9631 - regression_loss: 1.6678 - classification_loss: 0.2953 445/500 [=========================>....] - ETA: 18s - loss: 1.9636 - regression_loss: 1.6682 - classification_loss: 0.2955 446/500 [=========================>....] - ETA: 18s - loss: 1.9637 - regression_loss: 1.6684 - classification_loss: 0.2954 447/500 [=========================>....] - ETA: 18s - loss: 1.9637 - regression_loss: 1.6685 - classification_loss: 0.2952 448/500 [=========================>....] - ETA: 17s - loss: 1.9638 - regression_loss: 1.6684 - classification_loss: 0.2954 449/500 [=========================>....] - ETA: 17s - loss: 1.9645 - regression_loss: 1.6692 - classification_loss: 0.2954 450/500 [==========================>...] - ETA: 17s - loss: 1.9642 - regression_loss: 1.6690 - classification_loss: 0.2952 451/500 [==========================>...] - ETA: 16s - loss: 1.9639 - regression_loss: 1.6687 - classification_loss: 0.2952 452/500 [==========================>...] - ETA: 16s - loss: 1.9629 - regression_loss: 1.6679 - classification_loss: 0.2950 453/500 [==========================>...] - ETA: 15s - loss: 1.9633 - regression_loss: 1.6681 - classification_loss: 0.2952 454/500 [==========================>...] - ETA: 15s - loss: 1.9633 - regression_loss: 1.6682 - classification_loss: 0.2951 455/500 [==========================>...] - ETA: 15s - loss: 1.9638 - regression_loss: 1.6687 - classification_loss: 0.2952 456/500 [==========================>...] - ETA: 14s - loss: 1.9646 - regression_loss: 1.6692 - classification_loss: 0.2954 457/500 [==========================>...] - ETA: 14s - loss: 1.9652 - regression_loss: 1.6700 - classification_loss: 0.2952 458/500 [==========================>...] - ETA: 14s - loss: 1.9657 - regression_loss: 1.6703 - classification_loss: 0.2953 459/500 [==========================>...] - ETA: 13s - loss: 1.9652 - regression_loss: 1.6699 - classification_loss: 0.2953 460/500 [==========================>...] - ETA: 13s - loss: 1.9655 - regression_loss: 1.6701 - classification_loss: 0.2955 461/500 [==========================>...] - ETA: 13s - loss: 1.9664 - regression_loss: 1.6707 - classification_loss: 0.2956 462/500 [==========================>...] - ETA: 12s - loss: 1.9652 - regression_loss: 1.6698 - classification_loss: 0.2954 463/500 [==========================>...] - ETA: 12s - loss: 1.9638 - regression_loss: 1.6685 - classification_loss: 0.2953 464/500 [==========================>...] - ETA: 12s - loss: 1.9632 - regression_loss: 1.6682 - classification_loss: 0.2950 465/500 [==========================>...] - ETA: 11s - loss: 1.9637 - regression_loss: 1.6686 - classification_loss: 0.2951 466/500 [==========================>...] - ETA: 11s - loss: 1.9631 - regression_loss: 1.6682 - classification_loss: 0.2949 467/500 [===========================>..] - ETA: 11s - loss: 1.9619 - regression_loss: 1.6672 - classification_loss: 0.2947 468/500 [===========================>..] - ETA: 10s - loss: 1.9610 - regression_loss: 1.6665 - classification_loss: 0.2945 469/500 [===========================>..] - ETA: 10s - loss: 1.9613 - regression_loss: 1.6670 - classification_loss: 0.2943 470/500 [===========================>..] - ETA: 10s - loss: 1.9620 - regression_loss: 1.6675 - classification_loss: 0.2945 471/500 [===========================>..] - ETA: 9s - loss: 1.9621 - regression_loss: 1.6674 - classification_loss: 0.2947  472/500 [===========================>..] - ETA: 9s - loss: 1.9620 - regression_loss: 1.6673 - classification_loss: 0.2947 473/500 [===========================>..] - ETA: 9s - loss: 1.9607 - regression_loss: 1.6660 - classification_loss: 0.2946 474/500 [===========================>..] - ETA: 8s - loss: 1.9598 - regression_loss: 1.6653 - classification_loss: 0.2945 475/500 [===========================>..] - ETA: 8s - loss: 1.9591 - regression_loss: 1.6648 - classification_loss: 0.2943 476/500 [===========================>..] - ETA: 8s - loss: 1.9580 - regression_loss: 1.6639 - classification_loss: 0.2941 477/500 [===========================>..] - ETA: 7s - loss: 1.9577 - regression_loss: 1.6636 - classification_loss: 0.2941 478/500 [===========================>..] - ETA: 7s - loss: 1.9589 - regression_loss: 1.6648 - classification_loss: 0.2941 479/500 [===========================>..] - ETA: 7s - loss: 1.9594 - regression_loss: 1.6653 - classification_loss: 0.2941 480/500 [===========================>..] - ETA: 6s - loss: 1.9589 - regression_loss: 1.6648 - classification_loss: 0.2941 481/500 [===========================>..] - ETA: 6s - loss: 1.9604 - regression_loss: 1.6659 - classification_loss: 0.2944 482/500 [===========================>..] - ETA: 6s - loss: 1.9599 - regression_loss: 1.6656 - classification_loss: 0.2943 483/500 [===========================>..] - ETA: 5s - loss: 1.9599 - regression_loss: 1.6656 - classification_loss: 0.2943 484/500 [============================>.] - ETA: 5s - loss: 1.9601 - regression_loss: 1.6658 - classification_loss: 0.2943 485/500 [============================>.] - ETA: 5s - loss: 1.9601 - regression_loss: 1.6655 - classification_loss: 0.2946 486/500 [============================>.] - ETA: 4s - loss: 1.9603 - regression_loss: 1.6657 - classification_loss: 0.2946 487/500 [============================>.] - ETA: 4s - loss: 1.9614 - regression_loss: 1.6666 - classification_loss: 0.2948 488/500 [============================>.] - ETA: 4s - loss: 1.9614 - regression_loss: 1.6667 - classification_loss: 0.2948 489/500 [============================>.] - ETA: 3s - loss: 1.9620 - regression_loss: 1.6672 - classification_loss: 0.2948 490/500 [============================>.] - ETA: 3s - loss: 1.9615 - regression_loss: 1.6668 - classification_loss: 0.2947 491/500 [============================>.] - ETA: 3s - loss: 1.9610 - regression_loss: 1.6663 - classification_loss: 0.2947 492/500 [============================>.] - ETA: 2s - loss: 1.9608 - regression_loss: 1.6662 - classification_loss: 0.2946 493/500 [============================>.] - ETA: 2s - loss: 1.9609 - regression_loss: 1.6665 - classification_loss: 0.2945 494/500 [============================>.] - ETA: 2s - loss: 1.9617 - regression_loss: 1.6671 - classification_loss: 0.2947 495/500 [============================>.] - ETA: 1s - loss: 1.9615 - regression_loss: 1.6670 - classification_loss: 0.2946 496/500 [============================>.] - ETA: 1s - loss: 1.9612 - regression_loss: 1.6667 - classification_loss: 0.2944 497/500 [============================>.] - ETA: 1s - loss: 1.9607 - regression_loss: 1.6664 - classification_loss: 0.2942 498/500 [============================>.] - ETA: 0s - loss: 1.9604 - regression_loss: 1.6661 - classification_loss: 0.2942 499/500 [============================>.] - ETA: 0s - loss: 1.9584 - regression_loss: 1.6645 - classification_loss: 0.2940 500/500 [==============================] - 170s 341ms/step - loss: 1.9583 - regression_loss: 1.6645 - classification_loss: 0.2938 1172 instances of class plum with average precision: 0.6080 mAP: 0.6080 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 2:38 - loss: 2.3190 - regression_loss: 1.9731 - classification_loss: 0.3459 2/500 [..............................] - ETA: 2:44 - loss: 1.7054 - regression_loss: 1.4542 - classification_loss: 0.2513 3/500 [..............................] - ETA: 2:43 - loss: 1.7771 - regression_loss: 1.5162 - classification_loss: 0.2610 4/500 [..............................] - ETA: 2:44 - loss: 1.8259 - regression_loss: 1.5624 - classification_loss: 0.2636 5/500 [..............................] - ETA: 2:45 - loss: 1.8692 - regression_loss: 1.6054 - classification_loss: 0.2639 6/500 [..............................] - ETA: 2:46 - loss: 1.8540 - regression_loss: 1.5896 - classification_loss: 0.2644 7/500 [..............................] - ETA: 2:46 - loss: 1.8133 - regression_loss: 1.5550 - classification_loss: 0.2582 8/500 [..............................] - ETA: 2:46 - loss: 1.8720 - regression_loss: 1.6093 - classification_loss: 0.2628 9/500 [..............................] - ETA: 2:46 - loss: 1.8895 - regression_loss: 1.6367 - classification_loss: 0.2529 10/500 [..............................] - ETA: 2:44 - loss: 1.8885 - regression_loss: 1.6339 - classification_loss: 0.2546 11/500 [..............................] - ETA: 2:43 - loss: 1.9104 - regression_loss: 1.6417 - classification_loss: 0.2687 12/500 [..............................] - ETA: 2:43 - loss: 1.8839 - regression_loss: 1.6167 - classification_loss: 0.2672 13/500 [..............................] - ETA: 2:43 - loss: 1.9173 - regression_loss: 1.6380 - classification_loss: 0.2793 14/500 [..............................] - ETA: 2:42 - loss: 1.9273 - regression_loss: 1.6453 - classification_loss: 0.2820 15/500 [..............................] - ETA: 2:42 - loss: 1.9091 - regression_loss: 1.6305 - classification_loss: 0.2786 16/500 [..............................] - ETA: 2:42 - loss: 1.9345 - regression_loss: 1.6515 - classification_loss: 0.2831 17/500 [>.............................] - ETA: 2:41 - loss: 1.8959 - regression_loss: 1.6206 - classification_loss: 0.2753 18/500 [>.............................] - ETA: 2:41 - loss: 1.8732 - regression_loss: 1.5986 - classification_loss: 0.2746 19/500 [>.............................] - ETA: 2:40 - loss: 1.8701 - regression_loss: 1.5971 - classification_loss: 0.2730 20/500 [>.............................] - ETA: 2:40 - loss: 1.8635 - regression_loss: 1.5940 - classification_loss: 0.2695 21/500 [>.............................] - ETA: 2:40 - loss: 1.8707 - regression_loss: 1.6023 - classification_loss: 0.2684 22/500 [>.............................] - ETA: 2:40 - loss: 1.9130 - regression_loss: 1.6388 - classification_loss: 0.2742 23/500 [>.............................] - ETA: 2:39 - loss: 1.9452 - regression_loss: 1.6631 - classification_loss: 0.2821 24/500 [>.............................] - ETA: 2:39 - loss: 1.9610 - regression_loss: 1.6761 - classification_loss: 0.2849 25/500 [>.............................] - ETA: 2:39 - loss: 1.9620 - regression_loss: 1.6770 - classification_loss: 0.2850 26/500 [>.............................] - ETA: 2:39 - loss: 1.9562 - regression_loss: 1.6716 - classification_loss: 0.2846 27/500 [>.............................] - ETA: 2:39 - loss: 1.9646 - regression_loss: 1.6831 - classification_loss: 0.2816 28/500 [>.............................] - ETA: 2:38 - loss: 1.9605 - regression_loss: 1.6762 - classification_loss: 0.2843 29/500 [>.............................] - ETA: 2:38 - loss: 1.9669 - regression_loss: 1.6837 - classification_loss: 0.2832 30/500 [>.............................] - ETA: 2:38 - loss: 1.9816 - regression_loss: 1.6943 - classification_loss: 0.2873 31/500 [>.............................] - ETA: 2:38 - loss: 1.9481 - regression_loss: 1.6659 - classification_loss: 0.2822 32/500 [>.............................] - ETA: 2:38 - loss: 1.9560 - regression_loss: 1.6750 - classification_loss: 0.2810 33/500 [>.............................] - ETA: 2:38 - loss: 1.9396 - regression_loss: 1.6596 - classification_loss: 0.2800 34/500 [=>............................] - ETA: 2:38 - loss: 1.9288 - regression_loss: 1.6501 - classification_loss: 0.2787 35/500 [=>............................] - ETA: 2:37 - loss: 1.9313 - regression_loss: 1.6532 - classification_loss: 0.2781 36/500 [=>............................] - ETA: 2:37 - loss: 1.9324 - regression_loss: 1.6539 - classification_loss: 0.2785 37/500 [=>............................] - ETA: 2:37 - loss: 1.9163 - regression_loss: 1.6410 - classification_loss: 0.2753 38/500 [=>............................] - ETA: 2:36 - loss: 1.9174 - regression_loss: 1.6414 - classification_loss: 0.2760 39/500 [=>............................] - ETA: 2:36 - loss: 1.9291 - regression_loss: 1.6519 - classification_loss: 0.2772 40/500 [=>............................] - ETA: 2:35 - loss: 1.9371 - regression_loss: 1.6587 - classification_loss: 0.2784 41/500 [=>............................] - ETA: 2:35 - loss: 1.9237 - regression_loss: 1.6473 - classification_loss: 0.2764 42/500 [=>............................] - ETA: 2:34 - loss: 1.9294 - regression_loss: 1.6531 - classification_loss: 0.2763 43/500 [=>............................] - ETA: 2:34 - loss: 1.9099 - regression_loss: 1.6362 - classification_loss: 0.2736 44/500 [=>............................] - ETA: 2:34 - loss: 1.9118 - regression_loss: 1.6374 - classification_loss: 0.2744 45/500 [=>............................] - ETA: 2:34 - loss: 1.9117 - regression_loss: 1.6372 - classification_loss: 0.2745 46/500 [=>............................] - ETA: 2:33 - loss: 1.9137 - regression_loss: 1.6395 - classification_loss: 0.2742 47/500 [=>............................] - ETA: 2:33 - loss: 1.9023 - regression_loss: 1.6294 - classification_loss: 0.2729 48/500 [=>............................] - ETA: 2:32 - loss: 1.9093 - regression_loss: 1.6357 - classification_loss: 0.2735 49/500 [=>............................] - ETA: 2:32 - loss: 1.9126 - regression_loss: 1.6384 - classification_loss: 0.2742 50/500 [==>...........................] - ETA: 2:32 - loss: 1.9205 - regression_loss: 1.6453 - classification_loss: 0.2752 51/500 [==>...........................] - ETA: 2:31 - loss: 1.9160 - regression_loss: 1.6418 - classification_loss: 0.2741 52/500 [==>...........................] - ETA: 2:31 - loss: 1.9098 - regression_loss: 1.6372 - classification_loss: 0.2726 53/500 [==>...........................] - ETA: 2:31 - loss: 1.9225 - regression_loss: 1.6480 - classification_loss: 0.2744 54/500 [==>...........................] - ETA: 2:31 - loss: 1.9267 - regression_loss: 1.6516 - classification_loss: 0.2750 55/500 [==>...........................] - ETA: 2:30 - loss: 1.9383 - regression_loss: 1.6587 - classification_loss: 0.2795 56/500 [==>...........................] - ETA: 2:30 - loss: 1.9387 - regression_loss: 1.6601 - classification_loss: 0.2786 57/500 [==>...........................] - ETA: 2:30 - loss: 1.9392 - regression_loss: 1.6601 - classification_loss: 0.2791 58/500 [==>...........................] - ETA: 2:29 - loss: 1.9300 - regression_loss: 1.6530 - classification_loss: 0.2770 59/500 [==>...........................] - ETA: 2:29 - loss: 1.9270 - regression_loss: 1.6507 - classification_loss: 0.2763 60/500 [==>...........................] - ETA: 2:29 - loss: 1.9352 - regression_loss: 1.6579 - classification_loss: 0.2772 61/500 [==>...........................] - ETA: 2:28 - loss: 1.9337 - regression_loss: 1.6568 - classification_loss: 0.2769 62/500 [==>...........................] - ETA: 2:28 - loss: 1.9349 - regression_loss: 1.6581 - classification_loss: 0.2768 63/500 [==>...........................] - ETA: 2:28 - loss: 1.9335 - regression_loss: 1.6563 - classification_loss: 0.2772 64/500 [==>...........................] - ETA: 2:27 - loss: 1.9339 - regression_loss: 1.6561 - classification_loss: 0.2778 65/500 [==>...........................] - ETA: 2:27 - loss: 1.9320 - regression_loss: 1.6547 - classification_loss: 0.2774 66/500 [==>...........................] - ETA: 2:27 - loss: 1.9311 - regression_loss: 1.6547 - classification_loss: 0.2764 67/500 [===>..........................] - ETA: 2:26 - loss: 1.9343 - regression_loss: 1.6578 - classification_loss: 0.2766 68/500 [===>..........................] - ETA: 2:26 - loss: 1.9319 - regression_loss: 1.6553 - classification_loss: 0.2766 69/500 [===>..........................] - ETA: 2:26 - loss: 1.9293 - regression_loss: 1.6485 - classification_loss: 0.2808 70/500 [===>..........................] - ETA: 2:25 - loss: 1.9168 - regression_loss: 1.6366 - classification_loss: 0.2802 71/500 [===>..........................] - ETA: 2:25 - loss: 1.9111 - regression_loss: 1.6304 - classification_loss: 0.2806 72/500 [===>..........................] - ETA: 2:24 - loss: 1.9112 - regression_loss: 1.6317 - classification_loss: 0.2795 73/500 [===>..........................] - ETA: 2:24 - loss: 1.9127 - regression_loss: 1.6326 - classification_loss: 0.2801 74/500 [===>..........................] - ETA: 2:24 - loss: 1.9150 - regression_loss: 1.6346 - classification_loss: 0.2804 75/500 [===>..........................] - ETA: 2:23 - loss: 1.9142 - regression_loss: 1.6337 - classification_loss: 0.2805 76/500 [===>..........................] - ETA: 2:23 - loss: 1.9153 - regression_loss: 1.6340 - classification_loss: 0.2813 77/500 [===>..........................] - ETA: 2:23 - loss: 1.9159 - regression_loss: 1.6341 - classification_loss: 0.2818 78/500 [===>..........................] - ETA: 2:22 - loss: 1.9078 - regression_loss: 1.6278 - classification_loss: 0.2800 79/500 [===>..........................] - ETA: 2:22 - loss: 1.9071 - regression_loss: 1.6274 - classification_loss: 0.2796 80/500 [===>..........................] - ETA: 2:22 - loss: 1.8990 - regression_loss: 1.6199 - classification_loss: 0.2791 81/500 [===>..........................] - ETA: 2:22 - loss: 1.9009 - regression_loss: 1.6220 - classification_loss: 0.2789 82/500 [===>..........................] - ETA: 2:21 - loss: 1.9027 - regression_loss: 1.6241 - classification_loss: 0.2785 83/500 [===>..........................] - ETA: 2:21 - loss: 1.8960 - regression_loss: 1.6194 - classification_loss: 0.2766 84/500 [====>.........................] - ETA: 2:21 - loss: 1.9018 - regression_loss: 1.6244 - classification_loss: 0.2773 85/500 [====>.........................] - ETA: 2:20 - loss: 1.9088 - regression_loss: 1.6297 - classification_loss: 0.2790 86/500 [====>.........................] - ETA: 2:20 - loss: 1.9111 - regression_loss: 1.6318 - classification_loss: 0.2793 87/500 [====>.........................] - ETA: 2:20 - loss: 1.9044 - regression_loss: 1.6262 - classification_loss: 0.2782 88/500 [====>.........................] - ETA: 2:19 - loss: 1.8953 - regression_loss: 1.6184 - classification_loss: 0.2769 89/500 [====>.........................] - ETA: 2:19 - loss: 1.8879 - regression_loss: 1.6120 - classification_loss: 0.2759 90/500 [====>.........................] - ETA: 2:19 - loss: 1.8911 - regression_loss: 1.6148 - classification_loss: 0.2762 91/500 [====>.........................] - ETA: 2:18 - loss: 1.8922 - regression_loss: 1.6157 - classification_loss: 0.2765 92/500 [====>.........................] - ETA: 2:18 - loss: 1.8808 - regression_loss: 1.6057 - classification_loss: 0.2752 93/500 [====>.........................] - ETA: 2:18 - loss: 1.8764 - regression_loss: 1.6014 - classification_loss: 0.2750 94/500 [====>.........................] - ETA: 2:17 - loss: 1.8754 - regression_loss: 1.6014 - classification_loss: 0.2740 95/500 [====>.........................] - ETA: 2:17 - loss: 1.8772 - regression_loss: 1.6025 - classification_loss: 0.2747 96/500 [====>.........................] - ETA: 2:17 - loss: 1.8756 - regression_loss: 1.6013 - classification_loss: 0.2743 97/500 [====>.........................] - ETA: 2:16 - loss: 1.8700 - regression_loss: 1.5947 - classification_loss: 0.2753 98/500 [====>.........................] - ETA: 2:16 - loss: 1.8656 - regression_loss: 1.5912 - classification_loss: 0.2744 99/500 [====>.........................] - ETA: 2:16 - loss: 1.8657 - regression_loss: 1.5916 - classification_loss: 0.2741 100/500 [=====>........................] - ETA: 2:15 - loss: 1.8633 - regression_loss: 1.5901 - classification_loss: 0.2731 101/500 [=====>........................] - ETA: 2:15 - loss: 1.8583 - regression_loss: 1.5860 - classification_loss: 0.2723 102/500 [=====>........................] - ETA: 2:15 - loss: 1.8538 - regression_loss: 1.5821 - classification_loss: 0.2717 103/500 [=====>........................] - ETA: 2:14 - loss: 1.8549 - regression_loss: 1.5830 - classification_loss: 0.2719 104/500 [=====>........................] - ETA: 2:14 - loss: 1.8576 - regression_loss: 1.5853 - classification_loss: 0.2724 105/500 [=====>........................] - ETA: 2:13 - loss: 1.8587 - regression_loss: 1.5859 - classification_loss: 0.2729 106/500 [=====>........................] - ETA: 2:13 - loss: 1.8603 - regression_loss: 1.5868 - classification_loss: 0.2735 107/500 [=====>........................] - ETA: 2:13 - loss: 1.8540 - regression_loss: 1.5820 - classification_loss: 0.2721 108/500 [=====>........................] - ETA: 2:12 - loss: 1.8549 - regression_loss: 1.5820 - classification_loss: 0.2729 109/500 [=====>........................] - ETA: 2:12 - loss: 1.8691 - regression_loss: 1.5938 - classification_loss: 0.2753 110/500 [=====>........................] - ETA: 2:12 - loss: 1.8675 - regression_loss: 1.5922 - classification_loss: 0.2753 111/500 [=====>........................] - ETA: 2:11 - loss: 1.8693 - regression_loss: 1.5938 - classification_loss: 0.2755 112/500 [=====>........................] - ETA: 2:11 - loss: 1.8660 - regression_loss: 1.5914 - classification_loss: 0.2746 113/500 [=====>........................] - ETA: 2:11 - loss: 1.8711 - regression_loss: 1.5956 - classification_loss: 0.2755 114/500 [=====>........................] - ETA: 2:11 - loss: 1.8731 - regression_loss: 1.5978 - classification_loss: 0.2754 115/500 [=====>........................] - ETA: 2:10 - loss: 1.8656 - regression_loss: 1.5912 - classification_loss: 0.2744 116/500 [=====>........................] - ETA: 2:10 - loss: 1.8684 - regression_loss: 1.5934 - classification_loss: 0.2750 117/500 [======>.......................] - ETA: 2:10 - loss: 1.8676 - regression_loss: 1.5930 - classification_loss: 0.2747 118/500 [======>.......................] - ETA: 2:09 - loss: 1.8646 - regression_loss: 1.5905 - classification_loss: 0.2741 119/500 [======>.......................] - ETA: 2:09 - loss: 1.8669 - regression_loss: 1.5926 - classification_loss: 0.2743 120/500 [======>.......................] - ETA: 2:08 - loss: 1.8668 - regression_loss: 1.5927 - classification_loss: 0.2741 121/500 [======>.......................] - ETA: 2:08 - loss: 1.8594 - regression_loss: 1.5855 - classification_loss: 0.2739 122/500 [======>.......................] - ETA: 2:08 - loss: 1.8609 - regression_loss: 1.5867 - classification_loss: 0.2742 123/500 [======>.......................] - ETA: 2:07 - loss: 1.8595 - regression_loss: 1.5856 - classification_loss: 0.2739 124/500 [======>.......................] - ETA: 2:07 - loss: 1.8615 - regression_loss: 1.5872 - classification_loss: 0.2743 125/500 [======>.......................] - ETA: 2:07 - loss: 1.8601 - regression_loss: 1.5862 - classification_loss: 0.2738 126/500 [======>.......................] - ETA: 2:06 - loss: 1.8586 - regression_loss: 1.5852 - classification_loss: 0.2734 127/500 [======>.......................] - ETA: 2:06 - loss: 1.8586 - regression_loss: 1.5852 - classification_loss: 0.2734 128/500 [======>.......................] - ETA: 2:06 - loss: 1.8658 - regression_loss: 1.5914 - classification_loss: 0.2744 129/500 [======>.......................] - ETA: 2:06 - loss: 1.8673 - regression_loss: 1.5934 - classification_loss: 0.2739 130/500 [======>.......................] - ETA: 2:05 - loss: 1.8670 - regression_loss: 1.5928 - classification_loss: 0.2742 131/500 [======>.......................] - ETA: 2:05 - loss: 1.8694 - regression_loss: 1.5948 - classification_loss: 0.2746 132/500 [======>.......................] - ETA: 2:04 - loss: 1.8678 - regression_loss: 1.5935 - classification_loss: 0.2742 133/500 [======>.......................] - ETA: 2:04 - loss: 1.8685 - regression_loss: 1.5944 - classification_loss: 0.2741 134/500 [=======>......................] - ETA: 2:04 - loss: 1.8677 - regression_loss: 1.5938 - classification_loss: 0.2739 135/500 [=======>......................] - ETA: 2:03 - loss: 1.8679 - regression_loss: 1.5940 - classification_loss: 0.2739 136/500 [=======>......................] - ETA: 2:03 - loss: 1.8675 - regression_loss: 1.5938 - classification_loss: 0.2736 137/500 [=======>......................] - ETA: 2:03 - loss: 1.8736 - regression_loss: 1.5989 - classification_loss: 0.2746 138/500 [=======>......................] - ETA: 2:02 - loss: 1.8774 - regression_loss: 1.6020 - classification_loss: 0.2755 139/500 [=======>......................] - ETA: 2:02 - loss: 1.8787 - regression_loss: 1.6033 - classification_loss: 0.2754 140/500 [=======>......................] - ETA: 2:02 - loss: 1.8768 - regression_loss: 1.6015 - classification_loss: 0.2753 141/500 [=======>......................] - ETA: 2:01 - loss: 1.8765 - regression_loss: 1.6021 - classification_loss: 0.2744 142/500 [=======>......................] - ETA: 2:01 - loss: 1.8792 - regression_loss: 1.6047 - classification_loss: 0.2745 143/500 [=======>......................] - ETA: 2:00 - loss: 1.8765 - regression_loss: 1.6020 - classification_loss: 0.2744 144/500 [=======>......................] - ETA: 2:00 - loss: 1.8810 - regression_loss: 1.6055 - classification_loss: 0.2754 145/500 [=======>......................] - ETA: 2:00 - loss: 1.8795 - regression_loss: 1.6042 - classification_loss: 0.2752 146/500 [=======>......................] - ETA: 1:59 - loss: 1.8812 - regression_loss: 1.6058 - classification_loss: 0.2754 147/500 [=======>......................] - ETA: 1:59 - loss: 1.8818 - regression_loss: 1.6064 - classification_loss: 0.2754 148/500 [=======>......................] - ETA: 1:59 - loss: 1.8837 - regression_loss: 1.6081 - classification_loss: 0.2757 149/500 [=======>......................] - ETA: 1:58 - loss: 1.8875 - regression_loss: 1.6111 - classification_loss: 0.2764 150/500 [========>.....................] - ETA: 1:58 - loss: 1.8868 - regression_loss: 1.6107 - classification_loss: 0.2761 151/500 [========>.....................] - ETA: 1:58 - loss: 1.8827 - regression_loss: 1.6073 - classification_loss: 0.2754 152/500 [========>.....................] - ETA: 1:57 - loss: 1.8824 - regression_loss: 1.6073 - classification_loss: 0.2751 153/500 [========>.....................] - ETA: 1:57 - loss: 1.8839 - regression_loss: 1.6088 - classification_loss: 0.2751 154/500 [========>.....................] - ETA: 1:57 - loss: 1.8826 - regression_loss: 1.6073 - classification_loss: 0.2754 155/500 [========>.....................] - ETA: 1:56 - loss: 1.8765 - regression_loss: 1.6005 - classification_loss: 0.2759 156/500 [========>.....................] - ETA: 1:56 - loss: 1.8772 - regression_loss: 1.6008 - classification_loss: 0.2763 157/500 [========>.....................] - ETA: 1:56 - loss: 1.8769 - regression_loss: 1.6007 - classification_loss: 0.2762 158/500 [========>.....................] - ETA: 1:56 - loss: 1.8794 - regression_loss: 1.6022 - classification_loss: 0.2772 159/500 [========>.....................] - ETA: 1:55 - loss: 1.8783 - regression_loss: 1.6013 - classification_loss: 0.2770 160/500 [========>.....................] - ETA: 1:55 - loss: 1.8796 - regression_loss: 1.6019 - classification_loss: 0.2777 161/500 [========>.....................] - ETA: 1:55 - loss: 1.8815 - regression_loss: 1.6039 - classification_loss: 0.2775 162/500 [========>.....................] - ETA: 1:54 - loss: 1.8832 - regression_loss: 1.6057 - classification_loss: 0.2774 163/500 [========>.....................] - ETA: 1:54 - loss: 1.8844 - regression_loss: 1.6065 - classification_loss: 0.2780 164/500 [========>.....................] - ETA: 1:54 - loss: 1.8863 - regression_loss: 1.6080 - classification_loss: 0.2783 165/500 [========>.....................] - ETA: 1:53 - loss: 1.8886 - regression_loss: 1.6096 - classification_loss: 0.2789 166/500 [========>.....................] - ETA: 1:53 - loss: 1.8890 - regression_loss: 1.6102 - classification_loss: 0.2788 167/500 [=========>....................] - ETA: 1:53 - loss: 1.8894 - regression_loss: 1.6104 - classification_loss: 0.2790 168/500 [=========>....................] - ETA: 1:52 - loss: 1.8868 - regression_loss: 1.6087 - classification_loss: 0.2781 169/500 [=========>....................] - ETA: 1:52 - loss: 1.8873 - regression_loss: 1.6089 - classification_loss: 0.2784 170/500 [=========>....................] - ETA: 1:52 - loss: 1.8843 - regression_loss: 1.6064 - classification_loss: 0.2779 171/500 [=========>....................] - ETA: 1:51 - loss: 1.8829 - regression_loss: 1.6053 - classification_loss: 0.2776 172/500 [=========>....................] - ETA: 1:51 - loss: 1.8810 - regression_loss: 1.6034 - classification_loss: 0.2776 173/500 [=========>....................] - ETA: 1:51 - loss: 1.8791 - regression_loss: 1.6005 - classification_loss: 0.2786 174/500 [=========>....................] - ETA: 1:50 - loss: 1.8817 - regression_loss: 1.6030 - classification_loss: 0.2788 175/500 [=========>....................] - ETA: 1:50 - loss: 1.8825 - regression_loss: 1.6038 - classification_loss: 0.2787 176/500 [=========>....................] - ETA: 1:50 - loss: 1.8805 - regression_loss: 1.6019 - classification_loss: 0.2786 177/500 [=========>....................] - ETA: 1:49 - loss: 1.8751 - regression_loss: 1.5973 - classification_loss: 0.2779 178/500 [=========>....................] - ETA: 1:49 - loss: 1.8728 - regression_loss: 1.5954 - classification_loss: 0.2774 179/500 [=========>....................] - ETA: 1:49 - loss: 1.8729 - regression_loss: 1.5955 - classification_loss: 0.2774 180/500 [=========>....................] - ETA: 1:48 - loss: 1.8736 - regression_loss: 1.5963 - classification_loss: 0.2773 181/500 [=========>....................] - ETA: 1:48 - loss: 1.8767 - regression_loss: 1.5988 - classification_loss: 0.2778 182/500 [=========>....................] - ETA: 1:48 - loss: 1.8728 - regression_loss: 1.5956 - classification_loss: 0.2772 183/500 [=========>....................] - ETA: 1:47 - loss: 1.8725 - regression_loss: 1.5957 - classification_loss: 0.2768 184/500 [==========>...................] - ETA: 1:47 - loss: 1.8717 - regression_loss: 1.5950 - classification_loss: 0.2767 185/500 [==========>...................] - ETA: 1:46 - loss: 1.8725 - regression_loss: 1.5957 - classification_loss: 0.2768 186/500 [==========>...................] - ETA: 1:46 - loss: 1.8699 - regression_loss: 1.5936 - classification_loss: 0.2763 187/500 [==========>...................] - ETA: 1:46 - loss: 1.8722 - regression_loss: 1.5952 - classification_loss: 0.2770 188/500 [==========>...................] - ETA: 1:45 - loss: 1.8730 - regression_loss: 1.5959 - classification_loss: 0.2771 189/500 [==========>...................] - ETA: 1:45 - loss: 1.8745 - regression_loss: 1.5971 - classification_loss: 0.2774 190/500 [==========>...................] - ETA: 1:45 - loss: 1.8743 - regression_loss: 1.5971 - classification_loss: 0.2772 191/500 [==========>...................] - ETA: 1:44 - loss: 1.8736 - regression_loss: 1.5968 - classification_loss: 0.2768 192/500 [==========>...................] - ETA: 1:44 - loss: 1.8728 - regression_loss: 1.5962 - classification_loss: 0.2766 193/500 [==========>...................] - ETA: 1:44 - loss: 1.8719 - regression_loss: 1.5953 - classification_loss: 0.2766 194/500 [==========>...................] - ETA: 1:43 - loss: 1.8721 - regression_loss: 1.5955 - classification_loss: 0.2767 195/500 [==========>...................] - ETA: 1:43 - loss: 1.8716 - regression_loss: 1.5951 - classification_loss: 0.2765 196/500 [==========>...................] - ETA: 1:43 - loss: 1.8736 - regression_loss: 1.5967 - classification_loss: 0.2770 197/500 [==========>...................] - ETA: 1:42 - loss: 1.8734 - regression_loss: 1.5965 - classification_loss: 0.2768 198/500 [==========>...................] - ETA: 1:42 - loss: 1.8748 - regression_loss: 1.5980 - classification_loss: 0.2768 199/500 [==========>...................] - ETA: 1:42 - loss: 1.8697 - regression_loss: 1.5937 - classification_loss: 0.2761 200/500 [===========>..................] - ETA: 1:41 - loss: 1.8721 - regression_loss: 1.5961 - classification_loss: 0.2760 201/500 [===========>..................] - ETA: 1:41 - loss: 1.8736 - regression_loss: 1.5972 - classification_loss: 0.2764 202/500 [===========>..................] - ETA: 1:41 - loss: 1.8734 - regression_loss: 1.5963 - classification_loss: 0.2771 203/500 [===========>..................] - ETA: 1:40 - loss: 1.8706 - regression_loss: 1.5941 - classification_loss: 0.2765 204/500 [===========>..................] - ETA: 1:40 - loss: 1.8681 - regression_loss: 1.5922 - classification_loss: 0.2759 205/500 [===========>..................] - ETA: 1:40 - loss: 1.8674 - regression_loss: 1.5917 - classification_loss: 0.2757 206/500 [===========>..................] - ETA: 1:39 - loss: 1.8681 - regression_loss: 1.5921 - classification_loss: 0.2760 207/500 [===========>..................] - ETA: 1:39 - loss: 1.8672 - regression_loss: 1.5913 - classification_loss: 0.2760 208/500 [===========>..................] - ETA: 1:39 - loss: 1.8646 - regression_loss: 1.5891 - classification_loss: 0.2755 209/500 [===========>..................] - ETA: 1:38 - loss: 1.8633 - regression_loss: 1.5881 - classification_loss: 0.2752 210/500 [===========>..................] - ETA: 1:38 - loss: 1.8624 - regression_loss: 1.5873 - classification_loss: 0.2751 211/500 [===========>..................] - ETA: 1:38 - loss: 1.8638 - regression_loss: 1.5886 - classification_loss: 0.2752 212/500 [===========>..................] - ETA: 1:37 - loss: 1.8647 - regression_loss: 1.5897 - classification_loss: 0.2750 213/500 [===========>..................] - ETA: 1:37 - loss: 1.8662 - regression_loss: 1.5913 - classification_loss: 0.2750 214/500 [===========>..................] - ETA: 1:37 - loss: 1.8681 - regression_loss: 1.5921 - classification_loss: 0.2760 215/500 [===========>..................] - ETA: 1:36 - loss: 1.8706 - regression_loss: 1.5943 - classification_loss: 0.2763 216/500 [===========>..................] - ETA: 1:36 - loss: 1.8654 - regression_loss: 1.5899 - classification_loss: 0.2755 217/500 [============>.................] - ETA: 1:36 - loss: 1.8658 - regression_loss: 1.5901 - classification_loss: 0.2757 218/500 [============>.................] - ETA: 1:35 - loss: 1.8678 - regression_loss: 1.5916 - classification_loss: 0.2761 219/500 [============>.................] - ETA: 1:35 - loss: 1.8661 - regression_loss: 1.5906 - classification_loss: 0.2755 220/500 [============>.................] - ETA: 1:35 - loss: 1.8669 - regression_loss: 1.5915 - classification_loss: 0.2755 221/500 [============>.................] - ETA: 1:34 - loss: 1.8680 - regression_loss: 1.5923 - classification_loss: 0.2757 222/500 [============>.................] - ETA: 1:34 - loss: 1.8720 - regression_loss: 1.5956 - classification_loss: 0.2763 223/500 [============>.................] - ETA: 1:34 - loss: 1.8714 - regression_loss: 1.5953 - classification_loss: 0.2761 224/500 [============>.................] - ETA: 1:33 - loss: 1.8688 - regression_loss: 1.5933 - classification_loss: 0.2755 225/500 [============>.................] - ETA: 1:33 - loss: 1.8673 - regression_loss: 1.5923 - classification_loss: 0.2750 226/500 [============>.................] - ETA: 1:33 - loss: 1.8664 - regression_loss: 1.5914 - classification_loss: 0.2749 227/500 [============>.................] - ETA: 1:32 - loss: 1.8663 - regression_loss: 1.5914 - classification_loss: 0.2749 228/500 [============>.................] - ETA: 1:32 - loss: 1.8629 - regression_loss: 1.5886 - classification_loss: 0.2744 229/500 [============>.................] - ETA: 1:32 - loss: 1.8595 - regression_loss: 1.5858 - classification_loss: 0.2738 230/500 [============>.................] - ETA: 1:31 - loss: 1.8618 - regression_loss: 1.5875 - classification_loss: 0.2743 231/500 [============>.................] - ETA: 1:31 - loss: 1.8631 - regression_loss: 1.5879 - classification_loss: 0.2753 232/500 [============>.................] - ETA: 1:31 - loss: 1.8633 - regression_loss: 1.5881 - classification_loss: 0.2752 233/500 [============>.................] - ETA: 1:30 - loss: 1.8626 - regression_loss: 1.5872 - classification_loss: 0.2754 234/500 [=============>................] - ETA: 1:30 - loss: 1.8591 - regression_loss: 1.5842 - classification_loss: 0.2748 235/500 [=============>................] - ETA: 1:30 - loss: 1.8592 - regression_loss: 1.5842 - classification_loss: 0.2750 236/500 [=============>................] - ETA: 1:29 - loss: 1.8566 - regression_loss: 1.5820 - classification_loss: 0.2746 237/500 [=============>................] - ETA: 1:29 - loss: 1.8551 - regression_loss: 1.5809 - classification_loss: 0.2743 238/500 [=============>................] - ETA: 1:29 - loss: 1.8548 - regression_loss: 1.5801 - classification_loss: 0.2747 239/500 [=============>................] - ETA: 1:28 - loss: 1.8538 - regression_loss: 1.5793 - classification_loss: 0.2744 240/500 [=============>................] - ETA: 1:28 - loss: 1.8511 - regression_loss: 1.5771 - classification_loss: 0.2741 241/500 [=============>................] - ETA: 1:27 - loss: 1.8499 - regression_loss: 1.5760 - classification_loss: 0.2738 242/500 [=============>................] - ETA: 1:27 - loss: 1.8500 - regression_loss: 1.5762 - classification_loss: 0.2738 243/500 [=============>................] - ETA: 1:27 - loss: 1.8505 - regression_loss: 1.5743 - classification_loss: 0.2762 244/500 [=============>................] - ETA: 1:26 - loss: 1.8591 - regression_loss: 1.5790 - classification_loss: 0.2801 245/500 [=============>................] - ETA: 1:26 - loss: 1.8581 - regression_loss: 1.5779 - classification_loss: 0.2802 246/500 [=============>................] - ETA: 1:26 - loss: 1.8582 - regression_loss: 1.5781 - classification_loss: 0.2801 247/500 [=============>................] - ETA: 1:25 - loss: 1.8597 - regression_loss: 1.5793 - classification_loss: 0.2804 248/500 [=============>................] - ETA: 1:25 - loss: 1.8604 - regression_loss: 1.5797 - classification_loss: 0.2808 249/500 [=============>................] - ETA: 1:25 - loss: 1.8571 - regression_loss: 1.5770 - classification_loss: 0.2800 250/500 [==============>...............] - ETA: 1:24 - loss: 1.8581 - regression_loss: 1.5778 - classification_loss: 0.2803 251/500 [==============>...............] - ETA: 1:24 - loss: 1.8601 - regression_loss: 1.5790 - classification_loss: 0.2811 252/500 [==============>...............] - ETA: 1:24 - loss: 1.8594 - regression_loss: 1.5786 - classification_loss: 0.2808 253/500 [==============>...............] - ETA: 1:23 - loss: 1.8582 - regression_loss: 1.5776 - classification_loss: 0.2806 254/500 [==============>...............] - ETA: 1:23 - loss: 1.8585 - regression_loss: 1.5780 - classification_loss: 0.2805 255/500 [==============>...............] - ETA: 1:23 - loss: 1.8572 - regression_loss: 1.5770 - classification_loss: 0.2802 256/500 [==============>...............] - ETA: 1:22 - loss: 1.8565 - regression_loss: 1.5765 - classification_loss: 0.2800 257/500 [==============>...............] - ETA: 1:22 - loss: 1.8575 - regression_loss: 1.5772 - classification_loss: 0.2802 258/500 [==============>...............] - ETA: 1:22 - loss: 1.8578 - regression_loss: 1.5776 - classification_loss: 0.2802 259/500 [==============>...............] - ETA: 1:21 - loss: 1.8576 - regression_loss: 1.5776 - classification_loss: 0.2800 260/500 [==============>...............] - ETA: 1:21 - loss: 1.8575 - regression_loss: 1.5776 - classification_loss: 0.2799 261/500 [==============>...............] - ETA: 1:21 - loss: 1.8557 - regression_loss: 1.5762 - classification_loss: 0.2794 262/500 [==============>...............] - ETA: 1:20 - loss: 1.8542 - regression_loss: 1.5749 - classification_loss: 0.2792 263/500 [==============>...............] - ETA: 1:20 - loss: 1.8550 - regression_loss: 1.5757 - classification_loss: 0.2793 264/500 [==============>...............] - ETA: 1:20 - loss: 1.8526 - regression_loss: 1.5738 - classification_loss: 0.2788 265/500 [==============>...............] - ETA: 1:19 - loss: 1.8567 - regression_loss: 1.5771 - classification_loss: 0.2796 266/500 [==============>...............] - ETA: 1:19 - loss: 1.8559 - regression_loss: 1.5767 - classification_loss: 0.2792 267/500 [===============>..............] - ETA: 1:19 - loss: 1.8570 - regression_loss: 1.5777 - classification_loss: 0.2793 268/500 [===============>..............] - ETA: 1:18 - loss: 1.8604 - regression_loss: 1.5799 - classification_loss: 0.2805 269/500 [===============>..............] - ETA: 1:18 - loss: 1.8637 - regression_loss: 1.5829 - classification_loss: 0.2808 270/500 [===============>..............] - ETA: 1:18 - loss: 1.8648 - regression_loss: 1.5841 - classification_loss: 0.2806 271/500 [===============>..............] - ETA: 1:17 - loss: 1.8617 - regression_loss: 1.5816 - classification_loss: 0.2801 272/500 [===============>..............] - ETA: 1:17 - loss: 1.8627 - regression_loss: 1.5825 - classification_loss: 0.2802 273/500 [===============>..............] - ETA: 1:17 - loss: 1.8630 - regression_loss: 1.5828 - classification_loss: 0.2802 274/500 [===============>..............] - ETA: 1:16 - loss: 1.8598 - regression_loss: 1.5801 - classification_loss: 0.2796 275/500 [===============>..............] - ETA: 1:16 - loss: 1.8607 - regression_loss: 1.5812 - classification_loss: 0.2795 276/500 [===============>..............] - ETA: 1:16 - loss: 1.8621 - regression_loss: 1.5821 - classification_loss: 0.2799 277/500 [===============>..............] - ETA: 1:15 - loss: 1.8603 - regression_loss: 1.5810 - classification_loss: 0.2794 278/500 [===============>..............] - ETA: 1:15 - loss: 1.8583 - regression_loss: 1.5792 - classification_loss: 0.2791 279/500 [===============>..............] - ETA: 1:15 - loss: 1.8579 - regression_loss: 1.5789 - classification_loss: 0.2790 280/500 [===============>..............] - ETA: 1:14 - loss: 1.8578 - regression_loss: 1.5791 - classification_loss: 0.2787 281/500 [===============>..............] - ETA: 1:14 - loss: 1.8582 - regression_loss: 1.5792 - classification_loss: 0.2789 282/500 [===============>..............] - ETA: 1:14 - loss: 1.8580 - regression_loss: 1.5792 - classification_loss: 0.2788 283/500 [===============>..............] - ETA: 1:13 - loss: 1.8549 - regression_loss: 1.5766 - classification_loss: 0.2782 284/500 [================>.............] - ETA: 1:13 - loss: 1.8530 - regression_loss: 1.5749 - classification_loss: 0.2781 285/500 [================>.............] - ETA: 1:13 - loss: 1.8539 - regression_loss: 1.5755 - classification_loss: 0.2784 286/500 [================>.............] - ETA: 1:12 - loss: 1.8559 - regression_loss: 1.5772 - classification_loss: 0.2787 287/500 [================>.............] - ETA: 1:12 - loss: 1.8517 - regression_loss: 1.5717 - classification_loss: 0.2800 288/500 [================>.............] - ETA: 1:12 - loss: 1.8525 - regression_loss: 1.5725 - classification_loss: 0.2800 289/500 [================>.............] - ETA: 1:11 - loss: 1.8543 - regression_loss: 1.5742 - classification_loss: 0.2802 290/500 [================>.............] - ETA: 1:11 - loss: 1.8536 - regression_loss: 1.5738 - classification_loss: 0.2799 291/500 [================>.............] - ETA: 1:10 - loss: 1.8543 - regression_loss: 1.5744 - classification_loss: 0.2798 292/500 [================>.............] - ETA: 1:10 - loss: 1.8539 - regression_loss: 1.5744 - classification_loss: 0.2795 293/500 [================>.............] - ETA: 1:10 - loss: 1.8543 - regression_loss: 1.5748 - classification_loss: 0.2795 294/500 [================>.............] - ETA: 1:09 - loss: 1.8531 - regression_loss: 1.5736 - classification_loss: 0.2795 295/500 [================>.............] - ETA: 1:09 - loss: 1.8510 - regression_loss: 1.5719 - classification_loss: 0.2791 296/500 [================>.............] - ETA: 1:09 - loss: 1.8517 - regression_loss: 1.5724 - classification_loss: 0.2793 297/500 [================>.............] - ETA: 1:08 - loss: 1.8530 - regression_loss: 1.5715 - classification_loss: 0.2815 298/500 [================>.............] - ETA: 1:08 - loss: 1.8517 - regression_loss: 1.5704 - classification_loss: 0.2813 299/500 [================>.............] - ETA: 1:08 - loss: 1.8517 - regression_loss: 1.5704 - classification_loss: 0.2813 300/500 [=================>............] - ETA: 1:07 - loss: 1.8521 - regression_loss: 1.5708 - classification_loss: 0.2813 301/500 [=================>............] - ETA: 1:07 - loss: 1.8518 - regression_loss: 1.5706 - classification_loss: 0.2812 302/500 [=================>............] - ETA: 1:07 - loss: 1.8529 - regression_loss: 1.5714 - classification_loss: 0.2815 303/500 [=================>............] - ETA: 1:06 - loss: 1.8539 - regression_loss: 1.5726 - classification_loss: 0.2813 304/500 [=================>............] - ETA: 1:06 - loss: 1.8543 - regression_loss: 1.5731 - classification_loss: 0.2812 305/500 [=================>............] - ETA: 1:06 - loss: 1.8544 - regression_loss: 1.5731 - classification_loss: 0.2813 306/500 [=================>............] - ETA: 1:05 - loss: 1.8525 - regression_loss: 1.5715 - classification_loss: 0.2810 307/500 [=================>............] - ETA: 1:05 - loss: 1.8526 - regression_loss: 1.5717 - classification_loss: 0.2809 308/500 [=================>............] - ETA: 1:05 - loss: 1.8525 - regression_loss: 1.5716 - classification_loss: 0.2809 309/500 [=================>............] - ETA: 1:04 - loss: 1.8524 - regression_loss: 1.5717 - classification_loss: 0.2807 310/500 [=================>............] - ETA: 1:04 - loss: 1.8525 - regression_loss: 1.5719 - classification_loss: 0.2806 311/500 [=================>............] - ETA: 1:04 - loss: 1.8512 - regression_loss: 1.5710 - classification_loss: 0.2803 312/500 [=================>............] - ETA: 1:03 - loss: 1.8520 - regression_loss: 1.5718 - classification_loss: 0.2802 313/500 [=================>............] - ETA: 1:03 - loss: 1.8512 - regression_loss: 1.5712 - classification_loss: 0.2800 314/500 [=================>............] - ETA: 1:03 - loss: 1.8510 - regression_loss: 1.5711 - classification_loss: 0.2799 315/500 [=================>............] - ETA: 1:02 - loss: 1.8516 - regression_loss: 1.5717 - classification_loss: 0.2799 316/500 [=================>............] - ETA: 1:02 - loss: 1.8534 - regression_loss: 1.5734 - classification_loss: 0.2801 317/500 [==================>...........] - ETA: 1:02 - loss: 1.8548 - regression_loss: 1.5746 - classification_loss: 0.2802 318/500 [==================>...........] - ETA: 1:01 - loss: 1.8503 - regression_loss: 1.5707 - classification_loss: 0.2796 319/500 [==================>...........] - ETA: 1:01 - loss: 1.8504 - regression_loss: 1.5708 - classification_loss: 0.2796 320/500 [==================>...........] - ETA: 1:01 - loss: 1.8524 - regression_loss: 1.5726 - classification_loss: 0.2798 321/500 [==================>...........] - ETA: 1:00 - loss: 1.8526 - regression_loss: 1.5729 - classification_loss: 0.2797 322/500 [==================>...........] - ETA: 1:00 - loss: 1.8532 - regression_loss: 1.5736 - classification_loss: 0.2795 323/500 [==================>...........] - ETA: 1:00 - loss: 1.8527 - regression_loss: 1.5734 - classification_loss: 0.2793 324/500 [==================>...........] - ETA: 59s - loss: 1.8526 - regression_loss: 1.5732 - classification_loss: 0.2793  325/500 [==================>...........] - ETA: 59s - loss: 1.8540 - regression_loss: 1.5743 - classification_loss: 0.2796 326/500 [==================>...........] - ETA: 59s - loss: 1.8539 - regression_loss: 1.5743 - classification_loss: 0.2797 327/500 [==================>...........] - ETA: 58s - loss: 1.8528 - regression_loss: 1.5733 - classification_loss: 0.2795 328/500 [==================>...........] - ETA: 58s - loss: 1.8512 - regression_loss: 1.5721 - classification_loss: 0.2791 329/500 [==================>...........] - ETA: 58s - loss: 1.8492 - regression_loss: 1.5706 - classification_loss: 0.2785 330/500 [==================>...........] - ETA: 57s - loss: 1.8477 - regression_loss: 1.5695 - classification_loss: 0.2782 331/500 [==================>...........] - ETA: 57s - loss: 1.8477 - regression_loss: 1.5696 - classification_loss: 0.2782 332/500 [==================>...........] - ETA: 57s - loss: 1.8489 - regression_loss: 1.5705 - classification_loss: 0.2785 333/500 [==================>...........] - ETA: 56s - loss: 1.8497 - regression_loss: 1.5711 - classification_loss: 0.2786 334/500 [===================>..........] - ETA: 56s - loss: 1.8481 - regression_loss: 1.5698 - classification_loss: 0.2783 335/500 [===================>..........] - ETA: 56s - loss: 1.8484 - regression_loss: 1.5702 - classification_loss: 0.2782 336/500 [===================>..........] - ETA: 55s - loss: 1.8494 - regression_loss: 1.5710 - classification_loss: 0.2784 337/500 [===================>..........] - ETA: 55s - loss: 1.8499 - regression_loss: 1.5717 - classification_loss: 0.2783 338/500 [===================>..........] - ETA: 54s - loss: 1.8494 - regression_loss: 1.5712 - classification_loss: 0.2781 339/500 [===================>..........] - ETA: 54s - loss: 1.8491 - regression_loss: 1.5711 - classification_loss: 0.2780 340/500 [===================>..........] - ETA: 54s - loss: 1.8481 - regression_loss: 1.5704 - classification_loss: 0.2776 341/500 [===================>..........] - ETA: 53s - loss: 1.8458 - regression_loss: 1.5686 - classification_loss: 0.2772 342/500 [===================>..........] - ETA: 53s - loss: 1.8484 - regression_loss: 1.5706 - classification_loss: 0.2778 343/500 [===================>..........] - ETA: 53s - loss: 1.8524 - regression_loss: 1.5740 - classification_loss: 0.2784 344/500 [===================>..........] - ETA: 52s - loss: 1.8540 - regression_loss: 1.5752 - classification_loss: 0.2787 345/500 [===================>..........] - ETA: 52s - loss: 1.8548 - regression_loss: 1.5760 - classification_loss: 0.2788 346/500 [===================>..........] - ETA: 52s - loss: 1.8563 - regression_loss: 1.5777 - classification_loss: 0.2786 347/500 [===================>..........] - ETA: 51s - loss: 1.8559 - regression_loss: 1.5774 - classification_loss: 0.2785 348/500 [===================>..........] - ETA: 51s - loss: 1.8578 - regression_loss: 1.5789 - classification_loss: 0.2789 349/500 [===================>..........] - ETA: 51s - loss: 1.8581 - regression_loss: 1.5793 - classification_loss: 0.2789 350/500 [====================>.........] - ETA: 50s - loss: 1.8593 - regression_loss: 1.5803 - classification_loss: 0.2791 351/500 [====================>.........] - ETA: 50s - loss: 1.8596 - regression_loss: 1.5805 - classification_loss: 0.2791 352/500 [====================>.........] - ETA: 50s - loss: 1.8601 - regression_loss: 1.5807 - classification_loss: 0.2795 353/500 [====================>.........] - ETA: 49s - loss: 1.8601 - regression_loss: 1.5807 - classification_loss: 0.2794 354/500 [====================>.........] - ETA: 49s - loss: 1.8606 - regression_loss: 1.5810 - classification_loss: 0.2796 355/500 [====================>.........] - ETA: 49s - loss: 1.8600 - regression_loss: 1.5804 - classification_loss: 0.2796 356/500 [====================>.........] - ETA: 48s - loss: 1.8611 - regression_loss: 1.5815 - classification_loss: 0.2796 357/500 [====================>.........] - ETA: 48s - loss: 1.8613 - regression_loss: 1.5818 - classification_loss: 0.2795 358/500 [====================>.........] - ETA: 48s - loss: 1.8614 - regression_loss: 1.5818 - classification_loss: 0.2796 359/500 [====================>.........] - ETA: 47s - loss: 1.8621 - regression_loss: 1.5826 - classification_loss: 0.2795 360/500 [====================>.........] - ETA: 47s - loss: 1.8610 - regression_loss: 1.5816 - classification_loss: 0.2794 361/500 [====================>.........] - ETA: 47s - loss: 1.8619 - regression_loss: 1.5825 - classification_loss: 0.2794 362/500 [====================>.........] - ETA: 46s - loss: 1.8624 - regression_loss: 1.5831 - classification_loss: 0.2793 363/500 [====================>.........] - ETA: 46s - loss: 1.8619 - regression_loss: 1.5828 - classification_loss: 0.2790 364/500 [====================>.........] - ETA: 46s - loss: 1.8609 - regression_loss: 1.5821 - classification_loss: 0.2788 365/500 [====================>.........] - ETA: 45s - loss: 1.8611 - regression_loss: 1.5823 - classification_loss: 0.2788 366/500 [====================>.........] - ETA: 45s - loss: 1.8610 - regression_loss: 1.5819 - classification_loss: 0.2791 367/500 [=====================>........] - ETA: 45s - loss: 1.8606 - regression_loss: 1.5817 - classification_loss: 0.2788 368/500 [=====================>........] - ETA: 44s - loss: 1.8577 - regression_loss: 1.5792 - classification_loss: 0.2785 369/500 [=====================>........] - ETA: 44s - loss: 1.8566 - regression_loss: 1.5783 - classification_loss: 0.2783 370/500 [=====================>........] - ETA: 44s - loss: 1.8570 - regression_loss: 1.5787 - classification_loss: 0.2783 371/500 [=====================>........] - ETA: 43s - loss: 1.8572 - regression_loss: 1.5790 - classification_loss: 0.2782 372/500 [=====================>........] - ETA: 43s - loss: 1.8562 - regression_loss: 1.5780 - classification_loss: 0.2783 373/500 [=====================>........] - ETA: 43s - loss: 1.8556 - regression_loss: 1.5775 - classification_loss: 0.2781 374/500 [=====================>........] - ETA: 42s - loss: 1.8567 - regression_loss: 1.5785 - classification_loss: 0.2782 375/500 [=====================>........] - ETA: 42s - loss: 1.8582 - regression_loss: 1.5798 - classification_loss: 0.2784 376/500 [=====================>........] - ETA: 42s - loss: 1.8592 - regression_loss: 1.5806 - classification_loss: 0.2785 377/500 [=====================>........] - ETA: 41s - loss: 1.8595 - regression_loss: 1.5807 - classification_loss: 0.2788 378/500 [=====================>........] - ETA: 41s - loss: 1.8608 - regression_loss: 1.5818 - classification_loss: 0.2790 379/500 [=====================>........] - ETA: 41s - loss: 1.8597 - regression_loss: 1.5809 - classification_loss: 0.2788 380/500 [=====================>........] - ETA: 40s - loss: 1.8604 - regression_loss: 1.5815 - classification_loss: 0.2789 381/500 [=====================>........] - ETA: 40s - loss: 1.8599 - regression_loss: 1.5813 - classification_loss: 0.2786 382/500 [=====================>........] - ETA: 40s - loss: 1.8582 - regression_loss: 1.5800 - classification_loss: 0.2781 383/500 [=====================>........] - ETA: 39s - loss: 1.8589 - regression_loss: 1.5804 - classification_loss: 0.2786 384/500 [======================>.......] - ETA: 39s - loss: 1.8576 - regression_loss: 1.5794 - classification_loss: 0.2782 385/500 [======================>.......] - ETA: 39s - loss: 1.8577 - regression_loss: 1.5796 - classification_loss: 0.2781 386/500 [======================>.......] - ETA: 38s - loss: 1.8553 - regression_loss: 1.5774 - classification_loss: 0.2779 387/500 [======================>.......] - ETA: 38s - loss: 1.8555 - regression_loss: 1.5775 - classification_loss: 0.2780 388/500 [======================>.......] - ETA: 38s - loss: 1.8550 - regression_loss: 1.5770 - classification_loss: 0.2780 389/500 [======================>.......] - ETA: 37s - loss: 1.8534 - regression_loss: 1.5758 - classification_loss: 0.2776 390/500 [======================>.......] - ETA: 37s - loss: 1.8528 - regression_loss: 1.5753 - classification_loss: 0.2775 391/500 [======================>.......] - ETA: 37s - loss: 1.8529 - regression_loss: 1.5754 - classification_loss: 0.2775 392/500 [======================>.......] - ETA: 36s - loss: 1.8518 - regression_loss: 1.5746 - classification_loss: 0.2772 393/500 [======================>.......] - ETA: 36s - loss: 1.8516 - regression_loss: 1.5744 - classification_loss: 0.2772 394/500 [======================>.......] - ETA: 36s - loss: 1.8523 - regression_loss: 1.5750 - classification_loss: 0.2773 395/500 [======================>.......] - ETA: 35s - loss: 1.8562 - regression_loss: 1.5776 - classification_loss: 0.2786 396/500 [======================>.......] - ETA: 35s - loss: 1.8542 - regression_loss: 1.5758 - classification_loss: 0.2784 397/500 [======================>.......] - ETA: 35s - loss: 1.8530 - regression_loss: 1.5749 - classification_loss: 0.2781 398/500 [======================>.......] - ETA: 34s - loss: 1.8528 - regression_loss: 1.5749 - classification_loss: 0.2779 399/500 [======================>.......] - ETA: 34s - loss: 1.8539 - regression_loss: 1.5757 - classification_loss: 0.2782 400/500 [=======================>......] - ETA: 33s - loss: 1.8549 - regression_loss: 1.5765 - classification_loss: 0.2784 401/500 [=======================>......] - ETA: 33s - loss: 1.8543 - regression_loss: 1.5761 - classification_loss: 0.2782 402/500 [=======================>......] - ETA: 33s - loss: 1.8548 - regression_loss: 1.5766 - classification_loss: 0.2783 403/500 [=======================>......] - ETA: 32s - loss: 1.8554 - regression_loss: 1.5771 - classification_loss: 0.2783 404/500 [=======================>......] - ETA: 32s - loss: 1.8540 - regression_loss: 1.5761 - classification_loss: 0.2779 405/500 [=======================>......] - ETA: 32s - loss: 1.8542 - regression_loss: 1.5762 - classification_loss: 0.2780 406/500 [=======================>......] - ETA: 31s - loss: 1.8542 - regression_loss: 1.5763 - classification_loss: 0.2779 407/500 [=======================>......] - ETA: 31s - loss: 1.8546 - regression_loss: 1.5763 - classification_loss: 0.2783 408/500 [=======================>......] - ETA: 31s - loss: 1.8549 - regression_loss: 1.5766 - classification_loss: 0.2783 409/500 [=======================>......] - ETA: 30s - loss: 1.8540 - regression_loss: 1.5758 - classification_loss: 0.2782 410/500 [=======================>......] - ETA: 30s - loss: 1.8552 - regression_loss: 1.5767 - classification_loss: 0.2785 411/500 [=======================>......] - ETA: 30s - loss: 1.8586 - regression_loss: 1.5788 - classification_loss: 0.2798 412/500 [=======================>......] - ETA: 29s - loss: 1.8590 - regression_loss: 1.5792 - classification_loss: 0.2798 413/500 [=======================>......] - ETA: 29s - loss: 1.8589 - regression_loss: 1.5793 - classification_loss: 0.2796 414/500 [=======================>......] - ETA: 29s - loss: 1.8592 - regression_loss: 1.5796 - classification_loss: 0.2796 415/500 [=======================>......] - ETA: 28s - loss: 1.8607 - regression_loss: 1.5809 - classification_loss: 0.2798 416/500 [=======================>......] - ETA: 28s - loss: 1.8601 - regression_loss: 1.5805 - classification_loss: 0.2796 417/500 [========================>.....] - ETA: 28s - loss: 1.8604 - regression_loss: 1.5809 - classification_loss: 0.2795 418/500 [========================>.....] - ETA: 27s - loss: 1.8597 - regression_loss: 1.5803 - classification_loss: 0.2794 419/500 [========================>.....] - ETA: 27s - loss: 1.8592 - regression_loss: 1.5800 - classification_loss: 0.2793 420/500 [========================>.....] - ETA: 27s - loss: 1.8605 - regression_loss: 1.5814 - classification_loss: 0.2791 421/500 [========================>.....] - ETA: 26s - loss: 1.8594 - regression_loss: 1.5805 - classification_loss: 0.2789 422/500 [========================>.....] - ETA: 26s - loss: 1.8599 - regression_loss: 1.5809 - classification_loss: 0.2790 423/500 [========================>.....] - ETA: 26s - loss: 1.8576 - regression_loss: 1.5789 - classification_loss: 0.2787 424/500 [========================>.....] - ETA: 25s - loss: 1.8570 - regression_loss: 1.5784 - classification_loss: 0.2786 425/500 [========================>.....] - ETA: 25s - loss: 1.8578 - regression_loss: 1.5791 - classification_loss: 0.2787 426/500 [========================>.....] - ETA: 25s - loss: 1.8578 - regression_loss: 1.5792 - classification_loss: 0.2786 427/500 [========================>.....] - ETA: 24s - loss: 1.8577 - regression_loss: 1.5793 - classification_loss: 0.2785 428/500 [========================>.....] - ETA: 24s - loss: 1.8560 - regression_loss: 1.5779 - classification_loss: 0.2781 429/500 [========================>.....] - ETA: 24s - loss: 1.8553 - regression_loss: 1.5772 - classification_loss: 0.2781 430/500 [========================>.....] - ETA: 23s - loss: 1.8572 - regression_loss: 1.5787 - classification_loss: 0.2785 431/500 [========================>.....] - ETA: 23s - loss: 1.8582 - regression_loss: 1.5796 - classification_loss: 0.2787 432/500 [========================>.....] - ETA: 23s - loss: 1.8583 - regression_loss: 1.5796 - classification_loss: 0.2787 433/500 [========================>.....] - ETA: 22s - loss: 1.8577 - regression_loss: 1.5792 - classification_loss: 0.2785 434/500 [=========================>....] - ETA: 22s - loss: 1.8554 - regression_loss: 1.5773 - classification_loss: 0.2781 435/500 [=========================>....] - ETA: 22s - loss: 1.8560 - regression_loss: 1.5780 - classification_loss: 0.2780 436/500 [=========================>....] - ETA: 21s - loss: 1.8547 - regression_loss: 1.5769 - classification_loss: 0.2778 437/500 [=========================>....] - ETA: 21s - loss: 1.8550 - regression_loss: 1.5773 - classification_loss: 0.2777 438/500 [=========================>....] - ETA: 21s - loss: 1.8550 - regression_loss: 1.5774 - classification_loss: 0.2776 439/500 [=========================>....] - ETA: 20s - loss: 1.8546 - regression_loss: 1.5772 - classification_loss: 0.2774 440/500 [=========================>....] - ETA: 20s - loss: 1.8553 - regression_loss: 1.5775 - classification_loss: 0.2778 441/500 [=========================>....] - ETA: 20s - loss: 1.8559 - regression_loss: 1.5781 - classification_loss: 0.2778 442/500 [=========================>....] - ETA: 19s - loss: 1.8559 - regression_loss: 1.5782 - classification_loss: 0.2777 443/500 [=========================>....] - ETA: 19s - loss: 1.8567 - regression_loss: 1.5789 - classification_loss: 0.2778 444/500 [=========================>....] - ETA: 19s - loss: 1.8570 - regression_loss: 1.5792 - classification_loss: 0.2778 445/500 [=========================>....] - ETA: 18s - loss: 1.8566 - regression_loss: 1.5790 - classification_loss: 0.2776 446/500 [=========================>....] - ETA: 18s - loss: 1.8575 - regression_loss: 1.5797 - classification_loss: 0.2778 447/500 [=========================>....] - ETA: 18s - loss: 1.8587 - regression_loss: 1.5807 - classification_loss: 0.2781 448/500 [=========================>....] - ETA: 17s - loss: 1.8581 - regression_loss: 1.5802 - classification_loss: 0.2779 449/500 [=========================>....] - ETA: 17s - loss: 1.8579 - regression_loss: 1.5801 - classification_loss: 0.2778 450/500 [==========================>...] - ETA: 16s - loss: 1.8571 - regression_loss: 1.5795 - classification_loss: 0.2776 451/500 [==========================>...] - ETA: 16s - loss: 1.8561 - regression_loss: 1.5786 - classification_loss: 0.2775 452/500 [==========================>...] - ETA: 16s - loss: 1.8541 - regression_loss: 1.5769 - classification_loss: 0.2772 453/500 [==========================>...] - ETA: 15s - loss: 1.8538 - regression_loss: 1.5768 - classification_loss: 0.2771 454/500 [==========================>...] - ETA: 15s - loss: 1.8522 - regression_loss: 1.5754 - classification_loss: 0.2768 455/500 [==========================>...] - ETA: 15s - loss: 1.8508 - regression_loss: 1.5743 - classification_loss: 0.2765 456/500 [==========================>...] - ETA: 14s - loss: 1.8508 - regression_loss: 1.5741 - classification_loss: 0.2767 457/500 [==========================>...] - ETA: 14s - loss: 1.8502 - regression_loss: 1.5735 - classification_loss: 0.2766 458/500 [==========================>...] - ETA: 14s - loss: 1.8501 - regression_loss: 1.5735 - classification_loss: 0.2766 459/500 [==========================>...] - ETA: 13s - loss: 1.8498 - regression_loss: 1.5732 - classification_loss: 0.2765 460/500 [==========================>...] - ETA: 13s - loss: 1.8502 - regression_loss: 1.5737 - classification_loss: 0.2765 461/500 [==========================>...] - ETA: 13s - loss: 1.8504 - regression_loss: 1.5740 - classification_loss: 0.2765 462/500 [==========================>...] - ETA: 12s - loss: 1.8488 - regression_loss: 1.5725 - classification_loss: 0.2762 463/500 [==========================>...] - ETA: 12s - loss: 1.8487 - regression_loss: 1.5726 - classification_loss: 0.2761 464/500 [==========================>...] - ETA: 12s - loss: 1.8490 - regression_loss: 1.5729 - classification_loss: 0.2761 465/500 [==========================>...] - ETA: 11s - loss: 1.8486 - regression_loss: 1.5726 - classification_loss: 0.2760 466/500 [==========================>...] - ETA: 11s - loss: 1.8484 - regression_loss: 1.5727 - classification_loss: 0.2757 467/500 [===========================>..] - ETA: 11s - loss: 1.8499 - regression_loss: 1.5742 - classification_loss: 0.2757 468/500 [===========================>..] - ETA: 10s - loss: 1.8489 - regression_loss: 1.5734 - classification_loss: 0.2755 469/500 [===========================>..] - ETA: 10s - loss: 1.8485 - regression_loss: 1.5732 - classification_loss: 0.2753 470/500 [===========================>..] - ETA: 10s - loss: 1.8479 - regression_loss: 1.5727 - classification_loss: 0.2752 471/500 [===========================>..] - ETA: 9s - loss: 1.8484 - regression_loss: 1.5732 - classification_loss: 0.2752  472/500 [===========================>..] - ETA: 9s - loss: 1.8462 - regression_loss: 1.5714 - classification_loss: 0.2748 473/500 [===========================>..] - ETA: 9s - loss: 1.8457 - regression_loss: 1.5710 - classification_loss: 0.2747 474/500 [===========================>..] - ETA: 8s - loss: 1.8452 - regression_loss: 1.5706 - classification_loss: 0.2745 475/500 [===========================>..] - ETA: 8s - loss: 1.8452 - regression_loss: 1.5704 - classification_loss: 0.2749 476/500 [===========================>..] - ETA: 8s - loss: 1.8460 - regression_loss: 1.5710 - classification_loss: 0.2749 477/500 [===========================>..] - ETA: 7s - loss: 1.8459 - regression_loss: 1.5710 - classification_loss: 0.2749 478/500 [===========================>..] - ETA: 7s - loss: 1.8457 - regression_loss: 1.5709 - classification_loss: 0.2748 479/500 [===========================>..] - ETA: 7s - loss: 1.8452 - regression_loss: 1.5705 - classification_loss: 0.2748 480/500 [===========================>..] - ETA: 6s - loss: 1.8449 - regression_loss: 1.5702 - classification_loss: 0.2747 481/500 [===========================>..] - ETA: 6s - loss: 1.8439 - regression_loss: 1.5693 - classification_loss: 0.2746 482/500 [===========================>..] - ETA: 6s - loss: 1.8429 - regression_loss: 1.5684 - classification_loss: 0.2744 483/500 [===========================>..] - ETA: 5s - loss: 1.8428 - regression_loss: 1.5685 - classification_loss: 0.2744 484/500 [============================>.] - ETA: 5s - loss: 1.8428 - regression_loss: 1.5684 - classification_loss: 0.2744 485/500 [============================>.] - ETA: 5s - loss: 1.8425 - regression_loss: 1.5683 - classification_loss: 0.2742 486/500 [============================>.] - ETA: 4s - loss: 1.8426 - regression_loss: 1.5683 - classification_loss: 0.2742 487/500 [============================>.] - ETA: 4s - loss: 1.8442 - regression_loss: 1.5695 - classification_loss: 0.2746 488/500 [============================>.] - ETA: 4s - loss: 1.8441 - regression_loss: 1.5695 - classification_loss: 0.2745 489/500 [============================>.] - ETA: 3s - loss: 1.8434 - regression_loss: 1.5690 - classification_loss: 0.2744 490/500 [============================>.] - ETA: 3s - loss: 1.8427 - regression_loss: 1.5684 - classification_loss: 0.2742 491/500 [============================>.] - ETA: 3s - loss: 1.8428 - regression_loss: 1.5686 - classification_loss: 0.2742 492/500 [============================>.] - ETA: 2s - loss: 1.8409 - regression_loss: 1.5671 - classification_loss: 0.2739 493/500 [============================>.] - ETA: 2s - loss: 1.8392 - regression_loss: 1.5657 - classification_loss: 0.2735 494/500 [============================>.] - ETA: 2s - loss: 1.8394 - regression_loss: 1.5659 - classification_loss: 0.2735 495/500 [============================>.] - ETA: 1s - loss: 1.8401 - regression_loss: 1.5663 - classification_loss: 0.2737 496/500 [============================>.] - ETA: 1s - loss: 1.8400 - regression_loss: 1.5662 - classification_loss: 0.2738 497/500 [============================>.] - ETA: 1s - loss: 1.8401 - regression_loss: 1.5663 - classification_loss: 0.2739 498/500 [============================>.] - ETA: 0s - loss: 1.8395 - regression_loss: 1.5657 - classification_loss: 0.2738 499/500 [============================>.] - ETA: 0s - loss: 1.8392 - regression_loss: 1.5656 - classification_loss: 0.2736 500/500 [==============================] - 170s 340ms/step - loss: 1.8397 - regression_loss: 1.5661 - classification_loss: 0.2736 1172 instances of class plum with average precision: 0.6719 mAP: 0.6719 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 2:45 - loss: 1.1722 - regression_loss: 0.9927 - classification_loss: 0.1795 2/500 [..............................] - ETA: 2:44 - loss: 2.2268 - regression_loss: 1.7967 - classification_loss: 0.4301 3/500 [..............................] - ETA: 2:45 - loss: 2.1223 - regression_loss: 1.7448 - classification_loss: 0.3775 4/500 [..............................] - ETA: 2:45 - loss: 2.0883 - regression_loss: 1.7096 - classification_loss: 0.3787 5/500 [..............................] - ETA: 2:48 - loss: 2.0370 - regression_loss: 1.6634 - classification_loss: 0.3736 6/500 [..............................] - ETA: 2:45 - loss: 2.0472 - regression_loss: 1.6792 - classification_loss: 0.3680 7/500 [..............................] - ETA: 2:47 - loss: 1.9176 - regression_loss: 1.5794 - classification_loss: 0.3382 8/500 [..............................] - ETA: 2:47 - loss: 1.9254 - regression_loss: 1.5899 - classification_loss: 0.3356 9/500 [..............................] - ETA: 2:46 - loss: 1.9120 - regression_loss: 1.5804 - classification_loss: 0.3316 10/500 [..............................] - ETA: 2:45 - loss: 1.8427 - regression_loss: 1.5349 - classification_loss: 0.3077 11/500 [..............................] - ETA: 2:45 - loss: 1.8100 - regression_loss: 1.5071 - classification_loss: 0.3029 12/500 [..............................] - ETA: 2:45 - loss: 1.7904 - regression_loss: 1.4925 - classification_loss: 0.2979 13/500 [..............................] - ETA: 2:45 - loss: 1.7928 - regression_loss: 1.4981 - classification_loss: 0.2947 14/500 [..............................] - ETA: 2:45 - loss: 1.7867 - regression_loss: 1.4966 - classification_loss: 0.2901 15/500 [..............................] - ETA: 2:45 - loss: 1.7773 - regression_loss: 1.4962 - classification_loss: 0.2811 16/500 [..............................] - ETA: 2:45 - loss: 1.7656 - regression_loss: 1.4905 - classification_loss: 0.2752 17/500 [>.............................] - ETA: 2:45 - loss: 1.7047 - regression_loss: 1.4387 - classification_loss: 0.2659 18/500 [>.............................] - ETA: 2:44 - loss: 1.7327 - regression_loss: 1.4626 - classification_loss: 0.2701 19/500 [>.............................] - ETA: 2:43 - loss: 1.6910 - regression_loss: 1.4259 - classification_loss: 0.2651 20/500 [>.............................] - ETA: 2:43 - loss: 1.7224 - regression_loss: 1.4544 - classification_loss: 0.2680 21/500 [>.............................] - ETA: 2:43 - loss: 1.7286 - regression_loss: 1.4597 - classification_loss: 0.2689 22/500 [>.............................] - ETA: 2:42 - loss: 1.7333 - regression_loss: 1.4645 - classification_loss: 0.2688 23/500 [>.............................] - ETA: 2:42 - loss: 1.7215 - regression_loss: 1.4574 - classification_loss: 0.2640 24/500 [>.............................] - ETA: 2:42 - loss: 1.7152 - regression_loss: 1.4555 - classification_loss: 0.2598 25/500 [>.............................] - ETA: 2:42 - loss: 1.7233 - regression_loss: 1.4634 - classification_loss: 0.2598 26/500 [>.............................] - ETA: 2:41 - loss: 1.7247 - regression_loss: 1.4672 - classification_loss: 0.2575 27/500 [>.............................] - ETA: 2:41 - loss: 1.6909 - regression_loss: 1.4390 - classification_loss: 0.2519 28/500 [>.............................] - ETA: 2:41 - loss: 1.7067 - regression_loss: 1.4532 - classification_loss: 0.2535 29/500 [>.............................] - ETA: 2:41 - loss: 1.7179 - regression_loss: 1.4661 - classification_loss: 0.2519 30/500 [>.............................] - ETA: 2:40 - loss: 1.7118 - regression_loss: 1.4598 - classification_loss: 0.2520 31/500 [>.............................] - ETA: 2:40 - loss: 1.7249 - regression_loss: 1.4693 - classification_loss: 0.2556 32/500 [>.............................] - ETA: 2:40 - loss: 1.7232 - regression_loss: 1.4685 - classification_loss: 0.2547 33/500 [>.............................] - ETA: 2:39 - loss: 1.7453 - regression_loss: 1.4874 - classification_loss: 0.2579 34/500 [=>............................] - ETA: 2:39 - loss: 1.7516 - regression_loss: 1.4923 - classification_loss: 0.2593 35/500 [=>............................] - ETA: 2:38 - loss: 1.7543 - regression_loss: 1.4957 - classification_loss: 0.2586 36/500 [=>............................] - ETA: 2:38 - loss: 1.7623 - regression_loss: 1.5020 - classification_loss: 0.2603 37/500 [=>............................] - ETA: 2:38 - loss: 1.7642 - regression_loss: 1.5034 - classification_loss: 0.2608 38/500 [=>............................] - ETA: 2:38 - loss: 1.7386 - regression_loss: 1.4825 - classification_loss: 0.2561 39/500 [=>............................] - ETA: 2:37 - loss: 1.7622 - regression_loss: 1.5005 - classification_loss: 0.2616 40/500 [=>............................] - ETA: 2:37 - loss: 1.7433 - regression_loss: 1.4848 - classification_loss: 0.2584 41/500 [=>............................] - ETA: 2:37 - loss: 1.7521 - regression_loss: 1.4942 - classification_loss: 0.2579 42/500 [=>............................] - ETA: 2:36 - loss: 1.7490 - regression_loss: 1.4915 - classification_loss: 0.2575 43/500 [=>............................] - ETA: 2:36 - loss: 1.7407 - regression_loss: 1.4842 - classification_loss: 0.2566 44/500 [=>............................] - ETA: 2:36 - loss: 1.7478 - regression_loss: 1.4897 - classification_loss: 0.2581 45/500 [=>............................] - ETA: 2:35 - loss: 1.7438 - regression_loss: 1.4871 - classification_loss: 0.2567 46/500 [=>............................] - ETA: 2:35 - loss: 1.7299 - regression_loss: 1.4757 - classification_loss: 0.2542 47/500 [=>............................] - ETA: 2:35 - loss: 1.7431 - regression_loss: 1.4868 - classification_loss: 0.2563 48/500 [=>............................] - ETA: 2:34 - loss: 1.7389 - regression_loss: 1.4842 - classification_loss: 0.2547 49/500 [=>............................] - ETA: 2:34 - loss: 1.7454 - regression_loss: 1.4904 - classification_loss: 0.2550 50/500 [==>...........................] - ETA: 2:33 - loss: 1.7473 - regression_loss: 1.4924 - classification_loss: 0.2549 51/500 [==>...........................] - ETA: 2:33 - loss: 1.7562 - regression_loss: 1.4996 - classification_loss: 0.2565 52/500 [==>...........................] - ETA: 2:33 - loss: 1.7531 - regression_loss: 1.4993 - classification_loss: 0.2537 53/500 [==>...........................] - ETA: 2:32 - loss: 1.7389 - regression_loss: 1.4876 - classification_loss: 0.2513 54/500 [==>...........................] - ETA: 2:32 - loss: 1.7446 - regression_loss: 1.4920 - classification_loss: 0.2527 55/500 [==>...........................] - ETA: 2:32 - loss: 1.7464 - regression_loss: 1.4928 - classification_loss: 0.2537 56/500 [==>...........................] - ETA: 2:31 - loss: 1.7598 - regression_loss: 1.5035 - classification_loss: 0.2562 57/500 [==>...........................] - ETA: 2:31 - loss: 1.7608 - regression_loss: 1.5044 - classification_loss: 0.2563 58/500 [==>...........................] - ETA: 2:31 - loss: 1.7533 - regression_loss: 1.4979 - classification_loss: 0.2554 59/500 [==>...........................] - ETA: 2:30 - loss: 1.7519 - regression_loss: 1.4971 - classification_loss: 0.2548 60/500 [==>...........................] - ETA: 2:30 - loss: 1.7597 - regression_loss: 1.5046 - classification_loss: 0.2550 61/500 [==>...........................] - ETA: 2:30 - loss: 1.7583 - regression_loss: 1.5033 - classification_loss: 0.2550 62/500 [==>...........................] - ETA: 2:29 - loss: 1.7606 - regression_loss: 1.5060 - classification_loss: 0.2547 63/500 [==>...........................] - ETA: 2:29 - loss: 1.7561 - regression_loss: 1.5030 - classification_loss: 0.2531 64/500 [==>...........................] - ETA: 2:29 - loss: 1.7527 - regression_loss: 1.4998 - classification_loss: 0.2529 65/500 [==>...........................] - ETA: 2:28 - loss: 1.7497 - regression_loss: 1.4892 - classification_loss: 0.2605 66/500 [==>...........................] - ETA: 2:28 - loss: 1.7400 - regression_loss: 1.4815 - classification_loss: 0.2585 67/500 [===>..........................] - ETA: 2:28 - loss: 1.7419 - regression_loss: 1.4834 - classification_loss: 0.2585 68/500 [===>..........................] - ETA: 2:27 - loss: 1.7402 - regression_loss: 1.4823 - classification_loss: 0.2579 69/500 [===>..........................] - ETA: 2:27 - loss: 1.7371 - regression_loss: 1.4785 - classification_loss: 0.2585 70/500 [===>..........................] - ETA: 2:27 - loss: 1.7314 - regression_loss: 1.4736 - classification_loss: 0.2577 71/500 [===>..........................] - ETA: 2:26 - loss: 1.7300 - regression_loss: 1.4717 - classification_loss: 0.2584 72/500 [===>..........................] - ETA: 2:26 - loss: 1.7273 - regression_loss: 1.4696 - classification_loss: 0.2577 73/500 [===>..........................] - ETA: 2:26 - loss: 1.7297 - regression_loss: 1.4720 - classification_loss: 0.2577 74/500 [===>..........................] - ETA: 2:25 - loss: 1.7393 - regression_loss: 1.4791 - classification_loss: 0.2602 75/500 [===>..........................] - ETA: 2:25 - loss: 1.7415 - regression_loss: 1.4809 - classification_loss: 0.2606 76/500 [===>..........................] - ETA: 2:25 - loss: 1.7496 - regression_loss: 1.4868 - classification_loss: 0.2628 77/500 [===>..........................] - ETA: 2:24 - loss: 1.7539 - regression_loss: 1.4907 - classification_loss: 0.2632 78/500 [===>..........................] - ETA: 2:24 - loss: 1.7495 - regression_loss: 1.4870 - classification_loss: 0.2626 79/500 [===>..........................] - ETA: 2:24 - loss: 1.7531 - regression_loss: 1.4910 - classification_loss: 0.2622 80/500 [===>..........................] - ETA: 2:23 - loss: 1.7583 - regression_loss: 1.4962 - classification_loss: 0.2621 81/500 [===>..........................] - ETA: 2:23 - loss: 1.7606 - regression_loss: 1.4986 - classification_loss: 0.2620 82/500 [===>..........................] - ETA: 2:23 - loss: 1.7651 - regression_loss: 1.5019 - classification_loss: 0.2632 83/500 [===>..........................] - ETA: 2:22 - loss: 1.7658 - regression_loss: 1.5030 - classification_loss: 0.2628 84/500 [====>.........................] - ETA: 2:22 - loss: 1.7736 - regression_loss: 1.5073 - classification_loss: 0.2663 85/500 [====>.........................] - ETA: 2:22 - loss: 1.7730 - regression_loss: 1.5065 - classification_loss: 0.2665 86/500 [====>.........................] - ETA: 2:21 - loss: 1.7767 - regression_loss: 1.5078 - classification_loss: 0.2689 87/500 [====>.........................] - ETA: 2:21 - loss: 1.7806 - regression_loss: 1.5109 - classification_loss: 0.2697 88/500 [====>.........................] - ETA: 2:20 - loss: 1.7831 - regression_loss: 1.5126 - classification_loss: 0.2706 89/500 [====>.........................] - ETA: 2:20 - loss: 1.7872 - regression_loss: 1.5161 - classification_loss: 0.2712 90/500 [====>.........................] - ETA: 2:20 - loss: 1.7906 - regression_loss: 1.5189 - classification_loss: 0.2717 91/500 [====>.........................] - ETA: 2:19 - loss: 1.7825 - regression_loss: 1.5124 - classification_loss: 0.2701 92/500 [====>.........................] - ETA: 2:19 - loss: 1.7814 - regression_loss: 1.5120 - classification_loss: 0.2694 93/500 [====>.........................] - ETA: 2:19 - loss: 1.7760 - regression_loss: 1.5076 - classification_loss: 0.2683 94/500 [====>.........................] - ETA: 2:18 - loss: 1.7745 - regression_loss: 1.5067 - classification_loss: 0.2678 95/500 [====>.........................] - ETA: 2:18 - loss: 1.7738 - regression_loss: 1.5050 - classification_loss: 0.2687 96/500 [====>.........................] - ETA: 2:18 - loss: 1.7717 - regression_loss: 1.5026 - classification_loss: 0.2691 97/500 [====>.........................] - ETA: 2:17 - loss: 1.7634 - regression_loss: 1.4956 - classification_loss: 0.2678 98/500 [====>.........................] - ETA: 2:17 - loss: 1.7634 - regression_loss: 1.4957 - classification_loss: 0.2677 99/500 [====>.........................] - ETA: 2:17 - loss: 1.7640 - regression_loss: 1.4971 - classification_loss: 0.2669 100/500 [=====>........................] - ETA: 2:16 - loss: 1.7631 - regression_loss: 1.4965 - classification_loss: 0.2666 101/500 [=====>........................] - ETA: 2:16 - loss: 1.7637 - regression_loss: 1.4966 - classification_loss: 0.2671 102/500 [=====>........................] - ETA: 2:16 - loss: 1.7536 - regression_loss: 1.4882 - classification_loss: 0.2655 103/500 [=====>........................] - ETA: 2:15 - loss: 1.7582 - regression_loss: 1.4919 - classification_loss: 0.2663 104/500 [=====>........................] - ETA: 2:15 - loss: 1.7562 - regression_loss: 1.4899 - classification_loss: 0.2663 105/500 [=====>........................] - ETA: 2:15 - loss: 1.7576 - regression_loss: 1.4918 - classification_loss: 0.2658 106/500 [=====>........................] - ETA: 2:14 - loss: 1.7593 - regression_loss: 1.4932 - classification_loss: 0.2660 107/500 [=====>........................] - ETA: 2:14 - loss: 1.7573 - regression_loss: 1.4917 - classification_loss: 0.2656 108/500 [=====>........................] - ETA: 2:14 - loss: 1.7557 - regression_loss: 1.4901 - classification_loss: 0.2656 109/500 [=====>........................] - ETA: 2:13 - loss: 1.7508 - regression_loss: 1.4864 - classification_loss: 0.2644 110/500 [=====>........................] - ETA: 2:13 - loss: 1.7505 - regression_loss: 1.4862 - classification_loss: 0.2643 111/500 [=====>........................] - ETA: 2:13 - loss: 1.7471 - regression_loss: 1.4839 - classification_loss: 0.2631 112/500 [=====>........................] - ETA: 2:12 - loss: 1.7422 - regression_loss: 1.4792 - classification_loss: 0.2630 113/500 [=====>........................] - ETA: 2:12 - loss: 1.7416 - regression_loss: 1.4787 - classification_loss: 0.2629 114/500 [=====>........................] - ETA: 2:12 - loss: 1.7411 - regression_loss: 1.4790 - classification_loss: 0.2621 115/500 [=====>........................] - ETA: 2:11 - loss: 1.7411 - regression_loss: 1.4795 - classification_loss: 0.2616 116/500 [=====>........................] - ETA: 2:11 - loss: 1.7415 - regression_loss: 1.4800 - classification_loss: 0.2614 117/500 [======>.......................] - ETA: 2:11 - loss: 1.7434 - regression_loss: 1.4817 - classification_loss: 0.2617 118/500 [======>.......................] - ETA: 2:10 - loss: 1.7427 - regression_loss: 1.4812 - classification_loss: 0.2615 119/500 [======>.......................] - ETA: 2:10 - loss: 1.7382 - regression_loss: 1.4776 - classification_loss: 0.2606 120/500 [======>.......................] - ETA: 2:10 - loss: 1.7386 - regression_loss: 1.4787 - classification_loss: 0.2599 121/500 [======>.......................] - ETA: 2:09 - loss: 1.7401 - regression_loss: 1.4795 - classification_loss: 0.2606 122/500 [======>.......................] - ETA: 2:09 - loss: 1.7449 - regression_loss: 1.4833 - classification_loss: 0.2616 123/500 [======>.......................] - ETA: 2:09 - loss: 1.7423 - regression_loss: 1.4814 - classification_loss: 0.2609 124/500 [======>.......................] - ETA: 2:08 - loss: 1.7420 - regression_loss: 1.4814 - classification_loss: 0.2607 125/500 [======>.......................] - ETA: 2:08 - loss: 1.7425 - regression_loss: 1.4819 - classification_loss: 0.2606 126/500 [======>.......................] - ETA: 2:08 - loss: 1.7459 - regression_loss: 1.4848 - classification_loss: 0.2612 127/500 [======>.......................] - ETA: 2:07 - loss: 1.7443 - regression_loss: 1.4832 - classification_loss: 0.2611 128/500 [======>.......................] - ETA: 2:07 - loss: 1.7422 - regression_loss: 1.4816 - classification_loss: 0.2606 129/500 [======>.......................] - ETA: 2:07 - loss: 1.7417 - regression_loss: 1.4814 - classification_loss: 0.2603 130/500 [======>.......................] - ETA: 2:06 - loss: 1.7469 - regression_loss: 1.4858 - classification_loss: 0.2610 131/500 [======>.......................] - ETA: 2:06 - loss: 1.7451 - regression_loss: 1.4844 - classification_loss: 0.2607 132/500 [======>.......................] - ETA: 2:06 - loss: 1.7445 - regression_loss: 1.4842 - classification_loss: 0.2604 133/500 [======>.......................] - ETA: 2:05 - loss: 1.7473 - regression_loss: 1.4858 - classification_loss: 0.2616 134/500 [=======>......................] - ETA: 2:05 - loss: 1.7497 - regression_loss: 1.4874 - classification_loss: 0.2623 135/500 [=======>......................] - ETA: 2:05 - loss: 1.7467 - regression_loss: 1.4851 - classification_loss: 0.2617 136/500 [=======>......................] - ETA: 2:04 - loss: 1.7488 - regression_loss: 1.4870 - classification_loss: 0.2618 137/500 [=======>......................] - ETA: 2:04 - loss: 1.7496 - regression_loss: 1.4879 - classification_loss: 0.2617 138/500 [=======>......................] - ETA: 2:04 - loss: 1.7494 - regression_loss: 1.4874 - classification_loss: 0.2620 139/500 [=======>......................] - ETA: 2:03 - loss: 1.7530 - regression_loss: 1.4906 - classification_loss: 0.2624 140/500 [=======>......................] - ETA: 2:03 - loss: 1.7507 - regression_loss: 1.4888 - classification_loss: 0.2618 141/500 [=======>......................] - ETA: 2:03 - loss: 1.7499 - regression_loss: 1.4885 - classification_loss: 0.2614 142/500 [=======>......................] - ETA: 2:02 - loss: 1.7464 - regression_loss: 1.4843 - classification_loss: 0.2621 143/500 [=======>......................] - ETA: 2:02 - loss: 1.7455 - regression_loss: 1.4831 - classification_loss: 0.2624 144/500 [=======>......................] - ETA: 2:01 - loss: 1.7465 - regression_loss: 1.4841 - classification_loss: 0.2624 145/500 [=======>......................] - ETA: 2:01 - loss: 1.7463 - regression_loss: 1.4845 - classification_loss: 0.2618 146/500 [=======>......................] - ETA: 2:01 - loss: 1.7487 - regression_loss: 1.4869 - classification_loss: 0.2618 147/500 [=======>......................] - ETA: 2:00 - loss: 1.7501 - regression_loss: 1.4881 - classification_loss: 0.2620 148/500 [=======>......................] - ETA: 2:00 - loss: 1.7514 - regression_loss: 1.4898 - classification_loss: 0.2617 149/500 [=======>......................] - ETA: 2:00 - loss: 1.7486 - regression_loss: 1.4874 - classification_loss: 0.2612 150/500 [========>.....................] - ETA: 1:59 - loss: 1.7500 - regression_loss: 1.4885 - classification_loss: 0.2615 151/500 [========>.....................] - ETA: 1:59 - loss: 1.7510 - regression_loss: 1.4900 - classification_loss: 0.2610 152/500 [========>.....................] - ETA: 1:59 - loss: 1.7484 - regression_loss: 1.4881 - classification_loss: 0.2603 153/500 [========>.....................] - ETA: 1:58 - loss: 1.7494 - regression_loss: 1.4891 - classification_loss: 0.2603 154/500 [========>.....................] - ETA: 1:58 - loss: 1.7521 - regression_loss: 1.4916 - classification_loss: 0.2605 155/500 [========>.....................] - ETA: 1:58 - loss: 1.7560 - regression_loss: 1.4946 - classification_loss: 0.2614 156/500 [========>.....................] - ETA: 1:57 - loss: 1.7572 - regression_loss: 1.4957 - classification_loss: 0.2615 157/500 [========>.....................] - ETA: 1:57 - loss: 1.7530 - regression_loss: 1.4923 - classification_loss: 0.2606 158/500 [========>.....................] - ETA: 1:57 - loss: 1.7564 - regression_loss: 1.4946 - classification_loss: 0.2618 159/500 [========>.....................] - ETA: 1:56 - loss: 1.7565 - regression_loss: 1.4948 - classification_loss: 0.2617 160/500 [========>.....................] - ETA: 1:56 - loss: 1.7586 - regression_loss: 1.4966 - classification_loss: 0.2620 161/500 [========>.....................] - ETA: 1:56 - loss: 1.7575 - regression_loss: 1.4956 - classification_loss: 0.2619 162/500 [========>.....................] - ETA: 1:55 - loss: 1.7550 - regression_loss: 1.4934 - classification_loss: 0.2616 163/500 [========>.....................] - ETA: 1:55 - loss: 1.7558 - regression_loss: 1.4940 - classification_loss: 0.2618 164/500 [========>.....................] - ETA: 1:55 - loss: 1.7568 - regression_loss: 1.4951 - classification_loss: 0.2617 165/500 [========>.....................] - ETA: 1:54 - loss: 1.7588 - regression_loss: 1.4968 - classification_loss: 0.2619 166/500 [========>.....................] - ETA: 1:54 - loss: 1.7599 - regression_loss: 1.4980 - classification_loss: 0.2619 167/500 [=========>....................] - ETA: 1:54 - loss: 1.7606 - regression_loss: 1.4986 - classification_loss: 0.2619 168/500 [=========>....................] - ETA: 1:53 - loss: 1.7630 - regression_loss: 1.5007 - classification_loss: 0.2623 169/500 [=========>....................] - ETA: 1:53 - loss: 1.7631 - regression_loss: 1.5008 - classification_loss: 0.2623 170/500 [=========>....................] - ETA: 1:52 - loss: 1.7645 - regression_loss: 1.5021 - classification_loss: 0.2624 171/500 [=========>....................] - ETA: 1:52 - loss: 1.7649 - regression_loss: 1.5028 - classification_loss: 0.2621 172/500 [=========>....................] - ETA: 1:52 - loss: 1.7679 - regression_loss: 1.5052 - classification_loss: 0.2627 173/500 [=========>....................] - ETA: 1:51 - loss: 1.7682 - regression_loss: 1.5051 - classification_loss: 0.2631 174/500 [=========>....................] - ETA: 1:51 - loss: 1.7657 - regression_loss: 1.5031 - classification_loss: 0.2626 175/500 [=========>....................] - ETA: 1:51 - loss: 1.7652 - regression_loss: 1.5024 - classification_loss: 0.2628 176/500 [=========>....................] - ETA: 1:50 - loss: 1.7650 - regression_loss: 1.5023 - classification_loss: 0.2627 177/500 [=========>....................] - ETA: 1:50 - loss: 1.7620 - regression_loss: 1.4999 - classification_loss: 0.2620 178/500 [=========>....................] - ETA: 1:50 - loss: 1.7586 - regression_loss: 1.4974 - classification_loss: 0.2612 179/500 [=========>....................] - ETA: 1:49 - loss: 1.7579 - regression_loss: 1.4970 - classification_loss: 0.2608 180/500 [=========>....................] - ETA: 1:49 - loss: 1.7580 - regression_loss: 1.4971 - classification_loss: 0.2608 181/500 [=========>....................] - ETA: 1:49 - loss: 1.7576 - regression_loss: 1.4968 - classification_loss: 0.2607 182/500 [=========>....................] - ETA: 1:48 - loss: 1.7550 - regression_loss: 1.4945 - classification_loss: 0.2605 183/500 [=========>....................] - ETA: 1:48 - loss: 1.7578 - regression_loss: 1.4971 - classification_loss: 0.2607 184/500 [==========>...................] - ETA: 1:48 - loss: 1.7604 - regression_loss: 1.4994 - classification_loss: 0.2610 185/500 [==========>...................] - ETA: 1:47 - loss: 1.7599 - regression_loss: 1.4989 - classification_loss: 0.2610 186/500 [==========>...................] - ETA: 1:47 - loss: 1.7618 - regression_loss: 1.5007 - classification_loss: 0.2611 187/500 [==========>...................] - ETA: 1:47 - loss: 1.7613 - regression_loss: 1.5004 - classification_loss: 0.2609 188/500 [==========>...................] - ETA: 1:46 - loss: 1.7600 - regression_loss: 1.4995 - classification_loss: 0.2606 189/500 [==========>...................] - ETA: 1:46 - loss: 1.7609 - regression_loss: 1.4998 - classification_loss: 0.2611 190/500 [==========>...................] - ETA: 1:46 - loss: 1.7589 - regression_loss: 1.4978 - classification_loss: 0.2611 191/500 [==========>...................] - ETA: 1:45 - loss: 1.7616 - regression_loss: 1.4999 - classification_loss: 0.2617 192/500 [==========>...................] - ETA: 1:45 - loss: 1.7626 - regression_loss: 1.5008 - classification_loss: 0.2618 193/500 [==========>...................] - ETA: 1:44 - loss: 1.7643 - regression_loss: 1.5022 - classification_loss: 0.2621 194/500 [==========>...................] - ETA: 1:44 - loss: 1.7631 - regression_loss: 1.5014 - classification_loss: 0.2617 195/500 [==========>...................] - ETA: 1:44 - loss: 1.7608 - regression_loss: 1.4994 - classification_loss: 0.2614 196/500 [==========>...................] - ETA: 1:44 - loss: 1.7628 - regression_loss: 1.5005 - classification_loss: 0.2624 197/500 [==========>...................] - ETA: 1:43 - loss: 1.7631 - regression_loss: 1.5007 - classification_loss: 0.2623 198/500 [==========>...................] - ETA: 1:43 - loss: 1.7639 - regression_loss: 1.5018 - classification_loss: 0.2621 199/500 [==========>...................] - ETA: 1:42 - loss: 1.7673 - regression_loss: 1.5040 - classification_loss: 0.2633 200/500 [===========>..................] - ETA: 1:42 - loss: 1.7694 - regression_loss: 1.5058 - classification_loss: 0.2636 201/500 [===========>..................] - ETA: 1:42 - loss: 1.7693 - regression_loss: 1.5059 - classification_loss: 0.2634 202/500 [===========>..................] - ETA: 1:42 - loss: 1.7694 - regression_loss: 1.5062 - classification_loss: 0.2632 203/500 [===========>..................] - ETA: 1:41 - loss: 1.7703 - regression_loss: 1.5068 - classification_loss: 0.2635 204/500 [===========>..................] - ETA: 1:41 - loss: 1.7667 - regression_loss: 1.5039 - classification_loss: 0.2628 205/500 [===========>..................] - ETA: 1:40 - loss: 1.7662 - regression_loss: 1.5037 - classification_loss: 0.2625 206/500 [===========>..................] - ETA: 1:40 - loss: 1.7644 - regression_loss: 1.5023 - classification_loss: 0.2621 207/500 [===========>..................] - ETA: 1:40 - loss: 1.7676 - regression_loss: 1.5051 - classification_loss: 0.2624 208/500 [===========>..................] - ETA: 1:39 - loss: 1.7667 - regression_loss: 1.5044 - classification_loss: 0.2623 209/500 [===========>..................] - ETA: 1:39 - loss: 1.7715 - regression_loss: 1.5089 - classification_loss: 0.2626 210/500 [===========>..................] - ETA: 1:39 - loss: 1.7722 - regression_loss: 1.5096 - classification_loss: 0.2626 211/500 [===========>..................] - ETA: 1:38 - loss: 1.7728 - regression_loss: 1.5103 - classification_loss: 0.2625 212/500 [===========>..................] - ETA: 1:38 - loss: 1.7740 - regression_loss: 1.5113 - classification_loss: 0.2627 213/500 [===========>..................] - ETA: 1:38 - loss: 1.7744 - regression_loss: 1.5117 - classification_loss: 0.2627 214/500 [===========>..................] - ETA: 1:37 - loss: 1.7750 - regression_loss: 1.5120 - classification_loss: 0.2630 215/500 [===========>..................] - ETA: 1:37 - loss: 1.7744 - regression_loss: 1.5117 - classification_loss: 0.2627 216/500 [===========>..................] - ETA: 1:37 - loss: 1.7754 - regression_loss: 1.5129 - classification_loss: 0.2626 217/500 [============>.................] - ETA: 1:36 - loss: 1.7771 - regression_loss: 1.5141 - classification_loss: 0.2630 218/500 [============>.................] - ETA: 1:36 - loss: 1.7795 - regression_loss: 1.5163 - classification_loss: 0.2632 219/500 [============>.................] - ETA: 1:36 - loss: 1.7796 - regression_loss: 1.5165 - classification_loss: 0.2631 220/500 [============>.................] - ETA: 1:35 - loss: 1.7816 - regression_loss: 1.5181 - classification_loss: 0.2635 221/500 [============>.................] - ETA: 1:35 - loss: 1.7833 - regression_loss: 1.5194 - classification_loss: 0.2639 222/500 [============>.................] - ETA: 1:35 - loss: 1.7826 - regression_loss: 1.5188 - classification_loss: 0.2638 223/500 [============>.................] - ETA: 1:34 - loss: 1.7814 - regression_loss: 1.5179 - classification_loss: 0.2635 224/500 [============>.................] - ETA: 1:34 - loss: 1.7806 - regression_loss: 1.5177 - classification_loss: 0.2628 225/500 [============>.................] - ETA: 1:34 - loss: 1.7820 - regression_loss: 1.5189 - classification_loss: 0.2630 226/500 [============>.................] - ETA: 1:33 - loss: 1.7823 - regression_loss: 1.5193 - classification_loss: 0.2630 227/500 [============>.................] - ETA: 1:33 - loss: 1.7820 - regression_loss: 1.5191 - classification_loss: 0.2629 228/500 [============>.................] - ETA: 1:33 - loss: 1.7789 - regression_loss: 1.5164 - classification_loss: 0.2625 229/500 [============>.................] - ETA: 1:32 - loss: 1.7786 - regression_loss: 1.5162 - classification_loss: 0.2624 230/500 [============>.................] - ETA: 1:32 - loss: 1.7801 - regression_loss: 1.5178 - classification_loss: 0.2623 231/500 [============>.................] - ETA: 1:32 - loss: 1.7817 - regression_loss: 1.5194 - classification_loss: 0.2623 232/500 [============>.................] - ETA: 1:31 - loss: 1.7796 - regression_loss: 1.5177 - classification_loss: 0.2619 233/500 [============>.................] - ETA: 1:31 - loss: 1.7810 - regression_loss: 1.5188 - classification_loss: 0.2622 234/500 [=============>................] - ETA: 1:30 - loss: 1.7779 - regression_loss: 1.5159 - classification_loss: 0.2620 235/500 [=============>................] - ETA: 1:30 - loss: 1.7759 - regression_loss: 1.5142 - classification_loss: 0.2617 236/500 [=============>................] - ETA: 1:30 - loss: 1.7755 - regression_loss: 1.5139 - classification_loss: 0.2616 237/500 [=============>................] - ETA: 1:29 - loss: 1.7763 - regression_loss: 1.5152 - classification_loss: 0.2610 238/500 [=============>................] - ETA: 1:29 - loss: 1.7774 - regression_loss: 1.5164 - classification_loss: 0.2610 239/500 [=============>................] - ETA: 1:29 - loss: 1.7776 - regression_loss: 1.5167 - classification_loss: 0.2609 240/500 [=============>................] - ETA: 1:28 - loss: 1.7769 - regression_loss: 1.5163 - classification_loss: 0.2607 241/500 [=============>................] - ETA: 1:28 - loss: 1.7752 - regression_loss: 1.5149 - classification_loss: 0.2604 242/500 [=============>................] - ETA: 1:28 - loss: 1.7753 - regression_loss: 1.5149 - classification_loss: 0.2604 243/500 [=============>................] - ETA: 1:27 - loss: 1.7758 - regression_loss: 1.5155 - classification_loss: 0.2604 244/500 [=============>................] - ETA: 1:27 - loss: 1.7761 - regression_loss: 1.5154 - classification_loss: 0.2606 245/500 [=============>................] - ETA: 1:27 - loss: 1.7749 - regression_loss: 1.5146 - classification_loss: 0.2603 246/500 [=============>................] - ETA: 1:26 - loss: 1.7741 - regression_loss: 1.5140 - classification_loss: 0.2601 247/500 [=============>................] - ETA: 1:26 - loss: 1.7736 - regression_loss: 1.5135 - classification_loss: 0.2601 248/500 [=============>................] - ETA: 1:26 - loss: 1.7726 - regression_loss: 1.5125 - classification_loss: 0.2602 249/500 [=============>................] - ETA: 1:25 - loss: 1.7683 - regression_loss: 1.5087 - classification_loss: 0.2596 250/500 [==============>...............] - ETA: 1:25 - loss: 1.7655 - regression_loss: 1.5063 - classification_loss: 0.2591 251/500 [==============>...............] - ETA: 1:25 - loss: 1.7663 - regression_loss: 1.5072 - classification_loss: 0.2591 252/500 [==============>...............] - ETA: 1:24 - loss: 1.7681 - regression_loss: 1.5088 - classification_loss: 0.2594 253/500 [==============>...............] - ETA: 1:24 - loss: 1.7679 - regression_loss: 1.5086 - classification_loss: 0.2593 254/500 [==============>...............] - ETA: 1:24 - loss: 1.7681 - regression_loss: 1.5089 - classification_loss: 0.2593 255/500 [==============>...............] - ETA: 1:23 - loss: 1.7688 - regression_loss: 1.5095 - classification_loss: 0.2593 256/500 [==============>...............] - ETA: 1:23 - loss: 1.7683 - regression_loss: 1.5092 - classification_loss: 0.2591 257/500 [==============>...............] - ETA: 1:23 - loss: 1.7684 - regression_loss: 1.5092 - classification_loss: 0.2591 258/500 [==============>...............] - ETA: 1:22 - loss: 1.7666 - regression_loss: 1.5078 - classification_loss: 0.2589 259/500 [==============>...............] - ETA: 1:22 - loss: 1.7674 - regression_loss: 1.5087 - classification_loss: 0.2586 260/500 [==============>...............] - ETA: 1:22 - loss: 1.7691 - regression_loss: 1.5101 - classification_loss: 0.2590 261/500 [==============>...............] - ETA: 1:21 - loss: 1.7689 - regression_loss: 1.5100 - classification_loss: 0.2589 262/500 [==============>...............] - ETA: 1:21 - loss: 1.7709 - regression_loss: 1.5115 - classification_loss: 0.2593 263/500 [==============>...............] - ETA: 1:21 - loss: 1.7693 - regression_loss: 1.5102 - classification_loss: 0.2591 264/500 [==============>...............] - ETA: 1:20 - loss: 1.7709 - regression_loss: 1.5115 - classification_loss: 0.2594 265/500 [==============>...............] - ETA: 1:20 - loss: 1.7714 - regression_loss: 1.5120 - classification_loss: 0.2594 266/500 [==============>...............] - ETA: 1:20 - loss: 1.7725 - regression_loss: 1.5130 - classification_loss: 0.2595 267/500 [===============>..............] - ETA: 1:19 - loss: 1.7733 - regression_loss: 1.5137 - classification_loss: 0.2596 268/500 [===============>..............] - ETA: 1:19 - loss: 1.7740 - regression_loss: 1.5144 - classification_loss: 0.2596 269/500 [===============>..............] - ETA: 1:19 - loss: 1.7752 - regression_loss: 1.5156 - classification_loss: 0.2595 270/500 [===============>..............] - ETA: 1:18 - loss: 1.7732 - regression_loss: 1.5138 - classification_loss: 0.2594 271/500 [===============>..............] - ETA: 1:18 - loss: 1.7725 - regression_loss: 1.5134 - classification_loss: 0.2592 272/500 [===============>..............] - ETA: 1:18 - loss: 1.7728 - regression_loss: 1.5136 - classification_loss: 0.2593 273/500 [===============>..............] - ETA: 1:17 - loss: 1.7724 - regression_loss: 1.5134 - classification_loss: 0.2589 274/500 [===============>..............] - ETA: 1:17 - loss: 1.7727 - regression_loss: 1.5139 - classification_loss: 0.2587 275/500 [===============>..............] - ETA: 1:17 - loss: 1.7762 - regression_loss: 1.5173 - classification_loss: 0.2589 276/500 [===============>..............] - ETA: 1:16 - loss: 1.7773 - regression_loss: 1.5183 - classification_loss: 0.2590 277/500 [===============>..............] - ETA: 1:16 - loss: 1.7765 - regression_loss: 1.5177 - classification_loss: 0.2587 278/500 [===============>..............] - ETA: 1:16 - loss: 1.7753 - regression_loss: 1.5168 - classification_loss: 0.2585 279/500 [===============>..............] - ETA: 1:15 - loss: 1.7763 - regression_loss: 1.5176 - classification_loss: 0.2588 280/500 [===============>..............] - ETA: 1:15 - loss: 1.7754 - regression_loss: 1.5170 - classification_loss: 0.2584 281/500 [===============>..............] - ETA: 1:14 - loss: 1.7755 - regression_loss: 1.5172 - classification_loss: 0.2583 282/500 [===============>..............] - ETA: 1:14 - loss: 1.7734 - regression_loss: 1.5154 - classification_loss: 0.2580 283/500 [===============>..............] - ETA: 1:14 - loss: 1.7754 - regression_loss: 1.5169 - classification_loss: 0.2584 284/500 [================>.............] - ETA: 1:13 - loss: 1.7755 - regression_loss: 1.5172 - classification_loss: 0.2583 285/500 [================>.............] - ETA: 1:13 - loss: 1.7748 - regression_loss: 1.5167 - classification_loss: 0.2581 286/500 [================>.............] - ETA: 1:13 - loss: 1.7730 - regression_loss: 1.5153 - classification_loss: 0.2577 287/500 [================>.............] - ETA: 1:12 - loss: 1.7725 - regression_loss: 1.5150 - classification_loss: 0.2575 288/500 [================>.............] - ETA: 1:12 - loss: 1.7707 - regression_loss: 1.5135 - classification_loss: 0.2572 289/500 [================>.............] - ETA: 1:12 - loss: 1.7696 - regression_loss: 1.5119 - classification_loss: 0.2577 290/500 [================>.............] - ETA: 1:11 - loss: 1.7704 - regression_loss: 1.5126 - classification_loss: 0.2578 291/500 [================>.............] - ETA: 1:11 - loss: 1.7688 - regression_loss: 1.5108 - classification_loss: 0.2580 292/500 [================>.............] - ETA: 1:11 - loss: 1.7704 - regression_loss: 1.5120 - classification_loss: 0.2583 293/500 [================>.............] - ETA: 1:10 - loss: 1.7729 - regression_loss: 1.5140 - classification_loss: 0.2589 294/500 [================>.............] - ETA: 1:10 - loss: 1.7762 - regression_loss: 1.5171 - classification_loss: 0.2591 295/500 [================>.............] - ETA: 1:10 - loss: 1.7780 - regression_loss: 1.5189 - classification_loss: 0.2591 296/500 [================>.............] - ETA: 1:09 - loss: 1.7791 - regression_loss: 1.5200 - classification_loss: 0.2591 297/500 [================>.............] - ETA: 1:09 - loss: 1.7791 - regression_loss: 1.5199 - classification_loss: 0.2592 298/500 [================>.............] - ETA: 1:09 - loss: 1.7762 - regression_loss: 1.5174 - classification_loss: 0.2588 299/500 [================>.............] - ETA: 1:08 - loss: 1.7796 - regression_loss: 1.5197 - classification_loss: 0.2599 300/500 [=================>............] - ETA: 1:08 - loss: 1.7807 - regression_loss: 1.5208 - classification_loss: 0.2599 301/500 [=================>............] - ETA: 1:08 - loss: 1.7807 - regression_loss: 1.5209 - classification_loss: 0.2598 302/500 [=================>............] - ETA: 1:07 - loss: 1.7780 - regression_loss: 1.5188 - classification_loss: 0.2592 303/500 [=================>............] - ETA: 1:07 - loss: 1.7783 - regression_loss: 1.5189 - classification_loss: 0.2594 304/500 [=================>............] - ETA: 1:07 - loss: 1.7777 - regression_loss: 1.5186 - classification_loss: 0.2591 305/500 [=================>............] - ETA: 1:06 - loss: 1.7769 - regression_loss: 1.5180 - classification_loss: 0.2589 306/500 [=================>............] - ETA: 1:06 - loss: 1.7775 - regression_loss: 1.5184 - classification_loss: 0.2591 307/500 [=================>............] - ETA: 1:06 - loss: 1.7772 - regression_loss: 1.5182 - classification_loss: 0.2590 308/500 [=================>............] - ETA: 1:05 - loss: 1.7749 - regression_loss: 1.5161 - classification_loss: 0.2588 309/500 [=================>............] - ETA: 1:05 - loss: 1.7731 - regression_loss: 1.5146 - classification_loss: 0.2585 310/500 [=================>............] - ETA: 1:05 - loss: 1.7754 - regression_loss: 1.5161 - classification_loss: 0.2594 311/500 [=================>............] - ETA: 1:04 - loss: 1.7740 - regression_loss: 1.5150 - classification_loss: 0.2590 312/500 [=================>............] - ETA: 1:04 - loss: 1.7771 - regression_loss: 1.5171 - classification_loss: 0.2600 313/500 [=================>............] - ETA: 1:04 - loss: 1.7776 - regression_loss: 1.5176 - classification_loss: 0.2600 314/500 [=================>............] - ETA: 1:03 - loss: 1.7789 - regression_loss: 1.5188 - classification_loss: 0.2601 315/500 [=================>............] - ETA: 1:03 - loss: 1.7787 - regression_loss: 1.5187 - classification_loss: 0.2600 316/500 [=================>............] - ETA: 1:02 - loss: 1.7777 - regression_loss: 1.5178 - classification_loss: 0.2598 317/500 [==================>...........] - ETA: 1:02 - loss: 1.7778 - regression_loss: 1.5180 - classification_loss: 0.2598 318/500 [==================>...........] - ETA: 1:02 - loss: 1.7779 - regression_loss: 1.5182 - classification_loss: 0.2597 319/500 [==================>...........] - ETA: 1:01 - loss: 1.7789 - regression_loss: 1.5191 - classification_loss: 0.2597 320/500 [==================>...........] - ETA: 1:01 - loss: 1.7758 - regression_loss: 1.5165 - classification_loss: 0.2593 321/500 [==================>...........] - ETA: 1:01 - loss: 1.7750 - regression_loss: 1.5157 - classification_loss: 0.2592 322/500 [==================>...........] - ETA: 1:00 - loss: 1.7750 - regression_loss: 1.5159 - classification_loss: 0.2591 323/500 [==================>...........] - ETA: 1:00 - loss: 1.7723 - regression_loss: 1.5136 - classification_loss: 0.2586 324/500 [==================>...........] - ETA: 1:00 - loss: 1.7728 - regression_loss: 1.5141 - classification_loss: 0.2587 325/500 [==================>...........] - ETA: 59s - loss: 1.7713 - regression_loss: 1.5130 - classification_loss: 0.2583  326/500 [==================>...........] - ETA: 59s - loss: 1.7711 - regression_loss: 1.5129 - classification_loss: 0.2583 327/500 [==================>...........] - ETA: 59s - loss: 1.7711 - regression_loss: 1.5129 - classification_loss: 0.2581 328/500 [==================>...........] - ETA: 58s - loss: 1.7703 - regression_loss: 1.5120 - classification_loss: 0.2583 329/500 [==================>...........] - ETA: 58s - loss: 1.7707 - regression_loss: 1.5123 - classification_loss: 0.2583 330/500 [==================>...........] - ETA: 58s - loss: 1.7721 - regression_loss: 1.5132 - classification_loss: 0.2590 331/500 [==================>...........] - ETA: 57s - loss: 1.7700 - regression_loss: 1.5115 - classification_loss: 0.2584 332/500 [==================>...........] - ETA: 57s - loss: 1.7702 - regression_loss: 1.5117 - classification_loss: 0.2585 333/500 [==================>...........] - ETA: 57s - loss: 1.7697 - regression_loss: 1.5113 - classification_loss: 0.2584 334/500 [===================>..........] - ETA: 56s - loss: 1.7698 - regression_loss: 1.5111 - classification_loss: 0.2587 335/500 [===================>..........] - ETA: 56s - loss: 1.7687 - regression_loss: 1.5102 - classification_loss: 0.2586 336/500 [===================>..........] - ETA: 56s - loss: 1.7689 - regression_loss: 1.5103 - classification_loss: 0.2586 337/500 [===================>..........] - ETA: 55s - loss: 1.7697 - regression_loss: 1.5110 - classification_loss: 0.2587 338/500 [===================>..........] - ETA: 55s - loss: 1.7678 - regression_loss: 1.5095 - classification_loss: 0.2583 339/500 [===================>..........] - ETA: 55s - loss: 1.7683 - regression_loss: 1.5100 - classification_loss: 0.2584 340/500 [===================>..........] - ETA: 54s - loss: 1.7682 - regression_loss: 1.5099 - classification_loss: 0.2583 341/500 [===================>..........] - ETA: 54s - loss: 1.7694 - regression_loss: 1.5111 - classification_loss: 0.2583 342/500 [===================>..........] - ETA: 54s - loss: 1.7692 - regression_loss: 1.5109 - classification_loss: 0.2583 343/500 [===================>..........] - ETA: 53s - loss: 1.7687 - regression_loss: 1.5107 - classification_loss: 0.2580 344/500 [===================>..........] - ETA: 53s - loss: 1.7678 - regression_loss: 1.5099 - classification_loss: 0.2578 345/500 [===================>..........] - ETA: 52s - loss: 1.7685 - regression_loss: 1.5107 - classification_loss: 0.2578 346/500 [===================>..........] - ETA: 52s - loss: 1.7690 - regression_loss: 1.5109 - classification_loss: 0.2581 347/500 [===================>..........] - ETA: 52s - loss: 1.7697 - regression_loss: 1.5115 - classification_loss: 0.2581 348/500 [===================>..........] - ETA: 51s - loss: 1.7679 - regression_loss: 1.5102 - classification_loss: 0.2577 349/500 [===================>..........] - ETA: 51s - loss: 1.7664 - regression_loss: 1.5090 - classification_loss: 0.2574 350/500 [====================>.........] - ETA: 51s - loss: 1.7651 - regression_loss: 1.5081 - classification_loss: 0.2570 351/500 [====================>.........] - ETA: 50s - loss: 1.7653 - regression_loss: 1.5083 - classification_loss: 0.2570 352/500 [====================>.........] - ETA: 50s - loss: 1.7644 - regression_loss: 1.5073 - classification_loss: 0.2571 353/500 [====================>.........] - ETA: 50s - loss: 1.7648 - regression_loss: 1.5076 - classification_loss: 0.2572 354/500 [====================>.........] - ETA: 49s - loss: 1.7652 - regression_loss: 1.5080 - classification_loss: 0.2572 355/500 [====================>.........] - ETA: 49s - loss: 1.7661 - regression_loss: 1.5087 - classification_loss: 0.2573 356/500 [====================>.........] - ETA: 49s - loss: 1.7660 - regression_loss: 1.5087 - classification_loss: 0.2573 357/500 [====================>.........] - ETA: 48s - loss: 1.7652 - regression_loss: 1.5079 - classification_loss: 0.2572 358/500 [====================>.........] - ETA: 48s - loss: 1.7640 - regression_loss: 1.5069 - classification_loss: 0.2571 359/500 [====================>.........] - ETA: 48s - loss: 1.7648 - regression_loss: 1.5075 - classification_loss: 0.2572 360/500 [====================>.........] - ETA: 47s - loss: 1.7643 - regression_loss: 1.5072 - classification_loss: 0.2571 361/500 [====================>.........] - ETA: 47s - loss: 1.7641 - regression_loss: 1.5071 - classification_loss: 0.2570 362/500 [====================>.........] - ETA: 47s - loss: 1.7649 - regression_loss: 1.5075 - classification_loss: 0.2575 363/500 [====================>.........] - ETA: 46s - loss: 1.7625 - regression_loss: 1.5055 - classification_loss: 0.2570 364/500 [====================>.........] - ETA: 46s - loss: 1.7657 - regression_loss: 1.5082 - classification_loss: 0.2575 365/500 [====================>.........] - ETA: 46s - loss: 1.7626 - regression_loss: 1.5055 - classification_loss: 0.2571 366/500 [====================>.........] - ETA: 45s - loss: 1.7625 - regression_loss: 1.5055 - classification_loss: 0.2570 367/500 [=====================>........] - ETA: 45s - loss: 1.7619 - regression_loss: 1.5050 - classification_loss: 0.2569 368/500 [=====================>........] - ETA: 45s - loss: 1.7621 - regression_loss: 1.5051 - classification_loss: 0.2570 369/500 [=====================>........] - ETA: 44s - loss: 1.7620 - regression_loss: 1.5049 - classification_loss: 0.2571 370/500 [=====================>........] - ETA: 44s - loss: 1.7625 - regression_loss: 1.5055 - classification_loss: 0.2570 371/500 [=====================>........] - ETA: 44s - loss: 1.7641 - regression_loss: 1.5067 - classification_loss: 0.2573 372/500 [=====================>........] - ETA: 43s - loss: 1.7646 - regression_loss: 1.5073 - classification_loss: 0.2572 373/500 [=====================>........] - ETA: 43s - loss: 1.7654 - regression_loss: 1.5079 - classification_loss: 0.2574 374/500 [=====================>........] - ETA: 43s - loss: 1.7663 - regression_loss: 1.5088 - classification_loss: 0.2575 375/500 [=====================>........] - ETA: 42s - loss: 1.7669 - regression_loss: 1.5093 - classification_loss: 0.2576 376/500 [=====================>........] - ETA: 42s - loss: 1.7674 - regression_loss: 1.5096 - classification_loss: 0.2578 377/500 [=====================>........] - ETA: 42s - loss: 1.7691 - regression_loss: 1.5111 - classification_loss: 0.2581 378/500 [=====================>........] - ETA: 41s - loss: 1.7693 - regression_loss: 1.5110 - classification_loss: 0.2582 379/500 [=====================>........] - ETA: 41s - loss: 1.7679 - regression_loss: 1.5099 - classification_loss: 0.2580 380/500 [=====================>........] - ETA: 41s - loss: 1.7690 - regression_loss: 1.5110 - classification_loss: 0.2580 381/500 [=====================>........] - ETA: 40s - loss: 1.7695 - regression_loss: 1.5117 - classification_loss: 0.2579 382/500 [=====================>........] - ETA: 40s - loss: 1.7678 - regression_loss: 1.5103 - classification_loss: 0.2576 383/500 [=====================>........] - ETA: 40s - loss: 1.7672 - regression_loss: 1.5098 - classification_loss: 0.2574 384/500 [======================>.......] - ETA: 39s - loss: 1.7657 - regression_loss: 1.5086 - classification_loss: 0.2571 385/500 [======================>.......] - ETA: 39s - loss: 1.7660 - regression_loss: 1.5088 - classification_loss: 0.2572 386/500 [======================>.......] - ETA: 38s - loss: 1.7644 - regression_loss: 1.5076 - classification_loss: 0.2569 387/500 [======================>.......] - ETA: 38s - loss: 1.7631 - regression_loss: 1.5065 - classification_loss: 0.2566 388/500 [======================>.......] - ETA: 38s - loss: 1.7638 - regression_loss: 1.5071 - classification_loss: 0.2567 389/500 [======================>.......] - ETA: 37s - loss: 1.7634 - regression_loss: 1.5067 - classification_loss: 0.2567 390/500 [======================>.......] - ETA: 37s - loss: 1.7618 - regression_loss: 1.5054 - classification_loss: 0.2565 391/500 [======================>.......] - ETA: 37s - loss: 1.7607 - regression_loss: 1.5045 - classification_loss: 0.2562 392/500 [======================>.......] - ETA: 36s - loss: 1.7615 - regression_loss: 1.5050 - classification_loss: 0.2565 393/500 [======================>.......] - ETA: 36s - loss: 1.7616 - regression_loss: 1.5051 - classification_loss: 0.2565 394/500 [======================>.......] - ETA: 36s - loss: 1.7612 - regression_loss: 1.5047 - classification_loss: 0.2564 395/500 [======================>.......] - ETA: 35s - loss: 1.7605 - regression_loss: 1.5041 - classification_loss: 0.2564 396/500 [======================>.......] - ETA: 35s - loss: 1.7596 - regression_loss: 1.5035 - classification_loss: 0.2561 397/500 [======================>.......] - ETA: 35s - loss: 1.7605 - regression_loss: 1.5043 - classification_loss: 0.2562 398/500 [======================>.......] - ETA: 34s - loss: 1.7606 - regression_loss: 1.5044 - classification_loss: 0.2562 399/500 [======================>.......] - ETA: 34s - loss: 1.7618 - regression_loss: 1.5055 - classification_loss: 0.2563 400/500 [=======================>......] - ETA: 34s - loss: 1.7625 - regression_loss: 1.5059 - classification_loss: 0.2565 401/500 [=======================>......] - ETA: 33s - loss: 1.7610 - regression_loss: 1.5047 - classification_loss: 0.2562 402/500 [=======================>......] - ETA: 33s - loss: 1.7603 - regression_loss: 1.5042 - classification_loss: 0.2561 403/500 [=======================>......] - ETA: 33s - loss: 1.7621 - regression_loss: 1.5056 - classification_loss: 0.2565 404/500 [=======================>......] - ETA: 32s - loss: 1.7618 - regression_loss: 1.5053 - classification_loss: 0.2564 405/500 [=======================>......] - ETA: 32s - loss: 1.7627 - regression_loss: 1.5061 - classification_loss: 0.2566 406/500 [=======================>......] - ETA: 32s - loss: 1.7618 - regression_loss: 1.5053 - classification_loss: 0.2565 407/500 [=======================>......] - ETA: 31s - loss: 1.7619 - regression_loss: 1.5055 - classification_loss: 0.2565 408/500 [=======================>......] - ETA: 31s - loss: 1.7606 - regression_loss: 1.5044 - classification_loss: 0.2562 409/500 [=======================>......] - ETA: 31s - loss: 1.7605 - regression_loss: 1.5044 - classification_loss: 0.2561 410/500 [=======================>......] - ETA: 30s - loss: 1.7615 - regression_loss: 1.5052 - classification_loss: 0.2563 411/500 [=======================>......] - ETA: 30s - loss: 1.7589 - regression_loss: 1.5030 - classification_loss: 0.2559 412/500 [=======================>......] - ETA: 30s - loss: 1.7593 - regression_loss: 1.5033 - classification_loss: 0.2560 413/500 [=======================>......] - ETA: 29s - loss: 1.7597 - regression_loss: 1.5036 - classification_loss: 0.2561 414/500 [=======================>......] - ETA: 29s - loss: 1.7601 - regression_loss: 1.5040 - classification_loss: 0.2561 415/500 [=======================>......] - ETA: 29s - loss: 1.7584 - regression_loss: 1.5025 - classification_loss: 0.2558 416/500 [=======================>......] - ETA: 28s - loss: 1.7586 - regression_loss: 1.5024 - classification_loss: 0.2562 417/500 [========================>.....] - ETA: 28s - loss: 1.7572 - regression_loss: 1.5011 - classification_loss: 0.2561 418/500 [========================>.....] - ETA: 28s - loss: 1.7556 - regression_loss: 1.4997 - classification_loss: 0.2558 419/500 [========================>.....] - ETA: 27s - loss: 1.7554 - regression_loss: 1.4995 - classification_loss: 0.2559 420/500 [========================>.....] - ETA: 27s - loss: 1.7547 - regression_loss: 1.4989 - classification_loss: 0.2558 421/500 [========================>.....] - ETA: 26s - loss: 1.7549 - regression_loss: 1.4991 - classification_loss: 0.2558 422/500 [========================>.....] - ETA: 26s - loss: 1.7548 - regression_loss: 1.4991 - classification_loss: 0.2556 423/500 [========================>.....] - ETA: 26s - loss: 1.7548 - regression_loss: 1.4993 - classification_loss: 0.2555 424/500 [========================>.....] - ETA: 25s - loss: 1.7537 - regression_loss: 1.4984 - classification_loss: 0.2553 425/500 [========================>.....] - ETA: 25s - loss: 1.7539 - regression_loss: 1.4986 - classification_loss: 0.2553 426/500 [========================>.....] - ETA: 25s - loss: 1.7547 - regression_loss: 1.4993 - classification_loss: 0.2554 427/500 [========================>.....] - ETA: 24s - loss: 1.7559 - regression_loss: 1.5003 - classification_loss: 0.2556 428/500 [========================>.....] - ETA: 24s - loss: 1.7558 - regression_loss: 1.5002 - classification_loss: 0.2556 429/500 [========================>.....] - ETA: 24s - loss: 1.7564 - regression_loss: 1.5004 - classification_loss: 0.2560 430/500 [========================>.....] - ETA: 23s - loss: 1.7564 - regression_loss: 1.5004 - classification_loss: 0.2560 431/500 [========================>.....] - ETA: 23s - loss: 1.7558 - regression_loss: 1.5000 - classification_loss: 0.2558 432/500 [========================>.....] - ETA: 23s - loss: 1.7561 - regression_loss: 1.5004 - classification_loss: 0.2557 433/500 [========================>.....] - ETA: 22s - loss: 1.7560 - regression_loss: 1.5003 - classification_loss: 0.2557 434/500 [=========================>....] - ETA: 22s - loss: 1.7557 - regression_loss: 1.5000 - classification_loss: 0.2557 435/500 [=========================>....] - ETA: 22s - loss: 1.7565 - regression_loss: 1.5008 - classification_loss: 0.2557 436/500 [=========================>....] - ETA: 21s - loss: 1.7569 - regression_loss: 1.5011 - classification_loss: 0.2558 437/500 [=========================>....] - ETA: 21s - loss: 1.7545 - regression_loss: 1.4990 - classification_loss: 0.2555 438/500 [=========================>....] - ETA: 21s - loss: 1.7536 - regression_loss: 1.4983 - classification_loss: 0.2553 439/500 [=========================>....] - ETA: 20s - loss: 1.7519 - regression_loss: 1.4968 - classification_loss: 0.2551 440/500 [=========================>....] - ETA: 20s - loss: 1.7518 - regression_loss: 1.4969 - classification_loss: 0.2550 441/500 [=========================>....] - ETA: 20s - loss: 1.7517 - regression_loss: 1.4967 - classification_loss: 0.2550 442/500 [=========================>....] - ETA: 19s - loss: 1.7514 - regression_loss: 1.4964 - classification_loss: 0.2550 443/500 [=========================>....] - ETA: 19s - loss: 1.7516 - regression_loss: 1.4968 - classification_loss: 0.2548 444/500 [=========================>....] - ETA: 19s - loss: 1.7518 - regression_loss: 1.4970 - classification_loss: 0.2548 445/500 [=========================>....] - ETA: 18s - loss: 1.7518 - regression_loss: 1.4970 - classification_loss: 0.2548 446/500 [=========================>....] - ETA: 18s - loss: 1.7537 - regression_loss: 1.4987 - classification_loss: 0.2550 447/500 [=========================>....] - ETA: 18s - loss: 1.7546 - regression_loss: 1.4996 - classification_loss: 0.2550 448/500 [=========================>....] - ETA: 17s - loss: 1.7544 - regression_loss: 1.4994 - classification_loss: 0.2550 449/500 [=========================>....] - ETA: 17s - loss: 1.7550 - regression_loss: 1.4999 - classification_loss: 0.2552 450/500 [==========================>...] - ETA: 17s - loss: 1.7557 - regression_loss: 1.5004 - classification_loss: 0.2553 451/500 [==========================>...] - ETA: 16s - loss: 1.7556 - regression_loss: 1.5001 - classification_loss: 0.2554 452/500 [==========================>...] - ETA: 16s - loss: 1.7556 - regression_loss: 1.5002 - classification_loss: 0.2554 453/500 [==========================>...] - ETA: 16s - loss: 1.7548 - regression_loss: 1.4995 - classification_loss: 0.2553 454/500 [==========================>...] - ETA: 15s - loss: 1.7546 - regression_loss: 1.4994 - classification_loss: 0.2552 455/500 [==========================>...] - ETA: 15s - loss: 1.7557 - regression_loss: 1.5004 - classification_loss: 0.2553 456/500 [==========================>...] - ETA: 15s - loss: 1.7556 - regression_loss: 1.5004 - classification_loss: 0.2552 457/500 [==========================>...] - ETA: 14s - loss: 1.7553 - regression_loss: 1.5003 - classification_loss: 0.2551 458/500 [==========================>...] - ETA: 14s - loss: 1.7561 - regression_loss: 1.5009 - classification_loss: 0.2552 459/500 [==========================>...] - ETA: 14s - loss: 1.7558 - regression_loss: 1.5004 - classification_loss: 0.2554 460/500 [==========================>...] - ETA: 13s - loss: 1.7556 - regression_loss: 1.5004 - classification_loss: 0.2552 461/500 [==========================>...] - ETA: 13s - loss: 1.7535 - regression_loss: 1.4986 - classification_loss: 0.2550 462/500 [==========================>...] - ETA: 12s - loss: 1.7535 - regression_loss: 1.4986 - classification_loss: 0.2549 463/500 [==========================>...] - ETA: 12s - loss: 1.7532 - regression_loss: 1.4984 - classification_loss: 0.2548 464/500 [==========================>...] - ETA: 12s - loss: 1.7521 - regression_loss: 1.4975 - classification_loss: 0.2546 465/500 [==========================>...] - ETA: 11s - loss: 1.7520 - regression_loss: 1.4976 - classification_loss: 0.2544 466/500 [==========================>...] - ETA: 11s - loss: 1.7521 - regression_loss: 1.4977 - classification_loss: 0.2544 467/500 [===========================>..] - ETA: 11s - loss: 1.7521 - regression_loss: 1.4977 - classification_loss: 0.2543 468/500 [===========================>..] - ETA: 10s - loss: 1.7531 - regression_loss: 1.4986 - classification_loss: 0.2545 469/500 [===========================>..] - ETA: 10s - loss: 1.7530 - regression_loss: 1.4985 - classification_loss: 0.2545 470/500 [===========================>..] - ETA: 10s - loss: 1.7540 - regression_loss: 1.4993 - classification_loss: 0.2547 471/500 [===========================>..] - ETA: 9s - loss: 1.7539 - regression_loss: 1.4988 - classification_loss: 0.2552  472/500 [===========================>..] - ETA: 9s - loss: 1.7521 - regression_loss: 1.4973 - classification_loss: 0.2548 473/500 [===========================>..] - ETA: 9s - loss: 1.7527 - regression_loss: 1.4980 - classification_loss: 0.2547 474/500 [===========================>..] - ETA: 8s - loss: 1.7572 - regression_loss: 1.5013 - classification_loss: 0.2559 475/500 [===========================>..] - ETA: 8s - loss: 1.7568 - regression_loss: 1.5010 - classification_loss: 0.2558 476/500 [===========================>..] - ETA: 8s - loss: 1.7563 - regression_loss: 1.5006 - classification_loss: 0.2557 477/500 [===========================>..] - ETA: 7s - loss: 1.7568 - regression_loss: 1.5009 - classification_loss: 0.2559 478/500 [===========================>..] - ETA: 7s - loss: 1.7563 - regression_loss: 1.5005 - classification_loss: 0.2558 479/500 [===========================>..] - ETA: 7s - loss: 1.7567 - regression_loss: 1.5007 - classification_loss: 0.2560 480/500 [===========================>..] - ETA: 6s - loss: 1.7568 - regression_loss: 1.5008 - classification_loss: 0.2560 481/500 [===========================>..] - ETA: 6s - loss: 1.7575 - regression_loss: 1.5015 - classification_loss: 0.2560 482/500 [===========================>..] - ETA: 6s - loss: 1.7575 - regression_loss: 1.5016 - classification_loss: 0.2559 483/500 [===========================>..] - ETA: 5s - loss: 1.7576 - regression_loss: 1.5016 - classification_loss: 0.2559 484/500 [============================>.] - ETA: 5s - loss: 1.7558 - regression_loss: 1.5000 - classification_loss: 0.2558 485/500 [============================>.] - ETA: 5s - loss: 1.7550 - regression_loss: 1.4995 - classification_loss: 0.2555 486/500 [============================>.] - ETA: 4s - loss: 1.7553 - regression_loss: 1.4995 - classification_loss: 0.2558 487/500 [============================>.] - ETA: 4s - loss: 1.7552 - regression_loss: 1.4996 - classification_loss: 0.2556 488/500 [============================>.] - ETA: 4s - loss: 1.7565 - regression_loss: 1.5007 - classification_loss: 0.2558 489/500 [============================>.] - ETA: 3s - loss: 1.7564 - regression_loss: 1.5008 - classification_loss: 0.2557 490/500 [============================>.] - ETA: 3s - loss: 1.7568 - regression_loss: 1.5010 - classification_loss: 0.2558 491/500 [============================>.] - ETA: 3s - loss: 1.7570 - regression_loss: 1.5012 - classification_loss: 0.2558 492/500 [============================>.] - ETA: 2s - loss: 1.7572 - regression_loss: 1.5014 - classification_loss: 0.2558 493/500 [============================>.] - ETA: 2s - loss: 1.7568 - regression_loss: 1.5009 - classification_loss: 0.2558 494/500 [============================>.] - ETA: 2s - loss: 1.7576 - regression_loss: 1.5017 - classification_loss: 0.2559 495/500 [============================>.] - ETA: 1s - loss: 1.7573 - regression_loss: 1.5015 - classification_loss: 0.2558 496/500 [============================>.] - ETA: 1s - loss: 1.7582 - regression_loss: 1.5022 - classification_loss: 0.2560 497/500 [============================>.] - ETA: 1s - loss: 1.7573 - regression_loss: 1.5014 - classification_loss: 0.2559 498/500 [============================>.] - ETA: 0s - loss: 1.7575 - regression_loss: 1.5015 - classification_loss: 0.2560 499/500 [============================>.] - ETA: 0s - loss: 1.7576 - regression_loss: 1.5017 - classification_loss: 0.2559 500/500 [==============================] - 171s 342ms/step - loss: 1.7581 - regression_loss: 1.5021 - classification_loss: 0.2559 1172 instances of class plum with average precision: 0.6935 mAP: 0.6935 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 2:46 - loss: 1.4735 - regression_loss: 1.2624 - classification_loss: 0.2111 2/500 [..............................] - ETA: 2:51 - loss: 1.7000 - regression_loss: 1.4685 - classification_loss: 0.2315 3/500 [..............................] - ETA: 2:49 - loss: 1.7625 - regression_loss: 1.5526 - classification_loss: 0.2099 4/500 [..............................] - ETA: 2:50 - loss: 1.7953 - regression_loss: 1.5797 - classification_loss: 0.2156 5/500 [..............................] - ETA: 2:51 - loss: 1.6409 - regression_loss: 1.4419 - classification_loss: 0.1990 6/500 [..............................] - ETA: 2:51 - loss: 1.6314 - regression_loss: 1.4284 - classification_loss: 0.2030 7/500 [..............................] - ETA: 2:49 - loss: 1.6223 - regression_loss: 1.4190 - classification_loss: 0.2033 8/500 [..............................] - ETA: 2:48 - loss: 1.6516 - regression_loss: 1.4393 - classification_loss: 0.2123 9/500 [..............................] - ETA: 2:47 - loss: 1.7819 - regression_loss: 1.5425 - classification_loss: 0.2394 10/500 [..............................] - ETA: 2:47 - loss: 1.8411 - regression_loss: 1.5912 - classification_loss: 0.2499 11/500 [..............................] - ETA: 2:46 - loss: 1.8967 - regression_loss: 1.6429 - classification_loss: 0.2538 12/500 [..............................] - ETA: 2:47 - loss: 1.8358 - regression_loss: 1.5899 - classification_loss: 0.2459 13/500 [..............................] - ETA: 2:47 - loss: 1.8299 - regression_loss: 1.5830 - classification_loss: 0.2470 14/500 [..............................] - ETA: 2:46 - loss: 1.8352 - regression_loss: 1.5868 - classification_loss: 0.2484 15/500 [..............................] - ETA: 2:45 - loss: 1.8592 - regression_loss: 1.6081 - classification_loss: 0.2511 16/500 [..............................] - ETA: 2:45 - loss: 1.8659 - regression_loss: 1.6123 - classification_loss: 0.2536 17/500 [>.............................] - ETA: 2:44 - loss: 1.8652 - regression_loss: 1.6157 - classification_loss: 0.2495 18/500 [>.............................] - ETA: 2:44 - loss: 1.8516 - regression_loss: 1.6027 - classification_loss: 0.2489 19/500 [>.............................] - ETA: 2:43 - loss: 1.8485 - regression_loss: 1.6003 - classification_loss: 0.2482 20/500 [>.............................] - ETA: 2:43 - loss: 1.8366 - regression_loss: 1.5884 - classification_loss: 0.2482 21/500 [>.............................] - ETA: 2:42 - loss: 1.8196 - regression_loss: 1.5728 - classification_loss: 0.2468 22/500 [>.............................] - ETA: 2:41 - loss: 1.8277 - regression_loss: 1.5774 - classification_loss: 0.2503 23/500 [>.............................] - ETA: 2:41 - loss: 1.8036 - regression_loss: 1.5560 - classification_loss: 0.2476 24/500 [>.............................] - ETA: 2:41 - loss: 1.7922 - regression_loss: 1.5460 - classification_loss: 0.2462 25/500 [>.............................] - ETA: 2:40 - loss: 1.7931 - regression_loss: 1.5489 - classification_loss: 0.2442 26/500 [>.............................] - ETA: 2:40 - loss: 1.8073 - regression_loss: 1.5605 - classification_loss: 0.2468 27/500 [>.............................] - ETA: 2:40 - loss: 1.7969 - regression_loss: 1.5518 - classification_loss: 0.2451 28/500 [>.............................] - ETA: 2:39 - loss: 1.7965 - regression_loss: 1.5520 - classification_loss: 0.2445 29/500 [>.............................] - ETA: 2:39 - loss: 1.7998 - regression_loss: 1.5543 - classification_loss: 0.2455 30/500 [>.............................] - ETA: 2:39 - loss: 1.7830 - regression_loss: 1.5393 - classification_loss: 0.2437 31/500 [>.............................] - ETA: 2:39 - loss: 1.7761 - regression_loss: 1.5334 - classification_loss: 0.2426 32/500 [>.............................] - ETA: 2:38 - loss: 1.7558 - regression_loss: 1.5168 - classification_loss: 0.2390 33/500 [>.............................] - ETA: 2:38 - loss: 1.7428 - regression_loss: 1.5048 - classification_loss: 0.2380 34/500 [=>............................] - ETA: 2:38 - loss: 1.7418 - regression_loss: 1.5028 - classification_loss: 0.2390 35/500 [=>............................] - ETA: 2:37 - loss: 1.7549 - regression_loss: 1.5049 - classification_loss: 0.2500 36/500 [=>............................] - ETA: 2:37 - loss: 1.7488 - regression_loss: 1.4998 - classification_loss: 0.2490 37/500 [=>............................] - ETA: 2:37 - loss: 1.7480 - regression_loss: 1.4993 - classification_loss: 0.2487 38/500 [=>............................] - ETA: 2:36 - loss: 1.7603 - regression_loss: 1.5105 - classification_loss: 0.2498 39/500 [=>............................] - ETA: 2:36 - loss: 1.7675 - regression_loss: 1.5170 - classification_loss: 0.2506 40/500 [=>............................] - ETA: 2:36 - loss: 1.7689 - regression_loss: 1.5185 - classification_loss: 0.2504 41/500 [=>............................] - ETA: 2:35 - loss: 1.7724 - regression_loss: 1.5217 - classification_loss: 0.2507 42/500 [=>............................] - ETA: 2:35 - loss: 1.7699 - regression_loss: 1.5195 - classification_loss: 0.2504 43/500 [=>............................] - ETA: 2:35 - loss: 1.7753 - regression_loss: 1.5241 - classification_loss: 0.2512 44/500 [=>............................] - ETA: 2:34 - loss: 1.7662 - regression_loss: 1.5169 - classification_loss: 0.2493 45/500 [=>............................] - ETA: 2:34 - loss: 1.7442 - regression_loss: 1.4979 - classification_loss: 0.2462 46/500 [=>............................] - ETA: 2:34 - loss: 1.7399 - regression_loss: 1.4947 - classification_loss: 0.2452 47/500 [=>............................] - ETA: 2:33 - loss: 1.7418 - regression_loss: 1.4961 - classification_loss: 0.2457 48/500 [=>............................] - ETA: 2:33 - loss: 1.7461 - regression_loss: 1.5000 - classification_loss: 0.2461 49/500 [=>............................] - ETA: 2:33 - loss: 1.7322 - regression_loss: 1.4882 - classification_loss: 0.2440 50/500 [==>...........................] - ETA: 2:32 - loss: 1.7318 - regression_loss: 1.4876 - classification_loss: 0.2442 51/500 [==>...........................] - ETA: 2:32 - loss: 1.7242 - regression_loss: 1.4821 - classification_loss: 0.2421 52/500 [==>...........................] - ETA: 2:32 - loss: 1.7331 - regression_loss: 1.4899 - classification_loss: 0.2432 53/500 [==>...........................] - ETA: 2:32 - loss: 1.7388 - regression_loss: 1.4949 - classification_loss: 0.2439 54/500 [==>...........................] - ETA: 2:31 - loss: 1.7375 - regression_loss: 1.4940 - classification_loss: 0.2436 55/500 [==>...........................] - ETA: 2:31 - loss: 1.7394 - regression_loss: 1.4957 - classification_loss: 0.2437 56/500 [==>...........................] - ETA: 2:31 - loss: 1.7359 - regression_loss: 1.4923 - classification_loss: 0.2436 57/500 [==>...........................] - ETA: 2:30 - loss: 1.7445 - regression_loss: 1.4987 - classification_loss: 0.2457 58/500 [==>...........................] - ETA: 2:30 - loss: 1.7528 - regression_loss: 1.5052 - classification_loss: 0.2476 59/500 [==>...........................] - ETA: 2:29 - loss: 1.7455 - regression_loss: 1.4991 - classification_loss: 0.2465 60/500 [==>...........................] - ETA: 2:29 - loss: 1.7346 - regression_loss: 1.4895 - classification_loss: 0.2451 61/500 [==>...........................] - ETA: 2:29 - loss: 1.7284 - regression_loss: 1.4830 - classification_loss: 0.2454 62/500 [==>...........................] - ETA: 2:28 - loss: 1.7279 - regression_loss: 1.4829 - classification_loss: 0.2450 63/500 [==>...........................] - ETA: 2:28 - loss: 1.7318 - regression_loss: 1.4862 - classification_loss: 0.2456 64/500 [==>...........................] - ETA: 2:28 - loss: 1.7321 - regression_loss: 1.4864 - classification_loss: 0.2458 65/500 [==>...........................] - ETA: 2:27 - loss: 1.7207 - regression_loss: 1.4768 - classification_loss: 0.2439 66/500 [==>...........................] - ETA: 2:27 - loss: 1.7183 - regression_loss: 1.4748 - classification_loss: 0.2435 67/500 [===>..........................] - ETA: 2:26 - loss: 1.7187 - regression_loss: 1.4769 - classification_loss: 0.2418 68/500 [===>..........................] - ETA: 2:26 - loss: 1.7201 - regression_loss: 1.4779 - classification_loss: 0.2422 69/500 [===>..........................] - ETA: 2:26 - loss: 1.7205 - regression_loss: 1.4782 - classification_loss: 0.2423 70/500 [===>..........................] - ETA: 2:25 - loss: 1.7272 - regression_loss: 1.4833 - classification_loss: 0.2439 71/500 [===>..........................] - ETA: 2:25 - loss: 1.7239 - regression_loss: 1.4804 - classification_loss: 0.2435 72/500 [===>..........................] - ETA: 2:25 - loss: 1.7099 - regression_loss: 1.4688 - classification_loss: 0.2411 73/500 [===>..........................] - ETA: 2:24 - loss: 1.7063 - regression_loss: 1.4656 - classification_loss: 0.2407 74/500 [===>..........................] - ETA: 2:24 - loss: 1.7128 - regression_loss: 1.4709 - classification_loss: 0.2419 75/500 [===>..........................] - ETA: 2:24 - loss: 1.7123 - regression_loss: 1.4707 - classification_loss: 0.2416 76/500 [===>..........................] - ETA: 2:23 - loss: 1.7168 - regression_loss: 1.4743 - classification_loss: 0.2425 77/500 [===>..........................] - ETA: 2:23 - loss: 1.7120 - regression_loss: 1.4703 - classification_loss: 0.2417 78/500 [===>..........................] - ETA: 2:23 - loss: 1.7135 - regression_loss: 1.4722 - classification_loss: 0.2413 79/500 [===>..........................] - ETA: 2:23 - loss: 1.7134 - regression_loss: 1.4724 - classification_loss: 0.2410 80/500 [===>..........................] - ETA: 2:22 - loss: 1.7136 - regression_loss: 1.4727 - classification_loss: 0.2408 81/500 [===>..........................] - ETA: 2:22 - loss: 1.7091 - regression_loss: 1.4684 - classification_loss: 0.2406 82/500 [===>..........................] - ETA: 2:22 - loss: 1.7065 - regression_loss: 1.4661 - classification_loss: 0.2403 83/500 [===>..........................] - ETA: 2:21 - loss: 1.7104 - regression_loss: 1.4700 - classification_loss: 0.2405 84/500 [====>.........................] - ETA: 2:21 - loss: 1.7100 - regression_loss: 1.4701 - classification_loss: 0.2399 85/500 [====>.........................] - ETA: 2:20 - loss: 1.7065 - regression_loss: 1.4671 - classification_loss: 0.2394 86/500 [====>.........................] - ETA: 2:20 - loss: 1.7053 - regression_loss: 1.4652 - classification_loss: 0.2401 87/500 [====>.........................] - ETA: 2:20 - loss: 1.7021 - regression_loss: 1.4626 - classification_loss: 0.2395 88/500 [====>.........................] - ETA: 2:19 - loss: 1.7006 - regression_loss: 1.4613 - classification_loss: 0.2393 89/500 [====>.........................] - ETA: 2:19 - loss: 1.6979 - regression_loss: 1.4589 - classification_loss: 0.2390 90/500 [====>.........................] - ETA: 2:19 - loss: 1.6998 - regression_loss: 1.4605 - classification_loss: 0.2393 91/500 [====>.........................] - ETA: 2:18 - loss: 1.7021 - regression_loss: 1.4626 - classification_loss: 0.2395 92/500 [====>.........................] - ETA: 2:18 - loss: 1.7031 - regression_loss: 1.4635 - classification_loss: 0.2396 93/500 [====>.........................] - ETA: 2:18 - loss: 1.7129 - regression_loss: 1.4708 - classification_loss: 0.2421 94/500 [====>.........................] - ETA: 2:17 - loss: 1.7126 - regression_loss: 1.4695 - classification_loss: 0.2431 95/500 [====>.........................] - ETA: 2:17 - loss: 1.7200 - regression_loss: 1.4753 - classification_loss: 0.2446 96/500 [====>.........................] - ETA: 2:17 - loss: 1.7145 - regression_loss: 1.4701 - classification_loss: 0.2444 97/500 [====>.........................] - ETA: 2:16 - loss: 1.7159 - regression_loss: 1.4718 - classification_loss: 0.2441 98/500 [====>.........................] - ETA: 2:16 - loss: 1.7143 - regression_loss: 1.4710 - classification_loss: 0.2433 99/500 [====>.........................] - ETA: 2:16 - loss: 1.7136 - regression_loss: 1.4707 - classification_loss: 0.2429 100/500 [=====>........................] - ETA: 2:15 - loss: 1.7159 - regression_loss: 1.4725 - classification_loss: 0.2434 101/500 [=====>........................] - ETA: 2:15 - loss: 1.7071 - regression_loss: 1.4653 - classification_loss: 0.2419 102/500 [=====>........................] - ETA: 2:15 - loss: 1.7069 - regression_loss: 1.4648 - classification_loss: 0.2420 103/500 [=====>........................] - ETA: 2:14 - loss: 1.7063 - regression_loss: 1.4641 - classification_loss: 0.2422 104/500 [=====>........................] - ETA: 2:14 - loss: 1.7177 - regression_loss: 1.4729 - classification_loss: 0.2448 105/500 [=====>........................] - ETA: 2:14 - loss: 1.7159 - regression_loss: 1.4717 - classification_loss: 0.2443 106/500 [=====>........................] - ETA: 2:13 - loss: 1.7163 - regression_loss: 1.4722 - classification_loss: 0.2441 107/500 [=====>........................] - ETA: 2:13 - loss: 1.7225 - regression_loss: 1.4778 - classification_loss: 0.2447 108/500 [=====>........................] - ETA: 2:13 - loss: 1.7236 - regression_loss: 1.4789 - classification_loss: 0.2447 109/500 [=====>........................] - ETA: 2:12 - loss: 1.7250 - regression_loss: 1.4802 - classification_loss: 0.2448 110/500 [=====>........................] - ETA: 2:12 - loss: 1.7195 - regression_loss: 1.4754 - classification_loss: 0.2440 111/500 [=====>........................] - ETA: 2:12 - loss: 1.7174 - regression_loss: 1.4741 - classification_loss: 0.2433 112/500 [=====>........................] - ETA: 2:11 - loss: 1.7186 - regression_loss: 1.4752 - classification_loss: 0.2435 113/500 [=====>........................] - ETA: 2:11 - loss: 1.7226 - regression_loss: 1.4783 - classification_loss: 0.2443 114/500 [=====>........................] - ETA: 2:11 - loss: 1.7264 - regression_loss: 1.4815 - classification_loss: 0.2448 115/500 [=====>........................] - ETA: 2:10 - loss: 1.7266 - regression_loss: 1.4820 - classification_loss: 0.2446 116/500 [=====>........................] - ETA: 2:10 - loss: 1.7221 - regression_loss: 1.4782 - classification_loss: 0.2439 117/500 [======>.......................] - ETA: 2:10 - loss: 1.7229 - regression_loss: 1.4793 - classification_loss: 0.2436 118/500 [======>.......................] - ETA: 2:10 - loss: 1.7243 - regression_loss: 1.4805 - classification_loss: 0.2439 119/500 [======>.......................] - ETA: 2:09 - loss: 1.7285 - regression_loss: 1.4838 - classification_loss: 0.2447 120/500 [======>.......................] - ETA: 2:09 - loss: 1.7286 - regression_loss: 1.4839 - classification_loss: 0.2446 121/500 [======>.......................] - ETA: 2:09 - loss: 1.7313 - regression_loss: 1.4862 - classification_loss: 0.2451 122/500 [======>.......................] - ETA: 2:08 - loss: 1.7342 - regression_loss: 1.4888 - classification_loss: 0.2454 123/500 [======>.......................] - ETA: 2:08 - loss: 1.7315 - regression_loss: 1.4869 - classification_loss: 0.2446 124/500 [======>.......................] - ETA: 2:08 - loss: 1.7284 - regression_loss: 1.4845 - classification_loss: 0.2440 125/500 [======>.......................] - ETA: 2:07 - loss: 1.7347 - regression_loss: 1.4888 - classification_loss: 0.2459 126/500 [======>.......................] - ETA: 2:07 - loss: 1.7351 - regression_loss: 1.4882 - classification_loss: 0.2469 127/500 [======>.......................] - ETA: 2:06 - loss: 1.7353 - regression_loss: 1.4887 - classification_loss: 0.2466 128/500 [======>.......................] - ETA: 2:06 - loss: 1.7328 - regression_loss: 1.4859 - classification_loss: 0.2468 129/500 [======>.......................] - ETA: 2:06 - loss: 1.7369 - regression_loss: 1.4893 - classification_loss: 0.2475 130/500 [======>.......................] - ETA: 2:06 - loss: 1.7315 - regression_loss: 1.4850 - classification_loss: 0.2465 131/500 [======>.......................] - ETA: 2:05 - loss: 1.7292 - regression_loss: 1.4833 - classification_loss: 0.2459 132/500 [======>.......................] - ETA: 2:05 - loss: 1.7285 - regression_loss: 1.4828 - classification_loss: 0.2457 133/500 [======>.......................] - ETA: 2:05 - loss: 1.7220 - regression_loss: 1.4774 - classification_loss: 0.2446 134/500 [=======>......................] - ETA: 2:04 - loss: 1.7251 - regression_loss: 1.4805 - classification_loss: 0.2446 135/500 [=======>......................] - ETA: 2:04 - loss: 1.7263 - regression_loss: 1.4813 - classification_loss: 0.2450 136/500 [=======>......................] - ETA: 2:04 - loss: 1.7252 - regression_loss: 1.4806 - classification_loss: 0.2447 137/500 [=======>......................] - ETA: 2:03 - loss: 1.7303 - regression_loss: 1.4844 - classification_loss: 0.2459 138/500 [=======>......................] - ETA: 2:03 - loss: 1.7329 - regression_loss: 1.4870 - classification_loss: 0.2459 139/500 [=======>......................] - ETA: 2:03 - loss: 1.7337 - regression_loss: 1.4874 - classification_loss: 0.2463 140/500 [=======>......................] - ETA: 2:02 - loss: 1.7319 - regression_loss: 1.4860 - classification_loss: 0.2459 141/500 [=======>......................] - ETA: 2:02 - loss: 1.7284 - regression_loss: 1.4834 - classification_loss: 0.2450 142/500 [=======>......................] - ETA: 2:02 - loss: 1.7298 - regression_loss: 1.4845 - classification_loss: 0.2452 143/500 [=======>......................] - ETA: 2:01 - loss: 1.7226 - regression_loss: 1.4786 - classification_loss: 0.2440 144/500 [=======>......................] - ETA: 2:01 - loss: 1.7187 - regression_loss: 1.4755 - classification_loss: 0.2432 145/500 [=======>......................] - ETA: 2:01 - loss: 1.7211 - regression_loss: 1.4776 - classification_loss: 0.2435 146/500 [=======>......................] - ETA: 2:00 - loss: 1.7184 - regression_loss: 1.4757 - classification_loss: 0.2427 147/500 [=======>......................] - ETA: 2:00 - loss: 1.7188 - regression_loss: 1.4762 - classification_loss: 0.2427 148/500 [=======>......................] - ETA: 2:00 - loss: 1.7123 - regression_loss: 1.4701 - classification_loss: 0.2422 149/500 [=======>......................] - ETA: 1:59 - loss: 1.7138 - regression_loss: 1.4712 - classification_loss: 0.2426 150/500 [========>.....................] - ETA: 1:59 - loss: 1.7150 - regression_loss: 1.4720 - classification_loss: 0.2429 151/500 [========>.....................] - ETA: 1:59 - loss: 1.7093 - regression_loss: 1.4669 - classification_loss: 0.2425 152/500 [========>.....................] - ETA: 1:58 - loss: 1.7084 - regression_loss: 1.4664 - classification_loss: 0.2420 153/500 [========>.....................] - ETA: 1:58 - loss: 1.7090 - regression_loss: 1.4671 - classification_loss: 0.2419 154/500 [========>.....................] - ETA: 1:57 - loss: 1.7091 - regression_loss: 1.4674 - classification_loss: 0.2418 155/500 [========>.....................] - ETA: 1:57 - loss: 1.7052 - regression_loss: 1.4639 - classification_loss: 0.2413 156/500 [========>.....................] - ETA: 1:57 - loss: 1.7093 - regression_loss: 1.4676 - classification_loss: 0.2417 157/500 [========>.....................] - ETA: 1:56 - loss: 1.7101 - regression_loss: 1.4683 - classification_loss: 0.2418 158/500 [========>.....................] - ETA: 1:56 - loss: 1.7111 - regression_loss: 1.4691 - classification_loss: 0.2420 159/500 [========>.....................] - ETA: 1:56 - loss: 1.7117 - regression_loss: 1.4696 - classification_loss: 0.2421 160/500 [========>.....................] - ETA: 1:56 - loss: 1.7117 - regression_loss: 1.4695 - classification_loss: 0.2422 161/500 [========>.....................] - ETA: 1:55 - loss: 1.7121 - regression_loss: 1.4700 - classification_loss: 0.2421 162/500 [========>.....................] - ETA: 1:55 - loss: 1.7150 - regression_loss: 1.4721 - classification_loss: 0.2430 163/500 [========>.....................] - ETA: 1:54 - loss: 1.7115 - regression_loss: 1.4690 - classification_loss: 0.2425 164/500 [========>.....................] - ETA: 1:54 - loss: 1.7144 - regression_loss: 1.4719 - classification_loss: 0.2424 165/500 [========>.....................] - ETA: 1:54 - loss: 1.7140 - regression_loss: 1.4712 - classification_loss: 0.2429 166/500 [========>.....................] - ETA: 1:54 - loss: 1.7144 - regression_loss: 1.4716 - classification_loss: 0.2428 167/500 [=========>....................] - ETA: 1:53 - loss: 1.7170 - regression_loss: 1.4736 - classification_loss: 0.2434 168/500 [=========>....................] - ETA: 1:53 - loss: 1.7160 - regression_loss: 1.4724 - classification_loss: 0.2435 169/500 [=========>....................] - ETA: 1:52 - loss: 1.7165 - regression_loss: 1.4721 - classification_loss: 0.2443 170/500 [=========>....................] - ETA: 1:52 - loss: 1.7176 - regression_loss: 1.4733 - classification_loss: 0.2443 171/500 [=========>....................] - ETA: 1:52 - loss: 1.7169 - regression_loss: 1.4727 - classification_loss: 0.2442 172/500 [=========>....................] - ETA: 1:51 - loss: 1.7176 - regression_loss: 1.4730 - classification_loss: 0.2445 173/500 [=========>....................] - ETA: 1:51 - loss: 1.7181 - regression_loss: 1.4735 - classification_loss: 0.2446 174/500 [=========>....................] - ETA: 1:51 - loss: 1.7253 - regression_loss: 1.4787 - classification_loss: 0.2467 175/500 [=========>....................] - ETA: 1:50 - loss: 1.7269 - regression_loss: 1.4800 - classification_loss: 0.2468 176/500 [=========>....................] - ETA: 1:50 - loss: 1.7323 - regression_loss: 1.4846 - classification_loss: 0.2477 177/500 [=========>....................] - ETA: 1:50 - loss: 1.7312 - regression_loss: 1.4839 - classification_loss: 0.2473 178/500 [=========>....................] - ETA: 1:49 - loss: 1.7292 - regression_loss: 1.4824 - classification_loss: 0.2468 179/500 [=========>....................] - ETA: 1:49 - loss: 1.7320 - regression_loss: 1.4843 - classification_loss: 0.2476 180/500 [=========>....................] - ETA: 1:49 - loss: 1.7306 - regression_loss: 1.4833 - classification_loss: 0.2473 181/500 [=========>....................] - ETA: 1:48 - loss: 1.7294 - regression_loss: 1.4822 - classification_loss: 0.2472 182/500 [=========>....................] - ETA: 1:48 - loss: 1.7295 - regression_loss: 1.4824 - classification_loss: 0.2470 183/500 [=========>....................] - ETA: 1:48 - loss: 1.7275 - regression_loss: 1.4811 - classification_loss: 0.2464 184/500 [==========>...................] - ETA: 1:47 - loss: 1.7252 - regression_loss: 1.4794 - classification_loss: 0.2458 185/500 [==========>...................] - ETA: 1:47 - loss: 1.7259 - regression_loss: 1.4797 - classification_loss: 0.2463 186/500 [==========>...................] - ETA: 1:47 - loss: 1.7273 - regression_loss: 1.4807 - classification_loss: 0.2466 187/500 [==========>...................] - ETA: 1:46 - loss: 1.7301 - regression_loss: 1.4828 - classification_loss: 0.2472 188/500 [==========>...................] - ETA: 1:46 - loss: 1.7273 - regression_loss: 1.4807 - classification_loss: 0.2465 189/500 [==========>...................] - ETA: 1:46 - loss: 1.7307 - regression_loss: 1.4842 - classification_loss: 0.2465 190/500 [==========>...................] - ETA: 1:45 - loss: 1.7319 - regression_loss: 1.4855 - classification_loss: 0.2464 191/500 [==========>...................] - ETA: 1:45 - loss: 1.7296 - regression_loss: 1.4835 - classification_loss: 0.2461 192/500 [==========>...................] - ETA: 1:45 - loss: 1.7306 - regression_loss: 1.4844 - classification_loss: 0.2462 193/500 [==========>...................] - ETA: 1:44 - loss: 1.7371 - regression_loss: 1.4903 - classification_loss: 0.2469 194/500 [==========>...................] - ETA: 1:44 - loss: 1.7370 - regression_loss: 1.4901 - classification_loss: 0.2469 195/500 [==========>...................] - ETA: 1:44 - loss: 1.7328 - regression_loss: 1.4865 - classification_loss: 0.2462 196/500 [==========>...................] - ETA: 1:43 - loss: 1.7282 - regression_loss: 1.4827 - classification_loss: 0.2455 197/500 [==========>...................] - ETA: 1:43 - loss: 1.7312 - regression_loss: 1.4859 - classification_loss: 0.2453 198/500 [==========>...................] - ETA: 1:42 - loss: 1.7268 - regression_loss: 1.4823 - classification_loss: 0.2446 199/500 [==========>...................] - ETA: 1:42 - loss: 1.7267 - regression_loss: 1.4822 - classification_loss: 0.2445 200/500 [===========>..................] - ETA: 1:42 - loss: 1.7287 - regression_loss: 1.4835 - classification_loss: 0.2452 201/500 [===========>..................] - ETA: 1:41 - loss: 1.7323 - regression_loss: 1.4866 - classification_loss: 0.2457 202/500 [===========>..................] - ETA: 1:41 - loss: 1.7324 - regression_loss: 1.4870 - classification_loss: 0.2455 203/500 [===========>..................] - ETA: 1:41 - loss: 1.7318 - regression_loss: 1.4864 - classification_loss: 0.2453 204/500 [===========>..................] - ETA: 1:40 - loss: 1.7298 - regression_loss: 1.4849 - classification_loss: 0.2449 205/500 [===========>..................] - ETA: 1:40 - loss: 1.7304 - regression_loss: 1.4854 - classification_loss: 0.2450 206/500 [===========>..................] - ETA: 1:40 - loss: 1.7317 - regression_loss: 1.4863 - classification_loss: 0.2454 207/500 [===========>..................] - ETA: 1:39 - loss: 1.7306 - regression_loss: 1.4855 - classification_loss: 0.2451 208/500 [===========>..................] - ETA: 1:39 - loss: 1.7302 - regression_loss: 1.4854 - classification_loss: 0.2448 209/500 [===========>..................] - ETA: 1:39 - loss: 1.7309 - regression_loss: 1.4861 - classification_loss: 0.2449 210/500 [===========>..................] - ETA: 1:38 - loss: 1.7306 - regression_loss: 1.4856 - classification_loss: 0.2449 211/500 [===========>..................] - ETA: 1:38 - loss: 1.7325 - regression_loss: 1.4872 - classification_loss: 0.2453 212/500 [===========>..................] - ETA: 1:38 - loss: 1.7358 - regression_loss: 1.4899 - classification_loss: 0.2460 213/500 [===========>..................] - ETA: 1:37 - loss: 1.7379 - regression_loss: 1.4910 - classification_loss: 0.2469 214/500 [===========>..................] - ETA: 1:37 - loss: 1.7331 - regression_loss: 1.4869 - classification_loss: 0.2462 215/500 [===========>..................] - ETA: 1:37 - loss: 1.7346 - regression_loss: 1.4883 - classification_loss: 0.2463 216/500 [===========>..................] - ETA: 1:36 - loss: 1.7354 - regression_loss: 1.4887 - classification_loss: 0.2467 217/500 [============>.................] - ETA: 1:36 - loss: 1.7356 - regression_loss: 1.4888 - classification_loss: 0.2468 218/500 [============>.................] - ETA: 1:36 - loss: 1.7316 - regression_loss: 1.4853 - classification_loss: 0.2463 219/500 [============>.................] - ETA: 1:35 - loss: 1.7318 - regression_loss: 1.4854 - classification_loss: 0.2464 220/500 [============>.................] - ETA: 1:35 - loss: 1.7316 - regression_loss: 1.4855 - classification_loss: 0.2461 221/500 [============>.................] - ETA: 1:35 - loss: 1.7277 - regression_loss: 1.4824 - classification_loss: 0.2453 222/500 [============>.................] - ETA: 1:34 - loss: 1.7260 - regression_loss: 1.4810 - classification_loss: 0.2450 223/500 [============>.................] - ETA: 1:34 - loss: 1.7248 - regression_loss: 1.4799 - classification_loss: 0.2449 224/500 [============>.................] - ETA: 1:34 - loss: 1.7257 - regression_loss: 1.4807 - classification_loss: 0.2450 225/500 [============>.................] - ETA: 1:33 - loss: 1.7252 - regression_loss: 1.4801 - classification_loss: 0.2451 226/500 [============>.................] - ETA: 1:33 - loss: 1.7240 - regression_loss: 1.4790 - classification_loss: 0.2449 227/500 [============>.................] - ETA: 1:33 - loss: 1.7237 - regression_loss: 1.4790 - classification_loss: 0.2446 228/500 [============>.................] - ETA: 1:32 - loss: 1.7225 - regression_loss: 1.4779 - classification_loss: 0.2446 229/500 [============>.................] - ETA: 1:32 - loss: 1.7195 - regression_loss: 1.4752 - classification_loss: 0.2443 230/500 [============>.................] - ETA: 1:32 - loss: 1.7202 - regression_loss: 1.4758 - classification_loss: 0.2444 231/500 [============>.................] - ETA: 1:31 - loss: 1.7237 - regression_loss: 1.4787 - classification_loss: 0.2450 232/500 [============>.................] - ETA: 1:31 - loss: 1.7225 - regression_loss: 1.4778 - classification_loss: 0.2447 233/500 [============>.................] - ETA: 1:31 - loss: 1.7235 - regression_loss: 1.4786 - classification_loss: 0.2448 234/500 [=============>................] - ETA: 1:30 - loss: 1.7217 - regression_loss: 1.4770 - classification_loss: 0.2446 235/500 [=============>................] - ETA: 1:30 - loss: 1.7218 - regression_loss: 1.4774 - classification_loss: 0.2444 236/500 [=============>................] - ETA: 1:29 - loss: 1.7197 - regression_loss: 1.4757 - classification_loss: 0.2440 237/500 [=============>................] - ETA: 1:29 - loss: 1.7190 - regression_loss: 1.4752 - classification_loss: 0.2438 238/500 [=============>................] - ETA: 1:29 - loss: 1.7211 - regression_loss: 1.4771 - classification_loss: 0.2439 239/500 [=============>................] - ETA: 1:28 - loss: 1.7186 - regression_loss: 1.4750 - classification_loss: 0.2436 240/500 [=============>................] - ETA: 1:28 - loss: 1.7184 - regression_loss: 1.4750 - classification_loss: 0.2434 241/500 [=============>................] - ETA: 1:28 - loss: 1.7167 - regression_loss: 1.4735 - classification_loss: 0.2432 242/500 [=============>................] - ETA: 1:27 - loss: 1.7199 - regression_loss: 1.4755 - classification_loss: 0.2444 243/500 [=============>................] - ETA: 1:27 - loss: 1.7165 - regression_loss: 1.4728 - classification_loss: 0.2438 244/500 [=============>................] - ETA: 1:27 - loss: 1.7169 - regression_loss: 1.4729 - classification_loss: 0.2440 245/500 [=============>................] - ETA: 1:26 - loss: 1.7180 - regression_loss: 1.4737 - classification_loss: 0.2442 246/500 [=============>................] - ETA: 1:26 - loss: 1.7177 - regression_loss: 1.4736 - classification_loss: 0.2441 247/500 [=============>................] - ETA: 1:26 - loss: 1.7156 - regression_loss: 1.4718 - classification_loss: 0.2438 248/500 [=============>................] - ETA: 1:25 - loss: 1.7149 - regression_loss: 1.4713 - classification_loss: 0.2436 249/500 [=============>................] - ETA: 1:25 - loss: 1.7162 - regression_loss: 1.4724 - classification_loss: 0.2439 250/500 [==============>...............] - ETA: 1:25 - loss: 1.7176 - regression_loss: 1.4738 - classification_loss: 0.2438 251/500 [==============>...............] - ETA: 1:24 - loss: 1.7168 - regression_loss: 1.4732 - classification_loss: 0.2436 252/500 [==============>...............] - ETA: 1:24 - loss: 1.7198 - regression_loss: 1.4763 - classification_loss: 0.2435 253/500 [==============>...............] - ETA: 1:24 - loss: 1.7199 - regression_loss: 1.4765 - classification_loss: 0.2434 254/500 [==============>...............] - ETA: 1:23 - loss: 1.7192 - regression_loss: 1.4760 - classification_loss: 0.2432 255/500 [==============>...............] - ETA: 1:23 - loss: 1.7204 - regression_loss: 1.4769 - classification_loss: 0.2434 256/500 [==============>...............] - ETA: 1:23 - loss: 1.7200 - regression_loss: 1.4767 - classification_loss: 0.2432 257/500 [==============>...............] - ETA: 1:22 - loss: 1.7229 - regression_loss: 1.4794 - classification_loss: 0.2434 258/500 [==============>...............] - ETA: 1:22 - loss: 1.7224 - regression_loss: 1.4791 - classification_loss: 0.2433 259/500 [==============>...............] - ETA: 1:22 - loss: 1.7223 - regression_loss: 1.4790 - classification_loss: 0.2433 260/500 [==============>...............] - ETA: 1:21 - loss: 1.7195 - regression_loss: 1.4767 - classification_loss: 0.2428 261/500 [==============>...............] - ETA: 1:21 - loss: 1.7203 - regression_loss: 1.4773 - classification_loss: 0.2430 262/500 [==============>...............] - ETA: 1:21 - loss: 1.7197 - regression_loss: 1.4769 - classification_loss: 0.2428 263/500 [==============>...............] - ETA: 1:20 - loss: 1.7205 - regression_loss: 1.4771 - classification_loss: 0.2434 264/500 [==============>...............] - ETA: 1:20 - loss: 1.7198 - regression_loss: 1.4764 - classification_loss: 0.2434 265/500 [==============>...............] - ETA: 1:20 - loss: 1.7165 - regression_loss: 1.4736 - classification_loss: 0.2428 266/500 [==============>...............] - ETA: 1:19 - loss: 1.7147 - regression_loss: 1.4723 - classification_loss: 0.2425 267/500 [===============>..............] - ETA: 1:19 - loss: 1.7146 - regression_loss: 1.4720 - classification_loss: 0.2427 268/500 [===============>..............] - ETA: 1:19 - loss: 1.7130 - regression_loss: 1.4707 - classification_loss: 0.2422 269/500 [===============>..............] - ETA: 1:18 - loss: 1.7131 - regression_loss: 1.4709 - classification_loss: 0.2422 270/500 [===============>..............] - ETA: 1:18 - loss: 1.7137 - regression_loss: 1.4715 - classification_loss: 0.2421 271/500 [===============>..............] - ETA: 1:18 - loss: 1.7136 - regression_loss: 1.4716 - classification_loss: 0.2420 272/500 [===============>..............] - ETA: 1:17 - loss: 1.7138 - regression_loss: 1.4719 - classification_loss: 0.2419 273/500 [===============>..............] - ETA: 1:17 - loss: 1.7135 - regression_loss: 1.4717 - classification_loss: 0.2418 274/500 [===============>..............] - ETA: 1:17 - loss: 1.7135 - regression_loss: 1.4713 - classification_loss: 0.2421 275/500 [===============>..............] - ETA: 1:16 - loss: 1.7119 - regression_loss: 1.4701 - classification_loss: 0.2417 276/500 [===============>..............] - ETA: 1:16 - loss: 1.7125 - regression_loss: 1.4708 - classification_loss: 0.2418 277/500 [===============>..............] - ETA: 1:16 - loss: 1.7128 - regression_loss: 1.4707 - classification_loss: 0.2421 278/500 [===============>..............] - ETA: 1:15 - loss: 1.7146 - regression_loss: 1.4720 - classification_loss: 0.2425 279/500 [===============>..............] - ETA: 1:15 - loss: 1.7129 - regression_loss: 1.4707 - classification_loss: 0.2422 280/500 [===============>..............] - ETA: 1:15 - loss: 1.7134 - regression_loss: 1.4713 - classification_loss: 0.2421 281/500 [===============>..............] - ETA: 1:14 - loss: 1.7152 - regression_loss: 1.4728 - classification_loss: 0.2425 282/500 [===============>..............] - ETA: 1:14 - loss: 1.7153 - regression_loss: 1.4730 - classification_loss: 0.2423 283/500 [===============>..............] - ETA: 1:13 - loss: 1.7109 - regression_loss: 1.4691 - classification_loss: 0.2418 284/500 [================>.............] - ETA: 1:13 - loss: 1.7125 - regression_loss: 1.4703 - classification_loss: 0.2422 285/500 [================>.............] - ETA: 1:13 - loss: 1.7102 - regression_loss: 1.4681 - classification_loss: 0.2421 286/500 [================>.............] - ETA: 1:12 - loss: 1.7098 - regression_loss: 1.4677 - classification_loss: 0.2421 287/500 [================>.............] - ETA: 1:12 - loss: 1.7103 - regression_loss: 1.4681 - classification_loss: 0.2422 288/500 [================>.............] - ETA: 1:12 - loss: 1.7094 - regression_loss: 1.4672 - classification_loss: 0.2422 289/500 [================>.............] - ETA: 1:11 - loss: 1.7098 - regression_loss: 1.4674 - classification_loss: 0.2424 290/500 [================>.............] - ETA: 1:11 - loss: 1.7087 - regression_loss: 1.4666 - classification_loss: 0.2421 291/500 [================>.............] - ETA: 1:11 - loss: 1.7059 - regression_loss: 1.4644 - classification_loss: 0.2416 292/500 [================>.............] - ETA: 1:10 - loss: 1.7065 - regression_loss: 1.4647 - classification_loss: 0.2418 293/500 [================>.............] - ETA: 1:10 - loss: 1.7079 - regression_loss: 1.4661 - classification_loss: 0.2419 294/500 [================>.............] - ETA: 1:10 - loss: 1.7092 - regression_loss: 1.4670 - classification_loss: 0.2422 295/500 [================>.............] - ETA: 1:09 - loss: 1.7098 - regression_loss: 1.4678 - classification_loss: 0.2420 296/500 [================>.............] - ETA: 1:09 - loss: 1.7077 - regression_loss: 1.4661 - classification_loss: 0.2416 297/500 [================>.............] - ETA: 1:09 - loss: 1.7085 - regression_loss: 1.4666 - classification_loss: 0.2418 298/500 [================>.............] - ETA: 1:08 - loss: 1.7082 - regression_loss: 1.4665 - classification_loss: 0.2417 299/500 [================>.............] - ETA: 1:08 - loss: 1.7078 - regression_loss: 1.4662 - classification_loss: 0.2416 300/500 [=================>............] - ETA: 1:08 - loss: 1.7077 - regression_loss: 1.4661 - classification_loss: 0.2416 301/500 [=================>............] - ETA: 1:07 - loss: 1.7081 - regression_loss: 1.4665 - classification_loss: 0.2416 302/500 [=================>............] - ETA: 1:07 - loss: 1.7069 - regression_loss: 1.4657 - classification_loss: 0.2412 303/500 [=================>............] - ETA: 1:07 - loss: 1.7063 - regression_loss: 1.4652 - classification_loss: 0.2411 304/500 [=================>............] - ETA: 1:06 - loss: 1.7060 - regression_loss: 1.4650 - classification_loss: 0.2410 305/500 [=================>............] - ETA: 1:06 - loss: 1.7048 - regression_loss: 1.4640 - classification_loss: 0.2408 306/500 [=================>............] - ETA: 1:06 - loss: 1.7062 - regression_loss: 1.4653 - classification_loss: 0.2408 307/500 [=================>............] - ETA: 1:05 - loss: 1.7056 - regression_loss: 1.4648 - classification_loss: 0.2408 308/500 [=================>............] - ETA: 1:05 - loss: 1.7046 - regression_loss: 1.4640 - classification_loss: 0.2406 309/500 [=================>............] - ETA: 1:05 - loss: 1.7040 - regression_loss: 1.4635 - classification_loss: 0.2405 310/500 [=================>............] - ETA: 1:04 - loss: 1.7057 - regression_loss: 1.4651 - classification_loss: 0.2406 311/500 [=================>............] - ETA: 1:04 - loss: 1.7059 - regression_loss: 1.4653 - classification_loss: 0.2406 312/500 [=================>............] - ETA: 1:04 - loss: 1.7055 - regression_loss: 1.4648 - classification_loss: 0.2407 313/500 [=================>............] - ETA: 1:03 - loss: 1.7067 - regression_loss: 1.4659 - classification_loss: 0.2408 314/500 [=================>............] - ETA: 1:03 - loss: 1.7087 - regression_loss: 1.4672 - classification_loss: 0.2414 315/500 [=================>............] - ETA: 1:03 - loss: 1.7081 - regression_loss: 1.4668 - classification_loss: 0.2413 316/500 [=================>............] - ETA: 1:02 - loss: 1.7102 - regression_loss: 1.4686 - classification_loss: 0.2416 317/500 [==================>...........] - ETA: 1:02 - loss: 1.7097 - regression_loss: 1.4682 - classification_loss: 0.2415 318/500 [==================>...........] - ETA: 1:02 - loss: 1.7102 - regression_loss: 1.4683 - classification_loss: 0.2419 319/500 [==================>...........] - ETA: 1:01 - loss: 1.7111 - regression_loss: 1.4691 - classification_loss: 0.2420 320/500 [==================>...........] - ETA: 1:01 - loss: 1.7099 - regression_loss: 1.4681 - classification_loss: 0.2418 321/500 [==================>...........] - ETA: 1:01 - loss: 1.7100 - regression_loss: 1.4682 - classification_loss: 0.2418 322/500 [==================>...........] - ETA: 1:00 - loss: 1.7086 - regression_loss: 1.4670 - classification_loss: 0.2416 323/500 [==================>...........] - ETA: 1:00 - loss: 1.7081 - regression_loss: 1.4662 - classification_loss: 0.2419 324/500 [==================>...........] - ETA: 1:00 - loss: 1.7065 - regression_loss: 1.4648 - classification_loss: 0.2417 325/500 [==================>...........] - ETA: 59s - loss: 1.7060 - regression_loss: 1.4644 - classification_loss: 0.2416  326/500 [==================>...........] - ETA: 59s - loss: 1.7042 - regression_loss: 1.4629 - classification_loss: 0.2413 327/500 [==================>...........] - ETA: 59s - loss: 1.7037 - regression_loss: 1.4625 - classification_loss: 0.2412 328/500 [==================>...........] - ETA: 58s - loss: 1.7054 - regression_loss: 1.4640 - classification_loss: 0.2415 329/500 [==================>...........] - ETA: 58s - loss: 1.7052 - regression_loss: 1.4636 - classification_loss: 0.2416 330/500 [==================>...........] - ETA: 58s - loss: 1.7045 - regression_loss: 1.4630 - classification_loss: 0.2415 331/500 [==================>...........] - ETA: 57s - loss: 1.7044 - regression_loss: 1.4629 - classification_loss: 0.2415 332/500 [==================>...........] - ETA: 57s - loss: 1.7028 - regression_loss: 1.4613 - classification_loss: 0.2415 333/500 [==================>...........] - ETA: 56s - loss: 1.7031 - regression_loss: 1.4615 - classification_loss: 0.2416 334/500 [===================>..........] - ETA: 56s - loss: 1.7022 - regression_loss: 1.4607 - classification_loss: 0.2415 335/500 [===================>..........] - ETA: 56s - loss: 1.7002 - regression_loss: 1.4588 - classification_loss: 0.2414 336/500 [===================>..........] - ETA: 55s - loss: 1.7021 - regression_loss: 1.4606 - classification_loss: 0.2415 337/500 [===================>..........] - ETA: 55s - loss: 1.7026 - regression_loss: 1.4610 - classification_loss: 0.2416 338/500 [===================>..........] - ETA: 55s - loss: 1.7007 - regression_loss: 1.4593 - classification_loss: 0.2413 339/500 [===================>..........] - ETA: 54s - loss: 1.7007 - regression_loss: 1.4595 - classification_loss: 0.2412 340/500 [===================>..........] - ETA: 54s - loss: 1.7007 - regression_loss: 1.4595 - classification_loss: 0.2411 341/500 [===================>..........] - ETA: 54s - loss: 1.7017 - regression_loss: 1.4604 - classification_loss: 0.2413 342/500 [===================>..........] - ETA: 53s - loss: 1.7017 - regression_loss: 1.4601 - classification_loss: 0.2416 343/500 [===================>..........] - ETA: 53s - loss: 1.7016 - regression_loss: 1.4600 - classification_loss: 0.2417 344/500 [===================>..........] - ETA: 53s - loss: 1.7005 - regression_loss: 1.4591 - classification_loss: 0.2414 345/500 [===================>..........] - ETA: 52s - loss: 1.7005 - regression_loss: 1.4593 - classification_loss: 0.2413 346/500 [===================>..........] - ETA: 52s - loss: 1.7011 - regression_loss: 1.4598 - classification_loss: 0.2413 347/500 [===================>..........] - ETA: 52s - loss: 1.7004 - regression_loss: 1.4593 - classification_loss: 0.2412 348/500 [===================>..........] - ETA: 51s - loss: 1.7012 - regression_loss: 1.4600 - classification_loss: 0.2412 349/500 [===================>..........] - ETA: 51s - loss: 1.6993 - regression_loss: 1.4585 - classification_loss: 0.2408 350/500 [====================>.........] - ETA: 51s - loss: 1.6984 - regression_loss: 1.4577 - classification_loss: 0.2407 351/500 [====================>.........] - ETA: 50s - loss: 1.6998 - regression_loss: 1.4588 - classification_loss: 0.2411 352/500 [====================>.........] - ETA: 50s - loss: 1.7000 - regression_loss: 1.4587 - classification_loss: 0.2413 353/500 [====================>.........] - ETA: 50s - loss: 1.7066 - regression_loss: 1.4637 - classification_loss: 0.2429 354/500 [====================>.........] - ETA: 49s - loss: 1.7047 - regression_loss: 1.4620 - classification_loss: 0.2427 355/500 [====================>.........] - ETA: 49s - loss: 1.7049 - regression_loss: 1.4622 - classification_loss: 0.2427 356/500 [====================>.........] - ETA: 49s - loss: 1.7052 - regression_loss: 1.4625 - classification_loss: 0.2428 357/500 [====================>.........] - ETA: 48s - loss: 1.7038 - regression_loss: 1.4613 - classification_loss: 0.2425 358/500 [====================>.........] - ETA: 48s - loss: 1.7021 - regression_loss: 1.4599 - classification_loss: 0.2422 359/500 [====================>.........] - ETA: 48s - loss: 1.7026 - regression_loss: 1.4603 - classification_loss: 0.2423 360/500 [====================>.........] - ETA: 47s - loss: 1.7032 - regression_loss: 1.4609 - classification_loss: 0.2424 361/500 [====================>.........] - ETA: 47s - loss: 1.7042 - regression_loss: 1.4617 - classification_loss: 0.2425 362/500 [====================>.........] - ETA: 47s - loss: 1.7032 - regression_loss: 1.4608 - classification_loss: 0.2424 363/500 [====================>.........] - ETA: 46s - loss: 1.7039 - regression_loss: 1.4615 - classification_loss: 0.2424 364/500 [====================>.........] - ETA: 46s - loss: 1.7037 - regression_loss: 1.4613 - classification_loss: 0.2424 365/500 [====================>.........] - ETA: 46s - loss: 1.7032 - regression_loss: 1.4607 - classification_loss: 0.2425 366/500 [====================>.........] - ETA: 45s - loss: 1.7037 - regression_loss: 1.4610 - classification_loss: 0.2426 367/500 [=====================>........] - ETA: 45s - loss: 1.7044 - regression_loss: 1.4619 - classification_loss: 0.2425 368/500 [=====================>........] - ETA: 45s - loss: 1.7048 - regression_loss: 1.4622 - classification_loss: 0.2425 369/500 [=====================>........] - ETA: 44s - loss: 1.7039 - regression_loss: 1.4614 - classification_loss: 0.2424 370/500 [=====================>........] - ETA: 44s - loss: 1.7060 - regression_loss: 1.4632 - classification_loss: 0.2427 371/500 [=====================>........] - ETA: 44s - loss: 1.7053 - regression_loss: 1.4626 - classification_loss: 0.2427 372/500 [=====================>........] - ETA: 43s - loss: 1.7062 - regression_loss: 1.4634 - classification_loss: 0.2428 373/500 [=====================>........] - ETA: 43s - loss: 1.7062 - regression_loss: 1.4634 - classification_loss: 0.2428 374/500 [=====================>........] - ETA: 43s - loss: 1.7062 - regression_loss: 1.4635 - classification_loss: 0.2428 375/500 [=====================>........] - ETA: 42s - loss: 1.7057 - regression_loss: 1.4629 - classification_loss: 0.2427 376/500 [=====================>........] - ETA: 42s - loss: 1.7050 - regression_loss: 1.4624 - classification_loss: 0.2426 377/500 [=====================>........] - ETA: 41s - loss: 1.7047 - regression_loss: 1.4622 - classification_loss: 0.2425 378/500 [=====================>........] - ETA: 41s - loss: 1.7045 - regression_loss: 1.4620 - classification_loss: 0.2424 379/500 [=====================>........] - ETA: 41s - loss: 1.7037 - regression_loss: 1.4613 - classification_loss: 0.2424 380/500 [=====================>........] - ETA: 40s - loss: 1.7034 - regression_loss: 1.4611 - classification_loss: 0.2423 381/500 [=====================>........] - ETA: 40s - loss: 1.7015 - regression_loss: 1.4595 - classification_loss: 0.2420 382/500 [=====================>........] - ETA: 40s - loss: 1.7010 - regression_loss: 1.4591 - classification_loss: 0.2419 383/500 [=====================>........] - ETA: 39s - loss: 1.7020 - regression_loss: 1.4601 - classification_loss: 0.2420 384/500 [======================>.......] - ETA: 39s - loss: 1.7012 - regression_loss: 1.4595 - classification_loss: 0.2416 385/500 [======================>.......] - ETA: 39s - loss: 1.7002 - regression_loss: 1.4585 - classification_loss: 0.2417 386/500 [======================>.......] - ETA: 38s - loss: 1.6986 - regression_loss: 1.4570 - classification_loss: 0.2415 387/500 [======================>.......] - ETA: 38s - loss: 1.7007 - regression_loss: 1.4589 - classification_loss: 0.2418 388/500 [======================>.......] - ETA: 38s - loss: 1.7006 - regression_loss: 1.4589 - classification_loss: 0.2417 389/500 [======================>.......] - ETA: 37s - loss: 1.6999 - regression_loss: 1.4584 - classification_loss: 0.2415 390/500 [======================>.......] - ETA: 37s - loss: 1.7012 - regression_loss: 1.4595 - classification_loss: 0.2417 391/500 [======================>.......] - ETA: 37s - loss: 1.7010 - regression_loss: 1.4594 - classification_loss: 0.2415 392/500 [======================>.......] - ETA: 36s - loss: 1.7018 - regression_loss: 1.4602 - classification_loss: 0.2416 393/500 [======================>.......] - ETA: 36s - loss: 1.7020 - regression_loss: 1.4604 - classification_loss: 0.2415 394/500 [======================>.......] - ETA: 36s - loss: 1.7031 - regression_loss: 1.4613 - classification_loss: 0.2419 395/500 [======================>.......] - ETA: 35s - loss: 1.7047 - regression_loss: 1.4627 - classification_loss: 0.2420 396/500 [======================>.......] - ETA: 35s - loss: 1.7065 - regression_loss: 1.4643 - classification_loss: 0.2423 397/500 [======================>.......] - ETA: 35s - loss: 1.7050 - regression_loss: 1.4630 - classification_loss: 0.2421 398/500 [======================>.......] - ETA: 34s - loss: 1.7056 - regression_loss: 1.4635 - classification_loss: 0.2421 399/500 [======================>.......] - ETA: 34s - loss: 1.7060 - regression_loss: 1.4639 - classification_loss: 0.2421 400/500 [=======================>......] - ETA: 34s - loss: 1.7077 - regression_loss: 1.4654 - classification_loss: 0.2423 401/500 [=======================>......] - ETA: 33s - loss: 1.7087 - regression_loss: 1.4662 - classification_loss: 0.2424 402/500 [=======================>......] - ETA: 33s - loss: 1.7089 - regression_loss: 1.4666 - classification_loss: 0.2423 403/500 [=======================>......] - ETA: 33s - loss: 1.7063 - regression_loss: 1.4644 - classification_loss: 0.2419 404/500 [=======================>......] - ETA: 32s - loss: 1.7065 - regression_loss: 1.4646 - classification_loss: 0.2419 405/500 [=======================>......] - ETA: 32s - loss: 1.7071 - regression_loss: 1.4649 - classification_loss: 0.2422 406/500 [=======================>......] - ETA: 32s - loss: 1.7081 - regression_loss: 1.4660 - classification_loss: 0.2421 407/500 [=======================>......] - ETA: 31s - loss: 1.7076 - regression_loss: 1.4657 - classification_loss: 0.2419 408/500 [=======================>......] - ETA: 31s - loss: 1.7079 - regression_loss: 1.4661 - classification_loss: 0.2419 409/500 [=======================>......] - ETA: 31s - loss: 1.7074 - regression_loss: 1.4656 - classification_loss: 0.2418 410/500 [=======================>......] - ETA: 30s - loss: 1.7069 - regression_loss: 1.4652 - classification_loss: 0.2417 411/500 [=======================>......] - ETA: 30s - loss: 1.7053 - regression_loss: 1.4638 - classification_loss: 0.2415 412/500 [=======================>......] - ETA: 30s - loss: 1.7063 - regression_loss: 1.4646 - classification_loss: 0.2417 413/500 [=======================>......] - ETA: 29s - loss: 1.7073 - regression_loss: 1.4655 - classification_loss: 0.2417 414/500 [=======================>......] - ETA: 29s - loss: 1.7055 - regression_loss: 1.4641 - classification_loss: 0.2414 415/500 [=======================>......] - ETA: 28s - loss: 1.7066 - regression_loss: 1.4651 - classification_loss: 0.2414 416/500 [=======================>......] - ETA: 28s - loss: 1.7064 - regression_loss: 1.4649 - classification_loss: 0.2415 417/500 [========================>.....] - ETA: 28s - loss: 1.7052 - regression_loss: 1.4639 - classification_loss: 0.2413 418/500 [========================>.....] - ETA: 27s - loss: 1.7037 - regression_loss: 1.4626 - classification_loss: 0.2411 419/500 [========================>.....] - ETA: 27s - loss: 1.7037 - regression_loss: 1.4626 - classification_loss: 0.2411 420/500 [========================>.....] - ETA: 27s - loss: 1.7051 - regression_loss: 1.4637 - classification_loss: 0.2414 421/500 [========================>.....] - ETA: 26s - loss: 1.7054 - regression_loss: 1.4640 - classification_loss: 0.2414 422/500 [========================>.....] - ETA: 26s - loss: 1.7060 - regression_loss: 1.4645 - classification_loss: 0.2415 423/500 [========================>.....] - ETA: 26s - loss: 1.7067 - regression_loss: 1.4650 - classification_loss: 0.2417 424/500 [========================>.....] - ETA: 25s - loss: 1.7059 - regression_loss: 1.4643 - classification_loss: 0.2416 425/500 [========================>.....] - ETA: 25s - loss: 1.7064 - regression_loss: 1.4646 - classification_loss: 0.2418 426/500 [========================>.....] - ETA: 25s - loss: 1.7064 - regression_loss: 1.4646 - classification_loss: 0.2418 427/500 [========================>.....] - ETA: 24s - loss: 1.7065 - regression_loss: 1.4646 - classification_loss: 0.2418 428/500 [========================>.....] - ETA: 24s - loss: 1.7070 - regression_loss: 1.4650 - classification_loss: 0.2420 429/500 [========================>.....] - ETA: 24s - loss: 1.7066 - regression_loss: 1.4646 - classification_loss: 0.2420 430/500 [========================>.....] - ETA: 23s - loss: 1.7057 - regression_loss: 1.4640 - classification_loss: 0.2417 431/500 [========================>.....] - ETA: 23s - loss: 1.7034 - regression_loss: 1.4620 - classification_loss: 0.2414 432/500 [========================>.....] - ETA: 23s - loss: 1.7022 - regression_loss: 1.4609 - classification_loss: 0.2413 433/500 [========================>.....] - ETA: 22s - loss: 1.7032 - regression_loss: 1.4618 - classification_loss: 0.2415 434/500 [=========================>....] - ETA: 22s - loss: 1.7020 - regression_loss: 1.4603 - classification_loss: 0.2416 435/500 [=========================>....] - ETA: 22s - loss: 1.7017 - regression_loss: 1.4602 - classification_loss: 0.2415 436/500 [=========================>....] - ETA: 21s - loss: 1.6993 - regression_loss: 1.4581 - classification_loss: 0.2412 437/500 [=========================>....] - ETA: 21s - loss: 1.6978 - regression_loss: 1.4567 - classification_loss: 0.2411 438/500 [=========================>....] - ETA: 21s - loss: 1.6957 - regression_loss: 1.4547 - classification_loss: 0.2410 439/500 [=========================>....] - ETA: 20s - loss: 1.6952 - regression_loss: 1.4544 - classification_loss: 0.2408 440/500 [=========================>....] - ETA: 20s - loss: 1.6963 - regression_loss: 1.4551 - classification_loss: 0.2412 441/500 [=========================>....] - ETA: 20s - loss: 1.6998 - regression_loss: 1.4579 - classification_loss: 0.2419 442/500 [=========================>....] - ETA: 19s - loss: 1.7000 - regression_loss: 1.4581 - classification_loss: 0.2419 443/500 [=========================>....] - ETA: 19s - loss: 1.7015 - regression_loss: 1.4593 - classification_loss: 0.2422 444/500 [=========================>....] - ETA: 19s - loss: 1.7013 - regression_loss: 1.4590 - classification_loss: 0.2423 445/500 [=========================>....] - ETA: 18s - loss: 1.7010 - regression_loss: 1.4588 - classification_loss: 0.2423 446/500 [=========================>....] - ETA: 18s - loss: 1.7019 - regression_loss: 1.4593 - classification_loss: 0.2425 447/500 [=========================>....] - ETA: 18s - loss: 1.6998 - regression_loss: 1.4575 - classification_loss: 0.2423 448/500 [=========================>....] - ETA: 17s - loss: 1.6996 - regression_loss: 1.4573 - classification_loss: 0.2423 449/500 [=========================>....] - ETA: 17s - loss: 1.6989 - regression_loss: 1.4568 - classification_loss: 0.2421 450/500 [==========================>...] - ETA: 17s - loss: 1.6984 - regression_loss: 1.4564 - classification_loss: 0.2420 451/500 [==========================>...] - ETA: 16s - loss: 1.6986 - regression_loss: 1.4566 - classification_loss: 0.2420 452/500 [==========================>...] - ETA: 16s - loss: 1.6985 - regression_loss: 1.4565 - classification_loss: 0.2420 453/500 [==========================>...] - ETA: 16s - loss: 1.6985 - regression_loss: 1.4561 - classification_loss: 0.2424 454/500 [==========================>...] - ETA: 15s - loss: 1.6961 - regression_loss: 1.4540 - classification_loss: 0.2421 455/500 [==========================>...] - ETA: 15s - loss: 1.6950 - regression_loss: 1.4532 - classification_loss: 0.2418 456/500 [==========================>...] - ETA: 15s - loss: 1.6953 - regression_loss: 1.4534 - classification_loss: 0.2419 457/500 [==========================>...] - ETA: 14s - loss: 1.6933 - regression_loss: 1.4518 - classification_loss: 0.2415 458/500 [==========================>...] - ETA: 14s - loss: 1.6935 - regression_loss: 1.4519 - classification_loss: 0.2416 459/500 [==========================>...] - ETA: 13s - loss: 1.6930 - regression_loss: 1.4516 - classification_loss: 0.2414 460/500 [==========================>...] - ETA: 13s - loss: 1.6931 - regression_loss: 1.4517 - classification_loss: 0.2414 461/500 [==========================>...] - ETA: 13s - loss: 1.6930 - regression_loss: 1.4517 - classification_loss: 0.2413 462/500 [==========================>...] - ETA: 12s - loss: 1.6931 - regression_loss: 1.4517 - classification_loss: 0.2414 463/500 [==========================>...] - ETA: 12s - loss: 1.6924 - regression_loss: 1.4512 - classification_loss: 0.2411 464/500 [==========================>...] - ETA: 12s - loss: 1.6956 - regression_loss: 1.4540 - classification_loss: 0.2416 465/500 [==========================>...] - ETA: 11s - loss: 1.6964 - regression_loss: 1.4548 - classification_loss: 0.2416 466/500 [==========================>...] - ETA: 11s - loss: 1.6941 - regression_loss: 1.4528 - classification_loss: 0.2413 467/500 [===========================>..] - ETA: 11s - loss: 1.6942 - regression_loss: 1.4531 - classification_loss: 0.2411 468/500 [===========================>..] - ETA: 10s - loss: 1.6950 - regression_loss: 1.4538 - classification_loss: 0.2412 469/500 [===========================>..] - ETA: 10s - loss: 1.6960 - regression_loss: 1.4546 - classification_loss: 0.2414 470/500 [===========================>..] - ETA: 10s - loss: 1.6942 - regression_loss: 1.4531 - classification_loss: 0.2411 471/500 [===========================>..] - ETA: 9s - loss: 1.6944 - regression_loss: 1.4532 - classification_loss: 0.2412  472/500 [===========================>..] - ETA: 9s - loss: 1.6930 - regression_loss: 1.4520 - classification_loss: 0.2410 473/500 [===========================>..] - ETA: 9s - loss: 1.6930 - regression_loss: 1.4518 - classification_loss: 0.2412 474/500 [===========================>..] - ETA: 8s - loss: 1.6936 - regression_loss: 1.4523 - classification_loss: 0.2413 475/500 [===========================>..] - ETA: 8s - loss: 1.6936 - regression_loss: 1.4523 - classification_loss: 0.2413 476/500 [===========================>..] - ETA: 8s - loss: 1.6937 - regression_loss: 1.4523 - classification_loss: 0.2415 477/500 [===========================>..] - ETA: 7s - loss: 1.6942 - regression_loss: 1.4526 - classification_loss: 0.2415 478/500 [===========================>..] - ETA: 7s - loss: 1.6933 - regression_loss: 1.4518 - classification_loss: 0.2415 479/500 [===========================>..] - ETA: 7s - loss: 1.6932 - regression_loss: 1.4518 - classification_loss: 0.2414 480/500 [===========================>..] - ETA: 6s - loss: 1.6931 - regression_loss: 1.4518 - classification_loss: 0.2414 481/500 [===========================>..] - ETA: 6s - loss: 1.6928 - regression_loss: 1.4515 - classification_loss: 0.2413 482/500 [===========================>..] - ETA: 6s - loss: 1.6930 - regression_loss: 1.4517 - classification_loss: 0.2413 483/500 [===========================>..] - ETA: 5s - loss: 1.6936 - regression_loss: 1.4523 - classification_loss: 0.2412 484/500 [============================>.] - ETA: 5s - loss: 1.6938 - regression_loss: 1.4526 - classification_loss: 0.2412 485/500 [============================>.] - ETA: 5s - loss: 1.6938 - regression_loss: 1.4526 - classification_loss: 0.2412 486/500 [============================>.] - ETA: 4s - loss: 1.6946 - regression_loss: 1.4527 - classification_loss: 0.2419 487/500 [============================>.] - ETA: 4s - loss: 1.6945 - regression_loss: 1.4527 - classification_loss: 0.2418 488/500 [============================>.] - ETA: 4s - loss: 1.6941 - regression_loss: 1.4524 - classification_loss: 0.2417 489/500 [============================>.] - ETA: 3s - loss: 1.6941 - regression_loss: 1.4524 - classification_loss: 0.2417 490/500 [============================>.] - ETA: 3s - loss: 1.6931 - regression_loss: 1.4517 - classification_loss: 0.2414 491/500 [============================>.] - ETA: 3s - loss: 1.6933 - regression_loss: 1.4519 - classification_loss: 0.2415 492/500 [============================>.] - ETA: 2s - loss: 1.6946 - regression_loss: 1.4530 - classification_loss: 0.2416 493/500 [============================>.] - ETA: 2s - loss: 1.6947 - regression_loss: 1.4531 - classification_loss: 0.2416 494/500 [============================>.] - ETA: 2s - loss: 1.6953 - regression_loss: 1.4536 - classification_loss: 0.2417 495/500 [============================>.] - ETA: 1s - loss: 1.6948 - regression_loss: 1.4532 - classification_loss: 0.2416 496/500 [============================>.] - ETA: 1s - loss: 1.6936 - regression_loss: 1.4522 - classification_loss: 0.2413 497/500 [============================>.] - ETA: 1s - loss: 1.6935 - regression_loss: 1.4522 - classification_loss: 0.2413 498/500 [============================>.] - ETA: 0s - loss: 1.6940 - regression_loss: 1.4526 - classification_loss: 0.2414 499/500 [============================>.] - ETA: 0s - loss: 1.6940 - regression_loss: 1.4523 - classification_loss: 0.2416 500/500 [==============================] - 171s 341ms/step - loss: 1.6941 - regression_loss: 1.4525 - classification_loss: 0.2416 1172 instances of class plum with average precision: 0.7076 mAP: 0.7076 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 2:43 - loss: 1.8252 - regression_loss: 1.5614 - classification_loss: 0.2639 2/500 [..............................] - ETA: 2:48 - loss: 1.7307 - regression_loss: 1.4843 - classification_loss: 0.2464 3/500 [..............................] - ETA: 2:51 - loss: 1.9115 - regression_loss: 1.6373 - classification_loss: 0.2743 4/500 [..............................] - ETA: 2:51 - loss: 1.7405 - regression_loss: 1.5118 - classification_loss: 0.2287 5/500 [..............................] - ETA: 2:51 - loss: 1.5975 - regression_loss: 1.3843 - classification_loss: 0.2132 6/500 [..............................] - ETA: 2:51 - loss: 1.5288 - regression_loss: 1.3185 - classification_loss: 0.2104 7/500 [..............................] - ETA: 2:51 - loss: 1.4300 - regression_loss: 1.2320 - classification_loss: 0.1980 8/500 [..............................] - ETA: 2:50 - loss: 1.5131 - regression_loss: 1.2962 - classification_loss: 0.2168 9/500 [..............................] - ETA: 2:48 - loss: 1.5274 - regression_loss: 1.3059 - classification_loss: 0.2214 10/500 [..............................] - ETA: 2:48 - loss: 1.5782 - regression_loss: 1.3457 - classification_loss: 0.2324 11/500 [..............................] - ETA: 2:47 - loss: 1.5699 - regression_loss: 1.3398 - classification_loss: 0.2301 12/500 [..............................] - ETA: 2:47 - loss: 1.5840 - regression_loss: 1.3518 - classification_loss: 0.2323 13/500 [..............................] - ETA: 2:47 - loss: 1.5525 - regression_loss: 1.3264 - classification_loss: 0.2261 14/500 [..............................] - ETA: 2:46 - loss: 1.5422 - regression_loss: 1.3168 - classification_loss: 0.2255 15/500 [..............................] - ETA: 2:46 - loss: 1.5585 - regression_loss: 1.3318 - classification_loss: 0.2266 16/500 [..............................] - ETA: 2:45 - loss: 1.6022 - regression_loss: 1.3702 - classification_loss: 0.2320 17/500 [>.............................] - ETA: 2:44 - loss: 1.6451 - regression_loss: 1.4140 - classification_loss: 0.2311 18/500 [>.............................] - ETA: 2:44 - loss: 1.6452 - regression_loss: 1.4142 - classification_loss: 0.2309 19/500 [>.............................] - ETA: 2:43 - loss: 1.6156 - regression_loss: 1.3896 - classification_loss: 0.2260 20/500 [>.............................] - ETA: 2:43 - loss: 1.6225 - regression_loss: 1.4013 - classification_loss: 0.2212 21/500 [>.............................] - ETA: 2:43 - loss: 1.6316 - regression_loss: 1.4065 - classification_loss: 0.2251 22/500 [>.............................] - ETA: 2:43 - loss: 1.5986 - regression_loss: 1.3774 - classification_loss: 0.2213 23/500 [>.............................] - ETA: 2:42 - loss: 1.5900 - regression_loss: 1.3708 - classification_loss: 0.2192 24/500 [>.............................] - ETA: 2:42 - loss: 1.6031 - regression_loss: 1.3799 - classification_loss: 0.2232 25/500 [>.............................] - ETA: 2:41 - loss: 1.6268 - regression_loss: 1.3982 - classification_loss: 0.2286 26/500 [>.............................] - ETA: 2:41 - loss: 1.6438 - regression_loss: 1.4110 - classification_loss: 0.2328 27/500 [>.............................] - ETA: 2:41 - loss: 1.6460 - regression_loss: 1.4136 - classification_loss: 0.2325 28/500 [>.............................] - ETA: 2:40 - loss: 1.6355 - regression_loss: 1.4053 - classification_loss: 0.2302 29/500 [>.............................] - ETA: 2:39 - loss: 1.6249 - regression_loss: 1.3982 - classification_loss: 0.2267 30/500 [>.............................] - ETA: 2:39 - loss: 1.6208 - regression_loss: 1.3938 - classification_loss: 0.2270 31/500 [>.............................] - ETA: 2:39 - loss: 1.6199 - regression_loss: 1.3919 - classification_loss: 0.2280 32/500 [>.............................] - ETA: 2:38 - loss: 1.6214 - regression_loss: 1.3941 - classification_loss: 0.2272 33/500 [>.............................] - ETA: 2:38 - loss: 1.6079 - regression_loss: 1.3828 - classification_loss: 0.2251 34/500 [=>............................] - ETA: 2:38 - loss: 1.6088 - regression_loss: 1.3838 - classification_loss: 0.2250 35/500 [=>............................] - ETA: 2:38 - loss: 1.6205 - regression_loss: 1.3935 - classification_loss: 0.2270 36/500 [=>............................] - ETA: 2:38 - loss: 1.6325 - regression_loss: 1.4063 - classification_loss: 0.2262 37/500 [=>............................] - ETA: 2:37 - loss: 1.6085 - regression_loss: 1.3856 - classification_loss: 0.2229 38/500 [=>............................] - ETA: 2:37 - loss: 1.6057 - regression_loss: 1.3843 - classification_loss: 0.2215 39/500 [=>............................] - ETA: 2:37 - loss: 1.6021 - regression_loss: 1.3805 - classification_loss: 0.2216 40/500 [=>............................] - ETA: 2:37 - loss: 1.5973 - regression_loss: 1.3752 - classification_loss: 0.2220 41/500 [=>............................] - ETA: 2:36 - loss: 1.6129 - regression_loss: 1.3882 - classification_loss: 0.2247 42/500 [=>............................] - ETA: 2:36 - loss: 1.6011 - regression_loss: 1.3797 - classification_loss: 0.2214 43/500 [=>............................] - ETA: 2:36 - loss: 1.6146 - regression_loss: 1.3904 - classification_loss: 0.2242 44/500 [=>............................] - ETA: 2:35 - loss: 1.6119 - regression_loss: 1.3879 - classification_loss: 0.2239 45/500 [=>............................] - ETA: 2:35 - loss: 1.6171 - regression_loss: 1.3923 - classification_loss: 0.2248 46/500 [=>............................] - ETA: 2:35 - loss: 1.6153 - regression_loss: 1.3908 - classification_loss: 0.2245 47/500 [=>............................] - ETA: 2:34 - loss: 1.6227 - regression_loss: 1.3963 - classification_loss: 0.2264 48/500 [=>............................] - ETA: 2:34 - loss: 1.6337 - regression_loss: 1.4053 - classification_loss: 0.2283 49/500 [=>............................] - ETA: 2:33 - loss: 1.6280 - regression_loss: 1.4000 - classification_loss: 0.2280 50/500 [==>...........................] - ETA: 2:33 - loss: 1.6225 - regression_loss: 1.3957 - classification_loss: 0.2267 51/500 [==>...........................] - ETA: 2:33 - loss: 1.6198 - regression_loss: 1.3889 - classification_loss: 0.2309 52/500 [==>...........................] - ETA: 2:32 - loss: 1.6133 - regression_loss: 1.3807 - classification_loss: 0.2326 53/500 [==>...........................] - ETA: 2:32 - loss: 1.6133 - regression_loss: 1.3809 - classification_loss: 0.2324 54/500 [==>...........................] - ETA: 2:31 - loss: 1.6229 - regression_loss: 1.3868 - classification_loss: 0.2361 55/500 [==>...........................] - ETA: 2:31 - loss: 1.6261 - regression_loss: 1.3870 - classification_loss: 0.2391 56/500 [==>...........................] - ETA: 2:31 - loss: 1.6317 - regression_loss: 1.3913 - classification_loss: 0.2404 57/500 [==>...........................] - ETA: 2:30 - loss: 1.6271 - regression_loss: 1.3875 - classification_loss: 0.2395 58/500 [==>...........................] - ETA: 2:30 - loss: 1.6339 - regression_loss: 1.3938 - classification_loss: 0.2402 59/500 [==>...........................] - ETA: 2:29 - loss: 1.6370 - regression_loss: 1.3970 - classification_loss: 0.2400 60/500 [==>...........................] - ETA: 2:29 - loss: 1.6362 - regression_loss: 1.3971 - classification_loss: 0.2391 61/500 [==>...........................] - ETA: 2:29 - loss: 1.6247 - regression_loss: 1.3872 - classification_loss: 0.2375 62/500 [==>...........................] - ETA: 2:28 - loss: 1.6342 - regression_loss: 1.3951 - classification_loss: 0.2391 63/500 [==>...........................] - ETA: 2:28 - loss: 1.6404 - regression_loss: 1.3996 - classification_loss: 0.2408 64/500 [==>...........................] - ETA: 2:27 - loss: 1.6514 - regression_loss: 1.4082 - classification_loss: 0.2432 65/500 [==>...........................] - ETA: 2:27 - loss: 1.6452 - regression_loss: 1.4029 - classification_loss: 0.2423 66/500 [==>...........................] - ETA: 2:27 - loss: 1.6483 - regression_loss: 1.4063 - classification_loss: 0.2419 67/500 [===>..........................] - ETA: 2:27 - loss: 1.6498 - regression_loss: 1.4081 - classification_loss: 0.2417 68/500 [===>..........................] - ETA: 2:26 - loss: 1.6530 - regression_loss: 1.4110 - classification_loss: 0.2420 69/500 [===>..........................] - ETA: 2:26 - loss: 1.6521 - regression_loss: 1.4106 - classification_loss: 0.2414 70/500 [===>..........................] - ETA: 2:26 - loss: 1.6511 - regression_loss: 1.4104 - classification_loss: 0.2408 71/500 [===>..........................] - ETA: 2:25 - loss: 1.6528 - regression_loss: 1.4125 - classification_loss: 0.2403 72/500 [===>..........................] - ETA: 2:25 - loss: 1.6519 - regression_loss: 1.4121 - classification_loss: 0.2398 73/500 [===>..........................] - ETA: 2:25 - loss: 1.6554 - regression_loss: 1.4143 - classification_loss: 0.2411 74/500 [===>..........................] - ETA: 2:24 - loss: 1.6572 - regression_loss: 1.4155 - classification_loss: 0.2417 75/500 [===>..........................] - ETA: 2:24 - loss: 1.6594 - regression_loss: 1.4179 - classification_loss: 0.2415 76/500 [===>..........................] - ETA: 2:24 - loss: 1.6648 - regression_loss: 1.4229 - classification_loss: 0.2419 77/500 [===>..........................] - ETA: 2:23 - loss: 1.6622 - regression_loss: 1.4211 - classification_loss: 0.2410 78/500 [===>..........................] - ETA: 2:23 - loss: 1.6650 - regression_loss: 1.4240 - classification_loss: 0.2410 79/500 [===>..........................] - ETA: 2:23 - loss: 1.6661 - regression_loss: 1.4252 - classification_loss: 0.2409 80/500 [===>..........................] - ETA: 2:22 - loss: 1.6627 - regression_loss: 1.4225 - classification_loss: 0.2402 81/500 [===>..........................] - ETA: 2:22 - loss: 1.6531 - regression_loss: 1.4144 - classification_loss: 0.2387 82/500 [===>..........................] - ETA: 2:21 - loss: 1.6514 - regression_loss: 1.4134 - classification_loss: 0.2380 83/500 [===>..........................] - ETA: 2:21 - loss: 1.6656 - regression_loss: 1.4265 - classification_loss: 0.2391 84/500 [====>.........................] - ETA: 2:21 - loss: 1.6656 - regression_loss: 1.4268 - classification_loss: 0.2387 85/500 [====>.........................] - ETA: 2:20 - loss: 1.6613 - regression_loss: 1.4237 - classification_loss: 0.2376 86/500 [====>.........................] - ETA: 2:20 - loss: 1.6597 - regression_loss: 1.4225 - classification_loss: 0.2372 87/500 [====>.........................] - ETA: 2:20 - loss: 1.6579 - regression_loss: 1.4207 - classification_loss: 0.2372 88/500 [====>.........................] - ETA: 2:19 - loss: 1.6564 - regression_loss: 1.4195 - classification_loss: 0.2369 89/500 [====>.........................] - ETA: 2:19 - loss: 1.6457 - regression_loss: 1.4102 - classification_loss: 0.2355 90/500 [====>.........................] - ETA: 2:19 - loss: 1.6464 - regression_loss: 1.4111 - classification_loss: 0.2353 91/500 [====>.........................] - ETA: 2:18 - loss: 1.6441 - regression_loss: 1.4087 - classification_loss: 0.2354 92/500 [====>.........................] - ETA: 2:18 - loss: 1.6472 - regression_loss: 1.4116 - classification_loss: 0.2357 93/500 [====>.........................] - ETA: 2:17 - loss: 1.6396 - regression_loss: 1.4053 - classification_loss: 0.2342 94/500 [====>.........................] - ETA: 2:17 - loss: 1.6384 - regression_loss: 1.4043 - classification_loss: 0.2341 95/500 [====>.........................] - ETA: 2:17 - loss: 1.6369 - regression_loss: 1.4027 - classification_loss: 0.2343 96/500 [====>.........................] - ETA: 2:16 - loss: 1.6310 - regression_loss: 1.3979 - classification_loss: 0.2331 97/500 [====>.........................] - ETA: 2:16 - loss: 1.6325 - regression_loss: 1.3993 - classification_loss: 0.2331 98/500 [====>.........................] - ETA: 2:16 - loss: 1.6328 - regression_loss: 1.4002 - classification_loss: 0.2327 99/500 [====>.........................] - ETA: 2:15 - loss: 1.6345 - regression_loss: 1.4012 - classification_loss: 0.2333 100/500 [=====>........................] - ETA: 2:15 - loss: 1.6349 - regression_loss: 1.4010 - classification_loss: 0.2338 101/500 [=====>........................] - ETA: 2:15 - loss: 1.6240 - regression_loss: 1.3917 - classification_loss: 0.2323 102/500 [=====>........................] - ETA: 2:15 - loss: 1.6164 - regression_loss: 1.3855 - classification_loss: 0.2309 103/500 [=====>........................] - ETA: 2:14 - loss: 1.6217 - regression_loss: 1.3902 - classification_loss: 0.2315 104/500 [=====>........................] - ETA: 2:14 - loss: 1.6184 - regression_loss: 1.3876 - classification_loss: 0.2308 105/500 [=====>........................] - ETA: 2:13 - loss: 1.6227 - regression_loss: 1.3913 - classification_loss: 0.2315 106/500 [=====>........................] - ETA: 2:13 - loss: 1.6209 - regression_loss: 1.3897 - classification_loss: 0.2312 107/500 [=====>........................] - ETA: 2:13 - loss: 1.6235 - regression_loss: 1.3916 - classification_loss: 0.2319 108/500 [=====>........................] - ETA: 2:12 - loss: 1.6318 - regression_loss: 1.3994 - classification_loss: 0.2324 109/500 [=====>........................] - ETA: 2:12 - loss: 1.6336 - regression_loss: 1.4005 - classification_loss: 0.2331 110/500 [=====>........................] - ETA: 2:12 - loss: 1.6334 - regression_loss: 1.4008 - classification_loss: 0.2326 111/500 [=====>........................] - ETA: 2:12 - loss: 1.6274 - regression_loss: 1.3961 - classification_loss: 0.2313 112/500 [=====>........................] - ETA: 2:11 - loss: 1.6238 - regression_loss: 1.3933 - classification_loss: 0.2305 113/500 [=====>........................] - ETA: 2:11 - loss: 1.6259 - regression_loss: 1.3957 - classification_loss: 0.2302 114/500 [=====>........................] - ETA: 2:11 - loss: 1.6288 - regression_loss: 1.3983 - classification_loss: 0.2305 115/500 [=====>........................] - ETA: 2:10 - loss: 1.6317 - regression_loss: 1.4002 - classification_loss: 0.2315 116/500 [=====>........................] - ETA: 2:10 - loss: 1.6324 - regression_loss: 1.4012 - classification_loss: 0.2312 117/500 [======>.......................] - ETA: 2:10 - loss: 1.6379 - regression_loss: 1.4051 - classification_loss: 0.2328 118/500 [======>.......................] - ETA: 2:09 - loss: 1.6351 - regression_loss: 1.4026 - classification_loss: 0.2325 119/500 [======>.......................] - ETA: 2:09 - loss: 1.6339 - regression_loss: 1.4018 - classification_loss: 0.2321 120/500 [======>.......................] - ETA: 2:09 - loss: 1.6342 - regression_loss: 1.4018 - classification_loss: 0.2323 121/500 [======>.......................] - ETA: 2:09 - loss: 1.6339 - regression_loss: 1.4014 - classification_loss: 0.2325 122/500 [======>.......................] - ETA: 2:08 - loss: 1.6342 - regression_loss: 1.4018 - classification_loss: 0.2324 123/500 [======>.......................] - ETA: 2:08 - loss: 1.6318 - regression_loss: 1.3995 - classification_loss: 0.2323 124/500 [======>.......................] - ETA: 2:08 - loss: 1.6292 - regression_loss: 1.3975 - classification_loss: 0.2317 125/500 [======>.......................] - ETA: 2:07 - loss: 1.6248 - regression_loss: 1.3939 - classification_loss: 0.2309 126/500 [======>.......................] - ETA: 2:07 - loss: 1.6231 - regression_loss: 1.3921 - classification_loss: 0.2310 127/500 [======>.......................] - ETA: 2:06 - loss: 1.6160 - regression_loss: 1.3859 - classification_loss: 0.2301 128/500 [======>.......................] - ETA: 2:06 - loss: 1.6181 - regression_loss: 1.3876 - classification_loss: 0.2304 129/500 [======>.......................] - ETA: 2:06 - loss: 1.6143 - regression_loss: 1.3845 - classification_loss: 0.2298 130/500 [======>.......................] - ETA: 2:05 - loss: 1.6117 - regression_loss: 1.3827 - classification_loss: 0.2290 131/500 [======>.......................] - ETA: 2:05 - loss: 1.6112 - regression_loss: 1.3825 - classification_loss: 0.2287 132/500 [======>.......................] - ETA: 2:05 - loss: 1.6112 - regression_loss: 1.3825 - classification_loss: 0.2287 133/500 [======>.......................] - ETA: 2:04 - loss: 1.6112 - regression_loss: 1.3829 - classification_loss: 0.2283 134/500 [=======>......................] - ETA: 2:04 - loss: 1.6136 - regression_loss: 1.3846 - classification_loss: 0.2291 135/500 [=======>......................] - ETA: 2:04 - loss: 1.6127 - regression_loss: 1.3838 - classification_loss: 0.2289 136/500 [=======>......................] - ETA: 2:03 - loss: 1.6118 - regression_loss: 1.3830 - classification_loss: 0.2288 137/500 [=======>......................] - ETA: 2:03 - loss: 1.6087 - regression_loss: 1.3807 - classification_loss: 0.2279 138/500 [=======>......................] - ETA: 2:02 - loss: 1.6030 - regression_loss: 1.3757 - classification_loss: 0.2272 139/500 [=======>......................] - ETA: 2:02 - loss: 1.5995 - regression_loss: 1.3725 - classification_loss: 0.2270 140/500 [=======>......................] - ETA: 2:02 - loss: 1.5958 - regression_loss: 1.3692 - classification_loss: 0.2265 141/500 [=======>......................] - ETA: 2:01 - loss: 1.5983 - regression_loss: 1.3716 - classification_loss: 0.2267 142/500 [=======>......................] - ETA: 2:01 - loss: 1.6008 - regression_loss: 1.3738 - classification_loss: 0.2270 143/500 [=======>......................] - ETA: 2:01 - loss: 1.6034 - regression_loss: 1.3762 - classification_loss: 0.2271 144/500 [=======>......................] - ETA: 2:00 - loss: 1.6048 - regression_loss: 1.3777 - classification_loss: 0.2271 145/500 [=======>......................] - ETA: 2:00 - loss: 1.6044 - regression_loss: 1.3777 - classification_loss: 0.2267 146/500 [=======>......................] - ETA: 2:00 - loss: 1.6032 - regression_loss: 1.3768 - classification_loss: 0.2264 147/500 [=======>......................] - ETA: 1:59 - loss: 1.6070 - regression_loss: 1.3796 - classification_loss: 0.2274 148/500 [=======>......................] - ETA: 1:59 - loss: 1.6044 - regression_loss: 1.3769 - classification_loss: 0.2275 149/500 [=======>......................] - ETA: 1:59 - loss: 1.6056 - regression_loss: 1.3783 - classification_loss: 0.2274 150/500 [========>.....................] - ETA: 1:58 - loss: 1.6055 - regression_loss: 1.3779 - classification_loss: 0.2275 151/500 [========>.....................] - ETA: 1:58 - loss: 1.6055 - regression_loss: 1.3780 - classification_loss: 0.2275 152/500 [========>.....................] - ETA: 1:58 - loss: 1.6089 - regression_loss: 1.3811 - classification_loss: 0.2278 153/500 [========>.....................] - ETA: 1:57 - loss: 1.6109 - regression_loss: 1.3827 - classification_loss: 0.2282 154/500 [========>.....................] - ETA: 1:57 - loss: 1.6133 - regression_loss: 1.3850 - classification_loss: 0.2283 155/500 [========>.....................] - ETA: 1:57 - loss: 1.6119 - regression_loss: 1.3836 - classification_loss: 0.2283 156/500 [========>.....................] - ETA: 1:56 - loss: 1.6113 - regression_loss: 1.3831 - classification_loss: 0.2282 157/500 [========>.....................] - ETA: 1:56 - loss: 1.6136 - regression_loss: 1.3854 - classification_loss: 0.2282 158/500 [========>.....................] - ETA: 1:56 - loss: 1.6135 - regression_loss: 1.3856 - classification_loss: 0.2280 159/500 [========>.....................] - ETA: 1:55 - loss: 1.6183 - regression_loss: 1.3900 - classification_loss: 0.2283 160/500 [========>.....................] - ETA: 1:55 - loss: 1.6179 - regression_loss: 1.3899 - classification_loss: 0.2281 161/500 [========>.....................] - ETA: 1:55 - loss: 1.6184 - regression_loss: 1.3902 - classification_loss: 0.2281 162/500 [========>.....................] - ETA: 1:54 - loss: 1.6217 - regression_loss: 1.3931 - classification_loss: 0.2286 163/500 [========>.....................] - ETA: 1:54 - loss: 1.6238 - regression_loss: 1.3950 - classification_loss: 0.2288 164/500 [========>.....................] - ETA: 1:54 - loss: 1.6240 - regression_loss: 1.3953 - classification_loss: 0.2287 165/500 [========>.....................] - ETA: 1:53 - loss: 1.6247 - regression_loss: 1.3961 - classification_loss: 0.2287 166/500 [========>.....................] - ETA: 1:53 - loss: 1.6263 - regression_loss: 1.3976 - classification_loss: 0.2286 167/500 [=========>....................] - ETA: 1:53 - loss: 1.6209 - regression_loss: 1.3932 - classification_loss: 0.2277 168/500 [=========>....................] - ETA: 1:52 - loss: 1.6196 - regression_loss: 1.3919 - classification_loss: 0.2277 169/500 [=========>....................] - ETA: 1:52 - loss: 1.6204 - regression_loss: 1.3925 - classification_loss: 0.2279 170/500 [=========>....................] - ETA: 1:52 - loss: 1.6187 - regression_loss: 1.3909 - classification_loss: 0.2277 171/500 [=========>....................] - ETA: 1:51 - loss: 1.6157 - regression_loss: 1.3881 - classification_loss: 0.2276 172/500 [=========>....................] - ETA: 1:51 - loss: 1.6165 - regression_loss: 1.3889 - classification_loss: 0.2277 173/500 [=========>....................] - ETA: 1:51 - loss: 1.6181 - regression_loss: 1.3901 - classification_loss: 0.2280 174/500 [=========>....................] - ETA: 1:50 - loss: 1.6209 - regression_loss: 1.3925 - classification_loss: 0.2284 175/500 [=========>....................] - ETA: 1:50 - loss: 1.6255 - regression_loss: 1.3970 - classification_loss: 0.2286 176/500 [=========>....................] - ETA: 1:50 - loss: 1.6267 - regression_loss: 1.3980 - classification_loss: 0.2288 177/500 [=========>....................] - ETA: 1:49 - loss: 1.6296 - regression_loss: 1.4003 - classification_loss: 0.2293 178/500 [=========>....................] - ETA: 1:49 - loss: 1.6307 - regression_loss: 1.4008 - classification_loss: 0.2299 179/500 [=========>....................] - ETA: 1:48 - loss: 1.6337 - regression_loss: 1.4034 - classification_loss: 0.2303 180/500 [=========>....................] - ETA: 1:48 - loss: 1.6347 - regression_loss: 1.4043 - classification_loss: 0.2304 181/500 [=========>....................] - ETA: 1:48 - loss: 1.6316 - regression_loss: 1.4016 - classification_loss: 0.2300 182/500 [=========>....................] - ETA: 1:47 - loss: 1.6294 - regression_loss: 1.3996 - classification_loss: 0.2298 183/500 [=========>....................] - ETA: 1:47 - loss: 1.6339 - regression_loss: 1.4039 - classification_loss: 0.2300 184/500 [==========>...................] - ETA: 1:47 - loss: 1.6333 - regression_loss: 1.4032 - classification_loss: 0.2301 185/500 [==========>...................] - ETA: 1:46 - loss: 1.6327 - regression_loss: 1.4028 - classification_loss: 0.2299 186/500 [==========>...................] - ETA: 1:46 - loss: 1.6290 - regression_loss: 1.3999 - classification_loss: 0.2291 187/500 [==========>...................] - ETA: 1:46 - loss: 1.6308 - regression_loss: 1.4010 - classification_loss: 0.2298 188/500 [==========>...................] - ETA: 1:45 - loss: 1.6279 - regression_loss: 1.3984 - classification_loss: 0.2295 189/500 [==========>...................] - ETA: 1:45 - loss: 1.6273 - regression_loss: 1.3975 - classification_loss: 0.2298 190/500 [==========>...................] - ETA: 1:45 - loss: 1.6294 - regression_loss: 1.3993 - classification_loss: 0.2301 191/500 [==========>...................] - ETA: 1:44 - loss: 1.6292 - regression_loss: 1.3993 - classification_loss: 0.2300 192/500 [==========>...................] - ETA: 1:44 - loss: 1.6300 - regression_loss: 1.4000 - classification_loss: 0.2300 193/500 [==========>...................] - ETA: 1:44 - loss: 1.6299 - regression_loss: 1.4002 - classification_loss: 0.2297 194/500 [==========>...................] - ETA: 1:43 - loss: 1.6306 - regression_loss: 1.4013 - classification_loss: 0.2293 195/500 [==========>...................] - ETA: 1:43 - loss: 1.6302 - regression_loss: 1.4009 - classification_loss: 0.2293 196/500 [==========>...................] - ETA: 1:43 - loss: 1.6278 - regression_loss: 1.3991 - classification_loss: 0.2286 197/500 [==========>...................] - ETA: 1:42 - loss: 1.6301 - regression_loss: 1.4010 - classification_loss: 0.2291 198/500 [==========>...................] - ETA: 1:42 - loss: 1.6323 - regression_loss: 1.4031 - classification_loss: 0.2292 199/500 [==========>...................] - ETA: 1:42 - loss: 1.6321 - regression_loss: 1.4032 - classification_loss: 0.2289 200/500 [===========>..................] - ETA: 1:41 - loss: 1.6328 - regression_loss: 1.4037 - classification_loss: 0.2291 201/500 [===========>..................] - ETA: 1:41 - loss: 1.6341 - regression_loss: 1.4045 - classification_loss: 0.2296 202/500 [===========>..................] - ETA: 1:40 - loss: 1.6346 - regression_loss: 1.4050 - classification_loss: 0.2296 203/500 [===========>..................] - ETA: 1:40 - loss: 1.6306 - regression_loss: 1.4016 - classification_loss: 0.2290 204/500 [===========>..................] - ETA: 1:40 - loss: 1.6297 - regression_loss: 1.4007 - classification_loss: 0.2290 205/500 [===========>..................] - ETA: 1:39 - loss: 1.6309 - regression_loss: 1.4019 - classification_loss: 0.2290 206/500 [===========>..................] - ETA: 1:39 - loss: 1.6308 - regression_loss: 1.4019 - classification_loss: 0.2289 207/500 [===========>..................] - ETA: 1:39 - loss: 1.6309 - regression_loss: 1.4023 - classification_loss: 0.2286 208/500 [===========>..................] - ETA: 1:38 - loss: 1.6299 - regression_loss: 1.4015 - classification_loss: 0.2284 209/500 [===========>..................] - ETA: 1:38 - loss: 1.6307 - regression_loss: 1.4018 - classification_loss: 0.2290 210/500 [===========>..................] - ETA: 1:38 - loss: 1.6282 - regression_loss: 1.3997 - classification_loss: 0.2286 211/500 [===========>..................] - ETA: 1:37 - loss: 1.6238 - regression_loss: 1.3960 - classification_loss: 0.2278 212/500 [===========>..................] - ETA: 1:37 - loss: 1.6249 - regression_loss: 1.3969 - classification_loss: 0.2280 213/500 [===========>..................] - ETA: 1:37 - loss: 1.6226 - regression_loss: 1.3950 - classification_loss: 0.2276 214/500 [===========>..................] - ETA: 1:36 - loss: 1.6220 - regression_loss: 1.3948 - classification_loss: 0.2272 215/500 [===========>..................] - ETA: 1:36 - loss: 1.6204 - regression_loss: 1.3935 - classification_loss: 0.2269 216/500 [===========>..................] - ETA: 1:36 - loss: 1.6218 - regression_loss: 1.3945 - classification_loss: 0.2272 217/500 [============>.................] - ETA: 1:35 - loss: 1.6175 - regression_loss: 1.3910 - classification_loss: 0.2265 218/500 [============>.................] - ETA: 1:35 - loss: 1.6152 - regression_loss: 1.3892 - classification_loss: 0.2260 219/500 [============>.................] - ETA: 1:35 - loss: 1.6168 - regression_loss: 1.3906 - classification_loss: 0.2262 220/500 [============>.................] - ETA: 1:34 - loss: 1.6165 - regression_loss: 1.3905 - classification_loss: 0.2260 221/500 [============>.................] - ETA: 1:34 - loss: 1.6179 - regression_loss: 1.3916 - classification_loss: 0.2263 222/500 [============>.................] - ETA: 1:34 - loss: 1.6178 - regression_loss: 1.3916 - classification_loss: 0.2262 223/500 [============>.................] - ETA: 1:33 - loss: 1.6179 - regression_loss: 1.3916 - classification_loss: 0.2262 224/500 [============>.................] - ETA: 1:33 - loss: 1.6177 - regression_loss: 1.3914 - classification_loss: 0.2263 225/500 [============>.................] - ETA: 1:33 - loss: 1.6171 - regression_loss: 1.3910 - classification_loss: 0.2260 226/500 [============>.................] - ETA: 1:32 - loss: 1.6190 - regression_loss: 1.3925 - classification_loss: 0.2265 227/500 [============>.................] - ETA: 1:32 - loss: 1.6213 - regression_loss: 1.3941 - classification_loss: 0.2272 228/500 [============>.................] - ETA: 1:31 - loss: 1.6225 - regression_loss: 1.3953 - classification_loss: 0.2272 229/500 [============>.................] - ETA: 1:31 - loss: 1.6209 - regression_loss: 1.3939 - classification_loss: 0.2270 230/500 [============>.................] - ETA: 1:31 - loss: 1.6208 - regression_loss: 1.3938 - classification_loss: 0.2270 231/500 [============>.................] - ETA: 1:30 - loss: 1.6196 - regression_loss: 1.3924 - classification_loss: 0.2272 232/500 [============>.................] - ETA: 1:30 - loss: 1.6199 - regression_loss: 1.3930 - classification_loss: 0.2269 233/500 [============>.................] - ETA: 1:30 - loss: 1.6201 - regression_loss: 1.3930 - classification_loss: 0.2271 234/500 [=============>................] - ETA: 1:29 - loss: 1.6218 - regression_loss: 1.3946 - classification_loss: 0.2272 235/500 [=============>................] - ETA: 1:29 - loss: 1.6229 - regression_loss: 1.3955 - classification_loss: 0.2274 236/500 [=============>................] - ETA: 1:29 - loss: 1.6240 - regression_loss: 1.3964 - classification_loss: 0.2276 237/500 [=============>................] - ETA: 1:28 - loss: 1.6223 - regression_loss: 1.3951 - classification_loss: 0.2273 238/500 [=============>................] - ETA: 1:28 - loss: 1.6231 - regression_loss: 1.3958 - classification_loss: 0.2273 239/500 [=============>................] - ETA: 1:28 - loss: 1.6202 - regression_loss: 1.3934 - classification_loss: 0.2267 240/500 [=============>................] - ETA: 1:27 - loss: 1.6206 - regression_loss: 1.3940 - classification_loss: 0.2266 241/500 [=============>................] - ETA: 1:27 - loss: 1.6201 - regression_loss: 1.3938 - classification_loss: 0.2264 242/500 [=============>................] - ETA: 1:27 - loss: 1.6165 - regression_loss: 1.3908 - classification_loss: 0.2257 243/500 [=============>................] - ETA: 1:26 - loss: 1.6187 - regression_loss: 1.3926 - classification_loss: 0.2261 244/500 [=============>................] - ETA: 1:26 - loss: 1.6194 - regression_loss: 1.3931 - classification_loss: 0.2263 245/500 [=============>................] - ETA: 1:26 - loss: 1.6186 - regression_loss: 1.3925 - classification_loss: 0.2261 246/500 [=============>................] - ETA: 1:25 - loss: 1.6191 - regression_loss: 1.3930 - classification_loss: 0.2261 247/500 [=============>................] - ETA: 1:25 - loss: 1.6199 - regression_loss: 1.3939 - classification_loss: 0.2260 248/500 [=============>................] - ETA: 1:25 - loss: 1.6194 - regression_loss: 1.3933 - classification_loss: 0.2261 249/500 [=============>................] - ETA: 1:24 - loss: 1.6185 - regression_loss: 1.3923 - classification_loss: 0.2262 250/500 [==============>...............] - ETA: 1:24 - loss: 1.6191 - regression_loss: 1.3928 - classification_loss: 0.2263 251/500 [==============>...............] - ETA: 1:24 - loss: 1.6158 - regression_loss: 1.3899 - classification_loss: 0.2259 252/500 [==============>...............] - ETA: 1:23 - loss: 1.6195 - regression_loss: 1.3931 - classification_loss: 0.2264 253/500 [==============>...............] - ETA: 1:23 - loss: 1.6203 - regression_loss: 1.3939 - classification_loss: 0.2265 254/500 [==============>...............] - ETA: 1:23 - loss: 1.6181 - regression_loss: 1.3920 - classification_loss: 0.2261 255/500 [==============>...............] - ETA: 1:22 - loss: 1.6204 - regression_loss: 1.3941 - classification_loss: 0.2263 256/500 [==============>...............] - ETA: 1:22 - loss: 1.6216 - regression_loss: 1.3952 - classification_loss: 0.2263 257/500 [==============>...............] - ETA: 1:22 - loss: 1.6223 - regression_loss: 1.3959 - classification_loss: 0.2265 258/500 [==============>...............] - ETA: 1:21 - loss: 1.6218 - regression_loss: 1.3955 - classification_loss: 0.2263 259/500 [==============>...............] - ETA: 1:21 - loss: 1.6222 - regression_loss: 1.3958 - classification_loss: 0.2264 260/500 [==============>...............] - ETA: 1:21 - loss: 1.6231 - regression_loss: 1.3966 - classification_loss: 0.2265 261/500 [==============>...............] - ETA: 1:20 - loss: 1.6241 - regression_loss: 1.3975 - classification_loss: 0.2266 262/500 [==============>...............] - ETA: 1:20 - loss: 1.6241 - regression_loss: 1.3975 - classification_loss: 0.2266 263/500 [==============>...............] - ETA: 1:20 - loss: 1.6240 - regression_loss: 1.3975 - classification_loss: 0.2265 264/500 [==============>...............] - ETA: 1:19 - loss: 1.6200 - regression_loss: 1.3941 - classification_loss: 0.2259 265/500 [==============>...............] - ETA: 1:19 - loss: 1.6211 - regression_loss: 1.3949 - classification_loss: 0.2261 266/500 [==============>...............] - ETA: 1:19 - loss: 1.6208 - regression_loss: 1.3948 - classification_loss: 0.2260 267/500 [===============>..............] - ETA: 1:18 - loss: 1.6201 - regression_loss: 1.3944 - classification_loss: 0.2257 268/500 [===============>..............] - ETA: 1:18 - loss: 1.6189 - regression_loss: 1.3935 - classification_loss: 0.2254 269/500 [===============>..............] - ETA: 1:18 - loss: 1.6195 - regression_loss: 1.3940 - classification_loss: 0.2255 270/500 [===============>..............] - ETA: 1:17 - loss: 1.6202 - regression_loss: 1.3948 - classification_loss: 0.2254 271/500 [===============>..............] - ETA: 1:17 - loss: 1.6216 - regression_loss: 1.3962 - classification_loss: 0.2255 272/500 [===============>..............] - ETA: 1:17 - loss: 1.6206 - regression_loss: 1.3955 - classification_loss: 0.2251 273/500 [===============>..............] - ETA: 1:16 - loss: 1.6249 - regression_loss: 1.3991 - classification_loss: 0.2258 274/500 [===============>..............] - ETA: 1:16 - loss: 1.6250 - regression_loss: 1.3993 - classification_loss: 0.2257 275/500 [===============>..............] - ETA: 1:16 - loss: 1.6249 - regression_loss: 1.3989 - classification_loss: 0.2260 276/500 [===============>..............] - ETA: 1:15 - loss: 1.6266 - regression_loss: 1.4003 - classification_loss: 0.2263 277/500 [===============>..............] - ETA: 1:15 - loss: 1.6270 - regression_loss: 1.4008 - classification_loss: 0.2262 278/500 [===============>..............] - ETA: 1:15 - loss: 1.6272 - regression_loss: 1.4009 - classification_loss: 0.2264 279/500 [===============>..............] - ETA: 1:14 - loss: 1.6264 - regression_loss: 1.4003 - classification_loss: 0.2261 280/500 [===============>..............] - ETA: 1:14 - loss: 1.6267 - regression_loss: 1.4005 - classification_loss: 0.2262 281/500 [===============>..............] - ETA: 1:14 - loss: 1.6287 - regression_loss: 1.4021 - classification_loss: 0.2266 282/500 [===============>..............] - ETA: 1:13 - loss: 1.6305 - regression_loss: 1.4039 - classification_loss: 0.2266 283/500 [===============>..............] - ETA: 1:13 - loss: 1.6308 - regression_loss: 1.4041 - classification_loss: 0.2267 284/500 [================>.............] - ETA: 1:13 - loss: 1.6325 - regression_loss: 1.4054 - classification_loss: 0.2271 285/500 [================>.............] - ETA: 1:12 - loss: 1.6349 - regression_loss: 1.4072 - classification_loss: 0.2277 286/500 [================>.............] - ETA: 1:12 - loss: 1.6360 - regression_loss: 1.4082 - classification_loss: 0.2278 287/500 [================>.............] - ETA: 1:12 - loss: 1.6347 - regression_loss: 1.4067 - classification_loss: 0.2280 288/500 [================>.............] - ETA: 1:11 - loss: 1.6361 - regression_loss: 1.4070 - classification_loss: 0.2290 289/500 [================>.............] - ETA: 1:11 - loss: 1.6368 - regression_loss: 1.4077 - classification_loss: 0.2291 290/500 [================>.............] - ETA: 1:11 - loss: 1.6385 - regression_loss: 1.4092 - classification_loss: 0.2293 291/500 [================>.............] - ETA: 1:10 - loss: 1.6423 - regression_loss: 1.4120 - classification_loss: 0.2303 292/500 [================>.............] - ETA: 1:10 - loss: 1.6408 - regression_loss: 1.4100 - classification_loss: 0.2309 293/500 [================>.............] - ETA: 1:10 - loss: 1.6420 - regression_loss: 1.4109 - classification_loss: 0.2311 294/500 [================>.............] - ETA: 1:09 - loss: 1.6440 - regression_loss: 1.4130 - classification_loss: 0.2310 295/500 [================>.............] - ETA: 1:09 - loss: 1.6447 - regression_loss: 1.4138 - classification_loss: 0.2308 296/500 [================>.............] - ETA: 1:09 - loss: 1.6412 - regression_loss: 1.4109 - classification_loss: 0.2303 297/500 [================>.............] - ETA: 1:08 - loss: 1.6408 - regression_loss: 1.4105 - classification_loss: 0.2304 298/500 [================>.............] - ETA: 1:08 - loss: 1.6422 - regression_loss: 1.4114 - classification_loss: 0.2308 299/500 [================>.............] - ETA: 1:08 - loss: 1.6423 - regression_loss: 1.4113 - classification_loss: 0.2310 300/500 [=================>............] - ETA: 1:07 - loss: 1.6435 - regression_loss: 1.4122 - classification_loss: 0.2313 301/500 [=================>............] - ETA: 1:07 - loss: 1.6434 - regression_loss: 1.4120 - classification_loss: 0.2314 302/500 [=================>............] - ETA: 1:07 - loss: 1.6445 - regression_loss: 1.4130 - classification_loss: 0.2315 303/500 [=================>............] - ETA: 1:06 - loss: 1.6449 - regression_loss: 1.4133 - classification_loss: 0.2316 304/500 [=================>............] - ETA: 1:06 - loss: 1.6480 - regression_loss: 1.4158 - classification_loss: 0.2322 305/500 [=================>............] - ETA: 1:06 - loss: 1.6497 - regression_loss: 1.4171 - classification_loss: 0.2326 306/500 [=================>............] - ETA: 1:05 - loss: 1.6481 - regression_loss: 1.4159 - classification_loss: 0.2322 307/500 [=================>............] - ETA: 1:05 - loss: 1.6450 - regression_loss: 1.4133 - classification_loss: 0.2317 308/500 [=================>............] - ETA: 1:05 - loss: 1.6444 - regression_loss: 1.4129 - classification_loss: 0.2315 309/500 [=================>............] - ETA: 1:04 - loss: 1.6449 - regression_loss: 1.4133 - classification_loss: 0.2317 310/500 [=================>............] - ETA: 1:04 - loss: 1.6452 - regression_loss: 1.4134 - classification_loss: 0.2318 311/500 [=================>............] - ETA: 1:04 - loss: 1.6452 - regression_loss: 1.4132 - classification_loss: 0.2320 312/500 [=================>............] - ETA: 1:03 - loss: 1.6456 - regression_loss: 1.4135 - classification_loss: 0.2321 313/500 [=================>............] - ETA: 1:03 - loss: 1.6467 - regression_loss: 1.4144 - classification_loss: 0.2323 314/500 [=================>............] - ETA: 1:03 - loss: 1.6461 - regression_loss: 1.4139 - classification_loss: 0.2322 315/500 [=================>............] - ETA: 1:02 - loss: 1.6467 - regression_loss: 1.4145 - classification_loss: 0.2323 316/500 [=================>............] - ETA: 1:02 - loss: 1.6453 - regression_loss: 1.4134 - classification_loss: 0.2319 317/500 [==================>...........] - ETA: 1:02 - loss: 1.6453 - regression_loss: 1.4134 - classification_loss: 0.2319 318/500 [==================>...........] - ETA: 1:01 - loss: 1.6458 - regression_loss: 1.4137 - classification_loss: 0.2321 319/500 [==================>...........] - ETA: 1:01 - loss: 1.6456 - regression_loss: 1.4136 - classification_loss: 0.2321 320/500 [==================>...........] - ETA: 1:00 - loss: 1.6462 - regression_loss: 1.4141 - classification_loss: 0.2321 321/500 [==================>...........] - ETA: 1:00 - loss: 1.6466 - regression_loss: 1.4145 - classification_loss: 0.2321 322/500 [==================>...........] - ETA: 1:00 - loss: 1.6457 - regression_loss: 1.4138 - classification_loss: 0.2319 323/500 [==================>...........] - ETA: 1:00 - loss: 1.6446 - regression_loss: 1.4129 - classification_loss: 0.2316 324/500 [==================>...........] - ETA: 59s - loss: 1.6439 - regression_loss: 1.4124 - classification_loss: 0.2315  325/500 [==================>...........] - ETA: 59s - loss: 1.6439 - regression_loss: 1.4123 - classification_loss: 0.2315 326/500 [==================>...........] - ETA: 58s - loss: 1.6438 - regression_loss: 1.4119 - classification_loss: 0.2319 327/500 [==================>...........] - ETA: 58s - loss: 1.6443 - regression_loss: 1.4123 - classification_loss: 0.2320 328/500 [==================>...........] - ETA: 58s - loss: 1.6427 - regression_loss: 1.4109 - classification_loss: 0.2317 329/500 [==================>...........] - ETA: 57s - loss: 1.6414 - regression_loss: 1.4097 - classification_loss: 0.2317 330/500 [==================>...........] - ETA: 57s - loss: 1.6451 - regression_loss: 1.4131 - classification_loss: 0.2320 331/500 [==================>...........] - ETA: 57s - loss: 1.6454 - regression_loss: 1.4131 - classification_loss: 0.2323 332/500 [==================>...........] - ETA: 56s - loss: 1.6431 - regression_loss: 1.4112 - classification_loss: 0.2318 333/500 [==================>...........] - ETA: 56s - loss: 1.6423 - regression_loss: 1.4105 - classification_loss: 0.2318 334/500 [===================>..........] - ETA: 56s - loss: 1.6415 - regression_loss: 1.4100 - classification_loss: 0.2316 335/500 [===================>..........] - ETA: 55s - loss: 1.6406 - regression_loss: 1.4093 - classification_loss: 0.2314 336/500 [===================>..........] - ETA: 55s - loss: 1.6412 - regression_loss: 1.4100 - classification_loss: 0.2312 337/500 [===================>..........] - ETA: 55s - loss: 1.6413 - regression_loss: 1.4101 - classification_loss: 0.2313 338/500 [===================>..........] - ETA: 54s - loss: 1.6406 - regression_loss: 1.4096 - classification_loss: 0.2310 339/500 [===================>..........] - ETA: 54s - loss: 1.6380 - regression_loss: 1.4073 - classification_loss: 0.2306 340/500 [===================>..........] - ETA: 54s - loss: 1.6382 - regression_loss: 1.4074 - classification_loss: 0.2307 341/500 [===================>..........] - ETA: 53s - loss: 1.6392 - regression_loss: 1.4081 - classification_loss: 0.2311 342/500 [===================>..........] - ETA: 53s - loss: 1.6395 - regression_loss: 1.4082 - classification_loss: 0.2314 343/500 [===================>..........] - ETA: 53s - loss: 1.6409 - regression_loss: 1.4093 - classification_loss: 0.2316 344/500 [===================>..........] - ETA: 52s - loss: 1.6409 - regression_loss: 1.4094 - classification_loss: 0.2315 345/500 [===================>..........] - ETA: 52s - loss: 1.6390 - regression_loss: 1.4079 - classification_loss: 0.2312 346/500 [===================>..........] - ETA: 52s - loss: 1.6375 - regression_loss: 1.4065 - classification_loss: 0.2310 347/500 [===================>..........] - ETA: 51s - loss: 1.6396 - regression_loss: 1.4082 - classification_loss: 0.2314 348/500 [===================>..........] - ETA: 51s - loss: 1.6390 - regression_loss: 1.4076 - classification_loss: 0.2313 349/500 [===================>..........] - ETA: 51s - loss: 1.6376 - regression_loss: 1.4066 - classification_loss: 0.2310 350/500 [====================>.........] - ETA: 50s - loss: 1.6364 - regression_loss: 1.4056 - classification_loss: 0.2308 351/500 [====================>.........] - ETA: 50s - loss: 1.6362 - regression_loss: 1.4055 - classification_loss: 0.2307 352/500 [====================>.........] - ETA: 50s - loss: 1.6355 - regression_loss: 1.4050 - classification_loss: 0.2305 353/500 [====================>.........] - ETA: 49s - loss: 1.6367 - regression_loss: 1.4061 - classification_loss: 0.2306 354/500 [====================>.........] - ETA: 49s - loss: 1.6362 - regression_loss: 1.4056 - classification_loss: 0.2306 355/500 [====================>.........] - ETA: 49s - loss: 1.6374 - regression_loss: 1.4066 - classification_loss: 0.2308 356/500 [====================>.........] - ETA: 48s - loss: 1.6367 - regression_loss: 1.4060 - classification_loss: 0.2307 357/500 [====================>.........] - ETA: 48s - loss: 1.6357 - regression_loss: 1.4053 - classification_loss: 0.2304 358/500 [====================>.........] - ETA: 48s - loss: 1.6353 - regression_loss: 1.4050 - classification_loss: 0.2303 359/500 [====================>.........] - ETA: 47s - loss: 1.6367 - regression_loss: 1.4061 - classification_loss: 0.2306 360/500 [====================>.........] - ETA: 47s - loss: 1.6363 - regression_loss: 1.4058 - classification_loss: 0.2306 361/500 [====================>.........] - ETA: 47s - loss: 1.6363 - regression_loss: 1.4057 - classification_loss: 0.2306 362/500 [====================>.........] - ETA: 46s - loss: 1.6359 - regression_loss: 1.4054 - classification_loss: 0.2305 363/500 [====================>.........] - ETA: 46s - loss: 1.6349 - regression_loss: 1.4047 - classification_loss: 0.2302 364/500 [====================>.........] - ETA: 46s - loss: 1.6357 - regression_loss: 1.4054 - classification_loss: 0.2303 365/500 [====================>.........] - ETA: 45s - loss: 1.6361 - regression_loss: 1.4060 - classification_loss: 0.2302 366/500 [====================>.........] - ETA: 45s - loss: 1.6364 - regression_loss: 1.4062 - classification_loss: 0.2302 367/500 [=====================>........] - ETA: 45s - loss: 1.6352 - regression_loss: 1.4052 - classification_loss: 0.2300 368/500 [=====================>........] - ETA: 44s - loss: 1.6354 - regression_loss: 1.4053 - classification_loss: 0.2301 369/500 [=====================>........] - ETA: 44s - loss: 1.6353 - regression_loss: 1.4053 - classification_loss: 0.2300 370/500 [=====================>........] - ETA: 44s - loss: 1.6350 - regression_loss: 1.4050 - classification_loss: 0.2300 371/500 [=====================>........] - ETA: 43s - loss: 1.6356 - regression_loss: 1.4054 - classification_loss: 0.2302 372/500 [=====================>........] - ETA: 43s - loss: 1.6347 - regression_loss: 1.4046 - classification_loss: 0.2300 373/500 [=====================>........] - ETA: 43s - loss: 1.6339 - regression_loss: 1.4040 - classification_loss: 0.2299 374/500 [=====================>........] - ETA: 42s - loss: 1.6337 - regression_loss: 1.4039 - classification_loss: 0.2298 375/500 [=====================>........] - ETA: 42s - loss: 1.6340 - regression_loss: 1.4043 - classification_loss: 0.2297 376/500 [=====================>........] - ETA: 42s - loss: 1.6343 - regression_loss: 1.4046 - classification_loss: 0.2297 377/500 [=====================>........] - ETA: 41s - loss: 1.6352 - regression_loss: 1.4055 - classification_loss: 0.2298 378/500 [=====================>........] - ETA: 41s - loss: 1.6342 - regression_loss: 1.4046 - classification_loss: 0.2296 379/500 [=====================>........] - ETA: 41s - loss: 1.6339 - regression_loss: 1.4045 - classification_loss: 0.2294 380/500 [=====================>........] - ETA: 40s - loss: 1.6343 - regression_loss: 1.4049 - classification_loss: 0.2294 381/500 [=====================>........] - ETA: 40s - loss: 1.6318 - regression_loss: 1.4028 - classification_loss: 0.2290 382/500 [=====================>........] - ETA: 40s - loss: 1.6314 - regression_loss: 1.4024 - classification_loss: 0.2290 383/500 [=====================>........] - ETA: 39s - loss: 1.6330 - regression_loss: 1.4040 - classification_loss: 0.2290 384/500 [======================>.......] - ETA: 39s - loss: 1.6326 - regression_loss: 1.4038 - classification_loss: 0.2288 385/500 [======================>.......] - ETA: 39s - loss: 1.6334 - regression_loss: 1.4045 - classification_loss: 0.2289 386/500 [======================>.......] - ETA: 38s - loss: 1.6346 - regression_loss: 1.4052 - classification_loss: 0.2294 387/500 [======================>.......] - ETA: 38s - loss: 1.6347 - regression_loss: 1.4053 - classification_loss: 0.2294 388/500 [======================>.......] - ETA: 38s - loss: 1.6330 - regression_loss: 1.4039 - classification_loss: 0.2291 389/500 [======================>.......] - ETA: 37s - loss: 1.6317 - regression_loss: 1.4029 - classification_loss: 0.2289 390/500 [======================>.......] - ETA: 37s - loss: 1.6304 - regression_loss: 1.4018 - classification_loss: 0.2286 391/500 [======================>.......] - ETA: 37s - loss: 1.6303 - regression_loss: 1.4017 - classification_loss: 0.2286 392/500 [======================>.......] - ETA: 36s - loss: 1.6312 - regression_loss: 1.4026 - classification_loss: 0.2287 393/500 [======================>.......] - ETA: 36s - loss: 1.6313 - regression_loss: 1.4027 - classification_loss: 0.2287 394/500 [======================>.......] - ETA: 35s - loss: 1.6311 - regression_loss: 1.4022 - classification_loss: 0.2289 395/500 [======================>.......] - ETA: 35s - loss: 1.6315 - regression_loss: 1.4027 - classification_loss: 0.2288 396/500 [======================>.......] - ETA: 35s - loss: 1.6318 - regression_loss: 1.4030 - classification_loss: 0.2288 397/500 [======================>.......] - ETA: 34s - loss: 1.6307 - regression_loss: 1.4020 - classification_loss: 0.2287 398/500 [======================>.......] - ETA: 34s - loss: 1.6302 - regression_loss: 1.4017 - classification_loss: 0.2285 399/500 [======================>.......] - ETA: 34s - loss: 1.6314 - regression_loss: 1.4026 - classification_loss: 0.2287 400/500 [=======================>......] - ETA: 33s - loss: 1.6314 - regression_loss: 1.4027 - classification_loss: 0.2287 401/500 [=======================>......] - ETA: 33s - loss: 1.6300 - regression_loss: 1.4013 - classification_loss: 0.2287 402/500 [=======================>......] - ETA: 33s - loss: 1.6299 - regression_loss: 1.4013 - classification_loss: 0.2286 403/500 [=======================>......] - ETA: 32s - loss: 1.6293 - regression_loss: 1.4007 - classification_loss: 0.2286 404/500 [=======================>......] - ETA: 32s - loss: 1.6272 - regression_loss: 1.3988 - classification_loss: 0.2283 405/500 [=======================>......] - ETA: 32s - loss: 1.6257 - regression_loss: 1.3976 - classification_loss: 0.2281 406/500 [=======================>......] - ETA: 31s - loss: 1.6277 - regression_loss: 1.3994 - classification_loss: 0.2283 407/500 [=======================>......] - ETA: 31s - loss: 1.6278 - regression_loss: 1.3996 - classification_loss: 0.2282 408/500 [=======================>......] - ETA: 31s - loss: 1.6289 - regression_loss: 1.4008 - classification_loss: 0.2282 409/500 [=======================>......] - ETA: 30s - loss: 1.6294 - regression_loss: 1.4012 - classification_loss: 0.2282 410/500 [=======================>......] - ETA: 30s - loss: 1.6298 - regression_loss: 1.4016 - classification_loss: 0.2281 411/500 [=======================>......] - ETA: 30s - loss: 1.6291 - regression_loss: 1.4009 - classification_loss: 0.2281 412/500 [=======================>......] - ETA: 29s - loss: 1.6292 - regression_loss: 1.4011 - classification_loss: 0.2281 413/500 [=======================>......] - ETA: 29s - loss: 1.6279 - regression_loss: 1.4000 - classification_loss: 0.2279 414/500 [=======================>......] - ETA: 29s - loss: 1.6260 - regression_loss: 1.3983 - classification_loss: 0.2276 415/500 [=======================>......] - ETA: 28s - loss: 1.6272 - regression_loss: 1.3993 - classification_loss: 0.2279 416/500 [=======================>......] - ETA: 28s - loss: 1.6255 - regression_loss: 1.3979 - classification_loss: 0.2276 417/500 [========================>.....] - ETA: 28s - loss: 1.6235 - regression_loss: 1.3959 - classification_loss: 0.2275 418/500 [========================>.....] - ETA: 27s - loss: 1.6242 - regression_loss: 1.3963 - classification_loss: 0.2279 419/500 [========================>.....] - ETA: 27s - loss: 1.6251 - regression_loss: 1.3970 - classification_loss: 0.2280 420/500 [========================>.....] - ETA: 27s - loss: 1.6252 - regression_loss: 1.3972 - classification_loss: 0.2280 421/500 [========================>.....] - ETA: 26s - loss: 1.6268 - regression_loss: 1.3986 - classification_loss: 0.2283 422/500 [========================>.....] - ETA: 26s - loss: 1.6254 - regression_loss: 1.3975 - classification_loss: 0.2280 423/500 [========================>.....] - ETA: 26s - loss: 1.6252 - regression_loss: 1.3974 - classification_loss: 0.2278 424/500 [========================>.....] - ETA: 25s - loss: 1.6257 - regression_loss: 1.3980 - classification_loss: 0.2277 425/500 [========================>.....] - ETA: 25s - loss: 1.6257 - regression_loss: 1.3980 - classification_loss: 0.2276 426/500 [========================>.....] - ETA: 25s - loss: 1.6254 - regression_loss: 1.3978 - classification_loss: 0.2275 427/500 [========================>.....] - ETA: 24s - loss: 1.6245 - regression_loss: 1.3971 - classification_loss: 0.2274 428/500 [========================>.....] - ETA: 24s - loss: 1.6241 - regression_loss: 1.3966 - classification_loss: 0.2275 429/500 [========================>.....] - ETA: 24s - loss: 1.6242 - regression_loss: 1.3967 - classification_loss: 0.2275 430/500 [========================>.....] - ETA: 23s - loss: 1.6226 - regression_loss: 1.3952 - classification_loss: 0.2274 431/500 [========================>.....] - ETA: 23s - loss: 1.6224 - regression_loss: 1.3951 - classification_loss: 0.2274 432/500 [========================>.....] - ETA: 23s - loss: 1.6224 - regression_loss: 1.3951 - classification_loss: 0.2273 433/500 [========================>.....] - ETA: 22s - loss: 1.6228 - regression_loss: 1.3956 - classification_loss: 0.2272 434/500 [=========================>....] - ETA: 22s - loss: 1.6229 - regression_loss: 1.3957 - classification_loss: 0.2272 435/500 [=========================>....] - ETA: 22s - loss: 1.6223 - regression_loss: 1.3951 - classification_loss: 0.2271 436/500 [=========================>....] - ETA: 21s - loss: 1.6234 - regression_loss: 1.3961 - classification_loss: 0.2273 437/500 [=========================>....] - ETA: 21s - loss: 1.6234 - regression_loss: 1.3961 - classification_loss: 0.2273 438/500 [=========================>....] - ETA: 21s - loss: 1.6233 - regression_loss: 1.3961 - classification_loss: 0.2271 439/500 [=========================>....] - ETA: 20s - loss: 1.6233 - regression_loss: 1.3963 - classification_loss: 0.2270 440/500 [=========================>....] - ETA: 20s - loss: 1.6233 - regression_loss: 1.3963 - classification_loss: 0.2271 441/500 [=========================>....] - ETA: 20s - loss: 1.6231 - regression_loss: 1.3961 - classification_loss: 0.2271 442/500 [=========================>....] - ETA: 19s - loss: 1.6228 - regression_loss: 1.3958 - classification_loss: 0.2270 443/500 [=========================>....] - ETA: 19s - loss: 1.6226 - regression_loss: 1.3956 - classification_loss: 0.2271 444/500 [=========================>....] - ETA: 19s - loss: 1.6228 - regression_loss: 1.3958 - classification_loss: 0.2271 445/500 [=========================>....] - ETA: 18s - loss: 1.6232 - regression_loss: 1.3962 - classification_loss: 0.2270 446/500 [=========================>....] - ETA: 18s - loss: 1.6232 - regression_loss: 1.3961 - classification_loss: 0.2270 447/500 [=========================>....] - ETA: 17s - loss: 1.6235 - regression_loss: 1.3965 - classification_loss: 0.2270 448/500 [=========================>....] - ETA: 17s - loss: 1.6237 - regression_loss: 1.3967 - classification_loss: 0.2270 449/500 [=========================>....] - ETA: 17s - loss: 1.6245 - regression_loss: 1.3974 - classification_loss: 0.2272 450/500 [==========================>...] - ETA: 16s - loss: 1.6233 - regression_loss: 1.3964 - classification_loss: 0.2269 451/500 [==========================>...] - ETA: 16s - loss: 1.6234 - regression_loss: 1.3965 - classification_loss: 0.2269 452/500 [==========================>...] - ETA: 16s - loss: 1.6225 - regression_loss: 1.3959 - classification_loss: 0.2267 453/500 [==========================>...] - ETA: 15s - loss: 1.6226 - regression_loss: 1.3960 - classification_loss: 0.2266 454/500 [==========================>...] - ETA: 15s - loss: 1.6200 - regression_loss: 1.3938 - classification_loss: 0.2263 455/500 [==========================>...] - ETA: 15s - loss: 1.6205 - regression_loss: 1.3942 - classification_loss: 0.2263 456/500 [==========================>...] - ETA: 14s - loss: 1.6206 - regression_loss: 1.3944 - classification_loss: 0.2263 457/500 [==========================>...] - ETA: 14s - loss: 1.6205 - regression_loss: 1.3942 - classification_loss: 0.2262 458/500 [==========================>...] - ETA: 14s - loss: 1.6194 - regression_loss: 1.3933 - classification_loss: 0.2261 459/500 [==========================>...] - ETA: 13s - loss: 1.6190 - regression_loss: 1.3930 - classification_loss: 0.2260 460/500 [==========================>...] - ETA: 13s - loss: 1.6203 - regression_loss: 1.3941 - classification_loss: 0.2262 461/500 [==========================>...] - ETA: 13s - loss: 1.6207 - regression_loss: 1.3946 - classification_loss: 0.2261 462/500 [==========================>...] - ETA: 12s - loss: 1.6207 - regression_loss: 1.3944 - classification_loss: 0.2263 463/500 [==========================>...] - ETA: 12s - loss: 1.6215 - regression_loss: 1.3951 - classification_loss: 0.2264 464/500 [==========================>...] - ETA: 12s - loss: 1.6209 - regression_loss: 1.3946 - classification_loss: 0.2263 465/500 [==========================>...] - ETA: 11s - loss: 1.6208 - regression_loss: 1.3946 - classification_loss: 0.2262 466/500 [==========================>...] - ETA: 11s - loss: 1.6189 - regression_loss: 1.3929 - classification_loss: 0.2260 467/500 [===========================>..] - ETA: 11s - loss: 1.6197 - regression_loss: 1.3934 - classification_loss: 0.2262 468/500 [===========================>..] - ETA: 10s - loss: 1.6179 - regression_loss: 1.3920 - classification_loss: 0.2260 469/500 [===========================>..] - ETA: 10s - loss: 1.6191 - regression_loss: 1.3930 - classification_loss: 0.2261 470/500 [===========================>..] - ETA: 10s - loss: 1.6193 - regression_loss: 1.3932 - classification_loss: 0.2261 471/500 [===========================>..] - ETA: 9s - loss: 1.6179 - regression_loss: 1.3920 - classification_loss: 0.2259  472/500 [===========================>..] - ETA: 9s - loss: 1.6171 - regression_loss: 1.3914 - classification_loss: 0.2257 473/500 [===========================>..] - ETA: 9s - loss: 1.6181 - regression_loss: 1.3924 - classification_loss: 0.2257 474/500 [===========================>..] - ETA: 8s - loss: 1.6181 - regression_loss: 1.3924 - classification_loss: 0.2257 475/500 [===========================>..] - ETA: 8s - loss: 1.6178 - regression_loss: 1.3921 - classification_loss: 0.2257 476/500 [===========================>..] - ETA: 8s - loss: 1.6187 - regression_loss: 1.3928 - classification_loss: 0.2259 477/500 [===========================>..] - ETA: 7s - loss: 1.6180 - regression_loss: 1.3921 - classification_loss: 0.2258 478/500 [===========================>..] - ETA: 7s - loss: 1.6192 - regression_loss: 1.3932 - classification_loss: 0.2259 479/500 [===========================>..] - ETA: 7s - loss: 1.6196 - regression_loss: 1.3935 - classification_loss: 0.2260 480/500 [===========================>..] - ETA: 6s - loss: 1.6203 - regression_loss: 1.3941 - classification_loss: 0.2261 481/500 [===========================>..] - ETA: 6s - loss: 1.6206 - regression_loss: 1.3944 - classification_loss: 0.2262 482/500 [===========================>..] - ETA: 6s - loss: 1.6192 - regression_loss: 1.3932 - classification_loss: 0.2260 483/500 [===========================>..] - ETA: 5s - loss: 1.6205 - regression_loss: 1.3946 - classification_loss: 0.2259 484/500 [============================>.] - ETA: 5s - loss: 1.6203 - regression_loss: 1.3944 - classification_loss: 0.2260 485/500 [============================>.] - ETA: 5s - loss: 1.6204 - regression_loss: 1.3946 - classification_loss: 0.2259 486/500 [============================>.] - ETA: 4s - loss: 1.6198 - regression_loss: 1.3940 - classification_loss: 0.2258 487/500 [============================>.] - ETA: 4s - loss: 1.6211 - regression_loss: 1.3952 - classification_loss: 0.2259 488/500 [============================>.] - ETA: 4s - loss: 1.6209 - regression_loss: 1.3950 - classification_loss: 0.2259 489/500 [============================>.] - ETA: 3s - loss: 1.6206 - regression_loss: 1.3948 - classification_loss: 0.2258 490/500 [============================>.] - ETA: 3s - loss: 1.6200 - regression_loss: 1.3942 - classification_loss: 0.2258 491/500 [============================>.] - ETA: 3s - loss: 1.6208 - regression_loss: 1.3949 - classification_loss: 0.2259 492/500 [============================>.] - ETA: 2s - loss: 1.6215 - regression_loss: 1.3955 - classification_loss: 0.2260 493/500 [============================>.] - ETA: 2s - loss: 1.6216 - regression_loss: 1.3956 - classification_loss: 0.2261 494/500 [============================>.] - ETA: 2s - loss: 1.6213 - regression_loss: 1.3952 - classification_loss: 0.2261 495/500 [============================>.] - ETA: 1s - loss: 1.6218 - regression_loss: 1.3956 - classification_loss: 0.2262 496/500 [============================>.] - ETA: 1s - loss: 1.6227 - regression_loss: 1.3963 - classification_loss: 0.2264 497/500 [============================>.] - ETA: 1s - loss: 1.6230 - regression_loss: 1.3965 - classification_loss: 0.2265 498/500 [============================>.] - ETA: 0s - loss: 1.6232 - regression_loss: 1.3967 - classification_loss: 0.2265 499/500 [============================>.] - ETA: 0s - loss: 1.6243 - regression_loss: 1.3979 - classification_loss: 0.2265 500/500 [==============================] - 170s 339ms/step - loss: 1.6240 - regression_loss: 1.3975 - classification_loss: 0.2265 1172 instances of class plum with average precision: 0.6316 mAP: 0.6316 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:33 - loss: 0.8123 - regression_loss: 0.7307 - classification_loss: 0.0816 2/500 [..............................] - ETA: 2:45 - loss: 1.2119 - regression_loss: 1.0778 - classification_loss: 0.1341 3/500 [..............................] - ETA: 2:50 - loss: 1.3841 - regression_loss: 1.2197 - classification_loss: 0.1644 4/500 [..............................] - ETA: 2:50 - loss: 1.2788 - regression_loss: 1.0878 - classification_loss: 0.1910 5/500 [..............................] - ETA: 2:48 - loss: 1.3880 - regression_loss: 1.1798 - classification_loss: 0.2082 6/500 [..............................] - ETA: 2:49 - loss: 1.4901 - regression_loss: 1.2679 - classification_loss: 0.2222 7/500 [..............................] - ETA: 2:50 - loss: 1.4189 - regression_loss: 1.1960 - classification_loss: 0.2228 8/500 [..............................] - ETA: 2:49 - loss: 1.4365 - regression_loss: 1.2151 - classification_loss: 0.2214 9/500 [..............................] - ETA: 2:50 - loss: 1.4113 - regression_loss: 1.1993 - classification_loss: 0.2120 10/500 [..............................] - ETA: 2:50 - loss: 1.4306 - regression_loss: 1.2116 - classification_loss: 0.2190 11/500 [..............................] - ETA: 2:49 - loss: 1.4365 - regression_loss: 1.2173 - classification_loss: 0.2192 12/500 [..............................] - ETA: 2:48 - loss: 1.4569 - regression_loss: 1.2355 - classification_loss: 0.2214 13/500 [..............................] - ETA: 2:48 - loss: 1.4821 - regression_loss: 1.2568 - classification_loss: 0.2253 14/500 [..............................] - ETA: 2:49 - loss: 1.4991 - regression_loss: 1.2665 - classification_loss: 0.2326 15/500 [..............................] - ETA: 2:48 - loss: 1.4882 - regression_loss: 1.2601 - classification_loss: 0.2281 16/500 [..............................] - ETA: 2:47 - loss: 1.4449 - regression_loss: 1.2249 - classification_loss: 0.2200 17/500 [>.............................] - ETA: 2:46 - loss: 1.4625 - regression_loss: 1.2424 - classification_loss: 0.2200 18/500 [>.............................] - ETA: 2:46 - loss: 1.4518 - regression_loss: 1.2355 - classification_loss: 0.2163 19/500 [>.............................] - ETA: 2:45 - loss: 1.4836 - regression_loss: 1.2672 - classification_loss: 0.2164 20/500 [>.............................] - ETA: 2:45 - loss: 1.4576 - regression_loss: 1.2437 - classification_loss: 0.2139 21/500 [>.............................] - ETA: 2:45 - loss: 1.4813 - regression_loss: 1.2639 - classification_loss: 0.2174 22/500 [>.............................] - ETA: 2:44 - loss: 1.4956 - regression_loss: 1.2755 - classification_loss: 0.2201 23/500 [>.............................] - ETA: 2:44 - loss: 1.5039 - regression_loss: 1.2855 - classification_loss: 0.2184 24/500 [>.............................] - ETA: 2:43 - loss: 1.5019 - regression_loss: 1.2852 - classification_loss: 0.2167 25/500 [>.............................] - ETA: 2:43 - loss: 1.4796 - regression_loss: 1.2662 - classification_loss: 0.2134 26/500 [>.............................] - ETA: 2:42 - loss: 1.4467 - regression_loss: 1.2383 - classification_loss: 0.2084 27/500 [>.............................] - ETA: 2:42 - loss: 1.4456 - regression_loss: 1.2351 - classification_loss: 0.2105 28/500 [>.............................] - ETA: 2:41 - loss: 1.4307 - regression_loss: 1.2232 - classification_loss: 0.2075 29/500 [>.............................] - ETA: 2:41 - loss: 1.4197 - regression_loss: 1.2135 - classification_loss: 0.2062 30/500 [>.............................] - ETA: 2:41 - loss: 1.4171 - regression_loss: 1.2136 - classification_loss: 0.2035 31/500 [>.............................] - ETA: 2:40 - loss: 1.3931 - regression_loss: 1.1936 - classification_loss: 0.1995 32/500 [>.............................] - ETA: 2:40 - loss: 1.3731 - regression_loss: 1.1770 - classification_loss: 0.1961 33/500 [>.............................] - ETA: 2:39 - loss: 1.3811 - regression_loss: 1.1851 - classification_loss: 0.1961 34/500 [=>............................] - ETA: 2:39 - loss: 1.3816 - regression_loss: 1.1853 - classification_loss: 0.1962 35/500 [=>............................] - ETA: 2:38 - loss: 1.3967 - regression_loss: 1.1964 - classification_loss: 0.2002 36/500 [=>............................] - ETA: 2:38 - loss: 1.4205 - regression_loss: 1.2181 - classification_loss: 0.2024 37/500 [=>............................] - ETA: 2:37 - loss: 1.4284 - regression_loss: 1.2263 - classification_loss: 0.2022 38/500 [=>............................] - ETA: 2:37 - loss: 1.4264 - regression_loss: 1.2252 - classification_loss: 0.2012 39/500 [=>............................] - ETA: 2:37 - loss: 1.4300 - regression_loss: 1.2285 - classification_loss: 0.2016 40/500 [=>............................] - ETA: 2:37 - loss: 1.4314 - regression_loss: 1.2304 - classification_loss: 0.2009 41/500 [=>............................] - ETA: 2:36 - loss: 1.4276 - regression_loss: 1.2281 - classification_loss: 0.1995 42/500 [=>............................] - ETA: 2:36 - loss: 1.4175 - regression_loss: 1.2193 - classification_loss: 0.1983 43/500 [=>............................] - ETA: 2:36 - loss: 1.4214 - regression_loss: 1.2235 - classification_loss: 0.1980 44/500 [=>............................] - ETA: 2:35 - loss: 1.4490 - regression_loss: 1.2487 - classification_loss: 0.2004 45/500 [=>............................] - ETA: 2:35 - loss: 1.4508 - regression_loss: 1.2510 - classification_loss: 0.1999 46/500 [=>............................] - ETA: 2:34 - loss: 1.4590 - regression_loss: 1.2576 - classification_loss: 0.2014 47/500 [=>............................] - ETA: 2:34 - loss: 1.4709 - regression_loss: 1.2683 - classification_loss: 0.2026 48/500 [=>............................] - ETA: 2:34 - loss: 1.4815 - regression_loss: 1.2747 - classification_loss: 0.2069 49/500 [=>............................] - ETA: 2:33 - loss: 1.4973 - regression_loss: 1.2838 - classification_loss: 0.2135 50/500 [==>...........................] - ETA: 2:33 - loss: 1.5021 - regression_loss: 1.2882 - classification_loss: 0.2139 51/500 [==>...........................] - ETA: 2:32 - loss: 1.4943 - regression_loss: 1.2820 - classification_loss: 0.2122 52/500 [==>...........................] - ETA: 2:32 - loss: 1.5092 - regression_loss: 1.2940 - classification_loss: 0.2152 53/500 [==>...........................] - ETA: 2:32 - loss: 1.5163 - regression_loss: 1.3000 - classification_loss: 0.2162 54/500 [==>...........................] - ETA: 2:31 - loss: 1.5219 - regression_loss: 1.3065 - classification_loss: 0.2154 55/500 [==>...........................] - ETA: 2:31 - loss: 1.5461 - regression_loss: 1.3263 - classification_loss: 0.2198 56/500 [==>...........................] - ETA: 2:31 - loss: 1.5506 - regression_loss: 1.3315 - classification_loss: 0.2192 57/500 [==>...........................] - ETA: 2:30 - loss: 1.5489 - regression_loss: 1.3299 - classification_loss: 0.2190 58/500 [==>...........................] - ETA: 2:30 - loss: 1.5459 - regression_loss: 1.3273 - classification_loss: 0.2186 59/500 [==>...........................] - ETA: 2:30 - loss: 1.5520 - regression_loss: 1.3319 - classification_loss: 0.2200 60/500 [==>...........................] - ETA: 2:29 - loss: 1.5513 - regression_loss: 1.3316 - classification_loss: 0.2197 61/500 [==>...........................] - ETA: 2:29 - loss: 1.5453 - regression_loss: 1.3270 - classification_loss: 0.2183 62/500 [==>...........................] - ETA: 2:29 - loss: 1.5454 - regression_loss: 1.3275 - classification_loss: 0.2179 63/500 [==>...........................] - ETA: 2:28 - loss: 1.5421 - regression_loss: 1.3253 - classification_loss: 0.2168 64/500 [==>...........................] - ETA: 2:28 - loss: 1.5502 - regression_loss: 1.3321 - classification_loss: 0.2181 65/500 [==>...........................] - ETA: 2:28 - loss: 1.5480 - regression_loss: 1.3302 - classification_loss: 0.2178 66/500 [==>...........................] - ETA: 2:28 - loss: 1.5532 - regression_loss: 1.3346 - classification_loss: 0.2185 67/500 [===>..........................] - ETA: 2:27 - loss: 1.5595 - regression_loss: 1.3402 - classification_loss: 0.2193 68/500 [===>..........................] - ETA: 2:27 - loss: 1.5462 - regression_loss: 1.3286 - classification_loss: 0.2176 69/500 [===>..........................] - ETA: 2:27 - loss: 1.5427 - regression_loss: 1.3259 - classification_loss: 0.2167 70/500 [===>..........................] - ETA: 2:26 - loss: 1.5394 - regression_loss: 1.3235 - classification_loss: 0.2159 71/500 [===>..........................] - ETA: 2:26 - loss: 1.5444 - regression_loss: 1.3278 - classification_loss: 0.2166 72/500 [===>..........................] - ETA: 2:25 - loss: 1.5454 - regression_loss: 1.3281 - classification_loss: 0.2173 73/500 [===>..........................] - ETA: 2:25 - loss: 1.5479 - regression_loss: 1.3309 - classification_loss: 0.2170 74/500 [===>..........................] - ETA: 2:25 - loss: 1.5536 - regression_loss: 1.3357 - classification_loss: 0.2179 75/500 [===>..........................] - ETA: 2:24 - loss: 1.5523 - regression_loss: 1.3339 - classification_loss: 0.2184 76/500 [===>..........................] - ETA: 2:24 - loss: 1.5537 - regression_loss: 1.3353 - classification_loss: 0.2184 77/500 [===>..........................] - ETA: 2:24 - loss: 1.5532 - regression_loss: 1.3353 - classification_loss: 0.2180 78/500 [===>..........................] - ETA: 2:23 - loss: 1.5560 - regression_loss: 1.3375 - classification_loss: 0.2185 79/500 [===>..........................] - ETA: 2:23 - loss: 1.5567 - regression_loss: 1.3387 - classification_loss: 0.2180 80/500 [===>..........................] - ETA: 2:22 - loss: 1.5441 - regression_loss: 1.3281 - classification_loss: 0.2160 81/500 [===>..........................] - ETA: 2:22 - loss: 1.5430 - regression_loss: 1.3274 - classification_loss: 0.2156 82/500 [===>..........................] - ETA: 2:22 - loss: 1.5487 - regression_loss: 1.3320 - classification_loss: 0.2167 83/500 [===>..........................] - ETA: 2:21 - loss: 1.5546 - regression_loss: 1.3366 - classification_loss: 0.2180 84/500 [====>.........................] - ETA: 2:21 - loss: 1.5594 - regression_loss: 1.3409 - classification_loss: 0.2185 85/500 [====>.........................] - ETA: 2:21 - loss: 1.5566 - regression_loss: 1.3381 - classification_loss: 0.2186 86/500 [====>.........................] - ETA: 2:21 - loss: 1.5562 - regression_loss: 1.3375 - classification_loss: 0.2188 87/500 [====>.........................] - ETA: 2:20 - loss: 1.5609 - regression_loss: 1.3408 - classification_loss: 0.2202 88/500 [====>.........................] - ETA: 2:20 - loss: 1.5601 - regression_loss: 1.3395 - classification_loss: 0.2206 89/500 [====>.........................] - ETA: 2:20 - loss: 1.5538 - regression_loss: 1.3340 - classification_loss: 0.2198 90/500 [====>.........................] - ETA: 2:19 - loss: 1.5442 - regression_loss: 1.3262 - classification_loss: 0.2180 91/500 [====>.........................] - ETA: 2:19 - loss: 1.5430 - regression_loss: 1.3256 - classification_loss: 0.2175 92/500 [====>.........................] - ETA: 2:19 - loss: 1.5441 - regression_loss: 1.3262 - classification_loss: 0.2179 93/500 [====>.........................] - ETA: 2:18 - loss: 1.5388 - regression_loss: 1.3221 - classification_loss: 0.2168 94/500 [====>.........................] - ETA: 2:18 - loss: 1.5400 - regression_loss: 1.3229 - classification_loss: 0.2171 95/500 [====>.........................] - ETA: 2:17 - loss: 1.5293 - regression_loss: 1.3134 - classification_loss: 0.2159 96/500 [====>.........................] - ETA: 2:17 - loss: 1.5241 - regression_loss: 1.3091 - classification_loss: 0.2151 97/500 [====>.........................] - ETA: 2:17 - loss: 1.5201 - regression_loss: 1.3056 - classification_loss: 0.2145 98/500 [====>.........................] - ETA: 2:16 - loss: 1.5226 - regression_loss: 1.3077 - classification_loss: 0.2148 99/500 [====>.........................] - ETA: 2:16 - loss: 1.5216 - regression_loss: 1.3073 - classification_loss: 0.2143 100/500 [=====>........................] - ETA: 2:16 - loss: 1.5189 - regression_loss: 1.3049 - classification_loss: 0.2141 101/500 [=====>........................] - ETA: 2:15 - loss: 1.5116 - regression_loss: 1.2986 - classification_loss: 0.2130 102/500 [=====>........................] - ETA: 2:15 - loss: 1.5050 - regression_loss: 1.2925 - classification_loss: 0.2125 103/500 [=====>........................] - ETA: 2:14 - loss: 1.5069 - regression_loss: 1.2942 - classification_loss: 0.2126 104/500 [=====>........................] - ETA: 2:14 - loss: 1.5048 - regression_loss: 1.2927 - classification_loss: 0.2121 105/500 [=====>........................] - ETA: 2:14 - loss: 1.5068 - regression_loss: 1.2944 - classification_loss: 0.2124 106/500 [=====>........................] - ETA: 2:13 - loss: 1.5009 - regression_loss: 1.2898 - classification_loss: 0.2112 107/500 [=====>........................] - ETA: 2:13 - loss: 1.5029 - regression_loss: 1.2919 - classification_loss: 0.2110 108/500 [=====>........................] - ETA: 2:13 - loss: 1.5047 - regression_loss: 1.2935 - classification_loss: 0.2113 109/500 [=====>........................] - ETA: 2:12 - loss: 1.5018 - regression_loss: 1.2914 - classification_loss: 0.2104 110/500 [=====>........................] - ETA: 2:12 - loss: 1.5029 - regression_loss: 1.2922 - classification_loss: 0.2107 111/500 [=====>........................] - ETA: 2:12 - loss: 1.5018 - regression_loss: 1.2913 - classification_loss: 0.2105 112/500 [=====>........................] - ETA: 2:11 - loss: 1.4987 - regression_loss: 1.2887 - classification_loss: 0.2099 113/500 [=====>........................] - ETA: 2:11 - loss: 1.5004 - regression_loss: 1.2906 - classification_loss: 0.2098 114/500 [=====>........................] - ETA: 2:11 - loss: 1.5016 - regression_loss: 1.2916 - classification_loss: 0.2100 115/500 [=====>........................] - ETA: 2:10 - loss: 1.5002 - regression_loss: 1.2907 - classification_loss: 0.2095 116/500 [=====>........................] - ETA: 2:10 - loss: 1.4996 - regression_loss: 1.2904 - classification_loss: 0.2092 117/500 [======>.......................] - ETA: 2:09 - loss: 1.5026 - regression_loss: 1.2931 - classification_loss: 0.2095 118/500 [======>.......................] - ETA: 2:09 - loss: 1.5063 - regression_loss: 1.2965 - classification_loss: 0.2098 119/500 [======>.......................] - ETA: 2:09 - loss: 1.5052 - regression_loss: 1.2961 - classification_loss: 0.2091 120/500 [======>.......................] - ETA: 2:09 - loss: 1.5088 - regression_loss: 1.2987 - classification_loss: 0.2101 121/500 [======>.......................] - ETA: 2:08 - loss: 1.5113 - regression_loss: 1.3012 - classification_loss: 0.2101 122/500 [======>.......................] - ETA: 2:08 - loss: 1.5098 - regression_loss: 1.3000 - classification_loss: 0.2098 123/500 [======>.......................] - ETA: 2:07 - loss: 1.5124 - regression_loss: 1.3020 - classification_loss: 0.2104 124/500 [======>.......................] - ETA: 2:07 - loss: 1.5153 - regression_loss: 1.3047 - classification_loss: 0.2105 125/500 [======>.......................] - ETA: 2:07 - loss: 1.5297 - regression_loss: 1.3142 - classification_loss: 0.2155 126/500 [======>.......................] - ETA: 2:06 - loss: 1.5257 - regression_loss: 1.3112 - classification_loss: 0.2145 127/500 [======>.......................] - ETA: 2:06 - loss: 1.5247 - regression_loss: 1.3109 - classification_loss: 0.2138 128/500 [======>.......................] - ETA: 2:06 - loss: 1.5279 - regression_loss: 1.3136 - classification_loss: 0.2143 129/500 [======>.......................] - ETA: 2:05 - loss: 1.5308 - regression_loss: 1.3165 - classification_loss: 0.2142 130/500 [======>.......................] - ETA: 2:05 - loss: 1.5288 - regression_loss: 1.3151 - classification_loss: 0.2137 131/500 [======>.......................] - ETA: 2:05 - loss: 1.5313 - regression_loss: 1.3169 - classification_loss: 0.2144 132/500 [======>.......................] - ETA: 2:04 - loss: 1.5262 - regression_loss: 1.3126 - classification_loss: 0.2136 133/500 [======>.......................] - ETA: 2:04 - loss: 1.5239 - regression_loss: 1.3107 - classification_loss: 0.2132 134/500 [=======>......................] - ETA: 2:04 - loss: 1.5239 - regression_loss: 1.3108 - classification_loss: 0.2131 135/500 [=======>......................] - ETA: 2:03 - loss: 1.5179 - regression_loss: 1.3056 - classification_loss: 0.2124 136/500 [=======>......................] - ETA: 2:03 - loss: 1.5217 - regression_loss: 1.3088 - classification_loss: 0.2129 137/500 [=======>......................] - ETA: 2:03 - loss: 1.5186 - regression_loss: 1.3063 - classification_loss: 0.2122 138/500 [=======>......................] - ETA: 2:02 - loss: 1.5175 - regression_loss: 1.3055 - classification_loss: 0.2121 139/500 [=======>......................] - ETA: 2:02 - loss: 1.5143 - regression_loss: 1.3026 - classification_loss: 0.2116 140/500 [=======>......................] - ETA: 2:02 - loss: 1.5178 - regression_loss: 1.3054 - classification_loss: 0.2125 141/500 [=======>......................] - ETA: 2:01 - loss: 1.5185 - regression_loss: 1.3055 - classification_loss: 0.2130 142/500 [=======>......................] - ETA: 2:01 - loss: 1.5197 - regression_loss: 1.3064 - classification_loss: 0.2132 143/500 [=======>......................] - ETA: 2:01 - loss: 1.5217 - regression_loss: 1.3081 - classification_loss: 0.2136 144/500 [=======>......................] - ETA: 2:00 - loss: 1.5221 - regression_loss: 1.3085 - classification_loss: 0.2135 145/500 [=======>......................] - ETA: 2:00 - loss: 1.5295 - regression_loss: 1.3154 - classification_loss: 0.2141 146/500 [=======>......................] - ETA: 2:00 - loss: 1.5244 - regression_loss: 1.3108 - classification_loss: 0.2136 147/500 [=======>......................] - ETA: 1:59 - loss: 1.5243 - regression_loss: 1.3111 - classification_loss: 0.2131 148/500 [=======>......................] - ETA: 1:59 - loss: 1.5242 - regression_loss: 1.3096 - classification_loss: 0.2145 149/500 [=======>......................] - ETA: 1:59 - loss: 1.5227 - regression_loss: 1.3085 - classification_loss: 0.2142 150/500 [========>.....................] - ETA: 1:58 - loss: 1.5220 - regression_loss: 1.3076 - classification_loss: 0.2144 151/500 [========>.....................] - ETA: 1:58 - loss: 1.5176 - regression_loss: 1.3041 - classification_loss: 0.2135 152/500 [========>.....................] - ETA: 1:57 - loss: 1.5149 - regression_loss: 1.3021 - classification_loss: 0.2129 153/500 [========>.....................] - ETA: 1:57 - loss: 1.5164 - regression_loss: 1.3034 - classification_loss: 0.2130 154/500 [========>.....................] - ETA: 1:57 - loss: 1.5187 - regression_loss: 1.3052 - classification_loss: 0.2135 155/500 [========>.....................] - ETA: 1:56 - loss: 1.5183 - regression_loss: 1.3052 - classification_loss: 0.2132 156/500 [========>.....................] - ETA: 1:56 - loss: 1.5217 - regression_loss: 1.3079 - classification_loss: 0.2138 157/500 [========>.....................] - ETA: 1:56 - loss: 1.5237 - regression_loss: 1.3094 - classification_loss: 0.2143 158/500 [========>.....................] - ETA: 1:55 - loss: 1.5254 - regression_loss: 1.3109 - classification_loss: 0.2145 159/500 [========>.....................] - ETA: 1:55 - loss: 1.5200 - regression_loss: 1.3061 - classification_loss: 0.2139 160/500 [========>.....................] - ETA: 1:55 - loss: 1.5195 - regression_loss: 1.3060 - classification_loss: 0.2136 161/500 [========>.....................] - ETA: 1:54 - loss: 1.5223 - regression_loss: 1.3088 - classification_loss: 0.2135 162/500 [========>.....................] - ETA: 1:54 - loss: 1.5218 - regression_loss: 1.3081 - classification_loss: 0.2137 163/500 [========>.....................] - ETA: 1:54 - loss: 1.5211 - regression_loss: 1.3076 - classification_loss: 0.2135 164/500 [========>.....................] - ETA: 1:53 - loss: 1.5169 - regression_loss: 1.3040 - classification_loss: 0.2129 165/500 [========>.....................] - ETA: 1:53 - loss: 1.5126 - regression_loss: 1.3004 - classification_loss: 0.2122 166/500 [========>.....................] - ETA: 1:53 - loss: 1.5147 - regression_loss: 1.3023 - classification_loss: 0.2124 167/500 [=========>....................] - ETA: 1:52 - loss: 1.5197 - regression_loss: 1.3073 - classification_loss: 0.2124 168/500 [=========>....................] - ETA: 1:52 - loss: 1.5158 - regression_loss: 1.3041 - classification_loss: 0.2117 169/500 [=========>....................] - ETA: 1:52 - loss: 1.5160 - regression_loss: 1.3046 - classification_loss: 0.2114 170/500 [=========>....................] - ETA: 1:51 - loss: 1.5164 - regression_loss: 1.3052 - classification_loss: 0.2111 171/500 [=========>....................] - ETA: 1:51 - loss: 1.5160 - regression_loss: 1.3053 - classification_loss: 0.2107 172/500 [=========>....................] - ETA: 1:51 - loss: 1.5175 - regression_loss: 1.3065 - classification_loss: 0.2110 173/500 [=========>....................] - ETA: 1:50 - loss: 1.5172 - regression_loss: 1.3066 - classification_loss: 0.2106 174/500 [=========>....................] - ETA: 1:50 - loss: 1.5177 - regression_loss: 1.3071 - classification_loss: 0.2106 175/500 [=========>....................] - ETA: 1:50 - loss: 1.5201 - regression_loss: 1.3092 - classification_loss: 0.2110 176/500 [=========>....................] - ETA: 1:49 - loss: 1.5195 - regression_loss: 1.3090 - classification_loss: 0.2105 177/500 [=========>....................] - ETA: 1:49 - loss: 1.5197 - regression_loss: 1.3092 - classification_loss: 0.2105 178/500 [=========>....................] - ETA: 1:49 - loss: 1.5194 - regression_loss: 1.3092 - classification_loss: 0.2102 179/500 [=========>....................] - ETA: 1:48 - loss: 1.5196 - regression_loss: 1.3098 - classification_loss: 0.2098 180/500 [=========>....................] - ETA: 1:48 - loss: 1.5214 - regression_loss: 1.3116 - classification_loss: 0.2099 181/500 [=========>....................] - ETA: 1:47 - loss: 1.5207 - regression_loss: 1.3110 - classification_loss: 0.2097 182/500 [=========>....................] - ETA: 1:47 - loss: 1.5212 - regression_loss: 1.3115 - classification_loss: 0.2096 183/500 [=========>....................] - ETA: 1:47 - loss: 1.5218 - regression_loss: 1.3123 - classification_loss: 0.2095 184/500 [==========>...................] - ETA: 1:46 - loss: 1.5219 - regression_loss: 1.3128 - classification_loss: 0.2091 185/500 [==========>...................] - ETA: 1:46 - loss: 1.5244 - regression_loss: 1.3149 - classification_loss: 0.2095 186/500 [==========>...................] - ETA: 1:46 - loss: 1.5268 - regression_loss: 1.3171 - classification_loss: 0.2097 187/500 [==========>...................] - ETA: 1:45 - loss: 1.5253 - regression_loss: 1.3159 - classification_loss: 0.2094 188/500 [==========>...................] - ETA: 1:45 - loss: 1.5263 - regression_loss: 1.3167 - classification_loss: 0.2096 189/500 [==========>...................] - ETA: 1:45 - loss: 1.5266 - regression_loss: 1.3171 - classification_loss: 0.2095 190/500 [==========>...................] - ETA: 1:44 - loss: 1.5263 - regression_loss: 1.3167 - classification_loss: 0.2097 191/500 [==========>...................] - ETA: 1:44 - loss: 1.5271 - regression_loss: 1.3174 - classification_loss: 0.2096 192/500 [==========>...................] - ETA: 1:44 - loss: 1.5259 - regression_loss: 1.3163 - classification_loss: 0.2096 193/500 [==========>...................] - ETA: 1:43 - loss: 1.5243 - regression_loss: 1.3149 - classification_loss: 0.2094 194/500 [==========>...................] - ETA: 1:43 - loss: 1.5252 - regression_loss: 1.3158 - classification_loss: 0.2094 195/500 [==========>...................] - ETA: 1:43 - loss: 1.5218 - regression_loss: 1.3131 - classification_loss: 0.2087 196/500 [==========>...................] - ETA: 1:42 - loss: 1.5216 - regression_loss: 1.3131 - classification_loss: 0.2086 197/500 [==========>...................] - ETA: 1:42 - loss: 1.5197 - regression_loss: 1.3113 - classification_loss: 0.2083 198/500 [==========>...................] - ETA: 1:42 - loss: 1.5221 - regression_loss: 1.3136 - classification_loss: 0.2086 199/500 [==========>...................] - ETA: 1:41 - loss: 1.5271 - regression_loss: 1.3178 - classification_loss: 0.2093 200/500 [===========>..................] - ETA: 1:41 - loss: 1.5295 - regression_loss: 1.3197 - classification_loss: 0.2097 201/500 [===========>..................] - ETA: 1:41 - loss: 1.5339 - regression_loss: 1.3236 - classification_loss: 0.2103 202/500 [===========>..................] - ETA: 1:40 - loss: 1.5330 - regression_loss: 1.3229 - classification_loss: 0.2101 203/500 [===========>..................] - ETA: 1:40 - loss: 1.5316 - regression_loss: 1.3217 - classification_loss: 0.2099 204/500 [===========>..................] - ETA: 1:40 - loss: 1.5324 - regression_loss: 1.3224 - classification_loss: 0.2100 205/500 [===========>..................] - ETA: 1:39 - loss: 1.5322 - regression_loss: 1.3219 - classification_loss: 0.2103 206/500 [===========>..................] - ETA: 1:39 - loss: 1.5342 - regression_loss: 1.3236 - classification_loss: 0.2106 207/500 [===========>..................] - ETA: 1:39 - loss: 1.5341 - regression_loss: 1.3235 - classification_loss: 0.2106 208/500 [===========>..................] - ETA: 1:38 - loss: 1.5367 - regression_loss: 1.3258 - classification_loss: 0.2110 209/500 [===========>..................] - ETA: 1:38 - loss: 1.5380 - regression_loss: 1.3270 - classification_loss: 0.2110 210/500 [===========>..................] - ETA: 1:38 - loss: 1.5381 - regression_loss: 1.3268 - classification_loss: 0.2113 211/500 [===========>..................] - ETA: 1:37 - loss: 1.5395 - regression_loss: 1.3280 - classification_loss: 0.2115 212/500 [===========>..................] - ETA: 1:37 - loss: 1.5398 - regression_loss: 1.3282 - classification_loss: 0.2116 213/500 [===========>..................] - ETA: 1:37 - loss: 1.5426 - regression_loss: 1.3299 - classification_loss: 0.2127 214/500 [===========>..................] - ETA: 1:36 - loss: 1.5443 - regression_loss: 1.3314 - classification_loss: 0.2129 215/500 [===========>..................] - ETA: 1:36 - loss: 1.5464 - regression_loss: 1.3330 - classification_loss: 0.2133 216/500 [===========>..................] - ETA: 1:36 - loss: 1.5495 - regression_loss: 1.3356 - classification_loss: 0.2139 217/500 [============>.................] - ETA: 1:35 - loss: 1.5468 - regression_loss: 1.3333 - classification_loss: 0.2135 218/500 [============>.................] - ETA: 1:35 - loss: 1.5476 - regression_loss: 1.3339 - classification_loss: 0.2138 219/500 [============>.................] - ETA: 1:35 - loss: 1.5503 - regression_loss: 1.3358 - classification_loss: 0.2145 220/500 [============>.................] - ETA: 1:34 - loss: 1.5530 - regression_loss: 1.3380 - classification_loss: 0.2150 221/500 [============>.................] - ETA: 1:34 - loss: 1.5552 - regression_loss: 1.3399 - classification_loss: 0.2154 222/500 [============>.................] - ETA: 1:34 - loss: 1.5566 - regression_loss: 1.3408 - classification_loss: 0.2158 223/500 [============>.................] - ETA: 1:33 - loss: 1.5561 - regression_loss: 1.3405 - classification_loss: 0.2156 224/500 [============>.................] - ETA: 1:33 - loss: 1.5574 - regression_loss: 1.3417 - classification_loss: 0.2157 225/500 [============>.................] - ETA: 1:33 - loss: 1.5536 - regression_loss: 1.3383 - classification_loss: 0.2152 226/500 [============>.................] - ETA: 1:32 - loss: 1.5563 - regression_loss: 1.3406 - classification_loss: 0.2157 227/500 [============>.................] - ETA: 1:32 - loss: 1.5554 - regression_loss: 1.3391 - classification_loss: 0.2164 228/500 [============>.................] - ETA: 1:32 - loss: 1.5582 - regression_loss: 1.3414 - classification_loss: 0.2168 229/500 [============>.................] - ETA: 1:31 - loss: 1.5622 - regression_loss: 1.3450 - classification_loss: 0.2172 230/500 [============>.................] - ETA: 1:31 - loss: 1.5633 - regression_loss: 1.3460 - classification_loss: 0.2173 231/500 [============>.................] - ETA: 1:31 - loss: 1.5671 - regression_loss: 1.3490 - classification_loss: 0.2181 232/500 [============>.................] - ETA: 1:30 - loss: 1.5664 - regression_loss: 1.3484 - classification_loss: 0.2180 233/500 [============>.................] - ETA: 1:30 - loss: 1.5685 - regression_loss: 1.3498 - classification_loss: 0.2187 234/500 [=============>................] - ETA: 1:30 - loss: 1.5675 - regression_loss: 1.3491 - classification_loss: 0.2184 235/500 [=============>................] - ETA: 1:29 - loss: 1.5675 - regression_loss: 1.3492 - classification_loss: 0.2183 236/500 [=============>................] - ETA: 1:29 - loss: 1.5675 - regression_loss: 1.3492 - classification_loss: 0.2183 237/500 [=============>................] - ETA: 1:29 - loss: 1.5699 - regression_loss: 1.3517 - classification_loss: 0.2182 238/500 [=============>................] - ETA: 1:28 - loss: 1.5696 - regression_loss: 1.3516 - classification_loss: 0.2180 239/500 [=============>................] - ETA: 1:28 - loss: 1.5704 - regression_loss: 1.3523 - classification_loss: 0.2181 240/500 [=============>................] - ETA: 1:27 - loss: 1.5702 - regression_loss: 1.3523 - classification_loss: 0.2179 241/500 [=============>................] - ETA: 1:27 - loss: 1.5668 - regression_loss: 1.3493 - classification_loss: 0.2175 242/500 [=============>................] - ETA: 1:27 - loss: 1.5671 - regression_loss: 1.3497 - classification_loss: 0.2174 243/500 [=============>................] - ETA: 1:26 - loss: 1.5670 - regression_loss: 1.3496 - classification_loss: 0.2174 244/500 [=============>................] - ETA: 1:26 - loss: 1.5678 - regression_loss: 1.3503 - classification_loss: 0.2175 245/500 [=============>................] - ETA: 1:26 - loss: 1.5689 - regression_loss: 1.3513 - classification_loss: 0.2176 246/500 [=============>................] - ETA: 1:25 - loss: 1.5698 - regression_loss: 1.3521 - classification_loss: 0.2177 247/500 [=============>................] - ETA: 1:25 - loss: 1.5706 - regression_loss: 1.3528 - classification_loss: 0.2178 248/500 [=============>................] - ETA: 1:25 - loss: 1.5728 - regression_loss: 1.3546 - classification_loss: 0.2182 249/500 [=============>................] - ETA: 1:24 - loss: 1.5726 - regression_loss: 1.3542 - classification_loss: 0.2185 250/500 [==============>...............] - ETA: 1:24 - loss: 1.5715 - regression_loss: 1.3531 - classification_loss: 0.2184 251/500 [==============>...............] - ETA: 1:24 - loss: 1.5729 - regression_loss: 1.3541 - classification_loss: 0.2187 252/500 [==============>...............] - ETA: 1:23 - loss: 1.5727 - regression_loss: 1.3535 - classification_loss: 0.2192 253/500 [==============>...............] - ETA: 1:23 - loss: 1.5740 - regression_loss: 1.3546 - classification_loss: 0.2193 254/500 [==============>...............] - ETA: 1:23 - loss: 1.5736 - regression_loss: 1.3545 - classification_loss: 0.2191 255/500 [==============>...............] - ETA: 1:22 - loss: 1.5742 - regression_loss: 1.3550 - classification_loss: 0.2192 256/500 [==============>...............] - ETA: 1:22 - loss: 1.5795 - regression_loss: 1.3591 - classification_loss: 0.2204 257/500 [==============>...............] - ETA: 1:22 - loss: 1.5793 - regression_loss: 1.3590 - classification_loss: 0.2203 258/500 [==============>...............] - ETA: 1:21 - loss: 1.5807 - regression_loss: 1.3602 - classification_loss: 0.2204 259/500 [==============>...............] - ETA: 1:21 - loss: 1.5787 - regression_loss: 1.3586 - classification_loss: 0.2201 260/500 [==============>...............] - ETA: 1:21 - loss: 1.5793 - regression_loss: 1.3591 - classification_loss: 0.2202 261/500 [==============>...............] - ETA: 1:20 - loss: 1.5785 - regression_loss: 1.3586 - classification_loss: 0.2199 262/500 [==============>...............] - ETA: 1:20 - loss: 1.5838 - regression_loss: 1.3631 - classification_loss: 0.2207 263/500 [==============>...............] - ETA: 1:20 - loss: 1.5809 - regression_loss: 1.3606 - classification_loss: 0.2203 264/500 [==============>...............] - ETA: 1:19 - loss: 1.5801 - regression_loss: 1.3600 - classification_loss: 0.2201 265/500 [==============>...............] - ETA: 1:19 - loss: 1.5784 - regression_loss: 1.3586 - classification_loss: 0.2198 266/500 [==============>...............] - ETA: 1:19 - loss: 1.5792 - regression_loss: 1.3594 - classification_loss: 0.2199 267/500 [===============>..............] - ETA: 1:18 - loss: 1.5788 - regression_loss: 1.3590 - classification_loss: 0.2198 268/500 [===============>..............] - ETA: 1:18 - loss: 1.5784 - regression_loss: 1.3586 - classification_loss: 0.2198 269/500 [===============>..............] - ETA: 1:18 - loss: 1.5769 - regression_loss: 1.3571 - classification_loss: 0.2198 270/500 [===============>..............] - ETA: 1:17 - loss: 1.5759 - regression_loss: 1.3563 - classification_loss: 0.2196 271/500 [===============>..............] - ETA: 1:17 - loss: 1.5762 - regression_loss: 1.3566 - classification_loss: 0.2196 272/500 [===============>..............] - ETA: 1:17 - loss: 1.5745 - regression_loss: 1.3551 - classification_loss: 0.2194 273/500 [===============>..............] - ETA: 1:16 - loss: 1.5759 - regression_loss: 1.3566 - classification_loss: 0.2193 274/500 [===============>..............] - ETA: 1:16 - loss: 1.5760 - regression_loss: 1.3568 - classification_loss: 0.2193 275/500 [===============>..............] - ETA: 1:16 - loss: 1.5738 - regression_loss: 1.3551 - classification_loss: 0.2187 276/500 [===============>..............] - ETA: 1:15 - loss: 1.5722 - regression_loss: 1.3537 - classification_loss: 0.2185 277/500 [===============>..............] - ETA: 1:15 - loss: 1.5708 - regression_loss: 1.3526 - classification_loss: 0.2182 278/500 [===============>..............] - ETA: 1:15 - loss: 1.5700 - regression_loss: 1.3519 - classification_loss: 0.2181 279/500 [===============>..............] - ETA: 1:14 - loss: 1.5708 - regression_loss: 1.3526 - classification_loss: 0.2182 280/500 [===============>..............] - ETA: 1:14 - loss: 1.5717 - regression_loss: 1.3537 - classification_loss: 0.2181 281/500 [===============>..............] - ETA: 1:14 - loss: 1.5716 - regression_loss: 1.3537 - classification_loss: 0.2179 282/500 [===============>..............] - ETA: 1:13 - loss: 1.5695 - regression_loss: 1.3518 - classification_loss: 0.2177 283/500 [===============>..............] - ETA: 1:13 - loss: 1.5682 - regression_loss: 1.3509 - classification_loss: 0.2173 284/500 [================>.............] - ETA: 1:13 - loss: 1.5684 - regression_loss: 1.3511 - classification_loss: 0.2173 285/500 [================>.............] - ETA: 1:12 - loss: 1.5674 - regression_loss: 1.3505 - classification_loss: 0.2169 286/500 [================>.............] - ETA: 1:12 - loss: 1.5665 - regression_loss: 1.3499 - classification_loss: 0.2166 287/500 [================>.............] - ETA: 1:12 - loss: 1.5652 - regression_loss: 1.3488 - classification_loss: 0.2164 288/500 [================>.............] - ETA: 1:11 - loss: 1.5651 - regression_loss: 1.3488 - classification_loss: 0.2163 289/500 [================>.............] - ETA: 1:11 - loss: 1.5644 - regression_loss: 1.3483 - classification_loss: 0.2161 290/500 [================>.............] - ETA: 1:11 - loss: 1.5649 - regression_loss: 1.3486 - classification_loss: 0.2163 291/500 [================>.............] - ETA: 1:10 - loss: 1.5662 - regression_loss: 1.3497 - classification_loss: 0.2165 292/500 [================>.............] - ETA: 1:10 - loss: 1.5644 - regression_loss: 1.3482 - classification_loss: 0.2162 293/500 [================>.............] - ETA: 1:10 - loss: 1.5648 - regression_loss: 1.3486 - classification_loss: 0.2162 294/500 [================>.............] - ETA: 1:09 - loss: 1.5646 - regression_loss: 1.3484 - classification_loss: 0.2161 295/500 [================>.............] - ETA: 1:09 - loss: 1.5633 - regression_loss: 1.3474 - classification_loss: 0.2158 296/500 [================>.............] - ETA: 1:09 - loss: 1.5641 - regression_loss: 1.3482 - classification_loss: 0.2160 297/500 [================>.............] - ETA: 1:08 - loss: 1.5654 - regression_loss: 1.3491 - classification_loss: 0.2163 298/500 [================>.............] - ETA: 1:08 - loss: 1.5671 - regression_loss: 1.3508 - classification_loss: 0.2164 299/500 [================>.............] - ETA: 1:08 - loss: 1.5671 - regression_loss: 1.3507 - classification_loss: 0.2164 300/500 [=================>............] - ETA: 1:07 - loss: 1.5664 - regression_loss: 1.3501 - classification_loss: 0.2163 301/500 [=================>............] - ETA: 1:07 - loss: 1.5685 - regression_loss: 1.3520 - classification_loss: 0.2165 302/500 [=================>............] - ETA: 1:07 - loss: 1.5666 - regression_loss: 1.3505 - classification_loss: 0.2162 303/500 [=================>............] - ETA: 1:06 - loss: 1.5675 - regression_loss: 1.3511 - classification_loss: 0.2164 304/500 [=================>............] - ETA: 1:06 - loss: 1.5669 - regression_loss: 1.3508 - classification_loss: 0.2161 305/500 [=================>............] - ETA: 1:06 - loss: 1.5679 - regression_loss: 1.3519 - classification_loss: 0.2160 306/500 [=================>............] - ETA: 1:05 - loss: 1.5697 - regression_loss: 1.3534 - classification_loss: 0.2163 307/500 [=================>............] - ETA: 1:05 - loss: 1.5678 - regression_loss: 1.3520 - classification_loss: 0.2159 308/500 [=================>............] - ETA: 1:05 - loss: 1.5686 - regression_loss: 1.3526 - classification_loss: 0.2159 309/500 [=================>............] - ETA: 1:04 - loss: 1.5703 - regression_loss: 1.3538 - classification_loss: 0.2165 310/500 [=================>............] - ETA: 1:04 - loss: 1.5696 - regression_loss: 1.3533 - classification_loss: 0.2163 311/500 [=================>............] - ETA: 1:04 - loss: 1.5690 - regression_loss: 1.3529 - classification_loss: 0.2161 312/500 [=================>............] - ETA: 1:03 - loss: 1.5711 - regression_loss: 1.3546 - classification_loss: 0.2165 313/500 [=================>............] - ETA: 1:03 - loss: 1.5720 - regression_loss: 1.3553 - classification_loss: 0.2167 314/500 [=================>............] - ETA: 1:03 - loss: 1.5710 - regression_loss: 1.3539 - classification_loss: 0.2171 315/500 [=================>............] - ETA: 1:02 - loss: 1.5713 - regression_loss: 1.3540 - classification_loss: 0.2173 316/500 [=================>............] - ETA: 1:02 - loss: 1.5708 - regression_loss: 1.3538 - classification_loss: 0.2171 317/500 [==================>...........] - ETA: 1:02 - loss: 1.5732 - regression_loss: 1.3557 - classification_loss: 0.2175 318/500 [==================>...........] - ETA: 1:01 - loss: 1.5759 - regression_loss: 1.3577 - classification_loss: 0.2182 319/500 [==================>...........] - ETA: 1:01 - loss: 1.5753 - regression_loss: 1.3572 - classification_loss: 0.2181 320/500 [==================>...........] - ETA: 1:00 - loss: 1.5738 - regression_loss: 1.3558 - classification_loss: 0.2180 321/500 [==================>...........] - ETA: 1:00 - loss: 1.5745 - regression_loss: 1.3564 - classification_loss: 0.2181 322/500 [==================>...........] - ETA: 1:00 - loss: 1.5741 - regression_loss: 1.3561 - classification_loss: 0.2181 323/500 [==================>...........] - ETA: 59s - loss: 1.5748 - regression_loss: 1.3567 - classification_loss: 0.2181  324/500 [==================>...........] - ETA: 59s - loss: 1.5781 - regression_loss: 1.3591 - classification_loss: 0.2191 325/500 [==================>...........] - ETA: 59s - loss: 1.5776 - regression_loss: 1.3588 - classification_loss: 0.2188 326/500 [==================>...........] - ETA: 58s - loss: 1.5773 - regression_loss: 1.3585 - classification_loss: 0.2188 327/500 [==================>...........] - ETA: 58s - loss: 1.5784 - regression_loss: 1.3594 - classification_loss: 0.2189 328/500 [==================>...........] - ETA: 58s - loss: 1.5790 - regression_loss: 1.3603 - classification_loss: 0.2188 329/500 [==================>...........] - ETA: 57s - loss: 1.5789 - regression_loss: 1.3602 - classification_loss: 0.2187 330/500 [==================>...........] - ETA: 57s - loss: 1.5790 - regression_loss: 1.3602 - classification_loss: 0.2188 331/500 [==================>...........] - ETA: 57s - loss: 1.5773 - regression_loss: 1.3582 - classification_loss: 0.2191 332/500 [==================>...........] - ETA: 56s - loss: 1.5784 - regression_loss: 1.3591 - classification_loss: 0.2193 333/500 [==================>...........] - ETA: 56s - loss: 1.5796 - regression_loss: 1.3602 - classification_loss: 0.2194 334/500 [===================>..........] - ETA: 56s - loss: 1.5808 - regression_loss: 1.3610 - classification_loss: 0.2198 335/500 [===================>..........] - ETA: 55s - loss: 1.5815 - regression_loss: 1.3615 - classification_loss: 0.2200 336/500 [===================>..........] - ETA: 55s - loss: 1.5817 - regression_loss: 1.3618 - classification_loss: 0.2200 337/500 [===================>..........] - ETA: 55s - loss: 1.5808 - regression_loss: 1.3610 - classification_loss: 0.2198 338/500 [===================>..........] - ETA: 54s - loss: 1.5793 - regression_loss: 1.3597 - classification_loss: 0.2195 339/500 [===================>..........] - ETA: 54s - loss: 1.5805 - regression_loss: 1.3608 - classification_loss: 0.2197 340/500 [===================>..........] - ETA: 54s - loss: 1.5799 - regression_loss: 1.3602 - classification_loss: 0.2197 341/500 [===================>..........] - ETA: 53s - loss: 1.5802 - regression_loss: 1.3604 - classification_loss: 0.2198 342/500 [===================>..........] - ETA: 53s - loss: 1.5817 - regression_loss: 1.3616 - classification_loss: 0.2201 343/500 [===================>..........] - ETA: 53s - loss: 1.5825 - regression_loss: 1.3625 - classification_loss: 0.2201 344/500 [===================>..........] - ETA: 52s - loss: 1.5835 - regression_loss: 1.3633 - classification_loss: 0.2201 345/500 [===================>..........] - ETA: 52s - loss: 1.5842 - regression_loss: 1.3638 - classification_loss: 0.2203 346/500 [===================>..........] - ETA: 52s - loss: 1.5827 - regression_loss: 1.3627 - classification_loss: 0.2200 347/500 [===================>..........] - ETA: 51s - loss: 1.5839 - regression_loss: 1.3638 - classification_loss: 0.2201 348/500 [===================>..........] - ETA: 51s - loss: 1.5837 - regression_loss: 1.3635 - classification_loss: 0.2202 349/500 [===================>..........] - ETA: 51s - loss: 1.5827 - regression_loss: 1.3625 - classification_loss: 0.2202 350/500 [====================>.........] - ETA: 50s - loss: 1.5818 - regression_loss: 1.3616 - classification_loss: 0.2202 351/500 [====================>.........] - ETA: 50s - loss: 1.5818 - regression_loss: 1.3617 - classification_loss: 0.2201 352/500 [====================>.........] - ETA: 50s - loss: 1.5820 - regression_loss: 1.3619 - classification_loss: 0.2201 353/500 [====================>.........] - ETA: 49s - loss: 1.5866 - regression_loss: 1.3662 - classification_loss: 0.2203 354/500 [====================>.........] - ETA: 49s - loss: 1.5866 - regression_loss: 1.3663 - classification_loss: 0.2204 355/500 [====================>.........] - ETA: 49s - loss: 1.5867 - regression_loss: 1.3664 - classification_loss: 0.2202 356/500 [====================>.........] - ETA: 48s - loss: 1.5874 - regression_loss: 1.3671 - classification_loss: 0.2203 357/500 [====================>.........] - ETA: 48s - loss: 1.5865 - regression_loss: 1.3664 - classification_loss: 0.2201 358/500 [====================>.........] - ETA: 48s - loss: 1.5860 - regression_loss: 1.3661 - classification_loss: 0.2198 359/500 [====================>.........] - ETA: 47s - loss: 1.5859 - regression_loss: 1.3660 - classification_loss: 0.2199 360/500 [====================>.........] - ETA: 47s - loss: 1.5865 - regression_loss: 1.3667 - classification_loss: 0.2198 361/500 [====================>.........] - ETA: 47s - loss: 1.5867 - regression_loss: 1.3669 - classification_loss: 0.2198 362/500 [====================>.........] - ETA: 46s - loss: 1.5869 - regression_loss: 1.3671 - classification_loss: 0.2198 363/500 [====================>.........] - ETA: 46s - loss: 1.5865 - regression_loss: 1.3668 - classification_loss: 0.2197 364/500 [====================>.........] - ETA: 46s - loss: 1.5874 - regression_loss: 1.3676 - classification_loss: 0.2197 365/500 [====================>.........] - ETA: 45s - loss: 1.5887 - regression_loss: 1.3687 - classification_loss: 0.2200 366/500 [====================>.........] - ETA: 45s - loss: 1.5857 - regression_loss: 1.3660 - classification_loss: 0.2197 367/500 [=====================>........] - ETA: 45s - loss: 1.5869 - regression_loss: 1.3670 - classification_loss: 0.2199 368/500 [=====================>........] - ETA: 44s - loss: 1.5873 - regression_loss: 1.3673 - classification_loss: 0.2199 369/500 [=====================>........] - ETA: 44s - loss: 1.5886 - regression_loss: 1.3684 - classification_loss: 0.2202 370/500 [=====================>........] - ETA: 44s - loss: 1.5912 - regression_loss: 1.3706 - classification_loss: 0.2206 371/500 [=====================>........] - ETA: 43s - loss: 1.5920 - regression_loss: 1.3714 - classification_loss: 0.2207 372/500 [=====================>........] - ETA: 43s - loss: 1.5929 - regression_loss: 1.3720 - classification_loss: 0.2209 373/500 [=====================>........] - ETA: 43s - loss: 1.5933 - regression_loss: 1.3721 - classification_loss: 0.2212 374/500 [=====================>........] - ETA: 42s - loss: 1.5925 - regression_loss: 1.3714 - classification_loss: 0.2210 375/500 [=====================>........] - ETA: 42s - loss: 1.5935 - regression_loss: 1.3724 - classification_loss: 0.2211 376/500 [=====================>........] - ETA: 42s - loss: 1.5928 - regression_loss: 1.3719 - classification_loss: 0.2210 377/500 [=====================>........] - ETA: 41s - loss: 1.5924 - regression_loss: 1.3715 - classification_loss: 0.2209 378/500 [=====================>........] - ETA: 41s - loss: 1.5907 - regression_loss: 1.3701 - classification_loss: 0.2206 379/500 [=====================>........] - ETA: 41s - loss: 1.5907 - regression_loss: 1.3700 - classification_loss: 0.2207 380/500 [=====================>........] - ETA: 40s - loss: 1.5918 - regression_loss: 1.3710 - classification_loss: 0.2208 381/500 [=====================>........] - ETA: 40s - loss: 1.5919 - regression_loss: 1.3713 - classification_loss: 0.2206 382/500 [=====================>........] - ETA: 39s - loss: 1.5917 - regression_loss: 1.3712 - classification_loss: 0.2206 383/500 [=====================>........] - ETA: 39s - loss: 1.5933 - regression_loss: 1.3724 - classification_loss: 0.2209 384/500 [======================>.......] - ETA: 39s - loss: 1.5935 - regression_loss: 1.3726 - classification_loss: 0.2209 385/500 [======================>.......] - ETA: 38s - loss: 1.5918 - regression_loss: 1.3712 - classification_loss: 0.2206 386/500 [======================>.......] - ETA: 38s - loss: 1.5908 - regression_loss: 1.3703 - classification_loss: 0.2204 387/500 [======================>.......] - ETA: 38s - loss: 1.5906 - regression_loss: 1.3703 - classification_loss: 0.2203 388/500 [======================>.......] - ETA: 37s - loss: 1.5910 - regression_loss: 1.3706 - classification_loss: 0.2204 389/500 [======================>.......] - ETA: 37s - loss: 1.5906 - regression_loss: 1.3703 - classification_loss: 0.2203 390/500 [======================>.......] - ETA: 37s - loss: 1.5911 - regression_loss: 1.3708 - classification_loss: 0.2203 391/500 [======================>.......] - ETA: 36s - loss: 1.5895 - regression_loss: 1.3695 - classification_loss: 0.2200 392/500 [======================>.......] - ETA: 36s - loss: 1.5894 - regression_loss: 1.3694 - classification_loss: 0.2200 393/500 [======================>.......] - ETA: 36s - loss: 1.5902 - regression_loss: 1.3701 - classification_loss: 0.2201 394/500 [======================>.......] - ETA: 35s - loss: 1.5909 - regression_loss: 1.3708 - classification_loss: 0.2201 395/500 [======================>.......] - ETA: 35s - loss: 1.5911 - regression_loss: 1.3709 - classification_loss: 0.2202 396/500 [======================>.......] - ETA: 35s - loss: 1.5910 - regression_loss: 1.3708 - classification_loss: 0.2201 397/500 [======================>.......] - ETA: 34s - loss: 1.5909 - regression_loss: 1.3708 - classification_loss: 0.2201 398/500 [======================>.......] - ETA: 34s - loss: 1.5904 - regression_loss: 1.3705 - classification_loss: 0.2199 399/500 [======================>.......] - ETA: 34s - loss: 1.5897 - regression_loss: 1.3702 - classification_loss: 0.2195 400/500 [=======================>......] - ETA: 33s - loss: 1.5903 - regression_loss: 1.3705 - classification_loss: 0.2198 401/500 [=======================>......] - ETA: 33s - loss: 1.5909 - regression_loss: 1.3710 - classification_loss: 0.2199 402/500 [=======================>......] - ETA: 33s - loss: 1.5893 - regression_loss: 1.3697 - classification_loss: 0.2196 403/500 [=======================>......] - ETA: 32s - loss: 1.5888 - regression_loss: 1.3694 - classification_loss: 0.2194 404/500 [=======================>......] - ETA: 32s - loss: 1.5890 - regression_loss: 1.3696 - classification_loss: 0.2194 405/500 [=======================>......] - ETA: 32s - loss: 1.5885 - regression_loss: 1.3692 - classification_loss: 0.2193 406/500 [=======================>......] - ETA: 31s - loss: 1.5891 - regression_loss: 1.3698 - classification_loss: 0.2193 407/500 [=======================>......] - ETA: 31s - loss: 1.5903 - regression_loss: 1.3708 - classification_loss: 0.2195 408/500 [=======================>......] - ETA: 31s - loss: 1.5897 - regression_loss: 1.3704 - classification_loss: 0.2193 409/500 [=======================>......] - ETA: 30s - loss: 1.5898 - regression_loss: 1.3706 - classification_loss: 0.2192 410/500 [=======================>......] - ETA: 30s - loss: 1.5899 - regression_loss: 1.3706 - classification_loss: 0.2193 411/500 [=======================>......] - ETA: 30s - loss: 1.5897 - regression_loss: 1.3705 - classification_loss: 0.2192 412/500 [=======================>......] - ETA: 29s - loss: 1.5909 - regression_loss: 1.3716 - classification_loss: 0.2194 413/500 [=======================>......] - ETA: 29s - loss: 1.5912 - regression_loss: 1.3719 - classification_loss: 0.2193 414/500 [=======================>......] - ETA: 29s - loss: 1.5903 - regression_loss: 1.3711 - classification_loss: 0.2192 415/500 [=======================>......] - ETA: 28s - loss: 1.5914 - regression_loss: 1.3721 - classification_loss: 0.2194 416/500 [=======================>......] - ETA: 28s - loss: 1.5936 - regression_loss: 1.3738 - classification_loss: 0.2198 417/500 [========================>.....] - ETA: 28s - loss: 1.5943 - regression_loss: 1.3743 - classification_loss: 0.2200 418/500 [========================>.....] - ETA: 27s - loss: 1.5949 - regression_loss: 1.3747 - classification_loss: 0.2201 419/500 [========================>.....] - ETA: 27s - loss: 1.5959 - regression_loss: 1.3756 - classification_loss: 0.2202 420/500 [========================>.....] - ETA: 27s - loss: 1.5939 - regression_loss: 1.3739 - classification_loss: 0.2200 421/500 [========================>.....] - ETA: 26s - loss: 1.5945 - regression_loss: 1.3742 - classification_loss: 0.2202 422/500 [========================>.....] - ETA: 26s - loss: 1.5947 - regression_loss: 1.3743 - classification_loss: 0.2203 423/500 [========================>.....] - ETA: 26s - loss: 1.5934 - regression_loss: 1.3733 - classification_loss: 0.2201 424/500 [========================>.....] - ETA: 25s - loss: 1.5934 - regression_loss: 1.3733 - classification_loss: 0.2201 425/500 [========================>.....] - ETA: 25s - loss: 1.5948 - regression_loss: 1.3745 - classification_loss: 0.2203 426/500 [========================>.....] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3754 - classification_loss: 0.2212 427/500 [========================>.....] - ETA: 24s - loss: 1.5973 - regression_loss: 1.3759 - classification_loss: 0.2214 428/500 [========================>.....] - ETA: 24s - loss: 1.5974 - regression_loss: 1.3760 - classification_loss: 0.2214 429/500 [========================>.....] - ETA: 24s - loss: 1.5986 - regression_loss: 1.3768 - classification_loss: 0.2218 430/500 [========================>.....] - ETA: 23s - loss: 1.5988 - regression_loss: 1.3770 - classification_loss: 0.2218 431/500 [========================>.....] - ETA: 23s - loss: 1.6013 - regression_loss: 1.3788 - classification_loss: 0.2225 432/500 [========================>.....] - ETA: 23s - loss: 1.6017 - regression_loss: 1.3792 - classification_loss: 0.2226 433/500 [========================>.....] - ETA: 22s - loss: 1.6000 - regression_loss: 1.3778 - classification_loss: 0.2222 434/500 [=========================>....] - ETA: 22s - loss: 1.6005 - regression_loss: 1.3782 - classification_loss: 0.2223 435/500 [=========================>....] - ETA: 22s - loss: 1.6008 - regression_loss: 1.3786 - classification_loss: 0.2223 436/500 [=========================>....] - ETA: 21s - loss: 1.6020 - regression_loss: 1.3795 - classification_loss: 0.2225 437/500 [=========================>....] - ETA: 21s - loss: 1.6026 - regression_loss: 1.3801 - classification_loss: 0.2225 438/500 [=========================>....] - ETA: 21s - loss: 1.6025 - regression_loss: 1.3800 - classification_loss: 0.2225 439/500 [=========================>....] - ETA: 20s - loss: 1.6015 - regression_loss: 1.3792 - classification_loss: 0.2222 440/500 [=========================>....] - ETA: 20s - loss: 1.6019 - regression_loss: 1.3794 - classification_loss: 0.2225 441/500 [=========================>....] - ETA: 20s - loss: 1.6019 - regression_loss: 1.3793 - classification_loss: 0.2226 442/500 [=========================>....] - ETA: 19s - loss: 1.6015 - regression_loss: 1.3787 - classification_loss: 0.2228 443/500 [=========================>....] - ETA: 19s - loss: 1.6002 - regression_loss: 1.3776 - classification_loss: 0.2225 444/500 [=========================>....] - ETA: 18s - loss: 1.6006 - regression_loss: 1.3779 - classification_loss: 0.2227 445/500 [=========================>....] - ETA: 18s - loss: 1.6008 - regression_loss: 1.3781 - classification_loss: 0.2227 446/500 [=========================>....] - ETA: 18s - loss: 1.5996 - regression_loss: 1.3772 - classification_loss: 0.2225 447/500 [=========================>....] - ETA: 17s - loss: 1.5978 - regression_loss: 1.3756 - classification_loss: 0.2222 448/500 [=========================>....] - ETA: 17s - loss: 1.5990 - regression_loss: 1.3767 - classification_loss: 0.2223 449/500 [=========================>....] - ETA: 17s - loss: 1.6001 - regression_loss: 1.3776 - classification_loss: 0.2225 450/500 [==========================>...] - ETA: 16s - loss: 1.6003 - regression_loss: 1.3778 - classification_loss: 0.2226 451/500 [==========================>...] - ETA: 16s - loss: 1.5992 - regression_loss: 1.3768 - classification_loss: 0.2224 452/500 [==========================>...] - ETA: 16s - loss: 1.5989 - regression_loss: 1.3767 - classification_loss: 0.2223 453/500 [==========================>...] - ETA: 15s - loss: 1.5993 - regression_loss: 1.3770 - classification_loss: 0.2223 454/500 [==========================>...] - ETA: 15s - loss: 1.5993 - regression_loss: 1.3770 - classification_loss: 0.2223 455/500 [==========================>...] - ETA: 15s - loss: 1.5987 - regression_loss: 1.3764 - classification_loss: 0.2223 456/500 [==========================>...] - ETA: 14s - loss: 1.5988 - regression_loss: 1.3767 - classification_loss: 0.2221 457/500 [==========================>...] - ETA: 14s - loss: 1.5991 - regression_loss: 1.3770 - classification_loss: 0.2221 458/500 [==========================>...] - ETA: 14s - loss: 1.5971 - regression_loss: 1.3752 - classification_loss: 0.2219 459/500 [==========================>...] - ETA: 13s - loss: 1.5970 - regression_loss: 1.3752 - classification_loss: 0.2218 460/500 [==========================>...] - ETA: 13s - loss: 1.5988 - regression_loss: 1.3769 - classification_loss: 0.2219 461/500 [==========================>...] - ETA: 13s - loss: 1.5982 - regression_loss: 1.3765 - classification_loss: 0.2218 462/500 [==========================>...] - ETA: 12s - loss: 1.5978 - regression_loss: 1.3758 - classification_loss: 0.2221 463/500 [==========================>...] - ETA: 12s - loss: 1.5976 - regression_loss: 1.3755 - classification_loss: 0.2221 464/500 [==========================>...] - ETA: 12s - loss: 1.5976 - regression_loss: 1.3753 - classification_loss: 0.2223 465/500 [==========================>...] - ETA: 11s - loss: 1.5965 - regression_loss: 1.3743 - classification_loss: 0.2221 466/500 [==========================>...] - ETA: 11s - loss: 1.5966 - regression_loss: 1.3744 - classification_loss: 0.2222 467/500 [===========================>..] - ETA: 11s - loss: 1.5962 - regression_loss: 1.3742 - classification_loss: 0.2220 468/500 [===========================>..] - ETA: 10s - loss: 1.5952 - regression_loss: 1.3732 - classification_loss: 0.2220 469/500 [===========================>..] - ETA: 10s - loss: 1.5945 - regression_loss: 1.3727 - classification_loss: 0.2218 470/500 [===========================>..] - ETA: 10s - loss: 1.5950 - regression_loss: 1.3731 - classification_loss: 0.2219 471/500 [===========================>..] - ETA: 9s - loss: 1.5950 - regression_loss: 1.3733 - classification_loss: 0.2218  472/500 [===========================>..] - ETA: 9s - loss: 1.5946 - regression_loss: 1.3730 - classification_loss: 0.2216 473/500 [===========================>..] - ETA: 9s - loss: 1.5961 - regression_loss: 1.3744 - classification_loss: 0.2217 474/500 [===========================>..] - ETA: 8s - loss: 1.5950 - regression_loss: 1.3735 - classification_loss: 0.2216 475/500 [===========================>..] - ETA: 8s - loss: 1.5950 - regression_loss: 1.3735 - classification_loss: 0.2215 476/500 [===========================>..] - ETA: 8s - loss: 1.5956 - regression_loss: 1.3739 - classification_loss: 0.2217 477/500 [===========================>..] - ETA: 7s - loss: 1.5952 - regression_loss: 1.3735 - classification_loss: 0.2216 478/500 [===========================>..] - ETA: 7s - loss: 1.5948 - regression_loss: 1.3733 - classification_loss: 0.2215 479/500 [===========================>..] - ETA: 7s - loss: 1.5946 - regression_loss: 1.3731 - classification_loss: 0.2214 480/500 [===========================>..] - ETA: 6s - loss: 1.5947 - regression_loss: 1.3732 - classification_loss: 0.2214 481/500 [===========================>..] - ETA: 6s - loss: 1.5933 - regression_loss: 1.3720 - classification_loss: 0.2213 482/500 [===========================>..] - ETA: 6s - loss: 1.5946 - regression_loss: 1.3731 - classification_loss: 0.2215 483/500 [===========================>..] - ETA: 5s - loss: 1.5944 - regression_loss: 1.3729 - classification_loss: 0.2214 484/500 [============================>.] - ETA: 5s - loss: 1.5932 - regression_loss: 1.3720 - classification_loss: 0.2212 485/500 [============================>.] - ETA: 5s - loss: 1.5932 - regression_loss: 1.3721 - classification_loss: 0.2211 486/500 [============================>.] - ETA: 4s - loss: 1.5924 - regression_loss: 1.3714 - classification_loss: 0.2210 487/500 [============================>.] - ETA: 4s - loss: 1.5929 - regression_loss: 1.3717 - classification_loss: 0.2212 488/500 [============================>.] - ETA: 4s - loss: 1.5937 - regression_loss: 1.3723 - classification_loss: 0.2214 489/500 [============================>.] - ETA: 3s - loss: 1.5935 - regression_loss: 1.3721 - classification_loss: 0.2214 490/500 [============================>.] - ETA: 3s - loss: 1.5934 - regression_loss: 1.3719 - classification_loss: 0.2215 491/500 [============================>.] - ETA: 3s - loss: 1.5916 - regression_loss: 1.3703 - classification_loss: 0.2213 492/500 [============================>.] - ETA: 2s - loss: 1.5920 - regression_loss: 1.3705 - classification_loss: 0.2214 493/500 [============================>.] - ETA: 2s - loss: 1.5919 - regression_loss: 1.3705 - classification_loss: 0.2214 494/500 [============================>.] - ETA: 2s - loss: 1.5905 - regression_loss: 1.3694 - classification_loss: 0.2211 495/500 [============================>.] - ETA: 1s - loss: 1.5901 - regression_loss: 1.3689 - classification_loss: 0.2213 496/500 [============================>.] - ETA: 1s - loss: 1.5909 - regression_loss: 1.3694 - classification_loss: 0.2215 497/500 [============================>.] - ETA: 1s - loss: 1.5917 - regression_loss: 1.3699 - classification_loss: 0.2218 498/500 [============================>.] - ETA: 0s - loss: 1.5917 - regression_loss: 1.3698 - classification_loss: 0.2219 499/500 [============================>.] - ETA: 0s - loss: 1.5903 - regression_loss: 1.3687 - classification_loss: 0.2217 500/500 [==============================] - 169s 339ms/step - loss: 1.5902 - regression_loss: 1.3684 - classification_loss: 0.2218 1172 instances of class plum with average precision: 0.7232 mAP: 0.7232 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 2:35 - loss: 0.9358 - regression_loss: 0.7772 - classification_loss: 0.1586 2/500 [..............................] - ETA: 2:42 - loss: 1.4732 - regression_loss: 1.2481 - classification_loss: 0.2252 3/500 [..............................] - ETA: 2:43 - loss: 1.6698 - regression_loss: 1.4223 - classification_loss: 0.2475 4/500 [..............................] - ETA: 2:42 - loss: 1.7345 - regression_loss: 1.4921 - classification_loss: 0.2425 5/500 [..............................] - ETA: 2:41 - loss: 1.6286 - regression_loss: 1.4050 - classification_loss: 0.2236 6/500 [..............................] - ETA: 2:44 - loss: 1.6157 - regression_loss: 1.3970 - classification_loss: 0.2186 7/500 [..............................] - ETA: 2:45 - loss: 1.5669 - regression_loss: 1.3582 - classification_loss: 0.2087 8/500 [..............................] - ETA: 2:46 - loss: 1.6001 - regression_loss: 1.3863 - classification_loss: 0.2138 9/500 [..............................] - ETA: 2:45 - loss: 1.6236 - regression_loss: 1.4038 - classification_loss: 0.2198 10/500 [..............................] - ETA: 2:45 - loss: 1.6218 - regression_loss: 1.4066 - classification_loss: 0.2152 11/500 [..............................] - ETA: 2:45 - loss: 1.5478 - regression_loss: 1.3369 - classification_loss: 0.2109 12/500 [..............................] - ETA: 2:45 - loss: 1.6077 - regression_loss: 1.3857 - classification_loss: 0.2220 13/500 [..............................] - ETA: 2:45 - loss: 1.6082 - regression_loss: 1.3867 - classification_loss: 0.2215 14/500 [..............................] - ETA: 2:45 - loss: 1.5963 - regression_loss: 1.3768 - classification_loss: 0.2195 15/500 [..............................] - ETA: 2:45 - loss: 1.5816 - regression_loss: 1.3573 - classification_loss: 0.2244 16/500 [..............................] - ETA: 2:44 - loss: 1.6210 - regression_loss: 1.3889 - classification_loss: 0.2321 17/500 [>.............................] - ETA: 2:44 - loss: 1.6319 - regression_loss: 1.4002 - classification_loss: 0.2316 18/500 [>.............................] - ETA: 2:44 - loss: 1.6109 - regression_loss: 1.3832 - classification_loss: 0.2277 19/500 [>.............................] - ETA: 2:43 - loss: 1.6123 - regression_loss: 1.3869 - classification_loss: 0.2254 20/500 [>.............................] - ETA: 2:43 - loss: 1.6280 - regression_loss: 1.3997 - classification_loss: 0.2283 21/500 [>.............................] - ETA: 2:42 - loss: 1.6186 - regression_loss: 1.3910 - classification_loss: 0.2276 22/500 [>.............................] - ETA: 2:41 - loss: 1.6360 - regression_loss: 1.4076 - classification_loss: 0.2284 23/500 [>.............................] - ETA: 2:41 - loss: 1.6396 - regression_loss: 1.4096 - classification_loss: 0.2300 24/500 [>.............................] - ETA: 2:41 - loss: 1.6740 - regression_loss: 1.4342 - classification_loss: 0.2398 25/500 [>.............................] - ETA: 2:41 - loss: 1.6476 - regression_loss: 1.4129 - classification_loss: 0.2347 26/500 [>.............................] - ETA: 2:40 - loss: 1.6566 - regression_loss: 1.4188 - classification_loss: 0.2378 27/500 [>.............................] - ETA: 2:40 - loss: 1.6700 - regression_loss: 1.4283 - classification_loss: 0.2417 28/500 [>.............................] - ETA: 2:40 - loss: 1.6815 - regression_loss: 1.4390 - classification_loss: 0.2425 29/500 [>.............................] - ETA: 2:40 - loss: 1.6622 - regression_loss: 1.4245 - classification_loss: 0.2377 30/500 [>.............................] - ETA: 2:40 - loss: 1.6653 - regression_loss: 1.4274 - classification_loss: 0.2379 31/500 [>.............................] - ETA: 2:39 - loss: 1.6601 - regression_loss: 1.4233 - classification_loss: 0.2368 32/500 [>.............................] - ETA: 2:39 - loss: 1.6571 - regression_loss: 1.4201 - classification_loss: 0.2370 33/500 [>.............................] - ETA: 2:39 - loss: 1.6491 - regression_loss: 1.4140 - classification_loss: 0.2351 34/500 [=>............................] - ETA: 2:39 - loss: 1.6433 - regression_loss: 1.4105 - classification_loss: 0.2328 35/500 [=>............................] - ETA: 2:38 - loss: 1.6457 - regression_loss: 1.4130 - classification_loss: 0.2328 36/500 [=>............................] - ETA: 2:38 - loss: 1.6516 - regression_loss: 1.4180 - classification_loss: 0.2336 37/500 [=>............................] - ETA: 2:38 - loss: 1.6463 - regression_loss: 1.4145 - classification_loss: 0.2317 38/500 [=>............................] - ETA: 2:38 - loss: 1.6212 - regression_loss: 1.3939 - classification_loss: 0.2272 39/500 [=>............................] - ETA: 2:37 - loss: 1.6274 - regression_loss: 1.3988 - classification_loss: 0.2286 40/500 [=>............................] - ETA: 2:37 - loss: 1.6139 - regression_loss: 1.3885 - classification_loss: 0.2254 41/500 [=>............................] - ETA: 2:36 - loss: 1.6037 - regression_loss: 1.3801 - classification_loss: 0.2236 42/500 [=>............................] - ETA: 2:36 - loss: 1.6088 - regression_loss: 1.3845 - classification_loss: 0.2243 43/500 [=>............................] - ETA: 2:36 - loss: 1.6090 - regression_loss: 1.3857 - classification_loss: 0.2233 44/500 [=>............................] - ETA: 2:35 - loss: 1.6141 - regression_loss: 1.3903 - classification_loss: 0.2239 45/500 [=>............................] - ETA: 2:35 - loss: 1.6198 - regression_loss: 1.3954 - classification_loss: 0.2244 46/500 [=>............................] - ETA: 2:35 - loss: 1.6183 - regression_loss: 1.3944 - classification_loss: 0.2239 47/500 [=>............................] - ETA: 2:34 - loss: 1.6159 - regression_loss: 1.3925 - classification_loss: 0.2233 48/500 [=>............................] - ETA: 2:34 - loss: 1.6168 - regression_loss: 1.3951 - classification_loss: 0.2217 49/500 [=>............................] - ETA: 2:33 - loss: 1.6099 - regression_loss: 1.3885 - classification_loss: 0.2214 50/500 [==>...........................] - ETA: 2:33 - loss: 1.6068 - regression_loss: 1.3864 - classification_loss: 0.2204 51/500 [==>...........................] - ETA: 2:32 - loss: 1.6105 - regression_loss: 1.3890 - classification_loss: 0.2216 52/500 [==>...........................] - ETA: 2:32 - loss: 1.6153 - regression_loss: 1.3927 - classification_loss: 0.2226 53/500 [==>...........................] - ETA: 2:32 - loss: 1.6078 - regression_loss: 1.3864 - classification_loss: 0.2213 54/500 [==>...........................] - ETA: 2:31 - loss: 1.6049 - regression_loss: 1.3837 - classification_loss: 0.2212 55/500 [==>...........................] - ETA: 2:31 - loss: 1.6032 - regression_loss: 1.3823 - classification_loss: 0.2209 56/500 [==>...........................] - ETA: 2:31 - loss: 1.5884 - regression_loss: 1.3701 - classification_loss: 0.2182 57/500 [==>...........................] - ETA: 2:30 - loss: 1.5736 - regression_loss: 1.3584 - classification_loss: 0.2152 58/500 [==>...........................] - ETA: 2:30 - loss: 1.5728 - regression_loss: 1.3584 - classification_loss: 0.2145 59/500 [==>...........................] - ETA: 2:30 - loss: 1.5740 - regression_loss: 1.3597 - classification_loss: 0.2144 60/500 [==>...........................] - ETA: 2:30 - loss: 1.5716 - regression_loss: 1.3585 - classification_loss: 0.2131 61/500 [==>...........................] - ETA: 2:29 - loss: 1.5823 - regression_loss: 1.3687 - classification_loss: 0.2136 62/500 [==>...........................] - ETA: 2:29 - loss: 1.5844 - regression_loss: 1.3689 - classification_loss: 0.2155 63/500 [==>...........................] - ETA: 2:28 - loss: 1.5813 - regression_loss: 1.3668 - classification_loss: 0.2145 64/500 [==>...........................] - ETA: 2:28 - loss: 1.5719 - regression_loss: 1.3590 - classification_loss: 0.2129 65/500 [==>...........................] - ETA: 2:28 - loss: 1.5731 - regression_loss: 1.3597 - classification_loss: 0.2134 66/500 [==>...........................] - ETA: 2:27 - loss: 1.5654 - regression_loss: 1.3534 - classification_loss: 0.2120 67/500 [===>..........................] - ETA: 2:27 - loss: 1.5626 - regression_loss: 1.3516 - classification_loss: 0.2110 68/500 [===>..........................] - ETA: 2:26 - loss: 1.5753 - regression_loss: 1.3611 - classification_loss: 0.2142 69/500 [===>..........................] - ETA: 2:26 - loss: 1.5743 - regression_loss: 1.3598 - classification_loss: 0.2145 70/500 [===>..........................] - ETA: 2:26 - loss: 1.5696 - regression_loss: 1.3564 - classification_loss: 0.2132 71/500 [===>..........................] - ETA: 2:25 - loss: 1.5760 - regression_loss: 1.3605 - classification_loss: 0.2155 72/500 [===>..........................] - ETA: 2:25 - loss: 1.5721 - regression_loss: 1.3578 - classification_loss: 0.2143 73/500 [===>..........................] - ETA: 2:24 - loss: 1.5726 - regression_loss: 1.3587 - classification_loss: 0.2138 74/500 [===>..........................] - ETA: 2:24 - loss: 1.5732 - regression_loss: 1.3589 - classification_loss: 0.2143 75/500 [===>..........................] - ETA: 2:24 - loss: 1.5757 - regression_loss: 1.3609 - classification_loss: 0.2148 76/500 [===>..........................] - ETA: 2:23 - loss: 1.5820 - regression_loss: 1.3668 - classification_loss: 0.2151 77/500 [===>..........................] - ETA: 2:23 - loss: 1.5731 - regression_loss: 1.3592 - classification_loss: 0.2139 78/500 [===>..........................] - ETA: 2:23 - loss: 1.5716 - regression_loss: 1.3572 - classification_loss: 0.2144 79/500 [===>..........................] - ETA: 2:22 - loss: 1.5605 - regression_loss: 1.3481 - classification_loss: 0.2124 80/500 [===>..........................] - ETA: 2:22 - loss: 1.5682 - regression_loss: 1.3546 - classification_loss: 0.2136 81/500 [===>..........................] - ETA: 2:21 - loss: 1.5603 - regression_loss: 1.3478 - classification_loss: 0.2124 82/500 [===>..........................] - ETA: 2:21 - loss: 1.5609 - regression_loss: 1.3496 - classification_loss: 0.2112 83/500 [===>..........................] - ETA: 2:21 - loss: 1.5647 - regression_loss: 1.3537 - classification_loss: 0.2110 84/500 [====>.........................] - ETA: 2:20 - loss: 1.5639 - regression_loss: 1.3531 - classification_loss: 0.2108 85/500 [====>.........................] - ETA: 2:20 - loss: 1.5668 - regression_loss: 1.3554 - classification_loss: 0.2114 86/500 [====>.........................] - ETA: 2:20 - loss: 1.5580 - regression_loss: 1.3481 - classification_loss: 0.2099 87/500 [====>.........................] - ETA: 2:19 - loss: 1.5613 - regression_loss: 1.3504 - classification_loss: 0.2109 88/500 [====>.........................] - ETA: 2:19 - loss: 1.5580 - regression_loss: 1.3477 - classification_loss: 0.2103 89/500 [====>.........................] - ETA: 2:19 - loss: 1.5547 - regression_loss: 1.3443 - classification_loss: 0.2103 90/500 [====>.........................] - ETA: 2:18 - loss: 1.5680 - regression_loss: 1.3579 - classification_loss: 0.2101 91/500 [====>.........................] - ETA: 2:18 - loss: 1.5668 - regression_loss: 1.3569 - classification_loss: 0.2099 92/500 [====>.........................] - ETA: 2:17 - loss: 1.5573 - regression_loss: 1.3485 - classification_loss: 0.2088 93/500 [====>.........................] - ETA: 2:17 - loss: 1.5510 - regression_loss: 1.3433 - classification_loss: 0.2078 94/500 [====>.........................] - ETA: 2:17 - loss: 1.5531 - regression_loss: 1.3453 - classification_loss: 0.2078 95/500 [====>.........................] - ETA: 2:17 - loss: 1.5583 - regression_loss: 1.3498 - classification_loss: 0.2085 96/500 [====>.........................] - ETA: 2:16 - loss: 1.5515 - regression_loss: 1.3440 - classification_loss: 0.2075 97/500 [====>.........................] - ETA: 2:16 - loss: 1.5532 - regression_loss: 1.3457 - classification_loss: 0.2075 98/500 [====>.........................] - ETA: 2:16 - loss: 1.5547 - regression_loss: 1.3471 - classification_loss: 0.2076 99/500 [====>.........................] - ETA: 2:15 - loss: 1.5553 - regression_loss: 1.3465 - classification_loss: 0.2088 100/500 [=====>........................] - ETA: 2:15 - loss: 1.5590 - regression_loss: 1.3495 - classification_loss: 0.2095 101/500 [=====>........................] - ETA: 2:15 - loss: 1.5597 - regression_loss: 1.3500 - classification_loss: 0.2096 102/500 [=====>........................] - ETA: 2:14 - loss: 1.5568 - regression_loss: 1.3475 - classification_loss: 0.2094 103/500 [=====>........................] - ETA: 2:14 - loss: 1.5623 - regression_loss: 1.3521 - classification_loss: 0.2102 104/500 [=====>........................] - ETA: 2:14 - loss: 1.5705 - regression_loss: 1.3592 - classification_loss: 0.2113 105/500 [=====>........................] - ETA: 2:13 - loss: 1.5713 - regression_loss: 1.3604 - classification_loss: 0.2110 106/500 [=====>........................] - ETA: 2:13 - loss: 1.5704 - regression_loss: 1.3597 - classification_loss: 0.2107 107/500 [=====>........................] - ETA: 2:12 - loss: 1.5623 - regression_loss: 1.3528 - classification_loss: 0.2095 108/500 [=====>........................] - ETA: 2:12 - loss: 1.5667 - regression_loss: 1.3565 - classification_loss: 0.2101 109/500 [=====>........................] - ETA: 2:12 - loss: 1.5616 - regression_loss: 1.3523 - classification_loss: 0.2093 110/500 [=====>........................] - ETA: 2:11 - loss: 1.5520 - regression_loss: 1.3439 - classification_loss: 0.2081 111/500 [=====>........................] - ETA: 2:11 - loss: 1.5478 - regression_loss: 1.3406 - classification_loss: 0.2071 112/500 [=====>........................] - ETA: 2:11 - loss: 1.5506 - regression_loss: 1.3439 - classification_loss: 0.2067 113/500 [=====>........................] - ETA: 2:10 - loss: 1.5513 - regression_loss: 1.3447 - classification_loss: 0.2067 114/500 [=====>........................] - ETA: 2:10 - loss: 1.5527 - regression_loss: 1.3460 - classification_loss: 0.2067 115/500 [=====>........................] - ETA: 2:10 - loss: 1.5476 - regression_loss: 1.3419 - classification_loss: 0.2058 116/500 [=====>........................] - ETA: 2:09 - loss: 1.5438 - regression_loss: 1.3392 - classification_loss: 0.2046 117/500 [======>.......................] - ETA: 2:09 - loss: 1.5436 - regression_loss: 1.3394 - classification_loss: 0.2043 118/500 [======>.......................] - ETA: 2:09 - loss: 1.5407 - regression_loss: 1.3370 - classification_loss: 0.2038 119/500 [======>.......................] - ETA: 2:08 - loss: 1.5424 - regression_loss: 1.3382 - classification_loss: 0.2041 120/500 [======>.......................] - ETA: 2:08 - loss: 1.5423 - regression_loss: 1.3384 - classification_loss: 0.2040 121/500 [======>.......................] - ETA: 2:08 - loss: 1.5415 - regression_loss: 1.3378 - classification_loss: 0.2037 122/500 [======>.......................] - ETA: 2:07 - loss: 1.5426 - regression_loss: 1.3387 - classification_loss: 0.2040 123/500 [======>.......................] - ETA: 2:07 - loss: 1.5448 - regression_loss: 1.3410 - classification_loss: 0.2038 124/500 [======>.......................] - ETA: 2:07 - loss: 1.5448 - regression_loss: 1.3411 - classification_loss: 0.2037 125/500 [======>.......................] - ETA: 2:06 - loss: 1.5396 - regression_loss: 1.3368 - classification_loss: 0.2029 126/500 [======>.......................] - ETA: 2:06 - loss: 1.5402 - regression_loss: 1.3370 - classification_loss: 0.2032 127/500 [======>.......................] - ETA: 2:06 - loss: 1.5437 - regression_loss: 1.3402 - classification_loss: 0.2035 128/500 [======>.......................] - ETA: 2:05 - loss: 1.5401 - regression_loss: 1.3372 - classification_loss: 0.2029 129/500 [======>.......................] - ETA: 2:05 - loss: 1.5390 - regression_loss: 1.3363 - classification_loss: 0.2026 130/500 [======>.......................] - ETA: 2:05 - loss: 1.5450 - regression_loss: 1.3417 - classification_loss: 0.2033 131/500 [======>.......................] - ETA: 2:04 - loss: 1.5439 - regression_loss: 1.3407 - classification_loss: 0.2032 132/500 [======>.......................] - ETA: 2:04 - loss: 1.5440 - regression_loss: 1.3408 - classification_loss: 0.2033 133/500 [======>.......................] - ETA: 2:04 - loss: 1.5428 - regression_loss: 1.3394 - classification_loss: 0.2034 134/500 [=======>......................] - ETA: 2:03 - loss: 1.5432 - regression_loss: 1.3395 - classification_loss: 0.2037 135/500 [=======>......................] - ETA: 2:03 - loss: 1.5425 - regression_loss: 1.3391 - classification_loss: 0.2034 136/500 [=======>......................] - ETA: 2:03 - loss: 1.5396 - regression_loss: 1.3358 - classification_loss: 0.2038 137/500 [=======>......................] - ETA: 2:02 - loss: 1.5402 - regression_loss: 1.3368 - classification_loss: 0.2035 138/500 [=======>......................] - ETA: 2:02 - loss: 1.5423 - regression_loss: 1.3384 - classification_loss: 0.2039 139/500 [=======>......................] - ETA: 2:02 - loss: 1.5416 - regression_loss: 1.3379 - classification_loss: 0.2037 140/500 [=======>......................] - ETA: 2:01 - loss: 1.5404 - regression_loss: 1.3368 - classification_loss: 0.2036 141/500 [=======>......................] - ETA: 2:01 - loss: 1.5437 - regression_loss: 1.3394 - classification_loss: 0.2043 142/500 [=======>......................] - ETA: 2:01 - loss: 1.5421 - regression_loss: 1.3382 - classification_loss: 0.2040 143/500 [=======>......................] - ETA: 2:00 - loss: 1.5484 - regression_loss: 1.3442 - classification_loss: 0.2043 144/500 [=======>......................] - ETA: 2:00 - loss: 1.5502 - regression_loss: 1.3459 - classification_loss: 0.2044 145/500 [=======>......................] - ETA: 2:00 - loss: 1.5478 - regression_loss: 1.3439 - classification_loss: 0.2039 146/500 [=======>......................] - ETA: 1:59 - loss: 1.5482 - regression_loss: 1.3445 - classification_loss: 0.2037 147/500 [=======>......................] - ETA: 1:59 - loss: 1.5403 - regression_loss: 1.3376 - classification_loss: 0.2027 148/500 [=======>......................] - ETA: 1:59 - loss: 1.5404 - regression_loss: 1.3375 - classification_loss: 0.2029 149/500 [=======>......................] - ETA: 1:58 - loss: 1.5401 - regression_loss: 1.3373 - classification_loss: 0.2028 150/500 [========>.....................] - ETA: 1:58 - loss: 1.5399 - regression_loss: 1.3367 - classification_loss: 0.2031 151/500 [========>.....................] - ETA: 1:58 - loss: 1.5406 - regression_loss: 1.3373 - classification_loss: 0.2033 152/500 [========>.....................] - ETA: 1:57 - loss: 1.5403 - regression_loss: 1.3370 - classification_loss: 0.2033 153/500 [========>.....................] - ETA: 1:57 - loss: 1.5431 - regression_loss: 1.3395 - classification_loss: 0.2036 154/500 [========>.....................] - ETA: 1:57 - loss: 1.5434 - regression_loss: 1.3395 - classification_loss: 0.2038 155/500 [========>.....................] - ETA: 1:56 - loss: 1.5455 - regression_loss: 1.3414 - classification_loss: 0.2042 156/500 [========>.....................] - ETA: 1:56 - loss: 1.5447 - regression_loss: 1.3407 - classification_loss: 0.2040 157/500 [========>.....................] - ETA: 1:56 - loss: 1.5466 - regression_loss: 1.3424 - classification_loss: 0.2041 158/500 [========>.....................] - ETA: 1:55 - loss: 1.5492 - regression_loss: 1.3448 - classification_loss: 0.2044 159/500 [========>.....................] - ETA: 1:55 - loss: 1.5492 - regression_loss: 1.3449 - classification_loss: 0.2043 160/500 [========>.....................] - ETA: 1:55 - loss: 1.5509 - regression_loss: 1.3464 - classification_loss: 0.2045 161/500 [========>.....................] - ETA: 1:54 - loss: 1.5477 - regression_loss: 1.3437 - classification_loss: 0.2040 162/500 [========>.....................] - ETA: 1:54 - loss: 1.5462 - regression_loss: 1.3423 - classification_loss: 0.2039 163/500 [========>.....................] - ETA: 1:54 - loss: 1.5474 - regression_loss: 1.3431 - classification_loss: 0.2043 164/500 [========>.....................] - ETA: 1:53 - loss: 1.5459 - regression_loss: 1.3418 - classification_loss: 0.2041 165/500 [========>.....................] - ETA: 1:53 - loss: 1.5457 - regression_loss: 1.3421 - classification_loss: 0.2037 166/500 [========>.....................] - ETA: 1:53 - loss: 1.5467 - regression_loss: 1.3428 - classification_loss: 0.2038 167/500 [=========>....................] - ETA: 1:53 - loss: 1.5446 - regression_loss: 1.3411 - classification_loss: 0.2035 168/500 [=========>....................] - ETA: 1:52 - loss: 1.5487 - regression_loss: 1.3447 - classification_loss: 0.2040 169/500 [=========>....................] - ETA: 1:52 - loss: 1.5494 - regression_loss: 1.3453 - classification_loss: 0.2041 170/500 [=========>....................] - ETA: 1:52 - loss: 1.5527 - regression_loss: 1.3482 - classification_loss: 0.2044 171/500 [=========>....................] - ETA: 1:51 - loss: 1.5528 - regression_loss: 1.3489 - classification_loss: 0.2039 172/500 [=========>....................] - ETA: 1:51 - loss: 1.5502 - regression_loss: 1.3467 - classification_loss: 0.2035 173/500 [=========>....................] - ETA: 1:51 - loss: 1.5516 - regression_loss: 1.3480 - classification_loss: 0.2036 174/500 [=========>....................] - ETA: 1:50 - loss: 1.5492 - regression_loss: 1.3460 - classification_loss: 0.2032 175/500 [=========>....................] - ETA: 1:50 - loss: 1.5483 - regression_loss: 1.3455 - classification_loss: 0.2029 176/500 [=========>....................] - ETA: 1:50 - loss: 1.5456 - regression_loss: 1.3434 - classification_loss: 0.2022 177/500 [=========>....................] - ETA: 1:49 - loss: 1.5466 - regression_loss: 1.3441 - classification_loss: 0.2025 178/500 [=========>....................] - ETA: 1:49 - loss: 1.5424 - regression_loss: 1.3405 - classification_loss: 0.2019 179/500 [=========>....................] - ETA: 1:49 - loss: 1.5424 - regression_loss: 1.3406 - classification_loss: 0.2019 180/500 [=========>....................] - ETA: 1:48 - loss: 1.5415 - regression_loss: 1.3401 - classification_loss: 0.2014 181/500 [=========>....................] - ETA: 1:48 - loss: 1.5483 - regression_loss: 1.3458 - classification_loss: 0.2025 182/500 [=========>....................] - ETA: 1:48 - loss: 1.5508 - regression_loss: 1.3478 - classification_loss: 0.2030 183/500 [=========>....................] - ETA: 1:47 - loss: 1.5522 - regression_loss: 1.3492 - classification_loss: 0.2030 184/500 [==========>...................] - ETA: 1:47 - loss: 1.5530 - regression_loss: 1.3500 - classification_loss: 0.2030 185/500 [==========>...................] - ETA: 1:47 - loss: 1.5513 - regression_loss: 1.3486 - classification_loss: 0.2026 186/500 [==========>...................] - ETA: 1:46 - loss: 1.5502 - regression_loss: 1.3477 - classification_loss: 0.2025 187/500 [==========>...................] - ETA: 1:46 - loss: 1.5486 - regression_loss: 1.3463 - classification_loss: 0.2024 188/500 [==========>...................] - ETA: 1:46 - loss: 1.5510 - regression_loss: 1.3482 - classification_loss: 0.2029 189/500 [==========>...................] - ETA: 1:45 - loss: 1.5508 - regression_loss: 1.3478 - classification_loss: 0.2030 190/500 [==========>...................] - ETA: 1:45 - loss: 1.5507 - regression_loss: 1.3477 - classification_loss: 0.2030 191/500 [==========>...................] - ETA: 1:45 - loss: 1.5527 - regression_loss: 1.3492 - classification_loss: 0.2036 192/500 [==========>...................] - ETA: 1:44 - loss: 1.5566 - regression_loss: 1.3522 - classification_loss: 0.2044 193/500 [==========>...................] - ETA: 1:44 - loss: 1.5520 - regression_loss: 1.3483 - classification_loss: 0.2037 194/500 [==========>...................] - ETA: 1:44 - loss: 1.5518 - regression_loss: 1.3481 - classification_loss: 0.2037 195/500 [==========>...................] - ETA: 1:43 - loss: 1.5523 - regression_loss: 1.3486 - classification_loss: 0.2037 196/500 [==========>...................] - ETA: 1:43 - loss: 1.5537 - regression_loss: 1.3498 - classification_loss: 0.2039 197/500 [==========>...................] - ETA: 1:43 - loss: 1.5544 - regression_loss: 1.3502 - classification_loss: 0.2042 198/500 [==========>...................] - ETA: 1:42 - loss: 1.5554 - regression_loss: 1.3509 - classification_loss: 0.2045 199/500 [==========>...................] - ETA: 1:42 - loss: 1.5572 - regression_loss: 1.3528 - classification_loss: 0.2044 200/500 [===========>..................] - ETA: 1:42 - loss: 1.5548 - regression_loss: 1.3509 - classification_loss: 0.2039 201/500 [===========>..................] - ETA: 1:41 - loss: 1.5526 - regression_loss: 1.3493 - classification_loss: 0.2033 202/500 [===========>..................] - ETA: 1:41 - loss: 1.5519 - regression_loss: 1.3488 - classification_loss: 0.2031 203/500 [===========>..................] - ETA: 1:41 - loss: 1.5509 - regression_loss: 1.3480 - classification_loss: 0.2030 204/500 [===========>..................] - ETA: 1:40 - loss: 1.5472 - regression_loss: 1.3448 - classification_loss: 0.2024 205/500 [===========>..................] - ETA: 1:40 - loss: 1.5467 - regression_loss: 1.3441 - classification_loss: 0.2026 206/500 [===========>..................] - ETA: 1:40 - loss: 1.5480 - regression_loss: 1.3451 - classification_loss: 0.2029 207/500 [===========>..................] - ETA: 1:39 - loss: 1.5481 - regression_loss: 1.3441 - classification_loss: 0.2040 208/500 [===========>..................] - ETA: 1:39 - loss: 1.5475 - regression_loss: 1.3436 - classification_loss: 0.2039 209/500 [===========>..................] - ETA: 1:39 - loss: 1.5461 - regression_loss: 1.3425 - classification_loss: 0.2035 210/500 [===========>..................] - ETA: 1:38 - loss: 1.5471 - regression_loss: 1.3433 - classification_loss: 0.2038 211/500 [===========>..................] - ETA: 1:38 - loss: 1.5487 - regression_loss: 1.3447 - classification_loss: 0.2040 212/500 [===========>..................] - ETA: 1:38 - loss: 1.5468 - regression_loss: 1.3428 - classification_loss: 0.2040 213/500 [===========>..................] - ETA: 1:37 - loss: 1.5486 - regression_loss: 1.3444 - classification_loss: 0.2043 214/500 [===========>..................] - ETA: 1:37 - loss: 1.5453 - regression_loss: 1.3413 - classification_loss: 0.2040 215/500 [===========>..................] - ETA: 1:37 - loss: 1.5445 - regression_loss: 1.3406 - classification_loss: 0.2039 216/500 [===========>..................] - ETA: 1:36 - loss: 1.5437 - regression_loss: 1.3398 - classification_loss: 0.2040 217/500 [============>.................] - ETA: 1:36 - loss: 1.5418 - regression_loss: 1.3382 - classification_loss: 0.2036 218/500 [============>.................] - ETA: 1:36 - loss: 1.5389 - regression_loss: 1.3355 - classification_loss: 0.2034 219/500 [============>.................] - ETA: 1:35 - loss: 1.5414 - regression_loss: 1.3375 - classification_loss: 0.2038 220/500 [============>.................] - ETA: 1:35 - loss: 1.5420 - regression_loss: 1.3382 - classification_loss: 0.2037 221/500 [============>.................] - ETA: 1:35 - loss: 1.5422 - regression_loss: 1.3385 - classification_loss: 0.2037 222/500 [============>.................] - ETA: 1:34 - loss: 1.5428 - regression_loss: 1.3391 - classification_loss: 0.2037 223/500 [============>.................] - ETA: 1:34 - loss: 1.5409 - regression_loss: 1.3373 - classification_loss: 0.2036 224/500 [============>.................] - ETA: 1:34 - loss: 1.5407 - regression_loss: 1.3372 - classification_loss: 0.2034 225/500 [============>.................] - ETA: 1:33 - loss: 1.5411 - regression_loss: 1.3376 - classification_loss: 0.2034 226/500 [============>.................] - ETA: 1:33 - loss: 1.5443 - regression_loss: 1.3403 - classification_loss: 0.2040 227/500 [============>.................] - ETA: 1:33 - loss: 1.5482 - regression_loss: 1.3436 - classification_loss: 0.2046 228/500 [============>.................] - ETA: 1:32 - loss: 1.5480 - regression_loss: 1.3432 - classification_loss: 0.2048 229/500 [============>.................] - ETA: 1:32 - loss: 1.5484 - regression_loss: 1.3435 - classification_loss: 0.2049 230/500 [============>.................] - ETA: 1:31 - loss: 1.5509 - regression_loss: 1.3455 - classification_loss: 0.2055 231/500 [============>.................] - ETA: 1:31 - loss: 1.5502 - regression_loss: 1.3450 - classification_loss: 0.2052 232/500 [============>.................] - ETA: 1:31 - loss: 1.5504 - regression_loss: 1.3452 - classification_loss: 0.2053 233/500 [============>.................] - ETA: 1:30 - loss: 1.5518 - regression_loss: 1.3466 - classification_loss: 0.2052 234/500 [=============>................] - ETA: 1:30 - loss: 1.5555 - regression_loss: 1.3497 - classification_loss: 0.2058 235/500 [=============>................] - ETA: 1:30 - loss: 1.5555 - regression_loss: 1.3497 - classification_loss: 0.2058 236/500 [=============>................] - ETA: 1:29 - loss: 1.5523 - regression_loss: 1.3470 - classification_loss: 0.2053 237/500 [=============>................] - ETA: 1:29 - loss: 1.5529 - regression_loss: 1.3474 - classification_loss: 0.2055 238/500 [=============>................] - ETA: 1:29 - loss: 1.5524 - regression_loss: 1.3469 - classification_loss: 0.2055 239/500 [=============>................] - ETA: 1:28 - loss: 1.5530 - regression_loss: 1.3476 - classification_loss: 0.2054 240/500 [=============>................] - ETA: 1:28 - loss: 1.5532 - regression_loss: 1.3478 - classification_loss: 0.2054 241/500 [=============>................] - ETA: 1:28 - loss: 1.5511 - regression_loss: 1.3459 - classification_loss: 0.2052 242/500 [=============>................] - ETA: 1:27 - loss: 1.5520 - regression_loss: 1.3469 - classification_loss: 0.2051 243/500 [=============>................] - ETA: 1:27 - loss: 1.5530 - regression_loss: 1.3475 - classification_loss: 0.2055 244/500 [=============>................] - ETA: 1:27 - loss: 1.5526 - regression_loss: 1.3473 - classification_loss: 0.2053 245/500 [=============>................] - ETA: 1:26 - loss: 1.5531 - regression_loss: 1.3476 - classification_loss: 0.2055 246/500 [=============>................] - ETA: 1:26 - loss: 1.5534 - regression_loss: 1.3479 - classification_loss: 0.2054 247/500 [=============>................] - ETA: 1:26 - loss: 1.5545 - regression_loss: 1.3488 - classification_loss: 0.2056 248/500 [=============>................] - ETA: 1:25 - loss: 1.5549 - regression_loss: 1.3490 - classification_loss: 0.2059 249/500 [=============>................] - ETA: 1:25 - loss: 1.5553 - regression_loss: 1.3492 - classification_loss: 0.2061 250/500 [==============>...............] - ETA: 1:25 - loss: 1.5555 - regression_loss: 1.3481 - classification_loss: 0.2073 251/500 [==============>...............] - ETA: 1:24 - loss: 1.5549 - regression_loss: 1.3477 - classification_loss: 0.2072 252/500 [==============>...............] - ETA: 1:24 - loss: 1.5554 - regression_loss: 1.3482 - classification_loss: 0.2072 253/500 [==============>...............] - ETA: 1:24 - loss: 1.5552 - regression_loss: 1.3480 - classification_loss: 0.2072 254/500 [==============>...............] - ETA: 1:23 - loss: 1.5549 - regression_loss: 1.3476 - classification_loss: 0.2073 255/500 [==============>...............] - ETA: 1:23 - loss: 1.5541 - regression_loss: 1.3468 - classification_loss: 0.2073 256/500 [==============>...............] - ETA: 1:23 - loss: 1.5542 - regression_loss: 1.3469 - classification_loss: 0.2073 257/500 [==============>...............] - ETA: 1:22 - loss: 1.5556 - regression_loss: 1.3481 - classification_loss: 0.2075 258/500 [==============>...............] - ETA: 1:22 - loss: 1.5558 - regression_loss: 1.3484 - classification_loss: 0.2074 259/500 [==============>...............] - ETA: 1:22 - loss: 1.5536 - regression_loss: 1.3467 - classification_loss: 0.2070 260/500 [==============>...............] - ETA: 1:21 - loss: 1.5517 - regression_loss: 1.3449 - classification_loss: 0.2068 261/500 [==============>...............] - ETA: 1:21 - loss: 1.5527 - regression_loss: 1.3457 - classification_loss: 0.2070 262/500 [==============>...............] - ETA: 1:21 - loss: 1.5529 - regression_loss: 1.3460 - classification_loss: 0.2069 263/500 [==============>...............] - ETA: 1:20 - loss: 1.5530 - regression_loss: 1.3462 - classification_loss: 0.2069 264/500 [==============>...............] - ETA: 1:20 - loss: 1.5546 - regression_loss: 1.3476 - classification_loss: 0.2071 265/500 [==============>...............] - ETA: 1:20 - loss: 1.5577 - regression_loss: 1.3502 - classification_loss: 0.2075 266/500 [==============>...............] - ETA: 1:19 - loss: 1.5561 - regression_loss: 1.3489 - classification_loss: 0.2073 267/500 [===============>..............] - ETA: 1:19 - loss: 1.5549 - regression_loss: 1.3480 - classification_loss: 0.2070 268/500 [===============>..............] - ETA: 1:19 - loss: 1.5551 - regression_loss: 1.3477 - classification_loss: 0.2074 269/500 [===============>..............] - ETA: 1:18 - loss: 1.5541 - regression_loss: 1.3467 - classification_loss: 0.2074 270/500 [===============>..............] - ETA: 1:18 - loss: 1.5556 - regression_loss: 1.3480 - classification_loss: 0.2077 271/500 [===============>..............] - ETA: 1:18 - loss: 1.5567 - regression_loss: 1.3489 - classification_loss: 0.2078 272/500 [===============>..............] - ETA: 1:17 - loss: 1.5582 - regression_loss: 1.3503 - classification_loss: 0.2079 273/500 [===============>..............] - ETA: 1:17 - loss: 1.5585 - regression_loss: 1.3504 - classification_loss: 0.2080 274/500 [===============>..............] - ETA: 1:16 - loss: 1.5556 - regression_loss: 1.3455 - classification_loss: 0.2101 275/500 [===============>..............] - ETA: 1:16 - loss: 1.5558 - regression_loss: 1.3455 - classification_loss: 0.2103 276/500 [===============>..............] - ETA: 1:16 - loss: 1.5536 - regression_loss: 1.3434 - classification_loss: 0.2102 277/500 [===============>..............] - ETA: 1:15 - loss: 1.5519 - regression_loss: 1.3420 - classification_loss: 0.2099 278/500 [===============>..............] - ETA: 1:15 - loss: 1.5529 - regression_loss: 1.3428 - classification_loss: 0.2101 279/500 [===============>..............] - ETA: 1:15 - loss: 1.5543 - regression_loss: 1.3439 - classification_loss: 0.2104 280/500 [===============>..............] - ETA: 1:14 - loss: 1.5553 - regression_loss: 1.3450 - classification_loss: 0.2104 281/500 [===============>..............] - ETA: 1:14 - loss: 1.5561 - regression_loss: 1.3455 - classification_loss: 0.2106 282/500 [===============>..............] - ETA: 1:14 - loss: 1.5571 - regression_loss: 1.3462 - classification_loss: 0.2108 283/500 [===============>..............] - ETA: 1:13 - loss: 1.5559 - regression_loss: 1.3452 - classification_loss: 0.2107 284/500 [================>.............] - ETA: 1:13 - loss: 1.5523 - regression_loss: 1.3422 - classification_loss: 0.2102 285/500 [================>.............] - ETA: 1:13 - loss: 1.5508 - regression_loss: 1.3410 - classification_loss: 0.2099 286/500 [================>.............] - ETA: 1:12 - loss: 1.5502 - regression_loss: 1.3404 - classification_loss: 0.2098 287/500 [================>.............] - ETA: 1:12 - loss: 1.5499 - regression_loss: 1.3402 - classification_loss: 0.2097 288/500 [================>.............] - ETA: 1:12 - loss: 1.5522 - regression_loss: 1.3422 - classification_loss: 0.2099 289/500 [================>.............] - ETA: 1:11 - loss: 1.5537 - regression_loss: 1.3434 - classification_loss: 0.2103 290/500 [================>.............] - ETA: 1:11 - loss: 1.5535 - regression_loss: 1.3434 - classification_loss: 0.2101 291/500 [================>.............] - ETA: 1:11 - loss: 1.5537 - regression_loss: 1.3435 - classification_loss: 0.2102 292/500 [================>.............] - ETA: 1:10 - loss: 1.5526 - regression_loss: 1.3425 - classification_loss: 0.2101 293/500 [================>.............] - ETA: 1:10 - loss: 1.5527 - regression_loss: 1.3427 - classification_loss: 0.2100 294/500 [================>.............] - ETA: 1:10 - loss: 1.5525 - regression_loss: 1.3425 - classification_loss: 0.2100 295/500 [================>.............] - ETA: 1:09 - loss: 1.5538 - regression_loss: 1.3434 - classification_loss: 0.2104 296/500 [================>.............] - ETA: 1:09 - loss: 1.5540 - regression_loss: 1.3435 - classification_loss: 0.2105 297/500 [================>.............] - ETA: 1:09 - loss: 1.5525 - regression_loss: 1.3424 - classification_loss: 0.2100 298/500 [================>.............] - ETA: 1:08 - loss: 1.5501 - regression_loss: 1.3405 - classification_loss: 0.2097 299/500 [================>.............] - ETA: 1:08 - loss: 1.5494 - regression_loss: 1.3400 - classification_loss: 0.2095 300/500 [=================>............] - ETA: 1:08 - loss: 1.5529 - regression_loss: 1.3428 - classification_loss: 0.2101 301/500 [=================>............] - ETA: 1:07 - loss: 1.5532 - regression_loss: 1.3430 - classification_loss: 0.2102 302/500 [=================>............] - ETA: 1:07 - loss: 1.5533 - regression_loss: 1.3431 - classification_loss: 0.2102 303/500 [=================>............] - ETA: 1:07 - loss: 1.5528 - regression_loss: 1.3427 - classification_loss: 0.2101 304/500 [=================>............] - ETA: 1:06 - loss: 1.5526 - regression_loss: 1.3426 - classification_loss: 0.2100 305/500 [=================>............] - ETA: 1:06 - loss: 1.5525 - regression_loss: 1.3425 - classification_loss: 0.2100 306/500 [=================>............] - ETA: 1:06 - loss: 1.5548 - regression_loss: 1.3445 - classification_loss: 0.2103 307/500 [=================>............] - ETA: 1:05 - loss: 1.5559 - regression_loss: 1.3454 - classification_loss: 0.2105 308/500 [=================>............] - ETA: 1:05 - loss: 1.5569 - regression_loss: 1.3463 - classification_loss: 0.2106 309/500 [=================>............] - ETA: 1:05 - loss: 1.5573 - regression_loss: 1.3466 - classification_loss: 0.2107 310/500 [=================>............] - ETA: 1:04 - loss: 1.5575 - regression_loss: 1.3468 - classification_loss: 0.2107 311/500 [=================>............] - ETA: 1:04 - loss: 1.5641 - regression_loss: 1.3520 - classification_loss: 0.2121 312/500 [=================>............] - ETA: 1:03 - loss: 1.5635 - regression_loss: 1.3515 - classification_loss: 0.2120 313/500 [=================>............] - ETA: 1:03 - loss: 1.5608 - regression_loss: 1.3490 - classification_loss: 0.2118 314/500 [=================>............] - ETA: 1:03 - loss: 1.5613 - regression_loss: 1.3495 - classification_loss: 0.2118 315/500 [=================>............] - ETA: 1:02 - loss: 1.5617 - regression_loss: 1.3500 - classification_loss: 0.2116 316/500 [=================>............] - ETA: 1:02 - loss: 1.5617 - regression_loss: 1.3501 - classification_loss: 0.2115 317/500 [==================>...........] - ETA: 1:02 - loss: 1.5622 - regression_loss: 1.3507 - classification_loss: 0.2115 318/500 [==================>...........] - ETA: 1:01 - loss: 1.5622 - regression_loss: 1.3507 - classification_loss: 0.2115 319/500 [==================>...........] - ETA: 1:01 - loss: 1.5619 - regression_loss: 1.3506 - classification_loss: 0.2114 320/500 [==================>...........] - ETA: 1:01 - loss: 1.5619 - regression_loss: 1.3505 - classification_loss: 0.2113 321/500 [==================>...........] - ETA: 1:00 - loss: 1.5619 - regression_loss: 1.3507 - classification_loss: 0.2112 322/500 [==================>...........] - ETA: 1:00 - loss: 1.5618 - regression_loss: 1.3507 - classification_loss: 0.2111 323/500 [==================>...........] - ETA: 1:00 - loss: 1.5616 - regression_loss: 1.3505 - classification_loss: 0.2111 324/500 [==================>...........] - ETA: 59s - loss: 1.5606 - regression_loss: 1.3497 - classification_loss: 0.2109  325/500 [==================>...........] - ETA: 59s - loss: 1.5605 - regression_loss: 1.3496 - classification_loss: 0.2109 326/500 [==================>...........] - ETA: 59s - loss: 1.5612 - regression_loss: 1.3501 - classification_loss: 0.2111 327/500 [==================>...........] - ETA: 58s - loss: 1.5601 - regression_loss: 1.3491 - classification_loss: 0.2109 328/500 [==================>...........] - ETA: 58s - loss: 1.5604 - regression_loss: 1.3494 - classification_loss: 0.2110 329/500 [==================>...........] - ETA: 58s - loss: 1.5622 - regression_loss: 1.3508 - classification_loss: 0.2114 330/500 [==================>...........] - ETA: 57s - loss: 1.5626 - regression_loss: 1.3512 - classification_loss: 0.2114 331/500 [==================>...........] - ETA: 57s - loss: 1.5627 - regression_loss: 1.3515 - classification_loss: 0.2112 332/500 [==================>...........] - ETA: 57s - loss: 1.5624 - regression_loss: 1.3512 - classification_loss: 0.2112 333/500 [==================>...........] - ETA: 56s - loss: 1.5618 - regression_loss: 1.3506 - classification_loss: 0.2111 334/500 [===================>..........] - ETA: 56s - loss: 1.5627 - regression_loss: 1.3515 - classification_loss: 0.2112 335/500 [===================>..........] - ETA: 56s - loss: 1.5632 - regression_loss: 1.3520 - classification_loss: 0.2112 336/500 [===================>..........] - ETA: 55s - loss: 1.5647 - regression_loss: 1.3533 - classification_loss: 0.2113 337/500 [===================>..........] - ETA: 55s - loss: 1.5651 - regression_loss: 1.3538 - classification_loss: 0.2113 338/500 [===================>..........] - ETA: 55s - loss: 1.5663 - regression_loss: 1.3549 - classification_loss: 0.2114 339/500 [===================>..........] - ETA: 54s - loss: 1.5655 - regression_loss: 1.3543 - classification_loss: 0.2112 340/500 [===================>..........] - ETA: 54s - loss: 1.5660 - regression_loss: 1.3548 - classification_loss: 0.2111 341/500 [===================>..........] - ETA: 54s - loss: 1.5656 - regression_loss: 1.3544 - classification_loss: 0.2112 342/500 [===================>..........] - ETA: 53s - loss: 1.5672 - regression_loss: 1.3558 - classification_loss: 0.2113 343/500 [===================>..........] - ETA: 53s - loss: 1.5655 - regression_loss: 1.3544 - classification_loss: 0.2111 344/500 [===================>..........] - ETA: 53s - loss: 1.5659 - regression_loss: 1.3547 - classification_loss: 0.2112 345/500 [===================>..........] - ETA: 52s - loss: 1.5637 - regression_loss: 1.3529 - classification_loss: 0.2108 346/500 [===================>..........] - ETA: 52s - loss: 1.5613 - regression_loss: 1.3508 - classification_loss: 0.2105 347/500 [===================>..........] - ETA: 52s - loss: 1.5586 - regression_loss: 1.3486 - classification_loss: 0.2101 348/500 [===================>..........] - ETA: 51s - loss: 1.5574 - regression_loss: 1.3474 - classification_loss: 0.2100 349/500 [===================>..........] - ETA: 51s - loss: 1.5553 - regression_loss: 1.3456 - classification_loss: 0.2097 350/500 [====================>.........] - ETA: 51s - loss: 1.5544 - regression_loss: 1.3450 - classification_loss: 0.2094 351/500 [====================>.........] - ETA: 50s - loss: 1.5536 - regression_loss: 1.3442 - classification_loss: 0.2094 352/500 [====================>.........] - ETA: 50s - loss: 1.5535 - regression_loss: 1.3441 - classification_loss: 0.2094 353/500 [====================>.........] - ETA: 50s - loss: 1.5552 - regression_loss: 1.3457 - classification_loss: 0.2094 354/500 [====================>.........] - ETA: 49s - loss: 1.5561 - regression_loss: 1.3465 - classification_loss: 0.2095 355/500 [====================>.........] - ETA: 49s - loss: 1.5565 - regression_loss: 1.3470 - classification_loss: 0.2095 356/500 [====================>.........] - ETA: 49s - loss: 1.5565 - regression_loss: 1.3469 - classification_loss: 0.2096 357/500 [====================>.........] - ETA: 48s - loss: 1.5563 - regression_loss: 1.3468 - classification_loss: 0.2095 358/500 [====================>.........] - ETA: 48s - loss: 1.5561 - regression_loss: 1.3467 - classification_loss: 0.2094 359/500 [====================>.........] - ETA: 48s - loss: 1.5565 - regression_loss: 1.3470 - classification_loss: 0.2094 360/500 [====================>.........] - ETA: 47s - loss: 1.5567 - regression_loss: 1.3474 - classification_loss: 0.2093 361/500 [====================>.........] - ETA: 47s - loss: 1.5546 - regression_loss: 1.3458 - classification_loss: 0.2089 362/500 [====================>.........] - ETA: 46s - loss: 1.5526 - regression_loss: 1.3441 - classification_loss: 0.2085 363/500 [====================>.........] - ETA: 46s - loss: 1.5536 - regression_loss: 1.3449 - classification_loss: 0.2087 364/500 [====================>.........] - ETA: 46s - loss: 1.5542 - regression_loss: 1.3454 - classification_loss: 0.2088 365/500 [====================>.........] - ETA: 45s - loss: 1.5521 - regression_loss: 1.3434 - classification_loss: 0.2086 366/500 [====================>.........] - ETA: 45s - loss: 1.5514 - regression_loss: 1.3428 - classification_loss: 0.2086 367/500 [=====================>........] - ETA: 45s - loss: 1.5515 - regression_loss: 1.3429 - classification_loss: 0.2087 368/500 [=====================>........] - ETA: 44s - loss: 1.5503 - regression_loss: 1.3420 - classification_loss: 0.2084 369/500 [=====================>........] - ETA: 44s - loss: 1.5490 - regression_loss: 1.3410 - classification_loss: 0.2081 370/500 [=====================>........] - ETA: 44s - loss: 1.5498 - regression_loss: 1.3417 - classification_loss: 0.2081 371/500 [=====================>........] - ETA: 43s - loss: 1.5509 - regression_loss: 1.3426 - classification_loss: 0.2082 372/500 [=====================>........] - ETA: 43s - loss: 1.5517 - regression_loss: 1.3434 - classification_loss: 0.2083 373/500 [=====================>........] - ETA: 43s - loss: 1.5529 - regression_loss: 1.3444 - classification_loss: 0.2085 374/500 [=====================>........] - ETA: 42s - loss: 1.5531 - regression_loss: 1.3446 - classification_loss: 0.2085 375/500 [=====================>........] - ETA: 42s - loss: 1.5551 - regression_loss: 1.3464 - classification_loss: 0.2087 376/500 [=====================>........] - ETA: 42s - loss: 1.5557 - regression_loss: 1.3469 - classification_loss: 0.2088 377/500 [=====================>........] - ETA: 41s - loss: 1.5564 - regression_loss: 1.3472 - classification_loss: 0.2092 378/500 [=====================>........] - ETA: 41s - loss: 1.5571 - regression_loss: 1.3479 - classification_loss: 0.2093 379/500 [=====================>........] - ETA: 41s - loss: 1.5559 - regression_loss: 1.3468 - classification_loss: 0.2090 380/500 [=====================>........] - ETA: 40s - loss: 1.5565 - regression_loss: 1.3473 - classification_loss: 0.2092 381/500 [=====================>........] - ETA: 40s - loss: 1.5587 - regression_loss: 1.3495 - classification_loss: 0.2092 382/500 [=====================>........] - ETA: 40s - loss: 1.5587 - regression_loss: 1.3496 - classification_loss: 0.2091 383/500 [=====================>........] - ETA: 39s - loss: 1.5583 - regression_loss: 1.3494 - classification_loss: 0.2089 384/500 [======================>.......] - ETA: 39s - loss: 1.5576 - regression_loss: 1.3488 - classification_loss: 0.2088 385/500 [======================>.......] - ETA: 39s - loss: 1.5580 - regression_loss: 1.3491 - classification_loss: 0.2089 386/500 [======================>.......] - ETA: 38s - loss: 1.5589 - regression_loss: 1.3500 - classification_loss: 0.2089 387/500 [======================>.......] - ETA: 38s - loss: 1.5593 - regression_loss: 1.3506 - classification_loss: 0.2088 388/500 [======================>.......] - ETA: 38s - loss: 1.5597 - regression_loss: 1.3508 - classification_loss: 0.2088 389/500 [======================>.......] - ETA: 37s - loss: 1.5601 - regression_loss: 1.3512 - classification_loss: 0.2089 390/500 [======================>.......] - ETA: 37s - loss: 1.5593 - regression_loss: 1.3506 - classification_loss: 0.2087 391/500 [======================>.......] - ETA: 37s - loss: 1.5600 - regression_loss: 1.3511 - classification_loss: 0.2089 392/500 [======================>.......] - ETA: 36s - loss: 1.5573 - regression_loss: 1.3487 - classification_loss: 0.2086 393/500 [======================>.......] - ETA: 36s - loss: 1.5569 - regression_loss: 1.3483 - classification_loss: 0.2085 394/500 [======================>.......] - ETA: 36s - loss: 1.5568 - regression_loss: 1.3483 - classification_loss: 0.2086 395/500 [======================>.......] - ETA: 35s - loss: 1.5559 - regression_loss: 1.3474 - classification_loss: 0.2084 396/500 [======================>.......] - ETA: 35s - loss: 1.5569 - regression_loss: 1.3483 - classification_loss: 0.2086 397/500 [======================>.......] - ETA: 35s - loss: 1.5573 - regression_loss: 1.3487 - classification_loss: 0.2086 398/500 [======================>.......] - ETA: 34s - loss: 1.5571 - regression_loss: 1.3485 - classification_loss: 0.2087 399/500 [======================>.......] - ETA: 34s - loss: 1.5575 - regression_loss: 1.3487 - classification_loss: 0.2088 400/500 [=======================>......] - ETA: 34s - loss: 1.5557 - regression_loss: 1.3472 - classification_loss: 0.2085 401/500 [=======================>......] - ETA: 33s - loss: 1.5550 - regression_loss: 1.3466 - classification_loss: 0.2084 402/500 [=======================>......] - ETA: 33s - loss: 1.5551 - regression_loss: 1.3467 - classification_loss: 0.2085 403/500 [=======================>......] - ETA: 33s - loss: 1.5550 - regression_loss: 1.3466 - classification_loss: 0.2084 404/500 [=======================>......] - ETA: 32s - loss: 1.5548 - regression_loss: 1.3467 - classification_loss: 0.2082 405/500 [=======================>......] - ETA: 32s - loss: 1.5550 - regression_loss: 1.3469 - classification_loss: 0.2081 406/500 [=======================>......] - ETA: 31s - loss: 1.5558 - regression_loss: 1.3477 - classification_loss: 0.2082 407/500 [=======================>......] - ETA: 31s - loss: 1.5535 - regression_loss: 1.3457 - classification_loss: 0.2078 408/500 [=======================>......] - ETA: 31s - loss: 1.5534 - regression_loss: 1.3455 - classification_loss: 0.2079 409/500 [=======================>......] - ETA: 30s - loss: 1.5545 - regression_loss: 1.3464 - classification_loss: 0.2081 410/500 [=======================>......] - ETA: 30s - loss: 1.5523 - regression_loss: 1.3446 - classification_loss: 0.2078 411/500 [=======================>......] - ETA: 30s - loss: 1.5538 - regression_loss: 1.3458 - classification_loss: 0.2080 412/500 [=======================>......] - ETA: 29s - loss: 1.5530 - regression_loss: 1.3451 - classification_loss: 0.2079 413/500 [=======================>......] - ETA: 29s - loss: 1.5526 - regression_loss: 1.3448 - classification_loss: 0.2078 414/500 [=======================>......] - ETA: 29s - loss: 1.5520 - regression_loss: 1.3444 - classification_loss: 0.2076 415/500 [=======================>......] - ETA: 28s - loss: 1.5526 - regression_loss: 1.3446 - classification_loss: 0.2080 416/500 [=======================>......] - ETA: 28s - loss: 1.5505 - regression_loss: 1.3428 - classification_loss: 0.2077 417/500 [========================>.....] - ETA: 28s - loss: 1.5504 - regression_loss: 1.3428 - classification_loss: 0.2077 418/500 [========================>.....] - ETA: 27s - loss: 1.5485 - regression_loss: 1.3412 - classification_loss: 0.2074 419/500 [========================>.....] - ETA: 27s - loss: 1.5485 - regression_loss: 1.3412 - classification_loss: 0.2073 420/500 [========================>.....] - ETA: 27s - loss: 1.5480 - regression_loss: 1.3408 - classification_loss: 0.2072 421/500 [========================>.....] - ETA: 26s - loss: 1.5475 - regression_loss: 1.3404 - classification_loss: 0.2071 422/500 [========================>.....] - ETA: 26s - loss: 1.5471 - regression_loss: 1.3401 - classification_loss: 0.2070 423/500 [========================>.....] - ETA: 26s - loss: 1.5476 - regression_loss: 1.3406 - classification_loss: 0.2070 424/500 [========================>.....] - ETA: 25s - loss: 1.5451 - regression_loss: 1.3384 - classification_loss: 0.2067 425/500 [========================>.....] - ETA: 25s - loss: 1.5464 - regression_loss: 1.3395 - classification_loss: 0.2069 426/500 [========================>.....] - ETA: 25s - loss: 1.5460 - regression_loss: 1.3392 - classification_loss: 0.2068 427/500 [========================>.....] - ETA: 24s - loss: 1.5481 - regression_loss: 1.3408 - classification_loss: 0.2073 428/500 [========================>.....] - ETA: 24s - loss: 1.5468 - regression_loss: 1.3396 - classification_loss: 0.2071 429/500 [========================>.....] - ETA: 24s - loss: 1.5459 - regression_loss: 1.3389 - classification_loss: 0.2070 430/500 [========================>.....] - ETA: 23s - loss: 1.5451 - regression_loss: 1.3382 - classification_loss: 0.2069 431/500 [========================>.....] - ETA: 23s - loss: 1.5455 - regression_loss: 1.3386 - classification_loss: 0.2070 432/500 [========================>.....] - ETA: 23s - loss: 1.5454 - regression_loss: 1.3384 - classification_loss: 0.2069 433/500 [========================>.....] - ETA: 22s - loss: 1.5455 - regression_loss: 1.3386 - classification_loss: 0.2069 434/500 [=========================>....] - ETA: 22s - loss: 1.5460 - regression_loss: 1.3392 - classification_loss: 0.2069 435/500 [=========================>....] - ETA: 22s - loss: 1.5469 - regression_loss: 1.3400 - classification_loss: 0.2069 436/500 [=========================>....] - ETA: 21s - loss: 1.5463 - regression_loss: 1.3395 - classification_loss: 0.2067 437/500 [=========================>....] - ETA: 21s - loss: 1.5460 - regression_loss: 1.3394 - classification_loss: 0.2066 438/500 [=========================>....] - ETA: 21s - loss: 1.5453 - regression_loss: 1.3388 - classification_loss: 0.2065 439/500 [=========================>....] - ETA: 20s - loss: 1.5440 - regression_loss: 1.3378 - classification_loss: 0.2063 440/500 [=========================>....] - ETA: 20s - loss: 1.5450 - regression_loss: 1.3386 - classification_loss: 0.2064 441/500 [=========================>....] - ETA: 20s - loss: 1.5436 - regression_loss: 1.3374 - classification_loss: 0.2062 442/500 [=========================>....] - ETA: 19s - loss: 1.5446 - regression_loss: 1.3382 - classification_loss: 0.2064 443/500 [=========================>....] - ETA: 19s - loss: 1.5447 - regression_loss: 1.3383 - classification_loss: 0.2064 444/500 [=========================>....] - ETA: 19s - loss: 1.5437 - regression_loss: 1.3374 - classification_loss: 0.2063 445/500 [=========================>....] - ETA: 18s - loss: 1.5438 - regression_loss: 1.3375 - classification_loss: 0.2063 446/500 [=========================>....] - ETA: 18s - loss: 1.5427 - regression_loss: 1.3365 - classification_loss: 0.2062 447/500 [=========================>....] - ETA: 18s - loss: 1.5423 - regression_loss: 1.3363 - classification_loss: 0.2061 448/500 [=========================>....] - ETA: 17s - loss: 1.5411 - regression_loss: 1.3351 - classification_loss: 0.2060 449/500 [=========================>....] - ETA: 17s - loss: 1.5414 - regression_loss: 1.3354 - classification_loss: 0.2060 450/500 [==========================>...] - ETA: 17s - loss: 1.5396 - regression_loss: 1.3339 - classification_loss: 0.2057 451/500 [==========================>...] - ETA: 16s - loss: 1.5401 - regression_loss: 1.3343 - classification_loss: 0.2058 452/500 [==========================>...] - ETA: 16s - loss: 1.5403 - regression_loss: 1.3345 - classification_loss: 0.2058 453/500 [==========================>...] - ETA: 15s - loss: 1.5396 - regression_loss: 1.3339 - classification_loss: 0.2057 454/500 [==========================>...] - ETA: 15s - loss: 1.5394 - regression_loss: 1.3337 - classification_loss: 0.2057 455/500 [==========================>...] - ETA: 15s - loss: 1.5395 - regression_loss: 1.3338 - classification_loss: 0.2056 456/500 [==========================>...] - ETA: 14s - loss: 1.5403 - regression_loss: 1.3344 - classification_loss: 0.2058 457/500 [==========================>...] - ETA: 14s - loss: 1.5405 - regression_loss: 1.3349 - classification_loss: 0.2056 458/500 [==========================>...] - ETA: 14s - loss: 1.5399 - regression_loss: 1.3344 - classification_loss: 0.2056 459/500 [==========================>...] - ETA: 13s - loss: 1.5403 - regression_loss: 1.3346 - classification_loss: 0.2057 460/500 [==========================>...] - ETA: 13s - loss: 1.5387 - regression_loss: 1.3332 - classification_loss: 0.2055 461/500 [==========================>...] - ETA: 13s - loss: 1.5376 - regression_loss: 1.3324 - classification_loss: 0.2052 462/500 [==========================>...] - ETA: 12s - loss: 1.5373 - regression_loss: 1.3320 - classification_loss: 0.2052 463/500 [==========================>...] - ETA: 12s - loss: 1.5375 - regression_loss: 1.3323 - classification_loss: 0.2052 464/500 [==========================>...] - ETA: 12s - loss: 1.5366 - regression_loss: 1.3315 - classification_loss: 0.2051 465/500 [==========================>...] - ETA: 11s - loss: 1.5343 - regression_loss: 1.3296 - classification_loss: 0.2048 466/500 [==========================>...] - ETA: 11s - loss: 1.5348 - regression_loss: 1.3299 - classification_loss: 0.2049 467/500 [===========================>..] - ETA: 11s - loss: 1.5354 - regression_loss: 1.3304 - classification_loss: 0.2049 468/500 [===========================>..] - ETA: 10s - loss: 1.5358 - regression_loss: 1.3308 - classification_loss: 0.2050 469/500 [===========================>..] - ETA: 10s - loss: 1.5342 - regression_loss: 1.3295 - classification_loss: 0.2047 470/500 [===========================>..] - ETA: 10s - loss: 1.5358 - regression_loss: 1.3304 - classification_loss: 0.2055 471/500 [===========================>..] - ETA: 9s - loss: 1.5351 - regression_loss: 1.3297 - classification_loss: 0.2054  472/500 [===========================>..] - ETA: 9s - loss: 1.5349 - regression_loss: 1.3296 - classification_loss: 0.2053 473/500 [===========================>..] - ETA: 9s - loss: 1.5346 - regression_loss: 1.3292 - classification_loss: 0.2053 474/500 [===========================>..] - ETA: 8s - loss: 1.5352 - regression_loss: 1.3299 - classification_loss: 0.2053 475/500 [===========================>..] - ETA: 8s - loss: 1.5351 - regression_loss: 1.3298 - classification_loss: 0.2053 476/500 [===========================>..] - ETA: 8s - loss: 1.5349 - regression_loss: 1.3297 - classification_loss: 0.2052 477/500 [===========================>..] - ETA: 7s - loss: 1.5352 - regression_loss: 1.3300 - classification_loss: 0.2053 478/500 [===========================>..] - ETA: 7s - loss: 1.5341 - regression_loss: 1.3290 - classification_loss: 0.2051 479/500 [===========================>..] - ETA: 7s - loss: 1.5321 - regression_loss: 1.3273 - classification_loss: 0.2049 480/500 [===========================>..] - ETA: 6s - loss: 1.5315 - regression_loss: 1.3267 - classification_loss: 0.2048 481/500 [===========================>..] - ETA: 6s - loss: 1.5315 - regression_loss: 1.3267 - classification_loss: 0.2048 482/500 [===========================>..] - ETA: 6s - loss: 1.5314 - regression_loss: 1.3267 - classification_loss: 0.2047 483/500 [===========================>..] - ETA: 5s - loss: 1.5319 - regression_loss: 1.3273 - classification_loss: 0.2047 484/500 [============================>.] - ETA: 5s - loss: 1.5306 - regression_loss: 1.3260 - classification_loss: 0.2046 485/500 [============================>.] - ETA: 5s - loss: 1.5310 - regression_loss: 1.3262 - classification_loss: 0.2048 486/500 [============================>.] - ETA: 4s - loss: 1.5303 - regression_loss: 1.3257 - classification_loss: 0.2046 487/500 [============================>.] - ETA: 4s - loss: 1.5291 - regression_loss: 1.3247 - classification_loss: 0.2044 488/500 [============================>.] - ETA: 4s - loss: 1.5298 - regression_loss: 1.3253 - classification_loss: 0.2045 489/500 [============================>.] - ETA: 3s - loss: 1.5298 - regression_loss: 1.3253 - classification_loss: 0.2045 490/500 [============================>.] - ETA: 3s - loss: 1.5303 - regression_loss: 1.3256 - classification_loss: 0.2046 491/500 [============================>.] - ETA: 3s - loss: 1.5310 - regression_loss: 1.3263 - classification_loss: 0.2047 492/500 [============================>.] - ETA: 2s - loss: 1.5319 - regression_loss: 1.3271 - classification_loss: 0.2047 493/500 [============================>.] - ETA: 2s - loss: 1.5313 - regression_loss: 1.3268 - classification_loss: 0.2045 494/500 [============================>.] - ETA: 2s - loss: 1.5300 - regression_loss: 1.3257 - classification_loss: 0.2043 495/500 [============================>.] - ETA: 1s - loss: 1.5285 - regression_loss: 1.3245 - classification_loss: 0.2040 496/500 [============================>.] - ETA: 1s - loss: 1.5303 - regression_loss: 1.3261 - classification_loss: 0.2042 497/500 [============================>.] - ETA: 1s - loss: 1.5290 - regression_loss: 1.3250 - classification_loss: 0.2039 498/500 [============================>.] - ETA: 0s - loss: 1.5302 - regression_loss: 1.3260 - classification_loss: 0.2041 499/500 [============================>.] - ETA: 0s - loss: 1.5304 - regression_loss: 1.3263 - classification_loss: 0.2041 500/500 [==============================] - 170s 340ms/step - loss: 1.5301 - regression_loss: 1.3260 - classification_loss: 0.2041 1172 instances of class plum with average precision: 0.6648 mAP: 0.6648 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:34 - loss: 1.0474 - regression_loss: 0.8848 - classification_loss: 0.1626 2/500 [..............................] - ETA: 2:42 - loss: 1.4265 - regression_loss: 1.2289 - classification_loss: 0.1976 3/500 [..............................] - ETA: 2:44 - loss: 1.4562 - regression_loss: 1.2667 - classification_loss: 0.1895 4/500 [..............................] - ETA: 2:46 - loss: 1.6085 - regression_loss: 1.3930 - classification_loss: 0.2155 5/500 [..............................] - ETA: 2:47 - loss: 1.6907 - regression_loss: 1.4659 - classification_loss: 0.2248 6/500 [..............................] - ETA: 2:49 - loss: 1.6818 - regression_loss: 1.4564 - classification_loss: 0.2255 7/500 [..............................] - ETA: 2:48 - loss: 1.6342 - regression_loss: 1.4150 - classification_loss: 0.2193 8/500 [..............................] - ETA: 2:48 - loss: 1.6451 - regression_loss: 1.4213 - classification_loss: 0.2239 9/500 [..............................] - ETA: 2:48 - loss: 1.6761 - regression_loss: 1.4472 - classification_loss: 0.2290 10/500 [..............................] - ETA: 2:48 - loss: 1.7035 - regression_loss: 1.4648 - classification_loss: 0.2387 11/500 [..............................] - ETA: 2:47 - loss: 1.6809 - regression_loss: 1.4403 - classification_loss: 0.2406 12/500 [..............................] - ETA: 2:47 - loss: 1.6650 - regression_loss: 1.4279 - classification_loss: 0.2371 13/500 [..............................] - ETA: 2:47 - loss: 1.6321 - regression_loss: 1.3992 - classification_loss: 0.2329 14/500 [..............................] - ETA: 2:46 - loss: 1.6382 - regression_loss: 1.4062 - classification_loss: 0.2320 15/500 [..............................] - ETA: 2:46 - loss: 1.6116 - regression_loss: 1.3832 - classification_loss: 0.2284 16/500 [..............................] - ETA: 2:45 - loss: 1.5803 - regression_loss: 1.3567 - classification_loss: 0.2236 17/500 [>.............................] - ETA: 2:45 - loss: 1.6214 - regression_loss: 1.3959 - classification_loss: 0.2255 18/500 [>.............................] - ETA: 2:44 - loss: 1.6082 - regression_loss: 1.3901 - classification_loss: 0.2182 19/500 [>.............................] - ETA: 2:44 - loss: 1.6200 - regression_loss: 1.3978 - classification_loss: 0.2222 20/500 [>.............................] - ETA: 2:43 - loss: 1.5778 - regression_loss: 1.3638 - classification_loss: 0.2141 21/500 [>.............................] - ETA: 2:43 - loss: 1.5936 - regression_loss: 1.3774 - classification_loss: 0.2161 22/500 [>.............................] - ETA: 2:43 - loss: 1.5441 - regression_loss: 1.3339 - classification_loss: 0.2101 23/500 [>.............................] - ETA: 2:42 - loss: 1.5296 - regression_loss: 1.3219 - classification_loss: 0.2077 24/500 [>.............................] - ETA: 2:42 - loss: 1.5436 - regression_loss: 1.3356 - classification_loss: 0.2080 25/500 [>.............................] - ETA: 2:42 - loss: 1.5364 - regression_loss: 1.3306 - classification_loss: 0.2058 26/500 [>.............................] - ETA: 2:41 - loss: 1.5224 - regression_loss: 1.3199 - classification_loss: 0.2025 27/500 [>.............................] - ETA: 2:41 - loss: 1.5322 - regression_loss: 1.3262 - classification_loss: 0.2060 28/500 [>.............................] - ETA: 2:41 - loss: 1.5304 - regression_loss: 1.3253 - classification_loss: 0.2051 29/500 [>.............................] - ETA: 2:40 - loss: 1.5222 - regression_loss: 1.3177 - classification_loss: 0.2045 30/500 [>.............................] - ETA: 2:40 - loss: 1.5241 - regression_loss: 1.3190 - classification_loss: 0.2051 31/500 [>.............................] - ETA: 2:39 - loss: 1.5159 - regression_loss: 1.3083 - classification_loss: 0.2077 32/500 [>.............................] - ETA: 2:39 - loss: 1.5173 - regression_loss: 1.3086 - classification_loss: 0.2087 33/500 [>.............................] - ETA: 2:39 - loss: 1.5000 - regression_loss: 1.2947 - classification_loss: 0.2053 34/500 [=>............................] - ETA: 2:38 - loss: 1.5055 - regression_loss: 1.2989 - classification_loss: 0.2066 35/500 [=>............................] - ETA: 2:38 - loss: 1.5101 - regression_loss: 1.3035 - classification_loss: 0.2066 36/500 [=>............................] - ETA: 2:38 - loss: 1.5221 - regression_loss: 1.3140 - classification_loss: 0.2081 37/500 [=>............................] - ETA: 2:37 - loss: 1.5235 - regression_loss: 1.3150 - classification_loss: 0.2085 38/500 [=>............................] - ETA: 2:37 - loss: 1.5207 - regression_loss: 1.3137 - classification_loss: 0.2071 39/500 [=>............................] - ETA: 2:36 - loss: 1.5348 - regression_loss: 1.3260 - classification_loss: 0.2088 40/500 [=>............................] - ETA: 2:36 - loss: 1.5317 - regression_loss: 1.3221 - classification_loss: 0.2095 41/500 [=>............................] - ETA: 2:35 - loss: 1.5379 - regression_loss: 1.3278 - classification_loss: 0.2101 42/500 [=>............................] - ETA: 2:35 - loss: 1.5450 - regression_loss: 1.3340 - classification_loss: 0.2110 43/500 [=>............................] - ETA: 2:35 - loss: 1.5548 - regression_loss: 1.3426 - classification_loss: 0.2122 44/500 [=>............................] - ETA: 2:34 - loss: 1.5362 - regression_loss: 1.3268 - classification_loss: 0.2093 45/500 [=>............................] - ETA: 2:34 - loss: 1.5190 - regression_loss: 1.3126 - classification_loss: 0.2064 46/500 [=>............................] - ETA: 2:33 - loss: 1.5192 - regression_loss: 1.3137 - classification_loss: 0.2055 47/500 [=>............................] - ETA: 2:33 - loss: 1.5172 - regression_loss: 1.3132 - classification_loss: 0.2040 48/500 [=>............................] - ETA: 2:33 - loss: 1.5023 - regression_loss: 1.3003 - classification_loss: 0.2020 49/500 [=>............................] - ETA: 2:32 - loss: 1.5059 - regression_loss: 1.3038 - classification_loss: 0.2021 50/500 [==>...........................] - ETA: 2:32 - loss: 1.5039 - regression_loss: 1.3025 - classification_loss: 0.2014 51/500 [==>...........................] - ETA: 2:32 - loss: 1.4924 - regression_loss: 1.2910 - classification_loss: 0.2014 52/500 [==>...........................] - ETA: 2:31 - loss: 1.4903 - regression_loss: 1.2892 - classification_loss: 0.2011 53/500 [==>...........................] - ETA: 2:31 - loss: 1.4793 - regression_loss: 1.2795 - classification_loss: 0.1997 54/500 [==>...........................] - ETA: 2:31 - loss: 1.4839 - regression_loss: 1.2828 - classification_loss: 0.2011 55/500 [==>...........................] - ETA: 2:30 - loss: 1.4830 - regression_loss: 1.2823 - classification_loss: 0.2007 56/500 [==>...........................] - ETA: 2:30 - loss: 1.4711 - regression_loss: 1.2725 - classification_loss: 0.1986 57/500 [==>...........................] - ETA: 2:29 - loss: 1.4660 - regression_loss: 1.2676 - classification_loss: 0.1983 58/500 [==>...........................] - ETA: 2:29 - loss: 1.4606 - regression_loss: 1.2635 - classification_loss: 0.1972 59/500 [==>...........................] - ETA: 2:29 - loss: 1.4647 - regression_loss: 1.2671 - classification_loss: 0.1976 60/500 [==>...........................] - ETA: 2:29 - loss: 1.4508 - regression_loss: 1.2551 - classification_loss: 0.1957 61/500 [==>...........................] - ETA: 2:28 - loss: 1.4586 - regression_loss: 1.2613 - classification_loss: 0.1974 62/500 [==>...........................] - ETA: 2:28 - loss: 1.4439 - regression_loss: 1.2487 - classification_loss: 0.1952 63/500 [==>...........................] - ETA: 2:28 - loss: 1.4481 - regression_loss: 1.2522 - classification_loss: 0.1959 64/500 [==>...........................] - ETA: 2:27 - loss: 1.4446 - regression_loss: 1.2496 - classification_loss: 0.1950 65/500 [==>...........................] - ETA: 2:27 - loss: 1.4490 - regression_loss: 1.2534 - classification_loss: 0.1956 66/500 [==>...........................] - ETA: 2:26 - loss: 1.4499 - regression_loss: 1.2544 - classification_loss: 0.1955 67/500 [===>..........................] - ETA: 2:26 - loss: 1.4420 - regression_loss: 1.2483 - classification_loss: 0.1938 68/500 [===>..........................] - ETA: 2:26 - loss: 1.4397 - regression_loss: 1.2465 - classification_loss: 0.1932 69/500 [===>..........................] - ETA: 2:25 - loss: 1.4420 - regression_loss: 1.2490 - classification_loss: 0.1929 70/500 [===>..........................] - ETA: 2:25 - loss: 1.4394 - regression_loss: 1.2454 - classification_loss: 0.1940 71/500 [===>..........................] - ETA: 2:25 - loss: 1.4263 - regression_loss: 1.2338 - classification_loss: 0.1925 72/500 [===>..........................] - ETA: 2:24 - loss: 1.4215 - regression_loss: 1.2293 - classification_loss: 0.1922 73/500 [===>..........................] - ETA: 2:24 - loss: 1.4353 - regression_loss: 1.2398 - classification_loss: 0.1955 74/500 [===>..........................] - ETA: 2:24 - loss: 1.4395 - regression_loss: 1.2438 - classification_loss: 0.1957 75/500 [===>..........................] - ETA: 2:23 - loss: 1.4431 - regression_loss: 1.2472 - classification_loss: 0.1959 76/500 [===>..........................] - ETA: 2:23 - loss: 1.4494 - regression_loss: 1.2533 - classification_loss: 0.1962 77/500 [===>..........................] - ETA: 2:23 - loss: 1.4526 - regression_loss: 1.2556 - classification_loss: 0.1969 78/500 [===>..........................] - ETA: 2:22 - loss: 1.4449 - regression_loss: 1.2497 - classification_loss: 0.1952 79/500 [===>..........................] - ETA: 2:22 - loss: 1.4467 - regression_loss: 1.2513 - classification_loss: 0.1953 80/500 [===>..........................] - ETA: 2:22 - loss: 1.4482 - regression_loss: 1.2518 - classification_loss: 0.1964 81/500 [===>..........................] - ETA: 2:21 - loss: 1.4505 - regression_loss: 1.2540 - classification_loss: 0.1966 82/500 [===>..........................] - ETA: 2:21 - loss: 1.4515 - regression_loss: 1.2546 - classification_loss: 0.1970 83/500 [===>..........................] - ETA: 2:21 - loss: 1.4551 - regression_loss: 1.2566 - classification_loss: 0.1984 84/500 [====>.........................] - ETA: 2:20 - loss: 1.4533 - regression_loss: 1.2555 - classification_loss: 0.1978 85/500 [====>.........................] - ETA: 2:20 - loss: 1.4539 - regression_loss: 1.2558 - classification_loss: 0.1982 86/500 [====>.........................] - ETA: 2:19 - loss: 1.4611 - regression_loss: 1.2636 - classification_loss: 0.1975 87/500 [====>.........................] - ETA: 2:19 - loss: 1.4582 - regression_loss: 1.2608 - classification_loss: 0.1974 88/500 [====>.........................] - ETA: 2:19 - loss: 1.4641 - regression_loss: 1.2661 - classification_loss: 0.1980 89/500 [====>.........................] - ETA: 2:18 - loss: 1.4745 - regression_loss: 1.2743 - classification_loss: 0.2002 90/500 [====>.........................] - ETA: 2:18 - loss: 1.4783 - regression_loss: 1.2774 - classification_loss: 0.2009 91/500 [====>.........................] - ETA: 2:18 - loss: 1.4811 - regression_loss: 1.2802 - classification_loss: 0.2009 92/500 [====>.........................] - ETA: 2:17 - loss: 1.4824 - regression_loss: 1.2814 - classification_loss: 0.2010 93/500 [====>.........................] - ETA: 2:17 - loss: 1.4849 - regression_loss: 1.2841 - classification_loss: 0.2007 94/500 [====>.........................] - ETA: 2:17 - loss: 1.4830 - regression_loss: 1.2829 - classification_loss: 0.2001 95/500 [====>.........................] - ETA: 2:16 - loss: 1.4791 - regression_loss: 1.2796 - classification_loss: 0.1995 96/500 [====>.........................] - ETA: 2:16 - loss: 1.4832 - regression_loss: 1.2830 - classification_loss: 0.2002 97/500 [====>.........................] - ETA: 2:16 - loss: 1.4918 - regression_loss: 1.2902 - classification_loss: 0.2016 98/500 [====>.........................] - ETA: 2:15 - loss: 1.4948 - regression_loss: 1.2928 - classification_loss: 0.2020 99/500 [====>.........................] - ETA: 2:15 - loss: 1.4975 - regression_loss: 1.2955 - classification_loss: 0.2020 100/500 [=====>........................] - ETA: 2:15 - loss: 1.4984 - regression_loss: 1.2969 - classification_loss: 0.2015 101/500 [=====>........................] - ETA: 2:14 - loss: 1.5018 - regression_loss: 1.2999 - classification_loss: 0.2019 102/500 [=====>........................] - ETA: 2:14 - loss: 1.5040 - regression_loss: 1.3020 - classification_loss: 0.2020 103/500 [=====>........................] - ETA: 2:13 - loss: 1.5071 - regression_loss: 1.3047 - classification_loss: 0.2024 104/500 [=====>........................] - ETA: 2:13 - loss: 1.5039 - regression_loss: 1.3020 - classification_loss: 0.2018 105/500 [=====>........................] - ETA: 2:13 - loss: 1.5019 - regression_loss: 1.3008 - classification_loss: 0.2011 106/500 [=====>........................] - ETA: 2:13 - loss: 1.5027 - regression_loss: 1.3017 - classification_loss: 0.2010 107/500 [=====>........................] - ETA: 2:12 - loss: 1.5010 - regression_loss: 1.3002 - classification_loss: 0.2008 108/500 [=====>........................] - ETA: 2:12 - loss: 1.4962 - regression_loss: 1.2956 - classification_loss: 0.2006 109/500 [=====>........................] - ETA: 2:12 - loss: 1.4944 - regression_loss: 1.2939 - classification_loss: 0.2005 110/500 [=====>........................] - ETA: 2:11 - loss: 1.4964 - regression_loss: 1.2958 - classification_loss: 0.2006 111/500 [=====>........................] - ETA: 2:11 - loss: 1.4952 - regression_loss: 1.2949 - classification_loss: 0.2003 112/500 [=====>........................] - ETA: 2:11 - loss: 1.5024 - regression_loss: 1.3007 - classification_loss: 0.2016 113/500 [=====>........................] - ETA: 2:10 - loss: 1.5012 - regression_loss: 1.2992 - classification_loss: 0.2021 114/500 [=====>........................] - ETA: 2:10 - loss: 1.5014 - regression_loss: 1.2994 - classification_loss: 0.2020 115/500 [=====>........................] - ETA: 2:09 - loss: 1.5063 - regression_loss: 1.3040 - classification_loss: 0.2023 116/500 [=====>........................] - ETA: 2:09 - loss: 1.5053 - regression_loss: 1.3034 - classification_loss: 0.2019 117/500 [======>.......................] - ETA: 2:09 - loss: 1.5131 - regression_loss: 1.3103 - classification_loss: 0.2028 118/500 [======>.......................] - ETA: 2:09 - loss: 1.5178 - regression_loss: 1.3146 - classification_loss: 0.2031 119/500 [======>.......................] - ETA: 2:08 - loss: 1.5216 - regression_loss: 1.3178 - classification_loss: 0.2037 120/500 [======>.......................] - ETA: 2:08 - loss: 1.5201 - regression_loss: 1.3166 - classification_loss: 0.2036 121/500 [======>.......................] - ETA: 2:07 - loss: 1.5208 - regression_loss: 1.3171 - classification_loss: 0.2038 122/500 [======>.......................] - ETA: 2:07 - loss: 1.5246 - regression_loss: 1.3199 - classification_loss: 0.2046 123/500 [======>.......................] - ETA: 2:07 - loss: 1.5237 - regression_loss: 1.3186 - classification_loss: 0.2051 124/500 [======>.......................] - ETA: 2:06 - loss: 1.5307 - regression_loss: 1.3251 - classification_loss: 0.2056 125/500 [======>.......................] - ETA: 2:06 - loss: 1.5302 - regression_loss: 1.3251 - classification_loss: 0.2052 126/500 [======>.......................] - ETA: 2:06 - loss: 1.5295 - regression_loss: 1.3246 - classification_loss: 0.2050 127/500 [======>.......................] - ETA: 2:05 - loss: 1.5311 - regression_loss: 1.3255 - classification_loss: 0.2056 128/500 [======>.......................] - ETA: 2:05 - loss: 1.5300 - regression_loss: 1.3241 - classification_loss: 0.2059 129/500 [======>.......................] - ETA: 2:05 - loss: 1.5312 - regression_loss: 1.3252 - classification_loss: 0.2061 130/500 [======>.......................] - ETA: 2:04 - loss: 1.5337 - regression_loss: 1.3275 - classification_loss: 0.2062 131/500 [======>.......................] - ETA: 2:04 - loss: 1.5356 - regression_loss: 1.3292 - classification_loss: 0.2063 132/500 [======>.......................] - ETA: 2:04 - loss: 1.5289 - regression_loss: 1.3234 - classification_loss: 0.2055 133/500 [======>.......................] - ETA: 2:03 - loss: 1.5337 - regression_loss: 1.3278 - classification_loss: 0.2059 134/500 [=======>......................] - ETA: 2:03 - loss: 1.5393 - regression_loss: 1.3327 - classification_loss: 0.2067 135/500 [=======>......................] - ETA: 2:02 - loss: 1.5320 - regression_loss: 1.3265 - classification_loss: 0.2055 136/500 [=======>......................] - ETA: 2:02 - loss: 1.5308 - regression_loss: 1.3255 - classification_loss: 0.2053 137/500 [=======>......................] - ETA: 2:02 - loss: 1.5333 - regression_loss: 1.3269 - classification_loss: 0.2064 138/500 [=======>......................] - ETA: 2:01 - loss: 1.5330 - regression_loss: 1.3270 - classification_loss: 0.2060 139/500 [=======>......................] - ETA: 2:01 - loss: 1.5334 - regression_loss: 1.3273 - classification_loss: 0.2062 140/500 [=======>......................] - ETA: 2:01 - loss: 1.5341 - regression_loss: 1.3282 - classification_loss: 0.2059 141/500 [=======>......................] - ETA: 2:00 - loss: 1.5305 - regression_loss: 1.3249 - classification_loss: 0.2056 142/500 [=======>......................] - ETA: 2:00 - loss: 1.5324 - regression_loss: 1.3270 - classification_loss: 0.2055 143/500 [=======>......................] - ETA: 2:00 - loss: 1.5335 - regression_loss: 1.3278 - classification_loss: 0.2058 144/500 [=======>......................] - ETA: 2:00 - loss: 1.5352 - regression_loss: 1.3291 - classification_loss: 0.2061 145/500 [=======>......................] - ETA: 1:59 - loss: 1.5347 - regression_loss: 1.3284 - classification_loss: 0.2063 146/500 [=======>......................] - ETA: 1:59 - loss: 1.5376 - regression_loss: 1.3305 - classification_loss: 0.2071 147/500 [=======>......................] - ETA: 1:58 - loss: 1.5407 - regression_loss: 1.3328 - classification_loss: 0.2078 148/500 [=======>......................] - ETA: 1:58 - loss: 1.5420 - regression_loss: 1.3345 - classification_loss: 0.2075 149/500 [=======>......................] - ETA: 1:58 - loss: 1.5402 - regression_loss: 1.3328 - classification_loss: 0.2074 150/500 [========>.....................] - ETA: 1:58 - loss: 1.5394 - regression_loss: 1.3321 - classification_loss: 0.2073 151/500 [========>.....................] - ETA: 1:57 - loss: 1.5428 - regression_loss: 1.3349 - classification_loss: 0.2080 152/500 [========>.....................] - ETA: 1:57 - loss: 1.5433 - regression_loss: 1.3348 - classification_loss: 0.2085 153/500 [========>.....................] - ETA: 1:57 - loss: 1.5434 - regression_loss: 1.3351 - classification_loss: 0.2083 154/500 [========>.....................] - ETA: 1:56 - loss: 1.5477 - regression_loss: 1.3390 - classification_loss: 0.2087 155/500 [========>.....................] - ETA: 1:56 - loss: 1.5485 - regression_loss: 1.3394 - classification_loss: 0.2090 156/500 [========>.....................] - ETA: 1:56 - loss: 1.5496 - regression_loss: 1.3407 - classification_loss: 0.2089 157/500 [========>.....................] - ETA: 1:55 - loss: 1.5480 - regression_loss: 1.3396 - classification_loss: 0.2085 158/500 [========>.....................] - ETA: 1:55 - loss: 1.5521 - regression_loss: 1.3432 - classification_loss: 0.2088 159/500 [========>.....................] - ETA: 1:55 - loss: 1.5518 - regression_loss: 1.3426 - classification_loss: 0.2092 160/500 [========>.....................] - ETA: 1:54 - loss: 1.5493 - regression_loss: 1.3406 - classification_loss: 0.2087 161/500 [========>.....................] - ETA: 1:54 - loss: 1.5526 - regression_loss: 1.3434 - classification_loss: 0.2092 162/500 [========>.....................] - ETA: 1:54 - loss: 1.5543 - regression_loss: 1.3449 - classification_loss: 0.2093 163/500 [========>.....................] - ETA: 1:53 - loss: 1.5551 - regression_loss: 1.3456 - classification_loss: 0.2096 164/500 [========>.....................] - ETA: 1:53 - loss: 1.5556 - regression_loss: 1.3456 - classification_loss: 0.2100 165/500 [========>.....................] - ETA: 1:53 - loss: 1.5498 - regression_loss: 1.3406 - classification_loss: 0.2092 166/500 [========>.....................] - ETA: 1:52 - loss: 1.5511 - regression_loss: 1.3418 - classification_loss: 0.2093 167/500 [=========>....................] - ETA: 1:52 - loss: 1.5515 - regression_loss: 1.3424 - classification_loss: 0.2091 168/500 [=========>....................] - ETA: 1:52 - loss: 1.5526 - regression_loss: 1.3427 - classification_loss: 0.2099 169/500 [=========>....................] - ETA: 1:51 - loss: 1.5527 - regression_loss: 1.3428 - classification_loss: 0.2099 170/500 [=========>....................] - ETA: 1:51 - loss: 1.5519 - regression_loss: 1.3422 - classification_loss: 0.2097 171/500 [=========>....................] - ETA: 1:51 - loss: 1.5522 - regression_loss: 1.3423 - classification_loss: 0.2099 172/500 [=========>....................] - ETA: 1:50 - loss: 1.5520 - regression_loss: 1.3422 - classification_loss: 0.2097 173/500 [=========>....................] - ETA: 1:50 - loss: 1.5495 - regression_loss: 1.3403 - classification_loss: 0.2093 174/500 [=========>....................] - ETA: 1:50 - loss: 1.5463 - regression_loss: 1.3372 - classification_loss: 0.2090 175/500 [=========>....................] - ETA: 1:49 - loss: 1.5411 - regression_loss: 1.3329 - classification_loss: 0.2082 176/500 [=========>....................] - ETA: 1:49 - loss: 1.5414 - regression_loss: 1.3333 - classification_loss: 0.2081 177/500 [=========>....................] - ETA: 1:49 - loss: 1.5383 - regression_loss: 1.3308 - classification_loss: 0.2074 178/500 [=========>....................] - ETA: 1:48 - loss: 1.5343 - regression_loss: 1.3271 - classification_loss: 0.2071 179/500 [=========>....................] - ETA: 1:48 - loss: 1.5324 - regression_loss: 1.3254 - classification_loss: 0.2069 180/500 [=========>....................] - ETA: 1:48 - loss: 1.5306 - regression_loss: 1.3238 - classification_loss: 0.2068 181/500 [=========>....................] - ETA: 1:48 - loss: 1.5273 - regression_loss: 1.3209 - classification_loss: 0.2064 182/500 [=========>....................] - ETA: 1:47 - loss: 1.5266 - regression_loss: 1.3205 - classification_loss: 0.2061 183/500 [=========>....................] - ETA: 1:47 - loss: 1.5266 - regression_loss: 1.3207 - classification_loss: 0.2059 184/500 [==========>...................] - ETA: 1:46 - loss: 1.5255 - regression_loss: 1.3197 - classification_loss: 0.2058 185/500 [==========>...................] - ETA: 1:46 - loss: 1.5219 - regression_loss: 1.3166 - classification_loss: 0.2053 186/500 [==========>...................] - ETA: 1:46 - loss: 1.5234 - regression_loss: 1.3180 - classification_loss: 0.2054 187/500 [==========>...................] - ETA: 1:45 - loss: 1.5195 - regression_loss: 1.3148 - classification_loss: 0.2047 188/500 [==========>...................] - ETA: 1:45 - loss: 1.5194 - regression_loss: 1.3148 - classification_loss: 0.2046 189/500 [==========>...................] - ETA: 1:45 - loss: 1.5216 - regression_loss: 1.3165 - classification_loss: 0.2051 190/500 [==========>...................] - ETA: 1:44 - loss: 1.5238 - regression_loss: 1.3190 - classification_loss: 0.2048 191/500 [==========>...................] - ETA: 1:44 - loss: 1.5244 - regression_loss: 1.3194 - classification_loss: 0.2050 192/500 [==========>...................] - ETA: 1:44 - loss: 1.5233 - regression_loss: 1.3185 - classification_loss: 0.2048 193/500 [==========>...................] - ETA: 1:43 - loss: 1.5200 - regression_loss: 1.3156 - classification_loss: 0.2043 194/500 [==========>...................] - ETA: 1:43 - loss: 1.5189 - regression_loss: 1.3146 - classification_loss: 0.2043 195/500 [==========>...................] - ETA: 1:43 - loss: 1.5213 - regression_loss: 1.3166 - classification_loss: 0.2047 196/500 [==========>...................] - ETA: 1:42 - loss: 1.5169 - regression_loss: 1.3127 - classification_loss: 0.2042 197/500 [==========>...................] - ETA: 1:42 - loss: 1.5179 - regression_loss: 1.3136 - classification_loss: 0.2043 198/500 [==========>...................] - ETA: 1:42 - loss: 1.5148 - regression_loss: 1.3111 - classification_loss: 0.2036 199/500 [==========>...................] - ETA: 1:41 - loss: 1.5128 - regression_loss: 1.3094 - classification_loss: 0.2034 200/500 [===========>..................] - ETA: 1:41 - loss: 1.5116 - regression_loss: 1.3084 - classification_loss: 0.2032 201/500 [===========>..................] - ETA: 1:41 - loss: 1.5121 - regression_loss: 1.3088 - classification_loss: 0.2033 202/500 [===========>..................] - ETA: 1:40 - loss: 1.5142 - regression_loss: 1.3106 - classification_loss: 0.2037 203/500 [===========>..................] - ETA: 1:40 - loss: 1.5135 - regression_loss: 1.3098 - classification_loss: 0.2037 204/500 [===========>..................] - ETA: 1:40 - loss: 1.5117 - regression_loss: 1.3081 - classification_loss: 0.2036 205/500 [===========>..................] - ETA: 1:39 - loss: 1.5122 - regression_loss: 1.3084 - classification_loss: 0.2038 206/500 [===========>..................] - ETA: 1:39 - loss: 1.5127 - regression_loss: 1.3088 - classification_loss: 0.2039 207/500 [===========>..................] - ETA: 1:39 - loss: 1.5134 - regression_loss: 1.3094 - classification_loss: 0.2040 208/500 [===========>..................] - ETA: 1:38 - loss: 1.5148 - regression_loss: 1.3107 - classification_loss: 0.2042 209/500 [===========>..................] - ETA: 1:38 - loss: 1.5165 - regression_loss: 1.3121 - classification_loss: 0.2044 210/500 [===========>..................] - ETA: 1:38 - loss: 1.5185 - regression_loss: 1.3138 - classification_loss: 0.2046 211/500 [===========>..................] - ETA: 1:37 - loss: 1.5185 - regression_loss: 1.3139 - classification_loss: 0.2046 212/500 [===========>..................] - ETA: 1:37 - loss: 1.5166 - regression_loss: 1.3123 - classification_loss: 0.2043 213/500 [===========>..................] - ETA: 1:37 - loss: 1.5212 - regression_loss: 1.3162 - classification_loss: 0.2050 214/500 [===========>..................] - ETA: 1:36 - loss: 1.5255 - regression_loss: 1.3200 - classification_loss: 0.2055 215/500 [===========>..................] - ETA: 1:36 - loss: 1.5244 - regression_loss: 1.3192 - classification_loss: 0.2052 216/500 [===========>..................] - ETA: 1:36 - loss: 1.5243 - regression_loss: 1.3191 - classification_loss: 0.2052 217/500 [============>.................] - ETA: 1:35 - loss: 1.5221 - regression_loss: 1.3173 - classification_loss: 0.2048 218/500 [============>.................] - ETA: 1:35 - loss: 1.5200 - regression_loss: 1.3156 - classification_loss: 0.2044 219/500 [============>.................] - ETA: 1:35 - loss: 1.5275 - regression_loss: 1.3226 - classification_loss: 0.2049 220/500 [============>.................] - ETA: 1:34 - loss: 1.5277 - regression_loss: 1.3229 - classification_loss: 0.2048 221/500 [============>.................] - ETA: 1:34 - loss: 1.5279 - regression_loss: 1.3232 - classification_loss: 0.2048 222/500 [============>.................] - ETA: 1:33 - loss: 1.5303 - regression_loss: 1.3248 - classification_loss: 0.2056 223/500 [============>.................] - ETA: 1:33 - loss: 1.5298 - regression_loss: 1.3243 - classification_loss: 0.2055 224/500 [============>.................] - ETA: 1:33 - loss: 1.5300 - regression_loss: 1.3246 - classification_loss: 0.2054 225/500 [============>.................] - ETA: 1:32 - loss: 1.5308 - regression_loss: 1.3252 - classification_loss: 0.2055 226/500 [============>.................] - ETA: 1:32 - loss: 1.5314 - regression_loss: 1.3258 - classification_loss: 0.2056 227/500 [============>.................] - ETA: 1:32 - loss: 1.5315 - regression_loss: 1.3260 - classification_loss: 0.2055 228/500 [============>.................] - ETA: 1:31 - loss: 1.5323 - regression_loss: 1.3267 - classification_loss: 0.2056 229/500 [============>.................] - ETA: 1:31 - loss: 1.5319 - regression_loss: 1.3262 - classification_loss: 0.2057 230/500 [============>.................] - ETA: 1:31 - loss: 1.5308 - regression_loss: 1.3254 - classification_loss: 0.2054 231/500 [============>.................] - ETA: 1:30 - loss: 1.5361 - regression_loss: 1.3303 - classification_loss: 0.2057 232/500 [============>.................] - ETA: 1:30 - loss: 1.5352 - regression_loss: 1.3297 - classification_loss: 0.2055 233/500 [============>.................] - ETA: 1:30 - loss: 1.5344 - regression_loss: 1.3292 - classification_loss: 0.2052 234/500 [=============>................] - ETA: 1:29 - loss: 1.5346 - regression_loss: 1.3293 - classification_loss: 0.2053 235/500 [=============>................] - ETA: 1:29 - loss: 1.5349 - regression_loss: 1.3296 - classification_loss: 0.2053 236/500 [=============>................] - ETA: 1:29 - loss: 1.5346 - regression_loss: 1.3294 - classification_loss: 0.2053 237/500 [=============>................] - ETA: 1:28 - loss: 1.5346 - regression_loss: 1.3294 - classification_loss: 0.2052 238/500 [=============>................] - ETA: 1:28 - loss: 1.5320 - regression_loss: 1.3272 - classification_loss: 0.2048 239/500 [=============>................] - ETA: 1:28 - loss: 1.5312 - regression_loss: 1.3264 - classification_loss: 0.2048 240/500 [=============>................] - ETA: 1:27 - loss: 1.5306 - regression_loss: 1.3259 - classification_loss: 0.2047 241/500 [=============>................] - ETA: 1:27 - loss: 1.5267 - regression_loss: 1.3227 - classification_loss: 0.2041 242/500 [=============>................] - ETA: 1:27 - loss: 1.5280 - regression_loss: 1.3236 - classification_loss: 0.2043 243/500 [=============>................] - ETA: 1:26 - loss: 1.5249 - regression_loss: 1.3208 - classification_loss: 0.2041 244/500 [=============>................] - ETA: 1:26 - loss: 1.5219 - regression_loss: 1.3183 - classification_loss: 0.2036 245/500 [=============>................] - ETA: 1:26 - loss: 1.5222 - regression_loss: 1.3187 - classification_loss: 0.2035 246/500 [=============>................] - ETA: 1:25 - loss: 1.5213 - regression_loss: 1.3177 - classification_loss: 0.2035 247/500 [=============>................] - ETA: 1:25 - loss: 1.5215 - regression_loss: 1.3179 - classification_loss: 0.2036 248/500 [=============>................] - ETA: 1:25 - loss: 1.5198 - regression_loss: 1.3167 - classification_loss: 0.2031 249/500 [=============>................] - ETA: 1:24 - loss: 1.5219 - regression_loss: 1.3185 - classification_loss: 0.2034 250/500 [==============>...............] - ETA: 1:24 - loss: 1.5200 - regression_loss: 1.3170 - classification_loss: 0.2030 251/500 [==============>...............] - ETA: 1:24 - loss: 1.5196 - regression_loss: 1.3169 - classification_loss: 0.2028 252/500 [==============>...............] - ETA: 1:23 - loss: 1.5199 - regression_loss: 1.3171 - classification_loss: 0.2028 253/500 [==============>...............] - ETA: 1:23 - loss: 1.5211 - regression_loss: 1.3180 - classification_loss: 0.2031 254/500 [==============>...............] - ETA: 1:23 - loss: 1.5208 - regression_loss: 1.3178 - classification_loss: 0.2030 255/500 [==============>...............] - ETA: 1:22 - loss: 1.5213 - regression_loss: 1.3183 - classification_loss: 0.2030 256/500 [==============>...............] - ETA: 1:22 - loss: 1.5222 - regression_loss: 1.3193 - classification_loss: 0.2030 257/500 [==============>...............] - ETA: 1:22 - loss: 1.5222 - regression_loss: 1.3193 - classification_loss: 0.2028 258/500 [==============>...............] - ETA: 1:21 - loss: 1.5221 - regression_loss: 1.3194 - classification_loss: 0.2027 259/500 [==============>...............] - ETA: 1:21 - loss: 1.5259 - regression_loss: 1.3227 - classification_loss: 0.2032 260/500 [==============>...............] - ETA: 1:21 - loss: 1.5282 - regression_loss: 1.3249 - classification_loss: 0.2034 261/500 [==============>...............] - ETA: 1:20 - loss: 1.5306 - regression_loss: 1.3267 - classification_loss: 0.2038 262/500 [==============>...............] - ETA: 1:20 - loss: 1.5302 - regression_loss: 1.3265 - classification_loss: 0.2038 263/500 [==============>...............] - ETA: 1:20 - loss: 1.5266 - regression_loss: 1.3233 - classification_loss: 0.2033 264/500 [==============>...............] - ETA: 1:19 - loss: 1.5267 - regression_loss: 1.3235 - classification_loss: 0.2032 265/500 [==============>...............] - ETA: 1:19 - loss: 1.5258 - regression_loss: 1.3224 - classification_loss: 0.2033 266/500 [==============>...............] - ETA: 1:19 - loss: 1.5265 - regression_loss: 1.3228 - classification_loss: 0.2037 267/500 [===============>..............] - ETA: 1:18 - loss: 1.5267 - regression_loss: 1.3229 - classification_loss: 0.2038 268/500 [===============>..............] - ETA: 1:18 - loss: 1.5238 - regression_loss: 1.3205 - classification_loss: 0.2033 269/500 [===============>..............] - ETA: 1:18 - loss: 1.5233 - regression_loss: 1.3200 - classification_loss: 0.2033 270/500 [===============>..............] - ETA: 1:17 - loss: 1.5281 - regression_loss: 1.3243 - classification_loss: 0.2038 271/500 [===============>..............] - ETA: 1:17 - loss: 1.5275 - regression_loss: 1.3238 - classification_loss: 0.2038 272/500 [===============>..............] - ETA: 1:17 - loss: 1.5272 - regression_loss: 1.3233 - classification_loss: 0.2040 273/500 [===============>..............] - ETA: 1:16 - loss: 1.5250 - regression_loss: 1.3212 - classification_loss: 0.2037 274/500 [===============>..............] - ETA: 1:16 - loss: 1.5246 - regression_loss: 1.3209 - classification_loss: 0.2037 275/500 [===============>..............] - ETA: 1:16 - loss: 1.5228 - regression_loss: 1.3192 - classification_loss: 0.2035 276/500 [===============>..............] - ETA: 1:15 - loss: 1.5231 - regression_loss: 1.3197 - classification_loss: 0.2034 277/500 [===============>..............] - ETA: 1:15 - loss: 1.5203 - regression_loss: 1.3174 - classification_loss: 0.2029 278/500 [===============>..............] - ETA: 1:15 - loss: 1.5189 - regression_loss: 1.3164 - classification_loss: 0.2025 279/500 [===============>..............] - ETA: 1:14 - loss: 1.5185 - regression_loss: 1.3162 - classification_loss: 0.2023 280/500 [===============>..............] - ETA: 1:14 - loss: 1.5176 - regression_loss: 1.3154 - classification_loss: 0.2021 281/500 [===============>..............] - ETA: 1:14 - loss: 1.5157 - regression_loss: 1.3139 - classification_loss: 0.2019 282/500 [===============>..............] - ETA: 1:13 - loss: 1.5167 - regression_loss: 1.3149 - classification_loss: 0.2018 283/500 [===============>..............] - ETA: 1:13 - loss: 1.5172 - regression_loss: 1.3150 - classification_loss: 0.2022 284/500 [================>.............] - ETA: 1:13 - loss: 1.5164 - regression_loss: 1.3144 - classification_loss: 0.2020 285/500 [================>.............] - ETA: 1:12 - loss: 1.5161 - regression_loss: 1.3142 - classification_loss: 0.2019 286/500 [================>.............] - ETA: 1:12 - loss: 1.5169 - regression_loss: 1.3150 - classification_loss: 0.2019 287/500 [================>.............] - ETA: 1:12 - loss: 1.5133 - regression_loss: 1.3118 - classification_loss: 0.2014 288/500 [================>.............] - ETA: 1:11 - loss: 1.5123 - regression_loss: 1.3110 - classification_loss: 0.2013 289/500 [================>.............] - ETA: 1:11 - loss: 1.5122 - regression_loss: 1.3109 - classification_loss: 0.2014 290/500 [================>.............] - ETA: 1:11 - loss: 1.5100 - regression_loss: 1.3090 - classification_loss: 0.2010 291/500 [================>.............] - ETA: 1:10 - loss: 1.5106 - regression_loss: 1.3095 - classification_loss: 0.2012 292/500 [================>.............] - ETA: 1:10 - loss: 1.5095 - regression_loss: 1.3086 - classification_loss: 0.2010 293/500 [================>.............] - ETA: 1:10 - loss: 1.5090 - regression_loss: 1.3080 - classification_loss: 0.2010 294/500 [================>.............] - ETA: 1:09 - loss: 1.5090 - regression_loss: 1.3081 - classification_loss: 0.2009 295/500 [================>.............] - ETA: 1:09 - loss: 1.5098 - regression_loss: 1.3089 - classification_loss: 0.2009 296/500 [================>.............] - ETA: 1:09 - loss: 1.5103 - regression_loss: 1.3093 - classification_loss: 0.2010 297/500 [================>.............] - ETA: 1:08 - loss: 1.5075 - regression_loss: 1.3068 - classification_loss: 0.2007 298/500 [================>.............] - ETA: 1:08 - loss: 1.5073 - regression_loss: 1.3066 - classification_loss: 0.2007 299/500 [================>.............] - ETA: 1:08 - loss: 1.5061 - regression_loss: 1.3055 - classification_loss: 0.2006 300/500 [=================>............] - ETA: 1:07 - loss: 1.5065 - regression_loss: 1.3057 - classification_loss: 0.2008 301/500 [=================>............] - ETA: 1:07 - loss: 1.5077 - regression_loss: 1.3068 - classification_loss: 0.2009 302/500 [=================>............] - ETA: 1:07 - loss: 1.5054 - regression_loss: 1.3049 - classification_loss: 0.2006 303/500 [=================>............] - ETA: 1:06 - loss: 1.5052 - regression_loss: 1.3048 - classification_loss: 0.2003 304/500 [=================>............] - ETA: 1:06 - loss: 1.5021 - regression_loss: 1.3022 - classification_loss: 0.1999 305/500 [=================>............] - ETA: 1:06 - loss: 1.5035 - regression_loss: 1.3035 - classification_loss: 0.2001 306/500 [=================>............] - ETA: 1:05 - loss: 1.5028 - regression_loss: 1.3030 - classification_loss: 0.1998 307/500 [=================>............] - ETA: 1:05 - loss: 1.5031 - regression_loss: 1.3032 - classification_loss: 0.1998 308/500 [=================>............] - ETA: 1:05 - loss: 1.5008 - regression_loss: 1.3013 - classification_loss: 0.1995 309/500 [=================>............] - ETA: 1:04 - loss: 1.5020 - regression_loss: 1.3024 - classification_loss: 0.1996 310/500 [=================>............] - ETA: 1:04 - loss: 1.5034 - regression_loss: 1.3036 - classification_loss: 0.1998 311/500 [=================>............] - ETA: 1:03 - loss: 1.5045 - regression_loss: 1.3045 - classification_loss: 0.2000 312/500 [=================>............] - ETA: 1:03 - loss: 1.5063 - regression_loss: 1.3061 - classification_loss: 0.2002 313/500 [=================>............] - ETA: 1:03 - loss: 1.5051 - regression_loss: 1.3051 - classification_loss: 0.2001 314/500 [=================>............] - ETA: 1:02 - loss: 1.5054 - regression_loss: 1.3053 - classification_loss: 0.2001 315/500 [=================>............] - ETA: 1:02 - loss: 1.5046 - regression_loss: 1.3047 - classification_loss: 0.1999 316/500 [=================>............] - ETA: 1:02 - loss: 1.5049 - regression_loss: 1.3051 - classification_loss: 0.1999 317/500 [==================>...........] - ETA: 1:01 - loss: 1.5061 - regression_loss: 1.3059 - classification_loss: 0.2001 318/500 [==================>...........] - ETA: 1:01 - loss: 1.5056 - regression_loss: 1.3057 - classification_loss: 0.2000 319/500 [==================>...........] - ETA: 1:01 - loss: 1.5053 - regression_loss: 1.3054 - classification_loss: 0.1999 320/500 [==================>...........] - ETA: 1:00 - loss: 1.5049 - regression_loss: 1.3051 - classification_loss: 0.1998 321/500 [==================>...........] - ETA: 1:00 - loss: 1.5065 - regression_loss: 1.3065 - classification_loss: 0.2000 322/500 [==================>...........] - ETA: 1:00 - loss: 1.5037 - regression_loss: 1.3040 - classification_loss: 0.1997 323/500 [==================>...........] - ETA: 59s - loss: 1.5035 - regression_loss: 1.3040 - classification_loss: 0.1995  324/500 [==================>...........] - ETA: 59s - loss: 1.5040 - regression_loss: 1.3044 - classification_loss: 0.1996 325/500 [==================>...........] - ETA: 59s - loss: 1.5016 - regression_loss: 1.3024 - classification_loss: 0.1992 326/500 [==================>...........] - ETA: 58s - loss: 1.4986 - regression_loss: 1.2997 - classification_loss: 0.1989 327/500 [==================>...........] - ETA: 58s - loss: 1.4973 - regression_loss: 1.2987 - classification_loss: 0.1986 328/500 [==================>...........] - ETA: 58s - loss: 1.4976 - regression_loss: 1.2989 - classification_loss: 0.1987 329/500 [==================>...........] - ETA: 57s - loss: 1.4988 - regression_loss: 1.3000 - classification_loss: 0.1988 330/500 [==================>...........] - ETA: 57s - loss: 1.4994 - regression_loss: 1.3000 - classification_loss: 0.1993 331/500 [==================>...........] - ETA: 57s - loss: 1.4974 - regression_loss: 1.2984 - classification_loss: 0.1990 332/500 [==================>...........] - ETA: 56s - loss: 1.4982 - regression_loss: 1.2991 - classification_loss: 0.1991 333/500 [==================>...........] - ETA: 56s - loss: 1.4989 - regression_loss: 1.2996 - classification_loss: 0.1992 334/500 [===================>..........] - ETA: 56s - loss: 1.4958 - regression_loss: 1.2970 - classification_loss: 0.1988 335/500 [===================>..........] - ETA: 55s - loss: 1.4945 - regression_loss: 1.2958 - classification_loss: 0.1987 336/500 [===================>..........] - ETA: 55s - loss: 1.4950 - regression_loss: 1.2963 - classification_loss: 0.1987 337/500 [===================>..........] - ETA: 55s - loss: 1.4953 - regression_loss: 1.2967 - classification_loss: 0.1986 338/500 [===================>..........] - ETA: 54s - loss: 1.4944 - regression_loss: 1.2959 - classification_loss: 0.1985 339/500 [===================>..........] - ETA: 54s - loss: 1.4937 - regression_loss: 1.2953 - classification_loss: 0.1984 340/500 [===================>..........] - ETA: 54s - loss: 1.4955 - regression_loss: 1.2966 - classification_loss: 0.1989 341/500 [===================>..........] - ETA: 53s - loss: 1.4932 - regression_loss: 1.2947 - classification_loss: 0.1985 342/500 [===================>..........] - ETA: 53s - loss: 1.4930 - regression_loss: 1.2946 - classification_loss: 0.1984 343/500 [===================>..........] - ETA: 53s - loss: 1.4916 - regression_loss: 1.2934 - classification_loss: 0.1981 344/500 [===================>..........] - ETA: 52s - loss: 1.4929 - regression_loss: 1.2945 - classification_loss: 0.1984 345/500 [===================>..........] - ETA: 52s - loss: 1.4932 - regression_loss: 1.2948 - classification_loss: 0.1984 346/500 [===================>..........] - ETA: 52s - loss: 1.4936 - regression_loss: 1.2952 - classification_loss: 0.1984 347/500 [===================>..........] - ETA: 51s - loss: 1.4953 - regression_loss: 1.2964 - classification_loss: 0.1989 348/500 [===================>..........] - ETA: 51s - loss: 1.4934 - regression_loss: 1.2948 - classification_loss: 0.1986 349/500 [===================>..........] - ETA: 51s - loss: 1.4939 - regression_loss: 1.2953 - classification_loss: 0.1986 350/500 [====================>.........] - ETA: 50s - loss: 1.4932 - regression_loss: 1.2946 - classification_loss: 0.1985 351/500 [====================>.........] - ETA: 50s - loss: 1.4934 - regression_loss: 1.2948 - classification_loss: 0.1986 352/500 [====================>.........] - ETA: 50s - loss: 1.4949 - regression_loss: 1.2962 - classification_loss: 0.1986 353/500 [====================>.........] - ETA: 49s - loss: 1.4953 - regression_loss: 1.2966 - classification_loss: 0.1987 354/500 [====================>.........] - ETA: 49s - loss: 1.4956 - regression_loss: 1.2969 - classification_loss: 0.1987 355/500 [====================>.........] - ETA: 49s - loss: 1.4986 - regression_loss: 1.2993 - classification_loss: 0.1993 356/500 [====================>.........] - ETA: 48s - loss: 1.4968 - regression_loss: 1.2977 - classification_loss: 0.1990 357/500 [====================>.........] - ETA: 48s - loss: 1.4961 - regression_loss: 1.2971 - classification_loss: 0.1990 358/500 [====================>.........] - ETA: 48s - loss: 1.4965 - regression_loss: 1.2974 - classification_loss: 0.1991 359/500 [====================>.........] - ETA: 47s - loss: 1.4955 - regression_loss: 1.2966 - classification_loss: 0.1989 360/500 [====================>.........] - ETA: 47s - loss: 1.4947 - regression_loss: 1.2960 - classification_loss: 0.1987 361/500 [====================>.........] - ETA: 47s - loss: 1.4948 - regression_loss: 1.2961 - classification_loss: 0.1986 362/500 [====================>.........] - ETA: 46s - loss: 1.4952 - regression_loss: 1.2965 - classification_loss: 0.1987 363/500 [====================>.........] - ETA: 46s - loss: 1.4953 - regression_loss: 1.2965 - classification_loss: 0.1987 364/500 [====================>.........] - ETA: 46s - loss: 1.4942 - regression_loss: 1.2956 - classification_loss: 0.1986 365/500 [====================>.........] - ETA: 45s - loss: 1.4938 - regression_loss: 1.2952 - classification_loss: 0.1986 366/500 [====================>.........] - ETA: 45s - loss: 1.4946 - regression_loss: 1.2957 - classification_loss: 0.1989 367/500 [=====================>........] - ETA: 45s - loss: 1.4945 - regression_loss: 1.2954 - classification_loss: 0.1991 368/500 [=====================>........] - ETA: 44s - loss: 1.4961 - regression_loss: 1.2966 - classification_loss: 0.1995 369/500 [=====================>........] - ETA: 44s - loss: 1.4990 - regression_loss: 1.2988 - classification_loss: 0.2002 370/500 [=====================>........] - ETA: 44s - loss: 1.4996 - regression_loss: 1.2994 - classification_loss: 0.2003 371/500 [=====================>........] - ETA: 43s - loss: 1.4995 - regression_loss: 1.2992 - classification_loss: 0.2002 372/500 [=====================>........] - ETA: 43s - loss: 1.5000 - regression_loss: 1.2998 - classification_loss: 0.2003 373/500 [=====================>........] - ETA: 43s - loss: 1.5004 - regression_loss: 1.3000 - classification_loss: 0.2003 374/500 [=====================>........] - ETA: 42s - loss: 1.4997 - regression_loss: 1.2995 - classification_loss: 0.2002 375/500 [=====================>........] - ETA: 42s - loss: 1.4999 - regression_loss: 1.2995 - classification_loss: 0.2004 376/500 [=====================>........] - ETA: 41s - loss: 1.4992 - regression_loss: 1.2989 - classification_loss: 0.2003 377/500 [=====================>........] - ETA: 41s - loss: 1.4987 - regression_loss: 1.2985 - classification_loss: 0.2003 378/500 [=====================>........] - ETA: 41s - loss: 1.5003 - regression_loss: 1.2996 - classification_loss: 0.2007 379/500 [=====================>........] - ETA: 40s - loss: 1.4992 - regression_loss: 1.2988 - classification_loss: 0.2004 380/500 [=====================>........] - ETA: 40s - loss: 1.4984 - regression_loss: 1.2983 - classification_loss: 0.2001 381/500 [=====================>........] - ETA: 40s - loss: 1.4974 - regression_loss: 1.2974 - classification_loss: 0.2000 382/500 [=====================>........] - ETA: 39s - loss: 1.4973 - regression_loss: 1.2973 - classification_loss: 0.2000 383/500 [=====================>........] - ETA: 39s - loss: 1.4966 - regression_loss: 1.2966 - classification_loss: 0.1999 384/500 [======================>.......] - ETA: 39s - loss: 1.4965 - regression_loss: 1.2966 - classification_loss: 0.1999 385/500 [======================>.......] - ETA: 38s - loss: 1.4978 - regression_loss: 1.2977 - classification_loss: 0.2002 386/500 [======================>.......] - ETA: 38s - loss: 1.4979 - regression_loss: 1.2974 - classification_loss: 0.2005 387/500 [======================>.......] - ETA: 38s - loss: 1.4985 - regression_loss: 1.2979 - classification_loss: 0.2006 388/500 [======================>.......] - ETA: 37s - loss: 1.4960 - regression_loss: 1.2956 - classification_loss: 0.2004 389/500 [======================>.......] - ETA: 37s - loss: 1.4971 - regression_loss: 1.2965 - classification_loss: 0.2006 390/500 [======================>.......] - ETA: 37s - loss: 1.4968 - regression_loss: 1.2962 - classification_loss: 0.2005 391/500 [======================>.......] - ETA: 36s - loss: 1.4974 - regression_loss: 1.2968 - classification_loss: 0.2005 392/500 [======================>.......] - ETA: 36s - loss: 1.4974 - regression_loss: 1.2969 - classification_loss: 0.2006 393/500 [======================>.......] - ETA: 36s - loss: 1.4971 - regression_loss: 1.2965 - classification_loss: 0.2006 394/500 [======================>.......] - ETA: 35s - loss: 1.4953 - regression_loss: 1.2949 - classification_loss: 0.2003 395/500 [======================>.......] - ETA: 35s - loss: 1.4968 - regression_loss: 1.2961 - classification_loss: 0.2007 396/500 [======================>.......] - ETA: 35s - loss: 1.4971 - regression_loss: 1.2962 - classification_loss: 0.2009 397/500 [======================>.......] - ETA: 34s - loss: 1.4972 - regression_loss: 1.2963 - classification_loss: 0.2009 398/500 [======================>.......] - ETA: 34s - loss: 1.4956 - regression_loss: 1.2950 - classification_loss: 0.2007 399/500 [======================>.......] - ETA: 34s - loss: 1.4950 - regression_loss: 1.2943 - classification_loss: 0.2007 400/500 [=======================>......] - ETA: 33s - loss: 1.4952 - regression_loss: 1.2946 - classification_loss: 0.2006 401/500 [=======================>......] - ETA: 33s - loss: 1.4951 - regression_loss: 1.2945 - classification_loss: 0.2006 402/500 [=======================>......] - ETA: 33s - loss: 1.4937 - regression_loss: 1.2933 - classification_loss: 0.2004 403/500 [=======================>......] - ETA: 32s - loss: 1.4943 - regression_loss: 1.2939 - classification_loss: 0.2005 404/500 [=======================>......] - ETA: 32s - loss: 1.4941 - regression_loss: 1.2936 - classification_loss: 0.2005 405/500 [=======================>......] - ETA: 32s - loss: 1.4936 - regression_loss: 1.2932 - classification_loss: 0.2004 406/500 [=======================>......] - ETA: 31s - loss: 1.4939 - regression_loss: 1.2935 - classification_loss: 0.2004 407/500 [=======================>......] - ETA: 31s - loss: 1.4925 - regression_loss: 1.2923 - classification_loss: 0.2002 408/500 [=======================>......] - ETA: 31s - loss: 1.4931 - regression_loss: 1.2928 - classification_loss: 0.2003 409/500 [=======================>......] - ETA: 30s - loss: 1.4928 - regression_loss: 1.2926 - classification_loss: 0.2002 410/500 [=======================>......] - ETA: 30s - loss: 1.4928 - regression_loss: 1.2927 - classification_loss: 0.2002 411/500 [=======================>......] - ETA: 30s - loss: 1.4927 - regression_loss: 1.2927 - classification_loss: 0.2001 412/500 [=======================>......] - ETA: 29s - loss: 1.4933 - regression_loss: 1.2932 - classification_loss: 0.2001 413/500 [=======================>......] - ETA: 29s - loss: 1.4933 - regression_loss: 1.2932 - classification_loss: 0.2001 414/500 [=======================>......] - ETA: 29s - loss: 1.4933 - regression_loss: 1.2932 - classification_loss: 0.2001 415/500 [=======================>......] - ETA: 28s - loss: 1.4930 - regression_loss: 1.2929 - classification_loss: 0.2001 416/500 [=======================>......] - ETA: 28s - loss: 1.4938 - regression_loss: 1.2937 - classification_loss: 0.2002 417/500 [========================>.....] - ETA: 28s - loss: 1.4932 - regression_loss: 1.2931 - classification_loss: 0.2001 418/500 [========================>.....] - ETA: 27s - loss: 1.4931 - regression_loss: 1.2929 - classification_loss: 0.2002 419/500 [========================>.....] - ETA: 27s - loss: 1.4928 - regression_loss: 1.2926 - classification_loss: 0.2002 420/500 [========================>.....] - ETA: 27s - loss: 1.4941 - regression_loss: 1.2936 - classification_loss: 0.2005 421/500 [========================>.....] - ETA: 26s - loss: 1.4958 - regression_loss: 1.2950 - classification_loss: 0.2008 422/500 [========================>.....] - ETA: 26s - loss: 1.4976 - regression_loss: 1.2966 - classification_loss: 0.2010 423/500 [========================>.....] - ETA: 26s - loss: 1.4983 - regression_loss: 1.2970 - classification_loss: 0.2014 424/500 [========================>.....] - ETA: 25s - loss: 1.4983 - regression_loss: 1.2969 - classification_loss: 0.2014 425/500 [========================>.....] - ETA: 25s - loss: 1.4973 - regression_loss: 1.2962 - classification_loss: 0.2011 426/500 [========================>.....] - ETA: 25s - loss: 1.4985 - regression_loss: 1.2972 - classification_loss: 0.2013 427/500 [========================>.....] - ETA: 24s - loss: 1.4992 - regression_loss: 1.2978 - classification_loss: 0.2015 428/500 [========================>.....] - ETA: 24s - loss: 1.4984 - regression_loss: 1.2971 - classification_loss: 0.2013 429/500 [========================>.....] - ETA: 24s - loss: 1.4979 - regression_loss: 1.2967 - classification_loss: 0.2012 430/500 [========================>.....] - ETA: 23s - loss: 1.4975 - regression_loss: 1.2963 - classification_loss: 0.2012 431/500 [========================>.....] - ETA: 23s - loss: 1.4984 - regression_loss: 1.2970 - classification_loss: 0.2014 432/500 [========================>.....] - ETA: 23s - loss: 1.4978 - regression_loss: 1.2966 - classification_loss: 0.2012 433/500 [========================>.....] - ETA: 22s - loss: 1.4973 - regression_loss: 1.2963 - classification_loss: 0.2010 434/500 [=========================>....] - ETA: 22s - loss: 1.4968 - regression_loss: 1.2958 - classification_loss: 0.2010 435/500 [=========================>....] - ETA: 22s - loss: 1.4973 - regression_loss: 1.2963 - classification_loss: 0.2011 436/500 [=========================>....] - ETA: 21s - loss: 1.4968 - regression_loss: 1.2958 - classification_loss: 0.2010 437/500 [=========================>....] - ETA: 21s - loss: 1.4965 - regression_loss: 1.2955 - classification_loss: 0.2010 438/500 [=========================>....] - ETA: 21s - loss: 1.4961 - regression_loss: 1.2953 - classification_loss: 0.2008 439/500 [=========================>....] - ETA: 20s - loss: 1.4972 - regression_loss: 1.2959 - classification_loss: 0.2013 440/500 [=========================>....] - ETA: 20s - loss: 1.4972 - regression_loss: 1.2959 - classification_loss: 0.2013 441/500 [=========================>....] - ETA: 20s - loss: 1.4966 - regression_loss: 1.2955 - classification_loss: 0.2011 442/500 [=========================>....] - ETA: 19s - loss: 1.4968 - regression_loss: 1.2957 - classification_loss: 0.2011 443/500 [=========================>....] - ETA: 19s - loss: 1.4975 - regression_loss: 1.2963 - classification_loss: 0.2012 444/500 [=========================>....] - ETA: 18s - loss: 1.4981 - regression_loss: 1.2968 - classification_loss: 0.2013 445/500 [=========================>....] - ETA: 18s - loss: 1.4991 - regression_loss: 1.2979 - classification_loss: 0.2013 446/500 [=========================>....] - ETA: 18s - loss: 1.4988 - regression_loss: 1.2976 - classification_loss: 0.2012 447/500 [=========================>....] - ETA: 17s - loss: 1.4981 - regression_loss: 1.2971 - classification_loss: 0.2010 448/500 [=========================>....] - ETA: 17s - loss: 1.4969 - regression_loss: 1.2962 - classification_loss: 0.2007 449/500 [=========================>....] - ETA: 17s - loss: 1.4956 - regression_loss: 1.2950 - classification_loss: 0.2006 450/500 [==========================>...] - ETA: 16s - loss: 1.4967 - regression_loss: 1.2956 - classification_loss: 0.2010 451/500 [==========================>...] - ETA: 16s - loss: 1.4960 - regression_loss: 1.2951 - classification_loss: 0.2009 452/500 [==========================>...] - ETA: 16s - loss: 1.4946 - regression_loss: 1.2938 - classification_loss: 0.2007 453/500 [==========================>...] - ETA: 15s - loss: 1.4927 - regression_loss: 1.2923 - classification_loss: 0.2004 454/500 [==========================>...] - ETA: 15s - loss: 1.4941 - regression_loss: 1.2935 - classification_loss: 0.2007 455/500 [==========================>...] - ETA: 15s - loss: 1.4942 - regression_loss: 1.2937 - classification_loss: 0.2005 456/500 [==========================>...] - ETA: 14s - loss: 1.4935 - regression_loss: 1.2932 - classification_loss: 0.2003 457/500 [==========================>...] - ETA: 14s - loss: 1.4945 - regression_loss: 1.2938 - classification_loss: 0.2007 458/500 [==========================>...] - ETA: 14s - loss: 1.4950 - regression_loss: 1.2943 - classification_loss: 0.2007 459/500 [==========================>...] - ETA: 13s - loss: 1.4948 - regression_loss: 1.2942 - classification_loss: 0.2006 460/500 [==========================>...] - ETA: 13s - loss: 1.4957 - regression_loss: 1.2949 - classification_loss: 0.2007 461/500 [==========================>...] - ETA: 13s - loss: 1.4958 - regression_loss: 1.2951 - classification_loss: 0.2007 462/500 [==========================>...] - ETA: 12s - loss: 1.4943 - regression_loss: 1.2939 - classification_loss: 0.2004 463/500 [==========================>...] - ETA: 12s - loss: 1.4967 - regression_loss: 1.2961 - classification_loss: 0.2006 464/500 [==========================>...] - ETA: 12s - loss: 1.4955 - regression_loss: 1.2951 - classification_loss: 0.2004 465/500 [==========================>...] - ETA: 11s - loss: 1.4965 - regression_loss: 1.2959 - classification_loss: 0.2006 466/500 [==========================>...] - ETA: 11s - loss: 1.4966 - regression_loss: 1.2960 - classification_loss: 0.2006 467/500 [===========================>..] - ETA: 11s - loss: 1.4964 - regression_loss: 1.2959 - classification_loss: 0.2005 468/500 [===========================>..] - ETA: 10s - loss: 1.4965 - regression_loss: 1.2959 - classification_loss: 0.2006 469/500 [===========================>..] - ETA: 10s - loss: 1.4968 - regression_loss: 1.2963 - classification_loss: 0.2005 470/500 [===========================>..] - ETA: 10s - loss: 1.4957 - regression_loss: 1.2953 - classification_loss: 0.2004 471/500 [===========================>..] - ETA: 9s - loss: 1.4956 - regression_loss: 1.2952 - classification_loss: 0.2004  472/500 [===========================>..] - ETA: 9s - loss: 1.4965 - regression_loss: 1.2959 - classification_loss: 0.2006 473/500 [===========================>..] - ETA: 9s - loss: 1.4962 - regression_loss: 1.2956 - classification_loss: 0.2006 474/500 [===========================>..] - ETA: 8s - loss: 1.4941 - regression_loss: 1.2939 - classification_loss: 0.2002 475/500 [===========================>..] - ETA: 8s - loss: 1.4944 - regression_loss: 1.2941 - classification_loss: 0.2003 476/500 [===========================>..] - ETA: 8s - loss: 1.4950 - regression_loss: 1.2945 - classification_loss: 0.2005 477/500 [===========================>..] - ETA: 7s - loss: 1.4952 - regression_loss: 1.2946 - classification_loss: 0.2005 478/500 [===========================>..] - ETA: 7s - loss: 1.4942 - regression_loss: 1.2938 - classification_loss: 0.2004 479/500 [===========================>..] - ETA: 7s - loss: 1.4947 - regression_loss: 1.2943 - classification_loss: 0.2004 480/500 [===========================>..] - ETA: 6s - loss: 1.4955 - regression_loss: 1.2951 - classification_loss: 0.2005 481/500 [===========================>..] - ETA: 6s - loss: 1.4944 - regression_loss: 1.2941 - classification_loss: 0.2002 482/500 [===========================>..] - ETA: 6s - loss: 1.4952 - regression_loss: 1.2949 - classification_loss: 0.2004 483/500 [===========================>..] - ETA: 5s - loss: 1.4956 - regression_loss: 1.2953 - classification_loss: 0.2003 484/500 [============================>.] - ETA: 5s - loss: 1.4940 - regression_loss: 1.2939 - classification_loss: 0.2001 485/500 [============================>.] - ETA: 5s - loss: 1.4941 - regression_loss: 1.2941 - classification_loss: 0.2000 486/500 [============================>.] - ETA: 4s - loss: 1.4932 - regression_loss: 1.2933 - classification_loss: 0.1999 487/500 [============================>.] - ETA: 4s - loss: 1.4932 - regression_loss: 1.2932 - classification_loss: 0.2000 488/500 [============================>.] - ETA: 4s - loss: 1.4927 - regression_loss: 1.2929 - classification_loss: 0.1999 489/500 [============================>.] - ETA: 3s - loss: 1.4926 - regression_loss: 1.2927 - classification_loss: 0.1999 490/500 [============================>.] - ETA: 3s - loss: 1.4928 - regression_loss: 1.2929 - classification_loss: 0.1999 491/500 [============================>.] - ETA: 3s - loss: 1.4922 - regression_loss: 1.2925 - classification_loss: 0.1998 492/500 [============================>.] - ETA: 2s - loss: 1.4913 - regression_loss: 1.2917 - classification_loss: 0.1997 493/500 [============================>.] - ETA: 2s - loss: 1.4913 - regression_loss: 1.2917 - classification_loss: 0.1996 494/500 [============================>.] - ETA: 2s - loss: 1.4919 - regression_loss: 1.2921 - classification_loss: 0.1998 495/500 [============================>.] - ETA: 1s - loss: 1.4926 - regression_loss: 1.2927 - classification_loss: 0.1999 496/500 [============================>.] - ETA: 1s - loss: 1.4939 - regression_loss: 1.2940 - classification_loss: 0.2000 497/500 [============================>.] - ETA: 1s - loss: 1.4958 - regression_loss: 1.2956 - classification_loss: 0.2002 498/500 [============================>.] - ETA: 0s - loss: 1.4950 - regression_loss: 1.2948 - classification_loss: 0.2001 499/500 [============================>.] - ETA: 0s - loss: 1.4960 - regression_loss: 1.2959 - classification_loss: 0.2002 500/500 [==============================] - 169s 339ms/step - loss: 1.4950 - regression_loss: 1.2950 - classification_loss: 0.2000 1172 instances of class plum with average precision: 0.7410 mAP: 0.7410 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 2:40 - loss: 1.4019 - regression_loss: 1.2554 - classification_loss: 0.1464 2/500 [..............................] - ETA: 2:44 - loss: 1.5342 - regression_loss: 1.3365 - classification_loss: 0.1977 3/500 [..............................] - ETA: 2:46 - loss: 1.5103 - regression_loss: 1.2984 - classification_loss: 0.2120 4/500 [..............................] - ETA: 2:47 - loss: 1.4726 - regression_loss: 1.2704 - classification_loss: 0.2022 5/500 [..............................] - ETA: 2:48 - loss: 1.4577 - regression_loss: 1.2612 - classification_loss: 0.1965 6/500 [..............................] - ETA: 2:49 - loss: 1.4427 - regression_loss: 1.2507 - classification_loss: 0.1920 7/500 [..............................] - ETA: 2:49 - loss: 1.4688 - regression_loss: 1.2700 - classification_loss: 0.1988 8/500 [..............................] - ETA: 2:49 - loss: 1.4722 - regression_loss: 1.2668 - classification_loss: 0.2053 9/500 [..............................] - ETA: 2:47 - loss: 1.4693 - regression_loss: 1.2645 - classification_loss: 0.2048 10/500 [..............................] - ETA: 2:46 - loss: 1.5011 - regression_loss: 1.2886 - classification_loss: 0.2125 11/500 [..............................] - ETA: 2:46 - loss: 1.4954 - regression_loss: 1.2848 - classification_loss: 0.2106 12/500 [..............................] - ETA: 2:45 - loss: 1.5083 - regression_loss: 1.2877 - classification_loss: 0.2206 13/500 [..............................] - ETA: 2:43 - loss: 1.4827 - regression_loss: 1.2676 - classification_loss: 0.2152 14/500 [..............................] - ETA: 2:42 - loss: 1.5124 - regression_loss: 1.2885 - classification_loss: 0.2239 15/500 [..............................] - ETA: 2:41 - loss: 1.4693 - regression_loss: 1.2549 - classification_loss: 0.2144 16/500 [..............................] - ETA: 2:41 - loss: 1.5431 - regression_loss: 1.3224 - classification_loss: 0.2207 17/500 [>.............................] - ETA: 2:42 - loss: 1.5293 - regression_loss: 1.3111 - classification_loss: 0.2182 18/500 [>.............................] - ETA: 2:42 - loss: 1.5421 - regression_loss: 1.3229 - classification_loss: 0.2192 19/500 [>.............................] - ETA: 2:42 - loss: 1.5540 - regression_loss: 1.3341 - classification_loss: 0.2198 20/500 [>.............................] - ETA: 2:42 - loss: 1.5530 - regression_loss: 1.3346 - classification_loss: 0.2184 21/500 [>.............................] - ETA: 2:42 - loss: 1.5857 - regression_loss: 1.3569 - classification_loss: 0.2288 22/500 [>.............................] - ETA: 2:41 - loss: 1.5773 - regression_loss: 1.3495 - classification_loss: 0.2278 23/500 [>.............................] - ETA: 2:41 - loss: 1.5768 - regression_loss: 1.3492 - classification_loss: 0.2276 24/500 [>.............................] - ETA: 2:41 - loss: 1.5714 - regression_loss: 1.3458 - classification_loss: 0.2256 25/500 [>.............................] - ETA: 2:40 - loss: 1.5701 - regression_loss: 1.3467 - classification_loss: 0.2234 26/500 [>.............................] - ETA: 2:40 - loss: 1.5606 - regression_loss: 1.3404 - classification_loss: 0.2202 27/500 [>.............................] - ETA: 2:39 - loss: 1.5703 - regression_loss: 1.3479 - classification_loss: 0.2225 28/500 [>.............................] - ETA: 2:39 - loss: 1.5845 - regression_loss: 1.3604 - classification_loss: 0.2241 29/500 [>.............................] - ETA: 2:39 - loss: 1.5973 - regression_loss: 1.3704 - classification_loss: 0.2269 30/500 [>.............................] - ETA: 2:39 - loss: 1.5730 - regression_loss: 1.3511 - classification_loss: 0.2219 31/500 [>.............................] - ETA: 2:38 - loss: 1.5393 - regression_loss: 1.3213 - classification_loss: 0.2180 32/500 [>.............................] - ETA: 2:38 - loss: 1.5454 - regression_loss: 1.3274 - classification_loss: 0.2180 33/500 [>.............................] - ETA: 2:38 - loss: 1.5317 - regression_loss: 1.3157 - classification_loss: 0.2160 34/500 [=>............................] - ETA: 2:38 - loss: 1.5458 - regression_loss: 1.3260 - classification_loss: 0.2198 35/500 [=>............................] - ETA: 2:37 - loss: 1.5637 - regression_loss: 1.3438 - classification_loss: 0.2199 36/500 [=>............................] - ETA: 2:37 - loss: 1.5429 - regression_loss: 1.3263 - classification_loss: 0.2166 37/500 [=>............................] - ETA: 2:37 - loss: 1.5349 - regression_loss: 1.3191 - classification_loss: 0.2158 38/500 [=>............................] - ETA: 2:36 - loss: 1.5262 - regression_loss: 1.3108 - classification_loss: 0.2154 39/500 [=>............................] - ETA: 2:36 - loss: 1.5499 - regression_loss: 1.3316 - classification_loss: 0.2183 40/500 [=>............................] - ETA: 2:36 - loss: 1.5510 - regression_loss: 1.3329 - classification_loss: 0.2181 41/500 [=>............................] - ETA: 2:36 - loss: 1.5503 - regression_loss: 1.3321 - classification_loss: 0.2183 42/500 [=>............................] - ETA: 2:36 - loss: 1.5501 - regression_loss: 1.3326 - classification_loss: 0.2175 43/500 [=>............................] - ETA: 2:35 - loss: 1.5495 - regression_loss: 1.3340 - classification_loss: 0.2155 44/500 [=>............................] - ETA: 2:35 - loss: 1.5582 - regression_loss: 1.3412 - classification_loss: 0.2170 45/500 [=>............................] - ETA: 2:35 - loss: 1.5683 - regression_loss: 1.3492 - classification_loss: 0.2191 46/500 [=>............................] - ETA: 2:35 - loss: 1.5706 - regression_loss: 1.3522 - classification_loss: 0.2183 47/500 [=>............................] - ETA: 2:34 - loss: 1.5683 - regression_loss: 1.3506 - classification_loss: 0.2176 48/500 [=>............................] - ETA: 2:34 - loss: 1.5615 - regression_loss: 1.3451 - classification_loss: 0.2164 49/500 [=>............................] - ETA: 2:34 - loss: 1.5646 - regression_loss: 1.3483 - classification_loss: 0.2163 50/500 [==>...........................] - ETA: 2:33 - loss: 1.5471 - regression_loss: 1.3328 - classification_loss: 0.2143 51/500 [==>...........................] - ETA: 2:33 - loss: 1.5488 - regression_loss: 1.3342 - classification_loss: 0.2146 52/500 [==>...........................] - ETA: 2:33 - loss: 1.5478 - regression_loss: 1.3336 - classification_loss: 0.2142 53/500 [==>...........................] - ETA: 2:32 - loss: 1.5554 - regression_loss: 1.3402 - classification_loss: 0.2152 54/500 [==>...........................] - ETA: 2:32 - loss: 1.5590 - regression_loss: 1.3443 - classification_loss: 0.2147 55/500 [==>...........................] - ETA: 2:31 - loss: 1.5556 - regression_loss: 1.3412 - classification_loss: 0.2143 56/500 [==>...........................] - ETA: 2:31 - loss: 1.5514 - regression_loss: 1.3378 - classification_loss: 0.2136 57/500 [==>...........................] - ETA: 2:31 - loss: 1.5380 - regression_loss: 1.3267 - classification_loss: 0.2113 58/500 [==>...........................] - ETA: 2:31 - loss: 1.5395 - regression_loss: 1.3296 - classification_loss: 0.2099 59/500 [==>...........................] - ETA: 2:30 - loss: 1.5461 - regression_loss: 1.3356 - classification_loss: 0.2106 60/500 [==>...........................] - ETA: 2:30 - loss: 1.5543 - regression_loss: 1.3426 - classification_loss: 0.2117 61/500 [==>...........................] - ETA: 2:30 - loss: 1.5525 - regression_loss: 1.3411 - classification_loss: 0.2114 62/500 [==>...........................] - ETA: 2:29 - loss: 1.5607 - regression_loss: 1.3474 - classification_loss: 0.2132 63/500 [==>...........................] - ETA: 2:29 - loss: 1.5680 - regression_loss: 1.3536 - classification_loss: 0.2144 64/500 [==>...........................] - ETA: 2:29 - loss: 1.5558 - regression_loss: 1.3439 - classification_loss: 0.2119 65/500 [==>...........................] - ETA: 2:28 - loss: 1.5571 - regression_loss: 1.3450 - classification_loss: 0.2121 66/500 [==>...........................] - ETA: 2:28 - loss: 1.5503 - regression_loss: 1.3383 - classification_loss: 0.2120 67/500 [===>..........................] - ETA: 2:28 - loss: 1.5558 - regression_loss: 1.3448 - classification_loss: 0.2109 68/500 [===>..........................] - ETA: 2:27 - loss: 1.5537 - regression_loss: 1.3435 - classification_loss: 0.2102 69/500 [===>..........................] - ETA: 2:27 - loss: 1.5614 - regression_loss: 1.3501 - classification_loss: 0.2113 70/500 [===>..........................] - ETA: 2:26 - loss: 1.5485 - regression_loss: 1.3392 - classification_loss: 0.2093 71/500 [===>..........................] - ETA: 2:26 - loss: 1.5524 - regression_loss: 1.3425 - classification_loss: 0.2098 72/500 [===>..........................] - ETA: 2:26 - loss: 1.5526 - regression_loss: 1.3428 - classification_loss: 0.2098 73/500 [===>..........................] - ETA: 2:25 - loss: 1.5515 - regression_loss: 1.3421 - classification_loss: 0.2095 74/500 [===>..........................] - ETA: 2:25 - loss: 1.5416 - regression_loss: 1.3333 - classification_loss: 0.2083 75/500 [===>..........................] - ETA: 2:24 - loss: 1.5365 - regression_loss: 1.3292 - classification_loss: 0.2073 76/500 [===>..........................] - ETA: 2:24 - loss: 1.5338 - regression_loss: 1.3260 - classification_loss: 0.2077 77/500 [===>..........................] - ETA: 2:24 - loss: 1.5332 - regression_loss: 1.3257 - classification_loss: 0.2075 78/500 [===>..........................] - ETA: 2:23 - loss: 1.5427 - regression_loss: 1.3320 - classification_loss: 0.2107 79/500 [===>..........................] - ETA: 2:23 - loss: 1.5405 - regression_loss: 1.3308 - classification_loss: 0.2097 80/500 [===>..........................] - ETA: 2:23 - loss: 1.5403 - regression_loss: 1.3308 - classification_loss: 0.2095 81/500 [===>..........................] - ETA: 2:23 - loss: 1.5422 - regression_loss: 1.3326 - classification_loss: 0.2097 82/500 [===>..........................] - ETA: 2:22 - loss: 1.5379 - regression_loss: 1.3298 - classification_loss: 0.2081 83/500 [===>..........................] - ETA: 2:22 - loss: 1.5296 - regression_loss: 1.3228 - classification_loss: 0.2068 84/500 [====>.........................] - ETA: 2:22 - loss: 1.5338 - regression_loss: 1.3260 - classification_loss: 0.2078 85/500 [====>.........................] - ETA: 2:21 - loss: 1.5319 - regression_loss: 1.3245 - classification_loss: 0.2074 86/500 [====>.........................] - ETA: 2:21 - loss: 1.5373 - regression_loss: 1.3284 - classification_loss: 0.2089 87/500 [====>.........................] - ETA: 2:21 - loss: 1.5367 - regression_loss: 1.3284 - classification_loss: 0.2083 88/500 [====>.........................] - ETA: 2:20 - loss: 1.5257 - regression_loss: 1.3187 - classification_loss: 0.2069 89/500 [====>.........................] - ETA: 2:20 - loss: 1.5199 - regression_loss: 1.3138 - classification_loss: 0.2061 90/500 [====>.........................] - ETA: 2:19 - loss: 1.5202 - regression_loss: 1.3142 - classification_loss: 0.2060 91/500 [====>.........................] - ETA: 2:19 - loss: 1.5202 - regression_loss: 1.3141 - classification_loss: 0.2061 92/500 [====>.........................] - ETA: 2:19 - loss: 1.5213 - regression_loss: 1.3153 - classification_loss: 0.2060 93/500 [====>.........................] - ETA: 2:18 - loss: 1.5183 - regression_loss: 1.3128 - classification_loss: 0.2055 94/500 [====>.........................] - ETA: 2:18 - loss: 1.5208 - regression_loss: 1.3146 - classification_loss: 0.2062 95/500 [====>.........................] - ETA: 2:18 - loss: 1.5156 - regression_loss: 1.3106 - classification_loss: 0.2050 96/500 [====>.........................] - ETA: 2:17 - loss: 1.5139 - regression_loss: 1.3090 - classification_loss: 0.2050 97/500 [====>.........................] - ETA: 2:17 - loss: 1.5083 - regression_loss: 1.3039 - classification_loss: 0.2043 98/500 [====>.........................] - ETA: 2:17 - loss: 1.5105 - regression_loss: 1.3068 - classification_loss: 0.2038 99/500 [====>.........................] - ETA: 2:16 - loss: 1.5075 - regression_loss: 1.3046 - classification_loss: 0.2029 100/500 [=====>........................] - ETA: 2:16 - loss: 1.5021 - regression_loss: 1.3002 - classification_loss: 0.2019 101/500 [=====>........................] - ETA: 2:16 - loss: 1.4935 - regression_loss: 1.2931 - classification_loss: 0.2004 102/500 [=====>........................] - ETA: 2:15 - loss: 1.4877 - regression_loss: 1.2885 - classification_loss: 0.1992 103/500 [=====>........................] - ETA: 2:15 - loss: 1.4928 - regression_loss: 1.2928 - classification_loss: 0.2000 104/500 [=====>........................] - ETA: 2:15 - loss: 1.4919 - regression_loss: 1.2920 - classification_loss: 0.1999 105/500 [=====>........................] - ETA: 2:14 - loss: 1.4852 - regression_loss: 1.2865 - classification_loss: 0.1987 106/500 [=====>........................] - ETA: 2:14 - loss: 1.4795 - regression_loss: 1.2820 - classification_loss: 0.1976 107/500 [=====>........................] - ETA: 2:14 - loss: 1.4770 - regression_loss: 1.2799 - classification_loss: 0.1971 108/500 [=====>........................] - ETA: 2:13 - loss: 1.4744 - regression_loss: 1.2774 - classification_loss: 0.1969 109/500 [=====>........................] - ETA: 2:13 - loss: 1.4716 - regression_loss: 1.2749 - classification_loss: 0.1967 110/500 [=====>........................] - ETA: 2:12 - loss: 1.4730 - regression_loss: 1.2759 - classification_loss: 0.1971 111/500 [=====>........................] - ETA: 2:12 - loss: 1.4741 - regression_loss: 1.2770 - classification_loss: 0.1971 112/500 [=====>........................] - ETA: 2:12 - loss: 1.4708 - regression_loss: 1.2743 - classification_loss: 0.1965 113/500 [=====>........................] - ETA: 2:12 - loss: 1.4757 - regression_loss: 1.2786 - classification_loss: 0.1972 114/500 [=====>........................] - ETA: 2:11 - loss: 1.4753 - regression_loss: 1.2786 - classification_loss: 0.1967 115/500 [=====>........................] - ETA: 2:11 - loss: 1.4739 - regression_loss: 1.2778 - classification_loss: 0.1961 116/500 [=====>........................] - ETA: 2:10 - loss: 1.4782 - regression_loss: 1.2807 - classification_loss: 0.1975 117/500 [======>.......................] - ETA: 2:10 - loss: 1.4744 - regression_loss: 1.2771 - classification_loss: 0.1973 118/500 [======>.......................] - ETA: 2:10 - loss: 1.4740 - regression_loss: 1.2764 - classification_loss: 0.1976 119/500 [======>.......................] - ETA: 2:09 - loss: 1.4746 - regression_loss: 1.2764 - classification_loss: 0.1982 120/500 [======>.......................] - ETA: 2:09 - loss: 1.4669 - regression_loss: 1.2698 - classification_loss: 0.1971 121/500 [======>.......................] - ETA: 2:09 - loss: 1.4678 - regression_loss: 1.2705 - classification_loss: 0.1973 122/500 [======>.......................] - ETA: 2:08 - loss: 1.4680 - regression_loss: 1.2708 - classification_loss: 0.1972 123/500 [======>.......................] - ETA: 2:08 - loss: 1.4682 - regression_loss: 1.2709 - classification_loss: 0.1974 124/500 [======>.......................] - ETA: 2:08 - loss: 1.4707 - regression_loss: 1.2726 - classification_loss: 0.1980 125/500 [======>.......................] - ETA: 2:07 - loss: 1.4735 - regression_loss: 1.2751 - classification_loss: 0.1984 126/500 [======>.......................] - ETA: 2:07 - loss: 1.4686 - regression_loss: 1.2702 - classification_loss: 0.1984 127/500 [======>.......................] - ETA: 2:07 - loss: 1.4773 - regression_loss: 1.2779 - classification_loss: 0.1994 128/500 [======>.......................] - ETA: 2:06 - loss: 1.4796 - regression_loss: 1.2798 - classification_loss: 0.1998 129/500 [======>.......................] - ETA: 2:06 - loss: 1.4786 - regression_loss: 1.2793 - classification_loss: 0.1994 130/500 [======>.......................] - ETA: 2:06 - loss: 1.4801 - regression_loss: 1.2807 - classification_loss: 0.1994 131/500 [======>.......................] - ETA: 2:06 - loss: 1.4785 - regression_loss: 1.2800 - classification_loss: 0.1985 132/500 [======>.......................] - ETA: 2:05 - loss: 1.4757 - regression_loss: 1.2778 - classification_loss: 0.1978 133/500 [======>.......................] - ETA: 2:05 - loss: 1.4773 - regression_loss: 1.2793 - classification_loss: 0.1980 134/500 [=======>......................] - ETA: 2:04 - loss: 1.4780 - regression_loss: 1.2798 - classification_loss: 0.1982 135/500 [=======>......................] - ETA: 2:04 - loss: 1.4752 - regression_loss: 1.2777 - classification_loss: 0.1976 136/500 [=======>......................] - ETA: 2:04 - loss: 1.4737 - regression_loss: 1.2764 - classification_loss: 0.1973 137/500 [=======>......................] - ETA: 2:03 - loss: 1.4736 - regression_loss: 1.2763 - classification_loss: 0.1974 138/500 [=======>......................] - ETA: 2:03 - loss: 1.4705 - regression_loss: 1.2735 - classification_loss: 0.1970 139/500 [=======>......................] - ETA: 2:03 - loss: 1.4705 - regression_loss: 1.2731 - classification_loss: 0.1974 140/500 [=======>......................] - ETA: 2:02 - loss: 1.4721 - regression_loss: 1.2741 - classification_loss: 0.1980 141/500 [=======>......................] - ETA: 2:02 - loss: 1.4714 - regression_loss: 1.2738 - classification_loss: 0.1976 142/500 [=======>......................] - ETA: 2:02 - loss: 1.4698 - regression_loss: 1.2722 - classification_loss: 0.1977 143/500 [=======>......................] - ETA: 2:01 - loss: 1.4716 - regression_loss: 1.2740 - classification_loss: 0.1976 144/500 [=======>......................] - ETA: 2:01 - loss: 1.4702 - regression_loss: 1.2733 - classification_loss: 0.1969 145/500 [=======>......................] - ETA: 2:01 - loss: 1.4726 - regression_loss: 1.2753 - classification_loss: 0.1973 146/500 [=======>......................] - ETA: 2:00 - loss: 1.4730 - regression_loss: 1.2761 - classification_loss: 0.1969 147/500 [=======>......................] - ETA: 2:00 - loss: 1.4743 - regression_loss: 1.2774 - classification_loss: 0.1969 148/500 [=======>......................] - ETA: 2:00 - loss: 1.4752 - regression_loss: 1.2782 - classification_loss: 0.1970 149/500 [=======>......................] - ETA: 1:59 - loss: 1.4753 - regression_loss: 1.2784 - classification_loss: 0.1968 150/500 [========>.....................] - ETA: 1:59 - loss: 1.4751 - regression_loss: 1.2784 - classification_loss: 0.1967 151/500 [========>.....................] - ETA: 1:59 - loss: 1.4736 - regression_loss: 1.2770 - classification_loss: 0.1966 152/500 [========>.....................] - ETA: 1:58 - loss: 1.4749 - regression_loss: 1.2785 - classification_loss: 0.1964 153/500 [========>.....................] - ETA: 1:58 - loss: 1.4695 - regression_loss: 1.2738 - classification_loss: 0.1957 154/500 [========>.....................] - ETA: 1:58 - loss: 1.4645 - regression_loss: 1.2698 - classification_loss: 0.1948 155/500 [========>.....................] - ETA: 1:57 - loss: 1.4643 - regression_loss: 1.2695 - classification_loss: 0.1948 156/500 [========>.....................] - ETA: 1:57 - loss: 1.4661 - regression_loss: 1.2709 - classification_loss: 0.1952 157/500 [========>.....................] - ETA: 1:56 - loss: 1.4687 - regression_loss: 1.2732 - classification_loss: 0.1955 158/500 [========>.....................] - ETA: 1:56 - loss: 1.4680 - regression_loss: 1.2727 - classification_loss: 0.1953 159/500 [========>.....................] - ETA: 1:56 - loss: 1.4675 - regression_loss: 1.2722 - classification_loss: 0.1953 160/500 [========>.....................] - ETA: 1:55 - loss: 1.4678 - regression_loss: 1.2725 - classification_loss: 0.1952 161/500 [========>.....................] - ETA: 1:55 - loss: 1.4748 - regression_loss: 1.2783 - classification_loss: 0.1966 162/500 [========>.....................] - ETA: 1:55 - loss: 1.4760 - regression_loss: 1.2793 - classification_loss: 0.1967 163/500 [========>.....................] - ETA: 1:54 - loss: 1.4783 - regression_loss: 1.2813 - classification_loss: 0.1970 164/500 [========>.....................] - ETA: 1:54 - loss: 1.4777 - regression_loss: 1.2810 - classification_loss: 0.1967 165/500 [========>.....................] - ETA: 1:54 - loss: 1.4775 - regression_loss: 1.2809 - classification_loss: 0.1966 166/500 [========>.....................] - ETA: 1:53 - loss: 1.4745 - regression_loss: 1.2786 - classification_loss: 0.1959 167/500 [=========>....................] - ETA: 1:53 - loss: 1.4739 - regression_loss: 1.2779 - classification_loss: 0.1960 168/500 [=========>....................] - ETA: 1:53 - loss: 1.4752 - regression_loss: 1.2787 - classification_loss: 0.1965 169/500 [=========>....................] - ETA: 1:52 - loss: 1.4767 - regression_loss: 1.2800 - classification_loss: 0.1968 170/500 [=========>....................] - ETA: 1:52 - loss: 1.4717 - regression_loss: 1.2756 - classification_loss: 0.1961 171/500 [=========>....................] - ETA: 1:52 - loss: 1.4668 - regression_loss: 1.2715 - classification_loss: 0.1954 172/500 [=========>....................] - ETA: 1:51 - loss: 1.4680 - regression_loss: 1.2725 - classification_loss: 0.1955 173/500 [=========>....................] - ETA: 1:51 - loss: 1.4688 - regression_loss: 1.2734 - classification_loss: 0.1954 174/500 [=========>....................] - ETA: 1:50 - loss: 1.4694 - regression_loss: 1.2740 - classification_loss: 0.1954 175/500 [=========>....................] - ETA: 1:50 - loss: 1.4766 - regression_loss: 1.2796 - classification_loss: 0.1969 176/500 [=========>....................] - ETA: 1:50 - loss: 1.4754 - regression_loss: 1.2787 - classification_loss: 0.1967 177/500 [=========>....................] - ETA: 1:49 - loss: 1.4768 - regression_loss: 1.2800 - classification_loss: 0.1968 178/500 [=========>....................] - ETA: 1:49 - loss: 1.4759 - regression_loss: 1.2792 - classification_loss: 0.1967 179/500 [=========>....................] - ETA: 1:49 - loss: 1.4759 - regression_loss: 1.2789 - classification_loss: 0.1970 180/500 [=========>....................] - ETA: 1:48 - loss: 1.4782 - regression_loss: 1.2811 - classification_loss: 0.1970 181/500 [=========>....................] - ETA: 1:48 - loss: 1.4798 - regression_loss: 1.2824 - classification_loss: 0.1974 182/500 [=========>....................] - ETA: 1:48 - loss: 1.4776 - regression_loss: 1.2808 - classification_loss: 0.1968 183/500 [=========>....................] - ETA: 1:47 - loss: 1.4796 - regression_loss: 1.2825 - classification_loss: 0.1972 184/500 [==========>...................] - ETA: 1:47 - loss: 1.4813 - regression_loss: 1.2839 - classification_loss: 0.1974 185/500 [==========>...................] - ETA: 1:47 - loss: 1.4795 - regression_loss: 1.2824 - classification_loss: 0.1971 186/500 [==========>...................] - ETA: 1:46 - loss: 1.4794 - regression_loss: 1.2822 - classification_loss: 0.1972 187/500 [==========>...................] - ETA: 1:46 - loss: 1.4812 - regression_loss: 1.2834 - classification_loss: 0.1978 188/500 [==========>...................] - ETA: 1:46 - loss: 1.4792 - regression_loss: 1.2817 - classification_loss: 0.1975 189/500 [==========>...................] - ETA: 1:45 - loss: 1.4796 - regression_loss: 1.2822 - classification_loss: 0.1975 190/500 [==========>...................] - ETA: 1:45 - loss: 1.4815 - regression_loss: 1.2839 - classification_loss: 0.1976 191/500 [==========>...................] - ETA: 1:45 - loss: 1.4832 - regression_loss: 1.2855 - classification_loss: 0.1976 192/500 [==========>...................] - ETA: 1:44 - loss: 1.4840 - regression_loss: 1.2853 - classification_loss: 0.1987 193/500 [==========>...................] - ETA: 1:44 - loss: 1.4833 - regression_loss: 1.2846 - classification_loss: 0.1987 194/500 [==========>...................] - ETA: 1:44 - loss: 1.4831 - regression_loss: 1.2835 - classification_loss: 0.1996 195/500 [==========>...................] - ETA: 1:43 - loss: 1.4846 - regression_loss: 1.2848 - classification_loss: 0.1998 196/500 [==========>...................] - ETA: 1:43 - loss: 1.4833 - regression_loss: 1.2836 - classification_loss: 0.1996 197/500 [==========>...................] - ETA: 1:43 - loss: 1.4841 - regression_loss: 1.2844 - classification_loss: 0.1997 198/500 [==========>...................] - ETA: 1:42 - loss: 1.4856 - regression_loss: 1.2857 - classification_loss: 0.1999 199/500 [==========>...................] - ETA: 1:42 - loss: 1.4855 - regression_loss: 1.2856 - classification_loss: 0.1999 200/500 [===========>..................] - ETA: 1:42 - loss: 1.4839 - regression_loss: 1.2844 - classification_loss: 0.1995 201/500 [===========>..................] - ETA: 1:41 - loss: 1.4814 - regression_loss: 1.2825 - classification_loss: 0.1990 202/500 [===========>..................] - ETA: 1:41 - loss: 1.4800 - regression_loss: 1.2815 - classification_loss: 0.1985 203/500 [===========>..................] - ETA: 1:41 - loss: 1.4803 - regression_loss: 1.2819 - classification_loss: 0.1984 204/500 [===========>..................] - ETA: 1:40 - loss: 1.4808 - regression_loss: 1.2821 - classification_loss: 0.1987 205/500 [===========>..................] - ETA: 1:40 - loss: 1.4811 - regression_loss: 1.2824 - classification_loss: 0.1988 206/500 [===========>..................] - ETA: 1:40 - loss: 1.4777 - regression_loss: 1.2796 - classification_loss: 0.1981 207/500 [===========>..................] - ETA: 1:39 - loss: 1.4772 - regression_loss: 1.2789 - classification_loss: 0.1983 208/500 [===========>..................] - ETA: 1:39 - loss: 1.4744 - regression_loss: 1.2763 - classification_loss: 0.1981 209/500 [===========>..................] - ETA: 1:39 - loss: 1.4745 - regression_loss: 1.2764 - classification_loss: 0.1982 210/500 [===========>..................] - ETA: 1:38 - loss: 1.4760 - regression_loss: 1.2776 - classification_loss: 0.1984 211/500 [===========>..................] - ETA: 1:38 - loss: 1.4787 - regression_loss: 1.2796 - classification_loss: 0.1990 212/500 [===========>..................] - ETA: 1:38 - loss: 1.4802 - regression_loss: 1.2807 - classification_loss: 0.1995 213/500 [===========>..................] - ETA: 1:37 - loss: 1.4804 - regression_loss: 1.2809 - classification_loss: 0.1995 214/500 [===========>..................] - ETA: 1:37 - loss: 1.4823 - regression_loss: 1.2824 - classification_loss: 0.1999 215/500 [===========>..................] - ETA: 1:36 - loss: 1.4829 - regression_loss: 1.2832 - classification_loss: 0.1998 216/500 [===========>..................] - ETA: 1:36 - loss: 1.4849 - regression_loss: 1.2851 - classification_loss: 0.1998 217/500 [============>.................] - ETA: 1:36 - loss: 1.4841 - regression_loss: 1.2845 - classification_loss: 0.1996 218/500 [============>.................] - ETA: 1:35 - loss: 1.4839 - regression_loss: 1.2843 - classification_loss: 0.1996 219/500 [============>.................] - ETA: 1:35 - loss: 1.4838 - regression_loss: 1.2842 - classification_loss: 0.1996 220/500 [============>.................] - ETA: 1:35 - loss: 1.4851 - regression_loss: 1.2853 - classification_loss: 0.1999 221/500 [============>.................] - ETA: 1:34 - loss: 1.4858 - regression_loss: 1.2857 - classification_loss: 0.2001 222/500 [============>.................] - ETA: 1:34 - loss: 1.4872 - regression_loss: 1.2870 - classification_loss: 0.2002 223/500 [============>.................] - ETA: 1:34 - loss: 1.4891 - regression_loss: 1.2885 - classification_loss: 0.2006 224/500 [============>.................] - ETA: 1:33 - loss: 1.4890 - regression_loss: 1.2883 - classification_loss: 0.2007 225/500 [============>.................] - ETA: 1:33 - loss: 1.4882 - regression_loss: 1.2875 - classification_loss: 0.2008 226/500 [============>.................] - ETA: 1:33 - loss: 1.4865 - regression_loss: 1.2861 - classification_loss: 0.2004 227/500 [============>.................] - ETA: 1:32 - loss: 1.4823 - regression_loss: 1.2826 - classification_loss: 0.1997 228/500 [============>.................] - ETA: 1:32 - loss: 1.4815 - regression_loss: 1.2819 - classification_loss: 0.1996 229/500 [============>.................] - ETA: 1:32 - loss: 1.4823 - regression_loss: 1.2825 - classification_loss: 0.1998 230/500 [============>.................] - ETA: 1:31 - loss: 1.4794 - regression_loss: 1.2800 - classification_loss: 0.1994 231/500 [============>.................] - ETA: 1:31 - loss: 1.4786 - regression_loss: 1.2794 - classification_loss: 0.1992 232/500 [============>.................] - ETA: 1:31 - loss: 1.4783 - regression_loss: 1.2792 - classification_loss: 0.1990 233/500 [============>.................] - ETA: 1:30 - loss: 1.4811 - regression_loss: 1.2818 - classification_loss: 0.1993 234/500 [=============>................] - ETA: 1:30 - loss: 1.4777 - regression_loss: 1.2787 - classification_loss: 0.1990 235/500 [=============>................] - ETA: 1:30 - loss: 1.4791 - regression_loss: 1.2802 - classification_loss: 0.1989 236/500 [=============>................] - ETA: 1:29 - loss: 1.4778 - regression_loss: 1.2792 - classification_loss: 0.1986 237/500 [=============>................] - ETA: 1:29 - loss: 1.4772 - regression_loss: 1.2787 - classification_loss: 0.1985 238/500 [=============>................] - ETA: 1:29 - loss: 1.4759 - regression_loss: 1.2776 - classification_loss: 0.1983 239/500 [=============>................] - ETA: 1:28 - loss: 1.4745 - regression_loss: 1.2764 - classification_loss: 0.1980 240/500 [=============>................] - ETA: 1:28 - loss: 1.4713 - regression_loss: 1.2736 - classification_loss: 0.1977 241/500 [=============>................] - ETA: 1:28 - loss: 1.4712 - regression_loss: 1.2734 - classification_loss: 0.1977 242/500 [=============>................] - ETA: 1:27 - loss: 1.4709 - regression_loss: 1.2731 - classification_loss: 0.1977 243/500 [=============>................] - ETA: 1:27 - loss: 1.4710 - regression_loss: 1.2732 - classification_loss: 0.1978 244/500 [=============>................] - ETA: 1:27 - loss: 1.4711 - regression_loss: 1.2730 - classification_loss: 0.1981 245/500 [=============>................] - ETA: 1:26 - loss: 1.4693 - regression_loss: 1.2715 - classification_loss: 0.1977 246/500 [=============>................] - ETA: 1:26 - loss: 1.4677 - regression_loss: 1.2704 - classification_loss: 0.1974 247/500 [=============>................] - ETA: 1:26 - loss: 1.4678 - regression_loss: 1.2705 - classification_loss: 0.1974 248/500 [=============>................] - ETA: 1:25 - loss: 1.4680 - regression_loss: 1.2706 - classification_loss: 0.1974 249/500 [=============>................] - ETA: 1:25 - loss: 1.4685 - regression_loss: 1.2712 - classification_loss: 0.1973 250/500 [==============>...............] - ETA: 1:25 - loss: 1.4686 - regression_loss: 1.2712 - classification_loss: 0.1975 251/500 [==============>...............] - ETA: 1:24 - loss: 1.4683 - regression_loss: 1.2711 - classification_loss: 0.1972 252/500 [==============>...............] - ETA: 1:24 - loss: 1.4679 - regression_loss: 1.2708 - classification_loss: 0.1971 253/500 [==============>...............] - ETA: 1:24 - loss: 1.4693 - regression_loss: 1.2720 - classification_loss: 0.1974 254/500 [==============>...............] - ETA: 1:23 - loss: 1.4695 - regression_loss: 1.2722 - classification_loss: 0.1973 255/500 [==============>...............] - ETA: 1:23 - loss: 1.4699 - regression_loss: 1.2726 - classification_loss: 0.1973 256/500 [==============>...............] - ETA: 1:23 - loss: 1.4719 - regression_loss: 1.2741 - classification_loss: 0.1979 257/500 [==============>...............] - ETA: 1:22 - loss: 1.4735 - regression_loss: 1.2753 - classification_loss: 0.1982 258/500 [==============>...............] - ETA: 1:22 - loss: 1.4730 - regression_loss: 1.2750 - classification_loss: 0.1980 259/500 [==============>...............] - ETA: 1:22 - loss: 1.4747 - regression_loss: 1.2766 - classification_loss: 0.1981 260/500 [==============>...............] - ETA: 1:21 - loss: 1.4755 - regression_loss: 1.2774 - classification_loss: 0.1982 261/500 [==============>...............] - ETA: 1:21 - loss: 1.4740 - regression_loss: 1.2761 - classification_loss: 0.1979 262/500 [==============>...............] - ETA: 1:20 - loss: 1.4713 - regression_loss: 1.2738 - classification_loss: 0.1975 263/500 [==============>...............] - ETA: 1:20 - loss: 1.4718 - regression_loss: 1.2743 - classification_loss: 0.1974 264/500 [==============>...............] - ETA: 1:20 - loss: 1.4695 - regression_loss: 1.2725 - classification_loss: 0.1970 265/500 [==============>...............] - ETA: 1:19 - loss: 1.4696 - regression_loss: 1.2727 - classification_loss: 0.1969 266/500 [==============>...............] - ETA: 1:19 - loss: 1.4691 - regression_loss: 1.2724 - classification_loss: 0.1967 267/500 [===============>..............] - ETA: 1:19 - loss: 1.4679 - regression_loss: 1.2715 - classification_loss: 0.1964 268/500 [===============>..............] - ETA: 1:18 - loss: 1.4663 - regression_loss: 1.2700 - classification_loss: 0.1963 269/500 [===============>..............] - ETA: 1:18 - loss: 1.4683 - regression_loss: 1.2719 - classification_loss: 0.1964 270/500 [===============>..............] - ETA: 1:18 - loss: 1.4692 - regression_loss: 1.2724 - classification_loss: 0.1968 271/500 [===============>..............] - ETA: 1:17 - loss: 1.4677 - regression_loss: 1.2711 - classification_loss: 0.1965 272/500 [===============>..............] - ETA: 1:17 - loss: 1.4673 - regression_loss: 1.2708 - classification_loss: 0.1965 273/500 [===============>..............] - ETA: 1:17 - loss: 1.4695 - regression_loss: 1.2726 - classification_loss: 0.1970 274/500 [===============>..............] - ETA: 1:16 - loss: 1.4684 - regression_loss: 1.2717 - classification_loss: 0.1967 275/500 [===============>..............] - ETA: 1:16 - loss: 1.4675 - regression_loss: 1.2711 - classification_loss: 0.1964 276/500 [===============>..............] - ETA: 1:16 - loss: 1.4691 - regression_loss: 1.2723 - classification_loss: 0.1968 277/500 [===============>..............] - ETA: 1:15 - loss: 1.4697 - regression_loss: 1.2728 - classification_loss: 0.1970 278/500 [===============>..............] - ETA: 1:15 - loss: 1.4697 - regression_loss: 1.2727 - classification_loss: 0.1969 279/500 [===============>..............] - ETA: 1:15 - loss: 1.4670 - regression_loss: 1.2702 - classification_loss: 0.1968 280/500 [===============>..............] - ETA: 1:14 - loss: 1.4686 - regression_loss: 1.2714 - classification_loss: 0.1972 281/500 [===============>..............] - ETA: 1:14 - loss: 1.4660 - regression_loss: 1.2691 - classification_loss: 0.1969 282/500 [===============>..............] - ETA: 1:14 - loss: 1.4681 - regression_loss: 1.2709 - classification_loss: 0.1971 283/500 [===============>..............] - ETA: 1:13 - loss: 1.4669 - regression_loss: 1.2699 - classification_loss: 0.1970 284/500 [================>.............] - ETA: 1:13 - loss: 1.4671 - regression_loss: 1.2703 - classification_loss: 0.1969 285/500 [================>.............] - ETA: 1:13 - loss: 1.4670 - regression_loss: 1.2702 - classification_loss: 0.1968 286/500 [================>.............] - ETA: 1:12 - loss: 1.4656 - regression_loss: 1.2691 - classification_loss: 0.1965 287/500 [================>.............] - ETA: 1:12 - loss: 1.4651 - regression_loss: 1.2686 - classification_loss: 0.1964 288/500 [================>.............] - ETA: 1:12 - loss: 1.4634 - regression_loss: 1.2672 - classification_loss: 0.1962 289/500 [================>.............] - ETA: 1:11 - loss: 1.4638 - regression_loss: 1.2675 - classification_loss: 0.1963 290/500 [================>.............] - ETA: 1:11 - loss: 1.4636 - regression_loss: 1.2674 - classification_loss: 0.1961 291/500 [================>.............] - ETA: 1:11 - loss: 1.4643 - regression_loss: 1.2680 - classification_loss: 0.1963 292/500 [================>.............] - ETA: 1:10 - loss: 1.4635 - regression_loss: 1.2674 - classification_loss: 0.1962 293/500 [================>.............] - ETA: 1:10 - loss: 1.4632 - regression_loss: 1.2673 - classification_loss: 0.1960 294/500 [================>.............] - ETA: 1:10 - loss: 1.4598 - regression_loss: 1.2644 - classification_loss: 0.1955 295/500 [================>.............] - ETA: 1:09 - loss: 1.4589 - regression_loss: 1.2635 - classification_loss: 0.1954 296/500 [================>.............] - ETA: 1:09 - loss: 1.4559 - regression_loss: 1.2609 - classification_loss: 0.1950 297/500 [================>.............] - ETA: 1:09 - loss: 1.4543 - regression_loss: 1.2596 - classification_loss: 0.1947 298/500 [================>.............] - ETA: 1:08 - loss: 1.4539 - regression_loss: 1.2589 - classification_loss: 0.1950 299/500 [================>.............] - ETA: 1:08 - loss: 1.4525 - regression_loss: 1.2578 - classification_loss: 0.1948 300/500 [=================>............] - ETA: 1:08 - loss: 1.4508 - regression_loss: 1.2562 - classification_loss: 0.1946 301/500 [=================>............] - ETA: 1:07 - loss: 1.4515 - regression_loss: 1.2568 - classification_loss: 0.1947 302/500 [=================>............] - ETA: 1:07 - loss: 1.4512 - regression_loss: 1.2567 - classification_loss: 0.1945 303/500 [=================>............] - ETA: 1:07 - loss: 1.4520 - regression_loss: 1.2574 - classification_loss: 0.1946 304/500 [=================>............] - ETA: 1:06 - loss: 1.4515 - regression_loss: 1.2572 - classification_loss: 0.1944 305/500 [=================>............] - ETA: 1:06 - loss: 1.4530 - regression_loss: 1.2584 - classification_loss: 0.1946 306/500 [=================>............] - ETA: 1:06 - loss: 1.4525 - regression_loss: 1.2579 - classification_loss: 0.1945 307/500 [=================>............] - ETA: 1:05 - loss: 1.4518 - regression_loss: 1.2575 - classification_loss: 0.1943 308/500 [=================>............] - ETA: 1:05 - loss: 1.4525 - regression_loss: 1.2582 - classification_loss: 0.1942 309/500 [=================>............] - ETA: 1:04 - loss: 1.4544 - regression_loss: 1.2599 - classification_loss: 0.1944 310/500 [=================>............] - ETA: 1:04 - loss: 1.4540 - regression_loss: 1.2596 - classification_loss: 0.1944 311/500 [=================>............] - ETA: 1:04 - loss: 1.4524 - regression_loss: 1.2582 - classification_loss: 0.1942 312/500 [=================>............] - ETA: 1:03 - loss: 1.4542 - regression_loss: 1.2599 - classification_loss: 0.1942 313/500 [=================>............] - ETA: 1:03 - loss: 1.4552 - regression_loss: 1.2605 - classification_loss: 0.1947 314/500 [=================>............] - ETA: 1:03 - loss: 1.4566 - regression_loss: 1.2615 - classification_loss: 0.1951 315/500 [=================>............] - ETA: 1:02 - loss: 1.4556 - regression_loss: 1.2607 - classification_loss: 0.1949 316/500 [=================>............] - ETA: 1:02 - loss: 1.4571 - regression_loss: 1.2620 - classification_loss: 0.1952 317/500 [==================>...........] - ETA: 1:02 - loss: 1.4553 - regression_loss: 1.2604 - classification_loss: 0.1949 318/500 [==================>...........] - ETA: 1:01 - loss: 1.4557 - regression_loss: 1.2609 - classification_loss: 0.1948 319/500 [==================>...........] - ETA: 1:01 - loss: 1.4574 - regression_loss: 1.2624 - classification_loss: 0.1950 320/500 [==================>...........] - ETA: 1:01 - loss: 1.4592 - regression_loss: 1.2639 - classification_loss: 0.1953 321/500 [==================>...........] - ETA: 1:00 - loss: 1.4569 - regression_loss: 1.2618 - classification_loss: 0.1952 322/500 [==================>...........] - ETA: 1:00 - loss: 1.4564 - regression_loss: 1.2614 - classification_loss: 0.1950 323/500 [==================>...........] - ETA: 1:00 - loss: 1.4580 - regression_loss: 1.2625 - classification_loss: 0.1954 324/500 [==================>...........] - ETA: 59s - loss: 1.4590 - regression_loss: 1.2635 - classification_loss: 0.1955  325/500 [==================>...........] - ETA: 59s - loss: 1.4578 - regression_loss: 1.2625 - classification_loss: 0.1953 326/500 [==================>...........] - ETA: 59s - loss: 1.4593 - regression_loss: 1.2636 - classification_loss: 0.1958 327/500 [==================>...........] - ETA: 58s - loss: 1.4589 - regression_loss: 1.2631 - classification_loss: 0.1958 328/500 [==================>...........] - ETA: 58s - loss: 1.4580 - regression_loss: 1.2623 - classification_loss: 0.1957 329/500 [==================>...........] - ETA: 58s - loss: 1.4578 - regression_loss: 1.2623 - classification_loss: 0.1956 330/500 [==================>...........] - ETA: 57s - loss: 1.4560 - regression_loss: 1.2607 - classification_loss: 0.1953 331/500 [==================>...........] - ETA: 57s - loss: 1.4577 - regression_loss: 1.2621 - classification_loss: 0.1956 332/500 [==================>...........] - ETA: 57s - loss: 1.4588 - regression_loss: 1.2631 - classification_loss: 0.1957 333/500 [==================>...........] - ETA: 56s - loss: 1.4577 - regression_loss: 1.2621 - classification_loss: 0.1956 334/500 [===================>..........] - ETA: 56s - loss: 1.4578 - regression_loss: 1.2625 - classification_loss: 0.1953 335/500 [===================>..........] - ETA: 56s - loss: 1.4587 - regression_loss: 1.2632 - classification_loss: 0.1955 336/500 [===================>..........] - ETA: 55s - loss: 1.4584 - regression_loss: 1.2630 - classification_loss: 0.1954 337/500 [===================>..........] - ETA: 55s - loss: 1.4593 - regression_loss: 1.2640 - classification_loss: 0.1953 338/500 [===================>..........] - ETA: 55s - loss: 1.4578 - regression_loss: 1.2627 - classification_loss: 0.1950 339/500 [===================>..........] - ETA: 54s - loss: 1.4574 - regression_loss: 1.2624 - classification_loss: 0.1950 340/500 [===================>..........] - ETA: 54s - loss: 1.4560 - regression_loss: 1.2613 - classification_loss: 0.1947 341/500 [===================>..........] - ETA: 54s - loss: 1.4561 - regression_loss: 1.2614 - classification_loss: 0.1947 342/500 [===================>..........] - ETA: 53s - loss: 1.4565 - regression_loss: 1.2618 - classification_loss: 0.1948 343/500 [===================>..........] - ETA: 53s - loss: 1.4567 - regression_loss: 1.2619 - classification_loss: 0.1948 344/500 [===================>..........] - ETA: 53s - loss: 1.4569 - regression_loss: 1.2622 - classification_loss: 0.1947 345/500 [===================>..........] - ETA: 52s - loss: 1.4580 - regression_loss: 1.2630 - classification_loss: 0.1950 346/500 [===================>..........] - ETA: 52s - loss: 1.4580 - regression_loss: 1.2631 - classification_loss: 0.1949 347/500 [===================>..........] - ETA: 52s - loss: 1.4568 - regression_loss: 1.2619 - classification_loss: 0.1949 348/500 [===================>..........] - ETA: 51s - loss: 1.4561 - regression_loss: 1.2614 - classification_loss: 0.1947 349/500 [===================>..........] - ETA: 51s - loss: 1.4569 - regression_loss: 1.2621 - classification_loss: 0.1948 350/500 [====================>.........] - ETA: 51s - loss: 1.4571 - regression_loss: 1.2622 - classification_loss: 0.1949 351/500 [====================>.........] - ETA: 50s - loss: 1.4562 - regression_loss: 1.2616 - classification_loss: 0.1946 352/500 [====================>.........] - ETA: 50s - loss: 1.4564 - regression_loss: 1.2618 - classification_loss: 0.1946 353/500 [====================>.........] - ETA: 50s - loss: 1.4548 - regression_loss: 1.2604 - classification_loss: 0.1943 354/500 [====================>.........] - ETA: 49s - loss: 1.4546 - regression_loss: 1.2604 - classification_loss: 0.1942 355/500 [====================>.........] - ETA: 49s - loss: 1.4542 - regression_loss: 1.2601 - classification_loss: 0.1941 356/500 [====================>.........] - ETA: 48s - loss: 1.4528 - regression_loss: 1.2590 - classification_loss: 0.1938 357/500 [====================>.........] - ETA: 48s - loss: 1.4538 - regression_loss: 1.2598 - classification_loss: 0.1940 358/500 [====================>.........] - ETA: 48s - loss: 1.4541 - regression_loss: 1.2602 - classification_loss: 0.1940 359/500 [====================>.........] - ETA: 47s - loss: 1.4543 - regression_loss: 1.2602 - classification_loss: 0.1941 360/500 [====================>.........] - ETA: 47s - loss: 1.4535 - regression_loss: 1.2595 - classification_loss: 0.1940 361/500 [====================>.........] - ETA: 47s - loss: 1.4514 - regression_loss: 1.2578 - classification_loss: 0.1936 362/500 [====================>.........] - ETA: 46s - loss: 1.4508 - regression_loss: 1.2574 - classification_loss: 0.1935 363/500 [====================>.........] - ETA: 46s - loss: 1.4487 - regression_loss: 1.2555 - classification_loss: 0.1932 364/500 [====================>.........] - ETA: 46s - loss: 1.4464 - regression_loss: 1.2536 - classification_loss: 0.1928 365/500 [====================>.........] - ETA: 45s - loss: 1.4453 - regression_loss: 1.2527 - classification_loss: 0.1925 366/500 [====================>.........] - ETA: 45s - loss: 1.4437 - regression_loss: 1.2514 - classification_loss: 0.1923 367/500 [=====================>........] - ETA: 45s - loss: 1.4437 - regression_loss: 1.2513 - classification_loss: 0.1924 368/500 [=====================>........] - ETA: 44s - loss: 1.4440 - regression_loss: 1.2517 - classification_loss: 0.1923 369/500 [=====================>........] - ETA: 44s - loss: 1.4462 - regression_loss: 1.2536 - classification_loss: 0.1926 370/500 [=====================>........] - ETA: 44s - loss: 1.4472 - regression_loss: 1.2546 - classification_loss: 0.1925 371/500 [=====================>........] - ETA: 43s - loss: 1.4471 - regression_loss: 1.2546 - classification_loss: 0.1925 372/500 [=====================>........] - ETA: 43s - loss: 1.4455 - regression_loss: 1.2533 - classification_loss: 0.1922 373/500 [=====================>........] - ETA: 43s - loss: 1.4458 - regression_loss: 1.2537 - classification_loss: 0.1922 374/500 [=====================>........] - ETA: 42s - loss: 1.4454 - regression_loss: 1.2533 - classification_loss: 0.1921 375/500 [=====================>........] - ETA: 42s - loss: 1.4457 - regression_loss: 1.2536 - classification_loss: 0.1920 376/500 [=====================>........] - ETA: 42s - loss: 1.4463 - regression_loss: 1.2545 - classification_loss: 0.1918 377/500 [=====================>........] - ETA: 41s - loss: 1.4457 - regression_loss: 1.2541 - classification_loss: 0.1915 378/500 [=====================>........] - ETA: 41s - loss: 1.4455 - regression_loss: 1.2541 - classification_loss: 0.1914 379/500 [=====================>........] - ETA: 41s - loss: 1.4459 - regression_loss: 1.2546 - classification_loss: 0.1913 380/500 [=====================>........] - ETA: 40s - loss: 1.4467 - regression_loss: 1.2552 - classification_loss: 0.1915 381/500 [=====================>........] - ETA: 40s - loss: 1.4452 - regression_loss: 1.2540 - classification_loss: 0.1911 382/500 [=====================>........] - ETA: 40s - loss: 1.4448 - regression_loss: 1.2536 - classification_loss: 0.1912 383/500 [=====================>........] - ETA: 39s - loss: 1.4443 - regression_loss: 1.2532 - classification_loss: 0.1911 384/500 [======================>.......] - ETA: 39s - loss: 1.4424 - regression_loss: 1.2517 - classification_loss: 0.1907 385/500 [======================>.......] - ETA: 39s - loss: 1.4419 - regression_loss: 1.2514 - classification_loss: 0.1906 386/500 [======================>.......] - ETA: 38s - loss: 1.4396 - regression_loss: 1.2494 - classification_loss: 0.1902 387/500 [======================>.......] - ETA: 38s - loss: 1.4401 - regression_loss: 1.2499 - classification_loss: 0.1902 388/500 [======================>.......] - ETA: 38s - loss: 1.4400 - regression_loss: 1.2498 - classification_loss: 0.1902 389/500 [======================>.......] - ETA: 37s - loss: 1.4396 - regression_loss: 1.2494 - classification_loss: 0.1902 390/500 [======================>.......] - ETA: 37s - loss: 1.4405 - regression_loss: 1.2499 - classification_loss: 0.1906 391/500 [======================>.......] - ETA: 37s - loss: 1.4400 - regression_loss: 1.2494 - classification_loss: 0.1905 392/500 [======================>.......] - ETA: 36s - loss: 1.4400 - regression_loss: 1.2494 - classification_loss: 0.1906 393/500 [======================>.......] - ETA: 36s - loss: 1.4398 - regression_loss: 1.2493 - classification_loss: 0.1905 394/500 [======================>.......] - ETA: 36s - loss: 1.4399 - regression_loss: 1.2494 - classification_loss: 0.1905 395/500 [======================>.......] - ETA: 35s - loss: 1.4390 - regression_loss: 1.2485 - classification_loss: 0.1905 396/500 [======================>.......] - ETA: 35s - loss: 1.4392 - regression_loss: 1.2488 - classification_loss: 0.1904 397/500 [======================>.......] - ETA: 34s - loss: 1.4395 - regression_loss: 1.2491 - classification_loss: 0.1904 398/500 [======================>.......] - ETA: 34s - loss: 1.4384 - regression_loss: 1.2481 - classification_loss: 0.1903 399/500 [======================>.......] - ETA: 34s - loss: 1.4385 - regression_loss: 1.2482 - classification_loss: 0.1904 400/500 [=======================>......] - ETA: 33s - loss: 1.4383 - regression_loss: 1.2480 - classification_loss: 0.1904 401/500 [=======================>......] - ETA: 33s - loss: 1.4375 - regression_loss: 1.2474 - classification_loss: 0.1901 402/500 [=======================>......] - ETA: 33s - loss: 1.4382 - regression_loss: 1.2480 - classification_loss: 0.1902 403/500 [=======================>......] - ETA: 32s - loss: 1.4354 - regression_loss: 1.2455 - classification_loss: 0.1898 404/500 [=======================>......] - ETA: 32s - loss: 1.4360 - regression_loss: 1.2461 - classification_loss: 0.1899 405/500 [=======================>......] - ETA: 32s - loss: 1.4370 - regression_loss: 1.2468 - classification_loss: 0.1901 406/500 [=======================>......] - ETA: 31s - loss: 1.4353 - regression_loss: 1.2454 - classification_loss: 0.1899 407/500 [=======================>......] - ETA: 31s - loss: 1.4345 - regression_loss: 1.2447 - classification_loss: 0.1898 408/500 [=======================>......] - ETA: 31s - loss: 1.4357 - regression_loss: 1.2455 - classification_loss: 0.1903 409/500 [=======================>......] - ETA: 30s - loss: 1.4373 - regression_loss: 1.2469 - classification_loss: 0.1904 410/500 [=======================>......] - ETA: 30s - loss: 1.4379 - regression_loss: 1.2474 - classification_loss: 0.1906 411/500 [=======================>......] - ETA: 30s - loss: 1.4385 - regression_loss: 1.2478 - classification_loss: 0.1907 412/500 [=======================>......] - ETA: 29s - loss: 1.4385 - regression_loss: 1.2479 - classification_loss: 0.1906 413/500 [=======================>......] - ETA: 29s - loss: 1.4385 - regression_loss: 1.2479 - classification_loss: 0.1907 414/500 [=======================>......] - ETA: 29s - loss: 1.4386 - regression_loss: 1.2479 - classification_loss: 0.1907 415/500 [=======================>......] - ETA: 28s - loss: 1.4369 - regression_loss: 1.2464 - classification_loss: 0.1905 416/500 [=======================>......] - ETA: 28s - loss: 1.4358 - regression_loss: 1.2455 - classification_loss: 0.1903 417/500 [========================>.....] - ETA: 28s - loss: 1.4355 - regression_loss: 1.2452 - classification_loss: 0.1903 418/500 [========================>.....] - ETA: 27s - loss: 1.4354 - regression_loss: 1.2451 - classification_loss: 0.1903 419/500 [========================>.....] - ETA: 27s - loss: 1.4361 - regression_loss: 1.2458 - classification_loss: 0.1903 420/500 [========================>.....] - ETA: 27s - loss: 1.4349 - regression_loss: 1.2448 - classification_loss: 0.1900 421/500 [========================>.....] - ETA: 26s - loss: 1.4338 - regression_loss: 1.2439 - classification_loss: 0.1900 422/500 [========================>.....] - ETA: 26s - loss: 1.4346 - regression_loss: 1.2445 - classification_loss: 0.1901 423/500 [========================>.....] - ETA: 26s - loss: 1.4356 - regression_loss: 1.2453 - classification_loss: 0.1903 424/500 [========================>.....] - ETA: 25s - loss: 1.4357 - regression_loss: 1.2453 - classification_loss: 0.1904 425/500 [========================>.....] - ETA: 25s - loss: 1.4354 - regression_loss: 1.2450 - classification_loss: 0.1904 426/500 [========================>.....] - ETA: 25s - loss: 1.4345 - regression_loss: 1.2442 - classification_loss: 0.1903 427/500 [========================>.....] - ETA: 24s - loss: 1.4333 - regression_loss: 1.2430 - classification_loss: 0.1903 428/500 [========================>.....] - ETA: 24s - loss: 1.4327 - regression_loss: 1.2425 - classification_loss: 0.1902 429/500 [========================>.....] - ETA: 24s - loss: 1.4338 - regression_loss: 1.2434 - classification_loss: 0.1903 430/500 [========================>.....] - ETA: 23s - loss: 1.4335 - regression_loss: 1.2432 - classification_loss: 0.1903 431/500 [========================>.....] - ETA: 23s - loss: 1.4338 - regression_loss: 1.2434 - classification_loss: 0.1904 432/500 [========================>.....] - ETA: 23s - loss: 1.4341 - regression_loss: 1.2437 - classification_loss: 0.1904 433/500 [========================>.....] - ETA: 22s - loss: 1.4334 - regression_loss: 1.2431 - classification_loss: 0.1903 434/500 [=========================>....] - ETA: 22s - loss: 1.4343 - regression_loss: 1.2438 - classification_loss: 0.1905 435/500 [=========================>....] - ETA: 22s - loss: 1.4337 - regression_loss: 1.2432 - classification_loss: 0.1905 436/500 [=========================>....] - ETA: 21s - loss: 1.4338 - regression_loss: 1.2434 - classification_loss: 0.1904 437/500 [=========================>....] - ETA: 21s - loss: 1.4337 - regression_loss: 1.2433 - classification_loss: 0.1904 438/500 [=========================>....] - ETA: 21s - loss: 1.4335 - regression_loss: 1.2433 - classification_loss: 0.1903 439/500 [=========================>....] - ETA: 20s - loss: 1.4341 - regression_loss: 1.2436 - classification_loss: 0.1905 440/500 [=========================>....] - ETA: 20s - loss: 1.4335 - regression_loss: 1.2431 - classification_loss: 0.1904 441/500 [=========================>....] - ETA: 19s - loss: 1.4334 - regression_loss: 1.2431 - classification_loss: 0.1904 442/500 [=========================>....] - ETA: 19s - loss: 1.4321 - regression_loss: 1.2420 - classification_loss: 0.1901 443/500 [=========================>....] - ETA: 19s - loss: 1.4327 - regression_loss: 1.2423 - classification_loss: 0.1904 444/500 [=========================>....] - ETA: 18s - loss: 1.4320 - regression_loss: 1.2416 - classification_loss: 0.1903 445/500 [=========================>....] - ETA: 18s - loss: 1.4333 - regression_loss: 1.2430 - classification_loss: 0.1903 446/500 [=========================>....] - ETA: 18s - loss: 1.4334 - regression_loss: 1.2431 - classification_loss: 0.1903 447/500 [=========================>....] - ETA: 17s - loss: 1.4341 - regression_loss: 1.2436 - classification_loss: 0.1905 448/500 [=========================>....] - ETA: 17s - loss: 1.4341 - regression_loss: 1.2436 - classification_loss: 0.1904 449/500 [=========================>....] - ETA: 17s - loss: 1.4335 - regression_loss: 1.2431 - classification_loss: 0.1904 450/500 [==========================>...] - ETA: 16s - loss: 1.4324 - regression_loss: 1.2424 - classification_loss: 0.1901 451/500 [==========================>...] - ETA: 16s - loss: 1.4319 - regression_loss: 1.2419 - classification_loss: 0.1900 452/500 [==========================>...] - ETA: 16s - loss: 1.4309 - regression_loss: 1.2411 - classification_loss: 0.1898 453/500 [==========================>...] - ETA: 15s - loss: 1.4316 - regression_loss: 1.2418 - classification_loss: 0.1899 454/500 [==========================>...] - ETA: 15s - loss: 1.4311 - regression_loss: 1.2414 - classification_loss: 0.1897 455/500 [==========================>...] - ETA: 15s - loss: 1.4314 - regression_loss: 1.2416 - classification_loss: 0.1898 456/500 [==========================>...] - ETA: 14s - loss: 1.4321 - regression_loss: 1.2423 - classification_loss: 0.1898 457/500 [==========================>...] - ETA: 14s - loss: 1.4338 - regression_loss: 1.2437 - classification_loss: 0.1901 458/500 [==========================>...] - ETA: 14s - loss: 1.4326 - regression_loss: 1.2427 - classification_loss: 0.1899 459/500 [==========================>...] - ETA: 13s - loss: 1.4332 - regression_loss: 1.2431 - classification_loss: 0.1901 460/500 [==========================>...] - ETA: 13s - loss: 1.4329 - regression_loss: 1.2429 - classification_loss: 0.1900 461/500 [==========================>...] - ETA: 13s - loss: 1.4326 - regression_loss: 1.2426 - classification_loss: 0.1900 462/500 [==========================>...] - ETA: 12s - loss: 1.4326 - regression_loss: 1.2426 - classification_loss: 0.1900 463/500 [==========================>...] - ETA: 12s - loss: 1.4333 - regression_loss: 1.2434 - classification_loss: 0.1899 464/500 [==========================>...] - ETA: 12s - loss: 1.4340 - regression_loss: 1.2439 - classification_loss: 0.1900 465/500 [==========================>...] - ETA: 11s - loss: 1.4345 - regression_loss: 1.2443 - classification_loss: 0.1902 466/500 [==========================>...] - ETA: 11s - loss: 1.4346 - regression_loss: 1.2444 - classification_loss: 0.1902 467/500 [===========================>..] - ETA: 11s - loss: 1.4345 - regression_loss: 1.2443 - classification_loss: 0.1902 468/500 [===========================>..] - ETA: 10s - loss: 1.4355 - regression_loss: 1.2452 - classification_loss: 0.1903 469/500 [===========================>..] - ETA: 10s - loss: 1.4362 - regression_loss: 1.2460 - classification_loss: 0.1903 470/500 [===========================>..] - ETA: 10s - loss: 1.4366 - regression_loss: 1.2462 - classification_loss: 0.1903 471/500 [===========================>..] - ETA: 9s - loss: 1.4363 - regression_loss: 1.2459 - classification_loss: 0.1904  472/500 [===========================>..] - ETA: 9s - loss: 1.4370 - regression_loss: 1.2465 - classification_loss: 0.1905 473/500 [===========================>..] - ETA: 9s - loss: 1.4376 - regression_loss: 1.2471 - classification_loss: 0.1906 474/500 [===========================>..] - ETA: 8s - loss: 1.4383 - regression_loss: 1.2476 - classification_loss: 0.1907 475/500 [===========================>..] - ETA: 8s - loss: 1.4388 - regression_loss: 1.2481 - classification_loss: 0.1907 476/500 [===========================>..] - ETA: 8s - loss: 1.4384 - regression_loss: 1.2478 - classification_loss: 0.1906 477/500 [===========================>..] - ETA: 7s - loss: 1.4389 - regression_loss: 1.2482 - classification_loss: 0.1907 478/500 [===========================>..] - ETA: 7s - loss: 1.4441 - regression_loss: 1.2485 - classification_loss: 0.1955 479/500 [===========================>..] - ETA: 7s - loss: 1.4450 - regression_loss: 1.2494 - classification_loss: 0.1957 480/500 [===========================>..] - ETA: 6s - loss: 1.4459 - regression_loss: 1.2501 - classification_loss: 0.1959 481/500 [===========================>..] - ETA: 6s - loss: 1.4470 - regression_loss: 1.2510 - classification_loss: 0.1960 482/500 [===========================>..] - ETA: 6s - loss: 1.4470 - regression_loss: 1.2510 - classification_loss: 0.1960 483/500 [===========================>..] - ETA: 5s - loss: 1.4465 - regression_loss: 1.2506 - classification_loss: 0.1959 484/500 [============================>.] - ETA: 5s - loss: 1.4477 - regression_loss: 1.2517 - classification_loss: 0.1960 485/500 [============================>.] - ETA: 5s - loss: 1.4486 - regression_loss: 1.2525 - classification_loss: 0.1962 486/500 [============================>.] - ETA: 4s - loss: 1.4486 - regression_loss: 1.2525 - classification_loss: 0.1961 487/500 [============================>.] - ETA: 4s - loss: 1.4476 - regression_loss: 1.2517 - classification_loss: 0.1959 488/500 [============================>.] - ETA: 4s - loss: 1.4487 - regression_loss: 1.2527 - classification_loss: 0.1961 489/500 [============================>.] - ETA: 3s - loss: 1.4485 - regression_loss: 1.2525 - classification_loss: 0.1960 490/500 [============================>.] - ETA: 3s - loss: 1.4492 - regression_loss: 1.2530 - classification_loss: 0.1961 491/500 [============================>.] - ETA: 3s - loss: 1.4484 - regression_loss: 1.2525 - classification_loss: 0.1959 492/500 [============================>.] - ETA: 2s - loss: 1.4490 - regression_loss: 1.2530 - classification_loss: 0.1960 493/500 [============================>.] - ETA: 2s - loss: 1.4495 - regression_loss: 1.2536 - classification_loss: 0.1959 494/500 [============================>.] - ETA: 2s - loss: 1.4479 - regression_loss: 1.2522 - classification_loss: 0.1957 495/500 [============================>.] - ETA: 1s - loss: 1.4486 - regression_loss: 1.2528 - classification_loss: 0.1958 496/500 [============================>.] - ETA: 1s - loss: 1.4487 - regression_loss: 1.2529 - classification_loss: 0.1957 497/500 [============================>.] - ETA: 1s - loss: 1.4480 - regression_loss: 1.2524 - classification_loss: 0.1956 498/500 [============================>.] - ETA: 0s - loss: 1.4489 - regression_loss: 1.2533 - classification_loss: 0.1957 499/500 [============================>.] - ETA: 0s - loss: 1.4499 - regression_loss: 1.2540 - classification_loss: 0.1958 500/500 [==============================] - 170s 339ms/step - loss: 1.4507 - regression_loss: 1.2548 - classification_loss: 0.1959 1172 instances of class plum with average precision: 0.7449 mAP: 0.7449 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 2:40 - loss: 1.0791 - regression_loss: 0.7810 - classification_loss: 0.2981 2/500 [..............................] - ETA: 2:41 - loss: 1.3513 - regression_loss: 1.0968 - classification_loss: 0.2545 3/500 [..............................] - ETA: 2:43 - loss: 1.4664 - regression_loss: 1.2170 - classification_loss: 0.2494 4/500 [..............................] - ETA: 2:45 - loss: 1.3944 - regression_loss: 1.1696 - classification_loss: 0.2248 5/500 [..............................] - ETA: 2:45 - loss: 1.5263 - regression_loss: 1.2915 - classification_loss: 0.2348 6/500 [..............................] - ETA: 2:46 - loss: 1.4108 - regression_loss: 1.1981 - classification_loss: 0.2127 7/500 [..............................] - ETA: 2:46 - loss: 1.4848 - regression_loss: 1.2627 - classification_loss: 0.2221 8/500 [..............................] - ETA: 2:46 - loss: 1.4702 - regression_loss: 1.2533 - classification_loss: 0.2169 9/500 [..............................] - ETA: 2:45 - loss: 1.4171 - regression_loss: 1.2147 - classification_loss: 0.2024 10/500 [..............................] - ETA: 2:45 - loss: 1.4825 - regression_loss: 1.2742 - classification_loss: 0.2083 11/500 [..............................] - ETA: 2:44 - loss: 1.5948 - regression_loss: 1.3793 - classification_loss: 0.2155 12/500 [..............................] - ETA: 2:43 - loss: 1.5381 - regression_loss: 1.3311 - classification_loss: 0.2070 13/500 [..............................] - ETA: 2:42 - loss: 1.5616 - regression_loss: 1.3462 - classification_loss: 0.2154 14/500 [..............................] - ETA: 2:42 - loss: 1.5651 - regression_loss: 1.3519 - classification_loss: 0.2132 15/500 [..............................] - ETA: 2:42 - loss: 1.5546 - regression_loss: 1.3436 - classification_loss: 0.2110 16/500 [..............................] - ETA: 2:41 - loss: 1.5617 - regression_loss: 1.3514 - classification_loss: 0.2103 17/500 [>.............................] - ETA: 2:41 - loss: 1.5457 - regression_loss: 1.3387 - classification_loss: 0.2070 18/500 [>.............................] - ETA: 2:41 - loss: 1.5310 - regression_loss: 1.3276 - classification_loss: 0.2033 19/500 [>.............................] - ETA: 2:41 - loss: 1.5088 - regression_loss: 1.3099 - classification_loss: 0.1989 20/500 [>.............................] - ETA: 2:41 - loss: 1.5853 - regression_loss: 1.3741 - classification_loss: 0.2112 21/500 [>.............................] - ETA: 2:40 - loss: 1.5338 - regression_loss: 1.3300 - classification_loss: 0.2038 22/500 [>.............................] - ETA: 2:40 - loss: 1.5213 - regression_loss: 1.3210 - classification_loss: 0.2003 23/500 [>.............................] - ETA: 2:39 - loss: 1.5345 - regression_loss: 1.3328 - classification_loss: 0.2017 24/500 [>.............................] - ETA: 2:39 - loss: 1.4966 - regression_loss: 1.2997 - classification_loss: 0.1968 25/500 [>.............................] - ETA: 2:39 - loss: 1.4926 - regression_loss: 1.2968 - classification_loss: 0.1958 26/500 [>.............................] - ETA: 2:39 - loss: 1.4605 - regression_loss: 1.2697 - classification_loss: 0.1908 27/500 [>.............................] - ETA: 2:39 - loss: 1.4722 - regression_loss: 1.2793 - classification_loss: 0.1929 28/500 [>.............................] - ETA: 2:39 - loss: 1.4720 - regression_loss: 1.2795 - classification_loss: 0.1925 29/500 [>.............................] - ETA: 2:39 - loss: 1.4838 - regression_loss: 1.2886 - classification_loss: 0.1951 30/500 [>.............................] - ETA: 2:38 - loss: 1.4705 - regression_loss: 1.2779 - classification_loss: 0.1926 31/500 [>.............................] - ETA: 2:38 - loss: 1.4624 - regression_loss: 1.2715 - classification_loss: 0.1909 32/500 [>.............................] - ETA: 2:37 - loss: 1.4623 - regression_loss: 1.2721 - classification_loss: 0.1902 33/500 [>.............................] - ETA: 2:37 - loss: 1.4567 - regression_loss: 1.2675 - classification_loss: 0.1893 34/500 [=>............................] - ETA: 2:37 - loss: 1.4552 - regression_loss: 1.2661 - classification_loss: 0.1891 35/500 [=>............................] - ETA: 2:36 - loss: 1.4624 - regression_loss: 1.2720 - classification_loss: 0.1904 36/500 [=>............................] - ETA: 2:36 - loss: 1.4631 - regression_loss: 1.2723 - classification_loss: 0.1907 37/500 [=>............................] - ETA: 2:36 - loss: 1.4774 - regression_loss: 1.2850 - classification_loss: 0.1923 38/500 [=>............................] - ETA: 2:36 - loss: 1.4852 - regression_loss: 1.2911 - classification_loss: 0.1941 39/500 [=>............................] - ETA: 2:36 - loss: 1.4807 - regression_loss: 1.2868 - classification_loss: 0.1939 40/500 [=>............................] - ETA: 2:35 - loss: 1.4770 - regression_loss: 1.2829 - classification_loss: 0.1941 41/500 [=>............................] - ETA: 2:35 - loss: 1.4868 - regression_loss: 1.2919 - classification_loss: 0.1949 42/500 [=>............................] - ETA: 2:35 - loss: 1.4820 - regression_loss: 1.2877 - classification_loss: 0.1942 43/500 [=>............................] - ETA: 2:34 - loss: 1.4903 - regression_loss: 1.2951 - classification_loss: 0.1952 44/500 [=>............................] - ETA: 2:34 - loss: 1.4986 - regression_loss: 1.3036 - classification_loss: 0.1950 45/500 [=>............................] - ETA: 2:34 - loss: 1.5088 - regression_loss: 1.3115 - classification_loss: 0.1972 46/500 [=>............................] - ETA: 2:33 - loss: 1.5110 - regression_loss: 1.3145 - classification_loss: 0.1965 47/500 [=>............................] - ETA: 2:33 - loss: 1.5203 - regression_loss: 1.3230 - classification_loss: 0.1974 48/500 [=>............................] - ETA: 2:33 - loss: 1.5249 - regression_loss: 1.3273 - classification_loss: 0.1976 49/500 [=>............................] - ETA: 2:32 - loss: 1.5083 - regression_loss: 1.3134 - classification_loss: 0.1949 50/500 [==>...........................] - ETA: 2:32 - loss: 1.4972 - regression_loss: 1.3037 - classification_loss: 0.1935 51/500 [==>...........................] - ETA: 2:32 - loss: 1.4913 - regression_loss: 1.2990 - classification_loss: 0.1923 52/500 [==>...........................] - ETA: 2:31 - loss: 1.4942 - regression_loss: 1.3012 - classification_loss: 0.1930 53/500 [==>...........................] - ETA: 2:31 - loss: 1.4848 - regression_loss: 1.2927 - classification_loss: 0.1921 54/500 [==>...........................] - ETA: 2:31 - loss: 1.4655 - regression_loss: 1.2763 - classification_loss: 0.1892 55/500 [==>...........................] - ETA: 2:30 - loss: 1.4618 - regression_loss: 1.2725 - classification_loss: 0.1893 56/500 [==>...........................] - ETA: 2:30 - loss: 1.4638 - regression_loss: 1.2739 - classification_loss: 0.1899 57/500 [==>...........................] - ETA: 2:30 - loss: 1.4710 - regression_loss: 1.2790 - classification_loss: 0.1920 58/500 [==>...........................] - ETA: 2:29 - loss: 1.4719 - regression_loss: 1.2806 - classification_loss: 0.1913 59/500 [==>...........................] - ETA: 2:29 - loss: 1.4673 - regression_loss: 1.2767 - classification_loss: 0.1907 60/500 [==>...........................] - ETA: 2:29 - loss: 1.4706 - regression_loss: 1.2790 - classification_loss: 0.1916 61/500 [==>...........................] - ETA: 2:28 - loss: 1.4747 - regression_loss: 1.2824 - classification_loss: 0.1924 62/500 [==>...........................] - ETA: 2:28 - loss: 1.4821 - regression_loss: 1.2893 - classification_loss: 0.1928 63/500 [==>...........................] - ETA: 2:28 - loss: 1.4906 - regression_loss: 1.2966 - classification_loss: 0.1940 64/500 [==>...........................] - ETA: 2:27 - loss: 1.4897 - regression_loss: 1.2949 - classification_loss: 0.1948 65/500 [==>...........................] - ETA: 2:27 - loss: 1.4878 - regression_loss: 1.2934 - classification_loss: 0.1944 66/500 [==>...........................] - ETA: 2:27 - loss: 1.4841 - regression_loss: 1.2904 - classification_loss: 0.1936 67/500 [===>..........................] - ETA: 2:26 - loss: 1.4822 - regression_loss: 1.2892 - classification_loss: 0.1930 68/500 [===>..........................] - ETA: 2:26 - loss: 1.4824 - regression_loss: 1.2893 - classification_loss: 0.1931 69/500 [===>..........................] - ETA: 2:26 - loss: 1.4804 - regression_loss: 1.2877 - classification_loss: 0.1927 70/500 [===>..........................] - ETA: 2:26 - loss: 1.4717 - regression_loss: 1.2799 - classification_loss: 0.1918 71/500 [===>..........................] - ETA: 2:25 - loss: 1.4709 - regression_loss: 1.2787 - classification_loss: 0.1923 72/500 [===>..........................] - ETA: 2:25 - loss: 1.4644 - regression_loss: 1.2734 - classification_loss: 0.1910 73/500 [===>..........................] - ETA: 2:25 - loss: 1.4659 - regression_loss: 1.2749 - classification_loss: 0.1910 74/500 [===>..........................] - ETA: 2:24 - loss: 1.4619 - regression_loss: 1.2717 - classification_loss: 0.1902 75/500 [===>..........................] - ETA: 2:24 - loss: 1.4651 - regression_loss: 1.2747 - classification_loss: 0.1904 76/500 [===>..........................] - ETA: 2:24 - loss: 1.4649 - regression_loss: 1.2741 - classification_loss: 0.1907 77/500 [===>..........................] - ETA: 2:23 - loss: 1.4685 - regression_loss: 1.2771 - classification_loss: 0.1915 78/500 [===>..........................] - ETA: 2:23 - loss: 1.4843 - regression_loss: 1.2891 - classification_loss: 0.1953 79/500 [===>..........................] - ETA: 2:23 - loss: 1.4829 - regression_loss: 1.2880 - classification_loss: 0.1949 80/500 [===>..........................] - ETA: 2:22 - loss: 1.4811 - regression_loss: 1.2865 - classification_loss: 0.1946 81/500 [===>..........................] - ETA: 2:22 - loss: 1.4818 - regression_loss: 1.2864 - classification_loss: 0.1955 82/500 [===>..........................] - ETA: 2:22 - loss: 1.4801 - regression_loss: 1.2850 - classification_loss: 0.1951 83/500 [===>..........................] - ETA: 2:21 - loss: 1.4871 - regression_loss: 1.2914 - classification_loss: 0.1957 84/500 [====>.........................] - ETA: 2:21 - loss: 1.4900 - regression_loss: 1.2937 - classification_loss: 0.1962 85/500 [====>.........................] - ETA: 2:21 - loss: 1.4888 - regression_loss: 1.2933 - classification_loss: 0.1955 86/500 [====>.........................] - ETA: 2:20 - loss: 1.4872 - regression_loss: 1.2918 - classification_loss: 0.1953 87/500 [====>.........................] - ETA: 2:20 - loss: 1.4838 - regression_loss: 1.2892 - classification_loss: 0.1946 88/500 [====>.........................] - ETA: 2:20 - loss: 1.4841 - regression_loss: 1.2894 - classification_loss: 0.1947 89/500 [====>.........................] - ETA: 2:19 - loss: 1.4833 - regression_loss: 1.2887 - classification_loss: 0.1946 90/500 [====>.........................] - ETA: 2:19 - loss: 1.4834 - regression_loss: 1.2889 - classification_loss: 0.1945 91/500 [====>.........................] - ETA: 2:19 - loss: 1.4837 - regression_loss: 1.2892 - classification_loss: 0.1945 92/500 [====>.........................] - ETA: 2:18 - loss: 1.4817 - regression_loss: 1.2874 - classification_loss: 0.1943 93/500 [====>.........................] - ETA: 2:18 - loss: 1.4784 - regression_loss: 1.2843 - classification_loss: 0.1941 94/500 [====>.........................] - ETA: 2:18 - loss: 1.4792 - regression_loss: 1.2850 - classification_loss: 0.1942 95/500 [====>.........................] - ETA: 2:17 - loss: 1.4778 - regression_loss: 1.2840 - classification_loss: 0.1938 96/500 [====>.........................] - ETA: 2:17 - loss: 1.4800 - regression_loss: 1.2858 - classification_loss: 0.1943 97/500 [====>.........................] - ETA: 2:17 - loss: 1.4808 - regression_loss: 1.2866 - classification_loss: 0.1942 98/500 [====>.........................] - ETA: 2:16 - loss: 1.4814 - regression_loss: 1.2874 - classification_loss: 0.1940 99/500 [====>.........................] - ETA: 2:16 - loss: 1.4842 - regression_loss: 1.2900 - classification_loss: 0.1942 100/500 [=====>........................] - ETA: 2:15 - loss: 1.4806 - regression_loss: 1.2866 - classification_loss: 0.1940 101/500 [=====>........................] - ETA: 2:15 - loss: 1.4796 - regression_loss: 1.2857 - classification_loss: 0.1939 102/500 [=====>........................] - ETA: 2:15 - loss: 1.4845 - regression_loss: 1.2886 - classification_loss: 0.1959 103/500 [=====>........................] - ETA: 2:14 - loss: 1.4863 - regression_loss: 1.2903 - classification_loss: 0.1960 104/500 [=====>........................] - ETA: 2:14 - loss: 1.4796 - regression_loss: 1.2847 - classification_loss: 0.1948 105/500 [=====>........................] - ETA: 2:14 - loss: 1.4734 - regression_loss: 1.2798 - classification_loss: 0.1936 106/500 [=====>........................] - ETA: 2:13 - loss: 1.4760 - regression_loss: 1.2820 - classification_loss: 0.1940 107/500 [=====>........................] - ETA: 2:13 - loss: 1.4746 - regression_loss: 1.2809 - classification_loss: 0.1937 108/500 [=====>........................] - ETA: 2:13 - loss: 1.4730 - regression_loss: 1.2790 - classification_loss: 0.1940 109/500 [=====>........................] - ETA: 2:13 - loss: 1.4769 - regression_loss: 1.2827 - classification_loss: 0.1942 110/500 [=====>........................] - ETA: 2:12 - loss: 1.4736 - regression_loss: 1.2800 - classification_loss: 0.1936 111/500 [=====>........................] - ETA: 2:12 - loss: 1.4724 - regression_loss: 1.2791 - classification_loss: 0.1933 112/500 [=====>........................] - ETA: 2:12 - loss: 1.4759 - regression_loss: 1.2823 - classification_loss: 0.1936 113/500 [=====>........................] - ETA: 2:11 - loss: 1.4738 - regression_loss: 1.2809 - classification_loss: 0.1928 114/500 [=====>........................] - ETA: 2:11 - loss: 1.4756 - regression_loss: 1.2824 - classification_loss: 0.1932 115/500 [=====>........................] - ETA: 2:11 - loss: 1.4771 - regression_loss: 1.2826 - classification_loss: 0.1946 116/500 [=====>........................] - ETA: 2:10 - loss: 1.4707 - regression_loss: 1.2774 - classification_loss: 0.1933 117/500 [======>.......................] - ETA: 2:10 - loss: 1.4708 - regression_loss: 1.2776 - classification_loss: 0.1932 118/500 [======>.......................] - ETA: 2:10 - loss: 1.4732 - regression_loss: 1.2795 - classification_loss: 0.1937 119/500 [======>.......................] - ETA: 2:09 - loss: 1.4761 - regression_loss: 1.2820 - classification_loss: 0.1941 120/500 [======>.......................] - ETA: 2:09 - loss: 1.4747 - regression_loss: 1.2808 - classification_loss: 0.1939 121/500 [======>.......................] - ETA: 2:09 - loss: 1.4750 - regression_loss: 1.2812 - classification_loss: 0.1938 122/500 [======>.......................] - ETA: 2:08 - loss: 1.4761 - regression_loss: 1.2819 - classification_loss: 0.1942 123/500 [======>.......................] - ETA: 2:08 - loss: 1.4769 - regression_loss: 1.2822 - classification_loss: 0.1947 124/500 [======>.......................] - ETA: 2:08 - loss: 1.4702 - regression_loss: 1.2766 - classification_loss: 0.1935 125/500 [======>.......................] - ETA: 2:07 - loss: 1.4688 - regression_loss: 1.2761 - classification_loss: 0.1927 126/500 [======>.......................] - ETA: 2:07 - loss: 1.4699 - regression_loss: 1.2768 - classification_loss: 0.1931 127/500 [======>.......................] - ETA: 2:07 - loss: 1.4632 - regression_loss: 1.2712 - classification_loss: 0.1920 128/500 [======>.......................] - ETA: 2:06 - loss: 1.4660 - regression_loss: 1.2738 - classification_loss: 0.1923 129/500 [======>.......................] - ETA: 2:06 - loss: 1.4655 - regression_loss: 1.2731 - classification_loss: 0.1924 130/500 [======>.......................] - ETA: 2:06 - loss: 1.4699 - regression_loss: 1.2770 - classification_loss: 0.1929 131/500 [======>.......................] - ETA: 2:05 - loss: 1.4658 - regression_loss: 1.2733 - classification_loss: 0.1926 132/500 [======>.......................] - ETA: 2:05 - loss: 1.4587 - regression_loss: 1.2671 - classification_loss: 0.1916 133/500 [======>.......................] - ETA: 2:05 - loss: 1.4626 - regression_loss: 1.2708 - classification_loss: 0.1918 134/500 [=======>......................] - ETA: 2:04 - loss: 1.4585 - regression_loss: 1.2674 - classification_loss: 0.1910 135/500 [=======>......................] - ETA: 2:04 - loss: 1.4575 - regression_loss: 1.2664 - classification_loss: 0.1911 136/500 [=======>......................] - ETA: 2:04 - loss: 1.4576 - regression_loss: 1.2665 - classification_loss: 0.1911 137/500 [=======>......................] - ETA: 2:03 - loss: 1.4521 - regression_loss: 1.2616 - classification_loss: 0.1904 138/500 [=======>......................] - ETA: 2:03 - loss: 1.4535 - regression_loss: 1.2628 - classification_loss: 0.1907 139/500 [=======>......................] - ETA: 2:03 - loss: 1.4507 - regression_loss: 1.2603 - classification_loss: 0.1904 140/500 [=======>......................] - ETA: 2:02 - loss: 1.4507 - regression_loss: 1.2605 - classification_loss: 0.1902 141/500 [=======>......................] - ETA: 2:02 - loss: 1.4481 - regression_loss: 1.2584 - classification_loss: 0.1897 142/500 [=======>......................] - ETA: 2:02 - loss: 1.4444 - regression_loss: 1.2550 - classification_loss: 0.1894 143/500 [=======>......................] - ETA: 2:01 - loss: 1.4452 - regression_loss: 1.2557 - classification_loss: 0.1895 144/500 [=======>......................] - ETA: 2:01 - loss: 1.4476 - regression_loss: 1.2582 - classification_loss: 0.1894 145/500 [=======>......................] - ETA: 2:01 - loss: 1.4437 - regression_loss: 1.2548 - classification_loss: 0.1888 146/500 [=======>......................] - ETA: 2:00 - loss: 1.4385 - regression_loss: 1.2506 - classification_loss: 0.1879 147/500 [=======>......................] - ETA: 2:00 - loss: 1.4400 - regression_loss: 1.2516 - classification_loss: 0.1884 148/500 [=======>......................] - ETA: 1:59 - loss: 1.4399 - regression_loss: 1.2514 - classification_loss: 0.1884 149/500 [=======>......................] - ETA: 1:59 - loss: 1.4386 - regression_loss: 1.2506 - classification_loss: 0.1881 150/500 [========>.....................] - ETA: 1:59 - loss: 1.4330 - regression_loss: 1.2458 - classification_loss: 0.1872 151/500 [========>.....................] - ETA: 1:58 - loss: 1.4311 - regression_loss: 1.2441 - classification_loss: 0.1869 152/500 [========>.....................] - ETA: 1:58 - loss: 1.4335 - regression_loss: 1.2460 - classification_loss: 0.1875 153/500 [========>.....................] - ETA: 1:58 - loss: 1.4343 - regression_loss: 1.2471 - classification_loss: 0.1872 154/500 [========>.....................] - ETA: 1:57 - loss: 1.4372 - regression_loss: 1.2496 - classification_loss: 0.1876 155/500 [========>.....................] - ETA: 1:57 - loss: 1.4335 - regression_loss: 1.2462 - classification_loss: 0.1873 156/500 [========>.....................] - ETA: 1:57 - loss: 1.4360 - regression_loss: 1.2480 - classification_loss: 0.1880 157/500 [========>.....................] - ETA: 1:56 - loss: 1.4336 - regression_loss: 1.2460 - classification_loss: 0.1877 158/500 [========>.....................] - ETA: 1:56 - loss: 1.4338 - regression_loss: 1.2462 - classification_loss: 0.1876 159/500 [========>.....................] - ETA: 1:56 - loss: 1.4293 - regression_loss: 1.2423 - classification_loss: 0.1870 160/500 [========>.....................] - ETA: 1:55 - loss: 1.4245 - regression_loss: 1.2383 - classification_loss: 0.1862 161/500 [========>.....................] - ETA: 1:55 - loss: 1.4269 - regression_loss: 1.2401 - classification_loss: 0.1867 162/500 [========>.....................] - ETA: 1:55 - loss: 1.4291 - regression_loss: 1.2419 - classification_loss: 0.1872 163/500 [========>.....................] - ETA: 1:54 - loss: 1.4266 - regression_loss: 1.2399 - classification_loss: 0.1867 164/500 [========>.....................] - ETA: 1:54 - loss: 1.4314 - regression_loss: 1.2439 - classification_loss: 0.1876 165/500 [========>.....................] - ETA: 1:54 - loss: 1.4312 - regression_loss: 1.2439 - classification_loss: 0.1873 166/500 [========>.....................] - ETA: 1:53 - loss: 1.4337 - regression_loss: 1.2463 - classification_loss: 0.1874 167/500 [=========>....................] - ETA: 1:53 - loss: 1.4323 - regression_loss: 1.2451 - classification_loss: 0.1872 168/500 [=========>....................] - ETA: 1:53 - loss: 1.4318 - regression_loss: 1.2449 - classification_loss: 0.1870 169/500 [=========>....................] - ETA: 1:52 - loss: 1.4290 - regression_loss: 1.2424 - classification_loss: 0.1866 170/500 [=========>....................] - ETA: 1:52 - loss: 1.4291 - regression_loss: 1.2425 - classification_loss: 0.1866 171/500 [=========>....................] - ETA: 1:52 - loss: 1.4290 - regression_loss: 1.2427 - classification_loss: 0.1863 172/500 [=========>....................] - ETA: 1:51 - loss: 1.4288 - regression_loss: 1.2426 - classification_loss: 0.1861 173/500 [=========>....................] - ETA: 1:51 - loss: 1.4259 - regression_loss: 1.2400 - classification_loss: 0.1858 174/500 [=========>....................] - ETA: 1:51 - loss: 1.4254 - regression_loss: 1.2397 - classification_loss: 0.1857 175/500 [=========>....................] - ETA: 1:50 - loss: 1.4251 - regression_loss: 1.2395 - classification_loss: 0.1856 176/500 [=========>....................] - ETA: 1:50 - loss: 1.4254 - regression_loss: 1.2400 - classification_loss: 0.1854 177/500 [=========>....................] - ETA: 1:50 - loss: 1.4275 - regression_loss: 1.2418 - classification_loss: 0.1857 178/500 [=========>....................] - ETA: 1:49 - loss: 1.4271 - regression_loss: 1.2413 - classification_loss: 0.1858 179/500 [=========>....................] - ETA: 1:49 - loss: 1.4238 - regression_loss: 1.2386 - classification_loss: 0.1853 180/500 [=========>....................] - ETA: 1:49 - loss: 1.4261 - regression_loss: 1.2406 - classification_loss: 0.1855 181/500 [=========>....................] - ETA: 1:48 - loss: 1.4255 - regression_loss: 1.2400 - classification_loss: 0.1855 182/500 [=========>....................] - ETA: 1:48 - loss: 1.4272 - regression_loss: 1.2414 - classification_loss: 0.1858 183/500 [=========>....................] - ETA: 1:48 - loss: 1.4269 - regression_loss: 1.2407 - classification_loss: 0.1862 184/500 [==========>...................] - ETA: 1:47 - loss: 1.4260 - regression_loss: 1.2399 - classification_loss: 0.1861 185/500 [==========>...................] - ETA: 1:47 - loss: 1.4283 - regression_loss: 1.2418 - classification_loss: 0.1865 186/500 [==========>...................] - ETA: 1:47 - loss: 1.4261 - regression_loss: 1.2398 - classification_loss: 0.1863 187/500 [==========>...................] - ETA: 1:46 - loss: 1.4267 - regression_loss: 1.2404 - classification_loss: 0.1863 188/500 [==========>...................] - ETA: 1:46 - loss: 1.4262 - regression_loss: 1.2403 - classification_loss: 0.1859 189/500 [==========>...................] - ETA: 1:45 - loss: 1.4271 - regression_loss: 1.2412 - classification_loss: 0.1859 190/500 [==========>...................] - ETA: 1:45 - loss: 1.4304 - regression_loss: 1.2436 - classification_loss: 0.1868 191/500 [==========>...................] - ETA: 1:45 - loss: 1.4283 - regression_loss: 1.2418 - classification_loss: 0.1866 192/500 [==========>...................] - ETA: 1:44 - loss: 1.4286 - regression_loss: 1.2421 - classification_loss: 0.1865 193/500 [==========>...................] - ETA: 1:44 - loss: 1.4264 - regression_loss: 1.2402 - classification_loss: 0.1862 194/500 [==========>...................] - ETA: 1:44 - loss: 1.4262 - regression_loss: 1.2402 - classification_loss: 0.1860 195/500 [==========>...................] - ETA: 1:43 - loss: 1.4279 - regression_loss: 1.2420 - classification_loss: 0.1859 196/500 [==========>...................] - ETA: 1:43 - loss: 1.4256 - regression_loss: 1.2398 - classification_loss: 0.1858 197/500 [==========>...................] - ETA: 1:43 - loss: 1.4246 - regression_loss: 1.2387 - classification_loss: 0.1859 198/500 [==========>...................] - ETA: 1:42 - loss: 1.4242 - regression_loss: 1.2389 - classification_loss: 0.1853 199/500 [==========>...................] - ETA: 1:42 - loss: 1.4232 - regression_loss: 1.2383 - classification_loss: 0.1849 200/500 [===========>..................] - ETA: 1:42 - loss: 1.4249 - regression_loss: 1.2397 - classification_loss: 0.1851 201/500 [===========>..................] - ETA: 1:41 - loss: 1.4233 - regression_loss: 1.2384 - classification_loss: 0.1849 202/500 [===========>..................] - ETA: 1:41 - loss: 1.4238 - regression_loss: 1.2389 - classification_loss: 0.1849 203/500 [===========>..................] - ETA: 1:41 - loss: 1.4249 - regression_loss: 1.2396 - classification_loss: 0.1852 204/500 [===========>..................] - ETA: 1:40 - loss: 1.4238 - regression_loss: 1.2390 - classification_loss: 0.1848 205/500 [===========>..................] - ETA: 1:40 - loss: 1.4218 - regression_loss: 1.2372 - classification_loss: 0.1845 206/500 [===========>..................] - ETA: 1:40 - loss: 1.4218 - regression_loss: 1.2375 - classification_loss: 0.1844 207/500 [===========>..................] - ETA: 1:39 - loss: 1.4196 - regression_loss: 1.2357 - classification_loss: 0.1839 208/500 [===========>..................] - ETA: 1:39 - loss: 1.4185 - regression_loss: 1.2350 - classification_loss: 0.1835 209/500 [===========>..................] - ETA: 1:38 - loss: 1.4170 - regression_loss: 1.2340 - classification_loss: 0.1830 210/500 [===========>..................] - ETA: 1:38 - loss: 1.4155 - regression_loss: 1.2326 - classification_loss: 0.1829 211/500 [===========>..................] - ETA: 1:38 - loss: 1.4140 - regression_loss: 1.2313 - classification_loss: 0.1827 212/500 [===========>..................] - ETA: 1:37 - loss: 1.4122 - regression_loss: 1.2298 - classification_loss: 0.1824 213/500 [===========>..................] - ETA: 1:37 - loss: 1.4129 - regression_loss: 1.2306 - classification_loss: 0.1824 214/500 [===========>..................] - ETA: 1:37 - loss: 1.4133 - regression_loss: 1.2308 - classification_loss: 0.1825 215/500 [===========>..................] - ETA: 1:36 - loss: 1.4146 - regression_loss: 1.2319 - classification_loss: 0.1827 216/500 [===========>..................] - ETA: 1:36 - loss: 1.4153 - regression_loss: 1.2324 - classification_loss: 0.1829 217/500 [============>.................] - ETA: 1:36 - loss: 1.4170 - regression_loss: 1.2334 - classification_loss: 0.1836 218/500 [============>.................] - ETA: 1:35 - loss: 1.4129 - regression_loss: 1.2299 - classification_loss: 0.1830 219/500 [============>.................] - ETA: 1:35 - loss: 1.4131 - regression_loss: 1.2301 - classification_loss: 0.1829 220/500 [============>.................] - ETA: 1:35 - loss: 1.4127 - regression_loss: 1.2302 - classification_loss: 0.1825 221/500 [============>.................] - ETA: 1:34 - loss: 1.4126 - regression_loss: 1.2302 - classification_loss: 0.1824 222/500 [============>.................] - ETA: 1:34 - loss: 1.4088 - regression_loss: 1.2270 - classification_loss: 0.1818 223/500 [============>.................] - ETA: 1:34 - loss: 1.4063 - regression_loss: 1.2250 - classification_loss: 0.1812 224/500 [============>.................] - ETA: 1:33 - loss: 1.4093 - regression_loss: 1.2277 - classification_loss: 0.1815 225/500 [============>.................] - ETA: 1:33 - loss: 1.4095 - regression_loss: 1.2281 - classification_loss: 0.1815 226/500 [============>.................] - ETA: 1:33 - loss: 1.4079 - regression_loss: 1.2268 - classification_loss: 0.1811 227/500 [============>.................] - ETA: 1:32 - loss: 1.4043 - regression_loss: 1.2239 - classification_loss: 0.1805 228/500 [============>.................] - ETA: 1:32 - loss: 1.4043 - regression_loss: 1.2239 - classification_loss: 0.1805 229/500 [============>.................] - ETA: 1:32 - loss: 1.4064 - regression_loss: 1.2255 - classification_loss: 0.1809 230/500 [============>.................] - ETA: 1:31 - loss: 1.4075 - regression_loss: 1.2265 - classification_loss: 0.1810 231/500 [============>.................] - ETA: 1:31 - loss: 1.4073 - regression_loss: 1.2263 - classification_loss: 0.1810 232/500 [============>.................] - ETA: 1:31 - loss: 1.4040 - regression_loss: 1.2235 - classification_loss: 0.1805 233/500 [============>.................] - ETA: 1:30 - loss: 1.4057 - regression_loss: 1.2249 - classification_loss: 0.1808 234/500 [=============>................] - ETA: 1:30 - loss: 1.4023 - regression_loss: 1.2220 - classification_loss: 0.1803 235/500 [=============>................] - ETA: 1:30 - loss: 1.4043 - regression_loss: 1.2237 - classification_loss: 0.1806 236/500 [=============>................] - ETA: 1:29 - loss: 1.4044 - regression_loss: 1.2237 - classification_loss: 0.1806 237/500 [=============>................] - ETA: 1:29 - loss: 1.4014 - regression_loss: 1.2212 - classification_loss: 0.1802 238/500 [=============>................] - ETA: 1:29 - loss: 1.4022 - regression_loss: 1.2218 - classification_loss: 0.1804 239/500 [=============>................] - ETA: 1:28 - loss: 1.4023 - regression_loss: 1.2221 - classification_loss: 0.1802 240/500 [=============>................] - ETA: 1:28 - loss: 1.3979 - regression_loss: 1.2181 - classification_loss: 0.1798 241/500 [=============>................] - ETA: 1:28 - loss: 1.3990 - regression_loss: 1.2190 - classification_loss: 0.1799 242/500 [=============>................] - ETA: 1:27 - loss: 1.4008 - regression_loss: 1.2207 - classification_loss: 0.1801 243/500 [=============>................] - ETA: 1:27 - loss: 1.4018 - regression_loss: 1.2215 - classification_loss: 0.1802 244/500 [=============>................] - ETA: 1:27 - loss: 1.4008 - regression_loss: 1.2206 - classification_loss: 0.1801 245/500 [=============>................] - ETA: 1:26 - loss: 1.4021 - regression_loss: 1.2216 - classification_loss: 0.1805 246/500 [=============>................] - ETA: 1:26 - loss: 1.4018 - regression_loss: 1.2213 - classification_loss: 0.1805 247/500 [=============>................] - ETA: 1:26 - loss: 1.4026 - regression_loss: 1.2220 - classification_loss: 0.1806 248/500 [=============>................] - ETA: 1:25 - loss: 1.4032 - regression_loss: 1.2225 - classification_loss: 0.1807 249/500 [=============>................] - ETA: 1:25 - loss: 1.4033 - regression_loss: 1.2225 - classification_loss: 0.1808 250/500 [==============>...............] - ETA: 1:25 - loss: 1.4035 - regression_loss: 1.2227 - classification_loss: 0.1808 251/500 [==============>...............] - ETA: 1:24 - loss: 1.4037 - regression_loss: 1.2228 - classification_loss: 0.1808 252/500 [==============>...............] - ETA: 1:24 - loss: 1.4025 - regression_loss: 1.2219 - classification_loss: 0.1806 253/500 [==============>...............] - ETA: 1:24 - loss: 1.4031 - regression_loss: 1.2224 - classification_loss: 0.1807 254/500 [==============>...............] - ETA: 1:23 - loss: 1.3998 - regression_loss: 1.2195 - classification_loss: 0.1803 255/500 [==============>...............] - ETA: 1:23 - loss: 1.3995 - regression_loss: 1.2192 - classification_loss: 0.1803 256/500 [==============>...............] - ETA: 1:23 - loss: 1.3995 - regression_loss: 1.2192 - classification_loss: 0.1803 257/500 [==============>...............] - ETA: 1:22 - loss: 1.4024 - regression_loss: 1.2217 - classification_loss: 0.1808 258/500 [==============>...............] - ETA: 1:22 - loss: 1.4042 - regression_loss: 1.2232 - classification_loss: 0.1810 259/500 [==============>...............] - ETA: 1:22 - loss: 1.4060 - regression_loss: 1.2247 - classification_loss: 0.1813 260/500 [==============>...............] - ETA: 1:21 - loss: 1.4063 - regression_loss: 1.2251 - classification_loss: 0.1812 261/500 [==============>...............] - ETA: 1:21 - loss: 1.4059 - regression_loss: 1.2250 - classification_loss: 0.1810 262/500 [==============>...............] - ETA: 1:21 - loss: 1.4073 - regression_loss: 1.2261 - classification_loss: 0.1812 263/500 [==============>...............] - ETA: 1:20 - loss: 1.4073 - regression_loss: 1.2260 - classification_loss: 0.1813 264/500 [==============>...............] - ETA: 1:20 - loss: 1.4084 - regression_loss: 1.2268 - classification_loss: 0.1816 265/500 [==============>...............] - ETA: 1:19 - loss: 1.4069 - regression_loss: 1.2256 - classification_loss: 0.1813 266/500 [==============>...............] - ETA: 1:19 - loss: 1.4087 - regression_loss: 1.2270 - classification_loss: 0.1817 267/500 [===============>..............] - ETA: 1:19 - loss: 1.4065 - regression_loss: 1.2250 - classification_loss: 0.1815 268/500 [===============>..............] - ETA: 1:18 - loss: 1.4063 - regression_loss: 1.2248 - classification_loss: 0.1815 269/500 [===============>..............] - ETA: 1:18 - loss: 1.4040 - regression_loss: 1.2228 - classification_loss: 0.1813 270/500 [===============>..............] - ETA: 1:18 - loss: 1.4064 - regression_loss: 1.2247 - classification_loss: 0.1816 271/500 [===============>..............] - ETA: 1:17 - loss: 1.4092 - regression_loss: 1.2272 - classification_loss: 0.1820 272/500 [===============>..............] - ETA: 1:17 - loss: 1.4095 - regression_loss: 1.2275 - classification_loss: 0.1819 273/500 [===============>..............] - ETA: 1:17 - loss: 1.4095 - regression_loss: 1.2276 - classification_loss: 0.1819 274/500 [===============>..............] - ETA: 1:16 - loss: 1.4101 - regression_loss: 1.2277 - classification_loss: 0.1824 275/500 [===============>..............] - ETA: 1:16 - loss: 1.4091 - regression_loss: 1.2270 - classification_loss: 0.1822 276/500 [===============>..............] - ETA: 1:16 - loss: 1.4090 - regression_loss: 1.2268 - classification_loss: 0.1822 277/500 [===============>..............] - ETA: 1:15 - loss: 1.4093 - regression_loss: 1.2270 - classification_loss: 0.1822 278/500 [===============>..............] - ETA: 1:15 - loss: 1.4092 - regression_loss: 1.2270 - classification_loss: 0.1822 279/500 [===============>..............] - ETA: 1:15 - loss: 1.4093 - regression_loss: 1.2271 - classification_loss: 0.1822 280/500 [===============>..............] - ETA: 1:14 - loss: 1.4087 - regression_loss: 1.2266 - classification_loss: 0.1821 281/500 [===============>..............] - ETA: 1:14 - loss: 1.4089 - regression_loss: 1.2268 - classification_loss: 0.1821 282/500 [===============>..............] - ETA: 1:14 - loss: 1.4115 - regression_loss: 1.2291 - classification_loss: 0.1824 283/500 [===============>..............] - ETA: 1:13 - loss: 1.4126 - regression_loss: 1.2301 - classification_loss: 0.1825 284/500 [================>.............] - ETA: 1:13 - loss: 1.4124 - regression_loss: 1.2298 - classification_loss: 0.1826 285/500 [================>.............] - ETA: 1:13 - loss: 1.4122 - regression_loss: 1.2297 - classification_loss: 0.1825 286/500 [================>.............] - ETA: 1:12 - loss: 1.4092 - regression_loss: 1.2271 - classification_loss: 0.1821 287/500 [================>.............] - ETA: 1:12 - loss: 1.4113 - regression_loss: 1.2290 - classification_loss: 0.1823 288/500 [================>.............] - ETA: 1:12 - loss: 1.4122 - regression_loss: 1.2297 - classification_loss: 0.1825 289/500 [================>.............] - ETA: 1:11 - loss: 1.4131 - regression_loss: 1.2304 - classification_loss: 0.1827 290/500 [================>.............] - ETA: 1:11 - loss: 1.4149 - regression_loss: 1.2319 - classification_loss: 0.1830 291/500 [================>.............] - ETA: 1:11 - loss: 1.4145 - regression_loss: 1.2316 - classification_loss: 0.1828 292/500 [================>.............] - ETA: 1:10 - loss: 1.4138 - regression_loss: 1.2310 - classification_loss: 0.1828 293/500 [================>.............] - ETA: 1:10 - loss: 1.4136 - regression_loss: 1.2307 - classification_loss: 0.1830 294/500 [================>.............] - ETA: 1:10 - loss: 1.4135 - regression_loss: 1.2302 - classification_loss: 0.1833 295/500 [================>.............] - ETA: 1:09 - loss: 1.4115 - regression_loss: 1.2286 - classification_loss: 0.1829 296/500 [================>.............] - ETA: 1:09 - loss: 1.4137 - regression_loss: 1.2305 - classification_loss: 0.1832 297/500 [================>.............] - ETA: 1:09 - loss: 1.4107 - regression_loss: 1.2280 - classification_loss: 0.1827 298/500 [================>.............] - ETA: 1:08 - loss: 1.4075 - regression_loss: 1.2253 - classification_loss: 0.1822 299/500 [================>.............] - ETA: 1:08 - loss: 1.4098 - regression_loss: 1.2268 - classification_loss: 0.1831 300/500 [=================>............] - ETA: 1:08 - loss: 1.4076 - regression_loss: 1.2249 - classification_loss: 0.1828 301/500 [=================>............] - ETA: 1:07 - loss: 1.4083 - regression_loss: 1.2254 - classification_loss: 0.1828 302/500 [=================>............] - ETA: 1:07 - loss: 1.4087 - regression_loss: 1.2260 - classification_loss: 0.1827 303/500 [=================>............] - ETA: 1:07 - loss: 1.4066 - regression_loss: 1.2240 - classification_loss: 0.1826 304/500 [=================>............] - ETA: 1:06 - loss: 1.4043 - regression_loss: 1.2221 - classification_loss: 0.1822 305/500 [=================>............] - ETA: 1:06 - loss: 1.4058 - regression_loss: 1.2232 - classification_loss: 0.1826 306/500 [=================>............] - ETA: 1:06 - loss: 1.4041 - regression_loss: 1.2218 - classification_loss: 0.1823 307/500 [=================>............] - ETA: 1:05 - loss: 1.4038 - regression_loss: 1.2215 - classification_loss: 0.1823 308/500 [=================>............] - ETA: 1:05 - loss: 1.4028 - regression_loss: 1.2206 - classification_loss: 0.1822 309/500 [=================>............] - ETA: 1:05 - loss: 1.4035 - regression_loss: 1.2211 - classification_loss: 0.1824 310/500 [=================>............] - ETA: 1:04 - loss: 1.4027 - regression_loss: 1.2201 - classification_loss: 0.1826 311/500 [=================>............] - ETA: 1:04 - loss: 1.4044 - regression_loss: 1.2217 - classification_loss: 0.1828 312/500 [=================>............] - ETA: 1:04 - loss: 1.4059 - regression_loss: 1.2230 - classification_loss: 0.1829 313/500 [=================>............] - ETA: 1:03 - loss: 1.4047 - regression_loss: 1.2220 - classification_loss: 0.1827 314/500 [=================>............] - ETA: 1:03 - loss: 1.4043 - regression_loss: 1.2217 - classification_loss: 0.1826 315/500 [=================>............] - ETA: 1:03 - loss: 1.4058 - regression_loss: 1.2230 - classification_loss: 0.1828 316/500 [=================>............] - ETA: 1:02 - loss: 1.4049 - regression_loss: 1.2223 - classification_loss: 0.1826 317/500 [==================>...........] - ETA: 1:02 - loss: 1.4044 - regression_loss: 1.2219 - classification_loss: 0.1825 318/500 [==================>...........] - ETA: 1:02 - loss: 1.4055 - regression_loss: 1.2227 - classification_loss: 0.1827 319/500 [==================>...........] - ETA: 1:01 - loss: 1.4065 - regression_loss: 1.2237 - classification_loss: 0.1828 320/500 [==================>...........] - ETA: 1:01 - loss: 1.4101 - regression_loss: 1.2266 - classification_loss: 0.1834 321/500 [==================>...........] - ETA: 1:00 - loss: 1.4109 - regression_loss: 1.2274 - classification_loss: 0.1835 322/500 [==================>...........] - ETA: 1:00 - loss: 1.4090 - regression_loss: 1.2258 - classification_loss: 0.1832 323/500 [==================>...........] - ETA: 1:00 - loss: 1.4087 - regression_loss: 1.2256 - classification_loss: 0.1830 324/500 [==================>...........] - ETA: 59s - loss: 1.4085 - regression_loss: 1.2253 - classification_loss: 0.1832  325/500 [==================>...........] - ETA: 59s - loss: 1.4108 - regression_loss: 1.2275 - classification_loss: 0.1833 326/500 [==================>...........] - ETA: 59s - loss: 1.4108 - regression_loss: 1.2275 - classification_loss: 0.1833 327/500 [==================>...........] - ETA: 58s - loss: 1.4139 - regression_loss: 1.2298 - classification_loss: 0.1841 328/500 [==================>...........] - ETA: 58s - loss: 1.4136 - regression_loss: 1.2295 - classification_loss: 0.1841 329/500 [==================>...........] - ETA: 58s - loss: 1.4141 - regression_loss: 1.2298 - classification_loss: 0.1843 330/500 [==================>...........] - ETA: 57s - loss: 1.4115 - regression_loss: 1.2275 - classification_loss: 0.1840 331/500 [==================>...........] - ETA: 57s - loss: 1.4092 - regression_loss: 1.2254 - classification_loss: 0.1838 332/500 [==================>...........] - ETA: 57s - loss: 1.4095 - regression_loss: 1.2258 - classification_loss: 0.1837 333/500 [==================>...........] - ETA: 56s - loss: 1.4100 - regression_loss: 1.2263 - classification_loss: 0.1837 334/500 [===================>..........] - ETA: 56s - loss: 1.4098 - regression_loss: 1.2262 - classification_loss: 0.1836 335/500 [===================>..........] - ETA: 56s - loss: 1.4090 - regression_loss: 1.2254 - classification_loss: 0.1836 336/500 [===================>..........] - ETA: 55s - loss: 1.4096 - regression_loss: 1.2257 - classification_loss: 0.1839 337/500 [===================>..........] - ETA: 55s - loss: 1.4104 - regression_loss: 1.2263 - classification_loss: 0.1841 338/500 [===================>..........] - ETA: 55s - loss: 1.4110 - regression_loss: 1.2270 - classification_loss: 0.1840 339/500 [===================>..........] - ETA: 54s - loss: 1.4115 - regression_loss: 1.2273 - classification_loss: 0.1842 340/500 [===================>..........] - ETA: 54s - loss: 1.4119 - regression_loss: 1.2278 - classification_loss: 0.1841 341/500 [===================>..........] - ETA: 54s - loss: 1.4123 - regression_loss: 1.2280 - classification_loss: 0.1843 342/500 [===================>..........] - ETA: 53s - loss: 1.4112 - regression_loss: 1.2270 - classification_loss: 0.1842 343/500 [===================>..........] - ETA: 53s - loss: 1.4113 - regression_loss: 1.2270 - classification_loss: 0.1843 344/500 [===================>..........] - ETA: 53s - loss: 1.4118 - regression_loss: 1.2274 - classification_loss: 0.1845 345/500 [===================>..........] - ETA: 52s - loss: 1.4107 - regression_loss: 1.2264 - classification_loss: 0.1843 346/500 [===================>..........] - ETA: 52s - loss: 1.4126 - regression_loss: 1.2280 - classification_loss: 0.1846 347/500 [===================>..........] - ETA: 52s - loss: 1.4143 - regression_loss: 1.2293 - classification_loss: 0.1850 348/500 [===================>..........] - ETA: 51s - loss: 1.4130 - regression_loss: 1.2283 - classification_loss: 0.1848 349/500 [===================>..........] - ETA: 51s - loss: 1.4127 - regression_loss: 1.2279 - classification_loss: 0.1848 350/500 [====================>.........] - ETA: 51s - loss: 1.4139 - regression_loss: 1.2289 - classification_loss: 0.1849 351/500 [====================>.........] - ETA: 50s - loss: 1.4130 - regression_loss: 1.2282 - classification_loss: 0.1848 352/500 [====================>.........] - ETA: 50s - loss: 1.4122 - regression_loss: 1.2275 - classification_loss: 0.1847 353/500 [====================>.........] - ETA: 50s - loss: 1.4131 - regression_loss: 1.2283 - classification_loss: 0.1848 354/500 [====================>.........] - ETA: 49s - loss: 1.4128 - regression_loss: 1.2281 - classification_loss: 0.1847 355/500 [====================>.........] - ETA: 49s - loss: 1.4125 - regression_loss: 1.2278 - classification_loss: 0.1847 356/500 [====================>.........] - ETA: 49s - loss: 1.4132 - regression_loss: 1.2285 - classification_loss: 0.1847 357/500 [====================>.........] - ETA: 48s - loss: 1.4113 - regression_loss: 1.2270 - classification_loss: 0.1843 358/500 [====================>.........] - ETA: 48s - loss: 1.4092 - regression_loss: 1.2252 - classification_loss: 0.1841 359/500 [====================>.........] - ETA: 48s - loss: 1.4084 - regression_loss: 1.2244 - classification_loss: 0.1839 360/500 [====================>.........] - ETA: 47s - loss: 1.4093 - regression_loss: 1.2252 - classification_loss: 0.1841 361/500 [====================>.........] - ETA: 47s - loss: 1.4100 - regression_loss: 1.2259 - classification_loss: 0.1841 362/500 [====================>.........] - ETA: 47s - loss: 1.4105 - regression_loss: 1.2263 - classification_loss: 0.1843 363/500 [====================>.........] - ETA: 46s - loss: 1.4105 - regression_loss: 1.2263 - classification_loss: 0.1842 364/500 [====================>.........] - ETA: 46s - loss: 1.4113 - regression_loss: 1.2270 - classification_loss: 0.1842 365/500 [====================>.........] - ETA: 45s - loss: 1.4094 - regression_loss: 1.2255 - classification_loss: 0.1839 366/500 [====================>.........] - ETA: 45s - loss: 1.4077 - regression_loss: 1.2240 - classification_loss: 0.1837 367/500 [=====================>........] - ETA: 45s - loss: 1.4105 - regression_loss: 1.2263 - classification_loss: 0.1842 368/500 [=====================>........] - ETA: 44s - loss: 1.4115 - regression_loss: 1.2270 - classification_loss: 0.1845 369/500 [=====================>........] - ETA: 44s - loss: 1.4125 - regression_loss: 1.2279 - classification_loss: 0.1846 370/500 [=====================>........] - ETA: 44s - loss: 1.4138 - regression_loss: 1.2289 - classification_loss: 0.1849 371/500 [=====================>........] - ETA: 43s - loss: 1.4147 - regression_loss: 1.2297 - classification_loss: 0.1850 372/500 [=====================>........] - ETA: 43s - loss: 1.4150 - regression_loss: 1.2300 - classification_loss: 0.1850 373/500 [=====================>........] - ETA: 43s - loss: 1.4156 - regression_loss: 1.2306 - classification_loss: 0.1850 374/500 [=====================>........] - ETA: 42s - loss: 1.4160 - regression_loss: 1.2311 - classification_loss: 0.1850 375/500 [=====================>........] - ETA: 42s - loss: 1.4166 - regression_loss: 1.2314 - classification_loss: 0.1853 376/500 [=====================>........] - ETA: 42s - loss: 1.4150 - regression_loss: 1.2298 - classification_loss: 0.1852 377/500 [=====================>........] - ETA: 41s - loss: 1.4150 - regression_loss: 1.2298 - classification_loss: 0.1852 378/500 [=====================>........] - ETA: 41s - loss: 1.4145 - regression_loss: 1.2293 - classification_loss: 0.1851 379/500 [=====================>........] - ETA: 41s - loss: 1.4146 - regression_loss: 1.2296 - classification_loss: 0.1851 380/500 [=====================>........] - ETA: 40s - loss: 1.4128 - regression_loss: 1.2281 - classification_loss: 0.1848 381/500 [=====================>........] - ETA: 40s - loss: 1.4139 - regression_loss: 1.2291 - classification_loss: 0.1848 382/500 [=====================>........] - ETA: 40s - loss: 1.4142 - regression_loss: 1.2293 - classification_loss: 0.1850 383/500 [=====================>........] - ETA: 39s - loss: 1.4139 - regression_loss: 1.2290 - classification_loss: 0.1849 384/500 [======================>.......] - ETA: 39s - loss: 1.4130 - regression_loss: 1.2282 - classification_loss: 0.1848 385/500 [======================>.......] - ETA: 39s - loss: 1.4126 - regression_loss: 1.2279 - classification_loss: 0.1847 386/500 [======================>.......] - ETA: 38s - loss: 1.4136 - regression_loss: 1.2287 - classification_loss: 0.1849 387/500 [======================>.......] - ETA: 38s - loss: 1.4128 - regression_loss: 1.2281 - classification_loss: 0.1847 388/500 [======================>.......] - ETA: 38s - loss: 1.4115 - regression_loss: 1.2270 - classification_loss: 0.1845 389/500 [======================>.......] - ETA: 37s - loss: 1.4113 - regression_loss: 1.2268 - classification_loss: 0.1845 390/500 [======================>.......] - ETA: 37s - loss: 1.4117 - regression_loss: 1.2273 - classification_loss: 0.1843 391/500 [======================>.......] - ETA: 37s - loss: 1.4098 - regression_loss: 1.2257 - classification_loss: 0.1841 392/500 [======================>.......] - ETA: 36s - loss: 1.4125 - regression_loss: 1.2279 - classification_loss: 0.1847 393/500 [======================>.......] - ETA: 36s - loss: 1.4118 - regression_loss: 1.2272 - classification_loss: 0.1846 394/500 [======================>.......] - ETA: 36s - loss: 1.4101 - regression_loss: 1.2257 - classification_loss: 0.1843 395/500 [======================>.......] - ETA: 35s - loss: 1.4092 - regression_loss: 1.2250 - classification_loss: 0.1842 396/500 [======================>.......] - ETA: 35s - loss: 1.4094 - regression_loss: 1.2252 - classification_loss: 0.1842 397/500 [======================>.......] - ETA: 35s - loss: 1.4096 - regression_loss: 1.2253 - classification_loss: 0.1843 398/500 [======================>.......] - ETA: 34s - loss: 1.4073 - regression_loss: 1.2233 - classification_loss: 0.1840 399/500 [======================>.......] - ETA: 34s - loss: 1.4070 - regression_loss: 1.2231 - classification_loss: 0.1839 400/500 [=======================>......] - ETA: 34s - loss: 1.4079 - regression_loss: 1.2241 - classification_loss: 0.1838 401/500 [=======================>......] - ETA: 33s - loss: 1.4076 - regression_loss: 1.2238 - classification_loss: 0.1838 402/500 [=======================>......] - ETA: 33s - loss: 1.4087 - regression_loss: 1.2248 - classification_loss: 0.1840 403/500 [=======================>......] - ETA: 33s - loss: 1.4082 - regression_loss: 1.2242 - classification_loss: 0.1840 404/500 [=======================>......] - ETA: 32s - loss: 1.4085 - regression_loss: 1.2245 - classification_loss: 0.1840 405/500 [=======================>......] - ETA: 32s - loss: 1.4067 - regression_loss: 1.2229 - classification_loss: 0.1839 406/500 [=======================>......] - ETA: 31s - loss: 1.4061 - regression_loss: 1.2224 - classification_loss: 0.1837 407/500 [=======================>......] - ETA: 31s - loss: 1.4051 - regression_loss: 1.2216 - classification_loss: 0.1835 408/500 [=======================>......] - ETA: 31s - loss: 1.4044 - regression_loss: 1.2210 - classification_loss: 0.1835 409/500 [=======================>......] - ETA: 30s - loss: 1.4051 - regression_loss: 1.2217 - classification_loss: 0.1835 410/500 [=======================>......] - ETA: 30s - loss: 1.4053 - regression_loss: 1.2219 - classification_loss: 0.1835 411/500 [=======================>......] - ETA: 30s - loss: 1.4062 - regression_loss: 1.2225 - classification_loss: 0.1837 412/500 [=======================>......] - ETA: 29s - loss: 1.4071 - regression_loss: 1.2232 - classification_loss: 0.1838 413/500 [=======================>......] - ETA: 29s - loss: 1.4067 - regression_loss: 1.2229 - classification_loss: 0.1838 414/500 [=======================>......] - ETA: 29s - loss: 1.4078 - regression_loss: 1.2238 - classification_loss: 0.1840 415/500 [=======================>......] - ETA: 28s - loss: 1.4088 - regression_loss: 1.2246 - classification_loss: 0.1841 416/500 [=======================>......] - ETA: 28s - loss: 1.4081 - regression_loss: 1.2241 - classification_loss: 0.1840 417/500 [========================>.....] - ETA: 28s - loss: 1.4078 - regression_loss: 1.2239 - classification_loss: 0.1839 418/500 [========================>.....] - ETA: 27s - loss: 1.4082 - regression_loss: 1.2244 - classification_loss: 0.1839 419/500 [========================>.....] - ETA: 27s - loss: 1.4088 - regression_loss: 1.2250 - classification_loss: 0.1839 420/500 [========================>.....] - ETA: 27s - loss: 1.4089 - regression_loss: 1.2250 - classification_loss: 0.1838 421/500 [========================>.....] - ETA: 26s - loss: 1.4094 - regression_loss: 1.2256 - classification_loss: 0.1838 422/500 [========================>.....] - ETA: 26s - loss: 1.4109 - regression_loss: 1.2266 - classification_loss: 0.1843 423/500 [========================>.....] - ETA: 26s - loss: 1.4108 - regression_loss: 1.2266 - classification_loss: 0.1842 424/500 [========================>.....] - ETA: 25s - loss: 1.4116 - regression_loss: 1.2273 - classification_loss: 0.1843 425/500 [========================>.....] - ETA: 25s - loss: 1.4123 - regression_loss: 1.2280 - classification_loss: 0.1843 426/500 [========================>.....] - ETA: 25s - loss: 1.4119 - regression_loss: 1.2277 - classification_loss: 0.1842 427/500 [========================>.....] - ETA: 24s - loss: 1.4116 - regression_loss: 1.2275 - classification_loss: 0.1841 428/500 [========================>.....] - ETA: 24s - loss: 1.4113 - regression_loss: 1.2274 - classification_loss: 0.1840 429/500 [========================>.....] - ETA: 24s - loss: 1.4092 - regression_loss: 1.2255 - classification_loss: 0.1837 430/500 [========================>.....] - ETA: 23s - loss: 1.4078 - regression_loss: 1.2243 - classification_loss: 0.1835 431/500 [========================>.....] - ETA: 23s - loss: 1.4083 - regression_loss: 1.2247 - classification_loss: 0.1835 432/500 [========================>.....] - ETA: 23s - loss: 1.4087 - regression_loss: 1.2251 - classification_loss: 0.1836 433/500 [========================>.....] - ETA: 22s - loss: 1.4083 - regression_loss: 1.2249 - classification_loss: 0.1835 434/500 [=========================>....] - ETA: 22s - loss: 1.4086 - regression_loss: 1.2250 - classification_loss: 0.1836 435/500 [=========================>....] - ETA: 22s - loss: 1.4067 - regression_loss: 1.2234 - classification_loss: 0.1833 436/500 [=========================>....] - ETA: 21s - loss: 1.4061 - regression_loss: 1.2229 - classification_loss: 0.1832 437/500 [=========================>....] - ETA: 21s - loss: 1.4044 - regression_loss: 1.2215 - classification_loss: 0.1830 438/500 [=========================>....] - ETA: 21s - loss: 1.4040 - regression_loss: 1.2210 - classification_loss: 0.1829 439/500 [=========================>....] - ETA: 20s - loss: 1.4026 - regression_loss: 1.2199 - classification_loss: 0.1828 440/500 [=========================>....] - ETA: 20s - loss: 1.4032 - regression_loss: 1.2204 - classification_loss: 0.1828 441/500 [=========================>....] - ETA: 20s - loss: 1.4026 - regression_loss: 1.2198 - classification_loss: 0.1828 442/500 [=========================>....] - ETA: 19s - loss: 1.4037 - regression_loss: 1.2208 - classification_loss: 0.1829 443/500 [=========================>....] - ETA: 19s - loss: 1.4017 - regression_loss: 1.2191 - classification_loss: 0.1826 444/500 [=========================>....] - ETA: 19s - loss: 1.4028 - regression_loss: 1.2202 - classification_loss: 0.1826 445/500 [=========================>....] - ETA: 18s - loss: 1.4034 - regression_loss: 1.2208 - classification_loss: 0.1826 446/500 [=========================>....] - ETA: 18s - loss: 1.4025 - regression_loss: 1.2200 - classification_loss: 0.1825 447/500 [=========================>....] - ETA: 18s - loss: 1.4026 - regression_loss: 1.2201 - classification_loss: 0.1825 448/500 [=========================>....] - ETA: 17s - loss: 1.4030 - regression_loss: 1.2204 - classification_loss: 0.1826 449/500 [=========================>....] - ETA: 17s - loss: 1.4032 - regression_loss: 1.2206 - classification_loss: 0.1826 450/500 [==========================>...] - ETA: 17s - loss: 1.4032 - regression_loss: 1.2205 - classification_loss: 0.1827 451/500 [==========================>...] - ETA: 16s - loss: 1.4032 - regression_loss: 1.2205 - classification_loss: 0.1827 452/500 [==========================>...] - ETA: 16s - loss: 1.4023 - regression_loss: 1.2197 - classification_loss: 0.1826 453/500 [==========================>...] - ETA: 15s - loss: 1.4035 - regression_loss: 1.2206 - classification_loss: 0.1829 454/500 [==========================>...] - ETA: 15s - loss: 1.4046 - regression_loss: 1.2214 - classification_loss: 0.1832 455/500 [==========================>...] - ETA: 15s - loss: 1.4051 - regression_loss: 1.2218 - classification_loss: 0.1833 456/500 [==========================>...] - ETA: 14s - loss: 1.4053 - regression_loss: 1.2219 - classification_loss: 0.1834 457/500 [==========================>...] - ETA: 14s - loss: 1.4072 - regression_loss: 1.2235 - classification_loss: 0.1836 458/500 [==========================>...] - ETA: 14s - loss: 1.4066 - regression_loss: 1.2229 - classification_loss: 0.1836 459/500 [==========================>...] - ETA: 13s - loss: 1.4091 - regression_loss: 1.2250 - classification_loss: 0.1841 460/500 [==========================>...] - ETA: 13s - loss: 1.4106 - regression_loss: 1.2264 - classification_loss: 0.1842 461/500 [==========================>...] - ETA: 13s - loss: 1.4117 - regression_loss: 1.2273 - classification_loss: 0.1844 462/500 [==========================>...] - ETA: 12s - loss: 1.4105 - regression_loss: 1.2264 - classification_loss: 0.1841 463/500 [==========================>...] - ETA: 12s - loss: 1.4108 - regression_loss: 1.2266 - classification_loss: 0.1842 464/500 [==========================>...] - ETA: 12s - loss: 1.4102 - regression_loss: 1.2261 - classification_loss: 0.1841 465/500 [==========================>...] - ETA: 11s - loss: 1.4085 - regression_loss: 1.2247 - classification_loss: 0.1838 466/500 [==========================>...] - ETA: 11s - loss: 1.4085 - regression_loss: 1.2247 - classification_loss: 0.1838 467/500 [===========================>..] - ETA: 11s - loss: 1.4076 - regression_loss: 1.2239 - classification_loss: 0.1837 468/500 [===========================>..] - ETA: 10s - loss: 1.4067 - regression_loss: 1.2231 - classification_loss: 0.1835 469/500 [===========================>..] - ETA: 10s - loss: 1.4061 - regression_loss: 1.2228 - classification_loss: 0.1834 470/500 [===========================>..] - ETA: 10s - loss: 1.4063 - regression_loss: 1.2229 - classification_loss: 0.1834 471/500 [===========================>..] - ETA: 9s - loss: 1.4064 - regression_loss: 1.2231 - classification_loss: 0.1834  472/500 [===========================>..] - ETA: 9s - loss: 1.4072 - regression_loss: 1.2237 - classification_loss: 0.1834 473/500 [===========================>..] - ETA: 9s - loss: 1.4075 - regression_loss: 1.2241 - classification_loss: 0.1834 474/500 [===========================>..] - ETA: 8s - loss: 1.4074 - regression_loss: 1.2240 - classification_loss: 0.1835 475/500 [===========================>..] - ETA: 8s - loss: 1.4079 - regression_loss: 1.2243 - classification_loss: 0.1835 476/500 [===========================>..] - ETA: 8s - loss: 1.4086 - regression_loss: 1.2248 - classification_loss: 0.1838 477/500 [===========================>..] - ETA: 7s - loss: 1.4088 - regression_loss: 1.2250 - classification_loss: 0.1838 478/500 [===========================>..] - ETA: 7s - loss: 1.4079 - regression_loss: 1.2243 - classification_loss: 0.1836 479/500 [===========================>..] - ETA: 7s - loss: 1.4079 - regression_loss: 1.2243 - classification_loss: 0.1836 480/500 [===========================>..] - ETA: 6s - loss: 1.4081 - regression_loss: 1.2245 - classification_loss: 0.1836 481/500 [===========================>..] - ETA: 6s - loss: 1.4083 - regression_loss: 1.2247 - classification_loss: 0.1836 482/500 [===========================>..] - ETA: 6s - loss: 1.4079 - regression_loss: 1.2242 - classification_loss: 0.1837 483/500 [===========================>..] - ETA: 5s - loss: 1.4073 - regression_loss: 1.2238 - classification_loss: 0.1836 484/500 [============================>.] - ETA: 5s - loss: 1.4071 - regression_loss: 1.2237 - classification_loss: 0.1835 485/500 [============================>.] - ETA: 5s - loss: 1.4063 - regression_loss: 1.2229 - classification_loss: 0.1834 486/500 [============================>.] - ETA: 4s - loss: 1.4051 - regression_loss: 1.2220 - classification_loss: 0.1831 487/500 [============================>.] - ETA: 4s - loss: 1.4034 - regression_loss: 1.2206 - classification_loss: 0.1829 488/500 [============================>.] - ETA: 4s - loss: 1.4041 - regression_loss: 1.2212 - classification_loss: 0.1830 489/500 [============================>.] - ETA: 3s - loss: 1.4044 - regression_loss: 1.2214 - classification_loss: 0.1830 490/500 [============================>.] - ETA: 3s - loss: 1.4039 - regression_loss: 1.2211 - classification_loss: 0.1828 491/500 [============================>.] - ETA: 3s - loss: 1.4064 - regression_loss: 1.2228 - classification_loss: 0.1835 492/500 [============================>.] - ETA: 2s - loss: 1.4059 - regression_loss: 1.2225 - classification_loss: 0.1835 493/500 [============================>.] - ETA: 2s - loss: 1.4051 - regression_loss: 1.2218 - classification_loss: 0.1833 494/500 [============================>.] - ETA: 2s - loss: 1.4054 - regression_loss: 1.2221 - classification_loss: 0.1833 495/500 [============================>.] - ETA: 1s - loss: 1.4051 - regression_loss: 1.2218 - classification_loss: 0.1833 496/500 [============================>.] - ETA: 1s - loss: 1.4049 - regression_loss: 1.2217 - classification_loss: 0.1831 497/500 [============================>.] - ETA: 1s - loss: 1.4054 - regression_loss: 1.2223 - classification_loss: 0.1831 498/500 [============================>.] - ETA: 0s - loss: 1.4054 - regression_loss: 1.2225 - classification_loss: 0.1830 499/500 [============================>.] - ETA: 0s - loss: 1.4050 - regression_loss: 1.2220 - classification_loss: 0.1830 500/500 [==============================] - 170s 340ms/step - loss: 1.4036 - regression_loss: 1.2208 - classification_loss: 0.1828 1172 instances of class plum with average precision: 0.6869 mAP: 0.6869 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:49 - loss: 1.2498 - regression_loss: 1.0885 - classification_loss: 0.1613 2/500 [..............................] - ETA: 2:49 - loss: 1.4780 - regression_loss: 1.2832 - classification_loss: 0.1948 3/500 [..............................] - ETA: 2:49 - loss: 1.4161 - regression_loss: 1.2170 - classification_loss: 0.1991 4/500 [..............................] - ETA: 2:50 - loss: 1.4647 - regression_loss: 1.2629 - classification_loss: 0.2018 5/500 [..............................] - ETA: 2:47 - loss: 1.4872 - regression_loss: 1.2897 - classification_loss: 0.1975 6/500 [..............................] - ETA: 2:47 - loss: 1.4572 - regression_loss: 1.2628 - classification_loss: 0.1944 7/500 [..............................] - ETA: 2:47 - loss: 1.4615 - regression_loss: 1.2695 - classification_loss: 0.1920 8/500 [..............................] - ETA: 2:47 - loss: 1.4124 - regression_loss: 1.2294 - classification_loss: 0.1831 9/500 [..............................] - ETA: 2:47 - loss: 1.3161 - regression_loss: 1.1447 - classification_loss: 0.1714 10/500 [..............................] - ETA: 2:46 - loss: 1.3242 - regression_loss: 1.1495 - classification_loss: 0.1747 11/500 [..............................] - ETA: 2:46 - loss: 1.3359 - regression_loss: 1.1676 - classification_loss: 0.1683 12/500 [..............................] - ETA: 2:46 - loss: 1.3072 - regression_loss: 1.1417 - classification_loss: 0.1655 13/500 [..............................] - ETA: 2:47 - loss: 1.3300 - regression_loss: 1.1623 - classification_loss: 0.1677 14/500 [..............................] - ETA: 2:46 - loss: 1.3130 - regression_loss: 1.1483 - classification_loss: 0.1648 15/500 [..............................] - ETA: 2:46 - loss: 1.3180 - regression_loss: 1.1529 - classification_loss: 0.1651 16/500 [..............................] - ETA: 2:45 - loss: 1.2976 - regression_loss: 1.1321 - classification_loss: 0.1655 17/500 [>.............................] - ETA: 2:44 - loss: 1.3031 - regression_loss: 1.1352 - classification_loss: 0.1678 18/500 [>.............................] - ETA: 2:44 - loss: 1.2958 - regression_loss: 1.1320 - classification_loss: 0.1637 19/500 [>.............................] - ETA: 2:44 - loss: 1.3047 - regression_loss: 1.1406 - classification_loss: 0.1641 20/500 [>.............................] - ETA: 2:44 - loss: 1.3344 - regression_loss: 1.1661 - classification_loss: 0.1683 21/500 [>.............................] - ETA: 2:44 - loss: 1.3602 - regression_loss: 1.1885 - classification_loss: 0.1718 22/500 [>.............................] - ETA: 2:43 - loss: 1.3653 - regression_loss: 1.1937 - classification_loss: 0.1717 23/500 [>.............................] - ETA: 2:43 - loss: 1.3822 - regression_loss: 1.2086 - classification_loss: 0.1736 24/500 [>.............................] - ETA: 2:43 - loss: 1.3799 - regression_loss: 1.2070 - classification_loss: 0.1729 25/500 [>.............................] - ETA: 2:42 - loss: 1.3914 - regression_loss: 1.2171 - classification_loss: 0.1744 26/500 [>.............................] - ETA: 2:42 - loss: 1.3690 - regression_loss: 1.1977 - classification_loss: 0.1714 27/500 [>.............................] - ETA: 2:42 - loss: 1.3820 - regression_loss: 1.2079 - classification_loss: 0.1741 28/500 [>.............................] - ETA: 2:41 - loss: 1.4046 - regression_loss: 1.2274 - classification_loss: 0.1772 29/500 [>.............................] - ETA: 2:41 - loss: 1.4024 - regression_loss: 1.2246 - classification_loss: 0.1779 30/500 [>.............................] - ETA: 2:41 - loss: 1.4246 - regression_loss: 1.2431 - classification_loss: 0.1815 31/500 [>.............................] - ETA: 2:40 - loss: 1.4380 - regression_loss: 1.2546 - classification_loss: 0.1834 32/500 [>.............................] - ETA: 2:40 - loss: 1.4236 - regression_loss: 1.2428 - classification_loss: 0.1808 33/500 [>.............................] - ETA: 2:39 - loss: 1.4295 - regression_loss: 1.2479 - classification_loss: 0.1816 34/500 [=>............................] - ETA: 2:39 - loss: 1.4279 - regression_loss: 1.2470 - classification_loss: 0.1809 35/500 [=>............................] - ETA: 2:38 - loss: 1.4003 - regression_loss: 1.2231 - classification_loss: 0.1771 36/500 [=>............................] - ETA: 2:38 - loss: 1.3755 - regression_loss: 1.2016 - classification_loss: 0.1739 37/500 [=>............................] - ETA: 2:37 - loss: 1.3714 - regression_loss: 1.1973 - classification_loss: 0.1741 38/500 [=>............................] - ETA: 2:37 - loss: 1.3734 - regression_loss: 1.2007 - classification_loss: 0.1728 39/500 [=>............................] - ETA: 2:37 - loss: 1.3804 - regression_loss: 1.2084 - classification_loss: 0.1720 40/500 [=>............................] - ETA: 2:36 - loss: 1.3603 - regression_loss: 1.1899 - classification_loss: 0.1704 41/500 [=>............................] - ETA: 2:35 - loss: 1.3625 - regression_loss: 1.1907 - classification_loss: 0.1718 42/500 [=>............................] - ETA: 2:35 - loss: 1.3736 - regression_loss: 1.2000 - classification_loss: 0.1735 43/500 [=>............................] - ETA: 2:34 - loss: 1.3668 - regression_loss: 1.1937 - classification_loss: 0.1731 44/500 [=>............................] - ETA: 2:34 - loss: 1.3737 - regression_loss: 1.1996 - classification_loss: 0.1741 45/500 [=>............................] - ETA: 2:34 - loss: 1.3822 - regression_loss: 1.2053 - classification_loss: 0.1769 46/500 [=>............................] - ETA: 2:33 - loss: 1.3945 - regression_loss: 1.2156 - classification_loss: 0.1789 47/500 [=>............................] - ETA: 2:33 - loss: 1.3993 - regression_loss: 1.2199 - classification_loss: 0.1794 48/500 [=>............................] - ETA: 2:33 - loss: 1.3851 - regression_loss: 1.2065 - classification_loss: 0.1786 49/500 [=>............................] - ETA: 2:32 - loss: 1.3897 - regression_loss: 1.2112 - classification_loss: 0.1785 50/500 [==>...........................] - ETA: 2:32 - loss: 1.3741 - regression_loss: 1.1973 - classification_loss: 0.1768 51/500 [==>...........................] - ETA: 2:32 - loss: 1.3741 - regression_loss: 1.1967 - classification_loss: 0.1773 52/500 [==>...........................] - ETA: 2:32 - loss: 1.3665 - regression_loss: 1.1911 - classification_loss: 0.1754 53/500 [==>...........................] - ETA: 2:31 - loss: 1.3593 - regression_loss: 1.1861 - classification_loss: 0.1733 54/500 [==>...........................] - ETA: 2:31 - loss: 1.3661 - regression_loss: 1.1918 - classification_loss: 0.1743 55/500 [==>...........................] - ETA: 2:31 - loss: 1.3548 - regression_loss: 1.1821 - classification_loss: 0.1727 56/500 [==>...........................] - ETA: 2:30 - loss: 1.3535 - regression_loss: 1.1806 - classification_loss: 0.1728 57/500 [==>...........................] - ETA: 2:30 - loss: 1.3522 - regression_loss: 1.1792 - classification_loss: 0.1730 58/500 [==>...........................] - ETA: 2:30 - loss: 1.3609 - regression_loss: 1.1868 - classification_loss: 0.1741 59/500 [==>...........................] - ETA: 2:29 - loss: 1.3550 - regression_loss: 1.1816 - classification_loss: 0.1734 60/500 [==>...........................] - ETA: 2:29 - loss: 1.3526 - regression_loss: 1.1801 - classification_loss: 0.1725 61/500 [==>...........................] - ETA: 2:29 - loss: 1.3486 - regression_loss: 1.1769 - classification_loss: 0.1717 62/500 [==>...........................] - ETA: 2:28 - loss: 1.3421 - regression_loss: 1.1708 - classification_loss: 0.1714 63/500 [==>...........................] - ETA: 2:28 - loss: 1.3465 - regression_loss: 1.1740 - classification_loss: 0.1725 64/500 [==>...........................] - ETA: 2:28 - loss: 1.3505 - regression_loss: 1.1774 - classification_loss: 0.1730 65/500 [==>...........................] - ETA: 2:27 - loss: 1.3444 - regression_loss: 1.1726 - classification_loss: 0.1719 66/500 [==>...........................] - ETA: 2:27 - loss: 1.3475 - regression_loss: 1.1752 - classification_loss: 0.1723 67/500 [===>..........................] - ETA: 2:27 - loss: 1.3390 - regression_loss: 1.1685 - classification_loss: 0.1705 68/500 [===>..........................] - ETA: 2:26 - loss: 1.3344 - regression_loss: 1.1647 - classification_loss: 0.1697 69/500 [===>..........................] - ETA: 2:26 - loss: 1.3277 - regression_loss: 1.1590 - classification_loss: 0.1686 70/500 [===>..........................] - ETA: 2:26 - loss: 1.3297 - regression_loss: 1.1608 - classification_loss: 0.1689 71/500 [===>..........................] - ETA: 2:25 - loss: 1.3305 - regression_loss: 1.1605 - classification_loss: 0.1700 72/500 [===>..........................] - ETA: 2:25 - loss: 1.3360 - regression_loss: 1.1654 - classification_loss: 0.1706 73/500 [===>..........................] - ETA: 2:25 - loss: 1.3385 - regression_loss: 1.1670 - classification_loss: 0.1715 74/500 [===>..........................] - ETA: 2:24 - loss: 1.3379 - regression_loss: 1.1666 - classification_loss: 0.1713 75/500 [===>..........................] - ETA: 2:24 - loss: 1.3412 - regression_loss: 1.1697 - classification_loss: 0.1715 76/500 [===>..........................] - ETA: 2:24 - loss: 1.3462 - regression_loss: 1.1733 - classification_loss: 0.1730 77/500 [===>..........................] - ETA: 2:24 - loss: 1.3544 - regression_loss: 1.1804 - classification_loss: 0.1740 78/500 [===>..........................] - ETA: 2:23 - loss: 1.3628 - regression_loss: 1.1870 - classification_loss: 0.1757 79/500 [===>..........................] - ETA: 2:23 - loss: 1.3628 - regression_loss: 1.1871 - classification_loss: 0.1757 80/500 [===>..........................] - ETA: 2:23 - loss: 1.3595 - regression_loss: 1.1837 - classification_loss: 0.1757 81/500 [===>..........................] - ETA: 2:22 - loss: 1.3571 - regression_loss: 1.1824 - classification_loss: 0.1746 82/500 [===>..........................] - ETA: 2:22 - loss: 1.3556 - regression_loss: 1.1817 - classification_loss: 0.1739 83/500 [===>..........................] - ETA: 2:22 - loss: 1.3581 - regression_loss: 1.1844 - classification_loss: 0.1738 84/500 [====>.........................] - ETA: 2:21 - loss: 1.3467 - regression_loss: 1.1744 - classification_loss: 0.1723 85/500 [====>.........................] - ETA: 2:21 - loss: 1.3500 - regression_loss: 1.1768 - classification_loss: 0.1732 86/500 [====>.........................] - ETA: 2:21 - loss: 1.3412 - regression_loss: 1.1692 - classification_loss: 0.1719 87/500 [====>.........................] - ETA: 2:20 - loss: 1.3412 - regression_loss: 1.1707 - classification_loss: 0.1705 88/500 [====>.........................] - ETA: 2:20 - loss: 1.3404 - regression_loss: 1.1702 - classification_loss: 0.1702 89/500 [====>.........................] - ETA: 2:19 - loss: 1.3340 - regression_loss: 1.1648 - classification_loss: 0.1692 90/500 [====>.........................] - ETA: 2:19 - loss: 1.3337 - regression_loss: 1.1650 - classification_loss: 0.1688 91/500 [====>.........................] - ETA: 2:19 - loss: 1.3384 - regression_loss: 1.1692 - classification_loss: 0.1692 92/500 [====>.........................] - ETA: 2:18 - loss: 1.3330 - regression_loss: 1.1640 - classification_loss: 0.1690 93/500 [====>.........................] - ETA: 2:18 - loss: 1.3356 - regression_loss: 1.1666 - classification_loss: 0.1690 94/500 [====>.........................] - ETA: 2:18 - loss: 1.3359 - regression_loss: 1.1663 - classification_loss: 0.1697 95/500 [====>.........................] - ETA: 2:17 - loss: 1.3366 - regression_loss: 1.1672 - classification_loss: 0.1694 96/500 [====>.........................] - ETA: 2:17 - loss: 1.3352 - regression_loss: 1.1660 - classification_loss: 0.1692 97/500 [====>.........................] - ETA: 2:17 - loss: 1.3309 - regression_loss: 1.1625 - classification_loss: 0.1684 98/500 [====>.........................] - ETA: 2:16 - loss: 1.3325 - regression_loss: 1.1638 - classification_loss: 0.1687 99/500 [====>.........................] - ETA: 2:16 - loss: 1.3340 - regression_loss: 1.1650 - classification_loss: 0.1689 100/500 [=====>........................] - ETA: 2:16 - loss: 1.3400 - regression_loss: 1.1700 - classification_loss: 0.1700 101/500 [=====>........................] - ETA: 2:16 - loss: 1.3350 - regression_loss: 1.1658 - classification_loss: 0.1691 102/500 [=====>........................] - ETA: 2:15 - loss: 1.3364 - regression_loss: 1.1669 - classification_loss: 0.1694 103/500 [=====>........................] - ETA: 2:15 - loss: 1.3365 - regression_loss: 1.1673 - classification_loss: 0.1692 104/500 [=====>........................] - ETA: 2:15 - loss: 1.3290 - regression_loss: 1.1605 - classification_loss: 0.1685 105/500 [=====>........................] - ETA: 2:14 - loss: 1.3293 - regression_loss: 1.1610 - classification_loss: 0.1683 106/500 [=====>........................] - ETA: 2:14 - loss: 1.3300 - regression_loss: 1.1607 - classification_loss: 0.1693 107/500 [=====>........................] - ETA: 2:14 - loss: 1.3285 - regression_loss: 1.1592 - classification_loss: 0.1692 108/500 [=====>........................] - ETA: 2:13 - loss: 1.3271 - regression_loss: 1.1584 - classification_loss: 0.1687 109/500 [=====>........................] - ETA: 2:13 - loss: 1.3256 - regression_loss: 1.1572 - classification_loss: 0.1683 110/500 [=====>........................] - ETA: 2:13 - loss: 1.3299 - regression_loss: 1.1607 - classification_loss: 0.1692 111/500 [=====>........................] - ETA: 2:12 - loss: 1.3280 - regression_loss: 1.1590 - classification_loss: 0.1689 112/500 [=====>........................] - ETA: 2:12 - loss: 1.3264 - regression_loss: 1.1578 - classification_loss: 0.1686 113/500 [=====>........................] - ETA: 2:12 - loss: 1.3276 - regression_loss: 1.1582 - classification_loss: 0.1693 114/500 [=====>........................] - ETA: 2:11 - loss: 1.3292 - regression_loss: 1.1594 - classification_loss: 0.1698 115/500 [=====>........................] - ETA: 2:11 - loss: 1.3300 - regression_loss: 1.1600 - classification_loss: 0.1700 116/500 [=====>........................] - ETA: 2:11 - loss: 1.3286 - regression_loss: 1.1590 - classification_loss: 0.1696 117/500 [======>.......................] - ETA: 2:10 - loss: 1.3309 - regression_loss: 1.1607 - classification_loss: 0.1702 118/500 [======>.......................] - ETA: 2:10 - loss: 1.3337 - regression_loss: 1.1632 - classification_loss: 0.1706 119/500 [======>.......................] - ETA: 2:10 - loss: 1.3392 - regression_loss: 1.1676 - classification_loss: 0.1715 120/500 [======>.......................] - ETA: 2:09 - loss: 1.3383 - regression_loss: 1.1667 - classification_loss: 0.1716 121/500 [======>.......................] - ETA: 2:09 - loss: 1.3379 - regression_loss: 1.1667 - classification_loss: 0.1712 122/500 [======>.......................] - ETA: 2:09 - loss: 1.3395 - regression_loss: 1.1683 - classification_loss: 0.1711 123/500 [======>.......................] - ETA: 2:08 - loss: 1.3418 - regression_loss: 1.1705 - classification_loss: 0.1713 124/500 [======>.......................] - ETA: 2:08 - loss: 1.3422 - regression_loss: 1.1709 - classification_loss: 0.1713 125/500 [======>.......................] - ETA: 2:08 - loss: 1.3430 - regression_loss: 1.1716 - classification_loss: 0.1714 126/500 [======>.......................] - ETA: 2:07 - loss: 1.3469 - regression_loss: 1.1749 - classification_loss: 0.1721 127/500 [======>.......................] - ETA: 2:07 - loss: 1.3434 - regression_loss: 1.1717 - classification_loss: 0.1717 128/500 [======>.......................] - ETA: 2:06 - loss: 1.3390 - regression_loss: 1.1682 - classification_loss: 0.1708 129/500 [======>.......................] - ETA: 2:06 - loss: 1.3404 - regression_loss: 1.1695 - classification_loss: 0.1709 130/500 [======>.......................] - ETA: 2:06 - loss: 1.3425 - regression_loss: 1.1708 - classification_loss: 0.1717 131/500 [======>.......................] - ETA: 2:05 - loss: 1.3392 - regression_loss: 1.1681 - classification_loss: 0.1711 132/500 [======>.......................] - ETA: 2:05 - loss: 1.3417 - regression_loss: 1.1703 - classification_loss: 0.1714 133/500 [======>.......................] - ETA: 2:05 - loss: 1.3451 - regression_loss: 1.1735 - classification_loss: 0.1716 134/500 [=======>......................] - ETA: 2:04 - loss: 1.3434 - regression_loss: 1.1721 - classification_loss: 0.1713 135/500 [=======>......................] - ETA: 2:04 - loss: 1.3465 - regression_loss: 1.1746 - classification_loss: 0.1719 136/500 [=======>......................] - ETA: 2:04 - loss: 1.3516 - regression_loss: 1.1787 - classification_loss: 0.1730 137/500 [=======>......................] - ETA: 2:03 - loss: 1.3520 - regression_loss: 1.1789 - classification_loss: 0.1731 138/500 [=======>......................] - ETA: 2:03 - loss: 1.3522 - regression_loss: 1.1785 - classification_loss: 0.1737 139/500 [=======>......................] - ETA: 2:03 - loss: 1.3516 - regression_loss: 1.1780 - classification_loss: 0.1736 140/500 [=======>......................] - ETA: 2:02 - loss: 1.3515 - regression_loss: 1.1778 - classification_loss: 0.1736 141/500 [=======>......................] - ETA: 2:02 - loss: 1.3520 - regression_loss: 1.1787 - classification_loss: 0.1733 142/500 [=======>......................] - ETA: 2:02 - loss: 1.3571 - regression_loss: 1.1831 - classification_loss: 0.1740 143/500 [=======>......................] - ETA: 2:01 - loss: 1.3580 - regression_loss: 1.1840 - classification_loss: 0.1741 144/500 [=======>......................] - ETA: 2:01 - loss: 1.3601 - regression_loss: 1.1861 - classification_loss: 0.1740 145/500 [=======>......................] - ETA: 2:01 - loss: 1.3635 - regression_loss: 1.1893 - classification_loss: 0.1742 146/500 [=======>......................] - ETA: 2:00 - loss: 1.3634 - regression_loss: 1.1886 - classification_loss: 0.1748 147/500 [=======>......................] - ETA: 2:00 - loss: 1.3639 - regression_loss: 1.1893 - classification_loss: 0.1747 148/500 [=======>......................] - ETA: 2:00 - loss: 1.3640 - regression_loss: 1.1896 - classification_loss: 0.1744 149/500 [=======>......................] - ETA: 1:59 - loss: 1.3636 - regression_loss: 1.1892 - classification_loss: 0.1744 150/500 [========>.....................] - ETA: 1:59 - loss: 1.3595 - regression_loss: 1.1858 - classification_loss: 0.1737 151/500 [========>.....................] - ETA: 1:59 - loss: 1.3611 - regression_loss: 1.1871 - classification_loss: 0.1740 152/500 [========>.....................] - ETA: 1:58 - loss: 1.3603 - regression_loss: 1.1867 - classification_loss: 0.1736 153/500 [========>.....................] - ETA: 1:58 - loss: 1.3602 - regression_loss: 1.1865 - classification_loss: 0.1737 154/500 [========>.....................] - ETA: 1:58 - loss: 1.3621 - regression_loss: 1.1882 - classification_loss: 0.1739 155/500 [========>.....................] - ETA: 1:57 - loss: 1.3615 - regression_loss: 1.1875 - classification_loss: 0.1740 156/500 [========>.....................] - ETA: 1:57 - loss: 1.3653 - regression_loss: 1.1905 - classification_loss: 0.1749 157/500 [========>.....................] - ETA: 1:57 - loss: 1.3611 - regression_loss: 1.1870 - classification_loss: 0.1742 158/500 [========>.....................] - ETA: 1:56 - loss: 1.3647 - regression_loss: 1.1898 - classification_loss: 0.1749 159/500 [========>.....................] - ETA: 1:56 - loss: 1.3667 - regression_loss: 1.1915 - classification_loss: 0.1752 160/500 [========>.....................] - ETA: 1:56 - loss: 1.3608 - regression_loss: 1.1863 - classification_loss: 0.1745 161/500 [========>.....................] - ETA: 1:55 - loss: 1.3614 - regression_loss: 1.1869 - classification_loss: 0.1745 162/500 [========>.....................] - ETA: 1:55 - loss: 1.3650 - regression_loss: 1.1898 - classification_loss: 0.1752 163/500 [========>.....................] - ETA: 1:55 - loss: 1.3658 - regression_loss: 1.1905 - classification_loss: 0.1753 164/500 [========>.....................] - ETA: 1:54 - loss: 1.3671 - regression_loss: 1.1915 - classification_loss: 0.1755 165/500 [========>.....................] - ETA: 1:54 - loss: 1.3671 - regression_loss: 1.1914 - classification_loss: 0.1757 166/500 [========>.....................] - ETA: 1:54 - loss: 1.3679 - regression_loss: 1.1921 - classification_loss: 0.1758 167/500 [=========>....................] - ETA: 1:53 - loss: 1.3673 - regression_loss: 1.1917 - classification_loss: 0.1756 168/500 [=========>....................] - ETA: 1:53 - loss: 1.3706 - regression_loss: 1.1949 - classification_loss: 0.1757 169/500 [=========>....................] - ETA: 1:53 - loss: 1.3730 - regression_loss: 1.1974 - classification_loss: 0.1756 170/500 [=========>....................] - ETA: 1:52 - loss: 1.3754 - regression_loss: 1.1999 - classification_loss: 0.1755 171/500 [=========>....................] - ETA: 1:52 - loss: 1.3758 - regression_loss: 1.2003 - classification_loss: 0.1754 172/500 [=========>....................] - ETA: 1:52 - loss: 1.3728 - regression_loss: 1.1979 - classification_loss: 0.1750 173/500 [=========>....................] - ETA: 1:51 - loss: 1.3731 - regression_loss: 1.1978 - classification_loss: 0.1753 174/500 [=========>....................] - ETA: 1:51 - loss: 1.3773 - regression_loss: 1.2013 - classification_loss: 0.1760 175/500 [=========>....................] - ETA: 1:51 - loss: 1.3748 - regression_loss: 1.1994 - classification_loss: 0.1754 176/500 [=========>....................] - ETA: 1:50 - loss: 1.3754 - regression_loss: 1.2001 - classification_loss: 0.1753 177/500 [=========>....................] - ETA: 1:50 - loss: 1.3713 - regression_loss: 1.1967 - classification_loss: 0.1746 178/500 [=========>....................] - ETA: 1:50 - loss: 1.3681 - regression_loss: 1.1940 - classification_loss: 0.1741 179/500 [=========>....................] - ETA: 1:49 - loss: 1.3685 - regression_loss: 1.1943 - classification_loss: 0.1742 180/500 [=========>....................] - ETA: 1:49 - loss: 1.3666 - regression_loss: 1.1920 - classification_loss: 0.1746 181/500 [=========>....................] - ETA: 1:48 - loss: 1.3671 - regression_loss: 1.1923 - classification_loss: 0.1748 182/500 [=========>....................] - ETA: 1:48 - loss: 1.3679 - regression_loss: 1.1930 - classification_loss: 0.1749 183/500 [=========>....................] - ETA: 1:48 - loss: 1.3663 - regression_loss: 1.1916 - classification_loss: 0.1747 184/500 [==========>...................] - ETA: 1:47 - loss: 1.3648 - regression_loss: 1.1904 - classification_loss: 0.1744 185/500 [==========>...................] - ETA: 1:47 - loss: 1.3658 - regression_loss: 1.1910 - classification_loss: 0.1748 186/500 [==========>...................] - ETA: 1:47 - loss: 1.3605 - regression_loss: 1.1863 - classification_loss: 0.1742 187/500 [==========>...................] - ETA: 1:46 - loss: 1.3631 - regression_loss: 1.1890 - classification_loss: 0.1742 188/500 [==========>...................] - ETA: 1:46 - loss: 1.3619 - regression_loss: 1.1881 - classification_loss: 0.1738 189/500 [==========>...................] - ETA: 1:46 - loss: 1.3619 - regression_loss: 1.1882 - classification_loss: 0.1737 190/500 [==========>...................] - ETA: 1:45 - loss: 1.3585 - regression_loss: 1.1854 - classification_loss: 0.1731 191/500 [==========>...................] - ETA: 1:45 - loss: 1.3583 - regression_loss: 1.1853 - classification_loss: 0.1729 192/500 [==========>...................] - ETA: 1:45 - loss: 1.3602 - regression_loss: 1.1869 - classification_loss: 0.1733 193/500 [==========>...................] - ETA: 1:44 - loss: 1.3609 - regression_loss: 1.1874 - classification_loss: 0.1735 194/500 [==========>...................] - ETA: 1:44 - loss: 1.3610 - regression_loss: 1.1877 - classification_loss: 0.1733 195/500 [==========>...................] - ETA: 1:44 - loss: 1.3619 - regression_loss: 1.1884 - classification_loss: 0.1735 196/500 [==========>...................] - ETA: 1:43 - loss: 1.3574 - regression_loss: 1.1844 - classification_loss: 0.1730 197/500 [==========>...................] - ETA: 1:43 - loss: 1.3585 - regression_loss: 1.1852 - classification_loss: 0.1733 198/500 [==========>...................] - ETA: 1:43 - loss: 1.3608 - regression_loss: 1.1868 - classification_loss: 0.1740 199/500 [==========>...................] - ETA: 1:42 - loss: 1.3597 - regression_loss: 1.1859 - classification_loss: 0.1738 200/500 [===========>..................] - ETA: 1:42 - loss: 1.3606 - regression_loss: 1.1865 - classification_loss: 0.1741 201/500 [===========>..................] - ETA: 1:42 - loss: 1.3622 - regression_loss: 1.1879 - classification_loss: 0.1742 202/500 [===========>..................] - ETA: 1:41 - loss: 1.3626 - regression_loss: 1.1884 - classification_loss: 0.1742 203/500 [===========>..................] - ETA: 1:41 - loss: 1.3634 - regression_loss: 1.1893 - classification_loss: 0.1742 204/500 [===========>..................] - ETA: 1:40 - loss: 1.3644 - regression_loss: 1.1901 - classification_loss: 0.1743 205/500 [===========>..................] - ETA: 1:40 - loss: 1.3643 - regression_loss: 1.1902 - classification_loss: 0.1741 206/500 [===========>..................] - ETA: 1:40 - loss: 1.3668 - regression_loss: 1.1924 - classification_loss: 0.1745 207/500 [===========>..................] - ETA: 1:39 - loss: 1.3685 - regression_loss: 1.1940 - classification_loss: 0.1745 208/500 [===========>..................] - ETA: 1:39 - loss: 1.3698 - regression_loss: 1.1948 - classification_loss: 0.1750 209/500 [===========>..................] - ETA: 1:39 - loss: 1.3713 - regression_loss: 1.1961 - classification_loss: 0.1752 210/500 [===========>..................] - ETA: 1:38 - loss: 1.3684 - regression_loss: 1.1936 - classification_loss: 0.1749 211/500 [===========>..................] - ETA: 1:38 - loss: 1.3682 - regression_loss: 1.1936 - classification_loss: 0.1746 212/500 [===========>..................] - ETA: 1:38 - loss: 1.3678 - regression_loss: 1.1933 - classification_loss: 0.1745 213/500 [===========>..................] - ETA: 1:37 - loss: 1.3685 - regression_loss: 1.1939 - classification_loss: 0.1746 214/500 [===========>..................] - ETA: 1:37 - loss: 1.3666 - regression_loss: 1.1923 - classification_loss: 0.1743 215/500 [===========>..................] - ETA: 1:37 - loss: 1.3650 - regression_loss: 1.1910 - classification_loss: 0.1740 216/500 [===========>..................] - ETA: 1:36 - loss: 1.3609 - regression_loss: 1.1874 - classification_loss: 0.1735 217/500 [============>.................] - ETA: 1:36 - loss: 1.3592 - regression_loss: 1.1861 - classification_loss: 0.1731 218/500 [============>.................] - ETA: 1:36 - loss: 1.3558 - regression_loss: 1.1832 - classification_loss: 0.1727 219/500 [============>.................] - ETA: 1:35 - loss: 1.3593 - regression_loss: 1.1861 - classification_loss: 0.1732 220/500 [============>.................] - ETA: 1:35 - loss: 1.3593 - regression_loss: 1.1862 - classification_loss: 0.1731 221/500 [============>.................] - ETA: 1:34 - loss: 1.3585 - regression_loss: 1.1856 - classification_loss: 0.1729 222/500 [============>.................] - ETA: 1:34 - loss: 1.3584 - regression_loss: 1.1857 - classification_loss: 0.1727 223/500 [============>.................] - ETA: 1:34 - loss: 1.3575 - regression_loss: 1.1851 - classification_loss: 0.1724 224/500 [============>.................] - ETA: 1:33 - loss: 1.3557 - regression_loss: 1.1835 - classification_loss: 0.1722 225/500 [============>.................] - ETA: 1:33 - loss: 1.3578 - regression_loss: 1.1852 - classification_loss: 0.1726 226/500 [============>.................] - ETA: 1:33 - loss: 1.3577 - regression_loss: 1.1852 - classification_loss: 0.1725 227/500 [============>.................] - ETA: 1:32 - loss: 1.3586 - regression_loss: 1.1859 - classification_loss: 0.1727 228/500 [============>.................] - ETA: 1:32 - loss: 1.3603 - regression_loss: 1.1872 - classification_loss: 0.1731 229/500 [============>.................] - ETA: 1:32 - loss: 1.3605 - regression_loss: 1.1874 - classification_loss: 0.1732 230/500 [============>.................] - ETA: 1:31 - loss: 1.3627 - regression_loss: 1.1891 - classification_loss: 0.1735 231/500 [============>.................] - ETA: 1:31 - loss: 1.3618 - regression_loss: 1.1883 - classification_loss: 0.1735 232/500 [============>.................] - ETA: 1:31 - loss: 1.3613 - regression_loss: 1.1877 - classification_loss: 0.1736 233/500 [============>.................] - ETA: 1:30 - loss: 1.3610 - regression_loss: 1.1874 - classification_loss: 0.1737 234/500 [=============>................] - ETA: 1:30 - loss: 1.3603 - regression_loss: 1.1868 - classification_loss: 0.1736 235/500 [=============>................] - ETA: 1:30 - loss: 1.3608 - regression_loss: 1.1872 - classification_loss: 0.1736 236/500 [=============>................] - ETA: 1:29 - loss: 1.3583 - regression_loss: 1.1851 - classification_loss: 0.1732 237/500 [=============>................] - ETA: 1:29 - loss: 1.3595 - regression_loss: 1.1862 - classification_loss: 0.1734 238/500 [=============>................] - ETA: 1:29 - loss: 1.3612 - regression_loss: 1.1876 - classification_loss: 0.1736 239/500 [=============>................] - ETA: 1:28 - loss: 1.3607 - regression_loss: 1.1871 - classification_loss: 0.1736 240/500 [=============>................] - ETA: 1:28 - loss: 1.3617 - regression_loss: 1.1877 - classification_loss: 0.1739 241/500 [=============>................] - ETA: 1:28 - loss: 1.3624 - regression_loss: 1.1884 - classification_loss: 0.1740 242/500 [=============>................] - ETA: 1:27 - loss: 1.3593 - regression_loss: 1.1858 - classification_loss: 0.1736 243/500 [=============>................] - ETA: 1:27 - loss: 1.3591 - regression_loss: 1.1856 - classification_loss: 0.1735 244/500 [=============>................] - ETA: 1:27 - loss: 1.3570 - regression_loss: 1.1838 - classification_loss: 0.1731 245/500 [=============>................] - ETA: 1:26 - loss: 1.3581 - regression_loss: 1.1845 - classification_loss: 0.1736 246/500 [=============>................] - ETA: 1:26 - loss: 1.3545 - regression_loss: 1.1815 - classification_loss: 0.1731 247/500 [=============>................] - ETA: 1:26 - loss: 1.3520 - regression_loss: 1.1793 - classification_loss: 0.1727 248/500 [=============>................] - ETA: 1:25 - loss: 1.3529 - regression_loss: 1.1799 - classification_loss: 0.1730 249/500 [=============>................] - ETA: 1:25 - loss: 1.3546 - regression_loss: 1.1814 - classification_loss: 0.1731 250/500 [==============>...............] - ETA: 1:25 - loss: 1.3565 - regression_loss: 1.1831 - classification_loss: 0.1734 251/500 [==============>...............] - ETA: 1:24 - loss: 1.3570 - regression_loss: 1.1835 - classification_loss: 0.1735 252/500 [==============>...............] - ETA: 1:24 - loss: 1.3564 - regression_loss: 1.1832 - classification_loss: 0.1732 253/500 [==============>...............] - ETA: 1:24 - loss: 1.3554 - regression_loss: 1.1822 - classification_loss: 0.1731 254/500 [==============>...............] - ETA: 1:23 - loss: 1.3559 - regression_loss: 1.1826 - classification_loss: 0.1733 255/500 [==============>...............] - ETA: 1:23 - loss: 1.3565 - regression_loss: 1.1831 - classification_loss: 0.1734 256/500 [==============>...............] - ETA: 1:23 - loss: 1.3563 - regression_loss: 1.1829 - classification_loss: 0.1734 257/500 [==============>...............] - ETA: 1:22 - loss: 1.3570 - regression_loss: 1.1835 - classification_loss: 0.1735 258/500 [==============>...............] - ETA: 1:22 - loss: 1.3551 - regression_loss: 1.1819 - classification_loss: 0.1732 259/500 [==============>...............] - ETA: 1:22 - loss: 1.3542 - regression_loss: 1.1811 - classification_loss: 0.1731 260/500 [==============>...............] - ETA: 1:21 - loss: 1.3543 - regression_loss: 1.1813 - classification_loss: 0.1730 261/500 [==============>...............] - ETA: 1:21 - loss: 1.3537 - regression_loss: 1.1808 - classification_loss: 0.1729 262/500 [==============>...............] - ETA: 1:21 - loss: 1.3532 - regression_loss: 1.1802 - classification_loss: 0.1730 263/500 [==============>...............] - ETA: 1:20 - loss: 1.3557 - regression_loss: 1.1824 - classification_loss: 0.1733 264/500 [==============>...............] - ETA: 1:20 - loss: 1.3556 - regression_loss: 1.1823 - classification_loss: 0.1733 265/500 [==============>...............] - ETA: 1:20 - loss: 1.3569 - regression_loss: 1.1832 - classification_loss: 0.1736 266/500 [==============>...............] - ETA: 1:19 - loss: 1.3580 - regression_loss: 1.1841 - classification_loss: 0.1738 267/500 [===============>..............] - ETA: 1:19 - loss: 1.3585 - regression_loss: 1.1848 - classification_loss: 0.1738 268/500 [===============>..............] - ETA: 1:19 - loss: 1.3593 - regression_loss: 1.1855 - classification_loss: 0.1738 269/500 [===============>..............] - ETA: 1:18 - loss: 1.3592 - regression_loss: 1.1854 - classification_loss: 0.1738 270/500 [===============>..............] - ETA: 1:18 - loss: 1.3591 - regression_loss: 1.1854 - classification_loss: 0.1737 271/500 [===============>..............] - ETA: 1:18 - loss: 1.3568 - regression_loss: 1.1836 - classification_loss: 0.1733 272/500 [===============>..............] - ETA: 1:17 - loss: 1.3552 - regression_loss: 1.1821 - classification_loss: 0.1731 273/500 [===============>..............] - ETA: 1:17 - loss: 1.3546 - regression_loss: 1.1816 - classification_loss: 0.1729 274/500 [===============>..............] - ETA: 1:17 - loss: 1.3555 - regression_loss: 1.1823 - classification_loss: 0.1732 275/500 [===============>..............] - ETA: 1:16 - loss: 1.3570 - regression_loss: 1.1840 - classification_loss: 0.1730 276/500 [===============>..............] - ETA: 1:16 - loss: 1.3566 - regression_loss: 1.1834 - classification_loss: 0.1732 277/500 [===============>..............] - ETA: 1:16 - loss: 1.3595 - regression_loss: 1.1831 - classification_loss: 0.1764 278/500 [===============>..............] - ETA: 1:15 - loss: 1.3594 - regression_loss: 1.1831 - classification_loss: 0.1764 279/500 [===============>..............] - ETA: 1:15 - loss: 1.3577 - regression_loss: 1.1817 - classification_loss: 0.1761 280/500 [===============>..............] - ETA: 1:14 - loss: 1.3583 - regression_loss: 1.1822 - classification_loss: 0.1761 281/500 [===============>..............] - ETA: 1:14 - loss: 1.3576 - regression_loss: 1.1817 - classification_loss: 0.1759 282/500 [===============>..............] - ETA: 1:14 - loss: 1.3567 - regression_loss: 1.1811 - classification_loss: 0.1757 283/500 [===============>..............] - ETA: 1:13 - loss: 1.3573 - regression_loss: 1.1815 - classification_loss: 0.1758 284/500 [================>.............] - ETA: 1:13 - loss: 1.3583 - regression_loss: 1.1822 - classification_loss: 0.1762 285/500 [================>.............] - ETA: 1:13 - loss: 1.3590 - regression_loss: 1.1826 - classification_loss: 0.1763 286/500 [================>.............] - ETA: 1:12 - loss: 1.3600 - regression_loss: 1.1836 - classification_loss: 0.1764 287/500 [================>.............] - ETA: 1:12 - loss: 1.3625 - regression_loss: 1.1857 - classification_loss: 0.1768 288/500 [================>.............] - ETA: 1:12 - loss: 1.3630 - regression_loss: 1.1862 - classification_loss: 0.1768 289/500 [================>.............] - ETA: 1:11 - loss: 1.3625 - regression_loss: 1.1859 - classification_loss: 0.1767 290/500 [================>.............] - ETA: 1:11 - loss: 1.3604 - regression_loss: 1.1840 - classification_loss: 0.1764 291/500 [================>.............] - ETA: 1:11 - loss: 1.3597 - regression_loss: 1.1833 - classification_loss: 0.1763 292/500 [================>.............] - ETA: 1:10 - loss: 1.3594 - regression_loss: 1.1832 - classification_loss: 0.1762 293/500 [================>.............] - ETA: 1:10 - loss: 1.3590 - regression_loss: 1.1828 - classification_loss: 0.1762 294/500 [================>.............] - ETA: 1:10 - loss: 1.3598 - regression_loss: 1.1834 - classification_loss: 0.1764 295/500 [================>.............] - ETA: 1:09 - loss: 1.3569 - regression_loss: 1.1806 - classification_loss: 0.1763 296/500 [================>.............] - ETA: 1:09 - loss: 1.3555 - regression_loss: 1.1795 - classification_loss: 0.1759 297/500 [================>.............] - ETA: 1:09 - loss: 1.3541 - regression_loss: 1.1783 - classification_loss: 0.1757 298/500 [================>.............] - ETA: 1:08 - loss: 1.3537 - regression_loss: 1.1782 - classification_loss: 0.1755 299/500 [================>.............] - ETA: 1:08 - loss: 1.3548 - regression_loss: 1.1792 - classification_loss: 0.1756 300/500 [=================>............] - ETA: 1:08 - loss: 1.3571 - regression_loss: 1.1813 - classification_loss: 0.1758 301/500 [=================>............] - ETA: 1:07 - loss: 1.3581 - regression_loss: 1.1823 - classification_loss: 0.1759 302/500 [=================>............] - ETA: 1:07 - loss: 1.3580 - regression_loss: 1.1821 - classification_loss: 0.1759 303/500 [=================>............] - ETA: 1:07 - loss: 1.3604 - regression_loss: 1.1842 - classification_loss: 0.1763 304/500 [=================>............] - ETA: 1:06 - loss: 1.3574 - regression_loss: 1.1816 - classification_loss: 0.1759 305/500 [=================>............] - ETA: 1:06 - loss: 1.3575 - regression_loss: 1.1815 - classification_loss: 0.1760 306/500 [=================>............] - ETA: 1:06 - loss: 1.3586 - regression_loss: 1.1825 - classification_loss: 0.1762 307/500 [=================>............] - ETA: 1:05 - loss: 1.3565 - regression_loss: 1.1807 - classification_loss: 0.1759 308/500 [=================>............] - ETA: 1:05 - loss: 1.3554 - regression_loss: 1.1798 - classification_loss: 0.1756 309/500 [=================>............] - ETA: 1:05 - loss: 1.3542 - regression_loss: 1.1788 - classification_loss: 0.1754 310/500 [=================>............] - ETA: 1:04 - loss: 1.3544 - regression_loss: 1.1790 - classification_loss: 0.1754 311/500 [=================>............] - ETA: 1:04 - loss: 1.3549 - regression_loss: 1.1794 - classification_loss: 0.1756 312/500 [=================>............] - ETA: 1:04 - loss: 1.3561 - regression_loss: 1.1808 - classification_loss: 0.1753 313/500 [=================>............] - ETA: 1:03 - loss: 1.3569 - regression_loss: 1.1815 - classification_loss: 0.1754 314/500 [=================>............] - ETA: 1:03 - loss: 1.3556 - regression_loss: 1.1804 - classification_loss: 0.1752 315/500 [=================>............] - ETA: 1:03 - loss: 1.3560 - regression_loss: 1.1808 - classification_loss: 0.1752 316/500 [=================>............] - ETA: 1:02 - loss: 1.3560 - regression_loss: 1.1807 - classification_loss: 0.1754 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3556 - regression_loss: 1.1803 - classification_loss: 0.1752 318/500 [==================>...........] - ETA: 1:02 - loss: 1.3561 - regression_loss: 1.1808 - classification_loss: 0.1753 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3541 - regression_loss: 1.1788 - classification_loss: 0.1753 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3541 - regression_loss: 1.1789 - classification_loss: 0.1752 321/500 [==================>...........] - ETA: 1:01 - loss: 1.3538 - regression_loss: 1.1786 - classification_loss: 0.1752 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3558 - regression_loss: 1.1804 - classification_loss: 0.1754 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3564 - regression_loss: 1.1812 - classification_loss: 0.1752 324/500 [==================>...........] - ETA: 1:00 - loss: 1.3563 - regression_loss: 1.1811 - classification_loss: 0.1752 325/500 [==================>...........] - ETA: 59s - loss: 1.3559 - regression_loss: 1.1808 - classification_loss: 0.1751  326/500 [==================>...........] - ETA: 59s - loss: 1.3539 - regression_loss: 1.1792 - classification_loss: 0.1747 327/500 [==================>...........] - ETA: 58s - loss: 1.3546 - regression_loss: 1.1797 - classification_loss: 0.1749 328/500 [==================>...........] - ETA: 58s - loss: 1.3544 - regression_loss: 1.1796 - classification_loss: 0.1748 329/500 [==================>...........] - ETA: 58s - loss: 1.3538 - regression_loss: 1.1791 - classification_loss: 0.1747 330/500 [==================>...........] - ETA: 57s - loss: 1.3540 - regression_loss: 1.1792 - classification_loss: 0.1747 331/500 [==================>...........] - ETA: 57s - loss: 1.3549 - regression_loss: 1.1801 - classification_loss: 0.1748 332/500 [==================>...........] - ETA: 57s - loss: 1.3557 - regression_loss: 1.1809 - classification_loss: 0.1748 333/500 [==================>...........] - ETA: 56s - loss: 1.3550 - regression_loss: 1.1803 - classification_loss: 0.1746 334/500 [===================>..........] - ETA: 56s - loss: 1.3548 - regression_loss: 1.1803 - classification_loss: 0.1745 335/500 [===================>..........] - ETA: 56s - loss: 1.3562 - regression_loss: 1.1815 - classification_loss: 0.1747 336/500 [===================>..........] - ETA: 55s - loss: 1.3549 - regression_loss: 1.1805 - classification_loss: 0.1744 337/500 [===================>..........] - ETA: 55s - loss: 1.3542 - regression_loss: 1.1798 - classification_loss: 0.1744 338/500 [===================>..........] - ETA: 55s - loss: 1.3549 - regression_loss: 1.1804 - classification_loss: 0.1745 339/500 [===================>..........] - ETA: 54s - loss: 1.3556 - regression_loss: 1.1810 - classification_loss: 0.1746 340/500 [===================>..........] - ETA: 54s - loss: 1.3563 - regression_loss: 1.1818 - classification_loss: 0.1745 341/500 [===================>..........] - ETA: 54s - loss: 1.3570 - regression_loss: 1.1823 - classification_loss: 0.1747 342/500 [===================>..........] - ETA: 53s - loss: 1.3579 - regression_loss: 1.1831 - classification_loss: 0.1747 343/500 [===================>..........] - ETA: 53s - loss: 1.3565 - regression_loss: 1.1818 - classification_loss: 0.1747 344/500 [===================>..........] - ETA: 53s - loss: 1.3555 - regression_loss: 1.1810 - classification_loss: 0.1745 345/500 [===================>..........] - ETA: 52s - loss: 1.3549 - regression_loss: 1.1804 - classification_loss: 0.1745 346/500 [===================>..........] - ETA: 52s - loss: 1.3556 - regression_loss: 1.1810 - classification_loss: 0.1745 347/500 [===================>..........] - ETA: 52s - loss: 1.3567 - regression_loss: 1.1821 - classification_loss: 0.1747 348/500 [===================>..........] - ETA: 51s - loss: 1.3573 - regression_loss: 1.1825 - classification_loss: 0.1748 349/500 [===================>..........] - ETA: 51s - loss: 1.3580 - regression_loss: 1.1832 - classification_loss: 0.1748 350/500 [====================>.........] - ETA: 51s - loss: 1.3564 - regression_loss: 1.1819 - classification_loss: 0.1746 351/500 [====================>.........] - ETA: 50s - loss: 1.3581 - regression_loss: 1.1833 - classification_loss: 0.1748 352/500 [====================>.........] - ETA: 50s - loss: 1.3585 - regression_loss: 1.1837 - classification_loss: 0.1749 353/500 [====================>.........] - ETA: 50s - loss: 1.3605 - regression_loss: 1.1856 - classification_loss: 0.1750 354/500 [====================>.........] - ETA: 49s - loss: 1.3607 - regression_loss: 1.1858 - classification_loss: 0.1749 355/500 [====================>.........] - ETA: 49s - loss: 1.3623 - regression_loss: 1.1872 - classification_loss: 0.1751 356/500 [====================>.........] - ETA: 49s - loss: 1.3629 - regression_loss: 1.1876 - classification_loss: 0.1753 357/500 [====================>.........] - ETA: 48s - loss: 1.3612 - regression_loss: 1.1858 - classification_loss: 0.1753 358/500 [====================>.........] - ETA: 48s - loss: 1.3623 - regression_loss: 1.1867 - classification_loss: 0.1756 359/500 [====================>.........] - ETA: 48s - loss: 1.3627 - regression_loss: 1.1872 - classification_loss: 0.1756 360/500 [====================>.........] - ETA: 47s - loss: 1.3639 - regression_loss: 1.1882 - classification_loss: 0.1758 361/500 [====================>.........] - ETA: 47s - loss: 1.3634 - regression_loss: 1.1878 - classification_loss: 0.1757 362/500 [====================>.........] - ETA: 47s - loss: 1.3643 - regression_loss: 1.1886 - classification_loss: 0.1757 363/500 [====================>.........] - ETA: 46s - loss: 1.3651 - regression_loss: 1.1892 - classification_loss: 0.1759 364/500 [====================>.........] - ETA: 46s - loss: 1.3668 - regression_loss: 1.1905 - classification_loss: 0.1762 365/500 [====================>.........] - ETA: 46s - loss: 1.3655 - regression_loss: 1.1896 - classification_loss: 0.1759 366/500 [====================>.........] - ETA: 45s - loss: 1.3641 - regression_loss: 1.1884 - classification_loss: 0.1756 367/500 [=====================>........] - ETA: 45s - loss: 1.3640 - regression_loss: 1.1884 - classification_loss: 0.1757 368/500 [=====================>........] - ETA: 44s - loss: 1.3643 - regression_loss: 1.1885 - classification_loss: 0.1758 369/500 [=====================>........] - ETA: 44s - loss: 1.3638 - regression_loss: 1.1881 - classification_loss: 0.1757 370/500 [=====================>........] - ETA: 44s - loss: 1.3627 - regression_loss: 1.1872 - classification_loss: 0.1755 371/500 [=====================>........] - ETA: 43s - loss: 1.3616 - regression_loss: 1.1863 - classification_loss: 0.1754 372/500 [=====================>........] - ETA: 43s - loss: 1.3621 - regression_loss: 1.1867 - classification_loss: 0.1754 373/500 [=====================>........] - ETA: 43s - loss: 1.3611 - regression_loss: 1.1859 - classification_loss: 0.1752 374/500 [=====================>........] - ETA: 42s - loss: 1.3599 - regression_loss: 1.1849 - classification_loss: 0.1750 375/500 [=====================>........] - ETA: 42s - loss: 1.3608 - regression_loss: 1.1855 - classification_loss: 0.1753 376/500 [=====================>........] - ETA: 42s - loss: 1.3607 - regression_loss: 1.1854 - classification_loss: 0.1753 377/500 [=====================>........] - ETA: 41s - loss: 1.3627 - regression_loss: 1.1869 - classification_loss: 0.1758 378/500 [=====================>........] - ETA: 41s - loss: 1.3625 - regression_loss: 1.1866 - classification_loss: 0.1759 379/500 [=====================>........] - ETA: 41s - loss: 1.3633 - regression_loss: 1.1875 - classification_loss: 0.1758 380/500 [=====================>........] - ETA: 40s - loss: 1.3630 - regression_loss: 1.1875 - classification_loss: 0.1755 381/500 [=====================>........] - ETA: 40s - loss: 1.3628 - regression_loss: 1.1874 - classification_loss: 0.1754 382/500 [=====================>........] - ETA: 40s - loss: 1.3613 - regression_loss: 1.1862 - classification_loss: 0.1751 383/500 [=====================>........] - ETA: 39s - loss: 1.3603 - regression_loss: 1.1852 - classification_loss: 0.1750 384/500 [======================>.......] - ETA: 39s - loss: 1.3583 - regression_loss: 1.1836 - classification_loss: 0.1747 385/500 [======================>.......] - ETA: 39s - loss: 1.3595 - regression_loss: 1.1846 - classification_loss: 0.1749 386/500 [======================>.......] - ETA: 38s - loss: 1.3607 - regression_loss: 1.1857 - classification_loss: 0.1750 387/500 [======================>.......] - ETA: 38s - loss: 1.3599 - regression_loss: 1.1850 - classification_loss: 0.1749 388/500 [======================>.......] - ETA: 38s - loss: 1.3613 - regression_loss: 1.1860 - classification_loss: 0.1753 389/500 [======================>.......] - ETA: 37s - loss: 1.3594 - regression_loss: 1.1843 - classification_loss: 0.1750 390/500 [======================>.......] - ETA: 37s - loss: 1.3592 - regression_loss: 1.1842 - classification_loss: 0.1750 391/500 [======================>.......] - ETA: 37s - loss: 1.3591 - regression_loss: 1.1840 - classification_loss: 0.1751 392/500 [======================>.......] - ETA: 36s - loss: 1.3589 - regression_loss: 1.1840 - classification_loss: 0.1749 393/500 [======================>.......] - ETA: 36s - loss: 1.3582 - regression_loss: 1.1833 - classification_loss: 0.1749 394/500 [======================>.......] - ETA: 36s - loss: 1.3583 - regression_loss: 1.1834 - classification_loss: 0.1748 395/500 [======================>.......] - ETA: 35s - loss: 1.3586 - regression_loss: 1.1835 - classification_loss: 0.1751 396/500 [======================>.......] - ETA: 35s - loss: 1.3589 - regression_loss: 1.1839 - classification_loss: 0.1751 397/500 [======================>.......] - ETA: 35s - loss: 1.3593 - regression_loss: 1.1842 - classification_loss: 0.1751 398/500 [======================>.......] - ETA: 34s - loss: 1.3600 - regression_loss: 1.1848 - classification_loss: 0.1752 399/500 [======================>.......] - ETA: 34s - loss: 1.3586 - regression_loss: 1.1836 - classification_loss: 0.1750 400/500 [=======================>......] - ETA: 34s - loss: 1.3581 - regression_loss: 1.1833 - classification_loss: 0.1749 401/500 [=======================>......] - ETA: 33s - loss: 1.3590 - regression_loss: 1.1840 - classification_loss: 0.1750 402/500 [=======================>......] - ETA: 33s - loss: 1.3600 - regression_loss: 1.1850 - classification_loss: 0.1750 403/500 [=======================>......] - ETA: 33s - loss: 1.3600 - regression_loss: 1.1850 - classification_loss: 0.1750 404/500 [=======================>......] - ETA: 32s - loss: 1.3599 - regression_loss: 1.1849 - classification_loss: 0.1750 405/500 [=======================>......] - ETA: 32s - loss: 1.3592 - regression_loss: 1.1843 - classification_loss: 0.1749 406/500 [=======================>......] - ETA: 32s - loss: 1.3592 - regression_loss: 1.1842 - classification_loss: 0.1750 407/500 [=======================>......] - ETA: 31s - loss: 1.3605 - regression_loss: 1.1854 - classification_loss: 0.1751 408/500 [=======================>......] - ETA: 31s - loss: 1.3607 - regression_loss: 1.1857 - classification_loss: 0.1751 409/500 [=======================>......] - ETA: 30s - loss: 1.3597 - regression_loss: 1.1849 - classification_loss: 0.1748 410/500 [=======================>......] - ETA: 30s - loss: 1.3603 - regression_loss: 1.1853 - classification_loss: 0.1749 411/500 [=======================>......] - ETA: 30s - loss: 1.3597 - regression_loss: 1.1847 - classification_loss: 0.1750 412/500 [=======================>......] - ETA: 29s - loss: 1.3593 - regression_loss: 1.1844 - classification_loss: 0.1750 413/500 [=======================>......] - ETA: 29s - loss: 1.3594 - regression_loss: 1.1844 - classification_loss: 0.1750 414/500 [=======================>......] - ETA: 29s - loss: 1.3606 - regression_loss: 1.1854 - classification_loss: 0.1752 415/500 [=======================>......] - ETA: 28s - loss: 1.3592 - regression_loss: 1.1841 - classification_loss: 0.1751 416/500 [=======================>......] - ETA: 28s - loss: 1.3571 - regression_loss: 1.1823 - classification_loss: 0.1749 417/500 [========================>.....] - ETA: 28s - loss: 1.3570 - regression_loss: 1.1822 - classification_loss: 0.1749 418/500 [========================>.....] - ETA: 27s - loss: 1.3555 - regression_loss: 1.1808 - classification_loss: 0.1746 419/500 [========================>.....] - ETA: 27s - loss: 1.3562 - regression_loss: 1.1814 - classification_loss: 0.1747 420/500 [========================>.....] - ETA: 27s - loss: 1.3560 - regression_loss: 1.1813 - classification_loss: 0.1747 421/500 [========================>.....] - ETA: 26s - loss: 1.3558 - regression_loss: 1.1812 - classification_loss: 0.1747 422/500 [========================>.....] - ETA: 26s - loss: 1.3547 - regression_loss: 1.1803 - classification_loss: 0.1744 423/500 [========================>.....] - ETA: 26s - loss: 1.3548 - regression_loss: 1.1803 - classification_loss: 0.1745 424/500 [========================>.....] - ETA: 25s - loss: 1.3555 - regression_loss: 1.1808 - classification_loss: 0.1747 425/500 [========================>.....] - ETA: 25s - loss: 1.3564 - regression_loss: 1.1816 - classification_loss: 0.1748 426/500 [========================>.....] - ETA: 25s - loss: 1.3560 - regression_loss: 1.1813 - classification_loss: 0.1748 427/500 [========================>.....] - ETA: 24s - loss: 1.3572 - regression_loss: 1.1824 - classification_loss: 0.1748 428/500 [========================>.....] - ETA: 24s - loss: 1.3569 - regression_loss: 1.1822 - classification_loss: 0.1747 429/500 [========================>.....] - ETA: 24s - loss: 1.3560 - regression_loss: 1.1814 - classification_loss: 0.1746 430/500 [========================>.....] - ETA: 23s - loss: 1.3544 - regression_loss: 1.1801 - classification_loss: 0.1744 431/500 [========================>.....] - ETA: 23s - loss: 1.3549 - regression_loss: 1.1805 - classification_loss: 0.1744 432/500 [========================>.....] - ETA: 23s - loss: 1.3558 - regression_loss: 1.1812 - classification_loss: 0.1746 433/500 [========================>.....] - ETA: 22s - loss: 1.3546 - regression_loss: 1.1803 - classification_loss: 0.1743 434/500 [=========================>....] - ETA: 22s - loss: 1.3551 - regression_loss: 1.1808 - classification_loss: 0.1744 435/500 [=========================>....] - ETA: 22s - loss: 1.3541 - regression_loss: 1.1798 - classification_loss: 0.1742 436/500 [=========================>....] - ETA: 21s - loss: 1.3542 - regression_loss: 1.1800 - classification_loss: 0.1743 437/500 [=========================>....] - ETA: 21s - loss: 1.3550 - regression_loss: 1.1807 - classification_loss: 0.1743 438/500 [=========================>....] - ETA: 21s - loss: 1.3564 - regression_loss: 1.1819 - classification_loss: 0.1745 439/500 [=========================>....] - ETA: 20s - loss: 1.3550 - regression_loss: 1.1807 - classification_loss: 0.1743 440/500 [=========================>....] - ETA: 20s - loss: 1.3555 - regression_loss: 1.1811 - classification_loss: 0.1744 441/500 [=========================>....] - ETA: 20s - loss: 1.3546 - regression_loss: 1.1803 - classification_loss: 0.1744 442/500 [=========================>....] - ETA: 19s - loss: 1.3535 - regression_loss: 1.1794 - classification_loss: 0.1741 443/500 [=========================>....] - ETA: 19s - loss: 1.3532 - regression_loss: 1.1792 - classification_loss: 0.1740 444/500 [=========================>....] - ETA: 19s - loss: 1.3537 - regression_loss: 1.1796 - classification_loss: 0.1741 445/500 [=========================>....] - ETA: 18s - loss: 1.3544 - regression_loss: 1.1803 - classification_loss: 0.1741 446/500 [=========================>....] - ETA: 18s - loss: 1.3562 - regression_loss: 1.1820 - classification_loss: 0.1743 447/500 [=========================>....] - ETA: 18s - loss: 1.3554 - regression_loss: 1.1812 - classification_loss: 0.1742 448/500 [=========================>....] - ETA: 17s - loss: 1.3557 - regression_loss: 1.1815 - classification_loss: 0.1742 449/500 [=========================>....] - ETA: 17s - loss: 1.3537 - regression_loss: 1.1798 - classification_loss: 0.1739 450/500 [==========================>...] - ETA: 17s - loss: 1.3525 - regression_loss: 1.1788 - classification_loss: 0.1737 451/500 [==========================>...] - ETA: 16s - loss: 1.3523 - regression_loss: 1.1786 - classification_loss: 0.1737 452/500 [==========================>...] - ETA: 16s - loss: 1.3527 - regression_loss: 1.1790 - classification_loss: 0.1737 453/500 [==========================>...] - ETA: 15s - loss: 1.3531 - regression_loss: 1.1793 - classification_loss: 0.1737 454/500 [==========================>...] - ETA: 15s - loss: 1.3528 - regression_loss: 1.1791 - classification_loss: 0.1737 455/500 [==========================>...] - ETA: 15s - loss: 1.3526 - regression_loss: 1.1791 - classification_loss: 0.1735 456/500 [==========================>...] - ETA: 14s - loss: 1.3515 - regression_loss: 1.1782 - classification_loss: 0.1733 457/500 [==========================>...] - ETA: 14s - loss: 1.3519 - regression_loss: 1.1786 - classification_loss: 0.1733 458/500 [==========================>...] - ETA: 14s - loss: 1.3508 - regression_loss: 1.1778 - classification_loss: 0.1731 459/500 [==========================>...] - ETA: 13s - loss: 1.3503 - regression_loss: 1.1773 - classification_loss: 0.1730 460/500 [==========================>...] - ETA: 13s - loss: 1.3504 - regression_loss: 1.1774 - classification_loss: 0.1730 461/500 [==========================>...] - ETA: 13s - loss: 1.3506 - regression_loss: 1.1776 - classification_loss: 0.1730 462/500 [==========================>...] - ETA: 12s - loss: 1.3505 - regression_loss: 1.1775 - classification_loss: 0.1730 463/500 [==========================>...] - ETA: 12s - loss: 1.3506 - regression_loss: 1.1776 - classification_loss: 0.1730 464/500 [==========================>...] - ETA: 12s - loss: 1.3509 - regression_loss: 1.1779 - classification_loss: 0.1730 465/500 [==========================>...] - ETA: 11s - loss: 1.3511 - regression_loss: 1.1780 - classification_loss: 0.1731 466/500 [==========================>...] - ETA: 11s - loss: 1.3517 - regression_loss: 1.1785 - classification_loss: 0.1732 467/500 [===========================>..] - ETA: 11s - loss: 1.3528 - regression_loss: 1.1796 - classification_loss: 0.1732 468/500 [===========================>..] - ETA: 10s - loss: 1.3523 - regression_loss: 1.1792 - classification_loss: 0.1731 469/500 [===========================>..] - ETA: 10s - loss: 1.3508 - regression_loss: 1.1779 - classification_loss: 0.1729 470/500 [===========================>..] - ETA: 10s - loss: 1.3515 - regression_loss: 1.1785 - classification_loss: 0.1730 471/500 [===========================>..] - ETA: 9s - loss: 1.3514 - regression_loss: 1.1784 - classification_loss: 0.1730  472/500 [===========================>..] - ETA: 9s - loss: 1.3511 - regression_loss: 1.1782 - classification_loss: 0.1729 473/500 [===========================>..] - ETA: 9s - loss: 1.3497 - regression_loss: 1.1769 - classification_loss: 0.1728 474/500 [===========================>..] - ETA: 8s - loss: 1.3498 - regression_loss: 1.1769 - classification_loss: 0.1728 475/500 [===========================>..] - ETA: 8s - loss: 1.3499 - regression_loss: 1.1772 - classification_loss: 0.1727 476/500 [===========================>..] - ETA: 8s - loss: 1.3517 - regression_loss: 1.1789 - classification_loss: 0.1728 477/500 [===========================>..] - ETA: 7s - loss: 1.3508 - regression_loss: 1.1782 - classification_loss: 0.1726 478/500 [===========================>..] - ETA: 7s - loss: 1.3498 - regression_loss: 1.1774 - classification_loss: 0.1724 479/500 [===========================>..] - ETA: 7s - loss: 1.3486 - regression_loss: 1.1764 - classification_loss: 0.1723 480/500 [===========================>..] - ETA: 6s - loss: 1.3489 - regression_loss: 1.1766 - classification_loss: 0.1723 481/500 [===========================>..] - ETA: 6s - loss: 1.3478 - regression_loss: 1.1757 - classification_loss: 0.1721 482/500 [===========================>..] - ETA: 6s - loss: 1.3482 - regression_loss: 1.1761 - classification_loss: 0.1721 483/500 [===========================>..] - ETA: 5s - loss: 1.3497 - regression_loss: 1.1775 - classification_loss: 0.1723 484/500 [============================>.] - ETA: 5s - loss: 1.3495 - regression_loss: 1.1773 - classification_loss: 0.1722 485/500 [============================>.] - ETA: 5s - loss: 1.3497 - regression_loss: 1.1775 - classification_loss: 0.1722 486/500 [============================>.] - ETA: 4s - loss: 1.3493 - regression_loss: 1.1772 - classification_loss: 0.1721 487/500 [============================>.] - ETA: 4s - loss: 1.3489 - regression_loss: 1.1769 - classification_loss: 0.1720 488/500 [============================>.] - ETA: 4s - loss: 1.3491 - regression_loss: 1.1770 - classification_loss: 0.1721 489/500 [============================>.] - ETA: 3s - loss: 1.3500 - regression_loss: 1.1778 - classification_loss: 0.1722 490/500 [============================>.] - ETA: 3s - loss: 1.3500 - regression_loss: 1.1779 - classification_loss: 0.1722 491/500 [============================>.] - ETA: 3s - loss: 1.3499 - regression_loss: 1.1778 - classification_loss: 0.1721 492/500 [============================>.] - ETA: 2s - loss: 1.3491 - regression_loss: 1.1770 - classification_loss: 0.1721 493/500 [============================>.] - ETA: 2s - loss: 1.3494 - regression_loss: 1.1774 - classification_loss: 0.1720 494/500 [============================>.] - ETA: 2s - loss: 1.3506 - regression_loss: 1.1784 - classification_loss: 0.1723 495/500 [============================>.] - ETA: 1s - loss: 1.3494 - regression_loss: 1.1774 - classification_loss: 0.1720 496/500 [============================>.] - ETA: 1s - loss: 1.3506 - regression_loss: 1.1781 - classification_loss: 0.1724 497/500 [============================>.] - ETA: 1s - loss: 1.3496 - regression_loss: 1.1773 - classification_loss: 0.1723 498/500 [============================>.] - ETA: 0s - loss: 1.3493 - regression_loss: 1.1770 - classification_loss: 0.1723 499/500 [============================>.] - ETA: 0s - loss: 1.3513 - regression_loss: 1.1783 - classification_loss: 0.1731 500/500 [==============================] - 170s 340ms/step - loss: 1.3511 - regression_loss: 1.1780 - classification_loss: 0.1731 1172 instances of class plum with average precision: 0.7724 mAP: 0.7724 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 2:39 - loss: 1.6795 - regression_loss: 1.4502 - classification_loss: 0.2292 2/500 [..............................] - ETA: 2:40 - loss: 1.6244 - regression_loss: 1.4153 - classification_loss: 0.2090 3/500 [..............................] - ETA: 2:41 - loss: 1.5430 - regression_loss: 1.3419 - classification_loss: 0.2011 4/500 [..............................] - ETA: 2:44 - loss: 1.4409 - regression_loss: 1.2527 - classification_loss: 0.1881 5/500 [..............................] - ETA: 2:44 - loss: 1.4953 - regression_loss: 1.2951 - classification_loss: 0.2002 6/500 [..............................] - ETA: 2:43 - loss: 1.5209 - regression_loss: 1.3166 - classification_loss: 0.2044 7/500 [..............................] - ETA: 2:44 - loss: 1.5707 - regression_loss: 1.3616 - classification_loss: 0.2091 8/500 [..............................] - ETA: 2:44 - loss: 1.5372 - regression_loss: 1.3324 - classification_loss: 0.2049 9/500 [..............................] - ETA: 2:44 - loss: 1.5139 - regression_loss: 1.3196 - classification_loss: 0.1943 10/500 [..............................] - ETA: 2:42 - loss: 1.5629 - regression_loss: 1.3598 - classification_loss: 0.2030 11/500 [..............................] - ETA: 2:42 - loss: 1.5292 - regression_loss: 1.3306 - classification_loss: 0.1987 12/500 [..............................] - ETA: 2:43 - loss: 1.4826 - regression_loss: 1.2909 - classification_loss: 0.1916 13/500 [..............................] - ETA: 2:42 - loss: 1.4253 - regression_loss: 1.2411 - classification_loss: 0.1842 14/500 [..............................] - ETA: 2:43 - loss: 1.3963 - regression_loss: 1.2162 - classification_loss: 0.1800 15/500 [..............................] - ETA: 2:42 - loss: 1.3750 - regression_loss: 1.1991 - classification_loss: 0.1758 16/500 [..............................] - ETA: 2:42 - loss: 1.3753 - regression_loss: 1.1944 - classification_loss: 0.1809 17/500 [>.............................] - ETA: 2:42 - loss: 1.3967 - regression_loss: 1.2136 - classification_loss: 0.1831 18/500 [>.............................] - ETA: 2:41 - loss: 1.4169 - regression_loss: 1.2305 - classification_loss: 0.1865 19/500 [>.............................] - ETA: 2:41 - loss: 1.3805 - regression_loss: 1.1998 - classification_loss: 0.1807 20/500 [>.............................] - ETA: 2:41 - loss: 1.3862 - regression_loss: 1.2050 - classification_loss: 0.1812 21/500 [>.............................] - ETA: 2:41 - loss: 1.4047 - regression_loss: 1.2232 - classification_loss: 0.1815 22/500 [>.............................] - ETA: 2:41 - loss: 1.4099 - regression_loss: 1.2284 - classification_loss: 0.1815 23/500 [>.............................] - ETA: 2:40 - loss: 1.4422 - regression_loss: 1.2534 - classification_loss: 0.1888 24/500 [>.............................] - ETA: 2:40 - loss: 1.4745 - regression_loss: 1.2756 - classification_loss: 0.1989 25/500 [>.............................] - ETA: 2:40 - loss: 1.4751 - regression_loss: 1.2750 - classification_loss: 0.2001 26/500 [>.............................] - ETA: 2:40 - loss: 1.4959 - regression_loss: 1.2893 - classification_loss: 0.2066 27/500 [>.............................] - ETA: 2:39 - loss: 1.4867 - regression_loss: 1.2825 - classification_loss: 0.2042 28/500 [>.............................] - ETA: 2:39 - loss: 1.4794 - regression_loss: 1.2788 - classification_loss: 0.2006 29/500 [>.............................] - ETA: 2:39 - loss: 1.4989 - regression_loss: 1.2943 - classification_loss: 0.2046 30/500 [>.............................] - ETA: 2:39 - loss: 1.5052 - regression_loss: 1.2996 - classification_loss: 0.2056 31/500 [>.............................] - ETA: 2:39 - loss: 1.5217 - regression_loss: 1.3112 - classification_loss: 0.2105 32/500 [>.............................] - ETA: 2:39 - loss: 1.4975 - regression_loss: 1.2919 - classification_loss: 0.2056 33/500 [>.............................] - ETA: 2:38 - loss: 1.4858 - regression_loss: 1.2847 - classification_loss: 0.2011 34/500 [=>............................] - ETA: 2:38 - loss: 1.4797 - regression_loss: 1.2813 - classification_loss: 0.1984 35/500 [=>............................] - ETA: 2:37 - loss: 1.4850 - regression_loss: 1.2856 - classification_loss: 0.1995 36/500 [=>............................] - ETA: 2:37 - loss: 1.4789 - regression_loss: 1.2811 - classification_loss: 0.1978 37/500 [=>............................] - ETA: 2:37 - loss: 1.4730 - regression_loss: 1.2768 - classification_loss: 0.1962 38/500 [=>............................] - ETA: 2:37 - loss: 1.4597 - regression_loss: 1.2652 - classification_loss: 0.1945 39/500 [=>............................] - ETA: 2:36 - loss: 1.4452 - regression_loss: 1.2534 - classification_loss: 0.1918 40/500 [=>............................] - ETA: 2:36 - loss: 1.4545 - regression_loss: 1.2599 - classification_loss: 0.1946 41/500 [=>............................] - ETA: 2:36 - loss: 1.4636 - regression_loss: 1.2678 - classification_loss: 0.1958 42/500 [=>............................] - ETA: 2:36 - loss: 1.4726 - regression_loss: 1.2749 - classification_loss: 0.1977 43/500 [=>............................] - ETA: 2:35 - loss: 1.4672 - regression_loss: 1.2712 - classification_loss: 0.1960 44/500 [=>............................] - ETA: 2:35 - loss: 1.4637 - regression_loss: 1.2685 - classification_loss: 0.1953 45/500 [=>............................] - ETA: 2:34 - loss: 1.4630 - regression_loss: 1.2679 - classification_loss: 0.1951 46/500 [=>............................] - ETA: 2:34 - loss: 1.4555 - regression_loss: 1.2614 - classification_loss: 0.1941 47/500 [=>............................] - ETA: 2:33 - loss: 1.4457 - regression_loss: 1.2531 - classification_loss: 0.1926 48/500 [=>............................] - ETA: 2:33 - loss: 1.4480 - regression_loss: 1.2554 - classification_loss: 0.1926 49/500 [=>............................] - ETA: 2:32 - loss: 1.4512 - regression_loss: 1.2583 - classification_loss: 0.1929 50/500 [==>...........................] - ETA: 2:32 - loss: 1.4506 - regression_loss: 1.2582 - classification_loss: 0.1924 51/500 [==>...........................] - ETA: 2:32 - loss: 1.4384 - regression_loss: 1.2481 - classification_loss: 0.1903 52/500 [==>...........................] - ETA: 2:31 - loss: 1.4453 - regression_loss: 1.2542 - classification_loss: 0.1911 53/500 [==>...........................] - ETA: 2:31 - loss: 1.4470 - regression_loss: 1.2564 - classification_loss: 0.1907 54/500 [==>...........................] - ETA: 2:31 - loss: 1.4460 - regression_loss: 1.2550 - classification_loss: 0.1910 55/500 [==>...........................] - ETA: 2:30 - loss: 1.4390 - regression_loss: 1.2480 - classification_loss: 0.1910 56/500 [==>...........................] - ETA: 2:30 - loss: 1.4338 - regression_loss: 1.2429 - classification_loss: 0.1909 57/500 [==>...........................] - ETA: 2:30 - loss: 1.4308 - regression_loss: 1.2411 - classification_loss: 0.1897 58/500 [==>...........................] - ETA: 2:29 - loss: 1.4213 - regression_loss: 1.2331 - classification_loss: 0.1882 59/500 [==>...........................] - ETA: 2:29 - loss: 1.4206 - regression_loss: 1.2330 - classification_loss: 0.1876 60/500 [==>...........................] - ETA: 2:29 - loss: 1.4300 - regression_loss: 1.2395 - classification_loss: 0.1905 61/500 [==>...........................] - ETA: 2:29 - loss: 1.4262 - regression_loss: 1.2360 - classification_loss: 0.1902 62/500 [==>...........................] - ETA: 2:28 - loss: 1.4245 - regression_loss: 1.2338 - classification_loss: 0.1907 63/500 [==>...........................] - ETA: 2:28 - loss: 1.4330 - regression_loss: 1.2415 - classification_loss: 0.1915 64/500 [==>...........................] - ETA: 2:28 - loss: 1.4343 - regression_loss: 1.2430 - classification_loss: 0.1913 65/500 [==>...........................] - ETA: 2:27 - loss: 1.4265 - regression_loss: 1.2367 - classification_loss: 0.1898 66/500 [==>...........................] - ETA: 2:27 - loss: 1.4191 - regression_loss: 1.2302 - classification_loss: 0.1889 67/500 [===>..........................] - ETA: 2:27 - loss: 1.4195 - regression_loss: 1.2309 - classification_loss: 0.1887 68/500 [===>..........................] - ETA: 2:26 - loss: 1.4227 - regression_loss: 1.2331 - classification_loss: 0.1896 69/500 [===>..........................] - ETA: 2:25 - loss: 1.4092 - regression_loss: 1.2214 - classification_loss: 0.1878 70/500 [===>..........................] - ETA: 2:25 - loss: 1.3981 - regression_loss: 1.2122 - classification_loss: 0.1859 71/500 [===>..........................] - ETA: 2:25 - loss: 1.4013 - regression_loss: 1.2139 - classification_loss: 0.1874 72/500 [===>..........................] - ETA: 2:24 - loss: 1.3899 - regression_loss: 1.2046 - classification_loss: 0.1854 73/500 [===>..........................] - ETA: 2:24 - loss: 1.3820 - regression_loss: 1.1979 - classification_loss: 0.1841 74/500 [===>..........................] - ETA: 2:24 - loss: 1.3845 - regression_loss: 1.2007 - classification_loss: 0.1838 75/500 [===>..........................] - ETA: 2:23 - loss: 1.3734 - regression_loss: 1.1914 - classification_loss: 0.1820 76/500 [===>..........................] - ETA: 2:23 - loss: 1.3729 - regression_loss: 1.1909 - classification_loss: 0.1820 77/500 [===>..........................] - ETA: 2:23 - loss: 1.3642 - regression_loss: 1.1835 - classification_loss: 0.1807 78/500 [===>..........................] - ETA: 2:22 - loss: 1.3658 - regression_loss: 1.1849 - classification_loss: 0.1809 79/500 [===>..........................] - ETA: 2:22 - loss: 1.3619 - regression_loss: 1.1816 - classification_loss: 0.1803 80/500 [===>..........................] - ETA: 2:22 - loss: 1.3726 - regression_loss: 1.1915 - classification_loss: 0.1811 81/500 [===>..........................] - ETA: 2:21 - loss: 1.3636 - regression_loss: 1.1842 - classification_loss: 0.1794 82/500 [===>..........................] - ETA: 2:21 - loss: 1.3695 - regression_loss: 1.1891 - classification_loss: 0.1804 83/500 [===>..........................] - ETA: 2:21 - loss: 1.3762 - regression_loss: 1.1945 - classification_loss: 0.1817 84/500 [====>.........................] - ETA: 2:20 - loss: 1.3852 - regression_loss: 1.2026 - classification_loss: 0.1827 85/500 [====>.........................] - ETA: 2:20 - loss: 1.3889 - regression_loss: 1.2059 - classification_loss: 0.1830 86/500 [====>.........................] - ETA: 2:20 - loss: 1.3910 - regression_loss: 1.2082 - classification_loss: 0.1828 87/500 [====>.........................] - ETA: 2:19 - loss: 1.3920 - regression_loss: 1.2089 - classification_loss: 0.1831 88/500 [====>.........................] - ETA: 2:19 - loss: 1.3891 - regression_loss: 1.2068 - classification_loss: 0.1823 89/500 [====>.........................] - ETA: 2:19 - loss: 1.3789 - regression_loss: 1.1977 - classification_loss: 0.1811 90/500 [====>.........................] - ETA: 2:18 - loss: 1.3862 - regression_loss: 1.2040 - classification_loss: 0.1822 91/500 [====>.........................] - ETA: 2:18 - loss: 1.3854 - regression_loss: 1.2033 - classification_loss: 0.1821 92/500 [====>.........................] - ETA: 2:18 - loss: 1.3872 - regression_loss: 1.2052 - classification_loss: 0.1821 93/500 [====>.........................] - ETA: 2:17 - loss: 1.3857 - regression_loss: 1.2040 - classification_loss: 0.1817 94/500 [====>.........................] - ETA: 2:17 - loss: 1.3784 - regression_loss: 1.1978 - classification_loss: 0.1806 95/500 [====>.........................] - ETA: 2:17 - loss: 1.3822 - regression_loss: 1.2014 - classification_loss: 0.1809 96/500 [====>.........................] - ETA: 2:17 - loss: 1.3845 - regression_loss: 1.2037 - classification_loss: 0.1808 97/500 [====>.........................] - ETA: 2:16 - loss: 1.3854 - regression_loss: 1.2045 - classification_loss: 0.1808 98/500 [====>.........................] - ETA: 2:16 - loss: 1.3809 - regression_loss: 1.2007 - classification_loss: 0.1802 99/500 [====>.........................] - ETA: 2:15 - loss: 1.3725 - regression_loss: 1.1934 - classification_loss: 0.1791 100/500 [=====>........................] - ETA: 2:15 - loss: 1.3776 - regression_loss: 1.1982 - classification_loss: 0.1793 101/500 [=====>........................] - ETA: 2:15 - loss: 1.3775 - regression_loss: 1.1985 - classification_loss: 0.1791 102/500 [=====>........................] - ETA: 2:14 - loss: 1.3797 - regression_loss: 1.2002 - classification_loss: 0.1794 103/500 [=====>........................] - ETA: 2:14 - loss: 1.3844 - regression_loss: 1.2047 - classification_loss: 0.1797 104/500 [=====>........................] - ETA: 2:14 - loss: 1.3771 - regression_loss: 1.1981 - classification_loss: 0.1790 105/500 [=====>........................] - ETA: 2:13 - loss: 1.3710 - regression_loss: 1.1932 - classification_loss: 0.1778 106/500 [=====>........................] - ETA: 2:13 - loss: 1.3642 - regression_loss: 1.1872 - classification_loss: 0.1770 107/500 [=====>........................] - ETA: 2:13 - loss: 1.3643 - regression_loss: 1.1872 - classification_loss: 0.1772 108/500 [=====>........................] - ETA: 2:12 - loss: 1.3633 - regression_loss: 1.1860 - classification_loss: 0.1773 109/500 [=====>........................] - ETA: 2:12 - loss: 1.3621 - regression_loss: 1.1845 - classification_loss: 0.1776 110/500 [=====>........................] - ETA: 2:12 - loss: 1.3556 - regression_loss: 1.1790 - classification_loss: 0.1766 111/500 [=====>........................] - ETA: 2:11 - loss: 1.3573 - regression_loss: 1.1786 - classification_loss: 0.1787 112/500 [=====>........................] - ETA: 2:11 - loss: 1.3509 - regression_loss: 1.1732 - classification_loss: 0.1776 113/500 [=====>........................] - ETA: 2:11 - loss: 1.3537 - regression_loss: 1.1757 - classification_loss: 0.1780 114/500 [=====>........................] - ETA: 2:10 - loss: 1.3482 - regression_loss: 1.1711 - classification_loss: 0.1770 115/500 [=====>........................] - ETA: 2:10 - loss: 1.3500 - regression_loss: 1.1730 - classification_loss: 0.1770 116/500 [=====>........................] - ETA: 2:10 - loss: 1.3526 - regression_loss: 1.1760 - classification_loss: 0.1767 117/500 [======>.......................] - ETA: 2:10 - loss: 1.3481 - regression_loss: 1.1722 - classification_loss: 0.1758 118/500 [======>.......................] - ETA: 2:09 - loss: 1.3523 - regression_loss: 1.1758 - classification_loss: 0.1764 119/500 [======>.......................] - ETA: 2:09 - loss: 1.3535 - regression_loss: 1.1768 - classification_loss: 0.1767 120/500 [======>.......................] - ETA: 2:08 - loss: 1.3526 - regression_loss: 1.1759 - classification_loss: 0.1767 121/500 [======>.......................] - ETA: 2:08 - loss: 1.3560 - regression_loss: 1.1784 - classification_loss: 0.1777 122/500 [======>.......................] - ETA: 2:08 - loss: 1.3610 - regression_loss: 1.1826 - classification_loss: 0.1783 123/500 [======>.......................] - ETA: 2:07 - loss: 1.3626 - regression_loss: 1.1840 - classification_loss: 0.1786 124/500 [======>.......................] - ETA: 2:07 - loss: 1.3609 - regression_loss: 1.1827 - classification_loss: 0.1782 125/500 [======>.......................] - ETA: 2:07 - loss: 1.3561 - regression_loss: 1.1787 - classification_loss: 0.1774 126/500 [======>.......................] - ETA: 2:06 - loss: 1.3601 - regression_loss: 1.1820 - classification_loss: 0.1781 127/500 [======>.......................] - ETA: 2:06 - loss: 1.3594 - regression_loss: 1.1814 - classification_loss: 0.1780 128/500 [======>.......................] - ETA: 2:06 - loss: 1.3571 - regression_loss: 1.1790 - classification_loss: 0.1781 129/500 [======>.......................] - ETA: 2:05 - loss: 1.3647 - regression_loss: 1.1851 - classification_loss: 0.1795 130/500 [======>.......................] - ETA: 2:05 - loss: 1.3577 - regression_loss: 1.1793 - classification_loss: 0.1784 131/500 [======>.......................] - ETA: 2:05 - loss: 1.3592 - regression_loss: 1.1811 - classification_loss: 0.1780 132/500 [======>.......................] - ETA: 2:04 - loss: 1.3566 - regression_loss: 1.1790 - classification_loss: 0.1776 133/500 [======>.......................] - ETA: 2:04 - loss: 1.3552 - regression_loss: 1.1782 - classification_loss: 0.1769 134/500 [=======>......................] - ETA: 2:04 - loss: 1.3513 - regression_loss: 1.1749 - classification_loss: 0.1764 135/500 [=======>......................] - ETA: 2:03 - loss: 1.3516 - regression_loss: 1.1753 - classification_loss: 0.1763 136/500 [=======>......................] - ETA: 2:03 - loss: 1.3484 - regression_loss: 1.1726 - classification_loss: 0.1758 137/500 [=======>......................] - ETA: 2:03 - loss: 1.3502 - regression_loss: 1.1740 - classification_loss: 0.1762 138/500 [=======>......................] - ETA: 2:02 - loss: 1.3500 - regression_loss: 1.1739 - classification_loss: 0.1762 139/500 [=======>......................] - ETA: 2:02 - loss: 1.3562 - regression_loss: 1.1795 - classification_loss: 0.1767 140/500 [=======>......................] - ETA: 2:02 - loss: 1.3557 - regression_loss: 1.1793 - classification_loss: 0.1764 141/500 [=======>......................] - ETA: 2:01 - loss: 1.3537 - regression_loss: 1.1775 - classification_loss: 0.1762 142/500 [=======>......................] - ETA: 2:01 - loss: 1.3543 - regression_loss: 1.1780 - classification_loss: 0.1762 143/500 [=======>......................] - ETA: 2:01 - loss: 1.3533 - regression_loss: 1.1771 - classification_loss: 0.1761 144/500 [=======>......................] - ETA: 2:00 - loss: 1.3531 - regression_loss: 1.1773 - classification_loss: 0.1758 145/500 [=======>......................] - ETA: 2:00 - loss: 1.3519 - regression_loss: 1.1762 - classification_loss: 0.1756 146/500 [=======>......................] - ETA: 2:00 - loss: 1.3527 - regression_loss: 1.1770 - classification_loss: 0.1757 147/500 [=======>......................] - ETA: 1:59 - loss: 1.3519 - regression_loss: 1.1763 - classification_loss: 0.1756 148/500 [=======>......................] - ETA: 1:59 - loss: 1.3526 - regression_loss: 1.1769 - classification_loss: 0.1757 149/500 [=======>......................] - ETA: 1:59 - loss: 1.3541 - regression_loss: 1.1783 - classification_loss: 0.1758 150/500 [========>.....................] - ETA: 1:58 - loss: 1.3531 - regression_loss: 1.1773 - classification_loss: 0.1758 151/500 [========>.....................] - ETA: 1:58 - loss: 1.3505 - regression_loss: 1.1753 - classification_loss: 0.1752 152/500 [========>.....................] - ETA: 1:58 - loss: 1.3522 - regression_loss: 1.1767 - classification_loss: 0.1754 153/500 [========>.....................] - ETA: 1:57 - loss: 1.3546 - regression_loss: 1.1789 - classification_loss: 0.1757 154/500 [========>.....................] - ETA: 1:57 - loss: 1.3551 - regression_loss: 1.1796 - classification_loss: 0.1756 155/500 [========>.....................] - ETA: 1:57 - loss: 1.3576 - regression_loss: 1.1815 - classification_loss: 0.1761 156/500 [========>.....................] - ETA: 1:57 - loss: 1.3606 - regression_loss: 1.1841 - classification_loss: 0.1765 157/500 [========>.....................] - ETA: 1:56 - loss: 1.3616 - regression_loss: 1.1848 - classification_loss: 0.1768 158/500 [========>.....................] - ETA: 1:56 - loss: 1.3637 - regression_loss: 1.1866 - classification_loss: 0.1771 159/500 [========>.....................] - ETA: 1:56 - loss: 1.3649 - regression_loss: 1.1880 - classification_loss: 0.1769 160/500 [========>.....................] - ETA: 1:55 - loss: 1.3642 - regression_loss: 1.1875 - classification_loss: 0.1767 161/500 [========>.....................] - ETA: 1:55 - loss: 1.3656 - regression_loss: 1.1887 - classification_loss: 0.1769 162/500 [========>.....................] - ETA: 1:55 - loss: 1.3655 - regression_loss: 1.1890 - classification_loss: 0.1765 163/500 [========>.....................] - ETA: 1:54 - loss: 1.3656 - regression_loss: 1.1893 - classification_loss: 0.1763 164/500 [========>.....................] - ETA: 1:54 - loss: 1.3680 - regression_loss: 1.1909 - classification_loss: 0.1771 165/500 [========>.....................] - ETA: 1:54 - loss: 1.3672 - regression_loss: 1.1902 - classification_loss: 0.1770 166/500 [========>.....................] - ETA: 1:53 - loss: 1.3691 - regression_loss: 1.1920 - classification_loss: 0.1771 167/500 [=========>....................] - ETA: 1:53 - loss: 1.3667 - regression_loss: 1.1900 - classification_loss: 0.1767 168/500 [=========>....................] - ETA: 1:53 - loss: 1.3654 - regression_loss: 1.1891 - classification_loss: 0.1763 169/500 [=========>....................] - ETA: 1:52 - loss: 1.3646 - regression_loss: 1.1886 - classification_loss: 0.1760 170/500 [=========>....................] - ETA: 1:52 - loss: 1.3627 - regression_loss: 1.1868 - classification_loss: 0.1759 171/500 [=========>....................] - ETA: 1:52 - loss: 1.3618 - regression_loss: 1.1862 - classification_loss: 0.1755 172/500 [=========>....................] - ETA: 1:51 - loss: 1.3635 - regression_loss: 1.1879 - classification_loss: 0.1757 173/500 [=========>....................] - ETA: 1:51 - loss: 1.3620 - regression_loss: 1.1869 - classification_loss: 0.1751 174/500 [=========>....................] - ETA: 1:51 - loss: 1.3607 - regression_loss: 1.1858 - classification_loss: 0.1749 175/500 [=========>....................] - ETA: 1:50 - loss: 1.3630 - regression_loss: 1.1874 - classification_loss: 0.1756 176/500 [=========>....................] - ETA: 1:50 - loss: 1.3612 - regression_loss: 1.1858 - classification_loss: 0.1754 177/500 [=========>....................] - ETA: 1:50 - loss: 1.3626 - regression_loss: 1.1869 - classification_loss: 0.1757 178/500 [=========>....................] - ETA: 1:49 - loss: 1.3604 - regression_loss: 1.1849 - classification_loss: 0.1755 179/500 [=========>....................] - ETA: 1:49 - loss: 1.3602 - regression_loss: 1.1842 - classification_loss: 0.1760 180/500 [=========>....................] - ETA: 1:49 - loss: 1.3654 - regression_loss: 1.1879 - classification_loss: 0.1776 181/500 [=========>....................] - ETA: 1:48 - loss: 1.3651 - regression_loss: 1.1878 - classification_loss: 0.1772 182/500 [=========>....................] - ETA: 1:48 - loss: 1.3636 - regression_loss: 1.1862 - classification_loss: 0.1774 183/500 [=========>....................] - ETA: 1:47 - loss: 1.3604 - regression_loss: 1.1833 - classification_loss: 0.1771 184/500 [==========>...................] - ETA: 1:47 - loss: 1.3599 - regression_loss: 1.1829 - classification_loss: 0.1770 185/500 [==========>...................] - ETA: 1:47 - loss: 1.3594 - regression_loss: 1.1827 - classification_loss: 0.1766 186/500 [==========>...................] - ETA: 1:46 - loss: 1.3564 - regression_loss: 1.1803 - classification_loss: 0.1761 187/500 [==========>...................] - ETA: 1:46 - loss: 1.3583 - regression_loss: 1.1817 - classification_loss: 0.1766 188/500 [==========>...................] - ETA: 1:46 - loss: 1.3552 - regression_loss: 1.1791 - classification_loss: 0.1761 189/500 [==========>...................] - ETA: 1:45 - loss: 1.3545 - regression_loss: 1.1784 - classification_loss: 0.1761 190/500 [==========>...................] - ETA: 1:45 - loss: 1.3560 - regression_loss: 1.1801 - classification_loss: 0.1759 191/500 [==========>...................] - ETA: 1:45 - loss: 1.3564 - regression_loss: 1.1806 - classification_loss: 0.1757 192/500 [==========>...................] - ETA: 1:44 - loss: 1.3553 - regression_loss: 1.1798 - classification_loss: 0.1755 193/500 [==========>...................] - ETA: 1:44 - loss: 1.3536 - regression_loss: 1.1782 - classification_loss: 0.1754 194/500 [==========>...................] - ETA: 1:44 - loss: 1.3538 - regression_loss: 1.1783 - classification_loss: 0.1754 195/500 [==========>...................] - ETA: 1:43 - loss: 1.3552 - regression_loss: 1.1795 - classification_loss: 0.1756 196/500 [==========>...................] - ETA: 1:43 - loss: 1.3562 - regression_loss: 1.1804 - classification_loss: 0.1759 197/500 [==========>...................] - ETA: 1:43 - loss: 1.3556 - regression_loss: 1.1800 - classification_loss: 0.1756 198/500 [==========>...................] - ETA: 1:42 - loss: 1.3563 - regression_loss: 1.1805 - classification_loss: 0.1758 199/500 [==========>...................] - ETA: 1:42 - loss: 1.3558 - regression_loss: 1.1800 - classification_loss: 0.1758 200/500 [===========>..................] - ETA: 1:42 - loss: 1.3593 - regression_loss: 1.1833 - classification_loss: 0.1759 201/500 [===========>..................] - ETA: 1:41 - loss: 1.3604 - regression_loss: 1.1844 - classification_loss: 0.1760 202/500 [===========>..................] - ETA: 1:41 - loss: 1.3600 - regression_loss: 1.1837 - classification_loss: 0.1763 203/500 [===========>..................] - ETA: 1:41 - loss: 1.3558 - regression_loss: 1.1801 - classification_loss: 0.1757 204/500 [===========>..................] - ETA: 1:40 - loss: 1.3549 - regression_loss: 1.1796 - classification_loss: 0.1754 205/500 [===========>..................] - ETA: 1:40 - loss: 1.3553 - regression_loss: 1.1791 - classification_loss: 0.1762 206/500 [===========>..................] - ETA: 1:39 - loss: 1.3577 - regression_loss: 1.1811 - classification_loss: 0.1766 207/500 [===========>..................] - ETA: 1:39 - loss: 1.3567 - regression_loss: 1.1800 - classification_loss: 0.1766 208/500 [===========>..................] - ETA: 1:39 - loss: 1.3540 - regression_loss: 1.1778 - classification_loss: 0.1762 209/500 [===========>..................] - ETA: 1:38 - loss: 1.3516 - regression_loss: 1.1757 - classification_loss: 0.1759 210/500 [===========>..................] - ETA: 1:38 - loss: 1.3539 - regression_loss: 1.1776 - classification_loss: 0.1763 211/500 [===========>..................] - ETA: 1:38 - loss: 1.3545 - regression_loss: 1.1782 - classification_loss: 0.1763 212/500 [===========>..................] - ETA: 1:37 - loss: 1.3545 - regression_loss: 1.1783 - classification_loss: 0.1762 213/500 [===========>..................] - ETA: 1:37 - loss: 1.3520 - regression_loss: 1.1764 - classification_loss: 0.1756 214/500 [===========>..................] - ETA: 1:37 - loss: 1.3524 - regression_loss: 1.1767 - classification_loss: 0.1757 215/500 [===========>..................] - ETA: 1:36 - loss: 1.3496 - regression_loss: 1.1743 - classification_loss: 0.1752 216/500 [===========>..................] - ETA: 1:36 - loss: 1.3527 - regression_loss: 1.1770 - classification_loss: 0.1758 217/500 [============>.................] - ETA: 1:36 - loss: 1.3541 - regression_loss: 1.1784 - classification_loss: 0.1757 218/500 [============>.................] - ETA: 1:35 - loss: 1.3556 - regression_loss: 1.1799 - classification_loss: 0.1757 219/500 [============>.................] - ETA: 1:35 - loss: 1.3555 - regression_loss: 1.1799 - classification_loss: 0.1757 220/500 [============>.................] - ETA: 1:35 - loss: 1.3538 - regression_loss: 1.1783 - classification_loss: 0.1755 221/500 [============>.................] - ETA: 1:34 - loss: 1.3529 - regression_loss: 1.1774 - classification_loss: 0.1754 222/500 [============>.................] - ETA: 1:34 - loss: 1.3533 - regression_loss: 1.1778 - classification_loss: 0.1755 223/500 [============>.................] - ETA: 1:34 - loss: 1.3545 - regression_loss: 1.1788 - classification_loss: 0.1758 224/500 [============>.................] - ETA: 1:33 - loss: 1.3551 - regression_loss: 1.1795 - classification_loss: 0.1757 225/500 [============>.................] - ETA: 1:33 - loss: 1.3570 - regression_loss: 1.1810 - classification_loss: 0.1759 226/500 [============>.................] - ETA: 1:33 - loss: 1.3568 - regression_loss: 1.1810 - classification_loss: 0.1758 227/500 [============>.................] - ETA: 1:32 - loss: 1.3575 - regression_loss: 1.1817 - classification_loss: 0.1759 228/500 [============>.................] - ETA: 1:32 - loss: 1.3566 - regression_loss: 1.1810 - classification_loss: 0.1757 229/500 [============>.................] - ETA: 1:31 - loss: 1.3576 - regression_loss: 1.1817 - classification_loss: 0.1759 230/500 [============>.................] - ETA: 1:31 - loss: 1.3581 - regression_loss: 1.1821 - classification_loss: 0.1760 231/500 [============>.................] - ETA: 1:31 - loss: 1.3581 - regression_loss: 1.1822 - classification_loss: 0.1759 232/500 [============>.................] - ETA: 1:30 - loss: 1.3569 - regression_loss: 1.1814 - classification_loss: 0.1755 233/500 [============>.................] - ETA: 1:30 - loss: 1.3563 - regression_loss: 1.1808 - classification_loss: 0.1755 234/500 [=============>................] - ETA: 1:30 - loss: 1.3538 - regression_loss: 1.1787 - classification_loss: 0.1751 235/500 [=============>................] - ETA: 1:29 - loss: 1.3547 - regression_loss: 1.1796 - classification_loss: 0.1750 236/500 [=============>................] - ETA: 1:29 - loss: 1.3547 - regression_loss: 1.1797 - classification_loss: 0.1750 237/500 [=============>................] - ETA: 1:29 - loss: 1.3562 - regression_loss: 1.1814 - classification_loss: 0.1748 238/500 [=============>................] - ETA: 1:28 - loss: 1.3548 - regression_loss: 1.1799 - classification_loss: 0.1749 239/500 [=============>................] - ETA: 1:28 - loss: 1.3549 - regression_loss: 1.1800 - classification_loss: 0.1750 240/500 [=============>................] - ETA: 1:28 - loss: 1.3548 - regression_loss: 1.1799 - classification_loss: 0.1749 241/500 [=============>................] - ETA: 1:27 - loss: 1.3541 - regression_loss: 1.1793 - classification_loss: 0.1749 242/500 [=============>................] - ETA: 1:27 - loss: 1.3540 - regression_loss: 1.1792 - classification_loss: 0.1748 243/500 [=============>................] - ETA: 1:27 - loss: 1.3550 - regression_loss: 1.1800 - classification_loss: 0.1750 244/500 [=============>................] - ETA: 1:26 - loss: 1.3552 - regression_loss: 1.1802 - classification_loss: 0.1750 245/500 [=============>................] - ETA: 1:26 - loss: 1.3537 - regression_loss: 1.1790 - classification_loss: 0.1747 246/500 [=============>................] - ETA: 1:26 - loss: 1.3520 - regression_loss: 1.1777 - classification_loss: 0.1743 247/500 [=============>................] - ETA: 1:25 - loss: 1.3488 - regression_loss: 1.1750 - classification_loss: 0.1738 248/500 [=============>................] - ETA: 1:25 - loss: 1.3526 - regression_loss: 1.1785 - classification_loss: 0.1740 249/500 [=============>................] - ETA: 1:24 - loss: 1.3541 - regression_loss: 1.1798 - classification_loss: 0.1743 250/500 [==============>...............] - ETA: 1:24 - loss: 1.3514 - regression_loss: 1.1776 - classification_loss: 0.1738 251/500 [==============>...............] - ETA: 1:24 - loss: 1.3502 - regression_loss: 1.1767 - classification_loss: 0.1735 252/500 [==============>...............] - ETA: 1:23 - loss: 1.3512 - regression_loss: 1.1777 - classification_loss: 0.1735 253/500 [==============>...............] - ETA: 1:23 - loss: 1.3532 - regression_loss: 1.1798 - classification_loss: 0.1734 254/500 [==============>...............] - ETA: 1:23 - loss: 1.3546 - regression_loss: 1.1810 - classification_loss: 0.1737 255/500 [==============>...............] - ETA: 1:22 - loss: 1.3534 - regression_loss: 1.1799 - classification_loss: 0.1735 256/500 [==============>...............] - ETA: 1:22 - loss: 1.3534 - regression_loss: 1.1799 - classification_loss: 0.1735 257/500 [==============>...............] - ETA: 1:22 - loss: 1.3507 - regression_loss: 1.1777 - classification_loss: 0.1730 258/500 [==============>...............] - ETA: 1:21 - loss: 1.3499 - regression_loss: 1.1770 - classification_loss: 0.1729 259/500 [==============>...............] - ETA: 1:21 - loss: 1.3478 - regression_loss: 1.1755 - classification_loss: 0.1723 260/500 [==============>...............] - ETA: 1:21 - loss: 1.3460 - regression_loss: 1.1740 - classification_loss: 0.1720 261/500 [==============>...............] - ETA: 1:20 - loss: 1.3464 - regression_loss: 1.1743 - classification_loss: 0.1721 262/500 [==============>...............] - ETA: 1:20 - loss: 1.3472 - regression_loss: 1.1750 - classification_loss: 0.1722 263/500 [==============>...............] - ETA: 1:20 - loss: 1.3479 - regression_loss: 1.1756 - classification_loss: 0.1723 264/500 [==============>...............] - ETA: 1:19 - loss: 1.3487 - regression_loss: 1.1763 - classification_loss: 0.1724 265/500 [==============>...............] - ETA: 1:19 - loss: 1.3486 - regression_loss: 1.1759 - classification_loss: 0.1727 266/500 [==============>...............] - ETA: 1:19 - loss: 1.3461 - regression_loss: 1.1739 - classification_loss: 0.1723 267/500 [===============>..............] - ETA: 1:18 - loss: 1.3463 - regression_loss: 1.1741 - classification_loss: 0.1722 268/500 [===============>..............] - ETA: 1:18 - loss: 1.3454 - regression_loss: 1.1734 - classification_loss: 0.1720 269/500 [===============>..............] - ETA: 1:18 - loss: 1.3458 - regression_loss: 1.1736 - classification_loss: 0.1721 270/500 [===============>..............] - ETA: 1:17 - loss: 1.3469 - regression_loss: 1.1746 - classification_loss: 0.1724 271/500 [===============>..............] - ETA: 1:17 - loss: 1.3487 - regression_loss: 1.1761 - classification_loss: 0.1726 272/500 [===============>..............] - ETA: 1:17 - loss: 1.3498 - regression_loss: 1.1770 - classification_loss: 0.1728 273/500 [===============>..............] - ETA: 1:16 - loss: 1.3499 - regression_loss: 1.1771 - classification_loss: 0.1728 274/500 [===============>..............] - ETA: 1:16 - loss: 1.3511 - regression_loss: 1.1779 - classification_loss: 0.1732 275/500 [===============>..............] - ETA: 1:16 - loss: 1.3515 - regression_loss: 1.1782 - classification_loss: 0.1733 276/500 [===============>..............] - ETA: 1:15 - loss: 1.3518 - regression_loss: 1.1786 - classification_loss: 0.1732 277/500 [===============>..............] - ETA: 1:15 - loss: 1.3529 - regression_loss: 1.1794 - classification_loss: 0.1735 278/500 [===============>..............] - ETA: 1:15 - loss: 1.3507 - regression_loss: 1.1775 - classification_loss: 0.1733 279/500 [===============>..............] - ETA: 1:14 - loss: 1.3531 - regression_loss: 1.1798 - classification_loss: 0.1733 280/500 [===============>..............] - ETA: 1:14 - loss: 1.3533 - regression_loss: 1.1799 - classification_loss: 0.1734 281/500 [===============>..............] - ETA: 1:14 - loss: 1.3541 - regression_loss: 1.1807 - classification_loss: 0.1734 282/500 [===============>..............] - ETA: 1:13 - loss: 1.3543 - regression_loss: 1.1808 - classification_loss: 0.1735 283/500 [===============>..............] - ETA: 1:13 - loss: 1.3540 - regression_loss: 1.1807 - classification_loss: 0.1733 284/500 [================>.............] - ETA: 1:13 - loss: 1.3523 - regression_loss: 1.1792 - classification_loss: 0.1730 285/500 [================>.............] - ETA: 1:12 - loss: 1.3497 - regression_loss: 1.1771 - classification_loss: 0.1726 286/500 [================>.............] - ETA: 1:12 - loss: 1.3485 - regression_loss: 1.1761 - classification_loss: 0.1725 287/500 [================>.............] - ETA: 1:12 - loss: 1.3491 - regression_loss: 1.1768 - classification_loss: 0.1723 288/500 [================>.............] - ETA: 1:11 - loss: 1.3457 - regression_loss: 1.1738 - classification_loss: 0.1718 289/500 [================>.............] - ETA: 1:11 - loss: 1.3446 - regression_loss: 1.1730 - classification_loss: 0.1716 290/500 [================>.............] - ETA: 1:11 - loss: 1.3456 - regression_loss: 1.1736 - classification_loss: 0.1719 291/500 [================>.............] - ETA: 1:10 - loss: 1.3463 - regression_loss: 1.1743 - classification_loss: 0.1719 292/500 [================>.............] - ETA: 1:10 - loss: 1.3442 - regression_loss: 1.1727 - classification_loss: 0.1715 293/500 [================>.............] - ETA: 1:10 - loss: 1.3451 - regression_loss: 1.1735 - classification_loss: 0.1716 294/500 [================>.............] - ETA: 1:09 - loss: 1.3463 - regression_loss: 1.1746 - classification_loss: 0.1717 295/500 [================>.............] - ETA: 1:09 - loss: 1.3473 - regression_loss: 1.1754 - classification_loss: 0.1719 296/500 [================>.............] - ETA: 1:09 - loss: 1.3483 - regression_loss: 1.1762 - classification_loss: 0.1720 297/500 [================>.............] - ETA: 1:08 - loss: 1.3493 - regression_loss: 1.1771 - classification_loss: 0.1722 298/500 [================>.............] - ETA: 1:08 - loss: 1.3516 - regression_loss: 1.1788 - classification_loss: 0.1728 299/500 [================>.............] - ETA: 1:08 - loss: 1.3491 - regression_loss: 1.1767 - classification_loss: 0.1724 300/500 [=================>............] - ETA: 1:07 - loss: 1.3493 - regression_loss: 1.1767 - classification_loss: 0.1726 301/500 [=================>............] - ETA: 1:07 - loss: 1.3482 - regression_loss: 1.1758 - classification_loss: 0.1724 302/500 [=================>............] - ETA: 1:07 - loss: 1.3512 - regression_loss: 1.1784 - classification_loss: 0.1728 303/500 [=================>............] - ETA: 1:06 - loss: 1.3490 - regression_loss: 1.1764 - classification_loss: 0.1726 304/500 [=================>............] - ETA: 1:06 - loss: 1.3492 - regression_loss: 1.1765 - classification_loss: 0.1726 305/500 [=================>............] - ETA: 1:06 - loss: 1.3500 - regression_loss: 1.1773 - classification_loss: 0.1727 306/500 [=================>............] - ETA: 1:05 - loss: 1.3522 - regression_loss: 1.1792 - classification_loss: 0.1731 307/500 [=================>............] - ETA: 1:05 - loss: 1.3515 - regression_loss: 1.1786 - classification_loss: 0.1729 308/500 [=================>............] - ETA: 1:05 - loss: 1.3491 - regression_loss: 1.1766 - classification_loss: 0.1725 309/500 [=================>............] - ETA: 1:04 - loss: 1.3490 - regression_loss: 1.1766 - classification_loss: 0.1725 310/500 [=================>............] - ETA: 1:04 - loss: 1.3483 - regression_loss: 1.1760 - classification_loss: 0.1723 311/500 [=================>............] - ETA: 1:04 - loss: 1.3469 - regression_loss: 1.1749 - classification_loss: 0.1721 312/500 [=================>............] - ETA: 1:03 - loss: 1.3480 - regression_loss: 1.1759 - classification_loss: 0.1720 313/500 [=================>............] - ETA: 1:03 - loss: 1.3472 - regression_loss: 1.1753 - classification_loss: 0.1719 314/500 [=================>............] - ETA: 1:03 - loss: 1.3477 - regression_loss: 1.1759 - classification_loss: 0.1718 315/500 [=================>............] - ETA: 1:02 - loss: 1.3475 - regression_loss: 1.1757 - classification_loss: 0.1718 316/500 [=================>............] - ETA: 1:02 - loss: 1.3456 - regression_loss: 1.1742 - classification_loss: 0.1715 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3469 - regression_loss: 1.1755 - classification_loss: 0.1714 318/500 [==================>...........] - ETA: 1:01 - loss: 1.3460 - regression_loss: 1.1748 - classification_loss: 0.1712 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3467 - regression_loss: 1.1754 - classification_loss: 0.1713 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3459 - regression_loss: 1.1746 - classification_loss: 0.1713 321/500 [==================>...........] - ETA: 1:00 - loss: 1.3465 - regression_loss: 1.1752 - classification_loss: 0.1713 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3458 - regression_loss: 1.1747 - classification_loss: 0.1712 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3452 - regression_loss: 1.1742 - classification_loss: 0.1710 324/500 [==================>...........] - ETA: 59s - loss: 1.3453 - regression_loss: 1.1744 - classification_loss: 0.1709  325/500 [==================>...........] - ETA: 59s - loss: 1.3436 - regression_loss: 1.1727 - classification_loss: 0.1709 326/500 [==================>...........] - ETA: 59s - loss: 1.3437 - regression_loss: 1.1728 - classification_loss: 0.1709 327/500 [==================>...........] - ETA: 58s - loss: 1.3458 - regression_loss: 1.1738 - classification_loss: 0.1720 328/500 [==================>...........] - ETA: 58s - loss: 1.3453 - regression_loss: 1.1734 - classification_loss: 0.1720 329/500 [==================>...........] - ETA: 57s - loss: 1.3464 - regression_loss: 1.1741 - classification_loss: 0.1723 330/500 [==================>...........] - ETA: 57s - loss: 1.3466 - regression_loss: 1.1745 - classification_loss: 0.1721 331/500 [==================>...........] - ETA: 57s - loss: 1.3485 - regression_loss: 1.1761 - classification_loss: 0.1724 332/500 [==================>...........] - ETA: 56s - loss: 1.3481 - regression_loss: 1.1758 - classification_loss: 0.1723 333/500 [==================>...........] - ETA: 56s - loss: 1.3474 - regression_loss: 1.1752 - classification_loss: 0.1722 334/500 [===================>..........] - ETA: 56s - loss: 1.3466 - regression_loss: 1.1745 - classification_loss: 0.1721 335/500 [===================>..........] - ETA: 55s - loss: 1.3458 - regression_loss: 1.1740 - classification_loss: 0.1718 336/500 [===================>..........] - ETA: 55s - loss: 1.3457 - regression_loss: 1.1740 - classification_loss: 0.1717 337/500 [===================>..........] - ETA: 55s - loss: 1.3452 - regression_loss: 1.1736 - classification_loss: 0.1716 338/500 [===================>..........] - ETA: 54s - loss: 1.3462 - regression_loss: 1.1743 - classification_loss: 0.1719 339/500 [===================>..........] - ETA: 54s - loss: 1.3444 - regression_loss: 1.1728 - classification_loss: 0.1716 340/500 [===================>..........] - ETA: 54s - loss: 1.3461 - regression_loss: 1.1740 - classification_loss: 0.1721 341/500 [===================>..........] - ETA: 53s - loss: 1.3466 - regression_loss: 1.1744 - classification_loss: 0.1722 342/500 [===================>..........] - ETA: 53s - loss: 1.3470 - regression_loss: 1.1747 - classification_loss: 0.1723 343/500 [===================>..........] - ETA: 53s - loss: 1.3468 - regression_loss: 1.1745 - classification_loss: 0.1723 344/500 [===================>..........] - ETA: 52s - loss: 1.3469 - regression_loss: 1.1747 - classification_loss: 0.1722 345/500 [===================>..........] - ETA: 52s - loss: 1.3474 - regression_loss: 1.1752 - classification_loss: 0.1722 346/500 [===================>..........] - ETA: 52s - loss: 1.3467 - regression_loss: 1.1746 - classification_loss: 0.1721 347/500 [===================>..........] - ETA: 51s - loss: 1.3453 - regression_loss: 1.1735 - classification_loss: 0.1718 348/500 [===================>..........] - ETA: 51s - loss: 1.3459 - regression_loss: 1.1742 - classification_loss: 0.1717 349/500 [===================>..........] - ETA: 51s - loss: 1.3465 - regression_loss: 1.1747 - classification_loss: 0.1718 350/500 [====================>.........] - ETA: 50s - loss: 1.3463 - regression_loss: 1.1745 - classification_loss: 0.1718 351/500 [====================>.........] - ETA: 50s - loss: 1.3469 - regression_loss: 1.1750 - classification_loss: 0.1719 352/500 [====================>.........] - ETA: 50s - loss: 1.3484 - regression_loss: 1.1760 - classification_loss: 0.1724 353/500 [====================>.........] - ETA: 49s - loss: 1.3480 - regression_loss: 1.1757 - classification_loss: 0.1723 354/500 [====================>.........] - ETA: 49s - loss: 1.3466 - regression_loss: 1.1746 - classification_loss: 0.1720 355/500 [====================>.........] - ETA: 49s - loss: 1.3470 - regression_loss: 1.1749 - classification_loss: 0.1721 356/500 [====================>.........] - ETA: 48s - loss: 1.3471 - regression_loss: 1.1751 - classification_loss: 0.1720 357/500 [====================>.........] - ETA: 48s - loss: 1.3476 - regression_loss: 1.1754 - classification_loss: 0.1722 358/500 [====================>.........] - ETA: 48s - loss: 1.3490 - regression_loss: 1.1765 - classification_loss: 0.1725 359/500 [====================>.........] - ETA: 47s - loss: 1.3483 - regression_loss: 1.1759 - classification_loss: 0.1724 360/500 [====================>.........] - ETA: 47s - loss: 1.3483 - regression_loss: 1.1759 - classification_loss: 0.1724 361/500 [====================>.........] - ETA: 47s - loss: 1.3486 - regression_loss: 1.1761 - classification_loss: 0.1725 362/500 [====================>.........] - ETA: 46s - loss: 1.3495 - regression_loss: 1.1769 - classification_loss: 0.1727 363/500 [====================>.........] - ETA: 46s - loss: 1.3496 - regression_loss: 1.1769 - classification_loss: 0.1727 364/500 [====================>.........] - ETA: 46s - loss: 1.3486 - regression_loss: 1.1761 - classification_loss: 0.1724 365/500 [====================>.........] - ETA: 45s - loss: 1.3483 - regression_loss: 1.1759 - classification_loss: 0.1724 366/500 [====================>.........] - ETA: 45s - loss: 1.3472 - regression_loss: 1.1751 - classification_loss: 0.1721 367/500 [=====================>........] - ETA: 45s - loss: 1.3476 - regression_loss: 1.1753 - classification_loss: 0.1722 368/500 [=====================>........] - ETA: 44s - loss: 1.3481 - regression_loss: 1.1759 - classification_loss: 0.1723 369/500 [=====================>........] - ETA: 44s - loss: 1.3492 - regression_loss: 1.1767 - classification_loss: 0.1725 370/500 [=====================>........] - ETA: 44s - loss: 1.3486 - regression_loss: 1.1761 - classification_loss: 0.1725 371/500 [=====================>........] - ETA: 43s - loss: 1.3477 - regression_loss: 1.1752 - classification_loss: 0.1725 372/500 [=====================>........] - ETA: 43s - loss: 1.3468 - regression_loss: 1.1743 - classification_loss: 0.1725 373/500 [=====================>........] - ETA: 43s - loss: 1.3467 - regression_loss: 1.1743 - classification_loss: 0.1724 374/500 [=====================>........] - ETA: 42s - loss: 1.3449 - regression_loss: 1.1728 - classification_loss: 0.1721 375/500 [=====================>........] - ETA: 42s - loss: 1.3450 - regression_loss: 1.1728 - classification_loss: 0.1722 376/500 [=====================>........] - ETA: 42s - loss: 1.3443 - regression_loss: 1.1723 - classification_loss: 0.1720 377/500 [=====================>........] - ETA: 41s - loss: 1.3437 - regression_loss: 1.1719 - classification_loss: 0.1718 378/500 [=====================>........] - ETA: 41s - loss: 1.3427 - regression_loss: 1.1709 - classification_loss: 0.1718 379/500 [=====================>........] - ETA: 41s - loss: 1.3414 - regression_loss: 1.1699 - classification_loss: 0.1715 380/500 [=====================>........] - ETA: 40s - loss: 1.3405 - regression_loss: 1.1691 - classification_loss: 0.1714 381/500 [=====================>........] - ETA: 40s - loss: 1.3394 - regression_loss: 1.1682 - classification_loss: 0.1713 382/500 [=====================>........] - ETA: 40s - loss: 1.3397 - regression_loss: 1.1685 - classification_loss: 0.1713 383/500 [=====================>........] - ETA: 39s - loss: 1.3381 - regression_loss: 1.1672 - classification_loss: 0.1709 384/500 [======================>.......] - ETA: 39s - loss: 1.3389 - regression_loss: 1.1680 - classification_loss: 0.1709 385/500 [======================>.......] - ETA: 39s - loss: 1.3386 - regression_loss: 1.1677 - classification_loss: 0.1709 386/500 [======================>.......] - ETA: 38s - loss: 1.3378 - regression_loss: 1.1671 - classification_loss: 0.1708 387/500 [======================>.......] - ETA: 38s - loss: 1.3381 - regression_loss: 1.1673 - classification_loss: 0.1708 388/500 [======================>.......] - ETA: 38s - loss: 1.3392 - regression_loss: 1.1680 - classification_loss: 0.1711 389/500 [======================>.......] - ETA: 37s - loss: 1.3392 - regression_loss: 1.1681 - classification_loss: 0.1711 390/500 [======================>.......] - ETA: 37s - loss: 1.3378 - regression_loss: 1.1670 - classification_loss: 0.1708 391/500 [======================>.......] - ETA: 37s - loss: 1.3362 - regression_loss: 1.1655 - classification_loss: 0.1707 392/500 [======================>.......] - ETA: 36s - loss: 1.3362 - regression_loss: 1.1656 - classification_loss: 0.1706 393/500 [======================>.......] - ETA: 36s - loss: 1.3365 - regression_loss: 1.1658 - classification_loss: 0.1707 394/500 [======================>.......] - ETA: 35s - loss: 1.3362 - regression_loss: 1.1654 - classification_loss: 0.1707 395/500 [======================>.......] - ETA: 35s - loss: 1.3359 - regression_loss: 1.1652 - classification_loss: 0.1707 396/500 [======================>.......] - ETA: 35s - loss: 1.3373 - regression_loss: 1.1665 - classification_loss: 0.1708 397/500 [======================>.......] - ETA: 34s - loss: 1.3373 - regression_loss: 1.1666 - classification_loss: 0.1708 398/500 [======================>.......] - ETA: 34s - loss: 1.3366 - regression_loss: 1.1660 - classification_loss: 0.1706 399/500 [======================>.......] - ETA: 34s - loss: 1.3358 - regression_loss: 1.1651 - classification_loss: 0.1708 400/500 [=======================>......] - ETA: 33s - loss: 1.3357 - regression_loss: 1.1649 - classification_loss: 0.1708 401/500 [=======================>......] - ETA: 33s - loss: 1.3363 - regression_loss: 1.1655 - classification_loss: 0.1709 402/500 [=======================>......] - ETA: 33s - loss: 1.3371 - regression_loss: 1.1662 - classification_loss: 0.1709 403/500 [=======================>......] - ETA: 32s - loss: 1.3375 - regression_loss: 1.1665 - classification_loss: 0.1711 404/500 [=======================>......] - ETA: 32s - loss: 1.3383 - regression_loss: 1.1672 - classification_loss: 0.1711 405/500 [=======================>......] - ETA: 32s - loss: 1.3392 - regression_loss: 1.1679 - classification_loss: 0.1712 406/500 [=======================>......] - ETA: 31s - loss: 1.3379 - regression_loss: 1.1669 - classification_loss: 0.1710 407/500 [=======================>......] - ETA: 31s - loss: 1.3370 - regression_loss: 1.1661 - classification_loss: 0.1709 408/500 [=======================>......] - ETA: 31s - loss: 1.3378 - regression_loss: 1.1668 - classification_loss: 0.1710 409/500 [=======================>......] - ETA: 30s - loss: 1.3364 - regression_loss: 1.1656 - classification_loss: 0.1709 410/500 [=======================>......] - ETA: 30s - loss: 1.3381 - regression_loss: 1.1663 - classification_loss: 0.1718 411/500 [=======================>......] - ETA: 30s - loss: 1.3369 - regression_loss: 1.1653 - classification_loss: 0.1716 412/500 [=======================>......] - ETA: 29s - loss: 1.3365 - regression_loss: 1.1649 - classification_loss: 0.1715 413/500 [=======================>......] - ETA: 29s - loss: 1.3355 - regression_loss: 1.1642 - classification_loss: 0.1713 414/500 [=======================>......] - ETA: 29s - loss: 1.3340 - regression_loss: 1.1629 - classification_loss: 0.1711 415/500 [=======================>......] - ETA: 28s - loss: 1.3347 - regression_loss: 1.1634 - classification_loss: 0.1713 416/500 [=======================>......] - ETA: 28s - loss: 1.3349 - regression_loss: 1.1636 - classification_loss: 0.1713 417/500 [========================>.....] - ETA: 28s - loss: 1.3350 - regression_loss: 1.1639 - classification_loss: 0.1711 418/500 [========================>.....] - ETA: 27s - loss: 1.3354 - regression_loss: 1.1642 - classification_loss: 0.1712 419/500 [========================>.....] - ETA: 27s - loss: 1.3356 - regression_loss: 1.1645 - classification_loss: 0.1712 420/500 [========================>.....] - ETA: 27s - loss: 1.3353 - regression_loss: 1.1642 - classification_loss: 0.1711 421/500 [========================>.....] - ETA: 26s - loss: 1.3358 - regression_loss: 1.1646 - classification_loss: 0.1712 422/500 [========================>.....] - ETA: 26s - loss: 1.3344 - regression_loss: 1.1633 - classification_loss: 0.1710 423/500 [========================>.....] - ETA: 26s - loss: 1.3332 - regression_loss: 1.1622 - classification_loss: 0.1710 424/500 [========================>.....] - ETA: 25s - loss: 1.3332 - regression_loss: 1.1622 - classification_loss: 0.1710 425/500 [========================>.....] - ETA: 25s - loss: 1.3343 - regression_loss: 1.1632 - classification_loss: 0.1712 426/500 [========================>.....] - ETA: 25s - loss: 1.3344 - regression_loss: 1.1633 - classification_loss: 0.1711 427/500 [========================>.....] - ETA: 24s - loss: 1.3345 - regression_loss: 1.1634 - classification_loss: 0.1711 428/500 [========================>.....] - ETA: 24s - loss: 1.3338 - regression_loss: 1.1628 - classification_loss: 0.1711 429/500 [========================>.....] - ETA: 24s - loss: 1.3352 - regression_loss: 1.1639 - classification_loss: 0.1712 430/500 [========================>.....] - ETA: 23s - loss: 1.3345 - regression_loss: 1.1633 - classification_loss: 0.1711 431/500 [========================>.....] - ETA: 23s - loss: 1.3333 - regression_loss: 1.1624 - classification_loss: 0.1709 432/500 [========================>.....] - ETA: 23s - loss: 1.3341 - regression_loss: 1.1631 - classification_loss: 0.1710 433/500 [========================>.....] - ETA: 22s - loss: 1.3351 - regression_loss: 1.1639 - classification_loss: 0.1712 434/500 [=========================>....] - ETA: 22s - loss: 1.3338 - regression_loss: 1.1628 - classification_loss: 0.1710 435/500 [=========================>....] - ETA: 22s - loss: 1.3341 - regression_loss: 1.1632 - classification_loss: 0.1710 436/500 [=========================>....] - ETA: 21s - loss: 1.3341 - regression_loss: 1.1631 - classification_loss: 0.1710 437/500 [=========================>....] - ETA: 21s - loss: 1.3341 - regression_loss: 1.1631 - classification_loss: 0.1710 438/500 [=========================>....] - ETA: 21s - loss: 1.3350 - regression_loss: 1.1639 - classification_loss: 0.1711 439/500 [=========================>....] - ETA: 20s - loss: 1.3354 - regression_loss: 1.1642 - classification_loss: 0.1712 440/500 [=========================>....] - ETA: 20s - loss: 1.3371 - regression_loss: 1.1656 - classification_loss: 0.1715 441/500 [=========================>....] - ETA: 19s - loss: 1.3375 - regression_loss: 1.1660 - classification_loss: 0.1715 442/500 [=========================>....] - ETA: 19s - loss: 1.3377 - regression_loss: 1.1661 - classification_loss: 0.1716 443/500 [=========================>....] - ETA: 19s - loss: 1.3377 - regression_loss: 1.1661 - classification_loss: 0.1716 444/500 [=========================>....] - ETA: 18s - loss: 1.3383 - regression_loss: 1.1666 - classification_loss: 0.1717 445/500 [=========================>....] - ETA: 18s - loss: 1.3379 - regression_loss: 1.1663 - classification_loss: 0.1716 446/500 [=========================>....] - ETA: 18s - loss: 1.3379 - regression_loss: 1.1664 - classification_loss: 0.1715 447/500 [=========================>....] - ETA: 17s - loss: 1.3382 - regression_loss: 1.1665 - classification_loss: 0.1717 448/500 [=========================>....] - ETA: 17s - loss: 1.3383 - regression_loss: 1.1665 - classification_loss: 0.1717 449/500 [=========================>....] - ETA: 17s - loss: 1.3379 - regression_loss: 1.1663 - classification_loss: 0.1716 450/500 [==========================>...] - ETA: 16s - loss: 1.3390 - regression_loss: 1.1673 - classification_loss: 0.1718 451/500 [==========================>...] - ETA: 16s - loss: 1.3400 - regression_loss: 1.1680 - classification_loss: 0.1720 452/500 [==========================>...] - ETA: 16s - loss: 1.3396 - regression_loss: 1.1676 - classification_loss: 0.1719 453/500 [==========================>...] - ETA: 15s - loss: 1.3379 - regression_loss: 1.1661 - classification_loss: 0.1717 454/500 [==========================>...] - ETA: 15s - loss: 1.3383 - regression_loss: 1.1665 - classification_loss: 0.1718 455/500 [==========================>...] - ETA: 15s - loss: 1.3370 - regression_loss: 1.1652 - classification_loss: 0.1718 456/500 [==========================>...] - ETA: 14s - loss: 1.3381 - regression_loss: 1.1661 - classification_loss: 0.1721 457/500 [==========================>...] - ETA: 14s - loss: 1.3379 - regression_loss: 1.1660 - classification_loss: 0.1719 458/500 [==========================>...] - ETA: 14s - loss: 1.3375 - regression_loss: 1.1655 - classification_loss: 0.1719 459/500 [==========================>...] - ETA: 13s - loss: 1.3378 - regression_loss: 1.1658 - classification_loss: 0.1720 460/500 [==========================>...] - ETA: 13s - loss: 1.3377 - regression_loss: 1.1657 - classification_loss: 0.1719 461/500 [==========================>...] - ETA: 13s - loss: 1.3362 - regression_loss: 1.1645 - classification_loss: 0.1717 462/500 [==========================>...] - ETA: 12s - loss: 1.3362 - regression_loss: 1.1645 - classification_loss: 0.1717 463/500 [==========================>...] - ETA: 12s - loss: 1.3364 - regression_loss: 1.1647 - classification_loss: 0.1717 464/500 [==========================>...] - ETA: 12s - loss: 1.3359 - regression_loss: 1.1643 - classification_loss: 0.1716 465/500 [==========================>...] - ETA: 11s - loss: 1.3346 - regression_loss: 1.1631 - classification_loss: 0.1715 466/500 [==========================>...] - ETA: 11s - loss: 1.3344 - regression_loss: 1.1630 - classification_loss: 0.1714 467/500 [===========================>..] - ETA: 11s - loss: 1.3367 - regression_loss: 1.1653 - classification_loss: 0.1715 468/500 [===========================>..] - ETA: 10s - loss: 1.3374 - regression_loss: 1.1659 - classification_loss: 0.1715 469/500 [===========================>..] - ETA: 10s - loss: 1.3378 - regression_loss: 1.1663 - classification_loss: 0.1715 470/500 [===========================>..] - ETA: 10s - loss: 1.3377 - regression_loss: 1.1662 - classification_loss: 0.1715 471/500 [===========================>..] - ETA: 9s - loss: 1.3376 - regression_loss: 1.1661 - classification_loss: 0.1715  472/500 [===========================>..] - ETA: 9s - loss: 1.3378 - regression_loss: 1.1663 - classification_loss: 0.1715 473/500 [===========================>..] - ETA: 9s - loss: 1.3396 - regression_loss: 1.1676 - classification_loss: 0.1719 474/500 [===========================>..] - ETA: 8s - loss: 1.3400 - regression_loss: 1.1681 - classification_loss: 0.1719 475/500 [===========================>..] - ETA: 8s - loss: 1.3403 - regression_loss: 1.1684 - classification_loss: 0.1719 476/500 [===========================>..] - ETA: 8s - loss: 1.3412 - regression_loss: 1.1693 - classification_loss: 0.1719 477/500 [===========================>..] - ETA: 7s - loss: 1.3414 - regression_loss: 1.1695 - classification_loss: 0.1720 478/500 [===========================>..] - ETA: 7s - loss: 1.3422 - regression_loss: 1.1701 - classification_loss: 0.1721 479/500 [===========================>..] - ETA: 7s - loss: 1.3425 - regression_loss: 1.1703 - classification_loss: 0.1721 480/500 [===========================>..] - ETA: 6s - loss: 1.3416 - regression_loss: 1.1695 - classification_loss: 0.1721 481/500 [===========================>..] - ETA: 6s - loss: 1.3420 - regression_loss: 1.1699 - classification_loss: 0.1721 482/500 [===========================>..] - ETA: 6s - loss: 1.3421 - regression_loss: 1.1701 - classification_loss: 0.1720 483/500 [===========================>..] - ETA: 5s - loss: 1.3410 - regression_loss: 1.1692 - classification_loss: 0.1719 484/500 [============================>.] - ETA: 5s - loss: 1.3413 - regression_loss: 1.1694 - classification_loss: 0.1719 485/500 [============================>.] - ETA: 5s - loss: 1.3414 - regression_loss: 1.1696 - classification_loss: 0.1718 486/500 [============================>.] - ETA: 4s - loss: 1.3422 - regression_loss: 1.1702 - classification_loss: 0.1719 487/500 [============================>.] - ETA: 4s - loss: 1.3426 - regression_loss: 1.1707 - classification_loss: 0.1719 488/500 [============================>.] - ETA: 4s - loss: 1.3426 - regression_loss: 1.1706 - classification_loss: 0.1719 489/500 [============================>.] - ETA: 3s - loss: 1.3425 - regression_loss: 1.1706 - classification_loss: 0.1719 490/500 [============================>.] - ETA: 3s - loss: 1.3426 - regression_loss: 1.1707 - classification_loss: 0.1718 491/500 [============================>.] - ETA: 3s - loss: 1.3431 - regression_loss: 1.1712 - classification_loss: 0.1719 492/500 [============================>.] - ETA: 2s - loss: 1.3433 - regression_loss: 1.1715 - classification_loss: 0.1718 493/500 [============================>.] - ETA: 2s - loss: 1.3439 - regression_loss: 1.1720 - classification_loss: 0.1718 494/500 [============================>.] - ETA: 2s - loss: 1.3438 - regression_loss: 1.1720 - classification_loss: 0.1717 495/500 [============================>.] - ETA: 1s - loss: 1.3436 - regression_loss: 1.1719 - classification_loss: 0.1717 496/500 [============================>.] - ETA: 1s - loss: 1.3440 - regression_loss: 1.1721 - classification_loss: 0.1718 497/500 [============================>.] - ETA: 1s - loss: 1.3438 - regression_loss: 1.1720 - classification_loss: 0.1718 498/500 [============================>.] - ETA: 0s - loss: 1.3448 - regression_loss: 1.1730 - classification_loss: 0.1718 499/500 [============================>.] - ETA: 0s - loss: 1.3461 - regression_loss: 1.1739 - classification_loss: 0.1722 500/500 [==============================] - 169s 339ms/step - loss: 1.3466 - regression_loss: 1.1742 - classification_loss: 0.1724 1172 instances of class plum with average precision: 0.7680 mAP: 0.7680 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:30 - loss: 1.1243 - regression_loss: 1.0085 - classification_loss: 0.1158 2/500 [..............................] - ETA: 2:40 - loss: 1.1201 - regression_loss: 0.9959 - classification_loss: 0.1242 3/500 [..............................] - ETA: 2:40 - loss: 1.1704 - regression_loss: 1.0557 - classification_loss: 0.1148 4/500 [..............................] - ETA: 2:40 - loss: 1.1749 - regression_loss: 1.0529 - classification_loss: 0.1220 5/500 [..............................] - ETA: 2:42 - loss: 1.2004 - regression_loss: 1.0773 - classification_loss: 0.1230 6/500 [..............................] - ETA: 2:42 - loss: 1.2280 - regression_loss: 1.0958 - classification_loss: 0.1322 7/500 [..............................] - ETA: 2:43 - loss: 1.2849 - regression_loss: 1.1416 - classification_loss: 0.1433 8/500 [..............................] - ETA: 2:44 - loss: 1.2701 - regression_loss: 1.1292 - classification_loss: 0.1408 9/500 [..............................] - ETA: 2:44 - loss: 1.3246 - regression_loss: 1.1853 - classification_loss: 0.1392 10/500 [..............................] - ETA: 2:43 - loss: 1.3435 - regression_loss: 1.2006 - classification_loss: 0.1429 11/500 [..............................] - ETA: 2:43 - loss: 1.3640 - regression_loss: 1.2150 - classification_loss: 0.1490 12/500 [..............................] - ETA: 2:43 - loss: 1.3671 - regression_loss: 1.2155 - classification_loss: 0.1515 13/500 [..............................] - ETA: 2:43 - loss: 1.3243 - regression_loss: 1.1785 - classification_loss: 0.1458 14/500 [..............................] - ETA: 2:42 - loss: 1.2898 - regression_loss: 1.1477 - classification_loss: 0.1422 15/500 [..............................] - ETA: 2:42 - loss: 1.2504 - regression_loss: 1.1130 - classification_loss: 0.1374 16/500 [..............................] - ETA: 2:41 - loss: 1.2221 - regression_loss: 1.0874 - classification_loss: 0.1347 17/500 [>.............................] - ETA: 2:41 - loss: 1.2492 - regression_loss: 1.1061 - classification_loss: 0.1431 18/500 [>.............................] - ETA: 2:41 - loss: 1.2878 - regression_loss: 1.1365 - classification_loss: 0.1513 19/500 [>.............................] - ETA: 2:40 - loss: 1.2929 - regression_loss: 1.1403 - classification_loss: 0.1527 20/500 [>.............................] - ETA: 2:40 - loss: 1.3375 - regression_loss: 1.1763 - classification_loss: 0.1613 21/500 [>.............................] - ETA: 2:40 - loss: 1.3207 - regression_loss: 1.1647 - classification_loss: 0.1560 22/500 [>.............................] - ETA: 2:40 - loss: 1.2947 - regression_loss: 1.1427 - classification_loss: 0.1521 23/500 [>.............................] - ETA: 2:40 - loss: 1.3019 - regression_loss: 1.1507 - classification_loss: 0.1511 24/500 [>.............................] - ETA: 2:40 - loss: 1.3038 - regression_loss: 1.1523 - classification_loss: 0.1516 25/500 [>.............................] - ETA: 2:40 - loss: 1.3092 - regression_loss: 1.1585 - classification_loss: 0.1507 26/500 [>.............................] - ETA: 2:40 - loss: 1.2870 - regression_loss: 1.1381 - classification_loss: 0.1489 27/500 [>.............................] - ETA: 2:40 - loss: 1.2828 - regression_loss: 1.1350 - classification_loss: 0.1478 28/500 [>.............................] - ETA: 2:40 - loss: 1.2846 - regression_loss: 1.1349 - classification_loss: 0.1497 29/500 [>.............................] - ETA: 2:39 - loss: 1.3114 - regression_loss: 1.1594 - classification_loss: 0.1520 30/500 [>.............................] - ETA: 2:39 - loss: 1.3203 - regression_loss: 1.1659 - classification_loss: 0.1544 31/500 [>.............................] - ETA: 2:38 - loss: 1.3240 - regression_loss: 1.1669 - classification_loss: 0.1571 32/500 [>.............................] - ETA: 2:38 - loss: 1.3206 - regression_loss: 1.1615 - classification_loss: 0.1590 33/500 [>.............................] - ETA: 2:37 - loss: 1.3223 - regression_loss: 1.1619 - classification_loss: 0.1604 34/500 [=>............................] - ETA: 2:37 - loss: 1.3362 - regression_loss: 1.1741 - classification_loss: 0.1620 35/500 [=>............................] - ETA: 2:37 - loss: 1.3767 - regression_loss: 1.2084 - classification_loss: 0.1682 36/500 [=>............................] - ETA: 2:36 - loss: 1.3750 - regression_loss: 1.2076 - classification_loss: 0.1674 37/500 [=>............................] - ETA: 2:36 - loss: 1.3557 - regression_loss: 1.1910 - classification_loss: 0.1647 38/500 [=>............................] - ETA: 2:36 - loss: 1.3580 - regression_loss: 1.1928 - classification_loss: 0.1652 39/500 [=>............................] - ETA: 2:36 - loss: 1.3682 - regression_loss: 1.2017 - classification_loss: 0.1665 40/500 [=>............................] - ETA: 2:35 - loss: 1.3577 - regression_loss: 1.1933 - classification_loss: 0.1644 41/500 [=>............................] - ETA: 2:35 - loss: 1.3401 - regression_loss: 1.1782 - classification_loss: 0.1619 42/500 [=>............................] - ETA: 2:34 - loss: 1.3360 - regression_loss: 1.1741 - classification_loss: 0.1619 43/500 [=>............................] - ETA: 2:34 - loss: 1.3192 - regression_loss: 1.1589 - classification_loss: 0.1603 44/500 [=>............................] - ETA: 2:34 - loss: 1.3269 - regression_loss: 1.1657 - classification_loss: 0.1612 45/500 [=>............................] - ETA: 2:34 - loss: 1.3357 - regression_loss: 1.1716 - classification_loss: 0.1641 46/500 [=>............................] - ETA: 2:33 - loss: 1.3261 - regression_loss: 1.1626 - classification_loss: 0.1634 47/500 [=>............................] - ETA: 2:33 - loss: 1.3206 - regression_loss: 1.1576 - classification_loss: 0.1630 48/500 [=>............................] - ETA: 2:33 - loss: 1.3119 - regression_loss: 1.1500 - classification_loss: 0.1619 49/500 [=>............................] - ETA: 2:32 - loss: 1.3008 - regression_loss: 1.1404 - classification_loss: 0.1604 50/500 [==>...........................] - ETA: 2:32 - loss: 1.2977 - regression_loss: 1.1382 - classification_loss: 0.1595 51/500 [==>...........................] - ETA: 2:32 - loss: 1.2980 - regression_loss: 1.1385 - classification_loss: 0.1595 52/500 [==>...........................] - ETA: 2:32 - loss: 1.3015 - regression_loss: 1.1413 - classification_loss: 0.1602 53/500 [==>...........................] - ETA: 2:32 - loss: 1.2896 - regression_loss: 1.1307 - classification_loss: 0.1589 54/500 [==>...........................] - ETA: 2:31 - loss: 1.2870 - regression_loss: 1.1288 - classification_loss: 0.1582 55/500 [==>...........................] - ETA: 2:31 - loss: 1.2840 - regression_loss: 1.1259 - classification_loss: 0.1581 56/500 [==>...........................] - ETA: 2:31 - loss: 1.2738 - regression_loss: 1.1173 - classification_loss: 0.1565 57/500 [==>...........................] - ETA: 2:30 - loss: 1.2792 - regression_loss: 1.1218 - classification_loss: 0.1574 58/500 [==>...........................] - ETA: 2:30 - loss: 1.2778 - regression_loss: 1.1212 - classification_loss: 0.1566 59/500 [==>...........................] - ETA: 2:30 - loss: 1.2768 - regression_loss: 1.1190 - classification_loss: 0.1578 60/500 [==>...........................] - ETA: 2:29 - loss: 1.2839 - regression_loss: 1.1245 - classification_loss: 0.1594 61/500 [==>...........................] - ETA: 2:29 - loss: 1.2875 - regression_loss: 1.1276 - classification_loss: 0.1599 62/500 [==>...........................] - ETA: 2:29 - loss: 1.2922 - regression_loss: 1.1308 - classification_loss: 0.1614 63/500 [==>...........................] - ETA: 2:28 - loss: 1.2950 - regression_loss: 1.1336 - classification_loss: 0.1614 64/500 [==>...........................] - ETA: 2:28 - loss: 1.3011 - regression_loss: 1.1387 - classification_loss: 0.1624 65/500 [==>...........................] - ETA: 2:28 - loss: 1.3045 - regression_loss: 1.1413 - classification_loss: 0.1632 66/500 [==>...........................] - ETA: 2:27 - loss: 1.3078 - regression_loss: 1.1437 - classification_loss: 0.1641 67/500 [===>..........................] - ETA: 2:27 - loss: 1.3181 - regression_loss: 1.1521 - classification_loss: 0.1659 68/500 [===>..........................] - ETA: 2:27 - loss: 1.3221 - regression_loss: 1.1554 - classification_loss: 0.1667 69/500 [===>..........................] - ETA: 2:27 - loss: 1.3320 - regression_loss: 1.1636 - classification_loss: 0.1684 70/500 [===>..........................] - ETA: 2:26 - loss: 1.3398 - regression_loss: 1.1700 - classification_loss: 0.1698 71/500 [===>..........................] - ETA: 2:26 - loss: 1.3470 - regression_loss: 1.1763 - classification_loss: 0.1708 72/500 [===>..........................] - ETA: 2:25 - loss: 1.3471 - regression_loss: 1.1749 - classification_loss: 0.1721 73/500 [===>..........................] - ETA: 2:25 - loss: 1.3539 - regression_loss: 1.1807 - classification_loss: 0.1732 74/500 [===>..........................] - ETA: 2:25 - loss: 1.3483 - regression_loss: 1.1752 - classification_loss: 0.1731 75/500 [===>..........................] - ETA: 2:24 - loss: 1.3556 - regression_loss: 1.1815 - classification_loss: 0.1741 76/500 [===>..........................] - ETA: 2:24 - loss: 1.3607 - regression_loss: 1.1841 - classification_loss: 0.1767 77/500 [===>..........................] - ETA: 2:24 - loss: 1.3598 - regression_loss: 1.1831 - classification_loss: 0.1767 78/500 [===>..........................] - ETA: 2:24 - loss: 1.3614 - regression_loss: 1.1851 - classification_loss: 0.1763 79/500 [===>..........................] - ETA: 2:23 - loss: 1.3657 - regression_loss: 1.1891 - classification_loss: 0.1766 80/500 [===>..........................] - ETA: 2:23 - loss: 1.3559 - regression_loss: 1.1808 - classification_loss: 0.1751 81/500 [===>..........................] - ETA: 2:23 - loss: 1.3487 - regression_loss: 1.1750 - classification_loss: 0.1736 82/500 [===>..........................] - ETA: 2:22 - loss: 1.3503 - regression_loss: 1.1761 - classification_loss: 0.1743 83/500 [===>..........................] - ETA: 2:22 - loss: 1.3504 - regression_loss: 1.1760 - classification_loss: 0.1744 84/500 [====>.........................] - ETA: 2:22 - loss: 1.3504 - regression_loss: 1.1764 - classification_loss: 0.1740 85/500 [====>.........................] - ETA: 2:21 - loss: 1.3512 - regression_loss: 1.1772 - classification_loss: 0.1741 86/500 [====>.........................] - ETA: 2:21 - loss: 1.3490 - regression_loss: 1.1742 - classification_loss: 0.1748 87/500 [====>.........................] - ETA: 2:21 - loss: 1.3489 - regression_loss: 1.1738 - classification_loss: 0.1750 88/500 [====>.........................] - ETA: 2:20 - loss: 1.3389 - regression_loss: 1.1652 - classification_loss: 0.1737 89/500 [====>.........................] - ETA: 2:20 - loss: 1.3367 - regression_loss: 1.1634 - classification_loss: 0.1734 90/500 [====>.........................] - ETA: 2:20 - loss: 1.3387 - regression_loss: 1.1650 - classification_loss: 0.1737 91/500 [====>.........................] - ETA: 2:20 - loss: 1.3412 - regression_loss: 1.1661 - classification_loss: 0.1751 92/500 [====>.........................] - ETA: 2:19 - loss: 1.3422 - regression_loss: 1.1658 - classification_loss: 0.1764 93/500 [====>.........................] - ETA: 2:19 - loss: 1.3392 - regression_loss: 1.1627 - classification_loss: 0.1765 94/500 [====>.........................] - ETA: 2:19 - loss: 1.3321 - regression_loss: 1.1566 - classification_loss: 0.1755 95/500 [====>.........................] - ETA: 2:18 - loss: 1.3384 - regression_loss: 1.1633 - classification_loss: 0.1750 96/500 [====>.........................] - ETA: 2:18 - loss: 1.3390 - regression_loss: 1.1635 - classification_loss: 0.1755 97/500 [====>.........................] - ETA: 2:17 - loss: 1.3383 - regression_loss: 1.1629 - classification_loss: 0.1754 98/500 [====>.........................] - ETA: 2:17 - loss: 1.3365 - regression_loss: 1.1616 - classification_loss: 0.1749 99/500 [====>.........................] - ETA: 2:16 - loss: 1.3367 - regression_loss: 1.1616 - classification_loss: 0.1751 100/500 [=====>........................] - ETA: 2:16 - loss: 1.3398 - regression_loss: 1.1644 - classification_loss: 0.1754 101/500 [=====>........................] - ETA: 2:16 - loss: 1.3442 - regression_loss: 1.1683 - classification_loss: 0.1759 102/500 [=====>........................] - ETA: 2:15 - loss: 1.3453 - regression_loss: 1.1691 - classification_loss: 0.1762 103/500 [=====>........................] - ETA: 2:15 - loss: 1.3439 - regression_loss: 1.1682 - classification_loss: 0.1757 104/500 [=====>........................] - ETA: 2:15 - loss: 1.3466 - regression_loss: 1.1703 - classification_loss: 0.1763 105/500 [=====>........................] - ETA: 2:14 - loss: 1.3423 - regression_loss: 1.1665 - classification_loss: 0.1757 106/500 [=====>........................] - ETA: 2:14 - loss: 1.3430 - regression_loss: 1.1674 - classification_loss: 0.1756 107/500 [=====>........................] - ETA: 2:13 - loss: 1.3357 - regression_loss: 1.1613 - classification_loss: 0.1744 108/500 [=====>........................] - ETA: 2:13 - loss: 1.3388 - regression_loss: 1.1640 - classification_loss: 0.1749 109/500 [=====>........................] - ETA: 2:13 - loss: 1.3308 - regression_loss: 1.1570 - classification_loss: 0.1738 110/500 [=====>........................] - ETA: 2:13 - loss: 1.3362 - regression_loss: 1.1616 - classification_loss: 0.1746 111/500 [=====>........................] - ETA: 2:12 - loss: 1.3309 - regression_loss: 1.1569 - classification_loss: 0.1740 112/500 [=====>........................] - ETA: 2:12 - loss: 1.3317 - regression_loss: 1.1576 - classification_loss: 0.1741 113/500 [=====>........................] - ETA: 2:12 - loss: 1.3270 - regression_loss: 1.1537 - classification_loss: 0.1733 114/500 [=====>........................] - ETA: 2:11 - loss: 1.3301 - regression_loss: 1.1565 - classification_loss: 0.1736 115/500 [=====>........................] - ETA: 2:11 - loss: 1.3271 - regression_loss: 1.1540 - classification_loss: 0.1730 116/500 [=====>........................] - ETA: 2:11 - loss: 1.3280 - regression_loss: 1.1551 - classification_loss: 0.1729 117/500 [======>.......................] - ETA: 2:10 - loss: 1.3308 - regression_loss: 1.1578 - classification_loss: 0.1730 118/500 [======>.......................] - ETA: 2:10 - loss: 1.3297 - regression_loss: 1.1569 - classification_loss: 0.1728 119/500 [======>.......................] - ETA: 2:10 - loss: 1.3313 - regression_loss: 1.1580 - classification_loss: 0.1733 120/500 [======>.......................] - ETA: 2:09 - loss: 1.3316 - regression_loss: 1.1577 - classification_loss: 0.1739 121/500 [======>.......................] - ETA: 2:09 - loss: 1.3257 - regression_loss: 1.1526 - classification_loss: 0.1731 122/500 [======>.......................] - ETA: 2:09 - loss: 1.3300 - regression_loss: 1.1561 - classification_loss: 0.1738 123/500 [======>.......................] - ETA: 2:08 - loss: 1.3335 - regression_loss: 1.1590 - classification_loss: 0.1744 124/500 [======>.......................] - ETA: 2:08 - loss: 1.3347 - regression_loss: 1.1600 - classification_loss: 0.1747 125/500 [======>.......................] - ETA: 2:08 - loss: 1.3372 - regression_loss: 1.1626 - classification_loss: 0.1745 126/500 [======>.......................] - ETA: 2:07 - loss: 1.3345 - regression_loss: 1.1600 - classification_loss: 0.1745 127/500 [======>.......................] - ETA: 2:07 - loss: 1.3341 - regression_loss: 1.1596 - classification_loss: 0.1745 128/500 [======>.......................] - ETA: 2:07 - loss: 1.3341 - regression_loss: 1.1593 - classification_loss: 0.1748 129/500 [======>.......................] - ETA: 2:06 - loss: 1.3360 - regression_loss: 1.1607 - classification_loss: 0.1753 130/500 [======>.......................] - ETA: 2:06 - loss: 1.3360 - regression_loss: 1.1602 - classification_loss: 0.1758 131/500 [======>.......................] - ETA: 2:06 - loss: 1.3430 - regression_loss: 1.1659 - classification_loss: 0.1771 132/500 [======>.......................] - ETA: 2:05 - loss: 1.3433 - regression_loss: 1.1661 - classification_loss: 0.1772 133/500 [======>.......................] - ETA: 2:05 - loss: 1.3403 - regression_loss: 1.1638 - classification_loss: 0.1765 134/500 [=======>......................] - ETA: 2:05 - loss: 1.3397 - regression_loss: 1.1635 - classification_loss: 0.1762 135/500 [=======>......................] - ETA: 2:04 - loss: 1.3359 - regression_loss: 1.1605 - classification_loss: 0.1754 136/500 [=======>......................] - ETA: 2:04 - loss: 1.3302 - regression_loss: 1.1557 - classification_loss: 0.1745 137/500 [=======>......................] - ETA: 2:04 - loss: 1.3301 - regression_loss: 1.1556 - classification_loss: 0.1744 138/500 [=======>......................] - ETA: 2:03 - loss: 1.3280 - regression_loss: 1.1537 - classification_loss: 0.1743 139/500 [=======>......................] - ETA: 2:03 - loss: 1.3273 - regression_loss: 1.1532 - classification_loss: 0.1740 140/500 [=======>......................] - ETA: 2:02 - loss: 1.3263 - regression_loss: 1.1526 - classification_loss: 0.1737 141/500 [=======>......................] - ETA: 2:02 - loss: 1.3285 - regression_loss: 1.1548 - classification_loss: 0.1738 142/500 [=======>......................] - ETA: 2:02 - loss: 1.3280 - regression_loss: 1.1545 - classification_loss: 0.1735 143/500 [=======>......................] - ETA: 2:01 - loss: 1.3293 - regression_loss: 1.1557 - classification_loss: 0.1736 144/500 [=======>......................] - ETA: 2:01 - loss: 1.3299 - regression_loss: 1.1561 - classification_loss: 0.1738 145/500 [=======>......................] - ETA: 2:01 - loss: 1.3327 - regression_loss: 1.1590 - classification_loss: 0.1737 146/500 [=======>......................] - ETA: 2:00 - loss: 1.3363 - regression_loss: 1.1621 - classification_loss: 0.1742 147/500 [=======>......................] - ETA: 2:00 - loss: 1.3367 - regression_loss: 1.1625 - classification_loss: 0.1742 148/500 [=======>......................] - ETA: 2:00 - loss: 1.3396 - regression_loss: 1.1649 - classification_loss: 0.1747 149/500 [=======>......................] - ETA: 1:59 - loss: 1.3393 - regression_loss: 1.1646 - classification_loss: 0.1747 150/500 [========>.....................] - ETA: 1:59 - loss: 1.3379 - regression_loss: 1.1633 - classification_loss: 0.1747 151/500 [========>.....................] - ETA: 1:59 - loss: 1.3382 - regression_loss: 1.1637 - classification_loss: 0.1745 152/500 [========>.....................] - ETA: 1:58 - loss: 1.3384 - regression_loss: 1.1638 - classification_loss: 0.1745 153/500 [========>.....................] - ETA: 1:58 - loss: 1.3397 - regression_loss: 1.1657 - classification_loss: 0.1740 154/500 [========>.....................] - ETA: 1:58 - loss: 1.3403 - regression_loss: 1.1664 - classification_loss: 0.1739 155/500 [========>.....................] - ETA: 1:57 - loss: 1.3387 - regression_loss: 1.1652 - classification_loss: 0.1735 156/500 [========>.....................] - ETA: 1:57 - loss: 1.3401 - regression_loss: 1.1665 - classification_loss: 0.1736 157/500 [========>.....................] - ETA: 1:57 - loss: 1.3384 - regression_loss: 1.1651 - classification_loss: 0.1732 158/500 [========>.....................] - ETA: 1:56 - loss: 1.3395 - regression_loss: 1.1660 - classification_loss: 0.1734 159/500 [========>.....................] - ETA: 1:56 - loss: 1.3393 - regression_loss: 1.1661 - classification_loss: 0.1732 160/500 [========>.....................] - ETA: 1:56 - loss: 1.3393 - regression_loss: 1.1659 - classification_loss: 0.1733 161/500 [========>.....................] - ETA: 1:55 - loss: 1.3382 - regression_loss: 1.1651 - classification_loss: 0.1731 162/500 [========>.....................] - ETA: 1:55 - loss: 1.3390 - regression_loss: 1.1656 - classification_loss: 0.1734 163/500 [========>.....................] - ETA: 1:54 - loss: 1.3359 - regression_loss: 1.1631 - classification_loss: 0.1728 164/500 [========>.....................] - ETA: 1:54 - loss: 1.3328 - regression_loss: 1.1604 - classification_loss: 0.1724 165/500 [========>.....................] - ETA: 1:54 - loss: 1.3310 - regression_loss: 1.1587 - classification_loss: 0.1723 166/500 [========>.....................] - ETA: 1:53 - loss: 1.3287 - regression_loss: 1.1565 - classification_loss: 0.1722 167/500 [=========>....................] - ETA: 1:53 - loss: 1.3283 - regression_loss: 1.1564 - classification_loss: 0.1720 168/500 [=========>....................] - ETA: 1:53 - loss: 1.3279 - regression_loss: 1.1560 - classification_loss: 0.1718 169/500 [=========>....................] - ETA: 1:52 - loss: 1.3244 - regression_loss: 1.1531 - classification_loss: 0.1713 170/500 [=========>....................] - ETA: 1:52 - loss: 1.3253 - regression_loss: 1.1540 - classification_loss: 0.1714 171/500 [=========>....................] - ETA: 1:52 - loss: 1.3258 - regression_loss: 1.1542 - classification_loss: 0.1716 172/500 [=========>....................] - ETA: 1:51 - loss: 1.3207 - regression_loss: 1.1499 - classification_loss: 0.1708 173/500 [=========>....................] - ETA: 1:51 - loss: 1.3218 - regression_loss: 1.1509 - classification_loss: 0.1709 174/500 [=========>....................] - ETA: 1:51 - loss: 1.3237 - regression_loss: 1.1525 - classification_loss: 0.1711 175/500 [=========>....................] - ETA: 1:50 - loss: 1.3239 - regression_loss: 1.1526 - classification_loss: 0.1713 176/500 [=========>....................] - ETA: 1:50 - loss: 1.3242 - regression_loss: 1.1527 - classification_loss: 0.1714 177/500 [=========>....................] - ETA: 1:50 - loss: 1.3256 - regression_loss: 1.1539 - classification_loss: 0.1716 178/500 [=========>....................] - ETA: 1:49 - loss: 1.3248 - regression_loss: 1.1535 - classification_loss: 0.1713 179/500 [=========>....................] - ETA: 1:49 - loss: 1.3226 - regression_loss: 1.1515 - classification_loss: 0.1711 180/500 [=========>....................] - ETA: 1:49 - loss: 1.3222 - regression_loss: 1.1515 - classification_loss: 0.1707 181/500 [=========>....................] - ETA: 1:48 - loss: 1.3240 - regression_loss: 1.1531 - classification_loss: 0.1710 182/500 [=========>....................] - ETA: 1:48 - loss: 1.3237 - regression_loss: 1.1530 - classification_loss: 0.1707 183/500 [=========>....................] - ETA: 1:48 - loss: 1.3229 - regression_loss: 1.1524 - classification_loss: 0.1705 184/500 [==========>...................] - ETA: 1:47 - loss: 1.3208 - regression_loss: 1.1508 - classification_loss: 0.1700 185/500 [==========>...................] - ETA: 1:47 - loss: 1.3208 - regression_loss: 1.1510 - classification_loss: 0.1698 186/500 [==========>...................] - ETA: 1:47 - loss: 1.3195 - regression_loss: 1.1499 - classification_loss: 0.1696 187/500 [==========>...................] - ETA: 1:46 - loss: 1.3184 - regression_loss: 1.1491 - classification_loss: 0.1693 188/500 [==========>...................] - ETA: 1:46 - loss: 1.3154 - regression_loss: 1.1459 - classification_loss: 0.1695 189/500 [==========>...................] - ETA: 1:46 - loss: 1.3155 - regression_loss: 1.1460 - classification_loss: 0.1695 190/500 [==========>...................] - ETA: 1:45 - loss: 1.3152 - regression_loss: 1.1458 - classification_loss: 0.1695 191/500 [==========>...................] - ETA: 1:45 - loss: 1.3122 - regression_loss: 1.1430 - classification_loss: 0.1691 192/500 [==========>...................] - ETA: 1:45 - loss: 1.3144 - regression_loss: 1.1449 - classification_loss: 0.1695 193/500 [==========>...................] - ETA: 1:44 - loss: 1.3109 - regression_loss: 1.1420 - classification_loss: 0.1689 194/500 [==========>...................] - ETA: 1:44 - loss: 1.3112 - regression_loss: 1.1425 - classification_loss: 0.1687 195/500 [==========>...................] - ETA: 1:44 - loss: 1.3120 - regression_loss: 1.1436 - classification_loss: 0.1685 196/500 [==========>...................] - ETA: 1:43 - loss: 1.3128 - regression_loss: 1.1444 - classification_loss: 0.1684 197/500 [==========>...................] - ETA: 1:43 - loss: 1.3146 - regression_loss: 1.1459 - classification_loss: 0.1687 198/500 [==========>...................] - ETA: 1:43 - loss: 1.3164 - regression_loss: 1.1473 - classification_loss: 0.1691 199/500 [==========>...................] - ETA: 1:42 - loss: 1.3127 - regression_loss: 1.1440 - classification_loss: 0.1686 200/500 [===========>..................] - ETA: 1:42 - loss: 1.3121 - regression_loss: 1.1436 - classification_loss: 0.1685 201/500 [===========>..................] - ETA: 1:42 - loss: 1.3144 - regression_loss: 1.1454 - classification_loss: 0.1690 202/500 [===========>..................] - ETA: 1:41 - loss: 1.3144 - regression_loss: 1.1455 - classification_loss: 0.1689 203/500 [===========>..................] - ETA: 1:41 - loss: 1.3173 - regression_loss: 1.1479 - classification_loss: 0.1694 204/500 [===========>..................] - ETA: 1:41 - loss: 1.3262 - regression_loss: 1.1553 - classification_loss: 0.1709 205/500 [===========>..................] - ETA: 1:40 - loss: 1.3267 - regression_loss: 1.1558 - classification_loss: 0.1710 206/500 [===========>..................] - ETA: 1:40 - loss: 1.3291 - regression_loss: 1.1579 - classification_loss: 0.1712 207/500 [===========>..................] - ETA: 1:40 - loss: 1.3289 - regression_loss: 1.1579 - classification_loss: 0.1710 208/500 [===========>..................] - ETA: 1:39 - loss: 1.3292 - regression_loss: 1.1582 - classification_loss: 0.1710 209/500 [===========>..................] - ETA: 1:39 - loss: 1.3295 - regression_loss: 1.1585 - classification_loss: 0.1710 210/500 [===========>..................] - ETA: 1:38 - loss: 1.3293 - regression_loss: 1.1583 - classification_loss: 0.1710 211/500 [===========>..................] - ETA: 1:38 - loss: 1.3302 - regression_loss: 1.1592 - classification_loss: 0.1710 212/500 [===========>..................] - ETA: 1:38 - loss: 1.3310 - regression_loss: 1.1598 - classification_loss: 0.1712 213/500 [===========>..................] - ETA: 1:38 - loss: 1.3315 - regression_loss: 1.1604 - classification_loss: 0.1711 214/500 [===========>..................] - ETA: 1:37 - loss: 1.3321 - regression_loss: 1.1609 - classification_loss: 0.1712 215/500 [===========>..................] - ETA: 1:37 - loss: 1.3339 - regression_loss: 1.1620 - classification_loss: 0.1719 216/500 [===========>..................] - ETA: 1:37 - loss: 1.3347 - regression_loss: 1.1627 - classification_loss: 0.1720 217/500 [============>.................] - ETA: 1:36 - loss: 1.3371 - regression_loss: 1.1649 - classification_loss: 0.1722 218/500 [============>.................] - ETA: 1:36 - loss: 1.3382 - regression_loss: 1.1660 - classification_loss: 0.1722 219/500 [============>.................] - ETA: 1:36 - loss: 1.3384 - regression_loss: 1.1661 - classification_loss: 0.1723 220/500 [============>.................] - ETA: 1:35 - loss: 1.3373 - regression_loss: 1.1650 - classification_loss: 0.1723 221/500 [============>.................] - ETA: 1:35 - loss: 1.3387 - regression_loss: 1.1662 - classification_loss: 0.1725 222/500 [============>.................] - ETA: 1:34 - loss: 1.3398 - regression_loss: 1.1672 - classification_loss: 0.1726 223/500 [============>.................] - ETA: 1:34 - loss: 1.3397 - regression_loss: 1.1669 - classification_loss: 0.1727 224/500 [============>.................] - ETA: 1:34 - loss: 1.3396 - regression_loss: 1.1668 - classification_loss: 0.1728 225/500 [============>.................] - ETA: 1:33 - loss: 1.3401 - regression_loss: 1.1673 - classification_loss: 0.1729 226/500 [============>.................] - ETA: 1:33 - loss: 1.3387 - regression_loss: 1.1661 - classification_loss: 0.1726 227/500 [============>.................] - ETA: 1:33 - loss: 1.3402 - regression_loss: 1.1671 - classification_loss: 0.1731 228/500 [============>.................] - ETA: 1:32 - loss: 1.3395 - regression_loss: 1.1665 - classification_loss: 0.1730 229/500 [============>.................] - ETA: 1:32 - loss: 1.3397 - regression_loss: 1.1666 - classification_loss: 0.1730 230/500 [============>.................] - ETA: 1:32 - loss: 1.3393 - regression_loss: 1.1663 - classification_loss: 0.1730 231/500 [============>.................] - ETA: 1:31 - loss: 1.3370 - regression_loss: 1.1643 - classification_loss: 0.1727 232/500 [============>.................] - ETA: 1:31 - loss: 1.3348 - regression_loss: 1.1626 - classification_loss: 0.1722 233/500 [============>.................] - ETA: 1:31 - loss: 1.3349 - regression_loss: 1.1630 - classification_loss: 0.1720 234/500 [=============>................] - ETA: 1:30 - loss: 1.3326 - regression_loss: 1.1611 - classification_loss: 0.1714 235/500 [=============>................] - ETA: 1:30 - loss: 1.3307 - regression_loss: 1.1597 - classification_loss: 0.1710 236/500 [=============>................] - ETA: 1:30 - loss: 1.3281 - regression_loss: 1.1576 - classification_loss: 0.1705 237/500 [=============>................] - ETA: 1:29 - loss: 1.3256 - regression_loss: 1.1557 - classification_loss: 0.1699 238/500 [=============>................] - ETA: 1:29 - loss: 1.3245 - regression_loss: 1.1548 - classification_loss: 0.1697 239/500 [=============>................] - ETA: 1:29 - loss: 1.3248 - regression_loss: 1.1550 - classification_loss: 0.1698 240/500 [=============>................] - ETA: 1:28 - loss: 1.3250 - regression_loss: 1.1552 - classification_loss: 0.1698 241/500 [=============>................] - ETA: 1:28 - loss: 1.3235 - regression_loss: 1.1540 - classification_loss: 0.1695 242/500 [=============>................] - ETA: 1:28 - loss: 1.3205 - regression_loss: 1.1516 - classification_loss: 0.1690 243/500 [=============>................] - ETA: 1:27 - loss: 1.3205 - regression_loss: 1.1514 - classification_loss: 0.1691 244/500 [=============>................] - ETA: 1:27 - loss: 1.3207 - regression_loss: 1.1516 - classification_loss: 0.1691 245/500 [=============>................] - ETA: 1:27 - loss: 1.3211 - regression_loss: 1.1521 - classification_loss: 0.1690 246/500 [=============>................] - ETA: 1:26 - loss: 1.3189 - regression_loss: 1.1503 - classification_loss: 0.1685 247/500 [=============>................] - ETA: 1:26 - loss: 1.3160 - regression_loss: 1.1475 - classification_loss: 0.1685 248/500 [=============>................] - ETA: 1:26 - loss: 1.3171 - regression_loss: 1.1484 - classification_loss: 0.1687 249/500 [=============>................] - ETA: 1:25 - loss: 1.3166 - regression_loss: 1.1481 - classification_loss: 0.1685 250/500 [==============>...............] - ETA: 1:25 - loss: 1.3144 - regression_loss: 1.1460 - classification_loss: 0.1684 251/500 [==============>...............] - ETA: 1:25 - loss: 1.3150 - regression_loss: 1.1466 - classification_loss: 0.1684 252/500 [==============>...............] - ETA: 1:24 - loss: 1.3149 - regression_loss: 1.1465 - classification_loss: 0.1684 253/500 [==============>...............] - ETA: 1:24 - loss: 1.3168 - regression_loss: 1.1479 - classification_loss: 0.1689 254/500 [==============>...............] - ETA: 1:24 - loss: 1.3173 - regression_loss: 1.1482 - classification_loss: 0.1691 255/500 [==============>...............] - ETA: 1:23 - loss: 1.3182 - regression_loss: 1.1487 - classification_loss: 0.1694 256/500 [==============>...............] - ETA: 1:23 - loss: 1.3180 - regression_loss: 1.1487 - classification_loss: 0.1692 257/500 [==============>...............] - ETA: 1:23 - loss: 1.3167 - regression_loss: 1.1476 - classification_loss: 0.1691 258/500 [==============>...............] - ETA: 1:22 - loss: 1.3160 - regression_loss: 1.1473 - classification_loss: 0.1688 259/500 [==============>...............] - ETA: 1:22 - loss: 1.3125 - regression_loss: 1.1442 - classification_loss: 0.1683 260/500 [==============>...............] - ETA: 1:21 - loss: 1.3103 - regression_loss: 1.1425 - classification_loss: 0.1678 261/500 [==============>...............] - ETA: 1:21 - loss: 1.3098 - regression_loss: 1.1420 - classification_loss: 0.1679 262/500 [==============>...............] - ETA: 1:21 - loss: 1.3079 - regression_loss: 1.1403 - classification_loss: 0.1676 263/500 [==============>...............] - ETA: 1:20 - loss: 1.3072 - regression_loss: 1.1398 - classification_loss: 0.1675 264/500 [==============>...............] - ETA: 1:20 - loss: 1.3064 - regression_loss: 1.1392 - classification_loss: 0.1672 265/500 [==============>...............] - ETA: 1:20 - loss: 1.3042 - regression_loss: 1.1373 - classification_loss: 0.1669 266/500 [==============>...............] - ETA: 1:19 - loss: 1.3014 - regression_loss: 1.1350 - classification_loss: 0.1665 267/500 [===============>..............] - ETA: 1:19 - loss: 1.3050 - regression_loss: 1.1376 - classification_loss: 0.1674 268/500 [===============>..............] - ETA: 1:19 - loss: 1.3062 - regression_loss: 1.1388 - classification_loss: 0.1674 269/500 [===============>..............] - ETA: 1:18 - loss: 1.3068 - regression_loss: 1.1393 - classification_loss: 0.1676 270/500 [===============>..............] - ETA: 1:18 - loss: 1.3151 - regression_loss: 1.1457 - classification_loss: 0.1694 271/500 [===============>..............] - ETA: 1:18 - loss: 1.3119 - regression_loss: 1.1430 - classification_loss: 0.1690 272/500 [===============>..............] - ETA: 1:17 - loss: 1.3153 - regression_loss: 1.1462 - classification_loss: 0.1692 273/500 [===============>..............] - ETA: 1:17 - loss: 1.3148 - regression_loss: 1.1457 - classification_loss: 0.1690 274/500 [===============>..............] - ETA: 1:17 - loss: 1.3172 - regression_loss: 1.1477 - classification_loss: 0.1695 275/500 [===============>..............] - ETA: 1:16 - loss: 1.3179 - regression_loss: 1.1482 - classification_loss: 0.1696 276/500 [===============>..............] - ETA: 1:16 - loss: 1.3181 - regression_loss: 1.1485 - classification_loss: 0.1697 277/500 [===============>..............] - ETA: 1:16 - loss: 1.3174 - regression_loss: 1.1479 - classification_loss: 0.1695 278/500 [===============>..............] - ETA: 1:15 - loss: 1.3157 - regression_loss: 1.1464 - classification_loss: 0.1693 279/500 [===============>..............] - ETA: 1:15 - loss: 1.3152 - regression_loss: 1.1460 - classification_loss: 0.1692 280/500 [===============>..............] - ETA: 1:15 - loss: 1.3157 - regression_loss: 1.1466 - classification_loss: 0.1691 281/500 [===============>..............] - ETA: 1:14 - loss: 1.3127 - regression_loss: 1.1440 - classification_loss: 0.1687 282/500 [===============>..............] - ETA: 1:14 - loss: 1.3118 - regression_loss: 1.1433 - classification_loss: 0.1685 283/500 [===============>..............] - ETA: 1:14 - loss: 1.3106 - regression_loss: 1.1423 - classification_loss: 0.1683 284/500 [================>.............] - ETA: 1:13 - loss: 1.3114 - regression_loss: 1.1429 - classification_loss: 0.1685 285/500 [================>.............] - ETA: 1:13 - loss: 1.3096 - regression_loss: 1.1413 - classification_loss: 0.1683 286/500 [================>.............] - ETA: 1:13 - loss: 1.3102 - regression_loss: 1.1419 - classification_loss: 0.1684 287/500 [================>.............] - ETA: 1:12 - loss: 1.3109 - regression_loss: 1.1424 - classification_loss: 0.1685 288/500 [================>.............] - ETA: 1:12 - loss: 1.3073 - regression_loss: 1.1392 - classification_loss: 0.1681 289/500 [================>.............] - ETA: 1:11 - loss: 1.3063 - regression_loss: 1.1384 - classification_loss: 0.1679 290/500 [================>.............] - ETA: 1:11 - loss: 1.3085 - regression_loss: 1.1404 - classification_loss: 0.1681 291/500 [================>.............] - ETA: 1:11 - loss: 1.3105 - regression_loss: 1.1419 - classification_loss: 0.1685 292/500 [================>.............] - ETA: 1:10 - loss: 1.3118 - regression_loss: 1.1430 - classification_loss: 0.1688 293/500 [================>.............] - ETA: 1:10 - loss: 1.3139 - regression_loss: 1.1447 - classification_loss: 0.1691 294/500 [================>.............] - ETA: 1:10 - loss: 1.3132 - regression_loss: 1.1443 - classification_loss: 0.1689 295/500 [================>.............] - ETA: 1:09 - loss: 1.3139 - regression_loss: 1.1449 - classification_loss: 0.1690 296/500 [================>.............] - ETA: 1:09 - loss: 1.3150 - regression_loss: 1.1457 - classification_loss: 0.1692 297/500 [================>.............] - ETA: 1:09 - loss: 1.3146 - regression_loss: 1.1454 - classification_loss: 0.1693 298/500 [================>.............] - ETA: 1:08 - loss: 1.3126 - regression_loss: 1.1437 - classification_loss: 0.1690 299/500 [================>.............] - ETA: 1:08 - loss: 1.3125 - regression_loss: 1.1435 - classification_loss: 0.1689 300/500 [=================>............] - ETA: 1:08 - loss: 1.3114 - regression_loss: 1.1426 - classification_loss: 0.1687 301/500 [=================>............] - ETA: 1:07 - loss: 1.3120 - regression_loss: 1.1432 - classification_loss: 0.1688 302/500 [=================>............] - ETA: 1:07 - loss: 1.3139 - regression_loss: 1.1446 - classification_loss: 0.1693 303/500 [=================>............] - ETA: 1:07 - loss: 1.3145 - regression_loss: 1.1451 - classification_loss: 0.1694 304/500 [=================>............] - ETA: 1:06 - loss: 1.3152 - regression_loss: 1.1458 - classification_loss: 0.1694 305/500 [=================>............] - ETA: 1:06 - loss: 1.3155 - regression_loss: 1.1458 - classification_loss: 0.1697 306/500 [=================>............] - ETA: 1:06 - loss: 1.3135 - regression_loss: 1.1442 - classification_loss: 0.1693 307/500 [=================>............] - ETA: 1:05 - loss: 1.3133 - regression_loss: 1.1440 - classification_loss: 0.1693 308/500 [=================>............] - ETA: 1:05 - loss: 1.3142 - regression_loss: 1.1447 - classification_loss: 0.1695 309/500 [=================>............] - ETA: 1:05 - loss: 1.3133 - regression_loss: 1.1440 - classification_loss: 0.1694 310/500 [=================>............] - ETA: 1:04 - loss: 1.3143 - regression_loss: 1.1449 - classification_loss: 0.1695 311/500 [=================>............] - ETA: 1:04 - loss: 1.3138 - regression_loss: 1.1443 - classification_loss: 0.1695 312/500 [=================>............] - ETA: 1:04 - loss: 1.3137 - regression_loss: 1.1443 - classification_loss: 0.1694 313/500 [=================>............] - ETA: 1:03 - loss: 1.3133 - regression_loss: 1.1438 - classification_loss: 0.1694 314/500 [=================>............] - ETA: 1:03 - loss: 1.3128 - regression_loss: 1.1433 - classification_loss: 0.1695 315/500 [=================>............] - ETA: 1:03 - loss: 1.3122 - regression_loss: 1.1428 - classification_loss: 0.1694 316/500 [=================>............] - ETA: 1:02 - loss: 1.3138 - regression_loss: 1.1441 - classification_loss: 0.1697 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3127 - regression_loss: 1.1433 - classification_loss: 0.1694 318/500 [==================>...........] - ETA: 1:02 - loss: 1.3110 - regression_loss: 1.1419 - classification_loss: 0.1691 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3119 - regression_loss: 1.1428 - classification_loss: 0.1692 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3113 - regression_loss: 1.1422 - classification_loss: 0.1691 321/500 [==================>...........] - ETA: 1:01 - loss: 1.3112 - regression_loss: 1.1422 - classification_loss: 0.1690 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3108 - regression_loss: 1.1419 - classification_loss: 0.1689 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3103 - regression_loss: 1.1416 - classification_loss: 0.1688 324/500 [==================>...........] - ETA: 1:00 - loss: 1.3078 - regression_loss: 1.1393 - classification_loss: 0.1685 325/500 [==================>...........] - ETA: 59s - loss: 1.3066 - regression_loss: 1.1384 - classification_loss: 0.1682  326/500 [==================>...........] - ETA: 59s - loss: 1.3047 - regression_loss: 1.1368 - classification_loss: 0.1678 327/500 [==================>...........] - ETA: 59s - loss: 1.3049 - regression_loss: 1.1372 - classification_loss: 0.1677 328/500 [==================>...........] - ETA: 58s - loss: 1.3044 - regression_loss: 1.1368 - classification_loss: 0.1676 329/500 [==================>...........] - ETA: 58s - loss: 1.3035 - regression_loss: 1.1361 - classification_loss: 0.1674 330/500 [==================>...........] - ETA: 57s - loss: 1.3029 - regression_loss: 1.1356 - classification_loss: 0.1673 331/500 [==================>...........] - ETA: 57s - loss: 1.3039 - regression_loss: 1.1364 - classification_loss: 0.1675 332/500 [==================>...........] - ETA: 57s - loss: 1.3041 - regression_loss: 1.1367 - classification_loss: 0.1675 333/500 [==================>...........] - ETA: 56s - loss: 1.3051 - regression_loss: 1.1376 - classification_loss: 0.1675 334/500 [===================>..........] - ETA: 56s - loss: 1.3051 - regression_loss: 1.1377 - classification_loss: 0.1673 335/500 [===================>..........] - ETA: 56s - loss: 1.3058 - regression_loss: 1.1384 - classification_loss: 0.1673 336/500 [===================>..........] - ETA: 55s - loss: 1.3065 - regression_loss: 1.1391 - classification_loss: 0.1674 337/500 [===================>..........] - ETA: 55s - loss: 1.3043 - regression_loss: 1.1372 - classification_loss: 0.1672 338/500 [===================>..........] - ETA: 55s - loss: 1.3028 - regression_loss: 1.1360 - classification_loss: 0.1668 339/500 [===================>..........] - ETA: 54s - loss: 1.3030 - regression_loss: 1.1362 - classification_loss: 0.1668 340/500 [===================>..........] - ETA: 54s - loss: 1.3035 - regression_loss: 1.1367 - classification_loss: 0.1668 341/500 [===================>..........] - ETA: 54s - loss: 1.3048 - regression_loss: 1.1377 - classification_loss: 0.1671 342/500 [===================>..........] - ETA: 53s - loss: 1.3044 - regression_loss: 1.1373 - classification_loss: 0.1671 343/500 [===================>..........] - ETA: 53s - loss: 1.3059 - regression_loss: 1.1384 - classification_loss: 0.1675 344/500 [===================>..........] - ETA: 53s - loss: 1.3052 - regression_loss: 1.1377 - classification_loss: 0.1674 345/500 [===================>..........] - ETA: 52s - loss: 1.3047 - regression_loss: 1.1374 - classification_loss: 0.1673 346/500 [===================>..........] - ETA: 52s - loss: 1.3053 - regression_loss: 1.1379 - classification_loss: 0.1674 347/500 [===================>..........] - ETA: 52s - loss: 1.3033 - regression_loss: 1.1363 - classification_loss: 0.1670 348/500 [===================>..........] - ETA: 51s - loss: 1.3043 - regression_loss: 1.1372 - classification_loss: 0.1671 349/500 [===================>..........] - ETA: 51s - loss: 1.3029 - regression_loss: 1.1361 - classification_loss: 0.1668 350/500 [====================>.........] - ETA: 51s - loss: 1.3027 - regression_loss: 1.1360 - classification_loss: 0.1667 351/500 [====================>.........] - ETA: 50s - loss: 1.3014 - regression_loss: 1.1349 - classification_loss: 0.1665 352/500 [====================>.........] - ETA: 50s - loss: 1.3013 - regression_loss: 1.1348 - classification_loss: 0.1665 353/500 [====================>.........] - ETA: 50s - loss: 1.3004 - regression_loss: 1.1339 - classification_loss: 0.1664 354/500 [====================>.........] - ETA: 49s - loss: 1.3013 - regression_loss: 1.1348 - classification_loss: 0.1665 355/500 [====================>.........] - ETA: 49s - loss: 1.3005 - regression_loss: 1.1343 - classification_loss: 0.1662 356/500 [====================>.........] - ETA: 49s - loss: 1.3016 - regression_loss: 1.1351 - classification_loss: 0.1664 357/500 [====================>.........] - ETA: 48s - loss: 1.3004 - regression_loss: 1.1341 - classification_loss: 0.1663 358/500 [====================>.........] - ETA: 48s - loss: 1.3008 - regression_loss: 1.1345 - classification_loss: 0.1662 359/500 [====================>.........] - ETA: 48s - loss: 1.3000 - regression_loss: 1.1338 - classification_loss: 0.1663 360/500 [====================>.........] - ETA: 47s - loss: 1.3056 - regression_loss: 1.1383 - classification_loss: 0.1672 361/500 [====================>.........] - ETA: 47s - loss: 1.3055 - regression_loss: 1.1382 - classification_loss: 0.1673 362/500 [====================>.........] - ETA: 47s - loss: 1.3049 - regression_loss: 1.1378 - classification_loss: 0.1672 363/500 [====================>.........] - ETA: 46s - loss: 1.3063 - regression_loss: 1.1388 - classification_loss: 0.1674 364/500 [====================>.........] - ETA: 46s - loss: 1.3046 - regression_loss: 1.1375 - classification_loss: 0.1672 365/500 [====================>.........] - ETA: 46s - loss: 1.3055 - regression_loss: 1.1383 - classification_loss: 0.1672 366/500 [====================>.........] - ETA: 45s - loss: 1.3058 - regression_loss: 1.1384 - classification_loss: 0.1673 367/500 [=====================>........] - ETA: 45s - loss: 1.3069 - regression_loss: 1.1393 - classification_loss: 0.1676 368/500 [=====================>........] - ETA: 45s - loss: 1.3060 - regression_loss: 1.1385 - classification_loss: 0.1674 369/500 [=====================>........] - ETA: 44s - loss: 1.3059 - regression_loss: 1.1386 - classification_loss: 0.1673 370/500 [=====================>........] - ETA: 44s - loss: 1.3042 - regression_loss: 1.1372 - classification_loss: 0.1670 371/500 [=====================>........] - ETA: 43s - loss: 1.3036 - regression_loss: 1.1367 - classification_loss: 0.1670 372/500 [=====================>........] - ETA: 43s - loss: 1.3043 - regression_loss: 1.1374 - classification_loss: 0.1669 373/500 [=====================>........] - ETA: 43s - loss: 1.3027 - regression_loss: 1.1361 - classification_loss: 0.1666 374/500 [=====================>........] - ETA: 42s - loss: 1.3030 - regression_loss: 1.1361 - classification_loss: 0.1669 375/500 [=====================>........] - ETA: 42s - loss: 1.3011 - regression_loss: 1.1345 - classification_loss: 0.1667 376/500 [=====================>........] - ETA: 42s - loss: 1.3013 - regression_loss: 1.1347 - classification_loss: 0.1666 377/500 [=====================>........] - ETA: 41s - loss: 1.3014 - regression_loss: 1.1348 - classification_loss: 0.1665 378/500 [=====================>........] - ETA: 41s - loss: 1.3012 - regression_loss: 1.1346 - classification_loss: 0.1666 379/500 [=====================>........] - ETA: 41s - loss: 1.2996 - regression_loss: 1.1333 - classification_loss: 0.1663 380/500 [=====================>........] - ETA: 40s - loss: 1.2996 - regression_loss: 1.1334 - classification_loss: 0.1662 381/500 [=====================>........] - ETA: 40s - loss: 1.3008 - regression_loss: 1.1345 - classification_loss: 0.1662 382/500 [=====================>........] - ETA: 40s - loss: 1.3027 - regression_loss: 1.1361 - classification_loss: 0.1666 383/500 [=====================>........] - ETA: 39s - loss: 1.3010 - regression_loss: 1.1346 - classification_loss: 0.1664 384/500 [======================>.......] - ETA: 39s - loss: 1.2988 - regression_loss: 1.1327 - classification_loss: 0.1660 385/500 [======================>.......] - ETA: 39s - loss: 1.2996 - regression_loss: 1.1335 - classification_loss: 0.1661 386/500 [======================>.......] - ETA: 38s - loss: 1.2995 - regression_loss: 1.1334 - classification_loss: 0.1662 387/500 [======================>.......] - ETA: 38s - loss: 1.3001 - regression_loss: 1.1338 - classification_loss: 0.1663 388/500 [======================>.......] - ETA: 38s - loss: 1.3002 - regression_loss: 1.1338 - classification_loss: 0.1664 389/500 [======================>.......] - ETA: 37s - loss: 1.3001 - regression_loss: 1.1335 - classification_loss: 0.1666 390/500 [======================>.......] - ETA: 37s - loss: 1.2996 - regression_loss: 1.1331 - classification_loss: 0.1665 391/500 [======================>.......] - ETA: 37s - loss: 1.3008 - regression_loss: 1.1340 - classification_loss: 0.1667 392/500 [======================>.......] - ETA: 36s - loss: 1.3004 - regression_loss: 1.1337 - classification_loss: 0.1667 393/500 [======================>.......] - ETA: 36s - loss: 1.3016 - regression_loss: 1.1347 - classification_loss: 0.1668 394/500 [======================>.......] - ETA: 36s - loss: 1.3019 - regression_loss: 1.1350 - classification_loss: 0.1669 395/500 [======================>.......] - ETA: 35s - loss: 1.3010 - regression_loss: 1.1342 - classification_loss: 0.1667 396/500 [======================>.......] - ETA: 35s - loss: 1.3015 - regression_loss: 1.1346 - classification_loss: 0.1668 397/500 [======================>.......] - ETA: 35s - loss: 1.3015 - regression_loss: 1.1347 - classification_loss: 0.1668 398/500 [======================>.......] - ETA: 34s - loss: 1.3029 - regression_loss: 1.1359 - classification_loss: 0.1670 399/500 [======================>.......] - ETA: 34s - loss: 1.3026 - regression_loss: 1.1357 - classification_loss: 0.1669 400/500 [=======================>......] - ETA: 34s - loss: 1.3031 - regression_loss: 1.1361 - classification_loss: 0.1670 401/500 [=======================>......] - ETA: 33s - loss: 1.3042 - regression_loss: 1.1371 - classification_loss: 0.1671 402/500 [=======================>......] - ETA: 33s - loss: 1.3039 - regression_loss: 1.1370 - classification_loss: 0.1669 403/500 [=======================>......] - ETA: 33s - loss: 1.3027 - regression_loss: 1.1360 - classification_loss: 0.1667 404/500 [=======================>......] - ETA: 32s - loss: 1.3019 - regression_loss: 1.1354 - classification_loss: 0.1665 405/500 [=======================>......] - ETA: 32s - loss: 1.3019 - regression_loss: 1.1355 - classification_loss: 0.1665 406/500 [=======================>......] - ETA: 32s - loss: 1.3003 - regression_loss: 1.1341 - classification_loss: 0.1662 407/500 [=======================>......] - ETA: 31s - loss: 1.2988 - regression_loss: 1.1328 - classification_loss: 0.1660 408/500 [=======================>......] - ETA: 31s - loss: 1.2974 - regression_loss: 1.1316 - classification_loss: 0.1657 409/500 [=======================>......] - ETA: 31s - loss: 1.2969 - regression_loss: 1.1313 - classification_loss: 0.1656 410/500 [=======================>......] - ETA: 30s - loss: 1.2972 - regression_loss: 1.1315 - classification_loss: 0.1657 411/500 [=======================>......] - ETA: 30s - loss: 1.2977 - regression_loss: 1.1319 - classification_loss: 0.1658 412/500 [=======================>......] - ETA: 29s - loss: 1.2980 - regression_loss: 1.1322 - classification_loss: 0.1658 413/500 [=======================>......] - ETA: 29s - loss: 1.2963 - regression_loss: 1.1308 - classification_loss: 0.1655 414/500 [=======================>......] - ETA: 29s - loss: 1.2968 - regression_loss: 1.1308 - classification_loss: 0.1659 415/500 [=======================>......] - ETA: 28s - loss: 1.2976 - regression_loss: 1.1315 - classification_loss: 0.1662 416/500 [=======================>......] - ETA: 28s - loss: 1.2984 - regression_loss: 1.1320 - classification_loss: 0.1663 417/500 [========================>.....] - ETA: 28s - loss: 1.2978 - regression_loss: 1.1316 - classification_loss: 0.1662 418/500 [========================>.....] - ETA: 27s - loss: 1.2988 - regression_loss: 1.1324 - classification_loss: 0.1664 419/500 [========================>.....] - ETA: 27s - loss: 1.2986 - regression_loss: 1.1322 - classification_loss: 0.1664 420/500 [========================>.....] - ETA: 27s - loss: 1.2996 - regression_loss: 1.1331 - classification_loss: 0.1665 421/500 [========================>.....] - ETA: 26s - loss: 1.3006 - regression_loss: 1.1341 - classification_loss: 0.1665 422/500 [========================>.....] - ETA: 26s - loss: 1.3037 - regression_loss: 1.1368 - classification_loss: 0.1669 423/500 [========================>.....] - ETA: 26s - loss: 1.3043 - regression_loss: 1.1374 - classification_loss: 0.1670 424/500 [========================>.....] - ETA: 25s - loss: 1.3042 - regression_loss: 1.1372 - classification_loss: 0.1669 425/500 [========================>.....] - ETA: 25s - loss: 1.3049 - regression_loss: 1.1378 - classification_loss: 0.1670 426/500 [========================>.....] - ETA: 25s - loss: 1.3053 - regression_loss: 1.1383 - classification_loss: 0.1670 427/500 [========================>.....] - ETA: 24s - loss: 1.3045 - regression_loss: 1.1376 - classification_loss: 0.1668 428/500 [========================>.....] - ETA: 24s - loss: 1.3037 - regression_loss: 1.1370 - classification_loss: 0.1667 429/500 [========================>.....] - ETA: 24s - loss: 1.3033 - regression_loss: 1.1367 - classification_loss: 0.1666 430/500 [========================>.....] - ETA: 23s - loss: 1.3051 - regression_loss: 1.1385 - classification_loss: 0.1667 431/500 [========================>.....] - ETA: 23s - loss: 1.3058 - regression_loss: 1.1390 - classification_loss: 0.1667 432/500 [========================>.....] - ETA: 23s - loss: 1.3062 - regression_loss: 1.1395 - classification_loss: 0.1667 433/500 [========================>.....] - ETA: 22s - loss: 1.3047 - regression_loss: 1.1382 - classification_loss: 0.1664 434/500 [=========================>....] - ETA: 22s - loss: 1.3032 - regression_loss: 1.1370 - classification_loss: 0.1662 435/500 [=========================>....] - ETA: 22s - loss: 1.3022 - regression_loss: 1.1362 - classification_loss: 0.1661 436/500 [=========================>....] - ETA: 21s - loss: 1.3019 - regression_loss: 1.1359 - classification_loss: 0.1660 437/500 [=========================>....] - ETA: 21s - loss: 1.3015 - regression_loss: 1.1357 - classification_loss: 0.1658 438/500 [=========================>....] - ETA: 21s - loss: 1.3012 - regression_loss: 1.1354 - classification_loss: 0.1657 439/500 [=========================>....] - ETA: 20s - loss: 1.3016 - regression_loss: 1.1358 - classification_loss: 0.1658 440/500 [=========================>....] - ETA: 20s - loss: 1.3024 - regression_loss: 1.1365 - classification_loss: 0.1659 441/500 [=========================>....] - ETA: 20s - loss: 1.3047 - regression_loss: 1.1382 - classification_loss: 0.1665 442/500 [=========================>....] - ETA: 19s - loss: 1.3056 - regression_loss: 1.1390 - classification_loss: 0.1667 443/500 [=========================>....] - ETA: 19s - loss: 1.3043 - regression_loss: 1.1378 - classification_loss: 0.1665 444/500 [=========================>....] - ETA: 19s - loss: 1.3037 - regression_loss: 1.1373 - classification_loss: 0.1664 445/500 [=========================>....] - ETA: 18s - loss: 1.3026 - regression_loss: 1.1364 - classification_loss: 0.1662 446/500 [=========================>....] - ETA: 18s - loss: 1.3035 - regression_loss: 1.1372 - classification_loss: 0.1663 447/500 [=========================>....] - ETA: 18s - loss: 1.3042 - regression_loss: 1.1376 - classification_loss: 0.1666 448/500 [=========================>....] - ETA: 17s - loss: 1.3034 - regression_loss: 1.1370 - classification_loss: 0.1664 449/500 [=========================>....] - ETA: 17s - loss: 1.3053 - regression_loss: 1.1388 - classification_loss: 0.1664 450/500 [==========================>...] - ETA: 17s - loss: 1.3061 - regression_loss: 1.1394 - classification_loss: 0.1667 451/500 [==========================>...] - ETA: 16s - loss: 1.3067 - regression_loss: 1.1399 - classification_loss: 0.1668 452/500 [==========================>...] - ETA: 16s - loss: 1.3103 - regression_loss: 1.1419 - classification_loss: 0.1684 453/500 [==========================>...] - ETA: 16s - loss: 1.3112 - regression_loss: 1.1426 - classification_loss: 0.1686 454/500 [==========================>...] - ETA: 15s - loss: 1.3118 - regression_loss: 1.1431 - classification_loss: 0.1688 455/500 [==========================>...] - ETA: 15s - loss: 1.3103 - regression_loss: 1.1418 - classification_loss: 0.1685 456/500 [==========================>...] - ETA: 14s - loss: 1.3104 - regression_loss: 1.1419 - classification_loss: 0.1685 457/500 [==========================>...] - ETA: 14s - loss: 1.3101 - regression_loss: 1.1417 - classification_loss: 0.1684 458/500 [==========================>...] - ETA: 14s - loss: 1.3115 - regression_loss: 1.1428 - classification_loss: 0.1687 459/500 [==========================>...] - ETA: 13s - loss: 1.3119 - regression_loss: 1.1431 - classification_loss: 0.1688 460/500 [==========================>...] - ETA: 13s - loss: 1.3132 - regression_loss: 1.1442 - classification_loss: 0.1690 461/500 [==========================>...] - ETA: 13s - loss: 1.3130 - regression_loss: 1.1441 - classification_loss: 0.1689 462/500 [==========================>...] - ETA: 12s - loss: 1.3133 - regression_loss: 1.1443 - classification_loss: 0.1690 463/500 [==========================>...] - ETA: 12s - loss: 1.3119 - regression_loss: 1.1431 - classification_loss: 0.1687 464/500 [==========================>...] - ETA: 12s - loss: 1.3123 - regression_loss: 1.1434 - classification_loss: 0.1689 465/500 [==========================>...] - ETA: 11s - loss: 1.3125 - regression_loss: 1.1436 - classification_loss: 0.1688 466/500 [==========================>...] - ETA: 11s - loss: 1.3122 - regression_loss: 1.1434 - classification_loss: 0.1688 467/500 [===========================>..] - ETA: 11s - loss: 1.3119 - regression_loss: 1.1430 - classification_loss: 0.1688 468/500 [===========================>..] - ETA: 10s - loss: 1.3122 - regression_loss: 1.1434 - classification_loss: 0.1688 469/500 [===========================>..] - ETA: 10s - loss: 1.3113 - regression_loss: 1.1426 - classification_loss: 0.1687 470/500 [===========================>..] - ETA: 10s - loss: 1.3114 - regression_loss: 1.1425 - classification_loss: 0.1689 471/500 [===========================>..] - ETA: 9s - loss: 1.3102 - regression_loss: 1.1415 - classification_loss: 0.1687  472/500 [===========================>..] - ETA: 9s - loss: 1.3106 - regression_loss: 1.1419 - classification_loss: 0.1687 473/500 [===========================>..] - ETA: 9s - loss: 1.3114 - regression_loss: 1.1427 - classification_loss: 0.1688 474/500 [===========================>..] - ETA: 8s - loss: 1.3107 - regression_loss: 1.1422 - classification_loss: 0.1686 475/500 [===========================>..] - ETA: 8s - loss: 1.3114 - regression_loss: 1.1427 - classification_loss: 0.1687 476/500 [===========================>..] - ETA: 8s - loss: 1.3111 - regression_loss: 1.1425 - classification_loss: 0.1686 477/500 [===========================>..] - ETA: 7s - loss: 1.3109 - regression_loss: 1.1423 - classification_loss: 0.1686 478/500 [===========================>..] - ETA: 7s - loss: 1.3114 - regression_loss: 1.1426 - classification_loss: 0.1687 479/500 [===========================>..] - ETA: 7s - loss: 1.3120 - regression_loss: 1.1432 - classification_loss: 0.1688 480/500 [===========================>..] - ETA: 6s - loss: 1.3113 - regression_loss: 1.1427 - classification_loss: 0.1687 481/500 [===========================>..] - ETA: 6s - loss: 1.3121 - regression_loss: 1.1433 - classification_loss: 0.1688 482/500 [===========================>..] - ETA: 6s - loss: 1.3125 - regression_loss: 1.1437 - classification_loss: 0.1689 483/500 [===========================>..] - ETA: 5s - loss: 1.3130 - regression_loss: 1.1441 - classification_loss: 0.1689 484/500 [============================>.] - ETA: 5s - loss: 1.3137 - regression_loss: 1.1447 - classification_loss: 0.1690 485/500 [============================>.] - ETA: 5s - loss: 1.3133 - regression_loss: 1.1443 - classification_loss: 0.1690 486/500 [============================>.] - ETA: 4s - loss: 1.3139 - regression_loss: 1.1449 - classification_loss: 0.1691 487/500 [============================>.] - ETA: 4s - loss: 1.3140 - regression_loss: 1.1451 - classification_loss: 0.1689 488/500 [============================>.] - ETA: 4s - loss: 1.3142 - regression_loss: 1.1453 - classification_loss: 0.1688 489/500 [============================>.] - ETA: 3s - loss: 1.3155 - regression_loss: 1.1465 - classification_loss: 0.1690 490/500 [============================>.] - ETA: 3s - loss: 1.3164 - regression_loss: 1.1473 - classification_loss: 0.1691 491/500 [============================>.] - ETA: 3s - loss: 1.3168 - regression_loss: 1.1479 - classification_loss: 0.1690 492/500 [============================>.] - ETA: 2s - loss: 1.3177 - regression_loss: 1.1486 - classification_loss: 0.1692 493/500 [============================>.] - ETA: 2s - loss: 1.3188 - regression_loss: 1.1495 - classification_loss: 0.1694 494/500 [============================>.] - ETA: 2s - loss: 1.3182 - regression_loss: 1.1488 - classification_loss: 0.1694 495/500 [============================>.] - ETA: 1s - loss: 1.3178 - regression_loss: 1.1486 - classification_loss: 0.1693 496/500 [============================>.] - ETA: 1s - loss: 1.3189 - regression_loss: 1.1495 - classification_loss: 0.1694 497/500 [============================>.] - ETA: 1s - loss: 1.3173 - regression_loss: 1.1481 - classification_loss: 0.1692 498/500 [============================>.] - ETA: 0s - loss: 1.3173 - regression_loss: 1.1481 - classification_loss: 0.1692 499/500 [============================>.] - ETA: 0s - loss: 1.3173 - regression_loss: 1.1481 - classification_loss: 0.1691 500/500 [==============================] - 170s 341ms/step - loss: 1.3165 - regression_loss: 1.1476 - classification_loss: 0.1689 1172 instances of class plum with average precision: 0.7309 mAP: 0.7309 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:45 - loss: 0.4212 - regression_loss: 0.3843 - classification_loss: 0.0369 2/500 [..............................] - ETA: 2:47 - loss: 0.9350 - regression_loss: 0.8105 - classification_loss: 0.1245 3/500 [..............................] - ETA: 2:49 - loss: 1.0592 - regression_loss: 0.9189 - classification_loss: 0.1403 4/500 [..............................] - ETA: 2:48 - loss: 1.0503 - regression_loss: 0.8947 - classification_loss: 0.1556 5/500 [..............................] - ETA: 2:47 - loss: 0.9615 - regression_loss: 0.8260 - classification_loss: 0.1356 6/500 [..............................] - ETA: 2:46 - loss: 1.0080 - regression_loss: 0.8707 - classification_loss: 0.1373 7/500 [..............................] - ETA: 2:47 - loss: 1.1009 - regression_loss: 0.9540 - classification_loss: 0.1469 8/500 [..............................] - ETA: 2:46 - loss: 1.1085 - regression_loss: 0.9609 - classification_loss: 0.1477 9/500 [..............................] - ETA: 2:46 - loss: 1.1581 - regression_loss: 1.0060 - classification_loss: 0.1521 10/500 [..............................] - ETA: 2:46 - loss: 1.1777 - regression_loss: 1.0251 - classification_loss: 0.1526 11/500 [..............................] - ETA: 2:45 - loss: 1.1486 - regression_loss: 1.0032 - classification_loss: 0.1454 12/500 [..............................] - ETA: 2:44 - loss: 1.1550 - regression_loss: 1.0095 - classification_loss: 0.1455 13/500 [..............................] - ETA: 2:44 - loss: 1.1786 - regression_loss: 1.0341 - classification_loss: 0.1445 14/500 [..............................] - ETA: 2:45 - loss: 1.1817 - regression_loss: 1.0378 - classification_loss: 0.1439 15/500 [..............................] - ETA: 2:45 - loss: 1.1961 - regression_loss: 1.0502 - classification_loss: 0.1460 16/500 [..............................] - ETA: 2:45 - loss: 1.1692 - regression_loss: 1.0268 - classification_loss: 0.1423 17/500 [>.............................] - ETA: 2:44 - loss: 1.1806 - regression_loss: 1.0400 - classification_loss: 0.1406 18/500 [>.............................] - ETA: 2:44 - loss: 1.1382 - regression_loss: 1.0028 - classification_loss: 0.1354 19/500 [>.............................] - ETA: 2:43 - loss: 1.1349 - regression_loss: 0.9998 - classification_loss: 0.1351 20/500 [>.............................] - ETA: 2:43 - loss: 1.1455 - regression_loss: 1.0086 - classification_loss: 0.1369 21/500 [>.............................] - ETA: 2:43 - loss: 1.1620 - regression_loss: 1.0204 - classification_loss: 0.1416 22/500 [>.............................] - ETA: 2:43 - loss: 1.1259 - regression_loss: 0.9890 - classification_loss: 0.1369 23/500 [>.............................] - ETA: 2:43 - loss: 1.1049 - regression_loss: 0.9709 - classification_loss: 0.1339 24/500 [>.............................] - ETA: 2:42 - loss: 1.1093 - regression_loss: 0.9746 - classification_loss: 0.1348 25/500 [>.............................] - ETA: 2:42 - loss: 1.1093 - regression_loss: 0.9745 - classification_loss: 0.1348 26/500 [>.............................] - ETA: 2:42 - loss: 1.1299 - regression_loss: 0.9900 - classification_loss: 0.1399 27/500 [>.............................] - ETA: 2:41 - loss: 1.1408 - regression_loss: 0.9992 - classification_loss: 0.1416 28/500 [>.............................] - ETA: 2:41 - loss: 1.1436 - regression_loss: 1.0007 - classification_loss: 0.1429 29/500 [>.............................] - ETA: 2:41 - loss: 1.1453 - regression_loss: 1.0032 - classification_loss: 0.1421 30/500 [>.............................] - ETA: 2:40 - loss: 1.1544 - regression_loss: 1.0108 - classification_loss: 0.1436 31/500 [>.............................] - ETA: 2:40 - loss: 1.1633 - regression_loss: 1.0205 - classification_loss: 0.1428 32/500 [>.............................] - ETA: 2:39 - loss: 1.1472 - regression_loss: 1.0064 - classification_loss: 0.1408 33/500 [>.............................] - ETA: 2:39 - loss: 1.1508 - regression_loss: 1.0088 - classification_loss: 0.1420 34/500 [=>............................] - ETA: 2:39 - loss: 1.1406 - regression_loss: 0.9996 - classification_loss: 0.1411 35/500 [=>............................] - ETA: 2:38 - loss: 1.1376 - regression_loss: 0.9950 - classification_loss: 0.1427 36/500 [=>............................] - ETA: 2:38 - loss: 1.1356 - regression_loss: 0.9926 - classification_loss: 0.1430 37/500 [=>............................] - ETA: 2:38 - loss: 1.1428 - regression_loss: 0.9972 - classification_loss: 0.1456 38/500 [=>............................] - ETA: 2:37 - loss: 1.1441 - regression_loss: 0.9983 - classification_loss: 0.1458 39/500 [=>............................] - ETA: 2:37 - loss: 1.1253 - regression_loss: 0.9810 - classification_loss: 0.1443 40/500 [=>............................] - ETA: 2:37 - loss: 1.1301 - regression_loss: 0.9843 - classification_loss: 0.1457 41/500 [=>............................] - ETA: 2:36 - loss: 1.1389 - regression_loss: 0.9915 - classification_loss: 0.1473 42/500 [=>............................] - ETA: 2:36 - loss: 1.1502 - regression_loss: 1.0022 - classification_loss: 0.1480 43/500 [=>............................] - ETA: 2:36 - loss: 1.1561 - regression_loss: 1.0055 - classification_loss: 0.1507 44/500 [=>............................] - ETA: 2:35 - loss: 1.1664 - regression_loss: 1.0140 - classification_loss: 0.1524 45/500 [=>............................] - ETA: 2:35 - loss: 1.1544 - regression_loss: 1.0037 - classification_loss: 0.1508 46/500 [=>............................] - ETA: 2:35 - loss: 1.1600 - regression_loss: 1.0096 - classification_loss: 0.1505 47/500 [=>............................] - ETA: 2:35 - loss: 1.1567 - regression_loss: 1.0059 - classification_loss: 0.1509 48/500 [=>............................] - ETA: 2:34 - loss: 1.1634 - regression_loss: 1.0111 - classification_loss: 0.1523 49/500 [=>............................] - ETA: 2:34 - loss: 1.1699 - regression_loss: 1.0164 - classification_loss: 0.1535 50/500 [==>...........................] - ETA: 2:33 - loss: 1.1621 - regression_loss: 1.0101 - classification_loss: 0.1520 51/500 [==>...........................] - ETA: 2:33 - loss: 1.1718 - regression_loss: 1.0180 - classification_loss: 0.1539 52/500 [==>...........................] - ETA: 2:33 - loss: 1.1556 - regression_loss: 1.0038 - classification_loss: 0.1518 53/500 [==>...........................] - ETA: 2:33 - loss: 1.1536 - regression_loss: 1.0022 - classification_loss: 0.1514 54/500 [==>...........................] - ETA: 2:32 - loss: 1.1542 - regression_loss: 1.0020 - classification_loss: 0.1522 55/500 [==>...........................] - ETA: 2:32 - loss: 1.1740 - regression_loss: 1.0190 - classification_loss: 0.1550 56/500 [==>...........................] - ETA: 2:32 - loss: 1.1829 - regression_loss: 1.0261 - classification_loss: 0.1568 57/500 [==>...........................] - ETA: 2:31 - loss: 1.1862 - regression_loss: 1.0281 - classification_loss: 0.1580 58/500 [==>...........................] - ETA: 2:31 - loss: 1.1749 - regression_loss: 1.0176 - classification_loss: 0.1573 59/500 [==>...........................] - ETA: 2:31 - loss: 1.1814 - regression_loss: 1.0232 - classification_loss: 0.1581 60/500 [==>...........................] - ETA: 2:30 - loss: 1.1836 - regression_loss: 1.0255 - classification_loss: 0.1581 61/500 [==>...........................] - ETA: 2:30 - loss: 1.1872 - regression_loss: 1.0290 - classification_loss: 0.1582 62/500 [==>...........................] - ETA: 2:30 - loss: 1.1954 - regression_loss: 1.0365 - classification_loss: 0.1589 63/500 [==>...........................] - ETA: 2:29 - loss: 1.2041 - regression_loss: 1.0433 - classification_loss: 0.1608 64/500 [==>...........................] - ETA: 2:29 - loss: 1.2147 - regression_loss: 1.0533 - classification_loss: 0.1614 65/500 [==>...........................] - ETA: 2:29 - loss: 1.2209 - regression_loss: 1.0592 - classification_loss: 0.1617 66/500 [==>...........................] - ETA: 2:28 - loss: 1.2284 - regression_loss: 1.0656 - classification_loss: 0.1628 67/500 [===>..........................] - ETA: 2:28 - loss: 1.2315 - regression_loss: 1.0686 - classification_loss: 0.1629 68/500 [===>..........................] - ETA: 2:28 - loss: 1.2217 - regression_loss: 1.0603 - classification_loss: 0.1614 69/500 [===>..........................] - ETA: 2:27 - loss: 1.2132 - regression_loss: 1.0529 - classification_loss: 0.1603 70/500 [===>..........................] - ETA: 2:27 - loss: 1.2187 - regression_loss: 1.0576 - classification_loss: 0.1611 71/500 [===>..........................] - ETA: 2:26 - loss: 1.2182 - regression_loss: 1.0565 - classification_loss: 0.1617 72/500 [===>..........................] - ETA: 2:26 - loss: 1.2189 - regression_loss: 1.0575 - classification_loss: 0.1614 73/500 [===>..........................] - ETA: 2:26 - loss: 1.2212 - regression_loss: 1.0594 - classification_loss: 0.1619 74/500 [===>..........................] - ETA: 2:25 - loss: 1.2231 - regression_loss: 1.0611 - classification_loss: 0.1620 75/500 [===>..........................] - ETA: 2:25 - loss: 1.2229 - regression_loss: 1.0615 - classification_loss: 0.1614 76/500 [===>..........................] - ETA: 2:24 - loss: 1.2239 - regression_loss: 1.0626 - classification_loss: 0.1613 77/500 [===>..........................] - ETA: 2:24 - loss: 1.2273 - regression_loss: 1.0659 - classification_loss: 0.1614 78/500 [===>..........................] - ETA: 2:24 - loss: 1.2292 - regression_loss: 1.0674 - classification_loss: 0.1618 79/500 [===>..........................] - ETA: 2:23 - loss: 1.2353 - regression_loss: 1.0724 - classification_loss: 0.1630 80/500 [===>..........................] - ETA: 2:23 - loss: 1.2321 - regression_loss: 1.0697 - classification_loss: 0.1623 81/500 [===>..........................] - ETA: 2:23 - loss: 1.2338 - regression_loss: 1.0709 - classification_loss: 0.1628 82/500 [===>..........................] - ETA: 2:22 - loss: 1.2319 - regression_loss: 1.0691 - classification_loss: 0.1627 83/500 [===>..........................] - ETA: 2:22 - loss: 1.2280 - regression_loss: 1.0659 - classification_loss: 0.1621 84/500 [====>.........................] - ETA: 2:22 - loss: 1.2342 - regression_loss: 1.0715 - classification_loss: 0.1626 85/500 [====>.........................] - ETA: 2:21 - loss: 1.2363 - regression_loss: 1.0736 - classification_loss: 0.1627 86/500 [====>.........................] - ETA: 2:21 - loss: 1.2401 - regression_loss: 1.0766 - classification_loss: 0.1636 87/500 [====>.........................] - ETA: 2:21 - loss: 1.2421 - regression_loss: 1.0781 - classification_loss: 0.1641 88/500 [====>.........................] - ETA: 2:20 - loss: 1.2422 - regression_loss: 1.0792 - classification_loss: 0.1631 89/500 [====>.........................] - ETA: 2:20 - loss: 1.2424 - regression_loss: 1.0792 - classification_loss: 0.1632 90/500 [====>.........................] - ETA: 2:20 - loss: 1.2408 - regression_loss: 1.0777 - classification_loss: 0.1632 91/500 [====>.........................] - ETA: 2:19 - loss: 1.2440 - regression_loss: 1.0804 - classification_loss: 0.1636 92/500 [====>.........................] - ETA: 2:19 - loss: 1.2440 - regression_loss: 1.0805 - classification_loss: 0.1635 93/500 [====>.........................] - ETA: 2:18 - loss: 1.2438 - regression_loss: 1.0806 - classification_loss: 0.1632 94/500 [====>.........................] - ETA: 2:18 - loss: 1.2437 - regression_loss: 1.0807 - classification_loss: 0.1630 95/500 [====>.........................] - ETA: 2:18 - loss: 1.2452 - regression_loss: 1.0820 - classification_loss: 0.1631 96/500 [====>.........................] - ETA: 2:17 - loss: 1.2507 - regression_loss: 1.0861 - classification_loss: 0.1646 97/500 [====>.........................] - ETA: 2:17 - loss: 1.2439 - regression_loss: 1.0805 - classification_loss: 0.1635 98/500 [====>.........................] - ETA: 2:17 - loss: 1.2479 - regression_loss: 1.0840 - classification_loss: 0.1639 99/500 [====>.........................] - ETA: 2:16 - loss: 1.2530 - regression_loss: 1.0884 - classification_loss: 0.1646 100/500 [=====>........................] - ETA: 2:16 - loss: 1.2473 - regression_loss: 1.0834 - classification_loss: 0.1639 101/500 [=====>........................] - ETA: 2:16 - loss: 1.2459 - regression_loss: 1.0820 - classification_loss: 0.1639 102/500 [=====>........................] - ETA: 2:15 - loss: 1.2498 - regression_loss: 1.0854 - classification_loss: 0.1644 103/500 [=====>........................] - ETA: 2:15 - loss: 1.2485 - regression_loss: 1.0846 - classification_loss: 0.1639 104/500 [=====>........................] - ETA: 2:15 - loss: 1.2545 - regression_loss: 1.0893 - classification_loss: 0.1651 105/500 [=====>........................] - ETA: 2:14 - loss: 1.2491 - regression_loss: 1.0849 - classification_loss: 0.1642 106/500 [=====>........................] - ETA: 2:14 - loss: 1.2427 - regression_loss: 1.0796 - classification_loss: 0.1631 107/500 [=====>........................] - ETA: 2:14 - loss: 1.2408 - regression_loss: 1.0781 - classification_loss: 0.1627 108/500 [=====>........................] - ETA: 2:13 - loss: 1.2386 - regression_loss: 1.0760 - classification_loss: 0.1626 109/500 [=====>........................] - ETA: 2:13 - loss: 1.2424 - regression_loss: 1.0793 - classification_loss: 0.1632 110/500 [=====>........................] - ETA: 2:13 - loss: 1.2476 - regression_loss: 1.0838 - classification_loss: 0.1638 111/500 [=====>........................] - ETA: 2:12 - loss: 1.2481 - regression_loss: 1.0845 - classification_loss: 0.1636 112/500 [=====>........................] - ETA: 2:12 - loss: 1.2405 - regression_loss: 1.0780 - classification_loss: 0.1625 113/500 [=====>........................] - ETA: 2:12 - loss: 1.2407 - regression_loss: 1.0784 - classification_loss: 0.1623 114/500 [=====>........................] - ETA: 2:11 - loss: 1.2444 - regression_loss: 1.0820 - classification_loss: 0.1624 115/500 [=====>........................] - ETA: 2:11 - loss: 1.2478 - regression_loss: 1.0852 - classification_loss: 0.1626 116/500 [=====>........................] - ETA: 2:11 - loss: 1.2425 - regression_loss: 1.0808 - classification_loss: 0.1617 117/500 [======>.......................] - ETA: 2:10 - loss: 1.2479 - regression_loss: 1.0867 - classification_loss: 0.1612 118/500 [======>.......................] - ETA: 2:10 - loss: 1.2500 - regression_loss: 1.0887 - classification_loss: 0.1612 119/500 [======>.......................] - ETA: 2:09 - loss: 1.2436 - regression_loss: 1.0834 - classification_loss: 0.1602 120/500 [======>.......................] - ETA: 2:09 - loss: 1.2437 - regression_loss: 1.0837 - classification_loss: 0.1600 121/500 [======>.......................] - ETA: 2:09 - loss: 1.2474 - regression_loss: 1.0869 - classification_loss: 0.1605 122/500 [======>.......................] - ETA: 2:08 - loss: 1.2468 - regression_loss: 1.0866 - classification_loss: 0.1602 123/500 [======>.......................] - ETA: 2:08 - loss: 1.2425 - regression_loss: 1.0832 - classification_loss: 0.1593 124/500 [======>.......................] - ETA: 2:08 - loss: 1.2405 - regression_loss: 1.0816 - classification_loss: 0.1589 125/500 [======>.......................] - ETA: 2:07 - loss: 1.2404 - regression_loss: 1.0817 - classification_loss: 0.1587 126/500 [======>.......................] - ETA: 2:07 - loss: 1.2374 - regression_loss: 1.0790 - classification_loss: 0.1584 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2389 - regression_loss: 1.0803 - classification_loss: 0.1586 128/500 [======>.......................] - ETA: 2:06 - loss: 1.2416 - regression_loss: 1.0829 - classification_loss: 0.1587 129/500 [======>.......................] - ETA: 2:06 - loss: 1.2425 - regression_loss: 1.0838 - classification_loss: 0.1587 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2453 - regression_loss: 1.0858 - classification_loss: 0.1594 131/500 [======>.......................] - ETA: 2:05 - loss: 1.2456 - regression_loss: 1.0865 - classification_loss: 0.1591 132/500 [======>.......................] - ETA: 2:05 - loss: 1.2439 - regression_loss: 1.0853 - classification_loss: 0.1586 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2434 - regression_loss: 1.0849 - classification_loss: 0.1585 134/500 [=======>......................] - ETA: 2:04 - loss: 1.2393 - regression_loss: 1.0814 - classification_loss: 0.1579 135/500 [=======>......................] - ETA: 2:04 - loss: 1.2398 - regression_loss: 1.0820 - classification_loss: 0.1579 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2387 - regression_loss: 1.0813 - classification_loss: 0.1574 137/500 [=======>......................] - ETA: 2:03 - loss: 1.2358 - regression_loss: 1.0791 - classification_loss: 0.1567 138/500 [=======>......................] - ETA: 2:03 - loss: 1.2393 - regression_loss: 1.0821 - classification_loss: 0.1572 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2401 - regression_loss: 1.0830 - classification_loss: 0.1571 140/500 [=======>......................] - ETA: 2:02 - loss: 1.2384 - regression_loss: 1.0814 - classification_loss: 0.1570 141/500 [=======>......................] - ETA: 2:02 - loss: 1.2427 - regression_loss: 1.0850 - classification_loss: 0.1577 142/500 [=======>......................] - ETA: 2:01 - loss: 1.2425 - regression_loss: 1.0853 - classification_loss: 0.1572 143/500 [=======>......................] - ETA: 2:01 - loss: 1.2429 - regression_loss: 1.0856 - classification_loss: 0.1572 144/500 [=======>......................] - ETA: 2:01 - loss: 1.2410 - regression_loss: 1.0839 - classification_loss: 0.1571 145/500 [=======>......................] - ETA: 2:00 - loss: 1.2386 - regression_loss: 1.0819 - classification_loss: 0.1567 146/500 [=======>......................] - ETA: 2:00 - loss: 1.2387 - regression_loss: 1.0822 - classification_loss: 0.1565 147/500 [=======>......................] - ETA: 2:00 - loss: 1.2365 - regression_loss: 1.0806 - classification_loss: 0.1559 148/500 [=======>......................] - ETA: 1:59 - loss: 1.2374 - regression_loss: 1.0814 - classification_loss: 0.1560 149/500 [=======>......................] - ETA: 1:59 - loss: 1.2368 - regression_loss: 1.0810 - classification_loss: 0.1558 150/500 [========>.....................] - ETA: 1:58 - loss: 1.2359 - regression_loss: 1.0803 - classification_loss: 0.1556 151/500 [========>.....................] - ETA: 1:58 - loss: 1.2401 - regression_loss: 1.0840 - classification_loss: 0.1561 152/500 [========>.....................] - ETA: 1:58 - loss: 1.2373 - regression_loss: 1.0818 - classification_loss: 0.1556 153/500 [========>.....................] - ETA: 1:57 - loss: 1.2365 - regression_loss: 1.0813 - classification_loss: 0.1553 154/500 [========>.....................] - ETA: 1:57 - loss: 1.2362 - regression_loss: 1.0810 - classification_loss: 0.1552 155/500 [========>.....................] - ETA: 1:57 - loss: 1.2390 - regression_loss: 1.0835 - classification_loss: 0.1554 156/500 [========>.....................] - ETA: 1:56 - loss: 1.2374 - regression_loss: 1.0821 - classification_loss: 0.1554 157/500 [========>.....................] - ETA: 1:56 - loss: 1.2388 - regression_loss: 1.0836 - classification_loss: 0.1552 158/500 [========>.....................] - ETA: 1:56 - loss: 1.2403 - regression_loss: 1.0839 - classification_loss: 0.1565 159/500 [========>.....................] - ETA: 1:55 - loss: 1.2357 - regression_loss: 1.0800 - classification_loss: 0.1557 160/500 [========>.....................] - ETA: 1:55 - loss: 1.2337 - regression_loss: 1.0784 - classification_loss: 0.1554 161/500 [========>.....................] - ETA: 1:55 - loss: 1.2362 - regression_loss: 1.0801 - classification_loss: 0.1561 162/500 [========>.....................] - ETA: 1:54 - loss: 1.2384 - regression_loss: 1.0819 - classification_loss: 0.1565 163/500 [========>.....................] - ETA: 1:54 - loss: 1.2417 - regression_loss: 1.0850 - classification_loss: 0.1567 164/500 [========>.....................] - ETA: 1:54 - loss: 1.2442 - regression_loss: 1.0872 - classification_loss: 0.1570 165/500 [========>.....................] - ETA: 1:53 - loss: 1.2403 - regression_loss: 1.0839 - classification_loss: 0.1565 166/500 [========>.....................] - ETA: 1:53 - loss: 1.2436 - regression_loss: 1.0865 - classification_loss: 0.1571 167/500 [=========>....................] - ETA: 1:53 - loss: 1.2454 - regression_loss: 1.0882 - classification_loss: 0.1571 168/500 [=========>....................] - ETA: 1:52 - loss: 1.2450 - regression_loss: 1.0878 - classification_loss: 0.1571 169/500 [=========>....................] - ETA: 1:52 - loss: 1.2459 - regression_loss: 1.0888 - classification_loss: 0.1571 170/500 [=========>....................] - ETA: 1:52 - loss: 1.2505 - regression_loss: 1.0926 - classification_loss: 0.1579 171/500 [=========>....................] - ETA: 1:51 - loss: 1.2528 - regression_loss: 1.0945 - classification_loss: 0.1584 172/500 [=========>....................] - ETA: 1:51 - loss: 1.2547 - regression_loss: 1.0955 - classification_loss: 0.1592 173/500 [=========>....................] - ETA: 1:51 - loss: 1.2632 - regression_loss: 1.1021 - classification_loss: 0.1611 174/500 [=========>....................] - ETA: 1:50 - loss: 1.2584 - regression_loss: 1.0978 - classification_loss: 0.1606 175/500 [=========>....................] - ETA: 1:50 - loss: 1.2585 - regression_loss: 1.0979 - classification_loss: 0.1606 176/500 [=========>....................] - ETA: 1:50 - loss: 1.2587 - regression_loss: 1.0982 - classification_loss: 0.1605 177/500 [=========>....................] - ETA: 1:49 - loss: 1.2636 - regression_loss: 1.1026 - classification_loss: 0.1610 178/500 [=========>....................] - ETA: 1:49 - loss: 1.2621 - regression_loss: 1.1016 - classification_loss: 0.1604 179/500 [=========>....................] - ETA: 1:49 - loss: 1.2632 - regression_loss: 1.1028 - classification_loss: 0.1604 180/500 [=========>....................] - ETA: 1:48 - loss: 1.2603 - regression_loss: 1.1005 - classification_loss: 0.1598 181/500 [=========>....................] - ETA: 1:48 - loss: 1.2576 - regression_loss: 1.0981 - classification_loss: 0.1595 182/500 [=========>....................] - ETA: 1:47 - loss: 1.2589 - regression_loss: 1.0992 - classification_loss: 0.1597 183/500 [=========>....................] - ETA: 1:47 - loss: 1.2598 - regression_loss: 1.1000 - classification_loss: 0.1597 184/500 [==========>...................] - ETA: 1:47 - loss: 1.2566 - regression_loss: 1.0974 - classification_loss: 0.1592 185/500 [==========>...................] - ETA: 1:47 - loss: 1.2565 - regression_loss: 1.0973 - classification_loss: 0.1592 186/500 [==========>...................] - ETA: 1:46 - loss: 1.2577 - regression_loss: 1.0986 - classification_loss: 0.1592 187/500 [==========>...................] - ETA: 1:46 - loss: 1.2593 - regression_loss: 1.0997 - classification_loss: 0.1596 188/500 [==========>...................] - ETA: 1:46 - loss: 1.2576 - regression_loss: 1.0984 - classification_loss: 0.1592 189/500 [==========>...................] - ETA: 1:45 - loss: 1.2592 - regression_loss: 1.1000 - classification_loss: 0.1592 190/500 [==========>...................] - ETA: 1:45 - loss: 1.2577 - regression_loss: 1.0989 - classification_loss: 0.1588 191/500 [==========>...................] - ETA: 1:45 - loss: 1.2574 - regression_loss: 1.0991 - classification_loss: 0.1584 192/500 [==========>...................] - ETA: 1:44 - loss: 1.2548 - regression_loss: 1.0968 - classification_loss: 0.1581 193/500 [==========>...................] - ETA: 1:44 - loss: 1.2574 - regression_loss: 1.0987 - classification_loss: 0.1587 194/500 [==========>...................] - ETA: 1:44 - loss: 1.2598 - regression_loss: 1.1006 - classification_loss: 0.1592 195/500 [==========>...................] - ETA: 1:43 - loss: 1.2597 - regression_loss: 1.1005 - classification_loss: 0.1592 196/500 [==========>...................] - ETA: 1:43 - loss: 1.2602 - regression_loss: 1.1012 - classification_loss: 0.1590 197/500 [==========>...................] - ETA: 1:43 - loss: 1.2605 - regression_loss: 1.1016 - classification_loss: 0.1589 198/500 [==========>...................] - ETA: 1:42 - loss: 1.2569 - regression_loss: 1.0986 - classification_loss: 0.1584 199/500 [==========>...................] - ETA: 1:42 - loss: 1.2556 - regression_loss: 1.0975 - classification_loss: 0.1580 200/500 [===========>..................] - ETA: 1:42 - loss: 1.2545 - regression_loss: 1.0965 - classification_loss: 0.1580 201/500 [===========>..................] - ETA: 1:41 - loss: 1.2546 - regression_loss: 1.0967 - classification_loss: 0.1579 202/500 [===========>..................] - ETA: 1:41 - loss: 1.2538 - regression_loss: 1.0960 - classification_loss: 0.1577 203/500 [===========>..................] - ETA: 1:41 - loss: 1.2563 - regression_loss: 1.0976 - classification_loss: 0.1587 204/500 [===========>..................] - ETA: 1:40 - loss: 1.2557 - regression_loss: 1.0971 - classification_loss: 0.1586 205/500 [===========>..................] - ETA: 1:40 - loss: 1.2565 - regression_loss: 1.0976 - classification_loss: 0.1588 206/500 [===========>..................] - ETA: 1:40 - loss: 1.2576 - regression_loss: 1.0985 - classification_loss: 0.1591 207/500 [===========>..................] - ETA: 1:39 - loss: 1.2550 - regression_loss: 1.0963 - classification_loss: 0.1587 208/500 [===========>..................] - ETA: 1:39 - loss: 1.2577 - regression_loss: 1.0989 - classification_loss: 0.1588 209/500 [===========>..................] - ETA: 1:38 - loss: 1.2579 - regression_loss: 1.0986 - classification_loss: 0.1594 210/500 [===========>..................] - ETA: 1:38 - loss: 1.2610 - regression_loss: 1.1009 - classification_loss: 0.1601 211/500 [===========>..................] - ETA: 1:38 - loss: 1.2612 - regression_loss: 1.1011 - classification_loss: 0.1601 212/500 [===========>..................] - ETA: 1:38 - loss: 1.2612 - regression_loss: 1.1013 - classification_loss: 0.1599 213/500 [===========>..................] - ETA: 1:37 - loss: 1.2598 - regression_loss: 1.1002 - classification_loss: 0.1596 214/500 [===========>..................] - ETA: 1:37 - loss: 1.2616 - regression_loss: 1.1019 - classification_loss: 0.1598 215/500 [===========>..................] - ETA: 1:37 - loss: 1.2613 - regression_loss: 1.1015 - classification_loss: 0.1598 216/500 [===========>..................] - ETA: 1:36 - loss: 1.2607 - regression_loss: 1.1009 - classification_loss: 0.1599 217/500 [============>.................] - ETA: 1:36 - loss: 1.2609 - regression_loss: 1.1010 - classification_loss: 0.1599 218/500 [============>.................] - ETA: 1:36 - loss: 1.2626 - regression_loss: 1.1022 - classification_loss: 0.1604 219/500 [============>.................] - ETA: 1:35 - loss: 1.2648 - regression_loss: 1.1039 - classification_loss: 0.1609 220/500 [============>.................] - ETA: 1:35 - loss: 1.2661 - regression_loss: 1.1049 - classification_loss: 0.1612 221/500 [============>.................] - ETA: 1:34 - loss: 1.2618 - regression_loss: 1.1012 - classification_loss: 0.1606 222/500 [============>.................] - ETA: 1:34 - loss: 1.2609 - regression_loss: 1.1003 - classification_loss: 0.1606 223/500 [============>.................] - ETA: 1:34 - loss: 1.2616 - regression_loss: 1.1009 - classification_loss: 0.1607 224/500 [============>.................] - ETA: 1:33 - loss: 1.2593 - regression_loss: 1.0990 - classification_loss: 0.1602 225/500 [============>.................] - ETA: 1:33 - loss: 1.2601 - regression_loss: 1.1001 - classification_loss: 0.1601 226/500 [============>.................] - ETA: 1:33 - loss: 1.2614 - regression_loss: 1.1011 - classification_loss: 0.1603 227/500 [============>.................] - ETA: 1:32 - loss: 1.2581 - regression_loss: 1.0983 - classification_loss: 0.1598 228/500 [============>.................] - ETA: 1:32 - loss: 1.2587 - regression_loss: 1.0988 - classification_loss: 0.1600 229/500 [============>.................] - ETA: 1:32 - loss: 1.2586 - regression_loss: 1.0984 - classification_loss: 0.1602 230/500 [============>.................] - ETA: 1:31 - loss: 1.2604 - regression_loss: 1.1001 - classification_loss: 0.1603 231/500 [============>.................] - ETA: 1:31 - loss: 1.2595 - regression_loss: 1.0995 - classification_loss: 0.1600 232/500 [============>.................] - ETA: 1:31 - loss: 1.2604 - regression_loss: 1.0999 - classification_loss: 0.1604 233/500 [============>.................] - ETA: 1:30 - loss: 1.2641 - regression_loss: 1.1031 - classification_loss: 0.1610 234/500 [=============>................] - ETA: 1:30 - loss: 1.2609 - regression_loss: 1.1000 - classification_loss: 0.1609 235/500 [=============>................] - ETA: 1:30 - loss: 1.2624 - regression_loss: 1.1014 - classification_loss: 0.1611 236/500 [=============>................] - ETA: 1:29 - loss: 1.2623 - regression_loss: 1.1011 - classification_loss: 0.1611 237/500 [=============>................] - ETA: 1:29 - loss: 1.2616 - regression_loss: 1.1006 - classification_loss: 0.1610 238/500 [=============>................] - ETA: 1:29 - loss: 1.2611 - regression_loss: 1.1003 - classification_loss: 0.1608 239/500 [=============>................] - ETA: 1:28 - loss: 1.2625 - regression_loss: 1.1014 - classification_loss: 0.1611 240/500 [=============>................] - ETA: 1:28 - loss: 1.2628 - regression_loss: 1.1017 - classification_loss: 0.1612 241/500 [=============>................] - ETA: 1:28 - loss: 1.2627 - regression_loss: 1.1014 - classification_loss: 0.1613 242/500 [=============>................] - ETA: 1:27 - loss: 1.2637 - regression_loss: 1.1022 - classification_loss: 0.1615 243/500 [=============>................] - ETA: 1:27 - loss: 1.2621 - regression_loss: 1.1010 - classification_loss: 0.1611 244/500 [=============>................] - ETA: 1:27 - loss: 1.2615 - regression_loss: 1.1005 - classification_loss: 0.1610 245/500 [=============>................] - ETA: 1:26 - loss: 1.2640 - regression_loss: 1.1027 - classification_loss: 0.1613 246/500 [=============>................] - ETA: 1:26 - loss: 1.2649 - regression_loss: 1.1032 - classification_loss: 0.1618 247/500 [=============>................] - ETA: 1:26 - loss: 1.2663 - regression_loss: 1.1043 - classification_loss: 0.1620 248/500 [=============>................] - ETA: 1:25 - loss: 1.2677 - regression_loss: 1.1058 - classification_loss: 0.1619 249/500 [=============>................] - ETA: 1:25 - loss: 1.2683 - regression_loss: 1.1063 - classification_loss: 0.1620 250/500 [==============>...............] - ETA: 1:25 - loss: 1.2699 - regression_loss: 1.1080 - classification_loss: 0.1619 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2703 - regression_loss: 1.1082 - classification_loss: 0.1621 252/500 [==============>...............] - ETA: 1:24 - loss: 1.2699 - regression_loss: 1.1080 - classification_loss: 0.1619 253/500 [==============>...............] - ETA: 1:24 - loss: 1.2668 - regression_loss: 1.1054 - classification_loss: 0.1614 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2672 - regression_loss: 1.1056 - classification_loss: 0.1617 255/500 [==============>...............] - ETA: 1:23 - loss: 1.2693 - regression_loss: 1.1072 - classification_loss: 0.1621 256/500 [==============>...............] - ETA: 1:23 - loss: 1.2701 - regression_loss: 1.1078 - classification_loss: 0.1623 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2723 - regression_loss: 1.1097 - classification_loss: 0.1626 258/500 [==============>...............] - ETA: 1:22 - loss: 1.2702 - regression_loss: 1.1081 - classification_loss: 0.1621 259/500 [==============>...............] - ETA: 1:22 - loss: 1.2709 - regression_loss: 1.1086 - classification_loss: 0.1623 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2726 - regression_loss: 1.1100 - classification_loss: 0.1626 261/500 [==============>...............] - ETA: 1:21 - loss: 1.2713 - regression_loss: 1.1087 - classification_loss: 0.1626 262/500 [==============>...............] - ETA: 1:21 - loss: 1.2702 - regression_loss: 1.1079 - classification_loss: 0.1623 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2706 - regression_loss: 1.1083 - classification_loss: 0.1623 264/500 [==============>...............] - ETA: 1:20 - loss: 1.2692 - regression_loss: 1.1072 - classification_loss: 0.1620 265/500 [==============>...............] - ETA: 1:20 - loss: 1.2697 - regression_loss: 1.1077 - classification_loss: 0.1621 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2723 - regression_loss: 1.1093 - classification_loss: 0.1630 267/500 [===============>..............] - ETA: 1:19 - loss: 1.2710 - regression_loss: 1.1083 - classification_loss: 0.1627 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2721 - regression_loss: 1.1092 - classification_loss: 0.1629 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2699 - regression_loss: 1.1072 - classification_loss: 0.1627 270/500 [===============>..............] - ETA: 1:18 - loss: 1.2733 - regression_loss: 1.1093 - classification_loss: 0.1639 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2710 - regression_loss: 1.1074 - classification_loss: 0.1637 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2718 - regression_loss: 1.1080 - classification_loss: 0.1638 273/500 [===============>..............] - ETA: 1:17 - loss: 1.2715 - regression_loss: 1.1077 - classification_loss: 0.1637 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2717 - regression_loss: 1.1079 - classification_loss: 0.1638 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2714 - regression_loss: 1.1076 - classification_loss: 0.1638 276/500 [===============>..............] - ETA: 1:16 - loss: 1.2715 - regression_loss: 1.1077 - classification_loss: 0.1639 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2736 - regression_loss: 1.1093 - classification_loss: 0.1643 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2744 - regression_loss: 1.1098 - classification_loss: 0.1645 279/500 [===============>..............] - ETA: 1:15 - loss: 1.2724 - regression_loss: 1.1083 - classification_loss: 0.1642 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2734 - regression_loss: 1.1090 - classification_loss: 0.1643 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2736 - regression_loss: 1.1091 - classification_loss: 0.1645 282/500 [===============>..............] - ETA: 1:14 - loss: 1.2723 - regression_loss: 1.1080 - classification_loss: 0.1642 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2713 - regression_loss: 1.1072 - classification_loss: 0.1640 284/500 [================>.............] - ETA: 1:13 - loss: 1.2730 - regression_loss: 1.1089 - classification_loss: 0.1641 285/500 [================>.............] - ETA: 1:13 - loss: 1.2733 - regression_loss: 1.1093 - classification_loss: 0.1640 286/500 [================>.............] - ETA: 1:12 - loss: 1.2711 - regression_loss: 1.1074 - classification_loss: 0.1637 287/500 [================>.............] - ETA: 1:12 - loss: 1.2711 - regression_loss: 1.1075 - classification_loss: 0.1636 288/500 [================>.............] - ETA: 1:12 - loss: 1.2714 - regression_loss: 1.1080 - classification_loss: 0.1635 289/500 [================>.............] - ETA: 1:11 - loss: 1.2723 - regression_loss: 1.1088 - classification_loss: 0.1635 290/500 [================>.............] - ETA: 1:11 - loss: 1.2720 - regression_loss: 1.1088 - classification_loss: 0.1632 291/500 [================>.............] - ETA: 1:11 - loss: 1.2734 - regression_loss: 1.1099 - classification_loss: 0.1635 292/500 [================>.............] - ETA: 1:10 - loss: 1.2730 - regression_loss: 1.1094 - classification_loss: 0.1636 293/500 [================>.............] - ETA: 1:10 - loss: 1.2714 - regression_loss: 1.1079 - classification_loss: 0.1634 294/500 [================>.............] - ETA: 1:10 - loss: 1.2716 - regression_loss: 1.1082 - classification_loss: 0.1634 295/500 [================>.............] - ETA: 1:09 - loss: 1.2712 - regression_loss: 1.1078 - classification_loss: 0.1634 296/500 [================>.............] - ETA: 1:09 - loss: 1.2706 - regression_loss: 1.1075 - classification_loss: 0.1631 297/500 [================>.............] - ETA: 1:08 - loss: 1.2688 - regression_loss: 1.1059 - classification_loss: 0.1629 298/500 [================>.............] - ETA: 1:08 - loss: 1.2697 - regression_loss: 1.1069 - classification_loss: 0.1628 299/500 [================>.............] - ETA: 1:08 - loss: 1.2717 - regression_loss: 1.1084 - classification_loss: 0.1633 300/500 [=================>............] - ETA: 1:07 - loss: 1.2713 - regression_loss: 1.1080 - classification_loss: 0.1633 301/500 [=================>............] - ETA: 1:07 - loss: 1.2724 - regression_loss: 1.1089 - classification_loss: 0.1635 302/500 [=================>............] - ETA: 1:07 - loss: 1.2731 - regression_loss: 1.1093 - classification_loss: 0.1638 303/500 [=================>............] - ETA: 1:06 - loss: 1.2743 - regression_loss: 1.1105 - classification_loss: 0.1638 304/500 [=================>............] - ETA: 1:06 - loss: 1.2746 - regression_loss: 1.1108 - classification_loss: 0.1638 305/500 [=================>............] - ETA: 1:06 - loss: 1.2751 - regression_loss: 1.1112 - classification_loss: 0.1639 306/500 [=================>............] - ETA: 1:05 - loss: 1.2775 - regression_loss: 1.1134 - classification_loss: 0.1641 307/500 [=================>............] - ETA: 1:05 - loss: 1.2768 - regression_loss: 1.1127 - classification_loss: 0.1641 308/500 [=================>............] - ETA: 1:04 - loss: 1.2764 - regression_loss: 1.1124 - classification_loss: 0.1640 309/500 [=================>............] - ETA: 1:04 - loss: 1.2776 - regression_loss: 1.1139 - classification_loss: 0.1638 310/500 [=================>............] - ETA: 1:04 - loss: 1.2777 - regression_loss: 1.1140 - classification_loss: 0.1637 311/500 [=================>............] - ETA: 1:03 - loss: 1.2778 - regression_loss: 1.1140 - classification_loss: 0.1638 312/500 [=================>............] - ETA: 1:03 - loss: 1.2783 - regression_loss: 1.1137 - classification_loss: 0.1646 313/500 [=================>............] - ETA: 1:03 - loss: 1.2790 - regression_loss: 1.1143 - classification_loss: 0.1647 314/500 [=================>............] - ETA: 1:02 - loss: 1.2858 - regression_loss: 1.1196 - classification_loss: 0.1662 315/500 [=================>............] - ETA: 1:02 - loss: 1.2849 - regression_loss: 1.1188 - classification_loss: 0.1660 316/500 [=================>............] - ETA: 1:02 - loss: 1.2856 - regression_loss: 1.1193 - classification_loss: 0.1663 317/500 [==================>...........] - ETA: 1:01 - loss: 1.2864 - regression_loss: 1.1200 - classification_loss: 0.1664 318/500 [==================>...........] - ETA: 1:01 - loss: 1.2846 - regression_loss: 1.1185 - classification_loss: 0.1661 319/500 [==================>...........] - ETA: 1:01 - loss: 1.2821 - regression_loss: 1.1163 - classification_loss: 0.1658 320/500 [==================>...........] - ETA: 1:00 - loss: 1.2839 - regression_loss: 1.1178 - classification_loss: 0.1661 321/500 [==================>...........] - ETA: 1:00 - loss: 1.2834 - regression_loss: 1.1174 - classification_loss: 0.1659 322/500 [==================>...........] - ETA: 1:00 - loss: 1.2811 - regression_loss: 1.1155 - classification_loss: 0.1656 323/500 [==================>...........] - ETA: 59s - loss: 1.2807 - regression_loss: 1.1151 - classification_loss: 0.1655  324/500 [==================>...........] - ETA: 59s - loss: 1.2808 - regression_loss: 1.1152 - classification_loss: 0.1656 325/500 [==================>...........] - ETA: 59s - loss: 1.2781 - regression_loss: 1.1129 - classification_loss: 0.1652 326/500 [==================>...........] - ETA: 58s - loss: 1.2776 - regression_loss: 1.1125 - classification_loss: 0.1651 327/500 [==================>...........] - ETA: 58s - loss: 1.2794 - regression_loss: 1.1139 - classification_loss: 0.1656 328/500 [==================>...........] - ETA: 58s - loss: 1.2781 - regression_loss: 1.1128 - classification_loss: 0.1653 329/500 [==================>...........] - ETA: 57s - loss: 1.2779 - regression_loss: 1.1127 - classification_loss: 0.1652 330/500 [==================>...........] - ETA: 57s - loss: 1.2766 - regression_loss: 1.1114 - classification_loss: 0.1652 331/500 [==================>...........] - ETA: 57s - loss: 1.2765 - regression_loss: 1.1113 - classification_loss: 0.1652 332/500 [==================>...........] - ETA: 56s - loss: 1.2761 - regression_loss: 1.1111 - classification_loss: 0.1650 333/500 [==================>...........] - ETA: 56s - loss: 1.2780 - regression_loss: 1.1124 - classification_loss: 0.1657 334/500 [===================>..........] - ETA: 56s - loss: 1.2771 - regression_loss: 1.1115 - classification_loss: 0.1655 335/500 [===================>..........] - ETA: 55s - loss: 1.2747 - regression_loss: 1.1095 - classification_loss: 0.1652 336/500 [===================>..........] - ETA: 55s - loss: 1.2747 - regression_loss: 1.1096 - classification_loss: 0.1651 337/500 [===================>..........] - ETA: 55s - loss: 1.2748 - regression_loss: 1.1097 - classification_loss: 0.1651 338/500 [===================>..........] - ETA: 54s - loss: 1.2758 - regression_loss: 1.1105 - classification_loss: 0.1652 339/500 [===================>..........] - ETA: 54s - loss: 1.2746 - regression_loss: 1.1096 - classification_loss: 0.1650 340/500 [===================>..........] - ETA: 54s - loss: 1.2750 - regression_loss: 1.1100 - classification_loss: 0.1651 341/500 [===================>..........] - ETA: 53s - loss: 1.2767 - regression_loss: 1.1115 - classification_loss: 0.1652 342/500 [===================>..........] - ETA: 53s - loss: 1.2779 - regression_loss: 1.1124 - classification_loss: 0.1654 343/500 [===================>..........] - ETA: 53s - loss: 1.2789 - regression_loss: 1.1133 - classification_loss: 0.1657 344/500 [===================>..........] - ETA: 52s - loss: 1.2797 - regression_loss: 1.1139 - classification_loss: 0.1657 345/500 [===================>..........] - ETA: 52s - loss: 1.2789 - regression_loss: 1.1133 - classification_loss: 0.1656 346/500 [===================>..........] - ETA: 52s - loss: 1.2798 - regression_loss: 1.1141 - classification_loss: 0.1657 347/500 [===================>..........] - ETA: 51s - loss: 1.2802 - regression_loss: 1.1145 - classification_loss: 0.1658 348/500 [===================>..........] - ETA: 51s - loss: 1.2818 - regression_loss: 1.1158 - classification_loss: 0.1660 349/500 [===================>..........] - ETA: 51s - loss: 1.2820 - regression_loss: 1.1159 - classification_loss: 0.1661 350/500 [====================>.........] - ETA: 50s - loss: 1.2838 - regression_loss: 1.1171 - classification_loss: 0.1667 351/500 [====================>.........] - ETA: 50s - loss: 1.2835 - regression_loss: 1.1169 - classification_loss: 0.1666 352/500 [====================>.........] - ETA: 50s - loss: 1.2833 - regression_loss: 1.1168 - classification_loss: 0.1665 353/500 [====================>.........] - ETA: 49s - loss: 1.2827 - regression_loss: 1.1163 - classification_loss: 0.1664 354/500 [====================>.........] - ETA: 49s - loss: 1.2806 - regression_loss: 1.1144 - classification_loss: 0.1661 355/500 [====================>.........] - ETA: 48s - loss: 1.2783 - regression_loss: 1.1124 - classification_loss: 0.1659 356/500 [====================>.........] - ETA: 48s - loss: 1.2790 - regression_loss: 1.1130 - classification_loss: 0.1659 357/500 [====================>.........] - ETA: 48s - loss: 1.2787 - regression_loss: 1.1127 - classification_loss: 0.1660 358/500 [====================>.........] - ETA: 47s - loss: 1.2810 - regression_loss: 1.1149 - classification_loss: 0.1661 359/500 [====================>.........] - ETA: 47s - loss: 1.2799 - regression_loss: 1.1140 - classification_loss: 0.1659 360/500 [====================>.........] - ETA: 47s - loss: 1.2795 - regression_loss: 1.1138 - classification_loss: 0.1657 361/500 [====================>.........] - ETA: 46s - loss: 1.2778 - regression_loss: 1.1125 - classification_loss: 0.1654 362/500 [====================>.........] - ETA: 46s - loss: 1.2768 - regression_loss: 1.1116 - classification_loss: 0.1651 363/500 [====================>.........] - ETA: 46s - loss: 1.2769 - regression_loss: 1.1117 - classification_loss: 0.1652 364/500 [====================>.........] - ETA: 45s - loss: 1.2769 - regression_loss: 1.1118 - classification_loss: 0.1651 365/500 [====================>.........] - ETA: 45s - loss: 1.2772 - regression_loss: 1.1121 - classification_loss: 0.1651 366/500 [====================>.........] - ETA: 45s - loss: 1.2771 - regression_loss: 1.1120 - classification_loss: 0.1651 367/500 [=====================>........] - ETA: 44s - loss: 1.2768 - regression_loss: 1.1118 - classification_loss: 0.1650 368/500 [=====================>........] - ETA: 44s - loss: 1.2772 - regression_loss: 1.1121 - classification_loss: 0.1651 369/500 [=====================>........] - ETA: 44s - loss: 1.2774 - regression_loss: 1.1123 - classification_loss: 0.1651 370/500 [=====================>........] - ETA: 43s - loss: 1.2776 - regression_loss: 1.1125 - classification_loss: 0.1650 371/500 [=====================>........] - ETA: 43s - loss: 1.2778 - regression_loss: 1.1127 - classification_loss: 0.1651 372/500 [=====================>........] - ETA: 43s - loss: 1.2761 - regression_loss: 1.1113 - classification_loss: 0.1648 373/500 [=====================>........] - ETA: 42s - loss: 1.2763 - regression_loss: 1.1116 - classification_loss: 0.1647 374/500 [=====================>........] - ETA: 42s - loss: 1.2774 - regression_loss: 1.1126 - classification_loss: 0.1648 375/500 [=====================>........] - ETA: 42s - loss: 1.2783 - regression_loss: 1.1134 - classification_loss: 0.1650 376/500 [=====================>........] - ETA: 41s - loss: 1.2787 - regression_loss: 1.1137 - classification_loss: 0.1650 377/500 [=====================>........] - ETA: 41s - loss: 1.2780 - regression_loss: 1.1132 - classification_loss: 0.1648 378/500 [=====================>........] - ETA: 41s - loss: 1.2783 - regression_loss: 1.1136 - classification_loss: 0.1647 379/500 [=====================>........] - ETA: 40s - loss: 1.2777 - regression_loss: 1.1131 - classification_loss: 0.1647 380/500 [=====================>........] - ETA: 40s - loss: 1.2781 - regression_loss: 1.1134 - classification_loss: 0.1647 381/500 [=====================>........] - ETA: 40s - loss: 1.2800 - regression_loss: 1.1147 - classification_loss: 0.1653 382/500 [=====================>........] - ETA: 39s - loss: 1.2792 - regression_loss: 1.1140 - classification_loss: 0.1652 383/500 [=====================>........] - ETA: 39s - loss: 1.2802 - regression_loss: 1.1150 - classification_loss: 0.1652 384/500 [======================>.......] - ETA: 39s - loss: 1.2787 - regression_loss: 1.1137 - classification_loss: 0.1650 385/500 [======================>.......] - ETA: 38s - loss: 1.2786 - regression_loss: 1.1136 - classification_loss: 0.1650 386/500 [======================>.......] - ETA: 38s - loss: 1.2785 - regression_loss: 1.1136 - classification_loss: 0.1650 387/500 [======================>.......] - ETA: 38s - loss: 1.2790 - regression_loss: 1.1140 - classification_loss: 0.1651 388/500 [======================>.......] - ETA: 37s - loss: 1.2790 - regression_loss: 1.1139 - classification_loss: 0.1651 389/500 [======================>.......] - ETA: 37s - loss: 1.2802 - regression_loss: 1.1150 - classification_loss: 0.1652 390/500 [======================>.......] - ETA: 37s - loss: 1.2806 - regression_loss: 1.1153 - classification_loss: 0.1653 391/500 [======================>.......] - ETA: 36s - loss: 1.2806 - regression_loss: 1.1154 - classification_loss: 0.1652 392/500 [======================>.......] - ETA: 36s - loss: 1.2800 - regression_loss: 1.1148 - classification_loss: 0.1651 393/500 [======================>.......] - ETA: 36s - loss: 1.2812 - regression_loss: 1.1160 - classification_loss: 0.1652 394/500 [======================>.......] - ETA: 35s - loss: 1.2805 - regression_loss: 1.1155 - classification_loss: 0.1650 395/500 [======================>.......] - ETA: 35s - loss: 1.2808 - regression_loss: 1.1157 - classification_loss: 0.1650 396/500 [======================>.......] - ETA: 35s - loss: 1.2814 - regression_loss: 1.1165 - classification_loss: 0.1649 397/500 [======================>.......] - ETA: 34s - loss: 1.2813 - regression_loss: 1.1165 - classification_loss: 0.1648 398/500 [======================>.......] - ETA: 34s - loss: 1.2799 - regression_loss: 1.1153 - classification_loss: 0.1646 399/500 [======================>.......] - ETA: 34s - loss: 1.2804 - regression_loss: 1.1157 - classification_loss: 0.1647 400/500 [=======================>......] - ETA: 33s - loss: 1.2819 - regression_loss: 1.1170 - classification_loss: 0.1650 401/500 [=======================>......] - ETA: 33s - loss: 1.2806 - regression_loss: 1.1158 - classification_loss: 0.1648 402/500 [=======================>......] - ETA: 33s - loss: 1.2801 - regression_loss: 1.1156 - classification_loss: 0.1645 403/500 [=======================>......] - ETA: 32s - loss: 1.2813 - regression_loss: 1.1166 - classification_loss: 0.1647 404/500 [=======================>......] - ETA: 32s - loss: 1.2813 - regression_loss: 1.1167 - classification_loss: 0.1646 405/500 [=======================>......] - ETA: 32s - loss: 1.2806 - regression_loss: 1.1160 - classification_loss: 0.1646 406/500 [=======================>......] - ETA: 31s - loss: 1.2789 - regression_loss: 1.1144 - classification_loss: 0.1645 407/500 [=======================>......] - ETA: 31s - loss: 1.2769 - regression_loss: 1.1127 - classification_loss: 0.1642 408/500 [=======================>......] - ETA: 31s - loss: 1.2771 - regression_loss: 1.1129 - classification_loss: 0.1641 409/500 [=======================>......] - ETA: 30s - loss: 1.2766 - regression_loss: 1.1126 - classification_loss: 0.1640 410/500 [=======================>......] - ETA: 30s - loss: 1.2784 - regression_loss: 1.1141 - classification_loss: 0.1643 411/500 [=======================>......] - ETA: 30s - loss: 1.2783 - regression_loss: 1.1140 - classification_loss: 0.1643 412/500 [=======================>......] - ETA: 29s - loss: 1.2786 - regression_loss: 1.1141 - classification_loss: 0.1646 413/500 [=======================>......] - ETA: 29s - loss: 1.2790 - regression_loss: 1.1144 - classification_loss: 0.1646 414/500 [=======================>......] - ETA: 29s - loss: 1.2791 - regression_loss: 1.1145 - classification_loss: 0.1645 415/500 [=======================>......] - ETA: 28s - loss: 1.2787 - regression_loss: 1.1144 - classification_loss: 0.1643 416/500 [=======================>......] - ETA: 28s - loss: 1.2779 - regression_loss: 1.1137 - classification_loss: 0.1642 417/500 [========================>.....] - ETA: 28s - loss: 1.2768 - regression_loss: 1.1128 - classification_loss: 0.1640 418/500 [========================>.....] - ETA: 27s - loss: 1.2772 - regression_loss: 1.1131 - classification_loss: 0.1640 419/500 [========================>.....] - ETA: 27s - loss: 1.2780 - regression_loss: 1.1138 - classification_loss: 0.1642 420/500 [========================>.....] - ETA: 27s - loss: 1.2757 - regression_loss: 1.1119 - classification_loss: 0.1639 421/500 [========================>.....] - ETA: 26s - loss: 1.2751 - regression_loss: 1.1114 - classification_loss: 0.1637 422/500 [========================>.....] - ETA: 26s - loss: 1.2755 - regression_loss: 1.1118 - classification_loss: 0.1637 423/500 [========================>.....] - ETA: 26s - loss: 1.2763 - regression_loss: 1.1125 - classification_loss: 0.1638 424/500 [========================>.....] - ETA: 25s - loss: 1.2748 - regression_loss: 1.1112 - classification_loss: 0.1636 425/500 [========================>.....] - ETA: 25s - loss: 1.2747 - regression_loss: 1.1111 - classification_loss: 0.1636 426/500 [========================>.....] - ETA: 25s - loss: 1.2752 - regression_loss: 1.1116 - classification_loss: 0.1636 427/500 [========================>.....] - ETA: 24s - loss: 1.2743 - regression_loss: 1.1109 - classification_loss: 0.1635 428/500 [========================>.....] - ETA: 24s - loss: 1.2743 - regression_loss: 1.1109 - classification_loss: 0.1634 429/500 [========================>.....] - ETA: 24s - loss: 1.2746 - regression_loss: 1.1111 - classification_loss: 0.1635 430/500 [========================>.....] - ETA: 23s - loss: 1.2723 - regression_loss: 1.1091 - classification_loss: 0.1632 431/500 [========================>.....] - ETA: 23s - loss: 1.2716 - regression_loss: 1.1086 - classification_loss: 0.1630 432/500 [========================>.....] - ETA: 23s - loss: 1.2710 - regression_loss: 1.1082 - classification_loss: 0.1628 433/500 [========================>.....] - ETA: 22s - loss: 1.2699 - regression_loss: 1.1073 - classification_loss: 0.1626 434/500 [=========================>....] - ETA: 22s - loss: 1.2700 - regression_loss: 1.1074 - classification_loss: 0.1626 435/500 [=========================>....] - ETA: 22s - loss: 1.2698 - regression_loss: 1.1072 - classification_loss: 0.1626 436/500 [=========================>....] - ETA: 21s - loss: 1.2704 - regression_loss: 1.1078 - classification_loss: 0.1626 437/500 [=========================>....] - ETA: 21s - loss: 1.2695 - regression_loss: 1.1070 - classification_loss: 0.1625 438/500 [=========================>....] - ETA: 20s - loss: 1.2698 - regression_loss: 1.1073 - classification_loss: 0.1625 439/500 [=========================>....] - ETA: 20s - loss: 1.2695 - regression_loss: 1.1072 - classification_loss: 0.1624 440/500 [=========================>....] - ETA: 20s - loss: 1.2700 - regression_loss: 1.1076 - classification_loss: 0.1625 441/500 [=========================>....] - ETA: 19s - loss: 1.2699 - regression_loss: 1.1075 - classification_loss: 0.1624 442/500 [=========================>....] - ETA: 19s - loss: 1.2710 - regression_loss: 1.1085 - classification_loss: 0.1625 443/500 [=========================>....] - ETA: 19s - loss: 1.2695 - regression_loss: 1.1072 - classification_loss: 0.1623 444/500 [=========================>....] - ETA: 18s - loss: 1.2708 - regression_loss: 1.1082 - classification_loss: 0.1625 445/500 [=========================>....] - ETA: 18s - loss: 1.2711 - regression_loss: 1.1086 - classification_loss: 0.1626 446/500 [=========================>....] - ETA: 18s - loss: 1.2712 - regression_loss: 1.1086 - classification_loss: 0.1626 447/500 [=========================>....] - ETA: 17s - loss: 1.2723 - regression_loss: 1.1095 - classification_loss: 0.1628 448/500 [=========================>....] - ETA: 17s - loss: 1.2731 - regression_loss: 1.1102 - classification_loss: 0.1629 449/500 [=========================>....] - ETA: 17s - loss: 1.2714 - regression_loss: 1.1087 - classification_loss: 0.1627 450/500 [==========================>...] - ETA: 16s - loss: 1.2717 - regression_loss: 1.1091 - classification_loss: 0.1626 451/500 [==========================>...] - ETA: 16s - loss: 1.2724 - regression_loss: 1.1097 - classification_loss: 0.1627 452/500 [==========================>...] - ETA: 16s - loss: 1.2719 - regression_loss: 1.1093 - classification_loss: 0.1626 453/500 [==========================>...] - ETA: 15s - loss: 1.2711 - regression_loss: 1.1086 - classification_loss: 0.1625 454/500 [==========================>...] - ETA: 15s - loss: 1.2721 - regression_loss: 1.1095 - classification_loss: 0.1626 455/500 [==========================>...] - ETA: 15s - loss: 1.2722 - regression_loss: 1.1095 - classification_loss: 0.1627 456/500 [==========================>...] - ETA: 14s - loss: 1.2706 - regression_loss: 1.1081 - classification_loss: 0.1624 457/500 [==========================>...] - ETA: 14s - loss: 1.2704 - regression_loss: 1.1080 - classification_loss: 0.1624 458/500 [==========================>...] - ETA: 14s - loss: 1.2704 - regression_loss: 1.1081 - classification_loss: 0.1623 459/500 [==========================>...] - ETA: 13s - loss: 1.2697 - regression_loss: 1.1074 - classification_loss: 0.1622 460/500 [==========================>...] - ETA: 13s - loss: 1.2706 - regression_loss: 1.1082 - classification_loss: 0.1623 461/500 [==========================>...] - ETA: 13s - loss: 1.2704 - regression_loss: 1.1082 - classification_loss: 0.1623 462/500 [==========================>...] - ETA: 12s - loss: 1.2696 - regression_loss: 1.1075 - classification_loss: 0.1621 463/500 [==========================>...] - ETA: 12s - loss: 1.2695 - regression_loss: 1.1074 - classification_loss: 0.1621 464/500 [==========================>...] - ETA: 12s - loss: 1.2705 - regression_loss: 1.1082 - classification_loss: 0.1623 465/500 [==========================>...] - ETA: 11s - loss: 1.2693 - regression_loss: 1.1072 - classification_loss: 0.1621 466/500 [==========================>...] - ETA: 11s - loss: 1.2708 - regression_loss: 1.1084 - classification_loss: 0.1624 467/500 [===========================>..] - ETA: 11s - loss: 1.2707 - regression_loss: 1.1084 - classification_loss: 0.1623 468/500 [===========================>..] - ETA: 10s - loss: 1.2700 - regression_loss: 1.1078 - classification_loss: 0.1622 469/500 [===========================>..] - ETA: 10s - loss: 1.2720 - regression_loss: 1.1093 - classification_loss: 0.1627 470/500 [===========================>..] - ETA: 10s - loss: 1.2729 - regression_loss: 1.1100 - classification_loss: 0.1628 471/500 [===========================>..] - ETA: 9s - loss: 1.2715 - regression_loss: 1.1089 - classification_loss: 0.1626  472/500 [===========================>..] - ETA: 9s - loss: 1.2724 - regression_loss: 1.1095 - classification_loss: 0.1629 473/500 [===========================>..] - ETA: 9s - loss: 1.2731 - regression_loss: 1.1101 - classification_loss: 0.1630 474/500 [===========================>..] - ETA: 8s - loss: 1.2725 - regression_loss: 1.1096 - classification_loss: 0.1629 475/500 [===========================>..] - ETA: 8s - loss: 1.2724 - regression_loss: 1.1094 - classification_loss: 0.1630 476/500 [===========================>..] - ETA: 8s - loss: 1.2716 - regression_loss: 1.1087 - classification_loss: 0.1629 477/500 [===========================>..] - ETA: 7s - loss: 1.2726 - regression_loss: 1.1096 - classification_loss: 0.1629 478/500 [===========================>..] - ETA: 7s - loss: 1.2736 - regression_loss: 1.1104 - classification_loss: 0.1631 479/500 [===========================>..] - ETA: 7s - loss: 1.2736 - regression_loss: 1.1105 - classification_loss: 0.1631 480/500 [===========================>..] - ETA: 6s - loss: 1.2726 - regression_loss: 1.1097 - classification_loss: 0.1629 481/500 [===========================>..] - ETA: 6s - loss: 1.2717 - regression_loss: 1.1089 - classification_loss: 0.1629 482/500 [===========================>..] - ETA: 6s - loss: 1.2725 - regression_loss: 1.1096 - classification_loss: 0.1629 483/500 [===========================>..] - ETA: 5s - loss: 1.2731 - regression_loss: 1.1102 - classification_loss: 0.1630 484/500 [============================>.] - ETA: 5s - loss: 1.2733 - regression_loss: 1.1103 - classification_loss: 0.1630 485/500 [============================>.] - ETA: 5s - loss: 1.2736 - regression_loss: 1.1104 - classification_loss: 0.1632 486/500 [============================>.] - ETA: 4s - loss: 1.2742 - regression_loss: 1.1109 - classification_loss: 0.1633 487/500 [============================>.] - ETA: 4s - loss: 1.2752 - regression_loss: 1.1118 - classification_loss: 0.1634 488/500 [============================>.] - ETA: 4s - loss: 1.2758 - regression_loss: 1.1124 - classification_loss: 0.1634 489/500 [============================>.] - ETA: 3s - loss: 1.2757 - regression_loss: 1.1124 - classification_loss: 0.1633 490/500 [============================>.] - ETA: 3s - loss: 1.2755 - regression_loss: 1.1122 - classification_loss: 0.1632 491/500 [============================>.] - ETA: 3s - loss: 1.2744 - regression_loss: 1.1114 - classification_loss: 0.1630 492/500 [============================>.] - ETA: 2s - loss: 1.2731 - regression_loss: 1.1103 - classification_loss: 0.1628 493/500 [============================>.] - ETA: 2s - loss: 1.2730 - regression_loss: 1.1102 - classification_loss: 0.1627 494/500 [============================>.] - ETA: 2s - loss: 1.2724 - regression_loss: 1.1098 - classification_loss: 0.1626 495/500 [============================>.] - ETA: 1s - loss: 1.2739 - regression_loss: 1.1110 - classification_loss: 0.1629 496/500 [============================>.] - ETA: 1s - loss: 1.2741 - regression_loss: 1.1111 - classification_loss: 0.1630 497/500 [============================>.] - ETA: 1s - loss: 1.2736 - regression_loss: 1.1107 - classification_loss: 0.1629 498/500 [============================>.] - ETA: 0s - loss: 1.2740 - regression_loss: 1.1110 - classification_loss: 0.1630 499/500 [============================>.] - ETA: 0s - loss: 1.2737 - regression_loss: 1.1108 - classification_loss: 0.1629 500/500 [==============================] - 169s 339ms/step - loss: 1.2723 - regression_loss: 1.1097 - classification_loss: 0.1627 1172 instances of class plum with average precision: 0.7515 mAP: 0.7515 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 2:44 - loss: 1.2193 - regression_loss: 0.9741 - classification_loss: 0.2452 2/500 [..............................] - ETA: 2:49 - loss: 1.3201 - regression_loss: 1.1043 - classification_loss: 0.2158 3/500 [..............................] - ETA: 2:49 - loss: 1.2781 - regression_loss: 1.0805 - classification_loss: 0.1975 4/500 [..............................] - ETA: 2:49 - loss: 1.3618 - regression_loss: 1.1635 - classification_loss: 0.1983 5/500 [..............................] - ETA: 2:49 - loss: 1.2311 - regression_loss: 1.0546 - classification_loss: 0.1765 6/500 [..............................] - ETA: 2:47 - loss: 1.1690 - regression_loss: 1.0029 - classification_loss: 0.1661 7/500 [..............................] - ETA: 2:46 - loss: 1.1371 - regression_loss: 0.9822 - classification_loss: 0.1549 8/500 [..............................] - ETA: 2:47 - loss: 1.1759 - regression_loss: 1.0166 - classification_loss: 0.1593 9/500 [..............................] - ETA: 2:47 - loss: 1.1645 - regression_loss: 1.0111 - classification_loss: 0.1534 10/500 [..............................] - ETA: 2:46 - loss: 1.1696 - regression_loss: 1.0141 - classification_loss: 0.1554 11/500 [..............................] - ETA: 2:46 - loss: 1.1901 - regression_loss: 1.0359 - classification_loss: 0.1542 12/500 [..............................] - ETA: 2:46 - loss: 1.3152 - regression_loss: 1.1460 - classification_loss: 0.1691 13/500 [..............................] - ETA: 2:45 - loss: 1.3155 - regression_loss: 1.1477 - classification_loss: 0.1678 14/500 [..............................] - ETA: 2:44 - loss: 1.2751 - regression_loss: 1.1137 - classification_loss: 0.1613 15/500 [..............................] - ETA: 2:45 - loss: 1.2900 - regression_loss: 1.1250 - classification_loss: 0.1650 16/500 [..............................] - ETA: 2:44 - loss: 1.2923 - regression_loss: 1.1278 - classification_loss: 0.1645 17/500 [>.............................] - ETA: 2:44 - loss: 1.3440 - regression_loss: 1.1725 - classification_loss: 0.1714 18/500 [>.............................] - ETA: 2:44 - loss: 1.3437 - regression_loss: 1.1714 - classification_loss: 0.1723 19/500 [>.............................] - ETA: 2:44 - loss: 1.3304 - regression_loss: 1.1609 - classification_loss: 0.1694 20/500 [>.............................] - ETA: 2:44 - loss: 1.3262 - regression_loss: 1.1583 - classification_loss: 0.1679 21/500 [>.............................] - ETA: 2:44 - loss: 1.3106 - regression_loss: 1.1451 - classification_loss: 0.1655 22/500 [>.............................] - ETA: 2:43 - loss: 1.3188 - regression_loss: 1.1543 - classification_loss: 0.1644 23/500 [>.............................] - ETA: 2:43 - loss: 1.3215 - regression_loss: 1.1573 - classification_loss: 0.1643 24/500 [>.............................] - ETA: 2:43 - loss: 1.3233 - regression_loss: 1.1598 - classification_loss: 0.1634 25/500 [>.............................] - ETA: 2:42 - loss: 1.3258 - regression_loss: 1.1633 - classification_loss: 0.1625 26/500 [>.............................] - ETA: 2:42 - loss: 1.3012 - regression_loss: 1.1409 - classification_loss: 0.1603 27/500 [>.............................] - ETA: 2:42 - loss: 1.2853 - regression_loss: 1.1270 - classification_loss: 0.1584 28/500 [>.............................] - ETA: 2:41 - loss: 1.2996 - regression_loss: 1.1376 - classification_loss: 0.1619 29/500 [>.............................] - ETA: 2:41 - loss: 1.2935 - regression_loss: 1.1314 - classification_loss: 0.1621 30/500 [>.............................] - ETA: 2:41 - loss: 1.3001 - regression_loss: 1.1385 - classification_loss: 0.1616 31/500 [>.............................] - ETA: 2:40 - loss: 1.2774 - regression_loss: 1.1176 - classification_loss: 0.1598 32/500 [>.............................] - ETA: 2:40 - loss: 1.2850 - regression_loss: 1.1235 - classification_loss: 0.1615 33/500 [>.............................] - ETA: 2:40 - loss: 1.2833 - regression_loss: 1.1212 - classification_loss: 0.1621 34/500 [=>............................] - ETA: 2:39 - loss: 1.2791 - regression_loss: 1.1185 - classification_loss: 0.1606 35/500 [=>............................] - ETA: 2:39 - loss: 1.2792 - regression_loss: 1.1198 - classification_loss: 0.1594 36/500 [=>............................] - ETA: 2:39 - loss: 1.2794 - regression_loss: 1.1205 - classification_loss: 0.1590 37/500 [=>............................] - ETA: 2:38 - loss: 1.2910 - regression_loss: 1.1305 - classification_loss: 0.1605 38/500 [=>............................] - ETA: 2:38 - loss: 1.2909 - regression_loss: 1.1307 - classification_loss: 0.1602 39/500 [=>............................] - ETA: 2:38 - loss: 1.2830 - regression_loss: 1.1241 - classification_loss: 0.1589 40/500 [=>............................] - ETA: 2:37 - loss: 1.2728 - regression_loss: 1.1156 - classification_loss: 0.1572 41/500 [=>............................] - ETA: 2:36 - loss: 1.2609 - regression_loss: 1.1054 - classification_loss: 0.1554 42/500 [=>............................] - ETA: 2:36 - loss: 1.2653 - regression_loss: 1.1094 - classification_loss: 0.1559 43/500 [=>............................] - ETA: 2:36 - loss: 1.2591 - regression_loss: 1.1041 - classification_loss: 0.1549 44/500 [=>............................] - ETA: 2:35 - loss: 1.2459 - regression_loss: 1.0927 - classification_loss: 0.1532 45/500 [=>............................] - ETA: 2:35 - loss: 1.2437 - regression_loss: 1.0909 - classification_loss: 0.1528 46/500 [=>............................] - ETA: 2:35 - loss: 1.2511 - regression_loss: 1.0972 - classification_loss: 0.1539 47/500 [=>............................] - ETA: 2:35 - loss: 1.2384 - regression_loss: 1.0867 - classification_loss: 0.1517 48/500 [=>............................] - ETA: 2:34 - loss: 1.2418 - regression_loss: 1.0891 - classification_loss: 0.1526 49/500 [=>............................] - ETA: 2:34 - loss: 1.2329 - regression_loss: 1.0812 - classification_loss: 0.1517 50/500 [==>...........................] - ETA: 2:34 - loss: 1.2365 - regression_loss: 1.0847 - classification_loss: 0.1518 51/500 [==>...........................] - ETA: 2:33 - loss: 1.2357 - regression_loss: 1.0838 - classification_loss: 0.1519 52/500 [==>...........................] - ETA: 2:33 - loss: 1.2434 - regression_loss: 1.0902 - classification_loss: 0.1532 53/500 [==>...........................] - ETA: 2:33 - loss: 1.2451 - regression_loss: 1.0924 - classification_loss: 0.1527 54/500 [==>...........................] - ETA: 2:33 - loss: 1.2346 - regression_loss: 1.0827 - classification_loss: 0.1519 55/500 [==>...........................] - ETA: 2:32 - loss: 1.2360 - regression_loss: 1.0824 - classification_loss: 0.1536 56/500 [==>...........................] - ETA: 2:32 - loss: 1.2391 - regression_loss: 1.0846 - classification_loss: 0.1545 57/500 [==>...........................] - ETA: 2:32 - loss: 1.2425 - regression_loss: 1.0876 - classification_loss: 0.1550 58/500 [==>...........................] - ETA: 2:31 - loss: 1.2289 - regression_loss: 1.0756 - classification_loss: 0.1533 59/500 [==>...........................] - ETA: 2:31 - loss: 1.2229 - regression_loss: 1.0702 - classification_loss: 0.1527 60/500 [==>...........................] - ETA: 2:31 - loss: 1.2184 - regression_loss: 1.0658 - classification_loss: 0.1526 61/500 [==>...........................] - ETA: 2:30 - loss: 1.2139 - regression_loss: 1.0620 - classification_loss: 0.1518 62/500 [==>...........................] - ETA: 2:30 - loss: 1.2053 - regression_loss: 1.0547 - classification_loss: 0.1507 63/500 [==>...........................] - ETA: 2:30 - loss: 1.2120 - regression_loss: 1.0593 - classification_loss: 0.1527 64/500 [==>...........................] - ETA: 2:29 - loss: 1.2103 - regression_loss: 1.0578 - classification_loss: 0.1524 65/500 [==>...........................] - ETA: 2:29 - loss: 1.2159 - regression_loss: 1.0634 - classification_loss: 0.1525 66/500 [==>...........................] - ETA: 2:28 - loss: 1.2219 - regression_loss: 1.0685 - classification_loss: 0.1534 67/500 [===>..........................] - ETA: 2:28 - loss: 1.2267 - regression_loss: 1.0724 - classification_loss: 0.1542 68/500 [===>..........................] - ETA: 2:28 - loss: 1.2278 - regression_loss: 1.0747 - classification_loss: 0.1531 69/500 [===>..........................] - ETA: 2:27 - loss: 1.2298 - regression_loss: 1.0758 - classification_loss: 0.1541 70/500 [===>..........................] - ETA: 2:27 - loss: 1.2331 - regression_loss: 1.0780 - classification_loss: 0.1551 71/500 [===>..........................] - ETA: 2:27 - loss: 1.2384 - regression_loss: 1.0828 - classification_loss: 0.1556 72/500 [===>..........................] - ETA: 2:26 - loss: 1.2418 - regression_loss: 1.0856 - classification_loss: 0.1562 73/500 [===>..........................] - ETA: 2:26 - loss: 1.2434 - regression_loss: 1.0874 - classification_loss: 0.1561 74/500 [===>..........................] - ETA: 2:25 - loss: 1.2461 - regression_loss: 1.0902 - classification_loss: 0.1560 75/500 [===>..........................] - ETA: 2:25 - loss: 1.2494 - regression_loss: 1.0931 - classification_loss: 0.1563 76/500 [===>..........................] - ETA: 2:25 - loss: 1.2491 - regression_loss: 1.0933 - classification_loss: 0.1559 77/500 [===>..........................] - ETA: 2:24 - loss: 1.2487 - regression_loss: 1.0924 - classification_loss: 0.1563 78/500 [===>..........................] - ETA: 2:24 - loss: 1.2517 - regression_loss: 1.0949 - classification_loss: 0.1568 79/500 [===>..........................] - ETA: 2:24 - loss: 1.2486 - regression_loss: 1.0924 - classification_loss: 0.1562 80/500 [===>..........................] - ETA: 2:23 - loss: 1.2492 - regression_loss: 1.0929 - classification_loss: 0.1563 81/500 [===>..........................] - ETA: 2:23 - loss: 1.2476 - regression_loss: 1.0917 - classification_loss: 0.1559 82/500 [===>..........................] - ETA: 2:23 - loss: 1.2518 - regression_loss: 1.0950 - classification_loss: 0.1568 83/500 [===>..........................] - ETA: 2:22 - loss: 1.2548 - regression_loss: 1.0976 - classification_loss: 0.1572 84/500 [====>.........................] - ETA: 2:22 - loss: 1.2496 - regression_loss: 1.0931 - classification_loss: 0.1565 85/500 [====>.........................] - ETA: 2:22 - loss: 1.2474 - regression_loss: 1.0913 - classification_loss: 0.1561 86/500 [====>.........................] - ETA: 2:21 - loss: 1.2500 - regression_loss: 1.0936 - classification_loss: 0.1565 87/500 [====>.........................] - ETA: 2:21 - loss: 1.2568 - regression_loss: 1.0986 - classification_loss: 0.1582 88/500 [====>.........................] - ETA: 2:20 - loss: 1.2556 - regression_loss: 1.0974 - classification_loss: 0.1582 89/500 [====>.........................] - ETA: 2:20 - loss: 1.2560 - regression_loss: 1.0977 - classification_loss: 0.1583 90/500 [====>.........................] - ETA: 2:20 - loss: 1.2513 - regression_loss: 1.0939 - classification_loss: 0.1574 91/500 [====>.........................] - ETA: 2:19 - loss: 1.2555 - regression_loss: 1.0973 - classification_loss: 0.1582 92/500 [====>.........................] - ETA: 2:19 - loss: 1.2572 - regression_loss: 1.0986 - classification_loss: 0.1585 93/500 [====>.........................] - ETA: 2:19 - loss: 1.2560 - regression_loss: 1.0975 - classification_loss: 0.1585 94/500 [====>.........................] - ETA: 2:18 - loss: 1.2562 - regression_loss: 1.0979 - classification_loss: 0.1584 95/500 [====>.........................] - ETA: 2:18 - loss: 1.2551 - regression_loss: 1.0973 - classification_loss: 0.1578 96/500 [====>.........................] - ETA: 2:17 - loss: 1.2591 - regression_loss: 1.1012 - classification_loss: 0.1580 97/500 [====>.........................] - ETA: 2:17 - loss: 1.2639 - regression_loss: 1.1053 - classification_loss: 0.1586 98/500 [====>.........................] - ETA: 2:17 - loss: 1.2652 - regression_loss: 1.1065 - classification_loss: 0.1587 99/500 [====>.........................] - ETA: 2:16 - loss: 1.2619 - regression_loss: 1.1041 - classification_loss: 0.1578 100/500 [=====>........................] - ETA: 2:16 - loss: 1.2600 - regression_loss: 1.1026 - classification_loss: 0.1574 101/500 [=====>........................] - ETA: 2:16 - loss: 1.2564 - regression_loss: 1.0995 - classification_loss: 0.1569 102/500 [=====>........................] - ETA: 2:15 - loss: 1.2567 - regression_loss: 1.0998 - classification_loss: 0.1568 103/500 [=====>........................] - ETA: 2:15 - loss: 1.2554 - regression_loss: 1.0990 - classification_loss: 0.1564 104/500 [=====>........................] - ETA: 2:14 - loss: 1.2540 - regression_loss: 1.0980 - classification_loss: 0.1559 105/500 [=====>........................] - ETA: 2:14 - loss: 1.2569 - regression_loss: 1.1007 - classification_loss: 0.1562 106/500 [=====>........................] - ETA: 2:14 - loss: 1.2509 - regression_loss: 1.0953 - classification_loss: 0.1556 107/500 [=====>........................] - ETA: 2:13 - loss: 1.2560 - regression_loss: 1.0997 - classification_loss: 0.1563 108/500 [=====>........................] - ETA: 2:13 - loss: 1.2571 - regression_loss: 1.1005 - classification_loss: 0.1566 109/500 [=====>........................] - ETA: 2:13 - loss: 1.2559 - regression_loss: 1.0996 - classification_loss: 0.1563 110/500 [=====>........................] - ETA: 2:12 - loss: 1.2550 - regression_loss: 1.0994 - classification_loss: 0.1556 111/500 [=====>........................] - ETA: 2:12 - loss: 1.2497 - regression_loss: 1.0950 - classification_loss: 0.1547 112/500 [=====>........................] - ETA: 2:12 - loss: 1.2488 - regression_loss: 1.0941 - classification_loss: 0.1547 113/500 [=====>........................] - ETA: 2:11 - loss: 1.2477 - regression_loss: 1.0927 - classification_loss: 0.1549 114/500 [=====>........................] - ETA: 2:11 - loss: 1.2504 - regression_loss: 1.0957 - classification_loss: 0.1547 115/500 [=====>........................] - ETA: 2:10 - loss: 1.2520 - regression_loss: 1.0971 - classification_loss: 0.1549 116/500 [=====>........................] - ETA: 2:10 - loss: 1.2482 - regression_loss: 1.0941 - classification_loss: 0.1542 117/500 [======>.......................] - ETA: 2:10 - loss: 1.2419 - regression_loss: 1.0887 - classification_loss: 0.1531 118/500 [======>.......................] - ETA: 2:09 - loss: 1.2413 - regression_loss: 1.0884 - classification_loss: 0.1529 119/500 [======>.......................] - ETA: 2:09 - loss: 1.2367 - regression_loss: 1.0844 - classification_loss: 0.1523 120/500 [======>.......................] - ETA: 2:09 - loss: 1.2324 - regression_loss: 1.0809 - classification_loss: 0.1515 121/500 [======>.......................] - ETA: 2:08 - loss: 1.2357 - regression_loss: 1.0838 - classification_loss: 0.1519 122/500 [======>.......................] - ETA: 2:08 - loss: 1.2407 - regression_loss: 1.0882 - classification_loss: 0.1525 123/500 [======>.......................] - ETA: 2:08 - loss: 1.2409 - regression_loss: 1.0882 - classification_loss: 0.1527 124/500 [======>.......................] - ETA: 2:07 - loss: 1.2415 - regression_loss: 1.0888 - classification_loss: 0.1527 125/500 [======>.......................] - ETA: 2:07 - loss: 1.2404 - regression_loss: 1.0880 - classification_loss: 0.1524 126/500 [======>.......................] - ETA: 2:07 - loss: 1.2370 - regression_loss: 1.0855 - classification_loss: 0.1515 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2370 - regression_loss: 1.0858 - classification_loss: 0.1512 128/500 [======>.......................] - ETA: 2:06 - loss: 1.2414 - regression_loss: 1.0896 - classification_loss: 0.1518 129/500 [======>.......................] - ETA: 2:06 - loss: 1.2419 - regression_loss: 1.0899 - classification_loss: 0.1520 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2382 - regression_loss: 1.0867 - classification_loss: 0.1515 131/500 [======>.......................] - ETA: 2:05 - loss: 1.2404 - regression_loss: 1.0888 - classification_loss: 0.1516 132/500 [======>.......................] - ETA: 2:05 - loss: 1.2440 - regression_loss: 1.0919 - classification_loss: 0.1521 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2450 - regression_loss: 1.0928 - classification_loss: 0.1523 134/500 [=======>......................] - ETA: 2:04 - loss: 1.2404 - regression_loss: 1.0887 - classification_loss: 0.1517 135/500 [=======>......................] - ETA: 2:04 - loss: 1.2377 - regression_loss: 1.0861 - classification_loss: 0.1516 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2409 - regression_loss: 1.0881 - classification_loss: 0.1528 137/500 [=======>......................] - ETA: 2:03 - loss: 1.2399 - regression_loss: 1.0877 - classification_loss: 0.1522 138/500 [=======>......................] - ETA: 2:03 - loss: 1.2386 - regression_loss: 1.0864 - classification_loss: 0.1523 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2383 - regression_loss: 1.0858 - classification_loss: 0.1525 140/500 [=======>......................] - ETA: 2:02 - loss: 1.2375 - regression_loss: 1.0849 - classification_loss: 0.1526 141/500 [=======>......................] - ETA: 2:02 - loss: 1.2380 - regression_loss: 1.0853 - classification_loss: 0.1527 142/500 [=======>......................] - ETA: 2:01 - loss: 1.2384 - regression_loss: 1.0856 - classification_loss: 0.1528 143/500 [=======>......................] - ETA: 2:01 - loss: 1.2379 - regression_loss: 1.0853 - classification_loss: 0.1526 144/500 [=======>......................] - ETA: 2:01 - loss: 1.2353 - regression_loss: 1.0832 - classification_loss: 0.1521 145/500 [=======>......................] - ETA: 2:00 - loss: 1.2373 - regression_loss: 1.0855 - classification_loss: 0.1517 146/500 [=======>......................] - ETA: 2:00 - loss: 1.2316 - regression_loss: 1.0808 - classification_loss: 0.1508 147/500 [=======>......................] - ETA: 1:59 - loss: 1.2284 - regression_loss: 1.0780 - classification_loss: 0.1504 148/500 [=======>......................] - ETA: 1:59 - loss: 1.2296 - regression_loss: 1.0788 - classification_loss: 0.1508 149/500 [=======>......................] - ETA: 1:59 - loss: 1.2314 - regression_loss: 1.0804 - classification_loss: 0.1511 150/500 [========>.....................] - ETA: 1:58 - loss: 1.2365 - regression_loss: 1.0842 - classification_loss: 0.1523 151/500 [========>.....................] - ETA: 1:58 - loss: 1.2331 - regression_loss: 1.0814 - classification_loss: 0.1517 152/500 [========>.....................] - ETA: 1:58 - loss: 1.2282 - regression_loss: 1.0771 - classification_loss: 0.1511 153/500 [========>.....................] - ETA: 1:57 - loss: 1.2287 - regression_loss: 1.0776 - classification_loss: 0.1511 154/500 [========>.....................] - ETA: 1:57 - loss: 1.2324 - regression_loss: 1.0809 - classification_loss: 0.1515 155/500 [========>.....................] - ETA: 1:57 - loss: 1.2320 - regression_loss: 1.0803 - classification_loss: 0.1516 156/500 [========>.....................] - ETA: 1:56 - loss: 1.2304 - regression_loss: 1.0790 - classification_loss: 0.1514 157/500 [========>.....................] - ETA: 1:56 - loss: 1.2274 - regression_loss: 1.0765 - classification_loss: 0.1508 158/500 [========>.....................] - ETA: 1:56 - loss: 1.2297 - regression_loss: 1.0786 - classification_loss: 0.1511 159/500 [========>.....................] - ETA: 1:55 - loss: 1.2308 - regression_loss: 1.0797 - classification_loss: 0.1511 160/500 [========>.....................] - ETA: 1:55 - loss: 1.2293 - regression_loss: 1.0785 - classification_loss: 0.1508 161/500 [========>.....................] - ETA: 1:54 - loss: 1.2289 - regression_loss: 1.0784 - classification_loss: 0.1505 162/500 [========>.....................] - ETA: 1:54 - loss: 1.2287 - regression_loss: 1.0784 - classification_loss: 0.1503 163/500 [========>.....................] - ETA: 1:54 - loss: 1.2315 - regression_loss: 1.0809 - classification_loss: 0.1506 164/500 [========>.....................] - ETA: 1:54 - loss: 1.2317 - regression_loss: 1.0810 - classification_loss: 0.1507 165/500 [========>.....................] - ETA: 1:53 - loss: 1.2298 - regression_loss: 1.0794 - classification_loss: 0.1505 166/500 [========>.....................] - ETA: 1:53 - loss: 1.2262 - regression_loss: 1.0760 - classification_loss: 0.1502 167/500 [=========>....................] - ETA: 1:52 - loss: 1.2283 - regression_loss: 1.0777 - classification_loss: 0.1505 168/500 [=========>....................] - ETA: 1:52 - loss: 1.2285 - regression_loss: 1.0779 - classification_loss: 0.1506 169/500 [=========>....................] - ETA: 1:52 - loss: 1.2293 - regression_loss: 1.0788 - classification_loss: 0.1505 170/500 [=========>....................] - ETA: 1:51 - loss: 1.2293 - regression_loss: 1.0785 - classification_loss: 0.1508 171/500 [=========>....................] - ETA: 1:51 - loss: 1.2288 - regression_loss: 1.0782 - classification_loss: 0.1506 172/500 [=========>....................] - ETA: 1:51 - loss: 1.2319 - regression_loss: 1.0811 - classification_loss: 0.1507 173/500 [=========>....................] - ETA: 1:50 - loss: 1.2331 - regression_loss: 1.0823 - classification_loss: 0.1508 174/500 [=========>....................] - ETA: 1:50 - loss: 1.2323 - regression_loss: 1.0815 - classification_loss: 0.1508 175/500 [=========>....................] - ETA: 1:50 - loss: 1.2320 - regression_loss: 1.0811 - classification_loss: 0.1509 176/500 [=========>....................] - ETA: 1:49 - loss: 1.2328 - regression_loss: 1.0821 - classification_loss: 0.1508 177/500 [=========>....................] - ETA: 1:49 - loss: 1.2323 - regression_loss: 1.0817 - classification_loss: 0.1506 178/500 [=========>....................] - ETA: 1:49 - loss: 1.2335 - regression_loss: 1.0825 - classification_loss: 0.1509 179/500 [=========>....................] - ETA: 1:48 - loss: 1.2346 - regression_loss: 1.0837 - classification_loss: 0.1509 180/500 [=========>....................] - ETA: 1:48 - loss: 1.2329 - regression_loss: 1.0825 - classification_loss: 0.1505 181/500 [=========>....................] - ETA: 1:48 - loss: 1.2346 - regression_loss: 1.0840 - classification_loss: 0.1506 182/500 [=========>....................] - ETA: 1:47 - loss: 1.2307 - regression_loss: 1.0806 - classification_loss: 0.1501 183/500 [=========>....................] - ETA: 1:47 - loss: 1.2329 - regression_loss: 1.0825 - classification_loss: 0.1504 184/500 [==========>...................] - ETA: 1:47 - loss: 1.2338 - regression_loss: 1.0832 - classification_loss: 0.1506 185/500 [==========>...................] - ETA: 1:46 - loss: 1.2340 - regression_loss: 1.0836 - classification_loss: 0.1504 186/500 [==========>...................] - ETA: 1:46 - loss: 1.2362 - regression_loss: 1.0855 - classification_loss: 0.1507 187/500 [==========>...................] - ETA: 1:46 - loss: 1.2329 - regression_loss: 1.0827 - classification_loss: 0.1503 188/500 [==========>...................] - ETA: 1:45 - loss: 1.2315 - regression_loss: 1.0815 - classification_loss: 0.1499 189/500 [==========>...................] - ETA: 1:45 - loss: 1.2318 - regression_loss: 1.0819 - classification_loss: 0.1500 190/500 [==========>...................] - ETA: 1:45 - loss: 1.2332 - regression_loss: 1.0829 - classification_loss: 0.1503 191/500 [==========>...................] - ETA: 1:44 - loss: 1.2335 - regression_loss: 1.0832 - classification_loss: 0.1504 192/500 [==========>...................] - ETA: 1:44 - loss: 1.2360 - regression_loss: 1.0852 - classification_loss: 0.1508 193/500 [==========>...................] - ETA: 1:44 - loss: 1.2361 - regression_loss: 1.0854 - classification_loss: 0.1507 194/500 [==========>...................] - ETA: 1:43 - loss: 1.2365 - regression_loss: 1.0856 - classification_loss: 0.1509 195/500 [==========>...................] - ETA: 1:43 - loss: 1.2395 - regression_loss: 1.0884 - classification_loss: 0.1510 196/500 [==========>...................] - ETA: 1:43 - loss: 1.2389 - regression_loss: 1.0880 - classification_loss: 0.1509 197/500 [==========>...................] - ETA: 1:42 - loss: 1.2360 - regression_loss: 1.0856 - classification_loss: 0.1503 198/500 [==========>...................] - ETA: 1:42 - loss: 1.2366 - regression_loss: 1.0864 - classification_loss: 0.1503 199/500 [==========>...................] - ETA: 1:42 - loss: 1.2341 - regression_loss: 1.0842 - classification_loss: 0.1499 200/500 [===========>..................] - ETA: 1:41 - loss: 1.2351 - regression_loss: 1.0849 - classification_loss: 0.1503 201/500 [===========>..................] - ETA: 1:41 - loss: 1.2356 - regression_loss: 1.0852 - classification_loss: 0.1504 202/500 [===========>..................] - ETA: 1:41 - loss: 1.2349 - regression_loss: 1.0847 - classification_loss: 0.1502 203/500 [===========>..................] - ETA: 1:40 - loss: 1.2341 - regression_loss: 1.0840 - classification_loss: 0.1501 204/500 [===========>..................] - ETA: 1:40 - loss: 1.2327 - regression_loss: 1.0828 - classification_loss: 0.1499 205/500 [===========>..................] - ETA: 1:40 - loss: 1.2321 - regression_loss: 1.0822 - classification_loss: 0.1498 206/500 [===========>..................] - ETA: 1:39 - loss: 1.2302 - regression_loss: 1.0807 - classification_loss: 0.1494 207/500 [===========>..................] - ETA: 1:39 - loss: 1.2305 - regression_loss: 1.0808 - classification_loss: 0.1498 208/500 [===========>..................] - ETA: 1:39 - loss: 1.2325 - regression_loss: 1.0824 - classification_loss: 0.1501 209/500 [===========>..................] - ETA: 1:38 - loss: 1.2337 - regression_loss: 1.0835 - classification_loss: 0.1503 210/500 [===========>..................] - ETA: 1:38 - loss: 1.2324 - regression_loss: 1.0823 - classification_loss: 0.1501 211/500 [===========>..................] - ETA: 1:38 - loss: 1.2292 - regression_loss: 1.0795 - classification_loss: 0.1497 212/500 [===========>..................] - ETA: 1:37 - loss: 1.2296 - regression_loss: 1.0798 - classification_loss: 0.1498 213/500 [===========>..................] - ETA: 1:37 - loss: 1.2308 - regression_loss: 1.0808 - classification_loss: 0.1500 214/500 [===========>..................] - ETA: 1:37 - loss: 1.2269 - regression_loss: 1.0774 - classification_loss: 0.1495 215/500 [===========>..................] - ETA: 1:36 - loss: 1.2273 - regression_loss: 1.0777 - classification_loss: 0.1496 216/500 [===========>..................] - ETA: 1:36 - loss: 1.2261 - regression_loss: 1.0768 - classification_loss: 0.1494 217/500 [============>.................] - ETA: 1:36 - loss: 1.2263 - regression_loss: 1.0770 - classification_loss: 0.1493 218/500 [============>.................] - ETA: 1:35 - loss: 1.2238 - regression_loss: 1.0749 - classification_loss: 0.1488 219/500 [============>.................] - ETA: 1:35 - loss: 1.2249 - regression_loss: 1.0756 - classification_loss: 0.1493 220/500 [============>.................] - ETA: 1:34 - loss: 1.2240 - regression_loss: 1.0748 - classification_loss: 0.1492 221/500 [============>.................] - ETA: 1:34 - loss: 1.2206 - regression_loss: 1.0719 - classification_loss: 0.1487 222/500 [============>.................] - ETA: 1:34 - loss: 1.2182 - regression_loss: 1.0698 - classification_loss: 0.1484 223/500 [============>.................] - ETA: 1:33 - loss: 1.2185 - regression_loss: 1.0702 - classification_loss: 0.1483 224/500 [============>.................] - ETA: 1:33 - loss: 1.2200 - regression_loss: 1.0714 - classification_loss: 0.1485 225/500 [============>.................] - ETA: 1:33 - loss: 1.2205 - regression_loss: 1.0719 - classification_loss: 0.1487 226/500 [============>.................] - ETA: 1:32 - loss: 1.2211 - regression_loss: 1.0724 - classification_loss: 0.1487 227/500 [============>.................] - ETA: 1:32 - loss: 1.2221 - regression_loss: 1.0731 - classification_loss: 0.1490 228/500 [============>.................] - ETA: 1:32 - loss: 1.2223 - regression_loss: 1.0733 - classification_loss: 0.1490 229/500 [============>.................] - ETA: 1:31 - loss: 1.2237 - regression_loss: 1.0744 - classification_loss: 0.1493 230/500 [============>.................] - ETA: 1:31 - loss: 1.2253 - regression_loss: 1.0757 - classification_loss: 0.1497 231/500 [============>.................] - ETA: 1:31 - loss: 1.2241 - regression_loss: 1.0748 - classification_loss: 0.1493 232/500 [============>.................] - ETA: 1:30 - loss: 1.2229 - regression_loss: 1.0739 - classification_loss: 0.1490 233/500 [============>.................] - ETA: 1:30 - loss: 1.2240 - regression_loss: 1.0748 - classification_loss: 0.1492 234/500 [=============>................] - ETA: 1:30 - loss: 1.2225 - regression_loss: 1.0736 - classification_loss: 0.1489 235/500 [=============>................] - ETA: 1:29 - loss: 1.2236 - regression_loss: 1.0745 - classification_loss: 0.1491 236/500 [=============>................] - ETA: 1:29 - loss: 1.2256 - regression_loss: 1.0764 - classification_loss: 0.1492 237/500 [=============>................] - ETA: 1:29 - loss: 1.2247 - regression_loss: 1.0758 - classification_loss: 0.1490 238/500 [=============>................] - ETA: 1:28 - loss: 1.2237 - regression_loss: 1.0750 - classification_loss: 0.1487 239/500 [=============>................] - ETA: 1:28 - loss: 1.2254 - regression_loss: 1.0764 - classification_loss: 0.1490 240/500 [=============>................] - ETA: 1:28 - loss: 1.2260 - regression_loss: 1.0768 - classification_loss: 0.1492 241/500 [=============>................] - ETA: 1:27 - loss: 1.2298 - regression_loss: 1.0800 - classification_loss: 0.1498 242/500 [=============>................] - ETA: 1:27 - loss: 1.2296 - regression_loss: 1.0799 - classification_loss: 0.1498 243/500 [=============>................] - ETA: 1:27 - loss: 1.2298 - regression_loss: 1.0802 - classification_loss: 0.1496 244/500 [=============>................] - ETA: 1:26 - loss: 1.2308 - regression_loss: 1.0811 - classification_loss: 0.1497 245/500 [=============>................] - ETA: 1:26 - loss: 1.2314 - regression_loss: 1.0814 - classification_loss: 0.1500 246/500 [=============>................] - ETA: 1:26 - loss: 1.2308 - regression_loss: 1.0809 - classification_loss: 0.1499 247/500 [=============>................] - ETA: 1:25 - loss: 1.2298 - regression_loss: 1.0801 - classification_loss: 0.1498 248/500 [=============>................] - ETA: 1:25 - loss: 1.2282 - regression_loss: 1.0785 - classification_loss: 0.1497 249/500 [=============>................] - ETA: 1:25 - loss: 1.2274 - regression_loss: 1.0778 - classification_loss: 0.1497 250/500 [==============>...............] - ETA: 1:24 - loss: 1.2268 - regression_loss: 1.0771 - classification_loss: 0.1497 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2278 - regression_loss: 1.0780 - classification_loss: 0.1498 252/500 [==============>...............] - ETA: 1:24 - loss: 1.2279 - regression_loss: 1.0780 - classification_loss: 0.1498 253/500 [==============>...............] - ETA: 1:23 - loss: 1.2296 - regression_loss: 1.0794 - classification_loss: 0.1501 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2292 - regression_loss: 1.0794 - classification_loss: 0.1498 255/500 [==============>...............] - ETA: 1:23 - loss: 1.2284 - regression_loss: 1.0789 - classification_loss: 0.1495 256/500 [==============>...............] - ETA: 1:22 - loss: 1.2270 - regression_loss: 1.0775 - classification_loss: 0.1495 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2278 - regression_loss: 1.0784 - classification_loss: 0.1494 258/500 [==============>...............] - ETA: 1:22 - loss: 1.2287 - regression_loss: 1.0787 - classification_loss: 0.1500 259/500 [==============>...............] - ETA: 1:21 - loss: 1.2261 - regression_loss: 1.0765 - classification_loss: 0.1496 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2276 - regression_loss: 1.0777 - classification_loss: 0.1499 261/500 [==============>...............] - ETA: 1:21 - loss: 1.2275 - regression_loss: 1.0776 - classification_loss: 0.1499 262/500 [==============>...............] - ETA: 1:20 - loss: 1.2269 - regression_loss: 1.0772 - classification_loss: 0.1498 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2314 - regression_loss: 1.0806 - classification_loss: 0.1508 264/500 [==============>...............] - ETA: 1:20 - loss: 1.2318 - regression_loss: 1.0809 - classification_loss: 0.1510 265/500 [==============>...............] - ETA: 1:19 - loss: 1.2317 - regression_loss: 1.0806 - classification_loss: 0.1511 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2319 - regression_loss: 1.0808 - classification_loss: 0.1511 267/500 [===============>..............] - ETA: 1:19 - loss: 1.2324 - regression_loss: 1.0813 - classification_loss: 0.1511 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2326 - regression_loss: 1.0814 - classification_loss: 0.1512 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2312 - regression_loss: 1.0801 - classification_loss: 0.1511 270/500 [===============>..............] - ETA: 1:18 - loss: 1.2328 - regression_loss: 1.0815 - classification_loss: 0.1513 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2336 - regression_loss: 1.0822 - classification_loss: 0.1514 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2354 - regression_loss: 1.0838 - classification_loss: 0.1516 273/500 [===============>..............] - ETA: 1:17 - loss: 1.2332 - regression_loss: 1.0819 - classification_loss: 0.1513 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2346 - regression_loss: 1.0828 - classification_loss: 0.1518 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2357 - regression_loss: 1.0838 - classification_loss: 0.1519 276/500 [===============>..............] - ETA: 1:15 - loss: 1.2374 - regression_loss: 1.0851 - classification_loss: 0.1523 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2375 - regression_loss: 1.0853 - classification_loss: 0.1523 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2380 - regression_loss: 1.0859 - classification_loss: 0.1522 279/500 [===============>..............] - ETA: 1:14 - loss: 1.2397 - regression_loss: 1.0874 - classification_loss: 0.1523 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2427 - regression_loss: 1.0899 - classification_loss: 0.1528 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2445 - regression_loss: 1.0916 - classification_loss: 0.1529 282/500 [===============>..............] - ETA: 1:13 - loss: 1.2431 - regression_loss: 1.0904 - classification_loss: 0.1527 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2469 - regression_loss: 1.0936 - classification_loss: 0.1533 284/500 [================>.............] - ETA: 1:13 - loss: 1.2452 - regression_loss: 1.0921 - classification_loss: 0.1531 285/500 [================>.............] - ETA: 1:12 - loss: 1.2450 - regression_loss: 1.0918 - classification_loss: 0.1532 286/500 [================>.............] - ETA: 1:12 - loss: 1.2451 - regression_loss: 1.0919 - classification_loss: 0.1531 287/500 [================>.............] - ETA: 1:12 - loss: 1.2467 - regression_loss: 1.0933 - classification_loss: 0.1534 288/500 [================>.............] - ETA: 1:11 - loss: 1.2477 - regression_loss: 1.0942 - classification_loss: 0.1535 289/500 [================>.............] - ETA: 1:11 - loss: 1.2458 - regression_loss: 1.0926 - classification_loss: 0.1532 290/500 [================>.............] - ETA: 1:11 - loss: 1.2440 - regression_loss: 1.0911 - classification_loss: 0.1528 291/500 [================>.............] - ETA: 1:10 - loss: 1.2431 - regression_loss: 1.0904 - classification_loss: 0.1527 292/500 [================>.............] - ETA: 1:10 - loss: 1.2437 - regression_loss: 1.0910 - classification_loss: 0.1528 293/500 [================>.............] - ETA: 1:10 - loss: 1.2443 - regression_loss: 1.0914 - classification_loss: 0.1529 294/500 [================>.............] - ETA: 1:09 - loss: 1.2459 - regression_loss: 1.0930 - classification_loss: 0.1529 295/500 [================>.............] - ETA: 1:09 - loss: 1.2467 - regression_loss: 1.0937 - classification_loss: 0.1530 296/500 [================>.............] - ETA: 1:09 - loss: 1.2470 - regression_loss: 1.0939 - classification_loss: 0.1532 297/500 [================>.............] - ETA: 1:08 - loss: 1.2483 - regression_loss: 1.0949 - classification_loss: 0.1534 298/500 [================>.............] - ETA: 1:08 - loss: 1.2475 - regression_loss: 1.0940 - classification_loss: 0.1534 299/500 [================>.............] - ETA: 1:08 - loss: 1.2487 - regression_loss: 1.0950 - classification_loss: 0.1537 300/500 [=================>............] - ETA: 1:07 - loss: 1.2522 - regression_loss: 1.0980 - classification_loss: 0.1542 301/500 [=================>............] - ETA: 1:07 - loss: 1.2522 - regression_loss: 1.0979 - classification_loss: 0.1543 302/500 [=================>............] - ETA: 1:07 - loss: 1.2513 - regression_loss: 1.0972 - classification_loss: 0.1541 303/500 [=================>............] - ETA: 1:06 - loss: 1.2513 - regression_loss: 1.0972 - classification_loss: 0.1541 304/500 [=================>............] - ETA: 1:06 - loss: 1.2508 - regression_loss: 1.0966 - classification_loss: 0.1542 305/500 [=================>............] - ETA: 1:06 - loss: 1.2509 - regression_loss: 1.0968 - classification_loss: 0.1541 306/500 [=================>............] - ETA: 1:05 - loss: 1.2519 - regression_loss: 1.0976 - classification_loss: 0.1542 307/500 [=================>............] - ETA: 1:05 - loss: 1.2504 - regression_loss: 1.0964 - classification_loss: 0.1540 308/500 [=================>............] - ETA: 1:05 - loss: 1.2499 - regression_loss: 1.0958 - classification_loss: 0.1540 309/500 [=================>............] - ETA: 1:04 - loss: 1.2479 - regression_loss: 1.0942 - classification_loss: 0.1537 310/500 [=================>............] - ETA: 1:04 - loss: 1.2462 - regression_loss: 1.0928 - classification_loss: 0.1534 311/500 [=================>............] - ETA: 1:04 - loss: 1.2468 - regression_loss: 1.0935 - classification_loss: 0.1533 312/500 [=================>............] - ETA: 1:03 - loss: 1.2487 - regression_loss: 1.0951 - classification_loss: 0.1536 313/500 [=================>............] - ETA: 1:03 - loss: 1.2469 - regression_loss: 1.0935 - classification_loss: 0.1534 314/500 [=================>............] - ETA: 1:03 - loss: 1.2449 - regression_loss: 1.0917 - classification_loss: 0.1532 315/500 [=================>............] - ETA: 1:02 - loss: 1.2457 - regression_loss: 1.0925 - classification_loss: 0.1531 316/500 [=================>............] - ETA: 1:02 - loss: 1.2447 - regression_loss: 1.0918 - classification_loss: 0.1529 317/500 [==================>...........] - ETA: 1:02 - loss: 1.2462 - regression_loss: 1.0931 - classification_loss: 0.1531 318/500 [==================>...........] - ETA: 1:01 - loss: 1.2445 - regression_loss: 1.0917 - classification_loss: 0.1528 319/500 [==================>...........] - ETA: 1:01 - loss: 1.2452 - regression_loss: 1.0923 - classification_loss: 0.1528 320/500 [==================>...........] - ETA: 1:01 - loss: 1.2439 - regression_loss: 1.0912 - classification_loss: 0.1527 321/500 [==================>...........] - ETA: 1:00 - loss: 1.2451 - regression_loss: 1.0921 - classification_loss: 0.1529 322/500 [==================>...........] - ETA: 1:00 - loss: 1.2457 - regression_loss: 1.0928 - classification_loss: 0.1529 323/500 [==================>...........] - ETA: 1:00 - loss: 1.2466 - regression_loss: 1.0934 - classification_loss: 0.1532 324/500 [==================>...........] - ETA: 59s - loss: 1.2494 - regression_loss: 1.0960 - classification_loss: 0.1534  325/500 [==================>...........] - ETA: 59s - loss: 1.2491 - regression_loss: 1.0957 - classification_loss: 0.1534 326/500 [==================>...........] - ETA: 59s - loss: 1.2484 - regression_loss: 1.0951 - classification_loss: 0.1533 327/500 [==================>...........] - ETA: 58s - loss: 1.2471 - regression_loss: 1.0941 - classification_loss: 0.1531 328/500 [==================>...........] - ETA: 58s - loss: 1.2474 - regression_loss: 1.0943 - classification_loss: 0.1531 329/500 [==================>...........] - ETA: 57s - loss: 1.2462 - regression_loss: 1.0934 - classification_loss: 0.1528 330/500 [==================>...........] - ETA: 57s - loss: 1.2448 - regression_loss: 1.0923 - classification_loss: 0.1525 331/500 [==================>...........] - ETA: 57s - loss: 1.2431 - regression_loss: 1.0908 - classification_loss: 0.1523 332/500 [==================>...........] - ETA: 56s - loss: 1.2434 - regression_loss: 1.0914 - classification_loss: 0.1521 333/500 [==================>...........] - ETA: 56s - loss: 1.2434 - regression_loss: 1.0913 - classification_loss: 0.1521 334/500 [===================>..........] - ETA: 56s - loss: 1.2442 - regression_loss: 1.0921 - classification_loss: 0.1521 335/500 [===================>..........] - ETA: 55s - loss: 1.2448 - regression_loss: 1.0926 - classification_loss: 0.1521 336/500 [===================>..........] - ETA: 55s - loss: 1.2443 - regression_loss: 1.0924 - classification_loss: 0.1519 337/500 [===================>..........] - ETA: 55s - loss: 1.2456 - regression_loss: 1.0935 - classification_loss: 0.1521 338/500 [===================>..........] - ETA: 54s - loss: 1.2460 - regression_loss: 1.0941 - classification_loss: 0.1519 339/500 [===================>..........] - ETA: 54s - loss: 1.2462 - regression_loss: 1.0942 - classification_loss: 0.1520 340/500 [===================>..........] - ETA: 54s - loss: 1.2469 - regression_loss: 1.0947 - classification_loss: 0.1522 341/500 [===================>..........] - ETA: 53s - loss: 1.2474 - regression_loss: 1.0952 - classification_loss: 0.1522 342/500 [===================>..........] - ETA: 53s - loss: 1.2502 - regression_loss: 1.0980 - classification_loss: 0.1523 343/500 [===================>..........] - ETA: 53s - loss: 1.2482 - regression_loss: 1.0962 - classification_loss: 0.1520 344/500 [===================>..........] - ETA: 52s - loss: 1.2480 - regression_loss: 1.0960 - classification_loss: 0.1520 345/500 [===================>..........] - ETA: 52s - loss: 1.2486 - regression_loss: 1.0966 - classification_loss: 0.1520 346/500 [===================>..........] - ETA: 52s - loss: 1.2490 - regression_loss: 1.0970 - classification_loss: 0.1520 347/500 [===================>..........] - ETA: 51s - loss: 1.2491 - regression_loss: 1.0971 - classification_loss: 0.1520 348/500 [===================>..........] - ETA: 51s - loss: 1.2472 - regression_loss: 1.0954 - classification_loss: 0.1518 349/500 [===================>..........] - ETA: 51s - loss: 1.2468 - regression_loss: 1.0949 - classification_loss: 0.1519 350/500 [====================>.........] - ETA: 50s - loss: 1.2445 - regression_loss: 1.0929 - classification_loss: 0.1516 351/500 [====================>.........] - ETA: 50s - loss: 1.2423 - regression_loss: 1.0911 - classification_loss: 0.1512 352/500 [====================>.........] - ETA: 50s - loss: 1.2431 - regression_loss: 1.0918 - classification_loss: 0.1513 353/500 [====================>.........] - ETA: 49s - loss: 1.2427 - regression_loss: 1.0915 - classification_loss: 0.1512 354/500 [====================>.........] - ETA: 49s - loss: 1.2423 - regression_loss: 1.0913 - classification_loss: 0.1510 355/500 [====================>.........] - ETA: 49s - loss: 1.2411 - regression_loss: 1.0902 - classification_loss: 0.1509 356/500 [====================>.........] - ETA: 48s - loss: 1.2420 - regression_loss: 1.0910 - classification_loss: 0.1510 357/500 [====================>.........] - ETA: 48s - loss: 1.2428 - regression_loss: 1.0919 - classification_loss: 0.1509 358/500 [====================>.........] - ETA: 48s - loss: 1.2409 - regression_loss: 1.0902 - classification_loss: 0.1507 359/500 [====================>.........] - ETA: 47s - loss: 1.2411 - regression_loss: 1.0903 - classification_loss: 0.1508 360/500 [====================>.........] - ETA: 47s - loss: 1.2395 - regression_loss: 1.0891 - classification_loss: 0.1505 361/500 [====================>.........] - ETA: 47s - loss: 1.2394 - regression_loss: 1.0886 - classification_loss: 0.1508 362/500 [====================>.........] - ETA: 46s - loss: 1.2398 - regression_loss: 1.0888 - classification_loss: 0.1510 363/500 [====================>.........] - ETA: 46s - loss: 1.2406 - regression_loss: 1.0894 - classification_loss: 0.1512 364/500 [====================>.........] - ETA: 46s - loss: 1.2405 - regression_loss: 1.0894 - classification_loss: 0.1511 365/500 [====================>.........] - ETA: 45s - loss: 1.2398 - regression_loss: 1.0890 - classification_loss: 0.1509 366/500 [====================>.........] - ETA: 45s - loss: 1.2406 - regression_loss: 1.0897 - classification_loss: 0.1509 367/500 [=====================>........] - ETA: 45s - loss: 1.2388 - regression_loss: 1.0881 - classification_loss: 0.1507 368/500 [=====================>........] - ETA: 44s - loss: 1.2391 - regression_loss: 1.0884 - classification_loss: 0.1507 369/500 [=====================>........] - ETA: 44s - loss: 1.2374 - regression_loss: 1.0868 - classification_loss: 0.1505 370/500 [=====================>........] - ETA: 44s - loss: 1.2377 - regression_loss: 1.0871 - classification_loss: 0.1505 371/500 [=====================>........] - ETA: 43s - loss: 1.2371 - regression_loss: 1.0867 - classification_loss: 0.1504 372/500 [=====================>........] - ETA: 43s - loss: 1.2364 - regression_loss: 1.0861 - classification_loss: 0.1503 373/500 [=====================>........] - ETA: 43s - loss: 1.2354 - regression_loss: 1.0853 - classification_loss: 0.1500 374/500 [=====================>........] - ETA: 42s - loss: 1.2357 - regression_loss: 1.0857 - classification_loss: 0.1500 375/500 [=====================>........] - ETA: 42s - loss: 1.2357 - regression_loss: 1.0856 - classification_loss: 0.1500 376/500 [=====================>........] - ETA: 42s - loss: 1.2363 - regression_loss: 1.0862 - classification_loss: 0.1501 377/500 [=====================>........] - ETA: 41s - loss: 1.2358 - regression_loss: 1.0857 - classification_loss: 0.1501 378/500 [=====================>........] - ETA: 41s - loss: 1.2363 - regression_loss: 1.0862 - classification_loss: 0.1501 379/500 [=====================>........] - ETA: 41s - loss: 1.2370 - regression_loss: 1.0869 - classification_loss: 0.1501 380/500 [=====================>........] - ETA: 40s - loss: 1.2369 - regression_loss: 1.0867 - classification_loss: 0.1502 381/500 [=====================>........] - ETA: 40s - loss: 1.2365 - regression_loss: 1.0864 - classification_loss: 0.1501 382/500 [=====================>........] - ETA: 39s - loss: 1.2368 - regression_loss: 1.0866 - classification_loss: 0.1502 383/500 [=====================>........] - ETA: 39s - loss: 1.2366 - regression_loss: 1.0865 - classification_loss: 0.1502 384/500 [======================>.......] - ETA: 39s - loss: 1.2373 - regression_loss: 1.0870 - classification_loss: 0.1503 385/500 [======================>.......] - ETA: 38s - loss: 1.2390 - regression_loss: 1.0884 - classification_loss: 0.1506 386/500 [======================>.......] - ETA: 38s - loss: 1.2382 - regression_loss: 1.0877 - classification_loss: 0.1505 387/500 [======================>.......] - ETA: 38s - loss: 1.2385 - regression_loss: 1.0879 - classification_loss: 0.1506 388/500 [======================>.......] - ETA: 37s - loss: 1.2385 - regression_loss: 1.0881 - classification_loss: 0.1504 389/500 [======================>.......] - ETA: 37s - loss: 1.2399 - regression_loss: 1.0888 - classification_loss: 0.1511 390/500 [======================>.......] - ETA: 37s - loss: 1.2393 - regression_loss: 1.0884 - classification_loss: 0.1509 391/500 [======================>.......] - ETA: 36s - loss: 1.2401 - regression_loss: 1.0892 - classification_loss: 0.1509 392/500 [======================>.......] - ETA: 36s - loss: 1.2391 - regression_loss: 1.0884 - classification_loss: 0.1507 393/500 [======================>.......] - ETA: 36s - loss: 1.2394 - regression_loss: 1.0886 - classification_loss: 0.1508 394/500 [======================>.......] - ETA: 35s - loss: 1.2392 - regression_loss: 1.0885 - classification_loss: 0.1507 395/500 [======================>.......] - ETA: 35s - loss: 1.2388 - regression_loss: 1.0881 - classification_loss: 0.1507 396/500 [======================>.......] - ETA: 35s - loss: 1.2401 - regression_loss: 1.0892 - classification_loss: 0.1509 397/500 [======================>.......] - ETA: 34s - loss: 1.2408 - regression_loss: 1.0899 - classification_loss: 0.1510 398/500 [======================>.......] - ETA: 34s - loss: 1.2423 - regression_loss: 1.0913 - classification_loss: 0.1511 399/500 [======================>.......] - ETA: 34s - loss: 1.2435 - regression_loss: 1.0922 - classification_loss: 0.1513 400/500 [=======================>......] - ETA: 33s - loss: 1.2443 - regression_loss: 1.0929 - classification_loss: 0.1514 401/500 [=======================>......] - ETA: 33s - loss: 1.2447 - regression_loss: 1.0931 - classification_loss: 0.1516 402/500 [=======================>......] - ETA: 33s - loss: 1.2448 - regression_loss: 1.0931 - classification_loss: 0.1516 403/500 [=======================>......] - ETA: 32s - loss: 1.2431 - regression_loss: 1.0918 - classification_loss: 0.1514 404/500 [=======================>......] - ETA: 32s - loss: 1.2433 - regression_loss: 1.0916 - classification_loss: 0.1517 405/500 [=======================>......] - ETA: 32s - loss: 1.2442 - regression_loss: 1.0924 - classification_loss: 0.1519 406/500 [=======================>......] - ETA: 31s - loss: 1.2428 - regression_loss: 1.0911 - classification_loss: 0.1517 407/500 [=======================>......] - ETA: 31s - loss: 1.2418 - regression_loss: 1.0902 - classification_loss: 0.1515 408/500 [=======================>......] - ETA: 31s - loss: 1.2411 - regression_loss: 1.0898 - classification_loss: 0.1513 409/500 [=======================>......] - ETA: 30s - loss: 1.2394 - regression_loss: 1.0884 - classification_loss: 0.1511 410/500 [=======================>......] - ETA: 30s - loss: 1.2398 - regression_loss: 1.0888 - classification_loss: 0.1510 411/500 [=======================>......] - ETA: 30s - loss: 1.2413 - regression_loss: 1.0899 - classification_loss: 0.1514 412/500 [=======================>......] - ETA: 29s - loss: 1.2415 - regression_loss: 1.0901 - classification_loss: 0.1513 413/500 [=======================>......] - ETA: 29s - loss: 1.2411 - regression_loss: 1.0898 - classification_loss: 0.1513 414/500 [=======================>......] - ETA: 29s - loss: 1.2403 - regression_loss: 1.0892 - classification_loss: 0.1511 415/500 [=======================>......] - ETA: 28s - loss: 1.2401 - regression_loss: 1.0889 - classification_loss: 0.1512 416/500 [=======================>......] - ETA: 28s - loss: 1.2400 - regression_loss: 1.0889 - classification_loss: 0.1511 417/500 [========================>.....] - ETA: 28s - loss: 1.2410 - regression_loss: 1.0896 - classification_loss: 0.1514 418/500 [========================>.....] - ETA: 27s - loss: 1.2416 - regression_loss: 1.0901 - classification_loss: 0.1516 419/500 [========================>.....] - ETA: 27s - loss: 1.2423 - regression_loss: 1.0906 - classification_loss: 0.1517 420/500 [========================>.....] - ETA: 27s - loss: 1.2431 - regression_loss: 1.0912 - classification_loss: 0.1519 421/500 [========================>.....] - ETA: 26s - loss: 1.2444 - regression_loss: 1.0923 - classification_loss: 0.1521 422/500 [========================>.....] - ETA: 26s - loss: 1.2437 - regression_loss: 1.0917 - classification_loss: 0.1519 423/500 [========================>.....] - ETA: 26s - loss: 1.2437 - regression_loss: 1.0918 - classification_loss: 0.1519 424/500 [========================>.....] - ETA: 25s - loss: 1.2416 - regression_loss: 1.0898 - classification_loss: 0.1518 425/500 [========================>.....] - ETA: 25s - loss: 1.2403 - regression_loss: 1.0888 - classification_loss: 0.1515 426/500 [========================>.....] - ETA: 25s - loss: 1.2395 - regression_loss: 1.0882 - classification_loss: 0.1513 427/500 [========================>.....] - ETA: 24s - loss: 1.2388 - regression_loss: 1.0876 - classification_loss: 0.1512 428/500 [========================>.....] - ETA: 24s - loss: 1.2390 - regression_loss: 1.0877 - classification_loss: 0.1513 429/500 [========================>.....] - ETA: 24s - loss: 1.2400 - regression_loss: 1.0885 - classification_loss: 0.1514 430/500 [========================>.....] - ETA: 23s - loss: 1.2398 - regression_loss: 1.0884 - classification_loss: 0.1514 431/500 [========================>.....] - ETA: 23s - loss: 1.2405 - regression_loss: 1.0890 - classification_loss: 0.1515 432/500 [========================>.....] - ETA: 23s - loss: 1.2416 - regression_loss: 1.0900 - classification_loss: 0.1516 433/500 [========================>.....] - ETA: 22s - loss: 1.2419 - regression_loss: 1.0902 - classification_loss: 0.1517 434/500 [=========================>....] - ETA: 22s - loss: 1.2412 - regression_loss: 1.0897 - classification_loss: 0.1515 435/500 [=========================>....] - ETA: 22s - loss: 1.2414 - regression_loss: 1.0898 - classification_loss: 0.1516 436/500 [=========================>....] - ETA: 21s - loss: 1.2406 - regression_loss: 1.0892 - classification_loss: 0.1514 437/500 [=========================>....] - ETA: 21s - loss: 1.2414 - regression_loss: 1.0899 - classification_loss: 0.1516 438/500 [=========================>....] - ETA: 21s - loss: 1.2407 - regression_loss: 1.0893 - classification_loss: 0.1515 439/500 [=========================>....] - ETA: 20s - loss: 1.2404 - regression_loss: 1.0889 - classification_loss: 0.1515 440/500 [=========================>....] - ETA: 20s - loss: 1.2421 - regression_loss: 1.0902 - classification_loss: 0.1519 441/500 [=========================>....] - ETA: 20s - loss: 1.2418 - regression_loss: 1.0900 - classification_loss: 0.1517 442/500 [=========================>....] - ETA: 19s - loss: 1.2410 - regression_loss: 1.0893 - classification_loss: 0.1517 443/500 [=========================>....] - ETA: 19s - loss: 1.2407 - regression_loss: 1.0890 - classification_loss: 0.1517 444/500 [=========================>....] - ETA: 19s - loss: 1.2395 - regression_loss: 1.0880 - classification_loss: 0.1515 445/500 [=========================>....] - ETA: 18s - loss: 1.2399 - regression_loss: 1.0884 - classification_loss: 0.1516 446/500 [=========================>....] - ETA: 18s - loss: 1.2400 - regression_loss: 1.0885 - classification_loss: 0.1515 447/500 [=========================>....] - ETA: 17s - loss: 1.2396 - regression_loss: 1.0882 - classification_loss: 0.1515 448/500 [=========================>....] - ETA: 17s - loss: 1.2398 - regression_loss: 1.0882 - classification_loss: 0.1516 449/500 [=========================>....] - ETA: 17s - loss: 1.2393 - regression_loss: 1.0878 - classification_loss: 0.1515 450/500 [==========================>...] - ETA: 16s - loss: 1.2398 - regression_loss: 1.0882 - classification_loss: 0.1516 451/500 [==========================>...] - ETA: 16s - loss: 1.2392 - regression_loss: 1.0877 - classification_loss: 0.1515 452/500 [==========================>...] - ETA: 16s - loss: 1.2376 - regression_loss: 1.0863 - classification_loss: 0.1512 453/500 [==========================>...] - ETA: 15s - loss: 1.2388 - regression_loss: 1.0873 - classification_loss: 0.1515 454/500 [==========================>...] - ETA: 15s - loss: 1.2382 - regression_loss: 1.0867 - classification_loss: 0.1515 455/500 [==========================>...] - ETA: 15s - loss: 1.2393 - regression_loss: 1.0876 - classification_loss: 0.1517 456/500 [==========================>...] - ETA: 14s - loss: 1.2393 - regression_loss: 1.0876 - classification_loss: 0.1517 457/500 [==========================>...] - ETA: 14s - loss: 1.2394 - regression_loss: 1.0877 - classification_loss: 0.1518 458/500 [==========================>...] - ETA: 14s - loss: 1.2399 - regression_loss: 1.0881 - classification_loss: 0.1518 459/500 [==========================>...] - ETA: 13s - loss: 1.2395 - regression_loss: 1.0878 - classification_loss: 0.1517 460/500 [==========================>...] - ETA: 13s - loss: 1.2406 - regression_loss: 1.0884 - classification_loss: 0.1522 461/500 [==========================>...] - ETA: 13s - loss: 1.2397 - regression_loss: 1.0876 - classification_loss: 0.1520 462/500 [==========================>...] - ETA: 12s - loss: 1.2408 - regression_loss: 1.0886 - classification_loss: 0.1522 463/500 [==========================>...] - ETA: 12s - loss: 1.2419 - regression_loss: 1.0895 - classification_loss: 0.1524 464/500 [==========================>...] - ETA: 12s - loss: 1.2420 - regression_loss: 1.0896 - classification_loss: 0.1524 465/500 [==========================>...] - ETA: 11s - loss: 1.2418 - regression_loss: 1.0894 - classification_loss: 0.1524 466/500 [==========================>...] - ETA: 11s - loss: 1.2410 - regression_loss: 1.0887 - classification_loss: 0.1523 467/500 [===========================>..] - ETA: 11s - loss: 1.2416 - regression_loss: 1.0892 - classification_loss: 0.1524 468/500 [===========================>..] - ETA: 10s - loss: 1.2408 - regression_loss: 1.0886 - classification_loss: 0.1522 469/500 [===========================>..] - ETA: 10s - loss: 1.2402 - regression_loss: 1.0881 - classification_loss: 0.1521 470/500 [===========================>..] - ETA: 10s - loss: 1.2403 - regression_loss: 1.0882 - classification_loss: 0.1521 471/500 [===========================>..] - ETA: 9s - loss: 1.2407 - regression_loss: 1.0885 - classification_loss: 0.1522  472/500 [===========================>..] - ETA: 9s - loss: 1.2400 - regression_loss: 1.0878 - classification_loss: 0.1522 473/500 [===========================>..] - ETA: 9s - loss: 1.2397 - regression_loss: 1.0876 - classification_loss: 0.1521 474/500 [===========================>..] - ETA: 8s - loss: 1.2388 - regression_loss: 1.0869 - classification_loss: 0.1519 475/500 [===========================>..] - ETA: 8s - loss: 1.2398 - regression_loss: 1.0878 - classification_loss: 0.1520 476/500 [===========================>..] - ETA: 8s - loss: 1.2386 - regression_loss: 1.0868 - classification_loss: 0.1519 477/500 [===========================>..] - ETA: 7s - loss: 1.2389 - regression_loss: 1.0870 - classification_loss: 0.1519 478/500 [===========================>..] - ETA: 7s - loss: 1.2395 - regression_loss: 1.0876 - classification_loss: 0.1520 479/500 [===========================>..] - ETA: 7s - loss: 1.2385 - regression_loss: 1.0867 - classification_loss: 0.1518 480/500 [===========================>..] - ETA: 6s - loss: 1.2425 - regression_loss: 1.0901 - classification_loss: 0.1524 481/500 [===========================>..] - ETA: 6s - loss: 1.2420 - regression_loss: 1.0897 - classification_loss: 0.1523 482/500 [===========================>..] - ETA: 6s - loss: 1.2420 - regression_loss: 1.0897 - classification_loss: 0.1522 483/500 [===========================>..] - ETA: 5s - loss: 1.2417 - regression_loss: 1.0896 - classification_loss: 0.1521 484/500 [============================>.] - ETA: 5s - loss: 1.2415 - regression_loss: 1.0894 - classification_loss: 0.1521 485/500 [============================>.] - ETA: 5s - loss: 1.2420 - regression_loss: 1.0899 - classification_loss: 0.1521 486/500 [============================>.] - ETA: 4s - loss: 1.2418 - regression_loss: 1.0898 - classification_loss: 0.1520 487/500 [============================>.] - ETA: 4s - loss: 1.2434 - regression_loss: 1.0911 - classification_loss: 0.1523 488/500 [============================>.] - ETA: 4s - loss: 1.2439 - regression_loss: 1.0916 - classification_loss: 0.1523 489/500 [============================>.] - ETA: 3s - loss: 1.2442 - regression_loss: 1.0920 - classification_loss: 0.1522 490/500 [============================>.] - ETA: 3s - loss: 1.2444 - regression_loss: 1.0921 - classification_loss: 0.1523 491/500 [============================>.] - ETA: 3s - loss: 1.2445 - regression_loss: 1.0922 - classification_loss: 0.1523 492/500 [============================>.] - ETA: 2s - loss: 1.2443 - regression_loss: 1.0921 - classification_loss: 0.1522 493/500 [============================>.] - ETA: 2s - loss: 1.2447 - regression_loss: 1.0923 - classification_loss: 0.1523 494/500 [============================>.] - ETA: 2s - loss: 1.2448 - regression_loss: 1.0923 - classification_loss: 0.1525 495/500 [============================>.] - ETA: 1s - loss: 1.2450 - regression_loss: 1.0925 - classification_loss: 0.1526 496/500 [============================>.] - ETA: 1s - loss: 1.2466 - regression_loss: 1.0938 - classification_loss: 0.1528 497/500 [============================>.] - ETA: 1s - loss: 1.2469 - regression_loss: 1.0941 - classification_loss: 0.1527 498/500 [============================>.] - ETA: 0s - loss: 1.2470 - regression_loss: 1.0942 - classification_loss: 0.1528 499/500 [============================>.] - ETA: 0s - loss: 1.2471 - regression_loss: 1.0943 - classification_loss: 0.1528 500/500 [==============================] - 170s 339ms/step - loss: 1.2490 - regression_loss: 1.0956 - classification_loss: 0.1534 1172 instances of class plum with average precision: 0.7582 mAP: 0.7582 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 2:55 - loss: 1.4182 - regression_loss: 1.2794 - classification_loss: 0.1387 2/500 [..............................] - ETA: 2:54 - loss: 1.6120 - regression_loss: 1.4440 - classification_loss: 0.1680 3/500 [..............................] - ETA: 2:48 - loss: 1.5238 - regression_loss: 1.3616 - classification_loss: 0.1622 4/500 [..............................] - ETA: 2:44 - loss: 1.2999 - regression_loss: 1.1585 - classification_loss: 0.1414 5/500 [..............................] - ETA: 2:42 - loss: 1.2162 - regression_loss: 1.0898 - classification_loss: 0.1264 6/500 [..............................] - ETA: 2:40 - loss: 1.2930 - regression_loss: 1.1555 - classification_loss: 0.1375 7/500 [..............................] - ETA: 2:39 - loss: 1.3032 - regression_loss: 1.1628 - classification_loss: 0.1404 8/500 [..............................] - ETA: 2:40 - loss: 1.1842 - regression_loss: 1.0579 - classification_loss: 0.1262 9/500 [..............................] - ETA: 2:41 - loss: 1.1066 - regression_loss: 0.9890 - classification_loss: 0.1176 10/500 [..............................] - ETA: 2:40 - loss: 1.1387 - regression_loss: 1.0126 - classification_loss: 0.1261 11/500 [..............................] - ETA: 2:40 - loss: 1.1970 - regression_loss: 1.0676 - classification_loss: 0.1294 12/500 [..............................] - ETA: 2:40 - loss: 1.1825 - regression_loss: 1.0527 - classification_loss: 0.1298 13/500 [..............................] - ETA: 2:40 - loss: 1.1792 - regression_loss: 1.0520 - classification_loss: 0.1271 14/500 [..............................] - ETA: 2:40 - loss: 1.1653 - regression_loss: 1.0393 - classification_loss: 0.1260 15/500 [..............................] - ETA: 2:40 - loss: 1.1764 - regression_loss: 1.0442 - classification_loss: 0.1323 16/500 [..............................] - ETA: 2:40 - loss: 1.2034 - regression_loss: 1.0670 - classification_loss: 0.1364 17/500 [>.............................] - ETA: 2:40 - loss: 1.2238 - regression_loss: 1.0868 - classification_loss: 0.1370 18/500 [>.............................] - ETA: 2:40 - loss: 1.1873 - regression_loss: 1.0549 - classification_loss: 0.1324 19/500 [>.............................] - ETA: 2:40 - loss: 1.1678 - regression_loss: 1.0379 - classification_loss: 0.1299 20/500 [>.............................] - ETA: 2:39 - loss: 1.1434 - regression_loss: 1.0155 - classification_loss: 0.1279 21/500 [>.............................] - ETA: 2:39 - loss: 1.1509 - regression_loss: 1.0163 - classification_loss: 0.1346 22/500 [>.............................] - ETA: 2:39 - loss: 1.1641 - regression_loss: 1.0259 - classification_loss: 0.1382 23/500 [>.............................] - ETA: 2:39 - loss: 1.1775 - regression_loss: 1.0367 - classification_loss: 0.1408 24/500 [>.............................] - ETA: 2:39 - loss: 1.1614 - regression_loss: 1.0240 - classification_loss: 0.1374 25/500 [>.............................] - ETA: 2:39 - loss: 1.1576 - regression_loss: 1.0205 - classification_loss: 0.1370 26/500 [>.............................] - ETA: 2:39 - loss: 1.1450 - regression_loss: 1.0096 - classification_loss: 0.1354 27/500 [>.............................] - ETA: 2:39 - loss: 1.1375 - regression_loss: 1.0031 - classification_loss: 0.1344 28/500 [>.............................] - ETA: 2:39 - loss: 1.1197 - regression_loss: 0.9885 - classification_loss: 0.1312 29/500 [>.............................] - ETA: 2:38 - loss: 1.1143 - regression_loss: 0.9841 - classification_loss: 0.1302 30/500 [>.............................] - ETA: 2:38 - loss: 1.1169 - regression_loss: 0.9881 - classification_loss: 0.1288 31/500 [>.............................] - ETA: 2:38 - loss: 1.1068 - regression_loss: 0.9797 - classification_loss: 0.1270 32/500 [>.............................] - ETA: 2:38 - loss: 1.1143 - regression_loss: 0.9871 - classification_loss: 0.1272 33/500 [>.............................] - ETA: 2:37 - loss: 1.0948 - regression_loss: 0.9703 - classification_loss: 0.1245 34/500 [=>............................] - ETA: 2:37 - loss: 1.1335 - regression_loss: 0.9846 - classification_loss: 0.1489 35/500 [=>............................] - ETA: 2:37 - loss: 1.1312 - regression_loss: 0.9837 - classification_loss: 0.1476 36/500 [=>............................] - ETA: 2:36 - loss: 1.1108 - regression_loss: 0.9657 - classification_loss: 0.1451 37/500 [=>............................] - ETA: 2:36 - loss: 1.1204 - regression_loss: 0.9743 - classification_loss: 0.1461 38/500 [=>............................] - ETA: 2:36 - loss: 1.1324 - regression_loss: 0.9846 - classification_loss: 0.1478 39/500 [=>............................] - ETA: 2:35 - loss: 1.1380 - regression_loss: 0.9897 - classification_loss: 0.1484 40/500 [=>............................] - ETA: 2:35 - loss: 1.1373 - regression_loss: 0.9891 - classification_loss: 0.1482 41/500 [=>............................] - ETA: 2:35 - loss: 1.1361 - regression_loss: 0.9888 - classification_loss: 0.1472 42/500 [=>............................] - ETA: 2:34 - loss: 1.1271 - regression_loss: 0.9817 - classification_loss: 0.1454 43/500 [=>............................] - ETA: 2:34 - loss: 1.1351 - regression_loss: 0.9888 - classification_loss: 0.1464 44/500 [=>............................] - ETA: 2:33 - loss: 1.1480 - regression_loss: 0.9997 - classification_loss: 0.1483 45/500 [=>............................] - ETA: 2:33 - loss: 1.1362 - regression_loss: 0.9892 - classification_loss: 0.1470 46/500 [=>............................] - ETA: 2:33 - loss: 1.1505 - regression_loss: 1.0031 - classification_loss: 0.1473 47/500 [=>............................] - ETA: 2:32 - loss: 1.1568 - regression_loss: 1.0091 - classification_loss: 0.1477 48/500 [=>............................] - ETA: 2:32 - loss: 1.1547 - regression_loss: 1.0082 - classification_loss: 0.1465 49/500 [=>............................] - ETA: 2:32 - loss: 1.1509 - regression_loss: 1.0057 - classification_loss: 0.1452 50/500 [==>...........................] - ETA: 2:32 - loss: 1.1483 - regression_loss: 1.0042 - classification_loss: 0.1441 51/500 [==>...........................] - ETA: 2:31 - loss: 1.1419 - regression_loss: 0.9988 - classification_loss: 0.1431 52/500 [==>...........................] - ETA: 2:31 - loss: 1.1447 - regression_loss: 1.0012 - classification_loss: 0.1435 53/500 [==>...........................] - ETA: 2:31 - loss: 1.1394 - regression_loss: 0.9969 - classification_loss: 0.1425 54/500 [==>...........................] - ETA: 2:31 - loss: 1.1397 - regression_loss: 0.9977 - classification_loss: 0.1420 55/500 [==>...........................] - ETA: 2:30 - loss: 1.1531 - regression_loss: 1.0085 - classification_loss: 0.1446 56/500 [==>...........................] - ETA: 2:30 - loss: 1.1464 - regression_loss: 1.0024 - classification_loss: 0.1440 57/500 [==>...........................] - ETA: 2:29 - loss: 1.1446 - regression_loss: 1.0006 - classification_loss: 0.1440 58/500 [==>...........................] - ETA: 2:29 - loss: 1.1486 - regression_loss: 1.0042 - classification_loss: 0.1444 59/500 [==>...........................] - ETA: 2:29 - loss: 1.1507 - regression_loss: 1.0041 - classification_loss: 0.1466 60/500 [==>...........................] - ETA: 2:28 - loss: 1.1540 - regression_loss: 1.0074 - classification_loss: 0.1467 61/500 [==>...........................] - ETA: 2:28 - loss: 1.1587 - regression_loss: 1.0111 - classification_loss: 0.1475 62/500 [==>...........................] - ETA: 2:28 - loss: 1.1544 - regression_loss: 1.0084 - classification_loss: 0.1460 63/500 [==>...........................] - ETA: 2:28 - loss: 1.1523 - regression_loss: 1.0064 - classification_loss: 0.1459 64/500 [==>...........................] - ETA: 2:27 - loss: 1.1570 - regression_loss: 1.0113 - classification_loss: 0.1457 65/500 [==>...........................] - ETA: 2:27 - loss: 1.1584 - regression_loss: 1.0130 - classification_loss: 0.1455 66/500 [==>...........................] - ETA: 2:27 - loss: 1.1632 - regression_loss: 1.0164 - classification_loss: 0.1467 67/500 [===>..........................] - ETA: 2:26 - loss: 1.1573 - regression_loss: 1.0114 - classification_loss: 0.1459 68/500 [===>..........................] - ETA: 2:26 - loss: 1.1553 - regression_loss: 1.0097 - classification_loss: 0.1456 69/500 [===>..........................] - ETA: 2:26 - loss: 1.1483 - regression_loss: 1.0037 - classification_loss: 0.1446 70/500 [===>..........................] - ETA: 2:26 - loss: 1.1546 - regression_loss: 1.0088 - classification_loss: 0.1458 71/500 [===>..........................] - ETA: 2:25 - loss: 1.1560 - regression_loss: 1.0090 - classification_loss: 0.1469 72/500 [===>..........................] - ETA: 2:25 - loss: 1.1534 - regression_loss: 1.0064 - classification_loss: 0.1469 73/500 [===>..........................] - ETA: 2:24 - loss: 1.1524 - regression_loss: 1.0058 - classification_loss: 0.1466 74/500 [===>..........................] - ETA: 2:24 - loss: 1.1645 - regression_loss: 1.0158 - classification_loss: 0.1487 75/500 [===>..........................] - ETA: 2:24 - loss: 1.1667 - regression_loss: 1.0180 - classification_loss: 0.1487 76/500 [===>..........................] - ETA: 2:24 - loss: 1.1582 - regression_loss: 1.0105 - classification_loss: 0.1477 77/500 [===>..........................] - ETA: 2:23 - loss: 1.1500 - regression_loss: 1.0038 - classification_loss: 0.1462 78/500 [===>..........................] - ETA: 2:23 - loss: 1.1483 - regression_loss: 1.0024 - classification_loss: 0.1458 79/500 [===>..........................] - ETA: 2:22 - loss: 1.1480 - regression_loss: 1.0028 - classification_loss: 0.1453 80/500 [===>..........................] - ETA: 2:22 - loss: 1.1524 - regression_loss: 1.0063 - classification_loss: 0.1461 81/500 [===>..........................] - ETA: 2:22 - loss: 1.1563 - regression_loss: 1.0095 - classification_loss: 0.1468 82/500 [===>..........................] - ETA: 2:21 - loss: 1.1645 - regression_loss: 1.0167 - classification_loss: 0.1479 83/500 [===>..........................] - ETA: 2:21 - loss: 1.1677 - regression_loss: 1.0203 - classification_loss: 0.1474 84/500 [====>.........................] - ETA: 2:21 - loss: 1.1630 - regression_loss: 1.0167 - classification_loss: 0.1464 85/500 [====>.........................] - ETA: 2:20 - loss: 1.1596 - regression_loss: 1.0136 - classification_loss: 0.1460 86/500 [====>.........................] - ETA: 2:20 - loss: 1.1545 - regression_loss: 1.0091 - classification_loss: 0.1454 87/500 [====>.........................] - ETA: 2:20 - loss: 1.1512 - regression_loss: 1.0061 - classification_loss: 0.1451 88/500 [====>.........................] - ETA: 2:19 - loss: 1.1522 - regression_loss: 1.0076 - classification_loss: 0.1446 89/500 [====>.........................] - ETA: 2:19 - loss: 1.1542 - regression_loss: 1.0099 - classification_loss: 0.1443 90/500 [====>.........................] - ETA: 2:19 - loss: 1.1493 - regression_loss: 1.0060 - classification_loss: 0.1433 91/500 [====>.........................] - ETA: 2:18 - loss: 1.1522 - regression_loss: 1.0086 - classification_loss: 0.1437 92/500 [====>.........................] - ETA: 2:18 - loss: 1.1562 - regression_loss: 1.0122 - classification_loss: 0.1439 93/500 [====>.........................] - ETA: 2:18 - loss: 1.1631 - regression_loss: 1.0193 - classification_loss: 0.1438 94/500 [====>.........................] - ETA: 2:17 - loss: 1.1641 - regression_loss: 1.0207 - classification_loss: 0.1434 95/500 [====>.........................] - ETA: 2:17 - loss: 1.1652 - regression_loss: 1.0211 - classification_loss: 0.1442 96/500 [====>.........................] - ETA: 2:16 - loss: 1.1669 - regression_loss: 1.0226 - classification_loss: 0.1443 97/500 [====>.........................] - ETA: 2:16 - loss: 1.1710 - regression_loss: 1.0260 - classification_loss: 0.1450 98/500 [====>.........................] - ETA: 2:16 - loss: 1.1661 - regression_loss: 1.0211 - classification_loss: 0.1450 99/500 [====>.........................] - ETA: 2:15 - loss: 1.1652 - regression_loss: 1.0208 - classification_loss: 0.1444 100/500 [=====>........................] - ETA: 2:15 - loss: 1.1599 - regression_loss: 1.0160 - classification_loss: 0.1439 101/500 [=====>........................] - ETA: 2:15 - loss: 1.1630 - regression_loss: 1.0188 - classification_loss: 0.1441 102/500 [=====>........................] - ETA: 2:14 - loss: 1.1618 - regression_loss: 1.0185 - classification_loss: 0.1434 103/500 [=====>........................] - ETA: 2:14 - loss: 1.1714 - regression_loss: 1.0264 - classification_loss: 0.1449 104/500 [=====>........................] - ETA: 2:13 - loss: 1.1752 - regression_loss: 1.0289 - classification_loss: 0.1462 105/500 [=====>........................] - ETA: 2:13 - loss: 1.1781 - regression_loss: 1.0314 - classification_loss: 0.1467 106/500 [=====>........................] - ETA: 2:13 - loss: 1.1798 - regression_loss: 1.0327 - classification_loss: 0.1471 107/500 [=====>........................] - ETA: 2:12 - loss: 1.1827 - regression_loss: 1.0357 - classification_loss: 0.1470 108/500 [=====>........................] - ETA: 2:12 - loss: 1.1829 - regression_loss: 1.0360 - classification_loss: 0.1468 109/500 [=====>........................] - ETA: 2:12 - loss: 1.1809 - regression_loss: 1.0347 - classification_loss: 0.1462 110/500 [=====>........................] - ETA: 2:11 - loss: 1.1874 - regression_loss: 1.0401 - classification_loss: 0.1473 111/500 [=====>........................] - ETA: 2:11 - loss: 1.1852 - regression_loss: 1.0383 - classification_loss: 0.1468 112/500 [=====>........................] - ETA: 2:11 - loss: 1.1849 - regression_loss: 1.0382 - classification_loss: 0.1468 113/500 [=====>........................] - ETA: 2:10 - loss: 1.1845 - regression_loss: 1.0380 - classification_loss: 0.1465 114/500 [=====>........................] - ETA: 2:10 - loss: 1.1841 - regression_loss: 1.0378 - classification_loss: 0.1463 115/500 [=====>........................] - ETA: 2:10 - loss: 1.1803 - regression_loss: 1.0338 - classification_loss: 0.1465 116/500 [=====>........................] - ETA: 2:09 - loss: 1.1830 - regression_loss: 1.0360 - classification_loss: 0.1469 117/500 [======>.......................] - ETA: 2:09 - loss: 1.1831 - regression_loss: 1.0362 - classification_loss: 0.1469 118/500 [======>.......................] - ETA: 2:09 - loss: 1.1849 - regression_loss: 1.0382 - classification_loss: 0.1468 119/500 [======>.......................] - ETA: 2:08 - loss: 1.1893 - regression_loss: 1.0419 - classification_loss: 0.1474 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1842 - regression_loss: 1.0377 - classification_loss: 0.1465 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1871 - regression_loss: 1.0400 - classification_loss: 0.1471 122/500 [======>.......................] - ETA: 2:07 - loss: 1.1899 - regression_loss: 1.0423 - classification_loss: 0.1476 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1902 - regression_loss: 1.0424 - classification_loss: 0.1478 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1911 - regression_loss: 1.0434 - classification_loss: 0.1477 125/500 [======>.......................] - ETA: 2:06 - loss: 1.1954 - regression_loss: 1.0470 - classification_loss: 0.1484 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1966 - regression_loss: 1.0481 - classification_loss: 0.1485 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2005 - regression_loss: 1.0514 - classification_loss: 0.1491 128/500 [======>.......................] - ETA: 2:05 - loss: 1.2038 - regression_loss: 1.0548 - classification_loss: 0.1490 129/500 [======>.......................] - ETA: 2:05 - loss: 1.2015 - regression_loss: 1.0527 - classification_loss: 0.1488 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2011 - regression_loss: 1.0506 - classification_loss: 0.1504 131/500 [======>.......................] - ETA: 2:04 - loss: 1.2005 - regression_loss: 1.0501 - classification_loss: 0.1503 132/500 [======>.......................] - ETA: 2:04 - loss: 1.2029 - regression_loss: 1.0520 - classification_loss: 0.1509 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2072 - regression_loss: 1.0556 - classification_loss: 0.1516 134/500 [=======>......................] - ETA: 2:03 - loss: 1.2081 - regression_loss: 1.0563 - classification_loss: 0.1518 135/500 [=======>......................] - ETA: 2:03 - loss: 1.2095 - regression_loss: 1.0575 - classification_loss: 0.1520 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2146 - regression_loss: 1.0610 - classification_loss: 0.1535 137/500 [=======>......................] - ETA: 2:02 - loss: 1.2145 - regression_loss: 1.0608 - classification_loss: 0.1537 138/500 [=======>......................] - ETA: 2:02 - loss: 1.2162 - regression_loss: 1.0622 - classification_loss: 0.1539 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2167 - regression_loss: 1.0625 - classification_loss: 0.1543 140/500 [=======>......................] - ETA: 2:01 - loss: 1.2165 - regression_loss: 1.0621 - classification_loss: 0.1544 141/500 [=======>......................] - ETA: 2:01 - loss: 1.2147 - regression_loss: 1.0608 - classification_loss: 0.1539 142/500 [=======>......................] - ETA: 2:01 - loss: 1.2157 - regression_loss: 1.0616 - classification_loss: 0.1541 143/500 [=======>......................] - ETA: 2:01 - loss: 1.2140 - regression_loss: 1.0601 - classification_loss: 0.1539 144/500 [=======>......................] - ETA: 2:00 - loss: 1.2149 - regression_loss: 1.0613 - classification_loss: 0.1536 145/500 [=======>......................] - ETA: 2:00 - loss: 1.2166 - regression_loss: 1.0628 - classification_loss: 0.1538 146/500 [=======>......................] - ETA: 2:00 - loss: 1.2129 - regression_loss: 1.0597 - classification_loss: 0.1531 147/500 [=======>......................] - ETA: 1:59 - loss: 1.2088 - regression_loss: 1.0563 - classification_loss: 0.1525 148/500 [=======>......................] - ETA: 1:59 - loss: 1.2082 - regression_loss: 1.0561 - classification_loss: 0.1521 149/500 [=======>......................] - ETA: 1:59 - loss: 1.2083 - regression_loss: 1.0564 - classification_loss: 0.1519 150/500 [========>.....................] - ETA: 1:58 - loss: 1.2119 - regression_loss: 1.0593 - classification_loss: 0.1525 151/500 [========>.....................] - ETA: 1:58 - loss: 1.2072 - regression_loss: 1.0554 - classification_loss: 0.1518 152/500 [========>.....................] - ETA: 1:57 - loss: 1.2035 - regression_loss: 1.0518 - classification_loss: 0.1517 153/500 [========>.....................] - ETA: 1:57 - loss: 1.2029 - regression_loss: 1.0513 - classification_loss: 0.1516 154/500 [========>.....................] - ETA: 1:57 - loss: 1.2061 - regression_loss: 1.0546 - classification_loss: 0.1515 155/500 [========>.....................] - ETA: 1:56 - loss: 1.2067 - regression_loss: 1.0551 - classification_loss: 0.1516 156/500 [========>.....................] - ETA: 1:56 - loss: 1.2077 - regression_loss: 1.0561 - classification_loss: 0.1516 157/500 [========>.....................] - ETA: 1:56 - loss: 1.2034 - regression_loss: 1.0524 - classification_loss: 0.1510 158/500 [========>.....................] - ETA: 1:55 - loss: 1.2049 - regression_loss: 1.0537 - classification_loss: 0.1512 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1993 - regression_loss: 1.0487 - classification_loss: 0.1507 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1957 - regression_loss: 1.0456 - classification_loss: 0.1501 161/500 [========>.....................] - ETA: 1:54 - loss: 1.1947 - regression_loss: 1.0448 - classification_loss: 0.1499 162/500 [========>.....................] - ETA: 1:54 - loss: 1.2003 - regression_loss: 1.0494 - classification_loss: 0.1509 163/500 [========>.....................] - ETA: 1:54 - loss: 1.2021 - regression_loss: 1.0510 - classification_loss: 0.1511 164/500 [========>.....................] - ETA: 1:53 - loss: 1.2030 - regression_loss: 1.0517 - classification_loss: 0.1513 165/500 [========>.....................] - ETA: 1:53 - loss: 1.2047 - regression_loss: 1.0532 - classification_loss: 0.1515 166/500 [========>.....................] - ETA: 1:53 - loss: 1.2047 - regression_loss: 1.0532 - classification_loss: 0.1515 167/500 [=========>....................] - ETA: 1:53 - loss: 1.2065 - regression_loss: 1.0550 - classification_loss: 0.1516 168/500 [=========>....................] - ETA: 1:52 - loss: 1.2033 - regression_loss: 1.0522 - classification_loss: 0.1511 169/500 [=========>....................] - ETA: 1:52 - loss: 1.2041 - regression_loss: 1.0530 - classification_loss: 0.1510 170/500 [=========>....................] - ETA: 1:52 - loss: 1.2041 - regression_loss: 1.0530 - classification_loss: 0.1511 171/500 [=========>....................] - ETA: 1:51 - loss: 1.2043 - regression_loss: 1.0533 - classification_loss: 0.1510 172/500 [=========>....................] - ETA: 1:51 - loss: 1.2121 - regression_loss: 1.0600 - classification_loss: 0.1521 173/500 [=========>....................] - ETA: 1:51 - loss: 1.2124 - regression_loss: 1.0605 - classification_loss: 0.1519 174/500 [=========>....................] - ETA: 1:50 - loss: 1.2108 - regression_loss: 1.0588 - classification_loss: 0.1521 175/500 [=========>....................] - ETA: 1:50 - loss: 1.2127 - regression_loss: 1.0606 - classification_loss: 0.1521 176/500 [=========>....................] - ETA: 1:49 - loss: 1.2122 - regression_loss: 1.0601 - classification_loss: 0.1520 177/500 [=========>....................] - ETA: 1:49 - loss: 1.2117 - regression_loss: 1.0599 - classification_loss: 0.1518 178/500 [=========>....................] - ETA: 1:49 - loss: 1.2111 - regression_loss: 1.0594 - classification_loss: 0.1517 179/500 [=========>....................] - ETA: 1:48 - loss: 1.2111 - regression_loss: 1.0593 - classification_loss: 0.1519 180/500 [=========>....................] - ETA: 1:48 - loss: 1.2095 - regression_loss: 1.0579 - classification_loss: 0.1515 181/500 [=========>....................] - ETA: 1:48 - loss: 1.2093 - regression_loss: 1.0578 - classification_loss: 0.1515 182/500 [=========>....................] - ETA: 1:47 - loss: 1.2099 - regression_loss: 1.0583 - classification_loss: 0.1515 183/500 [=========>....................] - ETA: 1:47 - loss: 1.2104 - regression_loss: 1.0588 - classification_loss: 0.1516 184/500 [==========>...................] - ETA: 1:47 - loss: 1.2084 - regression_loss: 1.0573 - classification_loss: 0.1511 185/500 [==========>...................] - ETA: 1:46 - loss: 1.2077 - regression_loss: 1.0566 - classification_loss: 0.1511 186/500 [==========>...................] - ETA: 1:46 - loss: 1.2117 - regression_loss: 1.0597 - classification_loss: 0.1521 187/500 [==========>...................] - ETA: 1:46 - loss: 1.2123 - regression_loss: 1.0603 - classification_loss: 0.1520 188/500 [==========>...................] - ETA: 1:45 - loss: 1.2101 - regression_loss: 1.0583 - classification_loss: 0.1518 189/500 [==========>...................] - ETA: 1:45 - loss: 1.2076 - regression_loss: 1.0563 - classification_loss: 0.1513 190/500 [==========>...................] - ETA: 1:45 - loss: 1.2047 - regression_loss: 1.0535 - classification_loss: 0.1512 191/500 [==========>...................] - ETA: 1:44 - loss: 1.2016 - regression_loss: 1.0509 - classification_loss: 0.1507 192/500 [==========>...................] - ETA: 1:44 - loss: 1.2024 - regression_loss: 1.0516 - classification_loss: 0.1508 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1996 - regression_loss: 1.0493 - classification_loss: 0.1503 194/500 [==========>...................] - ETA: 1:43 - loss: 1.2007 - regression_loss: 1.0500 - classification_loss: 0.1507 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1984 - regression_loss: 1.0482 - classification_loss: 0.1502 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1987 - regression_loss: 1.0485 - classification_loss: 0.1502 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1999 - regression_loss: 1.0496 - classification_loss: 0.1503 198/500 [==========>...................] - ETA: 1:42 - loss: 1.2015 - regression_loss: 1.0516 - classification_loss: 0.1499 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1991 - regression_loss: 1.0496 - classification_loss: 0.1495 200/500 [===========>..................] - ETA: 1:41 - loss: 1.2022 - regression_loss: 1.0525 - classification_loss: 0.1497 201/500 [===========>..................] - ETA: 1:41 - loss: 1.2033 - regression_loss: 1.0537 - classification_loss: 0.1496 202/500 [===========>..................] - ETA: 1:41 - loss: 1.2033 - regression_loss: 1.0539 - classification_loss: 0.1494 203/500 [===========>..................] - ETA: 1:40 - loss: 1.2031 - regression_loss: 1.0538 - classification_loss: 0.1493 204/500 [===========>..................] - ETA: 1:40 - loss: 1.2036 - regression_loss: 1.0542 - classification_loss: 0.1494 205/500 [===========>..................] - ETA: 1:40 - loss: 1.2029 - regression_loss: 1.0535 - classification_loss: 0.1493 206/500 [===========>..................] - ETA: 1:39 - loss: 1.2052 - regression_loss: 1.0553 - classification_loss: 0.1499 207/500 [===========>..................] - ETA: 1:39 - loss: 1.2055 - regression_loss: 1.0559 - classification_loss: 0.1496 208/500 [===========>..................] - ETA: 1:39 - loss: 1.2082 - regression_loss: 1.0582 - classification_loss: 0.1500 209/500 [===========>..................] - ETA: 1:38 - loss: 1.2076 - regression_loss: 1.0579 - classification_loss: 0.1497 210/500 [===========>..................] - ETA: 1:38 - loss: 1.2104 - regression_loss: 1.0605 - classification_loss: 0.1499 211/500 [===========>..................] - ETA: 1:38 - loss: 1.2086 - regression_loss: 1.0591 - classification_loss: 0.1495 212/500 [===========>..................] - ETA: 1:37 - loss: 1.2069 - regression_loss: 1.0577 - classification_loss: 0.1493 213/500 [===========>..................] - ETA: 1:37 - loss: 1.2066 - regression_loss: 1.0575 - classification_loss: 0.1491 214/500 [===========>..................] - ETA: 1:37 - loss: 1.2053 - regression_loss: 1.0566 - classification_loss: 0.1487 215/500 [===========>..................] - ETA: 1:36 - loss: 1.2043 - regression_loss: 1.0558 - classification_loss: 0.1485 216/500 [===========>..................] - ETA: 1:36 - loss: 1.2044 - regression_loss: 1.0557 - classification_loss: 0.1487 217/500 [============>.................] - ETA: 1:36 - loss: 1.2058 - regression_loss: 1.0568 - classification_loss: 0.1490 218/500 [============>.................] - ETA: 1:35 - loss: 1.2079 - regression_loss: 1.0586 - classification_loss: 0.1493 219/500 [============>.................] - ETA: 1:35 - loss: 1.2100 - regression_loss: 1.0602 - classification_loss: 0.1498 220/500 [============>.................] - ETA: 1:34 - loss: 1.2110 - regression_loss: 1.0609 - classification_loss: 0.1500 221/500 [============>.................] - ETA: 1:34 - loss: 1.2132 - regression_loss: 1.0628 - classification_loss: 0.1504 222/500 [============>.................] - ETA: 1:34 - loss: 1.2137 - regression_loss: 1.0632 - classification_loss: 0.1504 223/500 [============>.................] - ETA: 1:33 - loss: 1.2129 - regression_loss: 1.0627 - classification_loss: 0.1502 224/500 [============>.................] - ETA: 1:33 - loss: 1.2171 - regression_loss: 1.0656 - classification_loss: 0.1515 225/500 [============>.................] - ETA: 1:33 - loss: 1.2159 - regression_loss: 1.0646 - classification_loss: 0.1514 226/500 [============>.................] - ETA: 1:32 - loss: 1.2183 - regression_loss: 1.0663 - classification_loss: 0.1519 227/500 [============>.................] - ETA: 1:32 - loss: 1.2157 - regression_loss: 1.0642 - classification_loss: 0.1515 228/500 [============>.................] - ETA: 1:32 - loss: 1.2179 - regression_loss: 1.0660 - classification_loss: 0.1519 229/500 [============>.................] - ETA: 1:32 - loss: 1.2152 - regression_loss: 1.0637 - classification_loss: 0.1514 230/500 [============>.................] - ETA: 1:31 - loss: 1.2174 - regression_loss: 1.0656 - classification_loss: 0.1518 231/500 [============>.................] - ETA: 1:31 - loss: 1.2182 - regression_loss: 1.0663 - classification_loss: 0.1520 232/500 [============>.................] - ETA: 1:30 - loss: 1.2160 - regression_loss: 1.0644 - classification_loss: 0.1516 233/500 [============>.................] - ETA: 1:30 - loss: 1.2162 - regression_loss: 1.0646 - classification_loss: 0.1516 234/500 [=============>................] - ETA: 1:30 - loss: 1.2146 - regression_loss: 1.0633 - classification_loss: 0.1514 235/500 [=============>................] - ETA: 1:29 - loss: 1.2142 - regression_loss: 1.0630 - classification_loss: 0.1512 236/500 [=============>................] - ETA: 1:29 - loss: 1.2150 - regression_loss: 1.0638 - classification_loss: 0.1512 237/500 [=============>................] - ETA: 1:29 - loss: 1.2154 - regression_loss: 1.0643 - classification_loss: 0.1511 238/500 [=============>................] - ETA: 1:28 - loss: 1.2149 - regression_loss: 1.0637 - classification_loss: 0.1512 239/500 [=============>................] - ETA: 1:28 - loss: 1.2129 - regression_loss: 1.0621 - classification_loss: 0.1508 240/500 [=============>................] - ETA: 1:28 - loss: 1.2121 - regression_loss: 1.0613 - classification_loss: 0.1507 241/500 [=============>................] - ETA: 1:27 - loss: 1.2107 - regression_loss: 1.0603 - classification_loss: 0.1504 242/500 [=============>................] - ETA: 1:27 - loss: 1.2112 - regression_loss: 1.0607 - classification_loss: 0.1505 243/500 [=============>................] - ETA: 1:27 - loss: 1.2107 - regression_loss: 1.0604 - classification_loss: 0.1503 244/500 [=============>................] - ETA: 1:26 - loss: 1.2107 - regression_loss: 1.0605 - classification_loss: 0.1502 245/500 [=============>................] - ETA: 1:26 - loss: 1.2085 - regression_loss: 1.0586 - classification_loss: 0.1499 246/500 [=============>................] - ETA: 1:26 - loss: 1.2087 - regression_loss: 1.0587 - classification_loss: 0.1500 247/500 [=============>................] - ETA: 1:25 - loss: 1.2114 - regression_loss: 1.0609 - classification_loss: 0.1505 248/500 [=============>................] - ETA: 1:25 - loss: 1.2106 - regression_loss: 1.0602 - classification_loss: 0.1504 249/500 [=============>................] - ETA: 1:25 - loss: 1.2109 - regression_loss: 1.0604 - classification_loss: 0.1505 250/500 [==============>...............] - ETA: 1:24 - loss: 1.2115 - regression_loss: 1.0607 - classification_loss: 0.1509 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2083 - regression_loss: 1.0576 - classification_loss: 0.1507 252/500 [==============>...............] - ETA: 1:24 - loss: 1.2092 - regression_loss: 1.0582 - classification_loss: 0.1510 253/500 [==============>...............] - ETA: 1:23 - loss: 1.2080 - regression_loss: 1.0569 - classification_loss: 0.1511 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2068 - regression_loss: 1.0561 - classification_loss: 0.1507 255/500 [==============>...............] - ETA: 1:23 - loss: 1.2052 - regression_loss: 1.0549 - classification_loss: 0.1503 256/500 [==============>...............] - ETA: 1:22 - loss: 1.2039 - regression_loss: 1.0535 - classification_loss: 0.1504 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2051 - regression_loss: 1.0546 - classification_loss: 0.1505 258/500 [==============>...............] - ETA: 1:22 - loss: 1.2066 - regression_loss: 1.0559 - classification_loss: 0.1507 259/500 [==============>...............] - ETA: 1:21 - loss: 1.2050 - regression_loss: 1.0543 - classification_loss: 0.1506 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2069 - regression_loss: 1.0561 - classification_loss: 0.1508 261/500 [==============>...............] - ETA: 1:21 - loss: 1.2084 - regression_loss: 1.0575 - classification_loss: 0.1510 262/500 [==============>...............] - ETA: 1:20 - loss: 1.2101 - regression_loss: 1.0589 - classification_loss: 0.1512 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2117 - regression_loss: 1.0603 - classification_loss: 0.1514 264/500 [==============>...............] - ETA: 1:20 - loss: 1.2127 - regression_loss: 1.0610 - classification_loss: 0.1516 265/500 [==============>...............] - ETA: 1:19 - loss: 1.2121 - regression_loss: 1.0604 - classification_loss: 0.1517 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2134 - regression_loss: 1.0616 - classification_loss: 0.1518 267/500 [===============>..............] - ETA: 1:19 - loss: 1.2151 - regression_loss: 1.0630 - classification_loss: 0.1520 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2162 - regression_loss: 1.0643 - classification_loss: 0.1520 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2158 - regression_loss: 1.0638 - classification_loss: 0.1520 270/500 [===============>..............] - ETA: 1:18 - loss: 1.2137 - regression_loss: 1.0620 - classification_loss: 0.1517 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2135 - regression_loss: 1.0619 - classification_loss: 0.1516 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2146 - regression_loss: 1.0627 - classification_loss: 0.1519 273/500 [===============>..............] - ETA: 1:17 - loss: 1.2139 - regression_loss: 1.0622 - classification_loss: 0.1517 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2162 - regression_loss: 1.0641 - classification_loss: 0.1521 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2166 - regression_loss: 1.0645 - classification_loss: 0.1521 276/500 [===============>..............] - ETA: 1:16 - loss: 1.2171 - regression_loss: 1.0650 - classification_loss: 0.1521 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2190 - regression_loss: 1.0666 - classification_loss: 0.1524 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2181 - regression_loss: 1.0658 - classification_loss: 0.1523 279/500 [===============>..............] - ETA: 1:15 - loss: 1.2199 - regression_loss: 1.0667 - classification_loss: 0.1532 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2183 - regression_loss: 1.0653 - classification_loss: 0.1530 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2190 - regression_loss: 1.0659 - classification_loss: 0.1532 282/500 [===============>..............] - ETA: 1:14 - loss: 1.2166 - regression_loss: 1.0638 - classification_loss: 0.1528 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2176 - regression_loss: 1.0646 - classification_loss: 0.1530 284/500 [================>.............] - ETA: 1:13 - loss: 1.2161 - regression_loss: 1.0633 - classification_loss: 0.1528 285/500 [================>.............] - ETA: 1:13 - loss: 1.2172 - regression_loss: 1.0643 - classification_loss: 0.1529 286/500 [================>.............] - ETA: 1:12 - loss: 1.2154 - regression_loss: 1.0628 - classification_loss: 0.1526 287/500 [================>.............] - ETA: 1:12 - loss: 1.2148 - regression_loss: 1.0623 - classification_loss: 0.1525 288/500 [================>.............] - ETA: 1:12 - loss: 1.2154 - regression_loss: 1.0627 - classification_loss: 0.1526 289/500 [================>.............] - ETA: 1:11 - loss: 1.2179 - regression_loss: 1.0649 - classification_loss: 0.1531 290/500 [================>.............] - ETA: 1:11 - loss: 1.2191 - regression_loss: 1.0658 - classification_loss: 0.1533 291/500 [================>.............] - ETA: 1:10 - loss: 1.2185 - regression_loss: 1.0651 - classification_loss: 0.1534 292/500 [================>.............] - ETA: 1:10 - loss: 1.2182 - regression_loss: 1.0649 - classification_loss: 0.1533 293/500 [================>.............] - ETA: 1:10 - loss: 1.2161 - regression_loss: 1.0632 - classification_loss: 0.1529 294/500 [================>.............] - ETA: 1:09 - loss: 1.2146 - regression_loss: 1.0619 - classification_loss: 0.1527 295/500 [================>.............] - ETA: 1:09 - loss: 1.2137 - regression_loss: 1.0612 - classification_loss: 0.1524 296/500 [================>.............] - ETA: 1:09 - loss: 1.2141 - regression_loss: 1.0616 - classification_loss: 0.1525 297/500 [================>.............] - ETA: 1:08 - loss: 1.2156 - regression_loss: 1.0629 - classification_loss: 0.1528 298/500 [================>.............] - ETA: 1:08 - loss: 1.2167 - regression_loss: 1.0637 - classification_loss: 0.1530 299/500 [================>.............] - ETA: 1:08 - loss: 1.2168 - regression_loss: 1.0638 - classification_loss: 0.1530 300/500 [=================>............] - ETA: 1:07 - loss: 1.2161 - regression_loss: 1.0634 - classification_loss: 0.1528 301/500 [=================>............] - ETA: 1:07 - loss: 1.2158 - regression_loss: 1.0631 - classification_loss: 0.1526 302/500 [=================>............] - ETA: 1:07 - loss: 1.2149 - regression_loss: 1.0624 - classification_loss: 0.1525 303/500 [=================>............] - ETA: 1:06 - loss: 1.2137 - regression_loss: 1.0615 - classification_loss: 0.1522 304/500 [=================>............] - ETA: 1:06 - loss: 1.2118 - regression_loss: 1.0599 - classification_loss: 0.1519 305/500 [=================>............] - ETA: 1:06 - loss: 1.2099 - regression_loss: 1.0582 - classification_loss: 0.1517 306/500 [=================>............] - ETA: 1:05 - loss: 1.2112 - regression_loss: 1.0592 - classification_loss: 0.1520 307/500 [=================>............] - ETA: 1:05 - loss: 1.2122 - regression_loss: 1.0600 - classification_loss: 0.1522 308/500 [=================>............] - ETA: 1:05 - loss: 1.2107 - regression_loss: 1.0589 - classification_loss: 0.1518 309/500 [=================>............] - ETA: 1:04 - loss: 1.2111 - regression_loss: 1.0594 - classification_loss: 0.1517 310/500 [=================>............] - ETA: 1:04 - loss: 1.2110 - regression_loss: 1.0594 - classification_loss: 0.1516 311/500 [=================>............] - ETA: 1:04 - loss: 1.2112 - regression_loss: 1.0596 - classification_loss: 0.1516 312/500 [=================>............] - ETA: 1:03 - loss: 1.2112 - regression_loss: 1.0597 - classification_loss: 0.1515 313/500 [=================>............] - ETA: 1:03 - loss: 1.2110 - regression_loss: 1.0596 - classification_loss: 0.1514 314/500 [=================>............] - ETA: 1:03 - loss: 1.2084 - regression_loss: 1.0574 - classification_loss: 0.1510 315/500 [=================>............] - ETA: 1:02 - loss: 1.2088 - regression_loss: 1.0578 - classification_loss: 0.1511 316/500 [=================>............] - ETA: 1:02 - loss: 1.2093 - regression_loss: 1.0583 - classification_loss: 0.1509 317/500 [==================>...........] - ETA: 1:02 - loss: 1.2088 - regression_loss: 1.0579 - classification_loss: 0.1509 318/500 [==================>...........] - ETA: 1:01 - loss: 1.2111 - regression_loss: 1.0595 - classification_loss: 0.1516 319/500 [==================>...........] - ETA: 1:01 - loss: 1.2126 - regression_loss: 1.0608 - classification_loss: 0.1517 320/500 [==================>...........] - ETA: 1:01 - loss: 1.2123 - regression_loss: 1.0605 - classification_loss: 0.1518 321/500 [==================>...........] - ETA: 1:00 - loss: 1.2159 - regression_loss: 1.0634 - classification_loss: 0.1525 322/500 [==================>...........] - ETA: 1:00 - loss: 1.2177 - regression_loss: 1.0650 - classification_loss: 0.1527 323/500 [==================>...........] - ETA: 1:00 - loss: 1.2166 - regression_loss: 1.0641 - classification_loss: 0.1525 324/500 [==================>...........] - ETA: 59s - loss: 1.2170 - regression_loss: 1.0645 - classification_loss: 0.1525  325/500 [==================>...........] - ETA: 59s - loss: 1.2162 - regression_loss: 1.0638 - classification_loss: 0.1524 326/500 [==================>...........] - ETA: 59s - loss: 1.2165 - regression_loss: 1.0638 - classification_loss: 0.1528 327/500 [==================>...........] - ETA: 58s - loss: 1.2174 - regression_loss: 1.0646 - classification_loss: 0.1528 328/500 [==================>...........] - ETA: 58s - loss: 1.2175 - regression_loss: 1.0646 - classification_loss: 0.1528 329/500 [==================>...........] - ETA: 58s - loss: 1.2167 - regression_loss: 1.0639 - classification_loss: 0.1528 330/500 [==================>...........] - ETA: 57s - loss: 1.2169 - regression_loss: 1.0642 - classification_loss: 0.1526 331/500 [==================>...........] - ETA: 57s - loss: 1.2146 - regression_loss: 1.0623 - classification_loss: 0.1523 332/500 [==================>...........] - ETA: 57s - loss: 1.2125 - regression_loss: 1.0605 - classification_loss: 0.1520 333/500 [==================>...........] - ETA: 56s - loss: 1.2112 - regression_loss: 1.0593 - classification_loss: 0.1519 334/500 [===================>..........] - ETA: 56s - loss: 1.2120 - regression_loss: 1.0599 - classification_loss: 0.1522 335/500 [===================>..........] - ETA: 56s - loss: 1.2146 - regression_loss: 1.0619 - classification_loss: 0.1527 336/500 [===================>..........] - ETA: 55s - loss: 1.2147 - regression_loss: 1.0619 - classification_loss: 0.1528 337/500 [===================>..........] - ETA: 55s - loss: 1.2139 - regression_loss: 1.0612 - classification_loss: 0.1527 338/500 [===================>..........] - ETA: 55s - loss: 1.2147 - regression_loss: 1.0619 - classification_loss: 0.1527 339/500 [===================>..........] - ETA: 54s - loss: 1.2154 - regression_loss: 1.0626 - classification_loss: 0.1528 340/500 [===================>..........] - ETA: 54s - loss: 1.2153 - regression_loss: 1.0625 - classification_loss: 0.1528 341/500 [===================>..........] - ETA: 54s - loss: 1.2135 - regression_loss: 1.0610 - classification_loss: 0.1525 342/500 [===================>..........] - ETA: 53s - loss: 1.2144 - regression_loss: 1.0620 - classification_loss: 0.1524 343/500 [===================>..........] - ETA: 53s - loss: 1.2127 - regression_loss: 1.0605 - classification_loss: 0.1522 344/500 [===================>..........] - ETA: 53s - loss: 1.2103 - regression_loss: 1.0585 - classification_loss: 0.1518 345/500 [===================>..........] - ETA: 52s - loss: 1.2102 - regression_loss: 1.0584 - classification_loss: 0.1518 346/500 [===================>..........] - ETA: 52s - loss: 1.2117 - regression_loss: 1.0596 - classification_loss: 0.1521 347/500 [===================>..........] - ETA: 52s - loss: 1.2123 - regression_loss: 1.0602 - classification_loss: 0.1521 348/500 [===================>..........] - ETA: 51s - loss: 1.2152 - regression_loss: 1.0629 - classification_loss: 0.1523 349/500 [===================>..........] - ETA: 51s - loss: 1.2156 - regression_loss: 1.0633 - classification_loss: 0.1524 350/500 [====================>.........] - ETA: 51s - loss: 1.2138 - regression_loss: 1.0617 - classification_loss: 0.1521 351/500 [====================>.........] - ETA: 50s - loss: 1.2134 - regression_loss: 1.0614 - classification_loss: 0.1520 352/500 [====================>.........] - ETA: 50s - loss: 1.2132 - regression_loss: 1.0613 - classification_loss: 0.1519 353/500 [====================>.........] - ETA: 49s - loss: 1.2131 - regression_loss: 1.0614 - classification_loss: 0.1517 354/500 [====================>.........] - ETA: 49s - loss: 1.2148 - regression_loss: 1.0630 - classification_loss: 0.1518 355/500 [====================>.........] - ETA: 49s - loss: 1.2157 - regression_loss: 1.0638 - classification_loss: 0.1519 356/500 [====================>.........] - ETA: 48s - loss: 1.2158 - regression_loss: 1.0639 - classification_loss: 0.1519 357/500 [====================>.........] - ETA: 48s - loss: 1.2165 - regression_loss: 1.0644 - classification_loss: 0.1521 358/500 [====================>.........] - ETA: 48s - loss: 1.2149 - regression_loss: 1.0630 - classification_loss: 0.1518 359/500 [====================>.........] - ETA: 47s - loss: 1.2143 - regression_loss: 1.0627 - classification_loss: 0.1517 360/500 [====================>.........] - ETA: 47s - loss: 1.2135 - regression_loss: 1.0620 - classification_loss: 0.1515 361/500 [====================>.........] - ETA: 47s - loss: 1.2155 - regression_loss: 1.0636 - classification_loss: 0.1519 362/500 [====================>.........] - ETA: 46s - loss: 1.2149 - regression_loss: 1.0632 - classification_loss: 0.1517 363/500 [====================>.........] - ETA: 46s - loss: 1.2147 - regression_loss: 1.0630 - classification_loss: 0.1517 364/500 [====================>.........] - ETA: 46s - loss: 1.2150 - regression_loss: 1.0631 - classification_loss: 0.1519 365/500 [====================>.........] - ETA: 45s - loss: 1.2171 - regression_loss: 1.0651 - classification_loss: 0.1520 366/500 [====================>.........] - ETA: 45s - loss: 1.2156 - regression_loss: 1.0638 - classification_loss: 0.1518 367/500 [=====================>........] - ETA: 45s - loss: 1.2145 - regression_loss: 1.0630 - classification_loss: 0.1516 368/500 [=====================>........] - ETA: 44s - loss: 1.2147 - regression_loss: 1.0631 - classification_loss: 0.1516 369/500 [=====================>........] - ETA: 44s - loss: 1.2157 - regression_loss: 1.0641 - classification_loss: 0.1516 370/500 [=====================>........] - ETA: 44s - loss: 1.2156 - regression_loss: 1.0640 - classification_loss: 0.1515 371/500 [=====================>........] - ETA: 43s - loss: 1.2157 - regression_loss: 1.0641 - classification_loss: 0.1515 372/500 [=====================>........] - ETA: 43s - loss: 1.2154 - regression_loss: 1.0638 - classification_loss: 0.1515 373/500 [=====================>........] - ETA: 43s - loss: 1.2174 - regression_loss: 1.0657 - classification_loss: 0.1517 374/500 [=====================>........] - ETA: 42s - loss: 1.2170 - regression_loss: 1.0654 - classification_loss: 0.1515 375/500 [=====================>........] - ETA: 42s - loss: 1.2166 - regression_loss: 1.0651 - classification_loss: 0.1515 376/500 [=====================>........] - ETA: 42s - loss: 1.2159 - regression_loss: 1.0640 - classification_loss: 0.1520 377/500 [=====================>........] - ETA: 41s - loss: 1.2140 - regression_loss: 1.0623 - classification_loss: 0.1517 378/500 [=====================>........] - ETA: 41s - loss: 1.2134 - regression_loss: 1.0618 - classification_loss: 0.1516 379/500 [=====================>........] - ETA: 41s - loss: 1.2141 - regression_loss: 1.0624 - classification_loss: 0.1517 380/500 [=====================>........] - ETA: 40s - loss: 1.2154 - regression_loss: 1.0634 - classification_loss: 0.1520 381/500 [=====================>........] - ETA: 40s - loss: 1.2155 - regression_loss: 1.0634 - classification_loss: 0.1520 382/500 [=====================>........] - ETA: 40s - loss: 1.2158 - regression_loss: 1.0637 - classification_loss: 0.1521 383/500 [=====================>........] - ETA: 39s - loss: 1.2137 - regression_loss: 1.0619 - classification_loss: 0.1518 384/500 [======================>.......] - ETA: 39s - loss: 1.2119 - regression_loss: 1.0603 - classification_loss: 0.1517 385/500 [======================>.......] - ETA: 39s - loss: 1.2104 - regression_loss: 1.0589 - classification_loss: 0.1515 386/500 [======================>.......] - ETA: 38s - loss: 1.2104 - regression_loss: 1.0590 - classification_loss: 0.1515 387/500 [======================>.......] - ETA: 38s - loss: 1.2110 - regression_loss: 1.0596 - classification_loss: 0.1514 388/500 [======================>.......] - ETA: 38s - loss: 1.2112 - regression_loss: 1.0597 - classification_loss: 0.1515 389/500 [======================>.......] - ETA: 37s - loss: 1.2089 - regression_loss: 1.0578 - classification_loss: 0.1512 390/500 [======================>.......] - ETA: 37s - loss: 1.2083 - regression_loss: 1.0572 - classification_loss: 0.1511 391/500 [======================>.......] - ETA: 37s - loss: 1.2094 - regression_loss: 1.0581 - classification_loss: 0.1513 392/500 [======================>.......] - ETA: 36s - loss: 1.2096 - regression_loss: 1.0584 - classification_loss: 0.1512 393/500 [======================>.......] - ETA: 36s - loss: 1.2095 - regression_loss: 1.0582 - classification_loss: 0.1513 394/500 [======================>.......] - ETA: 36s - loss: 1.2078 - regression_loss: 1.0568 - classification_loss: 0.1510 395/500 [======================>.......] - ETA: 35s - loss: 1.2071 - regression_loss: 1.0563 - classification_loss: 0.1508 396/500 [======================>.......] - ETA: 35s - loss: 1.2075 - regression_loss: 1.0566 - classification_loss: 0.1510 397/500 [======================>.......] - ETA: 35s - loss: 1.2079 - regression_loss: 1.0569 - classification_loss: 0.1509 398/500 [======================>.......] - ETA: 34s - loss: 1.2083 - regression_loss: 1.0573 - classification_loss: 0.1510 399/500 [======================>.......] - ETA: 34s - loss: 1.2086 - regression_loss: 1.0575 - classification_loss: 0.1511 400/500 [=======================>......] - ETA: 34s - loss: 1.2072 - regression_loss: 1.0562 - classification_loss: 0.1510 401/500 [=======================>......] - ETA: 33s - loss: 1.2066 - regression_loss: 1.0558 - classification_loss: 0.1509 402/500 [=======================>......] - ETA: 33s - loss: 1.2063 - regression_loss: 1.0555 - classification_loss: 0.1508 403/500 [=======================>......] - ETA: 33s - loss: 1.2063 - regression_loss: 1.0555 - classification_loss: 0.1507 404/500 [=======================>......] - ETA: 32s - loss: 1.2063 - regression_loss: 1.0556 - classification_loss: 0.1507 405/500 [=======================>......] - ETA: 32s - loss: 1.2069 - regression_loss: 1.0560 - classification_loss: 0.1509 406/500 [=======================>......] - ETA: 31s - loss: 1.2059 - regression_loss: 1.0551 - classification_loss: 0.1508 407/500 [=======================>......] - ETA: 31s - loss: 1.2047 - regression_loss: 1.0541 - classification_loss: 0.1506 408/500 [=======================>......] - ETA: 31s - loss: 1.2046 - regression_loss: 1.0540 - classification_loss: 0.1505 409/500 [=======================>......] - ETA: 30s - loss: 1.2043 - regression_loss: 1.0539 - classification_loss: 0.1504 410/500 [=======================>......] - ETA: 30s - loss: 1.2043 - regression_loss: 1.0539 - classification_loss: 0.1504 411/500 [=======================>......] - ETA: 30s - loss: 1.2050 - regression_loss: 1.0545 - classification_loss: 0.1504 412/500 [=======================>......] - ETA: 29s - loss: 1.2036 - regression_loss: 1.0534 - classification_loss: 0.1503 413/500 [=======================>......] - ETA: 29s - loss: 1.2022 - regression_loss: 1.0522 - classification_loss: 0.1500 414/500 [=======================>......] - ETA: 29s - loss: 1.2015 - regression_loss: 1.0516 - classification_loss: 0.1499 415/500 [=======================>......] - ETA: 28s - loss: 1.2018 - regression_loss: 1.0519 - classification_loss: 0.1499 416/500 [=======================>......] - ETA: 28s - loss: 1.2001 - regression_loss: 1.0505 - classification_loss: 0.1496 417/500 [========================>.....] - ETA: 28s - loss: 1.1989 - regression_loss: 1.0495 - classification_loss: 0.1494 418/500 [========================>.....] - ETA: 27s - loss: 1.1997 - regression_loss: 1.0503 - classification_loss: 0.1494 419/500 [========================>.....] - ETA: 27s - loss: 1.2012 - regression_loss: 1.0517 - classification_loss: 0.1495 420/500 [========================>.....] - ETA: 27s - loss: 1.2009 - regression_loss: 1.0515 - classification_loss: 0.1494 421/500 [========================>.....] - ETA: 26s - loss: 1.2016 - regression_loss: 1.0521 - classification_loss: 0.1495 422/500 [========================>.....] - ETA: 26s - loss: 1.2025 - regression_loss: 1.0529 - classification_loss: 0.1496 423/500 [========================>.....] - ETA: 26s - loss: 1.2013 - regression_loss: 1.0519 - classification_loss: 0.1494 424/500 [========================>.....] - ETA: 25s - loss: 1.2011 - regression_loss: 1.0517 - classification_loss: 0.1494 425/500 [========================>.....] - ETA: 25s - loss: 1.2010 - regression_loss: 1.0517 - classification_loss: 0.1492 426/500 [========================>.....] - ETA: 25s - loss: 1.2013 - regression_loss: 1.0520 - classification_loss: 0.1492 427/500 [========================>.....] - ETA: 24s - loss: 1.2011 - regression_loss: 1.0520 - classification_loss: 0.1491 428/500 [========================>.....] - ETA: 24s - loss: 1.1996 - regression_loss: 1.0508 - classification_loss: 0.1488 429/500 [========================>.....] - ETA: 24s - loss: 1.1989 - regression_loss: 1.0501 - classification_loss: 0.1488 430/500 [========================>.....] - ETA: 23s - loss: 1.1999 - regression_loss: 1.0509 - classification_loss: 0.1490 431/500 [========================>.....] - ETA: 23s - loss: 1.1979 - regression_loss: 1.0492 - classification_loss: 0.1487 432/500 [========================>.....] - ETA: 23s - loss: 1.1980 - regression_loss: 1.0493 - classification_loss: 0.1487 433/500 [========================>.....] - ETA: 22s - loss: 1.1977 - regression_loss: 1.0491 - classification_loss: 0.1486 434/500 [=========================>....] - ETA: 22s - loss: 1.1987 - regression_loss: 1.0499 - classification_loss: 0.1487 435/500 [=========================>....] - ETA: 22s - loss: 1.1969 - regression_loss: 1.0484 - classification_loss: 0.1485 436/500 [=========================>....] - ETA: 21s - loss: 1.1962 - regression_loss: 1.0479 - classification_loss: 0.1483 437/500 [=========================>....] - ETA: 21s - loss: 1.1959 - regression_loss: 1.0476 - classification_loss: 0.1483 438/500 [=========================>....] - ETA: 21s - loss: 1.1949 - regression_loss: 1.0467 - classification_loss: 0.1481 439/500 [=========================>....] - ETA: 20s - loss: 1.1967 - regression_loss: 1.0484 - classification_loss: 0.1483 440/500 [=========================>....] - ETA: 20s - loss: 1.1963 - regression_loss: 1.0480 - classification_loss: 0.1483 441/500 [=========================>....] - ETA: 20s - loss: 1.1969 - regression_loss: 1.0485 - classification_loss: 0.1484 442/500 [=========================>....] - ETA: 19s - loss: 1.1968 - regression_loss: 1.0485 - classification_loss: 0.1483 443/500 [=========================>....] - ETA: 19s - loss: 1.1975 - regression_loss: 1.0491 - classification_loss: 0.1484 444/500 [=========================>....] - ETA: 19s - loss: 1.1985 - regression_loss: 1.0500 - classification_loss: 0.1486 445/500 [=========================>....] - ETA: 18s - loss: 1.1972 - regression_loss: 1.0488 - classification_loss: 0.1484 446/500 [=========================>....] - ETA: 18s - loss: 1.1964 - regression_loss: 1.0482 - classification_loss: 0.1482 447/500 [=========================>....] - ETA: 18s - loss: 1.1968 - regression_loss: 1.0485 - classification_loss: 0.1483 448/500 [=========================>....] - ETA: 17s - loss: 1.1977 - regression_loss: 1.0493 - classification_loss: 0.1485 449/500 [=========================>....] - ETA: 17s - loss: 1.1976 - regression_loss: 1.0493 - classification_loss: 0.1484 450/500 [==========================>...] - ETA: 17s - loss: 1.1982 - regression_loss: 1.0498 - classification_loss: 0.1484 451/500 [==========================>...] - ETA: 16s - loss: 1.1984 - regression_loss: 1.0501 - classification_loss: 0.1484 452/500 [==========================>...] - ETA: 16s - loss: 1.1991 - regression_loss: 1.0506 - classification_loss: 0.1485 453/500 [==========================>...] - ETA: 15s - loss: 1.1986 - regression_loss: 1.0503 - classification_loss: 0.1484 454/500 [==========================>...] - ETA: 15s - loss: 1.1972 - regression_loss: 1.0490 - classification_loss: 0.1482 455/500 [==========================>...] - ETA: 15s - loss: 1.1967 - regression_loss: 1.0485 - classification_loss: 0.1483 456/500 [==========================>...] - ETA: 14s - loss: 1.1970 - regression_loss: 1.0487 - classification_loss: 0.1482 457/500 [==========================>...] - ETA: 14s - loss: 1.1976 - regression_loss: 1.0492 - classification_loss: 0.1483 458/500 [==========================>...] - ETA: 14s - loss: 1.1979 - regression_loss: 1.0496 - classification_loss: 0.1484 459/500 [==========================>...] - ETA: 13s - loss: 1.1979 - regression_loss: 1.0496 - classification_loss: 0.1483 460/500 [==========================>...] - ETA: 13s - loss: 1.1984 - regression_loss: 1.0501 - classification_loss: 0.1483 461/500 [==========================>...] - ETA: 13s - loss: 1.1982 - regression_loss: 1.0499 - classification_loss: 0.1483 462/500 [==========================>...] - ETA: 12s - loss: 1.1973 - regression_loss: 1.0491 - classification_loss: 0.1481 463/500 [==========================>...] - ETA: 12s - loss: 1.1975 - regression_loss: 1.0495 - classification_loss: 0.1481 464/500 [==========================>...] - ETA: 12s - loss: 1.1982 - regression_loss: 1.0501 - classification_loss: 0.1482 465/500 [==========================>...] - ETA: 11s - loss: 1.1986 - regression_loss: 1.0504 - classification_loss: 0.1483 466/500 [==========================>...] - ETA: 11s - loss: 1.1987 - regression_loss: 1.0505 - classification_loss: 0.1482 467/500 [===========================>..] - ETA: 11s - loss: 1.1981 - regression_loss: 1.0501 - classification_loss: 0.1480 468/500 [===========================>..] - ETA: 10s - loss: 1.1987 - regression_loss: 1.0507 - classification_loss: 0.1480 469/500 [===========================>..] - ETA: 10s - loss: 1.1982 - regression_loss: 1.0504 - classification_loss: 0.1478 470/500 [===========================>..] - ETA: 10s - loss: 1.1984 - regression_loss: 1.0504 - classification_loss: 0.1479 471/500 [===========================>..] - ETA: 9s - loss: 1.1991 - regression_loss: 1.0510 - classification_loss: 0.1481  472/500 [===========================>..] - ETA: 9s - loss: 1.1978 - regression_loss: 1.0498 - classification_loss: 0.1480 473/500 [===========================>..] - ETA: 9s - loss: 1.1973 - regression_loss: 1.0493 - classification_loss: 0.1479 474/500 [===========================>..] - ETA: 8s - loss: 1.1964 - regression_loss: 1.0486 - classification_loss: 0.1478 475/500 [===========================>..] - ETA: 8s - loss: 1.1965 - regression_loss: 1.0488 - classification_loss: 0.1477 476/500 [===========================>..] - ETA: 8s - loss: 1.1956 - regression_loss: 1.0480 - classification_loss: 0.1476 477/500 [===========================>..] - ETA: 7s - loss: 1.1957 - regression_loss: 1.0481 - classification_loss: 0.1476 478/500 [===========================>..] - ETA: 7s - loss: 1.1957 - regression_loss: 1.0481 - classification_loss: 0.1476 479/500 [===========================>..] - ETA: 7s - loss: 1.1960 - regression_loss: 1.0484 - classification_loss: 0.1476 480/500 [===========================>..] - ETA: 6s - loss: 1.1951 - regression_loss: 1.0477 - classification_loss: 0.1474 481/500 [===========================>..] - ETA: 6s - loss: 1.1943 - regression_loss: 1.0470 - classification_loss: 0.1473 482/500 [===========================>..] - ETA: 6s - loss: 1.1950 - regression_loss: 1.0477 - classification_loss: 0.1474 483/500 [===========================>..] - ETA: 5s - loss: 1.1955 - regression_loss: 1.0481 - classification_loss: 0.1474 484/500 [============================>.] - ETA: 5s - loss: 1.1947 - regression_loss: 1.0474 - classification_loss: 0.1473 485/500 [============================>.] - ETA: 5s - loss: 1.1947 - regression_loss: 1.0474 - classification_loss: 0.1473 486/500 [============================>.] - ETA: 4s - loss: 1.1936 - regression_loss: 1.0464 - classification_loss: 0.1472 487/500 [============================>.] - ETA: 4s - loss: 1.1932 - regression_loss: 1.0458 - classification_loss: 0.1473 488/500 [============================>.] - ETA: 4s - loss: 1.1927 - regression_loss: 1.0454 - classification_loss: 0.1473 489/500 [============================>.] - ETA: 3s - loss: 1.1915 - regression_loss: 1.0444 - classification_loss: 0.1471 490/500 [============================>.] - ETA: 3s - loss: 1.1912 - regression_loss: 1.0442 - classification_loss: 0.1470 491/500 [============================>.] - ETA: 3s - loss: 1.1905 - regression_loss: 1.0437 - classification_loss: 0.1468 492/500 [============================>.] - ETA: 2s - loss: 1.1899 - regression_loss: 1.0432 - classification_loss: 0.1467 493/500 [============================>.] - ETA: 2s - loss: 1.1896 - regression_loss: 1.0430 - classification_loss: 0.1466 494/500 [============================>.] - ETA: 2s - loss: 1.1904 - regression_loss: 1.0437 - classification_loss: 0.1466 495/500 [============================>.] - ETA: 1s - loss: 1.1903 - regression_loss: 1.0437 - classification_loss: 0.1466 496/500 [============================>.] - ETA: 1s - loss: 1.1911 - regression_loss: 1.0443 - classification_loss: 0.1468 497/500 [============================>.] - ETA: 1s - loss: 1.1912 - regression_loss: 1.0444 - classification_loss: 0.1468 498/500 [============================>.] - ETA: 0s - loss: 1.1904 - regression_loss: 1.0438 - classification_loss: 0.1466 499/500 [============================>.] - ETA: 0s - loss: 1.1913 - regression_loss: 1.0446 - classification_loss: 0.1467 500/500 [==============================] - 170s 340ms/step - loss: 1.1914 - regression_loss: 1.0447 - classification_loss: 0.1467 1172 instances of class plum with average precision: 0.7703 mAP: 0.7703 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:40 - loss: 1.1503 - regression_loss: 0.9914 - classification_loss: 0.1589 2/500 [..............................] - ETA: 2:36 - loss: 0.8889 - regression_loss: 0.7839 - classification_loss: 0.1049 3/500 [..............................] - ETA: 2:36 - loss: 1.0811 - regression_loss: 0.9591 - classification_loss: 0.1220 4/500 [..............................] - ETA: 2:39 - loss: 1.0467 - regression_loss: 0.9383 - classification_loss: 0.1084 5/500 [..............................] - ETA: 2:41 - loss: 1.2046 - regression_loss: 1.0657 - classification_loss: 0.1388 6/500 [..............................] - ETA: 2:41 - loss: 1.2074 - regression_loss: 1.0697 - classification_loss: 0.1378 7/500 [..............................] - ETA: 2:42 - loss: 1.2319 - regression_loss: 1.0860 - classification_loss: 0.1458 8/500 [..............................] - ETA: 2:43 - loss: 1.2058 - regression_loss: 1.0643 - classification_loss: 0.1415 9/500 [..............................] - ETA: 2:43 - loss: 1.2199 - regression_loss: 1.0753 - classification_loss: 0.1446 10/500 [..............................] - ETA: 2:43 - loss: 1.1643 - regression_loss: 1.0283 - classification_loss: 0.1360 11/500 [..............................] - ETA: 2:43 - loss: 1.1708 - regression_loss: 1.0345 - classification_loss: 0.1362 12/500 [..............................] - ETA: 2:42 - loss: 1.1860 - regression_loss: 1.0471 - classification_loss: 0.1389 13/500 [..............................] - ETA: 2:42 - loss: 1.1681 - regression_loss: 1.0325 - classification_loss: 0.1356 14/500 [..............................] - ETA: 2:42 - loss: 1.1310 - regression_loss: 0.9984 - classification_loss: 0.1327 15/500 [..............................] - ETA: 2:42 - loss: 1.1538 - regression_loss: 1.0158 - classification_loss: 0.1380 16/500 [..............................] - ETA: 2:41 - loss: 1.1829 - regression_loss: 1.0379 - classification_loss: 0.1450 17/500 [>.............................] - ETA: 2:41 - loss: 1.1895 - regression_loss: 1.0438 - classification_loss: 0.1457 18/500 [>.............................] - ETA: 2:42 - loss: 1.1765 - regression_loss: 1.0273 - classification_loss: 0.1492 19/500 [>.............................] - ETA: 2:41 - loss: 1.1737 - regression_loss: 1.0266 - classification_loss: 0.1471 20/500 [>.............................] - ETA: 2:40 - loss: 1.1691 - regression_loss: 1.0251 - classification_loss: 0.1440 21/500 [>.............................] - ETA: 2:40 - loss: 1.1344 - regression_loss: 0.9950 - classification_loss: 0.1394 22/500 [>.............................] - ETA: 2:40 - loss: 1.1533 - regression_loss: 1.0110 - classification_loss: 0.1423 23/500 [>.............................] - ETA: 2:40 - loss: 1.1794 - regression_loss: 1.0326 - classification_loss: 0.1469 24/500 [>.............................] - ETA: 2:39 - loss: 1.1748 - regression_loss: 1.0276 - classification_loss: 0.1472 25/500 [>.............................] - ETA: 2:39 - loss: 1.1887 - regression_loss: 1.0401 - classification_loss: 0.1486 26/500 [>.............................] - ETA: 2:39 - loss: 1.1646 - regression_loss: 1.0205 - classification_loss: 0.1441 27/500 [>.............................] - ETA: 2:39 - loss: 1.1509 - regression_loss: 1.0099 - classification_loss: 0.1410 28/500 [>.............................] - ETA: 2:38 - loss: 1.1659 - regression_loss: 1.0243 - classification_loss: 0.1416 29/500 [>.............................] - ETA: 2:38 - loss: 1.1656 - regression_loss: 1.0238 - classification_loss: 0.1418 30/500 [>.............................] - ETA: 2:38 - loss: 1.1741 - regression_loss: 1.0313 - classification_loss: 0.1429 31/500 [>.............................] - ETA: 2:38 - loss: 1.1863 - regression_loss: 1.0415 - classification_loss: 0.1448 32/500 [>.............................] - ETA: 2:37 - loss: 1.1849 - regression_loss: 1.0411 - classification_loss: 0.1438 33/500 [>.............................] - ETA: 2:37 - loss: 1.1766 - regression_loss: 1.0340 - classification_loss: 0.1426 34/500 [=>............................] - ETA: 2:36 - loss: 1.1894 - regression_loss: 1.0467 - classification_loss: 0.1427 35/500 [=>............................] - ETA: 2:35 - loss: 1.1946 - regression_loss: 1.0513 - classification_loss: 0.1433 36/500 [=>............................] - ETA: 2:35 - loss: 1.2013 - regression_loss: 1.0577 - classification_loss: 0.1436 37/500 [=>............................] - ETA: 2:35 - loss: 1.2031 - regression_loss: 1.0588 - classification_loss: 0.1443 38/500 [=>............................] - ETA: 2:35 - loss: 1.1949 - regression_loss: 1.0524 - classification_loss: 0.1425 39/500 [=>............................] - ETA: 2:35 - loss: 1.1886 - regression_loss: 1.0480 - classification_loss: 0.1406 40/500 [=>............................] - ETA: 2:34 - loss: 1.1974 - regression_loss: 1.0557 - classification_loss: 0.1417 41/500 [=>............................] - ETA: 2:34 - loss: 1.1994 - regression_loss: 1.0574 - classification_loss: 0.1420 42/500 [=>............................] - ETA: 2:34 - loss: 1.2010 - regression_loss: 1.0589 - classification_loss: 0.1422 43/500 [=>............................] - ETA: 2:33 - loss: 1.2034 - regression_loss: 1.0605 - classification_loss: 0.1429 44/500 [=>............................] - ETA: 2:33 - loss: 1.2052 - regression_loss: 1.0625 - classification_loss: 0.1427 45/500 [=>............................] - ETA: 2:33 - loss: 1.2252 - regression_loss: 1.0781 - classification_loss: 0.1471 46/500 [=>............................] - ETA: 2:32 - loss: 1.2334 - regression_loss: 1.0825 - classification_loss: 0.1508 47/500 [=>............................] - ETA: 2:32 - loss: 1.2372 - regression_loss: 1.0861 - classification_loss: 0.1510 48/500 [=>............................] - ETA: 2:31 - loss: 1.2411 - regression_loss: 1.0891 - classification_loss: 0.1519 49/500 [=>............................] - ETA: 2:31 - loss: 1.2290 - regression_loss: 1.0794 - classification_loss: 0.1496 50/500 [==>...........................] - ETA: 2:31 - loss: 1.2303 - regression_loss: 1.0813 - classification_loss: 0.1490 51/500 [==>...........................] - ETA: 2:31 - loss: 1.2348 - regression_loss: 1.0857 - classification_loss: 0.1492 52/500 [==>...........................] - ETA: 2:31 - loss: 1.2359 - regression_loss: 1.0871 - classification_loss: 0.1487 53/500 [==>...........................] - ETA: 2:30 - loss: 1.2406 - regression_loss: 1.0907 - classification_loss: 0.1499 54/500 [==>...........................] - ETA: 2:30 - loss: 1.2417 - regression_loss: 1.0910 - classification_loss: 0.1507 55/500 [==>...........................] - ETA: 2:30 - loss: 1.2463 - regression_loss: 1.0950 - classification_loss: 0.1513 56/500 [==>...........................] - ETA: 2:29 - loss: 1.2341 - regression_loss: 1.0843 - classification_loss: 0.1498 57/500 [==>...........................] - ETA: 2:29 - loss: 1.2330 - regression_loss: 1.0837 - classification_loss: 0.1493 58/500 [==>...........................] - ETA: 2:29 - loss: 1.2303 - regression_loss: 1.0815 - classification_loss: 0.1488 59/500 [==>...........................] - ETA: 2:28 - loss: 1.2289 - regression_loss: 1.0805 - classification_loss: 0.1484 60/500 [==>...........................] - ETA: 2:28 - loss: 1.2234 - regression_loss: 1.0755 - classification_loss: 0.1480 61/500 [==>...........................] - ETA: 2:28 - loss: 1.2182 - regression_loss: 1.0709 - classification_loss: 0.1473 62/500 [==>...........................] - ETA: 2:28 - loss: 1.2233 - regression_loss: 1.0755 - classification_loss: 0.1477 63/500 [==>...........................] - ETA: 2:27 - loss: 1.2301 - regression_loss: 1.0817 - classification_loss: 0.1484 64/500 [==>...........................] - ETA: 2:27 - loss: 1.2325 - regression_loss: 1.0832 - classification_loss: 0.1492 65/500 [==>...........................] - ETA: 2:26 - loss: 1.2201 - regression_loss: 1.0722 - classification_loss: 0.1479 66/500 [==>...........................] - ETA: 2:26 - loss: 1.2201 - regression_loss: 1.0716 - classification_loss: 0.1485 67/500 [===>..........................] - ETA: 2:25 - loss: 1.2169 - regression_loss: 1.0687 - classification_loss: 0.1481 68/500 [===>..........................] - ETA: 2:25 - loss: 1.2181 - regression_loss: 1.0698 - classification_loss: 0.1483 69/500 [===>..........................] - ETA: 2:25 - loss: 1.2071 - regression_loss: 1.0602 - classification_loss: 0.1470 70/500 [===>..........................] - ETA: 2:24 - loss: 1.2062 - regression_loss: 1.0591 - classification_loss: 0.1471 71/500 [===>..........................] - ETA: 2:24 - loss: 1.2096 - regression_loss: 1.0623 - classification_loss: 0.1473 72/500 [===>..........................] - ETA: 2:24 - loss: 1.2133 - regression_loss: 1.0659 - classification_loss: 0.1474 73/500 [===>..........................] - ETA: 2:24 - loss: 1.2111 - regression_loss: 1.0646 - classification_loss: 0.1465 74/500 [===>..........................] - ETA: 2:23 - loss: 1.2166 - regression_loss: 1.0694 - classification_loss: 0.1472 75/500 [===>..........................] - ETA: 2:23 - loss: 1.2210 - regression_loss: 1.0724 - classification_loss: 0.1486 76/500 [===>..........................] - ETA: 2:23 - loss: 1.2166 - regression_loss: 1.0688 - classification_loss: 0.1478 77/500 [===>..........................] - ETA: 2:22 - loss: 1.2174 - regression_loss: 1.0695 - classification_loss: 0.1479 78/500 [===>..........................] - ETA: 2:22 - loss: 1.2150 - regression_loss: 1.0671 - classification_loss: 0.1478 79/500 [===>..........................] - ETA: 2:22 - loss: 1.2209 - regression_loss: 1.0717 - classification_loss: 0.1492 80/500 [===>..........................] - ETA: 2:22 - loss: 1.2238 - regression_loss: 1.0739 - classification_loss: 0.1499 81/500 [===>..........................] - ETA: 2:21 - loss: 1.2229 - regression_loss: 1.0729 - classification_loss: 0.1500 82/500 [===>..........................] - ETA: 2:21 - loss: 1.2165 - regression_loss: 1.0675 - classification_loss: 0.1490 83/500 [===>..........................] - ETA: 2:21 - loss: 1.2078 - regression_loss: 1.0601 - classification_loss: 0.1477 84/500 [====>.........................] - ETA: 2:20 - loss: 1.2148 - regression_loss: 1.0667 - classification_loss: 0.1481 85/500 [====>.........................] - ETA: 2:20 - loss: 1.2155 - regression_loss: 1.0674 - classification_loss: 0.1481 86/500 [====>.........................] - ETA: 2:20 - loss: 1.2215 - regression_loss: 1.0723 - classification_loss: 0.1492 87/500 [====>.........................] - ETA: 2:19 - loss: 1.2236 - regression_loss: 1.0747 - classification_loss: 0.1489 88/500 [====>.........................] - ETA: 2:19 - loss: 1.2160 - regression_loss: 1.0682 - classification_loss: 0.1478 89/500 [====>.........................] - ETA: 2:19 - loss: 1.2171 - regression_loss: 1.0698 - classification_loss: 0.1473 90/500 [====>.........................] - ETA: 2:18 - loss: 1.2171 - regression_loss: 1.0696 - classification_loss: 0.1475 91/500 [====>.........................] - ETA: 2:18 - loss: 1.2191 - regression_loss: 1.0713 - classification_loss: 0.1478 92/500 [====>.........................] - ETA: 2:18 - loss: 1.2240 - regression_loss: 1.0754 - classification_loss: 0.1486 93/500 [====>.........................] - ETA: 2:18 - loss: 1.2282 - regression_loss: 1.0786 - classification_loss: 0.1496 94/500 [====>.........................] - ETA: 2:17 - loss: 1.2244 - regression_loss: 1.0754 - classification_loss: 0.1490 95/500 [====>.........................] - ETA: 2:17 - loss: 1.2242 - regression_loss: 1.0754 - classification_loss: 0.1488 96/500 [====>.........................] - ETA: 2:17 - loss: 1.2229 - regression_loss: 1.0747 - classification_loss: 0.1483 97/500 [====>.........................] - ETA: 2:16 - loss: 1.2137 - regression_loss: 1.0666 - classification_loss: 0.1470 98/500 [====>.........................] - ETA: 2:16 - loss: 1.2126 - regression_loss: 1.0660 - classification_loss: 0.1466 99/500 [====>.........................] - ETA: 2:16 - loss: 1.2086 - regression_loss: 1.0630 - classification_loss: 0.1457 100/500 [=====>........................] - ETA: 2:15 - loss: 1.2132 - regression_loss: 1.0673 - classification_loss: 0.1459 101/500 [=====>........................] - ETA: 2:15 - loss: 1.2171 - regression_loss: 1.0710 - classification_loss: 0.1460 102/500 [=====>........................] - ETA: 2:15 - loss: 1.2174 - regression_loss: 1.0716 - classification_loss: 0.1458 103/500 [=====>........................] - ETA: 2:14 - loss: 1.2197 - regression_loss: 1.0738 - classification_loss: 0.1460 104/500 [=====>........................] - ETA: 2:14 - loss: 1.2203 - regression_loss: 1.0747 - classification_loss: 0.1455 105/500 [=====>........................] - ETA: 2:14 - loss: 1.2214 - regression_loss: 1.0756 - classification_loss: 0.1459 106/500 [=====>........................] - ETA: 2:13 - loss: 1.2156 - regression_loss: 1.0705 - classification_loss: 0.1451 107/500 [=====>........................] - ETA: 2:13 - loss: 1.2158 - regression_loss: 1.0712 - classification_loss: 0.1446 108/500 [=====>........................] - ETA: 2:13 - loss: 1.2167 - regression_loss: 1.0723 - classification_loss: 0.1445 109/500 [=====>........................] - ETA: 2:12 - loss: 1.2169 - regression_loss: 1.0726 - classification_loss: 0.1443 110/500 [=====>........................] - ETA: 2:12 - loss: 1.2165 - regression_loss: 1.0726 - classification_loss: 0.1439 111/500 [=====>........................] - ETA: 2:12 - loss: 1.2192 - regression_loss: 1.0747 - classification_loss: 0.1444 112/500 [=====>........................] - ETA: 2:11 - loss: 1.2196 - regression_loss: 1.0750 - classification_loss: 0.1446 113/500 [=====>........................] - ETA: 2:11 - loss: 1.2167 - regression_loss: 1.0726 - classification_loss: 0.1441 114/500 [=====>........................] - ETA: 2:11 - loss: 1.2168 - regression_loss: 1.0730 - classification_loss: 0.1439 115/500 [=====>........................] - ETA: 2:10 - loss: 1.2106 - regression_loss: 1.0673 - classification_loss: 0.1433 116/500 [=====>........................] - ETA: 2:10 - loss: 1.2149 - regression_loss: 1.0707 - classification_loss: 0.1443 117/500 [======>.......................] - ETA: 2:09 - loss: 1.2102 - regression_loss: 1.0667 - classification_loss: 0.1435 118/500 [======>.......................] - ETA: 2:09 - loss: 1.2087 - regression_loss: 1.0654 - classification_loss: 0.1433 119/500 [======>.......................] - ETA: 2:09 - loss: 1.2059 - regression_loss: 1.0625 - classification_loss: 0.1434 120/500 [======>.......................] - ETA: 2:09 - loss: 1.2052 - regression_loss: 1.0621 - classification_loss: 0.1431 121/500 [======>.......................] - ETA: 2:08 - loss: 1.2038 - regression_loss: 1.0608 - classification_loss: 0.1429 122/500 [======>.......................] - ETA: 2:08 - loss: 1.1998 - regression_loss: 1.0576 - classification_loss: 0.1423 123/500 [======>.......................] - ETA: 2:08 - loss: 1.1976 - regression_loss: 1.0556 - classification_loss: 0.1420 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1942 - regression_loss: 1.0530 - classification_loss: 0.1413 125/500 [======>.......................] - ETA: 2:07 - loss: 1.1921 - regression_loss: 1.0510 - classification_loss: 0.1410 126/500 [======>.......................] - ETA: 2:07 - loss: 1.1929 - regression_loss: 1.0518 - classification_loss: 0.1411 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1940 - regression_loss: 1.0524 - classification_loss: 0.1416 128/500 [======>.......................] - ETA: 2:06 - loss: 1.1950 - regression_loss: 1.0532 - classification_loss: 0.1418 129/500 [======>.......................] - ETA: 2:06 - loss: 1.1897 - regression_loss: 1.0488 - classification_loss: 0.1410 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1893 - regression_loss: 1.0485 - classification_loss: 0.1407 131/500 [======>.......................] - ETA: 2:05 - loss: 1.1923 - regression_loss: 1.0510 - classification_loss: 0.1413 132/500 [======>.......................] - ETA: 2:05 - loss: 1.1934 - regression_loss: 1.0522 - classification_loss: 0.1412 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1996 - regression_loss: 1.0565 - classification_loss: 0.1430 134/500 [=======>......................] - ETA: 2:04 - loss: 1.1995 - regression_loss: 1.0564 - classification_loss: 0.1430 135/500 [=======>......................] - ETA: 2:04 - loss: 1.1947 - regression_loss: 1.0523 - classification_loss: 0.1424 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1961 - regression_loss: 1.0534 - classification_loss: 0.1427 137/500 [=======>......................] - ETA: 2:03 - loss: 1.1957 - regression_loss: 1.0535 - classification_loss: 0.1422 138/500 [=======>......................] - ETA: 2:03 - loss: 1.1970 - regression_loss: 1.0547 - classification_loss: 0.1423 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1938 - regression_loss: 1.0519 - classification_loss: 0.1419 140/500 [=======>......................] - ETA: 2:02 - loss: 1.1932 - regression_loss: 1.0515 - classification_loss: 0.1416 141/500 [=======>......................] - ETA: 2:02 - loss: 1.1934 - regression_loss: 1.0514 - classification_loss: 0.1419 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1873 - regression_loss: 1.0461 - classification_loss: 0.1413 143/500 [=======>......................] - ETA: 2:01 - loss: 1.1889 - regression_loss: 1.0474 - classification_loss: 0.1414 144/500 [=======>......................] - ETA: 2:01 - loss: 1.1874 - regression_loss: 1.0463 - classification_loss: 0.1411 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1836 - regression_loss: 1.0429 - classification_loss: 0.1407 146/500 [=======>......................] - ETA: 2:00 - loss: 1.1783 - regression_loss: 1.0384 - classification_loss: 0.1399 147/500 [=======>......................] - ETA: 2:00 - loss: 1.1761 - regression_loss: 1.0362 - classification_loss: 0.1399 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1775 - regression_loss: 1.0374 - classification_loss: 0.1401 149/500 [=======>......................] - ETA: 1:59 - loss: 1.1773 - regression_loss: 1.0372 - classification_loss: 0.1401 150/500 [========>.....................] - ETA: 1:59 - loss: 1.1721 - regression_loss: 1.0327 - classification_loss: 0.1395 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1741 - regression_loss: 1.0345 - classification_loss: 0.1396 152/500 [========>.....................] - ETA: 1:58 - loss: 1.1705 - regression_loss: 1.0313 - classification_loss: 0.1392 153/500 [========>.....................] - ETA: 1:58 - loss: 1.1735 - regression_loss: 1.0332 - classification_loss: 0.1403 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1763 - regression_loss: 1.0362 - classification_loss: 0.1401 155/500 [========>.....................] - ETA: 1:57 - loss: 1.1762 - regression_loss: 1.0363 - classification_loss: 0.1400 156/500 [========>.....................] - ETA: 1:57 - loss: 1.1770 - regression_loss: 1.0370 - classification_loss: 0.1400 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1772 - regression_loss: 1.0375 - classification_loss: 0.1398 158/500 [========>.....................] - ETA: 1:56 - loss: 1.1769 - regression_loss: 1.0371 - classification_loss: 0.1398 159/500 [========>.....................] - ETA: 1:56 - loss: 1.1787 - regression_loss: 1.0390 - classification_loss: 0.1397 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1790 - regression_loss: 1.0392 - classification_loss: 0.1398 161/500 [========>.....................] - ETA: 1:55 - loss: 1.1824 - regression_loss: 1.0420 - classification_loss: 0.1404 162/500 [========>.....................] - ETA: 1:55 - loss: 1.1825 - regression_loss: 1.0418 - classification_loss: 0.1407 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1823 - regression_loss: 1.0416 - classification_loss: 0.1407 164/500 [========>.....................] - ETA: 1:54 - loss: 1.1817 - regression_loss: 1.0414 - classification_loss: 0.1403 165/500 [========>.....................] - ETA: 1:54 - loss: 1.1803 - regression_loss: 1.0402 - classification_loss: 0.1401 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1795 - regression_loss: 1.0397 - classification_loss: 0.1398 167/500 [=========>....................] - ETA: 1:53 - loss: 1.1788 - regression_loss: 1.0391 - classification_loss: 0.1397 168/500 [=========>....................] - ETA: 1:53 - loss: 1.1766 - regression_loss: 1.0373 - classification_loss: 0.1394 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1759 - regression_loss: 1.0366 - classification_loss: 0.1392 170/500 [=========>....................] - ETA: 1:52 - loss: 1.1761 - regression_loss: 1.0371 - classification_loss: 0.1391 171/500 [=========>....................] - ETA: 1:52 - loss: 1.1728 - regression_loss: 1.0336 - classification_loss: 0.1392 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1731 - regression_loss: 1.0340 - classification_loss: 0.1391 173/500 [=========>....................] - ETA: 1:51 - loss: 1.1746 - regression_loss: 1.0353 - classification_loss: 0.1393 174/500 [=========>....................] - ETA: 1:51 - loss: 1.1717 - regression_loss: 1.0328 - classification_loss: 0.1389 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1702 - regression_loss: 1.0316 - classification_loss: 0.1386 176/500 [=========>....................] - ETA: 1:50 - loss: 1.1670 - regression_loss: 1.0289 - classification_loss: 0.1380 177/500 [=========>....................] - ETA: 1:50 - loss: 1.1669 - regression_loss: 1.0291 - classification_loss: 0.1378 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1647 - regression_loss: 1.0273 - classification_loss: 0.1374 179/500 [=========>....................] - ETA: 1:49 - loss: 1.1669 - regression_loss: 1.0291 - classification_loss: 0.1377 180/500 [=========>....................] - ETA: 1:49 - loss: 1.1678 - regression_loss: 1.0298 - classification_loss: 0.1380 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1708 - regression_loss: 1.0321 - classification_loss: 0.1387 182/500 [=========>....................] - ETA: 1:48 - loss: 1.1671 - regression_loss: 1.0290 - classification_loss: 0.1381 183/500 [=========>....................] - ETA: 1:48 - loss: 1.1664 - regression_loss: 1.0284 - classification_loss: 0.1380 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1650 - regression_loss: 1.0271 - classification_loss: 0.1379 185/500 [==========>...................] - ETA: 1:47 - loss: 1.1613 - regression_loss: 1.0238 - classification_loss: 0.1375 186/500 [==========>...................] - ETA: 1:47 - loss: 1.1631 - regression_loss: 1.0253 - classification_loss: 0.1379 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1636 - regression_loss: 1.0260 - classification_loss: 0.1376 188/500 [==========>...................] - ETA: 1:46 - loss: 1.1613 - regression_loss: 1.0240 - classification_loss: 0.1373 189/500 [==========>...................] - ETA: 1:46 - loss: 1.1582 - regression_loss: 1.0214 - classification_loss: 0.1368 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1600 - regression_loss: 1.0230 - classification_loss: 0.1370 191/500 [==========>...................] - ETA: 1:45 - loss: 1.1593 - regression_loss: 1.0224 - classification_loss: 0.1369 192/500 [==========>...................] - ETA: 1:45 - loss: 1.1589 - regression_loss: 1.0222 - classification_loss: 0.1367 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1596 - regression_loss: 1.0229 - classification_loss: 0.1367 194/500 [==========>...................] - ETA: 1:44 - loss: 1.1562 - regression_loss: 1.0200 - classification_loss: 0.1362 195/500 [==========>...................] - ETA: 1:44 - loss: 1.1565 - regression_loss: 1.0201 - classification_loss: 0.1364 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1569 - regression_loss: 1.0203 - classification_loss: 0.1366 197/500 [==========>...................] - ETA: 1:43 - loss: 1.1575 - regression_loss: 1.0208 - classification_loss: 0.1366 198/500 [==========>...................] - ETA: 1:43 - loss: 1.1547 - regression_loss: 1.0184 - classification_loss: 0.1364 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1542 - regression_loss: 1.0182 - classification_loss: 0.1360 200/500 [===========>..................] - ETA: 1:42 - loss: 1.1562 - regression_loss: 1.0195 - classification_loss: 0.1367 201/500 [===========>..................] - ETA: 1:42 - loss: 1.1569 - regression_loss: 1.0202 - classification_loss: 0.1367 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1575 - regression_loss: 1.0207 - classification_loss: 0.1368 203/500 [===========>..................] - ETA: 1:41 - loss: 1.1568 - regression_loss: 1.0205 - classification_loss: 0.1363 204/500 [===========>..................] - ETA: 1:41 - loss: 1.1578 - regression_loss: 1.0214 - classification_loss: 0.1364 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1578 - regression_loss: 1.0209 - classification_loss: 0.1369 206/500 [===========>..................] - ETA: 1:40 - loss: 1.1574 - regression_loss: 1.0204 - classification_loss: 0.1370 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1569 - regression_loss: 1.0200 - classification_loss: 0.1368 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1573 - regression_loss: 1.0204 - classification_loss: 0.1369 209/500 [===========>..................] - ETA: 1:39 - loss: 1.1583 - regression_loss: 1.0212 - classification_loss: 0.1371 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1564 - regression_loss: 1.0198 - classification_loss: 0.1366 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1564 - regression_loss: 1.0198 - classification_loss: 0.1367 212/500 [===========>..................] - ETA: 1:38 - loss: 1.1536 - regression_loss: 1.0175 - classification_loss: 0.1361 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1555 - regression_loss: 1.0192 - classification_loss: 0.1363 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1550 - regression_loss: 1.0188 - classification_loss: 0.1362 215/500 [===========>..................] - ETA: 1:37 - loss: 1.1583 - regression_loss: 1.0216 - classification_loss: 0.1367 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1551 - regression_loss: 1.0189 - classification_loss: 0.1362 217/500 [============>.................] - ETA: 1:36 - loss: 1.1537 - regression_loss: 1.0178 - classification_loss: 0.1358 218/500 [============>.................] - ETA: 1:36 - loss: 1.1515 - regression_loss: 1.0161 - classification_loss: 0.1354 219/500 [============>.................] - ETA: 1:35 - loss: 1.1487 - regression_loss: 1.0138 - classification_loss: 0.1350 220/500 [============>.................] - ETA: 1:35 - loss: 1.1460 - regression_loss: 1.0114 - classification_loss: 0.1345 221/500 [============>.................] - ETA: 1:35 - loss: 1.1487 - regression_loss: 1.0138 - classification_loss: 0.1349 222/500 [============>.................] - ETA: 1:34 - loss: 1.1483 - regression_loss: 1.0134 - classification_loss: 0.1349 223/500 [============>.................] - ETA: 1:34 - loss: 1.1462 - regression_loss: 1.0115 - classification_loss: 0.1348 224/500 [============>.................] - ETA: 1:34 - loss: 1.1470 - regression_loss: 1.0121 - classification_loss: 0.1349 225/500 [============>.................] - ETA: 1:33 - loss: 1.1477 - regression_loss: 1.0128 - classification_loss: 0.1349 226/500 [============>.................] - ETA: 1:33 - loss: 1.1484 - regression_loss: 1.0133 - classification_loss: 0.1350 227/500 [============>.................] - ETA: 1:33 - loss: 1.1490 - regression_loss: 1.0138 - classification_loss: 0.1352 228/500 [============>.................] - ETA: 1:32 - loss: 1.1477 - regression_loss: 1.0127 - classification_loss: 0.1350 229/500 [============>.................] - ETA: 1:32 - loss: 1.1469 - regression_loss: 1.0120 - classification_loss: 0.1349 230/500 [============>.................] - ETA: 1:32 - loss: 1.1445 - regression_loss: 1.0101 - classification_loss: 0.1344 231/500 [============>.................] - ETA: 1:31 - loss: 1.1435 - regression_loss: 1.0089 - classification_loss: 0.1345 232/500 [============>.................] - ETA: 1:31 - loss: 1.1436 - regression_loss: 1.0090 - classification_loss: 0.1346 233/500 [============>.................] - ETA: 1:31 - loss: 1.1444 - regression_loss: 1.0098 - classification_loss: 0.1346 234/500 [=============>................] - ETA: 1:30 - loss: 1.1445 - regression_loss: 1.0100 - classification_loss: 0.1345 235/500 [=============>................] - ETA: 1:30 - loss: 1.1431 - regression_loss: 1.0087 - classification_loss: 0.1344 236/500 [=============>................] - ETA: 1:30 - loss: 1.1411 - regression_loss: 1.0070 - classification_loss: 0.1341 237/500 [=============>................] - ETA: 1:29 - loss: 1.1428 - regression_loss: 1.0086 - classification_loss: 0.1341 238/500 [=============>................] - ETA: 1:29 - loss: 1.1417 - regression_loss: 1.0079 - classification_loss: 0.1338 239/500 [=============>................] - ETA: 1:29 - loss: 1.1416 - regression_loss: 1.0079 - classification_loss: 0.1338 240/500 [=============>................] - ETA: 1:28 - loss: 1.1436 - regression_loss: 1.0098 - classification_loss: 0.1337 241/500 [=============>................] - ETA: 1:28 - loss: 1.1437 - regression_loss: 1.0096 - classification_loss: 0.1341 242/500 [=============>................] - ETA: 1:27 - loss: 1.1460 - regression_loss: 1.0113 - classification_loss: 0.1347 243/500 [=============>................] - ETA: 1:27 - loss: 1.1462 - regression_loss: 1.0115 - classification_loss: 0.1347 244/500 [=============>................] - ETA: 1:27 - loss: 1.1435 - regression_loss: 1.0092 - classification_loss: 0.1343 245/500 [=============>................] - ETA: 1:26 - loss: 1.1453 - regression_loss: 1.0106 - classification_loss: 0.1347 246/500 [=============>................] - ETA: 1:26 - loss: 1.1445 - regression_loss: 1.0098 - classification_loss: 0.1347 247/500 [=============>................] - ETA: 1:26 - loss: 1.1421 - regression_loss: 1.0078 - classification_loss: 0.1343 248/500 [=============>................] - ETA: 1:25 - loss: 1.1413 - regression_loss: 1.0073 - classification_loss: 0.1341 249/500 [=============>................] - ETA: 1:25 - loss: 1.1410 - regression_loss: 1.0071 - classification_loss: 0.1339 250/500 [==============>...............] - ETA: 1:25 - loss: 1.1377 - regression_loss: 1.0041 - classification_loss: 0.1337 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1384 - regression_loss: 1.0048 - classification_loss: 0.1337 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1388 - regression_loss: 1.0052 - classification_loss: 0.1336 253/500 [==============>...............] - ETA: 1:24 - loss: 1.1412 - regression_loss: 1.0070 - classification_loss: 0.1342 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1419 - regression_loss: 1.0077 - classification_loss: 0.1343 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1429 - regression_loss: 1.0085 - classification_loss: 0.1344 256/500 [==============>...............] - ETA: 1:23 - loss: 1.1435 - regression_loss: 1.0090 - classification_loss: 0.1345 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1436 - regression_loss: 1.0091 - classification_loss: 0.1345 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1469 - regression_loss: 1.0120 - classification_loss: 0.1349 259/500 [==============>...............] - ETA: 1:22 - loss: 1.1470 - regression_loss: 1.0121 - classification_loss: 0.1349 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1461 - regression_loss: 1.0114 - classification_loss: 0.1347 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1471 - regression_loss: 1.0123 - classification_loss: 0.1348 262/500 [==============>...............] - ETA: 1:21 - loss: 1.1457 - regression_loss: 1.0111 - classification_loss: 0.1346 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1465 - regression_loss: 1.0119 - classification_loss: 0.1346 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1485 - regression_loss: 1.0136 - classification_loss: 0.1350 265/500 [==============>...............] - ETA: 1:20 - loss: 1.1495 - regression_loss: 1.0144 - classification_loss: 0.1351 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1490 - regression_loss: 1.0140 - classification_loss: 0.1350 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1506 - regression_loss: 1.0154 - classification_loss: 0.1352 268/500 [===============>..............] - ETA: 1:19 - loss: 1.1493 - regression_loss: 1.0143 - classification_loss: 0.1349 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1488 - regression_loss: 1.0140 - classification_loss: 0.1347 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1499 - regression_loss: 1.0150 - classification_loss: 0.1350 271/500 [===============>..............] - ETA: 1:18 - loss: 1.1513 - regression_loss: 1.0163 - classification_loss: 0.1350 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1522 - regression_loss: 1.0170 - classification_loss: 0.1352 273/500 [===============>..............] - ETA: 1:17 - loss: 1.1533 - regression_loss: 1.0178 - classification_loss: 0.1354 274/500 [===============>..............] - ETA: 1:17 - loss: 1.1556 - regression_loss: 1.0198 - classification_loss: 0.1358 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1544 - regression_loss: 1.0187 - classification_loss: 0.1357 276/500 [===============>..............] - ETA: 1:16 - loss: 1.1533 - regression_loss: 1.0178 - classification_loss: 0.1355 277/500 [===============>..............] - ETA: 1:16 - loss: 1.1541 - regression_loss: 1.0185 - classification_loss: 0.1356 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1551 - regression_loss: 1.0195 - classification_loss: 0.1356 279/500 [===============>..............] - ETA: 1:15 - loss: 1.1536 - regression_loss: 1.0183 - classification_loss: 0.1353 280/500 [===============>..............] - ETA: 1:15 - loss: 1.1536 - regression_loss: 1.0183 - classification_loss: 0.1353 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1517 - regression_loss: 1.0166 - classification_loss: 0.1350 282/500 [===============>..............] - ETA: 1:14 - loss: 1.1500 - regression_loss: 1.0152 - classification_loss: 0.1348 283/500 [===============>..............] - ETA: 1:14 - loss: 1.1502 - regression_loss: 1.0154 - classification_loss: 0.1348 284/500 [================>.............] - ETA: 1:13 - loss: 1.1511 - regression_loss: 1.0160 - classification_loss: 0.1351 285/500 [================>.............] - ETA: 1:13 - loss: 1.1518 - regression_loss: 1.0165 - classification_loss: 0.1353 286/500 [================>.............] - ETA: 1:13 - loss: 1.1529 - regression_loss: 1.0174 - classification_loss: 0.1355 287/500 [================>.............] - ETA: 1:12 - loss: 1.1537 - regression_loss: 1.0181 - classification_loss: 0.1356 288/500 [================>.............] - ETA: 1:12 - loss: 1.1532 - regression_loss: 1.0176 - classification_loss: 0.1356 289/500 [================>.............] - ETA: 1:12 - loss: 1.1527 - regression_loss: 1.0171 - classification_loss: 0.1356 290/500 [================>.............] - ETA: 1:11 - loss: 1.1536 - regression_loss: 1.0179 - classification_loss: 0.1356 291/500 [================>.............] - ETA: 1:11 - loss: 1.1535 - regression_loss: 1.0179 - classification_loss: 0.1356 292/500 [================>.............] - ETA: 1:10 - loss: 1.1540 - regression_loss: 1.0182 - classification_loss: 0.1357 293/500 [================>.............] - ETA: 1:10 - loss: 1.1547 - regression_loss: 1.0190 - classification_loss: 0.1356 294/500 [================>.............] - ETA: 1:10 - loss: 1.1541 - regression_loss: 1.0185 - classification_loss: 0.1356 295/500 [================>.............] - ETA: 1:09 - loss: 1.1554 - regression_loss: 1.0197 - classification_loss: 0.1357 296/500 [================>.............] - ETA: 1:09 - loss: 1.1555 - regression_loss: 1.0197 - classification_loss: 0.1358 297/500 [================>.............] - ETA: 1:09 - loss: 1.1569 - regression_loss: 1.0211 - classification_loss: 0.1358 298/500 [================>.............] - ETA: 1:08 - loss: 1.1582 - regression_loss: 1.0222 - classification_loss: 0.1361 299/500 [================>.............] - ETA: 1:08 - loss: 1.1592 - regression_loss: 1.0233 - classification_loss: 0.1359 300/500 [=================>............] - ETA: 1:08 - loss: 1.1604 - regression_loss: 1.0244 - classification_loss: 0.1360 301/500 [=================>............] - ETA: 1:07 - loss: 1.1603 - regression_loss: 1.0242 - classification_loss: 0.1360 302/500 [=================>............] - ETA: 1:07 - loss: 1.1616 - regression_loss: 1.0253 - classification_loss: 0.1362 303/500 [=================>............] - ETA: 1:07 - loss: 1.1611 - regression_loss: 1.0248 - classification_loss: 0.1362 304/500 [=================>............] - ETA: 1:06 - loss: 1.1619 - regression_loss: 1.0256 - classification_loss: 0.1363 305/500 [=================>............] - ETA: 1:06 - loss: 1.1617 - regression_loss: 1.0254 - classification_loss: 0.1363 306/500 [=================>............] - ETA: 1:06 - loss: 1.1623 - regression_loss: 1.0259 - classification_loss: 0.1363 307/500 [=================>............] - ETA: 1:05 - loss: 1.1624 - regression_loss: 1.0260 - classification_loss: 0.1363 308/500 [=================>............] - ETA: 1:05 - loss: 1.1615 - regression_loss: 1.0254 - classification_loss: 0.1361 309/500 [=================>............] - ETA: 1:05 - loss: 1.1614 - regression_loss: 1.0254 - classification_loss: 0.1360 310/500 [=================>............] - ETA: 1:04 - loss: 1.1629 - regression_loss: 1.0265 - classification_loss: 0.1364 311/500 [=================>............] - ETA: 1:04 - loss: 1.1634 - regression_loss: 1.0269 - classification_loss: 0.1365 312/500 [=================>............] - ETA: 1:04 - loss: 1.1612 - regression_loss: 1.0250 - classification_loss: 0.1362 313/500 [=================>............] - ETA: 1:03 - loss: 1.1617 - regression_loss: 1.0254 - classification_loss: 0.1363 314/500 [=================>............] - ETA: 1:03 - loss: 1.1679 - regression_loss: 1.0310 - classification_loss: 0.1369 315/500 [=================>............] - ETA: 1:03 - loss: 1.1685 - regression_loss: 1.0314 - classification_loss: 0.1371 316/500 [=================>............] - ETA: 1:02 - loss: 1.1689 - regression_loss: 1.0318 - classification_loss: 0.1372 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1674 - regression_loss: 1.0305 - classification_loss: 0.1368 318/500 [==================>...........] - ETA: 1:02 - loss: 1.1699 - regression_loss: 1.0327 - classification_loss: 0.1372 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1689 - regression_loss: 1.0319 - classification_loss: 0.1370 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1688 - regression_loss: 1.0318 - classification_loss: 0.1370 321/500 [==================>...........] - ETA: 1:01 - loss: 1.1701 - regression_loss: 1.0328 - classification_loss: 0.1373 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1705 - regression_loss: 1.0332 - classification_loss: 0.1373 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1712 - regression_loss: 1.0338 - classification_loss: 0.1374 324/500 [==================>...........] - ETA: 1:00 - loss: 1.1727 - regression_loss: 1.0349 - classification_loss: 0.1378 325/500 [==================>...........] - ETA: 59s - loss: 1.1709 - regression_loss: 1.0334 - classification_loss: 0.1375  326/500 [==================>...........] - ETA: 59s - loss: 1.1690 - regression_loss: 1.0317 - classification_loss: 0.1372 327/500 [==================>...........] - ETA: 59s - loss: 1.1685 - regression_loss: 1.0314 - classification_loss: 0.1371 328/500 [==================>...........] - ETA: 58s - loss: 1.1666 - regression_loss: 1.0298 - classification_loss: 0.1368 329/500 [==================>...........] - ETA: 58s - loss: 1.1651 - regression_loss: 1.0284 - classification_loss: 0.1366 330/500 [==================>...........] - ETA: 57s - loss: 1.1662 - regression_loss: 1.0291 - classification_loss: 0.1371 331/500 [==================>...........] - ETA: 57s - loss: 1.1671 - regression_loss: 1.0299 - classification_loss: 0.1373 332/500 [==================>...........] - ETA: 57s - loss: 1.1670 - regression_loss: 1.0298 - classification_loss: 0.1371 333/500 [==================>...........] - ETA: 56s - loss: 1.1686 - regression_loss: 1.0313 - classification_loss: 0.1372 334/500 [===================>..........] - ETA: 56s - loss: 1.1688 - regression_loss: 1.0315 - classification_loss: 0.1373 335/500 [===================>..........] - ETA: 56s - loss: 1.1673 - regression_loss: 1.0303 - classification_loss: 0.1370 336/500 [===================>..........] - ETA: 55s - loss: 1.1679 - regression_loss: 1.0310 - classification_loss: 0.1369 337/500 [===================>..........] - ETA: 55s - loss: 1.1677 - regression_loss: 1.0309 - classification_loss: 0.1368 338/500 [===================>..........] - ETA: 55s - loss: 1.1679 - regression_loss: 1.0310 - classification_loss: 0.1369 339/500 [===================>..........] - ETA: 54s - loss: 1.1676 - regression_loss: 1.0308 - classification_loss: 0.1368 340/500 [===================>..........] - ETA: 54s - loss: 1.1666 - regression_loss: 1.0300 - classification_loss: 0.1366 341/500 [===================>..........] - ETA: 54s - loss: 1.1652 - regression_loss: 1.0288 - classification_loss: 0.1364 342/500 [===================>..........] - ETA: 53s - loss: 1.1651 - regression_loss: 1.0285 - classification_loss: 0.1366 343/500 [===================>..........] - ETA: 53s - loss: 1.1650 - regression_loss: 1.0282 - classification_loss: 0.1368 344/500 [===================>..........] - ETA: 53s - loss: 1.1644 - regression_loss: 1.0277 - classification_loss: 0.1368 345/500 [===================>..........] - ETA: 52s - loss: 1.1655 - regression_loss: 1.0286 - classification_loss: 0.1369 346/500 [===================>..........] - ETA: 52s - loss: 1.1648 - regression_loss: 1.0281 - classification_loss: 0.1367 347/500 [===================>..........] - ETA: 52s - loss: 1.1629 - regression_loss: 1.0265 - classification_loss: 0.1364 348/500 [===================>..........] - ETA: 51s - loss: 1.1631 - regression_loss: 1.0267 - classification_loss: 0.1363 349/500 [===================>..........] - ETA: 51s - loss: 1.1636 - regression_loss: 1.0271 - classification_loss: 0.1364 350/500 [====================>.........] - ETA: 51s - loss: 1.1631 - regression_loss: 1.0267 - classification_loss: 0.1364 351/500 [====================>.........] - ETA: 50s - loss: 1.1612 - regression_loss: 1.0250 - classification_loss: 0.1361 352/500 [====================>.........] - ETA: 50s - loss: 1.1608 - regression_loss: 1.0248 - classification_loss: 0.1361 353/500 [====================>.........] - ETA: 50s - loss: 1.1617 - regression_loss: 1.0257 - classification_loss: 0.1361 354/500 [====================>.........] - ETA: 49s - loss: 1.1648 - regression_loss: 1.0283 - classification_loss: 0.1365 355/500 [====================>.........] - ETA: 49s - loss: 1.1644 - regression_loss: 1.0280 - classification_loss: 0.1364 356/500 [====================>.........] - ETA: 49s - loss: 1.1650 - regression_loss: 1.0284 - classification_loss: 0.1366 357/500 [====================>.........] - ETA: 48s - loss: 1.1659 - regression_loss: 1.0291 - classification_loss: 0.1368 358/500 [====================>.........] - ETA: 48s - loss: 1.1662 - regression_loss: 1.0294 - classification_loss: 0.1369 359/500 [====================>.........] - ETA: 48s - loss: 1.1660 - regression_loss: 1.0292 - classification_loss: 0.1368 360/500 [====================>.........] - ETA: 47s - loss: 1.1660 - regression_loss: 1.0292 - classification_loss: 0.1368 361/500 [====================>.........] - ETA: 47s - loss: 1.1654 - regression_loss: 1.0287 - classification_loss: 0.1368 362/500 [====================>.........] - ETA: 47s - loss: 1.1661 - regression_loss: 1.0293 - classification_loss: 0.1368 363/500 [====================>.........] - ETA: 46s - loss: 1.1649 - regression_loss: 1.0280 - classification_loss: 0.1368 364/500 [====================>.........] - ETA: 46s - loss: 1.1639 - regression_loss: 1.0272 - classification_loss: 0.1367 365/500 [====================>.........] - ETA: 46s - loss: 1.1628 - regression_loss: 1.0263 - classification_loss: 0.1366 366/500 [====================>.........] - ETA: 45s - loss: 1.1647 - regression_loss: 1.0271 - classification_loss: 0.1375 367/500 [=====================>........] - ETA: 45s - loss: 1.1651 - regression_loss: 1.0273 - classification_loss: 0.1378 368/500 [=====================>........] - ETA: 45s - loss: 1.1637 - regression_loss: 1.0261 - classification_loss: 0.1376 369/500 [=====================>........] - ETA: 44s - loss: 1.1675 - regression_loss: 1.0293 - classification_loss: 0.1382 370/500 [=====================>........] - ETA: 44s - loss: 1.1688 - regression_loss: 1.0304 - classification_loss: 0.1384 371/500 [=====================>........] - ETA: 44s - loss: 1.1692 - regression_loss: 1.0309 - classification_loss: 0.1383 372/500 [=====================>........] - ETA: 43s - loss: 1.1692 - regression_loss: 1.0308 - classification_loss: 0.1383 373/500 [=====================>........] - ETA: 43s - loss: 1.1694 - regression_loss: 1.0309 - classification_loss: 0.1385 374/500 [=====================>........] - ETA: 43s - loss: 1.1702 - regression_loss: 1.0318 - classification_loss: 0.1384 375/500 [=====================>........] - ETA: 42s - loss: 1.1693 - regression_loss: 1.0310 - classification_loss: 0.1383 376/500 [=====================>........] - ETA: 42s - loss: 1.1692 - regression_loss: 1.0308 - classification_loss: 0.1384 377/500 [=====================>........] - ETA: 41s - loss: 1.1681 - regression_loss: 1.0298 - classification_loss: 0.1383 378/500 [=====================>........] - ETA: 41s - loss: 1.1671 - regression_loss: 1.0289 - classification_loss: 0.1382 379/500 [=====================>........] - ETA: 41s - loss: 1.1681 - regression_loss: 1.0298 - classification_loss: 0.1383 380/500 [=====================>........] - ETA: 40s - loss: 1.1663 - regression_loss: 1.0282 - classification_loss: 0.1381 381/500 [=====================>........] - ETA: 40s - loss: 1.1652 - regression_loss: 1.0273 - classification_loss: 0.1379 382/500 [=====================>........] - ETA: 40s - loss: 1.1641 - regression_loss: 1.0263 - classification_loss: 0.1377 383/500 [=====================>........] - ETA: 39s - loss: 1.1634 - regression_loss: 1.0259 - classification_loss: 0.1375 384/500 [======================>.......] - ETA: 39s - loss: 1.1646 - regression_loss: 1.0269 - classification_loss: 0.1377 385/500 [======================>.......] - ETA: 39s - loss: 1.1640 - regression_loss: 1.0264 - classification_loss: 0.1376 386/500 [======================>.......] - ETA: 38s - loss: 1.1643 - regression_loss: 1.0268 - classification_loss: 0.1375 387/500 [======================>.......] - ETA: 38s - loss: 1.1641 - regression_loss: 1.0267 - classification_loss: 0.1374 388/500 [======================>.......] - ETA: 38s - loss: 1.1636 - regression_loss: 1.0263 - classification_loss: 0.1373 389/500 [======================>.......] - ETA: 37s - loss: 1.1627 - regression_loss: 1.0255 - classification_loss: 0.1372 390/500 [======================>.......] - ETA: 37s - loss: 1.1637 - regression_loss: 1.0262 - classification_loss: 0.1375 391/500 [======================>.......] - ETA: 37s - loss: 1.1637 - regression_loss: 1.0261 - classification_loss: 0.1375 392/500 [======================>.......] - ETA: 36s - loss: 1.1643 - regression_loss: 1.0266 - classification_loss: 0.1377 393/500 [======================>.......] - ETA: 36s - loss: 1.1646 - regression_loss: 1.0270 - classification_loss: 0.1377 394/500 [======================>.......] - ETA: 36s - loss: 1.1660 - regression_loss: 1.0283 - classification_loss: 0.1377 395/500 [======================>.......] - ETA: 35s - loss: 1.1665 - regression_loss: 1.0287 - classification_loss: 0.1378 396/500 [======================>.......] - ETA: 35s - loss: 1.1675 - regression_loss: 1.0295 - classification_loss: 0.1380 397/500 [======================>.......] - ETA: 35s - loss: 1.1682 - regression_loss: 1.0303 - classification_loss: 0.1379 398/500 [======================>.......] - ETA: 34s - loss: 1.1681 - regression_loss: 1.0303 - classification_loss: 0.1378 399/500 [======================>.......] - ETA: 34s - loss: 1.1696 - regression_loss: 1.0314 - classification_loss: 0.1382 400/500 [=======================>......] - ETA: 34s - loss: 1.1685 - regression_loss: 1.0306 - classification_loss: 0.1379 401/500 [=======================>......] - ETA: 33s - loss: 1.1698 - regression_loss: 1.0315 - classification_loss: 0.1383 402/500 [=======================>......] - ETA: 33s - loss: 1.1704 - regression_loss: 1.0321 - classification_loss: 0.1383 403/500 [=======================>......] - ETA: 33s - loss: 1.1700 - regression_loss: 1.0318 - classification_loss: 0.1382 404/500 [=======================>......] - ETA: 32s - loss: 1.1694 - regression_loss: 1.0312 - classification_loss: 0.1382 405/500 [=======================>......] - ETA: 32s - loss: 1.1678 - regression_loss: 1.0298 - classification_loss: 0.1380 406/500 [=======================>......] - ETA: 32s - loss: 1.1660 - regression_loss: 1.0283 - classification_loss: 0.1377 407/500 [=======================>......] - ETA: 31s - loss: 1.1659 - regression_loss: 1.0283 - classification_loss: 0.1376 408/500 [=======================>......] - ETA: 31s - loss: 1.1654 - regression_loss: 1.0278 - classification_loss: 0.1376 409/500 [=======================>......] - ETA: 31s - loss: 1.1649 - regression_loss: 1.0274 - classification_loss: 0.1375 410/500 [=======================>......] - ETA: 30s - loss: 1.1643 - regression_loss: 1.0267 - classification_loss: 0.1376 411/500 [=======================>......] - ETA: 30s - loss: 1.1634 - regression_loss: 1.0259 - classification_loss: 0.1376 412/500 [=======================>......] - ETA: 30s - loss: 1.1646 - regression_loss: 1.0268 - classification_loss: 0.1378 413/500 [=======================>......] - ETA: 29s - loss: 1.1657 - regression_loss: 1.0277 - classification_loss: 0.1380 414/500 [=======================>......] - ETA: 29s - loss: 1.1672 - regression_loss: 1.0292 - classification_loss: 0.1380 415/500 [=======================>......] - ETA: 28s - loss: 1.1690 - regression_loss: 1.0308 - classification_loss: 0.1382 416/500 [=======================>......] - ETA: 28s - loss: 1.1693 - regression_loss: 1.0309 - classification_loss: 0.1384 417/500 [========================>.....] - ETA: 28s - loss: 1.1695 - regression_loss: 1.0311 - classification_loss: 0.1384 418/500 [========================>.....] - ETA: 27s - loss: 1.1707 - regression_loss: 1.0321 - classification_loss: 0.1386 419/500 [========================>.....] - ETA: 27s - loss: 1.1716 - regression_loss: 1.0329 - classification_loss: 0.1388 420/500 [========================>.....] - ETA: 27s - loss: 1.1699 - regression_loss: 1.0314 - classification_loss: 0.1385 421/500 [========================>.....] - ETA: 26s - loss: 1.1716 - regression_loss: 1.0328 - classification_loss: 0.1389 422/500 [========================>.....] - ETA: 26s - loss: 1.1736 - regression_loss: 1.0345 - classification_loss: 0.1392 423/500 [========================>.....] - ETA: 26s - loss: 1.1733 - regression_loss: 1.0342 - classification_loss: 0.1392 424/500 [========================>.....] - ETA: 25s - loss: 1.1745 - regression_loss: 1.0351 - classification_loss: 0.1394 425/500 [========================>.....] - ETA: 25s - loss: 1.1747 - regression_loss: 1.0352 - classification_loss: 0.1395 426/500 [========================>.....] - ETA: 25s - loss: 1.1756 - regression_loss: 1.0361 - classification_loss: 0.1395 427/500 [========================>.....] - ETA: 24s - loss: 1.1762 - regression_loss: 1.0366 - classification_loss: 0.1396 428/500 [========================>.....] - ETA: 24s - loss: 1.1759 - regression_loss: 1.0364 - classification_loss: 0.1395 429/500 [========================>.....] - ETA: 24s - loss: 1.1759 - regression_loss: 1.0364 - classification_loss: 0.1395 430/500 [========================>.....] - ETA: 23s - loss: 1.1767 - regression_loss: 1.0370 - classification_loss: 0.1396 431/500 [========================>.....] - ETA: 23s - loss: 1.1755 - regression_loss: 1.0361 - classification_loss: 0.1395 432/500 [========================>.....] - ETA: 23s - loss: 1.1767 - regression_loss: 1.0370 - classification_loss: 0.1396 433/500 [========================>.....] - ETA: 22s - loss: 1.1748 - regression_loss: 1.0354 - classification_loss: 0.1394 434/500 [=========================>....] - ETA: 22s - loss: 1.1748 - regression_loss: 1.0355 - classification_loss: 0.1393 435/500 [=========================>....] - ETA: 22s - loss: 1.1737 - regression_loss: 1.0346 - classification_loss: 0.1391 436/500 [=========================>....] - ETA: 21s - loss: 1.1741 - regression_loss: 1.0349 - classification_loss: 0.1392 437/500 [=========================>....] - ETA: 21s - loss: 1.1750 - regression_loss: 1.0355 - classification_loss: 0.1395 438/500 [=========================>....] - ETA: 21s - loss: 1.1752 - regression_loss: 1.0357 - classification_loss: 0.1395 439/500 [=========================>....] - ETA: 20s - loss: 1.1758 - regression_loss: 1.0363 - classification_loss: 0.1395 440/500 [=========================>....] - ETA: 20s - loss: 1.1760 - regression_loss: 1.0365 - classification_loss: 0.1395 441/500 [=========================>....] - ETA: 20s - loss: 1.1772 - regression_loss: 1.0376 - classification_loss: 0.1396 442/500 [=========================>....] - ETA: 19s - loss: 1.1754 - regression_loss: 1.0361 - classification_loss: 0.1393 443/500 [=========================>....] - ETA: 19s - loss: 1.1736 - regression_loss: 1.0345 - classification_loss: 0.1391 444/500 [=========================>....] - ETA: 19s - loss: 1.1731 - regression_loss: 1.0341 - classification_loss: 0.1390 445/500 [=========================>....] - ETA: 18s - loss: 1.1736 - regression_loss: 1.0345 - classification_loss: 0.1391 446/500 [=========================>....] - ETA: 18s - loss: 1.1725 - regression_loss: 1.0336 - classification_loss: 0.1389 447/500 [=========================>....] - ETA: 18s - loss: 1.1725 - regression_loss: 1.0336 - classification_loss: 0.1390 448/500 [=========================>....] - ETA: 17s - loss: 1.1712 - regression_loss: 1.0324 - classification_loss: 0.1388 449/500 [=========================>....] - ETA: 17s - loss: 1.1710 - regression_loss: 1.0323 - classification_loss: 0.1387 450/500 [==========================>...] - ETA: 17s - loss: 1.1706 - regression_loss: 1.0319 - classification_loss: 0.1387 451/500 [==========================>...] - ETA: 16s - loss: 1.1710 - regression_loss: 1.0324 - classification_loss: 0.1386 452/500 [==========================>...] - ETA: 16s - loss: 1.1696 - regression_loss: 1.0313 - classification_loss: 0.1384 453/500 [==========================>...] - ETA: 16s - loss: 1.1705 - regression_loss: 1.0322 - classification_loss: 0.1384 454/500 [==========================>...] - ETA: 15s - loss: 1.1707 - regression_loss: 1.0324 - classification_loss: 0.1384 455/500 [==========================>...] - ETA: 15s - loss: 1.1709 - regression_loss: 1.0325 - classification_loss: 0.1384 456/500 [==========================>...] - ETA: 14s - loss: 1.1711 - regression_loss: 1.0326 - classification_loss: 0.1385 457/500 [==========================>...] - ETA: 14s - loss: 1.1710 - regression_loss: 1.0325 - classification_loss: 0.1385 458/500 [==========================>...] - ETA: 14s - loss: 1.1715 - regression_loss: 1.0328 - classification_loss: 0.1387 459/500 [==========================>...] - ETA: 13s - loss: 1.1722 - regression_loss: 1.0335 - classification_loss: 0.1387 460/500 [==========================>...] - ETA: 13s - loss: 1.1720 - regression_loss: 1.0333 - classification_loss: 0.1386 461/500 [==========================>...] - ETA: 13s - loss: 1.1720 - regression_loss: 1.0333 - classification_loss: 0.1387 462/500 [==========================>...] - ETA: 12s - loss: 1.1719 - regression_loss: 1.0332 - classification_loss: 0.1387 463/500 [==========================>...] - ETA: 12s - loss: 1.1724 - regression_loss: 1.0337 - classification_loss: 0.1387 464/500 [==========================>...] - ETA: 12s - loss: 1.1722 - regression_loss: 1.0336 - classification_loss: 0.1386 465/500 [==========================>...] - ETA: 11s - loss: 1.1724 - regression_loss: 1.0339 - classification_loss: 0.1386 466/500 [==========================>...] - ETA: 11s - loss: 1.1734 - regression_loss: 1.0346 - classification_loss: 0.1388 467/500 [===========================>..] - ETA: 11s - loss: 1.1731 - regression_loss: 1.0343 - classification_loss: 0.1388 468/500 [===========================>..] - ETA: 10s - loss: 1.1732 - regression_loss: 1.0344 - classification_loss: 0.1389 469/500 [===========================>..] - ETA: 10s - loss: 1.1731 - regression_loss: 1.0343 - classification_loss: 0.1388 470/500 [===========================>..] - ETA: 10s - loss: 1.1745 - regression_loss: 1.0356 - classification_loss: 0.1389 471/500 [===========================>..] - ETA: 9s - loss: 1.1748 - regression_loss: 1.0358 - classification_loss: 0.1390  472/500 [===========================>..] - ETA: 9s - loss: 1.1750 - regression_loss: 1.0360 - classification_loss: 0.1390 473/500 [===========================>..] - ETA: 9s - loss: 1.1756 - regression_loss: 1.0366 - classification_loss: 0.1390 474/500 [===========================>..] - ETA: 8s - loss: 1.1763 - regression_loss: 1.0372 - classification_loss: 0.1391 475/500 [===========================>..] - ETA: 8s - loss: 1.1761 - regression_loss: 1.0370 - classification_loss: 0.1391 476/500 [===========================>..] - ETA: 8s - loss: 1.1748 - regression_loss: 1.0359 - classification_loss: 0.1389 477/500 [===========================>..] - ETA: 7s - loss: 1.1739 - regression_loss: 1.0351 - classification_loss: 0.1388 478/500 [===========================>..] - ETA: 7s - loss: 1.1749 - regression_loss: 1.0360 - classification_loss: 0.1389 479/500 [===========================>..] - ETA: 7s - loss: 1.1744 - regression_loss: 1.0356 - classification_loss: 0.1388 480/500 [===========================>..] - ETA: 6s - loss: 1.1745 - regression_loss: 1.0357 - classification_loss: 0.1388 481/500 [===========================>..] - ETA: 6s - loss: 1.1740 - regression_loss: 1.0352 - classification_loss: 0.1388 482/500 [===========================>..] - ETA: 6s - loss: 1.1744 - regression_loss: 1.0355 - classification_loss: 0.1389 483/500 [===========================>..] - ETA: 5s - loss: 1.1737 - regression_loss: 1.0349 - classification_loss: 0.1388 484/500 [============================>.] - ETA: 5s - loss: 1.1726 - regression_loss: 1.0340 - classification_loss: 0.1386 485/500 [============================>.] - ETA: 5s - loss: 1.1718 - regression_loss: 1.0334 - classification_loss: 0.1384 486/500 [============================>.] - ETA: 4s - loss: 1.1718 - regression_loss: 1.0334 - classification_loss: 0.1383 487/500 [============================>.] - ETA: 4s - loss: 1.1713 - regression_loss: 1.0331 - classification_loss: 0.1382 488/500 [============================>.] - ETA: 4s - loss: 1.1715 - regression_loss: 1.0333 - classification_loss: 0.1382 489/500 [============================>.] - ETA: 3s - loss: 1.1714 - regression_loss: 1.0329 - classification_loss: 0.1385 490/500 [============================>.] - ETA: 3s - loss: 1.1716 - regression_loss: 1.0331 - classification_loss: 0.1385 491/500 [============================>.] - ETA: 3s - loss: 1.1719 - regression_loss: 1.0333 - classification_loss: 0.1386 492/500 [============================>.] - ETA: 2s - loss: 1.1709 - regression_loss: 1.0325 - classification_loss: 0.1385 493/500 [============================>.] - ETA: 2s - loss: 1.1702 - regression_loss: 1.0318 - classification_loss: 0.1384 494/500 [============================>.] - ETA: 2s - loss: 1.1703 - regression_loss: 1.0319 - classification_loss: 0.1384 495/500 [============================>.] - ETA: 1s - loss: 1.1711 - regression_loss: 1.0324 - classification_loss: 0.1386 496/500 [============================>.] - ETA: 1s - loss: 1.1712 - regression_loss: 1.0325 - classification_loss: 0.1386 497/500 [============================>.] - ETA: 1s - loss: 1.1716 - regression_loss: 1.0328 - classification_loss: 0.1387 498/500 [============================>.] - ETA: 0s - loss: 1.1732 - regression_loss: 1.0342 - classification_loss: 0.1390 499/500 [============================>.] - ETA: 0s - loss: 1.1723 - regression_loss: 1.0334 - classification_loss: 0.1389 500/500 [==============================] - 170s 341ms/step - loss: 1.1726 - regression_loss: 1.0337 - classification_loss: 0.1390 1172 instances of class plum with average precision: 0.7787 mAP: 0.7787 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 2:45 - loss: 0.4537 - regression_loss: 0.3637 - classification_loss: 0.0900 2/500 [..............................] - ETA: 2:46 - loss: 0.5218 - regression_loss: 0.4522 - classification_loss: 0.0696 3/500 [..............................] - ETA: 2:48 - loss: 0.8396 - regression_loss: 0.7409 - classification_loss: 0.0987 4/500 [..............................] - ETA: 2:48 - loss: 0.9961 - regression_loss: 0.8791 - classification_loss: 0.1169 5/500 [..............................] - ETA: 2:48 - loss: 1.0681 - regression_loss: 0.9452 - classification_loss: 0.1229 6/500 [..............................] - ETA: 2:46 - loss: 1.2066 - regression_loss: 1.0660 - classification_loss: 0.1406 7/500 [..............................] - ETA: 2:46 - loss: 1.2893 - regression_loss: 1.1359 - classification_loss: 0.1534 8/500 [..............................] - ETA: 2:46 - loss: 1.2796 - regression_loss: 1.1275 - classification_loss: 0.1521 9/500 [..............................] - ETA: 2:46 - loss: 1.1705 - regression_loss: 1.0318 - classification_loss: 0.1387 10/500 [..............................] - ETA: 2:45 - loss: 1.1425 - regression_loss: 1.0132 - classification_loss: 0.1293 11/500 [..............................] - ETA: 2:45 - loss: 1.1640 - regression_loss: 1.0319 - classification_loss: 0.1322 12/500 [..............................] - ETA: 2:45 - loss: 1.1712 - regression_loss: 1.0394 - classification_loss: 0.1319 13/500 [..............................] - ETA: 2:45 - loss: 1.1773 - regression_loss: 1.0440 - classification_loss: 0.1333 14/500 [..............................] - ETA: 2:44 - loss: 1.1643 - regression_loss: 1.0314 - classification_loss: 0.1330 15/500 [..............................] - ETA: 2:44 - loss: 1.2008 - regression_loss: 1.0652 - classification_loss: 0.1356 16/500 [..............................] - ETA: 2:44 - loss: 1.1786 - regression_loss: 1.0463 - classification_loss: 0.1322 17/500 [>.............................] - ETA: 2:43 - loss: 1.1832 - regression_loss: 1.0495 - classification_loss: 0.1337 18/500 [>.............................] - ETA: 2:43 - loss: 1.1635 - regression_loss: 1.0296 - classification_loss: 0.1339 19/500 [>.............................] - ETA: 2:43 - loss: 1.1379 - regression_loss: 1.0086 - classification_loss: 0.1293 20/500 [>.............................] - ETA: 2:42 - loss: 1.1365 - regression_loss: 1.0073 - classification_loss: 0.1292 21/500 [>.............................] - ETA: 2:41 - loss: 1.1205 - regression_loss: 0.9934 - classification_loss: 0.1272 22/500 [>.............................] - ETA: 2:41 - loss: 1.1239 - regression_loss: 0.9947 - classification_loss: 0.1292 23/500 [>.............................] - ETA: 2:40 - loss: 1.1164 - regression_loss: 0.9876 - classification_loss: 0.1288 24/500 [>.............................] - ETA: 2:40 - loss: 1.1151 - regression_loss: 0.9836 - classification_loss: 0.1315 25/500 [>.............................] - ETA: 2:40 - loss: 1.0946 - regression_loss: 0.9666 - classification_loss: 0.1280 26/500 [>.............................] - ETA: 2:40 - loss: 1.1195 - regression_loss: 0.9882 - classification_loss: 0.1313 27/500 [>.............................] - ETA: 2:39 - loss: 1.1305 - regression_loss: 0.9972 - classification_loss: 0.1333 28/500 [>.............................] - ETA: 2:39 - loss: 1.1048 - regression_loss: 0.9754 - classification_loss: 0.1294 29/500 [>.............................] - ETA: 2:38 - loss: 1.1248 - regression_loss: 0.9906 - classification_loss: 0.1341 30/500 [>.............................] - ETA: 2:38 - loss: 1.1262 - regression_loss: 0.9922 - classification_loss: 0.1341 31/500 [>.............................] - ETA: 2:38 - loss: 1.1308 - regression_loss: 0.9970 - classification_loss: 0.1338 32/500 [>.............................] - ETA: 2:37 - loss: 1.1284 - regression_loss: 0.9940 - classification_loss: 0.1344 33/500 [>.............................] - ETA: 2:37 - loss: 1.1329 - regression_loss: 0.9977 - classification_loss: 0.1353 34/500 [=>............................] - ETA: 2:37 - loss: 1.1245 - regression_loss: 0.9904 - classification_loss: 0.1341 35/500 [=>............................] - ETA: 2:37 - loss: 1.1366 - regression_loss: 0.9989 - classification_loss: 0.1378 36/500 [=>............................] - ETA: 2:36 - loss: 1.1423 - regression_loss: 1.0037 - classification_loss: 0.1386 37/500 [=>............................] - ETA: 2:36 - loss: 1.1237 - regression_loss: 0.9878 - classification_loss: 0.1359 38/500 [=>............................] - ETA: 2:36 - loss: 1.1331 - regression_loss: 0.9949 - classification_loss: 0.1383 39/500 [=>............................] - ETA: 2:35 - loss: 1.1301 - regression_loss: 0.9913 - classification_loss: 0.1388 40/500 [=>............................] - ETA: 2:35 - loss: 1.1445 - regression_loss: 1.0058 - classification_loss: 0.1387 41/500 [=>............................] - ETA: 2:34 - loss: 1.1375 - regression_loss: 1.0003 - classification_loss: 0.1372 42/500 [=>............................] - ETA: 2:34 - loss: 1.1548 - regression_loss: 1.0143 - classification_loss: 0.1405 43/500 [=>............................] - ETA: 2:33 - loss: 1.1533 - regression_loss: 1.0121 - classification_loss: 0.1412 44/500 [=>............................] - ETA: 2:33 - loss: 1.1328 - regression_loss: 0.9944 - classification_loss: 0.1385 45/500 [=>............................] - ETA: 2:33 - loss: 1.1358 - regression_loss: 0.9974 - classification_loss: 0.1384 46/500 [=>............................] - ETA: 2:33 - loss: 1.1446 - regression_loss: 1.0050 - classification_loss: 0.1397 47/500 [=>............................] - ETA: 2:32 - loss: 1.1482 - regression_loss: 1.0087 - classification_loss: 0.1395 48/500 [=>............................] - ETA: 2:32 - loss: 1.1497 - regression_loss: 1.0098 - classification_loss: 0.1399 49/500 [=>............................] - ETA: 2:31 - loss: 1.1499 - regression_loss: 1.0106 - classification_loss: 0.1394 50/500 [==>...........................] - ETA: 2:31 - loss: 1.1472 - regression_loss: 1.0081 - classification_loss: 0.1391 51/500 [==>...........................] - ETA: 2:31 - loss: 1.1521 - regression_loss: 1.0124 - classification_loss: 0.1397 52/500 [==>...........................] - ETA: 2:30 - loss: 1.1555 - regression_loss: 1.0154 - classification_loss: 0.1400 53/500 [==>...........................] - ETA: 2:30 - loss: 1.1465 - regression_loss: 1.0079 - classification_loss: 0.1386 54/500 [==>...........................] - ETA: 2:30 - loss: 1.1475 - regression_loss: 1.0095 - classification_loss: 0.1381 55/500 [==>...........................] - ETA: 2:30 - loss: 1.1455 - regression_loss: 1.0068 - classification_loss: 0.1387 56/500 [==>...........................] - ETA: 2:29 - loss: 1.1530 - regression_loss: 1.0124 - classification_loss: 0.1406 57/500 [==>...........................] - ETA: 2:29 - loss: 1.1388 - regression_loss: 1.0001 - classification_loss: 0.1387 58/500 [==>...........................] - ETA: 2:28 - loss: 1.1296 - regression_loss: 0.9923 - classification_loss: 0.1373 59/500 [==>...........................] - ETA: 2:28 - loss: 1.1300 - regression_loss: 0.9925 - classification_loss: 0.1374 60/500 [==>...........................] - ETA: 2:27 - loss: 1.1288 - regression_loss: 0.9920 - classification_loss: 0.1368 61/500 [==>...........................] - ETA: 2:27 - loss: 1.1407 - regression_loss: 1.0019 - classification_loss: 0.1388 62/500 [==>...........................] - ETA: 2:26 - loss: 1.1396 - regression_loss: 1.0011 - classification_loss: 0.1386 63/500 [==>...........................] - ETA: 2:26 - loss: 1.1390 - regression_loss: 1.0002 - classification_loss: 0.1389 64/500 [==>...........................] - ETA: 2:26 - loss: 1.1602 - regression_loss: 1.0037 - classification_loss: 0.1565 65/500 [==>...........................] - ETA: 2:25 - loss: 1.1628 - regression_loss: 1.0061 - classification_loss: 0.1567 66/500 [==>...........................] - ETA: 2:25 - loss: 1.1587 - regression_loss: 1.0033 - classification_loss: 0.1554 67/500 [===>..........................] - ETA: 2:25 - loss: 1.1564 - regression_loss: 1.0020 - classification_loss: 0.1544 68/500 [===>..........................] - ETA: 2:24 - loss: 1.1602 - regression_loss: 1.0051 - classification_loss: 0.1550 69/500 [===>..........................] - ETA: 2:24 - loss: 1.1575 - regression_loss: 1.0040 - classification_loss: 0.1535 70/500 [===>..........................] - ETA: 2:24 - loss: 1.1615 - regression_loss: 1.0085 - classification_loss: 0.1530 71/500 [===>..........................] - ETA: 2:23 - loss: 1.1577 - regression_loss: 1.0051 - classification_loss: 0.1526 72/500 [===>..........................] - ETA: 2:23 - loss: 1.1612 - regression_loss: 1.0083 - classification_loss: 0.1529 73/500 [===>..........................] - ETA: 2:23 - loss: 1.1536 - regression_loss: 1.0022 - classification_loss: 0.1514 74/500 [===>..........................] - ETA: 2:23 - loss: 1.1521 - regression_loss: 1.0005 - classification_loss: 0.1516 75/500 [===>..........................] - ETA: 2:22 - loss: 1.1554 - regression_loss: 1.0035 - classification_loss: 0.1518 76/500 [===>..........................] - ETA: 2:22 - loss: 1.1571 - regression_loss: 1.0053 - classification_loss: 0.1518 77/500 [===>..........................] - ETA: 2:22 - loss: 1.1542 - regression_loss: 1.0033 - classification_loss: 0.1509 78/500 [===>..........................] - ETA: 2:21 - loss: 1.1578 - regression_loss: 1.0065 - classification_loss: 0.1514 79/500 [===>..........................] - ETA: 2:21 - loss: 1.1563 - regression_loss: 1.0063 - classification_loss: 0.1500 80/500 [===>..........................] - ETA: 2:21 - loss: 1.1535 - regression_loss: 1.0046 - classification_loss: 0.1489 81/500 [===>..........................] - ETA: 2:20 - loss: 1.1606 - regression_loss: 1.0108 - classification_loss: 0.1498 82/500 [===>..........................] - ETA: 2:20 - loss: 1.1574 - regression_loss: 1.0081 - classification_loss: 0.1492 83/500 [===>..........................] - ETA: 2:20 - loss: 1.1561 - regression_loss: 1.0074 - classification_loss: 0.1487 84/500 [====>.........................] - ETA: 2:19 - loss: 1.1532 - regression_loss: 1.0052 - classification_loss: 0.1480 85/500 [====>.........................] - ETA: 2:19 - loss: 1.1533 - regression_loss: 1.0054 - classification_loss: 0.1478 86/500 [====>.........................] - ETA: 2:19 - loss: 1.1541 - regression_loss: 1.0062 - classification_loss: 0.1479 87/500 [====>.........................] - ETA: 2:18 - loss: 1.1523 - regression_loss: 1.0049 - classification_loss: 0.1474 88/500 [====>.........................] - ETA: 2:18 - loss: 1.1575 - regression_loss: 1.0092 - classification_loss: 0.1483 89/500 [====>.........................] - ETA: 2:18 - loss: 1.1567 - regression_loss: 1.0086 - classification_loss: 0.1481 90/500 [====>.........................] - ETA: 2:17 - loss: 1.1558 - regression_loss: 1.0081 - classification_loss: 0.1477 91/500 [====>.........................] - ETA: 2:17 - loss: 1.1575 - regression_loss: 1.0098 - classification_loss: 0.1477 92/500 [====>.........................] - ETA: 2:17 - loss: 1.1555 - regression_loss: 1.0087 - classification_loss: 0.1468 93/500 [====>.........................] - ETA: 2:16 - loss: 1.1609 - regression_loss: 1.0131 - classification_loss: 0.1478 94/500 [====>.........................] - ETA: 2:16 - loss: 1.1549 - regression_loss: 1.0081 - classification_loss: 0.1469 95/500 [====>.........................] - ETA: 2:16 - loss: 1.1658 - regression_loss: 1.0158 - classification_loss: 0.1499 96/500 [====>.........................] - ETA: 2:15 - loss: 1.1680 - regression_loss: 1.0179 - classification_loss: 0.1502 97/500 [====>.........................] - ETA: 2:15 - loss: 1.1640 - regression_loss: 1.0145 - classification_loss: 0.1494 98/500 [====>.........................] - ETA: 2:15 - loss: 1.1650 - regression_loss: 1.0157 - classification_loss: 0.1493 99/500 [====>.........................] - ETA: 2:14 - loss: 1.1635 - regression_loss: 1.0144 - classification_loss: 0.1492 100/500 [=====>........................] - ETA: 2:14 - loss: 1.1637 - regression_loss: 1.0147 - classification_loss: 0.1490 101/500 [=====>........................] - ETA: 2:14 - loss: 1.1617 - regression_loss: 1.0136 - classification_loss: 0.1481 102/500 [=====>........................] - ETA: 2:13 - loss: 1.1569 - regression_loss: 1.0099 - classification_loss: 0.1470 103/500 [=====>........................] - ETA: 2:13 - loss: 1.1616 - regression_loss: 1.0141 - classification_loss: 0.1475 104/500 [=====>........................] - ETA: 2:13 - loss: 1.1626 - regression_loss: 1.0151 - classification_loss: 0.1475 105/500 [=====>........................] - ETA: 2:13 - loss: 1.1631 - regression_loss: 1.0156 - classification_loss: 0.1475 106/500 [=====>........................] - ETA: 2:12 - loss: 1.1652 - regression_loss: 1.0170 - classification_loss: 0.1482 107/500 [=====>........................] - ETA: 2:12 - loss: 1.1691 - regression_loss: 1.0203 - classification_loss: 0.1489 108/500 [=====>........................] - ETA: 2:12 - loss: 1.1634 - regression_loss: 1.0155 - classification_loss: 0.1480 109/500 [=====>........................] - ETA: 2:11 - loss: 1.1624 - regression_loss: 1.0147 - classification_loss: 0.1477 110/500 [=====>........................] - ETA: 2:11 - loss: 1.1647 - regression_loss: 1.0171 - classification_loss: 0.1476 111/500 [=====>........................] - ETA: 2:11 - loss: 1.1623 - regression_loss: 1.0154 - classification_loss: 0.1469 112/500 [=====>........................] - ETA: 2:10 - loss: 1.1593 - regression_loss: 1.0131 - classification_loss: 0.1462 113/500 [=====>........................] - ETA: 2:10 - loss: 1.1633 - regression_loss: 1.0167 - classification_loss: 0.1466 114/500 [=====>........................] - ETA: 2:10 - loss: 1.1598 - regression_loss: 1.0133 - classification_loss: 0.1464 115/500 [=====>........................] - ETA: 2:09 - loss: 1.1582 - regression_loss: 1.0122 - classification_loss: 0.1460 116/500 [=====>........................] - ETA: 2:09 - loss: 1.1583 - regression_loss: 1.0125 - classification_loss: 0.1458 117/500 [======>.......................] - ETA: 2:09 - loss: 1.1605 - regression_loss: 1.0142 - classification_loss: 0.1463 118/500 [======>.......................] - ETA: 2:08 - loss: 1.1604 - regression_loss: 1.0141 - classification_loss: 0.1463 119/500 [======>.......................] - ETA: 2:08 - loss: 1.1639 - regression_loss: 1.0172 - classification_loss: 0.1468 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1663 - regression_loss: 1.0193 - classification_loss: 0.1470 121/500 [======>.......................] - ETA: 2:07 - loss: 1.1650 - regression_loss: 1.0184 - classification_loss: 0.1466 122/500 [======>.......................] - ETA: 2:07 - loss: 1.1594 - regression_loss: 1.0136 - classification_loss: 0.1458 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1615 - regression_loss: 1.0150 - classification_loss: 0.1465 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1665 - regression_loss: 1.0192 - classification_loss: 0.1473 125/500 [======>.......................] - ETA: 2:06 - loss: 1.1640 - regression_loss: 1.0173 - classification_loss: 0.1467 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1605 - regression_loss: 1.0137 - classification_loss: 0.1467 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1594 - regression_loss: 1.0129 - classification_loss: 0.1464 128/500 [======>.......................] - ETA: 2:05 - loss: 1.1596 - regression_loss: 1.0134 - classification_loss: 0.1462 129/500 [======>.......................] - ETA: 2:05 - loss: 1.1560 - regression_loss: 1.0102 - classification_loss: 0.1458 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1545 - regression_loss: 1.0092 - classification_loss: 0.1453 131/500 [======>.......................] - ETA: 2:04 - loss: 1.1564 - regression_loss: 1.0108 - classification_loss: 0.1456 132/500 [======>.......................] - ETA: 2:04 - loss: 1.1579 - regression_loss: 1.0120 - classification_loss: 0.1459 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1622 - regression_loss: 1.0153 - classification_loss: 0.1468 134/500 [=======>......................] - ETA: 2:03 - loss: 1.1630 - regression_loss: 1.0161 - classification_loss: 0.1468 135/500 [=======>......................] - ETA: 2:03 - loss: 1.1633 - regression_loss: 1.0161 - classification_loss: 0.1471 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1662 - regression_loss: 1.0186 - classification_loss: 0.1475 137/500 [=======>......................] - ETA: 2:02 - loss: 1.1690 - regression_loss: 1.0211 - classification_loss: 0.1478 138/500 [=======>......................] - ETA: 2:02 - loss: 1.1692 - regression_loss: 1.0215 - classification_loss: 0.1477 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1714 - regression_loss: 1.0236 - classification_loss: 0.1478 140/500 [=======>......................] - ETA: 2:01 - loss: 1.1743 - regression_loss: 1.0261 - classification_loss: 0.1482 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1731 - regression_loss: 1.0253 - classification_loss: 0.1479 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1713 - regression_loss: 1.0238 - classification_loss: 0.1475 143/500 [=======>......................] - ETA: 2:00 - loss: 1.1713 - regression_loss: 1.0238 - classification_loss: 0.1475 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1710 - regression_loss: 1.0230 - classification_loss: 0.1479 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1680 - regression_loss: 1.0205 - classification_loss: 0.1474 146/500 [=======>......................] - ETA: 1:59 - loss: 1.1648 - regression_loss: 1.0181 - classification_loss: 0.1467 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1631 - regression_loss: 1.0165 - classification_loss: 0.1466 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1615 - regression_loss: 1.0153 - classification_loss: 0.1462 149/500 [=======>......................] - ETA: 1:58 - loss: 1.1601 - regression_loss: 1.0141 - classification_loss: 0.1460 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1635 - regression_loss: 1.0173 - classification_loss: 0.1462 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1641 - regression_loss: 1.0179 - classification_loss: 0.1463 152/500 [========>.....................] - ETA: 1:57 - loss: 1.1608 - regression_loss: 1.0151 - classification_loss: 0.1457 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1616 - regression_loss: 1.0156 - classification_loss: 0.1460 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1575 - regression_loss: 1.0122 - classification_loss: 0.1453 155/500 [========>.....................] - ETA: 1:56 - loss: 1.1537 - regression_loss: 1.0091 - classification_loss: 0.1446 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1495 - regression_loss: 1.0055 - classification_loss: 0.1440 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1475 - regression_loss: 1.0039 - classification_loss: 0.1436 158/500 [========>.....................] - ETA: 1:55 - loss: 1.1478 - regression_loss: 1.0045 - classification_loss: 0.1432 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1438 - regression_loss: 1.0012 - classification_loss: 0.1426 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1447 - regression_loss: 1.0022 - classification_loss: 0.1425 161/500 [========>.....................] - ETA: 1:55 - loss: 1.1477 - regression_loss: 1.0046 - classification_loss: 0.1430 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1452 - regression_loss: 1.0025 - classification_loss: 0.1426 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1461 - regression_loss: 1.0031 - classification_loss: 0.1430 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1464 - regression_loss: 1.0037 - classification_loss: 0.1427 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1476 - regression_loss: 1.0047 - classification_loss: 0.1429 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1488 - regression_loss: 1.0058 - classification_loss: 0.1430 167/500 [=========>....................] - ETA: 1:52 - loss: 1.1479 - regression_loss: 1.0050 - classification_loss: 0.1430 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1498 - regression_loss: 1.0067 - classification_loss: 0.1431 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1492 - regression_loss: 1.0063 - classification_loss: 0.1429 170/500 [=========>....................] - ETA: 1:52 - loss: 1.1497 - regression_loss: 1.0065 - classification_loss: 0.1432 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1486 - regression_loss: 1.0051 - classification_loss: 0.1434 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1455 - regression_loss: 1.0023 - classification_loss: 0.1432 173/500 [=========>....................] - ETA: 1:50 - loss: 1.1424 - regression_loss: 0.9996 - classification_loss: 0.1427 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1441 - regression_loss: 1.0011 - classification_loss: 0.1431 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1437 - regression_loss: 1.0008 - classification_loss: 0.1430 176/500 [=========>....................] - ETA: 1:50 - loss: 1.1413 - regression_loss: 0.9990 - classification_loss: 0.1423 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1380 - regression_loss: 0.9961 - classification_loss: 0.1419 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1367 - regression_loss: 0.9952 - classification_loss: 0.1415 179/500 [=========>....................] - ETA: 1:49 - loss: 1.1371 - regression_loss: 0.9955 - classification_loss: 0.1416 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1331 - regression_loss: 0.9921 - classification_loss: 0.1410 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1344 - regression_loss: 0.9933 - classification_loss: 0.1411 182/500 [=========>....................] - ETA: 1:48 - loss: 1.1307 - regression_loss: 0.9902 - classification_loss: 0.1405 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1279 - regression_loss: 0.9879 - classification_loss: 0.1401 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1283 - regression_loss: 0.9883 - classification_loss: 0.1399 185/500 [==========>...................] - ETA: 1:47 - loss: 1.1301 - regression_loss: 0.9900 - classification_loss: 0.1401 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1319 - regression_loss: 0.9914 - classification_loss: 0.1404 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1334 - regression_loss: 0.9929 - classification_loss: 0.1405 188/500 [==========>...................] - ETA: 1:46 - loss: 1.1324 - regression_loss: 0.9920 - classification_loss: 0.1404 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1290 - regression_loss: 0.9892 - classification_loss: 0.1398 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1301 - regression_loss: 0.9901 - classification_loss: 0.1401 191/500 [==========>...................] - ETA: 1:45 - loss: 1.1304 - regression_loss: 0.9904 - classification_loss: 0.1400 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1299 - regression_loss: 0.9901 - classification_loss: 0.1399 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1307 - regression_loss: 0.9908 - classification_loss: 0.1399 194/500 [==========>...................] - ETA: 1:44 - loss: 1.1307 - regression_loss: 0.9909 - classification_loss: 0.1397 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1301 - regression_loss: 0.9905 - classification_loss: 0.1396 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1326 - regression_loss: 0.9928 - classification_loss: 0.1399 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1311 - regression_loss: 0.9917 - classification_loss: 0.1394 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1314 - regression_loss: 0.9923 - classification_loss: 0.1391 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1315 - regression_loss: 0.9923 - classification_loss: 0.1392 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1312 - regression_loss: 0.9918 - classification_loss: 0.1394 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1327 - regression_loss: 0.9932 - classification_loss: 0.1395 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1323 - regression_loss: 0.9929 - classification_loss: 0.1394 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1330 - regression_loss: 0.9934 - classification_loss: 0.1395 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1317 - regression_loss: 0.9925 - classification_loss: 0.1392 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1291 - regression_loss: 0.9903 - classification_loss: 0.1388 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1286 - regression_loss: 0.9899 - classification_loss: 0.1387 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1250 - regression_loss: 0.9868 - classification_loss: 0.1382 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1216 - regression_loss: 0.9840 - classification_loss: 0.1377 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1242 - regression_loss: 0.9865 - classification_loss: 0.1378 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1223 - regression_loss: 0.9847 - classification_loss: 0.1376 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1228 - regression_loss: 0.9854 - classification_loss: 0.1374 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1236 - regression_loss: 0.9861 - classification_loss: 0.1374 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1225 - regression_loss: 0.9853 - classification_loss: 0.1372 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1219 - regression_loss: 0.9851 - classification_loss: 0.1369 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1219 - regression_loss: 0.9849 - classification_loss: 0.1370 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1213 - regression_loss: 0.9843 - classification_loss: 0.1370 217/500 [============>.................] - ETA: 1:36 - loss: 1.1222 - regression_loss: 0.9856 - classification_loss: 0.1366 218/500 [============>.................] - ETA: 1:35 - loss: 1.1247 - regression_loss: 0.9883 - classification_loss: 0.1364 219/500 [============>.................] - ETA: 1:35 - loss: 1.1243 - regression_loss: 0.9881 - classification_loss: 0.1361 220/500 [============>.................] - ETA: 1:35 - loss: 1.1216 - regression_loss: 0.9858 - classification_loss: 0.1358 221/500 [============>.................] - ETA: 1:34 - loss: 1.1225 - regression_loss: 0.9865 - classification_loss: 0.1359 222/500 [============>.................] - ETA: 1:34 - loss: 1.1259 - regression_loss: 0.9895 - classification_loss: 0.1364 223/500 [============>.................] - ETA: 1:34 - loss: 1.1268 - regression_loss: 0.9903 - classification_loss: 0.1365 224/500 [============>.................] - ETA: 1:33 - loss: 1.1335 - regression_loss: 0.9963 - classification_loss: 0.1372 225/500 [============>.................] - ETA: 1:33 - loss: 1.1351 - regression_loss: 0.9976 - classification_loss: 0.1375 226/500 [============>.................] - ETA: 1:33 - loss: 1.1348 - regression_loss: 0.9973 - classification_loss: 0.1375 227/500 [============>.................] - ETA: 1:32 - loss: 1.1367 - regression_loss: 0.9989 - classification_loss: 0.1378 228/500 [============>.................] - ETA: 1:32 - loss: 1.1358 - regression_loss: 0.9980 - classification_loss: 0.1378 229/500 [============>.................] - ETA: 1:32 - loss: 1.1365 - regression_loss: 0.9986 - classification_loss: 0.1380 230/500 [============>.................] - ETA: 1:31 - loss: 1.1390 - regression_loss: 1.0009 - classification_loss: 0.1381 231/500 [============>.................] - ETA: 1:31 - loss: 1.1382 - regression_loss: 1.0004 - classification_loss: 0.1378 232/500 [============>.................] - ETA: 1:31 - loss: 1.1389 - regression_loss: 1.0009 - classification_loss: 0.1380 233/500 [============>.................] - ETA: 1:30 - loss: 1.1366 - regression_loss: 0.9990 - classification_loss: 0.1376 234/500 [=============>................] - ETA: 1:30 - loss: 1.1349 - regression_loss: 0.9977 - classification_loss: 0.1372 235/500 [=============>................] - ETA: 1:30 - loss: 1.1365 - regression_loss: 0.9993 - classification_loss: 0.1372 236/500 [=============>................] - ETA: 1:29 - loss: 1.1362 - regression_loss: 0.9991 - classification_loss: 0.1372 237/500 [=============>................] - ETA: 1:29 - loss: 1.1360 - regression_loss: 0.9990 - classification_loss: 0.1370 238/500 [=============>................] - ETA: 1:28 - loss: 1.1379 - regression_loss: 1.0005 - classification_loss: 0.1374 239/500 [=============>................] - ETA: 1:28 - loss: 1.1381 - regression_loss: 1.0007 - classification_loss: 0.1374 240/500 [=============>................] - ETA: 1:28 - loss: 1.1396 - regression_loss: 1.0020 - classification_loss: 0.1376 241/500 [=============>................] - ETA: 1:27 - loss: 1.1394 - regression_loss: 1.0017 - classification_loss: 0.1378 242/500 [=============>................] - ETA: 1:27 - loss: 1.1403 - regression_loss: 1.0023 - classification_loss: 0.1380 243/500 [=============>................] - ETA: 1:27 - loss: 1.1393 - regression_loss: 1.0016 - classification_loss: 0.1377 244/500 [=============>................] - ETA: 1:26 - loss: 1.1398 - regression_loss: 1.0016 - classification_loss: 0.1382 245/500 [=============>................] - ETA: 1:26 - loss: 1.1400 - regression_loss: 1.0019 - classification_loss: 0.1381 246/500 [=============>................] - ETA: 1:26 - loss: 1.1383 - regression_loss: 1.0005 - classification_loss: 0.1377 247/500 [=============>................] - ETA: 1:25 - loss: 1.1405 - regression_loss: 1.0025 - classification_loss: 0.1380 248/500 [=============>................] - ETA: 1:25 - loss: 1.1424 - regression_loss: 1.0040 - classification_loss: 0.1384 249/500 [=============>................] - ETA: 1:25 - loss: 1.1441 - regression_loss: 1.0054 - classification_loss: 0.1387 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1425 - regression_loss: 1.0041 - classification_loss: 0.1384 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1425 - regression_loss: 1.0043 - classification_loss: 0.1383 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1411 - regression_loss: 1.0032 - classification_loss: 0.1379 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1404 - regression_loss: 1.0029 - classification_loss: 0.1376 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1398 - regression_loss: 1.0025 - classification_loss: 0.1374 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1383 - regression_loss: 1.0012 - classification_loss: 0.1371 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1384 - regression_loss: 1.0013 - classification_loss: 0.1372 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1391 - regression_loss: 1.0019 - classification_loss: 0.1372 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1378 - regression_loss: 1.0009 - classification_loss: 0.1369 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1387 - regression_loss: 1.0018 - classification_loss: 0.1369 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1373 - regression_loss: 1.0006 - classification_loss: 0.1368 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1361 - regression_loss: 0.9997 - classification_loss: 0.1364 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1364 - regression_loss: 0.9998 - classification_loss: 0.1366 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1382 - regression_loss: 1.0012 - classification_loss: 0.1370 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1398 - regression_loss: 1.0026 - classification_loss: 0.1373 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1405 - regression_loss: 1.0031 - classification_loss: 0.1373 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1427 - regression_loss: 1.0049 - classification_loss: 0.1378 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1433 - regression_loss: 1.0055 - classification_loss: 0.1378 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1428 - regression_loss: 1.0053 - classification_loss: 0.1375 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1409 - regression_loss: 1.0037 - classification_loss: 0.1371 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1401 - regression_loss: 1.0030 - classification_loss: 0.1371 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1389 - regression_loss: 1.0021 - classification_loss: 0.1368 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1404 - regression_loss: 1.0034 - classification_loss: 0.1370 273/500 [===============>..............] - ETA: 1:17 - loss: 1.1399 - regression_loss: 1.0030 - classification_loss: 0.1369 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1389 - regression_loss: 1.0023 - classification_loss: 0.1366 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1379 - regression_loss: 1.0014 - classification_loss: 0.1365 276/500 [===============>..............] - ETA: 1:15 - loss: 1.1387 - regression_loss: 1.0021 - classification_loss: 0.1366 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1381 - regression_loss: 1.0018 - classification_loss: 0.1364 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1381 - regression_loss: 1.0017 - classification_loss: 0.1364 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1368 - regression_loss: 1.0007 - classification_loss: 0.1362 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1380 - regression_loss: 1.0017 - classification_loss: 0.1363 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1410 - regression_loss: 1.0040 - classification_loss: 0.1370 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1395 - regression_loss: 1.0028 - classification_loss: 0.1367 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1382 - regression_loss: 1.0016 - classification_loss: 0.1366 284/500 [================>.............] - ETA: 1:13 - loss: 1.1370 - regression_loss: 1.0007 - classification_loss: 0.1363 285/500 [================>.............] - ETA: 1:12 - loss: 1.1352 - regression_loss: 0.9991 - classification_loss: 0.1361 286/500 [================>.............] - ETA: 1:12 - loss: 1.1327 - regression_loss: 0.9969 - classification_loss: 0.1358 287/500 [================>.............] - ETA: 1:12 - loss: 1.1318 - regression_loss: 0.9962 - classification_loss: 0.1355 288/500 [================>.............] - ETA: 1:11 - loss: 1.1308 - regression_loss: 0.9955 - classification_loss: 0.1354 289/500 [================>.............] - ETA: 1:11 - loss: 1.1311 - regression_loss: 0.9956 - classification_loss: 0.1354 290/500 [================>.............] - ETA: 1:11 - loss: 1.1321 - regression_loss: 0.9968 - classification_loss: 0.1353 291/500 [================>.............] - ETA: 1:10 - loss: 1.1327 - regression_loss: 0.9972 - classification_loss: 0.1354 292/500 [================>.............] - ETA: 1:10 - loss: 1.1331 - regression_loss: 0.9977 - classification_loss: 0.1355 293/500 [================>.............] - ETA: 1:10 - loss: 1.1320 - regression_loss: 0.9968 - classification_loss: 0.1352 294/500 [================>.............] - ETA: 1:09 - loss: 1.1324 - regression_loss: 0.9971 - classification_loss: 0.1353 295/500 [================>.............] - ETA: 1:09 - loss: 1.1311 - regression_loss: 0.9960 - classification_loss: 0.1351 296/500 [================>.............] - ETA: 1:09 - loss: 1.1315 - regression_loss: 0.9964 - classification_loss: 0.1351 297/500 [================>.............] - ETA: 1:08 - loss: 1.1335 - regression_loss: 0.9982 - classification_loss: 0.1353 298/500 [================>.............] - ETA: 1:08 - loss: 1.1345 - regression_loss: 0.9992 - classification_loss: 0.1353 299/500 [================>.............] - ETA: 1:08 - loss: 1.1349 - regression_loss: 0.9996 - classification_loss: 0.1353 300/500 [=================>............] - ETA: 1:07 - loss: 1.1350 - regression_loss: 0.9997 - classification_loss: 0.1353 301/500 [=================>............] - ETA: 1:07 - loss: 1.1360 - regression_loss: 1.0005 - classification_loss: 0.1355 302/500 [=================>............] - ETA: 1:07 - loss: 1.1360 - regression_loss: 1.0004 - classification_loss: 0.1356 303/500 [=================>............] - ETA: 1:06 - loss: 1.1393 - regression_loss: 1.0029 - classification_loss: 0.1364 304/500 [=================>............] - ETA: 1:06 - loss: 1.1395 - regression_loss: 1.0032 - classification_loss: 0.1364 305/500 [=================>............] - ETA: 1:06 - loss: 1.1404 - regression_loss: 1.0040 - classification_loss: 0.1364 306/500 [=================>............] - ETA: 1:05 - loss: 1.1400 - regression_loss: 1.0036 - classification_loss: 0.1364 307/500 [=================>............] - ETA: 1:05 - loss: 1.1405 - regression_loss: 1.0039 - classification_loss: 0.1366 308/500 [=================>............] - ETA: 1:05 - loss: 1.1384 - regression_loss: 1.0021 - classification_loss: 0.1363 309/500 [=================>............] - ETA: 1:04 - loss: 1.1385 - regression_loss: 1.0023 - classification_loss: 0.1362 310/500 [=================>............] - ETA: 1:04 - loss: 1.1385 - regression_loss: 1.0025 - classification_loss: 0.1360 311/500 [=================>............] - ETA: 1:04 - loss: 1.1443 - regression_loss: 1.0066 - classification_loss: 0.1378 312/500 [=================>............] - ETA: 1:03 - loss: 1.1450 - regression_loss: 1.0072 - classification_loss: 0.1379 313/500 [=================>............] - ETA: 1:03 - loss: 1.1447 - regression_loss: 1.0069 - classification_loss: 0.1379 314/500 [=================>............] - ETA: 1:03 - loss: 1.1447 - regression_loss: 1.0069 - classification_loss: 0.1378 315/500 [=================>............] - ETA: 1:02 - loss: 1.1449 - regression_loss: 1.0071 - classification_loss: 0.1378 316/500 [=================>............] - ETA: 1:02 - loss: 1.1461 - regression_loss: 1.0080 - classification_loss: 0.1380 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1448 - regression_loss: 1.0070 - classification_loss: 0.1378 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1458 - regression_loss: 1.0078 - classification_loss: 0.1380 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1447 - regression_loss: 1.0067 - classification_loss: 0.1379 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1448 - regression_loss: 1.0068 - classification_loss: 0.1380 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1457 - regression_loss: 1.0077 - classification_loss: 0.1381 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1442 - regression_loss: 1.0062 - classification_loss: 0.1380 323/500 [==================>...........] - ETA: 59s - loss: 1.1447 - regression_loss: 1.0067 - classification_loss: 0.1381  324/500 [==================>...........] - ETA: 59s - loss: 1.1450 - regression_loss: 1.0069 - classification_loss: 0.1382 325/500 [==================>...........] - ETA: 59s - loss: 1.1449 - regression_loss: 1.0067 - classification_loss: 0.1382 326/500 [==================>...........] - ETA: 58s - loss: 1.1447 - regression_loss: 1.0066 - classification_loss: 0.1381 327/500 [==================>...........] - ETA: 58s - loss: 1.1442 - regression_loss: 1.0062 - classification_loss: 0.1379 328/500 [==================>...........] - ETA: 58s - loss: 1.1426 - regression_loss: 1.0046 - classification_loss: 0.1379 329/500 [==================>...........] - ETA: 57s - loss: 1.1433 - regression_loss: 1.0051 - classification_loss: 0.1381 330/500 [==================>...........] - ETA: 57s - loss: 1.1419 - regression_loss: 1.0040 - classification_loss: 0.1379 331/500 [==================>...........] - ETA: 57s - loss: 1.1412 - regression_loss: 1.0034 - classification_loss: 0.1378 332/500 [==================>...........] - ETA: 56s - loss: 1.1409 - regression_loss: 1.0032 - classification_loss: 0.1377 333/500 [==================>...........] - ETA: 56s - loss: 1.1416 - regression_loss: 1.0039 - classification_loss: 0.1377 334/500 [===================>..........] - ETA: 56s - loss: 1.1433 - regression_loss: 1.0053 - classification_loss: 0.1380 335/500 [===================>..........] - ETA: 55s - loss: 1.1434 - regression_loss: 1.0054 - classification_loss: 0.1380 336/500 [===================>..........] - ETA: 55s - loss: 1.1436 - regression_loss: 1.0055 - classification_loss: 0.1381 337/500 [===================>..........] - ETA: 55s - loss: 1.1438 - regression_loss: 1.0056 - classification_loss: 0.1382 338/500 [===================>..........] - ETA: 54s - loss: 1.1427 - regression_loss: 1.0045 - classification_loss: 0.1381 339/500 [===================>..........] - ETA: 54s - loss: 1.1433 - regression_loss: 1.0050 - classification_loss: 0.1382 340/500 [===================>..........] - ETA: 54s - loss: 1.1433 - regression_loss: 1.0052 - classification_loss: 0.1381 341/500 [===================>..........] - ETA: 53s - loss: 1.1425 - regression_loss: 1.0045 - classification_loss: 0.1379 342/500 [===================>..........] - ETA: 53s - loss: 1.1417 - regression_loss: 1.0040 - classification_loss: 0.1377 343/500 [===================>..........] - ETA: 53s - loss: 1.1420 - regression_loss: 1.0040 - classification_loss: 0.1379 344/500 [===================>..........] - ETA: 52s - loss: 1.1427 - regression_loss: 1.0048 - classification_loss: 0.1378 345/500 [===================>..........] - ETA: 52s - loss: 1.1427 - regression_loss: 1.0050 - classification_loss: 0.1377 346/500 [===================>..........] - ETA: 52s - loss: 1.1424 - regression_loss: 1.0049 - classification_loss: 0.1375 347/500 [===================>..........] - ETA: 51s - loss: 1.1406 - regression_loss: 1.0034 - classification_loss: 0.1372 348/500 [===================>..........] - ETA: 51s - loss: 1.1403 - regression_loss: 1.0032 - classification_loss: 0.1371 349/500 [===================>..........] - ETA: 51s - loss: 1.1400 - regression_loss: 1.0030 - classification_loss: 0.1370 350/500 [====================>.........] - ETA: 50s - loss: 1.1396 - regression_loss: 1.0027 - classification_loss: 0.1370 351/500 [====================>.........] - ETA: 50s - loss: 1.1405 - regression_loss: 1.0034 - classification_loss: 0.1371 352/500 [====================>.........] - ETA: 50s - loss: 1.1417 - regression_loss: 1.0046 - classification_loss: 0.1371 353/500 [====================>.........] - ETA: 49s - loss: 1.1411 - regression_loss: 1.0041 - classification_loss: 0.1370 354/500 [====================>.........] - ETA: 49s - loss: 1.1416 - regression_loss: 1.0046 - classification_loss: 0.1370 355/500 [====================>.........] - ETA: 49s - loss: 1.1406 - regression_loss: 1.0038 - classification_loss: 0.1369 356/500 [====================>.........] - ETA: 48s - loss: 1.1432 - regression_loss: 1.0061 - classification_loss: 0.1371 357/500 [====================>.........] - ETA: 48s - loss: 1.1430 - regression_loss: 1.0060 - classification_loss: 0.1370 358/500 [====================>.........] - ETA: 48s - loss: 1.1427 - regression_loss: 1.0057 - classification_loss: 0.1370 359/500 [====================>.........] - ETA: 47s - loss: 1.1431 - regression_loss: 1.0060 - classification_loss: 0.1371 360/500 [====================>.........] - ETA: 47s - loss: 1.1439 - regression_loss: 1.0067 - classification_loss: 0.1372 361/500 [====================>.........] - ETA: 47s - loss: 1.1425 - regression_loss: 1.0056 - classification_loss: 0.1369 362/500 [====================>.........] - ETA: 46s - loss: 1.1428 - regression_loss: 1.0059 - classification_loss: 0.1369 363/500 [====================>.........] - ETA: 46s - loss: 1.1432 - regression_loss: 1.0064 - classification_loss: 0.1368 364/500 [====================>.........] - ETA: 46s - loss: 1.1419 - regression_loss: 1.0052 - classification_loss: 0.1366 365/500 [====================>.........] - ETA: 45s - loss: 1.1418 - regression_loss: 1.0052 - classification_loss: 0.1366 366/500 [====================>.........] - ETA: 45s - loss: 1.1426 - regression_loss: 1.0059 - classification_loss: 0.1367 367/500 [=====================>........] - ETA: 45s - loss: 1.1419 - regression_loss: 1.0054 - classification_loss: 0.1365 368/500 [=====================>........] - ETA: 44s - loss: 1.1417 - regression_loss: 1.0050 - classification_loss: 0.1367 369/500 [=====================>........] - ETA: 44s - loss: 1.1400 - regression_loss: 1.0036 - classification_loss: 0.1364 370/500 [=====================>........] - ETA: 44s - loss: 1.1404 - regression_loss: 1.0041 - classification_loss: 0.1363 371/500 [=====================>........] - ETA: 43s - loss: 1.1411 - regression_loss: 1.0046 - classification_loss: 0.1365 372/500 [=====================>........] - ETA: 43s - loss: 1.1399 - regression_loss: 1.0036 - classification_loss: 0.1362 373/500 [=====================>........] - ETA: 43s - loss: 1.1385 - regression_loss: 1.0025 - classification_loss: 0.1360 374/500 [=====================>........] - ETA: 42s - loss: 1.1387 - regression_loss: 1.0026 - classification_loss: 0.1361 375/500 [=====================>........] - ETA: 42s - loss: 1.1378 - regression_loss: 1.0019 - classification_loss: 0.1359 376/500 [=====================>........] - ETA: 42s - loss: 1.1376 - regression_loss: 1.0018 - classification_loss: 0.1359 377/500 [=====================>........] - ETA: 41s - loss: 1.1375 - regression_loss: 1.0016 - classification_loss: 0.1359 378/500 [=====================>........] - ETA: 41s - loss: 1.1364 - regression_loss: 1.0007 - classification_loss: 0.1357 379/500 [=====================>........] - ETA: 41s - loss: 1.1367 - regression_loss: 1.0008 - classification_loss: 0.1360 380/500 [=====================>........] - ETA: 40s - loss: 1.1370 - regression_loss: 1.0009 - classification_loss: 0.1360 381/500 [=====================>........] - ETA: 40s - loss: 1.1369 - regression_loss: 1.0010 - classification_loss: 0.1359 382/500 [=====================>........] - ETA: 40s - loss: 1.1371 - regression_loss: 1.0012 - classification_loss: 0.1359 383/500 [=====================>........] - ETA: 39s - loss: 1.1388 - regression_loss: 1.0027 - classification_loss: 0.1361 384/500 [======================>.......] - ETA: 39s - loss: 1.1392 - regression_loss: 1.0031 - classification_loss: 0.1361 385/500 [======================>.......] - ETA: 38s - loss: 1.1409 - regression_loss: 1.0046 - classification_loss: 0.1364 386/500 [======================>.......] - ETA: 38s - loss: 1.1417 - regression_loss: 1.0053 - classification_loss: 0.1365 387/500 [======================>.......] - ETA: 38s - loss: 1.1407 - regression_loss: 1.0045 - classification_loss: 0.1363 388/500 [======================>.......] - ETA: 37s - loss: 1.1414 - regression_loss: 1.0050 - classification_loss: 0.1365 389/500 [======================>.......] - ETA: 37s - loss: 1.1416 - regression_loss: 1.0051 - classification_loss: 0.1365 390/500 [======================>.......] - ETA: 37s - loss: 1.1415 - regression_loss: 1.0051 - classification_loss: 0.1365 391/500 [======================>.......] - ETA: 36s - loss: 1.1428 - regression_loss: 1.0063 - classification_loss: 0.1365 392/500 [======================>.......] - ETA: 36s - loss: 1.1433 - regression_loss: 1.0067 - classification_loss: 0.1366 393/500 [======================>.......] - ETA: 36s - loss: 1.1443 - regression_loss: 1.0074 - classification_loss: 0.1368 394/500 [======================>.......] - ETA: 35s - loss: 1.1445 - regression_loss: 1.0076 - classification_loss: 0.1369 395/500 [======================>.......] - ETA: 35s - loss: 1.1442 - regression_loss: 1.0073 - classification_loss: 0.1369 396/500 [======================>.......] - ETA: 35s - loss: 1.1441 - regression_loss: 1.0072 - classification_loss: 0.1369 397/500 [======================>.......] - ETA: 34s - loss: 1.1440 - regression_loss: 1.0071 - classification_loss: 0.1370 398/500 [======================>.......] - ETA: 34s - loss: 1.1437 - regression_loss: 1.0068 - classification_loss: 0.1369 399/500 [======================>.......] - ETA: 34s - loss: 1.1432 - regression_loss: 1.0064 - classification_loss: 0.1368 400/500 [=======================>......] - ETA: 33s - loss: 1.1436 - regression_loss: 1.0067 - classification_loss: 0.1369 401/500 [=======================>......] - ETA: 33s - loss: 1.1438 - regression_loss: 1.0068 - classification_loss: 0.1370 402/500 [=======================>......] - ETA: 33s - loss: 1.1446 - regression_loss: 1.0074 - classification_loss: 0.1371 403/500 [=======================>......] - ETA: 32s - loss: 1.1435 - regression_loss: 1.0066 - classification_loss: 0.1369 404/500 [=======================>......] - ETA: 32s - loss: 1.1441 - regression_loss: 1.0070 - classification_loss: 0.1371 405/500 [=======================>......] - ETA: 32s - loss: 1.1438 - regression_loss: 1.0068 - classification_loss: 0.1369 406/500 [=======================>......] - ETA: 31s - loss: 1.1443 - regression_loss: 1.0074 - classification_loss: 0.1370 407/500 [=======================>......] - ETA: 31s - loss: 1.1439 - regression_loss: 1.0070 - classification_loss: 0.1369 408/500 [=======================>......] - ETA: 31s - loss: 1.1441 - regression_loss: 1.0072 - classification_loss: 0.1370 409/500 [=======================>......] - ETA: 30s - loss: 1.1447 - regression_loss: 1.0077 - classification_loss: 0.1370 410/500 [=======================>......] - ETA: 30s - loss: 1.1449 - regression_loss: 1.0078 - classification_loss: 0.1371 411/500 [=======================>......] - ETA: 30s - loss: 1.1459 - regression_loss: 1.0086 - classification_loss: 0.1373 412/500 [=======================>......] - ETA: 29s - loss: 1.1459 - regression_loss: 1.0086 - classification_loss: 0.1373 413/500 [=======================>......] - ETA: 29s - loss: 1.1453 - regression_loss: 1.0080 - classification_loss: 0.1374 414/500 [=======================>......] - ETA: 29s - loss: 1.1458 - regression_loss: 1.0083 - classification_loss: 0.1375 415/500 [=======================>......] - ETA: 28s - loss: 1.1441 - regression_loss: 1.0068 - classification_loss: 0.1373 416/500 [=======================>......] - ETA: 28s - loss: 1.1425 - regression_loss: 1.0054 - classification_loss: 0.1371 417/500 [========================>.....] - ETA: 28s - loss: 1.1433 - regression_loss: 1.0061 - classification_loss: 0.1373 418/500 [========================>.....] - ETA: 27s - loss: 1.1426 - regression_loss: 1.0055 - classification_loss: 0.1371 419/500 [========================>.....] - ETA: 27s - loss: 1.1412 - regression_loss: 1.0043 - classification_loss: 0.1369 420/500 [========================>.....] - ETA: 27s - loss: 1.1412 - regression_loss: 1.0043 - classification_loss: 0.1369 421/500 [========================>.....] - ETA: 26s - loss: 1.1408 - regression_loss: 1.0041 - classification_loss: 0.1367 422/500 [========================>.....] - ETA: 26s - loss: 1.1416 - regression_loss: 1.0048 - classification_loss: 0.1368 423/500 [========================>.....] - ETA: 26s - loss: 1.1415 - regression_loss: 1.0046 - classification_loss: 0.1368 424/500 [========================>.....] - ETA: 25s - loss: 1.1412 - regression_loss: 1.0043 - classification_loss: 0.1369 425/500 [========================>.....] - ETA: 25s - loss: 1.1412 - regression_loss: 1.0042 - classification_loss: 0.1370 426/500 [========================>.....] - ETA: 25s - loss: 1.1417 - regression_loss: 1.0047 - classification_loss: 0.1370 427/500 [========================>.....] - ETA: 24s - loss: 1.1416 - regression_loss: 1.0045 - classification_loss: 0.1370 428/500 [========================>.....] - ETA: 24s - loss: 1.1413 - regression_loss: 1.0044 - classification_loss: 0.1369 429/500 [========================>.....] - ETA: 24s - loss: 1.1399 - regression_loss: 1.0033 - classification_loss: 0.1367 430/500 [========================>.....] - ETA: 23s - loss: 1.1389 - regression_loss: 1.0023 - classification_loss: 0.1366 431/500 [========================>.....] - ETA: 23s - loss: 1.1379 - regression_loss: 1.0015 - classification_loss: 0.1364 432/500 [========================>.....] - ETA: 23s - loss: 1.1366 - regression_loss: 1.0004 - classification_loss: 0.1362 433/500 [========================>.....] - ETA: 22s - loss: 1.1352 - regression_loss: 0.9992 - classification_loss: 0.1360 434/500 [=========================>....] - ETA: 22s - loss: 1.1356 - regression_loss: 0.9995 - classification_loss: 0.1361 435/500 [=========================>....] - ETA: 22s - loss: 1.1353 - regression_loss: 0.9993 - classification_loss: 0.1360 436/500 [=========================>....] - ETA: 21s - loss: 1.1340 - regression_loss: 0.9982 - classification_loss: 0.1358 437/500 [=========================>....] - ETA: 21s - loss: 1.1339 - regression_loss: 0.9981 - classification_loss: 0.1358 438/500 [=========================>....] - ETA: 21s - loss: 1.1327 - regression_loss: 0.9972 - classification_loss: 0.1355 439/500 [=========================>....] - ETA: 20s - loss: 1.1324 - regression_loss: 0.9970 - classification_loss: 0.1355 440/500 [=========================>....] - ETA: 20s - loss: 1.1311 - regression_loss: 0.9958 - classification_loss: 0.1353 441/500 [=========================>....] - ETA: 19s - loss: 1.1325 - regression_loss: 0.9968 - classification_loss: 0.1356 442/500 [=========================>....] - ETA: 19s - loss: 1.1330 - regression_loss: 0.9975 - classification_loss: 0.1356 443/500 [=========================>....] - ETA: 19s - loss: 1.1336 - regression_loss: 0.9980 - classification_loss: 0.1355 444/500 [=========================>....] - ETA: 18s - loss: 1.1333 - regression_loss: 0.9978 - classification_loss: 0.1355 445/500 [=========================>....] - ETA: 18s - loss: 1.1333 - regression_loss: 0.9978 - classification_loss: 0.1355 446/500 [=========================>....] - ETA: 18s - loss: 1.1323 - regression_loss: 0.9970 - classification_loss: 0.1353 447/500 [=========================>....] - ETA: 17s - loss: 1.1330 - regression_loss: 0.9976 - classification_loss: 0.1353 448/500 [=========================>....] - ETA: 17s - loss: 1.1339 - regression_loss: 0.9984 - classification_loss: 0.1355 449/500 [=========================>....] - ETA: 17s - loss: 1.1328 - regression_loss: 0.9975 - classification_loss: 0.1353 450/500 [==========================>...] - ETA: 16s - loss: 1.1323 - regression_loss: 0.9971 - classification_loss: 0.1353 451/500 [==========================>...] - ETA: 16s - loss: 1.1324 - regression_loss: 0.9972 - classification_loss: 0.1352 452/500 [==========================>...] - ETA: 16s - loss: 1.1311 - regression_loss: 0.9961 - classification_loss: 0.1350 453/500 [==========================>...] - ETA: 15s - loss: 1.1315 - regression_loss: 0.9965 - classification_loss: 0.1350 454/500 [==========================>...] - ETA: 15s - loss: 1.1316 - regression_loss: 0.9965 - classification_loss: 0.1351 455/500 [==========================>...] - ETA: 15s - loss: 1.1308 - regression_loss: 0.9959 - classification_loss: 0.1349 456/500 [==========================>...] - ETA: 14s - loss: 1.1308 - regression_loss: 0.9960 - classification_loss: 0.1348 457/500 [==========================>...] - ETA: 14s - loss: 1.1290 - regression_loss: 0.9944 - classification_loss: 0.1346 458/500 [==========================>...] - ETA: 14s - loss: 1.1279 - regression_loss: 0.9934 - classification_loss: 0.1344 459/500 [==========================>...] - ETA: 13s - loss: 1.1263 - regression_loss: 0.9920 - classification_loss: 0.1342 460/500 [==========================>...] - ETA: 13s - loss: 1.1271 - regression_loss: 0.9928 - classification_loss: 0.1343 461/500 [==========================>...] - ETA: 13s - loss: 1.1269 - regression_loss: 0.9926 - classification_loss: 0.1343 462/500 [==========================>...] - ETA: 12s - loss: 1.1264 - regression_loss: 0.9922 - classification_loss: 0.1342 463/500 [==========================>...] - ETA: 12s - loss: 1.1256 - regression_loss: 0.9915 - classification_loss: 0.1341 464/500 [==========================>...] - ETA: 12s - loss: 1.1255 - regression_loss: 0.9914 - classification_loss: 0.1341 465/500 [==========================>...] - ETA: 11s - loss: 1.1257 - regression_loss: 0.9917 - classification_loss: 0.1340 466/500 [==========================>...] - ETA: 11s - loss: 1.1245 - regression_loss: 0.9908 - classification_loss: 0.1338 467/500 [===========================>..] - ETA: 11s - loss: 1.1246 - regression_loss: 0.9908 - classification_loss: 0.1338 468/500 [===========================>..] - ETA: 10s - loss: 1.1245 - regression_loss: 0.9907 - classification_loss: 0.1339 469/500 [===========================>..] - ETA: 10s - loss: 1.1243 - regression_loss: 0.9904 - classification_loss: 0.1338 470/500 [===========================>..] - ETA: 10s - loss: 1.1241 - regression_loss: 0.9904 - classification_loss: 0.1338 471/500 [===========================>..] - ETA: 9s - loss: 1.1235 - regression_loss: 0.9899 - classification_loss: 0.1336  472/500 [===========================>..] - ETA: 9s - loss: 1.1238 - regression_loss: 0.9901 - classification_loss: 0.1337 473/500 [===========================>..] - ETA: 9s - loss: 1.1242 - regression_loss: 0.9904 - classification_loss: 0.1337 474/500 [===========================>..] - ETA: 8s - loss: 1.1246 - regression_loss: 0.9910 - classification_loss: 0.1336 475/500 [===========================>..] - ETA: 8s - loss: 1.1240 - regression_loss: 0.9905 - classification_loss: 0.1335 476/500 [===========================>..] - ETA: 8s - loss: 1.1230 - regression_loss: 0.9897 - classification_loss: 0.1334 477/500 [===========================>..] - ETA: 7s - loss: 1.1229 - regression_loss: 0.9895 - classification_loss: 0.1333 478/500 [===========================>..] - ETA: 7s - loss: 1.1223 - regression_loss: 0.9892 - classification_loss: 0.1332 479/500 [===========================>..] - ETA: 7s - loss: 1.1236 - regression_loss: 0.9903 - classification_loss: 0.1334 480/500 [===========================>..] - ETA: 6s - loss: 1.1233 - regression_loss: 0.9899 - classification_loss: 0.1334 481/500 [===========================>..] - ETA: 6s - loss: 1.1229 - regression_loss: 0.9897 - classification_loss: 0.1333 482/500 [===========================>..] - ETA: 6s - loss: 1.1235 - regression_loss: 0.9902 - classification_loss: 0.1333 483/500 [===========================>..] - ETA: 5s - loss: 1.1235 - regression_loss: 0.9902 - classification_loss: 0.1333 484/500 [============================>.] - ETA: 5s - loss: 1.1227 - regression_loss: 0.9896 - classification_loss: 0.1331 485/500 [============================>.] - ETA: 5s - loss: 1.1227 - regression_loss: 0.9896 - classification_loss: 0.1331 486/500 [============================>.] - ETA: 4s - loss: 1.1231 - regression_loss: 0.9901 - classification_loss: 0.1331 487/500 [============================>.] - ETA: 4s - loss: 1.1232 - regression_loss: 0.9902 - classification_loss: 0.1330 488/500 [============================>.] - ETA: 4s - loss: 1.1245 - regression_loss: 0.9913 - classification_loss: 0.1331 489/500 [============================>.] - ETA: 3s - loss: 1.1238 - regression_loss: 0.9908 - classification_loss: 0.1330 490/500 [============================>.] - ETA: 3s - loss: 1.1230 - regression_loss: 0.9901 - classification_loss: 0.1329 491/500 [============================>.] - ETA: 3s - loss: 1.1244 - regression_loss: 0.9914 - classification_loss: 0.1330 492/500 [============================>.] - ETA: 2s - loss: 1.1257 - regression_loss: 0.9925 - classification_loss: 0.1333 493/500 [============================>.] - ETA: 2s - loss: 1.1267 - regression_loss: 0.9933 - classification_loss: 0.1334 494/500 [============================>.] - ETA: 2s - loss: 1.1278 - regression_loss: 0.9942 - classification_loss: 0.1336 495/500 [============================>.] - ETA: 1s - loss: 1.1280 - regression_loss: 0.9944 - classification_loss: 0.1337 496/500 [============================>.] - ETA: 1s - loss: 1.1279 - regression_loss: 0.9942 - classification_loss: 0.1337 497/500 [============================>.] - ETA: 1s - loss: 1.1284 - regression_loss: 0.9947 - classification_loss: 0.1337 498/500 [============================>.] - ETA: 0s - loss: 1.1272 - regression_loss: 0.9937 - classification_loss: 0.1335 499/500 [============================>.] - ETA: 0s - loss: 1.1266 - regression_loss: 0.9932 - classification_loss: 0.1334 500/500 [==============================] - 170s 339ms/step - loss: 1.1275 - regression_loss: 0.9940 - classification_loss: 0.1335 1172 instances of class plum with average precision: 0.7595 mAP: 0.7595 Epoch 00022: saving model to ./training/snapshots/resnet101_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 2:34 - loss: 0.8147 - regression_loss: 0.7372 - classification_loss: 0.0775 2/500 [..............................] - ETA: 2:43 - loss: 0.9415 - regression_loss: 0.8546 - classification_loss: 0.0869 3/500 [..............................] - ETA: 2:45 - loss: 1.0927 - regression_loss: 0.9724 - classification_loss: 0.1203 4/500 [..............................] - ETA: 2:45 - loss: 1.0406 - regression_loss: 0.9267 - classification_loss: 0.1140 5/500 [..............................] - ETA: 2:45 - loss: 1.0019 - regression_loss: 0.8942 - classification_loss: 0.1077 6/500 [..............................] - ETA: 2:45 - loss: 0.9340 - regression_loss: 0.8358 - classification_loss: 0.0982 7/500 [..............................] - ETA: 2:44 - loss: 1.0003 - regression_loss: 0.8882 - classification_loss: 0.1121 8/500 [..............................] - ETA: 2:43 - loss: 1.0407 - regression_loss: 0.9227 - classification_loss: 0.1181 9/500 [..............................] - ETA: 2:43 - loss: 1.0207 - regression_loss: 0.9115 - classification_loss: 0.1092 10/500 [..............................] - ETA: 2:43 - loss: 0.9780 - regression_loss: 0.8765 - classification_loss: 0.1014 11/500 [..............................] - ETA: 2:43 - loss: 0.9367 - regression_loss: 0.8372 - classification_loss: 0.0996 12/500 [..............................] - ETA: 2:43 - loss: 0.9350 - regression_loss: 0.8369 - classification_loss: 0.0981 13/500 [..............................] - ETA: 2:42 - loss: 0.9478 - regression_loss: 0.8481 - classification_loss: 0.0996 14/500 [..............................] - ETA: 2:41 - loss: 0.8920 - regression_loss: 0.7985 - classification_loss: 0.0935 15/500 [..............................] - ETA: 2:41 - loss: 0.9306 - regression_loss: 0.8320 - classification_loss: 0.0986 16/500 [..............................] - ETA: 2:41 - loss: 0.9724 - regression_loss: 0.8649 - classification_loss: 0.1075 17/500 [>.............................] - ETA: 2:41 - loss: 1.0050 - regression_loss: 0.8925 - classification_loss: 0.1125 18/500 [>.............................] - ETA: 2:41 - loss: 1.0291 - regression_loss: 0.9111 - classification_loss: 0.1180 19/500 [>.............................] - ETA: 2:41 - loss: 1.0386 - regression_loss: 0.9191 - classification_loss: 0.1195 20/500 [>.............................] - ETA: 2:40 - loss: 1.0425 - regression_loss: 0.9228 - classification_loss: 0.1196 21/500 [>.............................] - ETA: 2:40 - loss: 1.0539 - regression_loss: 0.9338 - classification_loss: 0.1202 22/500 [>.............................] - ETA: 2:40 - loss: 1.0582 - regression_loss: 0.9374 - classification_loss: 0.1208 23/500 [>.............................] - ETA: 2:39 - loss: 1.0509 - regression_loss: 0.9312 - classification_loss: 0.1196 24/500 [>.............................] - ETA: 2:39 - loss: 1.0691 - regression_loss: 0.9458 - classification_loss: 0.1233 25/500 [>.............................] - ETA: 2:39 - loss: 1.1113 - regression_loss: 0.9837 - classification_loss: 0.1275 26/500 [>.............................] - ETA: 2:39 - loss: 1.1132 - regression_loss: 0.9857 - classification_loss: 0.1275 27/500 [>.............................] - ETA: 2:39 - loss: 1.1198 - regression_loss: 0.9908 - classification_loss: 0.1290 28/500 [>.............................] - ETA: 2:38 - loss: 1.1303 - regression_loss: 0.9991 - classification_loss: 0.1313 29/500 [>.............................] - ETA: 2:38 - loss: 1.1328 - regression_loss: 1.0006 - classification_loss: 0.1322 30/500 [>.............................] - ETA: 2:38 - loss: 1.1349 - regression_loss: 1.0017 - classification_loss: 0.1332 31/500 [>.............................] - ETA: 2:38 - loss: 1.1437 - regression_loss: 1.0107 - classification_loss: 0.1330 32/500 [>.............................] - ETA: 2:38 - loss: 1.1570 - regression_loss: 1.0233 - classification_loss: 0.1337 33/500 [>.............................] - ETA: 2:38 - loss: 1.1659 - regression_loss: 1.0311 - classification_loss: 0.1348 34/500 [=>............................] - ETA: 2:37 - loss: 1.1724 - regression_loss: 1.0364 - classification_loss: 0.1360 35/500 [=>............................] - ETA: 2:37 - loss: 1.2045 - regression_loss: 1.0646 - classification_loss: 0.1399 36/500 [=>............................] - ETA: 2:37 - loss: 1.1984 - regression_loss: 1.0597 - classification_loss: 0.1388 37/500 [=>............................] - ETA: 2:36 - loss: 1.2081 - regression_loss: 1.0655 - classification_loss: 0.1427 38/500 [=>............................] - ETA: 2:36 - loss: 1.2103 - regression_loss: 1.0676 - classification_loss: 0.1427 39/500 [=>............................] - ETA: 2:36 - loss: 1.2127 - regression_loss: 1.0702 - classification_loss: 0.1424 40/500 [=>............................] - ETA: 2:35 - loss: 1.2161 - regression_loss: 1.0729 - classification_loss: 0.1433 41/500 [=>............................] - ETA: 2:35 - loss: 1.2099 - regression_loss: 1.0682 - classification_loss: 0.1416 42/500 [=>............................] - ETA: 2:34 - loss: 1.2177 - regression_loss: 1.0742 - classification_loss: 0.1435 43/500 [=>............................] - ETA: 2:34 - loss: 1.2136 - regression_loss: 1.0710 - classification_loss: 0.1426 44/500 [=>............................] - ETA: 2:34 - loss: 1.2115 - regression_loss: 1.0696 - classification_loss: 0.1419 45/500 [=>............................] - ETA: 2:33 - loss: 1.2034 - regression_loss: 1.0637 - classification_loss: 0.1397 46/500 [=>............................] - ETA: 2:33 - loss: 1.2068 - regression_loss: 1.0660 - classification_loss: 0.1409 47/500 [=>............................] - ETA: 2:33 - loss: 1.2059 - regression_loss: 1.0647 - classification_loss: 0.1412 48/500 [=>............................] - ETA: 2:32 - loss: 1.1977 - regression_loss: 1.0571 - classification_loss: 0.1405 49/500 [=>............................] - ETA: 2:32 - loss: 1.1941 - regression_loss: 1.0546 - classification_loss: 0.1395 50/500 [==>...........................] - ETA: 2:32 - loss: 1.1947 - regression_loss: 1.0563 - classification_loss: 0.1383 51/500 [==>...........................] - ETA: 2:32 - loss: 1.2004 - regression_loss: 1.0628 - classification_loss: 0.1376 52/500 [==>...........................] - ETA: 2:31 - loss: 1.1961 - regression_loss: 1.0595 - classification_loss: 0.1365 53/500 [==>...........................] - ETA: 2:31 - loss: 1.1839 - regression_loss: 1.0492 - classification_loss: 0.1347 54/500 [==>...........................] - ETA: 2:30 - loss: 1.1827 - regression_loss: 1.0481 - classification_loss: 0.1346 55/500 [==>...........................] - ETA: 2:30 - loss: 1.1848 - regression_loss: 1.0499 - classification_loss: 0.1349 56/500 [==>...........................] - ETA: 2:30 - loss: 1.1804 - regression_loss: 1.0459 - classification_loss: 0.1345 57/500 [==>...........................] - ETA: 2:29 - loss: 1.1830 - regression_loss: 1.0485 - classification_loss: 0.1345 58/500 [==>...........................] - ETA: 2:29 - loss: 1.1835 - regression_loss: 1.0491 - classification_loss: 0.1344 59/500 [==>...........................] - ETA: 2:29 - loss: 1.1800 - regression_loss: 1.0462 - classification_loss: 0.1338 60/500 [==>...........................] - ETA: 2:29 - loss: 1.1850 - regression_loss: 1.0496 - classification_loss: 0.1355 61/500 [==>...........................] - ETA: 2:28 - loss: 1.1777 - regression_loss: 1.0435 - classification_loss: 0.1342 62/500 [==>...........................] - ETA: 2:28 - loss: 1.1782 - regression_loss: 1.0437 - classification_loss: 0.1346 63/500 [==>...........................] - ETA: 2:28 - loss: 1.1847 - regression_loss: 1.0492 - classification_loss: 0.1355 64/500 [==>...........................] - ETA: 2:27 - loss: 1.1806 - regression_loss: 1.0456 - classification_loss: 0.1350 65/500 [==>...........................] - ETA: 2:27 - loss: 1.1781 - regression_loss: 1.0437 - classification_loss: 0.1344 66/500 [==>...........................] - ETA: 2:27 - loss: 1.1799 - regression_loss: 1.0450 - classification_loss: 0.1348 67/500 [===>..........................] - ETA: 2:26 - loss: 1.1823 - regression_loss: 1.0466 - classification_loss: 0.1357 68/500 [===>..........................] - ETA: 2:26 - loss: 1.1836 - regression_loss: 1.0478 - classification_loss: 0.1359 69/500 [===>..........................] - ETA: 2:26 - loss: 1.1804 - regression_loss: 1.0459 - classification_loss: 0.1345 70/500 [===>..........................] - ETA: 2:26 - loss: 1.1820 - regression_loss: 1.0476 - classification_loss: 0.1344 71/500 [===>..........................] - ETA: 2:25 - loss: 1.1734 - regression_loss: 1.0401 - classification_loss: 0.1332 72/500 [===>..........................] - ETA: 2:25 - loss: 1.1750 - regression_loss: 1.0420 - classification_loss: 0.1329 73/500 [===>..........................] - ETA: 2:25 - loss: 1.1790 - regression_loss: 1.0454 - classification_loss: 0.1336 74/500 [===>..........................] - ETA: 2:24 - loss: 1.1791 - regression_loss: 1.0447 - classification_loss: 0.1344 75/500 [===>..........................] - ETA: 2:24 - loss: 1.1852 - regression_loss: 1.0499 - classification_loss: 0.1353 76/500 [===>..........................] - ETA: 2:24 - loss: 1.1778 - regression_loss: 1.0436 - classification_loss: 0.1341 77/500 [===>..........................] - ETA: 2:23 - loss: 1.1827 - regression_loss: 1.0478 - classification_loss: 0.1348 78/500 [===>..........................] - ETA: 2:23 - loss: 1.1841 - regression_loss: 1.0486 - classification_loss: 0.1356 79/500 [===>..........................] - ETA: 2:23 - loss: 1.1860 - regression_loss: 1.0499 - classification_loss: 0.1361 80/500 [===>..........................] - ETA: 2:22 - loss: 1.1840 - regression_loss: 1.0485 - classification_loss: 0.1354 81/500 [===>..........................] - ETA: 2:22 - loss: 1.1836 - regression_loss: 1.0482 - classification_loss: 0.1354 82/500 [===>..........................] - ETA: 2:22 - loss: 1.1852 - regression_loss: 1.0495 - classification_loss: 0.1357 83/500 [===>..........................] - ETA: 2:21 - loss: 1.1837 - regression_loss: 1.0482 - classification_loss: 0.1355 84/500 [====>.........................] - ETA: 2:21 - loss: 1.1809 - regression_loss: 1.0448 - classification_loss: 0.1362 85/500 [====>.........................] - ETA: 2:20 - loss: 1.1785 - regression_loss: 1.0432 - classification_loss: 0.1354 86/500 [====>.........................] - ETA: 2:20 - loss: 1.1770 - regression_loss: 1.0418 - classification_loss: 0.1352 87/500 [====>.........................] - ETA: 2:20 - loss: 1.1753 - regression_loss: 1.0405 - classification_loss: 0.1348 88/500 [====>.........................] - ETA: 2:19 - loss: 1.1735 - regression_loss: 1.0394 - classification_loss: 0.1341 89/500 [====>.........................] - ETA: 2:19 - loss: 1.1713 - regression_loss: 1.0374 - classification_loss: 0.1340 90/500 [====>.........................] - ETA: 2:18 - loss: 1.1710 - regression_loss: 1.0366 - classification_loss: 0.1344 91/500 [====>.........................] - ETA: 2:18 - loss: 1.1733 - regression_loss: 1.0390 - classification_loss: 0.1343 92/500 [====>.........................] - ETA: 2:17 - loss: 1.1727 - regression_loss: 1.0388 - classification_loss: 0.1339 93/500 [====>.........................] - ETA: 2:17 - loss: 1.1651 - regression_loss: 1.0322 - classification_loss: 0.1329 94/500 [====>.........................] - ETA: 2:17 - loss: 1.1674 - regression_loss: 1.0343 - classification_loss: 0.1331 95/500 [====>.........................] - ETA: 2:16 - loss: 1.1669 - regression_loss: 1.0340 - classification_loss: 0.1329 96/500 [====>.........................] - ETA: 2:16 - loss: 1.1700 - regression_loss: 1.0375 - classification_loss: 0.1325 97/500 [====>.........................] - ETA: 2:16 - loss: 1.1683 - regression_loss: 1.0357 - classification_loss: 0.1326 98/500 [====>.........................] - ETA: 2:15 - loss: 1.1726 - regression_loss: 1.0395 - classification_loss: 0.1331 99/500 [====>.........................] - ETA: 2:15 - loss: 1.1751 - regression_loss: 1.0418 - classification_loss: 0.1333 100/500 [=====>........................] - ETA: 2:15 - loss: 1.1748 - regression_loss: 1.0416 - classification_loss: 0.1332 101/500 [=====>........................] - ETA: 2:15 - loss: 1.1768 - regression_loss: 1.0430 - classification_loss: 0.1338 102/500 [=====>........................] - ETA: 2:14 - loss: 1.1763 - regression_loss: 1.0426 - classification_loss: 0.1338 103/500 [=====>........................] - ETA: 2:14 - loss: 1.1721 - regression_loss: 1.0387 - classification_loss: 0.1334 104/500 [=====>........................] - ETA: 2:14 - loss: 1.1737 - regression_loss: 1.0400 - classification_loss: 0.1337 105/500 [=====>........................] - ETA: 2:13 - loss: 1.1742 - regression_loss: 1.0403 - classification_loss: 0.1339 106/500 [=====>........................] - ETA: 2:13 - loss: 1.1761 - regression_loss: 1.0416 - classification_loss: 0.1345 107/500 [=====>........................] - ETA: 2:13 - loss: 1.1804 - regression_loss: 1.0449 - classification_loss: 0.1355 108/500 [=====>........................] - ETA: 2:12 - loss: 1.1790 - regression_loss: 1.0432 - classification_loss: 0.1358 109/500 [=====>........................] - ETA: 2:12 - loss: 1.1803 - regression_loss: 1.0447 - classification_loss: 0.1356 110/500 [=====>........................] - ETA: 2:12 - loss: 1.1764 - regression_loss: 1.0410 - classification_loss: 0.1355 111/500 [=====>........................] - ETA: 2:11 - loss: 1.1759 - regression_loss: 1.0402 - classification_loss: 0.1357 112/500 [=====>........................] - ETA: 2:11 - loss: 1.1732 - regression_loss: 1.0375 - classification_loss: 0.1357 113/500 [=====>........................] - ETA: 2:11 - loss: 1.1704 - regression_loss: 1.0352 - classification_loss: 0.1352 114/500 [=====>........................] - ETA: 2:10 - loss: 1.1691 - regression_loss: 1.0339 - classification_loss: 0.1351 115/500 [=====>........................] - ETA: 2:10 - loss: 1.1660 - regression_loss: 1.0311 - classification_loss: 0.1349 116/500 [=====>........................] - ETA: 2:10 - loss: 1.1666 - regression_loss: 1.0316 - classification_loss: 0.1350 117/500 [======>.......................] - ETA: 2:09 - loss: 1.1699 - regression_loss: 1.0346 - classification_loss: 0.1354 118/500 [======>.......................] - ETA: 2:09 - loss: 1.1692 - regression_loss: 1.0338 - classification_loss: 0.1354 119/500 [======>.......................] - ETA: 2:09 - loss: 1.1688 - regression_loss: 1.0336 - classification_loss: 0.1353 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1681 - regression_loss: 1.0331 - classification_loss: 0.1350 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1711 - regression_loss: 1.0355 - classification_loss: 0.1356 122/500 [======>.......................] - ETA: 2:08 - loss: 1.1688 - regression_loss: 1.0335 - classification_loss: 0.1353 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1723 - regression_loss: 1.0365 - classification_loss: 0.1358 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1769 - regression_loss: 1.0401 - classification_loss: 0.1367 125/500 [======>.......................] - ETA: 2:07 - loss: 1.1748 - regression_loss: 1.0383 - classification_loss: 0.1365 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1765 - regression_loss: 1.0393 - classification_loss: 0.1372 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1753 - regression_loss: 1.0386 - classification_loss: 0.1367 128/500 [======>.......................] - ETA: 2:06 - loss: 1.1762 - regression_loss: 1.0394 - classification_loss: 0.1368 129/500 [======>.......................] - ETA: 2:05 - loss: 1.1876 - regression_loss: 1.0424 - classification_loss: 0.1451 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1891 - regression_loss: 1.0438 - classification_loss: 0.1453 131/500 [======>.......................] - ETA: 2:05 - loss: 1.1888 - regression_loss: 1.0436 - classification_loss: 0.1451 132/500 [======>.......................] - ETA: 2:04 - loss: 1.1958 - regression_loss: 1.0488 - classification_loss: 0.1471 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1949 - regression_loss: 1.0482 - classification_loss: 0.1467 134/500 [=======>......................] - ETA: 2:04 - loss: 1.1954 - regression_loss: 1.0487 - classification_loss: 0.1467 135/500 [=======>......................] - ETA: 2:03 - loss: 1.1947 - regression_loss: 1.0484 - classification_loss: 0.1463 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1892 - regression_loss: 1.0437 - classification_loss: 0.1455 137/500 [=======>......................] - ETA: 2:03 - loss: 1.1891 - regression_loss: 1.0440 - classification_loss: 0.1452 138/500 [=======>......................] - ETA: 2:02 - loss: 1.1878 - regression_loss: 1.0429 - classification_loss: 0.1448 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1852 - regression_loss: 1.0409 - classification_loss: 0.1443 140/500 [=======>......................] - ETA: 2:02 - loss: 1.1848 - regression_loss: 1.0408 - classification_loss: 0.1440 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1835 - regression_loss: 1.0398 - classification_loss: 0.1437 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1783 - regression_loss: 1.0354 - classification_loss: 0.1429 143/500 [=======>......................] - ETA: 2:01 - loss: 1.1797 - regression_loss: 1.0364 - classification_loss: 0.1433 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1775 - regression_loss: 1.0345 - classification_loss: 0.1430 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1780 - regression_loss: 1.0351 - classification_loss: 0.1429 146/500 [=======>......................] - ETA: 2:00 - loss: 1.1752 - regression_loss: 1.0327 - classification_loss: 0.1425 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1740 - regression_loss: 1.0318 - classification_loss: 0.1422 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1676 - regression_loss: 1.0262 - classification_loss: 0.1414 149/500 [=======>......................] - ETA: 1:59 - loss: 1.1690 - regression_loss: 1.0275 - classification_loss: 0.1415 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1663 - regression_loss: 1.0254 - classification_loss: 0.1409 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1618 - regression_loss: 1.0216 - classification_loss: 0.1402 152/500 [========>.....................] - ETA: 1:58 - loss: 1.1608 - regression_loss: 1.0208 - classification_loss: 0.1400 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1560 - regression_loss: 1.0167 - classification_loss: 0.1393 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1514 - regression_loss: 1.0126 - classification_loss: 0.1387 155/500 [========>.....................] - ETA: 1:57 - loss: 1.1503 - regression_loss: 1.0119 - classification_loss: 0.1384 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1516 - regression_loss: 1.0130 - classification_loss: 0.1386 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1540 - regression_loss: 1.0149 - classification_loss: 0.1390 158/500 [========>.....................] - ETA: 1:56 - loss: 1.1515 - regression_loss: 1.0130 - classification_loss: 0.1385 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1500 - regression_loss: 1.0120 - classification_loss: 0.1380 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1462 - regression_loss: 1.0089 - classification_loss: 0.1374 161/500 [========>.....................] - ETA: 1:55 - loss: 1.1462 - regression_loss: 1.0089 - classification_loss: 0.1373 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1468 - regression_loss: 1.0093 - classification_loss: 0.1375 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1436 - regression_loss: 1.0066 - classification_loss: 0.1370 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1436 - regression_loss: 1.0066 - classification_loss: 0.1370 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1426 - regression_loss: 1.0058 - classification_loss: 0.1368 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1403 - regression_loss: 1.0039 - classification_loss: 0.1363 167/500 [=========>....................] - ETA: 1:52 - loss: 1.1406 - regression_loss: 1.0043 - classification_loss: 0.1363 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1363 - regression_loss: 1.0006 - classification_loss: 0.1357 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1383 - regression_loss: 1.0024 - classification_loss: 0.1359 170/500 [=========>....................] - ETA: 1:52 - loss: 1.1387 - regression_loss: 1.0028 - classification_loss: 0.1360 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1416 - regression_loss: 1.0054 - classification_loss: 0.1361 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1406 - regression_loss: 1.0043 - classification_loss: 0.1363 173/500 [=========>....................] - ETA: 1:51 - loss: 1.1396 - regression_loss: 1.0035 - classification_loss: 0.1361 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1414 - regression_loss: 1.0050 - classification_loss: 0.1364 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1412 - regression_loss: 1.0049 - classification_loss: 0.1363 176/500 [=========>....................] - ETA: 1:50 - loss: 1.1418 - regression_loss: 1.0055 - classification_loss: 0.1363 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1402 - regression_loss: 1.0044 - classification_loss: 0.1358 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1411 - regression_loss: 1.0053 - classification_loss: 0.1358 179/500 [=========>....................] - ETA: 1:48 - loss: 1.1379 - regression_loss: 1.0027 - classification_loss: 0.1352 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1411 - regression_loss: 1.0057 - classification_loss: 0.1354 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1441 - regression_loss: 1.0076 - classification_loss: 0.1365 182/500 [=========>....................] - ETA: 1:47 - loss: 1.1451 - regression_loss: 1.0087 - classification_loss: 0.1364 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1442 - regression_loss: 1.0077 - classification_loss: 0.1365 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1432 - regression_loss: 1.0069 - classification_loss: 0.1363 185/500 [==========>...................] - ETA: 1:46 - loss: 1.1407 - regression_loss: 1.0048 - classification_loss: 0.1359 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1391 - regression_loss: 1.0035 - classification_loss: 0.1356 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1385 - regression_loss: 1.0032 - classification_loss: 0.1352 188/500 [==========>...................] - ETA: 1:45 - loss: 1.1393 - regression_loss: 1.0040 - classification_loss: 0.1354 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1416 - regression_loss: 1.0058 - classification_loss: 0.1358 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1389 - regression_loss: 1.0036 - classification_loss: 0.1352 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1365 - regression_loss: 1.0017 - classification_loss: 0.1348 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1392 - regression_loss: 1.0038 - classification_loss: 0.1354 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1377 - regression_loss: 1.0027 - classification_loss: 0.1350 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1386 - regression_loss: 1.0032 - classification_loss: 0.1353 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1390 - regression_loss: 1.0036 - classification_loss: 0.1354 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1407 - regression_loss: 1.0050 - classification_loss: 0.1357 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1409 - regression_loss: 1.0051 - classification_loss: 0.1358 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1379 - regression_loss: 1.0023 - classification_loss: 0.1355 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1379 - regression_loss: 1.0026 - classification_loss: 0.1353 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1359 - regression_loss: 1.0009 - classification_loss: 0.1349 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1391 - regression_loss: 1.0037 - classification_loss: 0.1354 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1389 - regression_loss: 1.0035 - classification_loss: 0.1354 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1385 - regression_loss: 1.0035 - classification_loss: 0.1350 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1406 - regression_loss: 1.0049 - classification_loss: 0.1356 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1392 - regression_loss: 1.0039 - classification_loss: 0.1354 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1419 - regression_loss: 1.0060 - classification_loss: 0.1359 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1439 - regression_loss: 1.0080 - classification_loss: 0.1359 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1464 - regression_loss: 1.0105 - classification_loss: 0.1359 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1477 - regression_loss: 1.0113 - classification_loss: 0.1364 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1488 - regression_loss: 1.0123 - classification_loss: 0.1365 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1449 - regression_loss: 1.0089 - classification_loss: 0.1360 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1443 - regression_loss: 1.0086 - classification_loss: 0.1357 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1447 - regression_loss: 1.0090 - classification_loss: 0.1357 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1445 - regression_loss: 1.0086 - classification_loss: 0.1359 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1457 - regression_loss: 1.0096 - classification_loss: 0.1361 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1464 - regression_loss: 1.0100 - classification_loss: 0.1364 217/500 [============>.................] - ETA: 1:35 - loss: 1.1480 - regression_loss: 1.0113 - classification_loss: 0.1367 218/500 [============>.................] - ETA: 1:35 - loss: 1.1497 - regression_loss: 1.0129 - classification_loss: 0.1368 219/500 [============>.................] - ETA: 1:35 - loss: 1.1498 - regression_loss: 1.0131 - classification_loss: 0.1367 220/500 [============>.................] - ETA: 1:35 - loss: 1.1487 - regression_loss: 1.0122 - classification_loss: 0.1365 221/500 [============>.................] - ETA: 1:34 - loss: 1.1483 - regression_loss: 1.0119 - classification_loss: 0.1365 222/500 [============>.................] - ETA: 1:34 - loss: 1.1505 - regression_loss: 1.0137 - classification_loss: 0.1369 223/500 [============>.................] - ETA: 1:34 - loss: 1.1482 - regression_loss: 1.0117 - classification_loss: 0.1365 224/500 [============>.................] - ETA: 1:33 - loss: 1.1489 - regression_loss: 1.0122 - classification_loss: 0.1367 225/500 [============>.................] - ETA: 1:33 - loss: 1.1491 - regression_loss: 1.0125 - classification_loss: 0.1366 226/500 [============>.................] - ETA: 1:33 - loss: 1.1487 - regression_loss: 1.0124 - classification_loss: 0.1363 227/500 [============>.................] - ETA: 1:32 - loss: 1.1485 - regression_loss: 1.0121 - classification_loss: 0.1364 228/500 [============>.................] - ETA: 1:32 - loss: 1.1458 - regression_loss: 1.0099 - classification_loss: 0.1359 229/500 [============>.................] - ETA: 1:32 - loss: 1.1484 - regression_loss: 1.0121 - classification_loss: 0.1363 230/500 [============>.................] - ETA: 1:31 - loss: 1.1492 - regression_loss: 1.0128 - classification_loss: 0.1364 231/500 [============>.................] - ETA: 1:31 - loss: 1.1491 - regression_loss: 1.0128 - classification_loss: 0.1363 232/500 [============>.................] - ETA: 1:31 - loss: 1.1473 - regression_loss: 1.0113 - classification_loss: 0.1360 233/500 [============>.................] - ETA: 1:30 - loss: 1.1448 - regression_loss: 1.0091 - classification_loss: 0.1357 234/500 [=============>................] - ETA: 1:30 - loss: 1.1457 - regression_loss: 1.0099 - classification_loss: 0.1358 235/500 [=============>................] - ETA: 1:29 - loss: 1.1444 - regression_loss: 1.0089 - classification_loss: 0.1355 236/500 [=============>................] - ETA: 1:29 - loss: 1.1443 - regression_loss: 1.0089 - classification_loss: 0.1354 237/500 [=============>................] - ETA: 1:29 - loss: 1.1461 - regression_loss: 1.0104 - classification_loss: 0.1357 238/500 [=============>................] - ETA: 1:28 - loss: 1.1442 - regression_loss: 1.0087 - classification_loss: 0.1355 239/500 [=============>................] - ETA: 1:28 - loss: 1.1442 - regression_loss: 1.0087 - classification_loss: 0.1355 240/500 [=============>................] - ETA: 1:28 - loss: 1.1405 - regression_loss: 1.0055 - classification_loss: 0.1350 241/500 [=============>................] - ETA: 1:27 - loss: 1.1394 - regression_loss: 1.0045 - classification_loss: 0.1349 242/500 [=============>................] - ETA: 1:27 - loss: 1.1377 - regression_loss: 1.0031 - classification_loss: 0.1345 243/500 [=============>................] - ETA: 1:27 - loss: 1.1393 - regression_loss: 1.0044 - classification_loss: 0.1349 244/500 [=============>................] - ETA: 1:26 - loss: 1.1386 - regression_loss: 1.0038 - classification_loss: 0.1348 245/500 [=============>................] - ETA: 1:26 - loss: 1.1380 - regression_loss: 1.0033 - classification_loss: 0.1347 246/500 [=============>................] - ETA: 1:26 - loss: 1.1389 - regression_loss: 1.0039 - classification_loss: 0.1349 247/500 [=============>................] - ETA: 1:25 - loss: 1.1399 - regression_loss: 1.0049 - classification_loss: 0.1350 248/500 [=============>................] - ETA: 1:25 - loss: 1.1392 - regression_loss: 1.0043 - classification_loss: 0.1349 249/500 [=============>................] - ETA: 1:25 - loss: 1.1401 - regression_loss: 1.0051 - classification_loss: 0.1350 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1400 - regression_loss: 1.0050 - classification_loss: 0.1351 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1400 - regression_loss: 1.0048 - classification_loss: 0.1352 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1410 - regression_loss: 1.0056 - classification_loss: 0.1355 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1416 - regression_loss: 1.0061 - classification_loss: 0.1354 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1432 - regression_loss: 1.0078 - classification_loss: 0.1354 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1419 - regression_loss: 1.0068 - classification_loss: 0.1351 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1432 - regression_loss: 1.0080 - classification_loss: 0.1353 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1438 - regression_loss: 1.0086 - classification_loss: 0.1352 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1413 - regression_loss: 1.0064 - classification_loss: 0.1348 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1416 - regression_loss: 1.0067 - classification_loss: 0.1349 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1409 - regression_loss: 1.0062 - classification_loss: 0.1347 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1415 - regression_loss: 1.0068 - classification_loss: 0.1346 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1429 - regression_loss: 1.0081 - classification_loss: 0.1348 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1432 - regression_loss: 1.0084 - classification_loss: 0.1347 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1416 - regression_loss: 1.0071 - classification_loss: 0.1345 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1419 - regression_loss: 1.0073 - classification_loss: 0.1346 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1427 - regression_loss: 1.0080 - classification_loss: 0.1347 267/500 [===============>..............] - ETA: 1:18 - loss: 1.1429 - regression_loss: 1.0081 - classification_loss: 0.1348 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1444 - regression_loss: 1.0095 - classification_loss: 0.1349 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1439 - regression_loss: 1.0093 - classification_loss: 0.1347 270/500 [===============>..............] - ETA: 1:17 - loss: 1.1437 - regression_loss: 1.0090 - classification_loss: 0.1346 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1441 - regression_loss: 1.0095 - classification_loss: 0.1346 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1446 - regression_loss: 1.0098 - classification_loss: 0.1348 273/500 [===============>..............] - ETA: 1:16 - loss: 1.1462 - regression_loss: 1.0113 - classification_loss: 0.1349 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1443 - regression_loss: 1.0095 - classification_loss: 0.1348 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1424 - regression_loss: 1.0078 - classification_loss: 0.1346 276/500 [===============>..............] - ETA: 1:15 - loss: 1.1409 - regression_loss: 1.0066 - classification_loss: 0.1343 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1408 - regression_loss: 1.0065 - classification_loss: 0.1343 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1427 - regression_loss: 1.0083 - classification_loss: 0.1343 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1402 - regression_loss: 1.0063 - classification_loss: 0.1339 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1409 - regression_loss: 1.0069 - classification_loss: 0.1340 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1449 - regression_loss: 1.0100 - classification_loss: 0.1349 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1445 - regression_loss: 1.0095 - classification_loss: 0.1351 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1440 - regression_loss: 1.0091 - classification_loss: 0.1349 284/500 [================>.............] - ETA: 1:13 - loss: 1.1464 - regression_loss: 1.0111 - classification_loss: 0.1353 285/500 [================>.............] - ETA: 1:12 - loss: 1.1466 - regression_loss: 1.0111 - classification_loss: 0.1355 286/500 [================>.............] - ETA: 1:12 - loss: 1.1467 - regression_loss: 1.0112 - classification_loss: 0.1355 287/500 [================>.............] - ETA: 1:12 - loss: 1.1442 - regression_loss: 1.0090 - classification_loss: 0.1352 288/500 [================>.............] - ETA: 1:11 - loss: 1.1447 - regression_loss: 1.0094 - classification_loss: 0.1353 289/500 [================>.............] - ETA: 1:11 - loss: 1.1466 - regression_loss: 1.0110 - classification_loss: 0.1357 290/500 [================>.............] - ETA: 1:11 - loss: 1.1451 - regression_loss: 1.0098 - classification_loss: 0.1353 291/500 [================>.............] - ETA: 1:10 - loss: 1.1449 - regression_loss: 1.0096 - classification_loss: 0.1353 292/500 [================>.............] - ETA: 1:10 - loss: 1.1451 - regression_loss: 1.0096 - classification_loss: 0.1355 293/500 [================>.............] - ETA: 1:10 - loss: 1.1446 - regression_loss: 1.0093 - classification_loss: 0.1353 294/500 [================>.............] - ETA: 1:09 - loss: 1.1451 - regression_loss: 1.0097 - classification_loss: 0.1354 295/500 [================>.............] - ETA: 1:09 - loss: 1.1439 - regression_loss: 1.0087 - classification_loss: 0.1352 296/500 [================>.............] - ETA: 1:09 - loss: 1.1437 - regression_loss: 1.0085 - classification_loss: 0.1352 297/500 [================>.............] - ETA: 1:08 - loss: 1.1431 - regression_loss: 1.0080 - classification_loss: 0.1351 298/500 [================>.............] - ETA: 1:08 - loss: 1.1434 - regression_loss: 1.0082 - classification_loss: 0.1351 299/500 [================>.............] - ETA: 1:08 - loss: 1.1419 - regression_loss: 1.0070 - classification_loss: 0.1348 300/500 [=================>............] - ETA: 1:07 - loss: 1.1421 - regression_loss: 1.0070 - classification_loss: 0.1351 301/500 [=================>............] - ETA: 1:07 - loss: 1.1416 - regression_loss: 1.0063 - classification_loss: 0.1353 302/500 [=================>............] - ETA: 1:07 - loss: 1.1429 - regression_loss: 1.0075 - classification_loss: 0.1354 303/500 [=================>............] - ETA: 1:06 - loss: 1.1458 - regression_loss: 1.0099 - classification_loss: 0.1359 304/500 [=================>............] - ETA: 1:06 - loss: 1.1461 - regression_loss: 1.0100 - classification_loss: 0.1361 305/500 [=================>............] - ETA: 1:06 - loss: 1.1477 - regression_loss: 1.0113 - classification_loss: 0.1364 306/500 [=================>............] - ETA: 1:05 - loss: 1.1500 - regression_loss: 1.0132 - classification_loss: 0.1368 307/500 [=================>............] - ETA: 1:05 - loss: 1.1496 - regression_loss: 1.0129 - classification_loss: 0.1367 308/500 [=================>............] - ETA: 1:05 - loss: 1.1475 - regression_loss: 1.0110 - classification_loss: 0.1365 309/500 [=================>............] - ETA: 1:04 - loss: 1.1464 - regression_loss: 1.0101 - classification_loss: 0.1363 310/500 [=================>............] - ETA: 1:04 - loss: 1.1457 - regression_loss: 1.0094 - classification_loss: 0.1363 311/500 [=================>............] - ETA: 1:04 - loss: 1.1452 - regression_loss: 1.0091 - classification_loss: 0.1361 312/500 [=================>............] - ETA: 1:03 - loss: 1.1457 - regression_loss: 1.0092 - classification_loss: 0.1365 313/500 [=================>............] - ETA: 1:03 - loss: 1.1454 - regression_loss: 1.0090 - classification_loss: 0.1364 314/500 [=================>............] - ETA: 1:03 - loss: 1.1473 - regression_loss: 1.0105 - classification_loss: 0.1368 315/500 [=================>............] - ETA: 1:02 - loss: 1.1486 - regression_loss: 1.0115 - classification_loss: 0.1371 316/500 [=================>............] - ETA: 1:02 - loss: 1.1471 - regression_loss: 1.0102 - classification_loss: 0.1369 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1464 - regression_loss: 1.0096 - classification_loss: 0.1368 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1455 - regression_loss: 1.0089 - classification_loss: 0.1367 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1454 - regression_loss: 1.0088 - classification_loss: 0.1366 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1459 - regression_loss: 1.0093 - classification_loss: 0.1366 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1470 - regression_loss: 1.0103 - classification_loss: 0.1367 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1456 - regression_loss: 1.0083 - classification_loss: 0.1373 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1485 - regression_loss: 1.0109 - classification_loss: 0.1375 324/500 [==================>...........] - ETA: 59s - loss: 1.1483 - regression_loss: 1.0108 - classification_loss: 0.1375  325/500 [==================>...........] - ETA: 59s - loss: 1.1500 - regression_loss: 1.0122 - classification_loss: 0.1377 326/500 [==================>...........] - ETA: 58s - loss: 1.1498 - regression_loss: 1.0121 - classification_loss: 0.1376 327/500 [==================>...........] - ETA: 58s - loss: 1.1489 - regression_loss: 1.0113 - classification_loss: 0.1377 328/500 [==================>...........] - ETA: 58s - loss: 1.1488 - regression_loss: 1.0113 - classification_loss: 0.1375 329/500 [==================>...........] - ETA: 57s - loss: 1.1507 - regression_loss: 1.0129 - classification_loss: 0.1378 330/500 [==================>...........] - ETA: 57s - loss: 1.1509 - regression_loss: 1.0130 - classification_loss: 0.1379 331/500 [==================>...........] - ETA: 57s - loss: 1.1511 - regression_loss: 1.0131 - classification_loss: 0.1379 332/500 [==================>...........] - ETA: 56s - loss: 1.1508 - regression_loss: 1.0128 - classification_loss: 0.1380 333/500 [==================>...........] - ETA: 56s - loss: 1.1510 - regression_loss: 1.0130 - classification_loss: 0.1380 334/500 [===================>..........] - ETA: 56s - loss: 1.1492 - regression_loss: 1.0114 - classification_loss: 0.1378 335/500 [===================>..........] - ETA: 55s - loss: 1.1493 - regression_loss: 1.0115 - classification_loss: 0.1378 336/500 [===================>..........] - ETA: 55s - loss: 1.1480 - regression_loss: 1.0103 - classification_loss: 0.1377 337/500 [===================>..........] - ETA: 55s - loss: 1.1460 - regression_loss: 1.0086 - classification_loss: 0.1374 338/500 [===================>..........] - ETA: 54s - loss: 1.1465 - regression_loss: 1.0091 - classification_loss: 0.1374 339/500 [===================>..........] - ETA: 54s - loss: 1.1466 - regression_loss: 1.0091 - classification_loss: 0.1375 340/500 [===================>..........] - ETA: 54s - loss: 1.1473 - regression_loss: 1.0098 - classification_loss: 0.1375 341/500 [===================>..........] - ETA: 53s - loss: 1.1454 - regression_loss: 1.0081 - classification_loss: 0.1373 342/500 [===================>..........] - ETA: 53s - loss: 1.1454 - regression_loss: 1.0081 - classification_loss: 0.1372 343/500 [===================>..........] - ETA: 53s - loss: 1.1457 - regression_loss: 1.0084 - classification_loss: 0.1373 344/500 [===================>..........] - ETA: 52s - loss: 1.1466 - regression_loss: 1.0094 - classification_loss: 0.1372 345/500 [===================>..........] - ETA: 52s - loss: 1.1466 - regression_loss: 1.0094 - classification_loss: 0.1372 346/500 [===================>..........] - ETA: 52s - loss: 1.1468 - regression_loss: 1.0096 - classification_loss: 0.1372 347/500 [===================>..........] - ETA: 51s - loss: 1.1469 - regression_loss: 1.0097 - classification_loss: 0.1372 348/500 [===================>..........] - ETA: 51s - loss: 1.1472 - regression_loss: 1.0101 - classification_loss: 0.1371 349/500 [===================>..........] - ETA: 51s - loss: 1.1474 - regression_loss: 1.0102 - classification_loss: 0.1372 350/500 [====================>.........] - ETA: 50s - loss: 1.1467 - regression_loss: 1.0097 - classification_loss: 0.1370 351/500 [====================>.........] - ETA: 50s - loss: 1.1473 - regression_loss: 1.0102 - classification_loss: 0.1371 352/500 [====================>.........] - ETA: 50s - loss: 1.1469 - regression_loss: 1.0097 - classification_loss: 0.1372 353/500 [====================>.........] - ETA: 49s - loss: 1.1474 - regression_loss: 1.0101 - classification_loss: 0.1373 354/500 [====================>.........] - ETA: 49s - loss: 1.1461 - regression_loss: 1.0089 - classification_loss: 0.1372 355/500 [====================>.........] - ETA: 49s - loss: 1.1472 - regression_loss: 1.0099 - classification_loss: 0.1373 356/500 [====================>.........] - ETA: 48s - loss: 1.1461 - regression_loss: 1.0091 - classification_loss: 0.1370 357/500 [====================>.........] - ETA: 48s - loss: 1.1457 - regression_loss: 1.0089 - classification_loss: 0.1369 358/500 [====================>.........] - ETA: 48s - loss: 1.1444 - regression_loss: 1.0078 - classification_loss: 0.1367 359/500 [====================>.........] - ETA: 47s - loss: 1.1450 - regression_loss: 1.0083 - classification_loss: 0.1367 360/500 [====================>.........] - ETA: 47s - loss: 1.1434 - regression_loss: 1.0069 - classification_loss: 0.1365 361/500 [====================>.........] - ETA: 47s - loss: 1.1443 - regression_loss: 1.0079 - classification_loss: 0.1364 362/500 [====================>.........] - ETA: 46s - loss: 1.1430 - regression_loss: 1.0069 - classification_loss: 0.1362 363/500 [====================>.........] - ETA: 46s - loss: 1.1421 - regression_loss: 1.0061 - classification_loss: 0.1360 364/500 [====================>.........] - ETA: 46s - loss: 1.1418 - regression_loss: 1.0060 - classification_loss: 0.1359 365/500 [====================>.........] - ETA: 45s - loss: 1.1404 - regression_loss: 1.0048 - classification_loss: 0.1356 366/500 [====================>.........] - ETA: 45s - loss: 1.1414 - regression_loss: 1.0056 - classification_loss: 0.1358 367/500 [=====================>........] - ETA: 45s - loss: 1.1406 - regression_loss: 1.0049 - classification_loss: 0.1357 368/500 [=====================>........] - ETA: 44s - loss: 1.1410 - regression_loss: 1.0053 - classification_loss: 0.1357 369/500 [=====================>........] - ETA: 44s - loss: 1.1413 - regression_loss: 1.0055 - classification_loss: 0.1357 370/500 [=====================>........] - ETA: 44s - loss: 1.1401 - regression_loss: 1.0045 - classification_loss: 0.1356 371/500 [=====================>........] - ETA: 43s - loss: 1.1380 - regression_loss: 1.0027 - classification_loss: 0.1353 372/500 [=====================>........] - ETA: 43s - loss: 1.1365 - regression_loss: 1.0014 - classification_loss: 0.1351 373/500 [=====================>........] - ETA: 43s - loss: 1.1362 - regression_loss: 1.0012 - classification_loss: 0.1350 374/500 [=====================>........] - ETA: 42s - loss: 1.1356 - regression_loss: 1.0006 - classification_loss: 0.1349 375/500 [=====================>........] - ETA: 42s - loss: 1.1345 - regression_loss: 0.9998 - classification_loss: 0.1347 376/500 [=====================>........] - ETA: 42s - loss: 1.1333 - regression_loss: 0.9988 - classification_loss: 0.1345 377/500 [=====================>........] - ETA: 41s - loss: 1.1323 - regression_loss: 0.9980 - classification_loss: 0.1343 378/500 [=====================>........] - ETA: 41s - loss: 1.1328 - regression_loss: 0.9984 - classification_loss: 0.1344 379/500 [=====================>........] - ETA: 41s - loss: 1.1312 - regression_loss: 0.9970 - classification_loss: 0.1341 380/500 [=====================>........] - ETA: 40s - loss: 1.1314 - regression_loss: 0.9973 - classification_loss: 0.1341 381/500 [=====================>........] - ETA: 40s - loss: 1.1320 - regression_loss: 0.9979 - classification_loss: 0.1341 382/500 [=====================>........] - ETA: 40s - loss: 1.1319 - regression_loss: 0.9979 - classification_loss: 0.1341 383/500 [=====================>........] - ETA: 39s - loss: 1.1326 - regression_loss: 0.9986 - classification_loss: 0.1341 384/500 [======================>.......] - ETA: 39s - loss: 1.1328 - regression_loss: 0.9987 - classification_loss: 0.1341 385/500 [======================>.......] - ETA: 39s - loss: 1.1315 - regression_loss: 0.9976 - classification_loss: 0.1339 386/500 [======================>.......] - ETA: 38s - loss: 1.1325 - regression_loss: 0.9985 - classification_loss: 0.1340 387/500 [======================>.......] - ETA: 38s - loss: 1.1325 - regression_loss: 0.9985 - classification_loss: 0.1339 388/500 [======================>.......] - ETA: 38s - loss: 1.1326 - regression_loss: 0.9987 - classification_loss: 0.1339 389/500 [======================>.......] - ETA: 37s - loss: 1.1329 - regression_loss: 0.9989 - classification_loss: 0.1340 390/500 [======================>.......] - ETA: 37s - loss: 1.1352 - regression_loss: 1.0006 - classification_loss: 0.1345 391/500 [======================>.......] - ETA: 37s - loss: 1.1356 - regression_loss: 1.0010 - classification_loss: 0.1346 392/500 [======================>.......] - ETA: 36s - loss: 1.1359 - regression_loss: 1.0012 - classification_loss: 0.1347 393/500 [======================>.......] - ETA: 36s - loss: 1.1375 - regression_loss: 1.0024 - classification_loss: 0.1351 394/500 [======================>.......] - ETA: 35s - loss: 1.1375 - regression_loss: 1.0024 - classification_loss: 0.1350 395/500 [======================>.......] - ETA: 35s - loss: 1.1371 - regression_loss: 1.0022 - classification_loss: 0.1349 396/500 [======================>.......] - ETA: 35s - loss: 1.1370 - regression_loss: 1.0021 - classification_loss: 0.1348 397/500 [======================>.......] - ETA: 34s - loss: 1.1373 - regression_loss: 1.0024 - classification_loss: 0.1349 398/500 [======================>.......] - ETA: 34s - loss: 1.1386 - regression_loss: 1.0035 - classification_loss: 0.1351 399/500 [======================>.......] - ETA: 34s - loss: 1.1389 - regression_loss: 1.0038 - classification_loss: 0.1351 400/500 [=======================>......] - ETA: 33s - loss: 1.1389 - regression_loss: 1.0039 - classification_loss: 0.1351 401/500 [=======================>......] - ETA: 33s - loss: 1.1386 - regression_loss: 1.0035 - classification_loss: 0.1351 402/500 [=======================>......] - ETA: 33s - loss: 1.1384 - regression_loss: 1.0034 - classification_loss: 0.1349 403/500 [=======================>......] - ETA: 32s - loss: 1.1381 - regression_loss: 1.0032 - classification_loss: 0.1349 404/500 [=======================>......] - ETA: 32s - loss: 1.1380 - regression_loss: 1.0032 - classification_loss: 0.1348 405/500 [=======================>......] - ETA: 32s - loss: 1.1370 - regression_loss: 1.0024 - classification_loss: 0.1346 406/500 [=======================>......] - ETA: 31s - loss: 1.1368 - regression_loss: 1.0022 - classification_loss: 0.1346 407/500 [=======================>......] - ETA: 31s - loss: 1.1354 - regression_loss: 1.0010 - classification_loss: 0.1344 408/500 [=======================>......] - ETA: 31s - loss: 1.1355 - regression_loss: 1.0011 - classification_loss: 0.1345 409/500 [=======================>......] - ETA: 30s - loss: 1.1365 - regression_loss: 1.0020 - classification_loss: 0.1346 410/500 [=======================>......] - ETA: 30s - loss: 1.1349 - regression_loss: 1.0006 - classification_loss: 0.1343 411/500 [=======================>......] - ETA: 30s - loss: 1.1349 - regression_loss: 1.0005 - classification_loss: 0.1344 412/500 [=======================>......] - ETA: 29s - loss: 1.1348 - regression_loss: 1.0004 - classification_loss: 0.1343 413/500 [=======================>......] - ETA: 29s - loss: 1.1348 - regression_loss: 1.0005 - classification_loss: 0.1343 414/500 [=======================>......] - ETA: 29s - loss: 1.1339 - regression_loss: 0.9998 - classification_loss: 0.1342 415/500 [=======================>......] - ETA: 28s - loss: 1.1338 - regression_loss: 0.9997 - classification_loss: 0.1342 416/500 [=======================>......] - ETA: 28s - loss: 1.1349 - regression_loss: 1.0006 - classification_loss: 0.1343 417/500 [========================>.....] - ETA: 28s - loss: 1.1342 - regression_loss: 1.0000 - classification_loss: 0.1341 418/500 [========================>.....] - ETA: 27s - loss: 1.1336 - regression_loss: 0.9994 - classification_loss: 0.1341 419/500 [========================>.....] - ETA: 27s - loss: 1.1345 - regression_loss: 1.0003 - classification_loss: 0.1342 420/500 [========================>.....] - ETA: 27s - loss: 1.1342 - regression_loss: 1.0002 - classification_loss: 0.1340 421/500 [========================>.....] - ETA: 26s - loss: 1.1342 - regression_loss: 1.0002 - classification_loss: 0.1341 422/500 [========================>.....] - ETA: 26s - loss: 1.1345 - regression_loss: 1.0005 - classification_loss: 0.1340 423/500 [========================>.....] - ETA: 26s - loss: 1.1334 - regression_loss: 0.9995 - classification_loss: 0.1339 424/500 [========================>.....] - ETA: 25s - loss: 1.1344 - regression_loss: 1.0003 - classification_loss: 0.1341 425/500 [========================>.....] - ETA: 25s - loss: 1.1346 - regression_loss: 1.0006 - classification_loss: 0.1340 426/500 [========================>.....] - ETA: 25s - loss: 1.1361 - regression_loss: 1.0017 - classification_loss: 0.1344 427/500 [========================>.....] - ETA: 24s - loss: 1.1368 - regression_loss: 1.0023 - classification_loss: 0.1346 428/500 [========================>.....] - ETA: 24s - loss: 1.1370 - regression_loss: 1.0023 - classification_loss: 0.1346 429/500 [========================>.....] - ETA: 24s - loss: 1.1376 - regression_loss: 1.0029 - classification_loss: 0.1347 430/500 [========================>.....] - ETA: 23s - loss: 1.1367 - regression_loss: 1.0021 - classification_loss: 0.1346 431/500 [========================>.....] - ETA: 23s - loss: 1.1373 - regression_loss: 1.0026 - classification_loss: 0.1347 432/500 [========================>.....] - ETA: 23s - loss: 1.1378 - regression_loss: 1.0029 - classification_loss: 0.1349 433/500 [========================>.....] - ETA: 22s - loss: 1.1390 - regression_loss: 1.0038 - classification_loss: 0.1352 434/500 [=========================>....] - ETA: 22s - loss: 1.1390 - regression_loss: 1.0038 - classification_loss: 0.1353 435/500 [=========================>....] - ETA: 22s - loss: 1.1377 - regression_loss: 1.0025 - classification_loss: 0.1352 436/500 [=========================>....] - ETA: 21s - loss: 1.1380 - regression_loss: 1.0026 - classification_loss: 0.1354 437/500 [=========================>....] - ETA: 21s - loss: 1.1380 - regression_loss: 1.0027 - classification_loss: 0.1353 438/500 [=========================>....] - ETA: 21s - loss: 1.1376 - regression_loss: 1.0023 - classification_loss: 0.1352 439/500 [=========================>....] - ETA: 20s - loss: 1.1376 - regression_loss: 1.0023 - classification_loss: 0.1353 440/500 [=========================>....] - ETA: 20s - loss: 1.1370 - regression_loss: 1.0018 - classification_loss: 0.1352 441/500 [=========================>....] - ETA: 20s - loss: 1.1366 - regression_loss: 1.0013 - classification_loss: 0.1353 442/500 [=========================>....] - ETA: 19s - loss: 1.1376 - regression_loss: 1.0023 - classification_loss: 0.1354 443/500 [=========================>....] - ETA: 19s - loss: 1.1375 - regression_loss: 1.0022 - classification_loss: 0.1353 444/500 [=========================>....] - ETA: 18s - loss: 1.1379 - regression_loss: 1.0026 - classification_loss: 0.1353 445/500 [=========================>....] - ETA: 18s - loss: 1.1363 - regression_loss: 1.0013 - classification_loss: 0.1351 446/500 [=========================>....] - ETA: 18s - loss: 1.1359 - regression_loss: 1.0009 - classification_loss: 0.1350 447/500 [=========================>....] - ETA: 17s - loss: 1.1369 - regression_loss: 1.0018 - classification_loss: 0.1351 448/500 [=========================>....] - ETA: 17s - loss: 1.1370 - regression_loss: 1.0019 - classification_loss: 0.1351 449/500 [=========================>....] - ETA: 17s - loss: 1.1352 - regression_loss: 1.0003 - classification_loss: 0.1348 450/500 [==========================>...] - ETA: 16s - loss: 1.1349 - regression_loss: 1.0002 - classification_loss: 0.1347 451/500 [==========================>...] - ETA: 16s - loss: 1.1350 - regression_loss: 1.0002 - classification_loss: 0.1348 452/500 [==========================>...] - ETA: 16s - loss: 1.1362 - regression_loss: 1.0013 - classification_loss: 0.1350 453/500 [==========================>...] - ETA: 15s - loss: 1.1366 - regression_loss: 1.0015 - classification_loss: 0.1350 454/500 [==========================>...] - ETA: 15s - loss: 1.1357 - regression_loss: 1.0008 - classification_loss: 0.1348 455/500 [==========================>...] - ETA: 15s - loss: 1.1338 - regression_loss: 0.9992 - classification_loss: 0.1346 456/500 [==========================>...] - ETA: 14s - loss: 1.1329 - regression_loss: 0.9984 - classification_loss: 0.1344 457/500 [==========================>...] - ETA: 14s - loss: 1.1327 - regression_loss: 0.9983 - classification_loss: 0.1344 458/500 [==========================>...] - ETA: 14s - loss: 1.1335 - regression_loss: 0.9992 - classification_loss: 0.1343 459/500 [==========================>...] - ETA: 13s - loss: 1.1335 - regression_loss: 0.9992 - classification_loss: 0.1344 460/500 [==========================>...] - ETA: 13s - loss: 1.1336 - regression_loss: 0.9993 - classification_loss: 0.1343 461/500 [==========================>...] - ETA: 13s - loss: 1.1334 - regression_loss: 0.9991 - classification_loss: 0.1343 462/500 [==========================>...] - ETA: 12s - loss: 1.1320 - regression_loss: 0.9977 - classification_loss: 0.1343 463/500 [==========================>...] - ETA: 12s - loss: 1.1325 - regression_loss: 0.9980 - classification_loss: 0.1344 464/500 [==========================>...] - ETA: 12s - loss: 1.1327 - regression_loss: 0.9983 - classification_loss: 0.1344 465/500 [==========================>...] - ETA: 11s - loss: 1.1336 - regression_loss: 0.9991 - classification_loss: 0.1346 466/500 [==========================>...] - ETA: 11s - loss: 1.1342 - regression_loss: 0.9996 - classification_loss: 0.1347 467/500 [===========================>..] - ETA: 11s - loss: 1.1335 - regression_loss: 0.9991 - classification_loss: 0.1345 468/500 [===========================>..] - ETA: 10s - loss: 1.1330 - regression_loss: 0.9987 - classification_loss: 0.1343 469/500 [===========================>..] - ETA: 10s - loss: 1.1333 - regression_loss: 0.9989 - classification_loss: 0.1344 470/500 [===========================>..] - ETA: 10s - loss: 1.1336 - regression_loss: 0.9992 - classification_loss: 0.1344 471/500 [===========================>..] - ETA: 9s - loss: 1.1328 - regression_loss: 0.9985 - classification_loss: 0.1343  472/500 [===========================>..] - ETA: 9s - loss: 1.1333 - regression_loss: 0.9990 - classification_loss: 0.1343 473/500 [===========================>..] - ETA: 9s - loss: 1.1328 - regression_loss: 0.9986 - classification_loss: 0.1342 474/500 [===========================>..] - ETA: 8s - loss: 1.1317 - regression_loss: 0.9976 - classification_loss: 0.1341 475/500 [===========================>..] - ETA: 8s - loss: 1.1319 - regression_loss: 0.9978 - classification_loss: 0.1341 476/500 [===========================>..] - ETA: 8s - loss: 1.1314 - regression_loss: 0.9974 - classification_loss: 0.1339 477/500 [===========================>..] - ETA: 7s - loss: 1.1315 - regression_loss: 0.9975 - classification_loss: 0.1340 478/500 [===========================>..] - ETA: 7s - loss: 1.1320 - regression_loss: 0.9979 - classification_loss: 0.1341 479/500 [===========================>..] - ETA: 7s - loss: 1.1319 - regression_loss: 0.9978 - classification_loss: 0.1341 480/500 [===========================>..] - ETA: 6s - loss: 1.1327 - regression_loss: 0.9983 - classification_loss: 0.1344 481/500 [===========================>..] - ETA: 6s - loss: 1.1331 - regression_loss: 0.9988 - classification_loss: 0.1343 482/500 [===========================>..] - ETA: 6s - loss: 1.1321 - regression_loss: 0.9980 - classification_loss: 0.1342 483/500 [===========================>..] - ETA: 5s - loss: 1.1319 - regression_loss: 0.9979 - classification_loss: 0.1340 484/500 [============================>.] - ETA: 5s - loss: 1.1307 - regression_loss: 0.9968 - classification_loss: 0.1339 485/500 [============================>.] - ETA: 5s - loss: 1.1293 - regression_loss: 0.9957 - classification_loss: 0.1337 486/500 [============================>.] - ETA: 4s - loss: 1.1288 - regression_loss: 0.9950 - classification_loss: 0.1338 487/500 [============================>.] - ETA: 4s - loss: 1.1294 - regression_loss: 0.9954 - classification_loss: 0.1339 488/500 [============================>.] - ETA: 4s - loss: 1.1293 - regression_loss: 0.9954 - classification_loss: 0.1339 489/500 [============================>.] - ETA: 3s - loss: 1.1288 - regression_loss: 0.9950 - classification_loss: 0.1338 490/500 [============================>.] - ETA: 3s - loss: 1.1293 - regression_loss: 0.9953 - classification_loss: 0.1340 491/500 [============================>.] - ETA: 3s - loss: 1.1300 - regression_loss: 0.9959 - classification_loss: 0.1341 492/500 [============================>.] - ETA: 2s - loss: 1.1308 - regression_loss: 0.9967 - classification_loss: 0.1341 493/500 [============================>.] - ETA: 2s - loss: 1.1308 - regression_loss: 0.9968 - classification_loss: 0.1340 494/500 [============================>.] - ETA: 2s - loss: 1.1308 - regression_loss: 0.9969 - classification_loss: 0.1340 495/500 [============================>.] - ETA: 1s - loss: 1.1314 - regression_loss: 0.9973 - classification_loss: 0.1341 496/500 [============================>.] - ETA: 1s - loss: 1.1315 - regression_loss: 0.9975 - classification_loss: 0.1340 497/500 [============================>.] - ETA: 1s - loss: 1.1315 - regression_loss: 0.9975 - classification_loss: 0.1340 498/500 [============================>.] - ETA: 0s - loss: 1.1310 - regression_loss: 0.9971 - classification_loss: 0.1339 499/500 [============================>.] - ETA: 0s - loss: 1.1303 - regression_loss: 0.9965 - classification_loss: 0.1338 500/500 [==============================] - 170s 339ms/step - loss: 1.1303 - regression_loss: 0.9965 - classification_loss: 0.1338 1172 instances of class plum with average precision: 0.8083 mAP: 0.8083 Epoch 00023: saving model to ./training/snapshots/resnet101_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:38 - loss: 1.4100 - regression_loss: 1.2576 - classification_loss: 0.1524 2/500 [..............................] - ETA: 2:45 - loss: 0.9471 - regression_loss: 0.8187 - classification_loss: 0.1284 3/500 [..............................] - ETA: 2:50 - loss: 0.8398 - regression_loss: 0.7412 - classification_loss: 0.0986 4/500 [..............................] - ETA: 2:50 - loss: 0.9542 - regression_loss: 0.8471 - classification_loss: 0.1071 5/500 [..............................] - ETA: 2:50 - loss: 0.9595 - regression_loss: 0.8573 - classification_loss: 0.1022 6/500 [..............................] - ETA: 2:48 - loss: 1.0214 - regression_loss: 0.9131 - classification_loss: 0.1084 7/500 [..............................] - ETA: 2:49 - loss: 1.0347 - regression_loss: 0.9161 - classification_loss: 0.1186 8/500 [..............................] - ETA: 2:49 - loss: 1.0833 - regression_loss: 0.9584 - classification_loss: 0.1249 9/500 [..............................] - ETA: 2:47 - loss: 1.0768 - regression_loss: 0.9474 - classification_loss: 0.1294 10/500 [..............................] - ETA: 2:48 - loss: 1.0764 - regression_loss: 0.9498 - classification_loss: 0.1266 11/500 [..............................] - ETA: 2:48 - loss: 1.0571 - regression_loss: 0.9346 - classification_loss: 0.1225 12/500 [..............................] - ETA: 2:47 - loss: 1.0189 - regression_loss: 0.8999 - classification_loss: 0.1190 13/500 [..............................] - ETA: 2:47 - loss: 0.9965 - regression_loss: 0.8794 - classification_loss: 0.1171 14/500 [..............................] - ETA: 2:47 - loss: 1.0202 - regression_loss: 0.9015 - classification_loss: 0.1187 15/500 [..............................] - ETA: 2:47 - loss: 1.0033 - regression_loss: 0.8875 - classification_loss: 0.1158 16/500 [..............................] - ETA: 2:47 - loss: 0.9813 - regression_loss: 0.8704 - classification_loss: 0.1109 17/500 [>.............................] - ETA: 2:47 - loss: 1.0066 - regression_loss: 0.8914 - classification_loss: 0.1151 18/500 [>.............................] - ETA: 2:47 - loss: 1.0366 - regression_loss: 0.9165 - classification_loss: 0.1202 19/500 [>.............................] - ETA: 2:46 - loss: 1.0640 - regression_loss: 0.9407 - classification_loss: 0.1233 20/500 [>.............................] - ETA: 2:45 - loss: 1.0335 - regression_loss: 0.9152 - classification_loss: 0.1182 21/500 [>.............................] - ETA: 2:45 - loss: 1.0389 - regression_loss: 0.9189 - classification_loss: 0.1200 22/500 [>.............................] - ETA: 2:45 - loss: 1.0718 - regression_loss: 0.9467 - classification_loss: 0.1251 23/500 [>.............................] - ETA: 2:44 - loss: 1.0604 - regression_loss: 0.9383 - classification_loss: 0.1220 24/500 [>.............................] - ETA: 2:44 - loss: 1.0766 - regression_loss: 0.9521 - classification_loss: 0.1244 25/500 [>.............................] - ETA: 2:43 - loss: 1.0936 - regression_loss: 0.9648 - classification_loss: 0.1288 26/500 [>.............................] - ETA: 2:43 - loss: 1.1015 - regression_loss: 0.9717 - classification_loss: 0.1298 27/500 [>.............................] - ETA: 2:42 - loss: 1.0974 - regression_loss: 0.9679 - classification_loss: 0.1296 28/500 [>.............................] - ETA: 2:42 - loss: 1.1515 - regression_loss: 1.0097 - classification_loss: 0.1419 29/500 [>.............................] - ETA: 2:41 - loss: 1.1297 - regression_loss: 0.9903 - classification_loss: 0.1394 30/500 [>.............................] - ETA: 2:41 - loss: 1.1180 - regression_loss: 0.9801 - classification_loss: 0.1379 31/500 [>.............................] - ETA: 2:40 - loss: 1.1194 - regression_loss: 0.9761 - classification_loss: 0.1433 32/500 [>.............................] - ETA: 2:40 - loss: 1.1298 - regression_loss: 0.9859 - classification_loss: 0.1440 33/500 [>.............................] - ETA: 2:40 - loss: 1.1323 - regression_loss: 0.9876 - classification_loss: 0.1447 34/500 [=>............................] - ETA: 2:39 - loss: 1.1114 - regression_loss: 0.9696 - classification_loss: 0.1419 35/500 [=>............................] - ETA: 2:39 - loss: 1.1227 - regression_loss: 0.9782 - classification_loss: 0.1445 36/500 [=>............................] - ETA: 2:39 - loss: 1.1255 - regression_loss: 0.9816 - classification_loss: 0.1440 37/500 [=>............................] - ETA: 2:38 - loss: 1.1242 - regression_loss: 0.9796 - classification_loss: 0.1446 38/500 [=>............................] - ETA: 2:37 - loss: 1.1122 - regression_loss: 0.9684 - classification_loss: 0.1438 39/500 [=>............................] - ETA: 2:37 - loss: 1.1221 - regression_loss: 0.9765 - classification_loss: 0.1457 40/500 [=>............................] - ETA: 2:36 - loss: 1.1259 - regression_loss: 0.9797 - classification_loss: 0.1462 41/500 [=>............................] - ETA: 2:36 - loss: 1.1320 - regression_loss: 0.9832 - classification_loss: 0.1488 42/500 [=>............................] - ETA: 2:35 - loss: 1.1467 - regression_loss: 0.9943 - classification_loss: 0.1524 43/500 [=>............................] - ETA: 2:35 - loss: 1.1595 - regression_loss: 1.0041 - classification_loss: 0.1554 44/500 [=>............................] - ETA: 2:35 - loss: 1.1657 - regression_loss: 1.0094 - classification_loss: 0.1563 45/500 [=>............................] - ETA: 2:34 - loss: 1.1678 - regression_loss: 1.0107 - classification_loss: 0.1571 46/500 [=>............................] - ETA: 2:34 - loss: 1.1634 - regression_loss: 1.0076 - classification_loss: 0.1558 47/500 [=>............................] - ETA: 2:34 - loss: 1.1568 - regression_loss: 1.0022 - classification_loss: 0.1546 48/500 [=>............................] - ETA: 2:33 - loss: 1.1650 - regression_loss: 1.0085 - classification_loss: 0.1565 49/500 [=>............................] - ETA: 2:33 - loss: 1.1676 - regression_loss: 1.0112 - classification_loss: 0.1565 50/500 [==>...........................] - ETA: 2:33 - loss: 1.1647 - regression_loss: 1.0090 - classification_loss: 0.1557 51/500 [==>...........................] - ETA: 2:33 - loss: 1.1692 - regression_loss: 1.0138 - classification_loss: 0.1554 52/500 [==>...........................] - ETA: 2:32 - loss: 1.1748 - regression_loss: 1.0190 - classification_loss: 0.1558 53/500 [==>...........................] - ETA: 2:32 - loss: 1.1857 - regression_loss: 1.0289 - classification_loss: 0.1568 54/500 [==>...........................] - ETA: 2:32 - loss: 1.1768 - regression_loss: 1.0215 - classification_loss: 0.1552 55/500 [==>...........................] - ETA: 2:32 - loss: 1.1863 - regression_loss: 1.0286 - classification_loss: 0.1576 56/500 [==>...........................] - ETA: 2:31 - loss: 1.1860 - regression_loss: 1.0289 - classification_loss: 0.1571 57/500 [==>...........................] - ETA: 2:31 - loss: 1.1860 - regression_loss: 1.0300 - classification_loss: 0.1560 58/500 [==>...........................] - ETA: 2:31 - loss: 1.1801 - regression_loss: 1.0254 - classification_loss: 0.1547 59/500 [==>...........................] - ETA: 2:30 - loss: 1.1837 - regression_loss: 1.0284 - classification_loss: 0.1552 60/500 [==>...........................] - ETA: 2:30 - loss: 1.1762 - regression_loss: 1.0220 - classification_loss: 0.1542 61/500 [==>...........................] - ETA: 2:30 - loss: 1.1805 - regression_loss: 1.0256 - classification_loss: 0.1548 62/500 [==>...........................] - ETA: 2:29 - loss: 1.1723 - regression_loss: 1.0190 - classification_loss: 0.1533 63/500 [==>...........................] - ETA: 2:29 - loss: 1.1686 - regression_loss: 1.0166 - classification_loss: 0.1521 64/500 [==>...........................] - ETA: 2:29 - loss: 1.1742 - regression_loss: 1.0221 - classification_loss: 0.1522 65/500 [==>...........................] - ETA: 2:28 - loss: 1.1807 - regression_loss: 1.0276 - classification_loss: 0.1531 66/500 [==>...........................] - ETA: 2:28 - loss: 1.1811 - regression_loss: 1.0284 - classification_loss: 0.1526 67/500 [===>..........................] - ETA: 2:27 - loss: 1.1770 - regression_loss: 1.0253 - classification_loss: 0.1517 68/500 [===>..........................] - ETA: 2:27 - loss: 1.1788 - regression_loss: 1.0278 - classification_loss: 0.1510 69/500 [===>..........................] - ETA: 2:27 - loss: 1.1802 - regression_loss: 1.0288 - classification_loss: 0.1514 70/500 [===>..........................] - ETA: 2:26 - loss: 1.1743 - regression_loss: 1.0243 - classification_loss: 0.1500 71/500 [===>..........................] - ETA: 2:26 - loss: 1.1799 - regression_loss: 1.0307 - classification_loss: 0.1492 72/500 [===>..........................] - ETA: 2:26 - loss: 1.1786 - regression_loss: 1.0295 - classification_loss: 0.1491 73/500 [===>..........................] - ETA: 2:25 - loss: 1.1671 - regression_loss: 1.0193 - classification_loss: 0.1477 74/500 [===>..........................] - ETA: 2:25 - loss: 1.1628 - regression_loss: 1.0156 - classification_loss: 0.1472 75/500 [===>..........................] - ETA: 2:24 - loss: 1.1620 - regression_loss: 1.0145 - classification_loss: 0.1476 76/500 [===>..........................] - ETA: 2:24 - loss: 1.1655 - regression_loss: 1.0174 - classification_loss: 0.1481 77/500 [===>..........................] - ETA: 2:24 - loss: 1.1633 - regression_loss: 1.0157 - classification_loss: 0.1476 78/500 [===>..........................] - ETA: 2:24 - loss: 1.1642 - regression_loss: 1.0170 - classification_loss: 0.1471 79/500 [===>..........................] - ETA: 2:23 - loss: 1.1679 - regression_loss: 1.0206 - classification_loss: 0.1473 80/500 [===>..........................] - ETA: 2:23 - loss: 1.1691 - regression_loss: 1.0212 - classification_loss: 0.1479 81/500 [===>..........................] - ETA: 2:22 - loss: 1.1711 - regression_loss: 1.0229 - classification_loss: 0.1481 82/500 [===>..........................] - ETA: 2:22 - loss: 1.1628 - regression_loss: 1.0161 - classification_loss: 0.1467 83/500 [===>..........................] - ETA: 2:21 - loss: 1.1661 - regression_loss: 1.0193 - classification_loss: 0.1469 84/500 [====>.........................] - ETA: 2:21 - loss: 1.1688 - regression_loss: 1.0225 - classification_loss: 0.1464 85/500 [====>.........................] - ETA: 2:21 - loss: 1.1660 - regression_loss: 1.0202 - classification_loss: 0.1459 86/500 [====>.........................] - ETA: 2:20 - loss: 1.1561 - regression_loss: 1.0117 - classification_loss: 0.1445 87/500 [====>.........................] - ETA: 2:20 - loss: 1.1472 - regression_loss: 1.0040 - classification_loss: 0.1432 88/500 [====>.........................] - ETA: 2:20 - loss: 1.1502 - regression_loss: 1.0065 - classification_loss: 0.1437 89/500 [====>.........................] - ETA: 2:19 - loss: 1.1424 - regression_loss: 0.9996 - classification_loss: 0.1427 90/500 [====>.........................] - ETA: 2:19 - loss: 1.1418 - regression_loss: 0.9995 - classification_loss: 0.1423 91/500 [====>.........................] - ETA: 2:19 - loss: 1.1414 - regression_loss: 0.9993 - classification_loss: 0.1421 92/500 [====>.........................] - ETA: 2:18 - loss: 1.1423 - regression_loss: 1.0003 - classification_loss: 0.1420 93/500 [====>.........................] - ETA: 2:18 - loss: 1.1437 - regression_loss: 1.0014 - classification_loss: 0.1424 94/500 [====>.........................] - ETA: 2:18 - loss: 1.1424 - regression_loss: 1.0004 - classification_loss: 0.1420 95/500 [====>.........................] - ETA: 2:17 - loss: 1.1431 - regression_loss: 1.0013 - classification_loss: 0.1419 96/500 [====>.........................] - ETA: 2:17 - loss: 1.1403 - regression_loss: 0.9989 - classification_loss: 0.1414 97/500 [====>.........................] - ETA: 2:17 - loss: 1.1431 - regression_loss: 1.0020 - classification_loss: 0.1411 98/500 [====>.........................] - ETA: 2:16 - loss: 1.1430 - regression_loss: 1.0017 - classification_loss: 0.1413 99/500 [====>.........................] - ETA: 2:16 - loss: 1.1431 - regression_loss: 1.0014 - classification_loss: 0.1417 100/500 [=====>........................] - ETA: 2:16 - loss: 1.1440 - regression_loss: 1.0026 - classification_loss: 0.1414 101/500 [=====>........................] - ETA: 2:15 - loss: 1.1482 - regression_loss: 1.0060 - classification_loss: 0.1422 102/500 [=====>........................] - ETA: 2:15 - loss: 1.1539 - regression_loss: 1.0107 - classification_loss: 0.1431 103/500 [=====>........................] - ETA: 2:15 - loss: 1.1446 - regression_loss: 1.0026 - classification_loss: 0.1420 104/500 [=====>........................] - ETA: 2:14 - loss: 1.1465 - regression_loss: 1.0041 - classification_loss: 0.1425 105/500 [=====>........................] - ETA: 2:14 - loss: 1.1482 - regression_loss: 1.0063 - classification_loss: 0.1420 106/500 [=====>........................] - ETA: 2:14 - loss: 1.1437 - regression_loss: 1.0022 - classification_loss: 0.1415 107/500 [=====>........................] - ETA: 2:13 - loss: 1.1449 - regression_loss: 1.0032 - classification_loss: 0.1417 108/500 [=====>........................] - ETA: 2:13 - loss: 1.1465 - regression_loss: 1.0046 - classification_loss: 0.1419 109/500 [=====>........................] - ETA: 2:12 - loss: 1.1460 - regression_loss: 1.0042 - classification_loss: 0.1418 110/500 [=====>........................] - ETA: 2:12 - loss: 1.1410 - regression_loss: 0.9999 - classification_loss: 0.1412 111/500 [=====>........................] - ETA: 2:12 - loss: 1.1355 - regression_loss: 0.9955 - classification_loss: 0.1401 112/500 [=====>........................] - ETA: 2:11 - loss: 1.1366 - regression_loss: 0.9965 - classification_loss: 0.1401 113/500 [=====>........................] - ETA: 2:11 - loss: 1.1350 - regression_loss: 0.9954 - classification_loss: 0.1396 114/500 [=====>........................] - ETA: 2:11 - loss: 1.1364 - regression_loss: 0.9966 - classification_loss: 0.1397 115/500 [=====>........................] - ETA: 2:10 - loss: 1.1323 - regression_loss: 0.9932 - classification_loss: 0.1391 116/500 [=====>........................] - ETA: 2:10 - loss: 1.1340 - regression_loss: 0.9949 - classification_loss: 0.1391 117/500 [======>.......................] - ETA: 2:10 - loss: 1.1342 - regression_loss: 0.9952 - classification_loss: 0.1389 118/500 [======>.......................] - ETA: 2:09 - loss: 1.1339 - regression_loss: 0.9949 - classification_loss: 0.1390 119/500 [======>.......................] - ETA: 2:09 - loss: 1.1313 - regression_loss: 0.9929 - classification_loss: 0.1384 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1322 - regression_loss: 0.9939 - classification_loss: 0.1383 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1308 - regression_loss: 0.9928 - classification_loss: 0.1380 122/500 [======>.......................] - ETA: 2:08 - loss: 1.1383 - regression_loss: 0.9993 - classification_loss: 0.1390 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1352 - regression_loss: 0.9967 - classification_loss: 0.1385 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1366 - regression_loss: 0.9979 - classification_loss: 0.1387 125/500 [======>.......................] - ETA: 2:07 - loss: 1.1372 - regression_loss: 0.9983 - classification_loss: 0.1388 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1373 - regression_loss: 0.9988 - classification_loss: 0.1385 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1331 - regression_loss: 0.9954 - classification_loss: 0.1377 128/500 [======>.......................] - ETA: 2:06 - loss: 1.1340 - regression_loss: 0.9962 - classification_loss: 0.1377 129/500 [======>.......................] - ETA: 2:05 - loss: 1.1301 - regression_loss: 0.9925 - classification_loss: 0.1376 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1319 - regression_loss: 0.9942 - classification_loss: 0.1377 131/500 [======>.......................] - ETA: 2:05 - loss: 1.1336 - regression_loss: 0.9955 - classification_loss: 0.1381 132/500 [======>.......................] - ETA: 2:04 - loss: 1.1354 - regression_loss: 0.9970 - classification_loss: 0.1383 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1353 - regression_loss: 0.9967 - classification_loss: 0.1385 134/500 [=======>......................] - ETA: 2:04 - loss: 1.1351 - regression_loss: 0.9967 - classification_loss: 0.1384 135/500 [=======>......................] - ETA: 2:03 - loss: 1.1366 - regression_loss: 0.9986 - classification_loss: 0.1381 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1322 - regression_loss: 0.9948 - classification_loss: 0.1374 137/500 [=======>......................] - ETA: 2:03 - loss: 1.1312 - regression_loss: 0.9942 - classification_loss: 0.1370 138/500 [=======>......................] - ETA: 2:02 - loss: 1.1279 - regression_loss: 0.9915 - classification_loss: 0.1365 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1264 - regression_loss: 0.9906 - classification_loss: 0.1357 140/500 [=======>......................] - ETA: 2:01 - loss: 1.1255 - regression_loss: 0.9900 - classification_loss: 0.1356 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1290 - regression_loss: 0.9933 - classification_loss: 0.1357 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1269 - regression_loss: 0.9917 - classification_loss: 0.1351 143/500 [=======>......................] - ETA: 2:00 - loss: 1.1246 - regression_loss: 0.9898 - classification_loss: 0.1348 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1226 - regression_loss: 0.9883 - classification_loss: 0.1344 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1272 - regression_loss: 0.9919 - classification_loss: 0.1353 146/500 [=======>......................] - ETA: 1:59 - loss: 1.1287 - regression_loss: 0.9933 - classification_loss: 0.1353 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1310 - regression_loss: 0.9957 - classification_loss: 0.1354 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1310 - regression_loss: 0.9957 - classification_loss: 0.1353 149/500 [=======>......................] - ETA: 1:58 - loss: 1.1308 - regression_loss: 0.9955 - classification_loss: 0.1353 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1261 - regression_loss: 0.9916 - classification_loss: 0.1345 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1254 - regression_loss: 0.9913 - classification_loss: 0.1342 152/500 [========>.....................] - ETA: 1:57 - loss: 1.1241 - regression_loss: 0.9905 - classification_loss: 0.1336 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1259 - regression_loss: 0.9923 - classification_loss: 0.1336 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1216 - regression_loss: 0.9886 - classification_loss: 0.1330 155/500 [========>.....................] - ETA: 1:56 - loss: 1.1226 - regression_loss: 0.9895 - classification_loss: 0.1331 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1197 - regression_loss: 0.9870 - classification_loss: 0.1327 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1209 - regression_loss: 0.9880 - classification_loss: 0.1329 158/500 [========>.....................] - ETA: 1:55 - loss: 1.1158 - regression_loss: 0.9835 - classification_loss: 0.1322 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1138 - regression_loss: 0.9819 - classification_loss: 0.1318 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1111 - regression_loss: 0.9798 - classification_loss: 0.1313 161/500 [========>.....................] - ETA: 1:54 - loss: 1.1116 - regression_loss: 0.9802 - classification_loss: 0.1314 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1128 - regression_loss: 0.9812 - classification_loss: 0.1316 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1098 - regression_loss: 0.9787 - classification_loss: 0.1311 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1070 - regression_loss: 0.9763 - classification_loss: 0.1307 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1028 - regression_loss: 0.9727 - classification_loss: 0.1301 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0986 - regression_loss: 0.9692 - classification_loss: 0.1295 167/500 [=========>....................] - ETA: 1:52 - loss: 1.0987 - regression_loss: 0.9694 - classification_loss: 0.1293 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0979 - regression_loss: 0.9689 - classification_loss: 0.1290 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0951 - regression_loss: 0.9666 - classification_loss: 0.1285 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0969 - regression_loss: 0.9684 - classification_loss: 0.1285 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0962 - regression_loss: 0.9674 - classification_loss: 0.1287 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0982 - regression_loss: 0.9693 - classification_loss: 0.1290 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0971 - regression_loss: 0.9682 - classification_loss: 0.1289 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0952 - regression_loss: 0.9668 - classification_loss: 0.1285 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0917 - regression_loss: 0.9638 - classification_loss: 0.1279 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0919 - regression_loss: 0.9636 - classification_loss: 0.1283 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0913 - regression_loss: 0.9631 - classification_loss: 0.1282 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0899 - regression_loss: 0.9618 - classification_loss: 0.1281 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0894 - regression_loss: 0.9614 - classification_loss: 0.1279 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0899 - regression_loss: 0.9619 - classification_loss: 0.1281 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0906 - regression_loss: 0.9617 - classification_loss: 0.1289 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0914 - regression_loss: 0.9626 - classification_loss: 0.1288 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0924 - regression_loss: 0.9635 - classification_loss: 0.1289 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0930 - regression_loss: 0.9640 - classification_loss: 0.1290 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0923 - regression_loss: 0.9634 - classification_loss: 0.1289 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0943 - regression_loss: 0.9651 - classification_loss: 0.1293 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0955 - regression_loss: 0.9661 - classification_loss: 0.1293 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0957 - regression_loss: 0.9665 - classification_loss: 0.1293 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0964 - regression_loss: 0.9671 - classification_loss: 0.1293 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0970 - regression_loss: 0.9678 - classification_loss: 0.1291 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0973 - regression_loss: 0.9678 - classification_loss: 0.1295 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0988 - regression_loss: 0.9695 - classification_loss: 0.1293 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0963 - regression_loss: 0.9673 - classification_loss: 0.1289 194/500 [==========>...................] - ETA: 1:43 - loss: 1.0965 - regression_loss: 0.9677 - classification_loss: 0.1288 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0963 - regression_loss: 0.9676 - classification_loss: 0.1287 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0962 - regression_loss: 0.9676 - classification_loss: 0.1286 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0986 - regression_loss: 0.9695 - classification_loss: 0.1292 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0998 - regression_loss: 0.9702 - classification_loss: 0.1296 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0979 - regression_loss: 0.9687 - classification_loss: 0.1292 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0994 - regression_loss: 0.9700 - classification_loss: 0.1294 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0964 - regression_loss: 0.9674 - classification_loss: 0.1290 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0940 - regression_loss: 0.9653 - classification_loss: 0.1288 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0967 - regression_loss: 0.9675 - classification_loss: 0.1292 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0963 - regression_loss: 0.9673 - classification_loss: 0.1290 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0951 - regression_loss: 0.9663 - classification_loss: 0.1289 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0959 - regression_loss: 0.9669 - classification_loss: 0.1290 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0962 - regression_loss: 0.9672 - classification_loss: 0.1290 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0982 - regression_loss: 0.9695 - classification_loss: 0.1287 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0997 - regression_loss: 0.9708 - classification_loss: 0.1289 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1006 - regression_loss: 0.9716 - classification_loss: 0.1290 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1009 - regression_loss: 0.9718 - classification_loss: 0.1291 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1003 - regression_loss: 0.9712 - classification_loss: 0.1291 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1003 - regression_loss: 0.9711 - classification_loss: 0.1292 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1012 - regression_loss: 0.9718 - classification_loss: 0.1294 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1035 - regression_loss: 0.9737 - classification_loss: 0.1297 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1043 - regression_loss: 0.9744 - classification_loss: 0.1299 217/500 [============>.................] - ETA: 1:36 - loss: 1.1015 - regression_loss: 0.9719 - classification_loss: 0.1296 218/500 [============>.................] - ETA: 1:35 - loss: 1.0990 - regression_loss: 0.9698 - classification_loss: 0.1293 219/500 [============>.................] - ETA: 1:35 - loss: 1.0989 - regression_loss: 0.9698 - classification_loss: 0.1292 220/500 [============>.................] - ETA: 1:35 - loss: 1.1013 - regression_loss: 0.9717 - classification_loss: 0.1295 221/500 [============>.................] - ETA: 1:34 - loss: 1.1000 - regression_loss: 0.9707 - classification_loss: 0.1293 222/500 [============>.................] - ETA: 1:34 - loss: 1.0999 - regression_loss: 0.9708 - classification_loss: 0.1291 223/500 [============>.................] - ETA: 1:34 - loss: 1.1020 - regression_loss: 0.9727 - classification_loss: 0.1293 224/500 [============>.................] - ETA: 1:33 - loss: 1.1043 - regression_loss: 0.9747 - classification_loss: 0.1296 225/500 [============>.................] - ETA: 1:33 - loss: 1.1039 - regression_loss: 0.9741 - classification_loss: 0.1298 226/500 [============>.................] - ETA: 1:33 - loss: 1.1063 - regression_loss: 0.9760 - classification_loss: 0.1302 227/500 [============>.................] - ETA: 1:32 - loss: 1.1088 - regression_loss: 0.9780 - classification_loss: 0.1308 228/500 [============>.................] - ETA: 1:32 - loss: 1.1082 - regression_loss: 0.9776 - classification_loss: 0.1307 229/500 [============>.................] - ETA: 1:32 - loss: 1.1091 - regression_loss: 0.9783 - classification_loss: 0.1308 230/500 [============>.................] - ETA: 1:31 - loss: 1.1073 - regression_loss: 0.9762 - classification_loss: 0.1311 231/500 [============>.................] - ETA: 1:31 - loss: 1.1080 - regression_loss: 0.9767 - classification_loss: 0.1312 232/500 [============>.................] - ETA: 1:31 - loss: 1.1086 - regression_loss: 0.9772 - classification_loss: 0.1314 233/500 [============>.................] - ETA: 1:30 - loss: 1.1082 - regression_loss: 0.9768 - classification_loss: 0.1314 234/500 [=============>................] - ETA: 1:30 - loss: 1.1074 - regression_loss: 0.9760 - classification_loss: 0.1314 235/500 [=============>................] - ETA: 1:30 - loss: 1.1090 - regression_loss: 0.9773 - classification_loss: 0.1317 236/500 [=============>................] - ETA: 1:29 - loss: 1.1091 - regression_loss: 0.9776 - classification_loss: 0.1315 237/500 [=============>................] - ETA: 1:29 - loss: 1.1110 - regression_loss: 0.9792 - classification_loss: 0.1318 238/500 [=============>................] - ETA: 1:29 - loss: 1.1100 - regression_loss: 0.9785 - classification_loss: 0.1316 239/500 [=============>................] - ETA: 1:28 - loss: 1.1107 - regression_loss: 0.9791 - classification_loss: 0.1317 240/500 [=============>................] - ETA: 1:28 - loss: 1.1084 - regression_loss: 0.9771 - classification_loss: 0.1313 241/500 [=============>................] - ETA: 1:28 - loss: 1.1090 - regression_loss: 0.9775 - classification_loss: 0.1314 242/500 [=============>................] - ETA: 1:27 - loss: 1.1091 - regression_loss: 0.9775 - classification_loss: 0.1315 243/500 [=============>................] - ETA: 1:27 - loss: 1.1088 - regression_loss: 0.9773 - classification_loss: 0.1315 244/500 [=============>................] - ETA: 1:26 - loss: 1.1089 - regression_loss: 0.9773 - classification_loss: 0.1316 245/500 [=============>................] - ETA: 1:26 - loss: 1.1091 - regression_loss: 0.9773 - classification_loss: 0.1318 246/500 [=============>................] - ETA: 1:26 - loss: 1.1100 - regression_loss: 0.9781 - classification_loss: 0.1319 247/500 [=============>................] - ETA: 1:25 - loss: 1.1106 - regression_loss: 0.9787 - classification_loss: 0.1319 248/500 [=============>................] - ETA: 1:25 - loss: 1.1076 - regression_loss: 0.9760 - classification_loss: 0.1315 249/500 [=============>................] - ETA: 1:25 - loss: 1.1075 - regression_loss: 0.9762 - classification_loss: 0.1314 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1091 - regression_loss: 0.9771 - classification_loss: 0.1320 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1065 - regression_loss: 0.9750 - classification_loss: 0.1315 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1092 - regression_loss: 0.9775 - classification_loss: 0.1318 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1116 - regression_loss: 0.9795 - classification_loss: 0.1321 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1110 - regression_loss: 0.9790 - classification_loss: 0.1320 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1121 - regression_loss: 0.9801 - classification_loss: 0.1320 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1139 - regression_loss: 0.9816 - classification_loss: 0.1323 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1132 - regression_loss: 0.9810 - classification_loss: 0.1323 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1112 - regression_loss: 0.9792 - classification_loss: 0.1319 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1113 - regression_loss: 0.9793 - classification_loss: 0.1320 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1098 - regression_loss: 0.9779 - classification_loss: 0.1319 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1107 - regression_loss: 0.9788 - classification_loss: 0.1319 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1092 - regression_loss: 0.9776 - classification_loss: 0.1316 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1096 - regression_loss: 0.9780 - classification_loss: 0.1316 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1088 - regression_loss: 0.9774 - classification_loss: 0.1314 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1103 - regression_loss: 0.9788 - classification_loss: 0.1315 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1120 - regression_loss: 0.9802 - classification_loss: 0.1317 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1134 - regression_loss: 0.9813 - classification_loss: 0.1321 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1113 - regression_loss: 0.9794 - classification_loss: 0.1318 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1130 - regression_loss: 0.9809 - classification_loss: 0.1321 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1151 - regression_loss: 0.9826 - classification_loss: 0.1325 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1153 - regression_loss: 0.9820 - classification_loss: 0.1332 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1138 - regression_loss: 0.9809 - classification_loss: 0.1329 273/500 [===============>..............] - ETA: 1:17 - loss: 1.1207 - regression_loss: 0.9864 - classification_loss: 0.1343 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1210 - regression_loss: 0.9866 - classification_loss: 0.1344 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1212 - regression_loss: 0.9869 - classification_loss: 0.1343 276/500 [===============>..............] - ETA: 1:16 - loss: 1.1230 - regression_loss: 0.9885 - classification_loss: 0.1345 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1251 - regression_loss: 0.9904 - classification_loss: 0.1347 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1256 - regression_loss: 0.9908 - classification_loss: 0.1348 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1259 - regression_loss: 0.9912 - classification_loss: 0.1348 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1252 - regression_loss: 0.9906 - classification_loss: 0.1346 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1274 - regression_loss: 0.9925 - classification_loss: 0.1349 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1274 - regression_loss: 0.9921 - classification_loss: 0.1353 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1278 - regression_loss: 0.9923 - classification_loss: 0.1355 284/500 [================>.............] - ETA: 1:13 - loss: 1.1280 - regression_loss: 0.9926 - classification_loss: 0.1354 285/500 [================>.............] - ETA: 1:12 - loss: 1.1275 - regression_loss: 0.9923 - classification_loss: 0.1352 286/500 [================>.............] - ETA: 1:12 - loss: 1.1289 - regression_loss: 0.9937 - classification_loss: 0.1352 287/500 [================>.............] - ETA: 1:12 - loss: 1.1273 - regression_loss: 0.9924 - classification_loss: 0.1349 288/500 [================>.............] - ETA: 1:11 - loss: 1.1264 - regression_loss: 0.9916 - classification_loss: 0.1348 289/500 [================>.............] - ETA: 1:11 - loss: 1.1270 - regression_loss: 0.9922 - classification_loss: 0.1348 290/500 [================>.............] - ETA: 1:11 - loss: 1.1249 - regression_loss: 0.9904 - classification_loss: 0.1345 291/500 [================>.............] - ETA: 1:10 - loss: 1.1272 - regression_loss: 0.9924 - classification_loss: 0.1348 292/500 [================>.............] - ETA: 1:10 - loss: 1.1252 - regression_loss: 0.9906 - classification_loss: 0.1346 293/500 [================>.............] - ETA: 1:10 - loss: 1.1239 - regression_loss: 0.9895 - classification_loss: 0.1344 294/500 [================>.............] - ETA: 1:09 - loss: 1.1234 - regression_loss: 0.9891 - classification_loss: 0.1343 295/500 [================>.............] - ETA: 1:09 - loss: 1.1246 - regression_loss: 0.9901 - classification_loss: 0.1345 296/500 [================>.............] - ETA: 1:09 - loss: 1.1221 - regression_loss: 0.9878 - classification_loss: 0.1343 297/500 [================>.............] - ETA: 1:08 - loss: 1.1213 - regression_loss: 0.9871 - classification_loss: 0.1341 298/500 [================>.............] - ETA: 1:08 - loss: 1.1190 - regression_loss: 0.9852 - classification_loss: 0.1338 299/500 [================>.............] - ETA: 1:08 - loss: 1.1194 - regression_loss: 0.9855 - classification_loss: 0.1338 300/500 [=================>............] - ETA: 1:07 - loss: 1.1189 - regression_loss: 0.9851 - classification_loss: 0.1338 301/500 [=================>............] - ETA: 1:07 - loss: 1.1181 - regression_loss: 0.9844 - classification_loss: 0.1337 302/500 [=================>............] - ETA: 1:07 - loss: 1.1164 - regression_loss: 0.9831 - classification_loss: 0.1333 303/500 [=================>............] - ETA: 1:06 - loss: 1.1155 - regression_loss: 0.9825 - classification_loss: 0.1331 304/500 [=================>............] - ETA: 1:06 - loss: 1.1160 - regression_loss: 0.9829 - classification_loss: 0.1331 305/500 [=================>............] - ETA: 1:06 - loss: 1.1162 - regression_loss: 0.9830 - classification_loss: 0.1332 306/500 [=================>............] - ETA: 1:05 - loss: 1.1153 - regression_loss: 0.9824 - classification_loss: 0.1329 307/500 [=================>............] - ETA: 1:05 - loss: 1.1159 - regression_loss: 0.9829 - classification_loss: 0.1330 308/500 [=================>............] - ETA: 1:05 - loss: 1.1180 - regression_loss: 0.9846 - classification_loss: 0.1334 309/500 [=================>............] - ETA: 1:04 - loss: 1.1185 - regression_loss: 0.9850 - classification_loss: 0.1335 310/500 [=================>............] - ETA: 1:04 - loss: 1.1186 - regression_loss: 0.9850 - classification_loss: 0.1336 311/500 [=================>............] - ETA: 1:04 - loss: 1.1172 - regression_loss: 0.9839 - classification_loss: 0.1333 312/500 [=================>............] - ETA: 1:03 - loss: 1.1171 - regression_loss: 0.9840 - classification_loss: 0.1332 313/500 [=================>............] - ETA: 1:03 - loss: 1.1196 - regression_loss: 0.9860 - classification_loss: 0.1336 314/500 [=================>............] - ETA: 1:03 - loss: 1.1213 - regression_loss: 0.9874 - classification_loss: 0.1339 315/500 [=================>............] - ETA: 1:02 - loss: 1.1223 - regression_loss: 0.9882 - classification_loss: 0.1341 316/500 [=================>............] - ETA: 1:02 - loss: 1.1224 - regression_loss: 0.9884 - classification_loss: 0.1340 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1211 - regression_loss: 0.9871 - classification_loss: 0.1340 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1218 - regression_loss: 0.9878 - classification_loss: 0.1340 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1226 - regression_loss: 0.9885 - classification_loss: 0.1341 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1222 - regression_loss: 0.9882 - classification_loss: 0.1341 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1216 - regression_loss: 0.9877 - classification_loss: 0.1339 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1219 - regression_loss: 0.9880 - classification_loss: 0.1339 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1229 - regression_loss: 0.9888 - classification_loss: 0.1340 324/500 [==================>...........] - ETA: 59s - loss: 1.1213 - regression_loss: 0.9875 - classification_loss: 0.1338  325/500 [==================>...........] - ETA: 59s - loss: 1.1213 - regression_loss: 0.9877 - classification_loss: 0.1337 326/500 [==================>...........] - ETA: 59s - loss: 1.1226 - regression_loss: 0.9885 - classification_loss: 0.1341 327/500 [==================>...........] - ETA: 58s - loss: 1.1213 - regression_loss: 0.9874 - classification_loss: 0.1339 328/500 [==================>...........] - ETA: 58s - loss: 1.1223 - regression_loss: 0.9882 - classification_loss: 0.1341 329/500 [==================>...........] - ETA: 58s - loss: 1.1231 - regression_loss: 0.9889 - classification_loss: 0.1342 330/500 [==================>...........] - ETA: 57s - loss: 1.1232 - regression_loss: 0.9890 - classification_loss: 0.1342 331/500 [==================>...........] - ETA: 57s - loss: 1.1236 - regression_loss: 0.9895 - classification_loss: 0.1342 332/500 [==================>...........] - ETA: 56s - loss: 1.1239 - regression_loss: 0.9899 - classification_loss: 0.1340 333/500 [==================>...........] - ETA: 56s - loss: 1.1220 - regression_loss: 0.9883 - classification_loss: 0.1336 334/500 [===================>..........] - ETA: 56s - loss: 1.1214 - regression_loss: 0.9879 - classification_loss: 0.1335 335/500 [===================>..........] - ETA: 55s - loss: 1.1212 - regression_loss: 0.9878 - classification_loss: 0.1334 336/500 [===================>..........] - ETA: 55s - loss: 1.1214 - regression_loss: 0.9879 - classification_loss: 0.1335 337/500 [===================>..........] - ETA: 55s - loss: 1.1212 - regression_loss: 0.9877 - classification_loss: 0.1335 338/500 [===================>..........] - ETA: 54s - loss: 1.1211 - regression_loss: 0.9876 - classification_loss: 0.1335 339/500 [===================>..........] - ETA: 54s - loss: 1.1208 - regression_loss: 0.9874 - classification_loss: 0.1334 340/500 [===================>..........] - ETA: 54s - loss: 1.1215 - regression_loss: 0.9880 - classification_loss: 0.1335 341/500 [===================>..........] - ETA: 53s - loss: 1.1210 - regression_loss: 0.9876 - classification_loss: 0.1334 342/500 [===================>..........] - ETA: 53s - loss: 1.1207 - regression_loss: 0.9873 - classification_loss: 0.1333 343/500 [===================>..........] - ETA: 53s - loss: 1.1217 - regression_loss: 0.9881 - classification_loss: 0.1336 344/500 [===================>..........] - ETA: 52s - loss: 1.1207 - regression_loss: 0.9873 - classification_loss: 0.1334 345/500 [===================>..........] - ETA: 52s - loss: 1.1214 - regression_loss: 0.9879 - classification_loss: 0.1335 346/500 [===================>..........] - ETA: 52s - loss: 1.1212 - regression_loss: 0.9877 - classification_loss: 0.1335 347/500 [===================>..........] - ETA: 51s - loss: 1.1201 - regression_loss: 0.9868 - classification_loss: 0.1334 348/500 [===================>..........] - ETA: 51s - loss: 1.1221 - regression_loss: 0.9884 - classification_loss: 0.1336 349/500 [===================>..........] - ETA: 51s - loss: 1.1266 - regression_loss: 0.9918 - classification_loss: 0.1348 350/500 [====================>.........] - ETA: 50s - loss: 1.1256 - regression_loss: 0.9910 - classification_loss: 0.1346 351/500 [====================>.........] - ETA: 50s - loss: 1.1290 - regression_loss: 0.9940 - classification_loss: 0.1350 352/500 [====================>.........] - ETA: 50s - loss: 1.1281 - regression_loss: 0.9934 - classification_loss: 0.1348 353/500 [====================>.........] - ETA: 49s - loss: 1.1266 - regression_loss: 0.9920 - classification_loss: 0.1346 354/500 [====================>.........] - ETA: 49s - loss: 1.1269 - regression_loss: 0.9923 - classification_loss: 0.1346 355/500 [====================>.........] - ETA: 49s - loss: 1.1268 - regression_loss: 0.9922 - classification_loss: 0.1346 356/500 [====================>.........] - ETA: 48s - loss: 1.1270 - regression_loss: 0.9925 - classification_loss: 0.1345 357/500 [====================>.........] - ETA: 48s - loss: 1.1265 - regression_loss: 0.9920 - classification_loss: 0.1345 358/500 [====================>.........] - ETA: 48s - loss: 1.1246 - regression_loss: 0.9903 - classification_loss: 0.1342 359/500 [====================>.........] - ETA: 47s - loss: 1.1221 - regression_loss: 0.9881 - classification_loss: 0.1339 360/500 [====================>.........] - ETA: 47s - loss: 1.1219 - regression_loss: 0.9880 - classification_loss: 0.1339 361/500 [====================>.........] - ETA: 47s - loss: 1.1210 - regression_loss: 0.9872 - classification_loss: 0.1338 362/500 [====================>.........] - ETA: 46s - loss: 1.1224 - regression_loss: 0.9885 - classification_loss: 0.1339 363/500 [====================>.........] - ETA: 46s - loss: 1.1212 - regression_loss: 0.9875 - classification_loss: 0.1337 364/500 [====================>.........] - ETA: 46s - loss: 1.1223 - regression_loss: 0.9885 - classification_loss: 0.1338 365/500 [====================>.........] - ETA: 45s - loss: 1.1222 - regression_loss: 0.9885 - classification_loss: 0.1338 366/500 [====================>.........] - ETA: 45s - loss: 1.1214 - regression_loss: 0.9878 - classification_loss: 0.1336 367/500 [=====================>........] - ETA: 45s - loss: 1.1208 - regression_loss: 0.9874 - classification_loss: 0.1334 368/500 [=====================>........] - ETA: 44s - loss: 1.1207 - regression_loss: 0.9873 - classification_loss: 0.1334 369/500 [=====================>........] - ETA: 44s - loss: 1.1220 - regression_loss: 0.9885 - classification_loss: 0.1335 370/500 [=====================>........] - ETA: 44s - loss: 1.1213 - regression_loss: 0.9880 - classification_loss: 0.1333 371/500 [=====================>........] - ETA: 43s - loss: 1.1217 - regression_loss: 0.9882 - classification_loss: 0.1335 372/500 [=====================>........] - ETA: 43s - loss: 1.1215 - regression_loss: 0.9881 - classification_loss: 0.1334 373/500 [=====================>........] - ETA: 43s - loss: 1.1199 - regression_loss: 0.9868 - classification_loss: 0.1332 374/500 [=====================>........] - ETA: 42s - loss: 1.1200 - regression_loss: 0.9869 - classification_loss: 0.1332 375/500 [=====================>........] - ETA: 42s - loss: 1.1186 - regression_loss: 0.9856 - classification_loss: 0.1330 376/500 [=====================>........] - ETA: 42s - loss: 1.1176 - regression_loss: 0.9846 - classification_loss: 0.1330 377/500 [=====================>........] - ETA: 41s - loss: 1.1184 - regression_loss: 0.9852 - classification_loss: 0.1332 378/500 [=====================>........] - ETA: 41s - loss: 1.1171 - regression_loss: 0.9841 - classification_loss: 0.1330 379/500 [=====================>........] - ETA: 41s - loss: 1.1181 - regression_loss: 0.9850 - classification_loss: 0.1331 380/500 [=====================>........] - ETA: 40s - loss: 1.1168 - regression_loss: 0.9838 - classification_loss: 0.1330 381/500 [=====================>........] - ETA: 40s - loss: 1.1171 - regression_loss: 0.9843 - classification_loss: 0.1328 382/500 [=====================>........] - ETA: 40s - loss: 1.1175 - regression_loss: 0.9847 - classification_loss: 0.1329 383/500 [=====================>........] - ETA: 39s - loss: 1.1183 - regression_loss: 0.9853 - classification_loss: 0.1329 384/500 [======================>.......] - ETA: 39s - loss: 1.1189 - regression_loss: 0.9860 - classification_loss: 0.1329 385/500 [======================>.......] - ETA: 39s - loss: 1.1183 - regression_loss: 0.9855 - classification_loss: 0.1327 386/500 [======================>.......] - ETA: 38s - loss: 1.1174 - regression_loss: 0.9848 - classification_loss: 0.1326 387/500 [======================>.......] - ETA: 38s - loss: 1.1174 - regression_loss: 0.9850 - classification_loss: 0.1325 388/500 [======================>.......] - ETA: 38s - loss: 1.1172 - regression_loss: 0.9847 - classification_loss: 0.1324 389/500 [======================>.......] - ETA: 37s - loss: 1.1188 - regression_loss: 0.9862 - classification_loss: 0.1326 390/500 [======================>.......] - ETA: 37s - loss: 1.1179 - regression_loss: 0.9855 - classification_loss: 0.1324 391/500 [======================>.......] - ETA: 36s - loss: 1.1183 - regression_loss: 0.9857 - classification_loss: 0.1326 392/500 [======================>.......] - ETA: 36s - loss: 1.1186 - regression_loss: 0.9860 - classification_loss: 0.1326 393/500 [======================>.......] - ETA: 36s - loss: 1.1190 - regression_loss: 0.9863 - classification_loss: 0.1327 394/500 [======================>.......] - ETA: 35s - loss: 1.1196 - regression_loss: 0.9868 - classification_loss: 0.1327 395/500 [======================>.......] - ETA: 35s - loss: 1.1192 - regression_loss: 0.9865 - classification_loss: 0.1327 396/500 [======================>.......] - ETA: 35s - loss: 1.1194 - regression_loss: 0.9867 - classification_loss: 0.1327 397/500 [======================>.......] - ETA: 34s - loss: 1.1202 - regression_loss: 0.9874 - classification_loss: 0.1328 398/500 [======================>.......] - ETA: 34s - loss: 1.1205 - regression_loss: 0.9877 - classification_loss: 0.1328 399/500 [======================>.......] - ETA: 34s - loss: 1.1217 - regression_loss: 0.9888 - classification_loss: 0.1329 400/500 [=======================>......] - ETA: 33s - loss: 1.1223 - regression_loss: 0.9892 - classification_loss: 0.1331 401/500 [=======================>......] - ETA: 33s - loss: 1.1219 - regression_loss: 0.9890 - classification_loss: 0.1329 402/500 [=======================>......] - ETA: 33s - loss: 1.1224 - regression_loss: 0.9895 - classification_loss: 0.1329 403/500 [=======================>......] - ETA: 32s - loss: 1.1226 - regression_loss: 0.9896 - classification_loss: 0.1330 404/500 [=======================>......] - ETA: 32s - loss: 1.1234 - regression_loss: 0.9904 - classification_loss: 0.1331 405/500 [=======================>......] - ETA: 32s - loss: 1.1241 - regression_loss: 0.9909 - classification_loss: 0.1332 406/500 [=======================>......] - ETA: 31s - loss: 1.1245 - regression_loss: 0.9912 - classification_loss: 0.1333 407/500 [=======================>......] - ETA: 31s - loss: 1.1244 - regression_loss: 0.9911 - classification_loss: 0.1333 408/500 [=======================>......] - ETA: 31s - loss: 1.1242 - regression_loss: 0.9909 - classification_loss: 0.1333 409/500 [=======================>......] - ETA: 30s - loss: 1.1244 - regression_loss: 0.9911 - classification_loss: 0.1333 410/500 [=======================>......] - ETA: 30s - loss: 1.1254 - regression_loss: 0.9919 - classification_loss: 0.1335 411/500 [=======================>......] - ETA: 30s - loss: 1.1247 - regression_loss: 0.9913 - classification_loss: 0.1334 412/500 [=======================>......] - ETA: 29s - loss: 1.1250 - regression_loss: 0.9915 - classification_loss: 0.1334 413/500 [=======================>......] - ETA: 29s - loss: 1.1267 - regression_loss: 0.9928 - classification_loss: 0.1339 414/500 [=======================>......] - ETA: 29s - loss: 1.1264 - regression_loss: 0.9927 - classification_loss: 0.1337 415/500 [=======================>......] - ETA: 28s - loss: 1.1271 - regression_loss: 0.9934 - classification_loss: 0.1338 416/500 [=======================>......] - ETA: 28s - loss: 1.1269 - regression_loss: 0.9933 - classification_loss: 0.1336 417/500 [========================>.....] - ETA: 28s - loss: 1.1266 - regression_loss: 0.9930 - classification_loss: 0.1336 418/500 [========================>.....] - ETA: 27s - loss: 1.1264 - regression_loss: 0.9929 - classification_loss: 0.1336 419/500 [========================>.....] - ETA: 27s - loss: 1.1253 - regression_loss: 0.9919 - classification_loss: 0.1334 420/500 [========================>.....] - ETA: 27s - loss: 1.1251 - regression_loss: 0.9918 - classification_loss: 0.1333 421/500 [========================>.....] - ETA: 26s - loss: 1.1263 - regression_loss: 0.9927 - classification_loss: 0.1336 422/500 [========================>.....] - ETA: 26s - loss: 1.1260 - regression_loss: 0.9924 - classification_loss: 0.1336 423/500 [========================>.....] - ETA: 26s - loss: 1.1258 - regression_loss: 0.9923 - classification_loss: 0.1335 424/500 [========================>.....] - ETA: 25s - loss: 1.1248 - regression_loss: 0.9914 - classification_loss: 0.1334 425/500 [========================>.....] - ETA: 25s - loss: 1.1246 - regression_loss: 0.9912 - classification_loss: 0.1333 426/500 [========================>.....] - ETA: 25s - loss: 1.1263 - regression_loss: 0.9927 - classification_loss: 0.1336 427/500 [========================>.....] - ETA: 24s - loss: 1.1268 - regression_loss: 0.9932 - classification_loss: 0.1336 428/500 [========================>.....] - ETA: 24s - loss: 1.1262 - regression_loss: 0.9927 - classification_loss: 0.1335 429/500 [========================>.....] - ETA: 24s - loss: 1.1245 - regression_loss: 0.9913 - classification_loss: 0.1332 430/500 [========================>.....] - ETA: 23s - loss: 1.1237 - regression_loss: 0.9907 - classification_loss: 0.1330 431/500 [========================>.....] - ETA: 23s - loss: 1.1224 - regression_loss: 0.9896 - classification_loss: 0.1328 432/500 [========================>.....] - ETA: 23s - loss: 1.1211 - regression_loss: 0.9885 - classification_loss: 0.1327 433/500 [========================>.....] - ETA: 22s - loss: 1.1197 - regression_loss: 0.9873 - classification_loss: 0.1324 434/500 [=========================>....] - ETA: 22s - loss: 1.1199 - regression_loss: 0.9875 - classification_loss: 0.1325 435/500 [=========================>....] - ETA: 22s - loss: 1.1198 - regression_loss: 0.9874 - classification_loss: 0.1324 436/500 [=========================>....] - ETA: 21s - loss: 1.1192 - regression_loss: 0.9870 - classification_loss: 0.1322 437/500 [=========================>....] - ETA: 21s - loss: 1.1183 - regression_loss: 0.9863 - classification_loss: 0.1320 438/500 [=========================>....] - ETA: 21s - loss: 1.1183 - regression_loss: 0.9862 - classification_loss: 0.1320 439/500 [=========================>....] - ETA: 20s - loss: 1.1196 - regression_loss: 0.9874 - classification_loss: 0.1322 440/500 [=========================>....] - ETA: 20s - loss: 1.1202 - regression_loss: 0.9878 - classification_loss: 0.1324 441/500 [=========================>....] - ETA: 20s - loss: 1.1196 - regression_loss: 0.9872 - classification_loss: 0.1324 442/500 [=========================>....] - ETA: 19s - loss: 1.1194 - regression_loss: 0.9870 - classification_loss: 0.1323 443/500 [=========================>....] - ETA: 19s - loss: 1.1185 - regression_loss: 0.9864 - classification_loss: 0.1322 444/500 [=========================>....] - ETA: 19s - loss: 1.1193 - regression_loss: 0.9872 - classification_loss: 0.1321 445/500 [=========================>....] - ETA: 18s - loss: 1.1195 - regression_loss: 0.9873 - classification_loss: 0.1321 446/500 [=========================>....] - ETA: 18s - loss: 1.1197 - regression_loss: 0.9876 - classification_loss: 0.1321 447/500 [=========================>....] - ETA: 17s - loss: 1.1198 - regression_loss: 0.9877 - classification_loss: 0.1322 448/500 [=========================>....] - ETA: 17s - loss: 1.1186 - regression_loss: 0.9865 - classification_loss: 0.1321 449/500 [=========================>....] - ETA: 17s - loss: 1.1188 - regression_loss: 0.9866 - classification_loss: 0.1322 450/500 [==========================>...] - ETA: 16s - loss: 1.1188 - regression_loss: 0.9867 - classification_loss: 0.1321 451/500 [==========================>...] - ETA: 16s - loss: 1.1187 - regression_loss: 0.9866 - classification_loss: 0.1321 452/500 [==========================>...] - ETA: 16s - loss: 1.1180 - regression_loss: 0.9859 - classification_loss: 0.1321 453/500 [==========================>...] - ETA: 15s - loss: 1.1180 - regression_loss: 0.9860 - classification_loss: 0.1320 454/500 [==========================>...] - ETA: 15s - loss: 1.1166 - regression_loss: 0.9848 - classification_loss: 0.1319 455/500 [==========================>...] - ETA: 15s - loss: 1.1177 - regression_loss: 0.9856 - classification_loss: 0.1321 456/500 [==========================>...] - ETA: 14s - loss: 1.1184 - regression_loss: 0.9862 - classification_loss: 0.1322 457/500 [==========================>...] - ETA: 14s - loss: 1.1178 - regression_loss: 0.9858 - classification_loss: 0.1320 458/500 [==========================>...] - ETA: 14s - loss: 1.1181 - regression_loss: 0.9861 - classification_loss: 0.1319 459/500 [==========================>...] - ETA: 13s - loss: 1.1172 - regression_loss: 0.9854 - classification_loss: 0.1318 460/500 [==========================>...] - ETA: 13s - loss: 1.1175 - regression_loss: 0.9857 - classification_loss: 0.1318 461/500 [==========================>...] - ETA: 13s - loss: 1.1179 - regression_loss: 0.9861 - classification_loss: 0.1318 462/500 [==========================>...] - ETA: 12s - loss: 1.1172 - regression_loss: 0.9855 - classification_loss: 0.1317 463/500 [==========================>...] - ETA: 12s - loss: 1.1176 - regression_loss: 0.9858 - classification_loss: 0.1318 464/500 [==========================>...] - ETA: 12s - loss: 1.1178 - regression_loss: 0.9860 - classification_loss: 0.1317 465/500 [==========================>...] - ETA: 11s - loss: 1.1187 - regression_loss: 0.9871 - classification_loss: 0.1316 466/500 [==========================>...] - ETA: 11s - loss: 1.1178 - regression_loss: 0.9863 - classification_loss: 0.1315 467/500 [===========================>..] - ETA: 11s - loss: 1.1181 - regression_loss: 0.9866 - classification_loss: 0.1315 468/500 [===========================>..] - ETA: 10s - loss: 1.1182 - regression_loss: 0.9867 - classification_loss: 0.1315 469/500 [===========================>..] - ETA: 10s - loss: 1.1178 - regression_loss: 0.9863 - classification_loss: 0.1315 470/500 [===========================>..] - ETA: 10s - loss: 1.1169 - regression_loss: 0.9855 - classification_loss: 0.1314 471/500 [===========================>..] - ETA: 9s - loss: 1.1160 - regression_loss: 0.9847 - classification_loss: 0.1313  472/500 [===========================>..] - ETA: 9s - loss: 1.1165 - regression_loss: 0.9851 - classification_loss: 0.1314 473/500 [===========================>..] - ETA: 9s - loss: 1.1163 - regression_loss: 0.9850 - classification_loss: 0.1313 474/500 [===========================>..] - ETA: 8s - loss: 1.1165 - regression_loss: 0.9853 - classification_loss: 0.1313 475/500 [===========================>..] - ETA: 8s - loss: 1.1165 - regression_loss: 0.9852 - classification_loss: 0.1312 476/500 [===========================>..] - ETA: 8s - loss: 1.1158 - regression_loss: 0.9846 - classification_loss: 0.1311 477/500 [===========================>..] - ETA: 7s - loss: 1.1161 - regression_loss: 0.9849 - classification_loss: 0.1312 478/500 [===========================>..] - ETA: 7s - loss: 1.1160 - regression_loss: 0.9849 - classification_loss: 0.1311 479/500 [===========================>..] - ETA: 7s - loss: 1.1158 - regression_loss: 0.9847 - classification_loss: 0.1311 480/500 [===========================>..] - ETA: 6s - loss: 1.1147 - regression_loss: 0.9838 - classification_loss: 0.1309 481/500 [===========================>..] - ETA: 6s - loss: 1.1149 - regression_loss: 0.9840 - classification_loss: 0.1310 482/500 [===========================>..] - ETA: 6s - loss: 1.1157 - regression_loss: 0.9847 - classification_loss: 0.1310 483/500 [===========================>..] - ETA: 5s - loss: 1.1145 - regression_loss: 0.9836 - classification_loss: 0.1309 484/500 [============================>.] - ETA: 5s - loss: 1.1144 - regression_loss: 0.9836 - classification_loss: 0.1308 485/500 [============================>.] - ETA: 5s - loss: 1.1144 - regression_loss: 0.9837 - classification_loss: 0.1307 486/500 [============================>.] - ETA: 4s - loss: 1.1152 - regression_loss: 0.9843 - classification_loss: 0.1309 487/500 [============================>.] - ETA: 4s - loss: 1.1148 - regression_loss: 0.9839 - classification_loss: 0.1309 488/500 [============================>.] - ETA: 4s - loss: 1.1146 - regression_loss: 0.9837 - classification_loss: 0.1309 489/500 [============================>.] - ETA: 3s - loss: 1.1144 - regression_loss: 0.9836 - classification_loss: 0.1309 490/500 [============================>.] - ETA: 3s - loss: 1.1137 - regression_loss: 0.9830 - classification_loss: 0.1307 491/500 [============================>.] - ETA: 3s - loss: 1.1127 - regression_loss: 0.9821 - classification_loss: 0.1306 492/500 [============================>.] - ETA: 2s - loss: 1.1123 - regression_loss: 0.9818 - classification_loss: 0.1305 493/500 [============================>.] - ETA: 2s - loss: 1.1119 - regression_loss: 0.9814 - classification_loss: 0.1304 494/500 [============================>.] - ETA: 2s - loss: 1.1126 - regression_loss: 0.9820 - classification_loss: 0.1306 495/500 [============================>.] - ETA: 1s - loss: 1.1119 - regression_loss: 0.9815 - classification_loss: 0.1305 496/500 [============================>.] - ETA: 1s - loss: 1.1123 - regression_loss: 0.9817 - classification_loss: 0.1306 497/500 [============================>.] - ETA: 1s - loss: 1.1124 - regression_loss: 0.9818 - classification_loss: 0.1306 498/500 [============================>.] - ETA: 0s - loss: 1.1121 - regression_loss: 0.9815 - classification_loss: 0.1305 499/500 [============================>.] - ETA: 0s - loss: 1.1120 - regression_loss: 0.9814 - classification_loss: 0.1305 500/500 [==============================] - 170s 340ms/step - loss: 1.1111 - regression_loss: 0.9806 - classification_loss: 0.1305 1172 instances of class plum with average precision: 0.7844 mAP: 0.7844 Epoch 00024: saving model to ./training/snapshots/resnet101_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:36 - loss: 0.8647 - regression_loss: 0.7845 - classification_loss: 0.0802 2/500 [..............................] - ETA: 2:39 - loss: 1.1940 - regression_loss: 1.0485 - classification_loss: 0.1455 3/500 [..............................] - ETA: 2:43 - loss: 1.0040 - regression_loss: 0.8888 - classification_loss: 0.1151 4/500 [..............................] - ETA: 2:45 - loss: 1.1588 - regression_loss: 1.0205 - classification_loss: 0.1383 5/500 [..............................] - ETA: 2:48 - loss: 1.0194 - regression_loss: 0.8956 - classification_loss: 0.1239 6/500 [..............................] - ETA: 2:48 - loss: 1.0606 - regression_loss: 0.9324 - classification_loss: 0.1281 7/500 [..............................] - ETA: 2:48 - loss: 1.0469 - regression_loss: 0.9200 - classification_loss: 0.1269 8/500 [..............................] - ETA: 2:48 - loss: 1.0023 - regression_loss: 0.8821 - classification_loss: 0.1202 9/500 [..............................] - ETA: 2:46 - loss: 0.9690 - regression_loss: 0.8558 - classification_loss: 0.1132 10/500 [..............................] - ETA: 2:46 - loss: 0.9365 - regression_loss: 0.8310 - classification_loss: 0.1055 11/500 [..............................] - ETA: 2:47 - loss: 0.8835 - regression_loss: 0.7841 - classification_loss: 0.0994 12/500 [..............................] - ETA: 2:47 - loss: 0.8675 - regression_loss: 0.7735 - classification_loss: 0.0941 13/500 [..............................] - ETA: 2:47 - loss: 0.9134 - regression_loss: 0.8122 - classification_loss: 0.1012 14/500 [..............................] - ETA: 2:46 - loss: 0.9387 - regression_loss: 0.8358 - classification_loss: 0.1029 15/500 [..............................] - ETA: 2:46 - loss: 0.9672 - regression_loss: 0.8585 - classification_loss: 0.1087 16/500 [..............................] - ETA: 2:45 - loss: 0.9235 - regression_loss: 0.8200 - classification_loss: 0.1035 17/500 [>.............................] - ETA: 2:45 - loss: 0.9145 - regression_loss: 0.8108 - classification_loss: 0.1037 18/500 [>.............................] - ETA: 2:45 - loss: 0.9234 - regression_loss: 0.8191 - classification_loss: 0.1043 19/500 [>.............................] - ETA: 2:44 - loss: 0.9333 - regression_loss: 0.8290 - classification_loss: 0.1043 20/500 [>.............................] - ETA: 2:44 - loss: 0.9273 - regression_loss: 0.8241 - classification_loss: 0.1032 21/500 [>.............................] - ETA: 2:43 - loss: 0.9188 - regression_loss: 0.8174 - classification_loss: 0.1014 22/500 [>.............................] - ETA: 2:42 - loss: 0.9090 - regression_loss: 0.8102 - classification_loss: 0.0988 23/500 [>.............................] - ETA: 2:42 - loss: 0.9339 - regression_loss: 0.8301 - classification_loss: 0.1038 24/500 [>.............................] - ETA: 2:42 - loss: 0.9536 - regression_loss: 0.8459 - classification_loss: 0.1077 25/500 [>.............................] - ETA: 2:42 - loss: 0.9836 - regression_loss: 0.8699 - classification_loss: 0.1137 26/500 [>.............................] - ETA: 2:41 - loss: 0.9965 - regression_loss: 0.8805 - classification_loss: 0.1160 27/500 [>.............................] - ETA: 2:41 - loss: 1.0113 - regression_loss: 0.8949 - classification_loss: 0.1163 28/500 [>.............................] - ETA: 2:41 - loss: 1.0124 - regression_loss: 0.8967 - classification_loss: 0.1157 29/500 [>.............................] - ETA: 2:40 - loss: 1.0258 - regression_loss: 0.9067 - classification_loss: 0.1192 30/500 [>.............................] - ETA: 2:39 - loss: 1.0285 - regression_loss: 0.9110 - classification_loss: 0.1175 31/500 [>.............................] - ETA: 2:39 - loss: 1.0330 - regression_loss: 0.9148 - classification_loss: 0.1182 32/500 [>.............................] - ETA: 2:39 - loss: 1.0418 - regression_loss: 0.9219 - classification_loss: 0.1199 33/500 [>.............................] - ETA: 2:39 - loss: 1.0351 - regression_loss: 0.9166 - classification_loss: 0.1185 34/500 [=>............................] - ETA: 2:38 - loss: 1.0294 - regression_loss: 0.9111 - classification_loss: 0.1183 35/500 [=>............................] - ETA: 2:38 - loss: 1.0406 - regression_loss: 0.9213 - classification_loss: 0.1193 36/500 [=>............................] - ETA: 2:38 - loss: 1.0451 - regression_loss: 0.9250 - classification_loss: 0.1200 37/500 [=>............................] - ETA: 2:37 - loss: 1.0519 - regression_loss: 0.9308 - classification_loss: 0.1211 38/500 [=>............................] - ETA: 2:37 - loss: 1.0550 - regression_loss: 0.9339 - classification_loss: 0.1211 39/500 [=>............................] - ETA: 2:37 - loss: 1.0600 - regression_loss: 0.9390 - classification_loss: 0.1210 40/500 [=>............................] - ETA: 2:36 - loss: 1.0635 - regression_loss: 0.9417 - classification_loss: 0.1218 41/500 [=>............................] - ETA: 2:36 - loss: 1.0649 - regression_loss: 0.9421 - classification_loss: 0.1228 42/500 [=>............................] - ETA: 2:36 - loss: 1.0526 - regression_loss: 0.9317 - classification_loss: 0.1209 43/500 [=>............................] - ETA: 2:36 - loss: 1.0514 - regression_loss: 0.9305 - classification_loss: 0.1209 44/500 [=>............................] - ETA: 2:35 - loss: 1.0542 - regression_loss: 0.9326 - classification_loss: 0.1216 45/500 [=>............................] - ETA: 2:35 - loss: 1.0489 - regression_loss: 0.9291 - classification_loss: 0.1198 46/500 [=>............................] - ETA: 2:35 - loss: 1.0364 - regression_loss: 0.9180 - classification_loss: 0.1184 47/500 [=>............................] - ETA: 2:34 - loss: 1.0257 - regression_loss: 0.9088 - classification_loss: 0.1169 48/500 [=>............................] - ETA: 2:34 - loss: 1.0227 - regression_loss: 0.9067 - classification_loss: 0.1160 49/500 [=>............................] - ETA: 2:33 - loss: 1.0292 - regression_loss: 0.9122 - classification_loss: 0.1170 50/500 [==>...........................] - ETA: 2:33 - loss: 1.0187 - regression_loss: 0.9035 - classification_loss: 0.1152 51/500 [==>...........................] - ETA: 2:33 - loss: 1.0236 - regression_loss: 0.9077 - classification_loss: 0.1159 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0198 - regression_loss: 0.9047 - classification_loss: 0.1151 53/500 [==>...........................] - ETA: 2:32 - loss: 1.0178 - regression_loss: 0.9036 - classification_loss: 0.1142 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0339 - regression_loss: 0.9165 - classification_loss: 0.1174 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0358 - regression_loss: 0.9178 - classification_loss: 0.1179 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0305 - regression_loss: 0.9135 - classification_loss: 0.1170 57/500 [==>...........................] - ETA: 2:31 - loss: 1.0318 - regression_loss: 0.9152 - classification_loss: 0.1166 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0311 - regression_loss: 0.9144 - classification_loss: 0.1167 59/500 [==>...........................] - ETA: 2:30 - loss: 1.0377 - regression_loss: 0.9211 - classification_loss: 0.1166 60/500 [==>...........................] - ETA: 2:30 - loss: 1.0417 - regression_loss: 0.9245 - classification_loss: 0.1172 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0462 - regression_loss: 0.9288 - classification_loss: 0.1174 62/500 [==>...........................] - ETA: 2:29 - loss: 1.0499 - regression_loss: 0.9315 - classification_loss: 0.1184 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0435 - regression_loss: 0.9261 - classification_loss: 0.1174 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0360 - regression_loss: 0.9171 - classification_loss: 0.1189 65/500 [==>...........................] - ETA: 2:28 - loss: 1.0404 - regression_loss: 0.9214 - classification_loss: 0.1190 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0455 - regression_loss: 0.9256 - classification_loss: 0.1200 67/500 [===>..........................] - ETA: 2:27 - loss: 1.0433 - regression_loss: 0.9237 - classification_loss: 0.1196 68/500 [===>..........................] - ETA: 2:27 - loss: 1.0440 - regression_loss: 0.9240 - classification_loss: 0.1201 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0479 - regression_loss: 0.9275 - classification_loss: 0.1204 70/500 [===>..........................] - ETA: 2:26 - loss: 1.0406 - regression_loss: 0.9213 - classification_loss: 0.1192 71/500 [===>..........................] - ETA: 2:26 - loss: 1.0317 - regression_loss: 0.9135 - classification_loss: 0.1182 72/500 [===>..........................] - ETA: 2:25 - loss: 1.0355 - regression_loss: 0.9168 - classification_loss: 0.1187 73/500 [===>..........................] - ETA: 2:25 - loss: 1.0417 - regression_loss: 0.9221 - classification_loss: 0.1196 74/500 [===>..........................] - ETA: 2:25 - loss: 1.0407 - regression_loss: 0.9211 - classification_loss: 0.1196 75/500 [===>..........................] - ETA: 2:24 - loss: 1.0327 - regression_loss: 0.9141 - classification_loss: 0.1187 76/500 [===>..........................] - ETA: 2:24 - loss: 1.0391 - regression_loss: 0.9199 - classification_loss: 0.1192 77/500 [===>..........................] - ETA: 2:24 - loss: 1.0398 - regression_loss: 0.9208 - classification_loss: 0.1190 78/500 [===>..........................] - ETA: 2:23 - loss: 1.0464 - regression_loss: 0.9264 - classification_loss: 0.1200 79/500 [===>..........................] - ETA: 2:23 - loss: 1.0432 - regression_loss: 0.9239 - classification_loss: 0.1193 80/500 [===>..........................] - ETA: 2:23 - loss: 1.0424 - regression_loss: 0.9231 - classification_loss: 0.1192 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0401 - regression_loss: 0.9217 - classification_loss: 0.1184 82/500 [===>..........................] - ETA: 2:22 - loss: 1.0453 - regression_loss: 0.9260 - classification_loss: 0.1193 83/500 [===>..........................] - ETA: 2:22 - loss: 1.0505 - regression_loss: 0.9298 - classification_loss: 0.1207 84/500 [====>.........................] - ETA: 2:22 - loss: 1.0559 - regression_loss: 0.9340 - classification_loss: 0.1219 85/500 [====>.........................] - ETA: 2:21 - loss: 1.0566 - regression_loss: 0.9347 - classification_loss: 0.1219 86/500 [====>.........................] - ETA: 2:21 - loss: 1.0648 - regression_loss: 0.9427 - classification_loss: 0.1221 87/500 [====>.........................] - ETA: 2:21 - loss: 1.0689 - regression_loss: 0.9461 - classification_loss: 0.1228 88/500 [====>.........................] - ETA: 2:20 - loss: 1.0632 - regression_loss: 0.9415 - classification_loss: 0.1217 89/500 [====>.........................] - ETA: 2:20 - loss: 1.0618 - regression_loss: 0.9404 - classification_loss: 0.1214 90/500 [====>.........................] - ETA: 2:20 - loss: 1.0568 - regression_loss: 0.9362 - classification_loss: 0.1206 91/500 [====>.........................] - ETA: 2:19 - loss: 1.0560 - regression_loss: 0.9356 - classification_loss: 0.1204 92/500 [====>.........................] - ETA: 2:19 - loss: 1.0572 - regression_loss: 0.9368 - classification_loss: 0.1204 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0536 - regression_loss: 0.9340 - classification_loss: 0.1196 94/500 [====>.........................] - ETA: 2:18 - loss: 1.0527 - regression_loss: 0.9332 - classification_loss: 0.1195 95/500 [====>.........................] - ETA: 2:18 - loss: 1.0562 - regression_loss: 0.9365 - classification_loss: 0.1197 96/500 [====>.........................] - ETA: 2:18 - loss: 1.0571 - regression_loss: 0.9374 - classification_loss: 0.1197 97/500 [====>.........................] - ETA: 2:17 - loss: 1.0577 - regression_loss: 0.9379 - classification_loss: 0.1197 98/500 [====>.........................] - ETA: 2:17 - loss: 1.0567 - regression_loss: 0.9373 - classification_loss: 0.1194 99/500 [====>.........................] - ETA: 2:17 - loss: 1.0575 - regression_loss: 0.9379 - classification_loss: 0.1196 100/500 [=====>........................] - ETA: 2:16 - loss: 1.0570 - regression_loss: 0.9370 - classification_loss: 0.1200 101/500 [=====>........................] - ETA: 2:16 - loss: 1.0578 - regression_loss: 0.9378 - classification_loss: 0.1201 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0568 - regression_loss: 0.9370 - classification_loss: 0.1198 103/500 [=====>........................] - ETA: 2:15 - loss: 1.0599 - regression_loss: 0.9397 - classification_loss: 0.1202 104/500 [=====>........................] - ETA: 2:15 - loss: 1.0627 - regression_loss: 0.9423 - classification_loss: 0.1204 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0654 - regression_loss: 0.9450 - classification_loss: 0.1204 106/500 [=====>........................] - ETA: 2:14 - loss: 1.0682 - regression_loss: 0.9477 - classification_loss: 0.1205 107/500 [=====>........................] - ETA: 2:14 - loss: 1.0714 - regression_loss: 0.9502 - classification_loss: 0.1212 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0709 - regression_loss: 0.9497 - classification_loss: 0.1212 109/500 [=====>........................] - ETA: 2:13 - loss: 1.0741 - regression_loss: 0.9522 - classification_loss: 0.1219 110/500 [=====>........................] - ETA: 2:13 - loss: 1.0730 - regression_loss: 0.9504 - classification_loss: 0.1227 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0691 - regression_loss: 0.9471 - classification_loss: 0.1220 112/500 [=====>........................] - ETA: 2:12 - loss: 1.0670 - regression_loss: 0.9453 - classification_loss: 0.1217 113/500 [=====>........................] - ETA: 2:12 - loss: 1.0605 - regression_loss: 0.9396 - classification_loss: 0.1209 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0609 - regression_loss: 0.9401 - classification_loss: 0.1208 115/500 [=====>........................] - ETA: 2:11 - loss: 1.0629 - regression_loss: 0.9419 - classification_loss: 0.1210 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0630 - regression_loss: 0.9422 - classification_loss: 0.1208 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0658 - regression_loss: 0.9448 - classification_loss: 0.1210 118/500 [======>.......................] - ETA: 2:10 - loss: 1.0734 - regression_loss: 0.9508 - classification_loss: 0.1226 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0719 - regression_loss: 0.9498 - classification_loss: 0.1222 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0785 - regression_loss: 0.9551 - classification_loss: 0.1234 121/500 [======>.......................] - ETA: 2:09 - loss: 1.0793 - regression_loss: 0.9559 - classification_loss: 0.1234 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0837 - regression_loss: 0.9597 - classification_loss: 0.1241 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0831 - regression_loss: 0.9591 - classification_loss: 0.1240 124/500 [======>.......................] - ETA: 2:08 - loss: 1.0831 - regression_loss: 0.9590 - classification_loss: 0.1241 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0820 - regression_loss: 0.9581 - classification_loss: 0.1239 126/500 [======>.......................] - ETA: 2:07 - loss: 1.0810 - regression_loss: 0.9573 - classification_loss: 0.1237 127/500 [======>.......................] - ETA: 2:07 - loss: 1.0827 - regression_loss: 0.9582 - classification_loss: 0.1245 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0823 - regression_loss: 0.9575 - classification_loss: 0.1248 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0789 - regression_loss: 0.9547 - classification_loss: 0.1243 130/500 [======>.......................] - ETA: 2:06 - loss: 1.0809 - regression_loss: 0.9566 - classification_loss: 0.1242 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0800 - regression_loss: 0.9562 - classification_loss: 0.1237 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0793 - regression_loss: 0.9554 - classification_loss: 0.1239 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0784 - regression_loss: 0.9549 - classification_loss: 0.1234 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0791 - regression_loss: 0.9551 - classification_loss: 0.1240 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0828 - regression_loss: 0.9579 - classification_loss: 0.1248 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0853 - regression_loss: 0.9604 - classification_loss: 0.1250 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0863 - regression_loss: 0.9618 - classification_loss: 0.1245 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0865 - regression_loss: 0.9617 - classification_loss: 0.1249 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0873 - regression_loss: 0.9622 - classification_loss: 0.1251 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0919 - regression_loss: 0.9661 - classification_loss: 0.1257 141/500 [=======>......................] - ETA: 2:02 - loss: 1.0883 - regression_loss: 0.9631 - classification_loss: 0.1252 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0832 - regression_loss: 0.9587 - classification_loss: 0.1245 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0806 - regression_loss: 0.9565 - classification_loss: 0.1241 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0823 - regression_loss: 0.9580 - classification_loss: 0.1243 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0828 - regression_loss: 0.9582 - classification_loss: 0.1246 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0876 - regression_loss: 0.9624 - classification_loss: 0.1252 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0874 - regression_loss: 0.9624 - classification_loss: 0.1250 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0844 - regression_loss: 0.9597 - classification_loss: 0.1247 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0801 - regression_loss: 0.9560 - classification_loss: 0.1241 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0809 - regression_loss: 0.9567 - classification_loss: 0.1243 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0765 - regression_loss: 0.9528 - classification_loss: 0.1237 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0752 - regression_loss: 0.9516 - classification_loss: 0.1236 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0776 - regression_loss: 0.9539 - classification_loss: 0.1237 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0766 - regression_loss: 0.9533 - classification_loss: 0.1233 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0746 - regression_loss: 0.9515 - classification_loss: 0.1231 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0753 - regression_loss: 0.9519 - classification_loss: 0.1234 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0784 - regression_loss: 0.9541 - classification_loss: 0.1243 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0806 - regression_loss: 0.9559 - classification_loss: 0.1246 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0767 - regression_loss: 0.9528 - classification_loss: 0.1240 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0733 - regression_loss: 0.9499 - classification_loss: 0.1234 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0708 - regression_loss: 0.9479 - classification_loss: 0.1229 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0706 - regression_loss: 0.9478 - classification_loss: 0.1228 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0670 - regression_loss: 0.9449 - classification_loss: 0.1222 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0642 - regression_loss: 0.9424 - classification_loss: 0.1218 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0668 - regression_loss: 0.9450 - classification_loss: 0.1218 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0652 - regression_loss: 0.9437 - classification_loss: 0.1215 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0644 - regression_loss: 0.9431 - classification_loss: 0.1213 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0662 - regression_loss: 0.9447 - classification_loss: 0.1215 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0628 - regression_loss: 0.9417 - classification_loss: 0.1210 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0675 - regression_loss: 0.9457 - classification_loss: 0.1218 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0682 - regression_loss: 0.9463 - classification_loss: 0.1219 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0662 - regression_loss: 0.9448 - classification_loss: 0.1214 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0662 - regression_loss: 0.9450 - classification_loss: 0.1212 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0667 - regression_loss: 0.9456 - classification_loss: 0.1211 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0649 - regression_loss: 0.9442 - classification_loss: 0.1207 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0649 - regression_loss: 0.9441 - classification_loss: 0.1208 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0638 - regression_loss: 0.9432 - classification_loss: 0.1206 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0679 - regression_loss: 0.9468 - classification_loss: 0.1211 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0714 - regression_loss: 0.9496 - classification_loss: 0.1218 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0696 - regression_loss: 0.9482 - classification_loss: 0.1215 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0703 - regression_loss: 0.9489 - classification_loss: 0.1214 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0694 - regression_loss: 0.9481 - classification_loss: 0.1212 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0665 - regression_loss: 0.9456 - classification_loss: 0.1208 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0676 - regression_loss: 0.9466 - classification_loss: 0.1210 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0687 - regression_loss: 0.9474 - classification_loss: 0.1214 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0671 - regression_loss: 0.9461 - classification_loss: 0.1210 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0680 - regression_loss: 0.9468 - classification_loss: 0.1212 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0677 - regression_loss: 0.9464 - classification_loss: 0.1212 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0670 - regression_loss: 0.9459 - classification_loss: 0.1211 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0641 - regression_loss: 0.9435 - classification_loss: 0.1206 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0658 - regression_loss: 0.9451 - classification_loss: 0.1207 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0642 - regression_loss: 0.9438 - classification_loss: 0.1205 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0630 - regression_loss: 0.9427 - classification_loss: 0.1203 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0643 - regression_loss: 0.9437 - classification_loss: 0.1205 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0676 - regression_loss: 0.9467 - classification_loss: 0.1208 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0674 - regression_loss: 0.9465 - classification_loss: 0.1209 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0680 - regression_loss: 0.9472 - classification_loss: 0.1208 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0735 - regression_loss: 0.9516 - classification_loss: 0.1219 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0731 - regression_loss: 0.9514 - classification_loss: 0.1217 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0743 - regression_loss: 0.9524 - classification_loss: 0.1219 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0720 - regression_loss: 0.9505 - classification_loss: 0.1215 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0718 - regression_loss: 0.9502 - classification_loss: 0.1215 203/500 [===========>..................] - ETA: 1:41 - loss: 1.0745 - regression_loss: 0.9525 - classification_loss: 0.1220 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0745 - regression_loss: 0.9527 - classification_loss: 0.1218 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0714 - regression_loss: 0.9500 - classification_loss: 0.1214 206/500 [===========>..................] - ETA: 1:40 - loss: 1.0702 - regression_loss: 0.9492 - classification_loss: 0.1210 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0695 - regression_loss: 0.9487 - classification_loss: 0.1208 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0699 - regression_loss: 0.9492 - classification_loss: 0.1207 209/500 [===========>..................] - ETA: 1:39 - loss: 1.0721 - regression_loss: 0.9511 - classification_loss: 0.1210 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0733 - regression_loss: 0.9522 - classification_loss: 0.1211 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0767 - regression_loss: 0.9549 - classification_loss: 0.1218 212/500 [===========>..................] - ETA: 1:38 - loss: 1.0780 - regression_loss: 0.9562 - classification_loss: 0.1218 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0782 - regression_loss: 0.9563 - classification_loss: 0.1220 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0801 - regression_loss: 0.9578 - classification_loss: 0.1223 215/500 [===========>..................] - ETA: 1:37 - loss: 1.0770 - regression_loss: 0.9550 - classification_loss: 0.1220 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0787 - regression_loss: 0.9566 - classification_loss: 0.1222 217/500 [============>.................] - ETA: 1:36 - loss: 1.0803 - regression_loss: 0.9580 - classification_loss: 0.1223 218/500 [============>.................] - ETA: 1:35 - loss: 1.0767 - regression_loss: 0.9549 - classification_loss: 0.1219 219/500 [============>.................] - ETA: 1:35 - loss: 1.0788 - regression_loss: 0.9566 - classification_loss: 0.1222 220/500 [============>.................] - ETA: 1:35 - loss: 1.0781 - regression_loss: 0.9561 - classification_loss: 0.1220 221/500 [============>.................] - ETA: 1:34 - loss: 1.0770 - regression_loss: 0.9551 - classification_loss: 0.1219 222/500 [============>.................] - ETA: 1:34 - loss: 1.0745 - regression_loss: 0.9531 - classification_loss: 0.1215 223/500 [============>.................] - ETA: 1:34 - loss: 1.0753 - regression_loss: 0.9537 - classification_loss: 0.1216 224/500 [============>.................] - ETA: 1:33 - loss: 1.0749 - regression_loss: 0.9536 - classification_loss: 0.1214 225/500 [============>.................] - ETA: 1:33 - loss: 1.0763 - regression_loss: 0.9550 - classification_loss: 0.1213 226/500 [============>.................] - ETA: 1:33 - loss: 1.0770 - regression_loss: 0.9555 - classification_loss: 0.1215 227/500 [============>.................] - ETA: 1:32 - loss: 1.0760 - regression_loss: 0.9548 - classification_loss: 0.1212 228/500 [============>.................] - ETA: 1:32 - loss: 1.0731 - regression_loss: 0.9521 - classification_loss: 0.1210 229/500 [============>.................] - ETA: 1:32 - loss: 1.0729 - regression_loss: 0.9520 - classification_loss: 0.1209 230/500 [============>.................] - ETA: 1:31 - loss: 1.0704 - regression_loss: 0.9499 - classification_loss: 0.1206 231/500 [============>.................] - ETA: 1:31 - loss: 1.0709 - regression_loss: 0.9502 - classification_loss: 0.1208 232/500 [============>.................] - ETA: 1:31 - loss: 1.0700 - regression_loss: 0.9494 - classification_loss: 0.1205 233/500 [============>.................] - ETA: 1:30 - loss: 1.0725 - regression_loss: 0.9517 - classification_loss: 0.1209 234/500 [=============>................] - ETA: 1:30 - loss: 1.0700 - regression_loss: 0.9495 - classification_loss: 0.1205 235/500 [=============>................] - ETA: 1:30 - loss: 1.0688 - regression_loss: 0.9485 - classification_loss: 0.1203 236/500 [=============>................] - ETA: 1:29 - loss: 1.0667 - regression_loss: 0.9467 - classification_loss: 0.1200 237/500 [=============>................] - ETA: 1:29 - loss: 1.0652 - regression_loss: 0.9452 - classification_loss: 0.1200 238/500 [=============>................] - ETA: 1:29 - loss: 1.0657 - regression_loss: 0.9456 - classification_loss: 0.1200 239/500 [=============>................] - ETA: 1:28 - loss: 1.0666 - regression_loss: 0.9465 - classification_loss: 0.1201 240/500 [=============>................] - ETA: 1:28 - loss: 1.0651 - regression_loss: 0.9454 - classification_loss: 0.1198 241/500 [=============>................] - ETA: 1:28 - loss: 1.0634 - regression_loss: 0.9438 - classification_loss: 0.1196 242/500 [=============>................] - ETA: 1:27 - loss: 1.0647 - regression_loss: 0.9449 - classification_loss: 0.1198 243/500 [=============>................] - ETA: 1:27 - loss: 1.0650 - regression_loss: 0.9453 - classification_loss: 0.1197 244/500 [=============>................] - ETA: 1:27 - loss: 1.0646 - regression_loss: 0.9450 - classification_loss: 0.1196 245/500 [=============>................] - ETA: 1:26 - loss: 1.0649 - regression_loss: 0.9452 - classification_loss: 0.1197 246/500 [=============>................] - ETA: 1:26 - loss: 1.0659 - regression_loss: 0.9462 - classification_loss: 0.1198 247/500 [=============>................] - ETA: 1:26 - loss: 1.0655 - regression_loss: 0.9458 - classification_loss: 0.1197 248/500 [=============>................] - ETA: 1:25 - loss: 1.0641 - regression_loss: 0.9447 - classification_loss: 0.1194 249/500 [=============>................] - ETA: 1:25 - loss: 1.0626 - regression_loss: 0.9434 - classification_loss: 0.1192 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0625 - regression_loss: 0.9430 - classification_loss: 0.1195 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0652 - regression_loss: 0.9452 - classification_loss: 0.1200 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0650 - regression_loss: 0.9447 - classification_loss: 0.1203 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0650 - regression_loss: 0.9447 - classification_loss: 0.1203 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0644 - regression_loss: 0.9442 - classification_loss: 0.1202 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0647 - regression_loss: 0.9444 - classification_loss: 0.1203 256/500 [==============>...............] - ETA: 1:23 - loss: 1.0666 - regression_loss: 0.9462 - classification_loss: 0.1204 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0671 - regression_loss: 0.9465 - classification_loss: 0.1206 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0684 - regression_loss: 0.9476 - classification_loss: 0.1208 259/500 [==============>...............] - ETA: 1:22 - loss: 1.0683 - regression_loss: 0.9473 - classification_loss: 0.1210 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0660 - regression_loss: 0.9451 - classification_loss: 0.1208 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0675 - regression_loss: 0.9464 - classification_loss: 0.1210 262/500 [==============>...............] - ETA: 1:21 - loss: 1.0687 - regression_loss: 0.9475 - classification_loss: 0.1212 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0672 - regression_loss: 0.9463 - classification_loss: 0.1210 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0668 - regression_loss: 0.9460 - classification_loss: 0.1208 265/500 [==============>...............] - ETA: 1:20 - loss: 1.0664 - regression_loss: 0.9455 - classification_loss: 0.1209 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0678 - regression_loss: 0.9466 - classification_loss: 0.1211 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0686 - regression_loss: 0.9473 - classification_loss: 0.1213 268/500 [===============>..............] - ETA: 1:19 - loss: 1.0706 - regression_loss: 0.9490 - classification_loss: 0.1216 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0708 - regression_loss: 0.9493 - classification_loss: 0.1215 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0699 - regression_loss: 0.9486 - classification_loss: 0.1213 271/500 [===============>..............] - ETA: 1:18 - loss: 1.0704 - regression_loss: 0.9491 - classification_loss: 0.1214 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0717 - regression_loss: 0.9503 - classification_loss: 0.1215 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0716 - regression_loss: 0.9500 - classification_loss: 0.1216 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0716 - regression_loss: 0.9499 - classification_loss: 0.1217 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0708 - regression_loss: 0.9493 - classification_loss: 0.1215 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0714 - regression_loss: 0.9498 - classification_loss: 0.1217 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0705 - regression_loss: 0.9490 - classification_loss: 0.1215 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0722 - regression_loss: 0.9504 - classification_loss: 0.1218 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0740 - regression_loss: 0.9519 - classification_loss: 0.1222 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0744 - regression_loss: 0.9522 - classification_loss: 0.1223 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0726 - regression_loss: 0.9507 - classification_loss: 0.1219 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0711 - regression_loss: 0.9493 - classification_loss: 0.1218 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0688 - regression_loss: 0.9474 - classification_loss: 0.1214 284/500 [================>.............] - ETA: 1:13 - loss: 1.0670 - regression_loss: 0.9459 - classification_loss: 0.1212 285/500 [================>.............] - ETA: 1:13 - loss: 1.0656 - regression_loss: 0.9446 - classification_loss: 0.1209 286/500 [================>.............] - ETA: 1:12 - loss: 1.0661 - regression_loss: 0.9451 - classification_loss: 0.1209 287/500 [================>.............] - ETA: 1:12 - loss: 1.0671 - regression_loss: 0.9461 - classification_loss: 0.1210 288/500 [================>.............] - ETA: 1:12 - loss: 1.0663 - regression_loss: 0.9455 - classification_loss: 0.1208 289/500 [================>.............] - ETA: 1:11 - loss: 1.0672 - regression_loss: 0.9466 - classification_loss: 0.1207 290/500 [================>.............] - ETA: 1:11 - loss: 1.0672 - regression_loss: 0.9464 - classification_loss: 0.1207 291/500 [================>.............] - ETA: 1:11 - loss: 1.0672 - regression_loss: 0.9465 - classification_loss: 0.1207 292/500 [================>.............] - ETA: 1:10 - loss: 1.0673 - regression_loss: 0.9461 - classification_loss: 0.1212 293/500 [================>.............] - ETA: 1:10 - loss: 1.0682 - regression_loss: 0.9468 - classification_loss: 0.1214 294/500 [================>.............] - ETA: 1:10 - loss: 1.0686 - regression_loss: 0.9472 - classification_loss: 0.1214 295/500 [================>.............] - ETA: 1:09 - loss: 1.0699 - regression_loss: 0.9483 - classification_loss: 0.1216 296/500 [================>.............] - ETA: 1:09 - loss: 1.0693 - regression_loss: 0.9476 - classification_loss: 0.1216 297/500 [================>.............] - ETA: 1:09 - loss: 1.0682 - regression_loss: 0.9466 - classification_loss: 0.1216 298/500 [================>.............] - ETA: 1:08 - loss: 1.0666 - regression_loss: 0.9453 - classification_loss: 0.1214 299/500 [================>.............] - ETA: 1:08 - loss: 1.0662 - regression_loss: 0.9450 - classification_loss: 0.1213 300/500 [=================>............] - ETA: 1:08 - loss: 1.0676 - regression_loss: 0.9461 - classification_loss: 0.1215 301/500 [=================>............] - ETA: 1:07 - loss: 1.0675 - regression_loss: 0.9460 - classification_loss: 0.1216 302/500 [=================>............] - ETA: 1:07 - loss: 1.0676 - regression_loss: 0.9460 - classification_loss: 0.1216 303/500 [=================>............] - ETA: 1:07 - loss: 1.0677 - regression_loss: 0.9461 - classification_loss: 0.1216 304/500 [=================>............] - ETA: 1:06 - loss: 1.0659 - regression_loss: 0.9446 - classification_loss: 0.1213 305/500 [=================>............] - ETA: 1:06 - loss: 1.0658 - regression_loss: 0.9444 - classification_loss: 0.1215 306/500 [=================>............] - ETA: 1:06 - loss: 1.0671 - regression_loss: 0.9453 - classification_loss: 0.1218 307/500 [=================>............] - ETA: 1:05 - loss: 1.0672 - regression_loss: 0.9456 - classification_loss: 0.1216 308/500 [=================>............] - ETA: 1:05 - loss: 1.0683 - regression_loss: 0.9465 - classification_loss: 0.1218 309/500 [=================>............] - ETA: 1:05 - loss: 1.0695 - regression_loss: 0.9475 - classification_loss: 0.1221 310/500 [=================>............] - ETA: 1:04 - loss: 1.0697 - regression_loss: 0.9477 - classification_loss: 0.1220 311/500 [=================>............] - ETA: 1:04 - loss: 1.0702 - regression_loss: 0.9482 - classification_loss: 0.1221 312/500 [=================>............] - ETA: 1:04 - loss: 1.0676 - regression_loss: 0.9459 - classification_loss: 0.1218 313/500 [=================>............] - ETA: 1:03 - loss: 1.0662 - regression_loss: 0.9446 - classification_loss: 0.1216 314/500 [=================>............] - ETA: 1:03 - loss: 1.0645 - regression_loss: 0.9428 - classification_loss: 0.1216 315/500 [=================>............] - ETA: 1:03 - loss: 1.0646 - regression_loss: 0.9430 - classification_loss: 0.1217 316/500 [=================>............] - ETA: 1:02 - loss: 1.0647 - regression_loss: 0.9431 - classification_loss: 0.1217 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0653 - regression_loss: 0.9435 - classification_loss: 0.1218 318/500 [==================>...........] - ETA: 1:02 - loss: 1.0658 - regression_loss: 0.9440 - classification_loss: 0.1219 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0657 - regression_loss: 0.9438 - classification_loss: 0.1219 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0659 - regression_loss: 0.9440 - classification_loss: 0.1219 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0655 - regression_loss: 0.9436 - classification_loss: 0.1219 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0663 - regression_loss: 0.9446 - classification_loss: 0.1218 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0645 - regression_loss: 0.9430 - classification_loss: 0.1215 324/500 [==================>...........] - ETA: 59s - loss: 1.0633 - regression_loss: 0.9421 - classification_loss: 0.1213  325/500 [==================>...........] - ETA: 59s - loss: 1.0642 - regression_loss: 0.9427 - classification_loss: 0.1214 326/500 [==================>...........] - ETA: 59s - loss: 1.0634 - regression_loss: 0.9420 - classification_loss: 0.1214 327/500 [==================>...........] - ETA: 58s - loss: 1.0633 - regression_loss: 0.9418 - classification_loss: 0.1214 328/500 [==================>...........] - ETA: 58s - loss: 1.0624 - regression_loss: 0.9412 - classification_loss: 0.1212 329/500 [==================>...........] - ETA: 58s - loss: 1.0631 - regression_loss: 0.9418 - classification_loss: 0.1212 330/500 [==================>...........] - ETA: 57s - loss: 1.0621 - regression_loss: 0.9410 - classification_loss: 0.1211 331/500 [==================>...........] - ETA: 57s - loss: 1.0617 - regression_loss: 0.9406 - classification_loss: 0.1211 332/500 [==================>...........] - ETA: 57s - loss: 1.0650 - regression_loss: 0.9436 - classification_loss: 0.1214 333/500 [==================>...........] - ETA: 56s - loss: 1.0651 - regression_loss: 0.9437 - classification_loss: 0.1214 334/500 [===================>..........] - ETA: 56s - loss: 1.0651 - regression_loss: 0.9438 - classification_loss: 0.1213 335/500 [===================>..........] - ETA: 56s - loss: 1.0641 - regression_loss: 0.9429 - classification_loss: 0.1212 336/500 [===================>..........] - ETA: 55s - loss: 1.0645 - regression_loss: 0.9432 - classification_loss: 0.1213 337/500 [===================>..........] - ETA: 55s - loss: 1.0659 - regression_loss: 0.9443 - classification_loss: 0.1216 338/500 [===================>..........] - ETA: 55s - loss: 1.0655 - regression_loss: 0.9440 - classification_loss: 0.1215 339/500 [===================>..........] - ETA: 54s - loss: 1.0642 - regression_loss: 0.9429 - classification_loss: 0.1213 340/500 [===================>..........] - ETA: 54s - loss: 1.0641 - regression_loss: 0.9429 - classification_loss: 0.1212 341/500 [===================>..........] - ETA: 54s - loss: 1.0640 - regression_loss: 0.9428 - classification_loss: 0.1212 342/500 [===================>..........] - ETA: 53s - loss: 1.0639 - regression_loss: 0.9428 - classification_loss: 0.1212 343/500 [===================>..........] - ETA: 53s - loss: 1.0640 - regression_loss: 0.9428 - classification_loss: 0.1212 344/500 [===================>..........] - ETA: 53s - loss: 1.0634 - regression_loss: 0.9423 - classification_loss: 0.1212 345/500 [===================>..........] - ETA: 52s - loss: 1.0646 - regression_loss: 0.9433 - classification_loss: 0.1213 346/500 [===================>..........] - ETA: 52s - loss: 1.0640 - regression_loss: 0.9427 - classification_loss: 0.1212 347/500 [===================>..........] - ETA: 52s - loss: 1.0660 - regression_loss: 0.9445 - classification_loss: 0.1215 348/500 [===================>..........] - ETA: 51s - loss: 1.0671 - regression_loss: 0.9455 - classification_loss: 0.1216 349/500 [===================>..........] - ETA: 51s - loss: 1.0676 - regression_loss: 0.9459 - classification_loss: 0.1217 350/500 [====================>.........] - ETA: 51s - loss: 1.0658 - regression_loss: 0.9443 - classification_loss: 0.1214 351/500 [====================>.........] - ETA: 50s - loss: 1.0665 - regression_loss: 0.9449 - classification_loss: 0.1216 352/500 [====================>.........] - ETA: 50s - loss: 1.0650 - regression_loss: 0.9436 - classification_loss: 0.1214 353/500 [====================>.........] - ETA: 50s - loss: 1.0662 - regression_loss: 0.9446 - classification_loss: 0.1215 354/500 [====================>.........] - ETA: 49s - loss: 1.0646 - regression_loss: 0.9433 - classification_loss: 0.1214 355/500 [====================>.........] - ETA: 49s - loss: 1.0641 - regression_loss: 0.9429 - classification_loss: 0.1212 356/500 [====================>.........] - ETA: 49s - loss: 1.0650 - regression_loss: 0.9437 - classification_loss: 0.1212 357/500 [====================>.........] - ETA: 48s - loss: 1.0653 - regression_loss: 0.9441 - classification_loss: 0.1212 358/500 [====================>.........] - ETA: 48s - loss: 1.0670 - regression_loss: 0.9455 - classification_loss: 0.1214 359/500 [====================>.........] - ETA: 48s - loss: 1.0667 - regression_loss: 0.9455 - classification_loss: 0.1212 360/500 [====================>.........] - ETA: 47s - loss: 1.0673 - regression_loss: 0.9459 - classification_loss: 0.1214 361/500 [====================>.........] - ETA: 47s - loss: 1.0674 - regression_loss: 0.9460 - classification_loss: 0.1214 362/500 [====================>.........] - ETA: 46s - loss: 1.0666 - regression_loss: 0.9453 - classification_loss: 0.1213 363/500 [====================>.........] - ETA: 46s - loss: 1.0675 - regression_loss: 0.9459 - classification_loss: 0.1216 364/500 [====================>.........] - ETA: 46s - loss: 1.0737 - regression_loss: 0.9492 - classification_loss: 0.1245 365/500 [====================>.........] - ETA: 45s - loss: 1.0743 - regression_loss: 0.9497 - classification_loss: 0.1245 366/500 [====================>.........] - ETA: 45s - loss: 1.0735 - regression_loss: 0.9491 - classification_loss: 0.1244 367/500 [=====================>........] - ETA: 45s - loss: 1.0739 - regression_loss: 0.9492 - classification_loss: 0.1246 368/500 [=====================>........] - ETA: 44s - loss: 1.0735 - regression_loss: 0.9489 - classification_loss: 0.1246 369/500 [=====================>........] - ETA: 44s - loss: 1.0746 - regression_loss: 0.9498 - classification_loss: 0.1248 370/500 [=====================>........] - ETA: 44s - loss: 1.0739 - regression_loss: 0.9492 - classification_loss: 0.1247 371/500 [=====================>........] - ETA: 43s - loss: 1.0746 - regression_loss: 0.9497 - classification_loss: 0.1248 372/500 [=====================>........] - ETA: 43s - loss: 1.0747 - regression_loss: 0.9499 - classification_loss: 0.1249 373/500 [=====================>........] - ETA: 43s - loss: 1.0750 - regression_loss: 0.9500 - classification_loss: 0.1250 374/500 [=====================>........] - ETA: 42s - loss: 1.0746 - regression_loss: 0.9497 - classification_loss: 0.1249 375/500 [=====================>........] - ETA: 42s - loss: 1.0760 - regression_loss: 0.9508 - classification_loss: 0.1251 376/500 [=====================>........] - ETA: 42s - loss: 1.0750 - regression_loss: 0.9501 - classification_loss: 0.1249 377/500 [=====================>........] - ETA: 41s - loss: 1.0755 - regression_loss: 0.9505 - classification_loss: 0.1250 378/500 [=====================>........] - ETA: 41s - loss: 1.0741 - regression_loss: 0.9493 - classification_loss: 0.1248 379/500 [=====================>........] - ETA: 41s - loss: 1.0755 - regression_loss: 0.9505 - classification_loss: 0.1250 380/500 [=====================>........] - ETA: 40s - loss: 1.0757 - regression_loss: 0.9506 - classification_loss: 0.1251 381/500 [=====================>........] - ETA: 40s - loss: 1.0767 - regression_loss: 0.9514 - classification_loss: 0.1253 382/500 [=====================>........] - ETA: 40s - loss: 1.0754 - regression_loss: 0.9503 - classification_loss: 0.1251 383/500 [=====================>........] - ETA: 39s - loss: 1.0751 - regression_loss: 0.9501 - classification_loss: 0.1250 384/500 [======================>.......] - ETA: 39s - loss: 1.0762 - regression_loss: 0.9510 - classification_loss: 0.1252 385/500 [======================>.......] - ETA: 39s - loss: 1.0767 - regression_loss: 0.9514 - classification_loss: 0.1253 386/500 [======================>.......] - ETA: 38s - loss: 1.0766 - regression_loss: 0.9514 - classification_loss: 0.1252 387/500 [======================>.......] - ETA: 38s - loss: 1.0775 - regression_loss: 0.9520 - classification_loss: 0.1255 388/500 [======================>.......] - ETA: 38s - loss: 1.0779 - regression_loss: 0.9524 - classification_loss: 0.1255 389/500 [======================>.......] - ETA: 37s - loss: 1.0782 - regression_loss: 0.9528 - classification_loss: 0.1254 390/500 [======================>.......] - ETA: 37s - loss: 1.0788 - regression_loss: 0.9532 - classification_loss: 0.1255 391/500 [======================>.......] - ETA: 37s - loss: 1.0797 - regression_loss: 0.9541 - classification_loss: 0.1256 392/500 [======================>.......] - ETA: 36s - loss: 1.0795 - regression_loss: 0.9540 - classification_loss: 0.1255 393/500 [======================>.......] - ETA: 36s - loss: 1.0784 - regression_loss: 0.9532 - classification_loss: 0.1253 394/500 [======================>.......] - ETA: 36s - loss: 1.0785 - regression_loss: 0.9532 - classification_loss: 0.1253 395/500 [======================>.......] - ETA: 35s - loss: 1.0783 - regression_loss: 0.9529 - classification_loss: 0.1254 396/500 [======================>.......] - ETA: 35s - loss: 1.0782 - regression_loss: 0.9528 - classification_loss: 0.1254 397/500 [======================>.......] - ETA: 35s - loss: 1.0767 - regression_loss: 0.9515 - classification_loss: 0.1252 398/500 [======================>.......] - ETA: 34s - loss: 1.0756 - regression_loss: 0.9506 - classification_loss: 0.1250 399/500 [======================>.......] - ETA: 34s - loss: 1.0743 - regression_loss: 0.9493 - classification_loss: 0.1250 400/500 [=======================>......] - ETA: 34s - loss: 1.0748 - regression_loss: 0.9497 - classification_loss: 0.1251 401/500 [=======================>......] - ETA: 33s - loss: 1.0754 - regression_loss: 0.9504 - classification_loss: 0.1250 402/500 [=======================>......] - ETA: 33s - loss: 1.0775 - regression_loss: 0.9523 - classification_loss: 0.1252 403/500 [=======================>......] - ETA: 33s - loss: 1.0772 - regression_loss: 0.9522 - classification_loss: 0.1250 404/500 [=======================>......] - ETA: 32s - loss: 1.0773 - regression_loss: 0.9523 - classification_loss: 0.1250 405/500 [=======================>......] - ETA: 32s - loss: 1.0778 - regression_loss: 0.9528 - classification_loss: 0.1250 406/500 [=======================>......] - ETA: 32s - loss: 1.0776 - regression_loss: 0.9526 - classification_loss: 0.1250 407/500 [=======================>......] - ETA: 31s - loss: 1.0778 - regression_loss: 0.9527 - classification_loss: 0.1250 408/500 [=======================>......] - ETA: 31s - loss: 1.0773 - regression_loss: 0.9524 - classification_loss: 0.1249 409/500 [=======================>......] - ETA: 31s - loss: 1.0772 - regression_loss: 0.9524 - classification_loss: 0.1248 410/500 [=======================>......] - ETA: 30s - loss: 1.0780 - regression_loss: 0.9530 - classification_loss: 0.1250 411/500 [=======================>......] - ETA: 30s - loss: 1.0783 - regression_loss: 0.9531 - classification_loss: 0.1252 412/500 [=======================>......] - ETA: 29s - loss: 1.0771 - regression_loss: 0.9521 - classification_loss: 0.1250 413/500 [=======================>......] - ETA: 29s - loss: 1.0758 - regression_loss: 0.9510 - classification_loss: 0.1248 414/500 [=======================>......] - ETA: 29s - loss: 1.0764 - regression_loss: 0.9514 - classification_loss: 0.1250 415/500 [=======================>......] - ETA: 28s - loss: 1.0775 - regression_loss: 0.9525 - classification_loss: 0.1250 416/500 [=======================>......] - ETA: 28s - loss: 1.0782 - regression_loss: 0.9530 - classification_loss: 0.1251 417/500 [========================>.....] - ETA: 28s - loss: 1.0783 - regression_loss: 0.9532 - classification_loss: 0.1251 418/500 [========================>.....] - ETA: 27s - loss: 1.0782 - regression_loss: 0.9531 - classification_loss: 0.1251 419/500 [========================>.....] - ETA: 27s - loss: 1.0785 - regression_loss: 0.9532 - classification_loss: 0.1253 420/500 [========================>.....] - ETA: 27s - loss: 1.0788 - regression_loss: 0.9534 - classification_loss: 0.1254 421/500 [========================>.....] - ETA: 26s - loss: 1.0785 - regression_loss: 0.9531 - classification_loss: 0.1254 422/500 [========================>.....] - ETA: 26s - loss: 1.0776 - regression_loss: 0.9523 - classification_loss: 0.1252 423/500 [========================>.....] - ETA: 26s - loss: 1.0777 - regression_loss: 0.9525 - classification_loss: 0.1252 424/500 [========================>.....] - ETA: 25s - loss: 1.0778 - regression_loss: 0.9526 - classification_loss: 0.1252 425/500 [========================>.....] - ETA: 25s - loss: 1.0781 - regression_loss: 0.9529 - classification_loss: 0.1252 426/500 [========================>.....] - ETA: 25s - loss: 1.0775 - regression_loss: 0.9524 - classification_loss: 0.1251 427/500 [========================>.....] - ETA: 24s - loss: 1.0779 - regression_loss: 0.9528 - classification_loss: 0.1251 428/500 [========================>.....] - ETA: 24s - loss: 1.0769 - regression_loss: 0.9520 - classification_loss: 0.1248 429/500 [========================>.....] - ETA: 24s - loss: 1.0764 - regression_loss: 0.9517 - classification_loss: 0.1248 430/500 [========================>.....] - ETA: 23s - loss: 1.0763 - regression_loss: 0.9516 - classification_loss: 0.1247 431/500 [========================>.....] - ETA: 23s - loss: 1.0769 - regression_loss: 0.9522 - classification_loss: 0.1247 432/500 [========================>.....] - ETA: 23s - loss: 1.0758 - regression_loss: 0.9512 - classification_loss: 0.1247 433/500 [========================>.....] - ETA: 22s - loss: 1.0765 - regression_loss: 0.9517 - classification_loss: 0.1247 434/500 [=========================>....] - ETA: 22s - loss: 1.0749 - regression_loss: 0.9505 - classification_loss: 0.1245 435/500 [=========================>....] - ETA: 22s - loss: 1.0748 - regression_loss: 0.9503 - classification_loss: 0.1245 436/500 [=========================>....] - ETA: 21s - loss: 1.0739 - regression_loss: 0.9495 - classification_loss: 0.1243 437/500 [=========================>....] - ETA: 21s - loss: 1.0732 - regression_loss: 0.9489 - classification_loss: 0.1242 438/500 [=========================>....] - ETA: 21s - loss: 1.0735 - regression_loss: 0.9493 - classification_loss: 0.1242 439/500 [=========================>....] - ETA: 20s - loss: 1.0728 - regression_loss: 0.9488 - classification_loss: 0.1241 440/500 [=========================>....] - ETA: 20s - loss: 1.0726 - regression_loss: 0.9485 - classification_loss: 0.1241 441/500 [=========================>....] - ETA: 20s - loss: 1.0727 - regression_loss: 0.9486 - classification_loss: 0.1241 442/500 [=========================>....] - ETA: 19s - loss: 1.0710 - regression_loss: 0.9471 - classification_loss: 0.1239 443/500 [=========================>....] - ETA: 19s - loss: 1.0706 - regression_loss: 0.9468 - classification_loss: 0.1238 444/500 [=========================>....] - ETA: 19s - loss: 1.0696 - regression_loss: 0.9461 - classification_loss: 0.1236 445/500 [=========================>....] - ETA: 18s - loss: 1.0707 - regression_loss: 0.9469 - classification_loss: 0.1237 446/500 [=========================>....] - ETA: 18s - loss: 1.0702 - regression_loss: 0.9465 - classification_loss: 0.1237 447/500 [=========================>....] - ETA: 18s - loss: 1.0702 - regression_loss: 0.9466 - classification_loss: 0.1236 448/500 [=========================>....] - ETA: 17s - loss: 1.0689 - regression_loss: 0.9455 - classification_loss: 0.1234 449/500 [=========================>....] - ETA: 17s - loss: 1.0698 - regression_loss: 0.9461 - classification_loss: 0.1236 450/500 [==========================>...] - ETA: 17s - loss: 1.0695 - regression_loss: 0.9460 - classification_loss: 0.1235 451/500 [==========================>...] - ETA: 16s - loss: 1.0690 - regression_loss: 0.9456 - classification_loss: 0.1234 452/500 [==========================>...] - ETA: 16s - loss: 1.0692 - regression_loss: 0.9459 - classification_loss: 0.1234 453/500 [==========================>...] - ETA: 16s - loss: 1.0686 - regression_loss: 0.9453 - classification_loss: 0.1234 454/500 [==========================>...] - ETA: 15s - loss: 1.0678 - regression_loss: 0.9446 - classification_loss: 0.1232 455/500 [==========================>...] - ETA: 15s - loss: 1.0695 - regression_loss: 0.9460 - classification_loss: 0.1235 456/500 [==========================>...] - ETA: 14s - loss: 1.0685 - regression_loss: 0.9451 - classification_loss: 0.1234 457/500 [==========================>...] - ETA: 14s - loss: 1.0676 - regression_loss: 0.9444 - classification_loss: 0.1232 458/500 [==========================>...] - ETA: 14s - loss: 1.0675 - regression_loss: 0.9444 - classification_loss: 0.1231 459/500 [==========================>...] - ETA: 13s - loss: 1.0668 - regression_loss: 0.9438 - classification_loss: 0.1230 460/500 [==========================>...] - ETA: 13s - loss: 1.0667 - regression_loss: 0.9438 - classification_loss: 0.1230 461/500 [==========================>...] - ETA: 13s - loss: 1.0673 - regression_loss: 0.9442 - classification_loss: 0.1230 462/500 [==========================>...] - ETA: 12s - loss: 1.0672 - regression_loss: 0.9443 - classification_loss: 0.1230 463/500 [==========================>...] - ETA: 12s - loss: 1.0673 - regression_loss: 0.9443 - classification_loss: 0.1230 464/500 [==========================>...] - ETA: 12s - loss: 1.0678 - regression_loss: 0.9448 - classification_loss: 0.1230 465/500 [==========================>...] - ETA: 11s - loss: 1.0687 - regression_loss: 0.9456 - classification_loss: 0.1231 466/500 [==========================>...] - ETA: 11s - loss: 1.0688 - regression_loss: 0.9457 - classification_loss: 0.1231 467/500 [===========================>..] - ETA: 11s - loss: 1.0688 - regression_loss: 0.9457 - classification_loss: 0.1231 468/500 [===========================>..] - ETA: 10s - loss: 1.0687 - regression_loss: 0.9456 - classification_loss: 0.1230 469/500 [===========================>..] - ETA: 10s - loss: 1.0684 - regression_loss: 0.9455 - classification_loss: 0.1229 470/500 [===========================>..] - ETA: 10s - loss: 1.0689 - regression_loss: 0.9460 - classification_loss: 0.1229 471/500 [===========================>..] - ETA: 9s - loss: 1.0691 - regression_loss: 0.9463 - classification_loss: 0.1228  472/500 [===========================>..] - ETA: 9s - loss: 1.0687 - regression_loss: 0.9459 - classification_loss: 0.1228 473/500 [===========================>..] - ETA: 9s - loss: 1.0695 - regression_loss: 0.9466 - classification_loss: 0.1229 474/500 [===========================>..] - ETA: 8s - loss: 1.0698 - regression_loss: 0.9467 - classification_loss: 0.1231 475/500 [===========================>..] - ETA: 8s - loss: 1.0711 - regression_loss: 0.9478 - classification_loss: 0.1234 476/500 [===========================>..] - ETA: 8s - loss: 1.0695 - regression_loss: 0.9464 - classification_loss: 0.1232 477/500 [===========================>..] - ETA: 7s - loss: 1.0700 - regression_loss: 0.9467 - classification_loss: 0.1233 478/500 [===========================>..] - ETA: 7s - loss: 1.0705 - regression_loss: 0.9472 - classification_loss: 0.1233 479/500 [===========================>..] - ETA: 7s - loss: 1.0697 - regression_loss: 0.9465 - classification_loss: 0.1232 480/500 [===========================>..] - ETA: 6s - loss: 1.0703 - regression_loss: 0.9471 - classification_loss: 0.1233 481/500 [===========================>..] - ETA: 6s - loss: 1.0699 - regression_loss: 0.9468 - classification_loss: 0.1232 482/500 [===========================>..] - ETA: 6s - loss: 1.0695 - regression_loss: 0.9465 - classification_loss: 0.1230 483/500 [===========================>..] - ETA: 5s - loss: 1.0698 - regression_loss: 0.9466 - classification_loss: 0.1231 484/500 [============================>.] - ETA: 5s - loss: 1.0702 - regression_loss: 0.9470 - classification_loss: 0.1232 485/500 [============================>.] - ETA: 5s - loss: 1.0698 - regression_loss: 0.9467 - classification_loss: 0.1231 486/500 [============================>.] - ETA: 4s - loss: 1.0704 - regression_loss: 0.9472 - classification_loss: 0.1232 487/500 [============================>.] - ETA: 4s - loss: 1.0706 - regression_loss: 0.9475 - classification_loss: 0.1231 488/500 [============================>.] - ETA: 4s - loss: 1.0710 - regression_loss: 0.9479 - classification_loss: 0.1231 489/500 [============================>.] - ETA: 3s - loss: 1.0715 - regression_loss: 0.9484 - classification_loss: 0.1231 490/500 [============================>.] - ETA: 3s - loss: 1.0710 - regression_loss: 0.9479 - classification_loss: 0.1231 491/500 [============================>.] - ETA: 3s - loss: 1.0719 - regression_loss: 0.9487 - classification_loss: 0.1232 492/500 [============================>.] - ETA: 2s - loss: 1.0716 - regression_loss: 0.9483 - classification_loss: 0.1233 493/500 [============================>.] - ETA: 2s - loss: 1.0716 - regression_loss: 0.9483 - classification_loss: 0.1233 494/500 [============================>.] - ETA: 2s - loss: 1.0709 - regression_loss: 0.9478 - classification_loss: 0.1231 495/500 [============================>.] - ETA: 1s - loss: 1.0710 - regression_loss: 0.9479 - classification_loss: 0.1231 496/500 [============================>.] - ETA: 1s - loss: 1.0711 - regression_loss: 0.9479 - classification_loss: 0.1232 497/500 [============================>.] - ETA: 1s - loss: 1.0720 - regression_loss: 0.9485 - classification_loss: 0.1235 498/500 [============================>.] - ETA: 0s - loss: 1.0723 - regression_loss: 0.9488 - classification_loss: 0.1235 499/500 [============================>.] - ETA: 0s - loss: 1.0722 - regression_loss: 0.9487 - classification_loss: 0.1235 500/500 [==============================] - 170s 340ms/step - loss: 1.0729 - regression_loss: 0.9493 - classification_loss: 0.1236 1172 instances of class plum with average precision: 0.7682 mAP: 0.7682 Epoch 00025: saving model to ./training/snapshots/resnet101_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:52 - loss: 0.6784 - regression_loss: 0.6622 - classification_loss: 0.0162 2/500 [..............................] - ETA: 2:53 - loss: 0.9500 - regression_loss: 0.8565 - classification_loss: 0.0935 3/500 [..............................] - ETA: 2:51 - loss: 1.0080 - regression_loss: 0.9013 - classification_loss: 0.1067 4/500 [..............................] - ETA: 2:52 - loss: 1.0842 - regression_loss: 0.9600 - classification_loss: 0.1242 5/500 [..............................] - ETA: 2:50 - loss: 1.1964 - regression_loss: 1.0513 - classification_loss: 0.1452 6/500 [..............................] - ETA: 2:50 - loss: 1.1763 - regression_loss: 1.0341 - classification_loss: 0.1423 7/500 [..............................] - ETA: 2:51 - loss: 1.1415 - regression_loss: 1.0075 - classification_loss: 0.1340 8/500 [..............................] - ETA: 2:49 - loss: 1.1070 - regression_loss: 0.9813 - classification_loss: 0.1258 9/500 [..............................] - ETA: 2:49 - loss: 1.1322 - regression_loss: 1.0092 - classification_loss: 0.1231 10/500 [..............................] - ETA: 2:49 - loss: 1.1102 - regression_loss: 0.9876 - classification_loss: 0.1226 11/500 [..............................] - ETA: 2:48 - loss: 1.1348 - regression_loss: 1.0059 - classification_loss: 0.1289 12/500 [..............................] - ETA: 2:48 - loss: 1.1255 - regression_loss: 0.9963 - classification_loss: 0.1292 13/500 [..............................] - ETA: 2:47 - loss: 1.1266 - regression_loss: 1.0023 - classification_loss: 0.1243 14/500 [..............................] - ETA: 2:47 - loss: 1.1109 - regression_loss: 0.9897 - classification_loss: 0.1212 15/500 [..............................] - ETA: 2:47 - loss: 1.0751 - regression_loss: 0.9590 - classification_loss: 0.1161 16/500 [..............................] - ETA: 2:46 - loss: 1.0594 - regression_loss: 0.9467 - classification_loss: 0.1127 17/500 [>.............................] - ETA: 2:45 - loss: 1.0732 - regression_loss: 0.9572 - classification_loss: 0.1160 18/500 [>.............................] - ETA: 2:45 - loss: 1.0991 - regression_loss: 0.9795 - classification_loss: 0.1196 19/500 [>.............................] - ETA: 2:44 - loss: 1.0981 - regression_loss: 0.9791 - classification_loss: 0.1190 20/500 [>.............................] - ETA: 2:44 - loss: 1.0854 - regression_loss: 0.9678 - classification_loss: 0.1176 21/500 [>.............................] - ETA: 2:43 - loss: 1.0892 - regression_loss: 0.9702 - classification_loss: 0.1189 22/500 [>.............................] - ETA: 2:43 - loss: 1.0918 - regression_loss: 0.9721 - classification_loss: 0.1197 23/500 [>.............................] - ETA: 2:42 - loss: 1.1095 - regression_loss: 0.9857 - classification_loss: 0.1237 24/500 [>.............................] - ETA: 2:42 - loss: 1.1034 - regression_loss: 0.9795 - classification_loss: 0.1239 25/500 [>.............................] - ETA: 2:41 - loss: 1.1033 - regression_loss: 0.9823 - classification_loss: 0.1210 26/500 [>.............................] - ETA: 2:41 - loss: 1.1186 - regression_loss: 0.9936 - classification_loss: 0.1250 27/500 [>.............................] - ETA: 2:40 - loss: 1.1261 - regression_loss: 0.9992 - classification_loss: 0.1268 28/500 [>.............................] - ETA: 2:40 - loss: 1.1160 - regression_loss: 0.9902 - classification_loss: 0.1258 29/500 [>.............................] - ETA: 2:40 - loss: 1.1082 - regression_loss: 0.9846 - classification_loss: 0.1236 30/500 [>.............................] - ETA: 2:39 - loss: 1.0968 - regression_loss: 0.9746 - classification_loss: 0.1222 31/500 [>.............................] - ETA: 2:39 - loss: 1.0808 - regression_loss: 0.9597 - classification_loss: 0.1212 32/500 [>.............................] - ETA: 2:38 - loss: 1.1064 - regression_loss: 0.9784 - classification_loss: 0.1280 33/500 [>.............................] - ETA: 2:38 - loss: 1.1041 - regression_loss: 0.9758 - classification_loss: 0.1283 34/500 [=>............................] - ETA: 2:37 - loss: 1.0795 - regression_loss: 0.9539 - classification_loss: 0.1257 35/500 [=>............................] - ETA: 2:37 - loss: 1.0786 - regression_loss: 0.9530 - classification_loss: 0.1256 36/500 [=>............................] - ETA: 2:37 - loss: 1.0626 - regression_loss: 0.9396 - classification_loss: 0.1229 37/500 [=>............................] - ETA: 2:36 - loss: 1.0557 - regression_loss: 0.9336 - classification_loss: 0.1221 38/500 [=>............................] - ETA: 2:36 - loss: 1.0672 - regression_loss: 0.9435 - classification_loss: 0.1237 39/500 [=>............................] - ETA: 2:36 - loss: 1.0588 - regression_loss: 0.9365 - classification_loss: 0.1223 40/500 [=>............................] - ETA: 2:36 - loss: 1.0621 - regression_loss: 0.9392 - classification_loss: 0.1230 41/500 [=>............................] - ETA: 2:36 - loss: 1.0632 - regression_loss: 0.9410 - classification_loss: 0.1222 42/500 [=>............................] - ETA: 2:35 - loss: 1.0613 - regression_loss: 0.9393 - classification_loss: 0.1220 43/500 [=>............................] - ETA: 2:35 - loss: 1.0608 - regression_loss: 0.9393 - classification_loss: 0.1215 44/500 [=>............................] - ETA: 2:35 - loss: 1.0596 - regression_loss: 0.9373 - classification_loss: 0.1223 45/500 [=>............................] - ETA: 2:34 - loss: 1.0625 - regression_loss: 0.9393 - classification_loss: 0.1232 46/500 [=>............................] - ETA: 2:34 - loss: 1.0575 - regression_loss: 0.9360 - classification_loss: 0.1215 47/500 [=>............................] - ETA: 2:34 - loss: 1.0572 - regression_loss: 0.9356 - classification_loss: 0.1216 48/500 [=>............................] - ETA: 2:33 - loss: 1.0640 - regression_loss: 0.9413 - classification_loss: 0.1227 49/500 [=>............................] - ETA: 2:33 - loss: 1.0587 - regression_loss: 0.9373 - classification_loss: 0.1214 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0543 - regression_loss: 0.9335 - classification_loss: 0.1207 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0517 - regression_loss: 0.9317 - classification_loss: 0.1199 52/500 [==>...........................] - ETA: 2:31 - loss: 1.0694 - regression_loss: 0.9459 - classification_loss: 0.1235 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0634 - regression_loss: 0.9411 - classification_loss: 0.1223 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0712 - regression_loss: 0.9478 - classification_loss: 0.1234 55/500 [==>...........................] - ETA: 2:30 - loss: 1.0635 - regression_loss: 0.9405 - classification_loss: 0.1230 56/500 [==>...........................] - ETA: 2:30 - loss: 1.0653 - regression_loss: 0.9431 - classification_loss: 0.1222 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0661 - regression_loss: 0.9440 - classification_loss: 0.1222 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0659 - regression_loss: 0.9448 - classification_loss: 0.1210 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0671 - regression_loss: 0.9466 - classification_loss: 0.1204 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0572 - regression_loss: 0.9384 - classification_loss: 0.1188 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0551 - regression_loss: 0.9369 - classification_loss: 0.1182 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0452 - regression_loss: 0.9283 - classification_loss: 0.1169 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0518 - regression_loss: 0.9340 - classification_loss: 0.1178 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0488 - regression_loss: 0.9316 - classification_loss: 0.1173 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0430 - regression_loss: 0.9268 - classification_loss: 0.1162 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0490 - regression_loss: 0.9311 - classification_loss: 0.1180 67/500 [===>..........................] - ETA: 2:27 - loss: 1.0583 - regression_loss: 0.9408 - classification_loss: 0.1175 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0634 - regression_loss: 0.9451 - classification_loss: 0.1183 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0519 - regression_loss: 0.9351 - classification_loss: 0.1168 70/500 [===>..........................] - ETA: 2:26 - loss: 1.0437 - regression_loss: 0.9281 - classification_loss: 0.1156 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0392 - regression_loss: 0.9246 - classification_loss: 0.1146 72/500 [===>..........................] - ETA: 2:25 - loss: 1.0395 - regression_loss: 0.9246 - classification_loss: 0.1149 73/500 [===>..........................] - ETA: 2:25 - loss: 1.0312 - regression_loss: 0.9172 - classification_loss: 0.1140 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0265 - regression_loss: 0.9128 - classification_loss: 0.1136 75/500 [===>..........................] - ETA: 2:24 - loss: 1.0318 - regression_loss: 0.9167 - classification_loss: 0.1151 76/500 [===>..........................] - ETA: 2:24 - loss: 1.0311 - regression_loss: 0.9164 - classification_loss: 0.1147 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0264 - regression_loss: 0.9127 - classification_loss: 0.1138 78/500 [===>..........................] - ETA: 2:23 - loss: 1.0307 - regression_loss: 0.9161 - classification_loss: 0.1146 79/500 [===>..........................] - ETA: 2:23 - loss: 1.0340 - regression_loss: 0.9189 - classification_loss: 0.1150 80/500 [===>..........................] - ETA: 2:22 - loss: 1.0385 - regression_loss: 0.9227 - classification_loss: 0.1157 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0374 - regression_loss: 0.9219 - classification_loss: 0.1155 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0408 - regression_loss: 0.9249 - classification_loss: 0.1159 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0418 - regression_loss: 0.9256 - classification_loss: 0.1162 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0370 - regression_loss: 0.9217 - classification_loss: 0.1153 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0381 - regression_loss: 0.9229 - classification_loss: 0.1153 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0406 - regression_loss: 0.9248 - classification_loss: 0.1158 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0378 - regression_loss: 0.9226 - classification_loss: 0.1152 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0342 - regression_loss: 0.9196 - classification_loss: 0.1146 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0353 - regression_loss: 0.9204 - classification_loss: 0.1149 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0356 - regression_loss: 0.9205 - classification_loss: 0.1151 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0416 - regression_loss: 0.9250 - classification_loss: 0.1166 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0488 - regression_loss: 0.9310 - classification_loss: 0.1177 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0433 - regression_loss: 0.9264 - classification_loss: 0.1170 94/500 [====>.........................] - ETA: 2:17 - loss: 1.0428 - regression_loss: 0.9259 - classification_loss: 0.1169 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0443 - regression_loss: 0.9270 - classification_loss: 0.1174 96/500 [====>.........................] - ETA: 2:17 - loss: 1.0374 - regression_loss: 0.9210 - classification_loss: 0.1165 97/500 [====>.........................] - ETA: 2:16 - loss: 1.0359 - regression_loss: 0.9201 - classification_loss: 0.1158 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0382 - regression_loss: 0.9222 - classification_loss: 0.1160 99/500 [====>.........................] - ETA: 2:16 - loss: 1.0339 - regression_loss: 0.9187 - classification_loss: 0.1152 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0324 - regression_loss: 0.9174 - classification_loss: 0.1150 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0365 - regression_loss: 0.9209 - classification_loss: 0.1156 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0363 - regression_loss: 0.9205 - classification_loss: 0.1158 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0301 - regression_loss: 0.9150 - classification_loss: 0.1151 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0297 - regression_loss: 0.9146 - classification_loss: 0.1150 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0317 - regression_loss: 0.9164 - classification_loss: 0.1153 106/500 [=====>........................] - ETA: 2:14 - loss: 1.0309 - regression_loss: 0.9158 - classification_loss: 0.1151 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0313 - regression_loss: 0.9156 - classification_loss: 0.1157 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0290 - regression_loss: 0.9138 - classification_loss: 0.1153 109/500 [=====>........................] - ETA: 2:13 - loss: 1.0287 - regression_loss: 0.9138 - classification_loss: 0.1150 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0288 - regression_loss: 0.9138 - classification_loss: 0.1150 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0310 - regression_loss: 0.9155 - classification_loss: 0.1155 112/500 [=====>........................] - ETA: 2:12 - loss: 1.0296 - regression_loss: 0.9143 - classification_loss: 0.1152 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0268 - regression_loss: 0.9120 - classification_loss: 0.1148 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0325 - regression_loss: 0.9134 - classification_loss: 0.1191 115/500 [=====>........................] - ETA: 2:11 - loss: 1.0270 - regression_loss: 0.9086 - classification_loss: 0.1184 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0311 - regression_loss: 0.9127 - classification_loss: 0.1184 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0277 - regression_loss: 0.9097 - classification_loss: 0.1180 118/500 [======>.......................] - ETA: 2:10 - loss: 1.0232 - regression_loss: 0.9059 - classification_loss: 0.1173 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0240 - regression_loss: 0.9066 - classification_loss: 0.1174 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0204 - regression_loss: 0.9035 - classification_loss: 0.1169 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0211 - regression_loss: 0.9042 - classification_loss: 0.1169 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0220 - regression_loss: 0.9051 - classification_loss: 0.1169 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0235 - regression_loss: 0.9063 - classification_loss: 0.1172 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0257 - regression_loss: 0.9084 - classification_loss: 0.1173 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0234 - regression_loss: 0.9066 - classification_loss: 0.1168 126/500 [======>.......................] - ETA: 2:07 - loss: 1.0261 - regression_loss: 0.9090 - classification_loss: 0.1172 127/500 [======>.......................] - ETA: 2:07 - loss: 1.0265 - regression_loss: 0.9089 - classification_loss: 0.1176 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0257 - regression_loss: 0.9080 - classification_loss: 0.1177 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0277 - regression_loss: 0.9098 - classification_loss: 0.1179 130/500 [======>.......................] - ETA: 2:06 - loss: 1.0293 - regression_loss: 0.9115 - classification_loss: 0.1178 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0301 - regression_loss: 0.9122 - classification_loss: 0.1179 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0305 - regression_loss: 0.9127 - classification_loss: 0.1178 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0332 - regression_loss: 0.9152 - classification_loss: 0.1180 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0298 - regression_loss: 0.9124 - classification_loss: 0.1174 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0349 - regression_loss: 0.9166 - classification_loss: 0.1184 136/500 [=======>......................] - ETA: 2:04 - loss: 1.0317 - regression_loss: 0.9137 - classification_loss: 0.1179 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0320 - regression_loss: 0.9141 - classification_loss: 0.1179 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0353 - regression_loss: 0.9166 - classification_loss: 0.1188 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0371 - regression_loss: 0.9179 - classification_loss: 0.1193 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0362 - regression_loss: 0.9174 - classification_loss: 0.1188 141/500 [=======>......................] - ETA: 2:02 - loss: 1.0364 - regression_loss: 0.9178 - classification_loss: 0.1186 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0360 - regression_loss: 0.9174 - classification_loss: 0.1186 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0333 - regression_loss: 0.9152 - classification_loss: 0.1181 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0332 - regression_loss: 0.9151 - classification_loss: 0.1181 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0378 - regression_loss: 0.9199 - classification_loss: 0.1179 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0376 - regression_loss: 0.9197 - classification_loss: 0.1179 147/500 [=======>......................] - ETA: 2:00 - loss: 1.0439 - regression_loss: 0.9246 - classification_loss: 0.1193 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0403 - regression_loss: 0.9215 - classification_loss: 0.1188 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0366 - regression_loss: 0.9180 - classification_loss: 0.1186 150/500 [========>.....................] - ETA: 1:59 - loss: 1.0403 - regression_loss: 0.9212 - classification_loss: 0.1191 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0397 - regression_loss: 0.9210 - classification_loss: 0.1186 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0354 - regression_loss: 0.9174 - classification_loss: 0.1180 153/500 [========>.....................] - ETA: 1:58 - loss: 1.0338 - regression_loss: 0.9161 - classification_loss: 0.1177 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0316 - regression_loss: 0.9145 - classification_loss: 0.1171 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0317 - regression_loss: 0.9147 - classification_loss: 0.1170 156/500 [========>.....................] - ETA: 1:57 - loss: 1.0349 - regression_loss: 0.9171 - classification_loss: 0.1178 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0361 - regression_loss: 0.9176 - classification_loss: 0.1184 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0357 - regression_loss: 0.9174 - classification_loss: 0.1183 159/500 [========>.....................] - ETA: 1:56 - loss: 1.0340 - regression_loss: 0.9160 - classification_loss: 0.1180 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0357 - regression_loss: 0.9172 - classification_loss: 0.1185 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0333 - regression_loss: 0.9152 - classification_loss: 0.1181 162/500 [========>.....................] - ETA: 1:55 - loss: 1.0323 - regression_loss: 0.9145 - classification_loss: 0.1177 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0332 - regression_loss: 0.9155 - classification_loss: 0.1177 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0334 - regression_loss: 0.9157 - classification_loss: 0.1177 165/500 [========>.....................] - ETA: 1:54 - loss: 1.0347 - regression_loss: 0.9169 - classification_loss: 0.1178 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0336 - regression_loss: 0.9158 - classification_loss: 0.1178 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0333 - regression_loss: 0.9156 - classification_loss: 0.1177 168/500 [=========>....................] - ETA: 1:53 - loss: 1.0328 - regression_loss: 0.9155 - classification_loss: 0.1173 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0334 - regression_loss: 0.9160 - classification_loss: 0.1174 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0361 - regression_loss: 0.9186 - classification_loss: 0.1175 171/500 [=========>....................] - ETA: 1:52 - loss: 1.0349 - regression_loss: 0.9175 - classification_loss: 0.1173 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0364 - regression_loss: 0.9185 - classification_loss: 0.1179 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0373 - regression_loss: 0.9193 - classification_loss: 0.1181 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0381 - regression_loss: 0.9197 - classification_loss: 0.1184 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0436 - regression_loss: 0.9245 - classification_loss: 0.1191 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0394 - regression_loss: 0.9208 - classification_loss: 0.1186 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0389 - regression_loss: 0.9204 - classification_loss: 0.1185 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0413 - regression_loss: 0.9224 - classification_loss: 0.1189 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0379 - regression_loss: 0.9194 - classification_loss: 0.1185 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0379 - regression_loss: 0.9195 - classification_loss: 0.1184 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0378 - regression_loss: 0.9194 - classification_loss: 0.1184 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0360 - regression_loss: 0.9179 - classification_loss: 0.1181 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0387 - regression_loss: 0.9199 - classification_loss: 0.1187 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0396 - regression_loss: 0.9207 - classification_loss: 0.1189 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0377 - regression_loss: 0.9192 - classification_loss: 0.1185 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0375 - regression_loss: 0.9190 - classification_loss: 0.1185 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0341 - regression_loss: 0.9160 - classification_loss: 0.1181 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0325 - regression_loss: 0.9145 - classification_loss: 0.1180 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0288 - regression_loss: 0.9111 - classification_loss: 0.1177 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0289 - regression_loss: 0.9113 - classification_loss: 0.1176 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0262 - regression_loss: 0.9090 - classification_loss: 0.1172 192/500 [==========>...................] - ETA: 1:45 - loss: 1.0260 - regression_loss: 0.9088 - classification_loss: 0.1172 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0240 - regression_loss: 0.9071 - classification_loss: 0.1169 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0256 - regression_loss: 0.9085 - classification_loss: 0.1171 195/500 [==========>...................] - ETA: 1:44 - loss: 1.0269 - regression_loss: 0.9096 - classification_loss: 0.1174 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0272 - regression_loss: 0.9098 - classification_loss: 0.1174 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0274 - regression_loss: 0.9097 - classification_loss: 0.1177 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0283 - regression_loss: 0.9103 - classification_loss: 0.1179 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0268 - regression_loss: 0.9091 - classification_loss: 0.1177 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0285 - regression_loss: 0.9104 - classification_loss: 0.1180 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0261 - regression_loss: 0.9086 - classification_loss: 0.1176 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0284 - regression_loss: 0.9104 - classification_loss: 0.1180 203/500 [===========>..................] - ETA: 1:41 - loss: 1.0286 - regression_loss: 0.9108 - classification_loss: 0.1178 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0305 - regression_loss: 0.9129 - classification_loss: 0.1176 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0357 - regression_loss: 0.9170 - classification_loss: 0.1187 206/500 [===========>..................] - ETA: 1:40 - loss: 1.0386 - regression_loss: 0.9192 - classification_loss: 0.1193 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0387 - regression_loss: 0.9192 - classification_loss: 0.1195 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0387 - regression_loss: 0.9192 - classification_loss: 0.1195 209/500 [===========>..................] - ETA: 1:39 - loss: 1.0406 - regression_loss: 0.9210 - classification_loss: 0.1196 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0392 - regression_loss: 0.9199 - classification_loss: 0.1193 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0401 - regression_loss: 0.9208 - classification_loss: 0.1194 212/500 [===========>..................] - ETA: 1:38 - loss: 1.0394 - regression_loss: 0.9202 - classification_loss: 0.1193 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0376 - regression_loss: 0.9186 - classification_loss: 0.1189 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0391 - regression_loss: 0.9201 - classification_loss: 0.1190 215/500 [===========>..................] - ETA: 1:37 - loss: 1.0374 - regression_loss: 0.9187 - classification_loss: 0.1187 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0360 - regression_loss: 0.9172 - classification_loss: 0.1189 217/500 [============>.................] - ETA: 1:36 - loss: 1.0381 - regression_loss: 0.9189 - classification_loss: 0.1192 218/500 [============>.................] - ETA: 1:36 - loss: 1.0371 - regression_loss: 0.9181 - classification_loss: 0.1190 219/500 [============>.................] - ETA: 1:35 - loss: 1.0368 - regression_loss: 0.9179 - classification_loss: 0.1189 220/500 [============>.................] - ETA: 1:35 - loss: 1.0354 - regression_loss: 0.9166 - classification_loss: 0.1187 221/500 [============>.................] - ETA: 1:35 - loss: 1.0359 - regression_loss: 0.9171 - classification_loss: 0.1189 222/500 [============>.................] - ETA: 1:34 - loss: 1.0346 - regression_loss: 0.9160 - classification_loss: 0.1186 223/500 [============>.................] - ETA: 1:34 - loss: 1.0353 - regression_loss: 0.9170 - classification_loss: 0.1183 224/500 [============>.................] - ETA: 1:34 - loss: 1.0355 - regression_loss: 0.9172 - classification_loss: 0.1184 225/500 [============>.................] - ETA: 1:33 - loss: 1.0362 - regression_loss: 0.9178 - classification_loss: 0.1184 226/500 [============>.................] - ETA: 1:33 - loss: 1.0360 - regression_loss: 0.9174 - classification_loss: 0.1186 227/500 [============>.................] - ETA: 1:33 - loss: 1.0377 - regression_loss: 0.9188 - classification_loss: 0.1189 228/500 [============>.................] - ETA: 1:32 - loss: 1.0383 - regression_loss: 0.9189 - classification_loss: 0.1194 229/500 [============>.................] - ETA: 1:32 - loss: 1.0398 - regression_loss: 0.9201 - classification_loss: 0.1197 230/500 [============>.................] - ETA: 1:32 - loss: 1.0409 - regression_loss: 0.9211 - classification_loss: 0.1197 231/500 [============>.................] - ETA: 1:31 - loss: 1.0415 - regression_loss: 0.9216 - classification_loss: 0.1199 232/500 [============>.................] - ETA: 1:31 - loss: 1.0413 - regression_loss: 0.9215 - classification_loss: 0.1198 233/500 [============>.................] - ETA: 1:31 - loss: 1.0413 - regression_loss: 0.9214 - classification_loss: 0.1200 234/500 [=============>................] - ETA: 1:30 - loss: 1.0422 - regression_loss: 0.9221 - classification_loss: 0.1200 235/500 [=============>................] - ETA: 1:30 - loss: 1.0419 - regression_loss: 0.9220 - classification_loss: 0.1199 236/500 [=============>................] - ETA: 1:30 - loss: 1.0423 - regression_loss: 0.9224 - classification_loss: 0.1199 237/500 [=============>................] - ETA: 1:29 - loss: 1.0424 - regression_loss: 0.9226 - classification_loss: 0.1198 238/500 [=============>................] - ETA: 1:29 - loss: 1.0419 - regression_loss: 0.9222 - classification_loss: 0.1197 239/500 [=============>................] - ETA: 1:29 - loss: 1.0418 - regression_loss: 0.9223 - classification_loss: 0.1196 240/500 [=============>................] - ETA: 1:28 - loss: 1.0420 - regression_loss: 0.9224 - classification_loss: 0.1196 241/500 [=============>................] - ETA: 1:28 - loss: 1.0428 - regression_loss: 0.9231 - classification_loss: 0.1198 242/500 [=============>................] - ETA: 1:28 - loss: 1.0425 - regression_loss: 0.9228 - classification_loss: 0.1197 243/500 [=============>................] - ETA: 1:27 - loss: 1.0432 - regression_loss: 0.9235 - classification_loss: 0.1197 244/500 [=============>................] - ETA: 1:27 - loss: 1.0418 - regression_loss: 0.9223 - classification_loss: 0.1195 245/500 [=============>................] - ETA: 1:27 - loss: 1.0431 - regression_loss: 0.9233 - classification_loss: 0.1198 246/500 [=============>................] - ETA: 1:26 - loss: 1.0421 - regression_loss: 0.9225 - classification_loss: 0.1196 247/500 [=============>................] - ETA: 1:26 - loss: 1.0429 - regression_loss: 0.9232 - classification_loss: 0.1197 248/500 [=============>................] - ETA: 1:25 - loss: 1.0425 - regression_loss: 0.9231 - classification_loss: 0.1194 249/500 [=============>................] - ETA: 1:25 - loss: 1.0424 - regression_loss: 0.9230 - classification_loss: 0.1194 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0411 - regression_loss: 0.9219 - classification_loss: 0.1192 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0383 - regression_loss: 0.9194 - classification_loss: 0.1188 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0390 - regression_loss: 0.9201 - classification_loss: 0.1189 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0364 - regression_loss: 0.9179 - classification_loss: 0.1186 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0370 - regression_loss: 0.9186 - classification_loss: 0.1183 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0383 - regression_loss: 0.9198 - classification_loss: 0.1185 256/500 [==============>...............] - ETA: 1:23 - loss: 1.0397 - regression_loss: 0.9211 - classification_loss: 0.1186 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0399 - regression_loss: 0.9213 - classification_loss: 0.1187 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0406 - regression_loss: 0.9219 - classification_loss: 0.1187 259/500 [==============>...............] - ETA: 1:22 - loss: 1.0388 - regression_loss: 0.9203 - classification_loss: 0.1185 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0391 - regression_loss: 0.9207 - classification_loss: 0.1184 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0409 - regression_loss: 0.9224 - classification_loss: 0.1185 262/500 [==============>...............] - ETA: 1:21 - loss: 1.0409 - regression_loss: 0.9224 - classification_loss: 0.1185 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0423 - regression_loss: 0.9234 - classification_loss: 0.1189 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0431 - regression_loss: 0.9237 - classification_loss: 0.1194 265/500 [==============>...............] - ETA: 1:20 - loss: 1.0439 - regression_loss: 0.9243 - classification_loss: 0.1196 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0429 - regression_loss: 0.9234 - classification_loss: 0.1195 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0433 - regression_loss: 0.9239 - classification_loss: 0.1194 268/500 [===============>..............] - ETA: 1:19 - loss: 1.0435 - regression_loss: 0.9238 - classification_loss: 0.1197 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0433 - regression_loss: 0.9235 - classification_loss: 0.1197 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0445 - regression_loss: 0.9245 - classification_loss: 0.1199 271/500 [===============>..............] - ETA: 1:18 - loss: 1.0450 - regression_loss: 0.9249 - classification_loss: 0.1201 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0455 - regression_loss: 0.9254 - classification_loss: 0.1201 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0457 - regression_loss: 0.9257 - classification_loss: 0.1200 274/500 [===============>..............] - ETA: 1:17 - loss: 1.0463 - regression_loss: 0.9263 - classification_loss: 0.1200 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0456 - regression_loss: 0.9257 - classification_loss: 0.1198 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0471 - regression_loss: 0.9269 - classification_loss: 0.1201 277/500 [===============>..............] - ETA: 1:16 - loss: 1.0469 - regression_loss: 0.9269 - classification_loss: 0.1200 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0473 - regression_loss: 0.9272 - classification_loss: 0.1200 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0472 - regression_loss: 0.9273 - classification_loss: 0.1200 280/500 [===============>..............] - ETA: 1:15 - loss: 1.0483 - regression_loss: 0.9282 - classification_loss: 0.1201 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0480 - regression_loss: 0.9278 - classification_loss: 0.1201 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0497 - regression_loss: 0.9294 - classification_loss: 0.1203 283/500 [===============>..............] - ETA: 1:14 - loss: 1.0512 - regression_loss: 0.9306 - classification_loss: 0.1205 284/500 [================>.............] - ETA: 1:13 - loss: 1.0494 - regression_loss: 0.9292 - classification_loss: 0.1203 285/500 [================>.............] - ETA: 1:13 - loss: 1.0483 - regression_loss: 0.9283 - classification_loss: 0.1200 286/500 [================>.............] - ETA: 1:13 - loss: 1.0491 - regression_loss: 0.9290 - classification_loss: 0.1201 287/500 [================>.............] - ETA: 1:12 - loss: 1.0495 - regression_loss: 0.9295 - classification_loss: 0.1200 288/500 [================>.............] - ETA: 1:12 - loss: 1.0493 - regression_loss: 0.9293 - classification_loss: 0.1200 289/500 [================>.............] - ETA: 1:11 - loss: 1.0514 - regression_loss: 0.9311 - classification_loss: 0.1204 290/500 [================>.............] - ETA: 1:11 - loss: 1.0523 - regression_loss: 0.9318 - classification_loss: 0.1205 291/500 [================>.............] - ETA: 1:11 - loss: 1.0526 - regression_loss: 0.9323 - classification_loss: 0.1203 292/500 [================>.............] - ETA: 1:10 - loss: 1.0519 - regression_loss: 0.9316 - classification_loss: 0.1202 293/500 [================>.............] - ETA: 1:10 - loss: 1.0527 - regression_loss: 0.9323 - classification_loss: 0.1204 294/500 [================>.............] - ETA: 1:10 - loss: 1.0512 - regression_loss: 0.9310 - classification_loss: 0.1202 295/500 [================>.............] - ETA: 1:09 - loss: 1.0530 - regression_loss: 0.9327 - classification_loss: 0.1203 296/500 [================>.............] - ETA: 1:09 - loss: 1.0532 - regression_loss: 0.9331 - classification_loss: 0.1202 297/500 [================>.............] - ETA: 1:09 - loss: 1.0534 - regression_loss: 0.9333 - classification_loss: 0.1201 298/500 [================>.............] - ETA: 1:08 - loss: 1.0548 - regression_loss: 0.9345 - classification_loss: 0.1203 299/500 [================>.............] - ETA: 1:08 - loss: 1.0535 - regression_loss: 0.9335 - classification_loss: 0.1200 300/500 [=================>............] - ETA: 1:08 - loss: 1.0518 - regression_loss: 0.9321 - classification_loss: 0.1198 301/500 [=================>............] - ETA: 1:07 - loss: 1.0525 - regression_loss: 0.9326 - classification_loss: 0.1199 302/500 [=================>............] - ETA: 1:07 - loss: 1.0531 - regression_loss: 0.9330 - classification_loss: 0.1201 303/500 [=================>............] - ETA: 1:07 - loss: 1.0542 - regression_loss: 0.9339 - classification_loss: 0.1203 304/500 [=================>............] - ETA: 1:06 - loss: 1.0553 - regression_loss: 0.9348 - classification_loss: 0.1204 305/500 [=================>............] - ETA: 1:06 - loss: 1.0534 - regression_loss: 0.9332 - classification_loss: 0.1202 306/500 [=================>............] - ETA: 1:06 - loss: 1.0532 - regression_loss: 0.9330 - classification_loss: 0.1202 307/500 [=================>............] - ETA: 1:05 - loss: 1.0532 - regression_loss: 0.9330 - classification_loss: 0.1202 308/500 [=================>............] - ETA: 1:05 - loss: 1.0515 - regression_loss: 0.9316 - classification_loss: 0.1199 309/500 [=================>............] - ETA: 1:05 - loss: 1.0492 - regression_loss: 0.9297 - classification_loss: 0.1196 310/500 [=================>............] - ETA: 1:04 - loss: 1.0474 - regression_loss: 0.9281 - classification_loss: 0.1193 311/500 [=================>............] - ETA: 1:04 - loss: 1.0488 - regression_loss: 0.9292 - classification_loss: 0.1195 312/500 [=================>............] - ETA: 1:04 - loss: 1.0494 - regression_loss: 0.9297 - classification_loss: 0.1197 313/500 [=================>............] - ETA: 1:03 - loss: 1.0492 - regression_loss: 0.9296 - classification_loss: 0.1196 314/500 [=================>............] - ETA: 1:03 - loss: 1.0493 - regression_loss: 0.9298 - classification_loss: 0.1195 315/500 [=================>............] - ETA: 1:03 - loss: 1.0493 - regression_loss: 0.9298 - classification_loss: 0.1195 316/500 [=================>............] - ETA: 1:02 - loss: 1.0501 - regression_loss: 0.9305 - classification_loss: 0.1197 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0497 - regression_loss: 0.9303 - classification_loss: 0.1194 318/500 [==================>...........] - ETA: 1:02 - loss: 1.0485 - regression_loss: 0.9292 - classification_loss: 0.1192 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0497 - regression_loss: 0.9303 - classification_loss: 0.1194 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0487 - regression_loss: 0.9295 - classification_loss: 0.1192 321/500 [==================>...........] - ETA: 1:01 - loss: 1.0508 - regression_loss: 0.9314 - classification_loss: 0.1194 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0493 - regression_loss: 0.9301 - classification_loss: 0.1192 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0502 - regression_loss: 0.9307 - classification_loss: 0.1195 324/500 [==================>...........] - ETA: 1:00 - loss: 1.0507 - regression_loss: 0.9312 - classification_loss: 0.1194 325/500 [==================>...........] - ETA: 59s - loss: 1.0508 - regression_loss: 0.9314 - classification_loss: 0.1194  326/500 [==================>...........] - ETA: 59s - loss: 1.0507 - regression_loss: 0.9314 - classification_loss: 0.1192 327/500 [==================>...........] - ETA: 59s - loss: 1.0519 - regression_loss: 0.9325 - classification_loss: 0.1193 328/500 [==================>...........] - ETA: 58s - loss: 1.0512 - regression_loss: 0.9319 - classification_loss: 0.1192 329/500 [==================>...........] - ETA: 58s - loss: 1.0501 - regression_loss: 0.9311 - classification_loss: 0.1190 330/500 [==================>...........] - ETA: 58s - loss: 1.0495 - regression_loss: 0.9306 - classification_loss: 0.1190 331/500 [==================>...........] - ETA: 57s - loss: 1.0497 - regression_loss: 0.9307 - classification_loss: 0.1190 332/500 [==================>...........] - ETA: 57s - loss: 1.0494 - regression_loss: 0.9304 - classification_loss: 0.1190 333/500 [==================>...........] - ETA: 57s - loss: 1.0500 - regression_loss: 0.9309 - classification_loss: 0.1191 334/500 [===================>..........] - ETA: 56s - loss: 1.0500 - regression_loss: 0.9308 - classification_loss: 0.1192 335/500 [===================>..........] - ETA: 56s - loss: 1.0485 - regression_loss: 0.9296 - classification_loss: 0.1189 336/500 [===================>..........] - ETA: 56s - loss: 1.0492 - regression_loss: 0.9301 - classification_loss: 0.1191 337/500 [===================>..........] - ETA: 55s - loss: 1.0495 - regression_loss: 0.9304 - classification_loss: 0.1191 338/500 [===================>..........] - ETA: 55s - loss: 1.0490 - regression_loss: 0.9300 - classification_loss: 0.1190 339/500 [===================>..........] - ETA: 54s - loss: 1.0505 - regression_loss: 0.9313 - classification_loss: 0.1192 340/500 [===================>..........] - ETA: 54s - loss: 1.0534 - regression_loss: 0.9336 - classification_loss: 0.1198 341/500 [===================>..........] - ETA: 54s - loss: 1.0541 - regression_loss: 0.9342 - classification_loss: 0.1199 342/500 [===================>..........] - ETA: 53s - loss: 1.0539 - regression_loss: 0.9340 - classification_loss: 0.1199 343/500 [===================>..........] - ETA: 53s - loss: 1.0552 - regression_loss: 0.9349 - classification_loss: 0.1202 344/500 [===================>..........] - ETA: 53s - loss: 1.0538 - regression_loss: 0.9338 - classification_loss: 0.1200 345/500 [===================>..........] - ETA: 52s - loss: 1.0525 - regression_loss: 0.9327 - classification_loss: 0.1198 346/500 [===================>..........] - ETA: 52s - loss: 1.0521 - regression_loss: 0.9324 - classification_loss: 0.1197 347/500 [===================>..........] - ETA: 52s - loss: 1.0517 - regression_loss: 0.9319 - classification_loss: 0.1198 348/500 [===================>..........] - ETA: 51s - loss: 1.0513 - regression_loss: 0.9316 - classification_loss: 0.1197 349/500 [===================>..........] - ETA: 51s - loss: 1.0524 - regression_loss: 0.9325 - classification_loss: 0.1199 350/500 [====================>.........] - ETA: 51s - loss: 1.0524 - regression_loss: 0.9325 - classification_loss: 0.1199 351/500 [====================>.........] - ETA: 50s - loss: 1.0525 - regression_loss: 0.9327 - classification_loss: 0.1198 352/500 [====================>.........] - ETA: 50s - loss: 1.0529 - regression_loss: 0.9330 - classification_loss: 0.1199 353/500 [====================>.........] - ETA: 50s - loss: 1.0540 - regression_loss: 0.9341 - classification_loss: 0.1200 354/500 [====================>.........] - ETA: 49s - loss: 1.0534 - regression_loss: 0.9334 - classification_loss: 0.1199 355/500 [====================>.........] - ETA: 49s - loss: 1.0544 - regression_loss: 0.9345 - classification_loss: 0.1199 356/500 [====================>.........] - ETA: 49s - loss: 1.0539 - regression_loss: 0.9341 - classification_loss: 0.1198 357/500 [====================>.........] - ETA: 48s - loss: 1.0528 - regression_loss: 0.9331 - classification_loss: 0.1197 358/500 [====================>.........] - ETA: 48s - loss: 1.0536 - regression_loss: 0.9339 - classification_loss: 0.1198 359/500 [====================>.........] - ETA: 48s - loss: 1.0540 - regression_loss: 0.9342 - classification_loss: 0.1198 360/500 [====================>.........] - ETA: 47s - loss: 1.0548 - regression_loss: 0.9348 - classification_loss: 0.1199 361/500 [====================>.........] - ETA: 47s - loss: 1.0553 - regression_loss: 0.9353 - classification_loss: 0.1199 362/500 [====================>.........] - ETA: 47s - loss: 1.0545 - regression_loss: 0.9346 - classification_loss: 0.1198 363/500 [====================>.........] - ETA: 46s - loss: 1.0549 - regression_loss: 0.9351 - classification_loss: 0.1198 364/500 [====================>.........] - ETA: 46s - loss: 1.0552 - regression_loss: 0.9354 - classification_loss: 0.1198 365/500 [====================>.........] - ETA: 46s - loss: 1.0553 - regression_loss: 0.9355 - classification_loss: 0.1198 366/500 [====================>.........] - ETA: 45s - loss: 1.0545 - regression_loss: 0.9348 - classification_loss: 0.1196 367/500 [=====================>........] - ETA: 45s - loss: 1.0544 - regression_loss: 0.9348 - classification_loss: 0.1196 368/500 [=====================>........] - ETA: 45s - loss: 1.0527 - regression_loss: 0.9333 - classification_loss: 0.1194 369/500 [=====================>........] - ETA: 44s - loss: 1.0514 - regression_loss: 0.9322 - classification_loss: 0.1192 370/500 [=====================>........] - ETA: 44s - loss: 1.0500 - regression_loss: 0.9311 - classification_loss: 0.1189 371/500 [=====================>........] - ETA: 44s - loss: 1.0507 - regression_loss: 0.9316 - classification_loss: 0.1191 372/500 [=====================>........] - ETA: 43s - loss: 1.0518 - regression_loss: 0.9327 - classification_loss: 0.1192 373/500 [=====================>........] - ETA: 43s - loss: 1.0528 - regression_loss: 0.9334 - classification_loss: 0.1194 374/500 [=====================>........] - ETA: 43s - loss: 1.0510 - regression_loss: 0.9318 - classification_loss: 0.1192 375/500 [=====================>........] - ETA: 42s - loss: 1.0517 - regression_loss: 0.9326 - classification_loss: 0.1192 376/500 [=====================>........] - ETA: 42s - loss: 1.0503 - regression_loss: 0.9313 - classification_loss: 0.1190 377/500 [=====================>........] - ETA: 41s - loss: 1.0504 - regression_loss: 0.9314 - classification_loss: 0.1190 378/500 [=====================>........] - ETA: 41s - loss: 1.0508 - regression_loss: 0.9316 - classification_loss: 0.1192 379/500 [=====================>........] - ETA: 41s - loss: 1.0519 - regression_loss: 0.9325 - classification_loss: 0.1193 380/500 [=====================>........] - ETA: 40s - loss: 1.0518 - regression_loss: 0.9319 - classification_loss: 0.1200 381/500 [=====================>........] - ETA: 40s - loss: 1.0501 - regression_loss: 0.9303 - classification_loss: 0.1198 382/500 [=====================>........] - ETA: 40s - loss: 1.0495 - regression_loss: 0.9299 - classification_loss: 0.1196 383/500 [=====================>........] - ETA: 39s - loss: 1.0497 - regression_loss: 0.9301 - classification_loss: 0.1196 384/500 [======================>.......] - ETA: 39s - loss: 1.0496 - regression_loss: 0.9300 - classification_loss: 0.1196 385/500 [======================>.......] - ETA: 39s - loss: 1.0494 - regression_loss: 0.9298 - classification_loss: 0.1196 386/500 [======================>.......] - ETA: 38s - loss: 1.0491 - regression_loss: 0.9295 - classification_loss: 0.1196 387/500 [======================>.......] - ETA: 38s - loss: 1.0499 - regression_loss: 0.9301 - classification_loss: 0.1197 388/500 [======================>.......] - ETA: 38s - loss: 1.0509 - regression_loss: 0.9310 - classification_loss: 0.1199 389/500 [======================>.......] - ETA: 37s - loss: 1.0514 - regression_loss: 0.9315 - classification_loss: 0.1200 390/500 [======================>.......] - ETA: 37s - loss: 1.0513 - regression_loss: 0.9313 - classification_loss: 0.1200 391/500 [======================>.......] - ETA: 37s - loss: 1.0512 - regression_loss: 0.9312 - classification_loss: 0.1200 392/500 [======================>.......] - ETA: 36s - loss: 1.0512 - regression_loss: 0.9311 - classification_loss: 0.1201 393/500 [======================>.......] - ETA: 36s - loss: 1.0517 - regression_loss: 0.9315 - classification_loss: 0.1202 394/500 [======================>.......] - ETA: 36s - loss: 1.0517 - regression_loss: 0.9314 - classification_loss: 0.1203 395/500 [======================>.......] - ETA: 35s - loss: 1.0525 - regression_loss: 0.9320 - classification_loss: 0.1204 396/500 [======================>.......] - ETA: 35s - loss: 1.0515 - regression_loss: 0.9312 - classification_loss: 0.1203 397/500 [======================>.......] - ETA: 35s - loss: 1.0515 - regression_loss: 0.9312 - classification_loss: 0.1204 398/500 [======================>.......] - ETA: 34s - loss: 1.0502 - regression_loss: 0.9300 - classification_loss: 0.1202 399/500 [======================>.......] - ETA: 34s - loss: 1.0503 - regression_loss: 0.9301 - classification_loss: 0.1202 400/500 [=======================>......] - ETA: 34s - loss: 1.0516 - regression_loss: 0.9313 - classification_loss: 0.1203 401/500 [=======================>......] - ETA: 33s - loss: 1.0519 - regression_loss: 0.9315 - classification_loss: 0.1204 402/500 [=======================>......] - ETA: 33s - loss: 1.0526 - regression_loss: 0.9322 - classification_loss: 0.1205 403/500 [=======================>......] - ETA: 33s - loss: 1.0514 - regression_loss: 0.9311 - classification_loss: 0.1203 404/500 [=======================>......] - ETA: 32s - loss: 1.0523 - regression_loss: 0.9317 - classification_loss: 0.1206 405/500 [=======================>......] - ETA: 32s - loss: 1.0516 - regression_loss: 0.9311 - classification_loss: 0.1206 406/500 [=======================>......] - ETA: 32s - loss: 1.0507 - regression_loss: 0.9301 - classification_loss: 0.1205 407/500 [=======================>......] - ETA: 31s - loss: 1.0500 - regression_loss: 0.9297 - classification_loss: 0.1203 408/500 [=======================>......] - ETA: 31s - loss: 1.0484 - regression_loss: 0.9282 - classification_loss: 0.1202 409/500 [=======================>......] - ETA: 31s - loss: 1.0493 - regression_loss: 0.9290 - classification_loss: 0.1203 410/500 [=======================>......] - ETA: 30s - loss: 1.0479 - regression_loss: 0.9277 - classification_loss: 0.1202 411/500 [=======================>......] - ETA: 30s - loss: 1.0489 - regression_loss: 0.9286 - classification_loss: 0.1203 412/500 [=======================>......] - ETA: 30s - loss: 1.0483 - regression_loss: 0.9280 - classification_loss: 0.1202 413/500 [=======================>......] - ETA: 29s - loss: 1.0486 - regression_loss: 0.9283 - classification_loss: 0.1203 414/500 [=======================>......] - ETA: 29s - loss: 1.0485 - regression_loss: 0.9281 - classification_loss: 0.1205 415/500 [=======================>......] - ETA: 29s - loss: 1.0484 - regression_loss: 0.9280 - classification_loss: 0.1204 416/500 [=======================>......] - ETA: 28s - loss: 1.0490 - regression_loss: 0.9285 - classification_loss: 0.1205 417/500 [========================>.....] - ETA: 28s - loss: 1.0501 - regression_loss: 0.9294 - classification_loss: 0.1206 418/500 [========================>.....] - ETA: 27s - loss: 1.0499 - regression_loss: 0.9293 - classification_loss: 0.1206 419/500 [========================>.....] - ETA: 27s - loss: 1.0502 - regression_loss: 0.9293 - classification_loss: 0.1209 420/500 [========================>.....] - ETA: 27s - loss: 1.0496 - regression_loss: 0.9288 - classification_loss: 0.1208 421/500 [========================>.....] - ETA: 26s - loss: 1.0499 - regression_loss: 0.9291 - classification_loss: 0.1208 422/500 [========================>.....] - ETA: 26s - loss: 1.0500 - regression_loss: 0.9291 - classification_loss: 0.1209 423/500 [========================>.....] - ETA: 26s - loss: 1.0501 - regression_loss: 0.9291 - classification_loss: 0.1210 424/500 [========================>.....] - ETA: 25s - loss: 1.0505 - regression_loss: 0.9295 - classification_loss: 0.1210 425/500 [========================>.....] - ETA: 25s - loss: 1.0518 - regression_loss: 0.9306 - classification_loss: 0.1212 426/500 [========================>.....] - ETA: 25s - loss: 1.0530 - regression_loss: 0.9317 - classification_loss: 0.1213 427/500 [========================>.....] - ETA: 24s - loss: 1.0536 - regression_loss: 0.9321 - classification_loss: 0.1215 428/500 [========================>.....] - ETA: 24s - loss: 1.0535 - regression_loss: 0.9321 - classification_loss: 0.1215 429/500 [========================>.....] - ETA: 24s - loss: 1.0517 - regression_loss: 0.9304 - classification_loss: 0.1212 430/500 [========================>.....] - ETA: 23s - loss: 1.0527 - regression_loss: 0.9313 - classification_loss: 0.1214 431/500 [========================>.....] - ETA: 23s - loss: 1.0526 - regression_loss: 0.9312 - classification_loss: 0.1214 432/500 [========================>.....] - ETA: 23s - loss: 1.0538 - regression_loss: 0.9322 - classification_loss: 0.1216 433/500 [========================>.....] - ETA: 22s - loss: 1.0532 - regression_loss: 0.9316 - classification_loss: 0.1217 434/500 [=========================>....] - ETA: 22s - loss: 1.0540 - regression_loss: 0.9321 - classification_loss: 0.1219 435/500 [=========================>....] - ETA: 22s - loss: 1.0525 - regression_loss: 0.9307 - classification_loss: 0.1217 436/500 [=========================>....] - ETA: 21s - loss: 1.0535 - regression_loss: 0.9314 - classification_loss: 0.1221 437/500 [=========================>....] - ETA: 21s - loss: 1.0535 - regression_loss: 0.9313 - classification_loss: 0.1222 438/500 [=========================>....] - ETA: 21s - loss: 1.0534 - regression_loss: 0.9313 - classification_loss: 0.1220 439/500 [=========================>....] - ETA: 20s - loss: 1.0546 - regression_loss: 0.9325 - classification_loss: 0.1222 440/500 [=========================>....] - ETA: 20s - loss: 1.0552 - regression_loss: 0.9330 - classification_loss: 0.1222 441/500 [=========================>....] - ETA: 20s - loss: 1.0541 - regression_loss: 0.9320 - classification_loss: 0.1221 442/500 [=========================>....] - ETA: 19s - loss: 1.0532 - regression_loss: 0.9313 - classification_loss: 0.1220 443/500 [=========================>....] - ETA: 19s - loss: 1.0541 - regression_loss: 0.9319 - classification_loss: 0.1222 444/500 [=========================>....] - ETA: 19s - loss: 1.0528 - regression_loss: 0.9308 - classification_loss: 0.1220 445/500 [=========================>....] - ETA: 18s - loss: 1.0536 - regression_loss: 0.9315 - classification_loss: 0.1221 446/500 [=========================>....] - ETA: 18s - loss: 1.0536 - regression_loss: 0.9315 - classification_loss: 0.1221 447/500 [=========================>....] - ETA: 18s - loss: 1.0530 - regression_loss: 0.9310 - classification_loss: 0.1220 448/500 [=========================>....] - ETA: 17s - loss: 1.0529 - regression_loss: 0.9308 - classification_loss: 0.1221 449/500 [=========================>....] - ETA: 17s - loss: 1.0540 - regression_loss: 0.9317 - classification_loss: 0.1223 450/500 [==========================>...] - ETA: 17s - loss: 1.0548 - regression_loss: 0.9323 - classification_loss: 0.1224 451/500 [==========================>...] - ETA: 16s - loss: 1.0553 - regression_loss: 0.9328 - classification_loss: 0.1225 452/500 [==========================>...] - ETA: 16s - loss: 1.0566 - regression_loss: 0.9339 - classification_loss: 0.1227 453/500 [==========================>...] - ETA: 16s - loss: 1.0558 - regression_loss: 0.9332 - classification_loss: 0.1225 454/500 [==========================>...] - ETA: 15s - loss: 1.0555 - regression_loss: 0.9329 - classification_loss: 0.1226 455/500 [==========================>...] - ETA: 15s - loss: 1.0546 - regression_loss: 0.9323 - classification_loss: 0.1224 456/500 [==========================>...] - ETA: 15s - loss: 1.0540 - regression_loss: 0.9318 - classification_loss: 0.1222 457/500 [==========================>...] - ETA: 14s - loss: 1.0548 - regression_loss: 0.9325 - classification_loss: 0.1223 458/500 [==========================>...] - ETA: 14s - loss: 1.0548 - regression_loss: 0.9325 - classification_loss: 0.1223 459/500 [==========================>...] - ETA: 13s - loss: 1.0549 - regression_loss: 0.9326 - classification_loss: 0.1223 460/500 [==========================>...] - ETA: 13s - loss: 1.0562 - regression_loss: 0.9336 - classification_loss: 0.1226 461/500 [==========================>...] - ETA: 13s - loss: 1.0573 - regression_loss: 0.9346 - classification_loss: 0.1227 462/500 [==========================>...] - ETA: 12s - loss: 1.0559 - regression_loss: 0.9334 - classification_loss: 0.1225 463/500 [==========================>...] - ETA: 12s - loss: 1.0551 - regression_loss: 0.9327 - classification_loss: 0.1223 464/500 [==========================>...] - ETA: 12s - loss: 1.0555 - regression_loss: 0.9332 - classification_loss: 0.1223 465/500 [==========================>...] - ETA: 11s - loss: 1.0558 - regression_loss: 0.9334 - classification_loss: 0.1224 466/500 [==========================>...] - ETA: 11s - loss: 1.0560 - regression_loss: 0.9336 - classification_loss: 0.1224 467/500 [===========================>..] - ETA: 11s - loss: 1.0584 - regression_loss: 0.9357 - classification_loss: 0.1227 468/500 [===========================>..] - ETA: 10s - loss: 1.0577 - regression_loss: 0.9351 - classification_loss: 0.1225 469/500 [===========================>..] - ETA: 10s - loss: 1.0586 - regression_loss: 0.9358 - classification_loss: 0.1228 470/500 [===========================>..] - ETA: 10s - loss: 1.0582 - regression_loss: 0.9354 - classification_loss: 0.1227 471/500 [===========================>..] - ETA: 9s - loss: 1.0580 - regression_loss: 0.9354 - classification_loss: 0.1227  472/500 [===========================>..] - ETA: 9s - loss: 1.0584 - regression_loss: 0.9357 - classification_loss: 0.1228 473/500 [===========================>..] - ETA: 9s - loss: 1.0569 - regression_loss: 0.9343 - classification_loss: 0.1226 474/500 [===========================>..] - ETA: 8s - loss: 1.0559 - regression_loss: 0.9335 - classification_loss: 0.1224 475/500 [===========================>..] - ETA: 8s - loss: 1.0557 - regression_loss: 0.9334 - classification_loss: 0.1223 476/500 [===========================>..] - ETA: 8s - loss: 1.0563 - regression_loss: 0.9338 - classification_loss: 0.1225 477/500 [===========================>..] - ETA: 7s - loss: 1.0562 - regression_loss: 0.9338 - classification_loss: 0.1224 478/500 [===========================>..] - ETA: 7s - loss: 1.0555 - regression_loss: 0.9332 - classification_loss: 0.1223 479/500 [===========================>..] - ETA: 7s - loss: 1.0570 - regression_loss: 0.9334 - classification_loss: 0.1237 480/500 [===========================>..] - ETA: 6s - loss: 1.0577 - regression_loss: 0.9339 - classification_loss: 0.1238 481/500 [===========================>..] - ETA: 6s - loss: 1.0571 - regression_loss: 0.9335 - classification_loss: 0.1237 482/500 [===========================>..] - ETA: 6s - loss: 1.0574 - regression_loss: 0.9337 - classification_loss: 0.1238 483/500 [===========================>..] - ETA: 5s - loss: 1.0571 - regression_loss: 0.9334 - classification_loss: 0.1237 484/500 [============================>.] - ETA: 5s - loss: 1.0573 - regression_loss: 0.9337 - classification_loss: 0.1237 485/500 [============================>.] - ETA: 5s - loss: 1.0564 - regression_loss: 0.9329 - classification_loss: 0.1235 486/500 [============================>.] - ETA: 4s - loss: 1.0573 - regression_loss: 0.9336 - classification_loss: 0.1236 487/500 [============================>.] - ETA: 4s - loss: 1.0587 - regression_loss: 0.9349 - classification_loss: 0.1238 488/500 [============================>.] - ETA: 4s - loss: 1.0592 - regression_loss: 0.9355 - classification_loss: 0.1237 489/500 [============================>.] - ETA: 3s - loss: 1.0592 - regression_loss: 0.9355 - classification_loss: 0.1237 490/500 [============================>.] - ETA: 3s - loss: 1.0585 - regression_loss: 0.9349 - classification_loss: 0.1236 491/500 [============================>.] - ETA: 3s - loss: 1.0584 - regression_loss: 0.9347 - classification_loss: 0.1236 492/500 [============================>.] - ETA: 2s - loss: 1.0585 - regression_loss: 0.9349 - classification_loss: 0.1236 493/500 [============================>.] - ETA: 2s - loss: 1.0587 - regression_loss: 0.9350 - classification_loss: 0.1237 494/500 [============================>.] - ETA: 2s - loss: 1.0593 - regression_loss: 0.9356 - classification_loss: 0.1237 495/500 [============================>.] - ETA: 1s - loss: 1.0602 - regression_loss: 0.9364 - classification_loss: 0.1238 496/500 [============================>.] - ETA: 1s - loss: 1.0612 - regression_loss: 0.9372 - classification_loss: 0.1239 497/500 [============================>.] - ETA: 1s - loss: 1.0603 - regression_loss: 0.9363 - classification_loss: 0.1240 498/500 [============================>.] - ETA: 0s - loss: 1.0609 - regression_loss: 0.9368 - classification_loss: 0.1241 499/500 [============================>.] - ETA: 0s - loss: 1.0604 - regression_loss: 0.9365 - classification_loss: 0.1240 500/500 [==============================] - 171s 341ms/step - loss: 1.0595 - regression_loss: 0.9357 - classification_loss: 0.1238 1172 instances of class plum with average precision: 0.7738 mAP: 0.7738 Epoch 00026: saving model to ./training/snapshots/resnet101_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:43 - loss: 1.1048 - regression_loss: 1.0080 - classification_loss: 0.0968 2/500 [..............................] - ETA: 2:50 - loss: 1.1965 - regression_loss: 1.0724 - classification_loss: 0.1241 3/500 [..............................] - ETA: 2:47 - loss: 1.0904 - regression_loss: 0.9808 - classification_loss: 0.1096 4/500 [..............................] - ETA: 2:47 - loss: 1.0885 - regression_loss: 0.9770 - classification_loss: 0.1115 5/500 [..............................] - ETA: 2:48 - loss: 1.1222 - regression_loss: 1.0028 - classification_loss: 0.1194 6/500 [..............................] - ETA: 2:47 - loss: 1.0086 - regression_loss: 0.9052 - classification_loss: 0.1033 7/500 [..............................] - ETA: 2:46 - loss: 1.1106 - regression_loss: 0.9753 - classification_loss: 0.1354 8/500 [..............................] - ETA: 2:46 - loss: 1.0652 - regression_loss: 0.9369 - classification_loss: 0.1283 9/500 [..............................] - ETA: 2:48 - loss: 1.1051 - regression_loss: 0.9692 - classification_loss: 0.1359 10/500 [..............................] - ETA: 2:48 - loss: 1.0778 - regression_loss: 0.9458 - classification_loss: 0.1320 11/500 [..............................] - ETA: 2:47 - loss: 1.0624 - regression_loss: 0.9341 - classification_loss: 0.1283 12/500 [..............................] - ETA: 2:47 - loss: 1.0179 - regression_loss: 0.8946 - classification_loss: 0.1233 13/500 [..............................] - ETA: 2:47 - loss: 1.0242 - regression_loss: 0.9008 - classification_loss: 0.1234 14/500 [..............................] - ETA: 2:46 - loss: 1.0255 - regression_loss: 0.9006 - classification_loss: 0.1249 15/500 [..............................] - ETA: 2:45 - loss: 1.0445 - regression_loss: 0.9204 - classification_loss: 0.1242 16/500 [..............................] - ETA: 2:44 - loss: 1.0600 - regression_loss: 0.9335 - classification_loss: 0.1265 17/500 [>.............................] - ETA: 2:44 - loss: 1.0796 - regression_loss: 0.9483 - classification_loss: 0.1313 18/500 [>.............................] - ETA: 2:44 - loss: 1.1001 - regression_loss: 0.9669 - classification_loss: 0.1332 19/500 [>.............................] - ETA: 2:43 - loss: 1.0918 - regression_loss: 0.9611 - classification_loss: 0.1307 20/500 [>.............................] - ETA: 2:43 - loss: 1.0921 - regression_loss: 0.9586 - classification_loss: 0.1336 21/500 [>.............................] - ETA: 2:43 - loss: 1.1130 - regression_loss: 0.9759 - classification_loss: 0.1372 22/500 [>.............................] - ETA: 2:43 - loss: 1.1284 - regression_loss: 0.9882 - classification_loss: 0.1402 23/500 [>.............................] - ETA: 2:43 - loss: 1.1233 - regression_loss: 0.9857 - classification_loss: 0.1376 24/500 [>.............................] - ETA: 2:42 - loss: 1.1171 - regression_loss: 0.9811 - classification_loss: 0.1361 25/500 [>.............................] - ETA: 2:41 - loss: 1.1144 - regression_loss: 0.9790 - classification_loss: 0.1354 26/500 [>.............................] - ETA: 2:40 - loss: 1.1080 - regression_loss: 0.9749 - classification_loss: 0.1332 27/500 [>.............................] - ETA: 2:39 - loss: 1.0906 - regression_loss: 0.9607 - classification_loss: 0.1300 28/500 [>.............................] - ETA: 2:39 - loss: 1.0951 - regression_loss: 0.9629 - classification_loss: 0.1322 29/500 [>.............................] - ETA: 2:39 - loss: 1.1075 - regression_loss: 0.9750 - classification_loss: 0.1325 30/500 [>.............................] - ETA: 2:38 - loss: 1.1067 - regression_loss: 0.9749 - classification_loss: 0.1319 31/500 [>.............................] - ETA: 2:38 - loss: 1.0829 - regression_loss: 0.9539 - classification_loss: 0.1289 32/500 [>.............................] - ETA: 2:37 - loss: 1.0717 - regression_loss: 0.9447 - classification_loss: 0.1270 33/500 [>.............................] - ETA: 2:37 - loss: 1.0751 - regression_loss: 0.9465 - classification_loss: 0.1285 34/500 [=>............................] - ETA: 2:37 - loss: 1.0633 - regression_loss: 0.9371 - classification_loss: 0.1262 35/500 [=>............................] - ETA: 2:37 - loss: 1.0651 - regression_loss: 0.9385 - classification_loss: 0.1265 36/500 [=>............................] - ETA: 2:36 - loss: 1.0754 - regression_loss: 0.9470 - classification_loss: 0.1284 37/500 [=>............................] - ETA: 2:36 - loss: 1.0788 - regression_loss: 0.9499 - classification_loss: 0.1289 38/500 [=>............................] - ETA: 2:36 - loss: 1.0782 - regression_loss: 0.9502 - classification_loss: 0.1280 39/500 [=>............................] - ETA: 2:35 - loss: 1.0761 - regression_loss: 0.9481 - classification_loss: 0.1280 40/500 [=>............................] - ETA: 2:35 - loss: 1.0712 - regression_loss: 0.9452 - classification_loss: 0.1260 41/500 [=>............................] - ETA: 2:34 - loss: 1.0627 - regression_loss: 0.9383 - classification_loss: 0.1243 42/500 [=>............................] - ETA: 2:34 - loss: 1.0520 - regression_loss: 0.9291 - classification_loss: 0.1228 43/500 [=>............................] - ETA: 2:34 - loss: 1.0545 - regression_loss: 0.9322 - classification_loss: 0.1224 44/500 [=>............................] - ETA: 2:33 - loss: 1.0431 - regression_loss: 0.9224 - classification_loss: 0.1207 45/500 [=>............................] - ETA: 2:33 - loss: 1.0578 - regression_loss: 0.9318 - classification_loss: 0.1260 46/500 [=>............................] - ETA: 2:32 - loss: 1.0557 - regression_loss: 0.9304 - classification_loss: 0.1252 47/500 [=>............................] - ETA: 2:32 - loss: 1.0462 - regression_loss: 0.9227 - classification_loss: 0.1235 48/500 [=>............................] - ETA: 2:32 - loss: 1.0377 - regression_loss: 0.9160 - classification_loss: 0.1217 49/500 [=>............................] - ETA: 2:32 - loss: 1.0428 - regression_loss: 0.9202 - classification_loss: 0.1226 50/500 [==>...........................] - ETA: 2:31 - loss: 1.0479 - regression_loss: 0.9246 - classification_loss: 0.1233 51/500 [==>...........................] - ETA: 2:31 - loss: 1.0377 - regression_loss: 0.9160 - classification_loss: 0.1217 52/500 [==>...........................] - ETA: 2:31 - loss: 1.0389 - regression_loss: 0.9173 - classification_loss: 0.1216 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0458 - regression_loss: 0.9238 - classification_loss: 0.1221 54/500 [==>...........................] - ETA: 2:30 - loss: 1.0325 - regression_loss: 0.9119 - classification_loss: 0.1205 55/500 [==>...........................] - ETA: 2:30 - loss: 1.0327 - regression_loss: 0.9112 - classification_loss: 0.1216 56/500 [==>...........................] - ETA: 2:29 - loss: 1.0247 - regression_loss: 0.9044 - classification_loss: 0.1203 57/500 [==>...........................] - ETA: 2:29 - loss: 1.0150 - regression_loss: 0.8953 - classification_loss: 0.1197 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0116 - regression_loss: 0.8927 - classification_loss: 0.1190 59/500 [==>...........................] - ETA: 2:28 - loss: 1.0071 - regression_loss: 0.8887 - classification_loss: 0.1184 60/500 [==>...........................] - ETA: 2:28 - loss: 0.9953 - regression_loss: 0.8782 - classification_loss: 0.1171 61/500 [==>...........................] - ETA: 2:28 - loss: 0.9925 - regression_loss: 0.8764 - classification_loss: 0.1161 62/500 [==>...........................] - ETA: 2:28 - loss: 0.9974 - regression_loss: 0.8794 - classification_loss: 0.1181 63/500 [==>...........................] - ETA: 2:27 - loss: 0.9961 - regression_loss: 0.8780 - classification_loss: 0.1181 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0064 - regression_loss: 0.8868 - classification_loss: 0.1196 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9992 - regression_loss: 0.8804 - classification_loss: 0.1187 66/500 [==>...........................] - ETA: 2:26 - loss: 1.0011 - regression_loss: 0.8816 - classification_loss: 0.1195 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0072 - regression_loss: 0.8870 - classification_loss: 0.1202 68/500 [===>..........................] - ETA: 2:25 - loss: 1.0130 - regression_loss: 0.8918 - classification_loss: 0.1212 69/500 [===>..........................] - ETA: 2:25 - loss: 1.0109 - regression_loss: 0.8900 - classification_loss: 0.1210 70/500 [===>..........................] - ETA: 2:25 - loss: 1.0111 - regression_loss: 0.8901 - classification_loss: 0.1211 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0119 - regression_loss: 0.8909 - classification_loss: 0.1210 72/500 [===>..........................] - ETA: 2:24 - loss: 1.0105 - regression_loss: 0.8896 - classification_loss: 0.1210 73/500 [===>..........................] - ETA: 2:24 - loss: 1.0159 - regression_loss: 0.8940 - classification_loss: 0.1219 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0180 - regression_loss: 0.8967 - classification_loss: 0.1213 75/500 [===>..........................] - ETA: 2:24 - loss: 1.0152 - regression_loss: 0.8950 - classification_loss: 0.1202 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0156 - regression_loss: 0.8955 - classification_loss: 0.1201 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0050 - regression_loss: 0.8862 - classification_loss: 0.1188 78/500 [===>..........................] - ETA: 2:22 - loss: 1.0025 - regression_loss: 0.8846 - classification_loss: 0.1178 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0104 - regression_loss: 0.8918 - classification_loss: 0.1186 80/500 [===>..........................] - ETA: 2:22 - loss: 1.0107 - regression_loss: 0.8925 - classification_loss: 0.1182 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0127 - regression_loss: 0.8939 - classification_loss: 0.1188 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0108 - regression_loss: 0.8920 - classification_loss: 0.1188 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0090 - regression_loss: 0.8911 - classification_loss: 0.1178 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0174 - regression_loss: 0.8979 - classification_loss: 0.1195 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0144 - regression_loss: 0.8957 - classification_loss: 0.1187 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0076 - regression_loss: 0.8894 - classification_loss: 0.1183 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0069 - regression_loss: 0.8889 - classification_loss: 0.1179 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0066 - regression_loss: 0.8890 - classification_loss: 0.1177 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0075 - regression_loss: 0.8894 - classification_loss: 0.1181 90/500 [====>.........................] - ETA: 2:18 - loss: 1.0116 - regression_loss: 0.8924 - classification_loss: 0.1192 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0140 - regression_loss: 0.8950 - classification_loss: 0.1190 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0126 - regression_loss: 0.8943 - classification_loss: 0.1183 93/500 [====>.........................] - ETA: 2:17 - loss: 1.0125 - regression_loss: 0.8945 - classification_loss: 0.1180 94/500 [====>.........................] - ETA: 2:17 - loss: 1.0137 - regression_loss: 0.8955 - classification_loss: 0.1181 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0121 - regression_loss: 0.8941 - classification_loss: 0.1179 96/500 [====>.........................] - ETA: 2:16 - loss: 1.0135 - regression_loss: 0.8953 - classification_loss: 0.1182 97/500 [====>.........................] - ETA: 2:16 - loss: 1.0093 - regression_loss: 0.8912 - classification_loss: 0.1181 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0172 - regression_loss: 0.8994 - classification_loss: 0.1178 99/500 [====>.........................] - ETA: 2:15 - loss: 1.0156 - regression_loss: 0.8982 - classification_loss: 0.1174 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0180 - regression_loss: 0.9007 - classification_loss: 0.1173 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0178 - regression_loss: 0.9006 - classification_loss: 0.1171 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0157 - regression_loss: 0.8990 - classification_loss: 0.1167 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0232 - regression_loss: 0.9051 - classification_loss: 0.1181 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0291 - regression_loss: 0.9100 - classification_loss: 0.1191 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0306 - regression_loss: 0.9118 - classification_loss: 0.1188 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0341 - regression_loss: 0.9150 - classification_loss: 0.1191 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0350 - regression_loss: 0.9161 - classification_loss: 0.1189 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0358 - regression_loss: 0.9172 - classification_loss: 0.1186 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0392 - regression_loss: 0.9200 - classification_loss: 0.1192 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0399 - regression_loss: 0.9211 - classification_loss: 0.1187 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0428 - regression_loss: 0.9239 - classification_loss: 0.1189 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0383 - regression_loss: 0.9201 - classification_loss: 0.1181 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0420 - regression_loss: 0.9231 - classification_loss: 0.1190 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0437 - regression_loss: 0.9245 - classification_loss: 0.1192 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0447 - regression_loss: 0.9253 - classification_loss: 0.1194 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0439 - regression_loss: 0.9250 - classification_loss: 0.1189 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0455 - regression_loss: 0.9265 - classification_loss: 0.1191 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0427 - regression_loss: 0.9244 - classification_loss: 0.1183 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0481 - regression_loss: 0.9220 - classification_loss: 0.1261 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0419 - regression_loss: 0.9165 - classification_loss: 0.1254 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0420 - regression_loss: 0.9164 - classification_loss: 0.1256 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0431 - regression_loss: 0.9175 - classification_loss: 0.1256 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0428 - regression_loss: 0.9173 - classification_loss: 0.1255 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0460 - regression_loss: 0.9198 - classification_loss: 0.1262 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0460 - regression_loss: 0.9203 - classification_loss: 0.1257 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0455 - regression_loss: 0.9201 - classification_loss: 0.1254 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0458 - regression_loss: 0.9201 - classification_loss: 0.1258 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0447 - regression_loss: 0.9194 - classification_loss: 0.1254 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0410 - regression_loss: 0.9159 - classification_loss: 0.1251 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0435 - regression_loss: 0.9181 - classification_loss: 0.1255 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0405 - regression_loss: 0.9156 - classification_loss: 0.1249 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0428 - regression_loss: 0.9176 - classification_loss: 0.1251 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0398 - regression_loss: 0.9152 - classification_loss: 0.1246 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0382 - regression_loss: 0.9139 - classification_loss: 0.1242 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0407 - regression_loss: 0.9165 - classification_loss: 0.1243 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0377 - regression_loss: 0.9140 - classification_loss: 0.1236 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0364 - regression_loss: 0.9131 - classification_loss: 0.1233 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0351 - regression_loss: 0.9100 - classification_loss: 0.1251 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0348 - regression_loss: 0.9101 - classification_loss: 0.1248 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0328 - regression_loss: 0.9086 - classification_loss: 0.1242 141/500 [=======>......................] - ETA: 2:02 - loss: 1.0336 - regression_loss: 0.9089 - classification_loss: 0.1247 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0350 - regression_loss: 0.9095 - classification_loss: 0.1255 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0345 - regression_loss: 0.9091 - classification_loss: 0.1254 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0336 - regression_loss: 0.9083 - classification_loss: 0.1254 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0357 - regression_loss: 0.9100 - classification_loss: 0.1258 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0340 - regression_loss: 0.9085 - classification_loss: 0.1254 147/500 [=======>......................] - ETA: 2:00 - loss: 1.0367 - regression_loss: 0.9110 - classification_loss: 0.1257 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0362 - regression_loss: 0.9109 - classification_loss: 0.1252 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0353 - regression_loss: 0.9100 - classification_loss: 0.1253 150/500 [========>.....................] - ETA: 1:59 - loss: 1.0322 - regression_loss: 0.9068 - classification_loss: 0.1254 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0298 - regression_loss: 0.9049 - classification_loss: 0.1250 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0323 - regression_loss: 0.9066 - classification_loss: 0.1257 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0313 - regression_loss: 0.9058 - classification_loss: 0.1255 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0276 - regression_loss: 0.9025 - classification_loss: 0.1250 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0273 - regression_loss: 0.9023 - classification_loss: 0.1249 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0276 - regression_loss: 0.9026 - classification_loss: 0.1250 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0281 - regression_loss: 0.9032 - classification_loss: 0.1249 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0298 - regression_loss: 0.9049 - classification_loss: 0.1249 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0312 - regression_loss: 0.9061 - classification_loss: 0.1251 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0309 - regression_loss: 0.9056 - classification_loss: 0.1253 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0306 - regression_loss: 0.9055 - classification_loss: 0.1251 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0354 - regression_loss: 0.9101 - classification_loss: 0.1252 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0349 - regression_loss: 0.9097 - classification_loss: 0.1252 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0351 - regression_loss: 0.9102 - classification_loss: 0.1249 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0366 - regression_loss: 0.9114 - classification_loss: 0.1252 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0365 - regression_loss: 0.9109 - classification_loss: 0.1256 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0344 - regression_loss: 0.9093 - classification_loss: 0.1251 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0329 - regression_loss: 0.9081 - classification_loss: 0.1248 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0340 - regression_loss: 0.9088 - classification_loss: 0.1252 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0321 - regression_loss: 0.9073 - classification_loss: 0.1247 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0323 - regression_loss: 0.9077 - classification_loss: 0.1246 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0331 - regression_loss: 0.9081 - classification_loss: 0.1249 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0325 - regression_loss: 0.9080 - classification_loss: 0.1245 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0327 - regression_loss: 0.9082 - classification_loss: 0.1244 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0342 - regression_loss: 0.9094 - classification_loss: 0.1247 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0331 - regression_loss: 0.9085 - classification_loss: 0.1246 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0332 - regression_loss: 0.9087 - classification_loss: 0.1245 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0338 - regression_loss: 0.9092 - classification_loss: 0.1246 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0332 - regression_loss: 0.9088 - classification_loss: 0.1244 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0357 - regression_loss: 0.9106 - classification_loss: 0.1251 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0351 - regression_loss: 0.9103 - classification_loss: 0.1248 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0345 - regression_loss: 0.9102 - classification_loss: 0.1243 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0323 - regression_loss: 0.9085 - classification_loss: 0.1238 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0325 - regression_loss: 0.9085 - classification_loss: 0.1240 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0330 - regression_loss: 0.9091 - classification_loss: 0.1239 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0328 - regression_loss: 0.9089 - classification_loss: 0.1240 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0348 - regression_loss: 0.9106 - classification_loss: 0.1242 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0335 - regression_loss: 0.9094 - classification_loss: 0.1241 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0343 - regression_loss: 0.9099 - classification_loss: 0.1244 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0362 - regression_loss: 0.9116 - classification_loss: 0.1246 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0353 - regression_loss: 0.9108 - classification_loss: 0.1244 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0355 - regression_loss: 0.9111 - classification_loss: 0.1244 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0388 - regression_loss: 0.9138 - classification_loss: 0.1250 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0400 - regression_loss: 0.9149 - classification_loss: 0.1251 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0424 - regression_loss: 0.9172 - classification_loss: 0.1252 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0440 - regression_loss: 0.9183 - classification_loss: 0.1257 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0457 - regression_loss: 0.9197 - classification_loss: 0.1260 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0471 - regression_loss: 0.9207 - classification_loss: 0.1264 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0442 - regression_loss: 0.9181 - classification_loss: 0.1261 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0410 - regression_loss: 0.9152 - classification_loss: 0.1257 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0421 - regression_loss: 0.9162 - classification_loss: 0.1259 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0428 - regression_loss: 0.9168 - classification_loss: 0.1260 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0429 - regression_loss: 0.9169 - classification_loss: 0.1260 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0423 - regression_loss: 0.9162 - classification_loss: 0.1261 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0430 - regression_loss: 0.9169 - classification_loss: 0.1261 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0422 - regression_loss: 0.9162 - classification_loss: 0.1260 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0450 - regression_loss: 0.9186 - classification_loss: 0.1264 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0444 - regression_loss: 0.9184 - classification_loss: 0.1261 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0452 - regression_loss: 0.9189 - classification_loss: 0.1263 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0423 - regression_loss: 0.9164 - classification_loss: 0.1259 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0436 - regression_loss: 0.9173 - classification_loss: 0.1262 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0432 - regression_loss: 0.9170 - classification_loss: 0.1261 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0457 - regression_loss: 0.9192 - classification_loss: 0.1265 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0457 - regression_loss: 0.9192 - classification_loss: 0.1265 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0429 - regression_loss: 0.9168 - classification_loss: 0.1261 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0433 - regression_loss: 0.9174 - classification_loss: 0.1259 217/500 [============>.................] - ETA: 1:36 - loss: 1.0404 - regression_loss: 0.9149 - classification_loss: 0.1255 218/500 [============>.................] - ETA: 1:35 - loss: 1.0423 - regression_loss: 0.9165 - classification_loss: 0.1258 219/500 [============>.................] - ETA: 1:35 - loss: 1.0424 - regression_loss: 0.9166 - classification_loss: 0.1258 220/500 [============>.................] - ETA: 1:35 - loss: 1.0440 - regression_loss: 0.9177 - classification_loss: 0.1262 221/500 [============>.................] - ETA: 1:34 - loss: 1.0437 - regression_loss: 0.9176 - classification_loss: 0.1261 222/500 [============>.................] - ETA: 1:34 - loss: 1.0457 - regression_loss: 0.9195 - classification_loss: 0.1261 223/500 [============>.................] - ETA: 1:34 - loss: 1.0476 - regression_loss: 0.9216 - classification_loss: 0.1260 224/500 [============>.................] - ETA: 1:33 - loss: 1.0498 - regression_loss: 0.9234 - classification_loss: 0.1264 225/500 [============>.................] - ETA: 1:33 - loss: 1.0499 - regression_loss: 0.9238 - classification_loss: 0.1261 226/500 [============>.................] - ETA: 1:33 - loss: 1.0509 - regression_loss: 0.9247 - classification_loss: 0.1262 227/500 [============>.................] - ETA: 1:32 - loss: 1.0485 - regression_loss: 0.9226 - classification_loss: 0.1259 228/500 [============>.................] - ETA: 1:32 - loss: 1.0498 - regression_loss: 0.9238 - classification_loss: 0.1260 229/500 [============>.................] - ETA: 1:32 - loss: 1.0505 - regression_loss: 0.9244 - classification_loss: 0.1261 230/500 [============>.................] - ETA: 1:31 - loss: 1.0513 - regression_loss: 0.9250 - classification_loss: 0.1263 231/500 [============>.................] - ETA: 1:31 - loss: 1.0530 - regression_loss: 0.9264 - classification_loss: 0.1266 232/500 [============>.................] - ETA: 1:31 - loss: 1.0498 - regression_loss: 0.9236 - classification_loss: 0.1263 233/500 [============>.................] - ETA: 1:30 - loss: 1.0502 - regression_loss: 0.9239 - classification_loss: 0.1263 234/500 [=============>................] - ETA: 1:30 - loss: 1.0488 - regression_loss: 0.9228 - classification_loss: 0.1260 235/500 [=============>................] - ETA: 1:30 - loss: 1.0477 - regression_loss: 0.9219 - classification_loss: 0.1258 236/500 [=============>................] - ETA: 1:29 - loss: 1.0488 - regression_loss: 0.9228 - classification_loss: 0.1260 237/500 [=============>................] - ETA: 1:29 - loss: 1.0475 - regression_loss: 0.9217 - classification_loss: 0.1259 238/500 [=============>................] - ETA: 1:29 - loss: 1.0473 - regression_loss: 0.9216 - classification_loss: 0.1257 239/500 [=============>................] - ETA: 1:28 - loss: 1.0481 - regression_loss: 0.9224 - classification_loss: 0.1257 240/500 [=============>................] - ETA: 1:28 - loss: 1.0485 - regression_loss: 0.9229 - classification_loss: 0.1256 241/500 [=============>................] - ETA: 1:28 - loss: 1.0481 - regression_loss: 0.9225 - classification_loss: 0.1256 242/500 [=============>................] - ETA: 1:27 - loss: 1.0462 - regression_loss: 0.9210 - classification_loss: 0.1253 243/500 [=============>................] - ETA: 1:27 - loss: 1.0487 - regression_loss: 0.9232 - classification_loss: 0.1256 244/500 [=============>................] - ETA: 1:26 - loss: 1.0473 - regression_loss: 0.9220 - classification_loss: 0.1253 245/500 [=============>................] - ETA: 1:26 - loss: 1.0472 - regression_loss: 0.9218 - classification_loss: 0.1254 246/500 [=============>................] - ETA: 1:26 - loss: 1.0493 - regression_loss: 0.9238 - classification_loss: 0.1256 247/500 [=============>................] - ETA: 1:25 - loss: 1.0500 - regression_loss: 0.9245 - classification_loss: 0.1255 248/500 [=============>................] - ETA: 1:25 - loss: 1.0517 - regression_loss: 0.9260 - classification_loss: 0.1257 249/500 [=============>................] - ETA: 1:25 - loss: 1.0526 - regression_loss: 0.9267 - classification_loss: 0.1258 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0547 - regression_loss: 0.9290 - classification_loss: 0.1258 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0546 - regression_loss: 0.9289 - classification_loss: 0.1257 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0547 - regression_loss: 0.9292 - classification_loss: 0.1255 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0550 - regression_loss: 0.9298 - classification_loss: 0.1252 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0535 - regression_loss: 0.9287 - classification_loss: 0.1248 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0536 - regression_loss: 0.9288 - classification_loss: 0.1248 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0510 - regression_loss: 0.9265 - classification_loss: 0.1245 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0504 - regression_loss: 0.9260 - classification_loss: 0.1244 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0498 - regression_loss: 0.9250 - classification_loss: 0.1247 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0473 - regression_loss: 0.9229 - classification_loss: 0.1244 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0490 - regression_loss: 0.9245 - classification_loss: 0.1245 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0481 - regression_loss: 0.9237 - classification_loss: 0.1244 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0454 - regression_loss: 0.9214 - classification_loss: 0.1240 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0436 - regression_loss: 0.9198 - classification_loss: 0.1238 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0442 - regression_loss: 0.9203 - classification_loss: 0.1239 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0458 - regression_loss: 0.9217 - classification_loss: 0.1242 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0466 - regression_loss: 0.9223 - classification_loss: 0.1243 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0470 - regression_loss: 0.9226 - classification_loss: 0.1244 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0469 - regression_loss: 0.9225 - classification_loss: 0.1244 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0473 - regression_loss: 0.9229 - classification_loss: 0.1244 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0478 - regression_loss: 0.9235 - classification_loss: 0.1243 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0460 - regression_loss: 0.9219 - classification_loss: 0.1241 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0473 - regression_loss: 0.9230 - classification_loss: 0.1243 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0458 - regression_loss: 0.9218 - classification_loss: 0.1240 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0450 - regression_loss: 0.9211 - classification_loss: 0.1239 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0433 - regression_loss: 0.9197 - classification_loss: 0.1236 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0439 - regression_loss: 0.9203 - classification_loss: 0.1236 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0444 - regression_loss: 0.9205 - classification_loss: 0.1239 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0422 - regression_loss: 0.9187 - classification_loss: 0.1236 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0441 - regression_loss: 0.9205 - classification_loss: 0.1237 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0424 - regression_loss: 0.9190 - classification_loss: 0.1234 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0432 - regression_loss: 0.9197 - classification_loss: 0.1234 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0440 - regression_loss: 0.9204 - classification_loss: 0.1236 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0450 - regression_loss: 0.9212 - classification_loss: 0.1238 284/500 [================>.............] - ETA: 1:13 - loss: 1.0449 - regression_loss: 0.9213 - classification_loss: 0.1236 285/500 [================>.............] - ETA: 1:13 - loss: 1.0448 - regression_loss: 0.9212 - classification_loss: 0.1235 286/500 [================>.............] - ETA: 1:12 - loss: 1.0450 - regression_loss: 0.9216 - classification_loss: 0.1234 287/500 [================>.............] - ETA: 1:12 - loss: 1.0465 - regression_loss: 0.9231 - classification_loss: 0.1234 288/500 [================>.............] - ETA: 1:12 - loss: 1.0475 - regression_loss: 0.9242 - classification_loss: 0.1232 289/500 [================>.............] - ETA: 1:11 - loss: 1.0481 - regression_loss: 0.9247 - classification_loss: 0.1234 290/500 [================>.............] - ETA: 1:11 - loss: 1.0473 - regression_loss: 0.9241 - classification_loss: 0.1232 291/500 [================>.............] - ETA: 1:11 - loss: 1.0474 - regression_loss: 0.9240 - classification_loss: 0.1234 292/500 [================>.............] - ETA: 1:10 - loss: 1.0476 - regression_loss: 0.9242 - classification_loss: 0.1233 293/500 [================>.............] - ETA: 1:10 - loss: 1.0478 - regression_loss: 0.9246 - classification_loss: 0.1232 294/500 [================>.............] - ETA: 1:09 - loss: 1.0488 - regression_loss: 0.9257 - classification_loss: 0.1231 295/500 [================>.............] - ETA: 1:09 - loss: 1.0500 - regression_loss: 0.9268 - classification_loss: 0.1232 296/500 [================>.............] - ETA: 1:09 - loss: 1.0494 - regression_loss: 0.9263 - classification_loss: 0.1231 297/500 [================>.............] - ETA: 1:08 - loss: 1.0498 - regression_loss: 0.9267 - classification_loss: 0.1231 298/500 [================>.............] - ETA: 1:08 - loss: 1.0495 - regression_loss: 0.9266 - classification_loss: 0.1230 299/500 [================>.............] - ETA: 1:08 - loss: 1.0489 - regression_loss: 0.9261 - classification_loss: 0.1228 300/500 [=================>............] - ETA: 1:07 - loss: 1.0492 - regression_loss: 0.9263 - classification_loss: 0.1228 301/500 [=================>............] - ETA: 1:07 - loss: 1.0494 - regression_loss: 0.9265 - classification_loss: 0.1229 302/500 [=================>............] - ETA: 1:07 - loss: 1.0483 - regression_loss: 0.9254 - classification_loss: 0.1229 303/500 [=================>............] - ETA: 1:06 - loss: 1.0493 - regression_loss: 0.9263 - classification_loss: 0.1229 304/500 [=================>............] - ETA: 1:06 - loss: 1.0478 - regression_loss: 0.9251 - classification_loss: 0.1227 305/500 [=================>............] - ETA: 1:06 - loss: 1.0480 - regression_loss: 0.9255 - classification_loss: 0.1225 306/500 [=================>............] - ETA: 1:05 - loss: 1.0519 - regression_loss: 0.9288 - classification_loss: 0.1231 307/500 [=================>............] - ETA: 1:05 - loss: 1.0506 - regression_loss: 0.9277 - classification_loss: 0.1229 308/500 [=================>............] - ETA: 1:05 - loss: 1.0521 - regression_loss: 0.9291 - classification_loss: 0.1230 309/500 [=================>............] - ETA: 1:04 - loss: 1.0519 - regression_loss: 0.9290 - classification_loss: 0.1230 310/500 [=================>............] - ETA: 1:04 - loss: 1.0508 - regression_loss: 0.9281 - classification_loss: 0.1228 311/500 [=================>............] - ETA: 1:04 - loss: 1.0498 - regression_loss: 0.9271 - classification_loss: 0.1227 312/500 [=================>............] - ETA: 1:03 - loss: 1.0504 - regression_loss: 0.9278 - classification_loss: 0.1227 313/500 [=================>............] - ETA: 1:03 - loss: 1.0493 - regression_loss: 0.9268 - classification_loss: 0.1225 314/500 [=================>............] - ETA: 1:03 - loss: 1.0497 - regression_loss: 0.9271 - classification_loss: 0.1226 315/500 [=================>............] - ETA: 1:02 - loss: 1.0507 - regression_loss: 0.9279 - classification_loss: 0.1228 316/500 [=================>............] - ETA: 1:02 - loss: 1.0506 - regression_loss: 0.9279 - classification_loss: 0.1227 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0503 - regression_loss: 0.9276 - classification_loss: 0.1227 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0498 - regression_loss: 0.9271 - classification_loss: 0.1227 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0506 - regression_loss: 0.9279 - classification_loss: 0.1228 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0498 - regression_loss: 0.9272 - classification_loss: 0.1226 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0496 - regression_loss: 0.9270 - classification_loss: 0.1226 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0475 - regression_loss: 0.9251 - classification_loss: 0.1223 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0494 - regression_loss: 0.9267 - classification_loss: 0.1227 324/500 [==================>...........] - ETA: 59s - loss: 1.0508 - regression_loss: 0.9280 - classification_loss: 0.1228  325/500 [==================>...........] - ETA: 59s - loss: 1.0506 - regression_loss: 0.9279 - classification_loss: 0.1227 326/500 [==================>...........] - ETA: 59s - loss: 1.0509 - regression_loss: 0.9281 - classification_loss: 0.1228 327/500 [==================>...........] - ETA: 58s - loss: 1.0516 - regression_loss: 0.9286 - classification_loss: 0.1229 328/500 [==================>...........] - ETA: 58s - loss: 1.0546 - regression_loss: 0.9313 - classification_loss: 0.1233 329/500 [==================>...........] - ETA: 58s - loss: 1.0528 - regression_loss: 0.9298 - classification_loss: 0.1230 330/500 [==================>...........] - ETA: 57s - loss: 1.0536 - regression_loss: 0.9306 - classification_loss: 0.1230 331/500 [==================>...........] - ETA: 57s - loss: 1.0532 - regression_loss: 0.9302 - classification_loss: 0.1230 332/500 [==================>...........] - ETA: 57s - loss: 1.0519 - regression_loss: 0.9291 - classification_loss: 0.1228 333/500 [==================>...........] - ETA: 56s - loss: 1.0515 - regression_loss: 0.9287 - classification_loss: 0.1228 334/500 [===================>..........] - ETA: 56s - loss: 1.0513 - regression_loss: 0.9285 - classification_loss: 0.1228 335/500 [===================>..........] - ETA: 56s - loss: 1.0515 - regression_loss: 0.9288 - classification_loss: 0.1227 336/500 [===================>..........] - ETA: 55s - loss: 1.0500 - regression_loss: 0.9275 - classification_loss: 0.1224 337/500 [===================>..........] - ETA: 55s - loss: 1.0506 - regression_loss: 0.9279 - classification_loss: 0.1227 338/500 [===================>..........] - ETA: 55s - loss: 1.0498 - regression_loss: 0.9273 - classification_loss: 0.1224 339/500 [===================>..........] - ETA: 54s - loss: 1.0502 - regression_loss: 0.9280 - classification_loss: 0.1223 340/500 [===================>..........] - ETA: 54s - loss: 1.0493 - regression_loss: 0.9272 - classification_loss: 0.1221 341/500 [===================>..........] - ETA: 54s - loss: 1.0505 - regression_loss: 0.9282 - classification_loss: 0.1223 342/500 [===================>..........] - ETA: 53s - loss: 1.0500 - regression_loss: 0.9278 - classification_loss: 0.1222 343/500 [===================>..........] - ETA: 53s - loss: 1.0498 - regression_loss: 0.9276 - classification_loss: 0.1222 344/500 [===================>..........] - ETA: 53s - loss: 1.0514 - regression_loss: 0.9290 - classification_loss: 0.1224 345/500 [===================>..........] - ETA: 52s - loss: 1.0522 - regression_loss: 0.9296 - classification_loss: 0.1226 346/500 [===================>..........] - ETA: 52s - loss: 1.0511 - regression_loss: 0.9286 - classification_loss: 0.1225 347/500 [===================>..........] - ETA: 52s - loss: 1.0508 - regression_loss: 0.9284 - classification_loss: 0.1224 348/500 [===================>..........] - ETA: 51s - loss: 1.0515 - regression_loss: 0.9290 - classification_loss: 0.1225 349/500 [===================>..........] - ETA: 51s - loss: 1.0502 - regression_loss: 0.9280 - classification_loss: 0.1223 350/500 [====================>.........] - ETA: 51s - loss: 1.0494 - regression_loss: 0.9274 - classification_loss: 0.1221 351/500 [====================>.........] - ETA: 50s - loss: 1.0477 - regression_loss: 0.9259 - classification_loss: 0.1218 352/500 [====================>.........] - ETA: 50s - loss: 1.0479 - regression_loss: 0.9261 - classification_loss: 0.1218 353/500 [====================>.........] - ETA: 49s - loss: 1.0482 - regression_loss: 0.9262 - classification_loss: 0.1221 354/500 [====================>.........] - ETA: 49s - loss: 1.0470 - regression_loss: 0.9252 - classification_loss: 0.1218 355/500 [====================>.........] - ETA: 49s - loss: 1.0471 - regression_loss: 0.9253 - classification_loss: 0.1218 356/500 [====================>.........] - ETA: 48s - loss: 1.0475 - regression_loss: 0.9257 - classification_loss: 0.1219 357/500 [====================>.........] - ETA: 48s - loss: 1.0475 - regression_loss: 0.9257 - classification_loss: 0.1218 358/500 [====================>.........] - ETA: 48s - loss: 1.0481 - regression_loss: 0.9263 - classification_loss: 0.1219 359/500 [====================>.........] - ETA: 47s - loss: 1.0499 - regression_loss: 0.9278 - classification_loss: 0.1221 360/500 [====================>.........] - ETA: 47s - loss: 1.0504 - regression_loss: 0.9282 - classification_loss: 0.1222 361/500 [====================>.........] - ETA: 47s - loss: 1.0500 - regression_loss: 0.9279 - classification_loss: 0.1221 362/500 [====================>.........] - ETA: 46s - loss: 1.0481 - regression_loss: 0.9263 - classification_loss: 0.1218 363/500 [====================>.........] - ETA: 46s - loss: 1.0489 - regression_loss: 0.9271 - classification_loss: 0.1218 364/500 [====================>.........] - ETA: 46s - loss: 1.0488 - regression_loss: 0.9271 - classification_loss: 0.1217 365/500 [====================>.........] - ETA: 45s - loss: 1.0487 - regression_loss: 0.9269 - classification_loss: 0.1218 366/500 [====================>.........] - ETA: 45s - loss: 1.0473 - regression_loss: 0.9257 - classification_loss: 0.1216 367/500 [=====================>........] - ETA: 45s - loss: 1.0465 - regression_loss: 0.9251 - classification_loss: 0.1214 368/500 [=====================>........] - ETA: 44s - loss: 1.0469 - regression_loss: 0.9256 - classification_loss: 0.1214 369/500 [=====================>........] - ETA: 44s - loss: 1.0476 - regression_loss: 0.9265 - classification_loss: 0.1211 370/500 [=====================>........] - ETA: 44s - loss: 1.0464 - regression_loss: 0.9254 - classification_loss: 0.1210 371/500 [=====================>........] - ETA: 43s - loss: 1.0475 - regression_loss: 0.9263 - classification_loss: 0.1211 372/500 [=====================>........] - ETA: 43s - loss: 1.0480 - regression_loss: 0.9267 - classification_loss: 0.1213 373/500 [=====================>........] - ETA: 43s - loss: 1.0484 - regression_loss: 0.9270 - classification_loss: 0.1214 374/500 [=====================>........] - ETA: 42s - loss: 1.0485 - regression_loss: 0.9271 - classification_loss: 0.1214 375/500 [=====================>........] - ETA: 42s - loss: 1.0487 - regression_loss: 0.9273 - classification_loss: 0.1214 376/500 [=====================>........] - ETA: 42s - loss: 1.0496 - regression_loss: 0.9280 - classification_loss: 0.1216 377/500 [=====================>........] - ETA: 41s - loss: 1.0482 - regression_loss: 0.9268 - classification_loss: 0.1214 378/500 [=====================>........] - ETA: 41s - loss: 1.0477 - regression_loss: 0.9264 - classification_loss: 0.1213 379/500 [=====================>........] - ETA: 41s - loss: 1.0479 - regression_loss: 0.9265 - classification_loss: 0.1213 380/500 [=====================>........] - ETA: 40s - loss: 1.0491 - regression_loss: 0.9274 - classification_loss: 0.1217 381/500 [=====================>........] - ETA: 40s - loss: 1.0488 - regression_loss: 0.9271 - classification_loss: 0.1217 382/500 [=====================>........] - ETA: 40s - loss: 1.0493 - regression_loss: 0.9276 - classification_loss: 0.1217 383/500 [=====================>........] - ETA: 39s - loss: 1.0502 - regression_loss: 0.9284 - classification_loss: 0.1218 384/500 [======================>.......] - ETA: 39s - loss: 1.0498 - regression_loss: 0.9281 - classification_loss: 0.1217 385/500 [======================>.......] - ETA: 39s - loss: 1.0500 - regression_loss: 0.9285 - classification_loss: 0.1215 386/500 [======================>.......] - ETA: 38s - loss: 1.0502 - regression_loss: 0.9288 - classification_loss: 0.1215 387/500 [======================>.......] - ETA: 38s - loss: 1.0507 - regression_loss: 0.9292 - classification_loss: 0.1215 388/500 [======================>.......] - ETA: 38s - loss: 1.0508 - regression_loss: 0.9292 - classification_loss: 0.1215 389/500 [======================>.......] - ETA: 37s - loss: 1.0519 - regression_loss: 0.9301 - classification_loss: 0.1218 390/500 [======================>.......] - ETA: 37s - loss: 1.0512 - regression_loss: 0.9295 - classification_loss: 0.1217 391/500 [======================>.......] - ETA: 37s - loss: 1.0510 - regression_loss: 0.9294 - classification_loss: 0.1216 392/500 [======================>.......] - ETA: 36s - loss: 1.0526 - regression_loss: 0.9307 - classification_loss: 0.1219 393/500 [======================>.......] - ETA: 36s - loss: 1.0517 - regression_loss: 0.9300 - classification_loss: 0.1217 394/500 [======================>.......] - ETA: 35s - loss: 1.0530 - regression_loss: 0.9311 - classification_loss: 0.1219 395/500 [======================>.......] - ETA: 35s - loss: 1.0526 - regression_loss: 0.9308 - classification_loss: 0.1218 396/500 [======================>.......] - ETA: 35s - loss: 1.0523 - regression_loss: 0.9306 - classification_loss: 0.1217 397/500 [======================>.......] - ETA: 34s - loss: 1.0527 - regression_loss: 0.9309 - classification_loss: 0.1217 398/500 [======================>.......] - ETA: 34s - loss: 1.0540 - regression_loss: 0.9319 - classification_loss: 0.1221 399/500 [======================>.......] - ETA: 34s - loss: 1.0548 - regression_loss: 0.9324 - classification_loss: 0.1224 400/500 [=======================>......] - ETA: 33s - loss: 1.0536 - regression_loss: 0.9314 - classification_loss: 0.1222 401/500 [=======================>......] - ETA: 33s - loss: 1.0530 - regression_loss: 0.9309 - classification_loss: 0.1221 402/500 [=======================>......] - ETA: 33s - loss: 1.0534 - regression_loss: 0.9312 - classification_loss: 0.1221 403/500 [=======================>......] - ETA: 32s - loss: 1.0522 - regression_loss: 0.9303 - classification_loss: 0.1220 404/500 [=======================>......] - ETA: 32s - loss: 1.0519 - regression_loss: 0.9300 - classification_loss: 0.1220 405/500 [=======================>......] - ETA: 32s - loss: 1.0522 - regression_loss: 0.9302 - classification_loss: 0.1220 406/500 [=======================>......] - ETA: 31s - loss: 1.0526 - regression_loss: 0.9307 - classification_loss: 0.1220 407/500 [=======================>......] - ETA: 31s - loss: 1.0523 - regression_loss: 0.9304 - classification_loss: 0.1219 408/500 [=======================>......] - ETA: 31s - loss: 1.0528 - regression_loss: 0.9308 - classification_loss: 0.1220 409/500 [=======================>......] - ETA: 30s - loss: 1.0534 - regression_loss: 0.9313 - classification_loss: 0.1220 410/500 [=======================>......] - ETA: 30s - loss: 1.0534 - regression_loss: 0.9314 - classification_loss: 0.1220 411/500 [=======================>......] - ETA: 30s - loss: 1.0542 - regression_loss: 0.9322 - classification_loss: 0.1221 412/500 [=======================>......] - ETA: 29s - loss: 1.0530 - regression_loss: 0.9311 - classification_loss: 0.1219 413/500 [=======================>......] - ETA: 29s - loss: 1.0541 - regression_loss: 0.9321 - classification_loss: 0.1220 414/500 [=======================>......] - ETA: 29s - loss: 1.0535 - regression_loss: 0.9316 - classification_loss: 0.1219 415/500 [=======================>......] - ETA: 28s - loss: 1.0541 - regression_loss: 0.9321 - classification_loss: 0.1220 416/500 [=======================>......] - ETA: 28s - loss: 1.0547 - regression_loss: 0.9325 - classification_loss: 0.1222 417/500 [========================>.....] - ETA: 28s - loss: 1.0540 - regression_loss: 0.9320 - classification_loss: 0.1221 418/500 [========================>.....] - ETA: 27s - loss: 1.0542 - regression_loss: 0.9321 - classification_loss: 0.1221 419/500 [========================>.....] - ETA: 27s - loss: 1.0543 - regression_loss: 0.9322 - classification_loss: 0.1221 420/500 [========================>.....] - ETA: 27s - loss: 1.0538 - regression_loss: 0.9318 - classification_loss: 0.1220 421/500 [========================>.....] - ETA: 26s - loss: 1.0550 - regression_loss: 0.9328 - classification_loss: 0.1222 422/500 [========================>.....] - ETA: 26s - loss: 1.0544 - regression_loss: 0.9323 - classification_loss: 0.1221 423/500 [========================>.....] - ETA: 26s - loss: 1.0553 - regression_loss: 0.9330 - classification_loss: 0.1222 424/500 [========================>.....] - ETA: 25s - loss: 1.0558 - regression_loss: 0.9335 - classification_loss: 0.1223 425/500 [========================>.....] - ETA: 25s - loss: 1.0549 - regression_loss: 0.9327 - classification_loss: 0.1222 426/500 [========================>.....] - ETA: 25s - loss: 1.0544 - regression_loss: 0.9324 - classification_loss: 0.1220 427/500 [========================>.....] - ETA: 24s - loss: 1.0535 - regression_loss: 0.9317 - classification_loss: 0.1218 428/500 [========================>.....] - ETA: 24s - loss: 1.0543 - regression_loss: 0.9323 - classification_loss: 0.1220 429/500 [========================>.....] - ETA: 24s - loss: 1.0535 - regression_loss: 0.9316 - classification_loss: 0.1219 430/500 [========================>.....] - ETA: 23s - loss: 1.0533 - regression_loss: 0.9314 - classification_loss: 0.1219 431/500 [========================>.....] - ETA: 23s - loss: 1.0533 - regression_loss: 0.9316 - classification_loss: 0.1217 432/500 [========================>.....] - ETA: 23s - loss: 1.0535 - regression_loss: 0.9318 - classification_loss: 0.1217 433/500 [========================>.....] - ETA: 22s - loss: 1.0547 - regression_loss: 0.9329 - classification_loss: 0.1218 434/500 [=========================>....] - ETA: 22s - loss: 1.0545 - regression_loss: 0.9328 - classification_loss: 0.1217 435/500 [=========================>....] - ETA: 22s - loss: 1.0534 - regression_loss: 0.9319 - classification_loss: 0.1216 436/500 [=========================>....] - ETA: 21s - loss: 1.0539 - regression_loss: 0.9325 - classification_loss: 0.1214 437/500 [=========================>....] - ETA: 21s - loss: 1.0561 - regression_loss: 0.9343 - classification_loss: 0.1217 438/500 [=========================>....] - ETA: 21s - loss: 1.0579 - regression_loss: 0.9359 - classification_loss: 0.1220 439/500 [=========================>....] - ETA: 20s - loss: 1.0579 - regression_loss: 0.9360 - classification_loss: 0.1219 440/500 [=========================>....] - ETA: 20s - loss: 1.0593 - regression_loss: 0.9372 - classification_loss: 0.1221 441/500 [=========================>....] - ETA: 20s - loss: 1.0614 - regression_loss: 0.9390 - classification_loss: 0.1224 442/500 [=========================>....] - ETA: 19s - loss: 1.0609 - regression_loss: 0.9386 - classification_loss: 0.1223 443/500 [=========================>....] - ETA: 19s - loss: 1.0599 - regression_loss: 0.9377 - classification_loss: 0.1221 444/500 [=========================>....] - ETA: 19s - loss: 1.0593 - regression_loss: 0.9372 - classification_loss: 0.1221 445/500 [=========================>....] - ETA: 18s - loss: 1.0594 - regression_loss: 0.9374 - classification_loss: 0.1220 446/500 [=========================>....] - ETA: 18s - loss: 1.0589 - regression_loss: 0.9370 - classification_loss: 0.1220 447/500 [=========================>....] - ETA: 18s - loss: 1.0593 - regression_loss: 0.9372 - classification_loss: 0.1220 448/500 [=========================>....] - ETA: 17s - loss: 1.0595 - regression_loss: 0.9374 - classification_loss: 0.1221 449/500 [=========================>....] - ETA: 17s - loss: 1.0588 - regression_loss: 0.9369 - classification_loss: 0.1220 450/500 [==========================>...] - ETA: 16s - loss: 1.0601 - regression_loss: 0.9380 - classification_loss: 0.1221 451/500 [==========================>...] - ETA: 16s - loss: 1.0609 - regression_loss: 0.9386 - classification_loss: 0.1223 452/500 [==========================>...] - ETA: 16s - loss: 1.0607 - regression_loss: 0.9384 - classification_loss: 0.1223 453/500 [==========================>...] - ETA: 15s - loss: 1.0592 - regression_loss: 0.9370 - classification_loss: 0.1221 454/500 [==========================>...] - ETA: 15s - loss: 1.0589 - regression_loss: 0.9369 - classification_loss: 0.1220 455/500 [==========================>...] - ETA: 15s - loss: 1.0587 - regression_loss: 0.9368 - classification_loss: 0.1218 456/500 [==========================>...] - ETA: 14s - loss: 1.0581 - regression_loss: 0.9363 - classification_loss: 0.1218 457/500 [==========================>...] - ETA: 14s - loss: 1.0586 - regression_loss: 0.9368 - classification_loss: 0.1218 458/500 [==========================>...] - ETA: 14s - loss: 1.0580 - regression_loss: 0.9363 - classification_loss: 0.1217 459/500 [==========================>...] - ETA: 13s - loss: 1.0584 - regression_loss: 0.9366 - classification_loss: 0.1218 460/500 [==========================>...] - ETA: 13s - loss: 1.0576 - regression_loss: 0.9360 - classification_loss: 0.1217 461/500 [==========================>...] - ETA: 13s - loss: 1.0586 - regression_loss: 0.9369 - classification_loss: 0.1218 462/500 [==========================>...] - ETA: 12s - loss: 1.0572 - regression_loss: 0.9357 - classification_loss: 0.1215 463/500 [==========================>...] - ETA: 12s - loss: 1.0570 - regression_loss: 0.9356 - classification_loss: 0.1214 464/500 [==========================>...] - ETA: 12s - loss: 1.0575 - regression_loss: 0.9359 - classification_loss: 0.1216 465/500 [==========================>...] - ETA: 11s - loss: 1.0582 - regression_loss: 0.9364 - classification_loss: 0.1218 466/500 [==========================>...] - ETA: 11s - loss: 1.0590 - regression_loss: 0.9371 - classification_loss: 0.1219 467/500 [===========================>..] - ETA: 11s - loss: 1.0593 - regression_loss: 0.9373 - classification_loss: 0.1220 468/500 [===========================>..] - ETA: 10s - loss: 1.0592 - regression_loss: 0.9373 - classification_loss: 0.1220 469/500 [===========================>..] - ETA: 10s - loss: 1.0596 - regression_loss: 0.9376 - classification_loss: 0.1220 470/500 [===========================>..] - ETA: 10s - loss: 1.0602 - regression_loss: 0.9381 - classification_loss: 0.1221 471/500 [===========================>..] - ETA: 9s - loss: 1.0596 - regression_loss: 0.9377 - classification_loss: 0.1219  472/500 [===========================>..] - ETA: 9s - loss: 1.0588 - regression_loss: 0.9371 - classification_loss: 0.1217 473/500 [===========================>..] - ETA: 9s - loss: 1.0597 - regression_loss: 0.9379 - classification_loss: 0.1218 474/500 [===========================>..] - ETA: 8s - loss: 1.0602 - regression_loss: 0.9384 - classification_loss: 0.1218 475/500 [===========================>..] - ETA: 8s - loss: 1.0615 - regression_loss: 0.9395 - classification_loss: 0.1220 476/500 [===========================>..] - ETA: 8s - loss: 1.0615 - regression_loss: 0.9394 - classification_loss: 0.1221 477/500 [===========================>..] - ETA: 7s - loss: 1.0620 - regression_loss: 0.9399 - classification_loss: 0.1221 478/500 [===========================>..] - ETA: 7s - loss: 1.0607 - regression_loss: 0.9388 - classification_loss: 0.1219 479/500 [===========================>..] - ETA: 7s - loss: 1.0610 - regression_loss: 0.9391 - classification_loss: 0.1219 480/500 [===========================>..] - ETA: 6s - loss: 1.0604 - regression_loss: 0.9385 - classification_loss: 0.1219 481/500 [===========================>..] - ETA: 6s - loss: 1.0600 - regression_loss: 0.9382 - classification_loss: 0.1218 482/500 [===========================>..] - ETA: 6s - loss: 1.0607 - regression_loss: 0.9387 - classification_loss: 0.1220 483/500 [===========================>..] - ETA: 5s - loss: 1.0608 - regression_loss: 0.9388 - classification_loss: 0.1220 484/500 [============================>.] - ETA: 5s - loss: 1.0607 - regression_loss: 0.9387 - classification_loss: 0.1220 485/500 [============================>.] - ETA: 5s - loss: 1.0608 - regression_loss: 0.9387 - classification_loss: 0.1220 486/500 [============================>.] - ETA: 4s - loss: 1.0604 - regression_loss: 0.9383 - classification_loss: 0.1220 487/500 [============================>.] - ETA: 4s - loss: 1.0591 - regression_loss: 0.9372 - classification_loss: 0.1219 488/500 [============================>.] - ETA: 4s - loss: 1.0591 - regression_loss: 0.9373 - classification_loss: 0.1219 489/500 [============================>.] - ETA: 3s - loss: 1.0584 - regression_loss: 0.9366 - classification_loss: 0.1218 490/500 [============================>.] - ETA: 3s - loss: 1.0583 - regression_loss: 0.9366 - classification_loss: 0.1217 491/500 [============================>.] - ETA: 3s - loss: 1.0581 - regression_loss: 0.9365 - classification_loss: 0.1217 492/500 [============================>.] - ETA: 2s - loss: 1.0580 - regression_loss: 0.9364 - classification_loss: 0.1215 493/500 [============================>.] - ETA: 2s - loss: 1.0577 - regression_loss: 0.9362 - classification_loss: 0.1215 494/500 [============================>.] - ETA: 2s - loss: 1.0581 - regression_loss: 0.9366 - classification_loss: 0.1216 495/500 [============================>.] - ETA: 1s - loss: 1.0573 - regression_loss: 0.9358 - classification_loss: 0.1214 496/500 [============================>.] - ETA: 1s - loss: 1.0571 - regression_loss: 0.9358 - classification_loss: 0.1213 497/500 [============================>.] - ETA: 1s - loss: 1.0562 - regression_loss: 0.9350 - classification_loss: 0.1212 498/500 [============================>.] - ETA: 0s - loss: 1.0565 - regression_loss: 0.9353 - classification_loss: 0.1212 499/500 [============================>.] - ETA: 0s - loss: 1.0562 - regression_loss: 0.9351 - classification_loss: 0.1211 500/500 [==============================] - 170s 340ms/step - loss: 1.0555 - regression_loss: 0.9346 - classification_loss: 0.1209 1172 instances of class plum with average precision: 0.8004 mAP: 0.8004 Epoch 00027: saving model to ./training/snapshots/resnet101_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:45 - loss: 0.7847 - regression_loss: 0.6862 - classification_loss: 0.0986 2/500 [..............................] - ETA: 2:49 - loss: 0.9287 - regression_loss: 0.8095 - classification_loss: 0.1192 3/500 [..............................] - ETA: 2:46 - loss: 1.1375 - regression_loss: 0.9599 - classification_loss: 0.1776 4/500 [..............................] - ETA: 2:49 - loss: 1.0476 - regression_loss: 0.9031 - classification_loss: 0.1445 5/500 [..............................] - ETA: 2:48 - loss: 1.0728 - regression_loss: 0.9304 - classification_loss: 0.1425 6/500 [..............................] - ETA: 2:49 - loss: 1.0702 - regression_loss: 0.9332 - classification_loss: 0.1370 7/500 [..............................] - ETA: 2:48 - loss: 1.0462 - regression_loss: 0.9144 - classification_loss: 0.1319 8/500 [..............................] - ETA: 2:48 - loss: 1.0009 - regression_loss: 0.8749 - classification_loss: 0.1259 9/500 [..............................] - ETA: 2:48 - loss: 0.9918 - regression_loss: 0.8684 - classification_loss: 0.1234 10/500 [..............................] - ETA: 2:48 - loss: 0.9986 - regression_loss: 0.8734 - classification_loss: 0.1251 11/500 [..............................] - ETA: 2:47 - loss: 0.9909 - regression_loss: 0.8664 - classification_loss: 0.1245 12/500 [..............................] - ETA: 2:45 - loss: 0.9916 - regression_loss: 0.8686 - classification_loss: 0.1230 13/500 [..............................] - ETA: 2:45 - loss: 0.9531 - regression_loss: 0.8342 - classification_loss: 0.1190 14/500 [..............................] - ETA: 2:44 - loss: 0.9373 - regression_loss: 0.8230 - classification_loss: 0.1142 15/500 [..............................] - ETA: 2:44 - loss: 0.9385 - regression_loss: 0.8242 - classification_loss: 0.1143 16/500 [..............................] - ETA: 2:43 - loss: 0.9585 - regression_loss: 0.8452 - classification_loss: 0.1133 17/500 [>.............................] - ETA: 2:42 - loss: 0.9327 - regression_loss: 0.8238 - classification_loss: 0.1089 18/500 [>.............................] - ETA: 2:42 - loss: 0.9029 - regression_loss: 0.7993 - classification_loss: 0.1036 19/500 [>.............................] - ETA: 2:42 - loss: 0.8985 - regression_loss: 0.7956 - classification_loss: 0.1029 20/500 [>.............................] - ETA: 2:42 - loss: 0.9133 - regression_loss: 0.8093 - classification_loss: 0.1040 21/500 [>.............................] - ETA: 2:41 - loss: 0.9263 - regression_loss: 0.8208 - classification_loss: 0.1054 22/500 [>.............................] - ETA: 2:41 - loss: 0.9316 - regression_loss: 0.8262 - classification_loss: 0.1054 23/500 [>.............................] - ETA: 2:41 - loss: 0.9350 - regression_loss: 0.8294 - classification_loss: 0.1057 24/500 [>.............................] - ETA: 2:41 - loss: 0.9248 - regression_loss: 0.8214 - classification_loss: 0.1033 25/500 [>.............................] - ETA: 2:41 - loss: 0.9493 - regression_loss: 0.8426 - classification_loss: 0.1066 26/500 [>.............................] - ETA: 2:41 - loss: 0.9490 - regression_loss: 0.8419 - classification_loss: 0.1072 27/500 [>.............................] - ETA: 2:40 - loss: 0.9531 - regression_loss: 0.8455 - classification_loss: 0.1076 28/500 [>.............................] - ETA: 2:40 - loss: 0.9609 - regression_loss: 0.8524 - classification_loss: 0.1085 29/500 [>.............................] - ETA: 2:39 - loss: 0.9671 - regression_loss: 0.8574 - classification_loss: 0.1097 30/500 [>.............................] - ETA: 2:39 - loss: 0.9580 - regression_loss: 0.8496 - classification_loss: 0.1083 31/500 [>.............................] - ETA: 2:39 - loss: 0.9605 - regression_loss: 0.8527 - classification_loss: 0.1078 32/500 [>.............................] - ETA: 2:39 - loss: 0.9424 - regression_loss: 0.8370 - classification_loss: 0.1054 33/500 [>.............................] - ETA: 2:39 - loss: 0.9526 - regression_loss: 0.8487 - classification_loss: 0.1038 34/500 [=>............................] - ETA: 2:38 - loss: 0.9436 - regression_loss: 0.8414 - classification_loss: 0.1022 35/500 [=>............................] - ETA: 2:38 - loss: 0.9330 - regression_loss: 0.8319 - classification_loss: 0.1011 36/500 [=>............................] - ETA: 2:38 - loss: 0.9400 - regression_loss: 0.8380 - classification_loss: 0.1020 37/500 [=>............................] - ETA: 2:38 - loss: 0.9347 - regression_loss: 0.8345 - classification_loss: 0.1003 38/500 [=>............................] - ETA: 2:37 - loss: 0.9448 - regression_loss: 0.8433 - classification_loss: 0.1015 39/500 [=>............................] - ETA: 2:36 - loss: 0.9424 - regression_loss: 0.8407 - classification_loss: 0.1017 40/500 [=>............................] - ETA: 2:36 - loss: 0.9477 - regression_loss: 0.8446 - classification_loss: 0.1031 41/500 [=>............................] - ETA: 2:36 - loss: 0.9580 - regression_loss: 0.8527 - classification_loss: 0.1052 42/500 [=>............................] - ETA: 2:36 - loss: 0.9489 - regression_loss: 0.8449 - classification_loss: 0.1040 43/500 [=>............................] - ETA: 2:35 - loss: 0.9554 - regression_loss: 0.8503 - classification_loss: 0.1051 44/500 [=>............................] - ETA: 2:35 - loss: 0.9498 - regression_loss: 0.8460 - classification_loss: 0.1038 45/500 [=>............................] - ETA: 2:35 - loss: 0.9373 - regression_loss: 0.8351 - classification_loss: 0.1021 46/500 [=>............................] - ETA: 2:35 - loss: 0.9355 - regression_loss: 0.8341 - classification_loss: 0.1014 47/500 [=>............................] - ETA: 2:34 - loss: 0.9413 - regression_loss: 0.8391 - classification_loss: 0.1023 48/500 [=>............................] - ETA: 2:34 - loss: 0.9300 - regression_loss: 0.8290 - classification_loss: 0.1010 49/500 [=>............................] - ETA: 2:34 - loss: 0.9301 - regression_loss: 0.8293 - classification_loss: 0.1008 50/500 [==>...........................] - ETA: 2:33 - loss: 0.9179 - regression_loss: 0.8187 - classification_loss: 0.0992 51/500 [==>...........................] - ETA: 2:33 - loss: 0.9218 - regression_loss: 0.8228 - classification_loss: 0.0990 52/500 [==>...........................] - ETA: 2:33 - loss: 0.9256 - regression_loss: 0.8259 - classification_loss: 0.0997 53/500 [==>...........................] - ETA: 2:32 - loss: 0.9224 - regression_loss: 0.8229 - classification_loss: 0.0995 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9130 - regression_loss: 0.8148 - classification_loss: 0.0981 55/500 [==>...........................] - ETA: 2:31 - loss: 0.9181 - regression_loss: 0.8185 - classification_loss: 0.0996 56/500 [==>...........................] - ETA: 2:31 - loss: 0.9218 - regression_loss: 0.8219 - classification_loss: 0.0999 57/500 [==>...........................] - ETA: 2:30 - loss: 0.9239 - regression_loss: 0.8246 - classification_loss: 0.0994 58/500 [==>...........................] - ETA: 2:30 - loss: 0.9216 - regression_loss: 0.8223 - classification_loss: 0.0992 59/500 [==>...........................] - ETA: 2:30 - loss: 0.9220 - regression_loss: 0.8227 - classification_loss: 0.0993 60/500 [==>...........................] - ETA: 2:29 - loss: 0.9262 - regression_loss: 0.8263 - classification_loss: 0.0998 61/500 [==>...........................] - ETA: 2:29 - loss: 0.9412 - regression_loss: 0.8393 - classification_loss: 0.1020 62/500 [==>...........................] - ETA: 2:28 - loss: 0.9370 - regression_loss: 0.8354 - classification_loss: 0.1016 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9435 - regression_loss: 0.8408 - classification_loss: 0.1028 64/500 [==>...........................] - ETA: 2:28 - loss: 0.9528 - regression_loss: 0.8473 - classification_loss: 0.1054 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9505 - regression_loss: 0.8456 - classification_loss: 0.1049 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9491 - regression_loss: 0.8449 - classification_loss: 0.1043 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9500 - regression_loss: 0.8457 - classification_loss: 0.1043 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9434 - regression_loss: 0.8400 - classification_loss: 0.1034 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9496 - regression_loss: 0.8454 - classification_loss: 0.1042 70/500 [===>..........................] - ETA: 2:25 - loss: 0.9514 - regression_loss: 0.8473 - classification_loss: 0.1041 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9473 - regression_loss: 0.8438 - classification_loss: 0.1035 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9481 - regression_loss: 0.8443 - classification_loss: 0.1038 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9540 - regression_loss: 0.8495 - classification_loss: 0.1045 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9572 - regression_loss: 0.8521 - classification_loss: 0.1052 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9564 - regression_loss: 0.8512 - classification_loss: 0.1052 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9491 - regression_loss: 0.8448 - classification_loss: 0.1043 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9543 - regression_loss: 0.8494 - classification_loss: 0.1049 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9564 - regression_loss: 0.8497 - classification_loss: 0.1067 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9617 - regression_loss: 0.8547 - classification_loss: 0.1070 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9572 - regression_loss: 0.8504 - classification_loss: 0.1068 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9530 - regression_loss: 0.8469 - classification_loss: 0.1061 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9477 - regression_loss: 0.8425 - classification_loss: 0.1052 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9519 - regression_loss: 0.8459 - classification_loss: 0.1059 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9549 - regression_loss: 0.8484 - classification_loss: 0.1065 85/500 [====>.........................] - ETA: 2:20 - loss: 0.9533 - regression_loss: 0.8472 - classification_loss: 0.1061 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9532 - regression_loss: 0.8477 - classification_loss: 0.1055 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9501 - regression_loss: 0.8450 - classification_loss: 0.1051 88/500 [====>.........................] - ETA: 2:19 - loss: 0.9564 - regression_loss: 0.8510 - classification_loss: 0.1054 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9544 - regression_loss: 0.8498 - classification_loss: 0.1047 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9586 - regression_loss: 0.8533 - classification_loss: 0.1053 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9662 - regression_loss: 0.8607 - classification_loss: 0.1055 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9636 - regression_loss: 0.8583 - classification_loss: 0.1053 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9651 - regression_loss: 0.8594 - classification_loss: 0.1057 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9667 - regression_loss: 0.8608 - classification_loss: 0.1059 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9604 - regression_loss: 0.8554 - classification_loss: 0.1051 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9595 - regression_loss: 0.8545 - classification_loss: 0.1050 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9622 - regression_loss: 0.8567 - classification_loss: 0.1055 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9607 - regression_loss: 0.8560 - classification_loss: 0.1047 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9663 - regression_loss: 0.8608 - classification_loss: 0.1055 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9651 - regression_loss: 0.8597 - classification_loss: 0.1054 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9657 - regression_loss: 0.8602 - classification_loss: 0.1055 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9605 - regression_loss: 0.8557 - classification_loss: 0.1048 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9917 - regression_loss: 0.8682 - classification_loss: 0.1235 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9942 - regression_loss: 0.8706 - classification_loss: 0.1236 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9899 - regression_loss: 0.8672 - classification_loss: 0.1228 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9861 - regression_loss: 0.8642 - classification_loss: 0.1219 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9889 - regression_loss: 0.8672 - classification_loss: 0.1217 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9913 - regression_loss: 0.8694 - classification_loss: 0.1219 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9940 - regression_loss: 0.8719 - classification_loss: 0.1221 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9977 - regression_loss: 0.8760 - classification_loss: 0.1217 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9975 - regression_loss: 0.8758 - classification_loss: 0.1216 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9984 - regression_loss: 0.8772 - classification_loss: 0.1212 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0036 - regression_loss: 0.8814 - classification_loss: 0.1222 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0056 - regression_loss: 0.8832 - classification_loss: 0.1225 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0087 - regression_loss: 0.8857 - classification_loss: 0.1229 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0030 - regression_loss: 0.8809 - classification_loss: 0.1221 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0064 - regression_loss: 0.8836 - classification_loss: 0.1228 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0057 - regression_loss: 0.8833 - classification_loss: 0.1225 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0067 - regression_loss: 0.8844 - classification_loss: 0.1223 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0068 - regression_loss: 0.8843 - classification_loss: 0.1225 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0099 - regression_loss: 0.8868 - classification_loss: 0.1231 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0130 - regression_loss: 0.8895 - classification_loss: 0.1235 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0169 - regression_loss: 0.8928 - classification_loss: 0.1241 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0202 - regression_loss: 0.8962 - classification_loss: 0.1240 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0212 - regression_loss: 0.8972 - classification_loss: 0.1240 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0191 - regression_loss: 0.8949 - classification_loss: 0.1243 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0211 - regression_loss: 0.8967 - classification_loss: 0.1244 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0218 - regression_loss: 0.8976 - classification_loss: 0.1241 129/500 [======>.......................] - ETA: 2:05 - loss: 1.0247 - regression_loss: 0.8999 - classification_loss: 0.1248 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0267 - regression_loss: 0.9017 - classification_loss: 0.1250 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0294 - regression_loss: 0.9040 - classification_loss: 0.1254 132/500 [======>.......................] - ETA: 2:04 - loss: 1.0271 - regression_loss: 0.9020 - classification_loss: 0.1251 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0258 - regression_loss: 0.9011 - classification_loss: 0.1247 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0277 - regression_loss: 0.9027 - classification_loss: 0.1250 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0295 - regression_loss: 0.9044 - classification_loss: 0.1251 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0262 - regression_loss: 0.9017 - classification_loss: 0.1245 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0278 - regression_loss: 0.9030 - classification_loss: 0.1248 138/500 [=======>......................] - ETA: 2:02 - loss: 1.0235 - regression_loss: 0.8993 - classification_loss: 0.1242 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0214 - regression_loss: 0.8975 - classification_loss: 0.1239 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0205 - regression_loss: 0.8971 - classification_loss: 0.1234 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0233 - regression_loss: 0.8997 - classification_loss: 0.1237 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0222 - regression_loss: 0.8986 - classification_loss: 0.1236 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0196 - regression_loss: 0.8964 - classification_loss: 0.1231 144/500 [=======>......................] - ETA: 2:00 - loss: 1.0202 - regression_loss: 0.8970 - classification_loss: 0.1232 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0178 - regression_loss: 0.8951 - classification_loss: 0.1226 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0178 - regression_loss: 0.8953 - classification_loss: 0.1225 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0169 - regression_loss: 0.8946 - classification_loss: 0.1222 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0174 - regression_loss: 0.8956 - classification_loss: 0.1219 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0180 - regression_loss: 0.8959 - classification_loss: 0.1221 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0165 - regression_loss: 0.8948 - classification_loss: 0.1217 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0158 - regression_loss: 0.8941 - classification_loss: 0.1217 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0127 - regression_loss: 0.8916 - classification_loss: 0.1211 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0141 - regression_loss: 0.8927 - classification_loss: 0.1214 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0107 - regression_loss: 0.8896 - classification_loss: 0.1211 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0128 - regression_loss: 0.8914 - classification_loss: 0.1214 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0116 - regression_loss: 0.8906 - classification_loss: 0.1210 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0129 - regression_loss: 0.8920 - classification_loss: 0.1209 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0144 - regression_loss: 0.8933 - classification_loss: 0.1211 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0137 - regression_loss: 0.8928 - classification_loss: 0.1209 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0162 - regression_loss: 0.8950 - classification_loss: 0.1212 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0139 - regression_loss: 0.8933 - classification_loss: 0.1205 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0147 - regression_loss: 0.8939 - classification_loss: 0.1208 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0109 - regression_loss: 0.8906 - classification_loss: 0.1203 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0097 - regression_loss: 0.8898 - classification_loss: 0.1199 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0109 - regression_loss: 0.8910 - classification_loss: 0.1198 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0109 - regression_loss: 0.8913 - classification_loss: 0.1196 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0131 - regression_loss: 0.8937 - classification_loss: 0.1194 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0130 - regression_loss: 0.8938 - classification_loss: 0.1192 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0131 - regression_loss: 0.8939 - classification_loss: 0.1192 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0148 - regression_loss: 0.8953 - classification_loss: 0.1195 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0165 - regression_loss: 0.8969 - classification_loss: 0.1196 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0194 - regression_loss: 0.8994 - classification_loss: 0.1200 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0252 - regression_loss: 0.9045 - classification_loss: 0.1207 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0230 - regression_loss: 0.9026 - classification_loss: 0.1204 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0247 - regression_loss: 0.9039 - classification_loss: 0.1208 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0257 - regression_loss: 0.9048 - classification_loss: 0.1209 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0227 - regression_loss: 0.9023 - classification_loss: 0.1204 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0214 - regression_loss: 0.9012 - classification_loss: 0.1202 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0224 - regression_loss: 0.9021 - classification_loss: 0.1202 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0209 - regression_loss: 0.9011 - classification_loss: 0.1198 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0195 - regression_loss: 0.8997 - classification_loss: 0.1198 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0196 - regression_loss: 0.8998 - classification_loss: 0.1198 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0196 - regression_loss: 0.9000 - classification_loss: 0.1196 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0177 - regression_loss: 0.8982 - classification_loss: 0.1194 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0177 - regression_loss: 0.8985 - classification_loss: 0.1193 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0180 - regression_loss: 0.8989 - classification_loss: 0.1191 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0162 - regression_loss: 0.8975 - classification_loss: 0.1187 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0191 - regression_loss: 0.8998 - classification_loss: 0.1192 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0161 - regression_loss: 0.8973 - classification_loss: 0.1188 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0168 - regression_loss: 0.8978 - classification_loss: 0.1189 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0183 - regression_loss: 0.8990 - classification_loss: 0.1193 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0190 - regression_loss: 0.8995 - classification_loss: 0.1195 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0171 - regression_loss: 0.8980 - classification_loss: 0.1191 194/500 [==========>...................] - ETA: 1:43 - loss: 1.0173 - regression_loss: 0.8981 - classification_loss: 0.1192 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0181 - regression_loss: 0.8990 - classification_loss: 0.1191 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0181 - regression_loss: 0.8990 - classification_loss: 0.1190 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0174 - regression_loss: 0.8985 - classification_loss: 0.1188 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0135 - regression_loss: 0.8951 - classification_loss: 0.1184 199/500 [==========>...................] - ETA: 1:41 - loss: 1.0139 - regression_loss: 0.8956 - classification_loss: 0.1183 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0150 - regression_loss: 0.8965 - classification_loss: 0.1184 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0140 - regression_loss: 0.8958 - classification_loss: 0.1182 202/500 [===========>..................] - ETA: 1:40 - loss: 1.0140 - regression_loss: 0.8960 - classification_loss: 0.1180 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0130 - regression_loss: 0.8953 - classification_loss: 0.1176 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0114 - regression_loss: 0.8941 - classification_loss: 0.1173 205/500 [===========>..................] - ETA: 1:39 - loss: 1.0148 - regression_loss: 0.8968 - classification_loss: 0.1180 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0123 - regression_loss: 0.8948 - classification_loss: 0.1176 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0122 - regression_loss: 0.8947 - classification_loss: 0.1175 208/500 [===========>..................] - ETA: 1:38 - loss: 1.0128 - regression_loss: 0.8954 - classification_loss: 0.1174 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0118 - regression_loss: 0.8946 - classification_loss: 0.1172 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0108 - regression_loss: 0.8939 - classification_loss: 0.1168 211/500 [===========>..................] - ETA: 1:37 - loss: 1.0095 - regression_loss: 0.8929 - classification_loss: 0.1166 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0096 - regression_loss: 0.8931 - classification_loss: 0.1165 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0097 - regression_loss: 0.8934 - classification_loss: 0.1164 214/500 [===========>..................] - ETA: 1:36 - loss: 1.0092 - regression_loss: 0.8928 - classification_loss: 0.1164 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0116 - regression_loss: 0.8949 - classification_loss: 0.1167 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0106 - regression_loss: 0.8941 - classification_loss: 0.1164 217/500 [============>.................] - ETA: 1:35 - loss: 1.0105 - regression_loss: 0.8942 - classification_loss: 0.1163 218/500 [============>.................] - ETA: 1:35 - loss: 1.0095 - regression_loss: 0.8932 - classification_loss: 0.1163 219/500 [============>.................] - ETA: 1:35 - loss: 1.0069 - regression_loss: 0.8909 - classification_loss: 0.1160 220/500 [============>.................] - ETA: 1:34 - loss: 1.0083 - regression_loss: 0.8920 - classification_loss: 0.1162 221/500 [============>.................] - ETA: 1:34 - loss: 1.0093 - regression_loss: 0.8930 - classification_loss: 0.1163 222/500 [============>.................] - ETA: 1:34 - loss: 1.0102 - regression_loss: 0.8937 - classification_loss: 0.1165 223/500 [============>.................] - ETA: 1:33 - loss: 1.0121 - regression_loss: 0.8954 - classification_loss: 0.1167 224/500 [============>.................] - ETA: 1:33 - loss: 1.0130 - regression_loss: 0.8962 - classification_loss: 0.1167 225/500 [============>.................] - ETA: 1:33 - loss: 1.0146 - regression_loss: 0.8978 - classification_loss: 0.1168 226/500 [============>.................] - ETA: 1:32 - loss: 1.0146 - regression_loss: 0.8980 - classification_loss: 0.1166 227/500 [============>.................] - ETA: 1:32 - loss: 1.0152 - regression_loss: 0.8985 - classification_loss: 0.1167 228/500 [============>.................] - ETA: 1:32 - loss: 1.0145 - regression_loss: 0.8981 - classification_loss: 0.1164 229/500 [============>.................] - ETA: 1:31 - loss: 1.0129 - regression_loss: 0.8968 - classification_loss: 0.1161 230/500 [============>.................] - ETA: 1:31 - loss: 1.0136 - regression_loss: 0.8972 - classification_loss: 0.1163 231/500 [============>.................] - ETA: 1:31 - loss: 1.0136 - regression_loss: 0.8973 - classification_loss: 0.1163 232/500 [============>.................] - ETA: 1:30 - loss: 1.0129 - regression_loss: 0.8968 - classification_loss: 0.1162 233/500 [============>.................] - ETA: 1:30 - loss: 1.0119 - regression_loss: 0.8960 - classification_loss: 0.1159 234/500 [=============>................] - ETA: 1:29 - loss: 1.0117 - regression_loss: 0.8960 - classification_loss: 0.1158 235/500 [=============>................] - ETA: 1:29 - loss: 1.0102 - regression_loss: 0.8947 - classification_loss: 0.1155 236/500 [=============>................] - ETA: 1:29 - loss: 1.0110 - regression_loss: 0.8954 - classification_loss: 0.1156 237/500 [=============>................] - ETA: 1:28 - loss: 1.0097 - regression_loss: 0.8944 - classification_loss: 0.1153 238/500 [=============>................] - ETA: 1:28 - loss: 1.0112 - regression_loss: 0.8958 - classification_loss: 0.1154 239/500 [=============>................] - ETA: 1:28 - loss: 1.0122 - regression_loss: 0.8965 - classification_loss: 0.1157 240/500 [=============>................] - ETA: 1:27 - loss: 1.0126 - regression_loss: 0.8965 - classification_loss: 0.1161 241/500 [=============>................] - ETA: 1:27 - loss: 1.0107 - regression_loss: 0.8948 - classification_loss: 0.1159 242/500 [=============>................] - ETA: 1:27 - loss: 1.0108 - regression_loss: 0.8950 - classification_loss: 0.1158 243/500 [=============>................] - ETA: 1:26 - loss: 1.0078 - regression_loss: 0.8924 - classification_loss: 0.1154 244/500 [=============>................] - ETA: 1:26 - loss: 1.0075 - regression_loss: 0.8921 - classification_loss: 0.1154 245/500 [=============>................] - ETA: 1:26 - loss: 1.0065 - regression_loss: 0.8912 - classification_loss: 0.1153 246/500 [=============>................] - ETA: 1:25 - loss: 1.0091 - regression_loss: 0.8937 - classification_loss: 0.1154 247/500 [=============>................] - ETA: 1:25 - loss: 1.0080 - regression_loss: 0.8929 - classification_loss: 0.1151 248/500 [=============>................] - ETA: 1:25 - loss: 1.0088 - regression_loss: 0.8935 - classification_loss: 0.1152 249/500 [=============>................] - ETA: 1:24 - loss: 1.0073 - regression_loss: 0.8923 - classification_loss: 0.1150 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0077 - regression_loss: 0.8927 - classification_loss: 0.1150 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0062 - regression_loss: 0.8915 - classification_loss: 0.1147 252/500 [==============>...............] - ETA: 1:23 - loss: 1.0064 - regression_loss: 0.8917 - classification_loss: 0.1146 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0077 - regression_loss: 0.8929 - classification_loss: 0.1148 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0084 - regression_loss: 0.8934 - classification_loss: 0.1150 255/500 [==============>...............] - ETA: 1:22 - loss: 1.0083 - regression_loss: 0.8934 - classification_loss: 0.1149 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0075 - regression_loss: 0.8928 - classification_loss: 0.1147 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0096 - regression_loss: 0.8946 - classification_loss: 0.1150 258/500 [==============>...............] - ETA: 1:21 - loss: 1.0108 - regression_loss: 0.8957 - classification_loss: 0.1152 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0113 - regression_loss: 0.8961 - classification_loss: 0.1152 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0130 - regression_loss: 0.8975 - classification_loss: 0.1154 261/500 [==============>...............] - ETA: 1:20 - loss: 1.0131 - regression_loss: 0.8977 - classification_loss: 0.1153 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0151 - regression_loss: 0.8995 - classification_loss: 0.1156 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0174 - regression_loss: 0.9013 - classification_loss: 0.1161 264/500 [==============>...............] - ETA: 1:19 - loss: 1.0185 - regression_loss: 0.9022 - classification_loss: 0.1162 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0177 - regression_loss: 0.9015 - classification_loss: 0.1161 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0172 - regression_loss: 0.9012 - classification_loss: 0.1160 267/500 [===============>..............] - ETA: 1:18 - loss: 1.0158 - regression_loss: 0.9001 - classification_loss: 0.1157 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0161 - regression_loss: 0.9002 - classification_loss: 0.1159 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0158 - regression_loss: 0.9000 - classification_loss: 0.1158 270/500 [===============>..............] - ETA: 1:17 - loss: 1.0139 - regression_loss: 0.8984 - classification_loss: 0.1155 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0119 - regression_loss: 0.8967 - classification_loss: 0.1152 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0094 - regression_loss: 0.8945 - classification_loss: 0.1149 273/500 [===============>..............] - ETA: 1:16 - loss: 1.0105 - regression_loss: 0.8955 - classification_loss: 0.1151 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0088 - regression_loss: 0.8940 - classification_loss: 0.1148 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0080 - regression_loss: 0.8933 - classification_loss: 0.1147 276/500 [===============>..............] - ETA: 1:15 - loss: 1.0087 - regression_loss: 0.8942 - classification_loss: 0.1145 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0091 - regression_loss: 0.8946 - classification_loss: 0.1145 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0090 - regression_loss: 0.8945 - classification_loss: 0.1145 279/500 [===============>..............] - ETA: 1:14 - loss: 1.0099 - regression_loss: 0.8952 - classification_loss: 0.1147 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0091 - regression_loss: 0.8944 - classification_loss: 0.1147 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0088 - regression_loss: 0.8942 - classification_loss: 0.1146 282/500 [===============>..............] - ETA: 1:13 - loss: 1.0076 - regression_loss: 0.8931 - classification_loss: 0.1146 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0077 - regression_loss: 0.8932 - classification_loss: 0.1145 284/500 [================>.............] - ETA: 1:13 - loss: 1.0075 - regression_loss: 0.8930 - classification_loss: 0.1145 285/500 [================>.............] - ETA: 1:12 - loss: 1.0081 - regression_loss: 0.8935 - classification_loss: 0.1146 286/500 [================>.............] - ETA: 1:12 - loss: 1.0080 - regression_loss: 0.8934 - classification_loss: 0.1146 287/500 [================>.............] - ETA: 1:12 - loss: 1.0075 - regression_loss: 0.8931 - classification_loss: 0.1144 288/500 [================>.............] - ETA: 1:11 - loss: 1.0055 - regression_loss: 0.8913 - classification_loss: 0.1141 289/500 [================>.............] - ETA: 1:11 - loss: 1.0068 - regression_loss: 0.8926 - classification_loss: 0.1142 290/500 [================>.............] - ETA: 1:11 - loss: 1.0067 - regression_loss: 0.8925 - classification_loss: 0.1142 291/500 [================>.............] - ETA: 1:10 - loss: 1.0086 - regression_loss: 0.8940 - classification_loss: 0.1146 292/500 [================>.............] - ETA: 1:10 - loss: 1.0095 - regression_loss: 0.8946 - classification_loss: 0.1149 293/500 [================>.............] - ETA: 1:10 - loss: 1.0087 - regression_loss: 0.8939 - classification_loss: 0.1147 294/500 [================>.............] - ETA: 1:09 - loss: 1.0079 - regression_loss: 0.8934 - classification_loss: 0.1146 295/500 [================>.............] - ETA: 1:09 - loss: 1.0086 - regression_loss: 0.8939 - classification_loss: 0.1148 296/500 [================>.............] - ETA: 1:09 - loss: 1.0099 - regression_loss: 0.8951 - classification_loss: 0.1148 297/500 [================>.............] - ETA: 1:08 - loss: 1.0077 - regression_loss: 0.8932 - classification_loss: 0.1145 298/500 [================>.............] - ETA: 1:08 - loss: 1.0084 - regression_loss: 0.8939 - classification_loss: 0.1145 299/500 [================>.............] - ETA: 1:08 - loss: 1.0097 - regression_loss: 0.8950 - classification_loss: 0.1147 300/500 [=================>............] - ETA: 1:07 - loss: 1.0101 - regression_loss: 0.8947 - classification_loss: 0.1153 301/500 [=================>............] - ETA: 1:07 - loss: 1.0129 - regression_loss: 0.8973 - classification_loss: 0.1156 302/500 [=================>............] - ETA: 1:07 - loss: 1.0133 - regression_loss: 0.8975 - classification_loss: 0.1158 303/500 [=================>............] - ETA: 1:06 - loss: 1.0124 - regression_loss: 0.8967 - classification_loss: 0.1157 304/500 [=================>............] - ETA: 1:06 - loss: 1.0133 - regression_loss: 0.8974 - classification_loss: 0.1159 305/500 [=================>............] - ETA: 1:06 - loss: 1.0123 - regression_loss: 0.8967 - classification_loss: 0.1157 306/500 [=================>............] - ETA: 1:05 - loss: 1.0131 - regression_loss: 0.8972 - classification_loss: 0.1158 307/500 [=================>............] - ETA: 1:05 - loss: 1.0130 - regression_loss: 0.8972 - classification_loss: 0.1159 308/500 [=================>............] - ETA: 1:05 - loss: 1.0131 - regression_loss: 0.8972 - classification_loss: 0.1159 309/500 [=================>............] - ETA: 1:04 - loss: 1.0123 - regression_loss: 0.8965 - classification_loss: 0.1158 310/500 [=================>............] - ETA: 1:04 - loss: 1.0097 - regression_loss: 0.8943 - classification_loss: 0.1155 311/500 [=================>............] - ETA: 1:04 - loss: 1.0076 - regression_loss: 0.8924 - classification_loss: 0.1152 312/500 [=================>............] - ETA: 1:03 - loss: 1.0073 - regression_loss: 0.8920 - classification_loss: 0.1153 313/500 [=================>............] - ETA: 1:03 - loss: 1.0069 - regression_loss: 0.8917 - classification_loss: 0.1153 314/500 [=================>............] - ETA: 1:03 - loss: 1.0052 - regression_loss: 0.8902 - classification_loss: 0.1150 315/500 [=================>............] - ETA: 1:02 - loss: 1.0030 - regression_loss: 0.8883 - classification_loss: 0.1147 316/500 [=================>............] - ETA: 1:02 - loss: 1.0037 - regression_loss: 0.8889 - classification_loss: 0.1149 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0039 - regression_loss: 0.8890 - classification_loss: 0.1149 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0028 - regression_loss: 0.8881 - classification_loss: 0.1146 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0016 - regression_loss: 0.8872 - classification_loss: 0.1144 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0017 - regression_loss: 0.8874 - classification_loss: 0.1143 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0007 - regression_loss: 0.8866 - classification_loss: 0.1141 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0056 - regression_loss: 0.8907 - classification_loss: 0.1149 323/500 [==================>...........] - ETA: 59s - loss: 1.0063 - regression_loss: 0.8914 - classification_loss: 0.1149  324/500 [==================>...........] - ETA: 59s - loss: 1.0055 - regression_loss: 0.8909 - classification_loss: 0.1146 325/500 [==================>...........] - ETA: 59s - loss: 1.0061 - regression_loss: 0.8915 - classification_loss: 0.1146 326/500 [==================>...........] - ETA: 58s - loss: 1.0048 - regression_loss: 0.8904 - classification_loss: 0.1144 327/500 [==================>...........] - ETA: 58s - loss: 1.0044 - regression_loss: 0.8899 - classification_loss: 0.1145 328/500 [==================>...........] - ETA: 58s - loss: 1.0052 - regression_loss: 0.8905 - classification_loss: 0.1147 329/500 [==================>...........] - ETA: 57s - loss: 1.0058 - regression_loss: 0.8911 - classification_loss: 0.1147 330/500 [==================>...........] - ETA: 57s - loss: 1.0045 - regression_loss: 0.8900 - classification_loss: 0.1145 331/500 [==================>...........] - ETA: 57s - loss: 1.0065 - regression_loss: 0.8918 - classification_loss: 0.1147 332/500 [==================>...........] - ETA: 56s - loss: 1.0059 - regression_loss: 0.8912 - classification_loss: 0.1146 333/500 [==================>...........] - ETA: 56s - loss: 1.0047 - regression_loss: 0.8903 - classification_loss: 0.1144 334/500 [===================>..........] - ETA: 56s - loss: 1.0038 - regression_loss: 0.8896 - classification_loss: 0.1141 335/500 [===================>..........] - ETA: 55s - loss: 1.0037 - regression_loss: 0.8897 - classification_loss: 0.1140 336/500 [===================>..........] - ETA: 55s - loss: 1.0041 - regression_loss: 0.8901 - classification_loss: 0.1140 337/500 [===================>..........] - ETA: 55s - loss: 1.0051 - regression_loss: 0.8911 - classification_loss: 0.1140 338/500 [===================>..........] - ETA: 54s - loss: 1.0061 - regression_loss: 0.8918 - classification_loss: 0.1142 339/500 [===================>..........] - ETA: 54s - loss: 1.0042 - regression_loss: 0.8902 - classification_loss: 0.1140 340/500 [===================>..........] - ETA: 54s - loss: 1.0041 - regression_loss: 0.8901 - classification_loss: 0.1140 341/500 [===================>..........] - ETA: 53s - loss: 1.0040 - regression_loss: 0.8900 - classification_loss: 0.1140 342/500 [===================>..........] - ETA: 53s - loss: 1.0021 - regression_loss: 0.8885 - classification_loss: 0.1137 343/500 [===================>..........] - ETA: 53s - loss: 1.0039 - regression_loss: 0.8899 - classification_loss: 0.1140 344/500 [===================>..........] - ETA: 52s - loss: 1.0033 - regression_loss: 0.8895 - classification_loss: 0.1138 345/500 [===================>..........] - ETA: 52s - loss: 1.0048 - regression_loss: 0.8910 - classification_loss: 0.1139 346/500 [===================>..........] - ETA: 52s - loss: 1.0032 - regression_loss: 0.8896 - classification_loss: 0.1136 347/500 [===================>..........] - ETA: 51s - loss: 1.0028 - regression_loss: 0.8891 - classification_loss: 0.1137 348/500 [===================>..........] - ETA: 51s - loss: 1.0015 - regression_loss: 0.8880 - classification_loss: 0.1136 349/500 [===================>..........] - ETA: 51s - loss: 1.0024 - regression_loss: 0.8887 - classification_loss: 0.1137 350/500 [====================>.........] - ETA: 50s - loss: 1.0030 - regression_loss: 0.8892 - classification_loss: 0.1138 351/500 [====================>.........] - ETA: 50s - loss: 1.0044 - regression_loss: 0.8906 - classification_loss: 0.1138 352/500 [====================>.........] - ETA: 50s - loss: 1.0053 - regression_loss: 0.8915 - classification_loss: 0.1139 353/500 [====================>.........] - ETA: 49s - loss: 1.0038 - regression_loss: 0.8902 - classification_loss: 0.1137 354/500 [====================>.........] - ETA: 49s - loss: 1.0042 - regression_loss: 0.8906 - classification_loss: 0.1136 355/500 [====================>.........] - ETA: 49s - loss: 1.0041 - regression_loss: 0.8906 - classification_loss: 0.1135 356/500 [====================>.........] - ETA: 48s - loss: 1.0039 - regression_loss: 0.8904 - classification_loss: 0.1135 357/500 [====================>.........] - ETA: 48s - loss: 1.0037 - regression_loss: 0.8903 - classification_loss: 0.1134 358/500 [====================>.........] - ETA: 48s - loss: 1.0027 - regression_loss: 0.8895 - classification_loss: 0.1132 359/500 [====================>.........] - ETA: 47s - loss: 1.0031 - regression_loss: 0.8899 - classification_loss: 0.1132 360/500 [====================>.........] - ETA: 47s - loss: 1.0031 - regression_loss: 0.8899 - classification_loss: 0.1131 361/500 [====================>.........] - ETA: 47s - loss: 1.0035 - regression_loss: 0.8902 - classification_loss: 0.1133 362/500 [====================>.........] - ETA: 46s - loss: 1.0028 - regression_loss: 0.8898 - classification_loss: 0.1131 363/500 [====================>.........] - ETA: 46s - loss: 1.0030 - regression_loss: 0.8899 - classification_loss: 0.1130 364/500 [====================>.........] - ETA: 46s - loss: 1.0009 - regression_loss: 0.8882 - classification_loss: 0.1128 365/500 [====================>.........] - ETA: 45s - loss: 1.0013 - regression_loss: 0.8884 - classification_loss: 0.1129 366/500 [====================>.........] - ETA: 45s - loss: 1.0022 - regression_loss: 0.8893 - classification_loss: 0.1130 367/500 [=====================>........] - ETA: 45s - loss: 1.0015 - regression_loss: 0.8886 - classification_loss: 0.1129 368/500 [=====================>........] - ETA: 44s - loss: 1.0010 - regression_loss: 0.8881 - classification_loss: 0.1128 369/500 [=====================>........] - ETA: 44s - loss: 0.9994 - regression_loss: 0.8868 - classification_loss: 0.1126 370/500 [=====================>........] - ETA: 44s - loss: 0.9998 - regression_loss: 0.8872 - classification_loss: 0.1126 371/500 [=====================>........] - ETA: 43s - loss: 1.0011 - regression_loss: 0.8885 - classification_loss: 0.1127 372/500 [=====================>........] - ETA: 43s - loss: 1.0013 - regression_loss: 0.8886 - classification_loss: 0.1127 373/500 [=====================>........] - ETA: 43s - loss: 1.0023 - regression_loss: 0.8896 - classification_loss: 0.1128 374/500 [=====================>........] - ETA: 42s - loss: 1.0035 - regression_loss: 0.8906 - classification_loss: 0.1129 375/500 [=====================>........] - ETA: 42s - loss: 1.0020 - regression_loss: 0.8894 - classification_loss: 0.1126 376/500 [=====================>........] - ETA: 42s - loss: 1.0012 - regression_loss: 0.8887 - classification_loss: 0.1126 377/500 [=====================>........] - ETA: 41s - loss: 1.0018 - regression_loss: 0.8892 - classification_loss: 0.1126 378/500 [=====================>........] - ETA: 41s - loss: 1.0000 - regression_loss: 0.8876 - classification_loss: 0.1124 379/500 [=====================>........] - ETA: 41s - loss: 1.0002 - regression_loss: 0.8878 - classification_loss: 0.1124 380/500 [=====================>........] - ETA: 40s - loss: 0.9986 - regression_loss: 0.8863 - classification_loss: 0.1123 381/500 [=====================>........] - ETA: 40s - loss: 1.0007 - regression_loss: 0.8882 - classification_loss: 0.1125 382/500 [=====================>........] - ETA: 40s - loss: 1.0013 - regression_loss: 0.8887 - classification_loss: 0.1126 383/500 [=====================>........] - ETA: 39s - loss: 1.0025 - regression_loss: 0.8897 - classification_loss: 0.1127 384/500 [======================>.......] - ETA: 39s - loss: 1.0035 - regression_loss: 0.8905 - classification_loss: 0.1129 385/500 [======================>.......] - ETA: 39s - loss: 1.0036 - regression_loss: 0.8906 - classification_loss: 0.1130 386/500 [======================>.......] - ETA: 38s - loss: 1.0027 - regression_loss: 0.8898 - classification_loss: 0.1128 387/500 [======================>.......] - ETA: 38s - loss: 1.0031 - regression_loss: 0.8902 - classification_loss: 0.1129 388/500 [======================>.......] - ETA: 37s - loss: 1.0021 - regression_loss: 0.8894 - classification_loss: 0.1127 389/500 [======================>.......] - ETA: 37s - loss: 1.0022 - regression_loss: 0.8894 - classification_loss: 0.1128 390/500 [======================>.......] - ETA: 37s - loss: 1.0019 - regression_loss: 0.8892 - classification_loss: 0.1128 391/500 [======================>.......] - ETA: 36s - loss: 1.0009 - regression_loss: 0.8883 - classification_loss: 0.1126 392/500 [======================>.......] - ETA: 36s - loss: 0.9997 - regression_loss: 0.8872 - classification_loss: 0.1124 393/500 [======================>.......] - ETA: 36s - loss: 1.0007 - regression_loss: 0.8881 - classification_loss: 0.1126 394/500 [======================>.......] - ETA: 35s - loss: 1.0011 - regression_loss: 0.8884 - classification_loss: 0.1127 395/500 [======================>.......] - ETA: 35s - loss: 1.0001 - regression_loss: 0.8876 - classification_loss: 0.1125 396/500 [======================>.......] - ETA: 35s - loss: 0.9998 - regression_loss: 0.8873 - classification_loss: 0.1125 397/500 [======================>.......] - ETA: 34s - loss: 1.0003 - regression_loss: 0.8876 - classification_loss: 0.1127 398/500 [======================>.......] - ETA: 34s - loss: 1.0003 - regression_loss: 0.8875 - classification_loss: 0.1128 399/500 [======================>.......] - ETA: 34s - loss: 0.9998 - regression_loss: 0.8871 - classification_loss: 0.1127 400/500 [=======================>......] - ETA: 33s - loss: 1.0011 - regression_loss: 0.8881 - classification_loss: 0.1129 401/500 [=======================>......] - ETA: 33s - loss: 1.0016 - regression_loss: 0.8886 - classification_loss: 0.1130 402/500 [=======================>......] - ETA: 33s - loss: 1.0021 - regression_loss: 0.8889 - classification_loss: 0.1131 403/500 [=======================>......] - ETA: 32s - loss: 1.0018 - regression_loss: 0.8887 - classification_loss: 0.1131 404/500 [=======================>......] - ETA: 32s - loss: 1.0022 - regression_loss: 0.8890 - classification_loss: 0.1132 405/500 [=======================>......] - ETA: 32s - loss: 1.0009 - regression_loss: 0.8880 - classification_loss: 0.1130 406/500 [=======================>......] - ETA: 31s - loss: 1.0001 - regression_loss: 0.8872 - classification_loss: 0.1129 407/500 [=======================>......] - ETA: 31s - loss: 1.0007 - regression_loss: 0.8877 - classification_loss: 0.1130 408/500 [=======================>......] - ETA: 31s - loss: 1.0014 - regression_loss: 0.8883 - classification_loss: 0.1131 409/500 [=======================>......] - ETA: 30s - loss: 1.0012 - regression_loss: 0.8881 - classification_loss: 0.1131 410/500 [=======================>......] - ETA: 30s - loss: 0.9999 - regression_loss: 0.8869 - classification_loss: 0.1129 411/500 [=======================>......] - ETA: 30s - loss: 0.9997 - regression_loss: 0.8868 - classification_loss: 0.1129 412/500 [=======================>......] - ETA: 29s - loss: 0.9993 - regression_loss: 0.8865 - classification_loss: 0.1128 413/500 [=======================>......] - ETA: 29s - loss: 0.9987 - regression_loss: 0.8859 - classification_loss: 0.1127 414/500 [=======================>......] - ETA: 29s - loss: 0.9988 - regression_loss: 0.8861 - classification_loss: 0.1127 415/500 [=======================>......] - ETA: 28s - loss: 0.9974 - regression_loss: 0.8849 - classification_loss: 0.1125 416/500 [=======================>......] - ETA: 28s - loss: 0.9973 - regression_loss: 0.8850 - classification_loss: 0.1124 417/500 [========================>.....] - ETA: 28s - loss: 0.9966 - regression_loss: 0.8844 - classification_loss: 0.1123 418/500 [========================>.....] - ETA: 27s - loss: 0.9968 - regression_loss: 0.8845 - classification_loss: 0.1123 419/500 [========================>.....] - ETA: 27s - loss: 0.9953 - regression_loss: 0.8832 - classification_loss: 0.1121 420/500 [========================>.....] - ETA: 27s - loss: 0.9959 - regression_loss: 0.8837 - classification_loss: 0.1122 421/500 [========================>.....] - ETA: 26s - loss: 0.9951 - regression_loss: 0.8831 - classification_loss: 0.1120 422/500 [========================>.....] - ETA: 26s - loss: 0.9947 - regression_loss: 0.8828 - classification_loss: 0.1119 423/500 [========================>.....] - ETA: 26s - loss: 0.9946 - regression_loss: 0.8827 - classification_loss: 0.1119 424/500 [========================>.....] - ETA: 25s - loss: 0.9945 - regression_loss: 0.8826 - classification_loss: 0.1118 425/500 [========================>.....] - ETA: 25s - loss: 0.9937 - regression_loss: 0.8821 - classification_loss: 0.1117 426/500 [========================>.....] - ETA: 25s - loss: 0.9941 - regression_loss: 0.8823 - classification_loss: 0.1117 427/500 [========================>.....] - ETA: 24s - loss: 0.9942 - regression_loss: 0.8824 - classification_loss: 0.1118 428/500 [========================>.....] - ETA: 24s - loss: 0.9934 - regression_loss: 0.8818 - classification_loss: 0.1116 429/500 [========================>.....] - ETA: 24s - loss: 0.9927 - regression_loss: 0.8812 - classification_loss: 0.1115 430/500 [========================>.....] - ETA: 23s - loss: 0.9927 - regression_loss: 0.8812 - classification_loss: 0.1115 431/500 [========================>.....] - ETA: 23s - loss: 0.9928 - regression_loss: 0.8814 - classification_loss: 0.1115 432/500 [========================>.....] - ETA: 23s - loss: 0.9930 - regression_loss: 0.8816 - classification_loss: 0.1114 433/500 [========================>.....] - ETA: 22s - loss: 0.9935 - regression_loss: 0.8820 - classification_loss: 0.1114 434/500 [=========================>....] - ETA: 22s - loss: 0.9931 - regression_loss: 0.8817 - classification_loss: 0.1114 435/500 [=========================>....] - ETA: 22s - loss: 0.9921 - regression_loss: 0.8808 - classification_loss: 0.1113 436/500 [=========================>....] - ETA: 21s - loss: 0.9921 - regression_loss: 0.8809 - classification_loss: 0.1112 437/500 [=========================>....] - ETA: 21s - loss: 0.9923 - regression_loss: 0.8811 - classification_loss: 0.1112 438/500 [=========================>....] - ETA: 21s - loss: 0.9926 - regression_loss: 0.8814 - classification_loss: 0.1111 439/500 [=========================>....] - ETA: 20s - loss: 0.9932 - regression_loss: 0.8818 - classification_loss: 0.1114 440/500 [=========================>....] - ETA: 20s - loss: 0.9939 - regression_loss: 0.8825 - classification_loss: 0.1114 441/500 [=========================>....] - ETA: 20s - loss: 0.9937 - regression_loss: 0.8823 - classification_loss: 0.1114 442/500 [=========================>....] - ETA: 19s - loss: 0.9942 - regression_loss: 0.8827 - classification_loss: 0.1116 443/500 [=========================>....] - ETA: 19s - loss: 0.9940 - regression_loss: 0.8825 - classification_loss: 0.1115 444/500 [=========================>....] - ETA: 18s - loss: 0.9939 - regression_loss: 0.8824 - classification_loss: 0.1115 445/500 [=========================>....] - ETA: 18s - loss: 0.9953 - regression_loss: 0.8835 - classification_loss: 0.1118 446/500 [=========================>....] - ETA: 18s - loss: 0.9943 - regression_loss: 0.8826 - classification_loss: 0.1117 447/500 [=========================>....] - ETA: 17s - loss: 0.9946 - regression_loss: 0.8828 - classification_loss: 0.1117 448/500 [=========================>....] - ETA: 17s - loss: 0.9948 - regression_loss: 0.8831 - classification_loss: 0.1117 449/500 [=========================>....] - ETA: 17s - loss: 0.9949 - regression_loss: 0.8831 - classification_loss: 0.1118 450/500 [==========================>...] - ETA: 16s - loss: 0.9942 - regression_loss: 0.8826 - classification_loss: 0.1117 451/500 [==========================>...] - ETA: 16s - loss: 0.9939 - regression_loss: 0.8823 - classification_loss: 0.1116 452/500 [==========================>...] - ETA: 16s - loss: 0.9946 - regression_loss: 0.8829 - classification_loss: 0.1117 453/500 [==========================>...] - ETA: 15s - loss: 0.9948 - regression_loss: 0.8830 - classification_loss: 0.1117 454/500 [==========================>...] - ETA: 15s - loss: 0.9937 - regression_loss: 0.8822 - classification_loss: 0.1116 455/500 [==========================>...] - ETA: 15s - loss: 0.9938 - regression_loss: 0.8823 - classification_loss: 0.1115 456/500 [==========================>...] - ETA: 14s - loss: 0.9940 - regression_loss: 0.8824 - classification_loss: 0.1117 457/500 [==========================>...] - ETA: 14s - loss: 0.9941 - regression_loss: 0.8825 - classification_loss: 0.1116 458/500 [==========================>...] - ETA: 14s - loss: 0.9945 - regression_loss: 0.8829 - classification_loss: 0.1117 459/500 [==========================>...] - ETA: 13s - loss: 0.9950 - regression_loss: 0.8832 - classification_loss: 0.1118 460/500 [==========================>...] - ETA: 13s - loss: 0.9955 - regression_loss: 0.8837 - classification_loss: 0.1118 461/500 [==========================>...] - ETA: 13s - loss: 0.9958 - regression_loss: 0.8839 - classification_loss: 0.1119 462/500 [==========================>...] - ETA: 12s - loss: 0.9960 - regression_loss: 0.8841 - classification_loss: 0.1119 463/500 [==========================>...] - ETA: 12s - loss: 0.9956 - regression_loss: 0.8838 - classification_loss: 0.1119 464/500 [==========================>...] - ETA: 12s - loss: 0.9965 - regression_loss: 0.8845 - classification_loss: 0.1120 465/500 [==========================>...] - ETA: 11s - loss: 0.9967 - regression_loss: 0.8847 - classification_loss: 0.1120 466/500 [==========================>...] - ETA: 11s - loss: 0.9966 - regression_loss: 0.8847 - classification_loss: 0.1119 467/500 [===========================>..] - ETA: 11s - loss: 0.9963 - regression_loss: 0.8845 - classification_loss: 0.1118 468/500 [===========================>..] - ETA: 10s - loss: 0.9973 - regression_loss: 0.8854 - classification_loss: 0.1118 469/500 [===========================>..] - ETA: 10s - loss: 0.9970 - regression_loss: 0.8852 - classification_loss: 0.1118 470/500 [===========================>..] - ETA: 10s - loss: 0.9958 - regression_loss: 0.8842 - classification_loss: 0.1116 471/500 [===========================>..] - ETA: 9s - loss: 0.9960 - regression_loss: 0.8843 - classification_loss: 0.1117  472/500 [===========================>..] - ETA: 9s - loss: 0.9969 - regression_loss: 0.8850 - classification_loss: 0.1119 473/500 [===========================>..] - ETA: 9s - loss: 0.9978 - regression_loss: 0.8858 - classification_loss: 0.1121 474/500 [===========================>..] - ETA: 8s - loss: 0.9977 - regression_loss: 0.8857 - classification_loss: 0.1120 475/500 [===========================>..] - ETA: 8s - loss: 0.9979 - regression_loss: 0.8858 - classification_loss: 0.1121 476/500 [===========================>..] - ETA: 8s - loss: 0.9973 - regression_loss: 0.8853 - classification_loss: 0.1120 477/500 [===========================>..] - ETA: 7s - loss: 0.9970 - regression_loss: 0.8851 - classification_loss: 0.1119 478/500 [===========================>..] - ETA: 7s - loss: 0.9971 - regression_loss: 0.8850 - classification_loss: 0.1120 479/500 [===========================>..] - ETA: 7s - loss: 0.9968 - regression_loss: 0.8847 - classification_loss: 0.1121 480/500 [===========================>..] - ETA: 6s - loss: 0.9958 - regression_loss: 0.8839 - classification_loss: 0.1119 481/500 [===========================>..] - ETA: 6s - loss: 0.9953 - regression_loss: 0.8834 - classification_loss: 0.1119 482/500 [===========================>..] - ETA: 6s - loss: 0.9956 - regression_loss: 0.8837 - classification_loss: 0.1119 483/500 [===========================>..] - ETA: 5s - loss: 0.9973 - regression_loss: 0.8851 - classification_loss: 0.1122 484/500 [============================>.] - ETA: 5s - loss: 0.9968 - regression_loss: 0.8847 - classification_loss: 0.1120 485/500 [============================>.] - ETA: 5s - loss: 0.9966 - regression_loss: 0.8846 - classification_loss: 0.1120 486/500 [============================>.] - ETA: 4s - loss: 0.9962 - regression_loss: 0.8843 - classification_loss: 0.1119 487/500 [============================>.] - ETA: 4s - loss: 0.9960 - regression_loss: 0.8842 - classification_loss: 0.1118 488/500 [============================>.] - ETA: 4s - loss: 0.9960 - regression_loss: 0.8843 - classification_loss: 0.1118 489/500 [============================>.] - ETA: 3s - loss: 0.9962 - regression_loss: 0.8844 - classification_loss: 0.1118 490/500 [============================>.] - ETA: 3s - loss: 0.9971 - regression_loss: 0.8853 - classification_loss: 0.1118 491/500 [============================>.] - ETA: 3s - loss: 0.9978 - regression_loss: 0.8859 - classification_loss: 0.1119 492/500 [============================>.] - ETA: 2s - loss: 0.9983 - regression_loss: 0.8863 - classification_loss: 0.1120 493/500 [============================>.] - ETA: 2s - loss: 0.9986 - regression_loss: 0.8865 - classification_loss: 0.1121 494/500 [============================>.] - ETA: 2s - loss: 0.9981 - regression_loss: 0.8861 - classification_loss: 0.1120 495/500 [============================>.] - ETA: 1s - loss: 0.9978 - regression_loss: 0.8860 - classification_loss: 0.1119 496/500 [============================>.] - ETA: 1s - loss: 0.9968 - regression_loss: 0.8851 - classification_loss: 0.1117 497/500 [============================>.] - ETA: 1s - loss: 0.9968 - regression_loss: 0.8851 - classification_loss: 0.1117 498/500 [============================>.] - ETA: 0s - loss: 0.9973 - regression_loss: 0.8856 - classification_loss: 0.1117 499/500 [============================>.] - ETA: 0s - loss: 0.9972 - regression_loss: 0.8856 - classification_loss: 0.1116 500/500 [==============================] - 170s 340ms/step - loss: 0.9979 - regression_loss: 0.8861 - classification_loss: 0.1118 1172 instances of class plum with average precision: 0.7878 mAP: 0.7878 Epoch 00028: saving model to ./training/snapshots/resnet101_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:48 - loss: 0.8053 - regression_loss: 0.7373 - classification_loss: 0.0680 2/500 [..............................] - ETA: 2:50 - loss: 1.1816 - regression_loss: 1.1031 - classification_loss: 0.0785 3/500 [..............................] - ETA: 2:51 - loss: 1.1797 - regression_loss: 1.0923 - classification_loss: 0.0874 4/500 [..............................] - ETA: 2:53 - loss: 1.1636 - regression_loss: 1.0692 - classification_loss: 0.0944 5/500 [..............................] - ETA: 2:52 - loss: 1.1838 - regression_loss: 1.0884 - classification_loss: 0.0955 6/500 [..............................] - ETA: 2:51 - loss: 1.1215 - regression_loss: 1.0348 - classification_loss: 0.0867 7/500 [..............................] - ETA: 2:50 - loss: 1.1822 - regression_loss: 1.0794 - classification_loss: 0.1028 8/500 [..............................] - ETA: 2:49 - loss: 1.1654 - regression_loss: 1.0609 - classification_loss: 0.1046 9/500 [..............................] - ETA: 2:48 - loss: 1.1727 - regression_loss: 1.0660 - classification_loss: 0.1067 10/500 [..............................] - ETA: 2:47 - loss: 1.1634 - regression_loss: 1.0569 - classification_loss: 0.1065 11/500 [..............................] - ETA: 2:47 - loss: 1.1531 - regression_loss: 1.0380 - classification_loss: 0.1151 12/500 [..............................] - ETA: 2:46 - loss: 1.0972 - regression_loss: 0.9901 - classification_loss: 0.1071 13/500 [..............................] - ETA: 2:46 - loss: 1.0523 - regression_loss: 0.9513 - classification_loss: 0.1010 14/500 [..............................] - ETA: 2:45 - loss: 1.0544 - regression_loss: 0.9521 - classification_loss: 0.1023 15/500 [..............................] - ETA: 2:44 - loss: 1.0631 - regression_loss: 0.9576 - classification_loss: 0.1056 16/500 [..............................] - ETA: 2:45 - loss: 1.0516 - regression_loss: 0.9445 - classification_loss: 0.1071 17/500 [>.............................] - ETA: 2:44 - loss: 1.0754 - regression_loss: 0.9644 - classification_loss: 0.1109 18/500 [>.............................] - ETA: 2:44 - loss: 1.0385 - regression_loss: 0.9324 - classification_loss: 0.1061 19/500 [>.............................] - ETA: 2:43 - loss: 1.0380 - regression_loss: 0.9309 - classification_loss: 0.1071 20/500 [>.............................] - ETA: 2:43 - loss: 1.0592 - regression_loss: 0.9462 - classification_loss: 0.1130 21/500 [>.............................] - ETA: 2:43 - loss: 1.0407 - regression_loss: 0.9308 - classification_loss: 0.1098 22/500 [>.............................] - ETA: 2:43 - loss: 1.0339 - regression_loss: 0.9233 - classification_loss: 0.1106 23/500 [>.............................] - ETA: 2:42 - loss: 1.0292 - regression_loss: 0.9212 - classification_loss: 0.1081 24/500 [>.............................] - ETA: 2:42 - loss: 1.0133 - regression_loss: 0.9071 - classification_loss: 0.1062 25/500 [>.............................] - ETA: 2:41 - loss: 1.0028 - regression_loss: 0.8981 - classification_loss: 0.1047 26/500 [>.............................] - ETA: 2:41 - loss: 0.9730 - regression_loss: 0.8714 - classification_loss: 0.1016 27/500 [>.............................] - ETA: 2:41 - loss: 0.9880 - regression_loss: 0.8846 - classification_loss: 0.1034 28/500 [>.............................] - ETA: 2:41 - loss: 0.9928 - regression_loss: 0.8887 - classification_loss: 0.1041 29/500 [>.............................] - ETA: 2:40 - loss: 1.0219 - regression_loss: 0.9060 - classification_loss: 0.1159 30/500 [>.............................] - ETA: 2:40 - loss: 1.0068 - regression_loss: 0.8925 - classification_loss: 0.1144 31/500 [>.............................] - ETA: 2:40 - loss: 1.0109 - regression_loss: 0.8959 - classification_loss: 0.1149 32/500 [>.............................] - ETA: 2:39 - loss: 0.9958 - regression_loss: 0.8830 - classification_loss: 0.1127 33/500 [>.............................] - ETA: 2:39 - loss: 0.9918 - regression_loss: 0.8805 - classification_loss: 0.1113 34/500 [=>............................] - ETA: 2:38 - loss: 0.9750 - regression_loss: 0.8663 - classification_loss: 0.1087 35/500 [=>............................] - ETA: 2:37 - loss: 0.9959 - regression_loss: 0.8841 - classification_loss: 0.1118 36/500 [=>............................] - ETA: 2:37 - loss: 1.0026 - regression_loss: 0.8899 - classification_loss: 0.1128 37/500 [=>............................] - ETA: 2:37 - loss: 1.0290 - regression_loss: 0.9142 - classification_loss: 0.1147 38/500 [=>............................] - ETA: 2:36 - loss: 1.0331 - regression_loss: 0.9175 - classification_loss: 0.1156 39/500 [=>............................] - ETA: 2:36 - loss: 1.0287 - regression_loss: 0.9135 - classification_loss: 0.1152 40/500 [=>............................] - ETA: 2:35 - loss: 1.0269 - regression_loss: 0.9111 - classification_loss: 0.1158 41/500 [=>............................] - ETA: 2:35 - loss: 1.0307 - regression_loss: 0.9141 - classification_loss: 0.1166 42/500 [=>............................] - ETA: 2:35 - loss: 1.0223 - regression_loss: 0.9074 - classification_loss: 0.1149 43/500 [=>............................] - ETA: 2:34 - loss: 1.0284 - regression_loss: 0.9126 - classification_loss: 0.1158 44/500 [=>............................] - ETA: 2:34 - loss: 1.0282 - regression_loss: 0.9122 - classification_loss: 0.1160 45/500 [=>............................] - ETA: 2:34 - loss: 1.0337 - regression_loss: 0.9160 - classification_loss: 0.1176 46/500 [=>............................] - ETA: 2:34 - loss: 1.0368 - regression_loss: 0.9183 - classification_loss: 0.1185 47/500 [=>............................] - ETA: 2:33 - loss: 1.0409 - regression_loss: 0.9227 - classification_loss: 0.1182 48/500 [=>............................] - ETA: 2:33 - loss: 1.0430 - regression_loss: 0.9246 - classification_loss: 0.1184 49/500 [=>............................] - ETA: 2:33 - loss: 1.0448 - regression_loss: 0.9264 - classification_loss: 0.1184 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0460 - regression_loss: 0.9276 - classification_loss: 0.1184 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0463 - regression_loss: 0.9276 - classification_loss: 0.1187 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0496 - regression_loss: 0.9300 - classification_loss: 0.1196 53/500 [==>...........................] - ETA: 2:32 - loss: 1.0526 - regression_loss: 0.9327 - classification_loss: 0.1199 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0499 - regression_loss: 0.9304 - classification_loss: 0.1194 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0433 - regression_loss: 0.9246 - classification_loss: 0.1187 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0368 - regression_loss: 0.9195 - classification_loss: 0.1173 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0419 - regression_loss: 0.9245 - classification_loss: 0.1173 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0543 - regression_loss: 0.9352 - classification_loss: 0.1191 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0460 - regression_loss: 0.9282 - classification_loss: 0.1178 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0497 - regression_loss: 0.9312 - classification_loss: 0.1186 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0495 - regression_loss: 0.9317 - classification_loss: 0.1178 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0529 - regression_loss: 0.9359 - classification_loss: 0.1171 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0540 - regression_loss: 0.9372 - classification_loss: 0.1168 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0482 - regression_loss: 0.9324 - classification_loss: 0.1158 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0502 - regression_loss: 0.9342 - classification_loss: 0.1160 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0547 - regression_loss: 0.9377 - classification_loss: 0.1171 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0582 - regression_loss: 0.9411 - classification_loss: 0.1171 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0576 - regression_loss: 0.9414 - classification_loss: 0.1162 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0611 - regression_loss: 0.9442 - classification_loss: 0.1170 70/500 [===>..........................] - ETA: 2:26 - loss: 1.0517 - regression_loss: 0.9355 - classification_loss: 0.1161 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0470 - regression_loss: 0.9322 - classification_loss: 0.1148 72/500 [===>..........................] - ETA: 2:25 - loss: 1.0482 - regression_loss: 0.9330 - classification_loss: 0.1151 73/500 [===>..........................] - ETA: 2:25 - loss: 1.0525 - regression_loss: 0.9367 - classification_loss: 0.1158 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0480 - regression_loss: 0.9330 - classification_loss: 0.1150 75/500 [===>..........................] - ETA: 2:24 - loss: 1.0444 - regression_loss: 0.9298 - classification_loss: 0.1147 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0351 - regression_loss: 0.9215 - classification_loss: 0.1137 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0274 - regression_loss: 0.9146 - classification_loss: 0.1128 78/500 [===>..........................] - ETA: 2:23 - loss: 1.0321 - regression_loss: 0.9179 - classification_loss: 0.1142 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0320 - regression_loss: 0.9167 - classification_loss: 0.1153 80/500 [===>..........................] - ETA: 2:22 - loss: 1.0278 - regression_loss: 0.9132 - classification_loss: 0.1146 81/500 [===>..........................] - ETA: 2:21 - loss: 1.0229 - regression_loss: 0.9090 - classification_loss: 0.1139 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0260 - regression_loss: 0.9115 - classification_loss: 0.1145 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0231 - regression_loss: 0.9093 - classification_loss: 0.1138 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0213 - regression_loss: 0.9073 - classification_loss: 0.1139 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0217 - regression_loss: 0.9078 - classification_loss: 0.1139 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0235 - regression_loss: 0.9093 - classification_loss: 0.1142 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0208 - regression_loss: 0.9071 - classification_loss: 0.1137 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0192 - regression_loss: 0.9055 - classification_loss: 0.1137 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0218 - regression_loss: 0.9080 - classification_loss: 0.1139 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0182 - regression_loss: 0.9049 - classification_loss: 0.1133 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0107 - regression_loss: 0.8978 - classification_loss: 0.1128 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0118 - regression_loss: 0.8992 - classification_loss: 0.1126 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0080 - regression_loss: 0.8962 - classification_loss: 0.1118 94/500 [====>.........................] - ETA: 2:17 - loss: 1.0042 - regression_loss: 0.8930 - classification_loss: 0.1112 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0078 - regression_loss: 0.8962 - classification_loss: 0.1116 96/500 [====>.........................] - ETA: 2:16 - loss: 1.0151 - regression_loss: 0.9018 - classification_loss: 0.1132 97/500 [====>.........................] - ETA: 2:16 - loss: 1.0102 - regression_loss: 0.8978 - classification_loss: 0.1124 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0053 - regression_loss: 0.8937 - classification_loss: 0.1116 99/500 [====>.........................] - ETA: 2:15 - loss: 1.0105 - regression_loss: 0.8992 - classification_loss: 0.1113 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0116 - regression_loss: 0.9003 - classification_loss: 0.1113 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0091 - regression_loss: 0.8983 - classification_loss: 0.1108 102/500 [=====>........................] - ETA: 2:14 - loss: 1.0037 - regression_loss: 0.8936 - classification_loss: 0.1100 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0060 - regression_loss: 0.8954 - classification_loss: 0.1106 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0033 - regression_loss: 0.8935 - classification_loss: 0.1098 105/500 [=====>........................] - ETA: 2:13 - loss: 1.0031 - regression_loss: 0.8936 - classification_loss: 0.1095 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0050 - regression_loss: 0.8955 - classification_loss: 0.1095 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0069 - regression_loss: 0.8971 - classification_loss: 0.1098 108/500 [=====>........................] - ETA: 2:12 - loss: 1.0087 - regression_loss: 0.8986 - classification_loss: 0.1101 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0113 - regression_loss: 0.9002 - classification_loss: 0.1111 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0096 - regression_loss: 0.8989 - classification_loss: 0.1106 111/500 [=====>........................] - ETA: 2:11 - loss: 1.0108 - regression_loss: 0.9001 - classification_loss: 0.1107 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0080 - regression_loss: 0.8977 - classification_loss: 0.1102 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0106 - regression_loss: 0.9000 - classification_loss: 0.1106 114/500 [=====>........................] - ETA: 2:10 - loss: 1.0133 - regression_loss: 0.9028 - classification_loss: 0.1105 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0123 - regression_loss: 0.9021 - classification_loss: 0.1103 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0119 - regression_loss: 0.9016 - classification_loss: 0.1103 117/500 [======>.......................] - ETA: 2:09 - loss: 1.0134 - regression_loss: 0.9027 - classification_loss: 0.1107 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0267 - regression_loss: 0.9129 - classification_loss: 0.1138 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0240 - regression_loss: 0.9106 - classification_loss: 0.1134 120/500 [======>.......................] - ETA: 2:08 - loss: 1.0235 - regression_loss: 0.9095 - classification_loss: 0.1140 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0242 - regression_loss: 0.9102 - classification_loss: 0.1139 122/500 [======>.......................] - ETA: 2:07 - loss: 1.0228 - regression_loss: 0.9090 - classification_loss: 0.1138 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0233 - regression_loss: 0.9094 - classification_loss: 0.1139 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0213 - regression_loss: 0.9076 - classification_loss: 0.1136 125/500 [======>.......................] - ETA: 2:06 - loss: 1.0241 - regression_loss: 0.9099 - classification_loss: 0.1142 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0251 - regression_loss: 0.9107 - classification_loss: 0.1145 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0209 - regression_loss: 0.9071 - classification_loss: 0.1138 128/500 [======>.......................] - ETA: 2:05 - loss: 1.0234 - regression_loss: 0.9092 - classification_loss: 0.1142 129/500 [======>.......................] - ETA: 2:05 - loss: 1.0238 - regression_loss: 0.9093 - classification_loss: 0.1145 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0244 - regression_loss: 0.9095 - classification_loss: 0.1149 131/500 [======>.......................] - ETA: 2:04 - loss: 1.0209 - regression_loss: 0.9068 - classification_loss: 0.1141 132/500 [======>.......................] - ETA: 2:04 - loss: 1.0217 - regression_loss: 0.9076 - classification_loss: 0.1142 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0167 - regression_loss: 0.9032 - classification_loss: 0.1135 134/500 [=======>......................] - ETA: 2:03 - loss: 1.0188 - regression_loss: 0.9054 - classification_loss: 0.1134 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0191 - regression_loss: 0.9058 - classification_loss: 0.1133 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0186 - regression_loss: 0.9054 - classification_loss: 0.1132 137/500 [=======>......................] - ETA: 2:02 - loss: 1.0213 - regression_loss: 0.9076 - classification_loss: 0.1137 138/500 [=======>......................] - ETA: 2:02 - loss: 1.0221 - regression_loss: 0.9084 - classification_loss: 0.1137 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0193 - regression_loss: 0.9061 - classification_loss: 0.1132 140/500 [=======>......................] - ETA: 2:01 - loss: 1.0175 - regression_loss: 0.9044 - classification_loss: 0.1130 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0183 - regression_loss: 0.9050 - classification_loss: 0.1133 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0186 - regression_loss: 0.9049 - classification_loss: 0.1137 143/500 [=======>......................] - ETA: 2:00 - loss: 1.0184 - regression_loss: 0.9047 - classification_loss: 0.1137 144/500 [=======>......................] - ETA: 2:00 - loss: 1.0183 - regression_loss: 0.9048 - classification_loss: 0.1134 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0173 - regression_loss: 0.9040 - classification_loss: 0.1132 146/500 [=======>......................] - ETA: 1:59 - loss: 1.0163 - regression_loss: 0.9032 - classification_loss: 0.1131 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0137 - regression_loss: 0.9009 - classification_loss: 0.1127 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0135 - regression_loss: 0.9011 - classification_loss: 0.1124 149/500 [=======>......................] - ETA: 1:58 - loss: 1.0128 - regression_loss: 0.9006 - classification_loss: 0.1122 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0117 - regression_loss: 0.9000 - classification_loss: 0.1117 151/500 [========>.....................] - ETA: 1:57 - loss: 1.0100 - regression_loss: 0.8987 - classification_loss: 0.1113 152/500 [========>.....................] - ETA: 1:57 - loss: 1.0097 - regression_loss: 0.8985 - classification_loss: 0.1113 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0098 - regression_loss: 0.8985 - classification_loss: 0.1113 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0065 - regression_loss: 0.8958 - classification_loss: 0.1108 155/500 [========>.....................] - ETA: 1:56 - loss: 1.0056 - regression_loss: 0.8950 - classification_loss: 0.1106 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0036 - regression_loss: 0.8932 - classification_loss: 0.1104 157/500 [========>.....................] - ETA: 1:55 - loss: 1.0001 - regression_loss: 0.8903 - classification_loss: 0.1098 158/500 [========>.....................] - ETA: 1:55 - loss: 0.9997 - regression_loss: 0.8899 - classification_loss: 0.1099 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9970 - regression_loss: 0.8876 - classification_loss: 0.1094 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9995 - regression_loss: 0.8898 - classification_loss: 0.1098 161/500 [========>.....................] - ETA: 1:54 - loss: 0.9987 - regression_loss: 0.8889 - classification_loss: 0.1098 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9993 - regression_loss: 0.8895 - classification_loss: 0.1098 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0003 - regression_loss: 0.8904 - classification_loss: 0.1099 164/500 [========>.....................] - ETA: 1:53 - loss: 1.0025 - regression_loss: 0.8924 - classification_loss: 0.1101 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0029 - regression_loss: 0.8927 - classification_loss: 0.1102 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0028 - regression_loss: 0.8925 - classification_loss: 0.1103 167/500 [=========>....................] - ETA: 1:52 - loss: 1.0048 - regression_loss: 0.8940 - classification_loss: 0.1108 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0054 - regression_loss: 0.8945 - classification_loss: 0.1109 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0058 - regression_loss: 0.8949 - classification_loss: 0.1109 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0091 - regression_loss: 0.8977 - classification_loss: 0.1114 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0101 - regression_loss: 0.8986 - classification_loss: 0.1115 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0119 - regression_loss: 0.9002 - classification_loss: 0.1117 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0113 - regression_loss: 0.8996 - classification_loss: 0.1117 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0119 - regression_loss: 0.9000 - classification_loss: 0.1118 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0117 - regression_loss: 0.8999 - classification_loss: 0.1119 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0099 - regression_loss: 0.8985 - classification_loss: 0.1114 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0103 - regression_loss: 0.8985 - classification_loss: 0.1118 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0098 - regression_loss: 0.8981 - classification_loss: 0.1117 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0087 - regression_loss: 0.8970 - classification_loss: 0.1118 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0089 - regression_loss: 0.8972 - classification_loss: 0.1117 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0087 - regression_loss: 0.8971 - classification_loss: 0.1116 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0105 - regression_loss: 0.8986 - classification_loss: 0.1120 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0120 - regression_loss: 0.8998 - classification_loss: 0.1122 184/500 [==========>...................] - ETA: 1:46 - loss: 1.0145 - regression_loss: 0.9019 - classification_loss: 0.1125 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0123 - regression_loss: 0.9001 - classification_loss: 0.1122 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0132 - regression_loss: 0.9008 - classification_loss: 0.1124 187/500 [==========>...................] - ETA: 1:45 - loss: 1.0151 - regression_loss: 0.9026 - classification_loss: 0.1125 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0153 - regression_loss: 0.9026 - classification_loss: 0.1127 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0138 - regression_loss: 0.9013 - classification_loss: 0.1124 190/500 [==========>...................] - ETA: 1:44 - loss: 1.0140 - regression_loss: 0.9015 - classification_loss: 0.1125 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0165 - regression_loss: 0.9036 - classification_loss: 0.1128 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0147 - regression_loss: 0.9022 - classification_loss: 0.1125 193/500 [==========>...................] - ETA: 1:43 - loss: 1.0125 - regression_loss: 0.9004 - classification_loss: 0.1121 194/500 [==========>...................] - ETA: 1:43 - loss: 1.0126 - regression_loss: 0.9006 - classification_loss: 0.1120 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0132 - regression_loss: 0.9009 - classification_loss: 0.1123 196/500 [==========>...................] - ETA: 1:42 - loss: 1.0144 - regression_loss: 0.9022 - classification_loss: 0.1122 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0150 - regression_loss: 0.9028 - classification_loss: 0.1123 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0133 - regression_loss: 0.9015 - classification_loss: 0.1119 199/500 [==========>...................] - ETA: 1:41 - loss: 1.0140 - regression_loss: 0.9017 - classification_loss: 0.1123 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0150 - regression_loss: 0.9026 - classification_loss: 0.1125 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0162 - regression_loss: 0.9035 - classification_loss: 0.1127 202/500 [===========>..................] - ETA: 1:40 - loss: 1.0137 - regression_loss: 0.9015 - classification_loss: 0.1122 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0155 - regression_loss: 0.9028 - classification_loss: 0.1126 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0127 - regression_loss: 0.9004 - classification_loss: 0.1123 205/500 [===========>..................] - ETA: 1:39 - loss: 1.0119 - regression_loss: 0.8996 - classification_loss: 0.1123 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0154 - regression_loss: 0.9026 - classification_loss: 0.1129 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0140 - regression_loss: 0.9014 - classification_loss: 0.1127 208/500 [===========>..................] - ETA: 1:38 - loss: 1.0109 - regression_loss: 0.8986 - classification_loss: 0.1123 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0131 - regression_loss: 0.9005 - classification_loss: 0.1126 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0126 - regression_loss: 0.8999 - classification_loss: 0.1127 211/500 [===========>..................] - ETA: 1:37 - loss: 1.0111 - regression_loss: 0.8986 - classification_loss: 0.1125 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0140 - regression_loss: 0.9015 - classification_loss: 0.1125 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0161 - regression_loss: 0.9031 - classification_loss: 0.1130 214/500 [===========>..................] - ETA: 1:36 - loss: 1.0174 - regression_loss: 0.9043 - classification_loss: 0.1131 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0171 - regression_loss: 0.9038 - classification_loss: 0.1133 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0140 - regression_loss: 0.9010 - classification_loss: 0.1130 217/500 [============>.................] - ETA: 1:35 - loss: 1.0134 - regression_loss: 0.9005 - classification_loss: 0.1129 218/500 [============>.................] - ETA: 1:35 - loss: 1.0124 - regression_loss: 0.8998 - classification_loss: 0.1126 219/500 [============>.................] - ETA: 1:35 - loss: 1.0161 - regression_loss: 0.9027 - classification_loss: 0.1134 220/500 [============>.................] - ETA: 1:34 - loss: 1.0171 - regression_loss: 0.9037 - classification_loss: 0.1135 221/500 [============>.................] - ETA: 1:34 - loss: 1.0153 - regression_loss: 0.9022 - classification_loss: 0.1131 222/500 [============>.................] - ETA: 1:34 - loss: 1.0148 - regression_loss: 0.9020 - classification_loss: 0.1128 223/500 [============>.................] - ETA: 1:33 - loss: 1.0159 - regression_loss: 0.9027 - classification_loss: 0.1131 224/500 [============>.................] - ETA: 1:33 - loss: 1.0136 - regression_loss: 0.9008 - classification_loss: 0.1128 225/500 [============>.................] - ETA: 1:33 - loss: 1.0132 - regression_loss: 0.9004 - classification_loss: 0.1127 226/500 [============>.................] - ETA: 1:32 - loss: 1.0146 - regression_loss: 0.9017 - classification_loss: 0.1128 227/500 [============>.................] - ETA: 1:32 - loss: 1.0164 - regression_loss: 0.9035 - classification_loss: 0.1129 228/500 [============>.................] - ETA: 1:32 - loss: 1.0174 - regression_loss: 0.9042 - classification_loss: 0.1132 229/500 [============>.................] - ETA: 1:31 - loss: 1.0180 - regression_loss: 0.9049 - classification_loss: 0.1131 230/500 [============>.................] - ETA: 1:31 - loss: 1.0167 - regression_loss: 0.9037 - classification_loss: 0.1129 231/500 [============>.................] - ETA: 1:31 - loss: 1.0168 - regression_loss: 0.9040 - classification_loss: 0.1128 232/500 [============>.................] - ETA: 1:30 - loss: 1.0163 - regression_loss: 0.9033 - classification_loss: 0.1129 233/500 [============>.................] - ETA: 1:30 - loss: 1.0157 - regression_loss: 0.9030 - classification_loss: 0.1128 234/500 [=============>................] - ETA: 1:30 - loss: 1.0147 - regression_loss: 0.9023 - classification_loss: 0.1124 235/500 [=============>................] - ETA: 1:29 - loss: 1.0136 - regression_loss: 0.9015 - classification_loss: 0.1121 236/500 [=============>................] - ETA: 1:29 - loss: 1.0143 - regression_loss: 0.9021 - classification_loss: 0.1121 237/500 [=============>................] - ETA: 1:29 - loss: 1.0132 - regression_loss: 0.9011 - classification_loss: 0.1120 238/500 [=============>................] - ETA: 1:28 - loss: 1.0139 - regression_loss: 0.9021 - classification_loss: 0.1119 239/500 [=============>................] - ETA: 1:28 - loss: 1.0141 - regression_loss: 0.9024 - classification_loss: 0.1117 240/500 [=============>................] - ETA: 1:28 - loss: 1.0157 - regression_loss: 0.9040 - classification_loss: 0.1117 241/500 [=============>................] - ETA: 1:27 - loss: 1.0135 - regression_loss: 0.9022 - classification_loss: 0.1113 242/500 [=============>................] - ETA: 1:27 - loss: 1.0155 - regression_loss: 0.9037 - classification_loss: 0.1118 243/500 [=============>................] - ETA: 1:27 - loss: 1.0142 - regression_loss: 0.9025 - classification_loss: 0.1117 244/500 [=============>................] - ETA: 1:26 - loss: 1.0134 - regression_loss: 0.9018 - classification_loss: 0.1116 245/500 [=============>................] - ETA: 1:26 - loss: 1.0138 - regression_loss: 0.9022 - classification_loss: 0.1116 246/500 [=============>................] - ETA: 1:26 - loss: 1.0151 - regression_loss: 0.9034 - classification_loss: 0.1117 247/500 [=============>................] - ETA: 1:25 - loss: 1.0153 - regression_loss: 0.9034 - classification_loss: 0.1119 248/500 [=============>................] - ETA: 1:25 - loss: 1.0156 - regression_loss: 0.9038 - classification_loss: 0.1118 249/500 [=============>................] - ETA: 1:25 - loss: 1.0167 - regression_loss: 0.9048 - classification_loss: 0.1119 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0157 - regression_loss: 0.9040 - classification_loss: 0.1118 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0149 - regression_loss: 0.9035 - classification_loss: 0.1115 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0169 - regression_loss: 0.9052 - classification_loss: 0.1117 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0168 - regression_loss: 0.9050 - classification_loss: 0.1117 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0167 - regression_loss: 0.9053 - classification_loss: 0.1114 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0141 - regression_loss: 0.9030 - classification_loss: 0.1111 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0139 - regression_loss: 0.9030 - classification_loss: 0.1109 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0156 - regression_loss: 0.9044 - classification_loss: 0.1111 258/500 [==============>...............] - ETA: 1:21 - loss: 1.0153 - regression_loss: 0.9043 - classification_loss: 0.1110 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0146 - regression_loss: 0.9037 - classification_loss: 0.1109 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0148 - regression_loss: 0.9038 - classification_loss: 0.1109 261/500 [==============>...............] - ETA: 1:20 - loss: 1.0140 - regression_loss: 0.9033 - classification_loss: 0.1108 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0145 - regression_loss: 0.9037 - classification_loss: 0.1108 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0140 - regression_loss: 0.9032 - classification_loss: 0.1108 264/500 [==============>...............] - ETA: 1:19 - loss: 1.0142 - regression_loss: 0.9034 - classification_loss: 0.1107 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0140 - regression_loss: 0.9033 - classification_loss: 0.1107 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0137 - regression_loss: 0.9031 - classification_loss: 0.1106 267/500 [===============>..............] - ETA: 1:18 - loss: 1.0131 - regression_loss: 0.9026 - classification_loss: 0.1105 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0123 - regression_loss: 0.9019 - classification_loss: 0.1103 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0142 - regression_loss: 0.9034 - classification_loss: 0.1108 270/500 [===============>..............] - ETA: 1:17 - loss: 1.0134 - regression_loss: 0.9026 - classification_loss: 0.1108 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0128 - regression_loss: 0.9021 - classification_loss: 0.1107 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0151 - regression_loss: 0.9040 - classification_loss: 0.1111 273/500 [===============>..............] - ETA: 1:16 - loss: 1.0136 - regression_loss: 0.9027 - classification_loss: 0.1109 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0144 - regression_loss: 0.9032 - classification_loss: 0.1112 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0134 - regression_loss: 0.9021 - classification_loss: 0.1112 276/500 [===============>..............] - ETA: 1:15 - loss: 1.0164 - regression_loss: 0.9045 - classification_loss: 0.1119 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0177 - regression_loss: 0.9060 - classification_loss: 0.1118 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0173 - regression_loss: 0.9057 - classification_loss: 0.1116 279/500 [===============>..............] - ETA: 1:14 - loss: 1.0163 - regression_loss: 0.9048 - classification_loss: 0.1115 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0170 - regression_loss: 0.9053 - classification_loss: 0.1117 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0152 - regression_loss: 0.9038 - classification_loss: 0.1114 282/500 [===============>..............] - ETA: 1:13 - loss: 1.0151 - regression_loss: 0.9037 - classification_loss: 0.1114 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0148 - regression_loss: 0.9036 - classification_loss: 0.1112 284/500 [================>.............] - ETA: 1:13 - loss: 1.0149 - regression_loss: 0.9038 - classification_loss: 0.1111 285/500 [================>.............] - ETA: 1:12 - loss: 1.0144 - regression_loss: 0.9034 - classification_loss: 0.1110 286/500 [================>.............] - ETA: 1:12 - loss: 1.0139 - regression_loss: 0.9030 - classification_loss: 0.1109 287/500 [================>.............] - ETA: 1:12 - loss: 1.0142 - regression_loss: 0.9032 - classification_loss: 0.1110 288/500 [================>.............] - ETA: 1:11 - loss: 1.0131 - regression_loss: 0.9022 - classification_loss: 0.1109 289/500 [================>.............] - ETA: 1:11 - loss: 1.0128 - regression_loss: 0.9020 - classification_loss: 0.1108 290/500 [================>.............] - ETA: 1:11 - loss: 1.0140 - regression_loss: 0.9030 - classification_loss: 0.1109 291/500 [================>.............] - ETA: 1:10 - loss: 1.0152 - regression_loss: 0.9041 - classification_loss: 0.1111 292/500 [================>.............] - ETA: 1:10 - loss: 1.0174 - regression_loss: 0.9061 - classification_loss: 0.1113 293/500 [================>.............] - ETA: 1:10 - loss: 1.0187 - regression_loss: 0.9072 - classification_loss: 0.1115 294/500 [================>.............] - ETA: 1:09 - loss: 1.0186 - regression_loss: 0.9072 - classification_loss: 0.1113 295/500 [================>.............] - ETA: 1:09 - loss: 1.0202 - regression_loss: 0.9086 - classification_loss: 0.1115 296/500 [================>.............] - ETA: 1:09 - loss: 1.0209 - regression_loss: 0.9094 - classification_loss: 0.1115 297/500 [================>.............] - ETA: 1:08 - loss: 1.0223 - regression_loss: 0.9107 - classification_loss: 0.1115 298/500 [================>.............] - ETA: 1:08 - loss: 1.0235 - regression_loss: 0.9118 - classification_loss: 0.1117 299/500 [================>.............] - ETA: 1:08 - loss: 1.0244 - regression_loss: 0.9127 - classification_loss: 0.1118 300/500 [=================>............] - ETA: 1:07 - loss: 1.0233 - regression_loss: 0.9117 - classification_loss: 0.1116 301/500 [=================>............] - ETA: 1:07 - loss: 1.0245 - regression_loss: 0.9127 - classification_loss: 0.1117 302/500 [=================>............] - ETA: 1:07 - loss: 1.0259 - regression_loss: 0.9138 - classification_loss: 0.1120 303/500 [=================>............] - ETA: 1:06 - loss: 1.0242 - regression_loss: 0.9125 - classification_loss: 0.1118 304/500 [=================>............] - ETA: 1:06 - loss: 1.0257 - regression_loss: 0.9138 - classification_loss: 0.1119 305/500 [=================>............] - ETA: 1:06 - loss: 1.0262 - regression_loss: 0.9143 - classification_loss: 0.1120 306/500 [=================>............] - ETA: 1:05 - loss: 1.0248 - regression_loss: 0.9130 - classification_loss: 0.1118 307/500 [=================>............] - ETA: 1:05 - loss: 1.0244 - regression_loss: 0.9126 - classification_loss: 0.1118 308/500 [=================>............] - ETA: 1:05 - loss: 1.0251 - regression_loss: 0.9135 - classification_loss: 0.1117 309/500 [=================>............] - ETA: 1:04 - loss: 1.0242 - regression_loss: 0.9127 - classification_loss: 0.1115 310/500 [=================>............] - ETA: 1:04 - loss: 1.0229 - regression_loss: 0.9115 - classification_loss: 0.1113 311/500 [=================>............] - ETA: 1:04 - loss: 1.0215 - regression_loss: 0.9104 - classification_loss: 0.1111 312/500 [=================>............] - ETA: 1:03 - loss: 1.0214 - regression_loss: 0.9103 - classification_loss: 0.1111 313/500 [=================>............] - ETA: 1:03 - loss: 1.0209 - regression_loss: 0.9097 - classification_loss: 0.1112 314/500 [=================>............] - ETA: 1:03 - loss: 1.0213 - regression_loss: 0.9099 - classification_loss: 0.1114 315/500 [=================>............] - ETA: 1:02 - loss: 1.0201 - regression_loss: 0.9089 - classification_loss: 0.1112 316/500 [=================>............] - ETA: 1:02 - loss: 1.0192 - regression_loss: 0.9082 - classification_loss: 0.1110 317/500 [==================>...........] - ETA: 1:01 - loss: 1.0202 - regression_loss: 0.9089 - classification_loss: 0.1113 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0214 - regression_loss: 0.9099 - classification_loss: 0.1115 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0212 - regression_loss: 0.9098 - classification_loss: 0.1114 320/500 [==================>...........] - ETA: 1:00 - loss: 1.0213 - regression_loss: 0.9098 - classification_loss: 0.1115 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0207 - regression_loss: 0.9094 - classification_loss: 0.1113 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0205 - regression_loss: 0.9092 - classification_loss: 0.1114 323/500 [==================>...........] - ETA: 59s - loss: 1.0208 - regression_loss: 0.9094 - classification_loss: 0.1114  324/500 [==================>...........] - ETA: 59s - loss: 1.0200 - regression_loss: 0.9088 - classification_loss: 0.1112 325/500 [==================>...........] - ETA: 59s - loss: 1.0209 - regression_loss: 0.9095 - classification_loss: 0.1114 326/500 [==================>...........] - ETA: 58s - loss: 1.0228 - regression_loss: 0.9111 - classification_loss: 0.1116 327/500 [==================>...........] - ETA: 58s - loss: 1.0227 - regression_loss: 0.9110 - classification_loss: 0.1117 328/500 [==================>...........] - ETA: 58s - loss: 1.0231 - regression_loss: 0.9113 - classification_loss: 0.1118 329/500 [==================>...........] - ETA: 57s - loss: 1.0227 - regression_loss: 0.9106 - classification_loss: 0.1121 330/500 [==================>...........] - ETA: 57s - loss: 1.0231 - regression_loss: 0.9108 - classification_loss: 0.1123 331/500 [==================>...........] - ETA: 57s - loss: 1.0234 - regression_loss: 0.9111 - classification_loss: 0.1123 332/500 [==================>...........] - ETA: 56s - loss: 1.0242 - regression_loss: 0.9118 - classification_loss: 0.1124 333/500 [==================>...........] - ETA: 56s - loss: 1.0222 - regression_loss: 0.9101 - classification_loss: 0.1121 334/500 [===================>..........] - ETA: 56s - loss: 1.0201 - regression_loss: 0.9082 - classification_loss: 0.1119 335/500 [===================>..........] - ETA: 55s - loss: 1.0209 - regression_loss: 0.9091 - classification_loss: 0.1119 336/500 [===================>..........] - ETA: 55s - loss: 1.0202 - regression_loss: 0.9085 - classification_loss: 0.1117 337/500 [===================>..........] - ETA: 55s - loss: 1.0207 - regression_loss: 0.9088 - classification_loss: 0.1119 338/500 [===================>..........] - ETA: 54s - loss: 1.0215 - regression_loss: 0.9094 - classification_loss: 0.1121 339/500 [===================>..........] - ETA: 54s - loss: 1.0216 - regression_loss: 0.9095 - classification_loss: 0.1120 340/500 [===================>..........] - ETA: 54s - loss: 1.0225 - regression_loss: 0.9106 - classification_loss: 0.1119 341/500 [===================>..........] - ETA: 53s - loss: 1.0227 - regression_loss: 0.9108 - classification_loss: 0.1119 342/500 [===================>..........] - ETA: 53s - loss: 1.0232 - regression_loss: 0.9112 - classification_loss: 0.1120 343/500 [===================>..........] - ETA: 53s - loss: 1.0234 - regression_loss: 0.9115 - classification_loss: 0.1120 344/500 [===================>..........] - ETA: 52s - loss: 1.0218 - regression_loss: 0.9101 - classification_loss: 0.1118 345/500 [===================>..........] - ETA: 52s - loss: 1.0202 - regression_loss: 0.9087 - classification_loss: 0.1115 346/500 [===================>..........] - ETA: 52s - loss: 1.0189 - regression_loss: 0.9076 - classification_loss: 0.1113 347/500 [===================>..........] - ETA: 51s - loss: 1.0194 - regression_loss: 0.9082 - classification_loss: 0.1112 348/500 [===================>..........] - ETA: 51s - loss: 1.0186 - regression_loss: 0.9075 - classification_loss: 0.1111 349/500 [===================>..........] - ETA: 51s - loss: 1.0187 - regression_loss: 0.9076 - classification_loss: 0.1111 350/500 [====================>.........] - ETA: 50s - loss: 1.0188 - regression_loss: 0.9077 - classification_loss: 0.1111 351/500 [====================>.........] - ETA: 50s - loss: 1.0203 - regression_loss: 0.9090 - classification_loss: 0.1113 352/500 [====================>.........] - ETA: 50s - loss: 1.0217 - regression_loss: 0.9102 - classification_loss: 0.1115 353/500 [====================>.........] - ETA: 49s - loss: 1.0229 - regression_loss: 0.9111 - classification_loss: 0.1118 354/500 [====================>.........] - ETA: 49s - loss: 1.0225 - regression_loss: 0.9108 - classification_loss: 0.1117 355/500 [====================>.........] - ETA: 49s - loss: 1.0233 - regression_loss: 0.9116 - classification_loss: 0.1117 356/500 [====================>.........] - ETA: 48s - loss: 1.0243 - regression_loss: 0.9126 - classification_loss: 0.1117 357/500 [====================>.........] - ETA: 48s - loss: 1.0252 - regression_loss: 0.9133 - classification_loss: 0.1119 358/500 [====================>.........] - ETA: 48s - loss: 1.0232 - regression_loss: 0.9116 - classification_loss: 0.1116 359/500 [====================>.........] - ETA: 47s - loss: 1.0211 - regression_loss: 0.9097 - classification_loss: 0.1114 360/500 [====================>.........] - ETA: 47s - loss: 1.0207 - regression_loss: 0.9094 - classification_loss: 0.1113 361/500 [====================>.........] - ETA: 47s - loss: 1.0205 - regression_loss: 0.9091 - classification_loss: 0.1114 362/500 [====================>.........] - ETA: 46s - loss: 1.0207 - regression_loss: 0.9093 - classification_loss: 0.1114 363/500 [====================>.........] - ETA: 46s - loss: 1.0208 - regression_loss: 0.9094 - classification_loss: 0.1114 364/500 [====================>.........] - ETA: 46s - loss: 1.0199 - regression_loss: 0.9085 - classification_loss: 0.1114 365/500 [====================>.........] - ETA: 45s - loss: 1.0184 - regression_loss: 0.9072 - classification_loss: 0.1111 366/500 [====================>.........] - ETA: 45s - loss: 1.0196 - regression_loss: 0.9086 - classification_loss: 0.1110 367/500 [=====================>........] - ETA: 45s - loss: 1.0197 - regression_loss: 0.9087 - classification_loss: 0.1109 368/500 [=====================>........] - ETA: 44s - loss: 1.0204 - regression_loss: 0.9094 - classification_loss: 0.1110 369/500 [=====================>........] - ETA: 44s - loss: 1.0185 - regression_loss: 0.9077 - classification_loss: 0.1108 370/500 [=====================>........] - ETA: 44s - loss: 1.0176 - regression_loss: 0.9069 - classification_loss: 0.1107 371/500 [=====================>........] - ETA: 43s - loss: 1.0162 - regression_loss: 0.9058 - classification_loss: 0.1105 372/500 [=====================>........] - ETA: 43s - loss: 1.0168 - regression_loss: 0.9062 - classification_loss: 0.1106 373/500 [=====================>........] - ETA: 43s - loss: 1.0151 - regression_loss: 0.9047 - classification_loss: 0.1104 374/500 [=====================>........] - ETA: 42s - loss: 1.0159 - regression_loss: 0.9054 - classification_loss: 0.1105 375/500 [=====================>........] - ETA: 42s - loss: 1.0176 - regression_loss: 0.9069 - classification_loss: 0.1107 376/500 [=====================>........] - ETA: 42s - loss: 1.0179 - regression_loss: 0.9072 - classification_loss: 0.1107 377/500 [=====================>........] - ETA: 41s - loss: 1.0179 - regression_loss: 0.9070 - classification_loss: 0.1108 378/500 [=====================>........] - ETA: 41s - loss: 1.0181 - regression_loss: 0.9073 - classification_loss: 0.1108 379/500 [=====================>........] - ETA: 41s - loss: 1.0180 - regression_loss: 0.9073 - classification_loss: 0.1107 380/500 [=====================>........] - ETA: 40s - loss: 1.0177 - regression_loss: 0.9070 - classification_loss: 0.1107 381/500 [=====================>........] - ETA: 40s - loss: 1.0173 - regression_loss: 0.9067 - classification_loss: 0.1106 382/500 [=====================>........] - ETA: 40s - loss: 1.0175 - regression_loss: 0.9068 - classification_loss: 0.1107 383/500 [=====================>........] - ETA: 39s - loss: 1.0180 - regression_loss: 0.9072 - classification_loss: 0.1108 384/500 [======================>.......] - ETA: 39s - loss: 1.0197 - regression_loss: 0.9087 - classification_loss: 0.1110 385/500 [======================>.......] - ETA: 39s - loss: 1.0190 - regression_loss: 0.9081 - classification_loss: 0.1109 386/500 [======================>.......] - ETA: 38s - loss: 1.0179 - regression_loss: 0.9071 - classification_loss: 0.1108 387/500 [======================>.......] - ETA: 38s - loss: 1.0181 - regression_loss: 0.9073 - classification_loss: 0.1108 388/500 [======================>.......] - ETA: 38s - loss: 1.0185 - regression_loss: 0.9076 - classification_loss: 0.1108 389/500 [======================>.......] - ETA: 37s - loss: 1.0182 - regression_loss: 0.9075 - classification_loss: 0.1108 390/500 [======================>.......] - ETA: 37s - loss: 1.0171 - regression_loss: 0.9064 - classification_loss: 0.1107 391/500 [======================>.......] - ETA: 36s - loss: 1.0159 - regression_loss: 0.9054 - classification_loss: 0.1105 392/500 [======================>.......] - ETA: 36s - loss: 1.0157 - regression_loss: 0.9053 - classification_loss: 0.1104 393/500 [======================>.......] - ETA: 36s - loss: 1.0170 - regression_loss: 0.9063 - classification_loss: 0.1107 394/500 [======================>.......] - ETA: 35s - loss: 1.0169 - regression_loss: 0.9062 - classification_loss: 0.1107 395/500 [======================>.......] - ETA: 35s - loss: 1.0163 - regression_loss: 0.9057 - classification_loss: 0.1106 396/500 [======================>.......] - ETA: 35s - loss: 1.0171 - regression_loss: 0.9064 - classification_loss: 0.1107 397/500 [======================>.......] - ETA: 34s - loss: 1.0167 - regression_loss: 0.9061 - classification_loss: 0.1105 398/500 [======================>.......] - ETA: 34s - loss: 1.0169 - regression_loss: 0.9063 - classification_loss: 0.1106 399/500 [======================>.......] - ETA: 34s - loss: 1.0171 - regression_loss: 0.9065 - classification_loss: 0.1106 400/500 [=======================>......] - ETA: 33s - loss: 1.0175 - regression_loss: 0.9069 - classification_loss: 0.1107 401/500 [=======================>......] - ETA: 33s - loss: 1.0174 - regression_loss: 0.9068 - classification_loss: 0.1106 402/500 [=======================>......] - ETA: 33s - loss: 1.0177 - regression_loss: 0.9070 - classification_loss: 0.1106 403/500 [=======================>......] - ETA: 32s - loss: 1.0181 - regression_loss: 0.9073 - classification_loss: 0.1107 404/500 [=======================>......] - ETA: 32s - loss: 1.0179 - regression_loss: 0.9073 - classification_loss: 0.1106 405/500 [=======================>......] - ETA: 32s - loss: 1.0164 - regression_loss: 0.9059 - classification_loss: 0.1105 406/500 [=======================>......] - ETA: 31s - loss: 1.0176 - regression_loss: 0.9069 - classification_loss: 0.1107 407/500 [=======================>......] - ETA: 31s - loss: 1.0159 - regression_loss: 0.9054 - classification_loss: 0.1105 408/500 [=======================>......] - ETA: 31s - loss: 1.0141 - regression_loss: 0.9038 - classification_loss: 0.1104 409/500 [=======================>......] - ETA: 30s - loss: 1.0147 - regression_loss: 0.9041 - classification_loss: 0.1106 410/500 [=======================>......] - ETA: 30s - loss: 1.0148 - regression_loss: 0.9041 - classification_loss: 0.1107 411/500 [=======================>......] - ETA: 30s - loss: 1.0164 - regression_loss: 0.9056 - classification_loss: 0.1108 412/500 [=======================>......] - ETA: 29s - loss: 1.0187 - regression_loss: 0.9076 - classification_loss: 0.1111 413/500 [=======================>......] - ETA: 29s - loss: 1.0183 - regression_loss: 0.9073 - classification_loss: 0.1110 414/500 [=======================>......] - ETA: 29s - loss: 1.0186 - regression_loss: 0.9076 - classification_loss: 0.1110 415/500 [=======================>......] - ETA: 28s - loss: 1.0176 - regression_loss: 0.9068 - classification_loss: 0.1108 416/500 [=======================>......] - ETA: 28s - loss: 1.0177 - regression_loss: 0.9070 - classification_loss: 0.1107 417/500 [========================>.....] - ETA: 28s - loss: 1.0190 - regression_loss: 0.9081 - classification_loss: 0.1109 418/500 [========================>.....] - ETA: 27s - loss: 1.0189 - regression_loss: 0.9080 - classification_loss: 0.1110 419/500 [========================>.....] - ETA: 27s - loss: 1.0194 - regression_loss: 0.9084 - classification_loss: 0.1111 420/500 [========================>.....] - ETA: 27s - loss: 1.0199 - regression_loss: 0.9087 - classification_loss: 0.1111 421/500 [========================>.....] - ETA: 26s - loss: 1.0203 - regression_loss: 0.9090 - classification_loss: 0.1113 422/500 [========================>.....] - ETA: 26s - loss: 1.0202 - regression_loss: 0.9089 - classification_loss: 0.1113 423/500 [========================>.....] - ETA: 26s - loss: 1.0204 - regression_loss: 0.9090 - classification_loss: 0.1114 424/500 [========================>.....] - ETA: 25s - loss: 1.0206 - regression_loss: 0.9092 - classification_loss: 0.1114 425/500 [========================>.....] - ETA: 25s - loss: 1.0196 - regression_loss: 0.9083 - classification_loss: 0.1112 426/500 [========================>.....] - ETA: 25s - loss: 1.0201 - regression_loss: 0.9087 - classification_loss: 0.1114 427/500 [========================>.....] - ETA: 24s - loss: 1.0214 - regression_loss: 0.9098 - classification_loss: 0.1116 428/500 [========================>.....] - ETA: 24s - loss: 1.0200 - regression_loss: 0.9086 - classification_loss: 0.1114 429/500 [========================>.....] - ETA: 24s - loss: 1.0197 - regression_loss: 0.9083 - classification_loss: 0.1114 430/500 [========================>.....] - ETA: 23s - loss: 1.0236 - regression_loss: 0.9062 - classification_loss: 0.1174 431/500 [========================>.....] - ETA: 23s - loss: 1.0234 - regression_loss: 0.9061 - classification_loss: 0.1173 432/500 [========================>.....] - ETA: 23s - loss: 1.0238 - regression_loss: 0.9065 - classification_loss: 0.1173 433/500 [========================>.....] - ETA: 22s - loss: 1.0249 - regression_loss: 0.9074 - classification_loss: 0.1175 434/500 [=========================>....] - ETA: 22s - loss: 1.0248 - regression_loss: 0.9071 - classification_loss: 0.1176 435/500 [=========================>....] - ETA: 22s - loss: 1.0247 - regression_loss: 0.9071 - classification_loss: 0.1175 436/500 [=========================>....] - ETA: 21s - loss: 1.0256 - regression_loss: 0.9080 - classification_loss: 0.1176 437/500 [=========================>....] - ETA: 21s - loss: 1.0244 - regression_loss: 0.9070 - classification_loss: 0.1174 438/500 [=========================>....] - ETA: 21s - loss: 1.0227 - regression_loss: 0.9056 - classification_loss: 0.1172 439/500 [=========================>....] - ETA: 20s - loss: 1.0226 - regression_loss: 0.9055 - classification_loss: 0.1171 440/500 [=========================>....] - ETA: 20s - loss: 1.0219 - regression_loss: 0.9049 - classification_loss: 0.1170 441/500 [=========================>....] - ETA: 20s - loss: 1.0222 - regression_loss: 0.9050 - classification_loss: 0.1171 442/500 [=========================>....] - ETA: 19s - loss: 1.0212 - regression_loss: 0.9043 - classification_loss: 0.1169 443/500 [=========================>....] - ETA: 19s - loss: 1.0222 - regression_loss: 0.9051 - classification_loss: 0.1171 444/500 [=========================>....] - ETA: 19s - loss: 1.0221 - regression_loss: 0.9050 - classification_loss: 0.1171 445/500 [=========================>....] - ETA: 18s - loss: 1.0236 - regression_loss: 0.9063 - classification_loss: 0.1173 446/500 [=========================>....] - ETA: 18s - loss: 1.0248 - regression_loss: 0.9070 - classification_loss: 0.1177 447/500 [=========================>....] - ETA: 17s - loss: 1.0242 - regression_loss: 0.9066 - classification_loss: 0.1176 448/500 [=========================>....] - ETA: 17s - loss: 1.0248 - regression_loss: 0.9071 - classification_loss: 0.1177 449/500 [=========================>....] - ETA: 17s - loss: 1.0251 - regression_loss: 0.9074 - classification_loss: 0.1177 450/500 [==========================>...] - ETA: 16s - loss: 1.0257 - regression_loss: 0.9079 - classification_loss: 0.1178 451/500 [==========================>...] - ETA: 16s - loss: 1.0249 - regression_loss: 0.9072 - classification_loss: 0.1177 452/500 [==========================>...] - ETA: 16s - loss: 1.0259 - regression_loss: 0.9081 - classification_loss: 0.1178 453/500 [==========================>...] - ETA: 15s - loss: 1.0248 - regression_loss: 0.9072 - classification_loss: 0.1176 454/500 [==========================>...] - ETA: 15s - loss: 1.0254 - regression_loss: 0.9078 - classification_loss: 0.1176 455/500 [==========================>...] - ETA: 15s - loss: 1.0258 - regression_loss: 0.9082 - classification_loss: 0.1176 456/500 [==========================>...] - ETA: 14s - loss: 1.0260 - regression_loss: 0.9086 - classification_loss: 0.1175 457/500 [==========================>...] - ETA: 14s - loss: 1.0257 - regression_loss: 0.9083 - classification_loss: 0.1174 458/500 [==========================>...] - ETA: 14s - loss: 1.0258 - regression_loss: 0.9084 - classification_loss: 0.1174 459/500 [==========================>...] - ETA: 13s - loss: 1.0270 - regression_loss: 0.9094 - classification_loss: 0.1175 460/500 [==========================>...] - ETA: 13s - loss: 1.0260 - regression_loss: 0.9087 - classification_loss: 0.1174 461/500 [==========================>...] - ETA: 13s - loss: 1.0267 - regression_loss: 0.9093 - classification_loss: 0.1174 462/500 [==========================>...] - ETA: 12s - loss: 1.0274 - regression_loss: 0.9101 - classification_loss: 0.1173 463/500 [==========================>...] - ETA: 12s - loss: 1.0258 - regression_loss: 0.9087 - classification_loss: 0.1171 464/500 [==========================>...] - ETA: 12s - loss: 1.0248 - regression_loss: 0.9077 - classification_loss: 0.1171 465/500 [==========================>...] - ETA: 11s - loss: 1.0248 - regression_loss: 0.9077 - classification_loss: 0.1171 466/500 [==========================>...] - ETA: 11s - loss: 1.0240 - regression_loss: 0.9070 - classification_loss: 0.1170 467/500 [===========================>..] - ETA: 11s - loss: 1.0234 - regression_loss: 0.9065 - classification_loss: 0.1169 468/500 [===========================>..] - ETA: 10s - loss: 1.0248 - regression_loss: 0.9078 - classification_loss: 0.1171 469/500 [===========================>..] - ETA: 10s - loss: 1.0244 - regression_loss: 0.9074 - classification_loss: 0.1170 470/500 [===========================>..] - ETA: 10s - loss: 1.0238 - regression_loss: 0.9069 - classification_loss: 0.1169 471/500 [===========================>..] - ETA: 9s - loss: 1.0238 - regression_loss: 0.9070 - classification_loss: 0.1169  472/500 [===========================>..] - ETA: 9s - loss: 1.0245 - regression_loss: 0.9075 - classification_loss: 0.1170 473/500 [===========================>..] - ETA: 9s - loss: 1.0241 - regression_loss: 0.9071 - classification_loss: 0.1170 474/500 [===========================>..] - ETA: 8s - loss: 1.0239 - regression_loss: 0.9070 - classification_loss: 0.1168 475/500 [===========================>..] - ETA: 8s - loss: 1.0231 - regression_loss: 0.9064 - classification_loss: 0.1167 476/500 [===========================>..] - ETA: 8s - loss: 1.0235 - regression_loss: 0.9069 - classification_loss: 0.1167 477/500 [===========================>..] - ETA: 7s - loss: 1.0230 - regression_loss: 0.9064 - classification_loss: 0.1165 478/500 [===========================>..] - ETA: 7s - loss: 1.0227 - regression_loss: 0.9063 - classification_loss: 0.1165 479/500 [===========================>..] - ETA: 7s - loss: 1.0225 - regression_loss: 0.9061 - classification_loss: 0.1164 480/500 [===========================>..] - ETA: 6s - loss: 1.0225 - regression_loss: 0.9061 - classification_loss: 0.1164 481/500 [===========================>..] - ETA: 6s - loss: 1.0217 - regression_loss: 0.9054 - classification_loss: 0.1163 482/500 [===========================>..] - ETA: 6s - loss: 1.0225 - regression_loss: 0.9061 - classification_loss: 0.1164 483/500 [===========================>..] - ETA: 5s - loss: 1.0212 - regression_loss: 0.9050 - classification_loss: 0.1162 484/500 [============================>.] - ETA: 5s - loss: 1.0212 - regression_loss: 0.9051 - classification_loss: 0.1161 485/500 [============================>.] - ETA: 5s - loss: 1.0209 - regression_loss: 0.9048 - classification_loss: 0.1161 486/500 [============================>.] - ETA: 4s - loss: 1.0221 - regression_loss: 0.9059 - classification_loss: 0.1162 487/500 [============================>.] - ETA: 4s - loss: 1.0219 - regression_loss: 0.9058 - classification_loss: 0.1161 488/500 [============================>.] - ETA: 4s - loss: 1.0213 - regression_loss: 0.9053 - classification_loss: 0.1160 489/500 [============================>.] - ETA: 3s - loss: 1.0207 - regression_loss: 0.9048 - classification_loss: 0.1159 490/500 [============================>.] - ETA: 3s - loss: 1.0208 - regression_loss: 0.9050 - classification_loss: 0.1158 491/500 [============================>.] - ETA: 3s - loss: 1.0212 - regression_loss: 0.9054 - classification_loss: 0.1159 492/500 [============================>.] - ETA: 2s - loss: 1.0209 - regression_loss: 0.9052 - classification_loss: 0.1158 493/500 [============================>.] - ETA: 2s - loss: 1.0197 - regression_loss: 0.9042 - classification_loss: 0.1156 494/500 [============================>.] - ETA: 2s - loss: 1.0205 - regression_loss: 0.9050 - classification_loss: 0.1156 495/500 [============================>.] - ETA: 1s - loss: 1.0206 - regression_loss: 0.9050 - classification_loss: 0.1156 496/500 [============================>.] - ETA: 1s - loss: 1.0208 - regression_loss: 0.9052 - classification_loss: 0.1156 497/500 [============================>.] - ETA: 1s - loss: 1.0218 - regression_loss: 0.9061 - classification_loss: 0.1157 498/500 [============================>.] - ETA: 0s - loss: 1.0207 - regression_loss: 0.9051 - classification_loss: 0.1156 499/500 [============================>.] - ETA: 0s - loss: 1.0206 - regression_loss: 0.9050 - classification_loss: 0.1156 500/500 [==============================] - 170s 340ms/step - loss: 1.0195 - regression_loss: 0.9039 - classification_loss: 0.1155 1172 instances of class plum with average precision: 0.7946 mAP: 0.7946 Epoch 00029: saving model to ./training/snapshots/resnet101_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 2:37 - loss: 1.0193 - regression_loss: 0.8967 - classification_loss: 0.1226 2/500 [..............................] - ETA: 2:41 - loss: 0.9548 - regression_loss: 0.8368 - classification_loss: 0.1181 3/500 [..............................] - ETA: 2:45 - loss: 1.1012 - regression_loss: 0.9601 - classification_loss: 0.1411 4/500 [..............................] - ETA: 2:44 - loss: 1.1134 - regression_loss: 0.9883 - classification_loss: 0.1251 5/500 [..............................] - ETA: 2:43 - loss: 1.1085 - regression_loss: 0.9877 - classification_loss: 0.1208 6/500 [..............................] - ETA: 2:43 - loss: 1.1038 - regression_loss: 0.9835 - classification_loss: 0.1203 7/500 [..............................] - ETA: 2:43 - loss: 1.0652 - regression_loss: 0.9519 - classification_loss: 0.1133 8/500 [..............................] - ETA: 2:43 - loss: 0.9866 - regression_loss: 0.8826 - classification_loss: 0.1040 9/500 [..............................] - ETA: 2:43 - loss: 1.0085 - regression_loss: 0.9084 - classification_loss: 0.1002 10/500 [..............................] - ETA: 2:43 - loss: 0.9894 - regression_loss: 0.8880 - classification_loss: 0.1014 11/500 [..............................] - ETA: 2:43 - loss: 0.9995 - regression_loss: 0.8957 - classification_loss: 0.1038 12/500 [..............................] - ETA: 2:43 - loss: 0.9749 - regression_loss: 0.8760 - classification_loss: 0.0989 13/500 [..............................] - ETA: 2:42 - loss: 0.9821 - regression_loss: 0.8831 - classification_loss: 0.0990 14/500 [..............................] - ETA: 2:42 - loss: 0.9939 - regression_loss: 0.8928 - classification_loss: 0.1011 15/500 [..............................] - ETA: 2:42 - loss: 0.9807 - regression_loss: 0.8813 - classification_loss: 0.0995 16/500 [..............................] - ETA: 2:42 - loss: 0.9910 - regression_loss: 0.8915 - classification_loss: 0.0995 17/500 [>.............................] - ETA: 2:41 - loss: 0.9853 - regression_loss: 0.8846 - classification_loss: 0.1008 18/500 [>.............................] - ETA: 2:41 - loss: 0.9438 - regression_loss: 0.8478 - classification_loss: 0.0960 19/500 [>.............................] - ETA: 2:41 - loss: 0.9618 - regression_loss: 0.8624 - classification_loss: 0.0993 20/500 [>.............................] - ETA: 2:40 - loss: 0.9346 - regression_loss: 0.8380 - classification_loss: 0.0966 21/500 [>.............................] - ETA: 2:39 - loss: 0.9511 - regression_loss: 0.8522 - classification_loss: 0.0989 22/500 [>.............................] - ETA: 2:40 - loss: 0.9639 - regression_loss: 0.8627 - classification_loss: 0.1012 23/500 [>.............................] - ETA: 2:39 - loss: 0.9772 - regression_loss: 0.8742 - classification_loss: 0.1030 24/500 [>.............................] - ETA: 2:39 - loss: 1.0090 - regression_loss: 0.9010 - classification_loss: 0.1080 25/500 [>.............................] - ETA: 2:39 - loss: 1.0273 - regression_loss: 0.9177 - classification_loss: 0.1096 26/500 [>.............................] - ETA: 2:38 - loss: 1.0061 - regression_loss: 0.8998 - classification_loss: 0.1062 27/500 [>.............................] - ETA: 2:38 - loss: 0.9805 - regression_loss: 0.8774 - classification_loss: 0.1031 28/500 [>.............................] - ETA: 2:37 - loss: 0.9960 - regression_loss: 0.8906 - classification_loss: 0.1053 29/500 [>.............................] - ETA: 2:37 - loss: 0.9965 - regression_loss: 0.8916 - classification_loss: 0.1049 30/500 [>.............................] - ETA: 2:37 - loss: 0.9897 - regression_loss: 0.8866 - classification_loss: 0.1031 31/500 [>.............................] - ETA: 2:36 - loss: 0.9856 - regression_loss: 0.8842 - classification_loss: 0.1014 32/500 [>.............................] - ETA: 2:36 - loss: 0.9948 - regression_loss: 0.8931 - classification_loss: 0.1017 33/500 [>.............................] - ETA: 2:36 - loss: 0.9946 - regression_loss: 0.8941 - classification_loss: 0.1005 34/500 [=>............................] - ETA: 2:35 - loss: 0.9864 - regression_loss: 0.8873 - classification_loss: 0.0992 35/500 [=>............................] - ETA: 2:35 - loss: 0.9850 - regression_loss: 0.8859 - classification_loss: 0.0991 36/500 [=>............................] - ETA: 2:35 - loss: 0.9807 - regression_loss: 0.8817 - classification_loss: 0.0990 37/500 [=>............................] - ETA: 2:35 - loss: 0.9711 - regression_loss: 0.8734 - classification_loss: 0.0977 38/500 [=>............................] - ETA: 2:34 - loss: 0.9614 - regression_loss: 0.8658 - classification_loss: 0.0956 39/500 [=>............................] - ETA: 2:34 - loss: 0.9535 - regression_loss: 0.8587 - classification_loss: 0.0948 40/500 [=>............................] - ETA: 2:34 - loss: 0.9531 - regression_loss: 0.8589 - classification_loss: 0.0943 41/500 [=>............................] - ETA: 2:33 - loss: 0.9590 - regression_loss: 0.8632 - classification_loss: 0.0958 42/500 [=>............................] - ETA: 2:33 - loss: 0.9646 - regression_loss: 0.8668 - classification_loss: 0.0977 43/500 [=>............................] - ETA: 2:33 - loss: 0.9551 - regression_loss: 0.8581 - classification_loss: 0.0970 44/500 [=>............................] - ETA: 2:33 - loss: 0.9714 - regression_loss: 0.8708 - classification_loss: 0.1006 45/500 [=>............................] - ETA: 2:32 - loss: 0.9688 - regression_loss: 0.8681 - classification_loss: 0.1008 46/500 [=>............................] - ETA: 2:32 - loss: 0.9634 - regression_loss: 0.8635 - classification_loss: 0.0999 47/500 [=>............................] - ETA: 2:32 - loss: 0.9511 - regression_loss: 0.8529 - classification_loss: 0.0982 48/500 [=>............................] - ETA: 2:32 - loss: 0.9621 - regression_loss: 0.8625 - classification_loss: 0.0997 49/500 [=>............................] - ETA: 2:31 - loss: 0.9703 - regression_loss: 0.8695 - classification_loss: 0.1009 50/500 [==>...........................] - ETA: 2:31 - loss: 0.9797 - regression_loss: 0.8778 - classification_loss: 0.1019 51/500 [==>...........................] - ETA: 2:31 - loss: 0.9714 - regression_loss: 0.8707 - classification_loss: 0.1007 52/500 [==>...........................] - ETA: 2:31 - loss: 0.9696 - regression_loss: 0.8689 - classification_loss: 0.1007 53/500 [==>...........................] - ETA: 2:30 - loss: 0.9689 - regression_loss: 0.8678 - classification_loss: 0.1011 54/500 [==>...........................] - ETA: 2:30 - loss: 0.9723 - regression_loss: 0.8703 - classification_loss: 0.1019 55/500 [==>...........................] - ETA: 2:30 - loss: 0.9766 - regression_loss: 0.8739 - classification_loss: 0.1027 56/500 [==>...........................] - ETA: 2:29 - loss: 0.9732 - regression_loss: 0.8710 - classification_loss: 0.1022 57/500 [==>...........................] - ETA: 2:29 - loss: 0.9725 - regression_loss: 0.8703 - classification_loss: 0.1022 58/500 [==>...........................] - ETA: 2:29 - loss: 0.9629 - regression_loss: 0.8618 - classification_loss: 0.1012 59/500 [==>...........................] - ETA: 2:28 - loss: 0.9700 - regression_loss: 0.8682 - classification_loss: 0.1019 60/500 [==>...........................] - ETA: 2:28 - loss: 0.9778 - regression_loss: 0.8748 - classification_loss: 0.1030 61/500 [==>...........................] - ETA: 2:28 - loss: 0.9784 - regression_loss: 0.8754 - classification_loss: 0.1030 62/500 [==>...........................] - ETA: 2:27 - loss: 0.9875 - regression_loss: 0.8836 - classification_loss: 0.1039 63/500 [==>...........................] - ETA: 2:27 - loss: 0.9958 - regression_loss: 0.8904 - classification_loss: 0.1054 64/500 [==>...........................] - ETA: 2:27 - loss: 0.9938 - regression_loss: 0.8892 - classification_loss: 0.1047 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9996 - regression_loss: 0.8930 - classification_loss: 0.1067 66/500 [==>...........................] - ETA: 2:26 - loss: 1.0054 - regression_loss: 0.8974 - classification_loss: 0.1079 67/500 [===>..........................] - ETA: 2:26 - loss: 0.9956 - regression_loss: 0.8887 - classification_loss: 0.1069 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9956 - regression_loss: 0.8888 - classification_loss: 0.1067 69/500 [===>..........................] - ETA: 2:25 - loss: 0.9899 - regression_loss: 0.8838 - classification_loss: 0.1061 70/500 [===>..........................] - ETA: 2:25 - loss: 0.9841 - regression_loss: 0.8787 - classification_loss: 0.1054 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9776 - regression_loss: 0.8732 - classification_loss: 0.1044 72/500 [===>..........................] - ETA: 2:24 - loss: 0.9859 - regression_loss: 0.8806 - classification_loss: 0.1053 73/500 [===>..........................] - ETA: 2:24 - loss: 0.9805 - regression_loss: 0.8763 - classification_loss: 0.1042 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9831 - regression_loss: 0.8785 - classification_loss: 0.1045 75/500 [===>..........................] - ETA: 2:23 - loss: 0.9840 - regression_loss: 0.8789 - classification_loss: 0.1051 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9954 - regression_loss: 0.8881 - classification_loss: 0.1073 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9891 - regression_loss: 0.8824 - classification_loss: 0.1067 78/500 [===>..........................] - ETA: 2:22 - loss: 0.9887 - regression_loss: 0.8823 - classification_loss: 0.1064 79/500 [===>..........................] - ETA: 2:22 - loss: 0.9886 - regression_loss: 0.8822 - classification_loss: 0.1064 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9932 - regression_loss: 0.8850 - classification_loss: 0.1082 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0071 - regression_loss: 0.8959 - classification_loss: 0.1112 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0001 - regression_loss: 0.8895 - classification_loss: 0.1106 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9993 - regression_loss: 0.8882 - classification_loss: 0.1111 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9987 - regression_loss: 0.8874 - classification_loss: 0.1113 85/500 [====>.........................] - ETA: 2:20 - loss: 0.9935 - regression_loss: 0.8833 - classification_loss: 0.1103 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9941 - regression_loss: 0.8838 - classification_loss: 0.1103 87/500 [====>.........................] - ETA: 2:19 - loss: 0.9942 - regression_loss: 0.8839 - classification_loss: 0.1103 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0020 - regression_loss: 0.8907 - classification_loss: 0.1113 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0027 - regression_loss: 0.8916 - classification_loss: 0.1111 90/500 [====>.........................] - ETA: 2:18 - loss: 0.9996 - regression_loss: 0.8892 - classification_loss: 0.1104 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0013 - regression_loss: 0.8907 - classification_loss: 0.1105 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9992 - regression_loss: 0.8890 - classification_loss: 0.1102 93/500 [====>.........................] - ETA: 2:17 - loss: 0.9938 - regression_loss: 0.8845 - classification_loss: 0.1093 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9902 - regression_loss: 0.8814 - classification_loss: 0.1088 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9911 - regression_loss: 0.8817 - classification_loss: 0.1094 96/500 [====>.........................] - ETA: 2:16 - loss: 0.9935 - regression_loss: 0.8835 - classification_loss: 0.1100 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9954 - regression_loss: 0.8852 - classification_loss: 0.1102 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9993 - regression_loss: 0.8888 - classification_loss: 0.1105 99/500 [====>.........................] - ETA: 2:15 - loss: 0.9998 - regression_loss: 0.8894 - classification_loss: 0.1104 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0021 - regression_loss: 0.8916 - classification_loss: 0.1105 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0010 - regression_loss: 0.8907 - classification_loss: 0.1103 102/500 [=====>........................] - ETA: 2:14 - loss: 1.0088 - regression_loss: 0.8974 - classification_loss: 0.1114 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0076 - regression_loss: 0.8964 - classification_loss: 0.1113 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0110 - regression_loss: 0.8993 - classification_loss: 0.1117 105/500 [=====>........................] - ETA: 2:13 - loss: 1.0100 - regression_loss: 0.8987 - classification_loss: 0.1113 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0123 - regression_loss: 0.9008 - classification_loss: 0.1115 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0136 - regression_loss: 0.9020 - classification_loss: 0.1116 108/500 [=====>........................] - ETA: 2:12 - loss: 1.0143 - regression_loss: 0.9026 - classification_loss: 0.1116 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0133 - regression_loss: 0.9020 - classification_loss: 0.1113 110/500 [=====>........................] - ETA: 2:11 - loss: 1.0158 - regression_loss: 0.9039 - classification_loss: 0.1119 111/500 [=====>........................] - ETA: 2:11 - loss: 1.0160 - regression_loss: 0.9043 - classification_loss: 0.1117 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0163 - regression_loss: 0.9042 - classification_loss: 0.1121 113/500 [=====>........................] - ETA: 2:10 - loss: 1.0148 - regression_loss: 0.9027 - classification_loss: 0.1120 114/500 [=====>........................] - ETA: 2:10 - loss: 1.0190 - regression_loss: 0.9064 - classification_loss: 0.1127 115/500 [=====>........................] - ETA: 2:09 - loss: 1.0218 - regression_loss: 0.9087 - classification_loss: 0.1131 116/500 [=====>........................] - ETA: 2:09 - loss: 1.0223 - regression_loss: 0.9093 - classification_loss: 0.1131 117/500 [======>.......................] - ETA: 2:09 - loss: 1.0190 - regression_loss: 0.9064 - classification_loss: 0.1126 118/500 [======>.......................] - ETA: 2:08 - loss: 1.0196 - regression_loss: 0.9067 - classification_loss: 0.1129 119/500 [======>.......................] - ETA: 2:08 - loss: 1.0216 - regression_loss: 0.9083 - classification_loss: 0.1134 120/500 [======>.......................] - ETA: 2:08 - loss: 1.0246 - regression_loss: 0.9108 - classification_loss: 0.1138 121/500 [======>.......................] - ETA: 2:07 - loss: 1.0258 - regression_loss: 0.9119 - classification_loss: 0.1139 122/500 [======>.......................] - ETA: 2:07 - loss: 1.0279 - regression_loss: 0.9140 - classification_loss: 0.1139 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0255 - regression_loss: 0.9121 - classification_loss: 0.1134 124/500 [======>.......................] - ETA: 2:06 - loss: 1.0268 - regression_loss: 0.9130 - classification_loss: 0.1138 125/500 [======>.......................] - ETA: 2:06 - loss: 1.0244 - regression_loss: 0.9110 - classification_loss: 0.1134 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0253 - regression_loss: 0.9118 - classification_loss: 0.1135 127/500 [======>.......................] - ETA: 2:05 - loss: 1.0280 - regression_loss: 0.9140 - classification_loss: 0.1140 128/500 [======>.......................] - ETA: 2:05 - loss: 1.0281 - regression_loss: 0.9142 - classification_loss: 0.1139 129/500 [======>.......................] - ETA: 2:05 - loss: 1.0289 - regression_loss: 0.9150 - classification_loss: 0.1139 130/500 [======>.......................] - ETA: 2:04 - loss: 1.0276 - regression_loss: 0.9140 - classification_loss: 0.1136 131/500 [======>.......................] - ETA: 2:04 - loss: 1.0261 - regression_loss: 0.9128 - classification_loss: 0.1133 132/500 [======>.......................] - ETA: 2:04 - loss: 1.0224 - regression_loss: 0.9097 - classification_loss: 0.1127 133/500 [======>.......................] - ETA: 2:03 - loss: 1.0177 - regression_loss: 0.9056 - classification_loss: 0.1121 134/500 [=======>......................] - ETA: 2:03 - loss: 1.0250 - regression_loss: 0.9122 - classification_loss: 0.1127 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0212 - regression_loss: 0.9092 - classification_loss: 0.1120 136/500 [=======>......................] - ETA: 2:02 - loss: 1.0176 - regression_loss: 0.9061 - classification_loss: 0.1114 137/500 [=======>......................] - ETA: 2:02 - loss: 1.0178 - regression_loss: 0.9064 - classification_loss: 0.1113 138/500 [=======>......................] - ETA: 2:02 - loss: 1.0172 - regression_loss: 0.9062 - classification_loss: 0.1110 139/500 [=======>......................] - ETA: 2:01 - loss: 1.0177 - regression_loss: 0.9068 - classification_loss: 0.1110 140/500 [=======>......................] - ETA: 2:01 - loss: 1.0201 - regression_loss: 0.9087 - classification_loss: 0.1114 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0221 - regression_loss: 0.9103 - classification_loss: 0.1118 142/500 [=======>......................] - ETA: 2:00 - loss: 1.0235 - regression_loss: 0.9116 - classification_loss: 0.1120 143/500 [=======>......................] - ETA: 2:00 - loss: 1.0208 - regression_loss: 0.9092 - classification_loss: 0.1116 144/500 [=======>......................] - ETA: 2:00 - loss: 1.0227 - regression_loss: 0.9107 - classification_loss: 0.1120 145/500 [=======>......................] - ETA: 1:59 - loss: 1.0235 - regression_loss: 0.9112 - classification_loss: 0.1123 146/500 [=======>......................] - ETA: 1:59 - loss: 1.0229 - regression_loss: 0.9107 - classification_loss: 0.1122 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0205 - regression_loss: 0.9086 - classification_loss: 0.1119 148/500 [=======>......................] - ETA: 1:58 - loss: 1.0196 - regression_loss: 0.9079 - classification_loss: 0.1118 149/500 [=======>......................] - ETA: 1:58 - loss: 1.0208 - regression_loss: 0.9089 - classification_loss: 0.1119 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0204 - regression_loss: 0.9085 - classification_loss: 0.1118 151/500 [========>.....................] - ETA: 1:57 - loss: 1.0200 - regression_loss: 0.9078 - classification_loss: 0.1122 152/500 [========>.....................] - ETA: 1:57 - loss: 1.0183 - regression_loss: 0.9065 - classification_loss: 0.1118 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0228 - regression_loss: 0.9109 - classification_loss: 0.1120 154/500 [========>.....................] - ETA: 1:56 - loss: 1.0228 - regression_loss: 0.9108 - classification_loss: 0.1120 155/500 [========>.....................] - ETA: 1:56 - loss: 1.0267 - regression_loss: 0.9144 - classification_loss: 0.1123 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0251 - regression_loss: 0.9129 - classification_loss: 0.1121 157/500 [========>.....................] - ETA: 1:55 - loss: 1.0268 - regression_loss: 0.9146 - classification_loss: 0.1123 158/500 [========>.....................] - ETA: 1:55 - loss: 1.0240 - regression_loss: 0.9120 - classification_loss: 0.1120 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0250 - regression_loss: 0.9133 - classification_loss: 0.1117 160/500 [========>.....................] - ETA: 1:54 - loss: 1.0258 - regression_loss: 0.9143 - classification_loss: 0.1115 161/500 [========>.....................] - ETA: 1:54 - loss: 1.0269 - regression_loss: 0.9152 - classification_loss: 0.1117 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0286 - regression_loss: 0.9167 - classification_loss: 0.1119 163/500 [========>.....................] - ETA: 1:53 - loss: 1.0253 - regression_loss: 0.9138 - classification_loss: 0.1115 164/500 [========>.....................] - ETA: 1:53 - loss: 1.0245 - regression_loss: 0.9132 - classification_loss: 0.1114 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0220 - regression_loss: 0.9109 - classification_loss: 0.1112 166/500 [========>.....................] - ETA: 1:52 - loss: 1.0185 - regression_loss: 0.9078 - classification_loss: 0.1107 167/500 [=========>....................] - ETA: 1:52 - loss: 1.0155 - regression_loss: 0.9051 - classification_loss: 0.1104 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0116 - regression_loss: 0.9017 - classification_loss: 0.1099 169/500 [=========>....................] - ETA: 1:51 - loss: 1.0116 - regression_loss: 0.9018 - classification_loss: 0.1098 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0095 - regression_loss: 0.9001 - classification_loss: 0.1094 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0117 - regression_loss: 0.9020 - classification_loss: 0.1097 172/500 [=========>....................] - ETA: 1:50 - loss: 1.0127 - regression_loss: 0.9027 - classification_loss: 0.1100 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0114 - regression_loss: 0.9014 - classification_loss: 0.1100 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0118 - regression_loss: 0.9017 - classification_loss: 0.1101 175/500 [=========>....................] - ETA: 1:49 - loss: 1.0073 - regression_loss: 0.8976 - classification_loss: 0.1097 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0065 - regression_loss: 0.8969 - classification_loss: 0.1096 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0088 - regression_loss: 0.8989 - classification_loss: 0.1099 178/500 [=========>....................] - ETA: 1:48 - loss: 1.0091 - regression_loss: 0.8990 - classification_loss: 0.1101 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0118 - regression_loss: 0.9013 - classification_loss: 0.1105 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0127 - regression_loss: 0.9021 - classification_loss: 0.1106 181/500 [=========>....................] - ETA: 1:47 - loss: 1.0106 - regression_loss: 0.9003 - classification_loss: 0.1103 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0091 - regression_loss: 0.8991 - classification_loss: 0.1100 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0200 - regression_loss: 0.9070 - classification_loss: 0.1130 184/500 [==========>...................] - ETA: 1:46 - loss: 1.0178 - regression_loss: 0.9051 - classification_loss: 0.1126 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0169 - regression_loss: 0.9044 - classification_loss: 0.1125 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0139 - regression_loss: 0.9019 - classification_loss: 0.1120 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0140 - regression_loss: 0.9020 - classification_loss: 0.1120 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0149 - regression_loss: 0.9033 - classification_loss: 0.1116 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0134 - regression_loss: 0.9021 - classification_loss: 0.1113 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0145 - regression_loss: 0.9029 - classification_loss: 0.1116 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0150 - regression_loss: 0.9037 - classification_loss: 0.1114 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0136 - regression_loss: 0.9026 - classification_loss: 0.1110 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0093 - regression_loss: 0.8987 - classification_loss: 0.1106 194/500 [==========>...................] - ETA: 1:43 - loss: 1.0109 - regression_loss: 0.9001 - classification_loss: 0.1108 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0129 - regression_loss: 0.9018 - classification_loss: 0.1111 196/500 [==========>...................] - ETA: 1:42 - loss: 1.0135 - regression_loss: 0.9023 - classification_loss: 0.1112 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0134 - regression_loss: 0.9023 - classification_loss: 0.1111 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0123 - regression_loss: 0.9015 - classification_loss: 0.1108 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0128 - regression_loss: 0.9020 - classification_loss: 0.1109 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0095 - regression_loss: 0.8991 - classification_loss: 0.1104 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0099 - regression_loss: 0.8995 - classification_loss: 0.1104 202/500 [===========>..................] - ETA: 1:40 - loss: 1.0110 - regression_loss: 0.9004 - classification_loss: 0.1106 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0124 - regression_loss: 0.9019 - classification_loss: 0.1105 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0119 - regression_loss: 0.9014 - classification_loss: 0.1105 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0130 - regression_loss: 0.9024 - classification_loss: 0.1106 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0146 - regression_loss: 0.9038 - classification_loss: 0.1108 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0149 - regression_loss: 0.9040 - classification_loss: 0.1109 208/500 [===========>..................] - ETA: 1:38 - loss: 1.0145 - regression_loss: 0.9037 - classification_loss: 0.1107 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0161 - regression_loss: 0.9053 - classification_loss: 0.1108 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0164 - regression_loss: 0.9055 - classification_loss: 0.1109 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0155 - regression_loss: 0.9047 - classification_loss: 0.1108 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0174 - regression_loss: 0.9063 - classification_loss: 0.1111 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0189 - regression_loss: 0.9079 - classification_loss: 0.1110 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0202 - regression_loss: 0.9091 - classification_loss: 0.1111 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0200 - regression_loss: 0.9088 - classification_loss: 0.1112 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0204 - regression_loss: 0.9091 - classification_loss: 0.1113 217/500 [============>.................] - ETA: 1:36 - loss: 1.0230 - regression_loss: 0.9112 - classification_loss: 0.1118 218/500 [============>.................] - ETA: 1:35 - loss: 1.0234 - regression_loss: 0.9114 - classification_loss: 0.1119 219/500 [============>.................] - ETA: 1:35 - loss: 1.0232 - regression_loss: 0.9111 - classification_loss: 0.1121 220/500 [============>.................] - ETA: 1:35 - loss: 1.0251 - regression_loss: 0.9128 - classification_loss: 0.1123 221/500 [============>.................] - ETA: 1:34 - loss: 1.0268 - regression_loss: 0.9142 - classification_loss: 0.1126 222/500 [============>.................] - ETA: 1:34 - loss: 1.0247 - regression_loss: 0.9124 - classification_loss: 0.1123 223/500 [============>.................] - ETA: 1:34 - loss: 1.0264 - regression_loss: 0.9136 - classification_loss: 0.1128 224/500 [============>.................] - ETA: 1:33 - loss: 1.0258 - regression_loss: 0.9129 - classification_loss: 0.1128 225/500 [============>.................] - ETA: 1:33 - loss: 1.0277 - regression_loss: 0.9144 - classification_loss: 0.1132 226/500 [============>.................] - ETA: 1:33 - loss: 1.0301 - regression_loss: 0.9167 - classification_loss: 0.1134 227/500 [============>.................] - ETA: 1:32 - loss: 1.0284 - regression_loss: 0.9152 - classification_loss: 0.1132 228/500 [============>.................] - ETA: 1:32 - loss: 1.0299 - regression_loss: 0.9163 - classification_loss: 0.1136 229/500 [============>.................] - ETA: 1:32 - loss: 1.0310 - regression_loss: 0.9171 - classification_loss: 0.1139 230/500 [============>.................] - ETA: 1:31 - loss: 1.0282 - regression_loss: 0.9146 - classification_loss: 0.1136 231/500 [============>.................] - ETA: 1:31 - loss: 1.0262 - regression_loss: 0.9131 - classification_loss: 0.1131 232/500 [============>.................] - ETA: 1:31 - loss: 1.0260 - regression_loss: 0.9130 - classification_loss: 0.1130 233/500 [============>.................] - ETA: 1:30 - loss: 1.0242 - regression_loss: 0.9114 - classification_loss: 0.1128 234/500 [=============>................] - ETA: 1:30 - loss: 1.0212 - regression_loss: 0.9088 - classification_loss: 0.1124 235/500 [=============>................] - ETA: 1:30 - loss: 1.0216 - regression_loss: 0.9093 - classification_loss: 0.1123 236/500 [=============>................] - ETA: 1:29 - loss: 1.0219 - regression_loss: 0.9095 - classification_loss: 0.1125 237/500 [=============>................] - ETA: 1:29 - loss: 1.0220 - regression_loss: 0.9096 - classification_loss: 0.1124 238/500 [=============>................] - ETA: 1:29 - loss: 1.0197 - regression_loss: 0.9076 - classification_loss: 0.1121 239/500 [=============>................] - ETA: 1:28 - loss: 1.0182 - regression_loss: 0.9063 - classification_loss: 0.1119 240/500 [=============>................] - ETA: 1:28 - loss: 1.0167 - regression_loss: 0.9050 - classification_loss: 0.1117 241/500 [=============>................] - ETA: 1:28 - loss: 1.0144 - regression_loss: 0.9030 - classification_loss: 0.1114 242/500 [=============>................] - ETA: 1:27 - loss: 1.0130 - regression_loss: 0.9019 - classification_loss: 0.1112 243/500 [=============>................] - ETA: 1:27 - loss: 1.0123 - regression_loss: 0.9011 - classification_loss: 0.1112 244/500 [=============>................] - ETA: 1:27 - loss: 1.0099 - regression_loss: 0.8991 - classification_loss: 0.1108 245/500 [=============>................] - ETA: 1:26 - loss: 1.0107 - regression_loss: 0.8998 - classification_loss: 0.1110 246/500 [=============>................] - ETA: 1:26 - loss: 1.0099 - regression_loss: 0.8991 - classification_loss: 0.1108 247/500 [=============>................] - ETA: 1:25 - loss: 1.0106 - regression_loss: 0.8997 - classification_loss: 0.1110 248/500 [=============>................] - ETA: 1:25 - loss: 1.0118 - regression_loss: 0.9007 - classification_loss: 0.1111 249/500 [=============>................] - ETA: 1:25 - loss: 1.0143 - regression_loss: 0.9026 - classification_loss: 0.1117 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0141 - regression_loss: 0.9026 - classification_loss: 0.1115 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0149 - regression_loss: 0.9034 - classification_loss: 0.1115 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0145 - regression_loss: 0.9030 - classification_loss: 0.1114 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0140 - regression_loss: 0.9026 - classification_loss: 0.1114 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0135 - regression_loss: 0.9021 - classification_loss: 0.1114 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0151 - regression_loss: 0.9032 - classification_loss: 0.1118 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0137 - regression_loss: 0.9021 - classification_loss: 0.1116 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0148 - regression_loss: 0.9030 - classification_loss: 0.1119 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0155 - regression_loss: 0.9035 - classification_loss: 0.1120 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0149 - regression_loss: 0.9028 - classification_loss: 0.1120 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0174 - regression_loss: 0.9050 - classification_loss: 0.1125 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0183 - regression_loss: 0.9056 - classification_loss: 0.1127 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0183 - regression_loss: 0.9056 - classification_loss: 0.1127 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0188 - regression_loss: 0.9061 - classification_loss: 0.1127 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0192 - regression_loss: 0.9063 - classification_loss: 0.1128 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0174 - regression_loss: 0.9048 - classification_loss: 0.1126 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0182 - regression_loss: 0.9056 - classification_loss: 0.1126 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0160 - regression_loss: 0.9036 - classification_loss: 0.1124 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0147 - regression_loss: 0.9026 - classification_loss: 0.1121 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0147 - regression_loss: 0.9026 - classification_loss: 0.1121 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0150 - regression_loss: 0.9028 - classification_loss: 0.1122 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0135 - regression_loss: 0.9016 - classification_loss: 0.1119 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0111 - regression_loss: 0.8995 - classification_loss: 0.1116 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0125 - regression_loss: 0.9006 - classification_loss: 0.1119 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0120 - regression_loss: 0.9003 - classification_loss: 0.1117 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0135 - regression_loss: 0.9016 - classification_loss: 0.1119 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0146 - regression_loss: 0.9026 - classification_loss: 0.1121 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0147 - regression_loss: 0.9026 - classification_loss: 0.1121 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0123 - regression_loss: 0.9005 - classification_loss: 0.1118 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0107 - regression_loss: 0.8992 - classification_loss: 0.1115 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0111 - regression_loss: 0.8995 - classification_loss: 0.1116 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0106 - regression_loss: 0.8990 - classification_loss: 0.1117 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0114 - regression_loss: 0.8996 - classification_loss: 0.1118 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0112 - regression_loss: 0.8994 - classification_loss: 0.1118 284/500 [================>.............] - ETA: 1:13 - loss: 1.0108 - regression_loss: 0.8993 - classification_loss: 0.1116 285/500 [================>.............] - ETA: 1:13 - loss: 1.0082 - regression_loss: 0.8969 - classification_loss: 0.1113 286/500 [================>.............] - ETA: 1:12 - loss: 1.0059 - regression_loss: 0.8948 - classification_loss: 0.1110 287/500 [================>.............] - ETA: 1:12 - loss: 1.0068 - regression_loss: 0.8957 - classification_loss: 0.1111 288/500 [================>.............] - ETA: 1:12 - loss: 1.0079 - regression_loss: 0.8966 - classification_loss: 0.1113 289/500 [================>.............] - ETA: 1:11 - loss: 1.0076 - regression_loss: 0.8963 - classification_loss: 0.1113 290/500 [================>.............] - ETA: 1:11 - loss: 1.0073 - regression_loss: 0.8959 - classification_loss: 0.1113 291/500 [================>.............] - ETA: 1:11 - loss: 1.0071 - regression_loss: 0.8958 - classification_loss: 0.1113 292/500 [================>.............] - ETA: 1:10 - loss: 1.0063 - regression_loss: 0.8950 - classification_loss: 0.1113 293/500 [================>.............] - ETA: 1:10 - loss: 1.0077 - regression_loss: 0.8961 - classification_loss: 0.1117 294/500 [================>.............] - ETA: 1:10 - loss: 1.0081 - regression_loss: 0.8963 - classification_loss: 0.1118 295/500 [================>.............] - ETA: 1:09 - loss: 1.0077 - regression_loss: 0.8960 - classification_loss: 0.1116 296/500 [================>.............] - ETA: 1:09 - loss: 1.0092 - regression_loss: 0.8973 - classification_loss: 0.1119 297/500 [================>.............] - ETA: 1:09 - loss: 1.0082 - regression_loss: 0.8965 - classification_loss: 0.1118 298/500 [================>.............] - ETA: 1:08 - loss: 1.0066 - regression_loss: 0.8951 - classification_loss: 0.1115 299/500 [================>.............] - ETA: 1:08 - loss: 1.0070 - regression_loss: 0.8955 - classification_loss: 0.1115 300/500 [=================>............] - ETA: 1:08 - loss: 1.0050 - regression_loss: 0.8937 - classification_loss: 0.1113 301/500 [=================>............] - ETA: 1:07 - loss: 1.0030 - regression_loss: 0.8920 - classification_loss: 0.1111 302/500 [=================>............] - ETA: 1:07 - loss: 1.0029 - regression_loss: 0.8918 - classification_loss: 0.1111 303/500 [=================>............] - ETA: 1:07 - loss: 1.0036 - regression_loss: 0.8925 - classification_loss: 0.1111 304/500 [=================>............] - ETA: 1:06 - loss: 1.0034 - regression_loss: 0.8923 - classification_loss: 0.1111 305/500 [=================>............] - ETA: 1:06 - loss: 1.0042 - regression_loss: 0.8930 - classification_loss: 0.1112 306/500 [=================>............] - ETA: 1:05 - loss: 1.0062 - regression_loss: 0.8944 - classification_loss: 0.1118 307/500 [=================>............] - ETA: 1:05 - loss: 1.0065 - regression_loss: 0.8947 - classification_loss: 0.1118 308/500 [=================>............] - ETA: 1:05 - loss: 1.0043 - regression_loss: 0.8926 - classification_loss: 0.1117 309/500 [=================>............] - ETA: 1:04 - loss: 1.0057 - regression_loss: 0.8939 - classification_loss: 0.1118 310/500 [=================>............] - ETA: 1:04 - loss: 1.0058 - regression_loss: 0.8940 - classification_loss: 0.1118 311/500 [=================>............] - ETA: 1:04 - loss: 1.0066 - regression_loss: 0.8947 - classification_loss: 0.1119 312/500 [=================>............] - ETA: 1:03 - loss: 1.0064 - regression_loss: 0.8944 - classification_loss: 0.1120 313/500 [=================>............] - ETA: 1:03 - loss: 1.0065 - regression_loss: 0.8946 - classification_loss: 0.1119 314/500 [=================>............] - ETA: 1:03 - loss: 1.0071 - regression_loss: 0.8951 - classification_loss: 0.1120 315/500 [=================>............] - ETA: 1:02 - loss: 1.0068 - regression_loss: 0.8948 - classification_loss: 0.1119 316/500 [=================>............] - ETA: 1:02 - loss: 1.0075 - regression_loss: 0.8955 - classification_loss: 0.1120 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0085 - regression_loss: 0.8963 - classification_loss: 0.1121 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0089 - regression_loss: 0.8967 - classification_loss: 0.1122 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0110 - regression_loss: 0.8987 - classification_loss: 0.1123 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0102 - regression_loss: 0.8980 - classification_loss: 0.1122 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0103 - regression_loss: 0.8981 - classification_loss: 0.1121 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0115 - regression_loss: 0.8991 - classification_loss: 0.1123 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0089 - regression_loss: 0.8969 - classification_loss: 0.1121 324/500 [==================>...........] - ETA: 59s - loss: 1.0087 - regression_loss: 0.8969 - classification_loss: 0.1118  325/500 [==================>...........] - ETA: 59s - loss: 1.0088 - regression_loss: 0.8972 - classification_loss: 0.1117 326/500 [==================>...........] - ETA: 59s - loss: 1.0099 - regression_loss: 0.8982 - classification_loss: 0.1117 327/500 [==================>...........] - ETA: 58s - loss: 1.0092 - regression_loss: 0.8977 - classification_loss: 0.1115 328/500 [==================>...........] - ETA: 58s - loss: 1.0097 - regression_loss: 0.8981 - classification_loss: 0.1116 329/500 [==================>...........] - ETA: 58s - loss: 1.0082 - regression_loss: 0.8969 - classification_loss: 0.1113 330/500 [==================>...........] - ETA: 57s - loss: 1.0089 - regression_loss: 0.8976 - classification_loss: 0.1113 331/500 [==================>...........] - ETA: 57s - loss: 1.0088 - regression_loss: 0.8977 - classification_loss: 0.1111 332/500 [==================>...........] - ETA: 57s - loss: 1.0093 - regression_loss: 0.8982 - classification_loss: 0.1111 333/500 [==================>...........] - ETA: 56s - loss: 1.0081 - regression_loss: 0.8970 - classification_loss: 0.1111 334/500 [===================>..........] - ETA: 56s - loss: 1.0082 - regression_loss: 0.8971 - classification_loss: 0.1111 335/500 [===================>..........] - ETA: 56s - loss: 1.0075 - regression_loss: 0.8965 - classification_loss: 0.1110 336/500 [===================>..........] - ETA: 55s - loss: 1.0075 - regression_loss: 0.8965 - classification_loss: 0.1110 337/500 [===================>..........] - ETA: 55s - loss: 1.0074 - regression_loss: 0.8965 - classification_loss: 0.1108 338/500 [===================>..........] - ETA: 55s - loss: 1.0055 - regression_loss: 0.8949 - classification_loss: 0.1106 339/500 [===================>..........] - ETA: 54s - loss: 1.0051 - regression_loss: 0.8947 - classification_loss: 0.1104 340/500 [===================>..........] - ETA: 54s - loss: 1.0049 - regression_loss: 0.8947 - classification_loss: 0.1102 341/500 [===================>..........] - ETA: 54s - loss: 1.0063 - regression_loss: 0.8959 - classification_loss: 0.1104 342/500 [===================>..........] - ETA: 53s - loss: 1.0072 - regression_loss: 0.8966 - classification_loss: 0.1106 343/500 [===================>..........] - ETA: 53s - loss: 1.0067 - regression_loss: 0.8961 - classification_loss: 0.1106 344/500 [===================>..........] - ETA: 53s - loss: 1.0085 - regression_loss: 0.8974 - classification_loss: 0.1111 345/500 [===================>..........] - ETA: 52s - loss: 1.0107 - regression_loss: 0.8996 - classification_loss: 0.1111 346/500 [===================>..........] - ETA: 52s - loss: 1.0114 - regression_loss: 0.9001 - classification_loss: 0.1114 347/500 [===================>..........] - ETA: 52s - loss: 1.0110 - regression_loss: 0.8998 - classification_loss: 0.1113 348/500 [===================>..........] - ETA: 51s - loss: 1.0115 - regression_loss: 0.9003 - classification_loss: 0.1112 349/500 [===================>..........] - ETA: 51s - loss: 1.0108 - regression_loss: 0.8996 - classification_loss: 0.1112 350/500 [====================>.........] - ETA: 51s - loss: 1.0110 - regression_loss: 0.8996 - classification_loss: 0.1114 351/500 [====================>.........] - ETA: 50s - loss: 1.0101 - regression_loss: 0.8989 - classification_loss: 0.1112 352/500 [====================>.........] - ETA: 50s - loss: 1.0113 - regression_loss: 0.8999 - classification_loss: 0.1114 353/500 [====================>.........] - ETA: 50s - loss: 1.0101 - regression_loss: 0.8988 - classification_loss: 0.1112 354/500 [====================>.........] - ETA: 49s - loss: 1.0108 - regression_loss: 0.8994 - classification_loss: 0.1114 355/500 [====================>.........] - ETA: 49s - loss: 1.0120 - regression_loss: 0.9005 - classification_loss: 0.1115 356/500 [====================>.........] - ETA: 48s - loss: 1.0107 - regression_loss: 0.8994 - classification_loss: 0.1113 357/500 [====================>.........] - ETA: 48s - loss: 1.0103 - regression_loss: 0.8990 - classification_loss: 0.1113 358/500 [====================>.........] - ETA: 48s - loss: 1.0114 - regression_loss: 0.9000 - classification_loss: 0.1114 359/500 [====================>.........] - ETA: 47s - loss: 1.0113 - regression_loss: 0.9000 - classification_loss: 0.1113 360/500 [====================>.........] - ETA: 47s - loss: 1.0105 - regression_loss: 0.8993 - classification_loss: 0.1112 361/500 [====================>.........] - ETA: 47s - loss: 1.0110 - regression_loss: 0.8997 - classification_loss: 0.1112 362/500 [====================>.........] - ETA: 46s - loss: 1.0119 - regression_loss: 0.9005 - classification_loss: 0.1114 363/500 [====================>.........] - ETA: 46s - loss: 1.0105 - regression_loss: 0.8992 - classification_loss: 0.1112 364/500 [====================>.........] - ETA: 46s - loss: 1.0097 - regression_loss: 0.8987 - classification_loss: 0.1110 365/500 [====================>.........] - ETA: 45s - loss: 1.0100 - regression_loss: 0.8990 - classification_loss: 0.1110 366/500 [====================>.........] - ETA: 45s - loss: 1.0143 - regression_loss: 0.9017 - classification_loss: 0.1126 367/500 [=====================>........] - ETA: 45s - loss: 1.0141 - regression_loss: 0.9015 - classification_loss: 0.1125 368/500 [=====================>........] - ETA: 44s - loss: 1.0136 - regression_loss: 0.9012 - classification_loss: 0.1125 369/500 [=====================>........] - ETA: 44s - loss: 1.0134 - regression_loss: 0.9009 - classification_loss: 0.1125 370/500 [=====================>........] - ETA: 44s - loss: 1.0123 - regression_loss: 0.9000 - classification_loss: 0.1123 371/500 [=====================>........] - ETA: 43s - loss: 1.0128 - regression_loss: 0.9003 - classification_loss: 0.1125 372/500 [=====================>........] - ETA: 43s - loss: 1.0127 - regression_loss: 0.9002 - classification_loss: 0.1126 373/500 [=====================>........] - ETA: 43s - loss: 1.0121 - regression_loss: 0.8997 - classification_loss: 0.1123 374/500 [=====================>........] - ETA: 42s - loss: 1.0125 - regression_loss: 0.9000 - classification_loss: 0.1125 375/500 [=====================>........] - ETA: 42s - loss: 1.0125 - regression_loss: 0.9000 - classification_loss: 0.1125 376/500 [=====================>........] - ETA: 42s - loss: 1.0122 - regression_loss: 0.8998 - classification_loss: 0.1123 377/500 [=====================>........] - ETA: 41s - loss: 1.0107 - regression_loss: 0.8985 - classification_loss: 0.1121 378/500 [=====================>........] - ETA: 41s - loss: 1.0111 - regression_loss: 0.8990 - classification_loss: 0.1121 379/500 [=====================>........] - ETA: 41s - loss: 1.0105 - regression_loss: 0.8985 - classification_loss: 0.1119 380/500 [=====================>........] - ETA: 40s - loss: 1.0114 - regression_loss: 0.8993 - classification_loss: 0.1122 381/500 [=====================>........] - ETA: 40s - loss: 1.0106 - regression_loss: 0.8985 - classification_loss: 0.1121 382/500 [=====================>........] - ETA: 40s - loss: 1.0091 - regression_loss: 0.8972 - classification_loss: 0.1118 383/500 [=====================>........] - ETA: 39s - loss: 1.0080 - regression_loss: 0.8963 - classification_loss: 0.1117 384/500 [======================>.......] - ETA: 39s - loss: 1.0073 - regression_loss: 0.8958 - classification_loss: 0.1115 385/500 [======================>.......] - ETA: 39s - loss: 1.0071 - regression_loss: 0.8956 - classification_loss: 0.1116 386/500 [======================>.......] - ETA: 38s - loss: 1.0075 - regression_loss: 0.8959 - classification_loss: 0.1117 387/500 [======================>.......] - ETA: 38s - loss: 1.0073 - regression_loss: 0.8956 - classification_loss: 0.1117 388/500 [======================>.......] - ETA: 38s - loss: 1.0063 - regression_loss: 0.8948 - classification_loss: 0.1115 389/500 [======================>.......] - ETA: 37s - loss: 1.0065 - regression_loss: 0.8950 - classification_loss: 0.1115 390/500 [======================>.......] - ETA: 37s - loss: 1.0064 - regression_loss: 0.8950 - classification_loss: 0.1115 391/500 [======================>.......] - ETA: 37s - loss: 1.0063 - regression_loss: 0.8949 - classification_loss: 0.1115 392/500 [======================>.......] - ETA: 36s - loss: 1.0047 - regression_loss: 0.8934 - classification_loss: 0.1113 393/500 [======================>.......] - ETA: 36s - loss: 1.0041 - regression_loss: 0.8929 - classification_loss: 0.1112 394/500 [======================>.......] - ETA: 36s - loss: 1.0064 - regression_loss: 0.8947 - classification_loss: 0.1116 395/500 [======================>.......] - ETA: 35s - loss: 1.0071 - regression_loss: 0.8954 - classification_loss: 0.1117 396/500 [======================>.......] - ETA: 35s - loss: 1.0062 - regression_loss: 0.8945 - classification_loss: 0.1116 397/500 [======================>.......] - ETA: 35s - loss: 1.0078 - regression_loss: 0.8959 - classification_loss: 0.1119 398/500 [======================>.......] - ETA: 34s - loss: 1.0073 - regression_loss: 0.8956 - classification_loss: 0.1118 399/500 [======================>.......] - ETA: 34s - loss: 1.0091 - regression_loss: 0.8973 - classification_loss: 0.1118 400/500 [=======================>......] - ETA: 34s - loss: 1.0075 - regression_loss: 0.8959 - classification_loss: 0.1116 401/500 [=======================>......] - ETA: 33s - loss: 1.0077 - regression_loss: 0.8960 - classification_loss: 0.1117 402/500 [=======================>......] - ETA: 33s - loss: 1.0081 - regression_loss: 0.8963 - classification_loss: 0.1118 403/500 [=======================>......] - ETA: 33s - loss: 1.0075 - regression_loss: 0.8957 - classification_loss: 0.1118 404/500 [=======================>......] - ETA: 32s - loss: 1.0062 - regression_loss: 0.8946 - classification_loss: 0.1116 405/500 [=======================>......] - ETA: 32s - loss: 1.0069 - regression_loss: 0.8949 - classification_loss: 0.1120 406/500 [=======================>......] - ETA: 32s - loss: 1.0060 - regression_loss: 0.8941 - classification_loss: 0.1120 407/500 [=======================>......] - ETA: 31s - loss: 1.0040 - regression_loss: 0.8922 - classification_loss: 0.1118 408/500 [=======================>......] - ETA: 31s - loss: 1.0042 - regression_loss: 0.8925 - classification_loss: 0.1118 409/500 [=======================>......] - ETA: 30s - loss: 1.0040 - regression_loss: 0.8923 - classification_loss: 0.1117 410/500 [=======================>......] - ETA: 30s - loss: 1.0030 - regression_loss: 0.8915 - classification_loss: 0.1115 411/500 [=======================>......] - ETA: 30s - loss: 1.0030 - regression_loss: 0.8915 - classification_loss: 0.1115 412/500 [=======================>......] - ETA: 29s - loss: 1.0038 - regression_loss: 0.8922 - classification_loss: 0.1115 413/500 [=======================>......] - ETA: 29s - loss: 1.0034 - regression_loss: 0.8920 - classification_loss: 0.1114 414/500 [=======================>......] - ETA: 29s - loss: 1.0022 - regression_loss: 0.8910 - classification_loss: 0.1112 415/500 [=======================>......] - ETA: 28s - loss: 1.0021 - regression_loss: 0.8909 - classification_loss: 0.1112 416/500 [=======================>......] - ETA: 28s - loss: 1.0017 - regression_loss: 0.8906 - classification_loss: 0.1112 417/500 [========================>.....] - ETA: 28s - loss: 1.0019 - regression_loss: 0.8907 - classification_loss: 0.1113 418/500 [========================>.....] - ETA: 27s - loss: 1.0023 - regression_loss: 0.8911 - classification_loss: 0.1112 419/500 [========================>.....] - ETA: 27s - loss: 1.0029 - regression_loss: 0.8916 - classification_loss: 0.1112 420/500 [========================>.....] - ETA: 27s - loss: 1.0031 - regression_loss: 0.8919 - classification_loss: 0.1113 421/500 [========================>.....] - ETA: 26s - loss: 1.0024 - regression_loss: 0.8912 - classification_loss: 0.1111 422/500 [========================>.....] - ETA: 26s - loss: 1.0023 - regression_loss: 0.8913 - classification_loss: 0.1110 423/500 [========================>.....] - ETA: 26s - loss: 1.0018 - regression_loss: 0.8908 - classification_loss: 0.1110 424/500 [========================>.....] - ETA: 25s - loss: 1.0003 - regression_loss: 0.8895 - classification_loss: 0.1108 425/500 [========================>.....] - ETA: 25s - loss: 1.0002 - regression_loss: 0.8892 - classification_loss: 0.1110 426/500 [========================>.....] - ETA: 25s - loss: 0.9982 - regression_loss: 0.8875 - classification_loss: 0.1108 427/500 [========================>.....] - ETA: 24s - loss: 0.9982 - regression_loss: 0.8872 - classification_loss: 0.1110 428/500 [========================>.....] - ETA: 24s - loss: 0.9991 - regression_loss: 0.8879 - classification_loss: 0.1112 429/500 [========================>.....] - ETA: 24s - loss: 0.9999 - regression_loss: 0.8886 - classification_loss: 0.1113 430/500 [========================>.....] - ETA: 23s - loss: 1.0011 - regression_loss: 0.8896 - classification_loss: 0.1115 431/500 [========================>.....] - ETA: 23s - loss: 1.0004 - regression_loss: 0.8891 - classification_loss: 0.1113 432/500 [========================>.....] - ETA: 23s - loss: 1.0023 - regression_loss: 0.8908 - classification_loss: 0.1115 433/500 [========================>.....] - ETA: 22s - loss: 1.0028 - regression_loss: 0.8914 - classification_loss: 0.1114 434/500 [=========================>....] - ETA: 22s - loss: 1.0018 - regression_loss: 0.8906 - classification_loss: 0.1112 435/500 [=========================>....] - ETA: 22s - loss: 1.0035 - regression_loss: 0.8921 - classification_loss: 0.1114 436/500 [=========================>....] - ETA: 21s - loss: 1.0036 - regression_loss: 0.8922 - classification_loss: 0.1114 437/500 [=========================>....] - ETA: 21s - loss: 1.0046 - regression_loss: 0.8929 - classification_loss: 0.1118 438/500 [=========================>....] - ETA: 21s - loss: 1.0056 - regression_loss: 0.8937 - classification_loss: 0.1119 439/500 [=========================>....] - ETA: 20s - loss: 1.0063 - regression_loss: 0.8942 - classification_loss: 0.1121 440/500 [=========================>....] - ETA: 20s - loss: 1.0062 - regression_loss: 0.8941 - classification_loss: 0.1121 441/500 [=========================>....] - ETA: 20s - loss: 1.0065 - regression_loss: 0.8943 - classification_loss: 0.1122 442/500 [=========================>....] - ETA: 19s - loss: 1.0067 - regression_loss: 0.8945 - classification_loss: 0.1122 443/500 [=========================>....] - ETA: 19s - loss: 1.0056 - regression_loss: 0.8935 - classification_loss: 0.1121 444/500 [=========================>....] - ETA: 19s - loss: 1.0043 - regression_loss: 0.8923 - classification_loss: 0.1120 445/500 [=========================>....] - ETA: 18s - loss: 1.0030 - regression_loss: 0.8912 - classification_loss: 0.1118 446/500 [=========================>....] - ETA: 18s - loss: 1.0039 - regression_loss: 0.8920 - classification_loss: 0.1120 447/500 [=========================>....] - ETA: 18s - loss: 1.0041 - regression_loss: 0.8921 - classification_loss: 0.1120 448/500 [=========================>....] - ETA: 17s - loss: 1.0046 - regression_loss: 0.8925 - classification_loss: 0.1121 449/500 [=========================>....] - ETA: 17s - loss: 1.0059 - regression_loss: 0.8937 - classification_loss: 0.1123 450/500 [==========================>...] - ETA: 17s - loss: 1.0060 - regression_loss: 0.8938 - classification_loss: 0.1122 451/500 [==========================>...] - ETA: 16s - loss: 1.0055 - regression_loss: 0.8933 - classification_loss: 0.1121 452/500 [==========================>...] - ETA: 16s - loss: 1.0068 - regression_loss: 0.8945 - classification_loss: 0.1122 453/500 [==========================>...] - ETA: 15s - loss: 1.0067 - regression_loss: 0.8944 - classification_loss: 0.1123 454/500 [==========================>...] - ETA: 15s - loss: 1.0065 - regression_loss: 0.8942 - classification_loss: 0.1123 455/500 [==========================>...] - ETA: 15s - loss: 1.0075 - regression_loss: 0.8950 - classification_loss: 0.1125 456/500 [==========================>...] - ETA: 14s - loss: 1.0073 - regression_loss: 0.8948 - classification_loss: 0.1125 457/500 [==========================>...] - ETA: 14s - loss: 1.0070 - regression_loss: 0.8946 - classification_loss: 0.1124 458/500 [==========================>...] - ETA: 14s - loss: 1.0076 - regression_loss: 0.8952 - classification_loss: 0.1124 459/500 [==========================>...] - ETA: 13s - loss: 1.0083 - regression_loss: 0.8957 - classification_loss: 0.1126 460/500 [==========================>...] - ETA: 13s - loss: 1.0090 - regression_loss: 0.8962 - classification_loss: 0.1129 461/500 [==========================>...] - ETA: 13s - loss: 1.0094 - regression_loss: 0.8964 - classification_loss: 0.1130 462/500 [==========================>...] - ETA: 12s - loss: 1.0100 - regression_loss: 0.8970 - classification_loss: 0.1130 463/500 [==========================>...] - ETA: 12s - loss: 1.0111 - regression_loss: 0.8979 - classification_loss: 0.1132 464/500 [==========================>...] - ETA: 12s - loss: 1.0111 - regression_loss: 0.8979 - classification_loss: 0.1133 465/500 [==========================>...] - ETA: 11s - loss: 1.0115 - regression_loss: 0.8982 - classification_loss: 0.1133 466/500 [==========================>...] - ETA: 11s - loss: 1.0117 - regression_loss: 0.8984 - classification_loss: 0.1133 467/500 [===========================>..] - ETA: 11s - loss: 1.0118 - regression_loss: 0.8985 - classification_loss: 0.1133 468/500 [===========================>..] - ETA: 10s - loss: 1.0116 - regression_loss: 0.8984 - classification_loss: 0.1132 469/500 [===========================>..] - ETA: 10s - loss: 1.0111 - regression_loss: 0.8978 - classification_loss: 0.1133 470/500 [===========================>..] - ETA: 10s - loss: 1.0109 - regression_loss: 0.8975 - classification_loss: 0.1134 471/500 [===========================>..] - ETA: 9s - loss: 1.0100 - regression_loss: 0.8967 - classification_loss: 0.1133  472/500 [===========================>..] - ETA: 9s - loss: 1.0104 - regression_loss: 0.8971 - classification_loss: 0.1133 473/500 [===========================>..] - ETA: 9s - loss: 1.0102 - regression_loss: 0.8969 - classification_loss: 0.1134 474/500 [===========================>..] - ETA: 8s - loss: 1.0108 - regression_loss: 0.8974 - classification_loss: 0.1133 475/500 [===========================>..] - ETA: 8s - loss: 1.0119 - regression_loss: 0.8985 - classification_loss: 0.1134 476/500 [===========================>..] - ETA: 8s - loss: 1.0118 - regression_loss: 0.8985 - classification_loss: 0.1133 477/500 [===========================>..] - ETA: 7s - loss: 1.0119 - regression_loss: 0.8986 - classification_loss: 0.1132 478/500 [===========================>..] - ETA: 7s - loss: 1.0118 - regression_loss: 0.8986 - classification_loss: 0.1132 479/500 [===========================>..] - ETA: 7s - loss: 1.0106 - regression_loss: 0.8975 - classification_loss: 0.1130 480/500 [===========================>..] - ETA: 6s - loss: 1.0113 - regression_loss: 0.8982 - classification_loss: 0.1132 481/500 [===========================>..] - ETA: 6s - loss: 1.0099 - regression_loss: 0.8969 - classification_loss: 0.1129 482/500 [===========================>..] - ETA: 6s - loss: 1.0092 - regression_loss: 0.8964 - classification_loss: 0.1128 483/500 [===========================>..] - ETA: 5s - loss: 1.0100 - regression_loss: 0.8970 - classification_loss: 0.1130 484/500 [============================>.] - ETA: 5s - loss: 1.0097 - regression_loss: 0.8967 - classification_loss: 0.1130 485/500 [============================>.] - ETA: 5s - loss: 1.0101 - regression_loss: 0.8970 - classification_loss: 0.1131 486/500 [============================>.] - ETA: 4s - loss: 1.0092 - regression_loss: 0.8962 - classification_loss: 0.1130 487/500 [============================>.] - ETA: 4s - loss: 1.0091 - regression_loss: 0.8961 - classification_loss: 0.1130 488/500 [============================>.] - ETA: 4s - loss: 1.0089 - regression_loss: 0.8959 - classification_loss: 0.1129 489/500 [============================>.] - ETA: 3s - loss: 1.0076 - regression_loss: 0.8948 - classification_loss: 0.1128 490/500 [============================>.] - ETA: 3s - loss: 1.0079 - regression_loss: 0.8951 - classification_loss: 0.1128 491/500 [============================>.] - ETA: 3s - loss: 1.0080 - regression_loss: 0.8952 - classification_loss: 0.1128 492/500 [============================>.] - ETA: 2s - loss: 1.0081 - regression_loss: 0.8953 - classification_loss: 0.1128 493/500 [============================>.] - ETA: 2s - loss: 1.0081 - regression_loss: 0.8954 - classification_loss: 0.1127 494/500 [============================>.] - ETA: 2s - loss: 1.0080 - regression_loss: 0.8954 - classification_loss: 0.1125 495/500 [============================>.] - ETA: 1s - loss: 1.0068 - regression_loss: 0.8944 - classification_loss: 0.1124 496/500 [============================>.] - ETA: 1s - loss: 1.0069 - regression_loss: 0.8945 - classification_loss: 0.1124 497/500 [============================>.] - ETA: 1s - loss: 1.0068 - regression_loss: 0.8944 - classification_loss: 0.1123 498/500 [============================>.] - ETA: 0s - loss: 1.0060 - regression_loss: 0.8938 - classification_loss: 0.1123 499/500 [============================>.] - ETA: 0s - loss: 1.0065 - regression_loss: 0.8942 - classification_loss: 0.1123 500/500 [==============================] - 170s 340ms/step - loss: 1.0064 - regression_loss: 0.8940 - classification_loss: 0.1124 1172 instances of class plum with average precision: 0.7846 mAP: 0.7846 Epoch 00030: saving model to ./training/snapshots/resnet101_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 2:43 - loss: 0.9726 - regression_loss: 0.8646 - classification_loss: 0.1079 2/500 [..............................] - ETA: 2:40 - loss: 1.1932 - regression_loss: 1.0607 - classification_loss: 0.1325 3/500 [..............................] - ETA: 2:40 - loss: 1.0665 - regression_loss: 0.9575 - classification_loss: 0.1091 4/500 [..............................] - ETA: 2:41 - loss: 1.0607 - regression_loss: 0.9479 - classification_loss: 0.1128 5/500 [..............................] - ETA: 2:42 - loss: 1.0253 - regression_loss: 0.9190 - classification_loss: 0.1062 6/500 [..............................] - ETA: 2:44 - loss: 1.0022 - regression_loss: 0.8976 - classification_loss: 0.1046 7/500 [..............................] - ETA: 2:46 - loss: 0.9558 - regression_loss: 0.8598 - classification_loss: 0.0960 8/500 [..............................] - ETA: 2:46 - loss: 0.9667 - regression_loss: 0.8691 - classification_loss: 0.0976 9/500 [..............................] - ETA: 2:46 - loss: 1.0057 - regression_loss: 0.9009 - classification_loss: 0.1048 10/500 [..............................] - ETA: 2:47 - loss: 1.0172 - regression_loss: 0.9112 - classification_loss: 0.1059 11/500 [..............................] - ETA: 2:47 - loss: 0.9541 - regression_loss: 0.8536 - classification_loss: 0.1005 12/500 [..............................] - ETA: 2:47 - loss: 0.9680 - regression_loss: 0.8675 - classification_loss: 0.1005 13/500 [..............................] - ETA: 2:46 - loss: 0.9702 - regression_loss: 0.8696 - classification_loss: 0.1005 14/500 [..............................] - ETA: 2:46 - loss: 0.9922 - regression_loss: 0.8893 - classification_loss: 0.1029 15/500 [..............................] - ETA: 2:46 - loss: 0.9795 - regression_loss: 0.8785 - classification_loss: 0.1010 16/500 [..............................] - ETA: 2:45 - loss: 0.9561 - regression_loss: 0.8571 - classification_loss: 0.0989 17/500 [>.............................] - ETA: 2:45 - loss: 0.9628 - regression_loss: 0.8607 - classification_loss: 0.1022 18/500 [>.............................] - ETA: 2:45 - loss: 0.9939 - regression_loss: 0.8854 - classification_loss: 0.1085 19/500 [>.............................] - ETA: 2:45 - loss: 0.9934 - regression_loss: 0.8852 - classification_loss: 0.1082 20/500 [>.............................] - ETA: 2:44 - loss: 1.0080 - regression_loss: 0.8991 - classification_loss: 0.1089 21/500 [>.............................] - ETA: 2:44 - loss: 0.9812 - regression_loss: 0.8744 - classification_loss: 0.1068 22/500 [>.............................] - ETA: 2:43 - loss: 0.9890 - regression_loss: 0.8808 - classification_loss: 0.1083 23/500 [>.............................] - ETA: 2:43 - loss: 0.9698 - regression_loss: 0.8640 - classification_loss: 0.1058 24/500 [>.............................] - ETA: 2:42 - loss: 0.9851 - regression_loss: 0.8782 - classification_loss: 0.1069 25/500 [>.............................] - ETA: 2:42 - loss: 0.9989 - regression_loss: 0.8894 - classification_loss: 0.1095 26/500 [>.............................] - ETA: 2:41 - loss: 1.0074 - regression_loss: 0.8940 - classification_loss: 0.1134 27/500 [>.............................] - ETA: 2:41 - loss: 1.0237 - regression_loss: 0.9079 - classification_loss: 0.1158 28/500 [>.............................] - ETA: 2:41 - loss: 1.0212 - regression_loss: 0.9067 - classification_loss: 0.1145 29/500 [>.............................] - ETA: 2:40 - loss: 1.0372 - regression_loss: 0.9198 - classification_loss: 0.1173 30/500 [>.............................] - ETA: 2:40 - loss: 1.0149 - regression_loss: 0.9007 - classification_loss: 0.1142 31/500 [>.............................] - ETA: 2:40 - loss: 1.0059 - regression_loss: 0.8938 - classification_loss: 0.1121 32/500 [>.............................] - ETA: 2:39 - loss: 1.0013 - regression_loss: 0.8903 - classification_loss: 0.1111 33/500 [>.............................] - ETA: 2:39 - loss: 1.0143 - regression_loss: 0.9035 - classification_loss: 0.1107 34/500 [=>............................] - ETA: 2:38 - loss: 0.9984 - regression_loss: 0.8886 - classification_loss: 0.1098 35/500 [=>............................] - ETA: 2:38 - loss: 0.9989 - regression_loss: 0.8894 - classification_loss: 0.1095 36/500 [=>............................] - ETA: 2:37 - loss: 1.0014 - regression_loss: 0.8917 - classification_loss: 0.1097 37/500 [=>............................] - ETA: 2:37 - loss: 0.9950 - regression_loss: 0.8861 - classification_loss: 0.1088 38/500 [=>............................] - ETA: 2:37 - loss: 1.0007 - regression_loss: 0.8926 - classification_loss: 0.1081 39/500 [=>............................] - ETA: 2:37 - loss: 0.9973 - regression_loss: 0.8908 - classification_loss: 0.1064 40/500 [=>............................] - ETA: 2:36 - loss: 1.0026 - regression_loss: 0.8955 - classification_loss: 0.1071 41/500 [=>............................] - ETA: 2:36 - loss: 0.9991 - regression_loss: 0.8915 - classification_loss: 0.1076 42/500 [=>............................] - ETA: 2:36 - loss: 1.0004 - regression_loss: 0.8924 - classification_loss: 0.1080 43/500 [=>............................] - ETA: 2:36 - loss: 0.9961 - regression_loss: 0.8893 - classification_loss: 0.1068 44/500 [=>............................] - ETA: 2:35 - loss: 0.9934 - regression_loss: 0.8875 - classification_loss: 0.1059 45/500 [=>............................] - ETA: 2:35 - loss: 0.9960 - regression_loss: 0.8902 - classification_loss: 0.1058 46/500 [=>............................] - ETA: 2:35 - loss: 0.9936 - regression_loss: 0.8876 - classification_loss: 0.1060 47/500 [=>............................] - ETA: 2:34 - loss: 0.9930 - regression_loss: 0.8864 - classification_loss: 0.1066 48/500 [=>............................] - ETA: 2:34 - loss: 0.9930 - regression_loss: 0.8870 - classification_loss: 0.1059 49/500 [=>............................] - ETA: 2:34 - loss: 0.9932 - regression_loss: 0.8876 - classification_loss: 0.1056 50/500 [==>...........................] - ETA: 2:33 - loss: 0.9983 - regression_loss: 0.8911 - classification_loss: 0.1072 51/500 [==>...........................] - ETA: 2:33 - loss: 1.0009 - regression_loss: 0.8936 - classification_loss: 0.1072 52/500 [==>...........................] - ETA: 2:33 - loss: 0.9958 - regression_loss: 0.8893 - classification_loss: 0.1065 53/500 [==>...........................] - ETA: 2:32 - loss: 0.9996 - regression_loss: 0.8917 - classification_loss: 0.1079 54/500 [==>...........................] - ETA: 2:32 - loss: 0.9889 - regression_loss: 0.8824 - classification_loss: 0.1065 55/500 [==>...........................] - ETA: 2:32 - loss: 0.9875 - regression_loss: 0.8813 - classification_loss: 0.1062 56/500 [==>...........................] - ETA: 2:31 - loss: 0.9870 - regression_loss: 0.8803 - classification_loss: 0.1067 57/500 [==>...........................] - ETA: 2:31 - loss: 0.9844 - regression_loss: 0.8780 - classification_loss: 0.1063 58/500 [==>...........................] - ETA: 2:31 - loss: 0.9839 - regression_loss: 0.8776 - classification_loss: 0.1064 59/500 [==>...........................] - ETA: 2:30 - loss: 0.9836 - regression_loss: 0.8774 - classification_loss: 0.1061 60/500 [==>...........................] - ETA: 2:30 - loss: 0.9879 - regression_loss: 0.8812 - classification_loss: 0.1068 61/500 [==>...........................] - ETA: 2:30 - loss: 0.9814 - regression_loss: 0.8755 - classification_loss: 0.1059 62/500 [==>...........................] - ETA: 2:29 - loss: 0.9698 - regression_loss: 0.8651 - classification_loss: 0.1048 63/500 [==>...........................] - ETA: 2:29 - loss: 0.9719 - regression_loss: 0.8663 - classification_loss: 0.1056 64/500 [==>...........................] - ETA: 2:29 - loss: 0.9727 - regression_loss: 0.8671 - classification_loss: 0.1056 65/500 [==>...........................] - ETA: 2:29 - loss: 0.9727 - regression_loss: 0.8675 - classification_loss: 0.1051 66/500 [==>...........................] - ETA: 2:28 - loss: 0.9728 - regression_loss: 0.8679 - classification_loss: 0.1049 67/500 [===>..........................] - ETA: 2:28 - loss: 0.9708 - regression_loss: 0.8658 - classification_loss: 0.1050 68/500 [===>..........................] - ETA: 2:27 - loss: 0.9679 - regression_loss: 0.8638 - classification_loss: 0.1041 69/500 [===>..........................] - ETA: 2:27 - loss: 0.9743 - regression_loss: 0.8687 - classification_loss: 0.1056 70/500 [===>..........................] - ETA: 2:27 - loss: 0.9800 - regression_loss: 0.8736 - classification_loss: 0.1064 71/500 [===>..........................] - ETA: 2:27 - loss: 0.9698 - regression_loss: 0.8647 - classification_loss: 0.1051 72/500 [===>..........................] - ETA: 2:26 - loss: 0.9695 - regression_loss: 0.8652 - classification_loss: 0.1043 73/500 [===>..........................] - ETA: 2:26 - loss: 0.9749 - regression_loss: 0.8699 - classification_loss: 0.1050 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9705 - regression_loss: 0.8663 - classification_loss: 0.1042 75/500 [===>..........................] - ETA: 2:25 - loss: 0.9668 - regression_loss: 0.8634 - classification_loss: 0.1035 76/500 [===>..........................] - ETA: 2:25 - loss: 0.9659 - regression_loss: 0.8616 - classification_loss: 0.1043 77/500 [===>..........................] - ETA: 2:24 - loss: 0.9627 - regression_loss: 0.8588 - classification_loss: 0.1038 78/500 [===>..........................] - ETA: 2:24 - loss: 0.9597 - regression_loss: 0.8563 - classification_loss: 0.1034 79/500 [===>..........................] - ETA: 2:24 - loss: 0.9627 - regression_loss: 0.8597 - classification_loss: 0.1030 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9652 - regression_loss: 0.8618 - classification_loss: 0.1034 81/500 [===>..........................] - ETA: 2:23 - loss: 0.9717 - regression_loss: 0.8663 - classification_loss: 0.1054 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9687 - regression_loss: 0.8638 - classification_loss: 0.1048 83/500 [===>..........................] - ETA: 2:22 - loss: 0.9723 - regression_loss: 0.8670 - classification_loss: 0.1052 84/500 [====>.........................] - ETA: 2:22 - loss: 0.9728 - regression_loss: 0.8669 - classification_loss: 0.1058 85/500 [====>.........................] - ETA: 2:22 - loss: 0.9776 - regression_loss: 0.8708 - classification_loss: 0.1068 86/500 [====>.........................] - ETA: 2:21 - loss: 0.9743 - regression_loss: 0.8686 - classification_loss: 0.1058 87/500 [====>.........................] - ETA: 2:21 - loss: 0.9735 - regression_loss: 0.8682 - classification_loss: 0.1053 88/500 [====>.........................] - ETA: 2:21 - loss: 0.9702 - regression_loss: 0.8654 - classification_loss: 0.1048 89/500 [====>.........................] - ETA: 2:20 - loss: 0.9704 - regression_loss: 0.8656 - classification_loss: 0.1048 90/500 [====>.........................] - ETA: 2:20 - loss: 0.9618 - regression_loss: 0.8580 - classification_loss: 0.1037 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9586 - regression_loss: 0.8553 - classification_loss: 0.1033 92/500 [====>.........................] - ETA: 2:19 - loss: 0.9531 - regression_loss: 0.8505 - classification_loss: 0.1026 93/500 [====>.........................] - ETA: 2:19 - loss: 0.9553 - regression_loss: 0.8525 - classification_loss: 0.1028 94/500 [====>.........................] - ETA: 2:18 - loss: 0.9560 - regression_loss: 0.8538 - classification_loss: 0.1021 95/500 [====>.........................] - ETA: 2:18 - loss: 0.9600 - regression_loss: 0.8573 - classification_loss: 0.1027 96/500 [====>.........................] - ETA: 2:18 - loss: 0.9564 - regression_loss: 0.8543 - classification_loss: 0.1021 97/500 [====>.........................] - ETA: 2:17 - loss: 0.9576 - regression_loss: 0.8553 - classification_loss: 0.1023 98/500 [====>.........................] - ETA: 2:17 - loss: 0.9629 - regression_loss: 0.8591 - classification_loss: 0.1037 99/500 [====>.........................] - ETA: 2:17 - loss: 0.9570 - regression_loss: 0.8541 - classification_loss: 0.1029 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9582 - regression_loss: 0.8554 - classification_loss: 0.1028 101/500 [=====>........................] - ETA: 2:16 - loss: 0.9594 - regression_loss: 0.8569 - classification_loss: 0.1026 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9588 - regression_loss: 0.8564 - classification_loss: 0.1024 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9540 - regression_loss: 0.8522 - classification_loss: 0.1018 104/500 [=====>........................] - ETA: 2:15 - loss: 0.9478 - regression_loss: 0.8467 - classification_loss: 0.1011 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9502 - regression_loss: 0.8489 - classification_loss: 0.1012 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9475 - regression_loss: 0.8466 - classification_loss: 0.1010 107/500 [=====>........................] - ETA: 2:14 - loss: 0.9476 - regression_loss: 0.8467 - classification_loss: 0.1008 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9518 - regression_loss: 0.8513 - classification_loss: 0.1005 109/500 [=====>........................] - ETA: 2:13 - loss: 0.9458 - regression_loss: 0.8460 - classification_loss: 0.0999 110/500 [=====>........................] - ETA: 2:13 - loss: 0.9451 - regression_loss: 0.8431 - classification_loss: 0.1019 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9406 - regression_loss: 0.8394 - classification_loss: 0.1012 112/500 [=====>........................] - ETA: 2:12 - loss: 0.9370 - regression_loss: 0.8365 - classification_loss: 0.1005 113/500 [=====>........................] - ETA: 2:12 - loss: 0.9412 - regression_loss: 0.8404 - classification_loss: 0.1008 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9385 - regression_loss: 0.8382 - classification_loss: 0.1003 115/500 [=====>........................] - ETA: 2:11 - loss: 0.9352 - regression_loss: 0.8353 - classification_loss: 0.1000 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9369 - regression_loss: 0.8365 - classification_loss: 0.1004 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9373 - regression_loss: 0.8366 - classification_loss: 0.1007 118/500 [======>.......................] - ETA: 2:10 - loss: 0.9343 - regression_loss: 0.8341 - classification_loss: 0.1002 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9379 - regression_loss: 0.8371 - classification_loss: 0.1008 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9384 - regression_loss: 0.8378 - classification_loss: 0.1007 121/500 [======>.......................] - ETA: 2:09 - loss: 0.9448 - regression_loss: 0.8431 - classification_loss: 0.1018 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9442 - regression_loss: 0.8424 - classification_loss: 0.1018 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9460 - regression_loss: 0.8440 - classification_loss: 0.1020 124/500 [======>.......................] - ETA: 2:08 - loss: 0.9479 - regression_loss: 0.8459 - classification_loss: 0.1020 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9458 - regression_loss: 0.8442 - classification_loss: 0.1016 126/500 [======>.......................] - ETA: 2:07 - loss: 0.9463 - regression_loss: 0.8444 - classification_loss: 0.1019 127/500 [======>.......................] - ETA: 2:07 - loss: 0.9469 - regression_loss: 0.8455 - classification_loss: 0.1015 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9457 - regression_loss: 0.8443 - classification_loss: 0.1015 129/500 [======>.......................] - ETA: 2:06 - loss: 0.9453 - regression_loss: 0.8440 - classification_loss: 0.1013 130/500 [======>.......................] - ETA: 2:06 - loss: 0.9447 - regression_loss: 0.8437 - classification_loss: 0.1010 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9479 - regression_loss: 0.8465 - classification_loss: 0.1014 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9455 - regression_loss: 0.8448 - classification_loss: 0.1007 133/500 [======>.......................] - ETA: 2:05 - loss: 0.9492 - regression_loss: 0.8479 - classification_loss: 0.1013 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9497 - regression_loss: 0.8478 - classification_loss: 0.1018 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9492 - regression_loss: 0.8473 - classification_loss: 0.1019 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9486 - regression_loss: 0.8467 - classification_loss: 0.1019 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9500 - regression_loss: 0.8478 - classification_loss: 0.1022 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9462 - regression_loss: 0.8444 - classification_loss: 0.1018 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9424 - regression_loss: 0.8411 - classification_loss: 0.1013 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9428 - regression_loss: 0.8416 - classification_loss: 0.1013 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9435 - regression_loss: 0.8424 - classification_loss: 0.1010 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9463 - regression_loss: 0.8449 - classification_loss: 0.1013 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9463 - regression_loss: 0.8449 - classification_loss: 0.1013 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9428 - regression_loss: 0.8419 - classification_loss: 0.1009 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9427 - regression_loss: 0.8417 - classification_loss: 0.1010 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9440 - regression_loss: 0.8428 - classification_loss: 0.1012 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9398 - regression_loss: 0.8389 - classification_loss: 0.1009 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9373 - regression_loss: 0.8367 - classification_loss: 0.1006 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9384 - regression_loss: 0.8376 - classification_loss: 0.1008 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9407 - regression_loss: 0.8395 - classification_loss: 0.1011 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9415 - regression_loss: 0.8405 - classification_loss: 0.1010 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9374 - regression_loss: 0.8370 - classification_loss: 0.1005 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9367 - regression_loss: 0.8366 - classification_loss: 0.1001 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9381 - regression_loss: 0.8377 - classification_loss: 0.1004 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9366 - regression_loss: 0.8366 - classification_loss: 0.1000 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9395 - regression_loss: 0.8388 - classification_loss: 0.1007 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9398 - regression_loss: 0.8390 - classification_loss: 0.1007 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9418 - regression_loss: 0.8408 - classification_loss: 0.1010 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9434 - regression_loss: 0.8422 - classification_loss: 0.1012 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9406 - regression_loss: 0.8396 - classification_loss: 0.1010 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9411 - regression_loss: 0.8400 - classification_loss: 0.1012 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9386 - regression_loss: 0.8376 - classification_loss: 0.1009 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9362 - regression_loss: 0.8357 - classification_loss: 0.1005 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9355 - regression_loss: 0.8351 - classification_loss: 0.1004 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9322 - regression_loss: 0.8323 - classification_loss: 0.0999 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9333 - regression_loss: 0.8332 - classification_loss: 0.1001 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9310 - regression_loss: 0.8313 - classification_loss: 0.0998 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9311 - regression_loss: 0.8316 - classification_loss: 0.0995 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9282 - regression_loss: 0.8291 - classification_loss: 0.0991 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9283 - regression_loss: 0.8292 - classification_loss: 0.0991 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9314 - regression_loss: 0.8318 - classification_loss: 0.0996 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9294 - regression_loss: 0.8302 - classification_loss: 0.0992 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9304 - regression_loss: 0.8310 - classification_loss: 0.0994 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9319 - regression_loss: 0.8322 - classification_loss: 0.0998 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9303 - regression_loss: 0.8309 - classification_loss: 0.0994 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9320 - regression_loss: 0.8322 - classification_loss: 0.0998 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9300 - regression_loss: 0.8304 - classification_loss: 0.0996 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9316 - regression_loss: 0.8314 - classification_loss: 0.1002 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9307 - regression_loss: 0.8307 - classification_loss: 0.0999 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9329 - regression_loss: 0.8327 - classification_loss: 0.1002 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9333 - regression_loss: 0.8330 - classification_loss: 0.1003 182/500 [=========>....................] - ETA: 1:48 - loss: 0.9319 - regression_loss: 0.8320 - classification_loss: 0.0999 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9367 - regression_loss: 0.8364 - classification_loss: 0.1003 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9366 - regression_loss: 0.8363 - classification_loss: 0.1003 185/500 [==========>...................] - ETA: 1:47 - loss: 0.9386 - regression_loss: 0.8379 - classification_loss: 0.1007 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9424 - regression_loss: 0.8409 - classification_loss: 0.1014 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9433 - regression_loss: 0.8417 - classification_loss: 0.1016 188/500 [==========>...................] - ETA: 1:46 - loss: 0.9437 - regression_loss: 0.8421 - classification_loss: 0.1016 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9429 - regression_loss: 0.8415 - classification_loss: 0.1014 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9428 - regression_loss: 0.8415 - classification_loss: 0.1013 191/500 [==========>...................] - ETA: 1:45 - loss: 0.9417 - regression_loss: 0.8407 - classification_loss: 0.1010 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9435 - regression_loss: 0.8421 - classification_loss: 0.1014 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9430 - regression_loss: 0.8417 - classification_loss: 0.1013 194/500 [==========>...................] - ETA: 1:44 - loss: 0.9441 - regression_loss: 0.8428 - classification_loss: 0.1013 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9451 - regression_loss: 0.8438 - classification_loss: 0.1013 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9452 - regression_loss: 0.8440 - classification_loss: 0.1012 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9449 - regression_loss: 0.8438 - classification_loss: 0.1011 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9451 - regression_loss: 0.8441 - classification_loss: 0.1009 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9464 - regression_loss: 0.8451 - classification_loss: 0.1013 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9472 - regression_loss: 0.8458 - classification_loss: 0.1013 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9492 - regression_loss: 0.8476 - classification_loss: 0.1017 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9490 - regression_loss: 0.8475 - classification_loss: 0.1015 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9498 - regression_loss: 0.8483 - classification_loss: 0.1015 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9495 - regression_loss: 0.8482 - classification_loss: 0.1013 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9511 - regression_loss: 0.8496 - classification_loss: 0.1015 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9522 - regression_loss: 0.8505 - classification_loss: 0.1016 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9495 - regression_loss: 0.8482 - classification_loss: 0.1013 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9505 - regression_loss: 0.8491 - classification_loss: 0.1014 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9506 - regression_loss: 0.8493 - classification_loss: 0.1013 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9509 - regression_loss: 0.8497 - classification_loss: 0.1012 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9506 - regression_loss: 0.8495 - classification_loss: 0.1012 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9503 - regression_loss: 0.8494 - classification_loss: 0.1009 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9545 - regression_loss: 0.8529 - classification_loss: 0.1016 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9545 - regression_loss: 0.8528 - classification_loss: 0.1017 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9540 - regression_loss: 0.8521 - classification_loss: 0.1019 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9569 - regression_loss: 0.8545 - classification_loss: 0.1024 217/500 [============>.................] - ETA: 1:36 - loss: 0.9579 - regression_loss: 0.8553 - classification_loss: 0.1025 218/500 [============>.................] - ETA: 1:35 - loss: 0.9576 - regression_loss: 0.8553 - classification_loss: 0.1023 219/500 [============>.................] - ETA: 1:35 - loss: 0.9569 - regression_loss: 0.8548 - classification_loss: 0.1022 220/500 [============>.................] - ETA: 1:35 - loss: 0.9582 - regression_loss: 0.8559 - classification_loss: 0.1024 221/500 [============>.................] - ETA: 1:34 - loss: 0.9596 - regression_loss: 0.8570 - classification_loss: 0.1026 222/500 [============>.................] - ETA: 1:34 - loss: 0.9622 - regression_loss: 0.8592 - classification_loss: 0.1029 223/500 [============>.................] - ETA: 1:34 - loss: 0.9626 - regression_loss: 0.8596 - classification_loss: 0.1030 224/500 [============>.................] - ETA: 1:33 - loss: 0.9622 - regression_loss: 0.8592 - classification_loss: 0.1029 225/500 [============>.................] - ETA: 1:33 - loss: 0.9623 - regression_loss: 0.8594 - classification_loss: 0.1029 226/500 [============>.................] - ETA: 1:33 - loss: 0.9620 - regression_loss: 0.8591 - classification_loss: 0.1029 227/500 [============>.................] - ETA: 1:32 - loss: 0.9617 - regression_loss: 0.8589 - classification_loss: 0.1028 228/500 [============>.................] - ETA: 1:32 - loss: 0.9628 - regression_loss: 0.8598 - classification_loss: 0.1030 229/500 [============>.................] - ETA: 1:32 - loss: 0.9607 - regression_loss: 0.8579 - classification_loss: 0.1028 230/500 [============>.................] - ETA: 1:31 - loss: 0.9642 - regression_loss: 0.8601 - classification_loss: 0.1041 231/500 [============>.................] - ETA: 1:31 - loss: 0.9642 - regression_loss: 0.8602 - classification_loss: 0.1040 232/500 [============>.................] - ETA: 1:31 - loss: 0.9641 - regression_loss: 0.8601 - classification_loss: 0.1040 233/500 [============>.................] - ETA: 1:30 - loss: 0.9611 - regression_loss: 0.8575 - classification_loss: 0.1036 234/500 [=============>................] - ETA: 1:30 - loss: 0.9610 - regression_loss: 0.8572 - classification_loss: 0.1038 235/500 [=============>................] - ETA: 1:30 - loss: 0.9596 - regression_loss: 0.8559 - classification_loss: 0.1037 236/500 [=============>................] - ETA: 1:29 - loss: 0.9601 - regression_loss: 0.8565 - classification_loss: 0.1037 237/500 [=============>................] - ETA: 1:29 - loss: 0.9575 - regression_loss: 0.8541 - classification_loss: 0.1034 238/500 [=============>................] - ETA: 1:29 - loss: 0.9551 - regression_loss: 0.8520 - classification_loss: 0.1032 239/500 [=============>................] - ETA: 1:28 - loss: 0.9550 - regression_loss: 0.8520 - classification_loss: 0.1030 240/500 [=============>................] - ETA: 1:28 - loss: 0.9559 - regression_loss: 0.8529 - classification_loss: 0.1030 241/500 [=============>................] - ETA: 1:28 - loss: 0.9548 - regression_loss: 0.8517 - classification_loss: 0.1031 242/500 [=============>................] - ETA: 1:27 - loss: 0.9538 - regression_loss: 0.8508 - classification_loss: 0.1029 243/500 [=============>................] - ETA: 1:27 - loss: 0.9563 - regression_loss: 0.8529 - classification_loss: 0.1034 244/500 [=============>................] - ETA: 1:27 - loss: 0.9568 - regression_loss: 0.8534 - classification_loss: 0.1034 245/500 [=============>................] - ETA: 1:26 - loss: 0.9556 - regression_loss: 0.8523 - classification_loss: 0.1033 246/500 [=============>................] - ETA: 1:26 - loss: 0.9538 - regression_loss: 0.8508 - classification_loss: 0.1030 247/500 [=============>................] - ETA: 1:26 - loss: 0.9516 - regression_loss: 0.8489 - classification_loss: 0.1027 248/500 [=============>................] - ETA: 1:25 - loss: 0.9488 - regression_loss: 0.8464 - classification_loss: 0.1024 249/500 [=============>................] - ETA: 1:25 - loss: 0.9482 - regression_loss: 0.8458 - classification_loss: 0.1023 250/500 [==============>...............] - ETA: 1:25 - loss: 0.9462 - regression_loss: 0.8441 - classification_loss: 0.1021 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9454 - regression_loss: 0.8435 - classification_loss: 0.1019 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9429 - regression_loss: 0.8413 - classification_loss: 0.1016 253/500 [==============>...............] - ETA: 1:24 - loss: 0.9453 - regression_loss: 0.8434 - classification_loss: 0.1018 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9458 - regression_loss: 0.8440 - classification_loss: 0.1018 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9461 - regression_loss: 0.8444 - classification_loss: 0.1018 256/500 [==============>...............] - ETA: 1:23 - loss: 0.9447 - regression_loss: 0.8431 - classification_loss: 0.1016 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9443 - regression_loss: 0.8429 - classification_loss: 0.1014 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9450 - regression_loss: 0.8435 - classification_loss: 0.1015 259/500 [==============>...............] - ETA: 1:22 - loss: 0.9451 - regression_loss: 0.8436 - classification_loss: 0.1015 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9452 - regression_loss: 0.8435 - classification_loss: 0.1017 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9446 - regression_loss: 0.8429 - classification_loss: 0.1017 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9415 - regression_loss: 0.8401 - classification_loss: 0.1014 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9415 - regression_loss: 0.8402 - classification_loss: 0.1013 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9422 - regression_loss: 0.8401 - classification_loss: 0.1020 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9423 - regression_loss: 0.8404 - classification_loss: 0.1019 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9427 - regression_loss: 0.8408 - classification_loss: 0.1019 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9434 - regression_loss: 0.8415 - classification_loss: 0.1019 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9420 - regression_loss: 0.8403 - classification_loss: 0.1017 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9428 - regression_loss: 0.8409 - classification_loss: 0.1019 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9421 - regression_loss: 0.8403 - classification_loss: 0.1018 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9413 - regression_loss: 0.8397 - classification_loss: 0.1017 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9416 - regression_loss: 0.8401 - classification_loss: 0.1015 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9414 - regression_loss: 0.8399 - classification_loss: 0.1015 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9428 - regression_loss: 0.8410 - classification_loss: 0.1017 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9445 - regression_loss: 0.8424 - classification_loss: 0.1021 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9441 - regression_loss: 0.8423 - classification_loss: 0.1018 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9423 - regression_loss: 0.8407 - classification_loss: 0.1016 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9446 - regression_loss: 0.8428 - classification_loss: 0.1018 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9446 - regression_loss: 0.8427 - classification_loss: 0.1019 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9452 - regression_loss: 0.8434 - classification_loss: 0.1018 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9431 - regression_loss: 0.8416 - classification_loss: 0.1015 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9416 - regression_loss: 0.8403 - classification_loss: 0.1013 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9429 - regression_loss: 0.8415 - classification_loss: 0.1015 284/500 [================>.............] - ETA: 1:13 - loss: 0.9429 - regression_loss: 0.8414 - classification_loss: 0.1015 285/500 [================>.............] - ETA: 1:13 - loss: 0.9435 - regression_loss: 0.8420 - classification_loss: 0.1014 286/500 [================>.............] - ETA: 1:12 - loss: 0.9421 - regression_loss: 0.8409 - classification_loss: 0.1012 287/500 [================>.............] - ETA: 1:12 - loss: 0.9428 - regression_loss: 0.8416 - classification_loss: 0.1012 288/500 [================>.............] - ETA: 1:12 - loss: 0.9437 - regression_loss: 0.8424 - classification_loss: 0.1013 289/500 [================>.............] - ETA: 1:11 - loss: 0.9423 - regression_loss: 0.8412 - classification_loss: 0.1011 290/500 [================>.............] - ETA: 1:11 - loss: 0.9429 - regression_loss: 0.8418 - classification_loss: 0.1012 291/500 [================>.............] - ETA: 1:11 - loss: 0.9431 - regression_loss: 0.8416 - classification_loss: 0.1015 292/500 [================>.............] - ETA: 1:10 - loss: 0.9442 - regression_loss: 0.8427 - classification_loss: 0.1014 293/500 [================>.............] - ETA: 1:10 - loss: 0.9445 - regression_loss: 0.8431 - classification_loss: 0.1014 294/500 [================>.............] - ETA: 1:10 - loss: 0.9440 - regression_loss: 0.8427 - classification_loss: 0.1013 295/500 [================>.............] - ETA: 1:09 - loss: 0.9446 - regression_loss: 0.8433 - classification_loss: 0.1013 296/500 [================>.............] - ETA: 1:09 - loss: 0.9441 - regression_loss: 0.8429 - classification_loss: 0.1012 297/500 [================>.............] - ETA: 1:09 - loss: 0.9456 - regression_loss: 0.8441 - classification_loss: 0.1015 298/500 [================>.............] - ETA: 1:08 - loss: 0.9451 - regression_loss: 0.8437 - classification_loss: 0.1014 299/500 [================>.............] - ETA: 1:08 - loss: 0.9461 - regression_loss: 0.8445 - classification_loss: 0.1016 300/500 [=================>............] - ETA: 1:08 - loss: 0.9469 - regression_loss: 0.8453 - classification_loss: 0.1016 301/500 [=================>............] - ETA: 1:07 - loss: 0.9478 - regression_loss: 0.8460 - classification_loss: 0.1017 302/500 [=================>............] - ETA: 1:07 - loss: 0.9475 - regression_loss: 0.8458 - classification_loss: 0.1016 303/500 [=================>............] - ETA: 1:07 - loss: 0.9465 - regression_loss: 0.8450 - classification_loss: 0.1014 304/500 [=================>............] - ETA: 1:06 - loss: 0.9451 - regression_loss: 0.8439 - classification_loss: 0.1012 305/500 [=================>............] - ETA: 1:06 - loss: 0.9478 - regression_loss: 0.8463 - classification_loss: 0.1015 306/500 [=================>............] - ETA: 1:06 - loss: 0.9483 - regression_loss: 0.8467 - classification_loss: 0.1015 307/500 [=================>............] - ETA: 1:05 - loss: 0.9487 - regression_loss: 0.8471 - classification_loss: 0.1016 308/500 [=================>............] - ETA: 1:05 - loss: 0.9492 - regression_loss: 0.8474 - classification_loss: 0.1018 309/500 [=================>............] - ETA: 1:05 - loss: 0.9492 - regression_loss: 0.8473 - classification_loss: 0.1019 310/500 [=================>............] - ETA: 1:04 - loss: 0.9495 - regression_loss: 0.8475 - classification_loss: 0.1019 311/500 [=================>............] - ETA: 1:04 - loss: 0.9501 - regression_loss: 0.8479 - classification_loss: 0.1022 312/500 [=================>............] - ETA: 1:03 - loss: 0.9487 - regression_loss: 0.8468 - classification_loss: 0.1020 313/500 [=================>............] - ETA: 1:03 - loss: 0.9481 - regression_loss: 0.8463 - classification_loss: 0.1018 314/500 [=================>............] - ETA: 1:03 - loss: 0.9494 - regression_loss: 0.8475 - classification_loss: 0.1020 315/500 [=================>............] - ETA: 1:02 - loss: 0.9504 - regression_loss: 0.8483 - classification_loss: 0.1021 316/500 [=================>............] - ETA: 1:02 - loss: 0.9514 - regression_loss: 0.8491 - classification_loss: 0.1022 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9498 - regression_loss: 0.8474 - classification_loss: 0.1023 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9481 - regression_loss: 0.8460 - classification_loss: 0.1021 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9486 - regression_loss: 0.8464 - classification_loss: 0.1022 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9479 - regression_loss: 0.8458 - classification_loss: 0.1021 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9497 - regression_loss: 0.8473 - classification_loss: 0.1024 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9504 - regression_loss: 0.8479 - classification_loss: 0.1025 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9502 - regression_loss: 0.8477 - classification_loss: 0.1025 324/500 [==================>...........] - ETA: 59s - loss: 0.9519 - regression_loss: 0.8491 - classification_loss: 0.1028  325/500 [==================>...........] - ETA: 59s - loss: 0.9503 - regression_loss: 0.8477 - classification_loss: 0.1026 326/500 [==================>...........] - ETA: 59s - loss: 0.9497 - regression_loss: 0.8472 - classification_loss: 0.1026 327/500 [==================>...........] - ETA: 58s - loss: 0.9499 - regression_loss: 0.8472 - classification_loss: 0.1027 328/500 [==================>...........] - ETA: 58s - loss: 0.9506 - regression_loss: 0.8478 - classification_loss: 0.1028 329/500 [==================>...........] - ETA: 58s - loss: 0.9512 - regression_loss: 0.8483 - classification_loss: 0.1030 330/500 [==================>...........] - ETA: 57s - loss: 0.9503 - regression_loss: 0.8476 - classification_loss: 0.1027 331/500 [==================>...........] - ETA: 57s - loss: 0.9504 - regression_loss: 0.8479 - classification_loss: 0.1025 332/500 [==================>...........] - ETA: 57s - loss: 0.9489 - regression_loss: 0.8465 - classification_loss: 0.1024 333/500 [==================>...........] - ETA: 56s - loss: 0.9496 - regression_loss: 0.8471 - classification_loss: 0.1025 334/500 [===================>..........] - ETA: 56s - loss: 0.9503 - regression_loss: 0.8476 - classification_loss: 0.1027 335/500 [===================>..........] - ETA: 56s - loss: 0.9498 - regression_loss: 0.8473 - classification_loss: 0.1026 336/500 [===================>..........] - ETA: 55s - loss: 0.9504 - regression_loss: 0.8478 - classification_loss: 0.1026 337/500 [===================>..........] - ETA: 55s - loss: 0.9515 - regression_loss: 0.8488 - classification_loss: 0.1027 338/500 [===================>..........] - ETA: 55s - loss: 0.9519 - regression_loss: 0.8493 - classification_loss: 0.1026 339/500 [===================>..........] - ETA: 54s - loss: 0.9524 - regression_loss: 0.8496 - classification_loss: 0.1028 340/500 [===================>..........] - ETA: 54s - loss: 0.9515 - regression_loss: 0.8488 - classification_loss: 0.1028 341/500 [===================>..........] - ETA: 54s - loss: 0.9518 - regression_loss: 0.8491 - classification_loss: 0.1027 342/500 [===================>..........] - ETA: 53s - loss: 0.9523 - regression_loss: 0.8495 - classification_loss: 0.1028 343/500 [===================>..........] - ETA: 53s - loss: 0.9533 - regression_loss: 0.8504 - classification_loss: 0.1029 344/500 [===================>..........] - ETA: 53s - loss: 0.9536 - regression_loss: 0.8507 - classification_loss: 0.1029 345/500 [===================>..........] - ETA: 52s - loss: 0.9540 - regression_loss: 0.8511 - classification_loss: 0.1029 346/500 [===================>..........] - ETA: 52s - loss: 0.9542 - regression_loss: 0.8514 - classification_loss: 0.1028 347/500 [===================>..........] - ETA: 52s - loss: 0.9541 - regression_loss: 0.8514 - classification_loss: 0.1027 348/500 [===================>..........] - ETA: 51s - loss: 0.9525 - regression_loss: 0.8499 - classification_loss: 0.1026 349/500 [===================>..........] - ETA: 51s - loss: 0.9532 - regression_loss: 0.8505 - classification_loss: 0.1027 350/500 [====================>.........] - ETA: 51s - loss: 0.9523 - regression_loss: 0.8496 - classification_loss: 0.1027 351/500 [====================>.........] - ETA: 50s - loss: 0.9523 - regression_loss: 0.8496 - classification_loss: 0.1027 352/500 [====================>.........] - ETA: 50s - loss: 0.9527 - regression_loss: 0.8500 - classification_loss: 0.1027 353/500 [====================>.........] - ETA: 50s - loss: 0.9530 - regression_loss: 0.8502 - classification_loss: 0.1029 354/500 [====================>.........] - ETA: 49s - loss: 0.9541 - regression_loss: 0.8511 - classification_loss: 0.1030 355/500 [====================>.........] - ETA: 49s - loss: 0.9531 - regression_loss: 0.8502 - classification_loss: 0.1029 356/500 [====================>.........] - ETA: 49s - loss: 0.9530 - regression_loss: 0.8501 - classification_loss: 0.1029 357/500 [====================>.........] - ETA: 48s - loss: 0.9533 - regression_loss: 0.8503 - classification_loss: 0.1030 358/500 [====================>.........] - ETA: 48s - loss: 0.9525 - regression_loss: 0.8498 - classification_loss: 0.1027 359/500 [====================>.........] - ETA: 47s - loss: 0.9521 - regression_loss: 0.8494 - classification_loss: 0.1027 360/500 [====================>.........] - ETA: 47s - loss: 0.9529 - regression_loss: 0.8501 - classification_loss: 0.1028 361/500 [====================>.........] - ETA: 47s - loss: 0.9519 - regression_loss: 0.8492 - classification_loss: 0.1027 362/500 [====================>.........] - ETA: 46s - loss: 0.9525 - regression_loss: 0.8496 - classification_loss: 0.1028 363/500 [====================>.........] - ETA: 46s - loss: 0.9515 - regression_loss: 0.8489 - classification_loss: 0.1027 364/500 [====================>.........] - ETA: 46s - loss: 0.9540 - regression_loss: 0.8512 - classification_loss: 0.1029 365/500 [====================>.........] - ETA: 45s - loss: 0.9538 - regression_loss: 0.8510 - classification_loss: 0.1028 366/500 [====================>.........] - ETA: 45s - loss: 0.9545 - regression_loss: 0.8518 - classification_loss: 0.1028 367/500 [=====================>........] - ETA: 45s - loss: 0.9548 - regression_loss: 0.8520 - classification_loss: 0.1028 368/500 [=====================>........] - ETA: 44s - loss: 0.9544 - regression_loss: 0.8517 - classification_loss: 0.1027 369/500 [=====================>........] - ETA: 44s - loss: 0.9527 - regression_loss: 0.8502 - classification_loss: 0.1025 370/500 [=====================>........] - ETA: 44s - loss: 0.9539 - regression_loss: 0.8512 - classification_loss: 0.1027 371/500 [=====================>........] - ETA: 43s - loss: 0.9546 - regression_loss: 0.8518 - classification_loss: 0.1028 372/500 [=====================>........] - ETA: 43s - loss: 0.9551 - regression_loss: 0.8523 - classification_loss: 0.1029 373/500 [=====================>........] - ETA: 43s - loss: 0.9559 - regression_loss: 0.8528 - classification_loss: 0.1031 374/500 [=====================>........] - ETA: 42s - loss: 0.9543 - regression_loss: 0.8514 - classification_loss: 0.1028 375/500 [=====================>........] - ETA: 42s - loss: 0.9529 - regression_loss: 0.8503 - classification_loss: 0.1026 376/500 [=====================>........] - ETA: 42s - loss: 0.9539 - regression_loss: 0.8510 - classification_loss: 0.1028 377/500 [=====================>........] - ETA: 41s - loss: 0.9548 - regression_loss: 0.8518 - classification_loss: 0.1030 378/500 [=====================>........] - ETA: 41s - loss: 0.9548 - regression_loss: 0.8519 - classification_loss: 0.1029 379/500 [=====================>........] - ETA: 41s - loss: 0.9558 - regression_loss: 0.8528 - classification_loss: 0.1029 380/500 [=====================>........] - ETA: 40s - loss: 0.9557 - regression_loss: 0.8527 - classification_loss: 0.1029 381/500 [=====================>........] - ETA: 40s - loss: 0.9556 - regression_loss: 0.8527 - classification_loss: 0.1029 382/500 [=====================>........] - ETA: 40s - loss: 0.9557 - regression_loss: 0.8527 - classification_loss: 0.1030 383/500 [=====================>........] - ETA: 39s - loss: 0.9545 - regression_loss: 0.8516 - classification_loss: 0.1029 384/500 [======================>.......] - ETA: 39s - loss: 0.9530 - regression_loss: 0.8503 - classification_loss: 0.1027 385/500 [======================>.......] - ETA: 39s - loss: 0.9544 - regression_loss: 0.8515 - classification_loss: 0.1029 386/500 [======================>.......] - ETA: 38s - loss: 0.9538 - regression_loss: 0.8511 - classification_loss: 0.1027 387/500 [======================>.......] - ETA: 38s - loss: 0.9527 - regression_loss: 0.8501 - classification_loss: 0.1026 388/500 [======================>.......] - ETA: 38s - loss: 0.9524 - regression_loss: 0.8499 - classification_loss: 0.1025 389/500 [======================>.......] - ETA: 37s - loss: 0.9520 - regression_loss: 0.8496 - classification_loss: 0.1024 390/500 [======================>.......] - ETA: 37s - loss: 0.9526 - regression_loss: 0.8500 - classification_loss: 0.1026 391/500 [======================>.......] - ETA: 37s - loss: 0.9536 - regression_loss: 0.8506 - classification_loss: 0.1029 392/500 [======================>.......] - ETA: 36s - loss: 0.9531 - regression_loss: 0.8503 - classification_loss: 0.1029 393/500 [======================>.......] - ETA: 36s - loss: 0.9541 - regression_loss: 0.8510 - classification_loss: 0.1031 394/500 [======================>.......] - ETA: 36s - loss: 0.9537 - regression_loss: 0.8507 - classification_loss: 0.1030 395/500 [======================>.......] - ETA: 35s - loss: 0.9547 - regression_loss: 0.8515 - classification_loss: 0.1032 396/500 [======================>.......] - ETA: 35s - loss: 0.9552 - regression_loss: 0.8520 - classification_loss: 0.1033 397/500 [======================>.......] - ETA: 35s - loss: 0.9558 - regression_loss: 0.8526 - classification_loss: 0.1033 398/500 [======================>.......] - ETA: 34s - loss: 0.9544 - regression_loss: 0.8513 - classification_loss: 0.1031 399/500 [======================>.......] - ETA: 34s - loss: 0.9539 - regression_loss: 0.8508 - classification_loss: 0.1031 400/500 [=======================>......] - ETA: 34s - loss: 0.9535 - regression_loss: 0.8504 - classification_loss: 0.1031 401/500 [=======================>......] - ETA: 33s - loss: 0.9538 - regression_loss: 0.8508 - classification_loss: 0.1030 402/500 [=======================>......] - ETA: 33s - loss: 0.9529 - regression_loss: 0.8501 - classification_loss: 0.1029 403/500 [=======================>......] - ETA: 33s - loss: 0.9532 - regression_loss: 0.8503 - classification_loss: 0.1029 404/500 [=======================>......] - ETA: 32s - loss: 0.9534 - regression_loss: 0.8504 - classification_loss: 0.1029 405/500 [=======================>......] - ETA: 32s - loss: 0.9522 - regression_loss: 0.8494 - classification_loss: 0.1028 406/500 [=======================>......] - ETA: 32s - loss: 0.9508 - regression_loss: 0.8482 - classification_loss: 0.1026 407/500 [=======================>......] - ETA: 31s - loss: 0.9509 - regression_loss: 0.8483 - classification_loss: 0.1026 408/500 [=======================>......] - ETA: 31s - loss: 0.9497 - regression_loss: 0.8473 - classification_loss: 0.1024 409/500 [=======================>......] - ETA: 30s - loss: 0.9500 - regression_loss: 0.8475 - classification_loss: 0.1025 410/500 [=======================>......] - ETA: 30s - loss: 0.9492 - regression_loss: 0.8468 - classification_loss: 0.1024 411/500 [=======================>......] - ETA: 30s - loss: 0.9501 - regression_loss: 0.8476 - classification_loss: 0.1026 412/500 [=======================>......] - ETA: 29s - loss: 0.9489 - regression_loss: 0.8465 - classification_loss: 0.1024 413/500 [=======================>......] - ETA: 29s - loss: 0.9478 - regression_loss: 0.8456 - classification_loss: 0.1022 414/500 [=======================>......] - ETA: 29s - loss: 0.9466 - regression_loss: 0.8446 - classification_loss: 0.1020 415/500 [=======================>......] - ETA: 28s - loss: 0.9478 - regression_loss: 0.8455 - classification_loss: 0.1023 416/500 [=======================>......] - ETA: 28s - loss: 0.9462 - regression_loss: 0.8440 - classification_loss: 0.1022 417/500 [========================>.....] - ETA: 28s - loss: 0.9461 - regression_loss: 0.8440 - classification_loss: 0.1021 418/500 [========================>.....] - ETA: 27s - loss: 0.9462 - regression_loss: 0.8441 - classification_loss: 0.1021 419/500 [========================>.....] - ETA: 27s - loss: 0.9468 - regression_loss: 0.8446 - classification_loss: 0.1022 420/500 [========================>.....] - ETA: 27s - loss: 0.9474 - regression_loss: 0.8452 - classification_loss: 0.1022 421/500 [========================>.....] - ETA: 26s - loss: 0.9474 - regression_loss: 0.8452 - classification_loss: 0.1022 422/500 [========================>.....] - ETA: 26s - loss: 0.9482 - regression_loss: 0.8458 - classification_loss: 0.1024 423/500 [========================>.....] - ETA: 26s - loss: 0.9482 - regression_loss: 0.8457 - classification_loss: 0.1025 424/500 [========================>.....] - ETA: 25s - loss: 0.9496 - regression_loss: 0.8469 - classification_loss: 0.1027 425/500 [========================>.....] - ETA: 25s - loss: 0.9507 - regression_loss: 0.8479 - classification_loss: 0.1028 426/500 [========================>.....] - ETA: 25s - loss: 0.9507 - regression_loss: 0.8479 - classification_loss: 0.1028 427/500 [========================>.....] - ETA: 24s - loss: 0.9504 - regression_loss: 0.8477 - classification_loss: 0.1027 428/500 [========================>.....] - ETA: 24s - loss: 0.9506 - regression_loss: 0.8477 - classification_loss: 0.1029 429/500 [========================>.....] - ETA: 24s - loss: 0.9502 - regression_loss: 0.8474 - classification_loss: 0.1027 430/500 [========================>.....] - ETA: 23s - loss: 0.9502 - regression_loss: 0.8475 - classification_loss: 0.1027 431/500 [========================>.....] - ETA: 23s - loss: 0.9498 - regression_loss: 0.8471 - classification_loss: 0.1027 432/500 [========================>.....] - ETA: 23s - loss: 0.9509 - regression_loss: 0.8480 - classification_loss: 0.1029 433/500 [========================>.....] - ETA: 22s - loss: 0.9510 - regression_loss: 0.8479 - classification_loss: 0.1030 434/500 [=========================>....] - ETA: 22s - loss: 0.9507 - regression_loss: 0.8478 - classification_loss: 0.1029 435/500 [=========================>....] - ETA: 22s - loss: 0.9509 - regression_loss: 0.8479 - classification_loss: 0.1029 436/500 [=========================>....] - ETA: 21s - loss: 0.9511 - regression_loss: 0.8483 - classification_loss: 0.1028 437/500 [=========================>....] - ETA: 21s - loss: 0.9517 - regression_loss: 0.8487 - classification_loss: 0.1029 438/500 [=========================>....] - ETA: 21s - loss: 0.9498 - regression_loss: 0.8471 - classification_loss: 0.1027 439/500 [=========================>....] - ETA: 20s - loss: 0.9509 - regression_loss: 0.8479 - classification_loss: 0.1030 440/500 [=========================>....] - ETA: 20s - loss: 0.9512 - regression_loss: 0.8482 - classification_loss: 0.1030 441/500 [=========================>....] - ETA: 20s - loss: 0.9510 - regression_loss: 0.8481 - classification_loss: 0.1029 442/500 [=========================>....] - ETA: 19s - loss: 0.9509 - regression_loss: 0.8481 - classification_loss: 0.1028 443/500 [=========================>....] - ETA: 19s - loss: 0.9502 - regression_loss: 0.8476 - classification_loss: 0.1026 444/500 [=========================>....] - ETA: 19s - loss: 0.9505 - regression_loss: 0.8479 - classification_loss: 0.1026 445/500 [=========================>....] - ETA: 18s - loss: 0.9511 - regression_loss: 0.8484 - classification_loss: 0.1027 446/500 [=========================>....] - ETA: 18s - loss: 0.9510 - regression_loss: 0.8482 - classification_loss: 0.1027 447/500 [=========================>....] - ETA: 18s - loss: 0.9514 - regression_loss: 0.8486 - classification_loss: 0.1028 448/500 [=========================>....] - ETA: 17s - loss: 0.9504 - regression_loss: 0.8478 - classification_loss: 0.1026 449/500 [=========================>....] - ETA: 17s - loss: 0.9506 - regression_loss: 0.8479 - classification_loss: 0.1027 450/500 [==========================>...] - ETA: 17s - loss: 0.9496 - regression_loss: 0.8470 - classification_loss: 0.1025 451/500 [==========================>...] - ETA: 16s - loss: 0.9479 - regression_loss: 0.8456 - classification_loss: 0.1023 452/500 [==========================>...] - ETA: 16s - loss: 0.9493 - regression_loss: 0.8466 - classification_loss: 0.1026 453/500 [==========================>...] - ETA: 15s - loss: 0.9486 - regression_loss: 0.8461 - classification_loss: 0.1025 454/500 [==========================>...] - ETA: 15s - loss: 0.9490 - regression_loss: 0.8464 - classification_loss: 0.1025 455/500 [==========================>...] - ETA: 15s - loss: 0.9483 - regression_loss: 0.8458 - classification_loss: 0.1024 456/500 [==========================>...] - ETA: 14s - loss: 0.9483 - regression_loss: 0.8459 - classification_loss: 0.1024 457/500 [==========================>...] - ETA: 14s - loss: 0.9480 - regression_loss: 0.8456 - classification_loss: 0.1024 458/500 [==========================>...] - ETA: 14s - loss: 0.9480 - regression_loss: 0.8456 - classification_loss: 0.1024 459/500 [==========================>...] - ETA: 13s - loss: 0.9475 - regression_loss: 0.8451 - classification_loss: 0.1024 460/500 [==========================>...] - ETA: 13s - loss: 0.9469 - regression_loss: 0.8446 - classification_loss: 0.1023 461/500 [==========================>...] - ETA: 13s - loss: 0.9467 - regression_loss: 0.8445 - classification_loss: 0.1022 462/500 [==========================>...] - ETA: 12s - loss: 0.9467 - regression_loss: 0.8446 - classification_loss: 0.1021 463/500 [==========================>...] - ETA: 12s - loss: 0.9469 - regression_loss: 0.8446 - classification_loss: 0.1023 464/500 [==========================>...] - ETA: 12s - loss: 0.9473 - regression_loss: 0.8450 - classification_loss: 0.1023 465/500 [==========================>...] - ETA: 11s - loss: 0.9470 - regression_loss: 0.8448 - classification_loss: 0.1022 466/500 [==========================>...] - ETA: 11s - loss: 0.9473 - regression_loss: 0.8450 - classification_loss: 0.1023 467/500 [===========================>..] - ETA: 11s - loss: 0.9470 - regression_loss: 0.8448 - classification_loss: 0.1022 468/500 [===========================>..] - ETA: 10s - loss: 0.9455 - regression_loss: 0.8435 - classification_loss: 0.1020 469/500 [===========================>..] - ETA: 10s - loss: 0.9462 - regression_loss: 0.8440 - classification_loss: 0.1022 470/500 [===========================>..] - ETA: 10s - loss: 0.9464 - regression_loss: 0.8443 - classification_loss: 0.1022 471/500 [===========================>..] - ETA: 9s - loss: 0.9458 - regression_loss: 0.8437 - classification_loss: 0.1021  472/500 [===========================>..] - ETA: 9s - loss: 0.9446 - regression_loss: 0.8426 - classification_loss: 0.1020 473/500 [===========================>..] - ETA: 9s - loss: 0.9450 - regression_loss: 0.8430 - classification_loss: 0.1020 474/500 [===========================>..] - ETA: 8s - loss: 0.9448 - regression_loss: 0.8428 - classification_loss: 0.1020 475/500 [===========================>..] - ETA: 8s - loss: 0.9440 - regression_loss: 0.8422 - classification_loss: 0.1018 476/500 [===========================>..] - ETA: 8s - loss: 0.9434 - regression_loss: 0.8416 - classification_loss: 0.1018 477/500 [===========================>..] - ETA: 7s - loss: 0.9425 - regression_loss: 0.8409 - classification_loss: 0.1016 478/500 [===========================>..] - ETA: 7s - loss: 0.9427 - regression_loss: 0.8410 - classification_loss: 0.1016 479/500 [===========================>..] - ETA: 7s - loss: 0.9429 - regression_loss: 0.8413 - classification_loss: 0.1016 480/500 [===========================>..] - ETA: 6s - loss: 0.9448 - regression_loss: 0.8429 - classification_loss: 0.1019 481/500 [===========================>..] - ETA: 6s - loss: 0.9448 - regression_loss: 0.8429 - classification_loss: 0.1020 482/500 [===========================>..] - ETA: 6s - loss: 0.9449 - regression_loss: 0.8430 - classification_loss: 0.1020 483/500 [===========================>..] - ETA: 5s - loss: 0.9454 - regression_loss: 0.8436 - classification_loss: 0.1018 484/500 [============================>.] - ETA: 5s - loss: 0.9461 - regression_loss: 0.8443 - classification_loss: 0.1018 485/500 [============================>.] - ETA: 5s - loss: 0.9466 - regression_loss: 0.8448 - classification_loss: 0.1018 486/500 [============================>.] - ETA: 4s - loss: 0.9472 - regression_loss: 0.8453 - classification_loss: 0.1019 487/500 [============================>.] - ETA: 4s - loss: 0.9471 - regression_loss: 0.8451 - classification_loss: 0.1019 488/500 [============================>.] - ETA: 4s - loss: 0.9477 - regression_loss: 0.8457 - classification_loss: 0.1021 489/500 [============================>.] - ETA: 3s - loss: 0.9471 - regression_loss: 0.8452 - classification_loss: 0.1020 490/500 [============================>.] - ETA: 3s - loss: 0.9473 - regression_loss: 0.8454 - classification_loss: 0.1020 491/500 [============================>.] - ETA: 3s - loss: 0.9475 - regression_loss: 0.8455 - classification_loss: 0.1020 492/500 [============================>.] - ETA: 2s - loss: 0.9480 - regression_loss: 0.8459 - classification_loss: 0.1022 493/500 [============================>.] - ETA: 2s - loss: 0.9487 - regression_loss: 0.8464 - classification_loss: 0.1023 494/500 [============================>.] - ETA: 2s - loss: 0.9490 - regression_loss: 0.8466 - classification_loss: 0.1023 495/500 [============================>.] - ETA: 1s - loss: 0.9484 - regression_loss: 0.8460 - classification_loss: 0.1023 496/500 [============================>.] - ETA: 1s - loss: 0.9479 - regression_loss: 0.8457 - classification_loss: 0.1023 497/500 [============================>.] - ETA: 1s - loss: 0.9484 - regression_loss: 0.8459 - classification_loss: 0.1025 498/500 [============================>.] - ETA: 0s - loss: 0.9482 - regression_loss: 0.8457 - classification_loss: 0.1025 499/500 [============================>.] - ETA: 0s - loss: 0.9491 - regression_loss: 0.8465 - classification_loss: 0.1026 500/500 [==============================] - 170s 340ms/step - loss: 0.9482 - regression_loss: 0.8456 - classification_loss: 0.1026 1172 instances of class plum with average precision: 0.8047 mAP: 0.8047 Epoch 00031: saving model to ./training/snapshots/resnet101_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 2:43 - loss: 0.5760 - regression_loss: 0.3428 - classification_loss: 0.2332 2/500 [..............................] - ETA: 2:47 - loss: 0.6865 - regression_loss: 0.5382 - classification_loss: 0.1483 3/500 [..............................] - ETA: 2:46 - loss: 0.6495 - regression_loss: 0.5313 - classification_loss: 0.1182 4/500 [..............................] - ETA: 2:48 - loss: 0.7249 - regression_loss: 0.6152 - classification_loss: 0.1097 5/500 [..............................] - ETA: 2:48 - loss: 0.6956 - regression_loss: 0.6048 - classification_loss: 0.0908 6/500 [..............................] - ETA: 2:47 - loss: 0.6649 - regression_loss: 0.5795 - classification_loss: 0.0854 7/500 [..............................] - ETA: 2:46 - loss: 0.7512 - regression_loss: 0.6554 - classification_loss: 0.0958 8/500 [..............................] - ETA: 2:45 - loss: 0.8351 - regression_loss: 0.7247 - classification_loss: 0.1104 9/500 [..............................] - ETA: 2:45 - loss: 0.8182 - regression_loss: 0.7116 - classification_loss: 0.1066 10/500 [..............................] - ETA: 2:46 - loss: 0.8448 - regression_loss: 0.7329 - classification_loss: 0.1119 11/500 [..............................] - ETA: 2:46 - loss: 0.8372 - regression_loss: 0.7313 - classification_loss: 0.1059 12/500 [..............................] - ETA: 2:46 - loss: 0.8368 - regression_loss: 0.7314 - classification_loss: 0.1054 13/500 [..............................] - ETA: 2:45 - loss: 0.8681 - regression_loss: 0.7581 - classification_loss: 0.1100 14/500 [..............................] - ETA: 2:45 - loss: 0.9127 - regression_loss: 0.8029 - classification_loss: 0.1098 15/500 [..............................] - ETA: 2:44 - loss: 0.9168 - regression_loss: 0.8104 - classification_loss: 0.1064 16/500 [..............................] - ETA: 2:43 - loss: 0.9337 - regression_loss: 0.8249 - classification_loss: 0.1087 17/500 [>.............................] - ETA: 2:42 - loss: 0.9166 - regression_loss: 0.8097 - classification_loss: 0.1069 18/500 [>.............................] - ETA: 2:42 - loss: 0.9215 - regression_loss: 0.8148 - classification_loss: 0.1068 19/500 [>.............................] - ETA: 2:42 - loss: 0.9169 - regression_loss: 0.8102 - classification_loss: 0.1067 20/500 [>.............................] - ETA: 2:41 - loss: 0.9509 - regression_loss: 0.8381 - classification_loss: 0.1128 21/500 [>.............................] - ETA: 2:41 - loss: 0.9798 - regression_loss: 0.8619 - classification_loss: 0.1179 22/500 [>.............................] - ETA: 2:41 - loss: 0.9604 - regression_loss: 0.8457 - classification_loss: 0.1147 23/500 [>.............................] - ETA: 2:41 - loss: 0.9741 - regression_loss: 0.8579 - classification_loss: 0.1162 24/500 [>.............................] - ETA: 2:41 - loss: 0.9643 - regression_loss: 0.8511 - classification_loss: 0.1132 25/500 [>.............................] - ETA: 2:40 - loss: 0.9559 - regression_loss: 0.8454 - classification_loss: 0.1105 26/500 [>.............................] - ETA: 2:40 - loss: 0.9635 - regression_loss: 0.8535 - classification_loss: 0.1099 27/500 [>.............................] - ETA: 2:39 - loss: 0.9625 - regression_loss: 0.8537 - classification_loss: 0.1088 28/500 [>.............................] - ETA: 2:39 - loss: 0.9650 - regression_loss: 0.8538 - classification_loss: 0.1112 29/500 [>.............................] - ETA: 2:39 - loss: 0.9632 - regression_loss: 0.8524 - classification_loss: 0.1108 30/500 [>.............................] - ETA: 2:39 - loss: 0.9636 - regression_loss: 0.8544 - classification_loss: 0.1092 31/500 [>.............................] - ETA: 2:39 - loss: 0.9712 - regression_loss: 0.8609 - classification_loss: 0.1103 32/500 [>.............................] - ETA: 2:38 - loss: 0.9720 - regression_loss: 0.8609 - classification_loss: 0.1111 33/500 [>.............................] - ETA: 2:38 - loss: 0.9740 - regression_loss: 0.8633 - classification_loss: 0.1107 34/500 [=>............................] - ETA: 2:37 - loss: 0.9575 - regression_loss: 0.8491 - classification_loss: 0.1084 35/500 [=>............................] - ETA: 2:37 - loss: 0.9647 - regression_loss: 0.8563 - classification_loss: 0.1084 36/500 [=>............................] - ETA: 2:37 - loss: 0.9710 - regression_loss: 0.8612 - classification_loss: 0.1098 37/500 [=>............................] - ETA: 2:36 - loss: 0.9672 - regression_loss: 0.8580 - classification_loss: 0.1092 38/500 [=>............................] - ETA: 2:36 - loss: 0.9626 - regression_loss: 0.8553 - classification_loss: 0.1073 39/500 [=>............................] - ETA: 2:36 - loss: 0.9434 - regression_loss: 0.8384 - classification_loss: 0.1050 40/500 [=>............................] - ETA: 2:35 - loss: 0.9308 - regression_loss: 0.8277 - classification_loss: 0.1030 41/500 [=>............................] - ETA: 2:35 - loss: 0.9185 - regression_loss: 0.8168 - classification_loss: 0.1017 42/500 [=>............................] - ETA: 2:35 - loss: 0.9075 - regression_loss: 0.8077 - classification_loss: 0.0999 43/500 [=>............................] - ETA: 2:35 - loss: 0.9086 - regression_loss: 0.8100 - classification_loss: 0.0986 44/500 [=>............................] - ETA: 2:35 - loss: 0.9098 - regression_loss: 0.8115 - classification_loss: 0.0984 45/500 [=>............................] - ETA: 2:34 - loss: 0.9033 - regression_loss: 0.8061 - classification_loss: 0.0971 46/500 [=>............................] - ETA: 2:34 - loss: 0.9081 - regression_loss: 0.8102 - classification_loss: 0.0979 47/500 [=>............................] - ETA: 2:33 - loss: 0.9048 - regression_loss: 0.8082 - classification_loss: 0.0966 48/500 [=>............................] - ETA: 2:33 - loss: 0.8974 - regression_loss: 0.8023 - classification_loss: 0.0951 49/500 [=>............................] - ETA: 2:32 - loss: 0.8976 - regression_loss: 0.8031 - classification_loss: 0.0945 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8820 - regression_loss: 0.7871 - classification_loss: 0.0949 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8896 - regression_loss: 0.7934 - classification_loss: 0.0962 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8946 - regression_loss: 0.7980 - classification_loss: 0.0966 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8973 - regression_loss: 0.8010 - classification_loss: 0.0963 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8903 - regression_loss: 0.7948 - classification_loss: 0.0955 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8967 - regression_loss: 0.8005 - classification_loss: 0.0962 56/500 [==>...........................] - ETA: 2:30 - loss: 0.8981 - regression_loss: 0.8018 - classification_loss: 0.0963 57/500 [==>...........................] - ETA: 2:30 - loss: 0.9056 - regression_loss: 0.8079 - classification_loss: 0.0977 58/500 [==>...........................] - ETA: 2:29 - loss: 0.9030 - regression_loss: 0.8058 - classification_loss: 0.0972 59/500 [==>...........................] - ETA: 2:29 - loss: 0.9120 - regression_loss: 0.8133 - classification_loss: 0.0987 60/500 [==>...........................] - ETA: 2:29 - loss: 0.9227 - regression_loss: 0.8225 - classification_loss: 0.1002 61/500 [==>...........................] - ETA: 2:28 - loss: 0.9290 - regression_loss: 0.8281 - classification_loss: 0.1008 62/500 [==>...........................] - ETA: 2:28 - loss: 0.9252 - regression_loss: 0.8254 - classification_loss: 0.0998 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9319 - regression_loss: 0.8309 - classification_loss: 0.1010 64/500 [==>...........................] - ETA: 2:27 - loss: 0.9388 - regression_loss: 0.8371 - classification_loss: 0.1017 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9493 - regression_loss: 0.8461 - classification_loss: 0.1033 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9526 - regression_loss: 0.8497 - classification_loss: 0.1029 67/500 [===>..........................] - ETA: 2:26 - loss: 0.9569 - regression_loss: 0.8538 - classification_loss: 0.1031 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9521 - regression_loss: 0.8495 - classification_loss: 0.1026 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9524 - regression_loss: 0.8488 - classification_loss: 0.1036 70/500 [===>..........................] - ETA: 2:25 - loss: 0.9523 - regression_loss: 0.8488 - classification_loss: 0.1036 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9560 - regression_loss: 0.8516 - classification_loss: 0.1044 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9580 - regression_loss: 0.8540 - classification_loss: 0.1040 73/500 [===>..........................] - ETA: 2:24 - loss: 0.9527 - regression_loss: 0.8497 - classification_loss: 0.1030 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9531 - regression_loss: 0.8506 - classification_loss: 0.1026 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9590 - regression_loss: 0.8554 - classification_loss: 0.1037 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9607 - regression_loss: 0.8565 - classification_loss: 0.1043 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9523 - regression_loss: 0.8488 - classification_loss: 0.1035 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9512 - regression_loss: 0.8482 - classification_loss: 0.1029 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9429 - regression_loss: 0.8407 - classification_loss: 0.1022 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9468 - regression_loss: 0.8442 - classification_loss: 0.1026 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9424 - regression_loss: 0.8406 - classification_loss: 0.1018 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9435 - regression_loss: 0.8409 - classification_loss: 0.1026 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9467 - regression_loss: 0.8439 - classification_loss: 0.1028 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9551 - regression_loss: 0.8505 - classification_loss: 0.1046 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9508 - regression_loss: 0.8468 - classification_loss: 0.1040 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9520 - regression_loss: 0.8483 - classification_loss: 0.1037 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9483 - regression_loss: 0.8456 - classification_loss: 0.1028 88/500 [====>.........................] - ETA: 2:20 - loss: 0.9532 - regression_loss: 0.8495 - classification_loss: 0.1037 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9557 - regression_loss: 0.8517 - classification_loss: 0.1041 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9597 - regression_loss: 0.8549 - classification_loss: 0.1048 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9554 - regression_loss: 0.8514 - classification_loss: 0.1041 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9523 - regression_loss: 0.8489 - classification_loss: 0.1034 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9475 - regression_loss: 0.8447 - classification_loss: 0.1028 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9430 - regression_loss: 0.8408 - classification_loss: 0.1022 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9411 - regression_loss: 0.8396 - classification_loss: 0.1016 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9406 - regression_loss: 0.8389 - classification_loss: 0.1017 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9411 - regression_loss: 0.8393 - classification_loss: 0.1018 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9366 - regression_loss: 0.8351 - classification_loss: 0.1015 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9361 - regression_loss: 0.8349 - classification_loss: 0.1012 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9356 - regression_loss: 0.8344 - classification_loss: 0.1012 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9365 - regression_loss: 0.8355 - classification_loss: 0.1010 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9363 - regression_loss: 0.8352 - classification_loss: 0.1011 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9354 - regression_loss: 0.8345 - classification_loss: 0.1010 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9317 - regression_loss: 0.8313 - classification_loss: 0.1005 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9340 - regression_loss: 0.8333 - classification_loss: 0.1007 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9357 - regression_loss: 0.8348 - classification_loss: 0.1009 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9347 - regression_loss: 0.8338 - classification_loss: 0.1009 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9365 - regression_loss: 0.8359 - classification_loss: 0.1006 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9396 - regression_loss: 0.8382 - classification_loss: 0.1014 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9373 - regression_loss: 0.8362 - classification_loss: 0.1010 111/500 [=====>........................] - ETA: 2:11 - loss: 0.9373 - regression_loss: 0.8363 - classification_loss: 0.1010 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9360 - regression_loss: 0.8356 - classification_loss: 0.1004 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9343 - regression_loss: 0.8343 - classification_loss: 0.0999 114/500 [=====>........................] - ETA: 2:10 - loss: 0.9318 - regression_loss: 0.8322 - classification_loss: 0.0995 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9324 - regression_loss: 0.8330 - classification_loss: 0.0994 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9336 - regression_loss: 0.8344 - classification_loss: 0.0992 117/500 [======>.......................] - ETA: 2:09 - loss: 0.9345 - regression_loss: 0.8355 - classification_loss: 0.0990 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9339 - regression_loss: 0.8355 - classification_loss: 0.0984 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9342 - regression_loss: 0.8359 - classification_loss: 0.0982 120/500 [======>.......................] - ETA: 2:08 - loss: 0.9345 - regression_loss: 0.8360 - classification_loss: 0.0986 121/500 [======>.......................] - ETA: 2:08 - loss: 0.9367 - regression_loss: 0.8380 - classification_loss: 0.0986 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9328 - regression_loss: 0.8348 - classification_loss: 0.0980 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9340 - regression_loss: 0.8355 - classification_loss: 0.0985 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9296 - regression_loss: 0.8315 - classification_loss: 0.0981 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9318 - regression_loss: 0.8335 - classification_loss: 0.0984 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9343 - regression_loss: 0.8355 - classification_loss: 0.0988 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9336 - regression_loss: 0.8348 - classification_loss: 0.0988 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9322 - regression_loss: 0.8338 - classification_loss: 0.0983 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9347 - regression_loss: 0.8360 - classification_loss: 0.0986 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9337 - regression_loss: 0.8350 - classification_loss: 0.0986 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9378 - regression_loss: 0.8390 - classification_loss: 0.0987 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9403 - regression_loss: 0.8411 - classification_loss: 0.0991 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9415 - regression_loss: 0.8423 - classification_loss: 0.0992 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9425 - regression_loss: 0.8433 - classification_loss: 0.0992 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9452 - regression_loss: 0.8454 - classification_loss: 0.0998 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9401 - regression_loss: 0.8410 - classification_loss: 0.0992 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9400 - regression_loss: 0.8407 - classification_loss: 0.0993 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9387 - regression_loss: 0.8395 - classification_loss: 0.0992 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9386 - regression_loss: 0.8397 - classification_loss: 0.0989 140/500 [=======>......................] - ETA: 2:01 - loss: 0.9395 - regression_loss: 0.8407 - classification_loss: 0.0987 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9394 - regression_loss: 0.8406 - classification_loss: 0.0988 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9403 - regression_loss: 0.8414 - classification_loss: 0.0989 143/500 [=======>......................] - ETA: 2:00 - loss: 0.9411 - regression_loss: 0.8419 - classification_loss: 0.0992 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9402 - regression_loss: 0.8412 - classification_loss: 0.0990 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9419 - regression_loss: 0.8427 - classification_loss: 0.0992 146/500 [=======>......................] - ETA: 1:59 - loss: 0.9453 - regression_loss: 0.8454 - classification_loss: 0.0999 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9469 - regression_loss: 0.8469 - classification_loss: 0.1000 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9449 - regression_loss: 0.8452 - classification_loss: 0.0997 149/500 [=======>......................] - ETA: 1:58 - loss: 0.9482 - regression_loss: 0.8479 - classification_loss: 0.1003 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9525 - regression_loss: 0.8510 - classification_loss: 0.1015 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9504 - regression_loss: 0.8494 - classification_loss: 0.1010 152/500 [========>.....................] - ETA: 1:57 - loss: 0.9506 - regression_loss: 0.8495 - classification_loss: 0.1011 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9481 - regression_loss: 0.8475 - classification_loss: 0.1006 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9511 - regression_loss: 0.8499 - classification_loss: 0.1013 155/500 [========>.....................] - ETA: 1:56 - loss: 0.9526 - regression_loss: 0.8512 - classification_loss: 0.1014 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9527 - regression_loss: 0.8512 - classification_loss: 0.1014 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9536 - regression_loss: 0.8520 - classification_loss: 0.1015 158/500 [========>.....................] - ETA: 1:55 - loss: 0.9555 - regression_loss: 0.8536 - classification_loss: 0.1018 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9575 - regression_loss: 0.8559 - classification_loss: 0.1016 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9561 - regression_loss: 0.8544 - classification_loss: 0.1017 161/500 [========>.....................] - ETA: 1:54 - loss: 0.9568 - regression_loss: 0.8546 - classification_loss: 0.1022 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9540 - regression_loss: 0.8523 - classification_loss: 0.1018 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9539 - regression_loss: 0.8524 - classification_loss: 0.1015 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9563 - regression_loss: 0.8545 - classification_loss: 0.1018 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9584 - regression_loss: 0.8564 - classification_loss: 0.1020 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9584 - regression_loss: 0.8566 - classification_loss: 0.1019 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9576 - regression_loss: 0.8557 - classification_loss: 0.1019 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9564 - regression_loss: 0.8547 - classification_loss: 0.1017 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9541 - regression_loss: 0.8527 - classification_loss: 0.1014 170/500 [=========>....................] - ETA: 1:51 - loss: 0.9520 - regression_loss: 0.8503 - classification_loss: 0.1017 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9502 - regression_loss: 0.8489 - classification_loss: 0.1013 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9495 - regression_loss: 0.8485 - classification_loss: 0.1010 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9464 - regression_loss: 0.8458 - classification_loss: 0.1006 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9466 - regression_loss: 0.8459 - classification_loss: 0.1007 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9517 - regression_loss: 0.8508 - classification_loss: 0.1009 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9495 - regression_loss: 0.8489 - classification_loss: 0.1006 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9468 - regression_loss: 0.8467 - classification_loss: 0.1002 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9461 - regression_loss: 0.8459 - classification_loss: 0.1002 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9461 - regression_loss: 0.8461 - classification_loss: 0.1000 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9476 - regression_loss: 0.8473 - classification_loss: 0.1003 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9488 - regression_loss: 0.8482 - classification_loss: 0.1006 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9518 - regression_loss: 0.8507 - classification_loss: 0.1010 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9529 - regression_loss: 0.8517 - classification_loss: 0.1012 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9557 - regression_loss: 0.8540 - classification_loss: 0.1017 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9563 - regression_loss: 0.8546 - classification_loss: 0.1017 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9560 - regression_loss: 0.8544 - classification_loss: 0.1016 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9576 - regression_loss: 0.8557 - classification_loss: 0.1018 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9586 - regression_loss: 0.8566 - classification_loss: 0.1020 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9596 - regression_loss: 0.8575 - classification_loss: 0.1020 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9583 - regression_loss: 0.8565 - classification_loss: 0.1018 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9582 - regression_loss: 0.8566 - classification_loss: 0.1016 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9598 - regression_loss: 0.8579 - classification_loss: 0.1019 193/500 [==========>...................] - ETA: 1:43 - loss: 0.9608 - regression_loss: 0.8589 - classification_loss: 0.1019 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9591 - regression_loss: 0.8573 - classification_loss: 0.1017 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9611 - regression_loss: 0.8591 - classification_loss: 0.1020 196/500 [==========>...................] - ETA: 1:42 - loss: 0.9662 - regression_loss: 0.8634 - classification_loss: 0.1028 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9665 - regression_loss: 0.8636 - classification_loss: 0.1029 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9702 - regression_loss: 0.8665 - classification_loss: 0.1037 199/500 [==========>...................] - ETA: 1:41 - loss: 0.9709 - regression_loss: 0.8670 - classification_loss: 0.1039 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9717 - regression_loss: 0.8677 - classification_loss: 0.1040 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9734 - regression_loss: 0.8691 - classification_loss: 0.1044 202/500 [===========>..................] - ETA: 1:40 - loss: 0.9732 - regression_loss: 0.8688 - classification_loss: 0.1044 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9693 - regression_loss: 0.8654 - classification_loss: 0.1040 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9689 - regression_loss: 0.8650 - classification_loss: 0.1040 205/500 [===========>..................] - ETA: 1:39 - loss: 0.9686 - regression_loss: 0.8650 - classification_loss: 0.1036 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9682 - regression_loss: 0.8646 - classification_loss: 0.1036 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9661 - regression_loss: 0.8627 - classification_loss: 0.1034 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9649 - regression_loss: 0.8616 - classification_loss: 0.1033 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9647 - regression_loss: 0.8614 - classification_loss: 0.1033 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9635 - regression_loss: 0.8605 - classification_loss: 0.1030 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9612 - regression_loss: 0.8584 - classification_loss: 0.1027 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9635 - regression_loss: 0.8608 - classification_loss: 0.1027 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9632 - regression_loss: 0.8605 - classification_loss: 0.1027 214/500 [===========>..................] - ETA: 1:36 - loss: 0.9632 - regression_loss: 0.8609 - classification_loss: 0.1023 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9616 - regression_loss: 0.8596 - classification_loss: 0.1020 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9645 - regression_loss: 0.8620 - classification_loss: 0.1024 217/500 [============>.................] - ETA: 1:35 - loss: 0.9651 - regression_loss: 0.8626 - classification_loss: 0.1025 218/500 [============>.................] - ETA: 1:35 - loss: 0.9657 - regression_loss: 0.8631 - classification_loss: 0.1026 219/500 [============>.................] - ETA: 1:35 - loss: 0.9654 - regression_loss: 0.8627 - classification_loss: 0.1027 220/500 [============>.................] - ETA: 1:34 - loss: 0.9659 - regression_loss: 0.8632 - classification_loss: 0.1027 221/500 [============>.................] - ETA: 1:34 - loss: 0.9661 - regression_loss: 0.8633 - classification_loss: 0.1028 222/500 [============>.................] - ETA: 1:34 - loss: 0.9670 - regression_loss: 0.8640 - classification_loss: 0.1030 223/500 [============>.................] - ETA: 1:33 - loss: 0.9676 - regression_loss: 0.8647 - classification_loss: 0.1029 224/500 [============>.................] - ETA: 1:33 - loss: 0.9691 - regression_loss: 0.8658 - classification_loss: 0.1033 225/500 [============>.................] - ETA: 1:33 - loss: 0.9670 - regression_loss: 0.8641 - classification_loss: 0.1029 226/500 [============>.................] - ETA: 1:32 - loss: 0.9667 - regression_loss: 0.8638 - classification_loss: 0.1029 227/500 [============>.................] - ETA: 1:32 - loss: 0.9666 - regression_loss: 0.8637 - classification_loss: 0.1029 228/500 [============>.................] - ETA: 1:31 - loss: 0.9679 - regression_loss: 0.8650 - classification_loss: 0.1029 229/500 [============>.................] - ETA: 1:31 - loss: 0.9680 - regression_loss: 0.8651 - classification_loss: 0.1029 230/500 [============>.................] - ETA: 1:31 - loss: 0.9672 - regression_loss: 0.8644 - classification_loss: 0.1028 231/500 [============>.................] - ETA: 1:31 - loss: 0.9680 - regression_loss: 0.8654 - classification_loss: 0.1026 232/500 [============>.................] - ETA: 1:30 - loss: 0.9681 - regression_loss: 0.8656 - classification_loss: 0.1025 233/500 [============>.................] - ETA: 1:30 - loss: 0.9668 - regression_loss: 0.8645 - classification_loss: 0.1023 234/500 [=============>................] - ETA: 1:29 - loss: 0.9664 - regression_loss: 0.8641 - classification_loss: 0.1023 235/500 [=============>................] - ETA: 1:29 - loss: 0.9646 - regression_loss: 0.8625 - classification_loss: 0.1021 236/500 [=============>................] - ETA: 1:29 - loss: 0.9617 - regression_loss: 0.8600 - classification_loss: 0.1017 237/500 [=============>................] - ETA: 1:28 - loss: 0.9598 - regression_loss: 0.8584 - classification_loss: 0.1014 238/500 [=============>................] - ETA: 1:28 - loss: 0.9626 - regression_loss: 0.8607 - classification_loss: 0.1019 239/500 [=============>................] - ETA: 1:28 - loss: 0.9603 - regression_loss: 0.8587 - classification_loss: 0.1015 240/500 [=============>................] - ETA: 1:27 - loss: 0.9601 - regression_loss: 0.8586 - classification_loss: 0.1015 241/500 [=============>................] - ETA: 1:27 - loss: 0.9599 - regression_loss: 0.8585 - classification_loss: 0.1014 242/500 [=============>................] - ETA: 1:27 - loss: 0.9620 - regression_loss: 0.8604 - classification_loss: 0.1016 243/500 [=============>................] - ETA: 1:26 - loss: 0.9639 - regression_loss: 0.8621 - classification_loss: 0.1019 244/500 [=============>................] - ETA: 1:26 - loss: 0.9649 - regression_loss: 0.8629 - classification_loss: 0.1020 245/500 [=============>................] - ETA: 1:26 - loss: 0.9636 - regression_loss: 0.8618 - classification_loss: 0.1018 246/500 [=============>................] - ETA: 1:25 - loss: 0.9618 - regression_loss: 0.8603 - classification_loss: 0.1016 247/500 [=============>................] - ETA: 1:25 - loss: 0.9610 - regression_loss: 0.8597 - classification_loss: 0.1013 248/500 [=============>................] - ETA: 1:25 - loss: 0.9596 - regression_loss: 0.8585 - classification_loss: 0.1011 249/500 [=============>................] - ETA: 1:24 - loss: 0.9606 - regression_loss: 0.8592 - classification_loss: 0.1014 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9600 - regression_loss: 0.8587 - classification_loss: 0.1013 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9609 - regression_loss: 0.8594 - classification_loss: 0.1015 252/500 [==============>...............] - ETA: 1:23 - loss: 0.9624 - regression_loss: 0.8601 - classification_loss: 0.1023 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9630 - regression_loss: 0.8605 - classification_loss: 0.1025 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9618 - regression_loss: 0.8593 - classification_loss: 0.1025 255/500 [==============>...............] - ETA: 1:22 - loss: 0.9628 - regression_loss: 0.8602 - classification_loss: 0.1026 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9621 - regression_loss: 0.8597 - classification_loss: 0.1024 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9621 - regression_loss: 0.8598 - classification_loss: 0.1023 258/500 [==============>...............] - ETA: 1:21 - loss: 0.9635 - regression_loss: 0.8610 - classification_loss: 0.1025 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9641 - regression_loss: 0.8615 - classification_loss: 0.1026 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9619 - regression_loss: 0.8595 - classification_loss: 0.1024 261/500 [==============>...............] - ETA: 1:20 - loss: 0.9592 - regression_loss: 0.8571 - classification_loss: 0.1021 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9588 - regression_loss: 0.8567 - classification_loss: 0.1021 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9597 - regression_loss: 0.8574 - classification_loss: 0.1023 264/500 [==============>...............] - ETA: 1:19 - loss: 0.9576 - regression_loss: 0.8556 - classification_loss: 0.1020 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9577 - regression_loss: 0.8557 - classification_loss: 0.1020 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9560 - regression_loss: 0.8543 - classification_loss: 0.1017 267/500 [===============>..............] - ETA: 1:18 - loss: 0.9569 - regression_loss: 0.8549 - classification_loss: 0.1019 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9562 - regression_loss: 0.8543 - classification_loss: 0.1019 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9576 - regression_loss: 0.8556 - classification_loss: 0.1020 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9571 - regression_loss: 0.8551 - classification_loss: 0.1019 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9575 - regression_loss: 0.8556 - classification_loss: 0.1019 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9584 - regression_loss: 0.8563 - classification_loss: 0.1021 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9590 - regression_loss: 0.8569 - classification_loss: 0.1021 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9592 - regression_loss: 0.8571 - classification_loss: 0.1021 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9588 - regression_loss: 0.8568 - classification_loss: 0.1020 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9592 - regression_loss: 0.8572 - classification_loss: 0.1020 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9613 - regression_loss: 0.8590 - classification_loss: 0.1023 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9614 - regression_loss: 0.8590 - classification_loss: 0.1024 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9611 - regression_loss: 0.8588 - classification_loss: 0.1023 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9592 - regression_loss: 0.8572 - classification_loss: 0.1020 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9596 - regression_loss: 0.8575 - classification_loss: 0.1021 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9604 - regression_loss: 0.8581 - classification_loss: 0.1023 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9587 - regression_loss: 0.8567 - classification_loss: 0.1020 284/500 [================>.............] - ETA: 1:13 - loss: 0.9567 - regression_loss: 0.8550 - classification_loss: 0.1018 285/500 [================>.............] - ETA: 1:12 - loss: 0.9548 - regression_loss: 0.8533 - classification_loss: 0.1015 286/500 [================>.............] - ETA: 1:12 - loss: 0.9551 - regression_loss: 0.8536 - classification_loss: 0.1015 287/500 [================>.............] - ETA: 1:12 - loss: 0.9548 - regression_loss: 0.8534 - classification_loss: 0.1014 288/500 [================>.............] - ETA: 1:11 - loss: 0.9551 - regression_loss: 0.8537 - classification_loss: 0.1014 289/500 [================>.............] - ETA: 1:11 - loss: 0.9561 - regression_loss: 0.8545 - classification_loss: 0.1016 290/500 [================>.............] - ETA: 1:11 - loss: 0.9558 - regression_loss: 0.8542 - classification_loss: 0.1016 291/500 [================>.............] - ETA: 1:10 - loss: 0.9543 - regression_loss: 0.8529 - classification_loss: 0.1013 292/500 [================>.............] - ETA: 1:10 - loss: 0.9548 - regression_loss: 0.8534 - classification_loss: 0.1014 293/500 [================>.............] - ETA: 1:10 - loss: 0.9561 - regression_loss: 0.8545 - classification_loss: 0.1016 294/500 [================>.............] - ETA: 1:09 - loss: 0.9552 - regression_loss: 0.8538 - classification_loss: 0.1014 295/500 [================>.............] - ETA: 1:09 - loss: 0.9548 - regression_loss: 0.8530 - classification_loss: 0.1018 296/500 [================>.............] - ETA: 1:09 - loss: 0.9538 - regression_loss: 0.8522 - classification_loss: 0.1016 297/500 [================>.............] - ETA: 1:08 - loss: 0.9529 - regression_loss: 0.8515 - classification_loss: 0.1014 298/500 [================>.............] - ETA: 1:08 - loss: 0.9533 - regression_loss: 0.8519 - classification_loss: 0.1014 299/500 [================>.............] - ETA: 1:07 - loss: 0.9554 - regression_loss: 0.8538 - classification_loss: 0.1016 300/500 [=================>............] - ETA: 1:07 - loss: 0.9557 - regression_loss: 0.8541 - classification_loss: 0.1017 301/500 [=================>............] - ETA: 1:07 - loss: 0.9568 - regression_loss: 0.8550 - classification_loss: 0.1017 302/500 [=================>............] - ETA: 1:06 - loss: 0.9567 - regression_loss: 0.8551 - classification_loss: 0.1016 303/500 [=================>............] - ETA: 1:06 - loss: 0.9572 - regression_loss: 0.8556 - classification_loss: 0.1017 304/500 [=================>............] - ETA: 1:06 - loss: 0.9576 - regression_loss: 0.8559 - classification_loss: 0.1017 305/500 [=================>............] - ETA: 1:05 - loss: 0.9560 - regression_loss: 0.8546 - classification_loss: 0.1015 306/500 [=================>............] - ETA: 1:05 - loss: 0.9540 - regression_loss: 0.8528 - classification_loss: 0.1012 307/500 [=================>............] - ETA: 1:05 - loss: 0.9531 - regression_loss: 0.8519 - classification_loss: 0.1011 308/500 [=================>............] - ETA: 1:04 - loss: 0.9531 - regression_loss: 0.8520 - classification_loss: 0.1010 309/500 [=================>............] - ETA: 1:04 - loss: 0.9511 - regression_loss: 0.8503 - classification_loss: 0.1008 310/500 [=================>............] - ETA: 1:04 - loss: 0.9535 - regression_loss: 0.8521 - classification_loss: 0.1014 311/500 [=================>............] - ETA: 1:03 - loss: 0.9537 - regression_loss: 0.8523 - classification_loss: 0.1013 312/500 [=================>............] - ETA: 1:03 - loss: 0.9536 - regression_loss: 0.8522 - classification_loss: 0.1014 313/500 [=================>............] - ETA: 1:03 - loss: 0.9540 - regression_loss: 0.8525 - classification_loss: 0.1015 314/500 [=================>............] - ETA: 1:02 - loss: 0.9531 - regression_loss: 0.8518 - classification_loss: 0.1013 315/500 [=================>............] - ETA: 1:02 - loss: 0.9514 - regression_loss: 0.8504 - classification_loss: 0.1010 316/500 [=================>............] - ETA: 1:02 - loss: 0.9513 - regression_loss: 0.8504 - classification_loss: 0.1009 317/500 [==================>...........] - ETA: 1:01 - loss: 0.9503 - regression_loss: 0.8496 - classification_loss: 0.1007 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9515 - regression_loss: 0.8506 - classification_loss: 0.1009 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9516 - regression_loss: 0.8508 - classification_loss: 0.1009 320/500 [==================>...........] - ETA: 1:00 - loss: 0.9507 - regression_loss: 0.8501 - classification_loss: 0.1006 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9505 - regression_loss: 0.8497 - classification_loss: 0.1008 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9499 - regression_loss: 0.8492 - classification_loss: 0.1007 323/500 [==================>...........] - ETA: 59s - loss: 0.9502 - regression_loss: 0.8494 - classification_loss: 0.1007  324/500 [==================>...........] - ETA: 59s - loss: 0.9497 - regression_loss: 0.8489 - classification_loss: 0.1007 325/500 [==================>...........] - ETA: 59s - loss: 0.9516 - regression_loss: 0.8506 - classification_loss: 0.1010 326/500 [==================>...........] - ETA: 58s - loss: 0.9516 - regression_loss: 0.8507 - classification_loss: 0.1010 327/500 [==================>...........] - ETA: 58s - loss: 0.9538 - regression_loss: 0.8524 - classification_loss: 0.1014 328/500 [==================>...........] - ETA: 58s - loss: 0.9531 - regression_loss: 0.8519 - classification_loss: 0.1012 329/500 [==================>...........] - ETA: 57s - loss: 0.9532 - regression_loss: 0.8520 - classification_loss: 0.1011 330/500 [==================>...........] - ETA: 57s - loss: 0.9538 - regression_loss: 0.8526 - classification_loss: 0.1012 331/500 [==================>...........] - ETA: 57s - loss: 0.9525 - regression_loss: 0.8515 - classification_loss: 0.1010 332/500 [==================>...........] - ETA: 56s - loss: 0.9524 - regression_loss: 0.8514 - classification_loss: 0.1010 333/500 [==================>...........] - ETA: 56s - loss: 0.9525 - regression_loss: 0.8513 - classification_loss: 0.1011 334/500 [===================>..........] - ETA: 56s - loss: 0.9522 - regression_loss: 0.8512 - classification_loss: 0.1010 335/500 [===================>..........] - ETA: 55s - loss: 0.9522 - regression_loss: 0.8511 - classification_loss: 0.1011 336/500 [===================>..........] - ETA: 55s - loss: 0.9519 - regression_loss: 0.8509 - classification_loss: 0.1011 337/500 [===================>..........] - ETA: 55s - loss: 0.9505 - regression_loss: 0.8497 - classification_loss: 0.1008 338/500 [===================>..........] - ETA: 54s - loss: 0.9501 - regression_loss: 0.8493 - classification_loss: 0.1008 339/500 [===================>..........] - ETA: 54s - loss: 0.9499 - regression_loss: 0.8492 - classification_loss: 0.1007 340/500 [===================>..........] - ETA: 54s - loss: 0.9500 - regression_loss: 0.8493 - classification_loss: 0.1007 341/500 [===================>..........] - ETA: 53s - loss: 0.9483 - regression_loss: 0.8478 - classification_loss: 0.1005 342/500 [===================>..........] - ETA: 53s - loss: 0.9487 - regression_loss: 0.8483 - classification_loss: 0.1005 343/500 [===================>..........] - ETA: 53s - loss: 0.9495 - regression_loss: 0.8490 - classification_loss: 0.1006 344/500 [===================>..........] - ETA: 52s - loss: 0.9479 - regression_loss: 0.8475 - classification_loss: 0.1004 345/500 [===================>..........] - ETA: 52s - loss: 0.9478 - regression_loss: 0.8474 - classification_loss: 0.1004 346/500 [===================>..........] - ETA: 52s - loss: 0.9476 - regression_loss: 0.8472 - classification_loss: 0.1003 347/500 [===================>..........] - ETA: 51s - loss: 0.9467 - regression_loss: 0.8465 - classification_loss: 0.1002 348/500 [===================>..........] - ETA: 51s - loss: 0.9461 - regression_loss: 0.8461 - classification_loss: 0.1000 349/500 [===================>..........] - ETA: 51s - loss: 0.9450 - regression_loss: 0.8452 - classification_loss: 0.0998 350/500 [====================>.........] - ETA: 50s - loss: 0.9439 - regression_loss: 0.8444 - classification_loss: 0.0996 351/500 [====================>.........] - ETA: 50s - loss: 0.9439 - regression_loss: 0.8443 - classification_loss: 0.0996 352/500 [====================>.........] - ETA: 50s - loss: 0.9427 - regression_loss: 0.8433 - classification_loss: 0.0994 353/500 [====================>.........] - ETA: 49s - loss: 0.9426 - regression_loss: 0.8431 - classification_loss: 0.0995 354/500 [====================>.........] - ETA: 49s - loss: 0.9422 - regression_loss: 0.8428 - classification_loss: 0.0994 355/500 [====================>.........] - ETA: 49s - loss: 0.9425 - regression_loss: 0.8431 - classification_loss: 0.0994 356/500 [====================>.........] - ETA: 48s - loss: 0.9438 - regression_loss: 0.8444 - classification_loss: 0.0994 357/500 [====================>.........] - ETA: 48s - loss: 0.9435 - regression_loss: 0.8443 - classification_loss: 0.0993 358/500 [====================>.........] - ETA: 48s - loss: 0.9433 - regression_loss: 0.8438 - classification_loss: 0.0995 359/500 [====================>.........] - ETA: 47s - loss: 0.9443 - regression_loss: 0.8447 - classification_loss: 0.0996 360/500 [====================>.........] - ETA: 47s - loss: 0.9454 - regression_loss: 0.8458 - classification_loss: 0.0996 361/500 [====================>.........] - ETA: 47s - loss: 0.9465 - regression_loss: 0.8470 - classification_loss: 0.0995 362/500 [====================>.........] - ETA: 46s - loss: 0.9466 - regression_loss: 0.8471 - classification_loss: 0.0995 363/500 [====================>.........] - ETA: 46s - loss: 0.9472 - regression_loss: 0.8476 - classification_loss: 0.0996 364/500 [====================>.........] - ETA: 46s - loss: 0.9473 - regression_loss: 0.8477 - classification_loss: 0.0996 365/500 [====================>.........] - ETA: 45s - loss: 0.9481 - regression_loss: 0.8484 - classification_loss: 0.0997 366/500 [====================>.........] - ETA: 45s - loss: 0.9484 - regression_loss: 0.8487 - classification_loss: 0.0997 367/500 [=====================>........] - ETA: 45s - loss: 0.9494 - regression_loss: 0.8495 - classification_loss: 0.0999 368/500 [=====================>........] - ETA: 44s - loss: 0.9501 - regression_loss: 0.8501 - classification_loss: 0.0999 369/500 [=====================>........] - ETA: 44s - loss: 0.9497 - regression_loss: 0.8499 - classification_loss: 0.0998 370/500 [=====================>........] - ETA: 43s - loss: 0.9506 - regression_loss: 0.8507 - classification_loss: 0.0999 371/500 [=====================>........] - ETA: 43s - loss: 0.9503 - regression_loss: 0.8505 - classification_loss: 0.0997 372/500 [=====================>........] - ETA: 43s - loss: 0.9488 - regression_loss: 0.8493 - classification_loss: 0.0995 373/500 [=====================>........] - ETA: 42s - loss: 0.9492 - regression_loss: 0.8497 - classification_loss: 0.0996 374/500 [=====================>........] - ETA: 42s - loss: 0.9504 - regression_loss: 0.8507 - classification_loss: 0.0997 375/500 [=====================>........] - ETA: 42s - loss: 0.9490 - regression_loss: 0.8495 - classification_loss: 0.0996 376/500 [=====================>........] - ETA: 41s - loss: 0.9479 - regression_loss: 0.8485 - classification_loss: 0.0994 377/500 [=====================>........] - ETA: 41s - loss: 0.9492 - regression_loss: 0.8497 - classification_loss: 0.0995 378/500 [=====================>........] - ETA: 41s - loss: 0.9501 - regression_loss: 0.8504 - classification_loss: 0.0997 379/500 [=====================>........] - ETA: 40s - loss: 0.9509 - regression_loss: 0.8512 - classification_loss: 0.0997 380/500 [=====================>........] - ETA: 40s - loss: 0.9508 - regression_loss: 0.8512 - classification_loss: 0.0997 381/500 [=====================>........] - ETA: 40s - loss: 0.9503 - regression_loss: 0.8507 - classification_loss: 0.0996 382/500 [=====================>........] - ETA: 39s - loss: 0.9501 - regression_loss: 0.8506 - classification_loss: 0.0995 383/500 [=====================>........] - ETA: 39s - loss: 0.9514 - regression_loss: 0.8518 - classification_loss: 0.0996 384/500 [======================>.......] - ETA: 39s - loss: 0.9526 - regression_loss: 0.8526 - classification_loss: 0.1000 385/500 [======================>.......] - ETA: 38s - loss: 0.9546 - regression_loss: 0.8544 - classification_loss: 0.1002 386/500 [======================>.......] - ETA: 38s - loss: 0.9553 - regression_loss: 0.8550 - classification_loss: 0.1003 387/500 [======================>.......] - ETA: 38s - loss: 0.9549 - regression_loss: 0.8547 - classification_loss: 0.1001 388/500 [======================>.......] - ETA: 37s - loss: 0.9547 - regression_loss: 0.8547 - classification_loss: 0.1000 389/500 [======================>.......] - ETA: 37s - loss: 0.9540 - regression_loss: 0.8540 - classification_loss: 0.1000 390/500 [======================>.......] - ETA: 37s - loss: 0.9532 - regression_loss: 0.8534 - classification_loss: 0.0999 391/500 [======================>.......] - ETA: 36s - loss: 0.9536 - regression_loss: 0.8536 - classification_loss: 0.1000 392/500 [======================>.......] - ETA: 36s - loss: 0.9522 - regression_loss: 0.8525 - classification_loss: 0.0997 393/500 [======================>.......] - ETA: 36s - loss: 0.9521 - regression_loss: 0.8524 - classification_loss: 0.0997 394/500 [======================>.......] - ETA: 35s - loss: 0.9521 - regression_loss: 0.8524 - classification_loss: 0.0997 395/500 [======================>.......] - ETA: 35s - loss: 0.9526 - regression_loss: 0.8528 - classification_loss: 0.0998 396/500 [======================>.......] - ETA: 35s - loss: 0.9530 - regression_loss: 0.8531 - classification_loss: 0.0999 397/500 [======================>.......] - ETA: 34s - loss: 0.9514 - regression_loss: 0.8517 - classification_loss: 0.0997 398/500 [======================>.......] - ETA: 34s - loss: 0.9517 - regression_loss: 0.8519 - classification_loss: 0.0998 399/500 [======================>.......] - ETA: 34s - loss: 0.9512 - regression_loss: 0.8515 - classification_loss: 0.0997 400/500 [=======================>......] - ETA: 33s - loss: 0.9514 - regression_loss: 0.8516 - classification_loss: 0.0997 401/500 [=======================>......] - ETA: 33s - loss: 0.9513 - regression_loss: 0.8516 - classification_loss: 0.0997 402/500 [=======================>......] - ETA: 33s - loss: 0.9498 - regression_loss: 0.8503 - classification_loss: 0.0995 403/500 [=======================>......] - ETA: 32s - loss: 0.9488 - regression_loss: 0.8494 - classification_loss: 0.0994 404/500 [=======================>......] - ETA: 32s - loss: 0.9500 - regression_loss: 0.8504 - classification_loss: 0.0996 405/500 [=======================>......] - ETA: 32s - loss: 0.9495 - regression_loss: 0.8500 - classification_loss: 0.0995 406/500 [=======================>......] - ETA: 31s - loss: 0.9512 - regression_loss: 0.8517 - classification_loss: 0.0995 407/500 [=======================>......] - ETA: 31s - loss: 0.9522 - regression_loss: 0.8525 - classification_loss: 0.0996 408/500 [=======================>......] - ETA: 31s - loss: 0.9511 - regression_loss: 0.8516 - classification_loss: 0.0995 409/500 [=======================>......] - ETA: 30s - loss: 0.9515 - regression_loss: 0.8520 - classification_loss: 0.0995 410/500 [=======================>......] - ETA: 30s - loss: 0.9508 - regression_loss: 0.8514 - classification_loss: 0.0994 411/500 [=======================>......] - ETA: 30s - loss: 0.9497 - regression_loss: 0.8505 - classification_loss: 0.0992 412/500 [=======================>......] - ETA: 29s - loss: 0.9487 - regression_loss: 0.8496 - classification_loss: 0.0991 413/500 [=======================>......] - ETA: 29s - loss: 0.9470 - regression_loss: 0.8481 - classification_loss: 0.0989 414/500 [=======================>......] - ETA: 29s - loss: 0.9461 - regression_loss: 0.8474 - classification_loss: 0.0987 415/500 [=======================>......] - ETA: 28s - loss: 0.9476 - regression_loss: 0.8487 - classification_loss: 0.0990 416/500 [=======================>......] - ETA: 28s - loss: 0.9487 - regression_loss: 0.8496 - classification_loss: 0.0991 417/500 [========================>.....] - ETA: 28s - loss: 0.9482 - regression_loss: 0.8492 - classification_loss: 0.0991 418/500 [========================>.....] - ETA: 27s - loss: 0.9480 - regression_loss: 0.8489 - classification_loss: 0.0991 419/500 [========================>.....] - ETA: 27s - loss: 0.9486 - regression_loss: 0.8493 - classification_loss: 0.0993 420/500 [========================>.....] - ETA: 27s - loss: 0.9485 - regression_loss: 0.8493 - classification_loss: 0.0992 421/500 [========================>.....] - ETA: 26s - loss: 0.9485 - regression_loss: 0.8493 - classification_loss: 0.0991 422/500 [========================>.....] - ETA: 26s - loss: 0.9486 - regression_loss: 0.8495 - classification_loss: 0.0991 423/500 [========================>.....] - ETA: 26s - loss: 0.9493 - regression_loss: 0.8503 - classification_loss: 0.0991 424/500 [========================>.....] - ETA: 25s - loss: 0.9497 - regression_loss: 0.8507 - classification_loss: 0.0990 425/500 [========================>.....] - ETA: 25s - loss: 0.9502 - regression_loss: 0.8512 - classification_loss: 0.0990 426/500 [========================>.....] - ETA: 25s - loss: 0.9504 - regression_loss: 0.8512 - classification_loss: 0.0992 427/500 [========================>.....] - ETA: 24s - loss: 0.9525 - regression_loss: 0.8529 - classification_loss: 0.0996 428/500 [========================>.....] - ETA: 24s - loss: 0.9536 - regression_loss: 0.8538 - classification_loss: 0.0999 429/500 [========================>.....] - ETA: 24s - loss: 0.9537 - regression_loss: 0.8539 - classification_loss: 0.0998 430/500 [========================>.....] - ETA: 23s - loss: 0.9559 - regression_loss: 0.8555 - classification_loss: 0.1003 431/500 [========================>.....] - ETA: 23s - loss: 0.9553 - regression_loss: 0.8551 - classification_loss: 0.1002 432/500 [========================>.....] - ETA: 23s - loss: 0.9568 - regression_loss: 0.8563 - classification_loss: 0.1005 433/500 [========================>.....] - ETA: 22s - loss: 0.9572 - regression_loss: 0.8566 - classification_loss: 0.1006 434/500 [=========================>....] - ETA: 22s - loss: 0.9576 - regression_loss: 0.8570 - classification_loss: 0.1006 435/500 [=========================>....] - ETA: 22s - loss: 0.9575 - regression_loss: 0.8569 - classification_loss: 0.1006 436/500 [=========================>....] - ETA: 21s - loss: 0.9577 - regression_loss: 0.8571 - classification_loss: 0.1006 437/500 [=========================>....] - ETA: 21s - loss: 0.9598 - regression_loss: 0.8589 - classification_loss: 0.1009 438/500 [=========================>....] - ETA: 21s - loss: 0.9604 - regression_loss: 0.8595 - classification_loss: 0.1009 439/500 [=========================>....] - ETA: 20s - loss: 0.9608 - regression_loss: 0.8598 - classification_loss: 0.1010 440/500 [=========================>....] - ETA: 20s - loss: 0.9612 - regression_loss: 0.8601 - classification_loss: 0.1011 441/500 [=========================>....] - ETA: 19s - loss: 0.9606 - regression_loss: 0.8594 - classification_loss: 0.1012 442/500 [=========================>....] - ETA: 19s - loss: 0.9610 - regression_loss: 0.8597 - classification_loss: 0.1012 443/500 [=========================>....] - ETA: 19s - loss: 0.9619 - regression_loss: 0.8605 - classification_loss: 0.1014 444/500 [=========================>....] - ETA: 18s - loss: 0.9609 - regression_loss: 0.8597 - classification_loss: 0.1012 445/500 [=========================>....] - ETA: 18s - loss: 0.9616 - regression_loss: 0.8603 - classification_loss: 0.1013 446/500 [=========================>....] - ETA: 18s - loss: 0.9607 - regression_loss: 0.8595 - classification_loss: 0.1012 447/500 [=========================>....] - ETA: 17s - loss: 0.9601 - regression_loss: 0.8591 - classification_loss: 0.1010 448/500 [=========================>....] - ETA: 17s - loss: 0.9595 - regression_loss: 0.8586 - classification_loss: 0.1009 449/500 [=========================>....] - ETA: 17s - loss: 0.9598 - regression_loss: 0.8589 - classification_loss: 0.1009 450/500 [==========================>...] - ETA: 16s - loss: 0.9600 - regression_loss: 0.8592 - classification_loss: 0.1008 451/500 [==========================>...] - ETA: 16s - loss: 0.9599 - regression_loss: 0.8591 - classification_loss: 0.1008 452/500 [==========================>...] - ETA: 16s - loss: 0.9603 - regression_loss: 0.8594 - classification_loss: 0.1008 453/500 [==========================>...] - ETA: 15s - loss: 0.9592 - regression_loss: 0.8585 - classification_loss: 0.1007 454/500 [==========================>...] - ETA: 15s - loss: 0.9602 - regression_loss: 0.8595 - classification_loss: 0.1007 455/500 [==========================>...] - ETA: 15s - loss: 0.9606 - regression_loss: 0.8599 - classification_loss: 0.1007 456/500 [==========================>...] - ETA: 14s - loss: 0.9597 - regression_loss: 0.8591 - classification_loss: 0.1006 457/500 [==========================>...] - ETA: 14s - loss: 0.9587 - regression_loss: 0.8582 - classification_loss: 0.1005 458/500 [==========================>...] - ETA: 14s - loss: 0.9589 - regression_loss: 0.8583 - classification_loss: 0.1006 459/500 [==========================>...] - ETA: 13s - loss: 0.9589 - regression_loss: 0.8583 - classification_loss: 0.1006 460/500 [==========================>...] - ETA: 13s - loss: 0.9602 - regression_loss: 0.8596 - classification_loss: 0.1006 461/500 [==========================>...] - ETA: 13s - loss: 0.9598 - regression_loss: 0.8593 - classification_loss: 0.1005 462/500 [==========================>...] - ETA: 12s - loss: 0.9606 - regression_loss: 0.8600 - classification_loss: 0.1006 463/500 [==========================>...] - ETA: 12s - loss: 0.9614 - regression_loss: 0.8607 - classification_loss: 0.1007 464/500 [==========================>...] - ETA: 12s - loss: 0.9612 - regression_loss: 0.8606 - classification_loss: 0.1006 465/500 [==========================>...] - ETA: 11s - loss: 0.9608 - regression_loss: 0.8602 - classification_loss: 0.1006 466/500 [==========================>...] - ETA: 11s - loss: 0.9611 - regression_loss: 0.8605 - classification_loss: 0.1007 467/500 [===========================>..] - ETA: 11s - loss: 0.9606 - regression_loss: 0.8600 - classification_loss: 0.1007 468/500 [===========================>..] - ETA: 10s - loss: 0.9613 - regression_loss: 0.8606 - classification_loss: 0.1008 469/500 [===========================>..] - ETA: 10s - loss: 0.9620 - regression_loss: 0.8611 - classification_loss: 0.1009 470/500 [===========================>..] - ETA: 10s - loss: 0.9623 - regression_loss: 0.8613 - classification_loss: 0.1010 471/500 [===========================>..] - ETA: 9s - loss: 0.9618 - regression_loss: 0.8609 - classification_loss: 0.1009  472/500 [===========================>..] - ETA: 9s - loss: 0.9608 - regression_loss: 0.8600 - classification_loss: 0.1008 473/500 [===========================>..] - ETA: 9s - loss: 0.9624 - regression_loss: 0.8614 - classification_loss: 0.1010 474/500 [===========================>..] - ETA: 8s - loss: 0.9614 - regression_loss: 0.8605 - classification_loss: 0.1009 475/500 [===========================>..] - ETA: 8s - loss: 0.9605 - regression_loss: 0.8597 - classification_loss: 0.1008 476/500 [===========================>..] - ETA: 8s - loss: 0.9609 - regression_loss: 0.8600 - classification_loss: 0.1009 477/500 [===========================>..] - ETA: 7s - loss: 0.9611 - regression_loss: 0.8603 - classification_loss: 0.1009 478/500 [===========================>..] - ETA: 7s - loss: 0.9602 - regression_loss: 0.8595 - classification_loss: 0.1007 479/500 [===========================>..] - ETA: 7s - loss: 0.9603 - regression_loss: 0.8596 - classification_loss: 0.1007 480/500 [===========================>..] - ETA: 6s - loss: 0.9587 - regression_loss: 0.8582 - classification_loss: 0.1005 481/500 [===========================>..] - ETA: 6s - loss: 0.9588 - regression_loss: 0.8582 - classification_loss: 0.1006 482/500 [===========================>..] - ETA: 6s - loss: 0.9584 - regression_loss: 0.8578 - classification_loss: 0.1006 483/500 [===========================>..] - ETA: 5s - loss: 0.9585 - regression_loss: 0.8579 - classification_loss: 0.1006 484/500 [============================>.] - ETA: 5s - loss: 0.9601 - regression_loss: 0.8591 - classification_loss: 0.1010 485/500 [============================>.] - ETA: 5s - loss: 0.9607 - regression_loss: 0.8596 - classification_loss: 0.1011 486/500 [============================>.] - ETA: 4s - loss: 0.9613 - regression_loss: 0.8601 - classification_loss: 0.1012 487/500 [============================>.] - ETA: 4s - loss: 0.9615 - regression_loss: 0.8603 - classification_loss: 0.1012 488/500 [============================>.] - ETA: 4s - loss: 0.9620 - regression_loss: 0.8607 - classification_loss: 0.1013 489/500 [============================>.] - ETA: 3s - loss: 0.9624 - regression_loss: 0.8611 - classification_loss: 0.1013 490/500 [============================>.] - ETA: 3s - loss: 0.9623 - regression_loss: 0.8610 - classification_loss: 0.1013 491/500 [============================>.] - ETA: 3s - loss: 0.9627 - regression_loss: 0.8613 - classification_loss: 0.1014 492/500 [============================>.] - ETA: 2s - loss: 0.9617 - regression_loss: 0.8604 - classification_loss: 0.1013 493/500 [============================>.] - ETA: 2s - loss: 0.9622 - regression_loss: 0.8609 - classification_loss: 0.1013 494/500 [============================>.] - ETA: 2s - loss: 0.9625 - regression_loss: 0.8612 - classification_loss: 0.1012 495/500 [============================>.] - ETA: 1s - loss: 0.9631 - regression_loss: 0.8618 - classification_loss: 0.1013 496/500 [============================>.] - ETA: 1s - loss: 0.9631 - regression_loss: 0.8618 - classification_loss: 0.1013 497/500 [============================>.] - ETA: 1s - loss: 0.9634 - regression_loss: 0.8620 - classification_loss: 0.1013 498/500 [============================>.] - ETA: 0s - loss: 0.9638 - regression_loss: 0.8624 - classification_loss: 0.1014 499/500 [============================>.] - ETA: 0s - loss: 0.9637 - regression_loss: 0.8625 - classification_loss: 0.1013 500/500 [==============================] - 170s 339ms/step - loss: 0.9639 - regression_loss: 0.8625 - classification_loss: 0.1014 1172 instances of class plum with average precision: 0.7736 mAP: 0.7736 Epoch 00032: saving model to ./training/snapshots/resnet101_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 2:42 - loss: 0.8606 - regression_loss: 0.7870 - classification_loss: 0.0737 2/500 [..............................] - ETA: 2:45 - loss: 0.8741 - regression_loss: 0.7762 - classification_loss: 0.0979 3/500 [..............................] - ETA: 2:46 - loss: 0.9591 - regression_loss: 0.8402 - classification_loss: 0.1189 4/500 [..............................] - ETA: 2:47 - loss: 0.9091 - regression_loss: 0.8094 - classification_loss: 0.0997 5/500 [..............................] - ETA: 2:47 - loss: 0.9799 - regression_loss: 0.8648 - classification_loss: 0.1150 6/500 [..............................] - ETA: 2:48 - loss: 0.8987 - regression_loss: 0.7980 - classification_loss: 0.1007 7/500 [..............................] - ETA: 2:48 - loss: 0.9440 - regression_loss: 0.8316 - classification_loss: 0.1124 8/500 [..............................] - ETA: 2:48 - loss: 0.9401 - regression_loss: 0.8264 - classification_loss: 0.1137 9/500 [..............................] - ETA: 2:49 - loss: 0.9940 - regression_loss: 0.8746 - classification_loss: 0.1194 10/500 [..............................] - ETA: 2:49 - loss: 0.9247 - regression_loss: 0.8156 - classification_loss: 0.1091 11/500 [..............................] - ETA: 2:48 - loss: 0.9734 - regression_loss: 0.8592 - classification_loss: 0.1142 12/500 [..............................] - ETA: 2:47 - loss: 0.9533 - regression_loss: 0.8436 - classification_loss: 0.1096 13/500 [..............................] - ETA: 2:46 - loss: 0.9801 - regression_loss: 0.8609 - classification_loss: 0.1192 14/500 [..............................] - ETA: 2:46 - loss: 0.9810 - regression_loss: 0.8630 - classification_loss: 0.1181 15/500 [..............................] - ETA: 2:45 - loss: 0.9559 - regression_loss: 0.8410 - classification_loss: 0.1149 16/500 [..............................] - ETA: 2:44 - loss: 0.9467 - regression_loss: 0.8337 - classification_loss: 0.1131 17/500 [>.............................] - ETA: 2:43 - loss: 0.9582 - regression_loss: 0.8406 - classification_loss: 0.1175 18/500 [>.............................] - ETA: 2:42 - loss: 0.9504 - regression_loss: 0.8346 - classification_loss: 0.1158 19/500 [>.............................] - ETA: 2:41 - loss: 0.9410 - regression_loss: 0.8286 - classification_loss: 0.1124 20/500 [>.............................] - ETA: 2:40 - loss: 0.9098 - regression_loss: 0.8012 - classification_loss: 0.1086 21/500 [>.............................] - ETA: 2:40 - loss: 0.9000 - regression_loss: 0.7926 - classification_loss: 0.1074 22/500 [>.............................] - ETA: 2:39 - loss: 0.8888 - regression_loss: 0.7816 - classification_loss: 0.1072 23/500 [>.............................] - ETA: 2:39 - loss: 0.9135 - regression_loss: 0.8003 - classification_loss: 0.1132 24/500 [>.............................] - ETA: 2:39 - loss: 0.9182 - regression_loss: 0.8050 - classification_loss: 0.1131 25/500 [>.............................] - ETA: 2:38 - loss: 0.9069 - regression_loss: 0.7959 - classification_loss: 0.1110 26/500 [>.............................] - ETA: 2:38 - loss: 0.9208 - regression_loss: 0.8090 - classification_loss: 0.1118 27/500 [>.............................] - ETA: 2:38 - loss: 0.9343 - regression_loss: 0.8165 - classification_loss: 0.1178 28/500 [>.............................] - ETA: 2:38 - loss: 0.9361 - regression_loss: 0.8194 - classification_loss: 0.1167 29/500 [>.............................] - ETA: 2:38 - loss: 0.9127 - regression_loss: 0.7995 - classification_loss: 0.1132 30/500 [>.............................] - ETA: 2:37 - loss: 0.9200 - regression_loss: 0.8034 - classification_loss: 0.1165 31/500 [>.............................] - ETA: 2:37 - loss: 0.9196 - regression_loss: 0.8033 - classification_loss: 0.1163 32/500 [>.............................] - ETA: 2:36 - loss: 0.9082 - regression_loss: 0.7937 - classification_loss: 0.1145 33/500 [>.............................] - ETA: 2:36 - loss: 0.9184 - regression_loss: 0.8026 - classification_loss: 0.1158 34/500 [=>............................] - ETA: 2:36 - loss: 0.9252 - regression_loss: 0.8093 - classification_loss: 0.1159 35/500 [=>............................] - ETA: 2:36 - loss: 0.9299 - regression_loss: 0.8137 - classification_loss: 0.1162 36/500 [=>............................] - ETA: 2:35 - loss: 0.9465 - regression_loss: 0.8272 - classification_loss: 0.1193 37/500 [=>............................] - ETA: 2:35 - loss: 0.9465 - regression_loss: 0.8268 - classification_loss: 0.1197 38/500 [=>............................] - ETA: 2:35 - loss: 0.9351 - regression_loss: 0.8179 - classification_loss: 0.1172 39/500 [=>............................] - ETA: 2:34 - loss: 0.9392 - regression_loss: 0.8220 - classification_loss: 0.1172 40/500 [=>............................] - ETA: 2:34 - loss: 0.9396 - regression_loss: 0.8237 - classification_loss: 0.1159 41/500 [=>............................] - ETA: 2:34 - loss: 0.9324 - regression_loss: 0.8179 - classification_loss: 0.1145 42/500 [=>............................] - ETA: 2:33 - loss: 0.9341 - regression_loss: 0.8201 - classification_loss: 0.1140 43/500 [=>............................] - ETA: 2:33 - loss: 0.9358 - regression_loss: 0.8216 - classification_loss: 0.1143 44/500 [=>............................] - ETA: 2:33 - loss: 0.9397 - regression_loss: 0.8258 - classification_loss: 0.1140 45/500 [=>............................] - ETA: 2:32 - loss: 0.9268 - regression_loss: 0.8148 - classification_loss: 0.1120 46/500 [=>............................] - ETA: 2:32 - loss: 0.9254 - regression_loss: 0.8138 - classification_loss: 0.1116 47/500 [=>............................] - ETA: 2:32 - loss: 0.9241 - regression_loss: 0.8131 - classification_loss: 0.1109 48/500 [=>............................] - ETA: 2:32 - loss: 0.9254 - regression_loss: 0.8146 - classification_loss: 0.1109 49/500 [=>............................] - ETA: 2:31 - loss: 0.9201 - regression_loss: 0.8106 - classification_loss: 0.1095 50/500 [==>...........................] - ETA: 2:31 - loss: 0.9254 - regression_loss: 0.8151 - classification_loss: 0.1103 51/500 [==>...........................] - ETA: 2:31 - loss: 0.9223 - regression_loss: 0.8127 - classification_loss: 0.1096 52/500 [==>...........................] - ETA: 2:30 - loss: 0.9173 - regression_loss: 0.8093 - classification_loss: 0.1081 53/500 [==>...........................] - ETA: 2:30 - loss: 0.9252 - regression_loss: 0.8177 - classification_loss: 0.1075 54/500 [==>...........................] - ETA: 2:30 - loss: 0.9142 - regression_loss: 0.8081 - classification_loss: 0.1061 55/500 [==>...........................] - ETA: 2:29 - loss: 0.9374 - regression_loss: 0.8282 - classification_loss: 0.1092 56/500 [==>...........................] - ETA: 2:29 - loss: 0.9467 - regression_loss: 0.8360 - classification_loss: 0.1108 57/500 [==>...........................] - ETA: 2:29 - loss: 0.9478 - regression_loss: 0.8377 - classification_loss: 0.1101 58/500 [==>...........................] - ETA: 2:28 - loss: 0.9413 - regression_loss: 0.8322 - classification_loss: 0.1091 59/500 [==>...........................] - ETA: 2:28 - loss: 0.9382 - regression_loss: 0.8294 - classification_loss: 0.1088 60/500 [==>...........................] - ETA: 2:28 - loss: 0.9390 - regression_loss: 0.8308 - classification_loss: 0.1082 61/500 [==>...........................] - ETA: 2:27 - loss: 0.9343 - regression_loss: 0.8259 - classification_loss: 0.1084 62/500 [==>...........................] - ETA: 2:27 - loss: 0.9301 - regression_loss: 0.8224 - classification_loss: 0.1078 63/500 [==>...........................] - ETA: 2:27 - loss: 0.9323 - regression_loss: 0.8242 - classification_loss: 0.1081 64/500 [==>...........................] - ETA: 2:26 - loss: 0.9307 - regression_loss: 0.8235 - classification_loss: 0.1071 65/500 [==>...........................] - ETA: 2:26 - loss: 0.9327 - regression_loss: 0.8252 - classification_loss: 0.1075 66/500 [==>...........................] - ETA: 2:26 - loss: 0.9343 - regression_loss: 0.8267 - classification_loss: 0.1076 67/500 [===>..........................] - ETA: 2:25 - loss: 0.9343 - regression_loss: 0.8275 - classification_loss: 0.1068 68/500 [===>..........................] - ETA: 2:25 - loss: 0.9316 - regression_loss: 0.8257 - classification_loss: 0.1059 69/500 [===>..........................] - ETA: 2:25 - loss: 0.9309 - regression_loss: 0.8249 - classification_loss: 0.1060 70/500 [===>..........................] - ETA: 2:24 - loss: 0.9345 - regression_loss: 0.8289 - classification_loss: 0.1056 71/500 [===>..........................] - ETA: 2:24 - loss: 0.9298 - regression_loss: 0.8248 - classification_loss: 0.1049 72/500 [===>..........................] - ETA: 2:24 - loss: 0.9247 - regression_loss: 0.8201 - classification_loss: 0.1046 73/500 [===>..........................] - ETA: 2:23 - loss: 0.9337 - regression_loss: 0.8281 - classification_loss: 0.1056 74/500 [===>..........................] - ETA: 2:23 - loss: 0.9318 - regression_loss: 0.8263 - classification_loss: 0.1055 75/500 [===>..........................] - ETA: 2:23 - loss: 0.9355 - regression_loss: 0.8295 - classification_loss: 0.1059 76/500 [===>..........................] - ETA: 2:22 - loss: 0.9399 - regression_loss: 0.8336 - classification_loss: 0.1063 77/500 [===>..........................] - ETA: 2:22 - loss: 0.9450 - regression_loss: 0.8379 - classification_loss: 0.1071 78/500 [===>..........................] - ETA: 2:22 - loss: 0.9480 - regression_loss: 0.8402 - classification_loss: 0.1078 79/500 [===>..........................] - ETA: 2:21 - loss: 0.9436 - regression_loss: 0.8358 - classification_loss: 0.1078 80/500 [===>..........................] - ETA: 2:21 - loss: 0.9441 - regression_loss: 0.8365 - classification_loss: 0.1076 81/500 [===>..........................] - ETA: 2:20 - loss: 0.9462 - regression_loss: 0.8385 - classification_loss: 0.1077 82/500 [===>..........................] - ETA: 2:20 - loss: 0.9494 - regression_loss: 0.8411 - classification_loss: 0.1083 83/500 [===>..........................] - ETA: 2:20 - loss: 0.9539 - regression_loss: 0.8450 - classification_loss: 0.1089 84/500 [====>.........................] - ETA: 2:20 - loss: 0.9584 - regression_loss: 0.8491 - classification_loss: 0.1094 85/500 [====>.........................] - ETA: 2:19 - loss: 0.9597 - regression_loss: 0.8498 - classification_loss: 0.1099 86/500 [====>.........................] - ETA: 2:19 - loss: 0.9596 - regression_loss: 0.8505 - classification_loss: 0.1092 87/500 [====>.........................] - ETA: 2:19 - loss: 0.9555 - regression_loss: 0.8470 - classification_loss: 0.1085 88/500 [====>.........................] - ETA: 2:18 - loss: 0.9529 - regression_loss: 0.8450 - classification_loss: 0.1079 89/500 [====>.........................] - ETA: 2:18 - loss: 0.9522 - regression_loss: 0.8444 - classification_loss: 0.1078 90/500 [====>.........................] - ETA: 2:18 - loss: 0.9483 - regression_loss: 0.8411 - classification_loss: 0.1073 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9424 - regression_loss: 0.8360 - classification_loss: 0.1064 92/500 [====>.........................] - ETA: 2:17 - loss: 0.9360 - regression_loss: 0.8301 - classification_loss: 0.1058 93/500 [====>.........................] - ETA: 2:17 - loss: 0.9361 - regression_loss: 0.8304 - classification_loss: 0.1058 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9397 - regression_loss: 0.8334 - classification_loss: 0.1063 95/500 [====>.........................] - ETA: 2:16 - loss: 0.9402 - regression_loss: 0.8337 - classification_loss: 0.1065 96/500 [====>.........................] - ETA: 2:16 - loss: 0.9465 - regression_loss: 0.8387 - classification_loss: 0.1078 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9461 - regression_loss: 0.8382 - classification_loss: 0.1078 98/500 [====>.........................] - ETA: 2:15 - loss: 0.9489 - regression_loss: 0.8406 - classification_loss: 0.1083 99/500 [====>.........................] - ETA: 2:15 - loss: 0.9487 - regression_loss: 0.8400 - classification_loss: 0.1087 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9491 - regression_loss: 0.8405 - classification_loss: 0.1086 101/500 [=====>........................] - ETA: 2:14 - loss: 0.9488 - regression_loss: 0.8402 - classification_loss: 0.1086 102/500 [=====>........................] - ETA: 2:14 - loss: 0.9479 - regression_loss: 0.8399 - classification_loss: 0.1081 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9458 - regression_loss: 0.8383 - classification_loss: 0.1075 104/500 [=====>........................] - ETA: 2:13 - loss: 0.9471 - regression_loss: 0.8397 - classification_loss: 0.1075 105/500 [=====>........................] - ETA: 2:13 - loss: 0.9492 - regression_loss: 0.8418 - classification_loss: 0.1074 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9517 - regression_loss: 0.8444 - classification_loss: 0.1073 107/500 [=====>........................] - ETA: 2:12 - loss: 0.9507 - regression_loss: 0.8441 - classification_loss: 0.1066 108/500 [=====>........................] - ETA: 2:12 - loss: 0.9541 - regression_loss: 0.8470 - classification_loss: 0.1070 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9569 - regression_loss: 0.8495 - classification_loss: 0.1074 110/500 [=====>........................] - ETA: 2:11 - loss: 0.9543 - regression_loss: 0.8470 - classification_loss: 0.1073 111/500 [=====>........................] - ETA: 2:11 - loss: 0.9530 - regression_loss: 0.8460 - classification_loss: 0.1070 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9507 - regression_loss: 0.8443 - classification_loss: 0.1064 113/500 [=====>........................] - ETA: 2:10 - loss: 0.9539 - regression_loss: 0.8472 - classification_loss: 0.1067 114/500 [=====>........................] - ETA: 2:10 - loss: 0.9521 - regression_loss: 0.8457 - classification_loss: 0.1064 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9548 - regression_loss: 0.8484 - classification_loss: 0.1064 116/500 [=====>........................] - ETA: 2:09 - loss: 0.9539 - regression_loss: 0.8476 - classification_loss: 0.1063 117/500 [======>.......................] - ETA: 2:09 - loss: 0.9489 - regression_loss: 0.8435 - classification_loss: 0.1055 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9507 - regression_loss: 0.8448 - classification_loss: 0.1060 119/500 [======>.......................] - ETA: 2:08 - loss: 0.9513 - regression_loss: 0.8455 - classification_loss: 0.1059 120/500 [======>.......................] - ETA: 2:08 - loss: 0.9491 - regression_loss: 0.8437 - classification_loss: 0.1054 121/500 [======>.......................] - ETA: 2:07 - loss: 0.9508 - regression_loss: 0.8450 - classification_loss: 0.1058 122/500 [======>.......................] - ETA: 2:07 - loss: 0.9455 - regression_loss: 0.8403 - classification_loss: 0.1052 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9443 - regression_loss: 0.8392 - classification_loss: 0.1051 124/500 [======>.......................] - ETA: 2:06 - loss: 0.9437 - regression_loss: 0.8387 - classification_loss: 0.1050 125/500 [======>.......................] - ETA: 2:06 - loss: 0.9440 - regression_loss: 0.8389 - classification_loss: 0.1052 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9437 - regression_loss: 0.8387 - classification_loss: 0.1050 127/500 [======>.......................] - ETA: 2:05 - loss: 0.9441 - regression_loss: 0.8393 - classification_loss: 0.1048 128/500 [======>.......................] - ETA: 2:05 - loss: 0.9442 - regression_loss: 0.8395 - classification_loss: 0.1048 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9392 - regression_loss: 0.8351 - classification_loss: 0.1041 130/500 [======>.......................] - ETA: 2:04 - loss: 0.9397 - regression_loss: 0.8354 - classification_loss: 0.1043 131/500 [======>.......................] - ETA: 2:04 - loss: 0.9405 - regression_loss: 0.8363 - classification_loss: 0.1043 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9391 - regression_loss: 0.8355 - classification_loss: 0.1037 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9368 - regression_loss: 0.8336 - classification_loss: 0.1032 134/500 [=======>......................] - ETA: 2:03 - loss: 0.9321 - regression_loss: 0.8293 - classification_loss: 0.1028 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9326 - regression_loss: 0.8301 - classification_loss: 0.1025 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9336 - regression_loss: 0.8312 - classification_loss: 0.1024 137/500 [=======>......................] - ETA: 2:02 - loss: 0.9322 - regression_loss: 0.8300 - classification_loss: 0.1022 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9307 - regression_loss: 0.8286 - classification_loss: 0.1021 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9322 - regression_loss: 0.8299 - classification_loss: 0.1023 140/500 [=======>......................] - ETA: 2:01 - loss: 0.9340 - regression_loss: 0.8315 - classification_loss: 0.1026 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9317 - regression_loss: 0.8294 - classification_loss: 0.1023 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9337 - regression_loss: 0.8313 - classification_loss: 0.1024 143/500 [=======>......................] - ETA: 2:00 - loss: 0.9308 - regression_loss: 0.8289 - classification_loss: 0.1019 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9313 - regression_loss: 0.8291 - classification_loss: 0.1022 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9336 - regression_loss: 0.8315 - classification_loss: 0.1022 146/500 [=======>......................] - ETA: 1:59 - loss: 0.9332 - regression_loss: 0.8310 - classification_loss: 0.1022 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9368 - regression_loss: 0.8339 - classification_loss: 0.1029 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9380 - regression_loss: 0.8350 - classification_loss: 0.1030 149/500 [=======>......................] - ETA: 1:58 - loss: 0.9347 - regression_loss: 0.8322 - classification_loss: 0.1025 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9349 - regression_loss: 0.8325 - classification_loss: 0.1024 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9343 - regression_loss: 0.8322 - classification_loss: 0.1021 152/500 [========>.....................] - ETA: 1:57 - loss: 0.9340 - regression_loss: 0.8319 - classification_loss: 0.1021 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9341 - regression_loss: 0.8320 - classification_loss: 0.1021 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9365 - regression_loss: 0.8339 - classification_loss: 0.1026 155/500 [========>.....................] - ETA: 1:56 - loss: 0.9367 - regression_loss: 0.8340 - classification_loss: 0.1028 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9367 - regression_loss: 0.8338 - classification_loss: 0.1029 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9378 - regression_loss: 0.8349 - classification_loss: 0.1029 158/500 [========>.....................] - ETA: 1:55 - loss: 0.9385 - regression_loss: 0.8357 - classification_loss: 0.1028 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9395 - regression_loss: 0.8367 - classification_loss: 0.1028 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9377 - regression_loss: 0.8351 - classification_loss: 0.1026 161/500 [========>.....................] - ETA: 1:54 - loss: 0.9378 - regression_loss: 0.8350 - classification_loss: 0.1028 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9361 - regression_loss: 0.8336 - classification_loss: 0.1025 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9357 - regression_loss: 0.8333 - classification_loss: 0.1024 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9371 - regression_loss: 0.8350 - classification_loss: 0.1021 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9414 - regression_loss: 0.8389 - classification_loss: 0.1026 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9412 - regression_loss: 0.8386 - classification_loss: 0.1025 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9403 - regression_loss: 0.8379 - classification_loss: 0.1024 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9385 - regression_loss: 0.8364 - classification_loss: 0.1020 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9452 - regression_loss: 0.8416 - classification_loss: 0.1036 170/500 [=========>....................] - ETA: 1:51 - loss: 0.9454 - regression_loss: 0.8419 - classification_loss: 0.1035 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9455 - regression_loss: 0.8420 - classification_loss: 0.1035 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9469 - regression_loss: 0.8433 - classification_loss: 0.1036 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9479 - regression_loss: 0.8442 - classification_loss: 0.1038 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9445 - regression_loss: 0.8411 - classification_loss: 0.1034 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9434 - regression_loss: 0.8401 - classification_loss: 0.1033 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9432 - regression_loss: 0.8398 - classification_loss: 0.1034 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9413 - regression_loss: 0.8382 - classification_loss: 0.1032 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9405 - regression_loss: 0.8373 - classification_loss: 0.1032 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9383 - regression_loss: 0.8355 - classification_loss: 0.1028 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9389 - regression_loss: 0.8362 - classification_loss: 0.1027 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9358 - regression_loss: 0.8335 - classification_loss: 0.1023 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9361 - regression_loss: 0.8338 - classification_loss: 0.1023 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9350 - regression_loss: 0.8330 - classification_loss: 0.1019 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9387 - regression_loss: 0.8366 - classification_loss: 0.1022 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9375 - regression_loss: 0.8357 - classification_loss: 0.1018 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9380 - regression_loss: 0.8360 - classification_loss: 0.1019 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9355 - regression_loss: 0.8339 - classification_loss: 0.1016 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9347 - regression_loss: 0.8330 - classification_loss: 0.1017 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9353 - regression_loss: 0.8336 - classification_loss: 0.1018 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9357 - regression_loss: 0.8341 - classification_loss: 0.1015 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9351 - regression_loss: 0.8337 - classification_loss: 0.1014 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9334 - regression_loss: 0.8323 - classification_loss: 0.1011 193/500 [==========>...................] - ETA: 1:43 - loss: 0.9353 - regression_loss: 0.8343 - classification_loss: 0.1010 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9336 - regression_loss: 0.8330 - classification_loss: 0.1006 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9302 - regression_loss: 0.8299 - classification_loss: 0.1002 196/500 [==========>...................] - ETA: 1:42 - loss: 0.9294 - regression_loss: 0.8293 - classification_loss: 0.1001 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9296 - regression_loss: 0.8293 - classification_loss: 0.1002 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9310 - regression_loss: 0.8305 - classification_loss: 0.1005 199/500 [==========>...................] - ETA: 1:41 - loss: 0.9328 - regression_loss: 0.8320 - classification_loss: 0.1008 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9340 - regression_loss: 0.8330 - classification_loss: 0.1010 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9347 - regression_loss: 0.8336 - classification_loss: 0.1011 202/500 [===========>..................] - ETA: 1:40 - loss: 0.9366 - regression_loss: 0.8352 - classification_loss: 0.1013 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9344 - regression_loss: 0.8334 - classification_loss: 0.1010 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9362 - regression_loss: 0.8347 - classification_loss: 0.1015 205/500 [===========>..................] - ETA: 1:39 - loss: 0.9372 - regression_loss: 0.8354 - classification_loss: 0.1017 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9379 - regression_loss: 0.8361 - classification_loss: 0.1018 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9395 - regression_loss: 0.8377 - classification_loss: 0.1018 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9378 - regression_loss: 0.8359 - classification_loss: 0.1018 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9383 - regression_loss: 0.8363 - classification_loss: 0.1020 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9405 - regression_loss: 0.8383 - classification_loss: 0.1022 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9392 - regression_loss: 0.8372 - classification_loss: 0.1021 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9397 - regression_loss: 0.8377 - classification_loss: 0.1020 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9406 - regression_loss: 0.8383 - classification_loss: 0.1022 214/500 [===========>..................] - ETA: 1:36 - loss: 0.9417 - regression_loss: 0.8393 - classification_loss: 0.1024 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9416 - regression_loss: 0.8392 - classification_loss: 0.1024 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9440 - regression_loss: 0.8413 - classification_loss: 0.1027 217/500 [============>.................] - ETA: 1:35 - loss: 0.9442 - regression_loss: 0.8415 - classification_loss: 0.1027 218/500 [============>.................] - ETA: 1:35 - loss: 0.9426 - regression_loss: 0.8401 - classification_loss: 0.1024 219/500 [============>.................] - ETA: 1:35 - loss: 0.9424 - regression_loss: 0.8399 - classification_loss: 0.1025 220/500 [============>.................] - ETA: 1:34 - loss: 0.9476 - regression_loss: 0.8446 - classification_loss: 0.1030 221/500 [============>.................] - ETA: 1:34 - loss: 0.9486 - regression_loss: 0.8455 - classification_loss: 0.1032 222/500 [============>.................] - ETA: 1:34 - loss: 0.9517 - regression_loss: 0.8481 - classification_loss: 0.1036 223/500 [============>.................] - ETA: 1:33 - loss: 0.9519 - regression_loss: 0.8483 - classification_loss: 0.1036 224/500 [============>.................] - ETA: 1:33 - loss: 0.9496 - regression_loss: 0.8463 - classification_loss: 0.1033 225/500 [============>.................] - ETA: 1:33 - loss: 0.9481 - regression_loss: 0.8451 - classification_loss: 0.1030 226/500 [============>.................] - ETA: 1:32 - loss: 0.9459 - regression_loss: 0.8432 - classification_loss: 0.1027 227/500 [============>.................] - ETA: 1:32 - loss: 0.9434 - regression_loss: 0.8411 - classification_loss: 0.1023 228/500 [============>.................] - ETA: 1:32 - loss: 0.9457 - regression_loss: 0.8431 - classification_loss: 0.1026 229/500 [============>.................] - ETA: 1:31 - loss: 0.9472 - regression_loss: 0.8445 - classification_loss: 0.1026 230/500 [============>.................] - ETA: 1:31 - loss: 0.9485 - regression_loss: 0.8458 - classification_loss: 0.1027 231/500 [============>.................] - ETA: 1:31 - loss: 0.9467 - regression_loss: 0.8443 - classification_loss: 0.1024 232/500 [============>.................] - ETA: 1:30 - loss: 0.9446 - regression_loss: 0.8426 - classification_loss: 0.1021 233/500 [============>.................] - ETA: 1:30 - loss: 0.9454 - regression_loss: 0.8433 - classification_loss: 0.1022 234/500 [=============>................] - ETA: 1:30 - loss: 0.9450 - regression_loss: 0.8429 - classification_loss: 0.1021 235/500 [=============>................] - ETA: 1:29 - loss: 0.9453 - regression_loss: 0.8431 - classification_loss: 0.1022 236/500 [=============>................] - ETA: 1:29 - loss: 0.9453 - regression_loss: 0.8430 - classification_loss: 0.1022 237/500 [=============>................] - ETA: 1:29 - loss: 0.9467 - regression_loss: 0.8442 - classification_loss: 0.1025 238/500 [=============>................] - ETA: 1:28 - loss: 0.9464 - regression_loss: 0.8439 - classification_loss: 0.1026 239/500 [=============>................] - ETA: 1:28 - loss: 0.9461 - regression_loss: 0.8437 - classification_loss: 0.1025 240/500 [=============>................] - ETA: 1:28 - loss: 0.9474 - regression_loss: 0.8446 - classification_loss: 0.1028 241/500 [=============>................] - ETA: 1:27 - loss: 0.9494 - regression_loss: 0.8463 - classification_loss: 0.1031 242/500 [=============>................] - ETA: 1:27 - loss: 0.9499 - regression_loss: 0.8469 - classification_loss: 0.1030 243/500 [=============>................] - ETA: 1:27 - loss: 0.9510 - regression_loss: 0.8480 - classification_loss: 0.1030 244/500 [=============>................] - ETA: 1:26 - loss: 0.9509 - regression_loss: 0.8480 - classification_loss: 0.1029 245/500 [=============>................] - ETA: 1:26 - loss: 0.9500 - regression_loss: 0.8472 - classification_loss: 0.1028 246/500 [=============>................] - ETA: 1:26 - loss: 0.9492 - regression_loss: 0.8466 - classification_loss: 0.1026 247/500 [=============>................] - ETA: 1:25 - loss: 0.9499 - regression_loss: 0.8472 - classification_loss: 0.1028 248/500 [=============>................] - ETA: 1:25 - loss: 0.9489 - regression_loss: 0.8465 - classification_loss: 0.1024 249/500 [=============>................] - ETA: 1:25 - loss: 0.9489 - regression_loss: 0.8465 - classification_loss: 0.1024 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9488 - regression_loss: 0.8464 - classification_loss: 0.1024 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9486 - regression_loss: 0.8463 - classification_loss: 0.1023 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9486 - regression_loss: 0.8462 - classification_loss: 0.1023 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9481 - regression_loss: 0.8459 - classification_loss: 0.1022 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9486 - regression_loss: 0.8463 - classification_loss: 0.1023 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9472 - regression_loss: 0.8452 - classification_loss: 0.1020 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9489 - regression_loss: 0.8468 - classification_loss: 0.1021 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9481 - regression_loss: 0.8462 - classification_loss: 0.1019 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9471 - regression_loss: 0.8453 - classification_loss: 0.1018 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9438 - regression_loss: 0.8420 - classification_loss: 0.1017 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9441 - regression_loss: 0.8423 - classification_loss: 0.1018 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9436 - regression_loss: 0.8421 - classification_loss: 0.1016 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9436 - regression_loss: 0.8419 - classification_loss: 0.1017 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9458 - regression_loss: 0.8438 - classification_loss: 0.1021 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9451 - regression_loss: 0.8431 - classification_loss: 0.1020 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9453 - regression_loss: 0.8434 - classification_loss: 0.1020 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9452 - regression_loss: 0.8433 - classification_loss: 0.1019 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9463 - regression_loss: 0.8442 - classification_loss: 0.1021 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9438 - regression_loss: 0.8420 - classification_loss: 0.1018 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9432 - regression_loss: 0.8414 - classification_loss: 0.1017 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9433 - regression_loss: 0.8416 - classification_loss: 0.1017 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9436 - regression_loss: 0.8418 - classification_loss: 0.1018 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9434 - regression_loss: 0.8417 - classification_loss: 0.1017 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9430 - regression_loss: 0.8415 - classification_loss: 0.1015 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9409 - regression_loss: 0.8396 - classification_loss: 0.1013 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9420 - regression_loss: 0.8405 - classification_loss: 0.1015 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9422 - regression_loss: 0.8407 - classification_loss: 0.1014 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9428 - regression_loss: 0.8411 - classification_loss: 0.1016 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9434 - regression_loss: 0.8417 - classification_loss: 0.1018 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9438 - regression_loss: 0.8421 - classification_loss: 0.1017 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9435 - regression_loss: 0.8418 - classification_loss: 0.1017 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9447 - regression_loss: 0.8428 - classification_loss: 0.1019 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9455 - regression_loss: 0.8435 - classification_loss: 0.1020 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9456 - regression_loss: 0.8435 - classification_loss: 0.1022 284/500 [================>.............] - ETA: 1:13 - loss: 0.9461 - regression_loss: 0.8439 - classification_loss: 0.1022 285/500 [================>.............] - ETA: 1:13 - loss: 0.9460 - regression_loss: 0.8439 - classification_loss: 0.1021 286/500 [================>.............] - ETA: 1:12 - loss: 0.9464 - regression_loss: 0.8441 - classification_loss: 0.1022 287/500 [================>.............] - ETA: 1:12 - loss: 0.9452 - regression_loss: 0.8432 - classification_loss: 0.1020 288/500 [================>.............] - ETA: 1:12 - loss: 0.9461 - regression_loss: 0.8439 - classification_loss: 0.1022 289/500 [================>.............] - ETA: 1:11 - loss: 0.9446 - regression_loss: 0.8427 - classification_loss: 0.1020 290/500 [================>.............] - ETA: 1:11 - loss: 0.9453 - regression_loss: 0.8433 - classification_loss: 0.1020 291/500 [================>.............] - ETA: 1:11 - loss: 0.9470 - regression_loss: 0.8448 - classification_loss: 0.1023 292/500 [================>.............] - ETA: 1:10 - loss: 0.9481 - regression_loss: 0.8456 - classification_loss: 0.1025 293/500 [================>.............] - ETA: 1:10 - loss: 0.9464 - regression_loss: 0.8442 - classification_loss: 0.1023 294/500 [================>.............] - ETA: 1:10 - loss: 0.9447 - regression_loss: 0.8427 - classification_loss: 0.1020 295/500 [================>.............] - ETA: 1:09 - loss: 0.9461 - regression_loss: 0.8441 - classification_loss: 0.1020 296/500 [================>.............] - ETA: 1:09 - loss: 0.9466 - regression_loss: 0.8449 - classification_loss: 0.1018 297/500 [================>.............] - ETA: 1:09 - loss: 0.9461 - regression_loss: 0.8445 - classification_loss: 0.1016 298/500 [================>.............] - ETA: 1:08 - loss: 0.9450 - regression_loss: 0.8435 - classification_loss: 0.1015 299/500 [================>.............] - ETA: 1:08 - loss: 0.9463 - regression_loss: 0.8448 - classification_loss: 0.1015 300/500 [=================>............] - ETA: 1:07 - loss: 0.9461 - regression_loss: 0.8448 - classification_loss: 0.1013 301/500 [=================>............] - ETA: 1:07 - loss: 0.9465 - regression_loss: 0.8452 - classification_loss: 0.1013 302/500 [=================>............] - ETA: 1:07 - loss: 0.9457 - regression_loss: 0.8445 - classification_loss: 0.1012 303/500 [=================>............] - ETA: 1:06 - loss: 0.9456 - regression_loss: 0.8445 - classification_loss: 0.1012 304/500 [=================>............] - ETA: 1:06 - loss: 0.9455 - regression_loss: 0.8444 - classification_loss: 0.1011 305/500 [=================>............] - ETA: 1:06 - loss: 0.9461 - regression_loss: 0.8449 - classification_loss: 0.1013 306/500 [=================>............] - ETA: 1:05 - loss: 0.9476 - regression_loss: 0.8463 - classification_loss: 0.1014 307/500 [=================>............] - ETA: 1:05 - loss: 0.9471 - regression_loss: 0.8460 - classification_loss: 0.1012 308/500 [=================>............] - ETA: 1:05 - loss: 0.9485 - regression_loss: 0.8470 - classification_loss: 0.1015 309/500 [=================>............] - ETA: 1:04 - loss: 0.9479 - regression_loss: 0.8466 - classification_loss: 0.1013 310/500 [=================>............] - ETA: 1:04 - loss: 0.9480 - regression_loss: 0.8467 - classification_loss: 0.1013 311/500 [=================>............] - ETA: 1:04 - loss: 0.9471 - regression_loss: 0.8460 - classification_loss: 0.1011 312/500 [=================>............] - ETA: 1:03 - loss: 0.9479 - regression_loss: 0.8469 - classification_loss: 0.1010 313/500 [=================>............] - ETA: 1:03 - loss: 0.9499 - regression_loss: 0.8485 - classification_loss: 0.1014 314/500 [=================>............] - ETA: 1:03 - loss: 0.9486 - regression_loss: 0.8474 - classification_loss: 0.1011 315/500 [=================>............] - ETA: 1:02 - loss: 0.9503 - regression_loss: 0.8492 - classification_loss: 0.1012 316/500 [=================>............] - ETA: 1:02 - loss: 0.9483 - regression_loss: 0.8473 - classification_loss: 0.1009 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9490 - regression_loss: 0.8480 - classification_loss: 0.1010 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9482 - regression_loss: 0.8472 - classification_loss: 0.1010 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9482 - regression_loss: 0.8470 - classification_loss: 0.1012 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9468 - regression_loss: 0.8458 - classification_loss: 0.1010 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9448 - regression_loss: 0.8440 - classification_loss: 0.1008 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9447 - regression_loss: 0.8441 - classification_loss: 0.1006 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9451 - regression_loss: 0.8447 - classification_loss: 0.1004 324/500 [==================>...........] - ETA: 59s - loss: 0.9452 - regression_loss: 0.8448 - classification_loss: 0.1003  325/500 [==================>...........] - ETA: 59s - loss: 0.9443 - regression_loss: 0.8441 - classification_loss: 0.1002 326/500 [==================>...........] - ETA: 59s - loss: 0.9454 - regression_loss: 0.8451 - classification_loss: 0.1003 327/500 [==================>...........] - ETA: 58s - loss: 0.9452 - regression_loss: 0.8449 - classification_loss: 0.1003 328/500 [==================>...........] - ETA: 58s - loss: 0.9447 - regression_loss: 0.8445 - classification_loss: 0.1002 329/500 [==================>...........] - ETA: 58s - loss: 0.9458 - regression_loss: 0.8454 - classification_loss: 0.1004 330/500 [==================>...........] - ETA: 57s - loss: 0.9456 - regression_loss: 0.8453 - classification_loss: 0.1003 331/500 [==================>...........] - ETA: 57s - loss: 0.9447 - regression_loss: 0.8446 - classification_loss: 0.1002 332/500 [==================>...........] - ETA: 57s - loss: 0.9445 - regression_loss: 0.8443 - classification_loss: 0.1002 333/500 [==================>...........] - ETA: 56s - loss: 0.9435 - regression_loss: 0.8435 - classification_loss: 0.1000 334/500 [===================>..........] - ETA: 56s - loss: 0.9431 - regression_loss: 0.8431 - classification_loss: 0.1000 335/500 [===================>..........] - ETA: 56s - loss: 0.9425 - regression_loss: 0.8424 - classification_loss: 0.1001 336/500 [===================>..........] - ETA: 55s - loss: 0.9418 - regression_loss: 0.8419 - classification_loss: 0.1000 337/500 [===================>..........] - ETA: 55s - loss: 0.9396 - regression_loss: 0.8399 - classification_loss: 0.0997 338/500 [===================>..........] - ETA: 55s - loss: 0.9395 - regression_loss: 0.8398 - classification_loss: 0.0997 339/500 [===================>..........] - ETA: 54s - loss: 0.9386 - regression_loss: 0.8389 - classification_loss: 0.0996 340/500 [===================>..........] - ETA: 54s - loss: 0.9400 - regression_loss: 0.8402 - classification_loss: 0.0998 341/500 [===================>..........] - ETA: 53s - loss: 0.9409 - regression_loss: 0.8410 - classification_loss: 0.0999 342/500 [===================>..........] - ETA: 53s - loss: 0.9405 - regression_loss: 0.8407 - classification_loss: 0.0998 343/500 [===================>..........] - ETA: 53s - loss: 0.9405 - regression_loss: 0.8407 - classification_loss: 0.0998 344/500 [===================>..........] - ETA: 52s - loss: 0.9404 - regression_loss: 0.8407 - classification_loss: 0.0997 345/500 [===================>..........] - ETA: 52s - loss: 0.9429 - regression_loss: 0.8424 - classification_loss: 0.1005 346/500 [===================>..........] - ETA: 52s - loss: 0.9425 - regression_loss: 0.8421 - classification_loss: 0.1004 347/500 [===================>..........] - ETA: 51s - loss: 0.9426 - regression_loss: 0.8422 - classification_loss: 0.1003 348/500 [===================>..........] - ETA: 51s - loss: 0.9413 - regression_loss: 0.8411 - classification_loss: 0.1002 349/500 [===================>..........] - ETA: 51s - loss: 0.9420 - regression_loss: 0.8419 - classification_loss: 0.1001 350/500 [====================>.........] - ETA: 50s - loss: 0.9408 - regression_loss: 0.8409 - classification_loss: 0.0999 351/500 [====================>.........] - ETA: 50s - loss: 0.9394 - regression_loss: 0.8397 - classification_loss: 0.0997 352/500 [====================>.........] - ETA: 50s - loss: 0.9402 - regression_loss: 0.8403 - classification_loss: 0.0999 353/500 [====================>.........] - ETA: 49s - loss: 0.9394 - regression_loss: 0.8396 - classification_loss: 0.0998 354/500 [====================>.........] - ETA: 49s - loss: 0.9402 - regression_loss: 0.8403 - classification_loss: 0.0999 355/500 [====================>.........] - ETA: 49s - loss: 0.9440 - regression_loss: 0.8434 - classification_loss: 0.1006 356/500 [====================>.........] - ETA: 48s - loss: 0.9449 - regression_loss: 0.8442 - classification_loss: 0.1007 357/500 [====================>.........] - ETA: 48s - loss: 0.9435 - regression_loss: 0.8429 - classification_loss: 0.1006 358/500 [====================>.........] - ETA: 48s - loss: 0.9420 - regression_loss: 0.8416 - classification_loss: 0.1004 359/500 [====================>.........] - ETA: 47s - loss: 0.9425 - regression_loss: 0.8423 - classification_loss: 0.1003 360/500 [====================>.........] - ETA: 47s - loss: 0.9430 - regression_loss: 0.8426 - classification_loss: 0.1004 361/500 [====================>.........] - ETA: 47s - loss: 0.9430 - regression_loss: 0.8426 - classification_loss: 0.1004 362/500 [====================>.........] - ETA: 46s - loss: 0.9425 - regression_loss: 0.8421 - classification_loss: 0.1004 363/500 [====================>.........] - ETA: 46s - loss: 0.9422 - regression_loss: 0.8418 - classification_loss: 0.1004 364/500 [====================>.........] - ETA: 46s - loss: 0.9419 - regression_loss: 0.8414 - classification_loss: 0.1004 365/500 [====================>.........] - ETA: 45s - loss: 0.9412 - regression_loss: 0.8409 - classification_loss: 0.1003 366/500 [====================>.........] - ETA: 45s - loss: 0.9418 - regression_loss: 0.8415 - classification_loss: 0.1003 367/500 [=====================>........] - ETA: 45s - loss: 0.9403 - regression_loss: 0.8401 - classification_loss: 0.1001 368/500 [=====================>........] - ETA: 44s - loss: 0.9406 - regression_loss: 0.8406 - classification_loss: 0.1001 369/500 [=====================>........] - ETA: 44s - loss: 0.9414 - regression_loss: 0.8412 - classification_loss: 0.1002 370/500 [=====================>........] - ETA: 44s - loss: 0.9413 - regression_loss: 0.8411 - classification_loss: 0.1002 371/500 [=====================>........] - ETA: 43s - loss: 0.9412 - regression_loss: 0.8411 - classification_loss: 0.1001 372/500 [=====================>........] - ETA: 43s - loss: 0.9410 - regression_loss: 0.8410 - classification_loss: 0.1000 373/500 [=====================>........] - ETA: 43s - loss: 0.9402 - regression_loss: 0.8402 - classification_loss: 0.0999 374/500 [=====================>........] - ETA: 42s - loss: 0.9389 - regression_loss: 0.8391 - classification_loss: 0.0998 375/500 [=====================>........] - ETA: 42s - loss: 0.9387 - regression_loss: 0.8389 - classification_loss: 0.0998 376/500 [=====================>........] - ETA: 42s - loss: 0.9385 - regression_loss: 0.8387 - classification_loss: 0.0997 377/500 [=====================>........] - ETA: 41s - loss: 0.9389 - regression_loss: 0.8391 - classification_loss: 0.0998 378/500 [=====================>........] - ETA: 41s - loss: 0.9389 - regression_loss: 0.8391 - classification_loss: 0.0998 379/500 [=====================>........] - ETA: 41s - loss: 0.9399 - regression_loss: 0.8400 - classification_loss: 0.0999 380/500 [=====================>........] - ETA: 40s - loss: 0.9382 - regression_loss: 0.8385 - classification_loss: 0.0997 381/500 [=====================>........] - ETA: 40s - loss: 0.9386 - regression_loss: 0.8388 - classification_loss: 0.0998 382/500 [=====================>........] - ETA: 40s - loss: 0.9368 - regression_loss: 0.8373 - classification_loss: 0.0996 383/500 [=====================>........] - ETA: 39s - loss: 0.9373 - regression_loss: 0.8377 - classification_loss: 0.0997 384/500 [======================>.......] - ETA: 39s - loss: 0.9358 - regression_loss: 0.8363 - classification_loss: 0.0995 385/500 [======================>.......] - ETA: 39s - loss: 0.9364 - regression_loss: 0.8369 - classification_loss: 0.0995 386/500 [======================>.......] - ETA: 38s - loss: 0.9355 - regression_loss: 0.8362 - classification_loss: 0.0993 387/500 [======================>.......] - ETA: 38s - loss: 0.9355 - regression_loss: 0.8362 - classification_loss: 0.0992 388/500 [======================>.......] - ETA: 38s - loss: 0.9363 - regression_loss: 0.8370 - classification_loss: 0.0993 389/500 [======================>.......] - ETA: 37s - loss: 0.9366 - regression_loss: 0.8373 - classification_loss: 0.0993 390/500 [======================>.......] - ETA: 37s - loss: 0.9359 - regression_loss: 0.8367 - classification_loss: 0.0991 391/500 [======================>.......] - ETA: 37s - loss: 0.9358 - regression_loss: 0.8367 - classification_loss: 0.0991 392/500 [======================>.......] - ETA: 36s - loss: 0.9355 - regression_loss: 0.8365 - classification_loss: 0.0990 393/500 [======================>.......] - ETA: 36s - loss: 0.9359 - regression_loss: 0.8368 - classification_loss: 0.0991 394/500 [======================>.......] - ETA: 35s - loss: 0.9364 - regression_loss: 0.8373 - classification_loss: 0.0991 395/500 [======================>.......] - ETA: 35s - loss: 0.9365 - regression_loss: 0.8374 - classification_loss: 0.0991 396/500 [======================>.......] - ETA: 35s - loss: 0.9359 - regression_loss: 0.8369 - classification_loss: 0.0990 397/500 [======================>.......] - ETA: 34s - loss: 0.9363 - regression_loss: 0.8373 - classification_loss: 0.0990 398/500 [======================>.......] - ETA: 34s - loss: 0.9371 - regression_loss: 0.8380 - classification_loss: 0.0991 399/500 [======================>.......] - ETA: 34s - loss: 0.9369 - regression_loss: 0.8379 - classification_loss: 0.0990 400/500 [=======================>......] - ETA: 33s - loss: 0.9366 - regression_loss: 0.8378 - classification_loss: 0.0988 401/500 [=======================>......] - ETA: 33s - loss: 0.9363 - regression_loss: 0.8375 - classification_loss: 0.0988 402/500 [=======================>......] - ETA: 33s - loss: 0.9368 - regression_loss: 0.8379 - classification_loss: 0.0989 403/500 [=======================>......] - ETA: 32s - loss: 0.9368 - regression_loss: 0.8379 - classification_loss: 0.0989 404/500 [=======================>......] - ETA: 32s - loss: 0.9378 - regression_loss: 0.8387 - classification_loss: 0.0991 405/500 [=======================>......] - ETA: 32s - loss: 0.9371 - regression_loss: 0.8381 - classification_loss: 0.0990 406/500 [=======================>......] - ETA: 31s - loss: 0.9383 - regression_loss: 0.8391 - classification_loss: 0.0991 407/500 [=======================>......] - ETA: 31s - loss: 0.9379 - regression_loss: 0.8388 - classification_loss: 0.0991 408/500 [=======================>......] - ETA: 31s - loss: 0.9380 - regression_loss: 0.8389 - classification_loss: 0.0991 409/500 [=======================>......] - ETA: 30s - loss: 0.9377 - regression_loss: 0.8386 - classification_loss: 0.0991 410/500 [=======================>......] - ETA: 30s - loss: 0.9382 - regression_loss: 0.8389 - classification_loss: 0.0993 411/500 [=======================>......] - ETA: 30s - loss: 0.9377 - regression_loss: 0.8385 - classification_loss: 0.0992 412/500 [=======================>......] - ETA: 29s - loss: 0.9363 - regression_loss: 0.8372 - classification_loss: 0.0991 413/500 [=======================>......] - ETA: 29s - loss: 0.9362 - regression_loss: 0.8371 - classification_loss: 0.0991 414/500 [=======================>......] - ETA: 29s - loss: 0.9367 - regression_loss: 0.8375 - classification_loss: 0.0992 415/500 [=======================>......] - ETA: 28s - loss: 0.9365 - regression_loss: 0.8374 - classification_loss: 0.0991 416/500 [=======================>......] - ETA: 28s - loss: 0.9376 - regression_loss: 0.8385 - classification_loss: 0.0991 417/500 [========================>.....] - ETA: 28s - loss: 0.9367 - regression_loss: 0.8377 - classification_loss: 0.0990 418/500 [========================>.....] - ETA: 27s - loss: 0.9380 - regression_loss: 0.8387 - classification_loss: 0.0993 419/500 [========================>.....] - ETA: 27s - loss: 0.9379 - regression_loss: 0.8387 - classification_loss: 0.0993 420/500 [========================>.....] - ETA: 27s - loss: 0.9377 - regression_loss: 0.8386 - classification_loss: 0.0992 421/500 [========================>.....] - ETA: 26s - loss: 0.9379 - regression_loss: 0.8388 - classification_loss: 0.0991 422/500 [========================>.....] - ETA: 26s - loss: 0.9371 - regression_loss: 0.8381 - classification_loss: 0.0990 423/500 [========================>.....] - ETA: 26s - loss: 0.9376 - regression_loss: 0.8385 - classification_loss: 0.0990 424/500 [========================>.....] - ETA: 25s - loss: 0.9369 - regression_loss: 0.8380 - classification_loss: 0.0989 425/500 [========================>.....] - ETA: 25s - loss: 0.9376 - regression_loss: 0.8385 - classification_loss: 0.0990 426/500 [========================>.....] - ETA: 25s - loss: 0.9380 - regression_loss: 0.8388 - classification_loss: 0.0992 427/500 [========================>.....] - ETA: 24s - loss: 0.9382 - regression_loss: 0.8391 - classification_loss: 0.0992 428/500 [========================>.....] - ETA: 24s - loss: 0.9377 - regression_loss: 0.8387 - classification_loss: 0.0990 429/500 [========================>.....] - ETA: 24s - loss: 0.9382 - regression_loss: 0.8391 - classification_loss: 0.0991 430/500 [========================>.....] - ETA: 23s - loss: 0.9399 - regression_loss: 0.8402 - classification_loss: 0.0996 431/500 [========================>.....] - ETA: 23s - loss: 0.9399 - regression_loss: 0.8402 - classification_loss: 0.0997 432/500 [========================>.....] - ETA: 23s - loss: 0.9391 - regression_loss: 0.8396 - classification_loss: 0.0995 433/500 [========================>.....] - ETA: 22s - loss: 0.9398 - regression_loss: 0.8402 - classification_loss: 0.0997 434/500 [=========================>....] - ETA: 22s - loss: 0.9396 - regression_loss: 0.8400 - classification_loss: 0.0996 435/500 [=========================>....] - ETA: 22s - loss: 0.9395 - regression_loss: 0.8399 - classification_loss: 0.0995 436/500 [=========================>....] - ETA: 21s - loss: 0.9392 - regression_loss: 0.8396 - classification_loss: 0.0996 437/500 [=========================>....] - ETA: 21s - loss: 0.9395 - regression_loss: 0.8399 - classification_loss: 0.0996 438/500 [=========================>....] - ETA: 21s - loss: 0.9383 - regression_loss: 0.8389 - classification_loss: 0.0994 439/500 [=========================>....] - ETA: 20s - loss: 0.9393 - regression_loss: 0.8399 - classification_loss: 0.0994 440/500 [=========================>....] - ETA: 20s - loss: 0.9408 - regression_loss: 0.8411 - classification_loss: 0.0997 441/500 [=========================>....] - ETA: 20s - loss: 0.9414 - regression_loss: 0.8416 - classification_loss: 0.0998 442/500 [=========================>....] - ETA: 19s - loss: 0.9415 - regression_loss: 0.8417 - classification_loss: 0.0998 443/500 [=========================>....] - ETA: 19s - loss: 0.9417 - regression_loss: 0.8419 - classification_loss: 0.0998 444/500 [=========================>....] - ETA: 19s - loss: 0.9418 - regression_loss: 0.8420 - classification_loss: 0.0998 445/500 [=========================>....] - ETA: 18s - loss: 0.9417 - regression_loss: 0.8418 - classification_loss: 0.0998 446/500 [=========================>....] - ETA: 18s - loss: 0.9433 - regression_loss: 0.8432 - classification_loss: 0.1000 447/500 [=========================>....] - ETA: 17s - loss: 0.9426 - regression_loss: 0.8427 - classification_loss: 0.0999 448/500 [=========================>....] - ETA: 17s - loss: 0.9424 - regression_loss: 0.8425 - classification_loss: 0.0998 449/500 [=========================>....] - ETA: 17s - loss: 0.9414 - regression_loss: 0.8417 - classification_loss: 0.0997 450/500 [==========================>...] - ETA: 16s - loss: 0.9423 - regression_loss: 0.8425 - classification_loss: 0.0997 451/500 [==========================>...] - ETA: 16s - loss: 0.9421 - regression_loss: 0.8425 - classification_loss: 0.0997 452/500 [==========================>...] - ETA: 16s - loss: 0.9425 - regression_loss: 0.8428 - classification_loss: 0.0998 453/500 [==========================>...] - ETA: 15s - loss: 0.9419 - regression_loss: 0.8422 - classification_loss: 0.0997 454/500 [==========================>...] - ETA: 15s - loss: 0.9410 - regression_loss: 0.8414 - classification_loss: 0.0996 455/500 [==========================>...] - ETA: 15s - loss: 0.9405 - regression_loss: 0.8411 - classification_loss: 0.0995 456/500 [==========================>...] - ETA: 14s - loss: 0.9395 - regression_loss: 0.8401 - classification_loss: 0.0993 457/500 [==========================>...] - ETA: 14s - loss: 0.9398 - regression_loss: 0.8405 - classification_loss: 0.0994 458/500 [==========================>...] - ETA: 14s - loss: 0.9395 - regression_loss: 0.8403 - classification_loss: 0.0993 459/500 [==========================>...] - ETA: 13s - loss: 0.9391 - regression_loss: 0.8399 - classification_loss: 0.0992 460/500 [==========================>...] - ETA: 13s - loss: 0.9393 - regression_loss: 0.8401 - classification_loss: 0.0992 461/500 [==========================>...] - ETA: 13s - loss: 0.9390 - regression_loss: 0.8397 - classification_loss: 0.0993 462/500 [==========================>...] - ETA: 12s - loss: 0.9389 - regression_loss: 0.8398 - classification_loss: 0.0992 463/500 [==========================>...] - ETA: 12s - loss: 0.9400 - regression_loss: 0.8406 - classification_loss: 0.0994 464/500 [==========================>...] - ETA: 12s - loss: 0.9418 - regression_loss: 0.8420 - classification_loss: 0.0997 465/500 [==========================>...] - ETA: 11s - loss: 0.9420 - regression_loss: 0.8422 - classification_loss: 0.0998 466/500 [==========================>...] - ETA: 11s - loss: 0.9432 - regression_loss: 0.8433 - classification_loss: 0.0999 467/500 [===========================>..] - ETA: 11s - loss: 0.9426 - regression_loss: 0.8427 - classification_loss: 0.1000 468/500 [===========================>..] - ETA: 10s - loss: 0.9417 - regression_loss: 0.8419 - classification_loss: 0.0998 469/500 [===========================>..] - ETA: 10s - loss: 0.9412 - regression_loss: 0.8416 - classification_loss: 0.0997 470/500 [===========================>..] - ETA: 10s - loss: 0.9420 - regression_loss: 0.8422 - classification_loss: 0.0998 471/500 [===========================>..] - ETA: 9s - loss: 0.9426 - regression_loss: 0.8426 - classification_loss: 0.1000  472/500 [===========================>..] - ETA: 9s - loss: 0.9423 - regression_loss: 0.8423 - classification_loss: 0.1000 473/500 [===========================>..] - ETA: 9s - loss: 0.9413 - regression_loss: 0.8414 - classification_loss: 0.0999 474/500 [===========================>..] - ETA: 8s - loss: 0.9407 - regression_loss: 0.8408 - classification_loss: 0.0999 475/500 [===========================>..] - ETA: 8s - loss: 0.9405 - regression_loss: 0.8406 - classification_loss: 0.0998 476/500 [===========================>..] - ETA: 8s - loss: 0.9408 - regression_loss: 0.8410 - classification_loss: 0.0998 477/500 [===========================>..] - ETA: 7s - loss: 0.9409 - regression_loss: 0.8411 - classification_loss: 0.0998 478/500 [===========================>..] - ETA: 7s - loss: 0.9395 - regression_loss: 0.8398 - classification_loss: 0.0996 479/500 [===========================>..] - ETA: 7s - loss: 0.9397 - regression_loss: 0.8402 - classification_loss: 0.0995 480/500 [===========================>..] - ETA: 6s - loss: 0.9394 - regression_loss: 0.8399 - classification_loss: 0.0995 481/500 [===========================>..] - ETA: 6s - loss: 0.9388 - regression_loss: 0.8394 - classification_loss: 0.0994 482/500 [===========================>..] - ETA: 6s - loss: 0.9379 - regression_loss: 0.8386 - classification_loss: 0.0993 483/500 [===========================>..] - ETA: 5s - loss: 0.9384 - regression_loss: 0.8391 - classification_loss: 0.0993 484/500 [============================>.] - ETA: 5s - loss: 0.9374 - regression_loss: 0.8382 - classification_loss: 0.0991 485/500 [============================>.] - ETA: 5s - loss: 0.9385 - regression_loss: 0.8392 - classification_loss: 0.0993 486/500 [============================>.] - ETA: 4s - loss: 0.9389 - regression_loss: 0.8396 - classification_loss: 0.0993 487/500 [============================>.] - ETA: 4s - loss: 0.9378 - regression_loss: 0.8387 - classification_loss: 0.0992 488/500 [============================>.] - ETA: 4s - loss: 0.9380 - regression_loss: 0.8388 - classification_loss: 0.0992 489/500 [============================>.] - ETA: 3s - loss: 0.9367 - regression_loss: 0.8377 - classification_loss: 0.0990 490/500 [============================>.] - ETA: 3s - loss: 0.9373 - regression_loss: 0.8382 - classification_loss: 0.0991 491/500 [============================>.] - ETA: 3s - loss: 0.9372 - regression_loss: 0.8380 - classification_loss: 0.0991 492/500 [============================>.] - ETA: 2s - loss: 0.9377 - regression_loss: 0.8384 - classification_loss: 0.0993 493/500 [============================>.] - ETA: 2s - loss: 0.9370 - regression_loss: 0.8378 - classification_loss: 0.0992 494/500 [============================>.] - ETA: 2s - loss: 0.9375 - regression_loss: 0.8383 - classification_loss: 0.0992 495/500 [============================>.] - ETA: 1s - loss: 0.9372 - regression_loss: 0.8381 - classification_loss: 0.0992 496/500 [============================>.] - ETA: 1s - loss: 0.9371 - regression_loss: 0.8380 - classification_loss: 0.0991 497/500 [============================>.] - ETA: 1s - loss: 0.9369 - regression_loss: 0.8378 - classification_loss: 0.0990 498/500 [============================>.] - ETA: 0s - loss: 0.9357 - regression_loss: 0.8368 - classification_loss: 0.0989 499/500 [============================>.] - ETA: 0s - loss: 0.9374 - regression_loss: 0.8381 - classification_loss: 0.0993 500/500 [==============================] - 170s 339ms/step - loss: 0.9378 - regression_loss: 0.8385 - classification_loss: 0.0994 1172 instances of class plum with average precision: 0.7741 mAP: 0.7741 Epoch 00033: saving model to ./training/snapshots/resnet101_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 2:45 - loss: 0.8746 - regression_loss: 0.8074 - classification_loss: 0.0672 2/500 [..............................] - ETA: 2:49 - loss: 1.0123 - regression_loss: 0.9010 - classification_loss: 0.1113 3/500 [..............................] - ETA: 2:48 - loss: 1.0607 - regression_loss: 0.9355 - classification_loss: 0.1252 4/500 [..............................] - ETA: 2:47 - loss: 0.9740 - regression_loss: 0.8678 - classification_loss: 0.1062 5/500 [..............................] - ETA: 2:47 - loss: 0.9549 - regression_loss: 0.8524 - classification_loss: 0.1025 6/500 [..............................] - ETA: 2:46 - loss: 0.9513 - regression_loss: 0.8487 - classification_loss: 0.1026 7/500 [..............................] - ETA: 2:46 - loss: 0.9788 - regression_loss: 0.8718 - classification_loss: 0.1070 8/500 [..............................] - ETA: 2:47 - loss: 0.9499 - regression_loss: 0.8494 - classification_loss: 0.1005 9/500 [..............................] - ETA: 2:47 - loss: 0.9155 - regression_loss: 0.8146 - classification_loss: 0.1009 10/500 [..............................] - ETA: 2:47 - loss: 0.9021 - regression_loss: 0.8073 - classification_loss: 0.0948 11/500 [..............................] - ETA: 2:47 - loss: 0.8696 - regression_loss: 0.7804 - classification_loss: 0.0893 12/500 [..............................] - ETA: 2:46 - loss: 0.9319 - regression_loss: 0.8323 - classification_loss: 0.0997 13/500 [..............................] - ETA: 2:45 - loss: 0.9424 - regression_loss: 0.8422 - classification_loss: 0.1002 14/500 [..............................] - ETA: 2:44 - loss: 0.9688 - regression_loss: 0.8656 - classification_loss: 0.1032 15/500 [..............................] - ETA: 2:44 - loss: 0.9428 - regression_loss: 0.8438 - classification_loss: 0.0990 16/500 [..............................] - ETA: 2:43 - loss: 0.9693 - regression_loss: 0.8659 - classification_loss: 0.1035 17/500 [>.............................] - ETA: 2:43 - loss: 0.9709 - regression_loss: 0.8664 - classification_loss: 0.1046 18/500 [>.............................] - ETA: 2:42 - loss: 0.9517 - regression_loss: 0.8502 - classification_loss: 0.1015 19/500 [>.............................] - ETA: 2:43 - loss: 0.9704 - regression_loss: 0.8656 - classification_loss: 0.1049 20/500 [>.............................] - ETA: 2:42 - loss: 0.9414 - regression_loss: 0.8410 - classification_loss: 0.1004 21/500 [>.............................] - ETA: 2:43 - loss: 0.9616 - regression_loss: 0.8578 - classification_loss: 0.1038 22/500 [>.............................] - ETA: 2:42 - loss: 0.9503 - regression_loss: 0.8478 - classification_loss: 0.1026 23/500 [>.............................] - ETA: 2:42 - loss: 0.9543 - regression_loss: 0.8520 - classification_loss: 0.1023 24/500 [>.............................] - ETA: 2:42 - loss: 0.9546 - regression_loss: 0.8521 - classification_loss: 0.1025 25/500 [>.............................] - ETA: 2:41 - loss: 0.9536 - regression_loss: 0.8511 - classification_loss: 0.1025 26/500 [>.............................] - ETA: 2:41 - loss: 0.9582 - regression_loss: 0.8554 - classification_loss: 0.1028 27/500 [>.............................] - ETA: 2:41 - loss: 0.9558 - regression_loss: 0.8540 - classification_loss: 0.1018 28/500 [>.............................] - ETA: 2:40 - loss: 0.9593 - regression_loss: 0.8571 - classification_loss: 0.1022 29/500 [>.............................] - ETA: 2:40 - loss: 0.9562 - regression_loss: 0.8555 - classification_loss: 0.1006 30/500 [>.............................] - ETA: 2:40 - loss: 0.9536 - regression_loss: 0.8532 - classification_loss: 0.1004 31/500 [>.............................] - ETA: 2:39 - loss: 0.9584 - regression_loss: 0.8569 - classification_loss: 0.1016 32/500 [>.............................] - ETA: 2:39 - loss: 0.9546 - regression_loss: 0.8544 - classification_loss: 0.1002 33/500 [>.............................] - ETA: 2:39 - loss: 0.9566 - regression_loss: 0.8564 - classification_loss: 0.1002 34/500 [=>............................] - ETA: 2:39 - loss: 0.9625 - regression_loss: 0.8609 - classification_loss: 0.1016 35/500 [=>............................] - ETA: 2:39 - loss: 0.9472 - regression_loss: 0.8476 - classification_loss: 0.0997 36/500 [=>............................] - ETA: 2:38 - loss: 0.9530 - regression_loss: 0.8527 - classification_loss: 0.1003 37/500 [=>............................] - ETA: 2:38 - loss: 0.9695 - regression_loss: 0.8698 - classification_loss: 0.0997 38/500 [=>............................] - ETA: 2:37 - loss: 0.9696 - regression_loss: 0.8691 - classification_loss: 0.1005 39/500 [=>............................] - ETA: 2:37 - loss: 0.9664 - regression_loss: 0.8659 - classification_loss: 0.1005 40/500 [=>............................] - ETA: 2:36 - loss: 0.9538 - regression_loss: 0.8550 - classification_loss: 0.0988 41/500 [=>............................] - ETA: 2:36 - loss: 0.9546 - regression_loss: 0.8545 - classification_loss: 0.1001 42/500 [=>............................] - ETA: 2:36 - loss: 0.9621 - regression_loss: 0.8613 - classification_loss: 0.1007 43/500 [=>............................] - ETA: 2:35 - loss: 0.9705 - regression_loss: 0.8687 - classification_loss: 0.1019 44/500 [=>............................] - ETA: 2:35 - loss: 0.9684 - regression_loss: 0.8674 - classification_loss: 0.1010 45/500 [=>............................] - ETA: 2:34 - loss: 0.9635 - regression_loss: 0.8639 - classification_loss: 0.0996 46/500 [=>............................] - ETA: 2:33 - loss: 0.9705 - regression_loss: 0.8703 - classification_loss: 0.1002 47/500 [=>............................] - ETA: 2:33 - loss: 0.9728 - regression_loss: 0.8719 - classification_loss: 0.1009 48/500 [=>............................] - ETA: 2:33 - loss: 0.9721 - regression_loss: 0.8723 - classification_loss: 0.0998 49/500 [=>............................] - ETA: 2:33 - loss: 0.9717 - regression_loss: 0.8712 - classification_loss: 0.1005 50/500 [==>...........................] - ETA: 2:32 - loss: 0.9737 - regression_loss: 0.8739 - classification_loss: 0.0998 51/500 [==>...........................] - ETA: 2:32 - loss: 0.9726 - regression_loss: 0.8729 - classification_loss: 0.0998 52/500 [==>...........................] - ETA: 2:31 - loss: 0.9648 - regression_loss: 0.8663 - classification_loss: 0.0986 53/500 [==>...........................] - ETA: 2:31 - loss: 0.9634 - regression_loss: 0.8648 - classification_loss: 0.0985 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9632 - regression_loss: 0.8652 - classification_loss: 0.0980 55/500 [==>...........................] - ETA: 2:30 - loss: 0.9568 - regression_loss: 0.8593 - classification_loss: 0.0976 56/500 [==>...........................] - ETA: 2:30 - loss: 0.9560 - regression_loss: 0.8587 - classification_loss: 0.0973 57/500 [==>...........................] - ETA: 2:29 - loss: 0.9573 - regression_loss: 0.8594 - classification_loss: 0.0979 58/500 [==>...........................] - ETA: 2:29 - loss: 0.9601 - regression_loss: 0.8618 - classification_loss: 0.0984 59/500 [==>...........................] - ETA: 2:29 - loss: 0.9521 - regression_loss: 0.8549 - classification_loss: 0.0972 60/500 [==>...........................] - ETA: 2:28 - loss: 0.9507 - regression_loss: 0.8541 - classification_loss: 0.0966 61/500 [==>...........................] - ETA: 2:28 - loss: 0.9597 - regression_loss: 0.8632 - classification_loss: 0.0965 62/500 [==>...........................] - ETA: 2:28 - loss: 0.9548 - regression_loss: 0.8592 - classification_loss: 0.0956 63/500 [==>...........................] - ETA: 2:27 - loss: 0.9503 - regression_loss: 0.8556 - classification_loss: 0.0947 64/500 [==>...........................] - ETA: 2:27 - loss: 0.9489 - regression_loss: 0.8545 - classification_loss: 0.0945 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9421 - regression_loss: 0.8484 - classification_loss: 0.0937 66/500 [==>...........................] - ETA: 2:26 - loss: 0.9332 - regression_loss: 0.8404 - classification_loss: 0.0927 67/500 [===>..........................] - ETA: 2:26 - loss: 0.9359 - regression_loss: 0.8422 - classification_loss: 0.0937 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9375 - regression_loss: 0.8435 - classification_loss: 0.0940 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9468 - regression_loss: 0.8531 - classification_loss: 0.0937 70/500 [===>..........................] - ETA: 2:25 - loss: 0.9391 - regression_loss: 0.8461 - classification_loss: 0.0929 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9386 - regression_loss: 0.8459 - classification_loss: 0.0927 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9350 - regression_loss: 0.8425 - classification_loss: 0.0925 73/500 [===>..........................] - ETA: 2:24 - loss: 0.9311 - regression_loss: 0.8393 - classification_loss: 0.0918 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9384 - regression_loss: 0.8455 - classification_loss: 0.0929 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9328 - regression_loss: 0.8408 - classification_loss: 0.0920 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9245 - regression_loss: 0.8334 - classification_loss: 0.0911 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9242 - regression_loss: 0.8326 - classification_loss: 0.0916 78/500 [===>..........................] - ETA: 2:22 - loss: 0.9270 - regression_loss: 0.8353 - classification_loss: 0.0917 79/500 [===>..........................] - ETA: 2:22 - loss: 0.9395 - regression_loss: 0.8467 - classification_loss: 0.0928 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9431 - regression_loss: 0.8495 - classification_loss: 0.0936 81/500 [===>..........................] - ETA: 2:21 - loss: 0.9431 - regression_loss: 0.8494 - classification_loss: 0.0937 82/500 [===>..........................] - ETA: 2:21 - loss: 0.9403 - regression_loss: 0.8472 - classification_loss: 0.0931 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9481 - regression_loss: 0.8542 - classification_loss: 0.0940 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9529 - regression_loss: 0.8583 - classification_loss: 0.0946 85/500 [====>.........................] - ETA: 2:20 - loss: 0.9505 - regression_loss: 0.8566 - classification_loss: 0.0939 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9472 - regression_loss: 0.8540 - classification_loss: 0.0931 87/500 [====>.........................] - ETA: 2:19 - loss: 0.9415 - regression_loss: 0.8492 - classification_loss: 0.0923 88/500 [====>.........................] - ETA: 2:19 - loss: 0.9363 - regression_loss: 0.8447 - classification_loss: 0.0915 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9404 - regression_loss: 0.8487 - classification_loss: 0.0917 90/500 [====>.........................] - ETA: 2:18 - loss: 0.9462 - regression_loss: 0.8536 - classification_loss: 0.0927 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9464 - regression_loss: 0.8534 - classification_loss: 0.0930 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9443 - regression_loss: 0.8515 - classification_loss: 0.0928 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9376 - regression_loss: 0.8455 - classification_loss: 0.0921 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9369 - regression_loss: 0.8448 - classification_loss: 0.0921 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9405 - regression_loss: 0.8474 - classification_loss: 0.0930 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9392 - regression_loss: 0.8463 - classification_loss: 0.0929 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9431 - regression_loss: 0.8495 - classification_loss: 0.0936 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9396 - regression_loss: 0.8465 - classification_loss: 0.0931 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9417 - regression_loss: 0.8483 - classification_loss: 0.0935 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9389 - regression_loss: 0.8460 - classification_loss: 0.0929 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9418 - regression_loss: 0.8485 - classification_loss: 0.0933 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9398 - regression_loss: 0.8471 - classification_loss: 0.0927 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9415 - regression_loss: 0.8484 - classification_loss: 0.0931 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9436 - regression_loss: 0.8498 - classification_loss: 0.0938 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9451 - regression_loss: 0.8511 - classification_loss: 0.0941 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9420 - regression_loss: 0.8482 - classification_loss: 0.0938 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9407 - regression_loss: 0.8474 - classification_loss: 0.0934 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9444 - regression_loss: 0.8503 - classification_loss: 0.0940 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9400 - regression_loss: 0.8464 - classification_loss: 0.0936 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9414 - regression_loss: 0.8479 - classification_loss: 0.0935 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9422 - regression_loss: 0.8487 - classification_loss: 0.0935 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9407 - regression_loss: 0.8475 - classification_loss: 0.0932 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9398 - regression_loss: 0.8466 - classification_loss: 0.0932 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9385 - regression_loss: 0.8457 - classification_loss: 0.0928 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9370 - regression_loss: 0.8445 - classification_loss: 0.0925 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9350 - regression_loss: 0.8429 - classification_loss: 0.0922 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9352 - regression_loss: 0.8430 - classification_loss: 0.0922 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9333 - regression_loss: 0.8413 - classification_loss: 0.0919 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9319 - regression_loss: 0.8401 - classification_loss: 0.0918 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9350 - regression_loss: 0.8427 - classification_loss: 0.0923 121/500 [======>.......................] - ETA: 2:08 - loss: 0.9322 - regression_loss: 0.8402 - classification_loss: 0.0919 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9347 - regression_loss: 0.8423 - classification_loss: 0.0924 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9333 - regression_loss: 0.8410 - classification_loss: 0.0923 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9324 - regression_loss: 0.8405 - classification_loss: 0.0918 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9326 - regression_loss: 0.8407 - classification_loss: 0.0919 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9321 - regression_loss: 0.8404 - classification_loss: 0.0917 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9351 - regression_loss: 0.8420 - classification_loss: 0.0931 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9320 - regression_loss: 0.8394 - classification_loss: 0.0926 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9326 - regression_loss: 0.8400 - classification_loss: 0.0926 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9344 - regression_loss: 0.8415 - classification_loss: 0.0930 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9371 - regression_loss: 0.8435 - classification_loss: 0.0936 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9378 - regression_loss: 0.8441 - classification_loss: 0.0936 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9350 - regression_loss: 0.8415 - classification_loss: 0.0935 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9342 - regression_loss: 0.8409 - classification_loss: 0.0933 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9326 - regression_loss: 0.8396 - classification_loss: 0.0930 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9469 - regression_loss: 0.8496 - classification_loss: 0.0973 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9458 - regression_loss: 0.8486 - classification_loss: 0.0972 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9467 - regression_loss: 0.8494 - classification_loss: 0.0973 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9481 - regression_loss: 0.8510 - classification_loss: 0.0971 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9469 - regression_loss: 0.8499 - classification_loss: 0.0970 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9502 - regression_loss: 0.8529 - classification_loss: 0.0973 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9507 - regression_loss: 0.8534 - classification_loss: 0.0973 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9513 - regression_loss: 0.8536 - classification_loss: 0.0977 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9519 - regression_loss: 0.8539 - classification_loss: 0.0980 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9500 - regression_loss: 0.8521 - classification_loss: 0.0979 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9462 - regression_loss: 0.8487 - classification_loss: 0.0975 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9482 - regression_loss: 0.8504 - classification_loss: 0.0978 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9464 - regression_loss: 0.8488 - classification_loss: 0.0976 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9450 - regression_loss: 0.8476 - classification_loss: 0.0975 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9402 - regression_loss: 0.8431 - classification_loss: 0.0972 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9437 - regression_loss: 0.8464 - classification_loss: 0.0973 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9459 - regression_loss: 0.8482 - classification_loss: 0.0976 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9474 - regression_loss: 0.8498 - classification_loss: 0.0976 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9473 - regression_loss: 0.8495 - classification_loss: 0.0977 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9436 - regression_loss: 0.8463 - classification_loss: 0.0973 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9457 - regression_loss: 0.8480 - classification_loss: 0.0977 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9462 - regression_loss: 0.8484 - classification_loss: 0.0978 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9444 - regression_loss: 0.8466 - classification_loss: 0.0978 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9452 - regression_loss: 0.8473 - classification_loss: 0.0979 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9447 - regression_loss: 0.8468 - classification_loss: 0.0979 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9459 - regression_loss: 0.8478 - classification_loss: 0.0981 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9452 - regression_loss: 0.8472 - classification_loss: 0.0980 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9467 - regression_loss: 0.8486 - classification_loss: 0.0980 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9485 - regression_loss: 0.8503 - classification_loss: 0.0982 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9500 - regression_loss: 0.8513 - classification_loss: 0.0986 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9483 - regression_loss: 0.8497 - classification_loss: 0.0986 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9489 - regression_loss: 0.8500 - classification_loss: 0.0989 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9482 - regression_loss: 0.8495 - classification_loss: 0.0987 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9503 - regression_loss: 0.8513 - classification_loss: 0.0990 170/500 [=========>....................] - ETA: 1:51 - loss: 0.9504 - regression_loss: 0.8516 - classification_loss: 0.0987 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9525 - regression_loss: 0.8536 - classification_loss: 0.0989 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9521 - regression_loss: 0.8532 - classification_loss: 0.0989 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9498 - regression_loss: 0.8512 - classification_loss: 0.0986 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9522 - regression_loss: 0.8535 - classification_loss: 0.0988 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9534 - regression_loss: 0.8544 - classification_loss: 0.0990 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9541 - regression_loss: 0.8550 - classification_loss: 0.0991 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9569 - regression_loss: 0.8572 - classification_loss: 0.0996 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9562 - regression_loss: 0.8567 - classification_loss: 0.0995 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9573 - regression_loss: 0.8576 - classification_loss: 0.0997 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9555 - regression_loss: 0.8561 - classification_loss: 0.0994 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9561 - regression_loss: 0.8565 - classification_loss: 0.0996 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9532 - regression_loss: 0.8541 - classification_loss: 0.0991 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9524 - regression_loss: 0.8532 - classification_loss: 0.0991 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9526 - regression_loss: 0.8534 - classification_loss: 0.0992 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9530 - regression_loss: 0.8539 - classification_loss: 0.0991 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9492 - regression_loss: 0.8505 - classification_loss: 0.0987 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9507 - regression_loss: 0.8517 - classification_loss: 0.0990 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9522 - regression_loss: 0.8531 - classification_loss: 0.0991 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9520 - regression_loss: 0.8532 - classification_loss: 0.0988 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9515 - regression_loss: 0.8529 - classification_loss: 0.0986 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9524 - regression_loss: 0.8536 - classification_loss: 0.0988 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9551 - regression_loss: 0.8558 - classification_loss: 0.0992 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9553 - regression_loss: 0.8561 - classification_loss: 0.0991 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9536 - regression_loss: 0.8549 - classification_loss: 0.0988 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9546 - regression_loss: 0.8555 - classification_loss: 0.0992 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9549 - regression_loss: 0.8554 - classification_loss: 0.0995 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9540 - regression_loss: 0.8547 - classification_loss: 0.0993 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9536 - regression_loss: 0.8545 - classification_loss: 0.0992 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9522 - regression_loss: 0.8532 - classification_loss: 0.0991 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9543 - regression_loss: 0.8548 - classification_loss: 0.0994 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9549 - regression_loss: 0.8553 - classification_loss: 0.0996 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9580 - regression_loss: 0.8578 - classification_loss: 0.1002 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9609 - regression_loss: 0.8598 - classification_loss: 0.1012 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9628 - regression_loss: 0.8613 - classification_loss: 0.1015 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9644 - regression_loss: 0.8625 - classification_loss: 0.1019 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9639 - regression_loss: 0.8622 - classification_loss: 0.1017 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9629 - regression_loss: 0.8613 - classification_loss: 0.1016 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9651 - regression_loss: 0.8631 - classification_loss: 0.1019 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9650 - regression_loss: 0.8630 - classification_loss: 0.1020 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9639 - regression_loss: 0.8620 - classification_loss: 0.1019 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9644 - regression_loss: 0.8624 - classification_loss: 0.1020 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9666 - regression_loss: 0.8642 - classification_loss: 0.1024 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9665 - regression_loss: 0.8640 - classification_loss: 0.1025 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9684 - regression_loss: 0.8656 - classification_loss: 0.1027 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9661 - regression_loss: 0.8637 - classification_loss: 0.1024 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9671 - regression_loss: 0.8647 - classification_loss: 0.1024 217/500 [============>.................] - ETA: 1:36 - loss: 0.9641 - regression_loss: 0.8621 - classification_loss: 0.1020 218/500 [============>.................] - ETA: 1:35 - loss: 0.9619 - regression_loss: 0.8601 - classification_loss: 0.1018 219/500 [============>.................] - ETA: 1:35 - loss: 0.9630 - regression_loss: 0.8611 - classification_loss: 0.1018 220/500 [============>.................] - ETA: 1:35 - loss: 0.9621 - regression_loss: 0.8606 - classification_loss: 0.1016 221/500 [============>.................] - ETA: 1:34 - loss: 0.9633 - regression_loss: 0.8617 - classification_loss: 0.1016 222/500 [============>.................] - ETA: 1:34 - loss: 0.9626 - regression_loss: 0.8613 - classification_loss: 0.1013 223/500 [============>.................] - ETA: 1:33 - loss: 0.9632 - regression_loss: 0.8619 - classification_loss: 0.1013 224/500 [============>.................] - ETA: 1:33 - loss: 0.9647 - regression_loss: 0.8631 - classification_loss: 0.1015 225/500 [============>.................] - ETA: 1:33 - loss: 0.9630 - regression_loss: 0.8617 - classification_loss: 0.1013 226/500 [============>.................] - ETA: 1:32 - loss: 0.9633 - regression_loss: 0.8620 - classification_loss: 0.1013 227/500 [============>.................] - ETA: 1:32 - loss: 0.9655 - regression_loss: 0.8640 - classification_loss: 0.1015 228/500 [============>.................] - ETA: 1:32 - loss: 0.9681 - regression_loss: 0.8665 - classification_loss: 0.1016 229/500 [============>.................] - ETA: 1:31 - loss: 0.9667 - regression_loss: 0.8654 - classification_loss: 0.1012 230/500 [============>.................] - ETA: 1:31 - loss: 0.9662 - regression_loss: 0.8650 - classification_loss: 0.1012 231/500 [============>.................] - ETA: 1:31 - loss: 0.9679 - regression_loss: 0.8662 - classification_loss: 0.1017 232/500 [============>.................] - ETA: 1:30 - loss: 0.9666 - regression_loss: 0.8651 - classification_loss: 0.1015 233/500 [============>.................] - ETA: 1:30 - loss: 0.9670 - regression_loss: 0.8654 - classification_loss: 0.1016 234/500 [=============>................] - ETA: 1:30 - loss: 0.9649 - regression_loss: 0.8636 - classification_loss: 0.1013 235/500 [=============>................] - ETA: 1:29 - loss: 0.9627 - regression_loss: 0.8617 - classification_loss: 0.1010 236/500 [=============>................] - ETA: 1:29 - loss: 0.9615 - regression_loss: 0.8606 - classification_loss: 0.1009 237/500 [=============>................] - ETA: 1:29 - loss: 0.9625 - regression_loss: 0.8615 - classification_loss: 0.1010 238/500 [=============>................] - ETA: 1:28 - loss: 0.9629 - regression_loss: 0.8618 - classification_loss: 0.1011 239/500 [=============>................] - ETA: 1:28 - loss: 0.9621 - regression_loss: 0.8611 - classification_loss: 0.1010 240/500 [=============>................] - ETA: 1:28 - loss: 0.9624 - regression_loss: 0.8613 - classification_loss: 0.1010 241/500 [=============>................] - ETA: 1:27 - loss: 0.9627 - regression_loss: 0.8617 - classification_loss: 0.1010 242/500 [=============>................] - ETA: 1:27 - loss: 0.9639 - regression_loss: 0.8629 - classification_loss: 0.1010 243/500 [=============>................] - ETA: 1:27 - loss: 0.9638 - regression_loss: 0.8628 - classification_loss: 0.1011 244/500 [=============>................] - ETA: 1:26 - loss: 0.9628 - regression_loss: 0.8619 - classification_loss: 0.1009 245/500 [=============>................] - ETA: 1:26 - loss: 0.9644 - regression_loss: 0.8633 - classification_loss: 0.1011 246/500 [=============>................] - ETA: 1:26 - loss: 0.9643 - regression_loss: 0.8632 - classification_loss: 0.1010 247/500 [=============>................] - ETA: 1:25 - loss: 0.9648 - regression_loss: 0.8636 - classification_loss: 0.1012 248/500 [=============>................] - ETA: 1:25 - loss: 0.9648 - regression_loss: 0.8635 - classification_loss: 0.1012 249/500 [=============>................] - ETA: 1:25 - loss: 0.9638 - regression_loss: 0.8628 - classification_loss: 0.1010 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9631 - regression_loss: 0.8620 - classification_loss: 0.1011 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9634 - regression_loss: 0.8625 - classification_loss: 0.1009 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9645 - regression_loss: 0.8636 - classification_loss: 0.1010 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9632 - regression_loss: 0.8625 - classification_loss: 0.1007 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9642 - regression_loss: 0.8635 - classification_loss: 0.1007 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9625 - regression_loss: 0.8620 - classification_loss: 0.1005 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9636 - regression_loss: 0.8632 - classification_loss: 0.1004 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9644 - regression_loss: 0.8639 - classification_loss: 0.1005 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9623 - regression_loss: 0.8621 - classification_loss: 0.1003 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9598 - regression_loss: 0.8598 - classification_loss: 0.1001 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9604 - regression_loss: 0.8603 - classification_loss: 0.1002 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9594 - regression_loss: 0.8595 - classification_loss: 0.0999 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9600 - regression_loss: 0.8600 - classification_loss: 0.1000 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9581 - regression_loss: 0.8583 - classification_loss: 0.0998 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9584 - regression_loss: 0.8587 - classification_loss: 0.0998 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9617 - regression_loss: 0.8613 - classification_loss: 0.1004 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9608 - regression_loss: 0.8605 - classification_loss: 0.1003 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9603 - regression_loss: 0.8601 - classification_loss: 0.1002 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9596 - regression_loss: 0.8596 - classification_loss: 0.1000 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9600 - regression_loss: 0.8601 - classification_loss: 0.0999 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9616 - regression_loss: 0.8615 - classification_loss: 0.1002 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9595 - regression_loss: 0.8596 - classification_loss: 0.0999 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9607 - regression_loss: 0.8606 - classification_loss: 0.1001 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9603 - regression_loss: 0.8602 - classification_loss: 0.1000 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9602 - regression_loss: 0.8602 - classification_loss: 0.1000 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9616 - regression_loss: 0.8613 - classification_loss: 0.1003 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9594 - regression_loss: 0.8594 - classification_loss: 0.1001 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9605 - regression_loss: 0.8603 - classification_loss: 0.1002 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9610 - regression_loss: 0.8606 - classification_loss: 0.1004 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9607 - regression_loss: 0.8605 - classification_loss: 0.1002 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9610 - regression_loss: 0.8607 - classification_loss: 0.1004 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9624 - regression_loss: 0.8617 - classification_loss: 0.1006 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9621 - regression_loss: 0.8616 - classification_loss: 0.1005 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9617 - regression_loss: 0.8611 - classification_loss: 0.1006 284/500 [================>.............] - ETA: 1:13 - loss: 0.9612 - regression_loss: 0.8606 - classification_loss: 0.1005 285/500 [================>.............] - ETA: 1:12 - loss: 0.9608 - regression_loss: 0.8603 - classification_loss: 0.1006 286/500 [================>.............] - ETA: 1:12 - loss: 0.9604 - regression_loss: 0.8598 - classification_loss: 0.1005 287/500 [================>.............] - ETA: 1:12 - loss: 0.9593 - regression_loss: 0.8588 - classification_loss: 0.1005 288/500 [================>.............] - ETA: 1:11 - loss: 0.9580 - regression_loss: 0.8577 - classification_loss: 0.1002 289/500 [================>.............] - ETA: 1:11 - loss: 0.9586 - regression_loss: 0.8583 - classification_loss: 0.1003 290/500 [================>.............] - ETA: 1:11 - loss: 0.9577 - regression_loss: 0.8573 - classification_loss: 0.1003 291/500 [================>.............] - ETA: 1:10 - loss: 0.9567 - regression_loss: 0.8565 - classification_loss: 0.1002 292/500 [================>.............] - ETA: 1:10 - loss: 0.9578 - regression_loss: 0.8575 - classification_loss: 0.1003 293/500 [================>.............] - ETA: 1:10 - loss: 0.9566 - regression_loss: 0.8565 - classification_loss: 0.1002 294/500 [================>.............] - ETA: 1:09 - loss: 0.9577 - regression_loss: 0.8572 - classification_loss: 0.1004 295/500 [================>.............] - ETA: 1:09 - loss: 0.9587 - regression_loss: 0.8581 - classification_loss: 0.1006 296/500 [================>.............] - ETA: 1:09 - loss: 0.9596 - regression_loss: 0.8588 - classification_loss: 0.1007 297/500 [================>.............] - ETA: 1:08 - loss: 0.9585 - regression_loss: 0.8579 - classification_loss: 0.1006 298/500 [================>.............] - ETA: 1:08 - loss: 0.9590 - regression_loss: 0.8582 - classification_loss: 0.1008 299/500 [================>.............] - ETA: 1:08 - loss: 0.9584 - regression_loss: 0.8577 - classification_loss: 0.1007 300/500 [=================>............] - ETA: 1:07 - loss: 0.9577 - regression_loss: 0.8571 - classification_loss: 0.1005 301/500 [=================>............] - ETA: 1:07 - loss: 0.9591 - regression_loss: 0.8585 - classification_loss: 0.1006 302/500 [=================>............] - ETA: 1:07 - loss: 0.9601 - regression_loss: 0.8593 - classification_loss: 0.1008 303/500 [=================>............] - ETA: 1:06 - loss: 0.9600 - regression_loss: 0.8593 - classification_loss: 0.1007 304/500 [=================>............] - ETA: 1:06 - loss: 0.9612 - regression_loss: 0.8604 - classification_loss: 0.1008 305/500 [=================>............] - ETA: 1:06 - loss: 0.9614 - regression_loss: 0.8605 - classification_loss: 0.1009 306/500 [=================>............] - ETA: 1:05 - loss: 0.9591 - regression_loss: 0.8585 - classification_loss: 0.1006 307/500 [=================>............] - ETA: 1:05 - loss: 0.9574 - regression_loss: 0.8569 - classification_loss: 0.1005 308/500 [=================>............] - ETA: 1:05 - loss: 0.9570 - regression_loss: 0.8567 - classification_loss: 0.1003 309/500 [=================>............] - ETA: 1:04 - loss: 0.9572 - regression_loss: 0.8570 - classification_loss: 0.1002 310/500 [=================>............] - ETA: 1:04 - loss: 0.9583 - regression_loss: 0.8579 - classification_loss: 0.1004 311/500 [=================>............] - ETA: 1:04 - loss: 0.9583 - regression_loss: 0.8579 - classification_loss: 0.1005 312/500 [=================>............] - ETA: 1:03 - loss: 0.9588 - regression_loss: 0.8582 - classification_loss: 0.1005 313/500 [=================>............] - ETA: 1:03 - loss: 0.9582 - regression_loss: 0.8578 - classification_loss: 0.1004 314/500 [=================>............] - ETA: 1:03 - loss: 0.9579 - regression_loss: 0.8576 - classification_loss: 0.1003 315/500 [=================>............] - ETA: 1:02 - loss: 0.9574 - regression_loss: 0.8572 - classification_loss: 0.1003 316/500 [=================>............] - ETA: 1:02 - loss: 0.9558 - regression_loss: 0.8558 - classification_loss: 0.1000 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9583 - regression_loss: 0.8581 - classification_loss: 0.1002 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9583 - regression_loss: 0.8580 - classification_loss: 0.1003 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9573 - regression_loss: 0.8572 - classification_loss: 0.1001 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9550 - regression_loss: 0.8552 - classification_loss: 0.0998 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9567 - regression_loss: 0.8567 - classification_loss: 0.1000 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9561 - regression_loss: 0.8563 - classification_loss: 0.0998 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9567 - regression_loss: 0.8569 - classification_loss: 0.0998 324/500 [==================>...........] - ETA: 59s - loss: 0.9546 - regression_loss: 0.8550 - classification_loss: 0.0996  325/500 [==================>...........] - ETA: 59s - loss: 0.9533 - regression_loss: 0.8539 - classification_loss: 0.0994 326/500 [==================>...........] - ETA: 59s - loss: 0.9528 - regression_loss: 0.8534 - classification_loss: 0.0994 327/500 [==================>...........] - ETA: 58s - loss: 0.9533 - regression_loss: 0.8540 - classification_loss: 0.0993 328/500 [==================>...........] - ETA: 58s - loss: 0.9527 - regression_loss: 0.8535 - classification_loss: 0.0992 329/500 [==================>...........] - ETA: 58s - loss: 0.9523 - regression_loss: 0.8532 - classification_loss: 0.0990 330/500 [==================>...........] - ETA: 57s - loss: 0.9525 - regression_loss: 0.8534 - classification_loss: 0.0990 331/500 [==================>...........] - ETA: 57s - loss: 0.9530 - regression_loss: 0.8539 - classification_loss: 0.0991 332/500 [==================>...........] - ETA: 57s - loss: 0.9514 - regression_loss: 0.8525 - classification_loss: 0.0989 333/500 [==================>...........] - ETA: 56s - loss: 0.9507 - regression_loss: 0.8519 - classification_loss: 0.0988 334/500 [===================>..........] - ETA: 56s - loss: 0.9495 - regression_loss: 0.8509 - classification_loss: 0.0986 335/500 [===================>..........] - ETA: 56s - loss: 0.9506 - regression_loss: 0.8518 - classification_loss: 0.0988 336/500 [===================>..........] - ETA: 55s - loss: 0.9499 - regression_loss: 0.8512 - classification_loss: 0.0987 337/500 [===================>..........] - ETA: 55s - loss: 0.9487 - regression_loss: 0.8502 - classification_loss: 0.0985 338/500 [===================>..........] - ETA: 55s - loss: 0.9478 - regression_loss: 0.8494 - classification_loss: 0.0983 339/500 [===================>..........] - ETA: 54s - loss: 0.9472 - regression_loss: 0.8489 - classification_loss: 0.0983 340/500 [===================>..........] - ETA: 54s - loss: 0.9471 - regression_loss: 0.8489 - classification_loss: 0.0983 341/500 [===================>..........] - ETA: 53s - loss: 0.9465 - regression_loss: 0.8484 - classification_loss: 0.0981 342/500 [===================>..........] - ETA: 53s - loss: 0.9478 - regression_loss: 0.8495 - classification_loss: 0.0983 343/500 [===================>..........] - ETA: 53s - loss: 0.9474 - regression_loss: 0.8492 - classification_loss: 0.0982 344/500 [===================>..........] - ETA: 52s - loss: 0.9482 - regression_loss: 0.8498 - classification_loss: 0.0984 345/500 [===================>..........] - ETA: 52s - loss: 0.9475 - regression_loss: 0.8493 - classification_loss: 0.0982 346/500 [===================>..........] - ETA: 52s - loss: 0.9479 - regression_loss: 0.8497 - classification_loss: 0.0982 347/500 [===================>..........] - ETA: 51s - loss: 0.9488 - regression_loss: 0.8505 - classification_loss: 0.0984 348/500 [===================>..........] - ETA: 51s - loss: 0.9479 - regression_loss: 0.8497 - classification_loss: 0.0982 349/500 [===================>..........] - ETA: 51s - loss: 0.9475 - regression_loss: 0.8493 - classification_loss: 0.0982 350/500 [====================>.........] - ETA: 50s - loss: 0.9473 - regression_loss: 0.8491 - classification_loss: 0.0982 351/500 [====================>.........] - ETA: 50s - loss: 0.9460 - regression_loss: 0.8480 - classification_loss: 0.0980 352/500 [====================>.........] - ETA: 50s - loss: 0.9466 - regression_loss: 0.8487 - classification_loss: 0.0980 353/500 [====================>.........] - ETA: 49s - loss: 0.9474 - regression_loss: 0.8492 - classification_loss: 0.0982 354/500 [====================>.........] - ETA: 49s - loss: 0.9487 - regression_loss: 0.8504 - classification_loss: 0.0983 355/500 [====================>.........] - ETA: 49s - loss: 0.9500 - regression_loss: 0.8518 - classification_loss: 0.0983 356/500 [====================>.........] - ETA: 48s - loss: 0.9485 - regression_loss: 0.8504 - classification_loss: 0.0980 357/500 [====================>.........] - ETA: 48s - loss: 0.9488 - regression_loss: 0.8507 - classification_loss: 0.0981 358/500 [====================>.........] - ETA: 48s - loss: 0.9490 - regression_loss: 0.8508 - classification_loss: 0.0982 359/500 [====================>.........] - ETA: 47s - loss: 0.9504 - regression_loss: 0.8520 - classification_loss: 0.0984 360/500 [====================>.........] - ETA: 47s - loss: 0.9506 - regression_loss: 0.8522 - classification_loss: 0.0984 361/500 [====================>.........] - ETA: 47s - loss: 0.9501 - regression_loss: 0.8518 - classification_loss: 0.0984 362/500 [====================>.........] - ETA: 46s - loss: 0.9513 - regression_loss: 0.8525 - classification_loss: 0.0987 363/500 [====================>.........] - ETA: 46s - loss: 0.9507 - regression_loss: 0.8521 - classification_loss: 0.0986 364/500 [====================>.........] - ETA: 46s - loss: 0.9516 - regression_loss: 0.8527 - classification_loss: 0.0989 365/500 [====================>.........] - ETA: 45s - loss: 0.9527 - regression_loss: 0.8537 - classification_loss: 0.0990 366/500 [====================>.........] - ETA: 45s - loss: 0.9531 - regression_loss: 0.8541 - classification_loss: 0.0990 367/500 [=====================>........] - ETA: 45s - loss: 0.9526 - regression_loss: 0.8535 - classification_loss: 0.0990 368/500 [=====================>........] - ETA: 44s - loss: 0.9534 - regression_loss: 0.8543 - classification_loss: 0.0991 369/500 [=====================>........] - ETA: 44s - loss: 0.9535 - regression_loss: 0.8542 - classification_loss: 0.0992 370/500 [=====================>........] - ETA: 44s - loss: 0.9522 - regression_loss: 0.8532 - classification_loss: 0.0990 371/500 [=====================>........] - ETA: 43s - loss: 0.9533 - regression_loss: 0.8540 - classification_loss: 0.0992 372/500 [=====================>........] - ETA: 43s - loss: 0.9537 - regression_loss: 0.8545 - classification_loss: 0.0992 373/500 [=====================>........] - ETA: 43s - loss: 0.9539 - regression_loss: 0.8547 - classification_loss: 0.0992 374/500 [=====================>........] - ETA: 42s - loss: 0.9543 - regression_loss: 0.8550 - classification_loss: 0.0993 375/500 [=====================>........] - ETA: 42s - loss: 0.9591 - regression_loss: 0.8587 - classification_loss: 0.1004 376/500 [=====================>........] - ETA: 42s - loss: 0.9604 - regression_loss: 0.8598 - classification_loss: 0.1006 377/500 [=====================>........] - ETA: 41s - loss: 0.9610 - regression_loss: 0.8605 - classification_loss: 0.1005 378/500 [=====================>........] - ETA: 41s - loss: 0.9617 - regression_loss: 0.8610 - classification_loss: 0.1007 379/500 [=====================>........] - ETA: 41s - loss: 0.9623 - regression_loss: 0.8616 - classification_loss: 0.1007 380/500 [=====================>........] - ETA: 40s - loss: 0.9616 - regression_loss: 0.8610 - classification_loss: 0.1006 381/500 [=====================>........] - ETA: 40s - loss: 0.9618 - regression_loss: 0.8610 - classification_loss: 0.1008 382/500 [=====================>........] - ETA: 40s - loss: 0.9630 - regression_loss: 0.8620 - classification_loss: 0.1010 383/500 [=====================>........] - ETA: 39s - loss: 0.9626 - regression_loss: 0.8615 - classification_loss: 0.1010 384/500 [======================>.......] - ETA: 39s - loss: 0.9625 - regression_loss: 0.8614 - classification_loss: 0.1011 385/500 [======================>.......] - ETA: 39s - loss: 0.9622 - regression_loss: 0.8611 - classification_loss: 0.1010 386/500 [======================>.......] - ETA: 38s - loss: 0.9625 - regression_loss: 0.8613 - classification_loss: 0.1012 387/500 [======================>.......] - ETA: 38s - loss: 0.9606 - regression_loss: 0.8596 - classification_loss: 0.1010 388/500 [======================>.......] - ETA: 38s - loss: 0.9605 - regression_loss: 0.8594 - classification_loss: 0.1011 389/500 [======================>.......] - ETA: 37s - loss: 0.9608 - regression_loss: 0.8596 - classification_loss: 0.1011 390/500 [======================>.......] - ETA: 37s - loss: 0.9617 - regression_loss: 0.8605 - classification_loss: 0.1012 391/500 [======================>.......] - ETA: 36s - loss: 0.9625 - regression_loss: 0.8611 - classification_loss: 0.1014 392/500 [======================>.......] - ETA: 36s - loss: 0.9613 - regression_loss: 0.8601 - classification_loss: 0.1012 393/500 [======================>.......] - ETA: 36s - loss: 0.9599 - regression_loss: 0.8589 - classification_loss: 0.1010 394/500 [======================>.......] - ETA: 35s - loss: 0.9603 - regression_loss: 0.8592 - classification_loss: 0.1010 395/500 [======================>.......] - ETA: 35s - loss: 0.9598 - regression_loss: 0.8590 - classification_loss: 0.1008 396/500 [======================>.......] - ETA: 35s - loss: 0.9593 - regression_loss: 0.8585 - classification_loss: 0.1007 397/500 [======================>.......] - ETA: 34s - loss: 0.9598 - regression_loss: 0.8589 - classification_loss: 0.1008 398/500 [======================>.......] - ETA: 34s - loss: 0.9592 - regression_loss: 0.8585 - classification_loss: 0.1007 399/500 [======================>.......] - ETA: 34s - loss: 0.9594 - regression_loss: 0.8586 - classification_loss: 0.1007 400/500 [=======================>......] - ETA: 33s - loss: 0.9616 - regression_loss: 0.8604 - classification_loss: 0.1012 401/500 [=======================>......] - ETA: 33s - loss: 0.9634 - regression_loss: 0.8619 - classification_loss: 0.1014 402/500 [=======================>......] - ETA: 33s - loss: 0.9645 - regression_loss: 0.8630 - classification_loss: 0.1015 403/500 [=======================>......] - ETA: 32s - loss: 0.9628 - regression_loss: 0.8615 - classification_loss: 0.1013 404/500 [=======================>......] - ETA: 32s - loss: 0.9624 - regression_loss: 0.8612 - classification_loss: 0.1012 405/500 [=======================>......] - ETA: 32s - loss: 0.9622 - regression_loss: 0.8611 - classification_loss: 0.1011 406/500 [=======================>......] - ETA: 31s - loss: 0.9619 - regression_loss: 0.8609 - classification_loss: 0.1010 407/500 [=======================>......] - ETA: 31s - loss: 0.9615 - regression_loss: 0.8607 - classification_loss: 0.1008 408/500 [=======================>......] - ETA: 31s - loss: 0.9615 - regression_loss: 0.8605 - classification_loss: 0.1010 409/500 [=======================>......] - ETA: 30s - loss: 0.9614 - regression_loss: 0.8604 - classification_loss: 0.1009 410/500 [=======================>......] - ETA: 30s - loss: 0.9616 - regression_loss: 0.8607 - classification_loss: 0.1009 411/500 [=======================>......] - ETA: 30s - loss: 0.9624 - regression_loss: 0.8613 - classification_loss: 0.1011 412/500 [=======================>......] - ETA: 29s - loss: 0.9618 - regression_loss: 0.8608 - classification_loss: 0.1010 413/500 [=======================>......] - ETA: 29s - loss: 0.9624 - regression_loss: 0.8613 - classification_loss: 0.1012 414/500 [=======================>......] - ETA: 29s - loss: 0.9632 - regression_loss: 0.8619 - classification_loss: 0.1012 415/500 [=======================>......] - ETA: 28s - loss: 0.9632 - regression_loss: 0.8621 - classification_loss: 0.1012 416/500 [=======================>......] - ETA: 28s - loss: 0.9632 - regression_loss: 0.8621 - classification_loss: 0.1011 417/500 [========================>.....] - ETA: 28s - loss: 0.9625 - regression_loss: 0.8615 - classification_loss: 0.1010 418/500 [========================>.....] - ETA: 27s - loss: 0.9622 - regression_loss: 0.8612 - classification_loss: 0.1010 419/500 [========================>.....] - ETA: 27s - loss: 0.9624 - regression_loss: 0.8614 - classification_loss: 0.1010 420/500 [========================>.....] - ETA: 27s - loss: 0.9618 - regression_loss: 0.8607 - classification_loss: 0.1010 421/500 [========================>.....] - ETA: 26s - loss: 0.9619 - regression_loss: 0.8608 - classification_loss: 0.1011 422/500 [========================>.....] - ETA: 26s - loss: 0.9608 - regression_loss: 0.8598 - classification_loss: 0.1009 423/500 [========================>.....] - ETA: 26s - loss: 0.9607 - regression_loss: 0.8598 - classification_loss: 0.1009 424/500 [========================>.....] - ETA: 25s - loss: 0.9611 - regression_loss: 0.8602 - classification_loss: 0.1009 425/500 [========================>.....] - ETA: 25s - loss: 0.9597 - regression_loss: 0.8590 - classification_loss: 0.1007 426/500 [========================>.....] - ETA: 25s - loss: 0.9595 - regression_loss: 0.8588 - classification_loss: 0.1006 427/500 [========================>.....] - ETA: 24s - loss: 0.9595 - regression_loss: 0.8589 - classification_loss: 0.1006 428/500 [========================>.....] - ETA: 24s - loss: 0.9599 - regression_loss: 0.8593 - classification_loss: 0.1007 429/500 [========================>.....] - ETA: 24s - loss: 0.9590 - regression_loss: 0.8584 - classification_loss: 0.1005 430/500 [========================>.....] - ETA: 23s - loss: 0.9587 - regression_loss: 0.8582 - classification_loss: 0.1006 431/500 [========================>.....] - ETA: 23s - loss: 0.9597 - regression_loss: 0.8589 - classification_loss: 0.1008 432/500 [========================>.....] - ETA: 23s - loss: 0.9596 - regression_loss: 0.8588 - classification_loss: 0.1008 433/500 [========================>.....] - ETA: 22s - loss: 0.9605 - regression_loss: 0.8597 - classification_loss: 0.1008 434/500 [=========================>....] - ETA: 22s - loss: 0.9598 - regression_loss: 0.8591 - classification_loss: 0.1007 435/500 [=========================>....] - ETA: 22s - loss: 0.9598 - regression_loss: 0.8592 - classification_loss: 0.1006 436/500 [=========================>....] - ETA: 21s - loss: 0.9595 - regression_loss: 0.8589 - classification_loss: 0.1006 437/500 [=========================>....] - ETA: 21s - loss: 0.9596 - regression_loss: 0.8590 - classification_loss: 0.1006 438/500 [=========================>....] - ETA: 21s - loss: 0.9594 - regression_loss: 0.8588 - classification_loss: 0.1006 439/500 [=========================>....] - ETA: 20s - loss: 0.9593 - regression_loss: 0.8587 - classification_loss: 0.1006 440/500 [=========================>....] - ETA: 20s - loss: 0.9598 - regression_loss: 0.8592 - classification_loss: 0.1007 441/500 [=========================>....] - ETA: 20s - loss: 0.9589 - regression_loss: 0.8583 - classification_loss: 0.1005 442/500 [=========================>....] - ETA: 19s - loss: 0.9590 - regression_loss: 0.8585 - classification_loss: 0.1005 443/500 [=========================>....] - ETA: 19s - loss: 0.9592 - regression_loss: 0.8586 - classification_loss: 0.1006 444/500 [=========================>....] - ETA: 18s - loss: 0.9592 - regression_loss: 0.8585 - classification_loss: 0.1007 445/500 [=========================>....] - ETA: 18s - loss: 0.9585 - regression_loss: 0.8580 - classification_loss: 0.1005 446/500 [=========================>....] - ETA: 18s - loss: 0.9582 - regression_loss: 0.8578 - classification_loss: 0.1005 447/500 [=========================>....] - ETA: 17s - loss: 0.9585 - regression_loss: 0.8580 - classification_loss: 0.1005 448/500 [=========================>....] - ETA: 17s - loss: 0.9571 - regression_loss: 0.8568 - classification_loss: 0.1003 449/500 [=========================>....] - ETA: 17s - loss: 0.9561 - regression_loss: 0.8559 - classification_loss: 0.1002 450/500 [==========================>...] - ETA: 16s - loss: 0.9548 - regression_loss: 0.8548 - classification_loss: 0.1000 451/500 [==========================>...] - ETA: 16s - loss: 0.9554 - regression_loss: 0.8552 - classification_loss: 0.1002 452/500 [==========================>...] - ETA: 16s - loss: 0.9540 - regression_loss: 0.8540 - classification_loss: 0.1000 453/500 [==========================>...] - ETA: 15s - loss: 0.9530 - regression_loss: 0.8531 - classification_loss: 0.0999 454/500 [==========================>...] - ETA: 15s - loss: 0.9529 - regression_loss: 0.8531 - classification_loss: 0.0998 455/500 [==========================>...] - ETA: 15s - loss: 0.9537 - regression_loss: 0.8537 - classification_loss: 0.1000 456/500 [==========================>...] - ETA: 14s - loss: 0.9527 - regression_loss: 0.8528 - classification_loss: 0.0998 457/500 [==========================>...] - ETA: 14s - loss: 0.9527 - regression_loss: 0.8528 - classification_loss: 0.0999 458/500 [==========================>...] - ETA: 14s - loss: 0.9521 - regression_loss: 0.8522 - classification_loss: 0.0998 459/500 [==========================>...] - ETA: 13s - loss: 0.9526 - regression_loss: 0.8528 - classification_loss: 0.0999 460/500 [==========================>...] - ETA: 13s - loss: 0.9533 - regression_loss: 0.8533 - classification_loss: 0.1000 461/500 [==========================>...] - ETA: 13s - loss: 0.9545 - regression_loss: 0.8546 - classification_loss: 0.0999 462/500 [==========================>...] - ETA: 12s - loss: 0.9546 - regression_loss: 0.8547 - classification_loss: 0.1000 463/500 [==========================>...] - ETA: 12s - loss: 0.9551 - regression_loss: 0.8551 - classification_loss: 0.1000 464/500 [==========================>...] - ETA: 12s - loss: 0.9542 - regression_loss: 0.8544 - classification_loss: 0.0999 465/500 [==========================>...] - ETA: 11s - loss: 0.9546 - regression_loss: 0.8548 - classification_loss: 0.0998 466/500 [==========================>...] - ETA: 11s - loss: 0.9551 - regression_loss: 0.8552 - classification_loss: 0.0999 467/500 [===========================>..] - ETA: 11s - loss: 0.9543 - regression_loss: 0.8545 - classification_loss: 0.0998 468/500 [===========================>..] - ETA: 10s - loss: 0.9545 - regression_loss: 0.8548 - classification_loss: 0.0997 469/500 [===========================>..] - ETA: 10s - loss: 0.9550 - regression_loss: 0.8552 - classification_loss: 0.0999 470/500 [===========================>..] - ETA: 10s - loss: 0.9545 - regression_loss: 0.8547 - classification_loss: 0.0998 471/500 [===========================>..] - ETA: 9s - loss: 0.9551 - regression_loss: 0.8552 - classification_loss: 0.0999  472/500 [===========================>..] - ETA: 9s - loss: 0.9546 - regression_loss: 0.8548 - classification_loss: 0.0997 473/500 [===========================>..] - ETA: 9s - loss: 0.9565 - regression_loss: 0.8563 - classification_loss: 0.1002 474/500 [===========================>..] - ETA: 8s - loss: 0.9563 - regression_loss: 0.8561 - classification_loss: 0.1002 475/500 [===========================>..] - ETA: 8s - loss: 0.9564 - regression_loss: 0.8562 - classification_loss: 0.1003 476/500 [===========================>..] - ETA: 8s - loss: 0.9565 - regression_loss: 0.8562 - classification_loss: 0.1003 477/500 [===========================>..] - ETA: 7s - loss: 0.9564 - regression_loss: 0.8562 - classification_loss: 0.1003 478/500 [===========================>..] - ETA: 7s - loss: 0.9563 - regression_loss: 0.8561 - classification_loss: 0.1002 479/500 [===========================>..] - ETA: 7s - loss: 0.9554 - regression_loss: 0.8552 - classification_loss: 0.1002 480/500 [===========================>..] - ETA: 6s - loss: 0.9557 - regression_loss: 0.8554 - classification_loss: 0.1003 481/500 [===========================>..] - ETA: 6s - loss: 0.9550 - regression_loss: 0.8548 - classification_loss: 0.1002 482/500 [===========================>..] - ETA: 6s - loss: 0.9555 - regression_loss: 0.8554 - classification_loss: 0.1002 483/500 [===========================>..] - ETA: 5s - loss: 0.9562 - regression_loss: 0.8559 - classification_loss: 0.1003 484/500 [============================>.] - ETA: 5s - loss: 0.9557 - regression_loss: 0.8554 - classification_loss: 0.1003 485/500 [============================>.] - ETA: 5s - loss: 0.9554 - regression_loss: 0.8552 - classification_loss: 0.1002 486/500 [============================>.] - ETA: 4s - loss: 0.9561 - regression_loss: 0.8558 - classification_loss: 0.1004 487/500 [============================>.] - ETA: 4s - loss: 0.9561 - regression_loss: 0.8557 - classification_loss: 0.1004 488/500 [============================>.] - ETA: 4s - loss: 0.9572 - regression_loss: 0.8565 - classification_loss: 0.1007 489/500 [============================>.] - ETA: 3s - loss: 0.9575 - regression_loss: 0.8568 - classification_loss: 0.1008 490/500 [============================>.] - ETA: 3s - loss: 0.9581 - regression_loss: 0.8573 - classification_loss: 0.1009 491/500 [============================>.] - ETA: 3s - loss: 0.9576 - regression_loss: 0.8567 - classification_loss: 0.1009 492/500 [============================>.] - ETA: 2s - loss: 0.9573 - regression_loss: 0.8565 - classification_loss: 0.1008 493/500 [============================>.] - ETA: 2s - loss: 0.9572 - regression_loss: 0.8564 - classification_loss: 0.1008 494/500 [============================>.] - ETA: 2s - loss: 0.9576 - regression_loss: 0.8568 - classification_loss: 0.1008 495/500 [============================>.] - ETA: 1s - loss: 0.9575 - regression_loss: 0.8568 - classification_loss: 0.1007 496/500 [============================>.] - ETA: 1s - loss: 0.9566 - regression_loss: 0.8559 - classification_loss: 0.1007 497/500 [============================>.] - ETA: 1s - loss: 0.9575 - regression_loss: 0.8567 - classification_loss: 0.1008 498/500 [============================>.] - ETA: 0s - loss: 0.9567 - regression_loss: 0.8559 - classification_loss: 0.1008 499/500 [============================>.] - ETA: 0s - loss: 0.9564 - regression_loss: 0.8557 - classification_loss: 0.1007 500/500 [==============================] - 170s 339ms/step - loss: 0.9562 - regression_loss: 0.8556 - classification_loss: 0.1007 1172 instances of class plum with average precision: 0.7800 mAP: 0.7800 Epoch 00034: saving model to ./training/snapshots/resnet101_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 2:45 - loss: 0.9448 - regression_loss: 0.8346 - classification_loss: 0.1102 2/500 [..............................] - ETA: 2:49 - loss: 1.0703 - regression_loss: 0.9569 - classification_loss: 0.1134 3/500 [..............................] - ETA: 2:49 - loss: 0.8763 - regression_loss: 0.7927 - classification_loss: 0.0836 4/500 [..............................] - ETA: 2:48 - loss: 0.7681 - regression_loss: 0.6977 - classification_loss: 0.0705 5/500 [..............................] - ETA: 2:48 - loss: 0.8626 - regression_loss: 0.7830 - classification_loss: 0.0796 6/500 [..............................] - ETA: 2:48 - loss: 1.1291 - regression_loss: 1.0086 - classification_loss: 0.1206 7/500 [..............................] - ETA: 2:49 - loss: 1.1067 - regression_loss: 0.9887 - classification_loss: 0.1181 8/500 [..............................] - ETA: 2:48 - loss: 1.0785 - regression_loss: 0.9613 - classification_loss: 0.1172 9/500 [..............................] - ETA: 2:48 - loss: 1.1286 - regression_loss: 1.0017 - classification_loss: 0.1269 10/500 [..............................] - ETA: 2:48 - loss: 1.1038 - regression_loss: 0.9825 - classification_loss: 0.1213 11/500 [..............................] - ETA: 2:47 - loss: 1.0720 - regression_loss: 0.9572 - classification_loss: 0.1149 12/500 [..............................] - ETA: 2:47 - loss: 1.1010 - regression_loss: 0.9804 - classification_loss: 0.1207 13/500 [..............................] - ETA: 2:47 - loss: 1.0519 - regression_loss: 0.9378 - classification_loss: 0.1141 14/500 [..............................] - ETA: 2:47 - loss: 1.0454 - regression_loss: 0.9336 - classification_loss: 0.1119 15/500 [..............................] - ETA: 2:46 - loss: 1.0274 - regression_loss: 0.9181 - classification_loss: 0.1093 16/500 [..............................] - ETA: 2:46 - loss: 1.0066 - regression_loss: 0.9020 - classification_loss: 0.1046 17/500 [>.............................] - ETA: 2:45 - loss: 1.0110 - regression_loss: 0.9076 - classification_loss: 0.1034 18/500 [>.............................] - ETA: 2:45 - loss: 1.0174 - regression_loss: 0.9108 - classification_loss: 0.1066 19/500 [>.............................] - ETA: 2:44 - loss: 1.0311 - regression_loss: 0.9165 - classification_loss: 0.1146 20/500 [>.............................] - ETA: 2:44 - loss: 0.9986 - regression_loss: 0.8883 - classification_loss: 0.1103 21/500 [>.............................] - ETA: 2:43 - loss: 1.0131 - regression_loss: 0.9007 - classification_loss: 0.1125 22/500 [>.............................] - ETA: 2:43 - loss: 1.0086 - regression_loss: 0.8942 - classification_loss: 0.1143 23/500 [>.............................] - ETA: 2:43 - loss: 1.0166 - regression_loss: 0.9005 - classification_loss: 0.1161 24/500 [>.............................] - ETA: 2:42 - loss: 1.0031 - regression_loss: 0.8894 - classification_loss: 0.1137 25/500 [>.............................] - ETA: 2:42 - loss: 1.0016 - regression_loss: 0.8873 - classification_loss: 0.1142 26/500 [>.............................] - ETA: 2:41 - loss: 1.0013 - regression_loss: 0.8879 - classification_loss: 0.1134 27/500 [>.............................] - ETA: 2:41 - loss: 0.9767 - regression_loss: 0.8666 - classification_loss: 0.1101 28/500 [>.............................] - ETA: 2:41 - loss: 0.9668 - regression_loss: 0.8584 - classification_loss: 0.1084 29/500 [>.............................] - ETA: 2:40 - loss: 0.9789 - regression_loss: 0.8687 - classification_loss: 0.1102 30/500 [>.............................] - ETA: 2:40 - loss: 0.9910 - regression_loss: 0.8789 - classification_loss: 0.1121 31/500 [>.............................] - ETA: 2:40 - loss: 1.0040 - regression_loss: 0.8888 - classification_loss: 0.1152 32/500 [>.............................] - ETA: 2:39 - loss: 1.0063 - regression_loss: 0.8914 - classification_loss: 0.1149 33/500 [>.............................] - ETA: 2:39 - loss: 1.0044 - regression_loss: 0.8893 - classification_loss: 0.1150 34/500 [=>............................] - ETA: 2:38 - loss: 1.0014 - regression_loss: 0.8868 - classification_loss: 0.1146 35/500 [=>............................] - ETA: 2:38 - loss: 1.0187 - regression_loss: 0.9019 - classification_loss: 0.1168 36/500 [=>............................] - ETA: 2:37 - loss: 1.0115 - regression_loss: 0.8959 - classification_loss: 0.1157 37/500 [=>............................] - ETA: 2:37 - loss: 1.0123 - regression_loss: 0.8973 - classification_loss: 0.1151 38/500 [=>............................] - ETA: 2:37 - loss: 1.0169 - regression_loss: 0.9017 - classification_loss: 0.1152 39/500 [=>............................] - ETA: 2:36 - loss: 1.0242 - regression_loss: 0.9077 - classification_loss: 0.1165 40/500 [=>............................] - ETA: 2:36 - loss: 1.0252 - regression_loss: 0.9097 - classification_loss: 0.1155 41/500 [=>............................] - ETA: 2:36 - loss: 1.0253 - regression_loss: 0.9087 - classification_loss: 0.1166 42/500 [=>............................] - ETA: 2:35 - loss: 1.0217 - regression_loss: 0.9055 - classification_loss: 0.1161 43/500 [=>............................] - ETA: 2:35 - loss: 1.0110 - regression_loss: 0.8967 - classification_loss: 0.1144 44/500 [=>............................] - ETA: 2:34 - loss: 1.0084 - regression_loss: 0.8923 - classification_loss: 0.1162 45/500 [=>............................] - ETA: 2:34 - loss: 1.0103 - regression_loss: 0.8931 - classification_loss: 0.1172 46/500 [=>............................] - ETA: 2:34 - loss: 1.0017 - regression_loss: 0.8856 - classification_loss: 0.1161 47/500 [=>............................] - ETA: 2:34 - loss: 1.0067 - regression_loss: 0.8902 - classification_loss: 0.1165 48/500 [=>............................] - ETA: 2:33 - loss: 1.0145 - regression_loss: 0.8964 - classification_loss: 0.1181 49/500 [=>............................] - ETA: 2:33 - loss: 1.0106 - regression_loss: 0.8933 - classification_loss: 0.1173 50/500 [==>...........................] - ETA: 2:33 - loss: 1.0064 - regression_loss: 0.8906 - classification_loss: 0.1158 51/500 [==>...........................] - ETA: 2:32 - loss: 0.9920 - regression_loss: 0.8782 - classification_loss: 0.1139 52/500 [==>...........................] - ETA: 2:32 - loss: 0.9920 - regression_loss: 0.8781 - classification_loss: 0.1139 53/500 [==>...........................] - ETA: 2:32 - loss: 0.9898 - regression_loss: 0.8757 - classification_loss: 0.1141 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9895 - regression_loss: 0.8763 - classification_loss: 0.1132 55/500 [==>...........................] - ETA: 2:31 - loss: 0.9814 - regression_loss: 0.8693 - classification_loss: 0.1121 56/500 [==>...........................] - ETA: 2:31 - loss: 0.9807 - regression_loss: 0.8692 - classification_loss: 0.1115 57/500 [==>...........................] - ETA: 2:31 - loss: 0.9717 - regression_loss: 0.8617 - classification_loss: 0.1100 58/500 [==>...........................] - ETA: 2:30 - loss: 0.9754 - regression_loss: 0.8649 - classification_loss: 0.1105 59/500 [==>...........................] - ETA: 2:30 - loss: 0.9695 - regression_loss: 0.8603 - classification_loss: 0.1092 60/500 [==>...........................] - ETA: 2:30 - loss: 0.9718 - regression_loss: 0.8623 - classification_loss: 0.1095 61/500 [==>...........................] - ETA: 2:29 - loss: 0.9769 - regression_loss: 0.8666 - classification_loss: 0.1103 62/500 [==>...........................] - ETA: 2:29 - loss: 0.9765 - regression_loss: 0.8659 - classification_loss: 0.1105 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9781 - regression_loss: 0.8678 - classification_loss: 0.1103 64/500 [==>...........................] - ETA: 2:28 - loss: 0.9757 - regression_loss: 0.8663 - classification_loss: 0.1094 65/500 [==>...........................] - ETA: 2:28 - loss: 0.9766 - regression_loss: 0.8678 - classification_loss: 0.1088 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9704 - regression_loss: 0.8624 - classification_loss: 0.1080 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9690 - regression_loss: 0.8623 - classification_loss: 0.1067 68/500 [===>..........................] - ETA: 2:27 - loss: 0.9673 - regression_loss: 0.8607 - classification_loss: 0.1066 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9636 - regression_loss: 0.8578 - classification_loss: 0.1058 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9633 - regression_loss: 0.8578 - classification_loss: 0.1054 71/500 [===>..........................] - ETA: 2:26 - loss: 0.9585 - regression_loss: 0.8540 - classification_loss: 0.1044 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9577 - regression_loss: 0.8536 - classification_loss: 0.1041 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9592 - regression_loss: 0.8553 - classification_loss: 0.1039 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9536 - regression_loss: 0.8505 - classification_loss: 0.1032 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9512 - regression_loss: 0.8482 - classification_loss: 0.1030 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9505 - regression_loss: 0.8477 - classification_loss: 0.1028 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9521 - regression_loss: 0.8492 - classification_loss: 0.1029 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9511 - regression_loss: 0.8485 - classification_loss: 0.1025 79/500 [===>..........................] - ETA: 2:22 - loss: 0.9527 - regression_loss: 0.8500 - classification_loss: 0.1027 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9495 - regression_loss: 0.8475 - classification_loss: 0.1020 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9510 - regression_loss: 0.8488 - classification_loss: 0.1022 82/500 [===>..........................] - ETA: 2:21 - loss: 0.9520 - regression_loss: 0.8498 - classification_loss: 0.1021 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9491 - regression_loss: 0.8478 - classification_loss: 0.1013 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9512 - regression_loss: 0.8490 - classification_loss: 0.1022 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9680 - regression_loss: 0.8549 - classification_loss: 0.1130 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9671 - regression_loss: 0.8541 - classification_loss: 0.1129 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9700 - regression_loss: 0.8570 - classification_loss: 0.1130 88/500 [====>.........................] - ETA: 2:19 - loss: 0.9719 - regression_loss: 0.8590 - classification_loss: 0.1128 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9693 - regression_loss: 0.8568 - classification_loss: 0.1125 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9709 - regression_loss: 0.8583 - classification_loss: 0.1126 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9687 - regression_loss: 0.8566 - classification_loss: 0.1120 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9710 - regression_loss: 0.8586 - classification_loss: 0.1124 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9637 - regression_loss: 0.8522 - classification_loss: 0.1115 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9607 - regression_loss: 0.8497 - classification_loss: 0.1110 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9576 - regression_loss: 0.8470 - classification_loss: 0.1106 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9584 - regression_loss: 0.8482 - classification_loss: 0.1102 97/500 [====>.........................] - ETA: 2:17 - loss: 0.9625 - regression_loss: 0.8517 - classification_loss: 0.1108 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9641 - regression_loss: 0.8529 - classification_loss: 0.1112 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9608 - regression_loss: 0.8502 - classification_loss: 0.1106 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9589 - regression_loss: 0.8488 - classification_loss: 0.1100 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9574 - regression_loss: 0.8479 - classification_loss: 0.1094 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9515 - regression_loss: 0.8429 - classification_loss: 0.1086 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9518 - regression_loss: 0.8433 - classification_loss: 0.1085 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9477 - regression_loss: 0.8398 - classification_loss: 0.1079 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9483 - regression_loss: 0.8401 - classification_loss: 0.1081 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9511 - regression_loss: 0.8426 - classification_loss: 0.1086 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9550 - regression_loss: 0.8458 - classification_loss: 0.1092 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9523 - regression_loss: 0.8438 - classification_loss: 0.1085 109/500 [=====>........................] - ETA: 2:13 - loss: 0.9522 - regression_loss: 0.8439 - classification_loss: 0.1083 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9550 - regression_loss: 0.8466 - classification_loss: 0.1085 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9587 - regression_loss: 0.8501 - classification_loss: 0.1087 112/500 [=====>........................] - ETA: 2:12 - loss: 0.9602 - regression_loss: 0.8511 - classification_loss: 0.1091 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9662 - regression_loss: 0.8555 - classification_loss: 0.1107 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9637 - regression_loss: 0.8533 - classification_loss: 0.1104 115/500 [=====>........................] - ETA: 2:11 - loss: 0.9640 - regression_loss: 0.8536 - classification_loss: 0.1104 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9603 - regression_loss: 0.8503 - classification_loss: 0.1100 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9621 - regression_loss: 0.8520 - classification_loss: 0.1101 118/500 [======>.......................] - ETA: 2:10 - loss: 0.9612 - regression_loss: 0.8514 - classification_loss: 0.1099 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9623 - regression_loss: 0.8528 - classification_loss: 0.1095 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9625 - regression_loss: 0.8534 - classification_loss: 0.1091 121/500 [======>.......................] - ETA: 2:09 - loss: 0.9608 - regression_loss: 0.8522 - classification_loss: 0.1086 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9646 - regression_loss: 0.8557 - classification_loss: 0.1089 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9675 - regression_loss: 0.8581 - classification_loss: 0.1093 124/500 [======>.......................] - ETA: 2:08 - loss: 0.9675 - regression_loss: 0.8581 - classification_loss: 0.1093 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9649 - regression_loss: 0.8561 - classification_loss: 0.1088 126/500 [======>.......................] - ETA: 2:07 - loss: 0.9676 - regression_loss: 0.8584 - classification_loss: 0.1092 127/500 [======>.......................] - ETA: 2:07 - loss: 0.9656 - regression_loss: 0.8567 - classification_loss: 0.1089 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9611 - regression_loss: 0.8529 - classification_loss: 0.1082 129/500 [======>.......................] - ETA: 2:06 - loss: 0.9623 - regression_loss: 0.8543 - classification_loss: 0.1080 130/500 [======>.......................] - ETA: 2:06 - loss: 0.9627 - regression_loss: 0.8552 - classification_loss: 0.1075 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9612 - regression_loss: 0.8541 - classification_loss: 0.1071 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9618 - regression_loss: 0.8550 - classification_loss: 0.1068 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9621 - regression_loss: 0.8553 - classification_loss: 0.1068 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9618 - regression_loss: 0.8550 - classification_loss: 0.1068 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9604 - regression_loss: 0.8539 - classification_loss: 0.1065 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9638 - regression_loss: 0.8568 - classification_loss: 0.1070 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9653 - regression_loss: 0.8581 - classification_loss: 0.1072 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9605 - regression_loss: 0.8539 - classification_loss: 0.1066 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9597 - regression_loss: 0.8534 - classification_loss: 0.1063 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9567 - regression_loss: 0.8508 - classification_loss: 0.1058 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9592 - regression_loss: 0.8529 - classification_loss: 0.1063 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9596 - regression_loss: 0.8532 - classification_loss: 0.1064 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9581 - regression_loss: 0.8521 - classification_loss: 0.1061 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9533 - regression_loss: 0.8476 - classification_loss: 0.1056 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9527 - regression_loss: 0.8473 - classification_loss: 0.1054 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9517 - regression_loss: 0.8466 - classification_loss: 0.1051 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9533 - regression_loss: 0.8479 - classification_loss: 0.1054 148/500 [=======>......................] - ETA: 2:00 - loss: 0.9588 - regression_loss: 0.8526 - classification_loss: 0.1062 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9589 - regression_loss: 0.8528 - classification_loss: 0.1060 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9614 - regression_loss: 0.8554 - classification_loss: 0.1060 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9618 - regression_loss: 0.8558 - classification_loss: 0.1059 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9599 - regression_loss: 0.8544 - classification_loss: 0.1054 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9609 - regression_loss: 0.8555 - classification_loss: 0.1053 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9611 - regression_loss: 0.8561 - classification_loss: 0.1050 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9599 - regression_loss: 0.8552 - classification_loss: 0.1047 156/500 [========>.....................] - ETA: 1:57 - loss: 0.9578 - regression_loss: 0.8532 - classification_loss: 0.1047 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9552 - regression_loss: 0.8509 - classification_loss: 0.1043 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9540 - regression_loss: 0.8496 - classification_loss: 0.1044 159/500 [========>.....................] - ETA: 1:56 - loss: 0.9543 - regression_loss: 0.8501 - classification_loss: 0.1042 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9537 - regression_loss: 0.8497 - classification_loss: 0.1040 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9619 - regression_loss: 0.8571 - classification_loss: 0.1048 162/500 [========>.....................] - ETA: 1:55 - loss: 0.9645 - regression_loss: 0.8591 - classification_loss: 0.1054 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9694 - regression_loss: 0.8632 - classification_loss: 0.1062 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9678 - regression_loss: 0.8619 - classification_loss: 0.1059 165/500 [========>.....................] - ETA: 1:54 - loss: 0.9669 - regression_loss: 0.8611 - classification_loss: 0.1058 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9694 - regression_loss: 0.8633 - classification_loss: 0.1060 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9693 - regression_loss: 0.8632 - classification_loss: 0.1061 168/500 [=========>....................] - ETA: 1:53 - loss: 0.9674 - regression_loss: 0.8616 - classification_loss: 0.1058 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9654 - regression_loss: 0.8600 - classification_loss: 0.1054 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9626 - regression_loss: 0.8576 - classification_loss: 0.1050 171/500 [=========>....................] - ETA: 1:52 - loss: 0.9630 - regression_loss: 0.8579 - classification_loss: 0.1051 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9660 - regression_loss: 0.8602 - classification_loss: 0.1058 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9671 - regression_loss: 0.8615 - classification_loss: 0.1057 174/500 [=========>....................] - ETA: 1:51 - loss: 0.9674 - regression_loss: 0.8618 - classification_loss: 0.1057 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9657 - regression_loss: 0.8604 - classification_loss: 0.1053 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9649 - regression_loss: 0.8594 - classification_loss: 0.1055 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9642 - regression_loss: 0.8588 - classification_loss: 0.1054 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9625 - regression_loss: 0.8573 - classification_loss: 0.1052 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9629 - regression_loss: 0.8577 - classification_loss: 0.1052 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9605 - regression_loss: 0.8558 - classification_loss: 0.1047 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9658 - regression_loss: 0.8604 - classification_loss: 0.1054 182/500 [=========>....................] - ETA: 1:48 - loss: 0.9647 - regression_loss: 0.8595 - classification_loss: 0.1052 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9652 - regression_loss: 0.8597 - classification_loss: 0.1055 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9636 - regression_loss: 0.8583 - classification_loss: 0.1053 185/500 [==========>...................] - ETA: 1:47 - loss: 0.9644 - regression_loss: 0.8579 - classification_loss: 0.1066 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9660 - regression_loss: 0.8592 - classification_loss: 0.1069 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9665 - regression_loss: 0.8596 - classification_loss: 0.1069 188/500 [==========>...................] - ETA: 1:46 - loss: 0.9676 - regression_loss: 0.8605 - classification_loss: 0.1071 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9694 - regression_loss: 0.8621 - classification_loss: 0.1073 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9720 - regression_loss: 0.8647 - classification_loss: 0.1073 191/500 [==========>...................] - ETA: 1:45 - loss: 0.9709 - regression_loss: 0.8638 - classification_loss: 0.1071 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9718 - regression_loss: 0.8646 - classification_loss: 0.1072 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9740 - regression_loss: 0.8664 - classification_loss: 0.1075 194/500 [==========>...................] - ETA: 1:44 - loss: 0.9739 - regression_loss: 0.8663 - classification_loss: 0.1076 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9764 - regression_loss: 0.8682 - classification_loss: 0.1081 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9758 - regression_loss: 0.8677 - classification_loss: 0.1081 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9760 - regression_loss: 0.8679 - classification_loss: 0.1081 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9734 - regression_loss: 0.8655 - classification_loss: 0.1079 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9721 - regression_loss: 0.8645 - classification_loss: 0.1076 200/500 [===========>..................] - ETA: 1:42 - loss: 0.9692 - regression_loss: 0.8620 - classification_loss: 0.1072 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9714 - regression_loss: 0.8639 - classification_loss: 0.1074 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9702 - regression_loss: 0.8631 - classification_loss: 0.1072 203/500 [===========>..................] - ETA: 1:41 - loss: 0.9715 - regression_loss: 0.8643 - classification_loss: 0.1072 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9718 - regression_loss: 0.8645 - classification_loss: 0.1073 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9712 - regression_loss: 0.8638 - classification_loss: 0.1074 206/500 [===========>..................] - ETA: 1:40 - loss: 0.9680 - regression_loss: 0.8611 - classification_loss: 0.1070 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9656 - regression_loss: 0.8590 - classification_loss: 0.1066 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9686 - regression_loss: 0.8617 - classification_loss: 0.1069 209/500 [===========>..................] - ETA: 1:39 - loss: 0.9696 - regression_loss: 0.8626 - classification_loss: 0.1070 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9670 - regression_loss: 0.8604 - classification_loss: 0.1066 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9676 - regression_loss: 0.8610 - classification_loss: 0.1066 212/500 [===========>..................] - ETA: 1:38 - loss: 0.9645 - regression_loss: 0.8582 - classification_loss: 0.1063 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9623 - regression_loss: 0.8563 - classification_loss: 0.1060 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9607 - regression_loss: 0.8550 - classification_loss: 0.1057 215/500 [===========>..................] - ETA: 1:37 - loss: 0.9607 - regression_loss: 0.8554 - classification_loss: 0.1054 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9622 - regression_loss: 0.8566 - classification_loss: 0.1056 217/500 [============>.................] - ETA: 1:36 - loss: 0.9597 - regression_loss: 0.8545 - classification_loss: 0.1052 218/500 [============>.................] - ETA: 1:36 - loss: 0.9572 - regression_loss: 0.8521 - classification_loss: 0.1050 219/500 [============>.................] - ETA: 1:35 - loss: 0.9572 - regression_loss: 0.8522 - classification_loss: 0.1050 220/500 [============>.................] - ETA: 1:35 - loss: 0.9576 - regression_loss: 0.8526 - classification_loss: 0.1051 221/500 [============>.................] - ETA: 1:35 - loss: 0.9563 - regression_loss: 0.8514 - classification_loss: 0.1048 222/500 [============>.................] - ETA: 1:34 - loss: 0.9559 - regression_loss: 0.8512 - classification_loss: 0.1047 223/500 [============>.................] - ETA: 1:34 - loss: 0.9561 - regression_loss: 0.8515 - classification_loss: 0.1046 224/500 [============>.................] - ETA: 1:33 - loss: 0.9577 - regression_loss: 0.8529 - classification_loss: 0.1048 225/500 [============>.................] - ETA: 1:33 - loss: 0.9578 - regression_loss: 0.8531 - classification_loss: 0.1047 226/500 [============>.................] - ETA: 1:33 - loss: 0.9598 - regression_loss: 0.8547 - classification_loss: 0.1051 227/500 [============>.................] - ETA: 1:32 - loss: 0.9583 - regression_loss: 0.8535 - classification_loss: 0.1049 228/500 [============>.................] - ETA: 1:32 - loss: 0.9591 - regression_loss: 0.8538 - classification_loss: 0.1054 229/500 [============>.................] - ETA: 1:32 - loss: 0.9559 - regression_loss: 0.8510 - classification_loss: 0.1050 230/500 [============>.................] - ETA: 1:31 - loss: 0.9573 - regression_loss: 0.8521 - classification_loss: 0.1052 231/500 [============>.................] - ETA: 1:31 - loss: 0.9554 - regression_loss: 0.8504 - classification_loss: 0.1050 232/500 [============>.................] - ETA: 1:31 - loss: 0.9547 - regression_loss: 0.8499 - classification_loss: 0.1048 233/500 [============>.................] - ETA: 1:30 - loss: 0.9543 - regression_loss: 0.8496 - classification_loss: 0.1047 234/500 [=============>................] - ETA: 1:30 - loss: 0.9518 - regression_loss: 0.8474 - classification_loss: 0.1044 235/500 [=============>................] - ETA: 1:30 - loss: 0.9511 - regression_loss: 0.8469 - classification_loss: 0.1042 236/500 [=============>................] - ETA: 1:29 - loss: 0.9501 - regression_loss: 0.8461 - classification_loss: 0.1040 237/500 [=============>................] - ETA: 1:29 - loss: 0.9501 - regression_loss: 0.8461 - classification_loss: 0.1040 238/500 [=============>................] - ETA: 1:29 - loss: 0.9493 - regression_loss: 0.8454 - classification_loss: 0.1039 239/500 [=============>................] - ETA: 1:28 - loss: 0.9491 - regression_loss: 0.8453 - classification_loss: 0.1037 240/500 [=============>................] - ETA: 1:28 - loss: 0.9509 - regression_loss: 0.8469 - classification_loss: 0.1040 241/500 [=============>................] - ETA: 1:28 - loss: 0.9511 - regression_loss: 0.8472 - classification_loss: 0.1039 242/500 [=============>................] - ETA: 1:27 - loss: 0.9507 - regression_loss: 0.8468 - classification_loss: 0.1038 243/500 [=============>................] - ETA: 1:27 - loss: 0.9494 - regression_loss: 0.8457 - classification_loss: 0.1037 244/500 [=============>................] - ETA: 1:27 - loss: 0.9494 - regression_loss: 0.8458 - classification_loss: 0.1036 245/500 [=============>................] - ETA: 1:26 - loss: 0.9507 - regression_loss: 0.8468 - classification_loss: 0.1039 246/500 [=============>................] - ETA: 1:26 - loss: 0.9501 - regression_loss: 0.8463 - classification_loss: 0.1039 247/500 [=============>................] - ETA: 1:26 - loss: 0.9504 - regression_loss: 0.8463 - classification_loss: 0.1041 248/500 [=============>................] - ETA: 1:25 - loss: 0.9483 - regression_loss: 0.8444 - classification_loss: 0.1039 249/500 [=============>................] - ETA: 1:25 - loss: 0.9491 - regression_loss: 0.8451 - classification_loss: 0.1040 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9481 - regression_loss: 0.8441 - classification_loss: 0.1040 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9478 - regression_loss: 0.8438 - classification_loss: 0.1039 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9473 - regression_loss: 0.8434 - classification_loss: 0.1039 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9473 - regression_loss: 0.8434 - classification_loss: 0.1040 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9474 - regression_loss: 0.8434 - classification_loss: 0.1040 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9486 - regression_loss: 0.8444 - classification_loss: 0.1042 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9496 - regression_loss: 0.8453 - classification_loss: 0.1043 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9508 - regression_loss: 0.8465 - classification_loss: 0.1043 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9513 - regression_loss: 0.8472 - classification_loss: 0.1041 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9500 - regression_loss: 0.8461 - classification_loss: 0.1039 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9497 - regression_loss: 0.8460 - classification_loss: 0.1037 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9517 - regression_loss: 0.8478 - classification_loss: 0.1039 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9526 - regression_loss: 0.8487 - classification_loss: 0.1039 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9531 - regression_loss: 0.8491 - classification_loss: 0.1040 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9533 - regression_loss: 0.8494 - classification_loss: 0.1040 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9541 - regression_loss: 0.8499 - classification_loss: 0.1042 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9522 - regression_loss: 0.8483 - classification_loss: 0.1039 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9520 - regression_loss: 0.8482 - classification_loss: 0.1038 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9539 - regression_loss: 0.8498 - classification_loss: 0.1041 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9540 - regression_loss: 0.8500 - classification_loss: 0.1041 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9551 - regression_loss: 0.8508 - classification_loss: 0.1043 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9579 - regression_loss: 0.8530 - classification_loss: 0.1049 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9578 - regression_loss: 0.8529 - classification_loss: 0.1049 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9579 - regression_loss: 0.8531 - classification_loss: 0.1048 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9577 - regression_loss: 0.8529 - classification_loss: 0.1048 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9573 - regression_loss: 0.8526 - classification_loss: 0.1047 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9571 - regression_loss: 0.8524 - classification_loss: 0.1047 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9569 - regression_loss: 0.8523 - classification_loss: 0.1046 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9580 - regression_loss: 0.8534 - classification_loss: 0.1046 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9568 - regression_loss: 0.8524 - classification_loss: 0.1044 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9561 - regression_loss: 0.8519 - classification_loss: 0.1042 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9577 - regression_loss: 0.8532 - classification_loss: 0.1045 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9581 - regression_loss: 0.8536 - classification_loss: 0.1044 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9588 - regression_loss: 0.8543 - classification_loss: 0.1045 284/500 [================>.............] - ETA: 1:13 - loss: 0.9590 - regression_loss: 0.8545 - classification_loss: 0.1045 285/500 [================>.............] - ETA: 1:13 - loss: 0.9564 - regression_loss: 0.8522 - classification_loss: 0.1042 286/500 [================>.............] - ETA: 1:12 - loss: 0.9565 - regression_loss: 0.8524 - classification_loss: 0.1041 287/500 [================>.............] - ETA: 1:12 - loss: 0.9571 - regression_loss: 0.8529 - classification_loss: 0.1042 288/500 [================>.............] - ETA: 1:12 - loss: 0.9559 - regression_loss: 0.8519 - classification_loss: 0.1039 289/500 [================>.............] - ETA: 1:11 - loss: 0.9549 - regression_loss: 0.8511 - classification_loss: 0.1039 290/500 [================>.............] - ETA: 1:11 - loss: 0.9553 - regression_loss: 0.8515 - classification_loss: 0.1038 291/500 [================>.............] - ETA: 1:11 - loss: 0.9570 - regression_loss: 0.8529 - classification_loss: 0.1040 292/500 [================>.............] - ETA: 1:10 - loss: 0.9558 - regression_loss: 0.8520 - classification_loss: 0.1038 293/500 [================>.............] - ETA: 1:10 - loss: 0.9549 - regression_loss: 0.8512 - classification_loss: 0.1037 294/500 [================>.............] - ETA: 1:10 - loss: 0.9546 - regression_loss: 0.8510 - classification_loss: 0.1037 295/500 [================>.............] - ETA: 1:09 - loss: 0.9557 - regression_loss: 0.8518 - classification_loss: 0.1039 296/500 [================>.............] - ETA: 1:09 - loss: 0.9550 - regression_loss: 0.8514 - classification_loss: 0.1037 297/500 [================>.............] - ETA: 1:09 - loss: 0.9553 - regression_loss: 0.8515 - classification_loss: 0.1037 298/500 [================>.............] - ETA: 1:08 - loss: 0.9539 - regression_loss: 0.8504 - classification_loss: 0.1034 299/500 [================>.............] - ETA: 1:08 - loss: 0.9547 - regression_loss: 0.8511 - classification_loss: 0.1036 300/500 [=================>............] - ETA: 1:08 - loss: 0.9542 - regression_loss: 0.8508 - classification_loss: 0.1035 301/500 [=================>............] - ETA: 1:07 - loss: 0.9549 - regression_loss: 0.8513 - classification_loss: 0.1035 302/500 [=================>............] - ETA: 1:07 - loss: 0.9526 - regression_loss: 0.8493 - classification_loss: 0.1032 303/500 [=================>............] - ETA: 1:06 - loss: 0.9529 - regression_loss: 0.8497 - classification_loss: 0.1032 304/500 [=================>............] - ETA: 1:06 - loss: 0.9534 - regression_loss: 0.8501 - classification_loss: 0.1033 305/500 [=================>............] - ETA: 1:06 - loss: 0.9548 - regression_loss: 0.8512 - classification_loss: 0.1037 306/500 [=================>............] - ETA: 1:05 - loss: 0.9548 - regression_loss: 0.8511 - classification_loss: 0.1037 307/500 [=================>............] - ETA: 1:05 - loss: 0.9571 - regression_loss: 0.8529 - classification_loss: 0.1041 308/500 [=================>............] - ETA: 1:05 - loss: 0.9552 - regression_loss: 0.8513 - classification_loss: 0.1039 309/500 [=================>............] - ETA: 1:04 - loss: 0.9562 - regression_loss: 0.8521 - classification_loss: 0.1041 310/500 [=================>............] - ETA: 1:04 - loss: 0.9566 - regression_loss: 0.8525 - classification_loss: 0.1041 311/500 [=================>............] - ETA: 1:04 - loss: 0.9562 - regression_loss: 0.8522 - classification_loss: 0.1040 312/500 [=================>............] - ETA: 1:03 - loss: 0.9574 - regression_loss: 0.8535 - classification_loss: 0.1040 313/500 [=================>............] - ETA: 1:03 - loss: 0.9559 - regression_loss: 0.8521 - classification_loss: 0.1037 314/500 [=================>............] - ETA: 1:03 - loss: 0.9546 - regression_loss: 0.8509 - classification_loss: 0.1037 315/500 [=================>............] - ETA: 1:02 - loss: 0.9562 - regression_loss: 0.8521 - classification_loss: 0.1042 316/500 [=================>............] - ETA: 1:02 - loss: 0.9555 - regression_loss: 0.8515 - classification_loss: 0.1039 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9551 - regression_loss: 0.8513 - classification_loss: 0.1038 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9573 - regression_loss: 0.8532 - classification_loss: 0.1042 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9581 - regression_loss: 0.8539 - classification_loss: 0.1042 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9589 - regression_loss: 0.8547 - classification_loss: 0.1041 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9596 - regression_loss: 0.8555 - classification_loss: 0.1041 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9611 - regression_loss: 0.8566 - classification_loss: 0.1045 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9602 - regression_loss: 0.8557 - classification_loss: 0.1045 324/500 [==================>...........] - ETA: 59s - loss: 0.9612 - regression_loss: 0.8566 - classification_loss: 0.1046  325/500 [==================>...........] - ETA: 59s - loss: 0.9599 - regression_loss: 0.8555 - classification_loss: 0.1044 326/500 [==================>...........] - ETA: 59s - loss: 0.9597 - regression_loss: 0.8555 - classification_loss: 0.1042 327/500 [==================>...........] - ETA: 58s - loss: 0.9595 - regression_loss: 0.8555 - classification_loss: 0.1040 328/500 [==================>...........] - ETA: 58s - loss: 0.9596 - regression_loss: 0.8557 - classification_loss: 0.1039 329/500 [==================>...........] - ETA: 58s - loss: 0.9595 - regression_loss: 0.8558 - classification_loss: 0.1037 330/500 [==================>...........] - ETA: 57s - loss: 0.9597 - regression_loss: 0.8560 - classification_loss: 0.1037 331/500 [==================>...........] - ETA: 57s - loss: 0.9604 - regression_loss: 0.8566 - classification_loss: 0.1037 332/500 [==================>...........] - ETA: 57s - loss: 0.9603 - regression_loss: 0.8567 - classification_loss: 0.1037 333/500 [==================>...........] - ETA: 56s - loss: 0.9596 - regression_loss: 0.8560 - classification_loss: 0.1036 334/500 [===================>..........] - ETA: 56s - loss: 0.9594 - regression_loss: 0.8558 - classification_loss: 0.1035 335/500 [===================>..........] - ETA: 56s - loss: 0.9592 - regression_loss: 0.8556 - classification_loss: 0.1035 336/500 [===================>..........] - ETA: 55s - loss: 0.9587 - regression_loss: 0.8552 - classification_loss: 0.1035 337/500 [===================>..........] - ETA: 55s - loss: 0.9595 - regression_loss: 0.8559 - classification_loss: 0.1037 338/500 [===================>..........] - ETA: 55s - loss: 0.9592 - regression_loss: 0.8556 - classification_loss: 0.1035 339/500 [===================>..........] - ETA: 54s - loss: 0.9591 - regression_loss: 0.8557 - classification_loss: 0.1035 340/500 [===================>..........] - ETA: 54s - loss: 0.9583 - regression_loss: 0.8550 - classification_loss: 0.1033 341/500 [===================>..........] - ETA: 54s - loss: 0.9568 - regression_loss: 0.8537 - classification_loss: 0.1030 342/500 [===================>..........] - ETA: 53s - loss: 0.9566 - regression_loss: 0.8536 - classification_loss: 0.1030 343/500 [===================>..........] - ETA: 53s - loss: 0.9544 - regression_loss: 0.8517 - classification_loss: 0.1027 344/500 [===================>..........] - ETA: 53s - loss: 0.9545 - regression_loss: 0.8518 - classification_loss: 0.1027 345/500 [===================>..........] - ETA: 52s - loss: 0.9552 - regression_loss: 0.8523 - classification_loss: 0.1028 346/500 [===================>..........] - ETA: 52s - loss: 0.9543 - regression_loss: 0.8517 - classification_loss: 0.1026 347/500 [===================>..........] - ETA: 51s - loss: 0.9532 - regression_loss: 0.8507 - classification_loss: 0.1025 348/500 [===================>..........] - ETA: 51s - loss: 0.9520 - regression_loss: 0.8497 - classification_loss: 0.1023 349/500 [===================>..........] - ETA: 51s - loss: 0.9527 - regression_loss: 0.8503 - classification_loss: 0.1024 350/500 [====================>.........] - ETA: 50s - loss: 0.9516 - regression_loss: 0.8494 - classification_loss: 0.1022 351/500 [====================>.........] - ETA: 50s - loss: 0.9502 - regression_loss: 0.8481 - classification_loss: 0.1020 352/500 [====================>.........] - ETA: 50s - loss: 0.9505 - regression_loss: 0.8484 - classification_loss: 0.1021 353/500 [====================>.........] - ETA: 49s - loss: 0.9504 - regression_loss: 0.8479 - classification_loss: 0.1025 354/500 [====================>.........] - ETA: 49s - loss: 0.9496 - regression_loss: 0.8472 - classification_loss: 0.1024 355/500 [====================>.........] - ETA: 49s - loss: 0.9504 - regression_loss: 0.8480 - classification_loss: 0.1024 356/500 [====================>.........] - ETA: 48s - loss: 0.9493 - regression_loss: 0.8471 - classification_loss: 0.1022 357/500 [====================>.........] - ETA: 48s - loss: 0.9492 - regression_loss: 0.8470 - classification_loss: 0.1022 358/500 [====================>.........] - ETA: 48s - loss: 0.9493 - regression_loss: 0.8470 - classification_loss: 0.1023 359/500 [====================>.........] - ETA: 47s - loss: 0.9501 - regression_loss: 0.8477 - classification_loss: 0.1024 360/500 [====================>.........] - ETA: 47s - loss: 0.9507 - regression_loss: 0.8482 - classification_loss: 0.1025 361/500 [====================>.........] - ETA: 47s - loss: 0.9493 - regression_loss: 0.8470 - classification_loss: 0.1023 362/500 [====================>.........] - ETA: 46s - loss: 0.9513 - regression_loss: 0.8488 - classification_loss: 0.1026 363/500 [====================>.........] - ETA: 46s - loss: 0.9524 - regression_loss: 0.8497 - classification_loss: 0.1027 364/500 [====================>.........] - ETA: 46s - loss: 0.9516 - regression_loss: 0.8491 - classification_loss: 0.1025 365/500 [====================>.........] - ETA: 45s - loss: 0.9524 - regression_loss: 0.8498 - classification_loss: 0.1026 366/500 [====================>.........] - ETA: 45s - loss: 0.9531 - regression_loss: 0.8504 - classification_loss: 0.1027 367/500 [=====================>........] - ETA: 45s - loss: 0.9523 - regression_loss: 0.8497 - classification_loss: 0.1026 368/500 [=====================>........] - ETA: 44s - loss: 0.9525 - regression_loss: 0.8500 - classification_loss: 0.1025 369/500 [=====================>........] - ETA: 44s - loss: 0.9524 - regression_loss: 0.8499 - classification_loss: 0.1025 370/500 [=====================>........] - ETA: 44s - loss: 0.9509 - regression_loss: 0.8486 - classification_loss: 0.1023 371/500 [=====================>........] - ETA: 43s - loss: 0.9521 - regression_loss: 0.8490 - classification_loss: 0.1031 372/500 [=====================>........] - ETA: 43s - loss: 0.9525 - regression_loss: 0.8494 - classification_loss: 0.1031 373/500 [=====================>........] - ETA: 43s - loss: 0.9531 - regression_loss: 0.8499 - classification_loss: 0.1032 374/500 [=====================>........] - ETA: 42s - loss: 0.9529 - regression_loss: 0.8497 - classification_loss: 0.1032 375/500 [=====================>........] - ETA: 42s - loss: 0.9521 - regression_loss: 0.8489 - classification_loss: 0.1031 376/500 [=====================>........] - ETA: 42s - loss: 0.9540 - regression_loss: 0.8505 - classification_loss: 0.1036 377/500 [=====================>........] - ETA: 41s - loss: 0.9541 - regression_loss: 0.8505 - classification_loss: 0.1035 378/500 [=====================>........] - ETA: 41s - loss: 0.9544 - regression_loss: 0.8508 - classification_loss: 0.1035 379/500 [=====================>........] - ETA: 41s - loss: 0.9543 - regression_loss: 0.8507 - classification_loss: 0.1036 380/500 [=====================>........] - ETA: 40s - loss: 0.9535 - regression_loss: 0.8500 - classification_loss: 0.1034 381/500 [=====================>........] - ETA: 40s - loss: 0.9539 - regression_loss: 0.8504 - classification_loss: 0.1034 382/500 [=====================>........] - ETA: 40s - loss: 0.9523 - regression_loss: 0.8491 - classification_loss: 0.1032 383/500 [=====================>........] - ETA: 39s - loss: 0.9518 - regression_loss: 0.8487 - classification_loss: 0.1031 384/500 [======================>.......] - ETA: 39s - loss: 0.9511 - regression_loss: 0.8480 - classification_loss: 0.1031 385/500 [======================>.......] - ETA: 39s - loss: 0.9522 - regression_loss: 0.8488 - classification_loss: 0.1034 386/500 [======================>.......] - ETA: 38s - loss: 0.9514 - regression_loss: 0.8482 - classification_loss: 0.1033 387/500 [======================>.......] - ETA: 38s - loss: 0.9500 - regression_loss: 0.8468 - classification_loss: 0.1032 388/500 [======================>.......] - ETA: 38s - loss: 0.9514 - regression_loss: 0.8480 - classification_loss: 0.1035 389/500 [======================>.......] - ETA: 37s - loss: 0.9523 - regression_loss: 0.8487 - classification_loss: 0.1037 390/500 [======================>.......] - ETA: 37s - loss: 0.9528 - regression_loss: 0.8490 - classification_loss: 0.1038 391/500 [======================>.......] - ETA: 36s - loss: 0.9530 - regression_loss: 0.8490 - classification_loss: 0.1039 392/500 [======================>.......] - ETA: 36s - loss: 0.9535 - regression_loss: 0.8495 - classification_loss: 0.1040 393/500 [======================>.......] - ETA: 36s - loss: 0.9542 - regression_loss: 0.8501 - classification_loss: 0.1041 394/500 [======================>.......] - ETA: 35s - loss: 0.9544 - regression_loss: 0.8504 - classification_loss: 0.1040 395/500 [======================>.......] - ETA: 35s - loss: 0.9552 - regression_loss: 0.8511 - classification_loss: 0.1041 396/500 [======================>.......] - ETA: 35s - loss: 0.9546 - regression_loss: 0.8506 - classification_loss: 0.1040 397/500 [======================>.......] - ETA: 34s - loss: 0.9543 - regression_loss: 0.8504 - classification_loss: 0.1039 398/500 [======================>.......] - ETA: 34s - loss: 0.9535 - regression_loss: 0.8497 - classification_loss: 0.1038 399/500 [======================>.......] - ETA: 34s - loss: 0.9531 - regression_loss: 0.8493 - classification_loss: 0.1038 400/500 [=======================>......] - ETA: 33s - loss: 0.9545 - regression_loss: 0.8500 - classification_loss: 0.1045 401/500 [=======================>......] - ETA: 33s - loss: 0.9544 - regression_loss: 0.8499 - classification_loss: 0.1044 402/500 [=======================>......] - ETA: 33s - loss: 0.9552 - regression_loss: 0.8507 - classification_loss: 0.1045 403/500 [=======================>......] - ETA: 32s - loss: 0.9579 - regression_loss: 0.8531 - classification_loss: 0.1048 404/500 [=======================>......] - ETA: 32s - loss: 0.9566 - regression_loss: 0.8519 - classification_loss: 0.1047 405/500 [=======================>......] - ETA: 32s - loss: 0.9561 - regression_loss: 0.8514 - classification_loss: 0.1047 406/500 [=======================>......] - ETA: 31s - loss: 0.9563 - regression_loss: 0.8515 - classification_loss: 0.1048 407/500 [=======================>......] - ETA: 31s - loss: 0.9575 - regression_loss: 0.8524 - classification_loss: 0.1051 408/500 [=======================>......] - ETA: 31s - loss: 0.9573 - regression_loss: 0.8522 - classification_loss: 0.1051 409/500 [=======================>......] - ETA: 30s - loss: 0.9558 - regression_loss: 0.8509 - classification_loss: 0.1049 410/500 [=======================>......] - ETA: 30s - loss: 0.9559 - regression_loss: 0.8510 - classification_loss: 0.1049 411/500 [=======================>......] - ETA: 30s - loss: 0.9558 - regression_loss: 0.8509 - classification_loss: 0.1049 412/500 [=======================>......] - ETA: 29s - loss: 0.9550 - regression_loss: 0.8502 - classification_loss: 0.1048 413/500 [=======================>......] - ETA: 29s - loss: 0.9549 - regression_loss: 0.8501 - classification_loss: 0.1048 414/500 [=======================>......] - ETA: 29s - loss: 0.9540 - regression_loss: 0.8494 - classification_loss: 0.1046 415/500 [=======================>......] - ETA: 28s - loss: 0.9538 - regression_loss: 0.8492 - classification_loss: 0.1046 416/500 [=======================>......] - ETA: 28s - loss: 0.9533 - regression_loss: 0.8489 - classification_loss: 0.1045 417/500 [========================>.....] - ETA: 28s - loss: 0.9518 - regression_loss: 0.8475 - classification_loss: 0.1043 418/500 [========================>.....] - ETA: 27s - loss: 0.9528 - regression_loss: 0.8483 - classification_loss: 0.1045 419/500 [========================>.....] - ETA: 27s - loss: 0.9530 - regression_loss: 0.8485 - classification_loss: 0.1045 420/500 [========================>.....] - ETA: 27s - loss: 0.9520 - regression_loss: 0.8477 - classification_loss: 0.1043 421/500 [========================>.....] - ETA: 26s - loss: 0.9520 - regression_loss: 0.8477 - classification_loss: 0.1043 422/500 [========================>.....] - ETA: 26s - loss: 0.9510 - regression_loss: 0.8469 - classification_loss: 0.1041 423/500 [========================>.....] - ETA: 26s - loss: 0.9502 - regression_loss: 0.8461 - classification_loss: 0.1041 424/500 [========================>.....] - ETA: 25s - loss: 0.9493 - regression_loss: 0.8454 - classification_loss: 0.1039 425/500 [========================>.....] - ETA: 25s - loss: 0.9481 - regression_loss: 0.8443 - classification_loss: 0.1038 426/500 [========================>.....] - ETA: 25s - loss: 0.9480 - regression_loss: 0.8442 - classification_loss: 0.1038 427/500 [========================>.....] - ETA: 24s - loss: 0.9481 - regression_loss: 0.8443 - classification_loss: 0.1038 428/500 [========================>.....] - ETA: 24s - loss: 0.9483 - regression_loss: 0.8445 - classification_loss: 0.1038 429/500 [========================>.....] - ETA: 24s - loss: 0.9472 - regression_loss: 0.8436 - classification_loss: 0.1036 430/500 [========================>.....] - ETA: 23s - loss: 0.9459 - regression_loss: 0.8424 - classification_loss: 0.1035 431/500 [========================>.....] - ETA: 23s - loss: 0.9456 - regression_loss: 0.8422 - classification_loss: 0.1034 432/500 [========================>.....] - ETA: 23s - loss: 0.9448 - regression_loss: 0.8415 - classification_loss: 0.1033 433/500 [========================>.....] - ETA: 22s - loss: 0.9445 - regression_loss: 0.8414 - classification_loss: 0.1032 434/500 [=========================>....] - ETA: 22s - loss: 0.9457 - regression_loss: 0.8424 - classification_loss: 0.1033 435/500 [=========================>....] - ETA: 22s - loss: 0.9449 - regression_loss: 0.8417 - classification_loss: 0.1032 436/500 [=========================>....] - ETA: 21s - loss: 0.9454 - regression_loss: 0.8422 - classification_loss: 0.1032 437/500 [=========================>....] - ETA: 21s - loss: 0.9447 - regression_loss: 0.8415 - classification_loss: 0.1032 438/500 [=========================>....] - ETA: 21s - loss: 0.9435 - regression_loss: 0.8405 - classification_loss: 0.1029 439/500 [=========================>....] - ETA: 20s - loss: 0.9438 - regression_loss: 0.8408 - classification_loss: 0.1030 440/500 [=========================>....] - ETA: 20s - loss: 0.9434 - regression_loss: 0.8405 - classification_loss: 0.1029 441/500 [=========================>....] - ETA: 19s - loss: 0.9431 - regression_loss: 0.8403 - classification_loss: 0.1028 442/500 [=========================>....] - ETA: 19s - loss: 0.9421 - regression_loss: 0.8394 - classification_loss: 0.1028 443/500 [=========================>....] - ETA: 19s - loss: 0.9410 - regression_loss: 0.8384 - classification_loss: 0.1026 444/500 [=========================>....] - ETA: 18s - loss: 0.9418 - regression_loss: 0.8391 - classification_loss: 0.1027 445/500 [=========================>....] - ETA: 18s - loss: 0.9422 - regression_loss: 0.8395 - classification_loss: 0.1027 446/500 [=========================>....] - ETA: 18s - loss: 0.9424 - regression_loss: 0.8397 - classification_loss: 0.1027 447/500 [=========================>....] - ETA: 17s - loss: 0.9422 - regression_loss: 0.8396 - classification_loss: 0.1027 448/500 [=========================>....] - ETA: 17s - loss: 0.9416 - regression_loss: 0.8390 - classification_loss: 0.1026 449/500 [=========================>....] - ETA: 17s - loss: 0.9429 - regression_loss: 0.8401 - classification_loss: 0.1028 450/500 [==========================>...] - ETA: 16s - loss: 0.9418 - regression_loss: 0.8391 - classification_loss: 0.1026 451/500 [==========================>...] - ETA: 16s - loss: 0.9415 - regression_loss: 0.8390 - classification_loss: 0.1026 452/500 [==========================>...] - ETA: 16s - loss: 0.9402 - regression_loss: 0.8378 - classification_loss: 0.1024 453/500 [==========================>...] - ETA: 15s - loss: 0.9405 - regression_loss: 0.8381 - classification_loss: 0.1025 454/500 [==========================>...] - ETA: 15s - loss: 0.9408 - regression_loss: 0.8383 - classification_loss: 0.1025 455/500 [==========================>...] - ETA: 15s - loss: 0.9390 - regression_loss: 0.8367 - classification_loss: 0.1023 456/500 [==========================>...] - ETA: 14s - loss: 0.9394 - regression_loss: 0.8371 - classification_loss: 0.1023 457/500 [==========================>...] - ETA: 14s - loss: 0.9396 - regression_loss: 0.8372 - classification_loss: 0.1024 458/500 [==========================>...] - ETA: 14s - loss: 0.9397 - regression_loss: 0.8373 - classification_loss: 0.1023 459/500 [==========================>...] - ETA: 13s - loss: 0.9402 - regression_loss: 0.8380 - classification_loss: 0.1022 460/500 [==========================>...] - ETA: 13s - loss: 0.9403 - regression_loss: 0.8381 - classification_loss: 0.1022 461/500 [==========================>...] - ETA: 13s - loss: 0.9401 - regression_loss: 0.8379 - classification_loss: 0.1021 462/500 [==========================>...] - ETA: 12s - loss: 0.9397 - regression_loss: 0.8376 - classification_loss: 0.1021 463/500 [==========================>...] - ETA: 12s - loss: 0.9408 - regression_loss: 0.8386 - classification_loss: 0.1022 464/500 [==========================>...] - ETA: 12s - loss: 0.9398 - regression_loss: 0.8377 - classification_loss: 0.1021 465/500 [==========================>...] - ETA: 11s - loss: 0.9397 - regression_loss: 0.8377 - classification_loss: 0.1020 466/500 [==========================>...] - ETA: 11s - loss: 0.9398 - regression_loss: 0.8379 - classification_loss: 0.1019 467/500 [===========================>..] - ETA: 11s - loss: 0.9399 - regression_loss: 0.8379 - classification_loss: 0.1020 468/500 [===========================>..] - ETA: 10s - loss: 0.9397 - regression_loss: 0.8378 - classification_loss: 0.1020 469/500 [===========================>..] - ETA: 10s - loss: 0.9390 - regression_loss: 0.8371 - classification_loss: 0.1019 470/500 [===========================>..] - ETA: 10s - loss: 0.9387 - regression_loss: 0.8370 - classification_loss: 0.1018 471/500 [===========================>..] - ETA: 9s - loss: 0.9388 - regression_loss: 0.8370 - classification_loss: 0.1018  472/500 [===========================>..] - ETA: 9s - loss: 0.9395 - regression_loss: 0.8377 - classification_loss: 0.1018 473/500 [===========================>..] - ETA: 9s - loss: 0.9407 - regression_loss: 0.8386 - classification_loss: 0.1020 474/500 [===========================>..] - ETA: 8s - loss: 0.9406 - regression_loss: 0.8387 - classification_loss: 0.1019 475/500 [===========================>..] - ETA: 8s - loss: 0.9397 - regression_loss: 0.8380 - classification_loss: 0.1017 476/500 [===========================>..] - ETA: 8s - loss: 0.9401 - regression_loss: 0.8383 - classification_loss: 0.1018 477/500 [===========================>..] - ETA: 7s - loss: 0.9404 - regression_loss: 0.8386 - classification_loss: 0.1018 478/500 [===========================>..] - ETA: 7s - loss: 0.9416 - regression_loss: 0.8398 - classification_loss: 0.1019 479/500 [===========================>..] - ETA: 7s - loss: 0.9409 - regression_loss: 0.8391 - classification_loss: 0.1017 480/500 [===========================>..] - ETA: 6s - loss: 0.9398 - regression_loss: 0.8382 - classification_loss: 0.1016 481/500 [===========================>..] - ETA: 6s - loss: 0.9400 - regression_loss: 0.8384 - classification_loss: 0.1015 482/500 [===========================>..] - ETA: 6s - loss: 0.9403 - regression_loss: 0.8387 - classification_loss: 0.1016 483/500 [===========================>..] - ETA: 5s - loss: 0.9404 - regression_loss: 0.8388 - classification_loss: 0.1016 484/500 [============================>.] - ETA: 5s - loss: 0.9398 - regression_loss: 0.8384 - classification_loss: 0.1014 485/500 [============================>.] - ETA: 5s - loss: 0.9392 - regression_loss: 0.8379 - classification_loss: 0.1013 486/500 [============================>.] - ETA: 4s - loss: 0.9377 - regression_loss: 0.8365 - classification_loss: 0.1011 487/500 [============================>.] - ETA: 4s - loss: 0.9378 - regression_loss: 0.8368 - classification_loss: 0.1011 488/500 [============================>.] - ETA: 4s - loss: 0.9377 - regression_loss: 0.8367 - classification_loss: 0.1010 489/500 [============================>.] - ETA: 3s - loss: 0.9386 - regression_loss: 0.8376 - classification_loss: 0.1009 490/500 [============================>.] - ETA: 3s - loss: 0.9375 - regression_loss: 0.8367 - classification_loss: 0.1008 491/500 [============================>.] - ETA: 3s - loss: 0.9381 - regression_loss: 0.8371 - classification_loss: 0.1009 492/500 [============================>.] - ETA: 2s - loss: 0.9378 - regression_loss: 0.8370 - classification_loss: 0.1008 493/500 [============================>.] - ETA: 2s - loss: 0.9383 - regression_loss: 0.8375 - classification_loss: 0.1008 494/500 [============================>.] - ETA: 2s - loss: 0.9389 - regression_loss: 0.8380 - classification_loss: 0.1009 495/500 [============================>.] - ETA: 1s - loss: 0.9393 - regression_loss: 0.8383 - classification_loss: 0.1010 496/500 [============================>.] - ETA: 1s - loss: 0.9394 - regression_loss: 0.8383 - classification_loss: 0.1010 497/500 [============================>.] - ETA: 1s - loss: 0.9397 - regression_loss: 0.8385 - classification_loss: 0.1011 498/500 [============================>.] - ETA: 0s - loss: 0.9393 - regression_loss: 0.8383 - classification_loss: 0.1010 499/500 [============================>.] - ETA: 0s - loss: 0.9387 - regression_loss: 0.8378 - classification_loss: 0.1009 500/500 [==============================] - 170s 339ms/step - loss: 0.9385 - regression_loss: 0.8376 - classification_loss: 0.1009 1172 instances of class plum with average precision: 0.8180 mAP: 0.8180 Epoch 00035: saving model to ./training/snapshots/resnet101_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:36 - loss: 1.0801 - regression_loss: 0.9664 - classification_loss: 0.1138 2/500 [..............................] - ETA: 2:39 - loss: 1.0351 - regression_loss: 0.9081 - classification_loss: 0.1270 3/500 [..............................] - ETA: 2:42 - loss: 0.8650 - regression_loss: 0.7695 - classification_loss: 0.0955 4/500 [..............................] - ETA: 2:44 - loss: 1.1007 - regression_loss: 0.9804 - classification_loss: 0.1203 5/500 [..............................] - ETA: 2:46 - loss: 0.9314 - regression_loss: 0.8325 - classification_loss: 0.0989 6/500 [..............................] - ETA: 2:45 - loss: 0.9070 - regression_loss: 0.8041 - classification_loss: 0.1028 7/500 [..............................] - ETA: 2:45 - loss: 0.8807 - regression_loss: 0.7796 - classification_loss: 0.1011 8/500 [..............................] - ETA: 2:45 - loss: 0.8456 - regression_loss: 0.7535 - classification_loss: 0.0921 9/500 [..............................] - ETA: 2:43 - loss: 0.8370 - regression_loss: 0.7437 - classification_loss: 0.0934 10/500 [..............................] - ETA: 2:43 - loss: 0.8312 - regression_loss: 0.7389 - classification_loss: 0.0923 11/500 [..............................] - ETA: 2:43 - loss: 0.8475 - regression_loss: 0.7568 - classification_loss: 0.0907 12/500 [..............................] - ETA: 2:44 - loss: 0.8125 - regression_loss: 0.7269 - classification_loss: 0.0856 13/500 [..............................] - ETA: 2:44 - loss: 0.8244 - regression_loss: 0.7377 - classification_loss: 0.0868 14/500 [..............................] - ETA: 2:44 - loss: 0.8369 - regression_loss: 0.7476 - classification_loss: 0.0893 15/500 [..............................] - ETA: 2:43 - loss: 0.8043 - regression_loss: 0.7188 - classification_loss: 0.0855 16/500 [..............................] - ETA: 2:42 - loss: 0.7931 - regression_loss: 0.7093 - classification_loss: 0.0838 17/500 [>.............................] - ETA: 2:42 - loss: 0.8266 - regression_loss: 0.7373 - classification_loss: 0.0893 18/500 [>.............................] - ETA: 2:42 - loss: 0.8506 - regression_loss: 0.7614 - classification_loss: 0.0892 19/500 [>.............................] - ETA: 2:42 - loss: 0.9217 - regression_loss: 0.8162 - classification_loss: 0.1055 20/500 [>.............................] - ETA: 2:41 - loss: 0.9320 - regression_loss: 0.8251 - classification_loss: 0.1069 21/500 [>.............................] - ETA: 2:41 - loss: 0.9195 - regression_loss: 0.8163 - classification_loss: 0.1032 22/500 [>.............................] - ETA: 2:41 - loss: 0.9269 - regression_loss: 0.8236 - classification_loss: 0.1032 23/500 [>.............................] - ETA: 2:40 - loss: 0.9217 - regression_loss: 0.8198 - classification_loss: 0.1020 24/500 [>.............................] - ETA: 2:39 - loss: 0.9093 - regression_loss: 0.8099 - classification_loss: 0.0994 25/500 [>.............................] - ETA: 2:39 - loss: 0.8839 - regression_loss: 0.7873 - classification_loss: 0.0966 26/500 [>.............................] - ETA: 2:38 - loss: 0.9047 - regression_loss: 0.8088 - classification_loss: 0.0959 27/500 [>.............................] - ETA: 2:38 - loss: 0.9164 - regression_loss: 0.8199 - classification_loss: 0.0965 28/500 [>.............................] - ETA: 2:38 - loss: 0.9087 - regression_loss: 0.8127 - classification_loss: 0.0960 29/500 [>.............................] - ETA: 2:37 - loss: 0.8889 - regression_loss: 0.7936 - classification_loss: 0.0953 30/500 [>.............................] - ETA: 2:37 - loss: 0.8821 - regression_loss: 0.7879 - classification_loss: 0.0942 31/500 [>.............................] - ETA: 2:36 - loss: 0.8760 - regression_loss: 0.7832 - classification_loss: 0.0927 32/500 [>.............................] - ETA: 2:36 - loss: 0.8689 - regression_loss: 0.7770 - classification_loss: 0.0919 33/500 [>.............................] - ETA: 2:36 - loss: 0.8755 - regression_loss: 0.7807 - classification_loss: 0.0948 34/500 [=>............................] - ETA: 2:36 - loss: 0.8802 - regression_loss: 0.7853 - classification_loss: 0.0950 35/500 [=>............................] - ETA: 2:35 - loss: 0.8878 - regression_loss: 0.7923 - classification_loss: 0.0955 36/500 [=>............................] - ETA: 2:35 - loss: 0.8953 - regression_loss: 0.7990 - classification_loss: 0.0963 37/500 [=>............................] - ETA: 2:35 - loss: 0.8970 - regression_loss: 0.8011 - classification_loss: 0.0958 38/500 [=>............................] - ETA: 2:34 - loss: 0.9103 - regression_loss: 0.8122 - classification_loss: 0.0981 39/500 [=>............................] - ETA: 2:34 - loss: 0.8957 - regression_loss: 0.7996 - classification_loss: 0.0961 40/500 [=>............................] - ETA: 2:34 - loss: 0.9006 - regression_loss: 0.8042 - classification_loss: 0.0965 41/500 [=>............................] - ETA: 2:33 - loss: 0.9093 - regression_loss: 0.8085 - classification_loss: 0.1007 42/500 [=>............................] - ETA: 2:33 - loss: 0.9160 - regression_loss: 0.8142 - classification_loss: 0.1018 43/500 [=>............................] - ETA: 2:33 - loss: 0.9177 - regression_loss: 0.8166 - classification_loss: 0.1011 44/500 [=>............................] - ETA: 2:32 - loss: 0.9200 - regression_loss: 0.8186 - classification_loss: 0.1014 45/500 [=>............................] - ETA: 2:32 - loss: 0.9192 - regression_loss: 0.8182 - classification_loss: 0.1010 46/500 [=>............................] - ETA: 2:32 - loss: 0.9159 - regression_loss: 0.8157 - classification_loss: 0.1002 47/500 [=>............................] - ETA: 2:32 - loss: 0.9030 - regression_loss: 0.8046 - classification_loss: 0.0984 48/500 [=>............................] - ETA: 2:32 - loss: 0.9058 - regression_loss: 0.8072 - classification_loss: 0.0987 49/500 [=>............................] - ETA: 2:31 - loss: 0.8930 - regression_loss: 0.7960 - classification_loss: 0.0970 50/500 [==>...........................] - ETA: 2:31 - loss: 0.8989 - regression_loss: 0.8008 - classification_loss: 0.0980 51/500 [==>...........................] - ETA: 2:31 - loss: 0.8927 - regression_loss: 0.7953 - classification_loss: 0.0973 52/500 [==>...........................] - ETA: 2:30 - loss: 0.9101 - regression_loss: 0.8090 - classification_loss: 0.1011 53/500 [==>...........................] - ETA: 2:30 - loss: 0.9039 - regression_loss: 0.8041 - classification_loss: 0.0998 54/500 [==>...........................] - ETA: 2:30 - loss: 0.9047 - regression_loss: 0.8052 - classification_loss: 0.0994 55/500 [==>...........................] - ETA: 2:29 - loss: 0.9037 - regression_loss: 0.8052 - classification_loss: 0.0985 56/500 [==>...........................] - ETA: 2:29 - loss: 0.9091 - regression_loss: 0.8103 - classification_loss: 0.0988 57/500 [==>...........................] - ETA: 2:29 - loss: 0.9103 - regression_loss: 0.8117 - classification_loss: 0.0986 58/500 [==>...........................] - ETA: 2:29 - loss: 0.9021 - regression_loss: 0.8046 - classification_loss: 0.0975 59/500 [==>...........................] - ETA: 2:28 - loss: 0.9091 - regression_loss: 0.8109 - classification_loss: 0.0982 60/500 [==>...........................] - ETA: 2:28 - loss: 0.9076 - regression_loss: 0.8093 - classification_loss: 0.0983 61/500 [==>...........................] - ETA: 2:28 - loss: 0.9007 - regression_loss: 0.8035 - classification_loss: 0.0972 62/500 [==>...........................] - ETA: 2:27 - loss: 0.8973 - regression_loss: 0.8009 - classification_loss: 0.0964 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8992 - regression_loss: 0.8014 - classification_loss: 0.0978 64/500 [==>...........................] - ETA: 2:27 - loss: 0.9039 - regression_loss: 0.8054 - classification_loss: 0.0986 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9053 - regression_loss: 0.8067 - classification_loss: 0.0986 66/500 [==>...........................] - ETA: 2:26 - loss: 0.9045 - regression_loss: 0.8063 - classification_loss: 0.0982 67/500 [===>..........................] - ETA: 2:26 - loss: 0.9026 - regression_loss: 0.8054 - classification_loss: 0.0972 68/500 [===>..........................] - ETA: 2:25 - loss: 0.9086 - regression_loss: 0.8111 - classification_loss: 0.0976 69/500 [===>..........................] - ETA: 2:25 - loss: 0.9112 - regression_loss: 0.8140 - classification_loss: 0.0972 70/500 [===>..........................] - ETA: 2:25 - loss: 0.9114 - regression_loss: 0.8144 - classification_loss: 0.0970 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9103 - regression_loss: 0.8127 - classification_loss: 0.0976 72/500 [===>..........................] - ETA: 2:24 - loss: 0.9188 - regression_loss: 0.8197 - classification_loss: 0.0991 73/500 [===>..........................] - ETA: 2:24 - loss: 0.9211 - regression_loss: 0.8220 - classification_loss: 0.0991 74/500 [===>..........................] - ETA: 2:23 - loss: 0.9255 - regression_loss: 0.8257 - classification_loss: 0.0998 75/500 [===>..........................] - ETA: 2:23 - loss: 0.9235 - regression_loss: 0.8239 - classification_loss: 0.0995 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9292 - regression_loss: 0.8291 - classification_loss: 0.1001 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9291 - regression_loss: 0.8288 - classification_loss: 0.1003 78/500 [===>..........................] - ETA: 2:22 - loss: 0.9320 - regression_loss: 0.8316 - classification_loss: 0.1004 79/500 [===>..........................] - ETA: 2:22 - loss: 0.9373 - regression_loss: 0.8364 - classification_loss: 0.1009 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9412 - regression_loss: 0.8398 - classification_loss: 0.1014 81/500 [===>..........................] - ETA: 2:21 - loss: 0.9382 - regression_loss: 0.8370 - classification_loss: 0.1012 82/500 [===>..........................] - ETA: 2:21 - loss: 0.9351 - regression_loss: 0.8346 - classification_loss: 0.1005 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9388 - regression_loss: 0.8373 - classification_loss: 0.1015 84/500 [====>.........................] - ETA: 2:20 - loss: 0.9422 - regression_loss: 0.8393 - classification_loss: 0.1029 85/500 [====>.........................] - ETA: 2:20 - loss: 0.9417 - regression_loss: 0.8390 - classification_loss: 0.1027 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9456 - regression_loss: 0.8421 - classification_loss: 0.1035 87/500 [====>.........................] - ETA: 2:19 - loss: 0.9449 - regression_loss: 0.8416 - classification_loss: 0.1033 88/500 [====>.........................] - ETA: 2:19 - loss: 0.9443 - regression_loss: 0.8413 - classification_loss: 0.1031 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9447 - regression_loss: 0.8422 - classification_loss: 0.1025 90/500 [====>.........................] - ETA: 2:18 - loss: 0.9426 - regression_loss: 0.8402 - classification_loss: 0.1024 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9428 - regression_loss: 0.8405 - classification_loss: 0.1024 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9466 - regression_loss: 0.8439 - classification_loss: 0.1027 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9505 - regression_loss: 0.8473 - classification_loss: 0.1032 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9487 - regression_loss: 0.8462 - classification_loss: 0.1025 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9463 - regression_loss: 0.8440 - classification_loss: 0.1023 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9492 - regression_loss: 0.8472 - classification_loss: 0.1020 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9557 - regression_loss: 0.8526 - classification_loss: 0.1031 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9550 - regression_loss: 0.8518 - classification_loss: 0.1032 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9505 - regression_loss: 0.8477 - classification_loss: 0.1028 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9673 - regression_loss: 0.8547 - classification_loss: 0.1126 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9647 - regression_loss: 0.8528 - classification_loss: 0.1119 102/500 [=====>........................] - ETA: 2:14 - loss: 0.9645 - regression_loss: 0.8528 - classification_loss: 0.1117 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9690 - regression_loss: 0.8563 - classification_loss: 0.1126 104/500 [=====>........................] - ETA: 2:13 - loss: 0.9670 - regression_loss: 0.8552 - classification_loss: 0.1118 105/500 [=====>........................] - ETA: 2:13 - loss: 0.9701 - regression_loss: 0.8579 - classification_loss: 0.1123 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9705 - regression_loss: 0.8583 - classification_loss: 0.1122 107/500 [=====>........................] - ETA: 2:12 - loss: 0.9703 - regression_loss: 0.8580 - classification_loss: 0.1123 108/500 [=====>........................] - ETA: 2:12 - loss: 0.9737 - regression_loss: 0.8608 - classification_loss: 0.1130 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9678 - regression_loss: 0.8557 - classification_loss: 0.1121 110/500 [=====>........................] - ETA: 2:11 - loss: 0.9699 - regression_loss: 0.8569 - classification_loss: 0.1130 111/500 [=====>........................] - ETA: 2:11 - loss: 0.9714 - regression_loss: 0.8585 - classification_loss: 0.1129 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9728 - regression_loss: 0.8598 - classification_loss: 0.1129 113/500 [=====>........................] - ETA: 2:10 - loss: 0.9749 - regression_loss: 0.8620 - classification_loss: 0.1130 114/500 [=====>........................] - ETA: 2:10 - loss: 0.9749 - regression_loss: 0.8625 - classification_loss: 0.1124 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9783 - regression_loss: 0.8657 - classification_loss: 0.1126 116/500 [=====>........................] - ETA: 2:09 - loss: 0.9745 - regression_loss: 0.8628 - classification_loss: 0.1117 117/500 [======>.......................] - ETA: 2:09 - loss: 0.9767 - regression_loss: 0.8649 - classification_loss: 0.1118 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9743 - regression_loss: 0.8629 - classification_loss: 0.1114 119/500 [======>.......................] - ETA: 2:08 - loss: 0.9702 - regression_loss: 0.8596 - classification_loss: 0.1106 120/500 [======>.......................] - ETA: 2:08 - loss: 0.9710 - regression_loss: 0.8606 - classification_loss: 0.1104 121/500 [======>.......................] - ETA: 2:07 - loss: 0.9737 - regression_loss: 0.8631 - classification_loss: 0.1106 122/500 [======>.......................] - ETA: 2:07 - loss: 0.9713 - regression_loss: 0.8611 - classification_loss: 0.1101 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9722 - regression_loss: 0.8623 - classification_loss: 0.1099 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9660 - regression_loss: 0.8568 - classification_loss: 0.1092 125/500 [======>.......................] - ETA: 2:06 - loss: 0.9654 - regression_loss: 0.8562 - classification_loss: 0.1092 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9640 - regression_loss: 0.8550 - classification_loss: 0.1090 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9680 - regression_loss: 0.8591 - classification_loss: 0.1089 128/500 [======>.......................] - ETA: 2:05 - loss: 0.9685 - regression_loss: 0.8594 - classification_loss: 0.1091 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9664 - regression_loss: 0.8575 - classification_loss: 0.1089 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9659 - regression_loss: 0.8570 - classification_loss: 0.1089 131/500 [======>.......................] - ETA: 2:04 - loss: 0.9634 - regression_loss: 0.8550 - classification_loss: 0.1084 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9655 - regression_loss: 0.8571 - classification_loss: 0.1083 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9643 - regression_loss: 0.8561 - classification_loss: 0.1082 134/500 [=======>......................] - ETA: 2:03 - loss: 0.9655 - regression_loss: 0.8571 - classification_loss: 0.1084 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9628 - regression_loss: 0.8549 - classification_loss: 0.1078 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9594 - regression_loss: 0.8521 - classification_loss: 0.1073 137/500 [=======>......................] - ETA: 2:02 - loss: 0.9581 - regression_loss: 0.8513 - classification_loss: 0.1068 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9594 - regression_loss: 0.8528 - classification_loss: 0.1066 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9599 - regression_loss: 0.8533 - classification_loss: 0.1066 140/500 [=======>......................] - ETA: 2:01 - loss: 0.9595 - regression_loss: 0.8532 - classification_loss: 0.1063 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9607 - regression_loss: 0.8543 - classification_loss: 0.1065 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9601 - regression_loss: 0.8535 - classification_loss: 0.1066 143/500 [=======>......................] - ETA: 2:00 - loss: 0.9603 - regression_loss: 0.8537 - classification_loss: 0.1066 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9583 - regression_loss: 0.8514 - classification_loss: 0.1069 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9544 - regression_loss: 0.8480 - classification_loss: 0.1064 146/500 [=======>......................] - ETA: 1:59 - loss: 0.9557 - regression_loss: 0.8493 - classification_loss: 0.1064 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9514 - regression_loss: 0.8456 - classification_loss: 0.1059 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9507 - regression_loss: 0.8446 - classification_loss: 0.1061 149/500 [=======>......................] - ETA: 1:58 - loss: 0.9482 - regression_loss: 0.8427 - classification_loss: 0.1055 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9468 - regression_loss: 0.8417 - classification_loss: 0.1052 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9477 - regression_loss: 0.8425 - classification_loss: 0.1052 152/500 [========>.....................] - ETA: 1:57 - loss: 0.9470 - regression_loss: 0.8421 - classification_loss: 0.1049 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9535 - regression_loss: 0.8476 - classification_loss: 0.1060 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9526 - regression_loss: 0.8471 - classification_loss: 0.1055 155/500 [========>.....................] - ETA: 1:56 - loss: 0.9529 - regression_loss: 0.8475 - classification_loss: 0.1054 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9534 - regression_loss: 0.8481 - classification_loss: 0.1053 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9557 - regression_loss: 0.8501 - classification_loss: 0.1056 158/500 [========>.....................] - ETA: 1:55 - loss: 0.9517 - regression_loss: 0.8467 - classification_loss: 0.1051 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9495 - regression_loss: 0.8447 - classification_loss: 0.1047 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9515 - regression_loss: 0.8465 - classification_loss: 0.1050 161/500 [========>.....................] - ETA: 1:54 - loss: 0.9511 - regression_loss: 0.8463 - classification_loss: 0.1048 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9487 - regression_loss: 0.8444 - classification_loss: 0.1044 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9485 - regression_loss: 0.8442 - classification_loss: 0.1043 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9483 - regression_loss: 0.8441 - classification_loss: 0.1041 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9462 - regression_loss: 0.8424 - classification_loss: 0.1038 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9456 - regression_loss: 0.8419 - classification_loss: 0.1036 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9435 - regression_loss: 0.8401 - classification_loss: 0.1034 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9415 - regression_loss: 0.8384 - classification_loss: 0.1031 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9410 - regression_loss: 0.8379 - classification_loss: 0.1031 170/500 [=========>....................] - ETA: 1:51 - loss: 0.9388 - regression_loss: 0.8360 - classification_loss: 0.1027 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9398 - regression_loss: 0.8371 - classification_loss: 0.1027 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9403 - regression_loss: 0.8375 - classification_loss: 0.1028 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9405 - regression_loss: 0.8378 - classification_loss: 0.1027 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9383 - regression_loss: 0.8358 - classification_loss: 0.1025 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9370 - regression_loss: 0.8349 - classification_loss: 0.1022 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9344 - regression_loss: 0.8327 - classification_loss: 0.1018 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9359 - regression_loss: 0.8337 - classification_loss: 0.1022 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9384 - regression_loss: 0.8357 - classification_loss: 0.1027 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9394 - regression_loss: 0.8366 - classification_loss: 0.1028 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9413 - regression_loss: 0.8381 - classification_loss: 0.1032 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9402 - regression_loss: 0.8371 - classification_loss: 0.1031 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9396 - regression_loss: 0.8368 - classification_loss: 0.1028 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9362 - regression_loss: 0.8339 - classification_loss: 0.1023 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9385 - regression_loss: 0.8354 - classification_loss: 0.1030 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9415 - regression_loss: 0.8378 - classification_loss: 0.1037 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9385 - regression_loss: 0.8351 - classification_loss: 0.1033 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9370 - regression_loss: 0.8338 - classification_loss: 0.1032 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9358 - regression_loss: 0.8330 - classification_loss: 0.1028 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9352 - regression_loss: 0.8324 - classification_loss: 0.1028 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9369 - regression_loss: 0.8338 - classification_loss: 0.1031 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9363 - regression_loss: 0.8336 - classification_loss: 0.1027 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9341 - regression_loss: 0.8318 - classification_loss: 0.1023 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9319 - regression_loss: 0.8301 - classification_loss: 0.1019 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9326 - regression_loss: 0.8308 - classification_loss: 0.1018 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9335 - regression_loss: 0.8317 - classification_loss: 0.1018 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9344 - regression_loss: 0.8325 - classification_loss: 0.1019 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9320 - regression_loss: 0.8305 - classification_loss: 0.1015 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9310 - regression_loss: 0.8298 - classification_loss: 0.1013 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9293 - regression_loss: 0.8281 - classification_loss: 0.1011 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9279 - regression_loss: 0.8269 - classification_loss: 0.1010 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9258 - regression_loss: 0.8252 - classification_loss: 0.1006 202/500 [===========>..................] - ETA: 1:40 - loss: 0.9280 - regression_loss: 0.8270 - classification_loss: 0.1010 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9285 - regression_loss: 0.8276 - classification_loss: 0.1009 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9301 - regression_loss: 0.8288 - classification_loss: 0.1013 205/500 [===========>..................] - ETA: 1:39 - loss: 0.9304 - regression_loss: 0.8291 - classification_loss: 0.1013 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9287 - regression_loss: 0.8276 - classification_loss: 0.1011 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9289 - regression_loss: 0.8278 - classification_loss: 0.1011 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9282 - regression_loss: 0.8272 - classification_loss: 0.1010 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9284 - regression_loss: 0.8274 - classification_loss: 0.1010 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9264 - regression_loss: 0.8256 - classification_loss: 0.1008 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9263 - regression_loss: 0.8256 - classification_loss: 0.1007 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9283 - regression_loss: 0.8272 - classification_loss: 0.1011 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9309 - regression_loss: 0.8295 - classification_loss: 0.1015 214/500 [===========>..................] - ETA: 1:36 - loss: 0.9281 - regression_loss: 0.8270 - classification_loss: 0.1011 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9255 - regression_loss: 0.8247 - classification_loss: 0.1008 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9267 - regression_loss: 0.8261 - classification_loss: 0.1007 217/500 [============>.................] - ETA: 1:35 - loss: 0.9259 - regression_loss: 0.8254 - classification_loss: 0.1005 218/500 [============>.................] - ETA: 1:35 - loss: 0.9232 - regression_loss: 0.8231 - classification_loss: 0.1001 219/500 [============>.................] - ETA: 1:35 - loss: 0.9254 - regression_loss: 0.8252 - classification_loss: 0.1002 220/500 [============>.................] - ETA: 1:34 - loss: 0.9255 - regression_loss: 0.8254 - classification_loss: 0.1001 221/500 [============>.................] - ETA: 1:34 - loss: 0.9242 - regression_loss: 0.8244 - classification_loss: 0.0999 222/500 [============>.................] - ETA: 1:34 - loss: 0.9236 - regression_loss: 0.8238 - classification_loss: 0.0998 223/500 [============>.................] - ETA: 1:33 - loss: 0.9232 - regression_loss: 0.8233 - classification_loss: 0.0999 224/500 [============>.................] - ETA: 1:33 - loss: 0.9231 - regression_loss: 0.8234 - classification_loss: 0.0997 225/500 [============>.................] - ETA: 1:33 - loss: 0.9209 - regression_loss: 0.8215 - classification_loss: 0.0994 226/500 [============>.................] - ETA: 1:32 - loss: 0.9190 - regression_loss: 0.8200 - classification_loss: 0.0990 227/500 [============>.................] - ETA: 1:32 - loss: 0.9188 - regression_loss: 0.8197 - classification_loss: 0.0991 228/500 [============>.................] - ETA: 1:32 - loss: 0.9192 - regression_loss: 0.8200 - classification_loss: 0.0992 229/500 [============>.................] - ETA: 1:31 - loss: 0.9198 - regression_loss: 0.8205 - classification_loss: 0.0993 230/500 [============>.................] - ETA: 1:31 - loss: 0.9191 - regression_loss: 0.8200 - classification_loss: 0.0991 231/500 [============>.................] - ETA: 1:31 - loss: 0.9204 - regression_loss: 0.8213 - classification_loss: 0.0991 232/500 [============>.................] - ETA: 1:30 - loss: 0.9191 - regression_loss: 0.8203 - classification_loss: 0.0987 233/500 [============>.................] - ETA: 1:30 - loss: 0.9195 - regression_loss: 0.8206 - classification_loss: 0.0989 234/500 [=============>................] - ETA: 1:30 - loss: 0.9190 - regression_loss: 0.8202 - classification_loss: 0.0988 235/500 [=============>................] - ETA: 1:29 - loss: 0.9194 - regression_loss: 0.8208 - classification_loss: 0.0986 236/500 [=============>................] - ETA: 1:29 - loss: 0.9195 - regression_loss: 0.8211 - classification_loss: 0.0985 237/500 [=============>................] - ETA: 1:29 - loss: 0.9196 - regression_loss: 0.8212 - classification_loss: 0.0984 238/500 [=============>................] - ETA: 1:28 - loss: 0.9181 - regression_loss: 0.8196 - classification_loss: 0.0985 239/500 [=============>................] - ETA: 1:28 - loss: 0.9172 - regression_loss: 0.8188 - classification_loss: 0.0983 240/500 [=============>................] - ETA: 1:28 - loss: 0.9168 - regression_loss: 0.8186 - classification_loss: 0.0982 241/500 [=============>................] - ETA: 1:27 - loss: 0.9170 - regression_loss: 0.8187 - classification_loss: 0.0983 242/500 [=============>................] - ETA: 1:27 - loss: 0.9166 - regression_loss: 0.8185 - classification_loss: 0.0982 243/500 [=============>................] - ETA: 1:27 - loss: 0.9182 - regression_loss: 0.8195 - classification_loss: 0.0987 244/500 [=============>................] - ETA: 1:26 - loss: 0.9191 - regression_loss: 0.8202 - classification_loss: 0.0989 245/500 [=============>................] - ETA: 1:26 - loss: 0.9180 - regression_loss: 0.8191 - classification_loss: 0.0990 246/500 [=============>................] - ETA: 1:26 - loss: 0.9180 - regression_loss: 0.8191 - classification_loss: 0.0989 247/500 [=============>................] - ETA: 1:25 - loss: 0.9174 - regression_loss: 0.8186 - classification_loss: 0.0988 248/500 [=============>................] - ETA: 1:25 - loss: 0.9181 - regression_loss: 0.8193 - classification_loss: 0.0988 249/500 [=============>................] - ETA: 1:25 - loss: 0.9162 - regression_loss: 0.8177 - classification_loss: 0.0986 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9177 - regression_loss: 0.8189 - classification_loss: 0.0988 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9160 - regression_loss: 0.8175 - classification_loss: 0.0985 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9157 - regression_loss: 0.8172 - classification_loss: 0.0985 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9137 - regression_loss: 0.8155 - classification_loss: 0.0982 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9145 - regression_loss: 0.8162 - classification_loss: 0.0983 255/500 [==============>...............] - ETA: 1:22 - loss: 0.9142 - regression_loss: 0.8159 - classification_loss: 0.0983 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9157 - regression_loss: 0.8171 - classification_loss: 0.0986 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9150 - regression_loss: 0.8166 - classification_loss: 0.0984 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9139 - regression_loss: 0.8157 - classification_loss: 0.0982 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9121 - regression_loss: 0.8141 - classification_loss: 0.0980 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9131 - regression_loss: 0.8151 - classification_loss: 0.0980 261/500 [==============>...............] - ETA: 1:20 - loss: 0.9117 - regression_loss: 0.8140 - classification_loss: 0.0978 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9123 - regression_loss: 0.8143 - classification_loss: 0.0980 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9110 - regression_loss: 0.8132 - classification_loss: 0.0978 264/500 [==============>...............] - ETA: 1:19 - loss: 0.9116 - regression_loss: 0.8136 - classification_loss: 0.0980 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9102 - regression_loss: 0.8125 - classification_loss: 0.0977 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9102 - regression_loss: 0.8125 - classification_loss: 0.0977 267/500 [===============>..............] - ETA: 1:18 - loss: 0.9106 - regression_loss: 0.8128 - classification_loss: 0.0978 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9105 - regression_loss: 0.8130 - classification_loss: 0.0976 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9087 - regression_loss: 0.8115 - classification_loss: 0.0973 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9096 - regression_loss: 0.8125 - classification_loss: 0.0971 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9134 - regression_loss: 0.8158 - classification_loss: 0.0976 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9148 - regression_loss: 0.8173 - classification_loss: 0.0975 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9158 - regression_loss: 0.8180 - classification_loss: 0.0978 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9161 - regression_loss: 0.8183 - classification_loss: 0.0978 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9165 - regression_loss: 0.8188 - classification_loss: 0.0978 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9154 - regression_loss: 0.8179 - classification_loss: 0.0976 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9153 - regression_loss: 0.8177 - classification_loss: 0.0976 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9156 - regression_loss: 0.8180 - classification_loss: 0.0976 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9171 - regression_loss: 0.8194 - classification_loss: 0.0977 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9201 - regression_loss: 0.8224 - classification_loss: 0.0977 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9196 - regression_loss: 0.8220 - classification_loss: 0.0976 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9191 - regression_loss: 0.8216 - classification_loss: 0.0975 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9202 - regression_loss: 0.8226 - classification_loss: 0.0976 284/500 [================>.............] - ETA: 1:13 - loss: 0.9187 - regression_loss: 0.8213 - classification_loss: 0.0974 285/500 [================>.............] - ETA: 1:12 - loss: 0.9170 - regression_loss: 0.8199 - classification_loss: 0.0972 286/500 [================>.............] - ETA: 1:12 - loss: 0.9163 - regression_loss: 0.8192 - classification_loss: 0.0970 287/500 [================>.............] - ETA: 1:12 - loss: 0.9158 - regression_loss: 0.8189 - classification_loss: 0.0969 288/500 [================>.............] - ETA: 1:11 - loss: 0.9156 - regression_loss: 0.8185 - classification_loss: 0.0971 289/500 [================>.............] - ETA: 1:11 - loss: 0.9134 - regression_loss: 0.8166 - classification_loss: 0.0968 290/500 [================>.............] - ETA: 1:11 - loss: 0.9128 - regression_loss: 0.8161 - classification_loss: 0.0966 291/500 [================>.............] - ETA: 1:10 - loss: 0.9136 - regression_loss: 0.8168 - classification_loss: 0.0968 292/500 [================>.............] - ETA: 1:10 - loss: 0.9130 - regression_loss: 0.8164 - classification_loss: 0.0967 293/500 [================>.............] - ETA: 1:10 - loss: 0.9113 - regression_loss: 0.8150 - classification_loss: 0.0964 294/500 [================>.............] - ETA: 1:09 - loss: 0.9126 - regression_loss: 0.8160 - classification_loss: 0.0966 295/500 [================>.............] - ETA: 1:09 - loss: 0.9118 - regression_loss: 0.8155 - classification_loss: 0.0964 296/500 [================>.............] - ETA: 1:09 - loss: 0.9123 - regression_loss: 0.8159 - classification_loss: 0.0964 297/500 [================>.............] - ETA: 1:08 - loss: 0.9142 - regression_loss: 0.8175 - classification_loss: 0.0967 298/500 [================>.............] - ETA: 1:08 - loss: 0.9146 - regression_loss: 0.8180 - classification_loss: 0.0966 299/500 [================>.............] - ETA: 1:08 - loss: 0.9148 - regression_loss: 0.8181 - classification_loss: 0.0967 300/500 [=================>............] - ETA: 1:07 - loss: 0.9156 - regression_loss: 0.8189 - classification_loss: 0.0967 301/500 [=================>............] - ETA: 1:07 - loss: 0.9155 - regression_loss: 0.8188 - classification_loss: 0.0967 302/500 [=================>............] - ETA: 1:07 - loss: 0.9154 - regression_loss: 0.8187 - classification_loss: 0.0967 303/500 [=================>............] - ETA: 1:06 - loss: 0.9164 - regression_loss: 0.8199 - classification_loss: 0.0965 304/500 [=================>............] - ETA: 1:06 - loss: 0.9173 - regression_loss: 0.8206 - classification_loss: 0.0967 305/500 [=================>............] - ETA: 1:06 - loss: 0.9182 - regression_loss: 0.8213 - classification_loss: 0.0970 306/500 [=================>............] - ETA: 1:05 - loss: 0.9180 - regression_loss: 0.8211 - classification_loss: 0.0968 307/500 [=================>............] - ETA: 1:05 - loss: 0.9170 - regression_loss: 0.8204 - classification_loss: 0.0966 308/500 [=================>............] - ETA: 1:05 - loss: 0.9176 - regression_loss: 0.8209 - classification_loss: 0.0966 309/500 [=================>............] - ETA: 1:04 - loss: 0.9174 - regression_loss: 0.8208 - classification_loss: 0.0966 310/500 [=================>............] - ETA: 1:04 - loss: 0.9183 - regression_loss: 0.8217 - classification_loss: 0.0966 311/500 [=================>............] - ETA: 1:04 - loss: 0.9183 - regression_loss: 0.8219 - classification_loss: 0.0964 312/500 [=================>............] - ETA: 1:03 - loss: 0.9193 - regression_loss: 0.8227 - classification_loss: 0.0965 313/500 [=================>............] - ETA: 1:03 - loss: 0.9175 - regression_loss: 0.8212 - classification_loss: 0.0963 314/500 [=================>............] - ETA: 1:03 - loss: 0.9159 - regression_loss: 0.8198 - classification_loss: 0.0961 315/500 [=================>............] - ETA: 1:02 - loss: 0.9143 - regression_loss: 0.8184 - classification_loss: 0.0959 316/500 [=================>............] - ETA: 1:02 - loss: 0.9151 - regression_loss: 0.8192 - classification_loss: 0.0959 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9135 - regression_loss: 0.8179 - classification_loss: 0.0957 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9142 - regression_loss: 0.8185 - classification_loss: 0.0957 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9131 - regression_loss: 0.8177 - classification_loss: 0.0955 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9120 - regression_loss: 0.8167 - classification_loss: 0.0953 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9119 - regression_loss: 0.8166 - classification_loss: 0.0953 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9122 - regression_loss: 0.8169 - classification_loss: 0.0953 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9101 - regression_loss: 0.8151 - classification_loss: 0.0951 324/500 [==================>...........] - ETA: 59s - loss: 0.9110 - regression_loss: 0.8159 - classification_loss: 0.0951  325/500 [==================>...........] - ETA: 59s - loss: 0.9123 - regression_loss: 0.8170 - classification_loss: 0.0953 326/500 [==================>...........] - ETA: 59s - loss: 0.9113 - regression_loss: 0.8162 - classification_loss: 0.0951 327/500 [==================>...........] - ETA: 58s - loss: 0.9116 - regression_loss: 0.8164 - classification_loss: 0.0952 328/500 [==================>...........] - ETA: 58s - loss: 0.9134 - regression_loss: 0.8178 - classification_loss: 0.0956 329/500 [==================>...........] - ETA: 58s - loss: 0.9128 - regression_loss: 0.8173 - classification_loss: 0.0955 330/500 [==================>...........] - ETA: 57s - loss: 0.9132 - regression_loss: 0.8176 - classification_loss: 0.0956 331/500 [==================>...........] - ETA: 57s - loss: 0.9145 - regression_loss: 0.8186 - classification_loss: 0.0959 332/500 [==================>...........] - ETA: 57s - loss: 0.9161 - regression_loss: 0.8200 - classification_loss: 0.0961 333/500 [==================>...........] - ETA: 56s - loss: 0.9159 - regression_loss: 0.8198 - classification_loss: 0.0961 334/500 [===================>..........] - ETA: 56s - loss: 0.9156 - regression_loss: 0.8195 - classification_loss: 0.0961 335/500 [===================>..........] - ETA: 56s - loss: 0.9164 - regression_loss: 0.8201 - classification_loss: 0.0963 336/500 [===================>..........] - ETA: 55s - loss: 0.9179 - regression_loss: 0.8214 - classification_loss: 0.0965 337/500 [===================>..........] - ETA: 55s - loss: 0.9180 - regression_loss: 0.8214 - classification_loss: 0.0966 338/500 [===================>..........] - ETA: 55s - loss: 0.9180 - regression_loss: 0.8214 - classification_loss: 0.0966 339/500 [===================>..........] - ETA: 54s - loss: 0.9180 - regression_loss: 0.8214 - classification_loss: 0.0966 340/500 [===================>..........] - ETA: 54s - loss: 0.9190 - regression_loss: 0.8219 - classification_loss: 0.0972 341/500 [===================>..........] - ETA: 53s - loss: 0.9178 - regression_loss: 0.8208 - classification_loss: 0.0971 342/500 [===================>..........] - ETA: 53s - loss: 0.9169 - regression_loss: 0.8200 - classification_loss: 0.0969 343/500 [===================>..........] - ETA: 53s - loss: 0.9153 - regression_loss: 0.8184 - classification_loss: 0.0969 344/500 [===================>..........] - ETA: 52s - loss: 0.9156 - regression_loss: 0.8186 - classification_loss: 0.0970 345/500 [===================>..........] - ETA: 52s - loss: 0.9152 - regression_loss: 0.8183 - classification_loss: 0.0969 346/500 [===================>..........] - ETA: 52s - loss: 0.9148 - regression_loss: 0.8180 - classification_loss: 0.0968 347/500 [===================>..........] - ETA: 51s - loss: 0.9143 - regression_loss: 0.8176 - classification_loss: 0.0967 348/500 [===================>..........] - ETA: 51s - loss: 0.9130 - regression_loss: 0.8165 - classification_loss: 0.0965 349/500 [===================>..........] - ETA: 51s - loss: 0.9139 - regression_loss: 0.8173 - classification_loss: 0.0966 350/500 [====================>.........] - ETA: 50s - loss: 0.9146 - regression_loss: 0.8178 - classification_loss: 0.0968 351/500 [====================>.........] - ETA: 50s - loss: 0.9135 - regression_loss: 0.8168 - classification_loss: 0.0967 352/500 [====================>.........] - ETA: 50s - loss: 0.9121 - regression_loss: 0.8156 - classification_loss: 0.0965 353/500 [====================>.........] - ETA: 49s - loss: 0.9111 - regression_loss: 0.8147 - classification_loss: 0.0964 354/500 [====================>.........] - ETA: 49s - loss: 0.9117 - regression_loss: 0.8153 - classification_loss: 0.0965 355/500 [====================>.........] - ETA: 49s - loss: 0.9123 - regression_loss: 0.8157 - classification_loss: 0.0966 356/500 [====================>.........] - ETA: 48s - loss: 0.9134 - regression_loss: 0.8166 - classification_loss: 0.0969 357/500 [====================>.........] - ETA: 48s - loss: 0.9139 - regression_loss: 0.8172 - classification_loss: 0.0967 358/500 [====================>.........] - ETA: 48s - loss: 0.9148 - regression_loss: 0.8181 - classification_loss: 0.0967 359/500 [====================>.........] - ETA: 47s - loss: 0.9145 - regression_loss: 0.8173 - classification_loss: 0.0972 360/500 [====================>.........] - ETA: 47s - loss: 0.9140 - regression_loss: 0.8169 - classification_loss: 0.0971 361/500 [====================>.........] - ETA: 47s - loss: 0.9133 - regression_loss: 0.8163 - classification_loss: 0.0971 362/500 [====================>.........] - ETA: 46s - loss: 0.9134 - regression_loss: 0.8163 - classification_loss: 0.0971 363/500 [====================>.........] - ETA: 46s - loss: 0.9138 - regression_loss: 0.8167 - classification_loss: 0.0971 364/500 [====================>.........] - ETA: 46s - loss: 0.9145 - regression_loss: 0.8173 - classification_loss: 0.0972 365/500 [====================>.........] - ETA: 45s - loss: 0.9132 - regression_loss: 0.8162 - classification_loss: 0.0970 366/500 [====================>.........] - ETA: 45s - loss: 0.9135 - regression_loss: 0.8165 - classification_loss: 0.0970 367/500 [=====================>........] - ETA: 45s - loss: 0.9137 - regression_loss: 0.8167 - classification_loss: 0.0970 368/500 [=====================>........] - ETA: 44s - loss: 0.9141 - regression_loss: 0.8171 - classification_loss: 0.0970 369/500 [=====================>........] - ETA: 44s - loss: 0.9139 - regression_loss: 0.8169 - classification_loss: 0.0970 370/500 [=====================>........] - ETA: 44s - loss: 0.9129 - regression_loss: 0.8160 - classification_loss: 0.0969 371/500 [=====================>........] - ETA: 43s - loss: 0.9135 - regression_loss: 0.8165 - classification_loss: 0.0970 372/500 [=====================>........] - ETA: 43s - loss: 0.9119 - regression_loss: 0.8152 - classification_loss: 0.0967 373/500 [=====================>........] - ETA: 43s - loss: 0.9107 - regression_loss: 0.8140 - classification_loss: 0.0967 374/500 [=====================>........] - ETA: 42s - loss: 0.9119 - regression_loss: 0.8151 - classification_loss: 0.0969 375/500 [=====================>........] - ETA: 42s - loss: 0.9140 - regression_loss: 0.8168 - classification_loss: 0.0972 376/500 [=====================>........] - ETA: 42s - loss: 0.9146 - regression_loss: 0.8174 - classification_loss: 0.0973 377/500 [=====================>........] - ETA: 41s - loss: 0.9141 - regression_loss: 0.8169 - classification_loss: 0.0972 378/500 [=====================>........] - ETA: 41s - loss: 0.9135 - regression_loss: 0.8165 - classification_loss: 0.0971 379/500 [=====================>........] - ETA: 41s - loss: 0.9145 - regression_loss: 0.8172 - classification_loss: 0.0973 380/500 [=====================>........] - ETA: 40s - loss: 0.9148 - regression_loss: 0.8175 - classification_loss: 0.0973 381/500 [=====================>........] - ETA: 40s - loss: 0.9141 - regression_loss: 0.8169 - classification_loss: 0.0972 382/500 [=====================>........] - ETA: 40s - loss: 0.9126 - regression_loss: 0.8155 - classification_loss: 0.0971 383/500 [=====================>........] - ETA: 39s - loss: 0.9128 - regression_loss: 0.8158 - classification_loss: 0.0970 384/500 [======================>.......] - ETA: 39s - loss: 0.9127 - regression_loss: 0.8157 - classification_loss: 0.0970 385/500 [======================>.......] - ETA: 39s - loss: 0.9114 - regression_loss: 0.8146 - classification_loss: 0.0968 386/500 [======================>.......] - ETA: 38s - loss: 0.9108 - regression_loss: 0.8142 - classification_loss: 0.0967 387/500 [======================>.......] - ETA: 38s - loss: 0.9112 - regression_loss: 0.8145 - classification_loss: 0.0967 388/500 [======================>.......] - ETA: 38s - loss: 0.9121 - regression_loss: 0.8152 - classification_loss: 0.0969 389/500 [======================>.......] - ETA: 37s - loss: 0.9118 - regression_loss: 0.8148 - classification_loss: 0.0970 390/500 [======================>.......] - ETA: 37s - loss: 0.9106 - regression_loss: 0.8138 - classification_loss: 0.0968 391/500 [======================>.......] - ETA: 37s - loss: 0.9090 - regression_loss: 0.8124 - classification_loss: 0.0966 392/500 [======================>.......] - ETA: 36s - loss: 0.9085 - regression_loss: 0.8119 - classification_loss: 0.0965 393/500 [======================>.......] - ETA: 36s - loss: 0.9096 - regression_loss: 0.8129 - classification_loss: 0.0967 394/500 [======================>.......] - ETA: 35s - loss: 0.9099 - regression_loss: 0.8133 - classification_loss: 0.0966 395/500 [======================>.......] - ETA: 35s - loss: 0.9101 - regression_loss: 0.8136 - classification_loss: 0.0966 396/500 [======================>.......] - ETA: 35s - loss: 0.9107 - regression_loss: 0.8141 - classification_loss: 0.0966 397/500 [======================>.......] - ETA: 34s - loss: 0.9095 - regression_loss: 0.8130 - classification_loss: 0.0965 398/500 [======================>.......] - ETA: 34s - loss: 0.9107 - regression_loss: 0.8140 - classification_loss: 0.0966 399/500 [======================>.......] - ETA: 34s - loss: 0.9093 - regression_loss: 0.8128 - classification_loss: 0.0965 400/500 [=======================>......] - ETA: 33s - loss: 0.9097 - regression_loss: 0.8131 - classification_loss: 0.0966 401/500 [=======================>......] - ETA: 33s - loss: 0.9108 - regression_loss: 0.8140 - classification_loss: 0.0968 402/500 [=======================>......] - ETA: 33s - loss: 0.9111 - regression_loss: 0.8142 - classification_loss: 0.0969 403/500 [=======================>......] - ETA: 32s - loss: 0.9110 - regression_loss: 0.8141 - classification_loss: 0.0970 404/500 [=======================>......] - ETA: 32s - loss: 0.9122 - regression_loss: 0.8150 - classification_loss: 0.0972 405/500 [=======================>......] - ETA: 32s - loss: 0.9122 - regression_loss: 0.8150 - classification_loss: 0.0972 406/500 [=======================>......] - ETA: 31s - loss: 0.9119 - regression_loss: 0.8148 - classification_loss: 0.0971 407/500 [=======================>......] - ETA: 31s - loss: 0.9111 - regression_loss: 0.8142 - classification_loss: 0.0969 408/500 [=======================>......] - ETA: 31s - loss: 0.9100 - regression_loss: 0.8132 - classification_loss: 0.0968 409/500 [=======================>......] - ETA: 30s - loss: 0.9102 - regression_loss: 0.8134 - classification_loss: 0.0968 410/500 [=======================>......] - ETA: 30s - loss: 0.9093 - regression_loss: 0.8127 - classification_loss: 0.0966 411/500 [=======================>......] - ETA: 30s - loss: 0.9082 - regression_loss: 0.8117 - classification_loss: 0.0965 412/500 [=======================>......] - ETA: 29s - loss: 0.9081 - regression_loss: 0.8116 - classification_loss: 0.0965 413/500 [=======================>......] - ETA: 29s - loss: 0.9085 - regression_loss: 0.8119 - classification_loss: 0.0966 414/500 [=======================>......] - ETA: 29s - loss: 0.9087 - regression_loss: 0.8121 - classification_loss: 0.0966 415/500 [=======================>......] - ETA: 28s - loss: 0.9097 - regression_loss: 0.8129 - classification_loss: 0.0968 416/500 [=======================>......] - ETA: 28s - loss: 0.9103 - regression_loss: 0.8134 - classification_loss: 0.0969 417/500 [========================>.....] - ETA: 28s - loss: 0.9122 - regression_loss: 0.8150 - classification_loss: 0.0972 418/500 [========================>.....] - ETA: 27s - loss: 0.9115 - regression_loss: 0.8144 - classification_loss: 0.0970 419/500 [========================>.....] - ETA: 27s - loss: 0.9110 - regression_loss: 0.8141 - classification_loss: 0.0969 420/500 [========================>.....] - ETA: 27s - loss: 0.9103 - regression_loss: 0.8135 - classification_loss: 0.0968 421/500 [========================>.....] - ETA: 26s - loss: 0.9093 - regression_loss: 0.8126 - classification_loss: 0.0967 422/500 [========================>.....] - ETA: 26s - loss: 0.9094 - regression_loss: 0.8127 - classification_loss: 0.0967 423/500 [========================>.....] - ETA: 26s - loss: 0.9078 - regression_loss: 0.8113 - classification_loss: 0.0965 424/500 [========================>.....] - ETA: 25s - loss: 0.9083 - regression_loss: 0.8118 - classification_loss: 0.0965 425/500 [========================>.....] - ETA: 25s - loss: 0.9079 - regression_loss: 0.8114 - classification_loss: 0.0965 426/500 [========================>.....] - ETA: 25s - loss: 0.9091 - regression_loss: 0.8125 - classification_loss: 0.0966 427/500 [========================>.....] - ETA: 24s - loss: 0.9098 - regression_loss: 0.8131 - classification_loss: 0.0966 428/500 [========================>.....] - ETA: 24s - loss: 0.9101 - regression_loss: 0.8134 - classification_loss: 0.0966 429/500 [========================>.....] - ETA: 24s - loss: 0.9100 - regression_loss: 0.8135 - classification_loss: 0.0965 430/500 [========================>.....] - ETA: 23s - loss: 0.9101 - regression_loss: 0.8135 - classification_loss: 0.0965 431/500 [========================>.....] - ETA: 23s - loss: 0.9094 - regression_loss: 0.8130 - classification_loss: 0.0964 432/500 [========================>.....] - ETA: 23s - loss: 0.9092 - regression_loss: 0.8128 - classification_loss: 0.0964 433/500 [========================>.....] - ETA: 22s - loss: 0.9097 - regression_loss: 0.8134 - classification_loss: 0.0963 434/500 [=========================>....] - ETA: 22s - loss: 0.9098 - regression_loss: 0.8137 - classification_loss: 0.0962 435/500 [=========================>....] - ETA: 22s - loss: 0.9100 - regression_loss: 0.8139 - classification_loss: 0.0962 436/500 [=========================>....] - ETA: 21s - loss: 0.9112 - regression_loss: 0.8149 - classification_loss: 0.0963 437/500 [=========================>....] - ETA: 21s - loss: 0.9102 - regression_loss: 0.8141 - classification_loss: 0.0962 438/500 [=========================>....] - ETA: 21s - loss: 0.9103 - regression_loss: 0.8141 - classification_loss: 0.0962 439/500 [=========================>....] - ETA: 20s - loss: 0.9102 - regression_loss: 0.8139 - classification_loss: 0.0963 440/500 [=========================>....] - ETA: 20s - loss: 0.9102 - regression_loss: 0.8140 - classification_loss: 0.0962 441/500 [=========================>....] - ETA: 20s - loss: 0.9101 - regression_loss: 0.8139 - classification_loss: 0.0962 442/500 [=========================>....] - ETA: 19s - loss: 0.9096 - regression_loss: 0.8134 - classification_loss: 0.0962 443/500 [=========================>....] - ETA: 19s - loss: 0.9099 - regression_loss: 0.8137 - classification_loss: 0.0962 444/500 [=========================>....] - ETA: 19s - loss: 0.9100 - regression_loss: 0.8138 - classification_loss: 0.0962 445/500 [=========================>....] - ETA: 18s - loss: 0.9115 - regression_loss: 0.8151 - classification_loss: 0.0964 446/500 [=========================>....] - ETA: 18s - loss: 0.9101 - regression_loss: 0.8139 - classification_loss: 0.0962 447/500 [=========================>....] - ETA: 17s - loss: 0.9101 - regression_loss: 0.8140 - classification_loss: 0.0961 448/500 [=========================>....] - ETA: 17s - loss: 0.9104 - regression_loss: 0.8143 - classification_loss: 0.0961 449/500 [=========================>....] - ETA: 17s - loss: 0.9108 - regression_loss: 0.8147 - classification_loss: 0.0961 450/500 [==========================>...] - ETA: 16s - loss: 0.9103 - regression_loss: 0.8142 - classification_loss: 0.0961 451/500 [==========================>...] - ETA: 16s - loss: 0.9110 - regression_loss: 0.8148 - classification_loss: 0.0962 452/500 [==========================>...] - ETA: 16s - loss: 0.9100 - regression_loss: 0.8140 - classification_loss: 0.0961 453/500 [==========================>...] - ETA: 15s - loss: 0.9099 - regression_loss: 0.8139 - classification_loss: 0.0961 454/500 [==========================>...] - ETA: 15s - loss: 0.9090 - regression_loss: 0.8131 - classification_loss: 0.0960 455/500 [==========================>...] - ETA: 15s - loss: 0.9087 - regression_loss: 0.8128 - classification_loss: 0.0959 456/500 [==========================>...] - ETA: 14s - loss: 0.9087 - regression_loss: 0.8129 - classification_loss: 0.0958 457/500 [==========================>...] - ETA: 14s - loss: 0.9094 - regression_loss: 0.8134 - classification_loss: 0.0960 458/500 [==========================>...] - ETA: 14s - loss: 0.9098 - regression_loss: 0.8138 - classification_loss: 0.0960 459/500 [==========================>...] - ETA: 13s - loss: 0.9100 - regression_loss: 0.8139 - classification_loss: 0.0961 460/500 [==========================>...] - ETA: 13s - loss: 0.9090 - regression_loss: 0.8131 - classification_loss: 0.0959 461/500 [==========================>...] - ETA: 13s - loss: 0.9086 - regression_loss: 0.8129 - classification_loss: 0.0958 462/500 [==========================>...] - ETA: 12s - loss: 0.9089 - regression_loss: 0.8131 - classification_loss: 0.0958 463/500 [==========================>...] - ETA: 12s - loss: 0.9094 - regression_loss: 0.8136 - classification_loss: 0.0958 464/500 [==========================>...] - ETA: 12s - loss: 0.9092 - regression_loss: 0.8134 - classification_loss: 0.0958 465/500 [==========================>...] - ETA: 11s - loss: 0.9092 - regression_loss: 0.8135 - classification_loss: 0.0957 466/500 [==========================>...] - ETA: 11s - loss: 0.9098 - regression_loss: 0.8140 - classification_loss: 0.0958 467/500 [===========================>..] - ETA: 11s - loss: 0.9084 - regression_loss: 0.8128 - classification_loss: 0.0956 468/500 [===========================>..] - ETA: 10s - loss: 0.9073 - regression_loss: 0.8118 - classification_loss: 0.0955 469/500 [===========================>..] - ETA: 10s - loss: 0.9065 - regression_loss: 0.8111 - classification_loss: 0.0954 470/500 [===========================>..] - ETA: 10s - loss: 0.9081 - regression_loss: 0.8124 - classification_loss: 0.0957 471/500 [===========================>..] - ETA: 9s - loss: 0.9091 - regression_loss: 0.8131 - classification_loss: 0.0959  472/500 [===========================>..] - ETA: 9s - loss: 0.9077 - regression_loss: 0.8120 - classification_loss: 0.0958 473/500 [===========================>..] - ETA: 9s - loss: 0.9090 - regression_loss: 0.8133 - classification_loss: 0.0958 474/500 [===========================>..] - ETA: 8s - loss: 0.9085 - regression_loss: 0.8128 - classification_loss: 0.0957 475/500 [===========================>..] - ETA: 8s - loss: 0.9090 - regression_loss: 0.8133 - classification_loss: 0.0957 476/500 [===========================>..] - ETA: 8s - loss: 0.9092 - regression_loss: 0.8136 - classification_loss: 0.0956 477/500 [===========================>..] - ETA: 7s - loss: 0.9103 - regression_loss: 0.8145 - classification_loss: 0.0958 478/500 [===========================>..] - ETA: 7s - loss: 0.9105 - regression_loss: 0.8147 - classification_loss: 0.0958 479/500 [===========================>..] - ETA: 7s - loss: 0.9099 - regression_loss: 0.8142 - classification_loss: 0.0957 480/500 [===========================>..] - ETA: 6s - loss: 0.9094 - regression_loss: 0.8138 - classification_loss: 0.0956 481/500 [===========================>..] - ETA: 6s - loss: 0.9084 - regression_loss: 0.8129 - classification_loss: 0.0955 482/500 [===========================>..] - ETA: 6s - loss: 0.9087 - regression_loss: 0.8132 - classification_loss: 0.0954 483/500 [===========================>..] - ETA: 5s - loss: 0.9082 - regression_loss: 0.8128 - classification_loss: 0.0954 484/500 [============================>.] - ETA: 5s - loss: 0.9085 - regression_loss: 0.8131 - classification_loss: 0.0954 485/500 [============================>.] - ETA: 5s - loss: 0.9089 - regression_loss: 0.8134 - classification_loss: 0.0954 486/500 [============================>.] - ETA: 4s - loss: 0.9084 - regression_loss: 0.8131 - classification_loss: 0.0953 487/500 [============================>.] - ETA: 4s - loss: 0.9084 - regression_loss: 0.8131 - classification_loss: 0.0953 488/500 [============================>.] - ETA: 4s - loss: 0.9079 - regression_loss: 0.8128 - classification_loss: 0.0951 489/500 [============================>.] - ETA: 3s - loss: 0.9071 - regression_loss: 0.8122 - classification_loss: 0.0950 490/500 [============================>.] - ETA: 3s - loss: 0.9062 - regression_loss: 0.8114 - classification_loss: 0.0949 491/500 [============================>.] - ETA: 3s - loss: 0.9068 - regression_loss: 0.8119 - classification_loss: 0.0949 492/500 [============================>.] - ETA: 2s - loss: 0.9057 - regression_loss: 0.8109 - classification_loss: 0.0947 493/500 [============================>.] - ETA: 2s - loss: 0.9057 - regression_loss: 0.8109 - classification_loss: 0.0947 494/500 [============================>.] - ETA: 2s - loss: 0.9064 - regression_loss: 0.8115 - classification_loss: 0.0949 495/500 [============================>.] - ETA: 1s - loss: 0.9063 - regression_loss: 0.8115 - classification_loss: 0.0948 496/500 [============================>.] - ETA: 1s - loss: 0.9059 - regression_loss: 0.8111 - classification_loss: 0.0948 497/500 [============================>.] - ETA: 1s - loss: 0.9062 - regression_loss: 0.8114 - classification_loss: 0.0948 498/500 [============================>.] - ETA: 0s - loss: 0.9061 - regression_loss: 0.8113 - classification_loss: 0.0948 499/500 [============================>.] - ETA: 0s - loss: 0.9064 - regression_loss: 0.8116 - classification_loss: 0.0948 500/500 [==============================] - 170s 339ms/step - loss: 0.9059 - regression_loss: 0.8111 - classification_loss: 0.0947 1172 instances of class plum with average precision: 0.7917 mAP: 0.7917 Epoch 00036: saving model to ./training/snapshots/resnet101_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 2:49 - loss: 0.8562 - regression_loss: 0.7395 - classification_loss: 0.1167 2/500 [..............................] - ETA: 2:52 - loss: 0.8844 - regression_loss: 0.7711 - classification_loss: 0.1133 3/500 [..............................] - ETA: 2:50 - loss: 0.8968 - regression_loss: 0.8030 - classification_loss: 0.0938 4/500 [..............................] - ETA: 2:51 - loss: 0.9690 - regression_loss: 0.8695 - classification_loss: 0.0995 5/500 [..............................] - ETA: 2:50 - loss: 0.9245 - regression_loss: 0.8336 - classification_loss: 0.0909 6/500 [..............................] - ETA: 2:49 - loss: 0.9664 - regression_loss: 0.8673 - classification_loss: 0.0990 7/500 [..............................] - ETA: 2:49 - loss: 0.9571 - regression_loss: 0.8607 - classification_loss: 0.0964 8/500 [..............................] - ETA: 2:47 - loss: 0.9099 - regression_loss: 0.8208 - classification_loss: 0.0891 9/500 [..............................] - ETA: 2:47 - loss: 0.9451 - regression_loss: 0.8492 - classification_loss: 0.0959 10/500 [..............................] - ETA: 2:47 - loss: 0.9901 - regression_loss: 0.8880 - classification_loss: 0.1022 11/500 [..............................] - ETA: 2:47 - loss: 0.9945 - regression_loss: 0.8899 - classification_loss: 0.1046 12/500 [..............................] - ETA: 2:47 - loss: 0.9916 - regression_loss: 0.8881 - classification_loss: 0.1035 13/500 [..............................] - ETA: 2:46 - loss: 0.9561 - regression_loss: 0.8576 - classification_loss: 0.0985 14/500 [..............................] - ETA: 2:46 - loss: 0.9321 - regression_loss: 0.8378 - classification_loss: 0.0942 15/500 [..............................] - ETA: 2:45 - loss: 0.9319 - regression_loss: 0.8383 - classification_loss: 0.0936 16/500 [..............................] - ETA: 2:45 - loss: 0.9263 - regression_loss: 0.8319 - classification_loss: 0.0943 17/500 [>.............................] - ETA: 2:45 - loss: 0.9963 - regression_loss: 0.8864 - classification_loss: 0.1099 18/500 [>.............................] - ETA: 2:44 - loss: 0.9781 - regression_loss: 0.8705 - classification_loss: 0.1076 19/500 [>.............................] - ETA: 2:44 - loss: 0.9838 - regression_loss: 0.8763 - classification_loss: 0.1076 20/500 [>.............................] - ETA: 2:43 - loss: 0.9709 - regression_loss: 0.8655 - classification_loss: 0.1054 21/500 [>.............................] - ETA: 2:43 - loss: 0.9661 - regression_loss: 0.8611 - classification_loss: 0.1050 22/500 [>.............................] - ETA: 2:43 - loss: 0.9712 - regression_loss: 0.8662 - classification_loss: 0.1050 23/500 [>.............................] - ETA: 2:42 - loss: 0.9698 - regression_loss: 0.8646 - classification_loss: 0.1052 24/500 [>.............................] - ETA: 2:42 - loss: 0.9468 - regression_loss: 0.8447 - classification_loss: 0.1022 25/500 [>.............................] - ETA: 2:42 - loss: 0.9232 - regression_loss: 0.8243 - classification_loss: 0.0989 26/500 [>.............................] - ETA: 2:42 - loss: 0.9222 - regression_loss: 0.8245 - classification_loss: 0.0976 27/500 [>.............................] - ETA: 2:42 - loss: 0.9315 - regression_loss: 0.8315 - classification_loss: 0.1000 28/500 [>.............................] - ETA: 2:41 - loss: 0.9330 - regression_loss: 0.8324 - classification_loss: 0.1006 29/500 [>.............................] - ETA: 2:41 - loss: 0.9323 - regression_loss: 0.8323 - classification_loss: 0.1000 30/500 [>.............................] - ETA: 2:40 - loss: 0.9256 - regression_loss: 0.8266 - classification_loss: 0.0990 31/500 [>.............................] - ETA: 2:40 - loss: 0.9170 - regression_loss: 0.8197 - classification_loss: 0.0973 32/500 [>.............................] - ETA: 2:40 - loss: 0.9170 - regression_loss: 0.8195 - classification_loss: 0.0974 33/500 [>.............................] - ETA: 2:39 - loss: 0.8957 - regression_loss: 0.8009 - classification_loss: 0.0948 34/500 [=>............................] - ETA: 2:39 - loss: 0.8957 - regression_loss: 0.8020 - classification_loss: 0.0936 35/500 [=>............................] - ETA: 2:39 - loss: 0.8949 - regression_loss: 0.8024 - classification_loss: 0.0925 36/500 [=>............................] - ETA: 2:38 - loss: 0.9056 - regression_loss: 0.8138 - classification_loss: 0.0918 37/500 [=>............................] - ETA: 2:38 - loss: 0.9148 - regression_loss: 0.8213 - classification_loss: 0.0935 38/500 [=>............................] - ETA: 2:37 - loss: 0.9212 - regression_loss: 0.8268 - classification_loss: 0.0944 39/500 [=>............................] - ETA: 2:37 - loss: 0.9237 - regression_loss: 0.8282 - classification_loss: 0.0954 40/500 [=>............................] - ETA: 2:37 - loss: 0.9091 - regression_loss: 0.8155 - classification_loss: 0.0936 41/500 [=>............................] - ETA: 2:36 - loss: 0.9150 - regression_loss: 0.8212 - classification_loss: 0.0938 42/500 [=>............................] - ETA: 2:36 - loss: 0.9094 - regression_loss: 0.8162 - classification_loss: 0.0933 43/500 [=>............................] - ETA: 2:35 - loss: 0.9045 - regression_loss: 0.8122 - classification_loss: 0.0923 44/500 [=>............................] - ETA: 2:35 - loss: 0.9013 - regression_loss: 0.8099 - classification_loss: 0.0914 45/500 [=>............................] - ETA: 2:35 - loss: 0.9079 - regression_loss: 0.8137 - classification_loss: 0.0942 46/500 [=>............................] - ETA: 2:34 - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 47/500 [=>............................] - ETA: 2:34 - loss: 0.8989 - regression_loss: 0.8047 - classification_loss: 0.0942 48/500 [=>............................] - ETA: 2:33 - loss: 0.8987 - regression_loss: 0.8037 - classification_loss: 0.0950 49/500 [=>............................] - ETA: 2:33 - loss: 0.9008 - regression_loss: 0.8048 - classification_loss: 0.0960 50/500 [==>...........................] - ETA: 2:33 - loss: 0.8977 - regression_loss: 0.8019 - classification_loss: 0.0957 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8862 - regression_loss: 0.7921 - classification_loss: 0.0941 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8779 - regression_loss: 0.7845 - classification_loss: 0.0934 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8706 - regression_loss: 0.7775 - classification_loss: 0.0931 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8813 - regression_loss: 0.7863 - classification_loss: 0.0950 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8779 - regression_loss: 0.7838 - classification_loss: 0.0941 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8744 - regression_loss: 0.7815 - classification_loss: 0.0929 57/500 [==>...........................] - ETA: 2:31 - loss: 0.8728 - regression_loss: 0.7804 - classification_loss: 0.0924 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8719 - regression_loss: 0.7799 - classification_loss: 0.0920 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8798 - regression_loss: 0.7859 - classification_loss: 0.0939 60/500 [==>...........................] - ETA: 2:30 - loss: 0.8843 - regression_loss: 0.7901 - classification_loss: 0.0941 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8815 - regression_loss: 0.7881 - classification_loss: 0.0934 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8824 - regression_loss: 0.7887 - classification_loss: 0.0938 63/500 [==>...........................] - ETA: 2:29 - loss: 0.8743 - regression_loss: 0.7818 - classification_loss: 0.0926 64/500 [==>...........................] - ETA: 2:28 - loss: 0.8786 - regression_loss: 0.7853 - classification_loss: 0.0933 65/500 [==>...........................] - ETA: 2:28 - loss: 0.8724 - regression_loss: 0.7802 - classification_loss: 0.0922 66/500 [==>...........................] - ETA: 2:28 - loss: 0.8820 - regression_loss: 0.7887 - classification_loss: 0.0934 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8790 - regression_loss: 0.7862 - classification_loss: 0.0928 68/500 [===>..........................] - ETA: 2:27 - loss: 0.8812 - regression_loss: 0.7884 - classification_loss: 0.0928 69/500 [===>..........................] - ETA: 2:27 - loss: 0.8861 - regression_loss: 0.7930 - classification_loss: 0.0932 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8901 - regression_loss: 0.7962 - classification_loss: 0.0938 71/500 [===>..........................] - ETA: 2:26 - loss: 0.8950 - regression_loss: 0.7999 - classification_loss: 0.0951 72/500 [===>..........................] - ETA: 2:26 - loss: 0.9010 - regression_loss: 0.8051 - classification_loss: 0.0960 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9004 - regression_loss: 0.8042 - classification_loss: 0.0961 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9033 - regression_loss: 0.8073 - classification_loss: 0.0960 75/500 [===>..........................] - ETA: 2:25 - loss: 0.9032 - regression_loss: 0.8074 - classification_loss: 0.0957 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9077 - regression_loss: 0.8113 - classification_loss: 0.0964 77/500 [===>..........................] - ETA: 2:24 - loss: 0.9106 - regression_loss: 0.8145 - classification_loss: 0.0961 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9169 - regression_loss: 0.8139 - classification_loss: 0.1030 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9082 - regression_loss: 0.8064 - classification_loss: 0.1018 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9047 - regression_loss: 0.8036 - classification_loss: 0.1011 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9016 - regression_loss: 0.8009 - classification_loss: 0.1007 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9009 - regression_loss: 0.8001 - classification_loss: 0.1008 83/500 [===>..........................] - ETA: 2:22 - loss: 0.9020 - regression_loss: 0.8010 - classification_loss: 0.1009 84/500 [====>.........................] - ETA: 2:22 - loss: 0.8954 - regression_loss: 0.7951 - classification_loss: 0.1003 85/500 [====>.........................] - ETA: 2:21 - loss: 0.8906 - regression_loss: 0.7912 - classification_loss: 0.0994 86/500 [====>.........................] - ETA: 2:21 - loss: 0.8924 - regression_loss: 0.7923 - classification_loss: 0.1001 87/500 [====>.........................] - ETA: 2:21 - loss: 0.8908 - regression_loss: 0.7913 - classification_loss: 0.0995 88/500 [====>.........................] - ETA: 2:20 - loss: 0.8917 - regression_loss: 0.7921 - classification_loss: 0.0995 89/500 [====>.........................] - ETA: 2:20 - loss: 0.8955 - regression_loss: 0.7958 - classification_loss: 0.0997 90/500 [====>.........................] - ETA: 2:20 - loss: 0.8977 - regression_loss: 0.7979 - classification_loss: 0.0998 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9074 - regression_loss: 0.8061 - classification_loss: 0.1013 92/500 [====>.........................] - ETA: 2:19 - loss: 0.9092 - regression_loss: 0.8076 - classification_loss: 0.1016 93/500 [====>.........................] - ETA: 2:19 - loss: 0.9078 - regression_loss: 0.8072 - classification_loss: 0.1006 94/500 [====>.........................] - ETA: 2:18 - loss: 0.9111 - regression_loss: 0.8103 - classification_loss: 0.1008 95/500 [====>.........................] - ETA: 2:18 - loss: 0.9098 - regression_loss: 0.8095 - classification_loss: 0.1003 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9096 - regression_loss: 0.8095 - classification_loss: 0.1001 97/500 [====>.........................] - ETA: 2:17 - loss: 0.9074 - regression_loss: 0.8078 - classification_loss: 0.0995 98/500 [====>.........................] - ETA: 2:17 - loss: 0.9023 - regression_loss: 0.8037 - classification_loss: 0.0987 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9033 - regression_loss: 0.8036 - classification_loss: 0.0997 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9060 - regression_loss: 0.8059 - classification_loss: 0.1001 101/500 [=====>........................] - ETA: 2:16 - loss: 0.9014 - regression_loss: 0.8021 - classification_loss: 0.0993 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9040 - regression_loss: 0.8047 - classification_loss: 0.0993 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9048 - regression_loss: 0.8052 - classification_loss: 0.0995 104/500 [=====>........................] - ETA: 2:15 - loss: 0.9040 - regression_loss: 0.8048 - classification_loss: 0.0992 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9034 - regression_loss: 0.8043 - classification_loss: 0.0991 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9043 - regression_loss: 0.8051 - classification_loss: 0.0991 107/500 [=====>........................] - ETA: 2:14 - loss: 0.9015 - regression_loss: 0.8029 - classification_loss: 0.0986 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8979 - regression_loss: 0.8001 - classification_loss: 0.0979 109/500 [=====>........................] - ETA: 2:13 - loss: 0.8983 - regression_loss: 0.8006 - classification_loss: 0.0977 110/500 [=====>........................] - ETA: 2:13 - loss: 0.8978 - regression_loss: 0.8000 - classification_loss: 0.0978 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9011 - regression_loss: 0.8032 - classification_loss: 0.0979 112/500 [=====>........................] - ETA: 2:12 - loss: 0.8988 - regression_loss: 0.8013 - classification_loss: 0.0975 113/500 [=====>........................] - ETA: 2:12 - loss: 0.8988 - regression_loss: 0.8013 - classification_loss: 0.0975 114/500 [=====>........................] - ETA: 2:11 - loss: 0.8952 - regression_loss: 0.7983 - classification_loss: 0.0969 115/500 [=====>........................] - ETA: 2:11 - loss: 0.8960 - regression_loss: 0.7988 - classification_loss: 0.0973 116/500 [=====>........................] - ETA: 2:11 - loss: 0.8964 - regression_loss: 0.7992 - classification_loss: 0.0972 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9032 - regression_loss: 0.8061 - classification_loss: 0.0971 118/500 [======>.......................] - ETA: 2:10 - loss: 0.9059 - regression_loss: 0.8083 - classification_loss: 0.0975 119/500 [======>.......................] - ETA: 2:10 - loss: 0.9090 - regression_loss: 0.8113 - classification_loss: 0.0977 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9086 - regression_loss: 0.8112 - classification_loss: 0.0974 121/500 [======>.......................] - ETA: 2:09 - loss: 0.9034 - regression_loss: 0.8066 - classification_loss: 0.0968 122/500 [======>.......................] - ETA: 2:09 - loss: 0.8998 - regression_loss: 0.8036 - classification_loss: 0.0962 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8986 - regression_loss: 0.8026 - classification_loss: 0.0959 124/500 [======>.......................] - ETA: 2:08 - loss: 0.8947 - regression_loss: 0.7990 - classification_loss: 0.0957 125/500 [======>.......................] - ETA: 2:08 - loss: 0.8903 - regression_loss: 0.7952 - classification_loss: 0.0951 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8931 - regression_loss: 0.7977 - classification_loss: 0.0954 127/500 [======>.......................] - ETA: 2:07 - loss: 0.8932 - regression_loss: 0.7975 - classification_loss: 0.0956 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8927 - regression_loss: 0.7972 - classification_loss: 0.0955 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8924 - regression_loss: 0.7973 - classification_loss: 0.0951 130/500 [======>.......................] - ETA: 2:06 - loss: 0.8935 - regression_loss: 0.7981 - classification_loss: 0.0953 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8907 - regression_loss: 0.7959 - classification_loss: 0.0948 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8930 - regression_loss: 0.7982 - classification_loss: 0.0947 133/500 [======>.......................] - ETA: 2:05 - loss: 0.8930 - regression_loss: 0.7982 - classification_loss: 0.0948 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8936 - regression_loss: 0.7987 - classification_loss: 0.0949 135/500 [=======>......................] - ETA: 2:04 - loss: 0.8949 - regression_loss: 0.7999 - classification_loss: 0.0949 136/500 [=======>......................] - ETA: 2:04 - loss: 0.8923 - regression_loss: 0.7978 - classification_loss: 0.0944 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8903 - regression_loss: 0.7963 - classification_loss: 0.0939 138/500 [=======>......................] - ETA: 2:03 - loss: 0.8969 - regression_loss: 0.8019 - classification_loss: 0.0950 139/500 [=======>......................] - ETA: 2:03 - loss: 0.8962 - regression_loss: 0.8015 - classification_loss: 0.0947 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8990 - regression_loss: 0.8038 - classification_loss: 0.0952 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9013 - regression_loss: 0.8056 - classification_loss: 0.0957 142/500 [=======>......................] - ETA: 2:02 - loss: 0.9004 - regression_loss: 0.8048 - classification_loss: 0.0956 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8985 - regression_loss: 0.8033 - classification_loss: 0.0952 144/500 [=======>......................] - ETA: 2:01 - loss: 0.8985 - regression_loss: 0.8038 - classification_loss: 0.0947 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9019 - regression_loss: 0.8066 - classification_loss: 0.0953 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9016 - regression_loss: 0.8064 - classification_loss: 0.0952 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9012 - regression_loss: 0.8061 - classification_loss: 0.0951 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9039 - regression_loss: 0.8087 - classification_loss: 0.0953 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9063 - regression_loss: 0.8108 - classification_loss: 0.0955 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9080 - regression_loss: 0.8124 - classification_loss: 0.0956 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9076 - regression_loss: 0.8125 - classification_loss: 0.0951 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9082 - regression_loss: 0.8130 - classification_loss: 0.0952 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9061 - regression_loss: 0.8111 - classification_loss: 0.0949 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9076 - regression_loss: 0.8125 - classification_loss: 0.0951 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9096 - regression_loss: 0.8142 - classification_loss: 0.0954 156/500 [========>.....................] - ETA: 1:57 - loss: 0.9060 - regression_loss: 0.8111 - classification_loss: 0.0949 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9055 - regression_loss: 0.8107 - classification_loss: 0.0948 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9076 - regression_loss: 0.8125 - classification_loss: 0.0951 159/500 [========>.....................] - ETA: 1:56 - loss: 0.9054 - regression_loss: 0.8104 - classification_loss: 0.0950 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9044 - regression_loss: 0.8095 - classification_loss: 0.0949 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9069 - regression_loss: 0.8116 - classification_loss: 0.0952 162/500 [========>.....................] - ETA: 1:55 - loss: 0.9103 - regression_loss: 0.8144 - classification_loss: 0.0958 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9108 - regression_loss: 0.8148 - classification_loss: 0.0960 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9099 - regression_loss: 0.8141 - classification_loss: 0.0958 165/500 [========>.....................] - ETA: 1:54 - loss: 0.9069 - regression_loss: 0.8115 - classification_loss: 0.0954 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9071 - regression_loss: 0.8122 - classification_loss: 0.0950 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9051 - regression_loss: 0.8103 - classification_loss: 0.0947 168/500 [=========>....................] - ETA: 1:53 - loss: 0.9059 - regression_loss: 0.8114 - classification_loss: 0.0945 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9058 - regression_loss: 0.8115 - classification_loss: 0.0944 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9065 - regression_loss: 0.8122 - classification_loss: 0.0943 171/500 [=========>....................] - ETA: 1:52 - loss: 0.9043 - regression_loss: 0.8104 - classification_loss: 0.0939 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9038 - regression_loss: 0.8101 - classification_loss: 0.0937 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9016 - regression_loss: 0.8083 - classification_loss: 0.0933 174/500 [=========>....................] - ETA: 1:51 - loss: 0.9020 - regression_loss: 0.8088 - classification_loss: 0.0932 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9032 - regression_loss: 0.8097 - classification_loss: 0.0935 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9050 - regression_loss: 0.8114 - classification_loss: 0.0936 177/500 [=========>....................] - ETA: 1:50 - loss: 0.9048 - regression_loss: 0.8111 - classification_loss: 0.0936 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9039 - regression_loss: 0.8107 - classification_loss: 0.0932 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9036 - regression_loss: 0.8103 - classification_loss: 0.0933 180/500 [=========>....................] - ETA: 1:49 - loss: 0.9034 - regression_loss: 0.8101 - classification_loss: 0.0933 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9063 - regression_loss: 0.8124 - classification_loss: 0.0939 182/500 [=========>....................] - ETA: 1:48 - loss: 0.9058 - regression_loss: 0.8118 - classification_loss: 0.0940 183/500 [=========>....................] - ETA: 1:48 - loss: 0.9058 - regression_loss: 0.8116 - classification_loss: 0.0942 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9052 - regression_loss: 0.8112 - classification_loss: 0.0940 185/500 [==========>...................] - ETA: 1:47 - loss: 0.9018 - regression_loss: 0.8083 - classification_loss: 0.0936 186/500 [==========>...................] - ETA: 1:47 - loss: 0.9020 - regression_loss: 0.8086 - classification_loss: 0.0934 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9033 - regression_loss: 0.8096 - classification_loss: 0.0937 188/500 [==========>...................] - ETA: 1:46 - loss: 0.9039 - regression_loss: 0.8103 - classification_loss: 0.0936 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9036 - regression_loss: 0.8100 - classification_loss: 0.0935 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9002 - regression_loss: 0.8070 - classification_loss: 0.0932 191/500 [==========>...................] - ETA: 1:45 - loss: 0.9007 - regression_loss: 0.8074 - classification_loss: 0.0932 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9035 - regression_loss: 0.8090 - classification_loss: 0.0946 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9037 - regression_loss: 0.8090 - classification_loss: 0.0947 194/500 [==========>...................] - ETA: 1:44 - loss: 0.9042 - regression_loss: 0.8093 - classification_loss: 0.0949 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9040 - regression_loss: 0.8089 - classification_loss: 0.0951 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9044 - regression_loss: 0.8093 - classification_loss: 0.0951 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9017 - regression_loss: 0.8064 - classification_loss: 0.0953 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9029 - regression_loss: 0.8073 - classification_loss: 0.0956 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9027 - regression_loss: 0.8073 - classification_loss: 0.0955 200/500 [===========>..................] - ETA: 1:42 - loss: 0.9014 - regression_loss: 0.8062 - classification_loss: 0.0952 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8989 - regression_loss: 0.8041 - classification_loss: 0.0949 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8996 - regression_loss: 0.8046 - classification_loss: 0.0950 203/500 [===========>..................] - ETA: 1:41 - loss: 0.9011 - regression_loss: 0.8060 - classification_loss: 0.0951 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9013 - regression_loss: 0.8062 - classification_loss: 0.0950 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9018 - regression_loss: 0.8068 - classification_loss: 0.0949 206/500 [===========>..................] - ETA: 1:40 - loss: 0.9028 - regression_loss: 0.8076 - classification_loss: 0.0951 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9037 - regression_loss: 0.8084 - classification_loss: 0.0953 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9025 - regression_loss: 0.8074 - classification_loss: 0.0951 209/500 [===========>..................] - ETA: 1:39 - loss: 0.9011 - regression_loss: 0.8061 - classification_loss: 0.0949 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8975 - regression_loss: 0.8030 - classification_loss: 0.0945 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8980 - regression_loss: 0.8035 - classification_loss: 0.0945 212/500 [===========>..................] - ETA: 1:38 - loss: 0.9008 - regression_loss: 0.8056 - classification_loss: 0.0952 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9008 - regression_loss: 0.8058 - classification_loss: 0.0949 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9025 - regression_loss: 0.8071 - classification_loss: 0.0954 215/500 [===========>..................] - ETA: 1:37 - loss: 0.9025 - regression_loss: 0.8072 - classification_loss: 0.0953 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9033 - regression_loss: 0.8079 - classification_loss: 0.0954 217/500 [============>.................] - ETA: 1:36 - loss: 0.9031 - regression_loss: 0.8077 - classification_loss: 0.0954 218/500 [============>.................] - ETA: 1:36 - loss: 0.9038 - regression_loss: 0.8084 - classification_loss: 0.0954 219/500 [============>.................] - ETA: 1:35 - loss: 0.9042 - regression_loss: 0.8088 - classification_loss: 0.0955 220/500 [============>.................] - ETA: 1:35 - loss: 0.9045 - regression_loss: 0.8089 - classification_loss: 0.0955 221/500 [============>.................] - ETA: 1:35 - loss: 0.9044 - regression_loss: 0.8089 - classification_loss: 0.0955 222/500 [============>.................] - ETA: 1:34 - loss: 0.9085 - regression_loss: 0.8123 - classification_loss: 0.0962 223/500 [============>.................] - ETA: 1:34 - loss: 0.9089 - regression_loss: 0.8127 - classification_loss: 0.0961 224/500 [============>.................] - ETA: 1:34 - loss: 0.9110 - regression_loss: 0.8147 - classification_loss: 0.0963 225/500 [============>.................] - ETA: 1:33 - loss: 0.9105 - regression_loss: 0.8145 - classification_loss: 0.0960 226/500 [============>.................] - ETA: 1:33 - loss: 0.9117 - regression_loss: 0.8151 - classification_loss: 0.0966 227/500 [============>.................] - ETA: 1:32 - loss: 0.9114 - regression_loss: 0.8149 - classification_loss: 0.0964 228/500 [============>.................] - ETA: 1:32 - loss: 0.9105 - regression_loss: 0.8142 - classification_loss: 0.0963 229/500 [============>.................] - ETA: 1:32 - loss: 0.9090 - regression_loss: 0.8130 - classification_loss: 0.0960 230/500 [============>.................] - ETA: 1:31 - loss: 0.9099 - regression_loss: 0.8138 - classification_loss: 0.0961 231/500 [============>.................] - ETA: 1:31 - loss: 0.9108 - regression_loss: 0.8145 - classification_loss: 0.0964 232/500 [============>.................] - ETA: 1:31 - loss: 0.9117 - regression_loss: 0.8154 - classification_loss: 0.0962 233/500 [============>.................] - ETA: 1:30 - loss: 0.9118 - regression_loss: 0.8156 - classification_loss: 0.0961 234/500 [=============>................] - ETA: 1:30 - loss: 0.9129 - regression_loss: 0.8166 - classification_loss: 0.0963 235/500 [=============>................] - ETA: 1:30 - loss: 0.9155 - regression_loss: 0.8187 - classification_loss: 0.0969 236/500 [=============>................] - ETA: 1:29 - loss: 0.9160 - regression_loss: 0.8191 - classification_loss: 0.0969 237/500 [=============>................] - ETA: 1:29 - loss: 0.9135 - regression_loss: 0.8169 - classification_loss: 0.0966 238/500 [=============>................] - ETA: 1:29 - loss: 0.9123 - regression_loss: 0.8159 - classification_loss: 0.0964 239/500 [=============>................] - ETA: 1:28 - loss: 0.9132 - regression_loss: 0.8167 - classification_loss: 0.0965 240/500 [=============>................] - ETA: 1:28 - loss: 0.9145 - regression_loss: 0.8179 - classification_loss: 0.0967 241/500 [=============>................] - ETA: 1:28 - loss: 0.9154 - regression_loss: 0.8186 - classification_loss: 0.0968 242/500 [=============>................] - ETA: 1:27 - loss: 0.9144 - regression_loss: 0.8180 - classification_loss: 0.0965 243/500 [=============>................] - ETA: 1:27 - loss: 0.9158 - regression_loss: 0.8191 - classification_loss: 0.0967 244/500 [=============>................] - ETA: 1:27 - loss: 0.9154 - regression_loss: 0.8186 - classification_loss: 0.0968 245/500 [=============>................] - ETA: 1:26 - loss: 0.9148 - regression_loss: 0.8182 - classification_loss: 0.0966 246/500 [=============>................] - ETA: 1:26 - loss: 0.9156 - regression_loss: 0.8187 - classification_loss: 0.0969 247/500 [=============>................] - ETA: 1:26 - loss: 0.9164 - regression_loss: 0.8194 - classification_loss: 0.0970 248/500 [=============>................] - ETA: 1:25 - loss: 0.9182 - regression_loss: 0.8210 - classification_loss: 0.0972 249/500 [=============>................] - ETA: 1:25 - loss: 0.9194 - regression_loss: 0.8223 - classification_loss: 0.0971 250/500 [==============>...............] - ETA: 1:25 - loss: 0.9177 - regression_loss: 0.8209 - classification_loss: 0.0968 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9200 - regression_loss: 0.8222 - classification_loss: 0.0979 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9174 - regression_loss: 0.8198 - classification_loss: 0.0975 253/500 [==============>...............] - ETA: 1:24 - loss: 0.9180 - regression_loss: 0.8204 - classification_loss: 0.0976 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9178 - regression_loss: 0.8203 - classification_loss: 0.0976 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9153 - regression_loss: 0.8179 - classification_loss: 0.0973 256/500 [==============>...............] - ETA: 1:23 - loss: 0.9158 - regression_loss: 0.8184 - classification_loss: 0.0974 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9177 - regression_loss: 0.8201 - classification_loss: 0.0976 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9167 - regression_loss: 0.8193 - classification_loss: 0.0974 259/500 [==============>...............] - ETA: 1:22 - loss: 0.9166 - regression_loss: 0.8193 - classification_loss: 0.0973 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9165 - regression_loss: 0.8192 - classification_loss: 0.0972 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9161 - regression_loss: 0.8190 - classification_loss: 0.0970 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9158 - regression_loss: 0.8188 - classification_loss: 0.0969 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9141 - regression_loss: 0.8174 - classification_loss: 0.0967 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9148 - regression_loss: 0.8179 - classification_loss: 0.0969 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9150 - regression_loss: 0.8182 - classification_loss: 0.0968 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9151 - regression_loss: 0.8183 - classification_loss: 0.0968 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9165 - regression_loss: 0.8192 - classification_loss: 0.0973 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9167 - regression_loss: 0.8193 - classification_loss: 0.0974 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9182 - regression_loss: 0.8206 - classification_loss: 0.0976 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9194 - regression_loss: 0.8216 - classification_loss: 0.0977 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9201 - regression_loss: 0.8225 - classification_loss: 0.0977 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9199 - regression_loss: 0.8223 - classification_loss: 0.0976 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9181 - regression_loss: 0.8208 - classification_loss: 0.0973 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9176 - regression_loss: 0.8204 - classification_loss: 0.0973 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9176 - regression_loss: 0.8202 - classification_loss: 0.0975 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9187 - regression_loss: 0.8214 - classification_loss: 0.0973 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9179 - regression_loss: 0.8208 - classification_loss: 0.0971 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9193 - regression_loss: 0.8219 - classification_loss: 0.0974 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9185 - regression_loss: 0.8212 - classification_loss: 0.0972 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9192 - regression_loss: 0.8219 - classification_loss: 0.0973 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9192 - regression_loss: 0.8220 - classification_loss: 0.0972 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9203 - regression_loss: 0.8228 - classification_loss: 0.0974 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9212 - regression_loss: 0.8236 - classification_loss: 0.0976 284/500 [================>.............] - ETA: 1:13 - loss: 0.9211 - regression_loss: 0.8233 - classification_loss: 0.0977 285/500 [================>.............] - ETA: 1:13 - loss: 0.9215 - regression_loss: 0.8238 - classification_loss: 0.0978 286/500 [================>.............] - ETA: 1:12 - loss: 0.9196 - regression_loss: 0.8221 - classification_loss: 0.0975 287/500 [================>.............] - ETA: 1:12 - loss: 0.9203 - regression_loss: 0.8229 - classification_loss: 0.0974 288/500 [================>.............] - ETA: 1:12 - loss: 0.9201 - regression_loss: 0.8226 - classification_loss: 0.0975 289/500 [================>.............] - ETA: 1:11 - loss: 0.9198 - regression_loss: 0.8224 - classification_loss: 0.0973 290/500 [================>.............] - ETA: 1:11 - loss: 0.9208 - regression_loss: 0.8233 - classification_loss: 0.0976 291/500 [================>.............] - ETA: 1:11 - loss: 0.9226 - regression_loss: 0.8246 - classification_loss: 0.0980 292/500 [================>.............] - ETA: 1:10 - loss: 0.9222 - regression_loss: 0.8243 - classification_loss: 0.0978 293/500 [================>.............] - ETA: 1:10 - loss: 0.9232 - regression_loss: 0.8252 - classification_loss: 0.0980 294/500 [================>.............] - ETA: 1:10 - loss: 0.9221 - regression_loss: 0.8243 - classification_loss: 0.0978 295/500 [================>.............] - ETA: 1:09 - loss: 0.9234 - regression_loss: 0.8252 - classification_loss: 0.0981 296/500 [================>.............] - ETA: 1:09 - loss: 0.9220 - regression_loss: 0.8239 - classification_loss: 0.0981 297/500 [================>.............] - ETA: 1:09 - loss: 0.9216 - regression_loss: 0.8236 - classification_loss: 0.0980 298/500 [================>.............] - ETA: 1:08 - loss: 0.9197 - regression_loss: 0.8219 - classification_loss: 0.0977 299/500 [================>.............] - ETA: 1:08 - loss: 0.9202 - regression_loss: 0.8227 - classification_loss: 0.0975 300/500 [=================>............] - ETA: 1:08 - loss: 0.9220 - regression_loss: 0.8242 - classification_loss: 0.0977 301/500 [=================>............] - ETA: 1:07 - loss: 0.9211 - regression_loss: 0.8234 - classification_loss: 0.0977 302/500 [=================>............] - ETA: 1:07 - loss: 0.9200 - regression_loss: 0.8225 - classification_loss: 0.0975 303/500 [=================>............] - ETA: 1:07 - loss: 0.9187 - regression_loss: 0.8215 - classification_loss: 0.0972 304/500 [=================>............] - ETA: 1:06 - loss: 0.9174 - regression_loss: 0.8204 - classification_loss: 0.0970 305/500 [=================>............] - ETA: 1:06 - loss: 0.9188 - regression_loss: 0.8211 - classification_loss: 0.0976 306/500 [=================>............] - ETA: 1:06 - loss: 0.9189 - regression_loss: 0.8213 - classification_loss: 0.0976 307/500 [=================>............] - ETA: 1:05 - loss: 0.9168 - regression_loss: 0.8194 - classification_loss: 0.0974 308/500 [=================>............] - ETA: 1:05 - loss: 0.9162 - regression_loss: 0.8189 - classification_loss: 0.0973 309/500 [=================>............] - ETA: 1:04 - loss: 0.9176 - regression_loss: 0.8201 - classification_loss: 0.0975 310/500 [=================>............] - ETA: 1:04 - loss: 0.9185 - regression_loss: 0.8211 - classification_loss: 0.0975 311/500 [=================>............] - ETA: 1:04 - loss: 0.9181 - regression_loss: 0.8208 - classification_loss: 0.0973 312/500 [=================>............] - ETA: 1:03 - loss: 0.9180 - regression_loss: 0.8207 - classification_loss: 0.0973 313/500 [=================>............] - ETA: 1:03 - loss: 0.9191 - regression_loss: 0.8216 - classification_loss: 0.0975 314/500 [=================>............] - ETA: 1:03 - loss: 0.9194 - regression_loss: 0.8219 - classification_loss: 0.0975 315/500 [=================>............] - ETA: 1:02 - loss: 0.9176 - regression_loss: 0.8204 - classification_loss: 0.0972 316/500 [=================>............] - ETA: 1:02 - loss: 0.9174 - regression_loss: 0.8204 - classification_loss: 0.0971 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9176 - regression_loss: 0.8205 - classification_loss: 0.0972 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9173 - regression_loss: 0.8202 - classification_loss: 0.0972 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9172 - regression_loss: 0.8201 - classification_loss: 0.0971 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9181 - regression_loss: 0.8209 - classification_loss: 0.0972 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9187 - regression_loss: 0.8215 - classification_loss: 0.0972 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9192 - regression_loss: 0.8219 - classification_loss: 0.0972 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9188 - regression_loss: 0.8215 - classification_loss: 0.0973 324/500 [==================>...........] - ETA: 59s - loss: 0.9198 - regression_loss: 0.8225 - classification_loss: 0.0973  325/500 [==================>...........] - ETA: 59s - loss: 0.9197 - regression_loss: 0.8226 - classification_loss: 0.0971 326/500 [==================>...........] - ETA: 59s - loss: 0.9194 - regression_loss: 0.8224 - classification_loss: 0.0971 327/500 [==================>...........] - ETA: 58s - loss: 0.9187 - regression_loss: 0.8218 - classification_loss: 0.0970 328/500 [==================>...........] - ETA: 58s - loss: 0.9187 - regression_loss: 0.8218 - classification_loss: 0.0969 329/500 [==================>...........] - ETA: 58s - loss: 0.9182 - regression_loss: 0.8213 - classification_loss: 0.0969 330/500 [==================>...........] - ETA: 57s - loss: 0.9186 - regression_loss: 0.8218 - classification_loss: 0.0968 331/500 [==================>...........] - ETA: 57s - loss: 0.9191 - regression_loss: 0.8222 - classification_loss: 0.0968 332/500 [==================>...........] - ETA: 57s - loss: 0.9197 - regression_loss: 0.8228 - classification_loss: 0.0969 333/500 [==================>...........] - ETA: 56s - loss: 0.9203 - regression_loss: 0.8233 - classification_loss: 0.0970 334/500 [===================>..........] - ETA: 56s - loss: 0.9194 - regression_loss: 0.8226 - classification_loss: 0.0968 335/500 [===================>..........] - ETA: 56s - loss: 0.9208 - regression_loss: 0.8239 - classification_loss: 0.0969 336/500 [===================>..........] - ETA: 55s - loss: 0.9219 - regression_loss: 0.8248 - classification_loss: 0.0971 337/500 [===================>..........] - ETA: 55s - loss: 0.9227 - regression_loss: 0.8256 - classification_loss: 0.0972 338/500 [===================>..........] - ETA: 55s - loss: 0.9216 - regression_loss: 0.8247 - classification_loss: 0.0969 339/500 [===================>..........] - ETA: 54s - loss: 0.9223 - regression_loss: 0.8253 - classification_loss: 0.0970 340/500 [===================>..........] - ETA: 54s - loss: 0.9213 - regression_loss: 0.8244 - classification_loss: 0.0969 341/500 [===================>..........] - ETA: 54s - loss: 0.9211 - regression_loss: 0.8242 - classification_loss: 0.0968 342/500 [===================>..........] - ETA: 53s - loss: 0.9217 - regression_loss: 0.8249 - classification_loss: 0.0968 343/500 [===================>..........] - ETA: 53s - loss: 0.9210 - regression_loss: 0.8243 - classification_loss: 0.0967 344/500 [===================>..........] - ETA: 53s - loss: 0.9205 - regression_loss: 0.8239 - classification_loss: 0.0966 345/500 [===================>..........] - ETA: 52s - loss: 0.9195 - regression_loss: 0.8231 - classification_loss: 0.0964 346/500 [===================>..........] - ETA: 52s - loss: 0.9178 - regression_loss: 0.8216 - classification_loss: 0.0963 347/500 [===================>..........] - ETA: 52s - loss: 0.9186 - regression_loss: 0.8222 - classification_loss: 0.0964 348/500 [===================>..........] - ETA: 51s - loss: 0.9184 - regression_loss: 0.8221 - classification_loss: 0.0963 349/500 [===================>..........] - ETA: 51s - loss: 0.9200 - regression_loss: 0.8235 - classification_loss: 0.0966 350/500 [====================>.........] - ETA: 51s - loss: 0.9209 - regression_loss: 0.8243 - classification_loss: 0.0966 351/500 [====================>.........] - ETA: 50s - loss: 0.9201 - regression_loss: 0.8237 - classification_loss: 0.0964 352/500 [====================>.........] - ETA: 50s - loss: 0.9183 - regression_loss: 0.8221 - classification_loss: 0.0962 353/500 [====================>.........] - ETA: 50s - loss: 0.9186 - regression_loss: 0.8223 - classification_loss: 0.0963 354/500 [====================>.........] - ETA: 49s - loss: 0.9182 - regression_loss: 0.8219 - classification_loss: 0.0963 355/500 [====================>.........] - ETA: 49s - loss: 0.9192 - regression_loss: 0.8226 - classification_loss: 0.0965 356/500 [====================>.........] - ETA: 49s - loss: 0.9188 - regression_loss: 0.8224 - classification_loss: 0.0964 357/500 [====================>.........] - ETA: 48s - loss: 0.9192 - regression_loss: 0.8227 - classification_loss: 0.0966 358/500 [====================>.........] - ETA: 48s - loss: 0.9196 - regression_loss: 0.8231 - classification_loss: 0.0965 359/500 [====================>.........] - ETA: 47s - loss: 0.9201 - regression_loss: 0.8235 - classification_loss: 0.0965 360/500 [====================>.........] - ETA: 47s - loss: 0.9190 - regression_loss: 0.8227 - classification_loss: 0.0963 361/500 [====================>.........] - ETA: 47s - loss: 0.9180 - regression_loss: 0.8218 - classification_loss: 0.0962 362/500 [====================>.........] - ETA: 46s - loss: 0.9184 - regression_loss: 0.8221 - classification_loss: 0.0963 363/500 [====================>.........] - ETA: 46s - loss: 0.9168 - regression_loss: 0.8207 - classification_loss: 0.0961 364/500 [====================>.........] - ETA: 46s - loss: 0.9157 - regression_loss: 0.8198 - classification_loss: 0.0959 365/500 [====================>.........] - ETA: 45s - loss: 0.9159 - regression_loss: 0.8199 - classification_loss: 0.0960 366/500 [====================>.........] - ETA: 45s - loss: 0.9168 - regression_loss: 0.8208 - classification_loss: 0.0960 367/500 [=====================>........] - ETA: 45s - loss: 0.9165 - regression_loss: 0.8205 - classification_loss: 0.0960 368/500 [=====================>........] - ETA: 44s - loss: 0.9158 - regression_loss: 0.8199 - classification_loss: 0.0959 369/500 [=====================>........] - ETA: 44s - loss: 0.9152 - regression_loss: 0.8194 - classification_loss: 0.0958 370/500 [=====================>........] - ETA: 44s - loss: 0.9138 - regression_loss: 0.8182 - classification_loss: 0.0956 371/500 [=====================>........] - ETA: 43s - loss: 0.9124 - regression_loss: 0.8169 - classification_loss: 0.0954 372/500 [=====================>........] - ETA: 43s - loss: 0.9127 - regression_loss: 0.8172 - classification_loss: 0.0955 373/500 [=====================>........] - ETA: 43s - loss: 0.9120 - regression_loss: 0.8166 - classification_loss: 0.0954 374/500 [=====================>........] - ETA: 42s - loss: 0.9114 - regression_loss: 0.8161 - classification_loss: 0.0953 375/500 [=====================>........] - ETA: 42s - loss: 0.9111 - regression_loss: 0.8158 - classification_loss: 0.0952 376/500 [=====================>........] - ETA: 42s - loss: 0.9104 - regression_loss: 0.8152 - classification_loss: 0.0952 377/500 [=====================>........] - ETA: 41s - loss: 0.9096 - regression_loss: 0.8145 - classification_loss: 0.0950 378/500 [=====================>........] - ETA: 41s - loss: 0.9095 - regression_loss: 0.8144 - classification_loss: 0.0951 379/500 [=====================>........] - ETA: 41s - loss: 0.9097 - regression_loss: 0.8145 - classification_loss: 0.0952 380/500 [=====================>........] - ETA: 40s - loss: 0.9095 - regression_loss: 0.8144 - classification_loss: 0.0951 381/500 [=====================>........] - ETA: 40s - loss: 0.9101 - regression_loss: 0.8150 - classification_loss: 0.0951 382/500 [=====================>........] - ETA: 40s - loss: 0.9099 - regression_loss: 0.8148 - classification_loss: 0.0951 383/500 [=====================>........] - ETA: 39s - loss: 0.9091 - regression_loss: 0.8141 - classification_loss: 0.0949 384/500 [======================>.......] - ETA: 39s - loss: 0.9096 - regression_loss: 0.8146 - classification_loss: 0.0950 385/500 [======================>.......] - ETA: 39s - loss: 0.9092 - regression_loss: 0.8142 - classification_loss: 0.0950 386/500 [======================>.......] - ETA: 38s - loss: 0.9098 - regression_loss: 0.8148 - classification_loss: 0.0950 387/500 [======================>.......] - ETA: 38s - loss: 0.9102 - regression_loss: 0.8150 - classification_loss: 0.0952 388/500 [======================>.......] - ETA: 38s - loss: 0.9100 - regression_loss: 0.8149 - classification_loss: 0.0951 389/500 [======================>.......] - ETA: 37s - loss: 0.9100 - regression_loss: 0.8150 - classification_loss: 0.0950 390/500 [======================>.......] - ETA: 37s - loss: 0.9113 - regression_loss: 0.8162 - classification_loss: 0.0951 391/500 [======================>.......] - ETA: 37s - loss: 0.9115 - regression_loss: 0.8162 - classification_loss: 0.0953 392/500 [======================>.......] - ETA: 36s - loss: 0.9102 - regression_loss: 0.8151 - classification_loss: 0.0951 393/500 [======================>.......] - ETA: 36s - loss: 0.9103 - regression_loss: 0.8151 - classification_loss: 0.0952 394/500 [======================>.......] - ETA: 36s - loss: 0.9088 - regression_loss: 0.8139 - classification_loss: 0.0950 395/500 [======================>.......] - ETA: 35s - loss: 0.9073 - regression_loss: 0.8125 - classification_loss: 0.0948 396/500 [======================>.......] - ETA: 35s - loss: 0.9073 - regression_loss: 0.8125 - classification_loss: 0.0947 397/500 [======================>.......] - ETA: 35s - loss: 0.9077 - regression_loss: 0.8129 - classification_loss: 0.0948 398/500 [======================>.......] - ETA: 34s - loss: 0.9081 - regression_loss: 0.8133 - classification_loss: 0.0948 399/500 [======================>.......] - ETA: 34s - loss: 0.9078 - regression_loss: 0.8131 - classification_loss: 0.0947 400/500 [=======================>......] - ETA: 34s - loss: 0.9088 - regression_loss: 0.8140 - classification_loss: 0.0948 401/500 [=======================>......] - ETA: 33s - loss: 0.9092 - regression_loss: 0.8144 - classification_loss: 0.0947 402/500 [=======================>......] - ETA: 33s - loss: 0.9092 - regression_loss: 0.8147 - classification_loss: 0.0946 403/500 [=======================>......] - ETA: 33s - loss: 0.9097 - regression_loss: 0.8151 - classification_loss: 0.0946 404/500 [=======================>......] - ETA: 32s - loss: 0.9102 - regression_loss: 0.8155 - classification_loss: 0.0948 405/500 [=======================>......] - ETA: 32s - loss: 0.9106 - regression_loss: 0.8158 - classification_loss: 0.0948 406/500 [=======================>......] - ETA: 32s - loss: 0.9114 - regression_loss: 0.8164 - classification_loss: 0.0949 407/500 [=======================>......] - ETA: 31s - loss: 0.9110 - regression_loss: 0.8161 - classification_loss: 0.0949 408/500 [=======================>......] - ETA: 31s - loss: 0.9108 - regression_loss: 0.8161 - classification_loss: 0.0947 409/500 [=======================>......] - ETA: 30s - loss: 0.9107 - regression_loss: 0.8160 - classification_loss: 0.0947 410/500 [=======================>......] - ETA: 30s - loss: 0.9103 - regression_loss: 0.8156 - classification_loss: 0.0947 411/500 [=======================>......] - ETA: 30s - loss: 0.9105 - regression_loss: 0.8158 - classification_loss: 0.0947 412/500 [=======================>......] - ETA: 29s - loss: 0.9095 - regression_loss: 0.8149 - classification_loss: 0.0946 413/500 [=======================>......] - ETA: 29s - loss: 0.9096 - regression_loss: 0.8149 - classification_loss: 0.0946 414/500 [=======================>......] - ETA: 29s - loss: 0.9105 - regression_loss: 0.8158 - classification_loss: 0.0946 415/500 [=======================>......] - ETA: 28s - loss: 0.9107 - regression_loss: 0.8160 - classification_loss: 0.0947 416/500 [=======================>......] - ETA: 28s - loss: 0.9138 - regression_loss: 0.8185 - classification_loss: 0.0952 417/500 [========================>.....] - ETA: 28s - loss: 0.9134 - regression_loss: 0.8182 - classification_loss: 0.0952 418/500 [========================>.....] - ETA: 27s - loss: 0.9149 - regression_loss: 0.8195 - classification_loss: 0.0954 419/500 [========================>.....] - ETA: 27s - loss: 0.9155 - regression_loss: 0.8200 - classification_loss: 0.0955 420/500 [========================>.....] - ETA: 27s - loss: 0.9162 - regression_loss: 0.8205 - classification_loss: 0.0957 421/500 [========================>.....] - ETA: 26s - loss: 0.9161 - regression_loss: 0.8205 - classification_loss: 0.0956 422/500 [========================>.....] - ETA: 26s - loss: 0.9158 - regression_loss: 0.8203 - classification_loss: 0.0955 423/500 [========================>.....] - ETA: 26s - loss: 0.9154 - regression_loss: 0.8200 - classification_loss: 0.0954 424/500 [========================>.....] - ETA: 25s - loss: 0.9159 - regression_loss: 0.8204 - classification_loss: 0.0956 425/500 [========================>.....] - ETA: 25s - loss: 0.9160 - regression_loss: 0.8204 - classification_loss: 0.0956 426/500 [========================>.....] - ETA: 25s - loss: 0.9153 - regression_loss: 0.8199 - classification_loss: 0.0954 427/500 [========================>.....] - ETA: 24s - loss: 0.9163 - regression_loss: 0.8207 - classification_loss: 0.0956 428/500 [========================>.....] - ETA: 24s - loss: 0.9150 - regression_loss: 0.8195 - classification_loss: 0.0955 429/500 [========================>.....] - ETA: 24s - loss: 0.9159 - regression_loss: 0.8203 - classification_loss: 0.0955 430/500 [========================>.....] - ETA: 23s - loss: 0.9157 - regression_loss: 0.8202 - classification_loss: 0.0955 431/500 [========================>.....] - ETA: 23s - loss: 0.9167 - regression_loss: 0.8210 - classification_loss: 0.0957 432/500 [========================>.....] - ETA: 23s - loss: 0.9151 - regression_loss: 0.8195 - classification_loss: 0.0956 433/500 [========================>.....] - ETA: 22s - loss: 0.9157 - regression_loss: 0.8200 - classification_loss: 0.0957 434/500 [=========================>....] - ETA: 22s - loss: 0.9157 - regression_loss: 0.8200 - classification_loss: 0.0957 435/500 [=========================>....] - ETA: 22s - loss: 0.9150 - regression_loss: 0.8194 - classification_loss: 0.0956 436/500 [=========================>....] - ETA: 21s - loss: 0.9134 - regression_loss: 0.8180 - classification_loss: 0.0954 437/500 [=========================>....] - ETA: 21s - loss: 0.9133 - regression_loss: 0.8180 - classification_loss: 0.0953 438/500 [=========================>....] - ETA: 21s - loss: 0.9138 - regression_loss: 0.8184 - classification_loss: 0.0954 439/500 [=========================>....] - ETA: 20s - loss: 0.9134 - regression_loss: 0.8182 - classification_loss: 0.0953 440/500 [=========================>....] - ETA: 20s - loss: 0.9126 - regression_loss: 0.8174 - classification_loss: 0.0952 441/500 [=========================>....] - ETA: 20s - loss: 0.9123 - regression_loss: 0.8172 - classification_loss: 0.0951 442/500 [=========================>....] - ETA: 19s - loss: 0.9126 - regression_loss: 0.8173 - classification_loss: 0.0952 443/500 [=========================>....] - ETA: 19s - loss: 0.9112 - regression_loss: 0.8162 - classification_loss: 0.0951 444/500 [=========================>....] - ETA: 19s - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 445/500 [=========================>....] - ETA: 18s - loss: 0.9109 - regression_loss: 0.8160 - classification_loss: 0.0949 446/500 [=========================>....] - ETA: 18s - loss: 0.9102 - regression_loss: 0.8154 - classification_loss: 0.0948 447/500 [=========================>....] - ETA: 18s - loss: 0.9101 - regression_loss: 0.8154 - classification_loss: 0.0947 448/500 [=========================>....] - ETA: 17s - loss: 0.9099 - regression_loss: 0.8153 - classification_loss: 0.0946 449/500 [=========================>....] - ETA: 17s - loss: 0.9091 - regression_loss: 0.8145 - classification_loss: 0.0947 450/500 [==========================>...] - ETA: 17s - loss: 0.9091 - regression_loss: 0.8143 - classification_loss: 0.0948 451/500 [==========================>...] - ETA: 16s - loss: 0.9086 - regression_loss: 0.8139 - classification_loss: 0.0947 452/500 [==========================>...] - ETA: 16s - loss: 0.9098 - regression_loss: 0.8148 - classification_loss: 0.0950 453/500 [==========================>...] - ETA: 15s - loss: 0.9103 - regression_loss: 0.8152 - classification_loss: 0.0950 454/500 [==========================>...] - ETA: 15s - loss: 0.9105 - regression_loss: 0.8155 - classification_loss: 0.0950 455/500 [==========================>...] - ETA: 15s - loss: 0.9109 - regression_loss: 0.8159 - classification_loss: 0.0950 456/500 [==========================>...] - ETA: 14s - loss: 0.9110 - regression_loss: 0.8160 - classification_loss: 0.0950 457/500 [==========================>...] - ETA: 14s - loss: 0.9112 - regression_loss: 0.8162 - classification_loss: 0.0951 458/500 [==========================>...] - ETA: 14s - loss: 0.9113 - regression_loss: 0.8162 - classification_loss: 0.0951 459/500 [==========================>...] - ETA: 13s - loss: 0.9107 - regression_loss: 0.8157 - classification_loss: 0.0950 460/500 [==========================>...] - ETA: 13s - loss: 0.9117 - regression_loss: 0.8165 - classification_loss: 0.0952 461/500 [==========================>...] - ETA: 13s - loss: 0.9117 - regression_loss: 0.8165 - classification_loss: 0.0951 462/500 [==========================>...] - ETA: 12s - loss: 0.9104 - regression_loss: 0.8155 - classification_loss: 0.0949 463/500 [==========================>...] - ETA: 12s - loss: 0.9108 - regression_loss: 0.8158 - classification_loss: 0.0950 464/500 [==========================>...] - ETA: 12s - loss: 0.9099 - regression_loss: 0.8150 - classification_loss: 0.0949 465/500 [==========================>...] - ETA: 11s - loss: 0.9099 - regression_loss: 0.8150 - classification_loss: 0.0949 466/500 [==========================>...] - ETA: 11s - loss: 0.9115 - regression_loss: 0.8163 - classification_loss: 0.0952 467/500 [===========================>..] - ETA: 11s - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 468/500 [===========================>..] - ETA: 10s - loss: 0.9113 - regression_loss: 0.8162 - classification_loss: 0.0951 469/500 [===========================>..] - ETA: 10s - loss: 0.9120 - regression_loss: 0.8168 - classification_loss: 0.0952 470/500 [===========================>..] - ETA: 10s - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 471/500 [===========================>..] - ETA: 9s - loss: 0.9117 - regression_loss: 0.8166 - classification_loss: 0.0951  472/500 [===========================>..] - ETA: 9s - loss: 0.9127 - regression_loss: 0.8174 - classification_loss: 0.0952 473/500 [===========================>..] - ETA: 9s - loss: 0.9137 - regression_loss: 0.8185 - classification_loss: 0.0952 474/500 [===========================>..] - ETA: 8s - loss: 0.9129 - regression_loss: 0.8178 - classification_loss: 0.0951 475/500 [===========================>..] - ETA: 8s - loss: 0.9133 - regression_loss: 0.8181 - classification_loss: 0.0952 476/500 [===========================>..] - ETA: 8s - loss: 0.9124 - regression_loss: 0.8173 - classification_loss: 0.0951 477/500 [===========================>..] - ETA: 7s - loss: 0.9120 - regression_loss: 0.8168 - classification_loss: 0.0952 478/500 [===========================>..] - ETA: 7s - loss: 0.9121 - regression_loss: 0.8169 - classification_loss: 0.0953 479/500 [===========================>..] - ETA: 7s - loss: 0.9119 - regression_loss: 0.8168 - classification_loss: 0.0952 480/500 [===========================>..] - ETA: 6s - loss: 0.9129 - regression_loss: 0.8175 - classification_loss: 0.0954 481/500 [===========================>..] - ETA: 6s - loss: 0.9133 - regression_loss: 0.8179 - classification_loss: 0.0955 482/500 [===========================>..] - ETA: 6s - loss: 0.9122 - regression_loss: 0.8169 - classification_loss: 0.0953 483/500 [===========================>..] - ETA: 5s - loss: 0.9125 - regression_loss: 0.8171 - classification_loss: 0.0954 484/500 [============================>.] - ETA: 5s - loss: 0.9113 - regression_loss: 0.8161 - classification_loss: 0.0952 485/500 [============================>.] - ETA: 5s - loss: 0.9107 - regression_loss: 0.8156 - classification_loss: 0.0951 486/500 [============================>.] - ETA: 4s - loss: 0.9114 - regression_loss: 0.8162 - classification_loss: 0.0951 487/500 [============================>.] - ETA: 4s - loss: 0.9114 - regression_loss: 0.8162 - classification_loss: 0.0952 488/500 [============================>.] - ETA: 4s - loss: 0.9112 - regression_loss: 0.8161 - classification_loss: 0.0951 489/500 [============================>.] - ETA: 3s - loss: 0.9104 - regression_loss: 0.8154 - classification_loss: 0.0951 490/500 [============================>.] - ETA: 3s - loss: 0.9109 - regression_loss: 0.8155 - classification_loss: 0.0954 491/500 [============================>.] - ETA: 3s - loss: 0.9104 - regression_loss: 0.8150 - classification_loss: 0.0953 492/500 [============================>.] - ETA: 2s - loss: 0.9100 - regression_loss: 0.8147 - classification_loss: 0.0953 493/500 [============================>.] - ETA: 2s - loss: 0.9095 - regression_loss: 0.8143 - classification_loss: 0.0951 494/500 [============================>.] - ETA: 2s - loss: 0.9096 - regression_loss: 0.8145 - classification_loss: 0.0951 495/500 [============================>.] - ETA: 1s - loss: 0.9103 - regression_loss: 0.8150 - classification_loss: 0.0953 496/500 [============================>.] - ETA: 1s - loss: 0.9117 - regression_loss: 0.8157 - classification_loss: 0.0960 497/500 [============================>.] - ETA: 1s - loss: 0.9116 - regression_loss: 0.8156 - classification_loss: 0.0960 498/500 [============================>.] - ETA: 0s - loss: 0.9108 - regression_loss: 0.8149 - classification_loss: 0.0958 499/500 [============================>.] - ETA: 0s - loss: 0.9100 - regression_loss: 0.8143 - classification_loss: 0.0957 500/500 [==============================] - 170s 340ms/step - loss: 0.9088 - regression_loss: 0.8133 - classification_loss: 0.0955 1172 instances of class plum with average precision: 0.7996 mAP: 0.7996 Epoch 00037: saving model to ./training/snapshots/resnet101_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 2:35 - loss: 0.5175 - regression_loss: 0.4629 - classification_loss: 0.0546 2/500 [..............................] - ETA: 2:45 - loss: 0.8722 - regression_loss: 0.7765 - classification_loss: 0.0958 3/500 [..............................] - ETA: 2:46 - loss: 1.1465 - regression_loss: 1.0179 - classification_loss: 0.1286 4/500 [..............................] - ETA: 2:48 - loss: 0.9665 - regression_loss: 0.8596 - classification_loss: 0.1069 5/500 [..............................] - ETA: 2:47 - loss: 0.9421 - regression_loss: 0.8389 - classification_loss: 0.1032 6/500 [..............................] - ETA: 2:46 - loss: 0.8774 - regression_loss: 0.7843 - classification_loss: 0.0931 7/500 [..............................] - ETA: 2:45 - loss: 0.8894 - regression_loss: 0.7976 - classification_loss: 0.0918 8/500 [..............................] - ETA: 2:45 - loss: 0.8585 - regression_loss: 0.7719 - classification_loss: 0.0866 9/500 [..............................] - ETA: 2:46 - loss: 0.8344 - regression_loss: 0.7529 - classification_loss: 0.0814 10/500 [..............................] - ETA: 2:46 - loss: 0.8750 - regression_loss: 0.7900 - classification_loss: 0.0850 11/500 [..............................] - ETA: 2:45 - loss: 0.9200 - regression_loss: 0.8292 - classification_loss: 0.0908 12/500 [..............................] - ETA: 2:45 - loss: 0.9265 - regression_loss: 0.8337 - classification_loss: 0.0928 13/500 [..............................] - ETA: 2:43 - loss: 0.8909 - regression_loss: 0.8032 - classification_loss: 0.0877 14/500 [..............................] - ETA: 2:43 - loss: 0.9173 - regression_loss: 0.8259 - classification_loss: 0.0914 15/500 [..............................] - ETA: 2:43 - loss: 0.9137 - regression_loss: 0.8217 - classification_loss: 0.0920 16/500 [..............................] - ETA: 2:42 - loss: 0.8900 - regression_loss: 0.8013 - classification_loss: 0.0887 17/500 [>.............................] - ETA: 2:42 - loss: 0.8712 - regression_loss: 0.7844 - classification_loss: 0.0868 18/500 [>.............................] - ETA: 2:42 - loss: 0.8834 - regression_loss: 0.7953 - classification_loss: 0.0881 19/500 [>.............................] - ETA: 2:42 - loss: 0.8881 - regression_loss: 0.7994 - classification_loss: 0.0886 20/500 [>.............................] - ETA: 2:42 - loss: 0.8630 - regression_loss: 0.7772 - classification_loss: 0.0858 21/500 [>.............................] - ETA: 2:42 - loss: 0.8364 - regression_loss: 0.7542 - classification_loss: 0.0822 22/500 [>.............................] - ETA: 2:42 - loss: 0.8355 - regression_loss: 0.7547 - classification_loss: 0.0808 23/500 [>.............................] - ETA: 2:41 - loss: 0.8239 - regression_loss: 0.7445 - classification_loss: 0.0795 24/500 [>.............................] - ETA: 2:41 - loss: 0.8242 - regression_loss: 0.7457 - classification_loss: 0.0785 25/500 [>.............................] - ETA: 2:41 - loss: 0.8507 - regression_loss: 0.7692 - classification_loss: 0.0816 26/500 [>.............................] - ETA: 2:41 - loss: 0.8496 - regression_loss: 0.7689 - classification_loss: 0.0808 27/500 [>.............................] - ETA: 2:41 - loss: 0.8739 - regression_loss: 0.7893 - classification_loss: 0.0846 28/500 [>.............................] - ETA: 2:41 - loss: 0.8699 - regression_loss: 0.7852 - classification_loss: 0.0846 29/500 [>.............................] - ETA: 2:40 - loss: 0.8764 - regression_loss: 0.7917 - classification_loss: 0.0846 30/500 [>.............................] - ETA: 2:40 - loss: 0.8852 - regression_loss: 0.8004 - classification_loss: 0.0848 31/500 [>.............................] - ETA: 2:40 - loss: 0.8891 - regression_loss: 0.8031 - classification_loss: 0.0860 32/500 [>.............................] - ETA: 2:40 - loss: 0.8666 - regression_loss: 0.7828 - classification_loss: 0.0838 33/500 [>.............................] - ETA: 2:39 - loss: 0.8799 - regression_loss: 0.7934 - classification_loss: 0.0864 34/500 [=>............................] - ETA: 2:39 - loss: 0.8846 - regression_loss: 0.7985 - classification_loss: 0.0861 35/500 [=>............................] - ETA: 2:39 - loss: 0.8816 - regression_loss: 0.7965 - classification_loss: 0.0851 36/500 [=>............................] - ETA: 2:39 - loss: 0.8809 - regression_loss: 0.7956 - classification_loss: 0.0852 37/500 [=>............................] - ETA: 2:38 - loss: 0.8731 - regression_loss: 0.7882 - classification_loss: 0.0849 38/500 [=>............................] - ETA: 2:38 - loss: 0.8600 - regression_loss: 0.7767 - classification_loss: 0.0833 39/500 [=>............................] - ETA: 2:38 - loss: 0.8691 - regression_loss: 0.7818 - classification_loss: 0.0874 40/500 [=>............................] - ETA: 2:37 - loss: 0.8711 - regression_loss: 0.7837 - classification_loss: 0.0875 41/500 [=>............................] - ETA: 2:37 - loss: 0.8761 - regression_loss: 0.7872 - classification_loss: 0.0889 42/500 [=>............................] - ETA: 2:36 - loss: 0.8755 - regression_loss: 0.7864 - classification_loss: 0.0891 43/500 [=>............................] - ETA: 2:36 - loss: 0.8657 - regression_loss: 0.7782 - classification_loss: 0.0875 44/500 [=>............................] - ETA: 2:35 - loss: 0.8658 - regression_loss: 0.7780 - classification_loss: 0.0878 45/500 [=>............................] - ETA: 2:35 - loss: 0.8690 - regression_loss: 0.7807 - classification_loss: 0.0883 46/500 [=>............................] - ETA: 2:35 - loss: 0.8661 - regression_loss: 0.7788 - classification_loss: 0.0873 47/500 [=>............................] - ETA: 2:34 - loss: 0.8668 - regression_loss: 0.7783 - classification_loss: 0.0885 48/500 [=>............................] - ETA: 2:34 - loss: 0.8582 - regression_loss: 0.7708 - classification_loss: 0.0874 49/500 [=>............................] - ETA: 2:34 - loss: 0.8606 - regression_loss: 0.7730 - classification_loss: 0.0876 50/500 [==>...........................] - ETA: 2:33 - loss: 0.8647 - regression_loss: 0.7767 - classification_loss: 0.0881 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8605 - regression_loss: 0.7727 - classification_loss: 0.0878 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8721 - regression_loss: 0.7842 - classification_loss: 0.0879 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8825 - regression_loss: 0.7932 - classification_loss: 0.0893 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8840 - regression_loss: 0.7947 - classification_loss: 0.0893 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8791 - regression_loss: 0.7906 - classification_loss: 0.0885 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8790 - regression_loss: 0.7903 - classification_loss: 0.0887 57/500 [==>...........................] - ETA: 2:30 - loss: 0.8835 - regression_loss: 0.7943 - classification_loss: 0.0892 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8808 - regression_loss: 0.7923 - classification_loss: 0.0885 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8849 - regression_loss: 0.7955 - classification_loss: 0.0894 60/500 [==>...........................] - ETA: 2:30 - loss: 0.8794 - regression_loss: 0.7912 - classification_loss: 0.0882 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8833 - regression_loss: 0.7955 - classification_loss: 0.0878 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8765 - regression_loss: 0.7897 - classification_loss: 0.0868 63/500 [==>...........................] - ETA: 2:28 - loss: 0.8821 - regression_loss: 0.7950 - classification_loss: 0.0870 64/500 [==>...........................] - ETA: 2:28 - loss: 0.8760 - regression_loss: 0.7899 - classification_loss: 0.0861 65/500 [==>...........................] - ETA: 2:28 - loss: 0.8766 - regression_loss: 0.7904 - classification_loss: 0.0862 66/500 [==>...........................] - ETA: 2:28 - loss: 0.8834 - regression_loss: 0.7953 - classification_loss: 0.0881 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8853 - regression_loss: 0.7971 - classification_loss: 0.0882 68/500 [===>..........................] - ETA: 2:27 - loss: 0.8824 - regression_loss: 0.7944 - classification_loss: 0.0880 69/500 [===>..........................] - ETA: 2:27 - loss: 0.8858 - regression_loss: 0.7970 - classification_loss: 0.0887 70/500 [===>..........................] - ETA: 2:27 - loss: 0.8888 - regression_loss: 0.7999 - classification_loss: 0.0890 71/500 [===>..........................] - ETA: 2:26 - loss: 0.8950 - regression_loss: 0.8051 - classification_loss: 0.0899 72/500 [===>..........................] - ETA: 2:26 - loss: 0.8928 - regression_loss: 0.8038 - classification_loss: 0.0890 73/500 [===>..........................] - ETA: 2:26 - loss: 0.9027 - regression_loss: 0.8127 - classification_loss: 0.0900 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9041 - regression_loss: 0.8142 - classification_loss: 0.0899 75/500 [===>..........................] - ETA: 2:25 - loss: 0.9018 - regression_loss: 0.8124 - classification_loss: 0.0893 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8987 - regression_loss: 0.8093 - classification_loss: 0.0894 77/500 [===>..........................] - ETA: 2:24 - loss: 0.8944 - regression_loss: 0.8053 - classification_loss: 0.0891 78/500 [===>..........................] - ETA: 2:24 - loss: 0.9011 - regression_loss: 0.8110 - classification_loss: 0.0900 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9010 - regression_loss: 0.8112 - classification_loss: 0.0899 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9043 - regression_loss: 0.8141 - classification_loss: 0.0901 81/500 [===>..........................] - ETA: 2:23 - loss: 0.9057 - regression_loss: 0.8156 - classification_loss: 0.0901 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9119 - regression_loss: 0.8207 - classification_loss: 0.0912 83/500 [===>..........................] - ETA: 2:22 - loss: 0.9137 - regression_loss: 0.8228 - classification_loss: 0.0909 84/500 [====>.........................] - ETA: 2:22 - loss: 0.9176 - regression_loss: 0.8267 - classification_loss: 0.0909 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9186 - regression_loss: 0.8275 - classification_loss: 0.0910 86/500 [====>.........................] - ETA: 2:21 - loss: 0.9193 - regression_loss: 0.8280 - classification_loss: 0.0913 87/500 [====>.........................] - ETA: 2:21 - loss: 0.9216 - regression_loss: 0.8300 - classification_loss: 0.0915 88/500 [====>.........................] - ETA: 2:20 - loss: 0.9217 - regression_loss: 0.8305 - classification_loss: 0.0912 89/500 [====>.........................] - ETA: 2:20 - loss: 0.9223 - regression_loss: 0.8309 - classification_loss: 0.0914 90/500 [====>.........................] - ETA: 2:20 - loss: 0.9181 - regression_loss: 0.8272 - classification_loss: 0.0909 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9196 - regression_loss: 0.8285 - classification_loss: 0.0911 92/500 [====>.........................] - ETA: 2:19 - loss: 0.9198 - regression_loss: 0.8287 - classification_loss: 0.0911 93/500 [====>.........................] - ETA: 2:19 - loss: 0.9187 - regression_loss: 0.8276 - classification_loss: 0.0911 94/500 [====>.........................] - ETA: 2:19 - loss: 0.9171 - regression_loss: 0.8261 - classification_loss: 0.0910 95/500 [====>.........................] - ETA: 2:18 - loss: 0.9146 - regression_loss: 0.8241 - classification_loss: 0.0905 96/500 [====>.........................] - ETA: 2:18 - loss: 0.9144 - regression_loss: 0.8240 - classification_loss: 0.0904 97/500 [====>.........................] - ETA: 2:18 - loss: 0.9162 - regression_loss: 0.8255 - classification_loss: 0.0908 98/500 [====>.........................] - ETA: 2:17 - loss: 0.9120 - regression_loss: 0.8218 - classification_loss: 0.0902 99/500 [====>.........................] - ETA: 2:17 - loss: 0.9147 - regression_loss: 0.8242 - classification_loss: 0.0904 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9157 - regression_loss: 0.8252 - classification_loss: 0.0905 101/500 [=====>........................] - ETA: 2:16 - loss: 0.9187 - regression_loss: 0.8279 - classification_loss: 0.0907 102/500 [=====>........................] - ETA: 2:16 - loss: 0.9211 - regression_loss: 0.8301 - classification_loss: 0.0910 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9235 - regression_loss: 0.8323 - classification_loss: 0.0913 104/500 [=====>........................] - ETA: 2:15 - loss: 0.9245 - regression_loss: 0.8330 - classification_loss: 0.0915 105/500 [=====>........................] - ETA: 2:15 - loss: 0.9226 - regression_loss: 0.8315 - classification_loss: 0.0910 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9232 - regression_loss: 0.8326 - classification_loss: 0.0907 107/500 [=====>........................] - ETA: 2:14 - loss: 0.9236 - regression_loss: 0.8329 - classification_loss: 0.0907 108/500 [=====>........................] - ETA: 2:14 - loss: 0.9239 - regression_loss: 0.8335 - classification_loss: 0.0904 109/500 [=====>........................] - ETA: 2:13 - loss: 0.9237 - regression_loss: 0.8331 - classification_loss: 0.0905 110/500 [=====>........................] - ETA: 2:13 - loss: 0.9228 - regression_loss: 0.8323 - classification_loss: 0.0905 111/500 [=====>........................] - ETA: 2:13 - loss: 0.9190 - regression_loss: 0.8290 - classification_loss: 0.0900 112/500 [=====>........................] - ETA: 2:12 - loss: 0.9168 - regression_loss: 0.8267 - classification_loss: 0.0901 113/500 [=====>........................] - ETA: 2:12 - loss: 0.9168 - regression_loss: 0.8267 - classification_loss: 0.0901 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9223 - regression_loss: 0.8309 - classification_loss: 0.0913 115/500 [=====>........................] - ETA: 2:11 - loss: 0.9207 - regression_loss: 0.8297 - classification_loss: 0.0909 116/500 [=====>........................] - ETA: 2:11 - loss: 0.9229 - regression_loss: 0.8311 - classification_loss: 0.0917 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9220 - regression_loss: 0.8306 - classification_loss: 0.0914 118/500 [======>.......................] - ETA: 2:10 - loss: 0.9184 - regression_loss: 0.8274 - classification_loss: 0.0910 119/500 [======>.......................] - ETA: 2:10 - loss: 0.9212 - regression_loss: 0.8300 - classification_loss: 0.0912 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9221 - regression_loss: 0.8309 - classification_loss: 0.0912 121/500 [======>.......................] - ETA: 2:09 - loss: 0.9262 - regression_loss: 0.8344 - classification_loss: 0.0918 122/500 [======>.......................] - ETA: 2:09 - loss: 0.9219 - regression_loss: 0.8305 - classification_loss: 0.0914 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9173 - regression_loss: 0.8265 - classification_loss: 0.0908 124/500 [======>.......................] - ETA: 2:08 - loss: 0.9191 - regression_loss: 0.8279 - classification_loss: 0.0912 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9207 - regression_loss: 0.8293 - classification_loss: 0.0914 126/500 [======>.......................] - ETA: 2:07 - loss: 0.9229 - regression_loss: 0.8316 - classification_loss: 0.0913 127/500 [======>.......................] - ETA: 2:07 - loss: 0.9238 - regression_loss: 0.8326 - classification_loss: 0.0912 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9192 - regression_loss: 0.8284 - classification_loss: 0.0908 129/500 [======>.......................] - ETA: 2:06 - loss: 0.9156 - regression_loss: 0.8253 - classification_loss: 0.0903 130/500 [======>.......................] - ETA: 2:06 - loss: 0.9192 - regression_loss: 0.8284 - classification_loss: 0.0907 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9197 - regression_loss: 0.8288 - classification_loss: 0.0910 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9188 - regression_loss: 0.8280 - classification_loss: 0.0907 133/500 [======>.......................] - ETA: 2:05 - loss: 0.9197 - regression_loss: 0.8287 - classification_loss: 0.0910 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9184 - regression_loss: 0.8275 - classification_loss: 0.0910 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9192 - regression_loss: 0.8283 - classification_loss: 0.0909 136/500 [=======>......................] - ETA: 2:04 - loss: 0.9223 - regression_loss: 0.8309 - classification_loss: 0.0914 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9214 - regression_loss: 0.8299 - classification_loss: 0.0915 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9214 - regression_loss: 0.8296 - classification_loss: 0.0917 139/500 [=======>......................] - ETA: 2:03 - loss: 0.9218 - regression_loss: 0.8298 - classification_loss: 0.0920 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9229 - regression_loss: 0.8310 - classification_loss: 0.0919 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9240 - regression_loss: 0.8323 - classification_loss: 0.0917 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9324 - regression_loss: 0.8396 - classification_loss: 0.0929 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9338 - regression_loss: 0.8410 - classification_loss: 0.0928 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9303 - regression_loss: 0.8381 - classification_loss: 0.0922 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9296 - regression_loss: 0.8377 - classification_loss: 0.0918 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9301 - regression_loss: 0.8381 - classification_loss: 0.0920 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9319 - regression_loss: 0.8401 - classification_loss: 0.0918 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9293 - regression_loss: 0.8380 - classification_loss: 0.0913 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9297 - regression_loss: 0.8387 - classification_loss: 0.0911 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9309 - regression_loss: 0.8395 - classification_loss: 0.0914 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9308 - regression_loss: 0.8393 - classification_loss: 0.0915 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9297 - regression_loss: 0.8384 - classification_loss: 0.0913 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9299 - regression_loss: 0.8383 - classification_loss: 0.0915 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9268 - regression_loss: 0.8356 - classification_loss: 0.0912 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9254 - regression_loss: 0.8346 - classification_loss: 0.0908 156/500 [========>.....................] - ETA: 1:57 - loss: 0.9238 - regression_loss: 0.8333 - classification_loss: 0.0905 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9238 - regression_loss: 0.8334 - classification_loss: 0.0903 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9239 - regression_loss: 0.8335 - classification_loss: 0.0904 159/500 [========>.....................] - ETA: 1:56 - loss: 0.9203 - regression_loss: 0.8303 - classification_loss: 0.0900 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9212 - regression_loss: 0.8310 - classification_loss: 0.0902 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9206 - regression_loss: 0.8306 - classification_loss: 0.0900 162/500 [========>.....................] - ETA: 1:55 - loss: 0.9211 - regression_loss: 0.8309 - classification_loss: 0.0903 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9211 - regression_loss: 0.8308 - classification_loss: 0.0904 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9230 - regression_loss: 0.8323 - classification_loss: 0.0907 165/500 [========>.....................] - ETA: 1:54 - loss: 0.9230 - regression_loss: 0.8325 - classification_loss: 0.0905 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9224 - regression_loss: 0.8320 - classification_loss: 0.0905 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9236 - regression_loss: 0.8330 - classification_loss: 0.0906 168/500 [=========>....................] - ETA: 1:53 - loss: 0.9206 - regression_loss: 0.8303 - classification_loss: 0.0903 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9211 - regression_loss: 0.8307 - classification_loss: 0.0904 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9179 - regression_loss: 0.8280 - classification_loss: 0.0899 171/500 [=========>....................] - ETA: 1:52 - loss: 0.9183 - regression_loss: 0.8283 - classification_loss: 0.0900 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9177 - regression_loss: 0.8277 - classification_loss: 0.0900 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9163 - regression_loss: 0.8265 - classification_loss: 0.0898 174/500 [=========>....................] - ETA: 1:51 - loss: 0.9176 - regression_loss: 0.8274 - classification_loss: 0.0902 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9163 - regression_loss: 0.8262 - classification_loss: 0.0900 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9159 - regression_loss: 0.8255 - classification_loss: 0.0903 177/500 [=========>....................] - ETA: 1:50 - loss: 0.9146 - regression_loss: 0.8246 - classification_loss: 0.0900 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9139 - regression_loss: 0.8240 - classification_loss: 0.0899 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9153 - regression_loss: 0.8251 - classification_loss: 0.0903 180/500 [=========>....................] - ETA: 1:49 - loss: 0.9150 - regression_loss: 0.8248 - classification_loss: 0.0902 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9132 - regression_loss: 0.8233 - classification_loss: 0.0899 182/500 [=========>....................] - ETA: 1:48 - loss: 0.9148 - regression_loss: 0.8247 - classification_loss: 0.0900 183/500 [=========>....................] - ETA: 1:48 - loss: 0.9121 - regression_loss: 0.8224 - classification_loss: 0.0897 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9110 - regression_loss: 0.8213 - classification_loss: 0.0897 185/500 [==========>...................] - ETA: 1:47 - loss: 0.9133 - regression_loss: 0.8234 - classification_loss: 0.0899 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9130 - regression_loss: 0.8232 - classification_loss: 0.0898 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9136 - regression_loss: 0.8238 - classification_loss: 0.0899 188/500 [==========>...................] - ETA: 1:46 - loss: 0.9130 - regression_loss: 0.8231 - classification_loss: 0.0899 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9133 - regression_loss: 0.8235 - classification_loss: 0.0898 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9162 - regression_loss: 0.8263 - classification_loss: 0.0900 191/500 [==========>...................] - ETA: 1:45 - loss: 0.9161 - regression_loss: 0.8261 - classification_loss: 0.0900 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9168 - regression_loss: 0.8266 - classification_loss: 0.0901 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9130 - regression_loss: 0.8231 - classification_loss: 0.0900 194/500 [==========>...................] - ETA: 1:44 - loss: 0.9116 - regression_loss: 0.8217 - classification_loss: 0.0898 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9112 - regression_loss: 0.8215 - classification_loss: 0.0897 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9102 - regression_loss: 0.8204 - classification_loss: 0.0898 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9076 - regression_loss: 0.8182 - classification_loss: 0.0894 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9086 - regression_loss: 0.8192 - classification_loss: 0.0895 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9074 - regression_loss: 0.8181 - classification_loss: 0.0893 200/500 [===========>..................] - ETA: 1:42 - loss: 0.9063 - regression_loss: 0.8172 - classification_loss: 0.0890 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9076 - regression_loss: 0.8184 - classification_loss: 0.0892 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9094 - regression_loss: 0.8198 - classification_loss: 0.0896 203/500 [===========>..................] - ETA: 1:41 - loss: 0.9073 - regression_loss: 0.8180 - classification_loss: 0.0894 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9050 - regression_loss: 0.8159 - classification_loss: 0.0891 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9055 - regression_loss: 0.8162 - classification_loss: 0.0893 206/500 [===========>..................] - ETA: 1:40 - loss: 0.9038 - regression_loss: 0.8149 - classification_loss: 0.0889 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9042 - regression_loss: 0.8152 - classification_loss: 0.0891 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9060 - regression_loss: 0.8168 - classification_loss: 0.0892 209/500 [===========>..................] - ETA: 1:39 - loss: 0.9047 - regression_loss: 0.8157 - classification_loss: 0.0890 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9035 - regression_loss: 0.8147 - classification_loss: 0.0888 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9030 - regression_loss: 0.8144 - classification_loss: 0.0886 212/500 [===========>..................] - ETA: 1:38 - loss: 0.9032 - regression_loss: 0.8145 - classification_loss: 0.0886 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9034 - regression_loss: 0.8147 - classification_loss: 0.0887 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9042 - regression_loss: 0.8154 - classification_loss: 0.0888 215/500 [===========>..................] - ETA: 1:37 - loss: 0.9014 - regression_loss: 0.8130 - classification_loss: 0.0885 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9013 - regression_loss: 0.8131 - classification_loss: 0.0883 217/500 [============>.................] - ETA: 1:36 - loss: 0.9015 - regression_loss: 0.8132 - classification_loss: 0.0883 218/500 [============>.................] - ETA: 1:36 - loss: 0.9025 - regression_loss: 0.8140 - classification_loss: 0.0885 219/500 [============>.................] - ETA: 1:35 - loss: 0.9032 - regression_loss: 0.8145 - classification_loss: 0.0887 220/500 [============>.................] - ETA: 1:35 - loss: 0.9017 - regression_loss: 0.8132 - classification_loss: 0.0885 221/500 [============>.................] - ETA: 1:35 - loss: 0.9034 - regression_loss: 0.8146 - classification_loss: 0.0888 222/500 [============>.................] - ETA: 1:34 - loss: 0.9033 - regression_loss: 0.8146 - classification_loss: 0.0887 223/500 [============>.................] - ETA: 1:34 - loss: 0.9045 - regression_loss: 0.8156 - classification_loss: 0.0889 224/500 [============>.................] - ETA: 1:33 - loss: 0.9029 - regression_loss: 0.8143 - classification_loss: 0.0886 225/500 [============>.................] - ETA: 1:33 - loss: 0.9037 - regression_loss: 0.8150 - classification_loss: 0.0887 226/500 [============>.................] - ETA: 1:33 - loss: 0.9011 - regression_loss: 0.8128 - classification_loss: 0.0884 227/500 [============>.................] - ETA: 1:32 - loss: 0.9003 - regression_loss: 0.8121 - classification_loss: 0.0882 228/500 [============>.................] - ETA: 1:32 - loss: 0.9018 - regression_loss: 0.8133 - classification_loss: 0.0885 229/500 [============>.................] - ETA: 1:32 - loss: 0.9018 - regression_loss: 0.8134 - classification_loss: 0.0885 230/500 [============>.................] - ETA: 1:31 - loss: 0.9019 - regression_loss: 0.8135 - classification_loss: 0.0884 231/500 [============>.................] - ETA: 1:31 - loss: 0.9026 - regression_loss: 0.8141 - classification_loss: 0.0885 232/500 [============>.................] - ETA: 1:31 - loss: 0.9022 - regression_loss: 0.8137 - classification_loss: 0.0885 233/500 [============>.................] - ETA: 1:30 - loss: 0.9027 - regression_loss: 0.8143 - classification_loss: 0.0884 234/500 [=============>................] - ETA: 1:30 - loss: 0.9054 - regression_loss: 0.8172 - classification_loss: 0.0883 235/500 [=============>................] - ETA: 1:29 - loss: 0.9064 - regression_loss: 0.8180 - classification_loss: 0.0884 236/500 [=============>................] - ETA: 1:29 - loss: 0.9063 - regression_loss: 0.8180 - classification_loss: 0.0882 237/500 [=============>................] - ETA: 1:29 - loss: 0.9073 - regression_loss: 0.8189 - classification_loss: 0.0884 238/500 [=============>................] - ETA: 1:28 - loss: 0.9065 - regression_loss: 0.8182 - classification_loss: 0.0883 239/500 [=============>................] - ETA: 1:28 - loss: 0.9071 - regression_loss: 0.8185 - classification_loss: 0.0886 240/500 [=============>................] - ETA: 1:28 - loss: 0.9065 - regression_loss: 0.8181 - classification_loss: 0.0884 241/500 [=============>................] - ETA: 1:27 - loss: 0.9066 - regression_loss: 0.8182 - classification_loss: 0.0884 242/500 [=============>................] - ETA: 1:27 - loss: 0.9072 - regression_loss: 0.8186 - classification_loss: 0.0886 243/500 [=============>................] - ETA: 1:27 - loss: 0.9047 - regression_loss: 0.8164 - classification_loss: 0.0883 244/500 [=============>................] - ETA: 1:26 - loss: 0.9020 - regression_loss: 0.8141 - classification_loss: 0.0880 245/500 [=============>................] - ETA: 1:26 - loss: 0.9019 - regression_loss: 0.8140 - classification_loss: 0.0879 246/500 [=============>................] - ETA: 1:26 - loss: 0.9014 - regression_loss: 0.8136 - classification_loss: 0.0879 247/500 [=============>................] - ETA: 1:25 - loss: 0.9004 - regression_loss: 0.8125 - classification_loss: 0.0878 248/500 [=============>................] - ETA: 1:25 - loss: 0.9005 - regression_loss: 0.8128 - classification_loss: 0.0876 249/500 [=============>................] - ETA: 1:25 - loss: 0.9011 - regression_loss: 0.8134 - classification_loss: 0.0877 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9009 - regression_loss: 0.8130 - classification_loss: 0.0879 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9000 - regression_loss: 0.8124 - classification_loss: 0.0877 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9022 - regression_loss: 0.8142 - classification_loss: 0.0880 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9079 - regression_loss: 0.8190 - classification_loss: 0.0889 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9087 - regression_loss: 0.8197 - classification_loss: 0.0890 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9089 - regression_loss: 0.8199 - classification_loss: 0.0890 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9100 - regression_loss: 0.8211 - classification_loss: 0.0889 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9104 - regression_loss: 0.8213 - classification_loss: 0.0890 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9107 - regression_loss: 0.8218 - classification_loss: 0.0890 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9106 - regression_loss: 0.8216 - classification_loss: 0.0890 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9132 - regression_loss: 0.8238 - classification_loss: 0.0893 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9153 - regression_loss: 0.8256 - classification_loss: 0.0897 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9156 - regression_loss: 0.8258 - classification_loss: 0.0898 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9168 - regression_loss: 0.8260 - classification_loss: 0.0908 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9170 - regression_loss: 0.8262 - classification_loss: 0.0908 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9163 - regression_loss: 0.8256 - classification_loss: 0.0907 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9168 - regression_loss: 0.8260 - classification_loss: 0.0908 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9148 - regression_loss: 0.8242 - classification_loss: 0.0905 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9132 - regression_loss: 0.8227 - classification_loss: 0.0904 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9119 - regression_loss: 0.8217 - classification_loss: 0.0902 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9119 - regression_loss: 0.8218 - classification_loss: 0.0901 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9111 - regression_loss: 0.8211 - classification_loss: 0.0900 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9135 - regression_loss: 0.8229 - classification_loss: 0.0906 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9131 - regression_loss: 0.8227 - classification_loss: 0.0905 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9137 - regression_loss: 0.8232 - classification_loss: 0.0905 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9118 - regression_loss: 0.8216 - classification_loss: 0.0902 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9114 - regression_loss: 0.8212 - classification_loss: 0.0902 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9127 - regression_loss: 0.8224 - classification_loss: 0.0903 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9143 - regression_loss: 0.8239 - classification_loss: 0.0903 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9134 - regression_loss: 0.8232 - classification_loss: 0.0902 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9142 - regression_loss: 0.8237 - classification_loss: 0.0905 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9129 - regression_loss: 0.8225 - classification_loss: 0.0904 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9119 - regression_loss: 0.8217 - classification_loss: 0.0902 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9118 - regression_loss: 0.8215 - classification_loss: 0.0903 284/500 [================>.............] - ETA: 1:13 - loss: 0.9098 - regression_loss: 0.8198 - classification_loss: 0.0900 285/500 [================>.............] - ETA: 1:12 - loss: 0.9098 - regression_loss: 0.8197 - classification_loss: 0.0901 286/500 [================>.............] - ETA: 1:12 - loss: 0.9083 - regression_loss: 0.8184 - classification_loss: 0.0899 287/500 [================>.............] - ETA: 1:12 - loss: 0.9062 - regression_loss: 0.8165 - classification_loss: 0.0897 288/500 [================>.............] - ETA: 1:11 - loss: 0.9072 - regression_loss: 0.8173 - classification_loss: 0.0899 289/500 [================>.............] - ETA: 1:11 - loss: 0.9059 - regression_loss: 0.8163 - classification_loss: 0.0896 290/500 [================>.............] - ETA: 1:11 - loss: 0.9054 - regression_loss: 0.8158 - classification_loss: 0.0895 291/500 [================>.............] - ETA: 1:10 - loss: 0.9064 - regression_loss: 0.8168 - classification_loss: 0.0896 292/500 [================>.............] - ETA: 1:10 - loss: 0.9062 - regression_loss: 0.8166 - classification_loss: 0.0896 293/500 [================>.............] - ETA: 1:10 - loss: 0.9080 - regression_loss: 0.8180 - classification_loss: 0.0901 294/500 [================>.............] - ETA: 1:09 - loss: 0.9076 - regression_loss: 0.8178 - classification_loss: 0.0899 295/500 [================>.............] - ETA: 1:09 - loss: 0.9077 - regression_loss: 0.8177 - classification_loss: 0.0899 296/500 [================>.............] - ETA: 1:09 - loss: 0.9070 - regression_loss: 0.8170 - classification_loss: 0.0899 297/500 [================>.............] - ETA: 1:08 - loss: 0.9065 - regression_loss: 0.8167 - classification_loss: 0.0898 298/500 [================>.............] - ETA: 1:08 - loss: 0.9052 - regression_loss: 0.8155 - classification_loss: 0.0898 299/500 [================>.............] - ETA: 1:08 - loss: 0.9036 - regression_loss: 0.8141 - classification_loss: 0.0895 300/500 [=================>............] - ETA: 1:07 - loss: 0.9043 - regression_loss: 0.8146 - classification_loss: 0.0897 301/500 [=================>............] - ETA: 1:07 - loss: 0.9054 - regression_loss: 0.8154 - classification_loss: 0.0900 302/500 [=================>............] - ETA: 1:07 - loss: 0.9053 - regression_loss: 0.8153 - classification_loss: 0.0900 303/500 [=================>............] - ETA: 1:06 - loss: 0.9047 - regression_loss: 0.8148 - classification_loss: 0.0899 304/500 [=================>............] - ETA: 1:06 - loss: 0.9051 - regression_loss: 0.8151 - classification_loss: 0.0899 305/500 [=================>............] - ETA: 1:06 - loss: 0.9059 - regression_loss: 0.8157 - classification_loss: 0.0901 306/500 [=================>............] - ETA: 1:05 - loss: 0.9050 - regression_loss: 0.8150 - classification_loss: 0.0900 307/500 [=================>............] - ETA: 1:05 - loss: 0.9048 - regression_loss: 0.8148 - classification_loss: 0.0900 308/500 [=================>............] - ETA: 1:05 - loss: 0.9044 - regression_loss: 0.8145 - classification_loss: 0.0899 309/500 [=================>............] - ETA: 1:04 - loss: 0.9042 - regression_loss: 0.8142 - classification_loss: 0.0900 310/500 [=================>............] - ETA: 1:04 - loss: 0.9034 - regression_loss: 0.8136 - classification_loss: 0.0898 311/500 [=================>............] - ETA: 1:04 - loss: 0.9048 - regression_loss: 0.8148 - classification_loss: 0.0901 312/500 [=================>............] - ETA: 1:03 - loss: 0.9050 - regression_loss: 0.8148 - classification_loss: 0.0902 313/500 [=================>............] - ETA: 1:03 - loss: 0.9055 - regression_loss: 0.8153 - classification_loss: 0.0902 314/500 [=================>............] - ETA: 1:03 - loss: 0.9042 - regression_loss: 0.8141 - classification_loss: 0.0900 315/500 [=================>............] - ETA: 1:02 - loss: 0.9049 - regression_loss: 0.8147 - classification_loss: 0.0902 316/500 [=================>............] - ETA: 1:02 - loss: 0.9071 - regression_loss: 0.8164 - classification_loss: 0.0907 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9084 - regression_loss: 0.8175 - classification_loss: 0.0910 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9069 - regression_loss: 0.8161 - classification_loss: 0.0907 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9066 - regression_loss: 0.8159 - classification_loss: 0.0907 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9073 - regression_loss: 0.8164 - classification_loss: 0.0909 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9080 - regression_loss: 0.8171 - classification_loss: 0.0908 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9082 - regression_loss: 0.8172 - classification_loss: 0.0910 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9059 - regression_loss: 0.8151 - classification_loss: 0.0908 324/500 [==================>...........] - ETA: 59s - loss: 0.9060 - regression_loss: 0.8152 - classification_loss: 0.0908  325/500 [==================>...........] - ETA: 59s - loss: 0.9060 - regression_loss: 0.8152 - classification_loss: 0.0908 326/500 [==================>...........] - ETA: 59s - loss: 0.9061 - regression_loss: 0.8153 - classification_loss: 0.0909 327/500 [==================>...........] - ETA: 58s - loss: 0.9075 - regression_loss: 0.8163 - classification_loss: 0.0912 328/500 [==================>...........] - ETA: 58s - loss: 0.9074 - regression_loss: 0.8163 - classification_loss: 0.0911 329/500 [==================>...........] - ETA: 58s - loss: 0.9053 - regression_loss: 0.8145 - classification_loss: 0.0908 330/500 [==================>...........] - ETA: 57s - loss: 0.9042 - regression_loss: 0.8136 - classification_loss: 0.0906 331/500 [==================>...........] - ETA: 57s - loss: 0.9027 - regression_loss: 0.8123 - classification_loss: 0.0904 332/500 [==================>...........] - ETA: 56s - loss: 0.9023 - regression_loss: 0.8120 - classification_loss: 0.0903 333/500 [==================>...........] - ETA: 56s - loss: 0.9025 - regression_loss: 0.8120 - classification_loss: 0.0904 334/500 [===================>..........] - ETA: 56s - loss: 0.9022 - regression_loss: 0.8118 - classification_loss: 0.0904 335/500 [===================>..........] - ETA: 55s - loss: 0.9021 - regression_loss: 0.8116 - classification_loss: 0.0905 336/500 [===================>..........] - ETA: 55s - loss: 0.9034 - regression_loss: 0.8128 - classification_loss: 0.0906 337/500 [===================>..........] - ETA: 55s - loss: 0.9041 - regression_loss: 0.8134 - classification_loss: 0.0907 338/500 [===================>..........] - ETA: 54s - loss: 0.9035 - regression_loss: 0.8128 - classification_loss: 0.0906 339/500 [===================>..........] - ETA: 54s - loss: 0.9038 - regression_loss: 0.8130 - classification_loss: 0.0908 340/500 [===================>..........] - ETA: 54s - loss: 0.9027 - regression_loss: 0.8121 - classification_loss: 0.0906 341/500 [===================>..........] - ETA: 53s - loss: 0.9023 - regression_loss: 0.8118 - classification_loss: 0.0906 342/500 [===================>..........] - ETA: 53s - loss: 0.9029 - regression_loss: 0.8122 - classification_loss: 0.0907 343/500 [===================>..........] - ETA: 53s - loss: 0.9018 - regression_loss: 0.8112 - classification_loss: 0.0906 344/500 [===================>..........] - ETA: 52s - loss: 0.9002 - regression_loss: 0.8097 - classification_loss: 0.0904 345/500 [===================>..........] - ETA: 52s - loss: 0.9028 - regression_loss: 0.8124 - classification_loss: 0.0904 346/500 [===================>..........] - ETA: 52s - loss: 0.9030 - regression_loss: 0.8125 - classification_loss: 0.0905 347/500 [===================>..........] - ETA: 51s - loss: 0.9029 - regression_loss: 0.8125 - classification_loss: 0.0905 348/500 [===================>..........] - ETA: 51s - loss: 0.9033 - regression_loss: 0.8128 - classification_loss: 0.0905 349/500 [===================>..........] - ETA: 51s - loss: 0.9034 - regression_loss: 0.8129 - classification_loss: 0.0905 350/500 [====================>.........] - ETA: 50s - loss: 0.9038 - regression_loss: 0.8132 - classification_loss: 0.0906 351/500 [====================>.........] - ETA: 50s - loss: 0.9053 - regression_loss: 0.8145 - classification_loss: 0.0908 352/500 [====================>.........] - ETA: 50s - loss: 0.9049 - regression_loss: 0.8141 - classification_loss: 0.0907 353/500 [====================>.........] - ETA: 49s - loss: 0.9041 - regression_loss: 0.8135 - classification_loss: 0.0906 354/500 [====================>.........] - ETA: 49s - loss: 0.9031 - regression_loss: 0.8127 - classification_loss: 0.0904 355/500 [====================>.........] - ETA: 49s - loss: 0.9014 - regression_loss: 0.8112 - classification_loss: 0.0902 356/500 [====================>.........] - ETA: 48s - loss: 0.9008 - regression_loss: 0.8107 - classification_loss: 0.0901 357/500 [====================>.........] - ETA: 48s - loss: 0.8991 - regression_loss: 0.8091 - classification_loss: 0.0899 358/500 [====================>.........] - ETA: 48s - loss: 0.9005 - regression_loss: 0.8105 - classification_loss: 0.0900 359/500 [====================>.........] - ETA: 47s - loss: 0.9010 - regression_loss: 0.8109 - classification_loss: 0.0902 360/500 [====================>.........] - ETA: 47s - loss: 0.8998 - regression_loss: 0.8097 - classification_loss: 0.0901 361/500 [====================>.........] - ETA: 47s - loss: 0.8997 - regression_loss: 0.8097 - classification_loss: 0.0900 362/500 [====================>.........] - ETA: 46s - loss: 0.9014 - regression_loss: 0.8111 - classification_loss: 0.0903 363/500 [====================>.........] - ETA: 46s - loss: 0.9033 - regression_loss: 0.8127 - classification_loss: 0.0906 364/500 [====================>.........] - ETA: 46s - loss: 0.9029 - regression_loss: 0.8124 - classification_loss: 0.0905 365/500 [====================>.........] - ETA: 45s - loss: 0.9048 - regression_loss: 0.8139 - classification_loss: 0.0909 366/500 [====================>.........] - ETA: 45s - loss: 0.9041 - regression_loss: 0.8133 - classification_loss: 0.0908 367/500 [=====================>........] - ETA: 45s - loss: 0.9031 - regression_loss: 0.8124 - classification_loss: 0.0906 368/500 [=====================>........] - ETA: 44s - loss: 0.9035 - regression_loss: 0.8129 - classification_loss: 0.0907 369/500 [=====================>........] - ETA: 44s - loss: 0.9029 - regression_loss: 0.8123 - classification_loss: 0.0905 370/500 [=====================>........] - ETA: 44s - loss: 0.9027 - regression_loss: 0.8122 - classification_loss: 0.0905 371/500 [=====================>........] - ETA: 43s - loss: 0.9016 - regression_loss: 0.8112 - classification_loss: 0.0904 372/500 [=====================>........] - ETA: 43s - loss: 0.9022 - regression_loss: 0.8118 - classification_loss: 0.0904 373/500 [=====================>........] - ETA: 43s - loss: 0.9027 - regression_loss: 0.8122 - classification_loss: 0.0905 374/500 [=====================>........] - ETA: 42s - loss: 0.9013 - regression_loss: 0.8110 - classification_loss: 0.0903 375/500 [=====================>........] - ETA: 42s - loss: 0.9012 - regression_loss: 0.8109 - classification_loss: 0.0903 376/500 [=====================>........] - ETA: 42s - loss: 0.8998 - regression_loss: 0.8096 - classification_loss: 0.0902 377/500 [=====================>........] - ETA: 41s - loss: 0.9008 - regression_loss: 0.8105 - classification_loss: 0.0903 378/500 [=====================>........] - ETA: 41s - loss: 0.9000 - regression_loss: 0.8098 - classification_loss: 0.0902 379/500 [=====================>........] - ETA: 41s - loss: 0.9003 - regression_loss: 0.8100 - classification_loss: 0.0903 380/500 [=====================>........] - ETA: 40s - loss: 0.9008 - regression_loss: 0.8104 - classification_loss: 0.0905 381/500 [=====================>........] - ETA: 40s - loss: 0.9015 - regression_loss: 0.8108 - classification_loss: 0.0906 382/500 [=====================>........] - ETA: 40s - loss: 0.9011 - regression_loss: 0.8105 - classification_loss: 0.0906 383/500 [=====================>........] - ETA: 39s - loss: 0.9002 - regression_loss: 0.8098 - classification_loss: 0.0905 384/500 [======================>.......] - ETA: 39s - loss: 0.9008 - regression_loss: 0.8104 - classification_loss: 0.0904 385/500 [======================>.......] - ETA: 38s - loss: 0.9015 - regression_loss: 0.8110 - classification_loss: 0.0905 386/500 [======================>.......] - ETA: 38s - loss: 0.9017 - regression_loss: 0.8112 - classification_loss: 0.0905 387/500 [======================>.......] - ETA: 38s - loss: 0.9010 - regression_loss: 0.8106 - classification_loss: 0.0904 388/500 [======================>.......] - ETA: 37s - loss: 0.9010 - regression_loss: 0.8107 - classification_loss: 0.0902 389/500 [======================>.......] - ETA: 37s - loss: 0.9006 - regression_loss: 0.8105 - classification_loss: 0.0901 390/500 [======================>.......] - ETA: 37s - loss: 0.8991 - regression_loss: 0.8092 - classification_loss: 0.0900 391/500 [======================>.......] - ETA: 36s - loss: 0.8987 - regression_loss: 0.8088 - classification_loss: 0.0899 392/500 [======================>.......] - ETA: 36s - loss: 0.8996 - regression_loss: 0.8095 - classification_loss: 0.0900 393/500 [======================>.......] - ETA: 36s - loss: 0.9001 - regression_loss: 0.8097 - classification_loss: 0.0904 394/500 [======================>.......] - ETA: 35s - loss: 0.8992 - regression_loss: 0.8090 - classification_loss: 0.0902 395/500 [======================>.......] - ETA: 35s - loss: 0.8997 - regression_loss: 0.8094 - classification_loss: 0.0903 396/500 [======================>.......] - ETA: 35s - loss: 0.8996 - regression_loss: 0.8093 - classification_loss: 0.0902 397/500 [======================>.......] - ETA: 34s - loss: 0.8997 - regression_loss: 0.8095 - classification_loss: 0.0902 398/500 [======================>.......] - ETA: 34s - loss: 0.9001 - regression_loss: 0.8098 - classification_loss: 0.0903 399/500 [======================>.......] - ETA: 34s - loss: 0.9005 - regression_loss: 0.8102 - classification_loss: 0.0902 400/500 [=======================>......] - ETA: 33s - loss: 0.9002 - regression_loss: 0.8100 - classification_loss: 0.0902 401/500 [=======================>......] - ETA: 33s - loss: 0.9003 - regression_loss: 0.8101 - classification_loss: 0.0902 402/500 [=======================>......] - ETA: 33s - loss: 0.9017 - regression_loss: 0.8113 - classification_loss: 0.0904 403/500 [=======================>......] - ETA: 32s - loss: 0.9011 - regression_loss: 0.8108 - classification_loss: 0.0903 404/500 [=======================>......] - ETA: 32s - loss: 0.9016 - regression_loss: 0.8112 - classification_loss: 0.0904 405/500 [=======================>......] - ETA: 32s - loss: 0.9024 - regression_loss: 0.8119 - classification_loss: 0.0905 406/500 [=======================>......] - ETA: 31s - loss: 0.9020 - regression_loss: 0.8117 - classification_loss: 0.0904 407/500 [=======================>......] - ETA: 31s - loss: 0.9010 - regression_loss: 0.8106 - classification_loss: 0.0903 408/500 [=======================>......] - ETA: 31s - loss: 0.9010 - regression_loss: 0.8107 - classification_loss: 0.0904 409/500 [=======================>......] - ETA: 30s - loss: 0.9009 - regression_loss: 0.8106 - classification_loss: 0.0903 410/500 [=======================>......] - ETA: 30s - loss: 0.9013 - regression_loss: 0.8110 - classification_loss: 0.0903 411/500 [=======================>......] - ETA: 30s - loss: 0.9033 - regression_loss: 0.8130 - classification_loss: 0.0903 412/500 [=======================>......] - ETA: 29s - loss: 0.9037 - regression_loss: 0.8133 - classification_loss: 0.0903 413/500 [=======================>......] - ETA: 29s - loss: 0.9037 - regression_loss: 0.8133 - classification_loss: 0.0903 414/500 [=======================>......] - ETA: 29s - loss: 0.9037 - regression_loss: 0.8133 - classification_loss: 0.0903 415/500 [=======================>......] - ETA: 28s - loss: 0.9046 - regression_loss: 0.8142 - classification_loss: 0.0904 416/500 [=======================>......] - ETA: 28s - loss: 0.9053 - regression_loss: 0.8148 - classification_loss: 0.0905 417/500 [========================>.....] - ETA: 28s - loss: 0.9044 - regression_loss: 0.8140 - classification_loss: 0.0903 418/500 [========================>.....] - ETA: 27s - loss: 0.9035 - regression_loss: 0.8132 - classification_loss: 0.0902 419/500 [========================>.....] - ETA: 27s - loss: 0.9031 - regression_loss: 0.8128 - classification_loss: 0.0903 420/500 [========================>.....] - ETA: 27s - loss: 0.9028 - regression_loss: 0.8125 - classification_loss: 0.0903 421/500 [========================>.....] - ETA: 26s - loss: 0.9028 - regression_loss: 0.8124 - classification_loss: 0.0904 422/500 [========================>.....] - ETA: 26s - loss: 0.9014 - regression_loss: 0.8111 - classification_loss: 0.0903 423/500 [========================>.....] - ETA: 26s - loss: 0.9018 - regression_loss: 0.8114 - classification_loss: 0.0904 424/500 [========================>.....] - ETA: 25s - loss: 0.9020 - regression_loss: 0.8116 - classification_loss: 0.0904 425/500 [========================>.....] - ETA: 25s - loss: 0.9011 - regression_loss: 0.8108 - classification_loss: 0.0903 426/500 [========================>.....] - ETA: 25s - loss: 0.9006 - regression_loss: 0.8104 - classification_loss: 0.0902 427/500 [========================>.....] - ETA: 24s - loss: 0.9011 - regression_loss: 0.8108 - classification_loss: 0.0903 428/500 [========================>.....] - ETA: 24s - loss: 0.8999 - regression_loss: 0.8098 - classification_loss: 0.0901 429/500 [========================>.....] - ETA: 24s - loss: 0.8986 - regression_loss: 0.8087 - classification_loss: 0.0899 430/500 [========================>.....] - ETA: 23s - loss: 0.8990 - regression_loss: 0.8091 - classification_loss: 0.0900 431/500 [========================>.....] - ETA: 23s - loss: 0.8976 - regression_loss: 0.8078 - classification_loss: 0.0898 432/500 [========================>.....] - ETA: 23s - loss: 0.8985 - regression_loss: 0.8085 - classification_loss: 0.0900 433/500 [========================>.....] - ETA: 22s - loss: 0.8990 - regression_loss: 0.8089 - classification_loss: 0.0901 434/500 [=========================>....] - ETA: 22s - loss: 0.8995 - regression_loss: 0.8094 - classification_loss: 0.0901 435/500 [=========================>....] - ETA: 22s - loss: 0.8997 - regression_loss: 0.8096 - classification_loss: 0.0901 436/500 [=========================>....] - ETA: 21s - loss: 0.9004 - regression_loss: 0.8101 - classification_loss: 0.0902 437/500 [=========================>....] - ETA: 21s - loss: 0.9008 - regression_loss: 0.8105 - classification_loss: 0.0903 438/500 [=========================>....] - ETA: 21s - loss: 0.9016 - regression_loss: 0.8111 - classification_loss: 0.0905 439/500 [=========================>....] - ETA: 20s - loss: 0.9032 - regression_loss: 0.8125 - classification_loss: 0.0908 440/500 [=========================>....] - ETA: 20s - loss: 0.9042 - regression_loss: 0.8132 - classification_loss: 0.0910 441/500 [=========================>....] - ETA: 20s - loss: 0.9044 - regression_loss: 0.8135 - classification_loss: 0.0909 442/500 [=========================>....] - ETA: 19s - loss: 0.9032 - regression_loss: 0.8125 - classification_loss: 0.0908 443/500 [=========================>....] - ETA: 19s - loss: 0.9037 - regression_loss: 0.8130 - classification_loss: 0.0908 444/500 [=========================>....] - ETA: 19s - loss: 0.9024 - regression_loss: 0.8118 - classification_loss: 0.0906 445/500 [=========================>....] - ETA: 18s - loss: 0.9018 - regression_loss: 0.8113 - classification_loss: 0.0905 446/500 [=========================>....] - ETA: 18s - loss: 0.9021 - regression_loss: 0.8115 - classification_loss: 0.0906 447/500 [=========================>....] - ETA: 17s - loss: 0.9024 - regression_loss: 0.8117 - classification_loss: 0.0906 448/500 [=========================>....] - ETA: 17s - loss: 0.9027 - regression_loss: 0.8121 - classification_loss: 0.0907 449/500 [=========================>....] - ETA: 17s - loss: 0.9027 - regression_loss: 0.8121 - classification_loss: 0.0906 450/500 [==========================>...] - ETA: 16s - loss: 0.9023 - regression_loss: 0.8117 - classification_loss: 0.0906 451/500 [==========================>...] - ETA: 16s - loss: 0.9023 - regression_loss: 0.8118 - classification_loss: 0.0905 452/500 [==========================>...] - ETA: 16s - loss: 0.9017 - regression_loss: 0.8114 - classification_loss: 0.0904 453/500 [==========================>...] - ETA: 15s - loss: 0.9026 - regression_loss: 0.8121 - classification_loss: 0.0906 454/500 [==========================>...] - ETA: 15s - loss: 0.9019 - regression_loss: 0.8115 - classification_loss: 0.0904 455/500 [==========================>...] - ETA: 15s - loss: 0.9012 - regression_loss: 0.8110 - classification_loss: 0.0902 456/500 [==========================>...] - ETA: 14s - loss: 0.9012 - regression_loss: 0.8110 - classification_loss: 0.0902 457/500 [==========================>...] - ETA: 14s - loss: 0.9023 - regression_loss: 0.8120 - classification_loss: 0.0904 458/500 [==========================>...] - ETA: 14s - loss: 0.9026 - regression_loss: 0.8122 - classification_loss: 0.0904 459/500 [==========================>...] - ETA: 13s - loss: 0.9030 - regression_loss: 0.8125 - classification_loss: 0.0904 460/500 [==========================>...] - ETA: 13s - loss: 0.9022 - regression_loss: 0.8118 - classification_loss: 0.0903 461/500 [==========================>...] - ETA: 13s - loss: 0.9020 - regression_loss: 0.8117 - classification_loss: 0.0903 462/500 [==========================>...] - ETA: 12s - loss: 0.9013 - regression_loss: 0.8110 - classification_loss: 0.0903 463/500 [==========================>...] - ETA: 12s - loss: 0.9008 - regression_loss: 0.8107 - classification_loss: 0.0901 464/500 [==========================>...] - ETA: 12s - loss: 0.8999 - regression_loss: 0.8099 - classification_loss: 0.0900 465/500 [==========================>...] - ETA: 11s - loss: 0.9007 - regression_loss: 0.8107 - classification_loss: 0.0900 466/500 [==========================>...] - ETA: 11s - loss: 0.9016 - regression_loss: 0.8115 - classification_loss: 0.0901 467/500 [===========================>..] - ETA: 11s - loss: 0.9011 - regression_loss: 0.8110 - classification_loss: 0.0900 468/500 [===========================>..] - ETA: 10s - loss: 0.9020 - regression_loss: 0.8118 - classification_loss: 0.0902 469/500 [===========================>..] - ETA: 10s - loss: 0.9015 - regression_loss: 0.8114 - classification_loss: 0.0901 470/500 [===========================>..] - ETA: 10s - loss: 0.9005 - regression_loss: 0.8105 - classification_loss: 0.0900 471/500 [===========================>..] - ETA: 9s - loss: 0.9009 - regression_loss: 0.8108 - classification_loss: 0.0901  472/500 [===========================>..] - ETA: 9s - loss: 0.9007 - regression_loss: 0.8107 - classification_loss: 0.0900 473/500 [===========================>..] - ETA: 9s - loss: 0.9008 - regression_loss: 0.8108 - classification_loss: 0.0900 474/500 [===========================>..] - ETA: 8s - loss: 0.9010 - regression_loss: 0.8111 - classification_loss: 0.0900 475/500 [===========================>..] - ETA: 8s - loss: 0.9021 - regression_loss: 0.8120 - classification_loss: 0.0901 476/500 [===========================>..] - ETA: 8s - loss: 0.9024 - regression_loss: 0.8123 - classification_loss: 0.0901 477/500 [===========================>..] - ETA: 7s - loss: 0.9015 - regression_loss: 0.8116 - classification_loss: 0.0900 478/500 [===========================>..] - ETA: 7s - loss: 0.9010 - regression_loss: 0.8112 - classification_loss: 0.0899 479/500 [===========================>..] - ETA: 7s - loss: 0.8995 - regression_loss: 0.8098 - classification_loss: 0.0897 480/500 [===========================>..] - ETA: 6s - loss: 0.8982 - regression_loss: 0.8087 - classification_loss: 0.0896 481/500 [===========================>..] - ETA: 6s - loss: 0.8981 - regression_loss: 0.8085 - classification_loss: 0.0896 482/500 [===========================>..] - ETA: 6s - loss: 0.8975 - regression_loss: 0.8080 - classification_loss: 0.0895 483/500 [===========================>..] - ETA: 5s - loss: 0.8982 - regression_loss: 0.8086 - classification_loss: 0.0896 484/500 [============================>.] - ETA: 5s - loss: 0.8989 - regression_loss: 0.8092 - classification_loss: 0.0897 485/500 [============================>.] - ETA: 5s - loss: 0.8995 - regression_loss: 0.8098 - classification_loss: 0.0898 486/500 [============================>.] - ETA: 4s - loss: 0.8996 - regression_loss: 0.8099 - classification_loss: 0.0897 487/500 [============================>.] - ETA: 4s - loss: 0.8999 - regression_loss: 0.8101 - classification_loss: 0.0897 488/500 [============================>.] - ETA: 4s - loss: 0.9000 - regression_loss: 0.8102 - classification_loss: 0.0897 489/500 [============================>.] - ETA: 3s - loss: 0.8992 - regression_loss: 0.8096 - classification_loss: 0.0896 490/500 [============================>.] - ETA: 3s - loss: 0.8983 - regression_loss: 0.8089 - classification_loss: 0.0894 491/500 [============================>.] - ETA: 3s - loss: 0.8988 - regression_loss: 0.8093 - classification_loss: 0.0895 492/500 [============================>.] - ETA: 2s - loss: 0.8975 - regression_loss: 0.8082 - classification_loss: 0.0894 493/500 [============================>.] - ETA: 2s - loss: 0.8982 - regression_loss: 0.8087 - classification_loss: 0.0895 494/500 [============================>.] - ETA: 2s - loss: 0.8972 - regression_loss: 0.8079 - classification_loss: 0.0893 495/500 [============================>.] - ETA: 1s - loss: 0.8965 - regression_loss: 0.8073 - classification_loss: 0.0892 496/500 [============================>.] - ETA: 1s - loss: 0.8955 - regression_loss: 0.8064 - classification_loss: 0.0891 497/500 [============================>.] - ETA: 1s - loss: 0.8967 - regression_loss: 0.8074 - classification_loss: 0.0892 498/500 [============================>.] - ETA: 0s - loss: 0.8964 - regression_loss: 0.8073 - classification_loss: 0.0892 499/500 [============================>.] - ETA: 0s - loss: 0.8967 - regression_loss: 0.8075 - classification_loss: 0.0891 500/500 [==============================] - 170s 339ms/step - loss: 0.8955 - regression_loss: 0.8065 - classification_loss: 0.0890 1172 instances of class plum with average precision: 0.8028 mAP: 0.8028 Epoch 00038: saving model to ./training/snapshots/resnet101_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 2:36 - loss: 0.8697 - regression_loss: 0.7817 - classification_loss: 0.0880 2/500 [..............................] - ETA: 2:41 - loss: 0.9537 - regression_loss: 0.8448 - classification_loss: 0.1089 3/500 [..............................] - ETA: 2:44 - loss: 0.9875 - regression_loss: 0.8754 - classification_loss: 0.1122 4/500 [..............................] - ETA: 2:46 - loss: 0.7852 - regression_loss: 0.6986 - classification_loss: 0.0866 5/500 [..............................] - ETA: 2:45 - loss: 0.7978 - regression_loss: 0.7072 - classification_loss: 0.0906 6/500 [..............................] - ETA: 2:46 - loss: 0.8083 - regression_loss: 0.7159 - classification_loss: 0.0923 7/500 [..............................] - ETA: 2:45 - loss: 0.8563 - regression_loss: 0.7597 - classification_loss: 0.0966 8/500 [..............................] - ETA: 2:44 - loss: 0.8739 - regression_loss: 0.7701 - classification_loss: 0.1038 9/500 [..............................] - ETA: 2:42 - loss: 0.8366 - regression_loss: 0.7422 - classification_loss: 0.0944 10/500 [..............................] - ETA: 2:42 - loss: 0.8620 - regression_loss: 0.7653 - classification_loss: 0.0967 11/500 [..............................] - ETA: 2:42 - loss: 0.8481 - regression_loss: 0.7556 - classification_loss: 0.0926 12/500 [..............................] - ETA: 2:41 - loss: 0.8195 - regression_loss: 0.7321 - classification_loss: 0.0874 13/500 [..............................] - ETA: 2:41 - loss: 0.8369 - regression_loss: 0.7434 - classification_loss: 0.0935 14/500 [..............................] - ETA: 2:41 - loss: 0.8500 - regression_loss: 0.7525 - classification_loss: 0.0974 15/500 [..............................] - ETA: 2:41 - loss: 0.8563 - regression_loss: 0.7622 - classification_loss: 0.0942 16/500 [..............................] - ETA: 2:41 - loss: 0.8430 - regression_loss: 0.7519 - classification_loss: 0.0911 17/500 [>.............................] - ETA: 2:42 - loss: 0.8269 - regression_loss: 0.7393 - classification_loss: 0.0876 18/500 [>.............................] - ETA: 2:41 - loss: 0.8141 - regression_loss: 0.7275 - classification_loss: 0.0867 19/500 [>.............................] - ETA: 2:41 - loss: 0.8343 - regression_loss: 0.7442 - classification_loss: 0.0900 20/500 [>.............................] - ETA: 2:41 - loss: 0.8484 - regression_loss: 0.7557 - classification_loss: 0.0927 21/500 [>.............................] - ETA: 2:41 - loss: 0.8214 - regression_loss: 0.7316 - classification_loss: 0.0898 22/500 [>.............................] - ETA: 2:41 - loss: 0.8352 - regression_loss: 0.7408 - classification_loss: 0.0944 23/500 [>.............................] - ETA: 2:41 - loss: 0.8385 - regression_loss: 0.7443 - classification_loss: 0.0943 24/500 [>.............................] - ETA: 2:41 - loss: 0.8227 - regression_loss: 0.7319 - classification_loss: 0.0909 25/500 [>.............................] - ETA: 2:40 - loss: 0.8010 - regression_loss: 0.7129 - classification_loss: 0.0882 26/500 [>.............................] - ETA: 2:39 - loss: 0.8169 - regression_loss: 0.7257 - classification_loss: 0.0913 27/500 [>.............................] - ETA: 2:39 - loss: 0.7976 - regression_loss: 0.7090 - classification_loss: 0.0886 28/500 [>.............................] - ETA: 2:38 - loss: 0.8100 - regression_loss: 0.7203 - classification_loss: 0.0897 29/500 [>.............................] - ETA: 2:38 - loss: 0.8215 - regression_loss: 0.7297 - classification_loss: 0.0918 30/500 [>.............................] - ETA: 2:38 - loss: 0.8421 - regression_loss: 0.7481 - classification_loss: 0.0940 31/500 [>.............................] - ETA: 2:38 - loss: 0.8464 - regression_loss: 0.7533 - classification_loss: 0.0932 32/500 [>.............................] - ETA: 2:37 - loss: 0.8590 - regression_loss: 0.7641 - classification_loss: 0.0949 33/500 [>.............................] - ETA: 2:37 - loss: 0.8390 - regression_loss: 0.7466 - classification_loss: 0.0925 34/500 [=>............................] - ETA: 2:36 - loss: 0.8420 - regression_loss: 0.7495 - classification_loss: 0.0925 35/500 [=>............................] - ETA: 2:36 - loss: 0.8451 - regression_loss: 0.7510 - classification_loss: 0.0941 36/500 [=>............................] - ETA: 2:35 - loss: 0.8347 - regression_loss: 0.7428 - classification_loss: 0.0918 37/500 [=>............................] - ETA: 2:35 - loss: 0.8349 - regression_loss: 0.7436 - classification_loss: 0.0913 38/500 [=>............................] - ETA: 2:35 - loss: 0.8383 - regression_loss: 0.7475 - classification_loss: 0.0908 39/500 [=>............................] - ETA: 2:34 - loss: 0.8384 - regression_loss: 0.7482 - classification_loss: 0.0902 40/500 [=>............................] - ETA: 2:34 - loss: 0.8340 - regression_loss: 0.7453 - classification_loss: 0.0887 41/500 [=>............................] - ETA: 2:34 - loss: 0.8371 - regression_loss: 0.7488 - classification_loss: 0.0883 42/500 [=>............................] - ETA: 2:33 - loss: 0.8263 - regression_loss: 0.7395 - classification_loss: 0.0868 43/500 [=>............................] - ETA: 2:33 - loss: 0.8313 - regression_loss: 0.7440 - classification_loss: 0.0873 44/500 [=>............................] - ETA: 2:33 - loss: 0.8333 - regression_loss: 0.7458 - classification_loss: 0.0874 45/500 [=>............................] - ETA: 2:32 - loss: 0.8339 - regression_loss: 0.7468 - classification_loss: 0.0872 46/500 [=>............................] - ETA: 2:32 - loss: 0.8345 - regression_loss: 0.7475 - classification_loss: 0.0870 47/500 [=>............................] - ETA: 2:32 - loss: 0.8325 - regression_loss: 0.7459 - classification_loss: 0.0866 48/500 [=>............................] - ETA: 2:31 - loss: 0.8378 - regression_loss: 0.7502 - classification_loss: 0.0876 49/500 [=>............................] - ETA: 2:31 - loss: 0.8342 - regression_loss: 0.7479 - classification_loss: 0.0863 50/500 [==>...........................] - ETA: 2:31 - loss: 0.8225 - regression_loss: 0.7376 - classification_loss: 0.0849 51/500 [==>...........................] - ETA: 2:30 - loss: 0.8258 - regression_loss: 0.7414 - classification_loss: 0.0844 52/500 [==>...........................] - ETA: 2:30 - loss: 0.8209 - regression_loss: 0.7370 - classification_loss: 0.0839 53/500 [==>...........................] - ETA: 2:30 - loss: 0.8195 - regression_loss: 0.7366 - classification_loss: 0.0829 54/500 [==>...........................] - ETA: 2:29 - loss: 0.8226 - regression_loss: 0.7391 - classification_loss: 0.0835 55/500 [==>...........................] - ETA: 2:29 - loss: 0.8204 - regression_loss: 0.7371 - classification_loss: 0.0833 56/500 [==>...........................] - ETA: 2:29 - loss: 0.8156 - regression_loss: 0.7332 - classification_loss: 0.0824 57/500 [==>...........................] - ETA: 2:28 - loss: 0.8167 - regression_loss: 0.7340 - classification_loss: 0.0827 58/500 [==>...........................] - ETA: 2:28 - loss: 0.8153 - regression_loss: 0.7331 - classification_loss: 0.0822 59/500 [==>...........................] - ETA: 2:27 - loss: 0.8120 - regression_loss: 0.7308 - classification_loss: 0.0812 60/500 [==>...........................] - ETA: 2:27 - loss: 0.8180 - regression_loss: 0.7354 - classification_loss: 0.0827 61/500 [==>...........................] - ETA: 2:27 - loss: 0.8196 - regression_loss: 0.7363 - classification_loss: 0.0833 62/500 [==>...........................] - ETA: 2:27 - loss: 0.8248 - regression_loss: 0.7404 - classification_loss: 0.0843 63/500 [==>...........................] - ETA: 2:26 - loss: 0.8320 - regression_loss: 0.7463 - classification_loss: 0.0857 64/500 [==>...........................] - ETA: 2:26 - loss: 0.8473 - regression_loss: 0.7595 - classification_loss: 0.0877 65/500 [==>...........................] - ETA: 2:26 - loss: 0.8445 - regression_loss: 0.7572 - classification_loss: 0.0873 66/500 [==>...........................] - ETA: 2:26 - loss: 0.8497 - regression_loss: 0.7623 - classification_loss: 0.0874 67/500 [===>..........................] - ETA: 2:25 - loss: 0.8531 - regression_loss: 0.7656 - classification_loss: 0.0875 68/500 [===>..........................] - ETA: 2:25 - loss: 0.8548 - regression_loss: 0.7671 - classification_loss: 0.0877 69/500 [===>..........................] - ETA: 2:25 - loss: 0.8652 - regression_loss: 0.7756 - classification_loss: 0.0896 70/500 [===>..........................] - ETA: 2:24 - loss: 0.8658 - regression_loss: 0.7759 - classification_loss: 0.0899 71/500 [===>..........................] - ETA: 2:24 - loss: 0.8591 - regression_loss: 0.7699 - classification_loss: 0.0893 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8629 - regression_loss: 0.7732 - classification_loss: 0.0897 73/500 [===>..........................] - ETA: 2:23 - loss: 0.8595 - regression_loss: 0.7705 - classification_loss: 0.0890 74/500 [===>..........................] - ETA: 2:23 - loss: 0.8625 - regression_loss: 0.7734 - classification_loss: 0.0891 75/500 [===>..........................] - ETA: 2:23 - loss: 0.8622 - regression_loss: 0.7733 - classification_loss: 0.0889 76/500 [===>..........................] - ETA: 2:22 - loss: 0.8626 - regression_loss: 0.7738 - classification_loss: 0.0889 77/500 [===>..........................] - ETA: 2:22 - loss: 0.8759 - regression_loss: 0.7847 - classification_loss: 0.0912 78/500 [===>..........................] - ETA: 2:22 - loss: 0.8778 - regression_loss: 0.7868 - classification_loss: 0.0910 79/500 [===>..........................] - ETA: 2:21 - loss: 0.8802 - regression_loss: 0.7885 - classification_loss: 0.0917 80/500 [===>..........................] - ETA: 2:21 - loss: 0.8825 - regression_loss: 0.7901 - classification_loss: 0.0924 81/500 [===>..........................] - ETA: 2:21 - loss: 0.8788 - regression_loss: 0.7868 - classification_loss: 0.0920 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8755 - regression_loss: 0.7838 - classification_loss: 0.0917 83/500 [===>..........................] - ETA: 2:20 - loss: 0.8794 - regression_loss: 0.7871 - classification_loss: 0.0923 84/500 [====>.........................] - ETA: 2:20 - loss: 0.8791 - regression_loss: 0.7867 - classification_loss: 0.0924 85/500 [====>.........................] - ETA: 2:19 - loss: 0.8729 - regression_loss: 0.7813 - classification_loss: 0.0915 86/500 [====>.........................] - ETA: 2:19 - loss: 0.8768 - regression_loss: 0.7848 - classification_loss: 0.0920 87/500 [====>.........................] - ETA: 2:19 - loss: 0.8807 - regression_loss: 0.7882 - classification_loss: 0.0925 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8817 - regression_loss: 0.7888 - classification_loss: 0.0929 89/500 [====>.........................] - ETA: 2:18 - loss: 0.8795 - regression_loss: 0.7872 - classification_loss: 0.0923 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8802 - regression_loss: 0.7879 - classification_loss: 0.0923 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8813 - regression_loss: 0.7892 - classification_loss: 0.0921 92/500 [====>.........................] - ETA: 2:17 - loss: 0.8842 - regression_loss: 0.7916 - classification_loss: 0.0927 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8803 - regression_loss: 0.7882 - classification_loss: 0.0921 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8803 - regression_loss: 0.7884 - classification_loss: 0.0919 95/500 [====>.........................] - ETA: 2:16 - loss: 0.8820 - regression_loss: 0.7904 - classification_loss: 0.0916 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8860 - regression_loss: 0.7937 - classification_loss: 0.0922 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8857 - regression_loss: 0.7936 - classification_loss: 0.0922 98/500 [====>.........................] - ETA: 2:15 - loss: 0.8880 - regression_loss: 0.7955 - classification_loss: 0.0925 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8875 - regression_loss: 0.7955 - classification_loss: 0.0920 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8839 - regression_loss: 0.7923 - classification_loss: 0.0916 101/500 [=====>........................] - ETA: 2:14 - loss: 0.8823 - regression_loss: 0.7913 - classification_loss: 0.0910 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8854 - regression_loss: 0.7942 - classification_loss: 0.0912 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8887 - regression_loss: 0.7968 - classification_loss: 0.0919 104/500 [=====>........................] - ETA: 2:13 - loss: 0.8843 - regression_loss: 0.7932 - classification_loss: 0.0911 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8865 - regression_loss: 0.7952 - classification_loss: 0.0913 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8878 - regression_loss: 0.7965 - classification_loss: 0.0912 107/500 [=====>........................] - ETA: 2:12 - loss: 0.8818 - regression_loss: 0.7913 - classification_loss: 0.0905 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8804 - regression_loss: 0.7899 - classification_loss: 0.0905 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8816 - regression_loss: 0.7911 - classification_loss: 0.0905 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8787 - regression_loss: 0.7887 - classification_loss: 0.0900 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8794 - regression_loss: 0.7896 - classification_loss: 0.0898 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8773 - regression_loss: 0.7880 - classification_loss: 0.0892 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8757 - regression_loss: 0.7868 - classification_loss: 0.0890 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8780 - regression_loss: 0.7890 - classification_loss: 0.0890 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8814 - regression_loss: 0.7918 - classification_loss: 0.0896 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8860 - regression_loss: 0.7958 - classification_loss: 0.0902 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8871 - regression_loss: 0.7967 - classification_loss: 0.0905 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8838 - regression_loss: 0.7939 - classification_loss: 0.0899 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8834 - regression_loss: 0.7936 - classification_loss: 0.0898 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8841 - regression_loss: 0.7946 - classification_loss: 0.0895 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8844 - regression_loss: 0.7950 - classification_loss: 0.0895 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8815 - regression_loss: 0.7924 - classification_loss: 0.0891 123/500 [======>.......................] - ETA: 2:07 - loss: 0.8811 - regression_loss: 0.7923 - classification_loss: 0.0889 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8789 - regression_loss: 0.7904 - classification_loss: 0.0884 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8804 - regression_loss: 0.7916 - classification_loss: 0.0887 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8809 - regression_loss: 0.7926 - classification_loss: 0.0883 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8783 - regression_loss: 0.7902 - classification_loss: 0.0880 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8748 - regression_loss: 0.7874 - classification_loss: 0.0875 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8778 - regression_loss: 0.7901 - classification_loss: 0.0876 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8767 - regression_loss: 0.7893 - classification_loss: 0.0874 131/500 [======>.......................] - ETA: 2:04 - loss: 0.8787 - regression_loss: 0.7913 - classification_loss: 0.0874 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8780 - regression_loss: 0.7907 - classification_loss: 0.0873 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8812 - regression_loss: 0.7934 - classification_loss: 0.0879 134/500 [=======>......................] - ETA: 2:03 - loss: 0.8849 - regression_loss: 0.7966 - classification_loss: 0.0883 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8828 - regression_loss: 0.7948 - classification_loss: 0.0879 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8861 - regression_loss: 0.7978 - classification_loss: 0.0884 137/500 [=======>......................] - ETA: 2:02 - loss: 0.8869 - regression_loss: 0.7983 - classification_loss: 0.0886 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8867 - regression_loss: 0.7984 - classification_loss: 0.0883 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8848 - regression_loss: 0.7969 - classification_loss: 0.0879 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8813 - regression_loss: 0.7938 - classification_loss: 0.0875 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8767 - regression_loss: 0.7896 - classification_loss: 0.0871 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8758 - regression_loss: 0.7889 - classification_loss: 0.0869 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8772 - regression_loss: 0.7899 - classification_loss: 0.0872 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8786 - regression_loss: 0.7913 - classification_loss: 0.0873 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8815 - regression_loss: 0.7938 - classification_loss: 0.0876 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8793 - regression_loss: 0.7921 - classification_loss: 0.0873 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8810 - regression_loss: 0.7936 - classification_loss: 0.0874 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8799 - regression_loss: 0.7926 - classification_loss: 0.0873 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8776 - regression_loss: 0.7908 - classification_loss: 0.0868 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8779 - regression_loss: 0.7912 - classification_loss: 0.0867 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8797 - regression_loss: 0.7928 - classification_loss: 0.0869 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8843 - regression_loss: 0.7969 - classification_loss: 0.0874 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8846 - regression_loss: 0.7973 - classification_loss: 0.0874 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8815 - regression_loss: 0.7946 - classification_loss: 0.0870 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8791 - regression_loss: 0.7924 - classification_loss: 0.0867 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8771 - regression_loss: 0.7907 - classification_loss: 0.0864 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8785 - regression_loss: 0.7918 - classification_loss: 0.0867 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8807 - regression_loss: 0.7939 - classification_loss: 0.0868 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8785 - regression_loss: 0.7922 - classification_loss: 0.0864 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8774 - regression_loss: 0.7914 - classification_loss: 0.0860 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8749 - regression_loss: 0.7893 - classification_loss: 0.0856 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8718 - regression_loss: 0.7866 - classification_loss: 0.0852 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8735 - regression_loss: 0.7878 - classification_loss: 0.0856 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8753 - regression_loss: 0.7890 - classification_loss: 0.0863 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8766 - regression_loss: 0.7898 - classification_loss: 0.0868 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8767 - regression_loss: 0.7899 - classification_loss: 0.0868 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8781 - regression_loss: 0.7914 - classification_loss: 0.0867 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8762 - regression_loss: 0.7897 - classification_loss: 0.0865 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8768 - regression_loss: 0.7901 - classification_loss: 0.0867 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8840 - regression_loss: 0.7965 - classification_loss: 0.0875 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8813 - regression_loss: 0.7942 - classification_loss: 0.0872 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8830 - regression_loss: 0.7956 - classification_loss: 0.0874 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8838 - regression_loss: 0.7962 - classification_loss: 0.0876 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8840 - regression_loss: 0.7962 - classification_loss: 0.0878 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8868 - regression_loss: 0.7987 - classification_loss: 0.0881 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8868 - regression_loss: 0.7988 - classification_loss: 0.0880 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8870 - regression_loss: 0.7987 - classification_loss: 0.0883 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8899 - regression_loss: 0.8014 - classification_loss: 0.0886 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8875 - regression_loss: 0.7993 - classification_loss: 0.0882 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8884 - regression_loss: 0.8001 - classification_loss: 0.0883 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8865 - regression_loss: 0.7985 - classification_loss: 0.0880 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8879 - regression_loss: 0.7999 - classification_loss: 0.0880 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8899 - regression_loss: 0.8016 - classification_loss: 0.0883 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8899 - regression_loss: 0.8013 - classification_loss: 0.0885 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8888 - regression_loss: 0.8000 - classification_loss: 0.0889 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8863 - regression_loss: 0.7978 - classification_loss: 0.0885 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8873 - regression_loss: 0.7987 - classification_loss: 0.0887 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8901 - regression_loss: 0.8006 - classification_loss: 0.0895 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8881 - regression_loss: 0.7989 - classification_loss: 0.0892 190/500 [==========>...................] - ETA: 1:44 - loss: 0.8854 - regression_loss: 0.7965 - classification_loss: 0.0888 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8844 - regression_loss: 0.7958 - classification_loss: 0.0886 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8827 - regression_loss: 0.7944 - classification_loss: 0.0883 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8829 - regression_loss: 0.7945 - classification_loss: 0.0884 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8833 - regression_loss: 0.7945 - classification_loss: 0.0887 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8834 - regression_loss: 0.7949 - classification_loss: 0.0885 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8819 - regression_loss: 0.7936 - classification_loss: 0.0883 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8817 - regression_loss: 0.7935 - classification_loss: 0.0883 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8838 - regression_loss: 0.7952 - classification_loss: 0.0886 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8811 - regression_loss: 0.7928 - classification_loss: 0.0883 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8819 - regression_loss: 0.7934 - classification_loss: 0.0885 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8800 - regression_loss: 0.7917 - classification_loss: 0.0883 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8792 - regression_loss: 0.7910 - classification_loss: 0.0883 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8779 - regression_loss: 0.7899 - classification_loss: 0.0881 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8791 - regression_loss: 0.7909 - classification_loss: 0.0882 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8780 - regression_loss: 0.7899 - classification_loss: 0.0881 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8805 - regression_loss: 0.7919 - classification_loss: 0.0886 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8823 - regression_loss: 0.7934 - classification_loss: 0.0889 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8828 - regression_loss: 0.7937 - classification_loss: 0.0891 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8837 - regression_loss: 0.7942 - classification_loss: 0.0895 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8811 - regression_loss: 0.7920 - classification_loss: 0.0891 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8833 - regression_loss: 0.7938 - classification_loss: 0.0895 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8844 - regression_loss: 0.7945 - classification_loss: 0.0899 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8830 - regression_loss: 0.7932 - classification_loss: 0.0897 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8824 - regression_loss: 0.7927 - classification_loss: 0.0898 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8818 - regression_loss: 0.7922 - classification_loss: 0.0895 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8817 - regression_loss: 0.7923 - classification_loss: 0.0894 217/500 [============>.................] - ETA: 1:36 - loss: 0.8827 - regression_loss: 0.7931 - classification_loss: 0.0896 218/500 [============>.................] - ETA: 1:35 - loss: 0.8836 - regression_loss: 0.7939 - classification_loss: 0.0897 219/500 [============>.................] - ETA: 1:35 - loss: 0.8848 - regression_loss: 0.7949 - classification_loss: 0.0899 220/500 [============>.................] - ETA: 1:35 - loss: 0.8852 - regression_loss: 0.7952 - classification_loss: 0.0899 221/500 [============>.................] - ETA: 1:34 - loss: 0.8836 - regression_loss: 0.7940 - classification_loss: 0.0896 222/500 [============>.................] - ETA: 1:34 - loss: 0.8841 - regression_loss: 0.7943 - classification_loss: 0.0898 223/500 [============>.................] - ETA: 1:34 - loss: 0.8841 - regression_loss: 0.7943 - classification_loss: 0.0898 224/500 [============>.................] - ETA: 1:33 - loss: 0.8813 - regression_loss: 0.7919 - classification_loss: 0.0894 225/500 [============>.................] - ETA: 1:33 - loss: 0.8814 - regression_loss: 0.7920 - classification_loss: 0.0894 226/500 [============>.................] - ETA: 1:33 - loss: 0.8818 - regression_loss: 0.7925 - classification_loss: 0.0892 227/500 [============>.................] - ETA: 1:32 - loss: 0.8807 - regression_loss: 0.7915 - classification_loss: 0.0891 228/500 [============>.................] - ETA: 1:32 - loss: 0.8791 - regression_loss: 0.7903 - classification_loss: 0.0888 229/500 [============>.................] - ETA: 1:32 - loss: 0.8804 - regression_loss: 0.7915 - classification_loss: 0.0888 230/500 [============>.................] - ETA: 1:31 - loss: 0.8822 - regression_loss: 0.7932 - classification_loss: 0.0890 231/500 [============>.................] - ETA: 1:31 - loss: 0.8825 - regression_loss: 0.7934 - classification_loss: 0.0891 232/500 [============>.................] - ETA: 1:30 - loss: 0.8829 - regression_loss: 0.7939 - classification_loss: 0.0890 233/500 [============>.................] - ETA: 1:30 - loss: 0.8814 - regression_loss: 0.7926 - classification_loss: 0.0888 234/500 [=============>................] - ETA: 1:30 - loss: 0.8820 - regression_loss: 0.7931 - classification_loss: 0.0889 235/500 [=============>................] - ETA: 1:29 - loss: 0.8819 - regression_loss: 0.7928 - classification_loss: 0.0891 236/500 [=============>................] - ETA: 1:29 - loss: 0.8811 - regression_loss: 0.7921 - classification_loss: 0.0890 237/500 [=============>................] - ETA: 1:29 - loss: 0.8789 - regression_loss: 0.7902 - classification_loss: 0.0887 238/500 [=============>................] - ETA: 1:28 - loss: 0.8784 - regression_loss: 0.7900 - classification_loss: 0.0885 239/500 [=============>................] - ETA: 1:28 - loss: 0.8771 - regression_loss: 0.7888 - classification_loss: 0.0883 240/500 [=============>................] - ETA: 1:28 - loss: 0.8765 - regression_loss: 0.7883 - classification_loss: 0.0882 241/500 [=============>................] - ETA: 1:27 - loss: 0.8778 - regression_loss: 0.7894 - classification_loss: 0.0884 242/500 [=============>................] - ETA: 1:27 - loss: 0.8785 - regression_loss: 0.7900 - classification_loss: 0.0885 243/500 [=============>................] - ETA: 1:27 - loss: 0.8787 - regression_loss: 0.7902 - classification_loss: 0.0885 244/500 [=============>................] - ETA: 1:26 - loss: 0.8780 - regression_loss: 0.7896 - classification_loss: 0.0884 245/500 [=============>................] - ETA: 1:26 - loss: 0.8757 - regression_loss: 0.7876 - classification_loss: 0.0881 246/500 [=============>................] - ETA: 1:26 - loss: 0.8763 - regression_loss: 0.7882 - classification_loss: 0.0881 247/500 [=============>................] - ETA: 1:25 - loss: 0.8771 - regression_loss: 0.7889 - classification_loss: 0.0883 248/500 [=============>................] - ETA: 1:25 - loss: 0.8791 - regression_loss: 0.7907 - classification_loss: 0.0884 249/500 [=============>................] - ETA: 1:25 - loss: 0.8803 - regression_loss: 0.7916 - classification_loss: 0.0887 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8810 - regression_loss: 0.7921 - classification_loss: 0.0889 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8814 - regression_loss: 0.7926 - classification_loss: 0.0889 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8802 - regression_loss: 0.7916 - classification_loss: 0.0886 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8801 - regression_loss: 0.7913 - classification_loss: 0.0888 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8786 - regression_loss: 0.7900 - classification_loss: 0.0886 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8780 - regression_loss: 0.7895 - classification_loss: 0.0885 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8780 - regression_loss: 0.7895 - classification_loss: 0.0885 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8788 - regression_loss: 0.7903 - classification_loss: 0.0885 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8790 - regression_loss: 0.7903 - classification_loss: 0.0887 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8790 - regression_loss: 0.7904 - classification_loss: 0.0886 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8795 - regression_loss: 0.7909 - classification_loss: 0.0886 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8804 - regression_loss: 0.7918 - classification_loss: 0.0885 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8796 - regression_loss: 0.7912 - classification_loss: 0.0884 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8799 - regression_loss: 0.7914 - classification_loss: 0.0885 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8813 - regression_loss: 0.7926 - classification_loss: 0.0888 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8819 - regression_loss: 0.7930 - classification_loss: 0.0888 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8817 - regression_loss: 0.7931 - classification_loss: 0.0886 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8817 - regression_loss: 0.7930 - classification_loss: 0.0886 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8827 - regression_loss: 0.7939 - classification_loss: 0.0888 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8845 - regression_loss: 0.7951 - classification_loss: 0.0894 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8831 - regression_loss: 0.7940 - classification_loss: 0.0891 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8813 - regression_loss: 0.7924 - classification_loss: 0.0888 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8816 - regression_loss: 0.7928 - classification_loss: 0.0888 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8815 - regression_loss: 0.7927 - classification_loss: 0.0888 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8813 - regression_loss: 0.7928 - classification_loss: 0.0885 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8808 - regression_loss: 0.7925 - classification_loss: 0.0883 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8820 - regression_loss: 0.7937 - classification_loss: 0.0883 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8839 - regression_loss: 0.7952 - classification_loss: 0.0886 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8840 - regression_loss: 0.7952 - classification_loss: 0.0888 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8859 - regression_loss: 0.7968 - classification_loss: 0.0891 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8849 - regression_loss: 0.7961 - classification_loss: 0.0888 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8867 - regression_loss: 0.7976 - classification_loss: 0.0892 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8877 - regression_loss: 0.7984 - classification_loss: 0.0893 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8862 - regression_loss: 0.7971 - classification_loss: 0.0891 284/500 [================>.............] - ETA: 1:13 - loss: 0.8869 - regression_loss: 0.7978 - classification_loss: 0.0891 285/500 [================>.............] - ETA: 1:12 - loss: 0.8863 - regression_loss: 0.7973 - classification_loss: 0.0890 286/500 [================>.............] - ETA: 1:12 - loss: 0.8867 - regression_loss: 0.7976 - classification_loss: 0.0891 287/500 [================>.............] - ETA: 1:12 - loss: 0.8867 - regression_loss: 0.7975 - classification_loss: 0.0892 288/500 [================>.............] - ETA: 1:11 - loss: 0.8871 - regression_loss: 0.7978 - classification_loss: 0.0893 289/500 [================>.............] - ETA: 1:11 - loss: 0.8898 - regression_loss: 0.8001 - classification_loss: 0.0898 290/500 [================>.............] - ETA: 1:11 - loss: 0.8883 - regression_loss: 0.7987 - classification_loss: 0.0896 291/500 [================>.............] - ETA: 1:10 - loss: 0.8900 - regression_loss: 0.8001 - classification_loss: 0.0899 292/500 [================>.............] - ETA: 1:10 - loss: 0.8904 - regression_loss: 0.8005 - classification_loss: 0.0899 293/500 [================>.............] - ETA: 1:10 - loss: 0.8891 - regression_loss: 0.7993 - classification_loss: 0.0897 294/500 [================>.............] - ETA: 1:09 - loss: 0.8900 - regression_loss: 0.8000 - classification_loss: 0.0900 295/500 [================>.............] - ETA: 1:09 - loss: 0.8910 - regression_loss: 0.8007 - classification_loss: 0.0902 296/500 [================>.............] - ETA: 1:09 - loss: 0.8903 - regression_loss: 0.8002 - classification_loss: 0.0901 297/500 [================>.............] - ETA: 1:08 - loss: 0.8908 - regression_loss: 0.8006 - classification_loss: 0.0903 298/500 [================>.............] - ETA: 1:08 - loss: 0.8911 - regression_loss: 0.8009 - classification_loss: 0.0902 299/500 [================>.............] - ETA: 1:08 - loss: 0.8901 - regression_loss: 0.8000 - classification_loss: 0.0901 300/500 [=================>............] - ETA: 1:07 - loss: 0.8907 - regression_loss: 0.8004 - classification_loss: 0.0902 301/500 [=================>............] - ETA: 1:07 - loss: 0.8908 - regression_loss: 0.8005 - classification_loss: 0.0903 302/500 [=================>............] - ETA: 1:07 - loss: 0.8915 - regression_loss: 0.8011 - classification_loss: 0.0904 303/500 [=================>............] - ETA: 1:06 - loss: 0.8899 - regression_loss: 0.7997 - classification_loss: 0.0902 304/500 [=================>............] - ETA: 1:06 - loss: 0.8916 - regression_loss: 0.8013 - classification_loss: 0.0904 305/500 [=================>............] - ETA: 1:06 - loss: 0.8913 - regression_loss: 0.8010 - classification_loss: 0.0903 306/500 [=================>............] - ETA: 1:05 - loss: 0.8899 - regression_loss: 0.7997 - classification_loss: 0.0901 307/500 [=================>............] - ETA: 1:05 - loss: 0.8918 - regression_loss: 0.8014 - classification_loss: 0.0904 308/500 [=================>............] - ETA: 1:05 - loss: 0.8902 - regression_loss: 0.8001 - classification_loss: 0.0902 309/500 [=================>............] - ETA: 1:04 - loss: 0.8896 - regression_loss: 0.7995 - classification_loss: 0.0901 310/500 [=================>............] - ETA: 1:04 - loss: 0.8905 - regression_loss: 0.8003 - classification_loss: 0.0901 311/500 [=================>............] - ETA: 1:04 - loss: 0.8899 - regression_loss: 0.7999 - classification_loss: 0.0899 312/500 [=================>............] - ETA: 1:03 - loss: 0.8913 - regression_loss: 0.8012 - classification_loss: 0.0901 313/500 [=================>............] - ETA: 1:03 - loss: 0.8919 - regression_loss: 0.8016 - classification_loss: 0.0903 314/500 [=================>............] - ETA: 1:03 - loss: 0.8909 - regression_loss: 0.8008 - classification_loss: 0.0901 315/500 [=================>............] - ETA: 1:02 - loss: 0.8906 - regression_loss: 0.8006 - classification_loss: 0.0899 316/500 [=================>............] - ETA: 1:02 - loss: 0.8908 - regression_loss: 0.8009 - classification_loss: 0.0899 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8902 - regression_loss: 0.8005 - classification_loss: 0.0898 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8913 - regression_loss: 0.8014 - classification_loss: 0.0900 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8927 - regression_loss: 0.8022 - classification_loss: 0.0905 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8910 - regression_loss: 0.8007 - classification_loss: 0.0903 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8920 - regression_loss: 0.8014 - classification_loss: 0.0906 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8927 - regression_loss: 0.8019 - classification_loss: 0.0908 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8927 - regression_loss: 0.8020 - classification_loss: 0.0907 324/500 [==================>...........] - ETA: 59s - loss: 0.8904 - regression_loss: 0.7999 - classification_loss: 0.0905  325/500 [==================>...........] - ETA: 59s - loss: 0.8899 - regression_loss: 0.7996 - classification_loss: 0.0903 326/500 [==================>...........] - ETA: 59s - loss: 0.8899 - regression_loss: 0.7996 - classification_loss: 0.0903 327/500 [==================>...........] - ETA: 58s - loss: 0.8915 - regression_loss: 0.8010 - classification_loss: 0.0905 328/500 [==================>...........] - ETA: 58s - loss: 0.8909 - regression_loss: 0.8005 - classification_loss: 0.0905 329/500 [==================>...........] - ETA: 58s - loss: 0.8911 - regression_loss: 0.8006 - classification_loss: 0.0905 330/500 [==================>...........] - ETA: 57s - loss: 0.8910 - regression_loss: 0.8006 - classification_loss: 0.0904 331/500 [==================>...........] - ETA: 57s - loss: 0.8890 - regression_loss: 0.7988 - classification_loss: 0.0902 332/500 [==================>...........] - ETA: 57s - loss: 0.8877 - regression_loss: 0.7976 - classification_loss: 0.0901 333/500 [==================>...........] - ETA: 56s - loss: 0.8878 - regression_loss: 0.7979 - classification_loss: 0.0900 334/500 [===================>..........] - ETA: 56s - loss: 0.8876 - regression_loss: 0.7977 - classification_loss: 0.0900 335/500 [===================>..........] - ETA: 55s - loss: 0.8888 - regression_loss: 0.7984 - classification_loss: 0.0904 336/500 [===================>..........] - ETA: 55s - loss: 0.8889 - regression_loss: 0.7985 - classification_loss: 0.0904 337/500 [===================>..........] - ETA: 55s - loss: 0.8896 - regression_loss: 0.7992 - classification_loss: 0.0904 338/500 [===================>..........] - ETA: 54s - loss: 0.8899 - regression_loss: 0.7995 - classification_loss: 0.0904 339/500 [===================>..........] - ETA: 54s - loss: 0.8883 - regression_loss: 0.7979 - classification_loss: 0.0903 340/500 [===================>..........] - ETA: 54s - loss: 0.8889 - regression_loss: 0.7985 - classification_loss: 0.0904 341/500 [===================>..........] - ETA: 53s - loss: 0.8877 - regression_loss: 0.7975 - classification_loss: 0.0902 342/500 [===================>..........] - ETA: 53s - loss: 0.8883 - regression_loss: 0.7981 - classification_loss: 0.0902 343/500 [===================>..........] - ETA: 53s - loss: 0.8873 - regression_loss: 0.7971 - classification_loss: 0.0901 344/500 [===================>..........] - ETA: 52s - loss: 0.8877 - regression_loss: 0.7975 - classification_loss: 0.0902 345/500 [===================>..........] - ETA: 52s - loss: 0.8885 - regression_loss: 0.7981 - classification_loss: 0.0903 346/500 [===================>..........] - ETA: 52s - loss: 0.8892 - regression_loss: 0.7988 - classification_loss: 0.0904 347/500 [===================>..........] - ETA: 51s - loss: 0.8896 - regression_loss: 0.7993 - classification_loss: 0.0903 348/500 [===================>..........] - ETA: 51s - loss: 0.8886 - regression_loss: 0.7985 - classification_loss: 0.0902 349/500 [===================>..........] - ETA: 51s - loss: 0.8885 - regression_loss: 0.7982 - classification_loss: 0.0902 350/500 [====================>.........] - ETA: 50s - loss: 0.8883 - regression_loss: 0.7982 - classification_loss: 0.0901 351/500 [====================>.........] - ETA: 50s - loss: 0.8880 - regression_loss: 0.7980 - classification_loss: 0.0900 352/500 [====================>.........] - ETA: 50s - loss: 0.8865 - regression_loss: 0.7967 - classification_loss: 0.0898 353/500 [====================>.........] - ETA: 49s - loss: 0.8880 - regression_loss: 0.7980 - classification_loss: 0.0900 354/500 [====================>.........] - ETA: 49s - loss: 0.8874 - regression_loss: 0.7975 - classification_loss: 0.0899 355/500 [====================>.........] - ETA: 49s - loss: 0.8882 - regression_loss: 0.7984 - classification_loss: 0.0898 356/500 [====================>.........] - ETA: 48s - loss: 0.8890 - regression_loss: 0.7990 - classification_loss: 0.0900 357/500 [====================>.........] - ETA: 48s - loss: 0.8888 - regression_loss: 0.7989 - classification_loss: 0.0899 358/500 [====================>.........] - ETA: 48s - loss: 0.8886 - regression_loss: 0.7987 - classification_loss: 0.0899 359/500 [====================>.........] - ETA: 47s - loss: 0.8898 - regression_loss: 0.7999 - classification_loss: 0.0899 360/500 [====================>.........] - ETA: 47s - loss: 0.8901 - regression_loss: 0.8002 - classification_loss: 0.0899 361/500 [====================>.........] - ETA: 47s - loss: 0.8892 - regression_loss: 0.7995 - classification_loss: 0.0897 362/500 [====================>.........] - ETA: 46s - loss: 0.8889 - regression_loss: 0.7992 - classification_loss: 0.0896 363/500 [====================>.........] - ETA: 46s - loss: 0.8901 - regression_loss: 0.8003 - classification_loss: 0.0898 364/500 [====================>.........] - ETA: 46s - loss: 0.8901 - regression_loss: 0.8004 - classification_loss: 0.0897 365/500 [====================>.........] - ETA: 45s - loss: 0.8895 - regression_loss: 0.7999 - classification_loss: 0.0897 366/500 [====================>.........] - ETA: 45s - loss: 0.8894 - regression_loss: 0.7998 - classification_loss: 0.0896 367/500 [=====================>........] - ETA: 45s - loss: 0.8881 - regression_loss: 0.7987 - classification_loss: 0.0894 368/500 [=====================>........] - ETA: 44s - loss: 0.8883 - regression_loss: 0.7990 - classification_loss: 0.0894 369/500 [=====================>........] - ETA: 44s - loss: 0.8871 - regression_loss: 0.7979 - classification_loss: 0.0892 370/500 [=====================>........] - ETA: 44s - loss: 0.8892 - regression_loss: 0.7997 - classification_loss: 0.0895 371/500 [=====================>........] - ETA: 43s - loss: 0.8909 - regression_loss: 0.8013 - classification_loss: 0.0896 372/500 [=====================>........] - ETA: 43s - loss: 0.8912 - regression_loss: 0.8016 - classification_loss: 0.0896 373/500 [=====================>........] - ETA: 43s - loss: 0.8912 - regression_loss: 0.8016 - classification_loss: 0.0896 374/500 [=====================>........] - ETA: 42s - loss: 0.8902 - regression_loss: 0.8007 - classification_loss: 0.0894 375/500 [=====================>........] - ETA: 42s - loss: 0.8899 - regression_loss: 0.8006 - classification_loss: 0.0893 376/500 [=====================>........] - ETA: 42s - loss: 0.8899 - regression_loss: 0.8005 - classification_loss: 0.0893 377/500 [=====================>........] - ETA: 41s - loss: 0.8905 - regression_loss: 0.8011 - classification_loss: 0.0894 378/500 [=====================>........] - ETA: 41s - loss: 0.8891 - regression_loss: 0.7999 - classification_loss: 0.0892 379/500 [=====================>........] - ETA: 41s - loss: 0.8900 - regression_loss: 0.8006 - classification_loss: 0.0893 380/500 [=====================>........] - ETA: 40s - loss: 0.8900 - regression_loss: 0.8006 - classification_loss: 0.0894 381/500 [=====================>........] - ETA: 40s - loss: 0.8885 - regression_loss: 0.7993 - classification_loss: 0.0892 382/500 [=====================>........] - ETA: 40s - loss: 0.8891 - regression_loss: 0.7998 - classification_loss: 0.0893 383/500 [=====================>........] - ETA: 39s - loss: 0.8894 - regression_loss: 0.8001 - classification_loss: 0.0894 384/500 [======================>.......] - ETA: 39s - loss: 0.8895 - regression_loss: 0.8001 - classification_loss: 0.0893 385/500 [======================>.......] - ETA: 38s - loss: 0.8901 - regression_loss: 0.8006 - classification_loss: 0.0894 386/500 [======================>.......] - ETA: 38s - loss: 0.8897 - regression_loss: 0.8002 - classification_loss: 0.0895 387/500 [======================>.......] - ETA: 38s - loss: 0.8886 - regression_loss: 0.7993 - classification_loss: 0.0893 388/500 [======================>.......] - ETA: 37s - loss: 0.8877 - regression_loss: 0.7986 - classification_loss: 0.0891 389/500 [======================>.......] - ETA: 37s - loss: 0.8889 - regression_loss: 0.7996 - classification_loss: 0.0893 390/500 [======================>.......] - ETA: 37s - loss: 0.8892 - regression_loss: 0.7997 - classification_loss: 0.0895 391/500 [======================>.......] - ETA: 36s - loss: 0.8892 - regression_loss: 0.7998 - classification_loss: 0.0894 392/500 [======================>.......] - ETA: 36s - loss: 0.8893 - regression_loss: 0.7999 - classification_loss: 0.0894 393/500 [======================>.......] - ETA: 36s - loss: 0.8890 - regression_loss: 0.7997 - classification_loss: 0.0893 394/500 [======================>.......] - ETA: 35s - loss: 0.8885 - regression_loss: 0.7992 - classification_loss: 0.0893 395/500 [======================>.......] - ETA: 35s - loss: 0.8884 - regression_loss: 0.7991 - classification_loss: 0.0893 396/500 [======================>.......] - ETA: 35s - loss: 0.8885 - regression_loss: 0.7991 - classification_loss: 0.0893 397/500 [======================>.......] - ETA: 34s - loss: 0.8890 - regression_loss: 0.7996 - classification_loss: 0.0893 398/500 [======================>.......] - ETA: 34s - loss: 0.8894 - regression_loss: 0.8000 - classification_loss: 0.0895 399/500 [======================>.......] - ETA: 34s - loss: 0.8890 - regression_loss: 0.7997 - classification_loss: 0.0893 400/500 [=======================>......] - ETA: 33s - loss: 0.8897 - regression_loss: 0.8003 - classification_loss: 0.0895 401/500 [=======================>......] - ETA: 33s - loss: 0.8898 - regression_loss: 0.8004 - classification_loss: 0.0893 402/500 [=======================>......] - ETA: 33s - loss: 0.8900 - regression_loss: 0.8007 - classification_loss: 0.0893 403/500 [=======================>......] - ETA: 32s - loss: 0.8914 - regression_loss: 0.8020 - classification_loss: 0.0894 404/500 [=======================>......] - ETA: 32s - loss: 0.8919 - regression_loss: 0.8024 - classification_loss: 0.0895 405/500 [=======================>......] - ETA: 32s - loss: 0.8927 - regression_loss: 0.8032 - classification_loss: 0.0895 406/500 [=======================>......] - ETA: 31s - loss: 0.8929 - regression_loss: 0.8033 - classification_loss: 0.0896 407/500 [=======================>......] - ETA: 31s - loss: 0.8925 - regression_loss: 0.8030 - classification_loss: 0.0895 408/500 [=======================>......] - ETA: 31s - loss: 0.8921 - regression_loss: 0.8027 - classification_loss: 0.0894 409/500 [=======================>......] - ETA: 30s - loss: 0.8922 - regression_loss: 0.8028 - classification_loss: 0.0894 410/500 [=======================>......] - ETA: 30s - loss: 0.8922 - regression_loss: 0.8029 - classification_loss: 0.0893 411/500 [=======================>......] - ETA: 30s - loss: 0.8930 - regression_loss: 0.8036 - classification_loss: 0.0894 412/500 [=======================>......] - ETA: 29s - loss: 0.8918 - regression_loss: 0.8025 - classification_loss: 0.0892 413/500 [=======================>......] - ETA: 29s - loss: 0.8915 - regression_loss: 0.8024 - classification_loss: 0.0891 414/500 [=======================>......] - ETA: 29s - loss: 0.8903 - regression_loss: 0.8013 - classification_loss: 0.0890 415/500 [=======================>......] - ETA: 28s - loss: 0.8909 - regression_loss: 0.8019 - classification_loss: 0.0890 416/500 [=======================>......] - ETA: 28s - loss: 0.8901 - regression_loss: 0.8013 - classification_loss: 0.0889 417/500 [========================>.....] - ETA: 28s - loss: 0.8899 - regression_loss: 0.8010 - classification_loss: 0.0889 418/500 [========================>.....] - ETA: 27s - loss: 0.8903 - regression_loss: 0.8014 - classification_loss: 0.0890 419/500 [========================>.....] - ETA: 27s - loss: 0.8893 - regression_loss: 0.8005 - classification_loss: 0.0888 420/500 [========================>.....] - ETA: 27s - loss: 0.8895 - regression_loss: 0.8007 - classification_loss: 0.0888 421/500 [========================>.....] - ETA: 26s - loss: 0.8882 - regression_loss: 0.7996 - classification_loss: 0.0886 422/500 [========================>.....] - ETA: 26s - loss: 0.8895 - regression_loss: 0.8006 - classification_loss: 0.0889 423/500 [========================>.....] - ETA: 26s - loss: 0.8904 - regression_loss: 0.8013 - classification_loss: 0.0891 424/500 [========================>.....] - ETA: 25s - loss: 0.8901 - regression_loss: 0.8011 - classification_loss: 0.0890 425/500 [========================>.....] - ETA: 25s - loss: 0.8891 - regression_loss: 0.8003 - classification_loss: 0.0888 426/500 [========================>.....] - ETA: 25s - loss: 0.8878 - regression_loss: 0.7992 - classification_loss: 0.0887 427/500 [========================>.....] - ETA: 24s - loss: 0.8878 - regression_loss: 0.7991 - classification_loss: 0.0887 428/500 [========================>.....] - ETA: 24s - loss: 0.8878 - regression_loss: 0.7992 - classification_loss: 0.0887 429/500 [========================>.....] - ETA: 24s - loss: 0.8890 - regression_loss: 0.8001 - classification_loss: 0.0889 430/500 [========================>.....] - ETA: 23s - loss: 0.8897 - regression_loss: 0.8007 - classification_loss: 0.0890 431/500 [========================>.....] - ETA: 23s - loss: 0.8906 - regression_loss: 0.8012 - classification_loss: 0.0894 432/500 [========================>.....] - ETA: 23s - loss: 0.8904 - regression_loss: 0.8010 - classification_loss: 0.0894 433/500 [========================>.....] - ETA: 22s - loss: 0.8894 - regression_loss: 0.8001 - classification_loss: 0.0893 434/500 [=========================>....] - ETA: 22s - loss: 0.8885 - regression_loss: 0.7994 - classification_loss: 0.0892 435/500 [=========================>....] - ETA: 22s - loss: 0.8883 - regression_loss: 0.7992 - classification_loss: 0.0891 436/500 [=========================>....] - ETA: 21s - loss: 0.8876 - regression_loss: 0.7986 - classification_loss: 0.0890 437/500 [=========================>....] - ETA: 21s - loss: 0.8867 - regression_loss: 0.7979 - classification_loss: 0.0888 438/500 [=========================>....] - ETA: 21s - loss: 0.8858 - regression_loss: 0.7971 - classification_loss: 0.0888 439/500 [=========================>....] - ETA: 20s - loss: 0.8863 - regression_loss: 0.7974 - classification_loss: 0.0889 440/500 [=========================>....] - ETA: 20s - loss: 0.8858 - regression_loss: 0.7970 - classification_loss: 0.0888 441/500 [=========================>....] - ETA: 19s - loss: 0.8854 - regression_loss: 0.7968 - classification_loss: 0.0886 442/500 [=========================>....] - ETA: 19s - loss: 0.8851 - regression_loss: 0.7965 - classification_loss: 0.0886 443/500 [=========================>....] - ETA: 19s - loss: 0.8846 - regression_loss: 0.7961 - classification_loss: 0.0885 444/500 [=========================>....] - ETA: 18s - loss: 0.8849 - regression_loss: 0.7963 - classification_loss: 0.0886 445/500 [=========================>....] - ETA: 18s - loss: 0.8850 - regression_loss: 0.7964 - classification_loss: 0.0886 446/500 [=========================>....] - ETA: 18s - loss: 0.8854 - regression_loss: 0.7968 - classification_loss: 0.0886 447/500 [=========================>....] - ETA: 17s - loss: 0.8842 - regression_loss: 0.7957 - classification_loss: 0.0884 448/500 [=========================>....] - ETA: 17s - loss: 0.8841 - regression_loss: 0.7957 - classification_loss: 0.0883 449/500 [=========================>....] - ETA: 17s - loss: 0.8845 - regression_loss: 0.7962 - classification_loss: 0.0883 450/500 [==========================>...] - ETA: 16s - loss: 0.8851 - regression_loss: 0.7967 - classification_loss: 0.0884 451/500 [==========================>...] - ETA: 16s - loss: 0.8842 - regression_loss: 0.7959 - classification_loss: 0.0883 452/500 [==========================>...] - ETA: 16s - loss: 0.8833 - regression_loss: 0.7952 - classification_loss: 0.0881 453/500 [==========================>...] - ETA: 15s - loss: 0.8829 - regression_loss: 0.7949 - classification_loss: 0.0881 454/500 [==========================>...] - ETA: 15s - loss: 0.8838 - regression_loss: 0.7955 - classification_loss: 0.0883 455/500 [==========================>...] - ETA: 15s - loss: 0.8834 - regression_loss: 0.7952 - classification_loss: 0.0882 456/500 [==========================>...] - ETA: 14s - loss: 0.8834 - regression_loss: 0.7952 - classification_loss: 0.0881 457/500 [==========================>...] - ETA: 14s - loss: 0.8837 - regression_loss: 0.7954 - classification_loss: 0.0883 458/500 [==========================>...] - ETA: 14s - loss: 0.8832 - regression_loss: 0.7950 - classification_loss: 0.0883 459/500 [==========================>...] - ETA: 13s - loss: 0.8843 - regression_loss: 0.7959 - classification_loss: 0.0885 460/500 [==========================>...] - ETA: 13s - loss: 0.8835 - regression_loss: 0.7952 - classification_loss: 0.0883 461/500 [==========================>...] - ETA: 13s - loss: 0.8832 - regression_loss: 0.7948 - classification_loss: 0.0883 462/500 [==========================>...] - ETA: 12s - loss: 0.8836 - regression_loss: 0.7953 - classification_loss: 0.0884 463/500 [==========================>...] - ETA: 12s - loss: 0.8846 - regression_loss: 0.7960 - classification_loss: 0.0885 464/500 [==========================>...] - ETA: 12s - loss: 0.8852 - regression_loss: 0.7967 - classification_loss: 0.0885 465/500 [==========================>...] - ETA: 11s - loss: 0.8861 - regression_loss: 0.7975 - classification_loss: 0.0887 466/500 [==========================>...] - ETA: 11s - loss: 0.8858 - regression_loss: 0.7971 - classification_loss: 0.0886 467/500 [===========================>..] - ETA: 11s - loss: 0.8847 - regression_loss: 0.7962 - classification_loss: 0.0885 468/500 [===========================>..] - ETA: 10s - loss: 0.8849 - regression_loss: 0.7963 - classification_loss: 0.0885 469/500 [===========================>..] - ETA: 10s - loss: 0.8848 - regression_loss: 0.7963 - classification_loss: 0.0885 470/500 [===========================>..] - ETA: 10s - loss: 0.8851 - regression_loss: 0.7967 - classification_loss: 0.0884 471/500 [===========================>..] - ETA: 9s - loss: 0.8849 - regression_loss: 0.7966 - classification_loss: 0.0883  472/500 [===========================>..] - ETA: 9s - loss: 0.8853 - regression_loss: 0.7971 - classification_loss: 0.0882 473/500 [===========================>..] - ETA: 9s - loss: 0.8848 - regression_loss: 0.7966 - classification_loss: 0.0882 474/500 [===========================>..] - ETA: 8s - loss: 0.8851 - regression_loss: 0.7969 - classification_loss: 0.0883 475/500 [===========================>..] - ETA: 8s - loss: 0.8848 - regression_loss: 0.7966 - classification_loss: 0.0883 476/500 [===========================>..] - ETA: 8s - loss: 0.8842 - regression_loss: 0.7960 - classification_loss: 0.0882 477/500 [===========================>..] - ETA: 7s - loss: 0.8844 - regression_loss: 0.7962 - classification_loss: 0.0882 478/500 [===========================>..] - ETA: 7s - loss: 0.8845 - regression_loss: 0.7962 - classification_loss: 0.0883 479/500 [===========================>..] - ETA: 7s - loss: 0.8829 - regression_loss: 0.7947 - classification_loss: 0.0881 480/500 [===========================>..] - ETA: 6s - loss: 0.8830 - regression_loss: 0.7949 - classification_loss: 0.0881 481/500 [===========================>..] - ETA: 6s - loss: 0.8827 - regression_loss: 0.7946 - classification_loss: 0.0880 482/500 [===========================>..] - ETA: 6s - loss: 0.8819 - regression_loss: 0.7940 - classification_loss: 0.0879 483/500 [===========================>..] - ETA: 5s - loss: 0.8825 - regression_loss: 0.7944 - classification_loss: 0.0881 484/500 [============================>.] - ETA: 5s - loss: 0.8826 - regression_loss: 0.7945 - classification_loss: 0.0880 485/500 [============================>.] - ETA: 5s - loss: 0.8828 - regression_loss: 0.7948 - classification_loss: 0.0880 486/500 [============================>.] - ETA: 4s - loss: 0.8836 - regression_loss: 0.7956 - classification_loss: 0.0880 487/500 [============================>.] - ETA: 4s - loss: 0.8837 - regression_loss: 0.7957 - classification_loss: 0.0881 488/500 [============================>.] - ETA: 4s - loss: 0.8844 - regression_loss: 0.7963 - classification_loss: 0.0881 489/500 [============================>.] - ETA: 3s - loss: 0.8848 - regression_loss: 0.7966 - classification_loss: 0.0882 490/500 [============================>.] - ETA: 3s - loss: 0.8851 - regression_loss: 0.7969 - classification_loss: 0.0882 491/500 [============================>.] - ETA: 3s - loss: 0.8846 - regression_loss: 0.7964 - classification_loss: 0.0882 492/500 [============================>.] - ETA: 2s - loss: 0.8854 - regression_loss: 0.7968 - classification_loss: 0.0886 493/500 [============================>.] - ETA: 2s - loss: 0.8854 - regression_loss: 0.7968 - classification_loss: 0.0886 494/500 [============================>.] - ETA: 2s - loss: 0.8861 - regression_loss: 0.7974 - classification_loss: 0.0887 495/500 [============================>.] - ETA: 1s - loss: 0.8859 - regression_loss: 0.7973 - classification_loss: 0.0886 496/500 [============================>.] - ETA: 1s - loss: 0.8862 - regression_loss: 0.7974 - classification_loss: 0.0887 497/500 [============================>.] - ETA: 1s - loss: 0.8873 - regression_loss: 0.7985 - classification_loss: 0.0888 498/500 [============================>.] - ETA: 0s - loss: 0.8865 - regression_loss: 0.7978 - classification_loss: 0.0887 499/500 [============================>.] - ETA: 0s - loss: 0.8876 - regression_loss: 0.7987 - classification_loss: 0.0889 500/500 [==============================] - 170s 339ms/step - loss: 0.8884 - regression_loss: 0.7993 - classification_loss: 0.0891 1172 instances of class plum with average precision: 0.7980 mAP: 0.7980 Epoch 00039: saving model to ./training/snapshots/resnet101_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:34 - loss: 0.6162 - regression_loss: 0.5471 - classification_loss: 0.0691 2/500 [..............................] - ETA: 2:41 - loss: 0.9179 - regression_loss: 0.8017 - classification_loss: 0.1162 3/500 [..............................] - ETA: 2:45 - loss: 0.7554 - regression_loss: 0.6717 - classification_loss: 0.0837 4/500 [..............................] - ETA: 2:46 - loss: 0.8001 - regression_loss: 0.7154 - classification_loss: 0.0846 5/500 [..............................] - ETA: 2:47 - loss: 0.8528 - regression_loss: 0.7555 - classification_loss: 0.0973 6/500 [..............................] - ETA: 2:47 - loss: 0.8168 - regression_loss: 0.7274 - classification_loss: 0.0894 7/500 [..............................] - ETA: 2:47 - loss: 0.7622 - regression_loss: 0.6811 - classification_loss: 0.0811 8/500 [..............................] - ETA: 2:48 - loss: 0.7479 - regression_loss: 0.6718 - classification_loss: 0.0761 9/500 [..............................] - ETA: 2:47 - loss: 0.7159 - regression_loss: 0.6425 - classification_loss: 0.0735 10/500 [..............................] - ETA: 2:47 - loss: 0.7608 - regression_loss: 0.6822 - classification_loss: 0.0786 11/500 [..............................] - ETA: 2:46 - loss: 0.7322 - regression_loss: 0.6585 - classification_loss: 0.0737 12/500 [..............................] - ETA: 2:46 - loss: 0.7193 - regression_loss: 0.6456 - classification_loss: 0.0737 13/500 [..............................] - ETA: 2:45 - loss: 0.7468 - regression_loss: 0.6733 - classification_loss: 0.0735 14/500 [..............................] - ETA: 2:45 - loss: 0.7688 - regression_loss: 0.6908 - classification_loss: 0.0780 15/500 [..............................] - ETA: 2:45 - loss: 0.7898 - regression_loss: 0.7102 - classification_loss: 0.0797 16/500 [..............................] - ETA: 2:45 - loss: 0.7951 - regression_loss: 0.7165 - classification_loss: 0.0786 17/500 [>.............................] - ETA: 2:44 - loss: 0.8071 - regression_loss: 0.7286 - classification_loss: 0.0785 18/500 [>.............................] - ETA: 2:43 - loss: 0.8164 - regression_loss: 0.7364 - classification_loss: 0.0800 19/500 [>.............................] - ETA: 2:42 - loss: 0.7965 - regression_loss: 0.7192 - classification_loss: 0.0773 20/500 [>.............................] - ETA: 2:42 - loss: 0.8020 - regression_loss: 0.7245 - classification_loss: 0.0776 21/500 [>.............................] - ETA: 2:42 - loss: 0.8353 - regression_loss: 0.7516 - classification_loss: 0.0837 22/500 [>.............................] - ETA: 2:42 - loss: 0.8293 - regression_loss: 0.7471 - classification_loss: 0.0823 23/500 [>.............................] - ETA: 2:42 - loss: 0.8287 - regression_loss: 0.7480 - classification_loss: 0.0807 24/500 [>.............................] - ETA: 2:42 - loss: 0.8290 - regression_loss: 0.7484 - classification_loss: 0.0806 25/500 [>.............................] - ETA: 2:41 - loss: 0.8187 - regression_loss: 0.7397 - classification_loss: 0.0790 26/500 [>.............................] - ETA: 2:41 - loss: 0.8203 - regression_loss: 0.7417 - classification_loss: 0.0786 27/500 [>.............................] - ETA: 2:40 - loss: 0.8021 - regression_loss: 0.7259 - classification_loss: 0.0763 28/500 [>.............................] - ETA: 2:40 - loss: 0.7847 - regression_loss: 0.7102 - classification_loss: 0.0744 29/500 [>.............................] - ETA: 2:40 - loss: 0.7903 - regression_loss: 0.7154 - classification_loss: 0.0749 30/500 [>.............................] - ETA: 2:40 - loss: 0.7836 - regression_loss: 0.7106 - classification_loss: 0.0730 31/500 [>.............................] - ETA: 2:39 - loss: 0.7654 - regression_loss: 0.6944 - classification_loss: 0.0710 32/500 [>.............................] - ETA: 2:39 - loss: 0.7706 - regression_loss: 0.6984 - classification_loss: 0.0722 33/500 [>.............................] - ETA: 2:38 - loss: 0.7746 - regression_loss: 0.7022 - classification_loss: 0.0724 34/500 [=>............................] - ETA: 2:38 - loss: 0.7776 - regression_loss: 0.7051 - classification_loss: 0.0725 35/500 [=>............................] - ETA: 2:38 - loss: 0.7647 - regression_loss: 0.6936 - classification_loss: 0.0711 36/500 [=>............................] - ETA: 2:37 - loss: 0.7767 - regression_loss: 0.7018 - classification_loss: 0.0749 37/500 [=>............................] - ETA: 2:36 - loss: 0.7800 - regression_loss: 0.7041 - classification_loss: 0.0759 38/500 [=>............................] - ETA: 2:36 - loss: 0.7901 - regression_loss: 0.7133 - classification_loss: 0.0769 39/500 [=>............................] - ETA: 2:35 - loss: 0.8088 - regression_loss: 0.7313 - classification_loss: 0.0775 40/500 [=>............................] - ETA: 2:35 - loss: 0.8118 - regression_loss: 0.7344 - classification_loss: 0.0774 41/500 [=>............................] - ETA: 2:34 - loss: 0.8069 - regression_loss: 0.7304 - classification_loss: 0.0765 42/500 [=>............................] - ETA: 2:34 - loss: 0.8055 - regression_loss: 0.7288 - classification_loss: 0.0767 43/500 [=>............................] - ETA: 2:34 - loss: 0.7937 - regression_loss: 0.7183 - classification_loss: 0.0755 44/500 [=>............................] - ETA: 2:33 - loss: 0.7884 - regression_loss: 0.7136 - classification_loss: 0.0748 45/500 [=>............................] - ETA: 2:33 - loss: 0.7802 - regression_loss: 0.7064 - classification_loss: 0.0738 46/500 [=>............................] - ETA: 2:33 - loss: 0.7721 - regression_loss: 0.6995 - classification_loss: 0.0726 47/500 [=>............................] - ETA: 2:32 - loss: 0.7688 - regression_loss: 0.6970 - classification_loss: 0.0718 48/500 [=>............................] - ETA: 2:32 - loss: 0.7757 - regression_loss: 0.7027 - classification_loss: 0.0730 49/500 [=>............................] - ETA: 2:32 - loss: 0.7769 - regression_loss: 0.7038 - classification_loss: 0.0731 50/500 [==>...........................] - ETA: 2:31 - loss: 0.7737 - regression_loss: 0.7011 - classification_loss: 0.0726 51/500 [==>...........................] - ETA: 2:31 - loss: 0.7827 - regression_loss: 0.7090 - classification_loss: 0.0737 52/500 [==>...........................] - ETA: 2:30 - loss: 0.7833 - regression_loss: 0.7093 - classification_loss: 0.0740 53/500 [==>...........................] - ETA: 2:30 - loss: 0.7905 - regression_loss: 0.7140 - classification_loss: 0.0765 54/500 [==>...........................] - ETA: 2:30 - loss: 0.7987 - regression_loss: 0.7209 - classification_loss: 0.0778 55/500 [==>...........................] - ETA: 2:29 - loss: 0.8020 - regression_loss: 0.7237 - classification_loss: 0.0783 56/500 [==>...........................] - ETA: 2:29 - loss: 0.8103 - regression_loss: 0.7311 - classification_loss: 0.0793 57/500 [==>...........................] - ETA: 2:29 - loss: 0.8079 - regression_loss: 0.7290 - classification_loss: 0.0789 58/500 [==>...........................] - ETA: 2:28 - loss: 0.8097 - regression_loss: 0.7306 - classification_loss: 0.0791 59/500 [==>...........................] - ETA: 2:28 - loss: 0.8110 - regression_loss: 0.7318 - classification_loss: 0.0792 60/500 [==>...........................] - ETA: 2:28 - loss: 0.8150 - regression_loss: 0.7352 - classification_loss: 0.0798 61/500 [==>...........................] - ETA: 2:27 - loss: 0.8092 - regression_loss: 0.7300 - classification_loss: 0.0792 62/500 [==>...........................] - ETA: 2:27 - loss: 0.8055 - regression_loss: 0.7272 - classification_loss: 0.0784 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8122 - regression_loss: 0.7317 - classification_loss: 0.0804 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8124 - regression_loss: 0.7320 - classification_loss: 0.0805 65/500 [==>...........................] - ETA: 2:26 - loss: 0.8044 - regression_loss: 0.7248 - classification_loss: 0.0796 66/500 [==>...........................] - ETA: 2:26 - loss: 0.8068 - regression_loss: 0.7278 - classification_loss: 0.0790 67/500 [===>..........................] - ETA: 2:26 - loss: 0.8098 - regression_loss: 0.7304 - classification_loss: 0.0794 68/500 [===>..........................] - ETA: 2:25 - loss: 0.8023 - regression_loss: 0.7238 - classification_loss: 0.0785 69/500 [===>..........................] - ETA: 2:25 - loss: 0.8082 - regression_loss: 0.7285 - classification_loss: 0.0797 70/500 [===>..........................] - ETA: 2:25 - loss: 0.8062 - regression_loss: 0.7271 - classification_loss: 0.0791 71/500 [===>..........................] - ETA: 2:24 - loss: 0.8088 - regression_loss: 0.7286 - classification_loss: 0.0802 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8163 - regression_loss: 0.7349 - classification_loss: 0.0814 73/500 [===>..........................] - ETA: 2:24 - loss: 0.8144 - regression_loss: 0.7328 - classification_loss: 0.0815 74/500 [===>..........................] - ETA: 2:23 - loss: 0.8167 - regression_loss: 0.7353 - classification_loss: 0.0814 75/500 [===>..........................] - ETA: 2:23 - loss: 0.8185 - regression_loss: 0.7366 - classification_loss: 0.0819 76/500 [===>..........................] - ETA: 2:23 - loss: 0.8133 - regression_loss: 0.7318 - classification_loss: 0.0815 77/500 [===>..........................] - ETA: 2:22 - loss: 0.8109 - regression_loss: 0.7295 - classification_loss: 0.0814 78/500 [===>..........................] - ETA: 2:22 - loss: 0.8147 - regression_loss: 0.7329 - classification_loss: 0.0818 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8129 - regression_loss: 0.7312 - classification_loss: 0.0817 80/500 [===>..........................] - ETA: 2:21 - loss: 0.8102 - regression_loss: 0.7291 - classification_loss: 0.0812 81/500 [===>..........................] - ETA: 2:21 - loss: 0.8135 - regression_loss: 0.7317 - classification_loss: 0.0818 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8147 - regression_loss: 0.7332 - classification_loss: 0.0815 83/500 [===>..........................] - ETA: 2:20 - loss: 0.8175 - regression_loss: 0.7363 - classification_loss: 0.0812 84/500 [====>.........................] - ETA: 2:20 - loss: 0.8205 - regression_loss: 0.7383 - classification_loss: 0.0822 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8215 - regression_loss: 0.7394 - classification_loss: 0.0822 86/500 [====>.........................] - ETA: 2:19 - loss: 0.8246 - regression_loss: 0.7421 - classification_loss: 0.0825 87/500 [====>.........................] - ETA: 2:19 - loss: 0.8277 - regression_loss: 0.7446 - classification_loss: 0.0831 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8326 - regression_loss: 0.7486 - classification_loss: 0.0840 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8336 - regression_loss: 0.7498 - classification_loss: 0.0838 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8365 - regression_loss: 0.7523 - classification_loss: 0.0842 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8384 - regression_loss: 0.7538 - classification_loss: 0.0846 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8405 - regression_loss: 0.7557 - classification_loss: 0.0848 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8447 - regression_loss: 0.7592 - classification_loss: 0.0855 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8492 - regression_loss: 0.7630 - classification_loss: 0.0861 95/500 [====>.........................] - ETA: 2:16 - loss: 0.8446 - regression_loss: 0.7589 - classification_loss: 0.0857 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8453 - regression_loss: 0.7594 - classification_loss: 0.0859 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8470 - regression_loss: 0.7612 - classification_loss: 0.0858 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8422 - regression_loss: 0.7571 - classification_loss: 0.0851 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8490 - regression_loss: 0.7625 - classification_loss: 0.0864 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8540 - regression_loss: 0.7671 - classification_loss: 0.0869 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8560 - regression_loss: 0.7695 - classification_loss: 0.0865 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8525 - regression_loss: 0.7666 - classification_loss: 0.0859 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8574 - regression_loss: 0.7706 - classification_loss: 0.0868 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8567 - regression_loss: 0.7701 - classification_loss: 0.0866 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8564 - regression_loss: 0.7701 - classification_loss: 0.0863 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8596 - regression_loss: 0.7728 - classification_loss: 0.0868 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8606 - regression_loss: 0.7736 - classification_loss: 0.0870 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8647 - regression_loss: 0.7767 - classification_loss: 0.0880 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8658 - regression_loss: 0.7780 - classification_loss: 0.0878 110/500 [=====>........................] - ETA: 2:12 - loss: 0.8632 - regression_loss: 0.7758 - classification_loss: 0.0874 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8607 - regression_loss: 0.7736 - classification_loss: 0.0870 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8626 - regression_loss: 0.7752 - classification_loss: 0.0874 113/500 [=====>........................] - ETA: 2:11 - loss: 0.8596 - regression_loss: 0.7728 - classification_loss: 0.0868 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8560 - regression_loss: 0.7697 - classification_loss: 0.0863 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8638 - regression_loss: 0.7771 - classification_loss: 0.0867 116/500 [=====>........................] - ETA: 2:10 - loss: 0.8685 - regression_loss: 0.7810 - classification_loss: 0.0876 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8743 - regression_loss: 0.7853 - classification_loss: 0.0889 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8714 - regression_loss: 0.7828 - classification_loss: 0.0886 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8678 - regression_loss: 0.7798 - classification_loss: 0.0880 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8656 - regression_loss: 0.7780 - classification_loss: 0.0876 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8652 - regression_loss: 0.7777 - classification_loss: 0.0875 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8608 - regression_loss: 0.7739 - classification_loss: 0.0870 123/500 [======>.......................] - ETA: 2:07 - loss: 0.8607 - regression_loss: 0.7736 - classification_loss: 0.0870 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8673 - regression_loss: 0.7794 - classification_loss: 0.0879 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8678 - regression_loss: 0.7800 - classification_loss: 0.0878 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8708 - regression_loss: 0.7826 - classification_loss: 0.0882 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8708 - regression_loss: 0.7824 - classification_loss: 0.0884 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8697 - regression_loss: 0.7815 - classification_loss: 0.0882 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8663 - regression_loss: 0.7787 - classification_loss: 0.0876 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8643 - regression_loss: 0.7770 - classification_loss: 0.0873 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8652 - regression_loss: 0.7780 - classification_loss: 0.0872 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8636 - regression_loss: 0.7765 - classification_loss: 0.0872 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8617 - regression_loss: 0.7749 - classification_loss: 0.0867 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8586 - regression_loss: 0.7723 - classification_loss: 0.0863 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8592 - regression_loss: 0.7727 - classification_loss: 0.0865 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8571 - regression_loss: 0.7710 - classification_loss: 0.0861 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8561 - regression_loss: 0.7704 - classification_loss: 0.0857 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8533 - regression_loss: 0.7679 - classification_loss: 0.0853 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8552 - regression_loss: 0.7696 - classification_loss: 0.0855 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8568 - regression_loss: 0.7712 - classification_loss: 0.0856 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8586 - regression_loss: 0.7728 - classification_loss: 0.0858 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8586 - regression_loss: 0.7730 - classification_loss: 0.0857 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8587 - regression_loss: 0.7728 - classification_loss: 0.0859 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8589 - regression_loss: 0.7729 - classification_loss: 0.0860 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8582 - regression_loss: 0.7722 - classification_loss: 0.0860 146/500 [=======>......................] - ETA: 2:00 - loss: 0.8585 - regression_loss: 0.7724 - classification_loss: 0.0861 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8608 - regression_loss: 0.7746 - classification_loss: 0.0862 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8614 - regression_loss: 0.7756 - classification_loss: 0.0858 149/500 [=======>......................] - ETA: 1:59 - loss: 0.8596 - regression_loss: 0.7741 - classification_loss: 0.0855 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8573 - regression_loss: 0.7722 - classification_loss: 0.0852 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8555 - regression_loss: 0.7706 - classification_loss: 0.0849 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8548 - regression_loss: 0.7701 - classification_loss: 0.0847 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8545 - regression_loss: 0.7699 - classification_loss: 0.0846 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8580 - regression_loss: 0.7730 - classification_loss: 0.0850 155/500 [========>.....................] - ETA: 1:57 - loss: 0.8574 - regression_loss: 0.7721 - classification_loss: 0.0853 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8567 - regression_loss: 0.7716 - classification_loss: 0.0851 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8578 - regression_loss: 0.7725 - classification_loss: 0.0853 158/500 [========>.....................] - ETA: 1:56 - loss: 0.8545 - regression_loss: 0.7697 - classification_loss: 0.0849 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8579 - regression_loss: 0.7725 - classification_loss: 0.0854 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8586 - regression_loss: 0.7732 - classification_loss: 0.0853 161/500 [========>.....................] - ETA: 1:55 - loss: 0.8567 - regression_loss: 0.7715 - classification_loss: 0.0852 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8576 - regression_loss: 0.7722 - classification_loss: 0.0854 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8576 - regression_loss: 0.7723 - classification_loss: 0.0853 164/500 [========>.....................] - ETA: 1:54 - loss: 0.8591 - regression_loss: 0.7734 - classification_loss: 0.0857 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8591 - regression_loss: 0.7733 - classification_loss: 0.0858 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8611 - regression_loss: 0.7751 - classification_loss: 0.0860 167/500 [=========>....................] - ETA: 1:53 - loss: 0.8615 - regression_loss: 0.7755 - classification_loss: 0.0860 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8608 - regression_loss: 0.7750 - classification_loss: 0.0858 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8631 - regression_loss: 0.7770 - classification_loss: 0.0861 170/500 [=========>....................] - ETA: 1:52 - loss: 0.8629 - regression_loss: 0.7769 - classification_loss: 0.0860 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8654 - regression_loss: 0.7790 - classification_loss: 0.0864 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8627 - regression_loss: 0.7768 - classification_loss: 0.0859 173/500 [=========>....................] - ETA: 1:51 - loss: 0.8636 - regression_loss: 0.7775 - classification_loss: 0.0861 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8646 - regression_loss: 0.7784 - classification_loss: 0.0862 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8644 - regression_loss: 0.7783 - classification_loss: 0.0861 176/500 [=========>....................] - ETA: 1:50 - loss: 0.8640 - regression_loss: 0.7781 - classification_loss: 0.0859 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8631 - regression_loss: 0.7772 - classification_loss: 0.0858 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8632 - regression_loss: 0.7774 - classification_loss: 0.0858 179/500 [=========>....................] - ETA: 1:49 - loss: 0.8630 - regression_loss: 0.7774 - classification_loss: 0.0856 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8641 - regression_loss: 0.7783 - classification_loss: 0.0858 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8640 - regression_loss: 0.7785 - classification_loss: 0.0855 182/500 [=========>....................] - ETA: 1:48 - loss: 0.8643 - regression_loss: 0.7789 - classification_loss: 0.0854 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8673 - regression_loss: 0.7814 - classification_loss: 0.0858 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8676 - regression_loss: 0.7816 - classification_loss: 0.0859 185/500 [==========>...................] - ETA: 1:47 - loss: 0.8684 - regression_loss: 0.7827 - classification_loss: 0.0857 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8682 - regression_loss: 0.7828 - classification_loss: 0.0854 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8663 - regression_loss: 0.7812 - classification_loss: 0.0851 188/500 [==========>...................] - ETA: 1:46 - loss: 0.8671 - regression_loss: 0.7821 - classification_loss: 0.0850 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8694 - regression_loss: 0.7842 - classification_loss: 0.0852 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8674 - regression_loss: 0.7823 - classification_loss: 0.0850 191/500 [==========>...................] - ETA: 1:45 - loss: 0.8690 - regression_loss: 0.7838 - classification_loss: 0.0852 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8672 - regression_loss: 0.7824 - classification_loss: 0.0848 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8678 - regression_loss: 0.7830 - classification_loss: 0.0849 194/500 [==========>...................] - ETA: 1:44 - loss: 0.8659 - regression_loss: 0.7812 - classification_loss: 0.0846 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8648 - regression_loss: 0.7804 - classification_loss: 0.0845 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8671 - regression_loss: 0.7824 - classification_loss: 0.0846 197/500 [==========>...................] - ETA: 1:43 - loss: 0.8653 - regression_loss: 0.7807 - classification_loss: 0.0845 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8640 - regression_loss: 0.7796 - classification_loss: 0.0843 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8636 - regression_loss: 0.7793 - classification_loss: 0.0843 200/500 [===========>..................] - ETA: 1:42 - loss: 0.8643 - regression_loss: 0.7801 - classification_loss: 0.0842 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8644 - regression_loss: 0.7802 - classification_loss: 0.0842 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8647 - regression_loss: 0.7806 - classification_loss: 0.0841 203/500 [===========>..................] - ETA: 1:41 - loss: 0.8629 - regression_loss: 0.7791 - classification_loss: 0.0838 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8630 - regression_loss: 0.7792 - classification_loss: 0.0838 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8635 - regression_loss: 0.7795 - classification_loss: 0.0840 206/500 [===========>..................] - ETA: 1:40 - loss: 0.8647 - regression_loss: 0.7803 - classification_loss: 0.0844 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8660 - regression_loss: 0.7815 - classification_loss: 0.0845 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8656 - regression_loss: 0.7812 - classification_loss: 0.0844 209/500 [===========>..................] - ETA: 1:39 - loss: 0.8666 - regression_loss: 0.7820 - classification_loss: 0.0845 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8652 - regression_loss: 0.7802 - classification_loss: 0.0850 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8653 - regression_loss: 0.7801 - classification_loss: 0.0852 212/500 [===========>..................] - ETA: 1:38 - loss: 0.8644 - regression_loss: 0.7795 - classification_loss: 0.0849 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8631 - regression_loss: 0.7785 - classification_loss: 0.0847 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8616 - regression_loss: 0.7773 - classification_loss: 0.0843 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8603 - regression_loss: 0.7760 - classification_loss: 0.0842 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8601 - regression_loss: 0.7757 - classification_loss: 0.0843 217/500 [============>.................] - ETA: 1:36 - loss: 0.8614 - regression_loss: 0.7768 - classification_loss: 0.0846 218/500 [============>.................] - ETA: 1:35 - loss: 0.8630 - regression_loss: 0.7781 - classification_loss: 0.0849 219/500 [============>.................] - ETA: 1:35 - loss: 0.8610 - regression_loss: 0.7763 - classification_loss: 0.0847 220/500 [============>.................] - ETA: 1:35 - loss: 0.8621 - regression_loss: 0.7774 - classification_loss: 0.0848 221/500 [============>.................] - ETA: 1:34 - loss: 0.8624 - regression_loss: 0.7777 - classification_loss: 0.0847 222/500 [============>.................] - ETA: 1:34 - loss: 0.8635 - regression_loss: 0.7787 - classification_loss: 0.0848 223/500 [============>.................] - ETA: 1:34 - loss: 0.8630 - regression_loss: 0.7782 - classification_loss: 0.0848 224/500 [============>.................] - ETA: 1:33 - loss: 0.8626 - regression_loss: 0.7777 - classification_loss: 0.0849 225/500 [============>.................] - ETA: 1:33 - loss: 0.8632 - regression_loss: 0.7780 - classification_loss: 0.0852 226/500 [============>.................] - ETA: 1:33 - loss: 0.8634 - regression_loss: 0.7782 - classification_loss: 0.0852 227/500 [============>.................] - ETA: 1:32 - loss: 0.8636 - regression_loss: 0.7783 - classification_loss: 0.0853 228/500 [============>.................] - ETA: 1:32 - loss: 0.8648 - regression_loss: 0.7793 - classification_loss: 0.0856 229/500 [============>.................] - ETA: 1:32 - loss: 0.8650 - regression_loss: 0.7795 - classification_loss: 0.0855 230/500 [============>.................] - ETA: 1:31 - loss: 0.8658 - regression_loss: 0.7803 - classification_loss: 0.0855 231/500 [============>.................] - ETA: 1:31 - loss: 0.8674 - regression_loss: 0.7816 - classification_loss: 0.0858 232/500 [============>.................] - ETA: 1:31 - loss: 0.8667 - regression_loss: 0.7809 - classification_loss: 0.0859 233/500 [============>.................] - ETA: 1:30 - loss: 0.8676 - regression_loss: 0.7816 - classification_loss: 0.0861 234/500 [=============>................] - ETA: 1:30 - loss: 0.8678 - regression_loss: 0.7817 - classification_loss: 0.0861 235/500 [=============>................] - ETA: 1:30 - loss: 0.8681 - regression_loss: 0.7818 - classification_loss: 0.0863 236/500 [=============>................] - ETA: 1:29 - loss: 0.8666 - regression_loss: 0.7806 - classification_loss: 0.0860 237/500 [=============>................] - ETA: 1:29 - loss: 0.8673 - regression_loss: 0.7811 - classification_loss: 0.0862 238/500 [=============>................] - ETA: 1:29 - loss: 0.8662 - regression_loss: 0.7803 - classification_loss: 0.0859 239/500 [=============>................] - ETA: 1:28 - loss: 0.8632 - regression_loss: 0.7776 - classification_loss: 0.0856 240/500 [=============>................] - ETA: 1:28 - loss: 0.8622 - regression_loss: 0.7769 - classification_loss: 0.0853 241/500 [=============>................] - ETA: 1:28 - loss: 0.8638 - regression_loss: 0.7785 - classification_loss: 0.0853 242/500 [=============>................] - ETA: 1:27 - loss: 0.8647 - regression_loss: 0.7793 - classification_loss: 0.0854 243/500 [=============>................] - ETA: 1:27 - loss: 0.8620 - regression_loss: 0.7770 - classification_loss: 0.0851 244/500 [=============>................] - ETA: 1:27 - loss: 0.8631 - regression_loss: 0.7778 - classification_loss: 0.0853 245/500 [=============>................] - ETA: 1:26 - loss: 0.8623 - regression_loss: 0.7773 - classification_loss: 0.0850 246/500 [=============>................] - ETA: 1:26 - loss: 0.8612 - regression_loss: 0.7763 - classification_loss: 0.0849 247/500 [=============>................] - ETA: 1:26 - loss: 0.8605 - regression_loss: 0.7758 - classification_loss: 0.0847 248/500 [=============>................] - ETA: 1:25 - loss: 0.8617 - regression_loss: 0.7769 - classification_loss: 0.0848 249/500 [=============>................] - ETA: 1:25 - loss: 0.8610 - regression_loss: 0.7764 - classification_loss: 0.0845 250/500 [==============>...............] - ETA: 1:25 - loss: 0.8590 - regression_loss: 0.7747 - classification_loss: 0.0843 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8583 - regression_loss: 0.7741 - classification_loss: 0.0842 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8595 - regression_loss: 0.7745 - classification_loss: 0.0849 253/500 [==============>...............] - ETA: 1:24 - loss: 0.8592 - regression_loss: 0.7744 - classification_loss: 0.0848 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8607 - regression_loss: 0.7756 - classification_loss: 0.0851 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8607 - regression_loss: 0.7757 - classification_loss: 0.0850 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8611 - regression_loss: 0.7760 - classification_loss: 0.0851 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8619 - regression_loss: 0.7766 - classification_loss: 0.0853 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8607 - regression_loss: 0.7755 - classification_loss: 0.0852 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8603 - regression_loss: 0.7751 - classification_loss: 0.0851 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8614 - regression_loss: 0.7763 - classification_loss: 0.0851 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8620 - regression_loss: 0.7769 - classification_loss: 0.0852 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8612 - regression_loss: 0.7762 - classification_loss: 0.0849 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8588 - regression_loss: 0.7741 - classification_loss: 0.0847 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8593 - regression_loss: 0.7748 - classification_loss: 0.0846 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8598 - regression_loss: 0.7751 - classification_loss: 0.0847 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8627 - regression_loss: 0.7778 - classification_loss: 0.0849 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8635 - regression_loss: 0.7785 - classification_loss: 0.0850 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8613 - regression_loss: 0.7766 - classification_loss: 0.0847 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8613 - regression_loss: 0.7765 - classification_loss: 0.0847 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8606 - regression_loss: 0.7759 - classification_loss: 0.0846 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8611 - regression_loss: 0.7764 - classification_loss: 0.0847 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8633 - regression_loss: 0.7783 - classification_loss: 0.0850 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8640 - regression_loss: 0.7790 - classification_loss: 0.0850 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8630 - regression_loss: 0.7781 - classification_loss: 0.0849 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8644 - regression_loss: 0.7794 - classification_loss: 0.0850 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8636 - regression_loss: 0.7788 - classification_loss: 0.0848 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8638 - regression_loss: 0.7788 - classification_loss: 0.0850 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8639 - regression_loss: 0.7788 - classification_loss: 0.0851 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8642 - regression_loss: 0.7792 - classification_loss: 0.0850 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8628 - regression_loss: 0.7780 - classification_loss: 0.0848 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8627 - regression_loss: 0.7780 - classification_loss: 0.0848 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8621 - regression_loss: 0.7774 - classification_loss: 0.0847 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8620 - regression_loss: 0.7773 - classification_loss: 0.0847 284/500 [================>.............] - ETA: 1:13 - loss: 0.8631 - regression_loss: 0.7782 - classification_loss: 0.0848 285/500 [================>.............] - ETA: 1:13 - loss: 0.8638 - regression_loss: 0.7789 - classification_loss: 0.0849 286/500 [================>.............] - ETA: 1:12 - loss: 0.8623 - regression_loss: 0.7776 - classification_loss: 0.0847 287/500 [================>.............] - ETA: 1:12 - loss: 0.8635 - regression_loss: 0.7785 - classification_loss: 0.0851 288/500 [================>.............] - ETA: 1:12 - loss: 0.8612 - regression_loss: 0.7764 - classification_loss: 0.0849 289/500 [================>.............] - ETA: 1:11 - loss: 0.8614 - regression_loss: 0.7764 - classification_loss: 0.0849 290/500 [================>.............] - ETA: 1:11 - loss: 0.8635 - regression_loss: 0.7782 - classification_loss: 0.0853 291/500 [================>.............] - ETA: 1:11 - loss: 0.8644 - regression_loss: 0.7790 - classification_loss: 0.0854 292/500 [================>.............] - ETA: 1:10 - loss: 0.8636 - regression_loss: 0.7784 - classification_loss: 0.0852 293/500 [================>.............] - ETA: 1:10 - loss: 0.8636 - regression_loss: 0.7784 - classification_loss: 0.0853 294/500 [================>.............] - ETA: 1:10 - loss: 0.8638 - regression_loss: 0.7785 - classification_loss: 0.0853 295/500 [================>.............] - ETA: 1:09 - loss: 0.8646 - regression_loss: 0.7792 - classification_loss: 0.0854 296/500 [================>.............] - ETA: 1:09 - loss: 0.8629 - regression_loss: 0.7777 - classification_loss: 0.0852 297/500 [================>.............] - ETA: 1:09 - loss: 0.8631 - regression_loss: 0.7778 - classification_loss: 0.0852 298/500 [================>.............] - ETA: 1:08 - loss: 0.8630 - regression_loss: 0.7776 - classification_loss: 0.0854 299/500 [================>.............] - ETA: 1:08 - loss: 0.8621 - regression_loss: 0.7767 - classification_loss: 0.0854 300/500 [=================>............] - ETA: 1:08 - loss: 0.8608 - regression_loss: 0.7757 - classification_loss: 0.0851 301/500 [=================>............] - ETA: 1:07 - loss: 0.8612 - regression_loss: 0.7760 - classification_loss: 0.0852 302/500 [=================>............] - ETA: 1:07 - loss: 0.8612 - regression_loss: 0.7759 - classification_loss: 0.0852 303/500 [=================>............] - ETA: 1:07 - loss: 0.8595 - regression_loss: 0.7745 - classification_loss: 0.0850 304/500 [=================>............] - ETA: 1:06 - loss: 0.8584 - regression_loss: 0.7735 - classification_loss: 0.0848 305/500 [=================>............] - ETA: 1:06 - loss: 0.8560 - regression_loss: 0.7714 - classification_loss: 0.0846 306/500 [=================>............] - ETA: 1:05 - loss: 0.8556 - regression_loss: 0.7711 - classification_loss: 0.0845 307/500 [=================>............] - ETA: 1:05 - loss: 0.8556 - regression_loss: 0.7710 - classification_loss: 0.0846 308/500 [=================>............] - ETA: 1:05 - loss: 0.8559 - regression_loss: 0.7712 - classification_loss: 0.0847 309/500 [=================>............] - ETA: 1:04 - loss: 0.8564 - regression_loss: 0.7718 - classification_loss: 0.0846 310/500 [=================>............] - ETA: 1:04 - loss: 0.8562 - regression_loss: 0.7716 - classification_loss: 0.0846 311/500 [=================>............] - ETA: 1:04 - loss: 0.8570 - regression_loss: 0.7723 - classification_loss: 0.0846 312/500 [=================>............] - ETA: 1:03 - loss: 0.8567 - regression_loss: 0.7722 - classification_loss: 0.0845 313/500 [=================>............] - ETA: 1:03 - loss: 0.8576 - regression_loss: 0.7729 - classification_loss: 0.0846 314/500 [=================>............] - ETA: 1:03 - loss: 0.8582 - regression_loss: 0.7735 - classification_loss: 0.0848 315/500 [=================>............] - ETA: 1:02 - loss: 0.8572 - regression_loss: 0.7726 - classification_loss: 0.0846 316/500 [=================>............] - ETA: 1:02 - loss: 0.8576 - regression_loss: 0.7730 - classification_loss: 0.0846 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8575 - regression_loss: 0.7730 - classification_loss: 0.0845 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8583 - regression_loss: 0.7736 - classification_loss: 0.0848 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8578 - regression_loss: 0.7732 - classification_loss: 0.0847 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8591 - regression_loss: 0.7744 - classification_loss: 0.0847 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8604 - regression_loss: 0.7757 - classification_loss: 0.0848 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8615 - regression_loss: 0.7765 - classification_loss: 0.0850 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8612 - regression_loss: 0.7763 - classification_loss: 0.0850 324/500 [==================>...........] - ETA: 59s - loss: 0.8608 - regression_loss: 0.7759 - classification_loss: 0.0849  325/500 [==================>...........] - ETA: 59s - loss: 0.8605 - regression_loss: 0.7757 - classification_loss: 0.0848 326/500 [==================>...........] - ETA: 59s - loss: 0.8596 - regression_loss: 0.7749 - classification_loss: 0.0846 327/500 [==================>...........] - ETA: 58s - loss: 0.8592 - regression_loss: 0.7746 - classification_loss: 0.0846 328/500 [==================>...........] - ETA: 58s - loss: 0.8597 - regression_loss: 0.7751 - classification_loss: 0.0847 329/500 [==================>...........] - ETA: 58s - loss: 0.8591 - regression_loss: 0.7746 - classification_loss: 0.0845 330/500 [==================>...........] - ETA: 57s - loss: 0.8597 - regression_loss: 0.7752 - classification_loss: 0.0846 331/500 [==================>...........] - ETA: 57s - loss: 0.8586 - regression_loss: 0.7741 - classification_loss: 0.0845 332/500 [==================>...........] - ETA: 57s - loss: 0.8607 - regression_loss: 0.7758 - classification_loss: 0.0849 333/500 [==================>...........] - ETA: 56s - loss: 0.8609 - regression_loss: 0.7759 - classification_loss: 0.0850 334/500 [===================>..........] - ETA: 56s - loss: 0.8619 - regression_loss: 0.7767 - classification_loss: 0.0852 335/500 [===================>..........] - ETA: 56s - loss: 0.8625 - regression_loss: 0.7771 - classification_loss: 0.0853 336/500 [===================>..........] - ETA: 55s - loss: 0.8625 - regression_loss: 0.7772 - classification_loss: 0.0853 337/500 [===================>..........] - ETA: 55s - loss: 0.8617 - regression_loss: 0.7764 - classification_loss: 0.0852 338/500 [===================>..........] - ETA: 55s - loss: 0.8607 - regression_loss: 0.7757 - classification_loss: 0.0850 339/500 [===================>..........] - ETA: 54s - loss: 0.8606 - regression_loss: 0.7756 - classification_loss: 0.0849 340/500 [===================>..........] - ETA: 54s - loss: 0.8607 - regression_loss: 0.7758 - classification_loss: 0.0850 341/500 [===================>..........] - ETA: 54s - loss: 0.8609 - regression_loss: 0.7760 - classification_loss: 0.0850 342/500 [===================>..........] - ETA: 53s - loss: 0.8607 - regression_loss: 0.7759 - classification_loss: 0.0848 343/500 [===================>..........] - ETA: 53s - loss: 0.8601 - regression_loss: 0.7753 - classification_loss: 0.0848 344/500 [===================>..........] - ETA: 53s - loss: 0.8602 - regression_loss: 0.7753 - classification_loss: 0.0849 345/500 [===================>..........] - ETA: 52s - loss: 0.8605 - regression_loss: 0.7756 - classification_loss: 0.0849 346/500 [===================>..........] - ETA: 52s - loss: 0.8601 - regression_loss: 0.7753 - classification_loss: 0.0847 347/500 [===================>..........] - ETA: 52s - loss: 0.8591 - regression_loss: 0.7745 - classification_loss: 0.0846 348/500 [===================>..........] - ETA: 51s - loss: 0.8576 - regression_loss: 0.7732 - classification_loss: 0.0844 349/500 [===================>..........] - ETA: 51s - loss: 0.8583 - regression_loss: 0.7738 - classification_loss: 0.0845 350/500 [====================>.........] - ETA: 51s - loss: 0.8588 - regression_loss: 0.7743 - classification_loss: 0.0845 351/500 [====================>.........] - ETA: 50s - loss: 0.8588 - regression_loss: 0.7743 - classification_loss: 0.0845 352/500 [====================>.........] - ETA: 50s - loss: 0.8601 - regression_loss: 0.7754 - classification_loss: 0.0847 353/500 [====================>.........] - ETA: 50s - loss: 0.8596 - regression_loss: 0.7750 - classification_loss: 0.0846 354/500 [====================>.........] - ETA: 49s - loss: 0.8596 - regression_loss: 0.7749 - classification_loss: 0.0847 355/500 [====================>.........] - ETA: 49s - loss: 0.8600 - regression_loss: 0.7753 - classification_loss: 0.0847 356/500 [====================>.........] - ETA: 49s - loss: 0.8585 - regression_loss: 0.7739 - classification_loss: 0.0846 357/500 [====================>.........] - ETA: 48s - loss: 0.8599 - regression_loss: 0.7751 - classification_loss: 0.0848 358/500 [====================>.........] - ETA: 48s - loss: 0.8588 - regression_loss: 0.7743 - classification_loss: 0.0846 359/500 [====================>.........] - ETA: 48s - loss: 0.8596 - regression_loss: 0.7751 - classification_loss: 0.0845 360/500 [====================>.........] - ETA: 47s - loss: 0.8601 - regression_loss: 0.7756 - classification_loss: 0.0845 361/500 [====================>.........] - ETA: 47s - loss: 0.8611 - regression_loss: 0.7764 - classification_loss: 0.0847 362/500 [====================>.........] - ETA: 47s - loss: 0.8617 - regression_loss: 0.7769 - classification_loss: 0.0848 363/500 [====================>.........] - ETA: 46s - loss: 0.8613 - regression_loss: 0.7766 - classification_loss: 0.0847 364/500 [====================>.........] - ETA: 46s - loss: 0.8620 - regression_loss: 0.7770 - classification_loss: 0.0849 365/500 [====================>.........] - ETA: 46s - loss: 0.8608 - regression_loss: 0.7757 - classification_loss: 0.0851 366/500 [====================>.........] - ETA: 45s - loss: 0.8614 - regression_loss: 0.7763 - classification_loss: 0.0851 367/500 [=====================>........] - ETA: 45s - loss: 0.8625 - regression_loss: 0.7773 - classification_loss: 0.0852 368/500 [=====================>........] - ETA: 44s - loss: 0.8629 - regression_loss: 0.7777 - classification_loss: 0.0852 369/500 [=====================>........] - ETA: 44s - loss: 0.8625 - regression_loss: 0.7774 - classification_loss: 0.0851 370/500 [=====================>........] - ETA: 44s - loss: 0.8638 - regression_loss: 0.7786 - classification_loss: 0.0853 371/500 [=====================>........] - ETA: 43s - loss: 0.8646 - regression_loss: 0.7792 - classification_loss: 0.0854 372/500 [=====================>........] - ETA: 43s - loss: 0.8652 - regression_loss: 0.7797 - classification_loss: 0.0855 373/500 [=====================>........] - ETA: 43s - loss: 0.8657 - regression_loss: 0.7801 - classification_loss: 0.0856 374/500 [=====================>........] - ETA: 42s - loss: 0.8654 - regression_loss: 0.7800 - classification_loss: 0.0855 375/500 [=====================>........] - ETA: 42s - loss: 0.8671 - regression_loss: 0.7813 - classification_loss: 0.0857 376/500 [=====================>........] - ETA: 42s - loss: 0.8713 - regression_loss: 0.7848 - classification_loss: 0.0865 377/500 [=====================>........] - ETA: 41s - loss: 0.8713 - regression_loss: 0.7849 - classification_loss: 0.0864 378/500 [=====================>........] - ETA: 41s - loss: 0.8718 - regression_loss: 0.7854 - classification_loss: 0.0864 379/500 [=====================>........] - ETA: 41s - loss: 0.8714 - regression_loss: 0.7851 - classification_loss: 0.0863 380/500 [=====================>........] - ETA: 40s - loss: 0.8701 - regression_loss: 0.7839 - classification_loss: 0.0862 381/500 [=====================>........] - ETA: 40s - loss: 0.8685 - regression_loss: 0.7825 - classification_loss: 0.0860 382/500 [=====================>........] - ETA: 40s - loss: 0.8668 - regression_loss: 0.7810 - classification_loss: 0.0858 383/500 [=====================>........] - ETA: 39s - loss: 0.8664 - regression_loss: 0.7806 - classification_loss: 0.0858 384/500 [======================>.......] - ETA: 39s - loss: 0.8663 - regression_loss: 0.7805 - classification_loss: 0.0857 385/500 [======================>.......] - ETA: 39s - loss: 0.8655 - regression_loss: 0.7798 - classification_loss: 0.0857 386/500 [======================>.......] - ETA: 38s - loss: 0.8669 - regression_loss: 0.7811 - classification_loss: 0.0858 387/500 [======================>.......] - ETA: 38s - loss: 0.8674 - regression_loss: 0.7816 - classification_loss: 0.0858 388/500 [======================>.......] - ETA: 38s - loss: 0.8668 - regression_loss: 0.7811 - classification_loss: 0.0857 389/500 [======================>.......] - ETA: 37s - loss: 0.8657 - regression_loss: 0.7800 - classification_loss: 0.0856 390/500 [======================>.......] - ETA: 37s - loss: 0.8655 - regression_loss: 0.7799 - classification_loss: 0.0855 391/500 [======================>.......] - ETA: 37s - loss: 0.8649 - regression_loss: 0.7795 - classification_loss: 0.0854 392/500 [======================>.......] - ETA: 36s - loss: 0.8651 - regression_loss: 0.7796 - classification_loss: 0.0855 393/500 [======================>.......] - ETA: 36s - loss: 0.8640 - regression_loss: 0.7787 - classification_loss: 0.0853 394/500 [======================>.......] - ETA: 36s - loss: 0.8637 - regression_loss: 0.7784 - classification_loss: 0.0852 395/500 [======================>.......] - ETA: 35s - loss: 0.8645 - regression_loss: 0.7792 - classification_loss: 0.0853 396/500 [======================>.......] - ETA: 35s - loss: 0.8651 - regression_loss: 0.7796 - classification_loss: 0.0855 397/500 [======================>.......] - ETA: 35s - loss: 0.8658 - regression_loss: 0.7804 - classification_loss: 0.0855 398/500 [======================>.......] - ETA: 34s - loss: 0.8649 - regression_loss: 0.7795 - classification_loss: 0.0853 399/500 [======================>.......] - ETA: 34s - loss: 0.8652 - regression_loss: 0.7799 - classification_loss: 0.0854 400/500 [=======================>......] - ETA: 34s - loss: 0.8668 - regression_loss: 0.7812 - classification_loss: 0.0857 401/500 [=======================>......] - ETA: 33s - loss: 0.8660 - regression_loss: 0.7804 - classification_loss: 0.0856 402/500 [=======================>......] - ETA: 33s - loss: 0.8652 - regression_loss: 0.7797 - classification_loss: 0.0855 403/500 [=======================>......] - ETA: 33s - loss: 0.8659 - regression_loss: 0.7802 - classification_loss: 0.0857 404/500 [=======================>......] - ETA: 32s - loss: 0.8660 - regression_loss: 0.7803 - classification_loss: 0.0856 405/500 [=======================>......] - ETA: 32s - loss: 0.8662 - regression_loss: 0.7806 - classification_loss: 0.0856 406/500 [=======================>......] - ETA: 31s - loss: 0.8653 - regression_loss: 0.7798 - classification_loss: 0.0855 407/500 [=======================>......] - ETA: 31s - loss: 0.8653 - regression_loss: 0.7799 - classification_loss: 0.0854 408/500 [=======================>......] - ETA: 31s - loss: 0.8656 - regression_loss: 0.7801 - classification_loss: 0.0855 409/500 [=======================>......] - ETA: 30s - loss: 0.8655 - regression_loss: 0.7801 - classification_loss: 0.0855 410/500 [=======================>......] - ETA: 30s - loss: 0.8652 - regression_loss: 0.7798 - classification_loss: 0.0854 411/500 [=======================>......] - ETA: 30s - loss: 0.8650 - regression_loss: 0.7796 - classification_loss: 0.0854 412/500 [=======================>......] - ETA: 29s - loss: 0.8657 - regression_loss: 0.7801 - classification_loss: 0.0856 413/500 [=======================>......] - ETA: 29s - loss: 0.8662 - regression_loss: 0.7806 - classification_loss: 0.0856 414/500 [=======================>......] - ETA: 29s - loss: 0.8664 - regression_loss: 0.7808 - classification_loss: 0.0856 415/500 [=======================>......] - ETA: 28s - loss: 0.8667 - regression_loss: 0.7811 - classification_loss: 0.0855 416/500 [=======================>......] - ETA: 28s - loss: 0.8662 - regression_loss: 0.7807 - classification_loss: 0.0855 417/500 [========================>.....] - ETA: 28s - loss: 0.8663 - regression_loss: 0.7807 - classification_loss: 0.0855 418/500 [========================>.....] - ETA: 27s - loss: 0.8671 - regression_loss: 0.7814 - classification_loss: 0.0857 419/500 [========================>.....] - ETA: 27s - loss: 0.8680 - regression_loss: 0.7822 - classification_loss: 0.0858 420/500 [========================>.....] - ETA: 27s - loss: 0.8674 - regression_loss: 0.7817 - classification_loss: 0.0857 421/500 [========================>.....] - ETA: 26s - loss: 0.8661 - regression_loss: 0.7806 - classification_loss: 0.0855 422/500 [========================>.....] - ETA: 26s - loss: 0.8659 - regression_loss: 0.7804 - classification_loss: 0.0855 423/500 [========================>.....] - ETA: 26s - loss: 0.8654 - regression_loss: 0.7800 - classification_loss: 0.0854 424/500 [========================>.....] - ETA: 25s - loss: 0.8648 - regression_loss: 0.7795 - classification_loss: 0.0853 425/500 [========================>.....] - ETA: 25s - loss: 0.8650 - regression_loss: 0.7796 - classification_loss: 0.0854 426/500 [========================>.....] - ETA: 25s - loss: 0.8648 - regression_loss: 0.7796 - classification_loss: 0.0853 427/500 [========================>.....] - ETA: 24s - loss: 0.8646 - regression_loss: 0.7794 - classification_loss: 0.0852 428/500 [========================>.....] - ETA: 24s - loss: 0.8639 - regression_loss: 0.7787 - classification_loss: 0.0851 429/500 [========================>.....] - ETA: 24s - loss: 0.8637 - regression_loss: 0.7785 - classification_loss: 0.0852 430/500 [========================>.....] - ETA: 23s - loss: 0.8639 - regression_loss: 0.7787 - classification_loss: 0.0852 431/500 [========================>.....] - ETA: 23s - loss: 0.8637 - regression_loss: 0.7785 - classification_loss: 0.0852 432/500 [========================>.....] - ETA: 23s - loss: 0.8647 - regression_loss: 0.7793 - classification_loss: 0.0853 433/500 [========================>.....] - ETA: 22s - loss: 0.8645 - regression_loss: 0.7791 - classification_loss: 0.0853 434/500 [=========================>....] - ETA: 22s - loss: 0.8644 - regression_loss: 0.7792 - classification_loss: 0.0852 435/500 [=========================>....] - ETA: 22s - loss: 0.8649 - regression_loss: 0.7795 - classification_loss: 0.0854 436/500 [=========================>....] - ETA: 21s - loss: 0.8660 - regression_loss: 0.7804 - classification_loss: 0.0856 437/500 [=========================>....] - ETA: 21s - loss: 0.8663 - regression_loss: 0.7807 - classification_loss: 0.0856 438/500 [=========================>....] - ETA: 21s - loss: 0.8658 - regression_loss: 0.7803 - classification_loss: 0.0855 439/500 [=========================>....] - ETA: 20s - loss: 0.8646 - regression_loss: 0.7792 - classification_loss: 0.0854 440/500 [=========================>....] - ETA: 20s - loss: 0.8639 - regression_loss: 0.7786 - classification_loss: 0.0853 441/500 [=========================>....] - ETA: 20s - loss: 0.8646 - regression_loss: 0.7792 - classification_loss: 0.0855 442/500 [=========================>....] - ETA: 19s - loss: 0.8648 - regression_loss: 0.7793 - classification_loss: 0.0855 443/500 [=========================>....] - ETA: 19s - loss: 0.8679 - regression_loss: 0.7819 - classification_loss: 0.0859 444/500 [=========================>....] - ETA: 19s - loss: 0.8680 - regression_loss: 0.7821 - classification_loss: 0.0859 445/500 [=========================>....] - ETA: 18s - loss: 0.8696 - regression_loss: 0.7834 - classification_loss: 0.0862 446/500 [=========================>....] - ETA: 18s - loss: 0.8700 - regression_loss: 0.7837 - classification_loss: 0.0862 447/500 [=========================>....] - ETA: 18s - loss: 0.8696 - regression_loss: 0.7834 - classification_loss: 0.0862 448/500 [=========================>....] - ETA: 17s - loss: 0.8702 - regression_loss: 0.7837 - classification_loss: 0.0865 449/500 [=========================>....] - ETA: 17s - loss: 0.8706 - regression_loss: 0.7842 - classification_loss: 0.0864 450/500 [==========================>...] - ETA: 16s - loss: 0.8708 - regression_loss: 0.7844 - classification_loss: 0.0864 451/500 [==========================>...] - ETA: 16s - loss: 0.8702 - regression_loss: 0.7838 - classification_loss: 0.0864 452/500 [==========================>...] - ETA: 16s - loss: 0.8690 - regression_loss: 0.7827 - classification_loss: 0.0863 453/500 [==========================>...] - ETA: 15s - loss: 0.8696 - regression_loss: 0.7832 - classification_loss: 0.0864 454/500 [==========================>...] - ETA: 15s - loss: 0.8700 - regression_loss: 0.7836 - classification_loss: 0.0864 455/500 [==========================>...] - ETA: 15s - loss: 0.8694 - regression_loss: 0.7831 - classification_loss: 0.0864 456/500 [==========================>...] - ETA: 14s - loss: 0.8684 - regression_loss: 0.7822 - classification_loss: 0.0862 457/500 [==========================>...] - ETA: 14s - loss: 0.8688 - regression_loss: 0.7826 - classification_loss: 0.0863 458/500 [==========================>...] - ETA: 14s - loss: 0.8681 - regression_loss: 0.7820 - classification_loss: 0.0861 459/500 [==========================>...] - ETA: 13s - loss: 0.8679 - regression_loss: 0.7818 - classification_loss: 0.0860 460/500 [==========================>...] - ETA: 13s - loss: 0.8664 - regression_loss: 0.7806 - classification_loss: 0.0859 461/500 [==========================>...] - ETA: 13s - loss: 0.8656 - regression_loss: 0.7799 - classification_loss: 0.0857 462/500 [==========================>...] - ETA: 12s - loss: 0.8653 - regression_loss: 0.7797 - classification_loss: 0.0857 463/500 [==========================>...] - ETA: 12s - loss: 0.8645 - regression_loss: 0.7789 - classification_loss: 0.0856 464/500 [==========================>...] - ETA: 12s - loss: 0.8653 - regression_loss: 0.7796 - classification_loss: 0.0857 465/500 [==========================>...] - ETA: 11s - loss: 0.8664 - regression_loss: 0.7805 - classification_loss: 0.0859 466/500 [==========================>...] - ETA: 11s - loss: 0.8663 - regression_loss: 0.7804 - classification_loss: 0.0859 467/500 [===========================>..] - ETA: 11s - loss: 0.8669 - regression_loss: 0.7810 - classification_loss: 0.0859 468/500 [===========================>..] - ETA: 10s - loss: 0.8667 - regression_loss: 0.7808 - classification_loss: 0.0859 469/500 [===========================>..] - ETA: 10s - loss: 0.8667 - regression_loss: 0.7808 - classification_loss: 0.0859 470/500 [===========================>..] - ETA: 10s - loss: 0.8666 - regression_loss: 0.7807 - classification_loss: 0.0859 471/500 [===========================>..] - ETA: 9s - loss: 0.8661 - regression_loss: 0.7801 - classification_loss: 0.0859  472/500 [===========================>..] - ETA: 9s - loss: 0.8667 - regression_loss: 0.7807 - classification_loss: 0.0860 473/500 [===========================>..] - ETA: 9s - loss: 0.8658 - regression_loss: 0.7800 - classification_loss: 0.0859 474/500 [===========================>..] - ETA: 8s - loss: 0.8664 - regression_loss: 0.7805 - classification_loss: 0.0859 475/500 [===========================>..] - ETA: 8s - loss: 0.8673 - regression_loss: 0.7813 - classification_loss: 0.0861 476/500 [===========================>..] - ETA: 8s - loss: 0.8663 - regression_loss: 0.7804 - classification_loss: 0.0860 477/500 [===========================>..] - ETA: 7s - loss: 0.8659 - regression_loss: 0.7800 - classification_loss: 0.0859 478/500 [===========================>..] - ETA: 7s - loss: 0.8653 - regression_loss: 0.7795 - classification_loss: 0.0858 479/500 [===========================>..] - ETA: 7s - loss: 0.8648 - regression_loss: 0.7791 - classification_loss: 0.0857 480/500 [===========================>..] - ETA: 6s - loss: 0.8648 - regression_loss: 0.7791 - classification_loss: 0.0857 481/500 [===========================>..] - ETA: 6s - loss: 0.8655 - regression_loss: 0.7799 - classification_loss: 0.0857 482/500 [===========================>..] - ETA: 6s - loss: 0.8659 - regression_loss: 0.7801 - classification_loss: 0.0858 483/500 [===========================>..] - ETA: 5s - loss: 0.8659 - regression_loss: 0.7801 - classification_loss: 0.0858 484/500 [============================>.] - ETA: 5s - loss: 0.8664 - regression_loss: 0.7805 - classification_loss: 0.0859 485/500 [============================>.] - ETA: 5s - loss: 0.8665 - regression_loss: 0.7805 - classification_loss: 0.0859 486/500 [============================>.] - ETA: 4s - loss: 0.8655 - regression_loss: 0.7797 - classification_loss: 0.0858 487/500 [============================>.] - ETA: 4s - loss: 0.8655 - regression_loss: 0.7798 - classification_loss: 0.0857 488/500 [============================>.] - ETA: 4s - loss: 0.8660 - regression_loss: 0.7803 - classification_loss: 0.0857 489/500 [============================>.] - ETA: 3s - loss: 0.8664 - regression_loss: 0.7808 - classification_loss: 0.0856 490/500 [============================>.] - ETA: 3s - loss: 0.8669 - regression_loss: 0.7812 - classification_loss: 0.0857 491/500 [============================>.] - ETA: 3s - loss: 0.8663 - regression_loss: 0.7807 - classification_loss: 0.0856 492/500 [============================>.] - ETA: 2s - loss: 0.8663 - regression_loss: 0.7808 - classification_loss: 0.0856 493/500 [============================>.] - ETA: 2s - loss: 0.8658 - regression_loss: 0.7804 - classification_loss: 0.0855 494/500 [============================>.] - ETA: 2s - loss: 0.8660 - regression_loss: 0.7806 - classification_loss: 0.0855 495/500 [============================>.] - ETA: 1s - loss: 0.8660 - regression_loss: 0.7805 - classification_loss: 0.0855 496/500 [============================>.] - ETA: 1s - loss: 0.8654 - regression_loss: 0.7800 - classification_loss: 0.0854 497/500 [============================>.] - ETA: 1s - loss: 0.8649 - regression_loss: 0.7796 - classification_loss: 0.0853 498/500 [============================>.] - ETA: 0s - loss: 0.8652 - regression_loss: 0.7798 - classification_loss: 0.0855 499/500 [============================>.] - ETA: 0s - loss: 0.8650 - regression_loss: 0.7796 - classification_loss: 0.0854 500/500 [==============================] - 170s 340ms/step - loss: 0.8653 - regression_loss: 0.7800 - classification_loss: 0.0854 1172 instances of class plum with average precision: 0.8033 mAP: 0.8033 Epoch 00040: saving model to ./training/snapshots/resnet101_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 2:34 - loss: 0.7469 - regression_loss: 0.6704 - classification_loss: 0.0765 2/500 [..............................] - ETA: 2:40 - loss: 0.5972 - regression_loss: 0.5535 - classification_loss: 0.0437 3/500 [..............................] - ETA: 2:42 - loss: 0.6635 - regression_loss: 0.6261 - classification_loss: 0.0374 4/500 [..............................] - ETA: 2:40 - loss: 1.1510 - regression_loss: 0.9616 - classification_loss: 0.1894 5/500 [..............................] - ETA: 2:38 - loss: 1.0502 - regression_loss: 0.8868 - classification_loss: 0.1635 6/500 [..............................] - ETA: 2:38 - loss: 0.9228 - regression_loss: 0.7809 - classification_loss: 0.1419 7/500 [..............................] - ETA: 2:40 - loss: 0.8613 - regression_loss: 0.7364 - classification_loss: 0.1249 8/500 [..............................] - ETA: 2:41 - loss: 0.8279 - regression_loss: 0.7145 - classification_loss: 0.1135 9/500 [..............................] - ETA: 2:41 - loss: 0.7752 - regression_loss: 0.6718 - classification_loss: 0.1034 10/500 [..............................] - ETA: 2:41 - loss: 0.8016 - regression_loss: 0.6986 - classification_loss: 0.1031 11/500 [..............................] - ETA: 2:42 - loss: 0.8143 - regression_loss: 0.7108 - classification_loss: 0.1034 12/500 [..............................] - ETA: 2:42 - loss: 0.8319 - regression_loss: 0.7299 - classification_loss: 0.1021 13/500 [..............................] - ETA: 2:42 - loss: 0.8294 - regression_loss: 0.7288 - classification_loss: 0.1005 14/500 [..............................] - ETA: 2:41 - loss: 0.8449 - regression_loss: 0.7439 - classification_loss: 0.1010 15/500 [..............................] - ETA: 2:41 - loss: 0.8357 - regression_loss: 0.7386 - classification_loss: 0.0971 16/500 [..............................] - ETA: 2:41 - loss: 0.8282 - regression_loss: 0.7329 - classification_loss: 0.0953 17/500 [>.............................] - ETA: 2:41 - loss: 0.8361 - regression_loss: 0.7400 - classification_loss: 0.0961 18/500 [>.............................] - ETA: 2:41 - loss: 0.8430 - regression_loss: 0.7467 - classification_loss: 0.0963 19/500 [>.............................] - ETA: 2:41 - loss: 0.8259 - regression_loss: 0.7321 - classification_loss: 0.0938 20/500 [>.............................] - ETA: 2:41 - loss: 0.8445 - regression_loss: 0.7485 - classification_loss: 0.0960 21/500 [>.............................] - ETA: 2:41 - loss: 0.8568 - regression_loss: 0.7614 - classification_loss: 0.0954 22/500 [>.............................] - ETA: 2:41 - loss: 0.8725 - regression_loss: 0.7765 - classification_loss: 0.0960 23/500 [>.............................] - ETA: 2:41 - loss: 0.8792 - regression_loss: 0.7823 - classification_loss: 0.0969 24/500 [>.............................] - ETA: 2:41 - loss: 0.8753 - regression_loss: 0.7797 - classification_loss: 0.0956 25/500 [>.............................] - ETA: 2:41 - loss: 0.8691 - regression_loss: 0.7733 - classification_loss: 0.0958 26/500 [>.............................] - ETA: 2:40 - loss: 0.8503 - regression_loss: 0.7572 - classification_loss: 0.0930 27/500 [>.............................] - ETA: 2:40 - loss: 0.8464 - regression_loss: 0.7548 - classification_loss: 0.0916 28/500 [>.............................] - ETA: 2:39 - loss: 0.8552 - regression_loss: 0.7620 - classification_loss: 0.0932 29/500 [>.............................] - ETA: 2:39 - loss: 0.8546 - regression_loss: 0.7627 - classification_loss: 0.0919 30/500 [>.............................] - ETA: 2:39 - loss: 0.8458 - regression_loss: 0.7560 - classification_loss: 0.0898 31/500 [>.............................] - ETA: 2:38 - loss: 0.8322 - regression_loss: 0.7444 - classification_loss: 0.0878 32/500 [>.............................] - ETA: 2:38 - loss: 0.8316 - regression_loss: 0.7447 - classification_loss: 0.0869 33/500 [>.............................] - ETA: 2:38 - loss: 0.8390 - regression_loss: 0.7507 - classification_loss: 0.0883 34/500 [=>............................] - ETA: 2:37 - loss: 0.8469 - regression_loss: 0.7576 - classification_loss: 0.0893 35/500 [=>............................] - ETA: 2:37 - loss: 0.8329 - regression_loss: 0.7457 - classification_loss: 0.0872 36/500 [=>............................] - ETA: 2:36 - loss: 0.8351 - regression_loss: 0.7488 - classification_loss: 0.0863 37/500 [=>............................] - ETA: 2:36 - loss: 0.8406 - regression_loss: 0.7552 - classification_loss: 0.0854 38/500 [=>............................] - ETA: 2:36 - loss: 0.8402 - regression_loss: 0.7561 - classification_loss: 0.0842 39/500 [=>............................] - ETA: 2:35 - loss: 0.8431 - regression_loss: 0.7589 - classification_loss: 0.0841 40/500 [=>............................] - ETA: 2:35 - loss: 0.8473 - regression_loss: 0.7627 - classification_loss: 0.0846 41/500 [=>............................] - ETA: 2:35 - loss: 0.8530 - regression_loss: 0.7655 - classification_loss: 0.0875 42/500 [=>............................] - ETA: 2:35 - loss: 0.8427 - regression_loss: 0.7563 - classification_loss: 0.0865 43/500 [=>............................] - ETA: 2:34 - loss: 0.8312 - regression_loss: 0.7465 - classification_loss: 0.0848 44/500 [=>............................] - ETA: 2:34 - loss: 0.8265 - regression_loss: 0.7431 - classification_loss: 0.0834 45/500 [=>............................] - ETA: 2:34 - loss: 0.8389 - regression_loss: 0.7539 - classification_loss: 0.0850 46/500 [=>............................] - ETA: 2:34 - loss: 0.8470 - regression_loss: 0.7607 - classification_loss: 0.0863 47/500 [=>............................] - ETA: 2:33 - loss: 0.8463 - regression_loss: 0.7600 - classification_loss: 0.0863 48/500 [=>............................] - ETA: 2:33 - loss: 0.8533 - regression_loss: 0.7668 - classification_loss: 0.0865 49/500 [=>............................] - ETA: 2:33 - loss: 0.8459 - regression_loss: 0.7605 - classification_loss: 0.0854 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8521 - regression_loss: 0.7668 - classification_loss: 0.0853 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8491 - regression_loss: 0.7649 - classification_loss: 0.0842 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8523 - regression_loss: 0.7679 - classification_loss: 0.0843 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8572 - regression_loss: 0.7723 - classification_loss: 0.0849 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8530 - regression_loss: 0.7692 - classification_loss: 0.0838 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8561 - regression_loss: 0.7725 - classification_loss: 0.0836 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8625 - regression_loss: 0.7780 - classification_loss: 0.0845 57/500 [==>...........................] - ETA: 2:30 - loss: 0.8647 - regression_loss: 0.7802 - classification_loss: 0.0845 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8613 - regression_loss: 0.7774 - classification_loss: 0.0839 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8515 - regression_loss: 0.7685 - classification_loss: 0.0830 60/500 [==>...........................] - ETA: 2:29 - loss: 0.8532 - regression_loss: 0.7697 - classification_loss: 0.0835 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8503 - regression_loss: 0.7677 - classification_loss: 0.0826 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8448 - regression_loss: 0.7630 - classification_loss: 0.0817 63/500 [==>...........................] - ETA: 2:29 - loss: 0.8346 - regression_loss: 0.7539 - classification_loss: 0.0807 64/500 [==>...........................] - ETA: 2:28 - loss: 0.8415 - regression_loss: 0.7600 - classification_loss: 0.0815 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8535 - regression_loss: 0.7721 - classification_loss: 0.0815 66/500 [==>...........................] - ETA: 2:27 - loss: 0.8496 - regression_loss: 0.7688 - classification_loss: 0.0808 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8413 - regression_loss: 0.7614 - classification_loss: 0.0799 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8367 - regression_loss: 0.7573 - classification_loss: 0.0794 69/500 [===>..........................] - ETA: 2:26 - loss: 0.8302 - regression_loss: 0.7515 - classification_loss: 0.0786 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8287 - regression_loss: 0.7505 - classification_loss: 0.0782 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8314 - regression_loss: 0.7527 - classification_loss: 0.0787 72/500 [===>..........................] - ETA: 2:25 - loss: 0.8317 - regression_loss: 0.7527 - classification_loss: 0.0789 73/500 [===>..........................] - ETA: 2:25 - loss: 0.8403 - regression_loss: 0.7597 - classification_loss: 0.0805 74/500 [===>..........................] - ETA: 2:24 - loss: 0.8390 - regression_loss: 0.7588 - classification_loss: 0.0803 75/500 [===>..........................] - ETA: 2:24 - loss: 0.8379 - regression_loss: 0.7582 - classification_loss: 0.0797 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8352 - regression_loss: 0.7559 - classification_loss: 0.0793 77/500 [===>..........................] - ETA: 2:23 - loss: 0.8383 - regression_loss: 0.7586 - classification_loss: 0.0797 78/500 [===>..........................] - ETA: 2:23 - loss: 0.8422 - regression_loss: 0.7624 - classification_loss: 0.0798 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8401 - regression_loss: 0.7610 - classification_loss: 0.0791 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8428 - regression_loss: 0.7632 - classification_loss: 0.0796 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8450 - regression_loss: 0.7653 - classification_loss: 0.0797 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8443 - regression_loss: 0.7647 - classification_loss: 0.0796 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8441 - regression_loss: 0.7646 - classification_loss: 0.0796 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8457 - regression_loss: 0.7659 - classification_loss: 0.0798 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8464 - regression_loss: 0.7665 - classification_loss: 0.0799 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8483 - regression_loss: 0.7680 - classification_loss: 0.0803 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8492 - regression_loss: 0.7685 - classification_loss: 0.0807 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8481 - regression_loss: 0.7673 - classification_loss: 0.0808 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8464 - regression_loss: 0.7661 - classification_loss: 0.0803 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8468 - regression_loss: 0.7662 - classification_loss: 0.0806 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8413 - regression_loss: 0.7613 - classification_loss: 0.0800 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8440 - regression_loss: 0.7639 - classification_loss: 0.0800 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8456 - regression_loss: 0.7657 - classification_loss: 0.0799 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8510 - regression_loss: 0.7703 - classification_loss: 0.0808 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8527 - regression_loss: 0.7717 - classification_loss: 0.0810 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8529 - regression_loss: 0.7718 - classification_loss: 0.0811 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8501 - regression_loss: 0.7695 - classification_loss: 0.0806 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8470 - regression_loss: 0.7668 - classification_loss: 0.0802 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8430 - regression_loss: 0.7633 - classification_loss: 0.0797 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8408 - regression_loss: 0.7614 - classification_loss: 0.0794 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8444 - regression_loss: 0.7644 - classification_loss: 0.0800 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8397 - regression_loss: 0.7604 - classification_loss: 0.0793 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8411 - regression_loss: 0.7616 - classification_loss: 0.0795 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8463 - regression_loss: 0.7658 - classification_loss: 0.0805 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8424 - regression_loss: 0.7623 - classification_loss: 0.0802 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8391 - regression_loss: 0.7593 - classification_loss: 0.0798 107/500 [=====>........................] - ETA: 2:12 - loss: 0.8388 - regression_loss: 0.7590 - classification_loss: 0.0798 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8384 - regression_loss: 0.7586 - classification_loss: 0.0798 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8358 - regression_loss: 0.7566 - classification_loss: 0.0792 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8370 - regression_loss: 0.7575 - classification_loss: 0.0796 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8368 - regression_loss: 0.7571 - classification_loss: 0.0797 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8376 - regression_loss: 0.7576 - classification_loss: 0.0800 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8374 - regression_loss: 0.7573 - classification_loss: 0.0801 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8383 - regression_loss: 0.7582 - classification_loss: 0.0800 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8387 - regression_loss: 0.7579 - classification_loss: 0.0808 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8403 - regression_loss: 0.7594 - classification_loss: 0.0810 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8419 - regression_loss: 0.7607 - classification_loss: 0.0812 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8395 - regression_loss: 0.7589 - classification_loss: 0.0807 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8404 - regression_loss: 0.7595 - classification_loss: 0.0808 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8383 - regression_loss: 0.7576 - classification_loss: 0.0807 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8355 - regression_loss: 0.7552 - classification_loss: 0.0803 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8398 - regression_loss: 0.7586 - classification_loss: 0.0812 123/500 [======>.......................] - ETA: 2:07 - loss: 0.8410 - regression_loss: 0.7598 - classification_loss: 0.0812 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8387 - regression_loss: 0.7571 - classification_loss: 0.0815 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8392 - regression_loss: 0.7578 - classification_loss: 0.0813 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8393 - regression_loss: 0.7579 - classification_loss: 0.0813 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8366 - regression_loss: 0.7558 - classification_loss: 0.0809 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8376 - regression_loss: 0.7568 - classification_loss: 0.0808 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8386 - regression_loss: 0.7578 - classification_loss: 0.0807 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8377 - regression_loss: 0.7573 - classification_loss: 0.0804 131/500 [======>.......................] - ETA: 2:04 - loss: 0.8347 - regression_loss: 0.7547 - classification_loss: 0.0800 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8299 - regression_loss: 0.7505 - classification_loss: 0.0794 133/500 [======>.......................] - ETA: 2:03 - loss: 0.8325 - regression_loss: 0.7525 - classification_loss: 0.0800 134/500 [=======>......................] - ETA: 2:03 - loss: 0.8344 - regression_loss: 0.7541 - classification_loss: 0.0803 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8309 - regression_loss: 0.7511 - classification_loss: 0.0798 136/500 [=======>......................] - ETA: 2:02 - loss: 0.8343 - regression_loss: 0.7538 - classification_loss: 0.0804 137/500 [=======>......................] - ETA: 2:02 - loss: 0.8342 - regression_loss: 0.7536 - classification_loss: 0.0806 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8365 - regression_loss: 0.7557 - classification_loss: 0.0808 139/500 [=======>......................] - ETA: 2:01 - loss: 0.8365 - regression_loss: 0.7556 - classification_loss: 0.0810 140/500 [=======>......................] - ETA: 2:01 - loss: 0.8374 - regression_loss: 0.7564 - classification_loss: 0.0810 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8406 - regression_loss: 0.7592 - classification_loss: 0.0814 142/500 [=======>......................] - ETA: 2:00 - loss: 0.8415 - regression_loss: 0.7599 - classification_loss: 0.0816 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8425 - regression_loss: 0.7608 - classification_loss: 0.0817 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8431 - regression_loss: 0.7611 - classification_loss: 0.0819 145/500 [=======>......................] - ETA: 1:59 - loss: 0.8439 - regression_loss: 0.7617 - classification_loss: 0.0822 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8441 - regression_loss: 0.7618 - classification_loss: 0.0823 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8424 - regression_loss: 0.7603 - classification_loss: 0.0821 148/500 [=======>......................] - ETA: 1:58 - loss: 0.8425 - regression_loss: 0.7602 - classification_loss: 0.0823 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8459 - regression_loss: 0.7631 - classification_loss: 0.0828 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8451 - regression_loss: 0.7623 - classification_loss: 0.0828 151/500 [========>.....................] - ETA: 1:57 - loss: 0.8429 - regression_loss: 0.7604 - classification_loss: 0.0826 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8391 - regression_loss: 0.7568 - classification_loss: 0.0823 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8390 - regression_loss: 0.7568 - classification_loss: 0.0822 154/500 [========>.....................] - ETA: 1:56 - loss: 0.8411 - regression_loss: 0.7587 - classification_loss: 0.0824 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8439 - regression_loss: 0.7610 - classification_loss: 0.0829 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8431 - regression_loss: 0.7605 - classification_loss: 0.0826 157/500 [========>.....................] - ETA: 1:55 - loss: 0.8424 - regression_loss: 0.7600 - classification_loss: 0.0823 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8437 - regression_loss: 0.7615 - classification_loss: 0.0822 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8434 - regression_loss: 0.7613 - classification_loss: 0.0821 160/500 [========>.....................] - ETA: 1:54 - loss: 0.8462 - regression_loss: 0.7637 - classification_loss: 0.0825 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8452 - regression_loss: 0.7629 - classification_loss: 0.0823 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8432 - regression_loss: 0.7611 - classification_loss: 0.0821 163/500 [========>.....................] - ETA: 1:53 - loss: 0.8440 - regression_loss: 0.7618 - classification_loss: 0.0822 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8431 - regression_loss: 0.7610 - classification_loss: 0.0820 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8461 - regression_loss: 0.7635 - classification_loss: 0.0826 166/500 [========>.....................] - ETA: 1:52 - loss: 0.8465 - regression_loss: 0.7632 - classification_loss: 0.0833 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8425 - regression_loss: 0.7596 - classification_loss: 0.0828 168/500 [=========>....................] - ETA: 1:51 - loss: 0.8417 - regression_loss: 0.7589 - classification_loss: 0.0828 169/500 [=========>....................] - ETA: 1:51 - loss: 0.8423 - regression_loss: 0.7594 - classification_loss: 0.0829 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8403 - regression_loss: 0.7578 - classification_loss: 0.0825 171/500 [=========>....................] - ETA: 1:50 - loss: 0.8410 - regression_loss: 0.7586 - classification_loss: 0.0824 172/500 [=========>....................] - ETA: 1:50 - loss: 0.8428 - regression_loss: 0.7602 - classification_loss: 0.0827 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8445 - regression_loss: 0.7614 - classification_loss: 0.0831 174/500 [=========>....................] - ETA: 1:49 - loss: 0.8422 - regression_loss: 0.7593 - classification_loss: 0.0829 175/500 [=========>....................] - ETA: 1:49 - loss: 0.8408 - regression_loss: 0.7582 - classification_loss: 0.0826 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8407 - regression_loss: 0.7579 - classification_loss: 0.0827 177/500 [=========>....................] - ETA: 1:48 - loss: 0.8398 - regression_loss: 0.7572 - classification_loss: 0.0826 178/500 [=========>....................] - ETA: 1:48 - loss: 0.8415 - regression_loss: 0.7587 - classification_loss: 0.0828 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8411 - regression_loss: 0.7584 - classification_loss: 0.0827 180/500 [=========>....................] - ETA: 1:47 - loss: 0.8375 - regression_loss: 0.7552 - classification_loss: 0.0823 181/500 [=========>....................] - ETA: 1:47 - loss: 0.8390 - regression_loss: 0.7566 - classification_loss: 0.0825 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8385 - regression_loss: 0.7558 - classification_loss: 0.0827 183/500 [=========>....................] - ETA: 1:46 - loss: 0.8401 - regression_loss: 0.7575 - classification_loss: 0.0827 184/500 [==========>...................] - ETA: 1:46 - loss: 0.8400 - regression_loss: 0.7574 - classification_loss: 0.0825 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8418 - regression_loss: 0.7588 - classification_loss: 0.0830 186/500 [==========>...................] - ETA: 1:45 - loss: 0.8449 - regression_loss: 0.7617 - classification_loss: 0.0832 187/500 [==========>...................] - ETA: 1:45 - loss: 0.8449 - regression_loss: 0.7616 - classification_loss: 0.0833 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8454 - regression_loss: 0.7619 - classification_loss: 0.0835 189/500 [==========>...................] - ETA: 1:44 - loss: 0.8458 - regression_loss: 0.7623 - classification_loss: 0.0835 190/500 [==========>...................] - ETA: 1:44 - loss: 0.8431 - regression_loss: 0.7599 - classification_loss: 0.0832 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8404 - regression_loss: 0.7576 - classification_loss: 0.0829 192/500 [==========>...................] - ETA: 1:43 - loss: 0.8415 - regression_loss: 0.7586 - classification_loss: 0.0829 193/500 [==========>...................] - ETA: 1:43 - loss: 0.8435 - regression_loss: 0.7604 - classification_loss: 0.0832 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8434 - regression_loss: 0.7604 - classification_loss: 0.0830 195/500 [==========>...................] - ETA: 1:42 - loss: 0.8445 - regression_loss: 0.7614 - classification_loss: 0.0830 196/500 [==========>...................] - ETA: 1:42 - loss: 0.8444 - regression_loss: 0.7613 - classification_loss: 0.0831 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8444 - regression_loss: 0.7615 - classification_loss: 0.0829 198/500 [==========>...................] - ETA: 1:41 - loss: 0.8425 - regression_loss: 0.7599 - classification_loss: 0.0827 199/500 [==========>...................] - ETA: 1:41 - loss: 0.8403 - regression_loss: 0.7580 - classification_loss: 0.0823 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8421 - regression_loss: 0.7596 - classification_loss: 0.0826 201/500 [===========>..................] - ETA: 1:40 - loss: 0.8435 - regression_loss: 0.7609 - classification_loss: 0.0827 202/500 [===========>..................] - ETA: 1:40 - loss: 0.8432 - regression_loss: 0.7607 - classification_loss: 0.0825 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8443 - regression_loss: 0.7616 - classification_loss: 0.0827 204/500 [===========>..................] - ETA: 1:39 - loss: 0.8446 - regression_loss: 0.7619 - classification_loss: 0.0827 205/500 [===========>..................] - ETA: 1:39 - loss: 0.8468 - regression_loss: 0.7636 - classification_loss: 0.0831 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8474 - regression_loss: 0.7642 - classification_loss: 0.0832 207/500 [===========>..................] - ETA: 1:38 - loss: 0.8461 - regression_loss: 0.7631 - classification_loss: 0.0830 208/500 [===========>..................] - ETA: 1:38 - loss: 0.8476 - regression_loss: 0.7643 - classification_loss: 0.0833 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8471 - regression_loss: 0.7640 - classification_loss: 0.0831 210/500 [===========>..................] - ETA: 1:37 - loss: 0.8496 - regression_loss: 0.7659 - classification_loss: 0.0837 211/500 [===========>..................] - ETA: 1:37 - loss: 0.8490 - regression_loss: 0.7655 - classification_loss: 0.0835 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8498 - regression_loss: 0.7661 - classification_loss: 0.0837 213/500 [===========>..................] - ETA: 1:36 - loss: 0.8498 - regression_loss: 0.7661 - classification_loss: 0.0836 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8483 - regression_loss: 0.7649 - classification_loss: 0.0834 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8502 - regression_loss: 0.7665 - classification_loss: 0.0837 216/500 [===========>..................] - ETA: 1:35 - loss: 0.8513 - regression_loss: 0.7676 - classification_loss: 0.0836 217/500 [============>.................] - ETA: 1:35 - loss: 0.8499 - regression_loss: 0.7665 - classification_loss: 0.0834 218/500 [============>.................] - ETA: 1:35 - loss: 0.8496 - regression_loss: 0.7664 - classification_loss: 0.0833 219/500 [============>.................] - ETA: 1:34 - loss: 0.8486 - regression_loss: 0.7655 - classification_loss: 0.0831 220/500 [============>.................] - ETA: 1:34 - loss: 0.8489 - regression_loss: 0.7657 - classification_loss: 0.0832 221/500 [============>.................] - ETA: 1:34 - loss: 0.8505 - regression_loss: 0.7671 - classification_loss: 0.0834 222/500 [============>.................] - ETA: 1:33 - loss: 0.8485 - regression_loss: 0.7653 - classification_loss: 0.0832 223/500 [============>.................] - ETA: 1:33 - loss: 0.8480 - regression_loss: 0.7649 - classification_loss: 0.0831 224/500 [============>.................] - ETA: 1:33 - loss: 0.8499 - regression_loss: 0.7666 - classification_loss: 0.0833 225/500 [============>.................] - ETA: 1:32 - loss: 0.8498 - regression_loss: 0.7665 - classification_loss: 0.0833 226/500 [============>.................] - ETA: 1:32 - loss: 0.8493 - regression_loss: 0.7662 - classification_loss: 0.0831 227/500 [============>.................] - ETA: 1:32 - loss: 0.8506 - regression_loss: 0.7674 - classification_loss: 0.0832 228/500 [============>.................] - ETA: 1:31 - loss: 0.8500 - regression_loss: 0.7670 - classification_loss: 0.0830 229/500 [============>.................] - ETA: 1:31 - loss: 0.8509 - regression_loss: 0.7680 - classification_loss: 0.0829 230/500 [============>.................] - ETA: 1:31 - loss: 0.8522 - regression_loss: 0.7690 - classification_loss: 0.0832 231/500 [============>.................] - ETA: 1:30 - loss: 0.8566 - regression_loss: 0.7719 - classification_loss: 0.0847 232/500 [============>.................] - ETA: 1:30 - loss: 0.8548 - regression_loss: 0.7703 - classification_loss: 0.0844 233/500 [============>.................] - ETA: 1:30 - loss: 0.8545 - regression_loss: 0.7702 - classification_loss: 0.0843 234/500 [=============>................] - ETA: 1:29 - loss: 0.8528 - regression_loss: 0.7688 - classification_loss: 0.0840 235/500 [=============>................] - ETA: 1:29 - loss: 0.8536 - regression_loss: 0.7692 - classification_loss: 0.0844 236/500 [=============>................] - ETA: 1:29 - loss: 0.8539 - regression_loss: 0.7693 - classification_loss: 0.0846 237/500 [=============>................] - ETA: 1:28 - loss: 0.8515 - regression_loss: 0.7672 - classification_loss: 0.0843 238/500 [=============>................] - ETA: 1:28 - loss: 0.8531 - regression_loss: 0.7684 - classification_loss: 0.0847 239/500 [=============>................] - ETA: 1:28 - loss: 0.8531 - regression_loss: 0.7684 - classification_loss: 0.0847 240/500 [=============>................] - ETA: 1:27 - loss: 0.8522 - regression_loss: 0.7678 - classification_loss: 0.0844 241/500 [=============>................] - ETA: 1:27 - loss: 0.8514 - regression_loss: 0.7671 - classification_loss: 0.0842 242/500 [=============>................] - ETA: 1:27 - loss: 0.8494 - regression_loss: 0.7654 - classification_loss: 0.0840 243/500 [=============>................] - ETA: 1:26 - loss: 0.8503 - regression_loss: 0.7662 - classification_loss: 0.0841 244/500 [=============>................] - ETA: 1:26 - loss: 0.8485 - regression_loss: 0.7646 - classification_loss: 0.0839 245/500 [=============>................] - ETA: 1:26 - loss: 0.8484 - regression_loss: 0.7646 - classification_loss: 0.0838 246/500 [=============>................] - ETA: 1:25 - loss: 0.8515 - regression_loss: 0.7670 - classification_loss: 0.0845 247/500 [=============>................] - ETA: 1:25 - loss: 0.8508 - regression_loss: 0.7666 - classification_loss: 0.0842 248/500 [=============>................] - ETA: 1:25 - loss: 0.8513 - regression_loss: 0.7669 - classification_loss: 0.0844 249/500 [=============>................] - ETA: 1:24 - loss: 0.8510 - regression_loss: 0.7667 - classification_loss: 0.0844 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8509 - regression_loss: 0.7666 - classification_loss: 0.0843 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8505 - regression_loss: 0.7662 - classification_loss: 0.0843 252/500 [==============>...............] - ETA: 1:23 - loss: 0.8496 - regression_loss: 0.7654 - classification_loss: 0.0841 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8487 - regression_loss: 0.7643 - classification_loss: 0.0844 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8474 - regression_loss: 0.7632 - classification_loss: 0.0842 255/500 [==============>...............] - ETA: 1:22 - loss: 0.8474 - regression_loss: 0.7630 - classification_loss: 0.0844 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8471 - regression_loss: 0.7629 - classification_loss: 0.0842 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8479 - regression_loss: 0.7634 - classification_loss: 0.0845 258/500 [==============>...............] - ETA: 1:21 - loss: 0.8483 - regression_loss: 0.7637 - classification_loss: 0.0845 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8488 - regression_loss: 0.7644 - classification_loss: 0.0844 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8474 - regression_loss: 0.7631 - classification_loss: 0.0843 261/500 [==============>...............] - ETA: 1:20 - loss: 0.8475 - regression_loss: 0.7632 - classification_loss: 0.0843 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8477 - regression_loss: 0.7632 - classification_loss: 0.0845 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8459 - regression_loss: 0.7617 - classification_loss: 0.0842 264/500 [==============>...............] - ETA: 1:19 - loss: 0.8464 - regression_loss: 0.7622 - classification_loss: 0.0843 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8472 - regression_loss: 0.7628 - classification_loss: 0.0844 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8483 - regression_loss: 0.7638 - classification_loss: 0.0845 267/500 [===============>..............] - ETA: 1:18 - loss: 0.8470 - regression_loss: 0.7627 - classification_loss: 0.0843 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8477 - regression_loss: 0.7634 - classification_loss: 0.0843 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8486 - regression_loss: 0.7640 - classification_loss: 0.0846 270/500 [===============>..............] - ETA: 1:17 - loss: 0.8466 - regression_loss: 0.7622 - classification_loss: 0.0844 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8476 - regression_loss: 0.7632 - classification_loss: 0.0844 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8475 - regression_loss: 0.7631 - classification_loss: 0.0844 273/500 [===============>..............] - ETA: 1:16 - loss: 0.8468 - regression_loss: 0.7624 - classification_loss: 0.0844 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8472 - regression_loss: 0.7629 - classification_loss: 0.0843 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8455 - regression_loss: 0.7614 - classification_loss: 0.0841 276/500 [===============>..............] - ETA: 1:15 - loss: 0.8434 - regression_loss: 0.7595 - classification_loss: 0.0839 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8421 - regression_loss: 0.7584 - classification_loss: 0.0837 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8415 - regression_loss: 0.7579 - classification_loss: 0.0836 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8430 - regression_loss: 0.7591 - classification_loss: 0.0838 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8435 - regression_loss: 0.7596 - classification_loss: 0.0839 281/500 [===============>..............] - ETA: 1:13 - loss: 0.8440 - regression_loss: 0.7604 - classification_loss: 0.0836 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8456 - regression_loss: 0.7618 - classification_loss: 0.0838 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8447 - regression_loss: 0.7611 - classification_loss: 0.0836 284/500 [================>.............] - ETA: 1:12 - loss: 0.8434 - regression_loss: 0.7599 - classification_loss: 0.0834 285/500 [================>.............] - ETA: 1:12 - loss: 0.8445 - regression_loss: 0.7609 - classification_loss: 0.0836 286/500 [================>.............] - ETA: 1:12 - loss: 0.8438 - regression_loss: 0.7604 - classification_loss: 0.0834 287/500 [================>.............] - ETA: 1:11 - loss: 0.8417 - regression_loss: 0.7585 - classification_loss: 0.0831 288/500 [================>.............] - ETA: 1:11 - loss: 0.8422 - regression_loss: 0.7591 - classification_loss: 0.0831 289/500 [================>.............] - ETA: 1:11 - loss: 0.8410 - regression_loss: 0.7581 - classification_loss: 0.0829 290/500 [================>.............] - ETA: 1:10 - loss: 0.8413 - regression_loss: 0.7584 - classification_loss: 0.0830 291/500 [================>.............] - ETA: 1:10 - loss: 0.8407 - regression_loss: 0.7579 - classification_loss: 0.0828 292/500 [================>.............] - ETA: 1:10 - loss: 0.8409 - regression_loss: 0.7581 - classification_loss: 0.0828 293/500 [================>.............] - ETA: 1:09 - loss: 0.8400 - regression_loss: 0.7572 - classification_loss: 0.0828 294/500 [================>.............] - ETA: 1:09 - loss: 0.8400 - regression_loss: 0.7572 - classification_loss: 0.0828 295/500 [================>.............] - ETA: 1:09 - loss: 0.8406 - regression_loss: 0.7579 - classification_loss: 0.0827 296/500 [================>.............] - ETA: 1:08 - loss: 0.8409 - regression_loss: 0.7582 - classification_loss: 0.0827 297/500 [================>.............] - ETA: 1:08 - loss: 0.8406 - regression_loss: 0.7580 - classification_loss: 0.0827 298/500 [================>.............] - ETA: 1:08 - loss: 0.8399 - regression_loss: 0.7574 - classification_loss: 0.0826 299/500 [================>.............] - ETA: 1:07 - loss: 0.8411 - regression_loss: 0.7586 - classification_loss: 0.0825 300/500 [=================>............] - ETA: 1:07 - loss: 0.8419 - regression_loss: 0.7594 - classification_loss: 0.0825 301/500 [=================>............] - ETA: 1:07 - loss: 0.8437 - regression_loss: 0.7611 - classification_loss: 0.0826 302/500 [=================>............] - ETA: 1:06 - loss: 0.8435 - regression_loss: 0.7609 - classification_loss: 0.0825 303/500 [=================>............] - ETA: 1:06 - loss: 0.8442 - regression_loss: 0.7615 - classification_loss: 0.0827 304/500 [=================>............] - ETA: 1:06 - loss: 0.8449 - regression_loss: 0.7621 - classification_loss: 0.0827 305/500 [=================>............] - ETA: 1:05 - loss: 0.8453 - regression_loss: 0.7625 - classification_loss: 0.0828 306/500 [=================>............] - ETA: 1:05 - loss: 0.8449 - regression_loss: 0.7621 - classification_loss: 0.0829 307/500 [=================>............] - ETA: 1:05 - loss: 0.8488 - regression_loss: 0.7656 - classification_loss: 0.0832 308/500 [=================>............] - ETA: 1:04 - loss: 0.8484 - regression_loss: 0.7652 - classification_loss: 0.0832 309/500 [=================>............] - ETA: 1:04 - loss: 0.8483 - regression_loss: 0.7651 - classification_loss: 0.0833 310/500 [=================>............] - ETA: 1:04 - loss: 0.8493 - regression_loss: 0.7659 - classification_loss: 0.0834 311/500 [=================>............] - ETA: 1:03 - loss: 0.8503 - regression_loss: 0.7670 - classification_loss: 0.0833 312/500 [=================>............] - ETA: 1:03 - loss: 0.8520 - regression_loss: 0.7687 - classification_loss: 0.0833 313/500 [=================>............] - ETA: 1:03 - loss: 0.8515 - regression_loss: 0.7683 - classification_loss: 0.0832 314/500 [=================>............] - ETA: 1:02 - loss: 0.8505 - regression_loss: 0.7675 - classification_loss: 0.0830 315/500 [=================>............] - ETA: 1:02 - loss: 0.8502 - regression_loss: 0.7672 - classification_loss: 0.0830 316/500 [=================>............] - ETA: 1:02 - loss: 0.8492 - regression_loss: 0.7664 - classification_loss: 0.0828 317/500 [==================>...........] - ETA: 1:01 - loss: 0.8479 - regression_loss: 0.7653 - classification_loss: 0.0827 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8486 - regression_loss: 0.7658 - classification_loss: 0.0828 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8486 - regression_loss: 0.7659 - classification_loss: 0.0828 320/500 [==================>...........] - ETA: 1:00 - loss: 0.8500 - regression_loss: 0.7669 - classification_loss: 0.0831 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8502 - regression_loss: 0.7671 - classification_loss: 0.0831 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8511 - regression_loss: 0.7679 - classification_loss: 0.0832 323/500 [==================>...........] - ETA: 59s - loss: 0.8524 - regression_loss: 0.7689 - classification_loss: 0.0835  324/500 [==================>...........] - ETA: 59s - loss: 0.8532 - regression_loss: 0.7696 - classification_loss: 0.0836 325/500 [==================>...........] - ETA: 59s - loss: 0.8531 - regression_loss: 0.7696 - classification_loss: 0.0836 326/500 [==================>...........] - ETA: 58s - loss: 0.8541 - regression_loss: 0.7704 - classification_loss: 0.0837 327/500 [==================>...........] - ETA: 58s - loss: 0.8552 - regression_loss: 0.7715 - classification_loss: 0.0837 328/500 [==================>...........] - ETA: 58s - loss: 0.8540 - regression_loss: 0.7705 - classification_loss: 0.0835 329/500 [==================>...........] - ETA: 57s - loss: 0.8541 - regression_loss: 0.7706 - classification_loss: 0.0835 330/500 [==================>...........] - ETA: 57s - loss: 0.8544 - regression_loss: 0.7709 - classification_loss: 0.0835 331/500 [==================>...........] - ETA: 57s - loss: 0.8529 - regression_loss: 0.7696 - classification_loss: 0.0833 332/500 [==================>...........] - ETA: 56s - loss: 0.8540 - regression_loss: 0.7705 - classification_loss: 0.0834 333/500 [==================>...........] - ETA: 56s - loss: 0.8551 - regression_loss: 0.7715 - classification_loss: 0.0836 334/500 [===================>..........] - ETA: 56s - loss: 0.8549 - regression_loss: 0.7713 - classification_loss: 0.0836 335/500 [===================>..........] - ETA: 55s - loss: 0.8561 - regression_loss: 0.7722 - classification_loss: 0.0839 336/500 [===================>..........] - ETA: 55s - loss: 0.8554 - regression_loss: 0.7716 - classification_loss: 0.0838 337/500 [===================>..........] - ETA: 55s - loss: 0.8560 - regression_loss: 0.7722 - classification_loss: 0.0838 338/500 [===================>..........] - ETA: 54s - loss: 0.8545 - regression_loss: 0.7708 - classification_loss: 0.0837 339/500 [===================>..........] - ETA: 54s - loss: 0.8531 - regression_loss: 0.7695 - classification_loss: 0.0835 340/500 [===================>..........] - ETA: 54s - loss: 0.8537 - regression_loss: 0.7702 - classification_loss: 0.0835 341/500 [===================>..........] - ETA: 53s - loss: 0.8532 - regression_loss: 0.7698 - classification_loss: 0.0833 342/500 [===================>..........] - ETA: 53s - loss: 0.8536 - regression_loss: 0.7702 - classification_loss: 0.0834 343/500 [===================>..........] - ETA: 53s - loss: 0.8541 - regression_loss: 0.7705 - classification_loss: 0.0836 344/500 [===================>..........] - ETA: 52s - loss: 0.8544 - regression_loss: 0.7708 - classification_loss: 0.0836 345/500 [===================>..........] - ETA: 52s - loss: 0.8548 - regression_loss: 0.7712 - classification_loss: 0.0836 346/500 [===================>..........] - ETA: 52s - loss: 0.8551 - regression_loss: 0.7714 - classification_loss: 0.0837 347/500 [===================>..........] - ETA: 51s - loss: 0.8553 - regression_loss: 0.7716 - classification_loss: 0.0837 348/500 [===================>..........] - ETA: 51s - loss: 0.8551 - regression_loss: 0.7714 - classification_loss: 0.0837 349/500 [===================>..........] - ETA: 51s - loss: 0.8542 - regression_loss: 0.7707 - classification_loss: 0.0836 350/500 [====================>.........] - ETA: 50s - loss: 0.8529 - regression_loss: 0.7695 - classification_loss: 0.0834 351/500 [====================>.........] - ETA: 50s - loss: 0.8521 - regression_loss: 0.7689 - classification_loss: 0.0832 352/500 [====================>.........] - ETA: 50s - loss: 0.8512 - regression_loss: 0.7681 - classification_loss: 0.0831 353/500 [====================>.........] - ETA: 49s - loss: 0.8502 - regression_loss: 0.7673 - classification_loss: 0.0829 354/500 [====================>.........] - ETA: 49s - loss: 0.8490 - regression_loss: 0.7662 - classification_loss: 0.0828 355/500 [====================>.........] - ETA: 49s - loss: 0.8490 - regression_loss: 0.7661 - classification_loss: 0.0829 356/500 [====================>.........] - ETA: 48s - loss: 0.8480 - regression_loss: 0.7653 - classification_loss: 0.0828 357/500 [====================>.........] - ETA: 48s - loss: 0.8482 - regression_loss: 0.7655 - classification_loss: 0.0827 358/500 [====================>.........] - ETA: 48s - loss: 0.8487 - regression_loss: 0.7660 - classification_loss: 0.0827 359/500 [====================>.........] - ETA: 47s - loss: 0.8484 - regression_loss: 0.7659 - classification_loss: 0.0825 360/500 [====================>.........] - ETA: 47s - loss: 0.8499 - regression_loss: 0.7674 - classification_loss: 0.0825 361/500 [====================>.........] - ETA: 47s - loss: 0.8507 - regression_loss: 0.7683 - classification_loss: 0.0824 362/500 [====================>.........] - ETA: 46s - loss: 0.8519 - regression_loss: 0.7692 - classification_loss: 0.0827 363/500 [====================>.........] - ETA: 46s - loss: 0.8521 - regression_loss: 0.7694 - classification_loss: 0.0828 364/500 [====================>.........] - ETA: 46s - loss: 0.8531 - regression_loss: 0.7703 - classification_loss: 0.0828 365/500 [====================>.........] - ETA: 45s - loss: 0.8539 - regression_loss: 0.7710 - classification_loss: 0.0829 366/500 [====================>.........] - ETA: 45s - loss: 0.8523 - regression_loss: 0.7696 - classification_loss: 0.0827 367/500 [=====================>........] - ETA: 45s - loss: 0.8529 - regression_loss: 0.7699 - classification_loss: 0.0829 368/500 [=====================>........] - ETA: 44s - loss: 0.8526 - regression_loss: 0.7699 - classification_loss: 0.0827 369/500 [=====================>........] - ETA: 44s - loss: 0.8512 - regression_loss: 0.7686 - classification_loss: 0.0826 370/500 [=====================>........] - ETA: 44s - loss: 0.8524 - regression_loss: 0.7697 - classification_loss: 0.0827 371/500 [=====================>........] - ETA: 43s - loss: 0.8532 - regression_loss: 0.7706 - classification_loss: 0.0826 372/500 [=====================>........] - ETA: 43s - loss: 0.8538 - regression_loss: 0.7711 - classification_loss: 0.0827 373/500 [=====================>........] - ETA: 43s - loss: 0.8530 - regression_loss: 0.7705 - classification_loss: 0.0826 374/500 [=====================>........] - ETA: 42s - loss: 0.8521 - regression_loss: 0.7697 - classification_loss: 0.0824 375/500 [=====================>........] - ETA: 42s - loss: 0.8509 - regression_loss: 0.7686 - classification_loss: 0.0823 376/500 [=====================>........] - ETA: 42s - loss: 0.8509 - regression_loss: 0.7686 - classification_loss: 0.0823 377/500 [=====================>........] - ETA: 41s - loss: 0.8511 - regression_loss: 0.7688 - classification_loss: 0.0824 378/500 [=====================>........] - ETA: 41s - loss: 0.8510 - regression_loss: 0.7687 - classification_loss: 0.0823 379/500 [=====================>........] - ETA: 41s - loss: 0.8502 - regression_loss: 0.7678 - classification_loss: 0.0823 380/500 [=====================>........] - ETA: 40s - loss: 0.8490 - regression_loss: 0.7668 - classification_loss: 0.0822 381/500 [=====================>........] - ETA: 40s - loss: 0.8488 - regression_loss: 0.7667 - classification_loss: 0.0821 382/500 [=====================>........] - ETA: 40s - loss: 0.8485 - regression_loss: 0.7665 - classification_loss: 0.0820 383/500 [=====================>........] - ETA: 39s - loss: 0.8474 - regression_loss: 0.7655 - classification_loss: 0.0819 384/500 [======================>.......] - ETA: 39s - loss: 0.8481 - regression_loss: 0.7661 - classification_loss: 0.0820 385/500 [======================>.......] - ETA: 38s - loss: 0.8478 - regression_loss: 0.7658 - classification_loss: 0.0820 386/500 [======================>.......] - ETA: 38s - loss: 0.8493 - regression_loss: 0.7670 - classification_loss: 0.0823 387/500 [======================>.......] - ETA: 38s - loss: 0.8499 - regression_loss: 0.7673 - classification_loss: 0.0825 388/500 [======================>.......] - ETA: 37s - loss: 0.8498 - regression_loss: 0.7674 - classification_loss: 0.0824 389/500 [======================>.......] - ETA: 37s - loss: 0.8488 - regression_loss: 0.7665 - classification_loss: 0.0823 390/500 [======================>.......] - ETA: 37s - loss: 0.8504 - regression_loss: 0.7679 - classification_loss: 0.0825 391/500 [======================>.......] - ETA: 36s - loss: 0.8507 - regression_loss: 0.7682 - classification_loss: 0.0825 392/500 [======================>.......] - ETA: 36s - loss: 0.8508 - regression_loss: 0.7683 - classification_loss: 0.0825 393/500 [======================>.......] - ETA: 36s - loss: 0.8512 - regression_loss: 0.7686 - classification_loss: 0.0826 394/500 [======================>.......] - ETA: 35s - loss: 0.8512 - regression_loss: 0.7685 - classification_loss: 0.0827 395/500 [======================>.......] - ETA: 35s - loss: 0.8513 - regression_loss: 0.7686 - classification_loss: 0.0827 396/500 [======================>.......] - ETA: 35s - loss: 0.8513 - regression_loss: 0.7686 - classification_loss: 0.0827 397/500 [======================>.......] - ETA: 34s - loss: 0.8505 - regression_loss: 0.7679 - classification_loss: 0.0826 398/500 [======================>.......] - ETA: 34s - loss: 0.8500 - regression_loss: 0.7674 - classification_loss: 0.0826 399/500 [======================>.......] - ETA: 34s - loss: 0.8495 - regression_loss: 0.7670 - classification_loss: 0.0825 400/500 [=======================>......] - ETA: 33s - loss: 0.8495 - regression_loss: 0.7669 - classification_loss: 0.0826 401/500 [=======================>......] - ETA: 33s - loss: 0.8499 - regression_loss: 0.7672 - classification_loss: 0.0827 402/500 [=======================>......] - ETA: 33s - loss: 0.8509 - regression_loss: 0.7681 - classification_loss: 0.0828 403/500 [=======================>......] - ETA: 32s - loss: 0.8508 - regression_loss: 0.7679 - classification_loss: 0.0828 404/500 [=======================>......] - ETA: 32s - loss: 0.8517 - regression_loss: 0.7688 - classification_loss: 0.0830 405/500 [=======================>......] - ETA: 32s - loss: 0.8527 - regression_loss: 0.7696 - classification_loss: 0.0831 406/500 [=======================>......] - ETA: 31s - loss: 0.8533 - regression_loss: 0.7700 - classification_loss: 0.0833 407/500 [=======================>......] - ETA: 31s - loss: 0.8538 - regression_loss: 0.7705 - classification_loss: 0.0833 408/500 [=======================>......] - ETA: 31s - loss: 0.8531 - regression_loss: 0.7698 - classification_loss: 0.0832 409/500 [=======================>......] - ETA: 30s - loss: 0.8559 - regression_loss: 0.7723 - classification_loss: 0.0836 410/500 [=======================>......] - ETA: 30s - loss: 0.8556 - regression_loss: 0.7721 - classification_loss: 0.0835 411/500 [=======================>......] - ETA: 30s - loss: 0.8551 - regression_loss: 0.7716 - classification_loss: 0.0834 412/500 [=======================>......] - ETA: 29s - loss: 0.8548 - regression_loss: 0.7714 - classification_loss: 0.0834 413/500 [=======================>......] - ETA: 29s - loss: 0.8554 - regression_loss: 0.7718 - classification_loss: 0.0836 414/500 [=======================>......] - ETA: 29s - loss: 0.8547 - regression_loss: 0.7713 - classification_loss: 0.0834 415/500 [=======================>......] - ETA: 28s - loss: 0.8559 - regression_loss: 0.7723 - classification_loss: 0.0836 416/500 [=======================>......] - ETA: 28s - loss: 0.8573 - regression_loss: 0.7734 - classification_loss: 0.0839 417/500 [========================>.....] - ETA: 28s - loss: 0.8580 - regression_loss: 0.7741 - classification_loss: 0.0839 418/500 [========================>.....] - ETA: 27s - loss: 0.8587 - regression_loss: 0.7747 - classification_loss: 0.0840 419/500 [========================>.....] - ETA: 27s - loss: 0.8579 - regression_loss: 0.7741 - classification_loss: 0.0839 420/500 [========================>.....] - ETA: 27s - loss: 0.8575 - regression_loss: 0.7737 - classification_loss: 0.0838 421/500 [========================>.....] - ETA: 26s - loss: 0.8577 - regression_loss: 0.7739 - classification_loss: 0.0838 422/500 [========================>.....] - ETA: 26s - loss: 0.8574 - regression_loss: 0.7736 - classification_loss: 0.0838 423/500 [========================>.....] - ETA: 26s - loss: 0.8564 - regression_loss: 0.7728 - classification_loss: 0.0837 424/500 [========================>.....] - ETA: 25s - loss: 0.8563 - regression_loss: 0.7727 - classification_loss: 0.0837 425/500 [========================>.....] - ETA: 25s - loss: 0.8564 - regression_loss: 0.7727 - classification_loss: 0.0837 426/500 [========================>.....] - ETA: 25s - loss: 0.8560 - regression_loss: 0.7724 - classification_loss: 0.0836 427/500 [========================>.....] - ETA: 24s - loss: 0.8564 - regression_loss: 0.7727 - classification_loss: 0.0837 428/500 [========================>.....] - ETA: 24s - loss: 0.8568 - regression_loss: 0.7730 - classification_loss: 0.0838 429/500 [========================>.....] - ETA: 24s - loss: 0.8574 - regression_loss: 0.7736 - classification_loss: 0.0838 430/500 [========================>.....] - ETA: 23s - loss: 0.8571 - regression_loss: 0.7734 - classification_loss: 0.0838 431/500 [========================>.....] - ETA: 23s - loss: 0.8567 - regression_loss: 0.7731 - classification_loss: 0.0836 432/500 [========================>.....] - ETA: 23s - loss: 0.8558 - regression_loss: 0.7723 - classification_loss: 0.0836 433/500 [========================>.....] - ETA: 22s - loss: 0.8554 - regression_loss: 0.7719 - classification_loss: 0.0834 434/500 [=========================>....] - ETA: 22s - loss: 0.8557 - regression_loss: 0.7722 - classification_loss: 0.0835 435/500 [=========================>....] - ETA: 22s - loss: 0.8564 - regression_loss: 0.7728 - classification_loss: 0.0836 436/500 [=========================>....] - ETA: 21s - loss: 0.8556 - regression_loss: 0.7721 - classification_loss: 0.0835 437/500 [=========================>....] - ETA: 21s - loss: 0.8553 - regression_loss: 0.7718 - classification_loss: 0.0835 438/500 [=========================>....] - ETA: 21s - loss: 0.8553 - regression_loss: 0.7719 - classification_loss: 0.0835 439/500 [=========================>....] - ETA: 20s - loss: 0.8564 - regression_loss: 0.7728 - classification_loss: 0.0836 440/500 [=========================>....] - ETA: 20s - loss: 0.8563 - regression_loss: 0.7727 - classification_loss: 0.0836 441/500 [=========================>....] - ETA: 19s - loss: 0.8562 - regression_loss: 0.7726 - classification_loss: 0.0837 442/500 [=========================>....] - ETA: 19s - loss: 0.8563 - regression_loss: 0.7726 - classification_loss: 0.0837 443/500 [=========================>....] - ETA: 19s - loss: 0.8555 - regression_loss: 0.7720 - classification_loss: 0.0835 444/500 [=========================>....] - ETA: 18s - loss: 0.8560 - regression_loss: 0.7723 - classification_loss: 0.0837 445/500 [=========================>....] - ETA: 18s - loss: 0.8554 - regression_loss: 0.7718 - classification_loss: 0.0835 446/500 [=========================>....] - ETA: 18s - loss: 0.8546 - regression_loss: 0.7712 - classification_loss: 0.0835 447/500 [=========================>....] - ETA: 17s - loss: 0.8567 - regression_loss: 0.7729 - classification_loss: 0.0838 448/500 [=========================>....] - ETA: 17s - loss: 0.8562 - regression_loss: 0.7725 - classification_loss: 0.0837 449/500 [=========================>....] - ETA: 17s - loss: 0.8550 - regression_loss: 0.7714 - classification_loss: 0.0836 450/500 [==========================>...] - ETA: 16s - loss: 0.8551 - regression_loss: 0.7715 - classification_loss: 0.0836 451/500 [==========================>...] - ETA: 16s - loss: 0.8557 - regression_loss: 0.7720 - classification_loss: 0.0837 452/500 [==========================>...] - ETA: 16s - loss: 0.8546 - regression_loss: 0.7710 - classification_loss: 0.0836 453/500 [==========================>...] - ETA: 15s - loss: 0.8545 - regression_loss: 0.7709 - classification_loss: 0.0836 454/500 [==========================>...] - ETA: 15s - loss: 0.8546 - regression_loss: 0.7709 - classification_loss: 0.0836 455/500 [==========================>...] - ETA: 15s - loss: 0.8544 - regression_loss: 0.7708 - classification_loss: 0.0836 456/500 [==========================>...] - ETA: 14s - loss: 0.8546 - regression_loss: 0.7709 - classification_loss: 0.0837 457/500 [==========================>...] - ETA: 14s - loss: 0.8545 - regression_loss: 0.7709 - classification_loss: 0.0836 458/500 [==========================>...] - ETA: 14s - loss: 0.8550 - regression_loss: 0.7712 - classification_loss: 0.0838 459/500 [==========================>...] - ETA: 13s - loss: 0.8542 - regression_loss: 0.7705 - classification_loss: 0.0837 460/500 [==========================>...] - ETA: 13s - loss: 0.8549 - regression_loss: 0.7711 - classification_loss: 0.0838 461/500 [==========================>...] - ETA: 13s - loss: 0.8554 - regression_loss: 0.7715 - classification_loss: 0.0838 462/500 [==========================>...] - ETA: 12s - loss: 0.8552 - regression_loss: 0.7714 - classification_loss: 0.0838 463/500 [==========================>...] - ETA: 12s - loss: 0.8548 - regression_loss: 0.7709 - classification_loss: 0.0838 464/500 [==========================>...] - ETA: 12s - loss: 0.8536 - regression_loss: 0.7698 - classification_loss: 0.0838 465/500 [==========================>...] - ETA: 11s - loss: 0.8541 - regression_loss: 0.7703 - classification_loss: 0.0839 466/500 [==========================>...] - ETA: 11s - loss: 0.8533 - regression_loss: 0.7696 - classification_loss: 0.0837 467/500 [===========================>..] - ETA: 11s - loss: 0.8540 - regression_loss: 0.7701 - classification_loss: 0.0838 468/500 [===========================>..] - ETA: 10s - loss: 0.8541 - regression_loss: 0.7702 - classification_loss: 0.0839 469/500 [===========================>..] - ETA: 10s - loss: 0.8544 - regression_loss: 0.7705 - classification_loss: 0.0839 470/500 [===========================>..] - ETA: 10s - loss: 0.8547 - regression_loss: 0.7707 - classification_loss: 0.0840 471/500 [===========================>..] - ETA: 9s - loss: 0.8570 - regression_loss: 0.7729 - classification_loss: 0.0841  472/500 [===========================>..] - ETA: 9s - loss: 0.8571 - regression_loss: 0.7729 - classification_loss: 0.0841 473/500 [===========================>..] - ETA: 9s - loss: 0.8577 - regression_loss: 0.7734 - classification_loss: 0.0842 474/500 [===========================>..] - ETA: 8s - loss: 0.8576 - regression_loss: 0.7734 - classification_loss: 0.0842 475/500 [===========================>..] - ETA: 8s - loss: 0.8571 - regression_loss: 0.7730 - classification_loss: 0.0841 476/500 [===========================>..] - ETA: 8s - loss: 0.8566 - regression_loss: 0.7726 - classification_loss: 0.0841 477/500 [===========================>..] - ETA: 7s - loss: 0.8565 - regression_loss: 0.7725 - classification_loss: 0.0840 478/500 [===========================>..] - ETA: 7s - loss: 0.8568 - regression_loss: 0.7729 - classification_loss: 0.0840 479/500 [===========================>..] - ETA: 7s - loss: 0.8572 - regression_loss: 0.7733 - classification_loss: 0.0840 480/500 [===========================>..] - ETA: 6s - loss: 0.8570 - regression_loss: 0.7730 - classification_loss: 0.0839 481/500 [===========================>..] - ETA: 6s - loss: 0.8567 - regression_loss: 0.7729 - classification_loss: 0.0839 482/500 [===========================>..] - ETA: 6s - loss: 0.8560 - regression_loss: 0.7722 - classification_loss: 0.0838 483/500 [===========================>..] - ETA: 5s - loss: 0.8549 - regression_loss: 0.7713 - classification_loss: 0.0837 484/500 [============================>.] - ETA: 5s - loss: 0.8549 - regression_loss: 0.7712 - classification_loss: 0.0837 485/500 [============================>.] - ETA: 5s - loss: 0.8548 - regression_loss: 0.7711 - classification_loss: 0.0836 486/500 [============================>.] - ETA: 4s - loss: 0.8535 - regression_loss: 0.7700 - classification_loss: 0.0835 487/500 [============================>.] - ETA: 4s - loss: 0.8534 - regression_loss: 0.7700 - classification_loss: 0.0835 488/500 [============================>.] - ETA: 4s - loss: 0.8531 - regression_loss: 0.7696 - classification_loss: 0.0834 489/500 [============================>.] - ETA: 3s - loss: 0.8531 - regression_loss: 0.7697 - classification_loss: 0.0835 490/500 [============================>.] - ETA: 3s - loss: 0.8519 - regression_loss: 0.7686 - classification_loss: 0.0833 491/500 [============================>.] - ETA: 3s - loss: 0.8512 - regression_loss: 0.7680 - classification_loss: 0.0832 492/500 [============================>.] - ETA: 2s - loss: 0.8515 - regression_loss: 0.7682 - classification_loss: 0.0832 493/500 [============================>.] - ETA: 2s - loss: 0.8517 - regression_loss: 0.7685 - classification_loss: 0.0832 494/500 [============================>.] - ETA: 2s - loss: 0.8519 - regression_loss: 0.7686 - classification_loss: 0.0833 495/500 [============================>.] - ETA: 1s - loss: 0.8529 - regression_loss: 0.7694 - classification_loss: 0.0835 496/500 [============================>.] - ETA: 1s - loss: 0.8521 - regression_loss: 0.7687 - classification_loss: 0.0834 497/500 [============================>.] - ETA: 1s - loss: 0.8530 - regression_loss: 0.7695 - classification_loss: 0.0835 498/500 [============================>.] - ETA: 0s - loss: 0.8534 - regression_loss: 0.7698 - classification_loss: 0.0836 499/500 [============================>.] - ETA: 0s - loss: 0.8533 - regression_loss: 0.7697 - classification_loss: 0.0836 500/500 [==============================] - 169s 339ms/step - loss: 0.8521 - regression_loss: 0.7686 - classification_loss: 0.0835 1172 instances of class plum with average precision: 0.7965 mAP: 0.7965 Epoch 00041: saving model to ./training/snapshots/resnet101_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 2:33 - loss: 0.8127 - regression_loss: 0.7288 - classification_loss: 0.0838 2/500 [..............................] - ETA: 2:43 - loss: 0.5603 - regression_loss: 0.5065 - classification_loss: 0.0538 3/500 [..............................] - ETA: 2:48 - loss: 0.8436 - regression_loss: 0.7504 - classification_loss: 0.0932 4/500 [..............................] - ETA: 2:48 - loss: 0.8632 - regression_loss: 0.7656 - classification_loss: 0.0976 5/500 [..............................] - ETA: 2:49 - loss: 0.8849 - regression_loss: 0.7919 - classification_loss: 0.0930 6/500 [..............................] - ETA: 2:48 - loss: 0.9120 - regression_loss: 0.8124 - classification_loss: 0.0996 7/500 [..............................] - ETA: 2:47 - loss: 0.9229 - regression_loss: 0.8207 - classification_loss: 0.1022 8/500 [..............................] - ETA: 2:47 - loss: 0.9046 - regression_loss: 0.8072 - classification_loss: 0.0974 9/500 [..............................] - ETA: 2:47 - loss: 0.8840 - regression_loss: 0.7935 - classification_loss: 0.0905 10/500 [..............................] - ETA: 2:47 - loss: 0.8942 - regression_loss: 0.8051 - classification_loss: 0.0890 11/500 [..............................] - ETA: 2:46 - loss: 0.8530 - regression_loss: 0.7680 - classification_loss: 0.0850 12/500 [..............................] - ETA: 2:46 - loss: 0.8380 - regression_loss: 0.7553 - classification_loss: 0.0828 13/500 [..............................] - ETA: 2:46 - loss: 0.8488 - regression_loss: 0.7666 - classification_loss: 0.0822 14/500 [..............................] - ETA: 2:45 - loss: 0.8694 - regression_loss: 0.7879 - classification_loss: 0.0815 15/500 [..............................] - ETA: 2:45 - loss: 0.8458 - regression_loss: 0.7679 - classification_loss: 0.0780 16/500 [..............................] - ETA: 2:44 - loss: 0.8647 - regression_loss: 0.7834 - classification_loss: 0.0814 17/500 [>.............................] - ETA: 2:43 - loss: 0.8367 - regression_loss: 0.7591 - classification_loss: 0.0776 18/500 [>.............................] - ETA: 2:43 - loss: 0.8538 - regression_loss: 0.7732 - classification_loss: 0.0807 19/500 [>.............................] - ETA: 2:43 - loss: 0.8323 - regression_loss: 0.7552 - classification_loss: 0.0771 20/500 [>.............................] - ETA: 2:42 - loss: 0.8064 - regression_loss: 0.7317 - classification_loss: 0.0747 21/500 [>.............................] - ETA: 2:42 - loss: 0.7825 - regression_loss: 0.7106 - classification_loss: 0.0719 22/500 [>.............................] - ETA: 2:41 - loss: 0.7788 - regression_loss: 0.7076 - classification_loss: 0.0712 23/500 [>.............................] - ETA: 2:41 - loss: 0.7953 - regression_loss: 0.7213 - classification_loss: 0.0740 24/500 [>.............................] - ETA: 2:41 - loss: 0.8050 - regression_loss: 0.7298 - classification_loss: 0.0752 25/500 [>.............................] - ETA: 2:40 - loss: 0.8136 - regression_loss: 0.7385 - classification_loss: 0.0750 26/500 [>.............................] - ETA: 2:40 - loss: 0.8285 - regression_loss: 0.7530 - classification_loss: 0.0755 27/500 [>.............................] - ETA: 2:40 - loss: 0.8442 - regression_loss: 0.7697 - classification_loss: 0.0746 28/500 [>.............................] - ETA: 2:39 - loss: 0.8423 - regression_loss: 0.7681 - classification_loss: 0.0742 29/500 [>.............................] - ETA: 2:39 - loss: 0.8488 - regression_loss: 0.7730 - classification_loss: 0.0757 30/500 [>.............................] - ETA: 2:39 - loss: 0.8371 - regression_loss: 0.7625 - classification_loss: 0.0745 31/500 [>.............................] - ETA: 2:38 - loss: 0.8493 - regression_loss: 0.7721 - classification_loss: 0.0772 32/500 [>.............................] - ETA: 2:38 - loss: 0.8493 - regression_loss: 0.7717 - classification_loss: 0.0776 33/500 [>.............................] - ETA: 2:37 - loss: 0.8408 - regression_loss: 0.7644 - classification_loss: 0.0765 34/500 [=>............................] - ETA: 2:37 - loss: 0.8369 - regression_loss: 0.7608 - classification_loss: 0.0760 35/500 [=>............................] - ETA: 2:37 - loss: 0.8223 - regression_loss: 0.7476 - classification_loss: 0.0747 36/500 [=>............................] - ETA: 2:36 - loss: 0.8195 - regression_loss: 0.7454 - classification_loss: 0.0741 37/500 [=>............................] - ETA: 2:36 - loss: 0.8226 - regression_loss: 0.7480 - classification_loss: 0.0746 38/500 [=>............................] - ETA: 2:36 - loss: 0.8190 - regression_loss: 0.7440 - classification_loss: 0.0750 39/500 [=>............................] - ETA: 2:35 - loss: 0.8344 - regression_loss: 0.7588 - classification_loss: 0.0756 40/500 [=>............................] - ETA: 2:35 - loss: 0.8238 - regression_loss: 0.7498 - classification_loss: 0.0741 41/500 [=>............................] - ETA: 2:35 - loss: 0.8134 - regression_loss: 0.7407 - classification_loss: 0.0727 42/500 [=>............................] - ETA: 2:34 - loss: 0.8219 - regression_loss: 0.7481 - classification_loss: 0.0738 43/500 [=>............................] - ETA: 2:34 - loss: 0.8189 - regression_loss: 0.7456 - classification_loss: 0.0732 44/500 [=>............................] - ETA: 2:34 - loss: 0.8262 - regression_loss: 0.7497 - classification_loss: 0.0765 45/500 [=>............................] - ETA: 2:33 - loss: 0.8305 - regression_loss: 0.7532 - classification_loss: 0.0773 46/500 [=>............................] - ETA: 2:33 - loss: 0.8360 - regression_loss: 0.7581 - classification_loss: 0.0779 47/500 [=>............................] - ETA: 2:32 - loss: 0.8280 - regression_loss: 0.7514 - classification_loss: 0.0767 48/500 [=>............................] - ETA: 2:32 - loss: 0.8369 - regression_loss: 0.7586 - classification_loss: 0.0783 49/500 [=>............................] - ETA: 2:32 - loss: 0.8287 - regression_loss: 0.7516 - classification_loss: 0.0770 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8276 - regression_loss: 0.7517 - classification_loss: 0.0759 51/500 [==>...........................] - ETA: 2:31 - loss: 0.8227 - regression_loss: 0.7475 - classification_loss: 0.0751 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8209 - regression_loss: 0.7464 - classification_loss: 0.0746 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8269 - regression_loss: 0.7519 - classification_loss: 0.0750 54/500 [==>...........................] - ETA: 2:30 - loss: 0.8347 - regression_loss: 0.7578 - classification_loss: 0.0769 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8375 - regression_loss: 0.7602 - classification_loss: 0.0774 56/500 [==>...........................] - ETA: 2:30 - loss: 0.8385 - regression_loss: 0.7600 - classification_loss: 0.0785 57/500 [==>...........................] - ETA: 2:29 - loss: 0.8357 - regression_loss: 0.7578 - classification_loss: 0.0779 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8352 - regression_loss: 0.7577 - classification_loss: 0.0776 59/500 [==>...........................] - ETA: 2:28 - loss: 0.8374 - regression_loss: 0.7599 - classification_loss: 0.0775 60/500 [==>...........................] - ETA: 2:28 - loss: 0.8337 - regression_loss: 0.7567 - classification_loss: 0.0770 61/500 [==>...........................] - ETA: 2:28 - loss: 0.8339 - regression_loss: 0.7577 - classification_loss: 0.0762 62/500 [==>...........................] - ETA: 2:27 - loss: 0.8340 - regression_loss: 0.7569 - classification_loss: 0.0771 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8302 - regression_loss: 0.7531 - classification_loss: 0.0771 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8319 - regression_loss: 0.7548 - classification_loss: 0.0772 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8334 - regression_loss: 0.7563 - classification_loss: 0.0772 66/500 [==>...........................] - ETA: 2:26 - loss: 0.8322 - regression_loss: 0.7554 - classification_loss: 0.0768 67/500 [===>..........................] - ETA: 2:26 - loss: 0.8310 - regression_loss: 0.7543 - classification_loss: 0.0767 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8315 - regression_loss: 0.7547 - classification_loss: 0.0768 69/500 [===>..........................] - ETA: 2:25 - loss: 0.8329 - regression_loss: 0.7556 - classification_loss: 0.0773 70/500 [===>..........................] - ETA: 2:25 - loss: 0.8331 - regression_loss: 0.7560 - classification_loss: 0.0770 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8279 - regression_loss: 0.7517 - classification_loss: 0.0762 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8235 - regression_loss: 0.7477 - classification_loss: 0.0758 73/500 [===>..........................] - ETA: 2:24 - loss: 0.8214 - regression_loss: 0.7462 - classification_loss: 0.0753 74/500 [===>..........................] - ETA: 2:24 - loss: 0.8269 - regression_loss: 0.7508 - classification_loss: 0.0761 75/500 [===>..........................] - ETA: 2:24 - loss: 0.8396 - regression_loss: 0.7621 - classification_loss: 0.0774 76/500 [===>..........................] - ETA: 2:23 - loss: 0.8375 - regression_loss: 0.7605 - classification_loss: 0.0771 77/500 [===>..........................] - ETA: 2:23 - loss: 0.8311 - regression_loss: 0.7549 - classification_loss: 0.0762 78/500 [===>..........................] - ETA: 2:23 - loss: 0.8313 - regression_loss: 0.7553 - classification_loss: 0.0759 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8339 - regression_loss: 0.7572 - classification_loss: 0.0767 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8309 - regression_loss: 0.7547 - classification_loss: 0.0762 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8304 - regression_loss: 0.7542 - classification_loss: 0.0762 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8363 - regression_loss: 0.7593 - classification_loss: 0.0769 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8396 - regression_loss: 0.7622 - classification_loss: 0.0774 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8358 - regression_loss: 0.7588 - classification_loss: 0.0770 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8380 - regression_loss: 0.7609 - classification_loss: 0.0771 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8458 - regression_loss: 0.7671 - classification_loss: 0.0787 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8437 - regression_loss: 0.7650 - classification_loss: 0.0786 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8425 - regression_loss: 0.7645 - classification_loss: 0.0781 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8453 - regression_loss: 0.7669 - classification_loss: 0.0784 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8430 - regression_loss: 0.7649 - classification_loss: 0.0780 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8472 - regression_loss: 0.7687 - classification_loss: 0.0786 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8468 - regression_loss: 0.7676 - classification_loss: 0.0792 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8454 - regression_loss: 0.7663 - classification_loss: 0.0791 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8389 - regression_loss: 0.7606 - classification_loss: 0.0783 95/500 [====>.........................] - ETA: 2:16 - loss: 0.8449 - regression_loss: 0.7657 - classification_loss: 0.0792 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8464 - regression_loss: 0.7673 - classification_loss: 0.0791 97/500 [====>.........................] - ETA: 2:15 - loss: 0.8454 - regression_loss: 0.7664 - classification_loss: 0.0791 98/500 [====>.........................] - ETA: 2:15 - loss: 0.8498 - regression_loss: 0.7701 - classification_loss: 0.0797 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8447 - regression_loss: 0.7656 - classification_loss: 0.0791 100/500 [=====>........................] - ETA: 2:14 - loss: 0.8445 - regression_loss: 0.7656 - classification_loss: 0.0789 101/500 [=====>........................] - ETA: 2:14 - loss: 0.8390 - regression_loss: 0.7607 - classification_loss: 0.0783 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8367 - regression_loss: 0.7586 - classification_loss: 0.0781 103/500 [=====>........................] - ETA: 2:13 - loss: 0.8364 - regression_loss: 0.7583 - classification_loss: 0.0781 104/500 [=====>........................] - ETA: 2:13 - loss: 0.8341 - regression_loss: 0.7561 - classification_loss: 0.0780 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8320 - regression_loss: 0.7542 - classification_loss: 0.0778 106/500 [=====>........................] - ETA: 2:12 - loss: 0.8349 - regression_loss: 0.7572 - classification_loss: 0.0776 107/500 [=====>........................] - ETA: 2:12 - loss: 0.8304 - regression_loss: 0.7534 - classification_loss: 0.0770 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8326 - regression_loss: 0.7556 - classification_loss: 0.0770 109/500 [=====>........................] - ETA: 2:11 - loss: 0.8318 - regression_loss: 0.7550 - classification_loss: 0.0768 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8333 - regression_loss: 0.7561 - classification_loss: 0.0772 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8282 - regression_loss: 0.7516 - classification_loss: 0.0766 112/500 [=====>........................] - ETA: 2:10 - loss: 0.8252 - regression_loss: 0.7489 - classification_loss: 0.0763 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8290 - regression_loss: 0.7524 - classification_loss: 0.0766 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8301 - regression_loss: 0.7535 - classification_loss: 0.0766 115/500 [=====>........................] - ETA: 2:09 - loss: 0.8327 - regression_loss: 0.7559 - classification_loss: 0.0767 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8355 - regression_loss: 0.7573 - classification_loss: 0.0782 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8319 - regression_loss: 0.7540 - classification_loss: 0.0779 118/500 [======>.......................] - ETA: 2:08 - loss: 0.8332 - regression_loss: 0.7552 - classification_loss: 0.0780 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8360 - regression_loss: 0.7577 - classification_loss: 0.0783 120/500 [======>.......................] - ETA: 2:07 - loss: 0.8338 - regression_loss: 0.7557 - classification_loss: 0.0780 121/500 [======>.......................] - ETA: 2:07 - loss: 0.8345 - regression_loss: 0.7568 - classification_loss: 0.0777 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8330 - regression_loss: 0.7552 - classification_loss: 0.0777 123/500 [======>.......................] - ETA: 2:06 - loss: 0.8363 - regression_loss: 0.7577 - classification_loss: 0.0786 124/500 [======>.......................] - ETA: 2:06 - loss: 0.8440 - regression_loss: 0.7641 - classification_loss: 0.0799 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8468 - regression_loss: 0.7662 - classification_loss: 0.0806 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8447 - regression_loss: 0.7643 - classification_loss: 0.0804 127/500 [======>.......................] - ETA: 2:05 - loss: 0.8434 - regression_loss: 0.7634 - classification_loss: 0.0800 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8432 - regression_loss: 0.7630 - classification_loss: 0.0802 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8425 - regression_loss: 0.7625 - classification_loss: 0.0800 130/500 [======>.......................] - ETA: 2:04 - loss: 0.8449 - regression_loss: 0.7642 - classification_loss: 0.0807 131/500 [======>.......................] - ETA: 2:04 - loss: 0.8423 - regression_loss: 0.7618 - classification_loss: 0.0805 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8427 - regression_loss: 0.7623 - classification_loss: 0.0804 133/500 [======>.......................] - ETA: 2:03 - loss: 0.8408 - regression_loss: 0.7608 - classification_loss: 0.0800 134/500 [=======>......................] - ETA: 2:03 - loss: 0.8432 - regression_loss: 0.7634 - classification_loss: 0.0798 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8453 - regression_loss: 0.7652 - classification_loss: 0.0801 136/500 [=======>......................] - ETA: 2:02 - loss: 0.8454 - regression_loss: 0.7653 - classification_loss: 0.0802 137/500 [=======>......................] - ETA: 2:02 - loss: 0.8494 - regression_loss: 0.7679 - classification_loss: 0.0815 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8538 - regression_loss: 0.7717 - classification_loss: 0.0821 139/500 [=======>......................] - ETA: 2:01 - loss: 0.8515 - regression_loss: 0.7697 - classification_loss: 0.0818 140/500 [=======>......................] - ETA: 2:01 - loss: 0.8517 - regression_loss: 0.7699 - classification_loss: 0.0818 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8537 - regression_loss: 0.7721 - classification_loss: 0.0816 142/500 [=======>......................] - ETA: 2:00 - loss: 0.8512 - regression_loss: 0.7701 - classification_loss: 0.0812 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8502 - regression_loss: 0.7694 - classification_loss: 0.0809 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8567 - regression_loss: 0.7751 - classification_loss: 0.0816 145/500 [=======>......................] - ETA: 1:59 - loss: 0.8591 - regression_loss: 0.7771 - classification_loss: 0.0820 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8609 - regression_loss: 0.7784 - classification_loss: 0.0825 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8619 - regression_loss: 0.7791 - classification_loss: 0.0828 148/500 [=======>......................] - ETA: 1:58 - loss: 0.8610 - regression_loss: 0.7783 - classification_loss: 0.0827 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8582 - regression_loss: 0.7759 - classification_loss: 0.0823 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8578 - regression_loss: 0.7756 - classification_loss: 0.0822 151/500 [========>.....................] - ETA: 1:57 - loss: 0.8566 - regression_loss: 0.7745 - classification_loss: 0.0821 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8560 - regression_loss: 0.7740 - classification_loss: 0.0820 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8565 - regression_loss: 0.7746 - classification_loss: 0.0819 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8569 - regression_loss: 0.7749 - classification_loss: 0.0821 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8572 - regression_loss: 0.7748 - classification_loss: 0.0823 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8580 - regression_loss: 0.7753 - classification_loss: 0.0827 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8578 - regression_loss: 0.7753 - classification_loss: 0.0825 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8574 - regression_loss: 0.7751 - classification_loss: 0.0823 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8552 - regression_loss: 0.7732 - classification_loss: 0.0820 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8541 - regression_loss: 0.7722 - classification_loss: 0.0819 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8527 - regression_loss: 0.7712 - classification_loss: 0.0816 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8522 - regression_loss: 0.7706 - classification_loss: 0.0816 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8522 - regression_loss: 0.7706 - classification_loss: 0.0816 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8553 - regression_loss: 0.7732 - classification_loss: 0.0821 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8562 - regression_loss: 0.7740 - classification_loss: 0.0822 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8562 - regression_loss: 0.7740 - classification_loss: 0.0822 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8588 - regression_loss: 0.7760 - classification_loss: 0.0828 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8596 - regression_loss: 0.7767 - classification_loss: 0.0829 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8594 - regression_loss: 0.7766 - classification_loss: 0.0828 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8589 - regression_loss: 0.7761 - classification_loss: 0.0829 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8563 - regression_loss: 0.7738 - classification_loss: 0.0825 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8527 - regression_loss: 0.7705 - classification_loss: 0.0821 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8519 - regression_loss: 0.7698 - classification_loss: 0.0821 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8522 - regression_loss: 0.7701 - classification_loss: 0.0821 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8543 - regression_loss: 0.7720 - classification_loss: 0.0823 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8521 - regression_loss: 0.7701 - classification_loss: 0.0820 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8509 - regression_loss: 0.7692 - classification_loss: 0.0817 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8507 - regression_loss: 0.7688 - classification_loss: 0.0818 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8529 - regression_loss: 0.7707 - classification_loss: 0.0821 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8512 - regression_loss: 0.7695 - classification_loss: 0.0818 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8514 - regression_loss: 0.7698 - classification_loss: 0.0816 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8493 - regression_loss: 0.7679 - classification_loss: 0.0814 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8479 - regression_loss: 0.7667 - classification_loss: 0.0812 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8457 - regression_loss: 0.7648 - classification_loss: 0.0810 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8476 - regression_loss: 0.7663 - classification_loss: 0.0813 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8474 - regression_loss: 0.7660 - classification_loss: 0.0814 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8460 - regression_loss: 0.7647 - classification_loss: 0.0813 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8454 - regression_loss: 0.7639 - classification_loss: 0.0815 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8472 - regression_loss: 0.7653 - classification_loss: 0.0819 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8450 - regression_loss: 0.7633 - classification_loss: 0.0817 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8467 - regression_loss: 0.7647 - classification_loss: 0.0820 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8447 - regression_loss: 0.7629 - classification_loss: 0.0817 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8416 - regression_loss: 0.7602 - classification_loss: 0.0814 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8411 - regression_loss: 0.7598 - classification_loss: 0.0814 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8401 - regression_loss: 0.7589 - classification_loss: 0.0812 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8405 - regression_loss: 0.7594 - classification_loss: 0.0811 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8415 - regression_loss: 0.7606 - classification_loss: 0.0810 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8433 - regression_loss: 0.7619 - classification_loss: 0.0814 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8433 - regression_loss: 0.7621 - classification_loss: 0.0811 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8447 - regression_loss: 0.7631 - classification_loss: 0.0816 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8442 - regression_loss: 0.7627 - classification_loss: 0.0815 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8456 - regression_loss: 0.7640 - classification_loss: 0.0816 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8455 - regression_loss: 0.7638 - classification_loss: 0.0817 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8468 - regression_loss: 0.7654 - classification_loss: 0.0814 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8482 - regression_loss: 0.7667 - classification_loss: 0.0815 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8491 - regression_loss: 0.7677 - classification_loss: 0.0814 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8489 - regression_loss: 0.7677 - classification_loss: 0.0812 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8494 - regression_loss: 0.7681 - classification_loss: 0.0813 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8501 - regression_loss: 0.7687 - classification_loss: 0.0814 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8515 - regression_loss: 0.7701 - classification_loss: 0.0815 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8496 - regression_loss: 0.7683 - classification_loss: 0.0812 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8485 - regression_loss: 0.7675 - classification_loss: 0.0810 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8485 - regression_loss: 0.7677 - classification_loss: 0.0808 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8513 - regression_loss: 0.7698 - classification_loss: 0.0815 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8529 - regression_loss: 0.7711 - classification_loss: 0.0818 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8517 - regression_loss: 0.7701 - classification_loss: 0.0816 217/500 [============>.................] - ETA: 1:36 - loss: 0.8502 - regression_loss: 0.7688 - classification_loss: 0.0814 218/500 [============>.................] - ETA: 1:35 - loss: 0.8506 - regression_loss: 0.7691 - classification_loss: 0.0815 219/500 [============>.................] - ETA: 1:35 - loss: 0.8514 - regression_loss: 0.7699 - classification_loss: 0.0815 220/500 [============>.................] - ETA: 1:35 - loss: 0.8537 - regression_loss: 0.7719 - classification_loss: 0.0819 221/500 [============>.................] - ETA: 1:34 - loss: 0.8546 - regression_loss: 0.7727 - classification_loss: 0.0820 222/500 [============>.................] - ETA: 1:34 - loss: 0.8538 - regression_loss: 0.7717 - classification_loss: 0.0821 223/500 [============>.................] - ETA: 1:34 - loss: 0.8534 - regression_loss: 0.7715 - classification_loss: 0.0819 224/500 [============>.................] - ETA: 1:33 - loss: 0.8521 - regression_loss: 0.7704 - classification_loss: 0.0817 225/500 [============>.................] - ETA: 1:33 - loss: 0.8501 - regression_loss: 0.7687 - classification_loss: 0.0815 226/500 [============>.................] - ETA: 1:33 - loss: 0.8516 - regression_loss: 0.7700 - classification_loss: 0.0817 227/500 [============>.................] - ETA: 1:32 - loss: 0.8545 - regression_loss: 0.7725 - classification_loss: 0.0820 228/500 [============>.................] - ETA: 1:32 - loss: 0.8543 - regression_loss: 0.7724 - classification_loss: 0.0819 229/500 [============>.................] - ETA: 1:31 - loss: 0.8540 - regression_loss: 0.7722 - classification_loss: 0.0818 230/500 [============>.................] - ETA: 1:31 - loss: 0.8562 - regression_loss: 0.7742 - classification_loss: 0.0820 231/500 [============>.................] - ETA: 1:31 - loss: 0.8539 - regression_loss: 0.7722 - classification_loss: 0.0817 232/500 [============>.................] - ETA: 1:30 - loss: 0.8513 - regression_loss: 0.7699 - classification_loss: 0.0814 233/500 [============>.................] - ETA: 1:30 - loss: 0.8533 - regression_loss: 0.7715 - classification_loss: 0.0818 234/500 [=============>................] - ETA: 1:30 - loss: 0.8529 - regression_loss: 0.7713 - classification_loss: 0.0816 235/500 [=============>................] - ETA: 1:29 - loss: 0.8546 - regression_loss: 0.7727 - classification_loss: 0.0819 236/500 [=============>................] - ETA: 1:29 - loss: 0.8546 - regression_loss: 0.7727 - classification_loss: 0.0819 237/500 [=============>................] - ETA: 1:29 - loss: 0.8559 - regression_loss: 0.7738 - classification_loss: 0.0821 238/500 [=============>................] - ETA: 1:28 - loss: 0.8539 - regression_loss: 0.7721 - classification_loss: 0.0819 239/500 [=============>................] - ETA: 1:28 - loss: 0.8536 - regression_loss: 0.7719 - classification_loss: 0.0817 240/500 [=============>................] - ETA: 1:28 - loss: 0.8521 - regression_loss: 0.7706 - classification_loss: 0.0815 241/500 [=============>................] - ETA: 1:27 - loss: 0.8548 - regression_loss: 0.7728 - classification_loss: 0.0820 242/500 [=============>................] - ETA: 1:27 - loss: 0.8549 - regression_loss: 0.7729 - classification_loss: 0.0819 243/500 [=============>................] - ETA: 1:27 - loss: 0.8559 - regression_loss: 0.7738 - classification_loss: 0.0820 244/500 [=============>................] - ETA: 1:26 - loss: 0.8541 - regression_loss: 0.7723 - classification_loss: 0.0818 245/500 [=============>................] - ETA: 1:26 - loss: 0.8566 - regression_loss: 0.7744 - classification_loss: 0.0822 246/500 [=============>................] - ETA: 1:26 - loss: 0.8549 - regression_loss: 0.7729 - classification_loss: 0.0820 247/500 [=============>................] - ETA: 1:25 - loss: 0.8552 - regression_loss: 0.7731 - classification_loss: 0.0820 248/500 [=============>................] - ETA: 1:25 - loss: 0.8549 - regression_loss: 0.7728 - classification_loss: 0.0821 249/500 [=============>................] - ETA: 1:25 - loss: 0.8530 - regression_loss: 0.7711 - classification_loss: 0.0819 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8527 - regression_loss: 0.7709 - classification_loss: 0.0818 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8544 - regression_loss: 0.7722 - classification_loss: 0.0822 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8544 - regression_loss: 0.7720 - classification_loss: 0.0823 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8531 - regression_loss: 0.7709 - classification_loss: 0.0821 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8516 - regression_loss: 0.7696 - classification_loss: 0.0820 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8549 - regression_loss: 0.7727 - classification_loss: 0.0822 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8559 - regression_loss: 0.7736 - classification_loss: 0.0822 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8576 - regression_loss: 0.7751 - classification_loss: 0.0825 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8556 - regression_loss: 0.7734 - classification_loss: 0.0822 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8578 - regression_loss: 0.7753 - classification_loss: 0.0825 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8573 - regression_loss: 0.7749 - classification_loss: 0.0824 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8579 - regression_loss: 0.7754 - classification_loss: 0.0825 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8566 - regression_loss: 0.7743 - classification_loss: 0.0823 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8580 - regression_loss: 0.7754 - classification_loss: 0.0826 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8585 - regression_loss: 0.7759 - classification_loss: 0.0826 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8598 - regression_loss: 0.7770 - classification_loss: 0.0828 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8611 - regression_loss: 0.7781 - classification_loss: 0.0830 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8604 - regression_loss: 0.7776 - classification_loss: 0.0829 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8581 - regression_loss: 0.7755 - classification_loss: 0.0826 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8581 - regression_loss: 0.7754 - classification_loss: 0.0827 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8571 - regression_loss: 0.7746 - classification_loss: 0.0825 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8570 - regression_loss: 0.7745 - classification_loss: 0.0825 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8550 - regression_loss: 0.7728 - classification_loss: 0.0823 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8532 - regression_loss: 0.7712 - classification_loss: 0.0820 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8518 - regression_loss: 0.7700 - classification_loss: 0.0818 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8509 - regression_loss: 0.7692 - classification_loss: 0.0816 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8499 - regression_loss: 0.7684 - classification_loss: 0.0815 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8513 - regression_loss: 0.7696 - classification_loss: 0.0817 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8532 - regression_loss: 0.7710 - classification_loss: 0.0821 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8516 - regression_loss: 0.7697 - classification_loss: 0.0819 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8513 - regression_loss: 0.7692 - classification_loss: 0.0821 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8503 - regression_loss: 0.7683 - classification_loss: 0.0820 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8514 - regression_loss: 0.7692 - classification_loss: 0.0822 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8510 - regression_loss: 0.7689 - classification_loss: 0.0822 284/500 [================>.............] - ETA: 1:13 - loss: 0.8510 - regression_loss: 0.7688 - classification_loss: 0.0822 285/500 [================>.............] - ETA: 1:12 - loss: 0.8497 - regression_loss: 0.7677 - classification_loss: 0.0820 286/500 [================>.............] - ETA: 1:12 - loss: 0.8500 - regression_loss: 0.7679 - classification_loss: 0.0820 287/500 [================>.............] - ETA: 1:12 - loss: 0.8496 - regression_loss: 0.7676 - classification_loss: 0.0820 288/500 [================>.............] - ETA: 1:11 - loss: 0.8484 - regression_loss: 0.7665 - classification_loss: 0.0818 289/500 [================>.............] - ETA: 1:11 - loss: 0.8481 - regression_loss: 0.7660 - classification_loss: 0.0820 290/500 [================>.............] - ETA: 1:11 - loss: 0.8483 - regression_loss: 0.7661 - classification_loss: 0.0822 291/500 [================>.............] - ETA: 1:10 - loss: 0.8497 - regression_loss: 0.7672 - classification_loss: 0.0824 292/500 [================>.............] - ETA: 1:10 - loss: 0.8492 - regression_loss: 0.7669 - classification_loss: 0.0823 293/500 [================>.............] - ETA: 1:10 - loss: 0.8506 - regression_loss: 0.7682 - classification_loss: 0.0824 294/500 [================>.............] - ETA: 1:09 - loss: 0.8502 - regression_loss: 0.7679 - classification_loss: 0.0823 295/500 [================>.............] - ETA: 1:09 - loss: 0.8512 - regression_loss: 0.7689 - classification_loss: 0.0823 296/500 [================>.............] - ETA: 1:09 - loss: 0.8496 - regression_loss: 0.7675 - classification_loss: 0.0821 297/500 [================>.............] - ETA: 1:08 - loss: 0.8513 - regression_loss: 0.7690 - classification_loss: 0.0822 298/500 [================>.............] - ETA: 1:08 - loss: 0.8528 - regression_loss: 0.7705 - classification_loss: 0.0824 299/500 [================>.............] - ETA: 1:08 - loss: 0.8544 - regression_loss: 0.7717 - classification_loss: 0.0826 300/500 [=================>............] - ETA: 1:07 - loss: 0.8544 - regression_loss: 0.7718 - classification_loss: 0.0826 301/500 [=================>............] - ETA: 1:07 - loss: 0.8550 - regression_loss: 0.7723 - classification_loss: 0.0827 302/500 [=================>............] - ETA: 1:07 - loss: 0.8542 - regression_loss: 0.7717 - classification_loss: 0.0825 303/500 [=================>............] - ETA: 1:06 - loss: 0.8540 - regression_loss: 0.7715 - classification_loss: 0.0825 304/500 [=================>............] - ETA: 1:06 - loss: 0.8537 - regression_loss: 0.7713 - classification_loss: 0.0823 305/500 [=================>............] - ETA: 1:06 - loss: 0.8538 - regression_loss: 0.7715 - classification_loss: 0.0823 306/500 [=================>............] - ETA: 1:05 - loss: 0.8546 - regression_loss: 0.7721 - classification_loss: 0.0825 307/500 [=================>............] - ETA: 1:05 - loss: 0.8554 - regression_loss: 0.7728 - classification_loss: 0.0826 308/500 [=================>............] - ETA: 1:05 - loss: 0.8539 - regression_loss: 0.7714 - classification_loss: 0.0825 309/500 [=================>............] - ETA: 1:04 - loss: 0.8548 - regression_loss: 0.7723 - classification_loss: 0.0825 310/500 [=================>............] - ETA: 1:04 - loss: 0.8547 - regression_loss: 0.7723 - classification_loss: 0.0824 311/500 [=================>............] - ETA: 1:04 - loss: 0.8533 - regression_loss: 0.7711 - classification_loss: 0.0822 312/500 [=================>............] - ETA: 1:03 - loss: 0.8531 - regression_loss: 0.7709 - classification_loss: 0.0822 313/500 [=================>............] - ETA: 1:03 - loss: 0.8547 - regression_loss: 0.7721 - classification_loss: 0.0827 314/500 [=================>............] - ETA: 1:03 - loss: 0.8547 - regression_loss: 0.7720 - classification_loss: 0.0827 315/500 [=================>............] - ETA: 1:02 - loss: 0.8547 - regression_loss: 0.7720 - classification_loss: 0.0827 316/500 [=================>............] - ETA: 1:02 - loss: 0.8538 - regression_loss: 0.7712 - classification_loss: 0.0826 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8524 - regression_loss: 0.7700 - classification_loss: 0.0824 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8528 - regression_loss: 0.7705 - classification_loss: 0.0824 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8521 - regression_loss: 0.7697 - classification_loss: 0.0824 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8528 - regression_loss: 0.7702 - classification_loss: 0.0826 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8544 - regression_loss: 0.7718 - classification_loss: 0.0826 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8543 - regression_loss: 0.7717 - classification_loss: 0.0825 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8559 - regression_loss: 0.7731 - classification_loss: 0.0828 324/500 [==================>...........] - ETA: 59s - loss: 0.8549 - regression_loss: 0.7722 - classification_loss: 0.0827  325/500 [==================>...........] - ETA: 59s - loss: 0.8550 - regression_loss: 0.7723 - classification_loss: 0.0827 326/500 [==================>...........] - ETA: 59s - loss: 0.8554 - regression_loss: 0.7727 - classification_loss: 0.0828 327/500 [==================>...........] - ETA: 58s - loss: 0.8542 - regression_loss: 0.7716 - classification_loss: 0.0826 328/500 [==================>...........] - ETA: 58s - loss: 0.8545 - regression_loss: 0.7719 - classification_loss: 0.0827 329/500 [==================>...........] - ETA: 58s - loss: 0.8537 - regression_loss: 0.7711 - classification_loss: 0.0826 330/500 [==================>...........] - ETA: 57s - loss: 0.8531 - regression_loss: 0.7706 - classification_loss: 0.0825 331/500 [==================>...........] - ETA: 57s - loss: 0.8534 - regression_loss: 0.7710 - classification_loss: 0.0824 332/500 [==================>...........] - ETA: 57s - loss: 0.8535 - regression_loss: 0.7711 - classification_loss: 0.0825 333/500 [==================>...........] - ETA: 56s - loss: 0.8537 - regression_loss: 0.7712 - classification_loss: 0.0826 334/500 [===================>..........] - ETA: 56s - loss: 0.8531 - regression_loss: 0.7705 - classification_loss: 0.0826 335/500 [===================>..........] - ETA: 55s - loss: 0.8524 - regression_loss: 0.7699 - classification_loss: 0.0824 336/500 [===================>..........] - ETA: 55s - loss: 0.8510 - regression_loss: 0.7688 - classification_loss: 0.0822 337/500 [===================>..........] - ETA: 55s - loss: 0.8514 - regression_loss: 0.7690 - classification_loss: 0.0824 338/500 [===================>..........] - ETA: 54s - loss: 0.8516 - regression_loss: 0.7692 - classification_loss: 0.0824 339/500 [===================>..........] - ETA: 54s - loss: 0.8515 - regression_loss: 0.7692 - classification_loss: 0.0823 340/500 [===================>..........] - ETA: 54s - loss: 0.8527 - regression_loss: 0.7702 - classification_loss: 0.0824 341/500 [===================>..........] - ETA: 53s - loss: 0.8521 - regression_loss: 0.7698 - classification_loss: 0.0823 342/500 [===================>..........] - ETA: 53s - loss: 0.8520 - regression_loss: 0.7697 - classification_loss: 0.0823 343/500 [===================>..........] - ETA: 53s - loss: 0.8527 - regression_loss: 0.7704 - classification_loss: 0.0823 344/500 [===================>..........] - ETA: 52s - loss: 0.8538 - regression_loss: 0.7715 - classification_loss: 0.0823 345/500 [===================>..........] - ETA: 52s - loss: 0.8539 - regression_loss: 0.7717 - classification_loss: 0.0823 346/500 [===================>..........] - ETA: 52s - loss: 0.8523 - regression_loss: 0.7702 - classification_loss: 0.0821 347/500 [===================>..........] - ETA: 51s - loss: 0.8520 - regression_loss: 0.7700 - classification_loss: 0.0821 348/500 [===================>..........] - ETA: 51s - loss: 0.8528 - regression_loss: 0.7707 - classification_loss: 0.0821 349/500 [===================>..........] - ETA: 51s - loss: 0.8524 - regression_loss: 0.7705 - classification_loss: 0.0820 350/500 [====================>.........] - ETA: 50s - loss: 0.8513 - regression_loss: 0.7694 - classification_loss: 0.0819 351/500 [====================>.........] - ETA: 50s - loss: 0.8515 - regression_loss: 0.7696 - classification_loss: 0.0819 352/500 [====================>.........] - ETA: 50s - loss: 0.8537 - regression_loss: 0.7712 - classification_loss: 0.0824 353/500 [====================>.........] - ETA: 49s - loss: 0.8549 - regression_loss: 0.7721 - classification_loss: 0.0828 354/500 [====================>.........] - ETA: 49s - loss: 0.8564 - regression_loss: 0.7734 - classification_loss: 0.0830 355/500 [====================>.........] - ETA: 49s - loss: 0.8563 - regression_loss: 0.7733 - classification_loss: 0.0830 356/500 [====================>.........] - ETA: 48s - loss: 0.8561 - regression_loss: 0.7732 - classification_loss: 0.0829 357/500 [====================>.........] - ETA: 48s - loss: 0.8553 - regression_loss: 0.7725 - classification_loss: 0.0828 358/500 [====================>.........] - ETA: 48s - loss: 0.8536 - regression_loss: 0.7710 - classification_loss: 0.0826 359/500 [====================>.........] - ETA: 47s - loss: 0.8549 - regression_loss: 0.7720 - classification_loss: 0.0828 360/500 [====================>.........] - ETA: 47s - loss: 0.8552 - regression_loss: 0.7724 - classification_loss: 0.0829 361/500 [====================>.........] - ETA: 47s - loss: 0.8554 - regression_loss: 0.7725 - classification_loss: 0.0828 362/500 [====================>.........] - ETA: 46s - loss: 0.8562 - regression_loss: 0.7732 - classification_loss: 0.0830 363/500 [====================>.........] - ETA: 46s - loss: 0.8567 - regression_loss: 0.7739 - classification_loss: 0.0829 364/500 [====================>.........] - ETA: 46s - loss: 0.8570 - regression_loss: 0.7741 - classification_loss: 0.0829 365/500 [====================>.........] - ETA: 45s - loss: 0.8585 - regression_loss: 0.7754 - classification_loss: 0.0831 366/500 [====================>.........] - ETA: 45s - loss: 0.8588 - regression_loss: 0.7758 - classification_loss: 0.0830 367/500 [=====================>........] - ETA: 45s - loss: 0.8591 - regression_loss: 0.7759 - classification_loss: 0.0832 368/500 [=====================>........] - ETA: 44s - loss: 0.8580 - regression_loss: 0.7749 - classification_loss: 0.0830 369/500 [=====================>........] - ETA: 44s - loss: 0.8590 - regression_loss: 0.7758 - classification_loss: 0.0832 370/500 [=====================>........] - ETA: 44s - loss: 0.8584 - regression_loss: 0.7754 - classification_loss: 0.0830 371/500 [=====================>........] - ETA: 43s - loss: 0.8584 - regression_loss: 0.7754 - classification_loss: 0.0830 372/500 [=====================>........] - ETA: 43s - loss: 0.8587 - regression_loss: 0.7757 - classification_loss: 0.0830 373/500 [=====================>........] - ETA: 43s - loss: 0.8596 - regression_loss: 0.7764 - classification_loss: 0.0832 374/500 [=====================>........] - ETA: 42s - loss: 0.8634 - regression_loss: 0.7798 - classification_loss: 0.0835 375/500 [=====================>........] - ETA: 42s - loss: 0.8643 - regression_loss: 0.7807 - classification_loss: 0.0836 376/500 [=====================>........] - ETA: 42s - loss: 0.8656 - regression_loss: 0.7817 - classification_loss: 0.0839 377/500 [=====================>........] - ETA: 41s - loss: 0.8658 - regression_loss: 0.7820 - classification_loss: 0.0838 378/500 [=====================>........] - ETA: 41s - loss: 0.8654 - regression_loss: 0.7816 - classification_loss: 0.0838 379/500 [=====================>........] - ETA: 41s - loss: 0.8636 - regression_loss: 0.7800 - classification_loss: 0.0836 380/500 [=====================>........] - ETA: 40s - loss: 0.8634 - regression_loss: 0.7798 - classification_loss: 0.0836 381/500 [=====================>........] - ETA: 40s - loss: 0.8633 - regression_loss: 0.7797 - classification_loss: 0.0836 382/500 [=====================>........] - ETA: 40s - loss: 0.8624 - regression_loss: 0.7788 - classification_loss: 0.0835 383/500 [=====================>........] - ETA: 39s - loss: 0.8607 - regression_loss: 0.7773 - classification_loss: 0.0834 384/500 [======================>.......] - ETA: 39s - loss: 0.8603 - regression_loss: 0.7771 - classification_loss: 0.0832 385/500 [======================>.......] - ETA: 39s - loss: 0.8607 - regression_loss: 0.7774 - classification_loss: 0.0833 386/500 [======================>.......] - ETA: 38s - loss: 0.8604 - regression_loss: 0.7772 - classification_loss: 0.0832 387/500 [======================>.......] - ETA: 38s - loss: 0.8590 - regression_loss: 0.7760 - classification_loss: 0.0830 388/500 [======================>.......] - ETA: 37s - loss: 0.8597 - regression_loss: 0.7766 - classification_loss: 0.0831 389/500 [======================>.......] - ETA: 37s - loss: 0.8604 - regression_loss: 0.7772 - classification_loss: 0.0832 390/500 [======================>.......] - ETA: 37s - loss: 0.8607 - regression_loss: 0.7776 - classification_loss: 0.0831 391/500 [======================>.......] - ETA: 36s - loss: 0.8617 - regression_loss: 0.7785 - classification_loss: 0.0833 392/500 [======================>.......] - ETA: 36s - loss: 0.8614 - regression_loss: 0.7782 - classification_loss: 0.0832 393/500 [======================>.......] - ETA: 36s - loss: 0.8615 - regression_loss: 0.7783 - classification_loss: 0.0832 394/500 [======================>.......] - ETA: 35s - loss: 0.8612 - regression_loss: 0.7779 - classification_loss: 0.0833 395/500 [======================>.......] - ETA: 35s - loss: 0.8603 - regression_loss: 0.7772 - classification_loss: 0.0831 396/500 [======================>.......] - ETA: 35s - loss: 0.8612 - regression_loss: 0.7779 - classification_loss: 0.0833 397/500 [======================>.......] - ETA: 34s - loss: 0.8615 - regression_loss: 0.7781 - classification_loss: 0.0834 398/500 [======================>.......] - ETA: 34s - loss: 0.8618 - regression_loss: 0.7784 - classification_loss: 0.0834 399/500 [======================>.......] - ETA: 34s - loss: 0.8626 - regression_loss: 0.7790 - classification_loss: 0.0836 400/500 [=======================>......] - ETA: 33s - loss: 0.8622 - regression_loss: 0.7787 - classification_loss: 0.0835 401/500 [=======================>......] - ETA: 33s - loss: 0.8629 - regression_loss: 0.7793 - classification_loss: 0.0836 402/500 [=======================>......] - ETA: 33s - loss: 0.8626 - regression_loss: 0.7791 - classification_loss: 0.0835 403/500 [=======================>......] - ETA: 32s - loss: 0.8609 - regression_loss: 0.7775 - classification_loss: 0.0834 404/500 [=======================>......] - ETA: 32s - loss: 0.8612 - regression_loss: 0.7779 - classification_loss: 0.0833 405/500 [=======================>......] - ETA: 32s - loss: 0.8617 - regression_loss: 0.7784 - classification_loss: 0.0833 406/500 [=======================>......] - ETA: 31s - loss: 0.8624 - regression_loss: 0.7790 - classification_loss: 0.0834 407/500 [=======================>......] - ETA: 31s - loss: 0.8619 - regression_loss: 0.7786 - classification_loss: 0.0833 408/500 [=======================>......] - ETA: 31s - loss: 0.8628 - regression_loss: 0.7793 - classification_loss: 0.0835 409/500 [=======================>......] - ETA: 30s - loss: 0.8627 - regression_loss: 0.7793 - classification_loss: 0.0834 410/500 [=======================>......] - ETA: 30s - loss: 0.8629 - regression_loss: 0.7794 - classification_loss: 0.0835 411/500 [=======================>......] - ETA: 30s - loss: 0.8631 - regression_loss: 0.7796 - classification_loss: 0.0835 412/500 [=======================>......] - ETA: 29s - loss: 0.8625 - regression_loss: 0.7791 - classification_loss: 0.0834 413/500 [=======================>......] - ETA: 29s - loss: 0.8622 - regression_loss: 0.7789 - classification_loss: 0.0833 414/500 [=======================>......] - ETA: 29s - loss: 0.8621 - regression_loss: 0.7787 - classification_loss: 0.0834 415/500 [=======================>......] - ETA: 28s - loss: 0.8635 - regression_loss: 0.7798 - classification_loss: 0.0837 416/500 [=======================>......] - ETA: 28s - loss: 0.8641 - regression_loss: 0.7805 - classification_loss: 0.0837 417/500 [========================>.....] - ETA: 28s - loss: 0.8639 - regression_loss: 0.7802 - classification_loss: 0.0837 418/500 [========================>.....] - ETA: 27s - loss: 0.8633 - regression_loss: 0.7796 - classification_loss: 0.0837 419/500 [========================>.....] - ETA: 27s - loss: 0.8637 - regression_loss: 0.7800 - classification_loss: 0.0837 420/500 [========================>.....] - ETA: 27s - loss: 0.8644 - regression_loss: 0.7805 - classification_loss: 0.0839 421/500 [========================>.....] - ETA: 26s - loss: 0.8645 - regression_loss: 0.7805 - classification_loss: 0.0840 422/500 [========================>.....] - ETA: 26s - loss: 0.8651 - regression_loss: 0.7810 - classification_loss: 0.0840 423/500 [========================>.....] - ETA: 26s - loss: 0.8646 - regression_loss: 0.7806 - classification_loss: 0.0839 424/500 [========================>.....] - ETA: 25s - loss: 0.8651 - regression_loss: 0.7811 - classification_loss: 0.0840 425/500 [========================>.....] - ETA: 25s - loss: 0.8661 - regression_loss: 0.7819 - classification_loss: 0.0842 426/500 [========================>.....] - ETA: 25s - loss: 0.8668 - regression_loss: 0.7825 - classification_loss: 0.0843 427/500 [========================>.....] - ETA: 24s - loss: 0.8670 - regression_loss: 0.7827 - classification_loss: 0.0843 428/500 [========================>.....] - ETA: 24s - loss: 0.8668 - regression_loss: 0.7826 - classification_loss: 0.0842 429/500 [========================>.....] - ETA: 24s - loss: 0.8654 - regression_loss: 0.7814 - classification_loss: 0.0841 430/500 [========================>.....] - ETA: 23s - loss: 0.8654 - regression_loss: 0.7814 - classification_loss: 0.0840 431/500 [========================>.....] - ETA: 23s - loss: 0.8653 - regression_loss: 0.7814 - classification_loss: 0.0840 432/500 [========================>.....] - ETA: 23s - loss: 0.8659 - regression_loss: 0.7819 - classification_loss: 0.0841 433/500 [========================>.....] - ETA: 22s - loss: 0.8656 - regression_loss: 0.7816 - classification_loss: 0.0840 434/500 [=========================>....] - ETA: 22s - loss: 0.8653 - regression_loss: 0.7813 - classification_loss: 0.0840 435/500 [=========================>....] - ETA: 22s - loss: 0.8656 - regression_loss: 0.7817 - classification_loss: 0.0839 436/500 [=========================>....] - ETA: 21s - loss: 0.8658 - regression_loss: 0.7818 - classification_loss: 0.0840 437/500 [=========================>....] - ETA: 21s - loss: 0.8664 - regression_loss: 0.7825 - classification_loss: 0.0839 438/500 [=========================>....] - ETA: 21s - loss: 0.8659 - regression_loss: 0.7821 - classification_loss: 0.0838 439/500 [=========================>....] - ETA: 20s - loss: 0.8652 - regression_loss: 0.7815 - classification_loss: 0.0837 440/500 [=========================>....] - ETA: 20s - loss: 0.8642 - regression_loss: 0.7807 - classification_loss: 0.0836 441/500 [=========================>....] - ETA: 20s - loss: 0.8647 - regression_loss: 0.7811 - classification_loss: 0.0836 442/500 [=========================>....] - ETA: 19s - loss: 0.8636 - regression_loss: 0.7802 - classification_loss: 0.0835 443/500 [=========================>....] - ETA: 19s - loss: 0.8623 - regression_loss: 0.7789 - classification_loss: 0.0833 444/500 [=========================>....] - ETA: 18s - loss: 0.8622 - regression_loss: 0.7789 - classification_loss: 0.0833 445/500 [=========================>....] - ETA: 18s - loss: 0.8622 - regression_loss: 0.7789 - classification_loss: 0.0833 446/500 [=========================>....] - ETA: 18s - loss: 0.8619 - regression_loss: 0.7787 - classification_loss: 0.0832 447/500 [=========================>....] - ETA: 17s - loss: 0.8627 - regression_loss: 0.7792 - classification_loss: 0.0835 448/500 [=========================>....] - ETA: 17s - loss: 0.8636 - regression_loss: 0.7802 - classification_loss: 0.0834 449/500 [=========================>....] - ETA: 17s - loss: 0.8636 - regression_loss: 0.7802 - classification_loss: 0.0834 450/500 [==========================>...] - ETA: 16s - loss: 0.8642 - regression_loss: 0.7806 - classification_loss: 0.0836 451/500 [==========================>...] - ETA: 16s - loss: 0.8633 - regression_loss: 0.7799 - classification_loss: 0.0834 452/500 [==========================>...] - ETA: 16s - loss: 0.8636 - regression_loss: 0.7801 - classification_loss: 0.0835 453/500 [==========================>...] - ETA: 15s - loss: 0.8642 - regression_loss: 0.7807 - classification_loss: 0.0836 454/500 [==========================>...] - ETA: 15s - loss: 0.8646 - regression_loss: 0.7810 - classification_loss: 0.0836 455/500 [==========================>...] - ETA: 15s - loss: 0.8640 - regression_loss: 0.7805 - classification_loss: 0.0835 456/500 [==========================>...] - ETA: 14s - loss: 0.8640 - regression_loss: 0.7806 - classification_loss: 0.0834 457/500 [==========================>...] - ETA: 14s - loss: 0.8631 - regression_loss: 0.7798 - classification_loss: 0.0833 458/500 [==========================>...] - ETA: 14s - loss: 0.8638 - regression_loss: 0.7804 - classification_loss: 0.0834 459/500 [==========================>...] - ETA: 13s - loss: 0.8641 - regression_loss: 0.7807 - classification_loss: 0.0834 460/500 [==========================>...] - ETA: 13s - loss: 0.8645 - regression_loss: 0.7810 - classification_loss: 0.0835 461/500 [==========================>...] - ETA: 13s - loss: 0.8637 - regression_loss: 0.7804 - classification_loss: 0.0834 462/500 [==========================>...] - ETA: 12s - loss: 0.8627 - regression_loss: 0.7795 - classification_loss: 0.0832 463/500 [==========================>...] - ETA: 12s - loss: 0.8628 - regression_loss: 0.7796 - classification_loss: 0.0832 464/500 [==========================>...] - ETA: 12s - loss: 0.8619 - regression_loss: 0.7788 - classification_loss: 0.0831 465/500 [==========================>...] - ETA: 11s - loss: 0.8607 - regression_loss: 0.7778 - classification_loss: 0.0830 466/500 [==========================>...] - ETA: 11s - loss: 0.8604 - regression_loss: 0.7776 - classification_loss: 0.0829 467/500 [===========================>..] - ETA: 11s - loss: 0.8606 - regression_loss: 0.7776 - classification_loss: 0.0831 468/500 [===========================>..] - ETA: 10s - loss: 0.8600 - regression_loss: 0.7771 - classification_loss: 0.0830 469/500 [===========================>..] - ETA: 10s - loss: 0.8595 - regression_loss: 0.7766 - classification_loss: 0.0830 470/500 [===========================>..] - ETA: 10s - loss: 0.8584 - regression_loss: 0.7755 - classification_loss: 0.0828 471/500 [===========================>..] - ETA: 9s - loss: 0.8573 - regression_loss: 0.7746 - classification_loss: 0.0827  472/500 [===========================>..] - ETA: 9s - loss: 0.8582 - regression_loss: 0.7754 - classification_loss: 0.0829 473/500 [===========================>..] - ETA: 9s - loss: 0.8582 - regression_loss: 0.7754 - classification_loss: 0.0828 474/500 [===========================>..] - ETA: 8s - loss: 0.8588 - regression_loss: 0.7759 - classification_loss: 0.0829 475/500 [===========================>..] - ETA: 8s - loss: 0.8590 - regression_loss: 0.7762 - classification_loss: 0.0828 476/500 [===========================>..] - ETA: 8s - loss: 0.8592 - regression_loss: 0.7764 - classification_loss: 0.0827 477/500 [===========================>..] - ETA: 7s - loss: 0.8581 - regression_loss: 0.7755 - classification_loss: 0.0826 478/500 [===========================>..] - ETA: 7s - loss: 0.8580 - regression_loss: 0.7753 - classification_loss: 0.0826 479/500 [===========================>..] - ETA: 7s - loss: 0.8581 - regression_loss: 0.7755 - classification_loss: 0.0826 480/500 [===========================>..] - ETA: 6s - loss: 0.8590 - regression_loss: 0.7762 - classification_loss: 0.0827 481/500 [===========================>..] - ETA: 6s - loss: 0.8593 - regression_loss: 0.7766 - classification_loss: 0.0827 482/500 [===========================>..] - ETA: 6s - loss: 0.8595 - regression_loss: 0.7768 - classification_loss: 0.0827 483/500 [===========================>..] - ETA: 5s - loss: 0.8597 - regression_loss: 0.7770 - classification_loss: 0.0827 484/500 [============================>.] - ETA: 5s - loss: 0.8598 - regression_loss: 0.7770 - classification_loss: 0.0828 485/500 [============================>.] - ETA: 5s - loss: 0.8598 - regression_loss: 0.7770 - classification_loss: 0.0828 486/500 [============================>.] - ETA: 4s - loss: 0.8592 - regression_loss: 0.7765 - classification_loss: 0.0827 487/500 [============================>.] - ETA: 4s - loss: 0.8591 - regression_loss: 0.7764 - classification_loss: 0.0827 488/500 [============================>.] - ETA: 4s - loss: 0.8583 - regression_loss: 0.7758 - classification_loss: 0.0825 489/500 [============================>.] - ETA: 3s - loss: 0.8573 - regression_loss: 0.7749 - classification_loss: 0.0824 490/500 [============================>.] - ETA: 3s - loss: 0.8573 - regression_loss: 0.7749 - classification_loss: 0.0824 491/500 [============================>.] - ETA: 3s - loss: 0.8568 - regression_loss: 0.7745 - classification_loss: 0.0823 492/500 [============================>.] - ETA: 2s - loss: 0.8559 - regression_loss: 0.7737 - classification_loss: 0.0822 493/500 [============================>.] - ETA: 2s - loss: 0.8561 - regression_loss: 0.7739 - classification_loss: 0.0823 494/500 [============================>.] - ETA: 2s - loss: 0.8563 - regression_loss: 0.7740 - classification_loss: 0.0823 495/500 [============================>.] - ETA: 1s - loss: 0.8572 - regression_loss: 0.7748 - classification_loss: 0.0824 496/500 [============================>.] - ETA: 1s - loss: 0.8563 - regression_loss: 0.7740 - classification_loss: 0.0823 497/500 [============================>.] - ETA: 1s - loss: 0.8562 - regression_loss: 0.7739 - classification_loss: 0.0823 498/500 [============================>.] - ETA: 0s - loss: 0.8559 - regression_loss: 0.7737 - classification_loss: 0.0822 499/500 [============================>.] - ETA: 0s - loss: 0.8575 - regression_loss: 0.7752 - classification_loss: 0.0823 500/500 [==============================] - 170s 339ms/step - loss: 0.8574 - regression_loss: 0.7752 - classification_loss: 0.0822 1172 instances of class plum with average precision: 0.8057 mAP: 0.8057 Epoch 00042: saving model to ./training/snapshots/resnet101_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:38 - loss: 0.6326 - regression_loss: 0.6099 - classification_loss: 0.0227 2/500 [..............................] - ETA: 2:43 - loss: 0.7435 - regression_loss: 0.7211 - classification_loss: 0.0224 3/500 [..............................] - ETA: 2:45 - loss: 0.5876 - regression_loss: 0.5646 - classification_loss: 0.0231 4/500 [..............................] - ETA: 2:44 - loss: 0.5554 - regression_loss: 0.5299 - classification_loss: 0.0255 5/500 [..............................] - ETA: 2:44 - loss: 0.6612 - regression_loss: 0.6128 - classification_loss: 0.0484 6/500 [..............................] - ETA: 2:43 - loss: 0.7448 - regression_loss: 0.6829 - classification_loss: 0.0618 7/500 [..............................] - ETA: 2:44 - loss: 0.7646 - regression_loss: 0.7007 - classification_loss: 0.0639 8/500 [..............................] - ETA: 2:44 - loss: 0.7683 - regression_loss: 0.7074 - classification_loss: 0.0610 9/500 [..............................] - ETA: 2:44 - loss: 0.8020 - regression_loss: 0.7334 - classification_loss: 0.0686 10/500 [..............................] - ETA: 2:44 - loss: 0.8477 - regression_loss: 0.7745 - classification_loss: 0.0731 11/500 [..............................] - ETA: 2:45 - loss: 0.8750 - regression_loss: 0.7990 - classification_loss: 0.0760 12/500 [..............................] - ETA: 2:45 - loss: 0.8824 - regression_loss: 0.8028 - classification_loss: 0.0796 13/500 [..............................] - ETA: 2:44 - loss: 0.8693 - regression_loss: 0.7898 - classification_loss: 0.0795 14/500 [..............................] - ETA: 2:45 - loss: 0.8822 - regression_loss: 0.7990 - classification_loss: 0.0832 15/500 [..............................] - ETA: 2:45 - loss: 0.8933 - regression_loss: 0.8071 - classification_loss: 0.0862 16/500 [..............................] - ETA: 2:45 - loss: 0.8858 - regression_loss: 0.8014 - classification_loss: 0.0844 17/500 [>.............................] - ETA: 2:44 - loss: 0.9078 - regression_loss: 0.8185 - classification_loss: 0.0893 18/500 [>.............................] - ETA: 2:44 - loss: 0.8691 - regression_loss: 0.7839 - classification_loss: 0.0852 19/500 [>.............................] - ETA: 2:43 - loss: 0.8900 - regression_loss: 0.7964 - classification_loss: 0.0936 20/500 [>.............................] - ETA: 2:43 - loss: 0.8943 - regression_loss: 0.7993 - classification_loss: 0.0950 21/500 [>.............................] - ETA: 2:42 - loss: 0.8906 - regression_loss: 0.7963 - classification_loss: 0.0943 22/500 [>.............................] - ETA: 2:42 - loss: 0.8881 - regression_loss: 0.7942 - classification_loss: 0.0939 23/500 [>.............................] - ETA: 2:42 - loss: 0.8687 - regression_loss: 0.7780 - classification_loss: 0.0907 24/500 [>.............................] - ETA: 2:41 - loss: 0.8687 - regression_loss: 0.7783 - classification_loss: 0.0903 25/500 [>.............................] - ETA: 2:41 - loss: 0.8685 - regression_loss: 0.7796 - classification_loss: 0.0889 26/500 [>.............................] - ETA: 2:41 - loss: 0.8516 - regression_loss: 0.7647 - classification_loss: 0.0868 27/500 [>.............................] - ETA: 2:40 - loss: 0.8542 - regression_loss: 0.7672 - classification_loss: 0.0871 28/500 [>.............................] - ETA: 2:40 - loss: 0.8562 - regression_loss: 0.7698 - classification_loss: 0.0864 29/500 [>.............................] - ETA: 2:40 - loss: 0.8600 - regression_loss: 0.7742 - classification_loss: 0.0858 30/500 [>.............................] - ETA: 2:40 - loss: 0.8540 - regression_loss: 0.7693 - classification_loss: 0.0848 31/500 [>.............................] - ETA: 2:39 - loss: 0.8602 - regression_loss: 0.7726 - classification_loss: 0.0875 32/500 [>.............................] - ETA: 2:39 - loss: 0.8446 - regression_loss: 0.7591 - classification_loss: 0.0855 33/500 [>.............................] - ETA: 2:39 - loss: 0.8377 - regression_loss: 0.7534 - classification_loss: 0.0843 34/500 [=>............................] - ETA: 2:38 - loss: 0.8264 - regression_loss: 0.7427 - classification_loss: 0.0837 35/500 [=>............................] - ETA: 2:38 - loss: 0.8314 - regression_loss: 0.7467 - classification_loss: 0.0847 36/500 [=>............................] - ETA: 2:37 - loss: 0.8322 - regression_loss: 0.7478 - classification_loss: 0.0844 37/500 [=>............................] - ETA: 2:37 - loss: 0.8367 - regression_loss: 0.7522 - classification_loss: 0.0846 38/500 [=>............................] - ETA: 2:37 - loss: 0.8330 - regression_loss: 0.7498 - classification_loss: 0.0833 39/500 [=>............................] - ETA: 2:37 - loss: 0.8308 - regression_loss: 0.7474 - classification_loss: 0.0834 40/500 [=>............................] - ETA: 2:36 - loss: 0.8286 - regression_loss: 0.7462 - classification_loss: 0.0824 41/500 [=>............................] - ETA: 2:36 - loss: 0.8349 - regression_loss: 0.7512 - classification_loss: 0.0837 42/500 [=>............................] - ETA: 2:36 - loss: 0.8356 - regression_loss: 0.7521 - classification_loss: 0.0835 43/500 [=>............................] - ETA: 2:36 - loss: 0.8318 - regression_loss: 0.7490 - classification_loss: 0.0829 44/500 [=>............................] - ETA: 2:35 - loss: 0.8313 - regression_loss: 0.7489 - classification_loss: 0.0824 45/500 [=>............................] - ETA: 2:35 - loss: 0.8263 - regression_loss: 0.7453 - classification_loss: 0.0810 46/500 [=>............................] - ETA: 2:34 - loss: 0.8245 - regression_loss: 0.7438 - classification_loss: 0.0807 47/500 [=>............................] - ETA: 2:34 - loss: 0.8291 - regression_loss: 0.7481 - classification_loss: 0.0810 48/500 [=>............................] - ETA: 2:34 - loss: 0.8300 - regression_loss: 0.7489 - classification_loss: 0.0811 49/500 [=>............................] - ETA: 2:33 - loss: 0.8285 - regression_loss: 0.7479 - classification_loss: 0.0806 50/500 [==>...........................] - ETA: 2:33 - loss: 0.8230 - regression_loss: 0.7434 - classification_loss: 0.0795 51/500 [==>...........................] - ETA: 2:33 - loss: 0.8165 - regression_loss: 0.7381 - classification_loss: 0.0783 52/500 [==>...........................] - ETA: 2:33 - loss: 0.8285 - regression_loss: 0.7486 - classification_loss: 0.0799 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8288 - regression_loss: 0.7490 - classification_loss: 0.0798 54/500 [==>...........................] - ETA: 2:32 - loss: 0.8260 - regression_loss: 0.7464 - classification_loss: 0.0795 55/500 [==>...........................] - ETA: 2:32 - loss: 0.8339 - regression_loss: 0.7543 - classification_loss: 0.0795 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8281 - regression_loss: 0.7492 - classification_loss: 0.0789 57/500 [==>...........................] - ETA: 2:31 - loss: 0.8168 - regression_loss: 0.7391 - classification_loss: 0.0778 58/500 [==>...........................] - ETA: 2:31 - loss: 0.8264 - regression_loss: 0.7467 - classification_loss: 0.0797 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8389 - regression_loss: 0.7575 - classification_loss: 0.0814 60/500 [==>...........................] - ETA: 2:30 - loss: 0.8424 - regression_loss: 0.7604 - classification_loss: 0.0820 61/500 [==>...........................] - ETA: 2:30 - loss: 0.8353 - regression_loss: 0.7538 - classification_loss: 0.0815 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8372 - regression_loss: 0.7551 - classification_loss: 0.0820 63/500 [==>...........................] - ETA: 2:29 - loss: 0.8378 - regression_loss: 0.7559 - classification_loss: 0.0819 64/500 [==>...........................] - ETA: 2:28 - loss: 0.8413 - regression_loss: 0.7583 - classification_loss: 0.0830 65/500 [==>...........................] - ETA: 2:28 - loss: 0.8460 - regression_loss: 0.7622 - classification_loss: 0.0837 66/500 [==>...........................] - ETA: 2:28 - loss: 0.8449 - regression_loss: 0.7611 - classification_loss: 0.0838 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8451 - regression_loss: 0.7614 - classification_loss: 0.0837 68/500 [===>..........................] - ETA: 2:27 - loss: 0.8454 - regression_loss: 0.7618 - classification_loss: 0.0836 69/500 [===>..........................] - ETA: 2:27 - loss: 0.8532 - regression_loss: 0.7692 - classification_loss: 0.0840 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8537 - regression_loss: 0.7692 - classification_loss: 0.0844 71/500 [===>..........................] - ETA: 2:26 - loss: 0.8662 - regression_loss: 0.7798 - classification_loss: 0.0864 72/500 [===>..........................] - ETA: 2:26 - loss: 0.8611 - regression_loss: 0.7757 - classification_loss: 0.0855 73/500 [===>..........................] - ETA: 2:25 - loss: 0.8607 - regression_loss: 0.7752 - classification_loss: 0.0854 74/500 [===>..........................] - ETA: 2:25 - loss: 0.8657 - regression_loss: 0.7792 - classification_loss: 0.0865 75/500 [===>..........................] - ETA: 2:25 - loss: 0.8617 - regression_loss: 0.7758 - classification_loss: 0.0859 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8548 - regression_loss: 0.7697 - classification_loss: 0.0851 77/500 [===>..........................] - ETA: 2:24 - loss: 0.8463 - regression_loss: 0.7621 - classification_loss: 0.0843 78/500 [===>..........................] - ETA: 2:24 - loss: 0.8515 - regression_loss: 0.7666 - classification_loss: 0.0849 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8525 - regression_loss: 0.7677 - classification_loss: 0.0849 80/500 [===>..........................] - ETA: 2:23 - loss: 0.8543 - regression_loss: 0.7697 - classification_loss: 0.0845 81/500 [===>..........................] - ETA: 2:23 - loss: 0.8496 - regression_loss: 0.7656 - classification_loss: 0.0840 82/500 [===>..........................] - ETA: 2:22 - loss: 0.8477 - regression_loss: 0.7642 - classification_loss: 0.0835 83/500 [===>..........................] - ETA: 2:22 - loss: 0.8470 - regression_loss: 0.7638 - classification_loss: 0.0832 84/500 [====>.........................] - ETA: 2:22 - loss: 0.8468 - regression_loss: 0.7641 - classification_loss: 0.0828 85/500 [====>.........................] - ETA: 2:21 - loss: 0.8469 - regression_loss: 0.7643 - classification_loss: 0.0826 86/500 [====>.........................] - ETA: 2:21 - loss: 0.8491 - regression_loss: 0.7665 - classification_loss: 0.0825 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8518 - regression_loss: 0.7687 - classification_loss: 0.0831 88/500 [====>.........................] - ETA: 2:20 - loss: 0.8461 - regression_loss: 0.7638 - classification_loss: 0.0823 89/500 [====>.........................] - ETA: 2:20 - loss: 0.8479 - regression_loss: 0.7656 - classification_loss: 0.0823 90/500 [====>.........................] - ETA: 2:19 - loss: 0.8428 - regression_loss: 0.7609 - classification_loss: 0.0819 91/500 [====>.........................] - ETA: 2:19 - loss: 0.8442 - regression_loss: 0.7619 - classification_loss: 0.0824 92/500 [====>.........................] - ETA: 2:19 - loss: 0.8478 - regression_loss: 0.7652 - classification_loss: 0.0826 93/500 [====>.........................] - ETA: 2:18 - loss: 0.8469 - regression_loss: 0.7645 - classification_loss: 0.0824 94/500 [====>.........................] - ETA: 2:18 - loss: 0.8478 - regression_loss: 0.7652 - classification_loss: 0.0826 95/500 [====>.........................] - ETA: 2:18 - loss: 0.8439 - regression_loss: 0.7619 - classification_loss: 0.0820 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8458 - regression_loss: 0.7638 - classification_loss: 0.0820 97/500 [====>.........................] - ETA: 2:17 - loss: 0.8414 - regression_loss: 0.7600 - classification_loss: 0.0814 98/500 [====>.........................] - ETA: 2:17 - loss: 0.8445 - regression_loss: 0.7626 - classification_loss: 0.0819 99/500 [====>.........................] - ETA: 2:16 - loss: 0.8459 - regression_loss: 0.7638 - classification_loss: 0.0821 100/500 [=====>........................] - ETA: 2:16 - loss: 0.8469 - regression_loss: 0.7644 - classification_loss: 0.0824 101/500 [=====>........................] - ETA: 2:16 - loss: 0.8482 - regression_loss: 0.7657 - classification_loss: 0.0824 102/500 [=====>........................] - ETA: 2:15 - loss: 0.8517 - regression_loss: 0.7688 - classification_loss: 0.0829 103/500 [=====>........................] - ETA: 2:15 - loss: 0.8490 - regression_loss: 0.7665 - classification_loss: 0.0826 104/500 [=====>........................] - ETA: 2:15 - loss: 0.8513 - regression_loss: 0.7682 - classification_loss: 0.0831 105/500 [=====>........................] - ETA: 2:14 - loss: 0.8497 - regression_loss: 0.7668 - classification_loss: 0.0829 106/500 [=====>........................] - ETA: 2:14 - loss: 0.8477 - regression_loss: 0.7650 - classification_loss: 0.0827 107/500 [=====>........................] - ETA: 2:14 - loss: 0.8465 - regression_loss: 0.7638 - classification_loss: 0.0827 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8500 - regression_loss: 0.7670 - classification_loss: 0.0830 109/500 [=====>........................] - ETA: 2:13 - loss: 0.8489 - regression_loss: 0.7660 - classification_loss: 0.0829 110/500 [=====>........................] - ETA: 2:13 - loss: 0.8490 - regression_loss: 0.7664 - classification_loss: 0.0827 111/500 [=====>........................] - ETA: 2:12 - loss: 0.8456 - regression_loss: 0.7636 - classification_loss: 0.0820 112/500 [=====>........................] - ETA: 2:12 - loss: 0.8423 - regression_loss: 0.7609 - classification_loss: 0.0814 113/500 [=====>........................] - ETA: 2:12 - loss: 0.8442 - regression_loss: 0.7625 - classification_loss: 0.0817 114/500 [=====>........................] - ETA: 2:12 - loss: 0.8443 - regression_loss: 0.7628 - classification_loss: 0.0815 115/500 [=====>........................] - ETA: 2:11 - loss: 0.8431 - regression_loss: 0.7619 - classification_loss: 0.0812 116/500 [=====>........................] - ETA: 2:11 - loss: 0.8465 - regression_loss: 0.7648 - classification_loss: 0.0817 117/500 [======>.......................] - ETA: 2:11 - loss: 0.8462 - regression_loss: 0.7646 - classification_loss: 0.0816 118/500 [======>.......................] - ETA: 2:10 - loss: 0.8440 - regression_loss: 0.7628 - classification_loss: 0.0812 119/500 [======>.......................] - ETA: 2:10 - loss: 0.8442 - regression_loss: 0.7629 - classification_loss: 0.0813 120/500 [======>.......................] - ETA: 2:10 - loss: 0.8442 - regression_loss: 0.7630 - classification_loss: 0.0812 121/500 [======>.......................] - ETA: 2:09 - loss: 0.8451 - regression_loss: 0.7637 - classification_loss: 0.0813 122/500 [======>.......................] - ETA: 2:09 - loss: 0.8421 - regression_loss: 0.7613 - classification_loss: 0.0808 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8435 - regression_loss: 0.7625 - classification_loss: 0.0810 124/500 [======>.......................] - ETA: 2:08 - loss: 0.8452 - regression_loss: 0.7637 - classification_loss: 0.0815 125/500 [======>.......................] - ETA: 2:08 - loss: 0.8463 - regression_loss: 0.7650 - classification_loss: 0.0814 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8466 - regression_loss: 0.7652 - classification_loss: 0.0814 127/500 [======>.......................] - ETA: 2:07 - loss: 0.8510 - regression_loss: 0.7689 - classification_loss: 0.0822 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8522 - regression_loss: 0.7697 - classification_loss: 0.0825 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8529 - regression_loss: 0.7702 - classification_loss: 0.0827 130/500 [======>.......................] - ETA: 2:06 - loss: 0.8541 - regression_loss: 0.7710 - classification_loss: 0.0831 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8533 - regression_loss: 0.7704 - classification_loss: 0.0829 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8558 - regression_loss: 0.7728 - classification_loss: 0.0831 133/500 [======>.......................] - ETA: 2:05 - loss: 0.8591 - regression_loss: 0.7758 - classification_loss: 0.0832 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8620 - regression_loss: 0.7781 - classification_loss: 0.0839 135/500 [=======>......................] - ETA: 2:04 - loss: 0.8645 - regression_loss: 0.7802 - classification_loss: 0.0843 136/500 [=======>......................] - ETA: 2:04 - loss: 0.8645 - regression_loss: 0.7803 - classification_loss: 0.0841 137/500 [=======>......................] - ETA: 2:04 - loss: 0.8639 - regression_loss: 0.7798 - classification_loss: 0.0842 138/500 [=======>......................] - ETA: 2:03 - loss: 0.8664 - regression_loss: 0.7821 - classification_loss: 0.0842 139/500 [=======>......................] - ETA: 2:03 - loss: 0.8631 - regression_loss: 0.7793 - classification_loss: 0.0838 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8609 - regression_loss: 0.7775 - classification_loss: 0.0835 141/500 [=======>......................] - ETA: 2:02 - loss: 0.8563 - regression_loss: 0.7733 - classification_loss: 0.0830 142/500 [=======>......................] - ETA: 2:02 - loss: 0.8547 - regression_loss: 0.7722 - classification_loss: 0.0825 143/500 [=======>......................] - ETA: 2:02 - loss: 0.8512 - regression_loss: 0.7691 - classification_loss: 0.0821 144/500 [=======>......................] - ETA: 2:01 - loss: 0.8511 - regression_loss: 0.7692 - classification_loss: 0.0819 145/500 [=======>......................] - ETA: 2:01 - loss: 0.8522 - regression_loss: 0.7700 - classification_loss: 0.0822 146/500 [=======>......................] - ETA: 2:01 - loss: 0.8540 - regression_loss: 0.7717 - classification_loss: 0.0824 147/500 [=======>......................] - ETA: 2:00 - loss: 0.8554 - regression_loss: 0.7730 - classification_loss: 0.0824 148/500 [=======>......................] - ETA: 2:00 - loss: 0.8549 - regression_loss: 0.7725 - classification_loss: 0.0824 149/500 [=======>......................] - ETA: 1:59 - loss: 0.8583 - regression_loss: 0.7757 - classification_loss: 0.0825 150/500 [========>.....................] - ETA: 1:59 - loss: 0.8685 - regression_loss: 0.7857 - classification_loss: 0.0828 151/500 [========>.....................] - ETA: 1:59 - loss: 0.8709 - regression_loss: 0.7878 - classification_loss: 0.0831 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8699 - regression_loss: 0.7871 - classification_loss: 0.0828 153/500 [========>.....................] - ETA: 1:58 - loss: 0.8715 - regression_loss: 0.7884 - classification_loss: 0.0831 154/500 [========>.....................] - ETA: 1:58 - loss: 0.8718 - regression_loss: 0.7885 - classification_loss: 0.0833 155/500 [========>.....................] - ETA: 1:57 - loss: 0.8711 - regression_loss: 0.7880 - classification_loss: 0.0831 156/500 [========>.....................] - ETA: 1:57 - loss: 0.8688 - regression_loss: 0.7860 - classification_loss: 0.0828 157/500 [========>.....................] - ETA: 1:57 - loss: 0.8657 - regression_loss: 0.7833 - classification_loss: 0.0824 158/500 [========>.....................] - ETA: 1:56 - loss: 0.8665 - regression_loss: 0.7837 - classification_loss: 0.0827 159/500 [========>.....................] - ETA: 1:56 - loss: 0.8657 - regression_loss: 0.7827 - classification_loss: 0.0830 160/500 [========>.....................] - ETA: 1:56 - loss: 0.8682 - regression_loss: 0.7848 - classification_loss: 0.0834 161/500 [========>.....................] - ETA: 1:55 - loss: 0.8676 - regression_loss: 0.7843 - classification_loss: 0.0834 162/500 [========>.....................] - ETA: 1:55 - loss: 0.8651 - regression_loss: 0.7821 - classification_loss: 0.0830 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8656 - regression_loss: 0.7826 - classification_loss: 0.0830 164/500 [========>.....................] - ETA: 1:54 - loss: 0.8653 - regression_loss: 0.7825 - classification_loss: 0.0829 165/500 [========>.....................] - ETA: 1:54 - loss: 0.8619 - regression_loss: 0.7794 - classification_loss: 0.0825 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8584 - regression_loss: 0.7762 - classification_loss: 0.0822 167/500 [=========>....................] - ETA: 1:53 - loss: 0.8572 - regression_loss: 0.7752 - classification_loss: 0.0819 168/500 [=========>....................] - ETA: 1:53 - loss: 0.8571 - regression_loss: 0.7751 - classification_loss: 0.0820 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8563 - regression_loss: 0.7744 - classification_loss: 0.0819 170/500 [=========>....................] - ETA: 1:52 - loss: 0.8583 - regression_loss: 0.7759 - classification_loss: 0.0823 171/500 [=========>....................] - ETA: 1:52 - loss: 0.8587 - regression_loss: 0.7763 - classification_loss: 0.0824 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8604 - regression_loss: 0.7777 - classification_loss: 0.0827 173/500 [=========>....................] - ETA: 1:51 - loss: 0.8605 - regression_loss: 0.7779 - classification_loss: 0.0826 174/500 [=========>....................] - ETA: 1:51 - loss: 0.8623 - regression_loss: 0.7795 - classification_loss: 0.0829 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8587 - regression_loss: 0.7762 - classification_loss: 0.0825 176/500 [=========>....................] - ETA: 1:50 - loss: 0.8603 - regression_loss: 0.7778 - classification_loss: 0.0826 177/500 [=========>....................] - ETA: 1:50 - loss: 0.8588 - regression_loss: 0.7766 - classification_loss: 0.0822 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8582 - regression_loss: 0.7762 - classification_loss: 0.0820 179/500 [=========>....................] - ETA: 1:49 - loss: 0.8602 - regression_loss: 0.7778 - classification_loss: 0.0824 180/500 [=========>....................] - ETA: 1:49 - loss: 0.8580 - regression_loss: 0.7759 - classification_loss: 0.0821 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8567 - regression_loss: 0.7748 - classification_loss: 0.0819 182/500 [=========>....................] - ETA: 1:48 - loss: 0.8583 - regression_loss: 0.7763 - classification_loss: 0.0820 183/500 [=========>....................] - ETA: 1:48 - loss: 0.8550 - regression_loss: 0.7734 - classification_loss: 0.0816 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8562 - regression_loss: 0.7744 - classification_loss: 0.0818 185/500 [==========>...................] - ETA: 1:47 - loss: 0.8563 - regression_loss: 0.7745 - classification_loss: 0.0817 186/500 [==========>...................] - ETA: 1:47 - loss: 0.8569 - regression_loss: 0.7753 - classification_loss: 0.0816 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8548 - regression_loss: 0.7735 - classification_loss: 0.0813 188/500 [==========>...................] - ETA: 1:46 - loss: 0.8546 - regression_loss: 0.7733 - classification_loss: 0.0813 189/500 [==========>...................] - ETA: 1:46 - loss: 0.8540 - regression_loss: 0.7728 - classification_loss: 0.0812 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8540 - regression_loss: 0.7731 - classification_loss: 0.0810 191/500 [==========>...................] - ETA: 1:45 - loss: 0.8539 - regression_loss: 0.7731 - classification_loss: 0.0809 192/500 [==========>...................] - ETA: 1:45 - loss: 0.8516 - regression_loss: 0.7710 - classification_loss: 0.0805 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8505 - regression_loss: 0.7697 - classification_loss: 0.0808 194/500 [==========>...................] - ETA: 1:44 - loss: 0.8493 - regression_loss: 0.7685 - classification_loss: 0.0808 195/500 [==========>...................] - ETA: 1:44 - loss: 0.8469 - regression_loss: 0.7664 - classification_loss: 0.0805 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8469 - regression_loss: 0.7665 - classification_loss: 0.0804 197/500 [==========>...................] - ETA: 1:43 - loss: 0.8461 - regression_loss: 0.7658 - classification_loss: 0.0803 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8466 - regression_loss: 0.7662 - classification_loss: 0.0804 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8470 - regression_loss: 0.7666 - classification_loss: 0.0804 200/500 [===========>..................] - ETA: 1:42 - loss: 0.8474 - regression_loss: 0.7668 - classification_loss: 0.0806 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8472 - regression_loss: 0.7666 - classification_loss: 0.0806 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8495 - regression_loss: 0.7687 - classification_loss: 0.0808 203/500 [===========>..................] - ETA: 1:41 - loss: 0.8478 - regression_loss: 0.7672 - classification_loss: 0.0807 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8488 - regression_loss: 0.7682 - classification_loss: 0.0806 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8504 - regression_loss: 0.7696 - classification_loss: 0.0809 206/500 [===========>..................] - ETA: 1:40 - loss: 0.8497 - regression_loss: 0.7690 - classification_loss: 0.0806 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8504 - regression_loss: 0.7696 - classification_loss: 0.0807 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8517 - regression_loss: 0.7697 - classification_loss: 0.0819 209/500 [===========>..................] - ETA: 1:39 - loss: 0.8529 - regression_loss: 0.7709 - classification_loss: 0.0821 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8539 - regression_loss: 0.7718 - classification_loss: 0.0821 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8534 - regression_loss: 0.7713 - classification_loss: 0.0822 212/500 [===========>..................] - ETA: 1:38 - loss: 0.8533 - regression_loss: 0.7712 - classification_loss: 0.0821 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8522 - regression_loss: 0.7702 - classification_loss: 0.0821 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8518 - regression_loss: 0.7699 - classification_loss: 0.0819 215/500 [===========>..................] - ETA: 1:37 - loss: 0.8513 - regression_loss: 0.7695 - classification_loss: 0.0818 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8520 - regression_loss: 0.7702 - classification_loss: 0.0818 217/500 [============>.................] - ETA: 1:36 - loss: 0.8500 - regression_loss: 0.7685 - classification_loss: 0.0816 218/500 [============>.................] - ETA: 1:36 - loss: 0.8494 - regression_loss: 0.7680 - classification_loss: 0.0815 219/500 [============>.................] - ETA: 1:35 - loss: 0.8500 - regression_loss: 0.7686 - classification_loss: 0.0814 220/500 [============>.................] - ETA: 1:35 - loss: 0.8510 - regression_loss: 0.7695 - classification_loss: 0.0816 221/500 [============>.................] - ETA: 1:35 - loss: 0.8484 - regression_loss: 0.7672 - classification_loss: 0.0812 222/500 [============>.................] - ETA: 1:34 - loss: 0.8508 - regression_loss: 0.7693 - classification_loss: 0.0815 223/500 [============>.................] - ETA: 1:34 - loss: 0.8484 - regression_loss: 0.7673 - classification_loss: 0.0812 224/500 [============>.................] - ETA: 1:34 - loss: 0.8527 - regression_loss: 0.7707 - classification_loss: 0.0820 225/500 [============>.................] - ETA: 1:33 - loss: 0.8545 - regression_loss: 0.7722 - classification_loss: 0.0823 226/500 [============>.................] - ETA: 1:33 - loss: 0.8548 - regression_loss: 0.7727 - classification_loss: 0.0821 227/500 [============>.................] - ETA: 1:33 - loss: 0.8548 - regression_loss: 0.7728 - classification_loss: 0.0820 228/500 [============>.................] - ETA: 1:32 - loss: 0.8548 - regression_loss: 0.7728 - classification_loss: 0.0819 229/500 [============>.................] - ETA: 1:32 - loss: 0.8532 - regression_loss: 0.7715 - classification_loss: 0.0817 230/500 [============>.................] - ETA: 1:32 - loss: 0.8515 - regression_loss: 0.7700 - classification_loss: 0.0815 231/500 [============>.................] - ETA: 1:31 - loss: 0.8515 - regression_loss: 0.7701 - classification_loss: 0.0815 232/500 [============>.................] - ETA: 1:31 - loss: 0.8522 - regression_loss: 0.7707 - classification_loss: 0.0816 233/500 [============>.................] - ETA: 1:31 - loss: 0.8512 - regression_loss: 0.7698 - classification_loss: 0.0814 234/500 [=============>................] - ETA: 1:30 - loss: 0.8514 - regression_loss: 0.7698 - classification_loss: 0.0816 235/500 [=============>................] - ETA: 1:30 - loss: 0.8523 - regression_loss: 0.7706 - classification_loss: 0.0817 236/500 [=============>................] - ETA: 1:30 - loss: 0.8515 - regression_loss: 0.7699 - classification_loss: 0.0816 237/500 [=============>................] - ETA: 1:29 - loss: 0.8521 - regression_loss: 0.7706 - classification_loss: 0.0815 238/500 [=============>................] - ETA: 1:29 - loss: 0.8548 - regression_loss: 0.7729 - classification_loss: 0.0819 239/500 [=============>................] - ETA: 1:29 - loss: 0.8567 - regression_loss: 0.7746 - classification_loss: 0.0821 240/500 [=============>................] - ETA: 1:28 - loss: 0.8554 - regression_loss: 0.7735 - classification_loss: 0.0819 241/500 [=============>................] - ETA: 1:28 - loss: 0.8562 - regression_loss: 0.7742 - classification_loss: 0.0820 242/500 [=============>................] - ETA: 1:28 - loss: 0.8594 - regression_loss: 0.7764 - classification_loss: 0.0830 243/500 [=============>................] - ETA: 1:27 - loss: 0.8574 - regression_loss: 0.7745 - classification_loss: 0.0829 244/500 [=============>................] - ETA: 1:27 - loss: 0.8581 - regression_loss: 0.7751 - classification_loss: 0.0830 245/500 [=============>................] - ETA: 1:27 - loss: 0.8577 - regression_loss: 0.7747 - classification_loss: 0.0830 246/500 [=============>................] - ETA: 1:26 - loss: 0.8569 - regression_loss: 0.7740 - classification_loss: 0.0829 247/500 [=============>................] - ETA: 1:26 - loss: 0.8579 - regression_loss: 0.7747 - classification_loss: 0.0831 248/500 [=============>................] - ETA: 1:26 - loss: 0.8569 - regression_loss: 0.7740 - classification_loss: 0.0829 249/500 [=============>................] - ETA: 1:25 - loss: 0.8550 - regression_loss: 0.7723 - classification_loss: 0.0827 250/500 [==============>...............] - ETA: 1:25 - loss: 0.8549 - regression_loss: 0.7722 - classification_loss: 0.0827 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8553 - regression_loss: 0.7724 - classification_loss: 0.0829 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8537 - regression_loss: 0.7711 - classification_loss: 0.0827 253/500 [==============>...............] - ETA: 1:24 - loss: 0.8553 - regression_loss: 0.7724 - classification_loss: 0.0828 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8539 - regression_loss: 0.7713 - classification_loss: 0.0826 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8516 - regression_loss: 0.7693 - classification_loss: 0.0823 256/500 [==============>...............] - ETA: 1:23 - loss: 0.8515 - regression_loss: 0.7692 - classification_loss: 0.0823 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8512 - regression_loss: 0.7690 - classification_loss: 0.0822 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8511 - regression_loss: 0.7689 - classification_loss: 0.0822 259/500 [==============>...............] - ETA: 1:22 - loss: 0.8500 - regression_loss: 0.7680 - classification_loss: 0.0820 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8511 - regression_loss: 0.7688 - classification_loss: 0.0823 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8513 - regression_loss: 0.7689 - classification_loss: 0.0825 262/500 [==============>...............] - ETA: 1:21 - loss: 0.8501 - regression_loss: 0.7678 - classification_loss: 0.0823 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8546 - regression_loss: 0.7693 - classification_loss: 0.0852 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8527 - regression_loss: 0.7677 - classification_loss: 0.0850 265/500 [==============>...............] - ETA: 1:20 - loss: 0.8508 - regression_loss: 0.7661 - classification_loss: 0.0847 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8498 - regression_loss: 0.7653 - classification_loss: 0.0845 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8482 - regression_loss: 0.7640 - classification_loss: 0.0842 268/500 [===============>..............] - ETA: 1:19 - loss: 0.8489 - regression_loss: 0.7647 - classification_loss: 0.0842 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8485 - regression_loss: 0.7643 - classification_loss: 0.0842 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8472 - regression_loss: 0.7631 - classification_loss: 0.0841 271/500 [===============>..............] - ETA: 1:18 - loss: 0.8458 - regression_loss: 0.7619 - classification_loss: 0.0839 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8454 - regression_loss: 0.7617 - classification_loss: 0.0838 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8448 - regression_loss: 0.7612 - classification_loss: 0.0836 274/500 [===============>..............] - ETA: 1:17 - loss: 0.8444 - regression_loss: 0.7608 - classification_loss: 0.0836 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8458 - regression_loss: 0.7617 - classification_loss: 0.0841 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8451 - regression_loss: 0.7611 - classification_loss: 0.0839 277/500 [===============>..............] - ETA: 1:16 - loss: 0.8443 - regression_loss: 0.7604 - classification_loss: 0.0839 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8444 - regression_loss: 0.7606 - classification_loss: 0.0838 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8451 - regression_loss: 0.7611 - classification_loss: 0.0839 280/500 [===============>..............] - ETA: 1:15 - loss: 0.8455 - regression_loss: 0.7614 - classification_loss: 0.0841 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8456 - regression_loss: 0.7614 - classification_loss: 0.0842 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8480 - regression_loss: 0.7633 - classification_loss: 0.0848 283/500 [===============>..............] - ETA: 1:14 - loss: 0.8478 - regression_loss: 0.7623 - classification_loss: 0.0856 284/500 [================>.............] - ETA: 1:13 - loss: 0.8484 - regression_loss: 0.7627 - classification_loss: 0.0857 285/500 [================>.............] - ETA: 1:13 - loss: 0.8492 - regression_loss: 0.7634 - classification_loss: 0.0858 286/500 [================>.............] - ETA: 1:13 - loss: 0.8501 - regression_loss: 0.7642 - classification_loss: 0.0859 287/500 [================>.............] - ETA: 1:12 - loss: 0.8499 - regression_loss: 0.7640 - classification_loss: 0.0859 288/500 [================>.............] - ETA: 1:12 - loss: 0.8498 - regression_loss: 0.7640 - classification_loss: 0.0858 289/500 [================>.............] - ETA: 1:12 - loss: 0.8490 - regression_loss: 0.7634 - classification_loss: 0.0856 290/500 [================>.............] - ETA: 1:11 - loss: 0.8492 - regression_loss: 0.7636 - classification_loss: 0.0856 291/500 [================>.............] - ETA: 1:11 - loss: 0.8516 - regression_loss: 0.7655 - classification_loss: 0.0861 292/500 [================>.............] - ETA: 1:10 - loss: 0.8510 - regression_loss: 0.7650 - classification_loss: 0.0860 293/500 [================>.............] - ETA: 1:10 - loss: 0.8501 - regression_loss: 0.7642 - classification_loss: 0.0860 294/500 [================>.............] - ETA: 1:10 - loss: 0.8492 - regression_loss: 0.7634 - classification_loss: 0.0858 295/500 [================>.............] - ETA: 1:09 - loss: 0.8497 - regression_loss: 0.7638 - classification_loss: 0.0859 296/500 [================>.............] - ETA: 1:09 - loss: 0.8492 - regression_loss: 0.7634 - classification_loss: 0.0858 297/500 [================>.............] - ETA: 1:09 - loss: 0.8497 - regression_loss: 0.7639 - classification_loss: 0.0858 298/500 [================>.............] - ETA: 1:08 - loss: 0.8492 - regression_loss: 0.7635 - classification_loss: 0.0856 299/500 [================>.............] - ETA: 1:08 - loss: 0.8504 - regression_loss: 0.7646 - classification_loss: 0.0858 300/500 [=================>............] - ETA: 1:08 - loss: 0.8487 - regression_loss: 0.7631 - classification_loss: 0.0856 301/500 [=================>............] - ETA: 1:07 - loss: 0.8468 - regression_loss: 0.7615 - classification_loss: 0.0854 302/500 [=================>............] - ETA: 1:07 - loss: 0.8468 - regression_loss: 0.7613 - classification_loss: 0.0854 303/500 [=================>............] - ETA: 1:07 - loss: 0.8470 - regression_loss: 0.7614 - classification_loss: 0.0856 304/500 [=================>............] - ETA: 1:06 - loss: 0.8480 - regression_loss: 0.7622 - classification_loss: 0.0858 305/500 [=================>............] - ETA: 1:06 - loss: 0.8457 - regression_loss: 0.7602 - classification_loss: 0.0855 306/500 [=================>............] - ETA: 1:06 - loss: 0.8454 - regression_loss: 0.7600 - classification_loss: 0.0854 307/500 [=================>............] - ETA: 1:05 - loss: 0.8442 - regression_loss: 0.7590 - classification_loss: 0.0852 308/500 [=================>............] - ETA: 1:05 - loss: 0.8435 - regression_loss: 0.7584 - classification_loss: 0.0851 309/500 [=================>............] - ETA: 1:05 - loss: 0.8428 - regression_loss: 0.7578 - classification_loss: 0.0850 310/500 [=================>............] - ETA: 1:04 - loss: 0.8431 - regression_loss: 0.7581 - classification_loss: 0.0850 311/500 [=================>............] - ETA: 1:04 - loss: 0.8428 - regression_loss: 0.7578 - classification_loss: 0.0850 312/500 [=================>............] - ETA: 1:04 - loss: 0.8449 - regression_loss: 0.7596 - classification_loss: 0.0853 313/500 [=================>............] - ETA: 1:03 - loss: 0.8443 - regression_loss: 0.7591 - classification_loss: 0.0852 314/500 [=================>............] - ETA: 1:03 - loss: 0.8440 - regression_loss: 0.7587 - classification_loss: 0.0853 315/500 [=================>............] - ETA: 1:03 - loss: 0.8448 - regression_loss: 0.7595 - classification_loss: 0.0853 316/500 [=================>............] - ETA: 1:02 - loss: 0.8446 - regression_loss: 0.7595 - classification_loss: 0.0851 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8439 - regression_loss: 0.7589 - classification_loss: 0.0850 318/500 [==================>...........] - ETA: 1:02 - loss: 0.8432 - regression_loss: 0.7584 - classification_loss: 0.0848 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8449 - regression_loss: 0.7598 - classification_loss: 0.0851 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8441 - regression_loss: 0.7591 - classification_loss: 0.0850 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8445 - regression_loss: 0.7595 - classification_loss: 0.0850 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8449 - regression_loss: 0.7600 - classification_loss: 0.0849 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8453 - regression_loss: 0.7604 - classification_loss: 0.0849 324/500 [==================>...........] - ETA: 59s - loss: 0.8460 - regression_loss: 0.7610 - classification_loss: 0.0850  325/500 [==================>...........] - ETA: 59s - loss: 0.8448 - regression_loss: 0.7600 - classification_loss: 0.0848 326/500 [==================>...........] - ETA: 59s - loss: 0.8453 - regression_loss: 0.7603 - classification_loss: 0.0849 327/500 [==================>...........] - ETA: 58s - loss: 0.8445 - regression_loss: 0.7597 - classification_loss: 0.0848 328/500 [==================>...........] - ETA: 58s - loss: 0.8453 - regression_loss: 0.7603 - classification_loss: 0.0850 329/500 [==================>...........] - ETA: 58s - loss: 0.8447 - regression_loss: 0.7597 - classification_loss: 0.0850 330/500 [==================>...........] - ETA: 57s - loss: 0.8441 - regression_loss: 0.7593 - classification_loss: 0.0848 331/500 [==================>...........] - ETA: 57s - loss: 0.8430 - regression_loss: 0.7585 - classification_loss: 0.0846 332/500 [==================>...........] - ETA: 57s - loss: 0.8431 - regression_loss: 0.7586 - classification_loss: 0.0846 333/500 [==================>...........] - ETA: 56s - loss: 0.8422 - regression_loss: 0.7578 - classification_loss: 0.0845 334/500 [===================>..........] - ETA: 56s - loss: 0.8433 - regression_loss: 0.7585 - classification_loss: 0.0847 335/500 [===================>..........] - ETA: 56s - loss: 0.8419 - regression_loss: 0.7574 - classification_loss: 0.0845 336/500 [===================>..........] - ETA: 55s - loss: 0.8405 - regression_loss: 0.7562 - classification_loss: 0.0843 337/500 [===================>..........] - ETA: 55s - loss: 0.8399 - regression_loss: 0.7556 - classification_loss: 0.0843 338/500 [===================>..........] - ETA: 55s - loss: 0.8397 - regression_loss: 0.7555 - classification_loss: 0.0842 339/500 [===================>..........] - ETA: 54s - loss: 0.8401 - regression_loss: 0.7559 - classification_loss: 0.0842 340/500 [===================>..........] - ETA: 54s - loss: 0.8401 - regression_loss: 0.7560 - classification_loss: 0.0842 341/500 [===================>..........] - ETA: 54s - loss: 0.8398 - regression_loss: 0.7558 - classification_loss: 0.0841 342/500 [===================>..........] - ETA: 53s - loss: 0.8403 - regression_loss: 0.7561 - classification_loss: 0.0841 343/500 [===================>..........] - ETA: 53s - loss: 0.8402 - regression_loss: 0.7560 - classification_loss: 0.0841 344/500 [===================>..........] - ETA: 53s - loss: 0.8390 - regression_loss: 0.7550 - classification_loss: 0.0839 345/500 [===================>..........] - ETA: 52s - loss: 0.8392 - regression_loss: 0.7553 - classification_loss: 0.0840 346/500 [===================>..........] - ETA: 52s - loss: 0.8379 - regression_loss: 0.7541 - classification_loss: 0.0838 347/500 [===================>..........] - ETA: 52s - loss: 0.8368 - regression_loss: 0.7531 - classification_loss: 0.0837 348/500 [===================>..........] - ETA: 51s - loss: 0.8376 - regression_loss: 0.7539 - classification_loss: 0.0837 349/500 [===================>..........] - ETA: 51s - loss: 0.8363 - regression_loss: 0.7528 - classification_loss: 0.0835 350/500 [====================>.........] - ETA: 51s - loss: 0.8364 - regression_loss: 0.7529 - classification_loss: 0.0835 351/500 [====================>.........] - ETA: 50s - loss: 0.8356 - regression_loss: 0.7522 - classification_loss: 0.0834 352/500 [====================>.........] - ETA: 50s - loss: 0.8344 - regression_loss: 0.7512 - classification_loss: 0.0833 353/500 [====================>.........] - ETA: 50s - loss: 0.8345 - regression_loss: 0.7513 - classification_loss: 0.0832 354/500 [====================>.........] - ETA: 49s - loss: 0.8332 - regression_loss: 0.7501 - classification_loss: 0.0831 355/500 [====================>.........] - ETA: 49s - loss: 0.8332 - regression_loss: 0.7502 - classification_loss: 0.0830 356/500 [====================>.........] - ETA: 49s - loss: 0.8341 - regression_loss: 0.7510 - classification_loss: 0.0831 357/500 [====================>.........] - ETA: 48s - loss: 0.8344 - regression_loss: 0.7513 - classification_loss: 0.0831 358/500 [====================>.........] - ETA: 48s - loss: 0.8342 - regression_loss: 0.7511 - classification_loss: 0.0830 359/500 [====================>.........] - ETA: 48s - loss: 0.8352 - regression_loss: 0.7521 - classification_loss: 0.0831 360/500 [====================>.........] - ETA: 47s - loss: 0.8337 - regression_loss: 0.7508 - classification_loss: 0.0829 361/500 [====================>.........] - ETA: 47s - loss: 0.8330 - regression_loss: 0.7502 - classification_loss: 0.0828 362/500 [====================>.........] - ETA: 47s - loss: 0.8344 - regression_loss: 0.7513 - classification_loss: 0.0830 363/500 [====================>.........] - ETA: 46s - loss: 0.8340 - regression_loss: 0.7509 - classification_loss: 0.0830 364/500 [====================>.........] - ETA: 46s - loss: 0.8339 - regression_loss: 0.7509 - classification_loss: 0.0830 365/500 [====================>.........] - ETA: 46s - loss: 0.8340 - regression_loss: 0.7510 - classification_loss: 0.0830 366/500 [====================>.........] - ETA: 45s - loss: 0.8334 - regression_loss: 0.7505 - classification_loss: 0.0829 367/500 [=====================>........] - ETA: 45s - loss: 0.8336 - regression_loss: 0.7507 - classification_loss: 0.0829 368/500 [=====================>........] - ETA: 45s - loss: 0.8337 - regression_loss: 0.7508 - classification_loss: 0.0829 369/500 [=====================>........] - ETA: 44s - loss: 0.8335 - regression_loss: 0.7506 - classification_loss: 0.0828 370/500 [=====================>........] - ETA: 44s - loss: 0.8336 - regression_loss: 0.7507 - classification_loss: 0.0828 371/500 [=====================>........] - ETA: 43s - loss: 0.8351 - regression_loss: 0.7521 - classification_loss: 0.0830 372/500 [=====================>........] - ETA: 43s - loss: 0.8353 - regression_loss: 0.7523 - classification_loss: 0.0830 373/500 [=====================>........] - ETA: 43s - loss: 0.8357 - regression_loss: 0.7526 - classification_loss: 0.0831 374/500 [=====================>........] - ETA: 42s - loss: 0.8348 - regression_loss: 0.7519 - classification_loss: 0.0830 375/500 [=====================>........] - ETA: 42s - loss: 0.8352 - regression_loss: 0.7521 - classification_loss: 0.0831 376/500 [=====================>........] - ETA: 42s - loss: 0.8349 - regression_loss: 0.7519 - classification_loss: 0.0830 377/500 [=====================>........] - ETA: 41s - loss: 0.8348 - regression_loss: 0.7519 - classification_loss: 0.0829 378/500 [=====================>........] - ETA: 41s - loss: 0.8355 - regression_loss: 0.7525 - classification_loss: 0.0830 379/500 [=====================>........] - ETA: 41s - loss: 0.8361 - regression_loss: 0.7531 - classification_loss: 0.0830 380/500 [=====================>........] - ETA: 40s - loss: 0.8370 - regression_loss: 0.7537 - classification_loss: 0.0833 381/500 [=====================>........] - ETA: 40s - loss: 0.8373 - regression_loss: 0.7541 - classification_loss: 0.0832 382/500 [=====================>........] - ETA: 40s - loss: 0.8377 - regression_loss: 0.7544 - classification_loss: 0.0833 383/500 [=====================>........] - ETA: 39s - loss: 0.8365 - regression_loss: 0.7534 - classification_loss: 0.0831 384/500 [======================>.......] - ETA: 39s - loss: 0.8374 - regression_loss: 0.7542 - classification_loss: 0.0832 385/500 [======================>.......] - ETA: 39s - loss: 0.8371 - regression_loss: 0.7536 - classification_loss: 0.0835 386/500 [======================>.......] - ETA: 38s - loss: 0.8368 - regression_loss: 0.7534 - classification_loss: 0.0834 387/500 [======================>.......] - ETA: 38s - loss: 0.8383 - regression_loss: 0.7547 - classification_loss: 0.0836 388/500 [======================>.......] - ETA: 38s - loss: 0.8382 - regression_loss: 0.7546 - classification_loss: 0.0836 389/500 [======================>.......] - ETA: 37s - loss: 0.8395 - regression_loss: 0.7558 - classification_loss: 0.0837 390/500 [======================>.......] - ETA: 37s - loss: 0.8401 - regression_loss: 0.7563 - classification_loss: 0.0838 391/500 [======================>.......] - ETA: 37s - loss: 0.8388 - regression_loss: 0.7550 - classification_loss: 0.0838 392/500 [======================>.......] - ETA: 36s - loss: 0.8389 - regression_loss: 0.7550 - classification_loss: 0.0839 393/500 [======================>.......] - ETA: 36s - loss: 0.8400 - regression_loss: 0.7560 - classification_loss: 0.0840 394/500 [======================>.......] - ETA: 36s - loss: 0.8393 - regression_loss: 0.7555 - classification_loss: 0.0839 395/500 [======================>.......] - ETA: 35s - loss: 0.8390 - regression_loss: 0.7552 - classification_loss: 0.0837 396/500 [======================>.......] - ETA: 35s - loss: 0.8388 - regression_loss: 0.7550 - classification_loss: 0.0838 397/500 [======================>.......] - ETA: 35s - loss: 0.8400 - regression_loss: 0.7560 - classification_loss: 0.0840 398/500 [======================>.......] - ETA: 34s - loss: 0.8408 - regression_loss: 0.7566 - classification_loss: 0.0841 399/500 [======================>.......] - ETA: 34s - loss: 0.8396 - regression_loss: 0.7556 - classification_loss: 0.0840 400/500 [=======================>......] - ETA: 34s - loss: 0.8402 - regression_loss: 0.7561 - classification_loss: 0.0841 401/500 [=======================>......] - ETA: 33s - loss: 0.8411 - regression_loss: 0.7569 - classification_loss: 0.0842 402/500 [=======================>......] - ETA: 33s - loss: 0.8407 - regression_loss: 0.7565 - classification_loss: 0.0842 403/500 [=======================>......] - ETA: 33s - loss: 0.8405 - regression_loss: 0.7563 - classification_loss: 0.0842 404/500 [=======================>......] - ETA: 32s - loss: 0.8409 - regression_loss: 0.7567 - classification_loss: 0.0842 405/500 [=======================>......] - ETA: 32s - loss: 0.8410 - regression_loss: 0.7568 - classification_loss: 0.0842 406/500 [=======================>......] - ETA: 32s - loss: 0.8407 - regression_loss: 0.7565 - classification_loss: 0.0842 407/500 [=======================>......] - ETA: 31s - loss: 0.8410 - regression_loss: 0.7567 - classification_loss: 0.0843 408/500 [=======================>......] - ETA: 31s - loss: 0.8418 - regression_loss: 0.7575 - classification_loss: 0.0843 409/500 [=======================>......] - ETA: 31s - loss: 0.8437 - regression_loss: 0.7591 - classification_loss: 0.0846 410/500 [=======================>......] - ETA: 30s - loss: 0.8435 - regression_loss: 0.7589 - classification_loss: 0.0846 411/500 [=======================>......] - ETA: 30s - loss: 0.8427 - regression_loss: 0.7583 - classification_loss: 0.0844 412/500 [=======================>......] - ETA: 30s - loss: 0.8420 - regression_loss: 0.7577 - classification_loss: 0.0843 413/500 [=======================>......] - ETA: 29s - loss: 0.8420 - regression_loss: 0.7576 - classification_loss: 0.0844 414/500 [=======================>......] - ETA: 29s - loss: 0.8428 - regression_loss: 0.7582 - classification_loss: 0.0845 415/500 [=======================>......] - ETA: 28s - loss: 0.8433 - regression_loss: 0.7587 - classification_loss: 0.0846 416/500 [=======================>......] - ETA: 28s - loss: 0.8436 - regression_loss: 0.7590 - classification_loss: 0.0846 417/500 [========================>.....] - ETA: 28s - loss: 0.8440 - regression_loss: 0.7594 - classification_loss: 0.0846 418/500 [========================>.....] - ETA: 27s - loss: 0.8440 - regression_loss: 0.7595 - classification_loss: 0.0846 419/500 [========================>.....] - ETA: 27s - loss: 0.8434 - regression_loss: 0.7589 - classification_loss: 0.0845 420/500 [========================>.....] - ETA: 27s - loss: 0.8422 - regression_loss: 0.7579 - classification_loss: 0.0844 421/500 [========================>.....] - ETA: 26s - loss: 0.8419 - regression_loss: 0.7575 - classification_loss: 0.0843 422/500 [========================>.....] - ETA: 26s - loss: 0.8420 - regression_loss: 0.7577 - classification_loss: 0.0843 423/500 [========================>.....] - ETA: 26s - loss: 0.8422 - regression_loss: 0.7580 - classification_loss: 0.0843 424/500 [========================>.....] - ETA: 25s - loss: 0.8428 - regression_loss: 0.7585 - classification_loss: 0.0843 425/500 [========================>.....] - ETA: 25s - loss: 0.8428 - regression_loss: 0.7585 - classification_loss: 0.0843 426/500 [========================>.....] - ETA: 25s - loss: 0.8427 - regression_loss: 0.7585 - classification_loss: 0.0843 427/500 [========================>.....] - ETA: 24s - loss: 0.8429 - regression_loss: 0.7587 - classification_loss: 0.0842 428/500 [========================>.....] - ETA: 24s - loss: 0.8425 - regression_loss: 0.7583 - classification_loss: 0.0842 429/500 [========================>.....] - ETA: 24s - loss: 0.8428 - regression_loss: 0.7587 - classification_loss: 0.0841 430/500 [========================>.....] - ETA: 23s - loss: 0.8435 - regression_loss: 0.7593 - classification_loss: 0.0843 431/500 [========================>.....] - ETA: 23s - loss: 0.8442 - regression_loss: 0.7599 - classification_loss: 0.0843 432/500 [========================>.....] - ETA: 23s - loss: 0.8436 - regression_loss: 0.7594 - classification_loss: 0.0842 433/500 [========================>.....] - ETA: 22s - loss: 0.8438 - regression_loss: 0.7596 - classification_loss: 0.0842 434/500 [=========================>....] - ETA: 22s - loss: 0.8433 - regression_loss: 0.7592 - classification_loss: 0.0841 435/500 [=========================>....] - ETA: 22s - loss: 0.8423 - regression_loss: 0.7583 - classification_loss: 0.0840 436/500 [=========================>....] - ETA: 21s - loss: 0.8426 - regression_loss: 0.7584 - classification_loss: 0.0842 437/500 [=========================>....] - ETA: 21s - loss: 0.8429 - regression_loss: 0.7586 - classification_loss: 0.0843 438/500 [=========================>....] - ETA: 21s - loss: 0.8424 - regression_loss: 0.7582 - classification_loss: 0.0842 439/500 [=========================>....] - ETA: 20s - loss: 0.8422 - regression_loss: 0.7580 - classification_loss: 0.0842 440/500 [=========================>....] - ETA: 20s - loss: 0.8424 - regression_loss: 0.7582 - classification_loss: 0.0842 441/500 [=========================>....] - ETA: 20s - loss: 0.8424 - regression_loss: 0.7583 - classification_loss: 0.0841 442/500 [=========================>....] - ETA: 19s - loss: 0.8421 - regression_loss: 0.7581 - classification_loss: 0.0840 443/500 [=========================>....] - ETA: 19s - loss: 0.8431 - regression_loss: 0.7591 - classification_loss: 0.0840 444/500 [=========================>....] - ETA: 19s - loss: 0.8440 - regression_loss: 0.7598 - classification_loss: 0.0841 445/500 [=========================>....] - ETA: 18s - loss: 0.8446 - regression_loss: 0.7604 - classification_loss: 0.0842 446/500 [=========================>....] - ETA: 18s - loss: 0.8440 - regression_loss: 0.7600 - classification_loss: 0.0840 447/500 [=========================>....] - ETA: 18s - loss: 0.8431 - regression_loss: 0.7592 - classification_loss: 0.0839 448/500 [=========================>....] - ETA: 17s - loss: 0.8440 - regression_loss: 0.7599 - classification_loss: 0.0841 449/500 [=========================>....] - ETA: 17s - loss: 0.8430 - regression_loss: 0.7590 - classification_loss: 0.0840 450/500 [==========================>...] - ETA: 17s - loss: 0.8434 - regression_loss: 0.7594 - classification_loss: 0.0840 451/500 [==========================>...] - ETA: 16s - loss: 0.8444 - regression_loss: 0.7601 - classification_loss: 0.0842 452/500 [==========================>...] - ETA: 16s - loss: 0.8441 - regression_loss: 0.7599 - classification_loss: 0.0842 453/500 [==========================>...] - ETA: 16s - loss: 0.8447 - regression_loss: 0.7604 - classification_loss: 0.0843 454/500 [==========================>...] - ETA: 15s - loss: 0.8441 - regression_loss: 0.7599 - classification_loss: 0.0842 455/500 [==========================>...] - ETA: 15s - loss: 0.8441 - regression_loss: 0.7599 - classification_loss: 0.0842 456/500 [==========================>...] - ETA: 15s - loss: 0.8431 - regression_loss: 0.7590 - classification_loss: 0.0840 457/500 [==========================>...] - ETA: 14s - loss: 0.8432 - regression_loss: 0.7591 - classification_loss: 0.0841 458/500 [==========================>...] - ETA: 14s - loss: 0.8440 - regression_loss: 0.7600 - classification_loss: 0.0841 459/500 [==========================>...] - ETA: 13s - loss: 0.8435 - regression_loss: 0.7595 - classification_loss: 0.0840 460/500 [==========================>...] - ETA: 13s - loss: 0.8438 - regression_loss: 0.7597 - classification_loss: 0.0841 461/500 [==========================>...] - ETA: 13s - loss: 0.8436 - regression_loss: 0.7595 - classification_loss: 0.0841 462/500 [==========================>...] - ETA: 12s - loss: 0.8437 - regression_loss: 0.7596 - classification_loss: 0.0841 463/500 [==========================>...] - ETA: 12s - loss: 0.8444 - regression_loss: 0.7603 - classification_loss: 0.0841 464/500 [==========================>...] - ETA: 12s - loss: 0.8435 - regression_loss: 0.7595 - classification_loss: 0.0840 465/500 [==========================>...] - ETA: 11s - loss: 0.8446 - regression_loss: 0.7605 - classification_loss: 0.0841 466/500 [==========================>...] - ETA: 11s - loss: 0.8452 - regression_loss: 0.7609 - classification_loss: 0.0842 467/500 [===========================>..] - ETA: 11s - loss: 0.8443 - regression_loss: 0.7602 - classification_loss: 0.0841 468/500 [===========================>..] - ETA: 10s - loss: 0.8429 - regression_loss: 0.7589 - classification_loss: 0.0840 469/500 [===========================>..] - ETA: 10s - loss: 0.8426 - regression_loss: 0.7587 - classification_loss: 0.0839 470/500 [===========================>..] - ETA: 10s - loss: 0.8418 - regression_loss: 0.7581 - classification_loss: 0.0838 471/500 [===========================>..] - ETA: 9s - loss: 0.8423 - regression_loss: 0.7584 - classification_loss: 0.0838  472/500 [===========================>..] - ETA: 9s - loss: 0.8416 - regression_loss: 0.7579 - classification_loss: 0.0837 473/500 [===========================>..] - ETA: 9s - loss: 0.8418 - regression_loss: 0.7581 - classification_loss: 0.0838 474/500 [===========================>..] - ETA: 8s - loss: 0.8425 - regression_loss: 0.7585 - classification_loss: 0.0840 475/500 [===========================>..] - ETA: 8s - loss: 0.8427 - regression_loss: 0.7588 - classification_loss: 0.0840 476/500 [===========================>..] - ETA: 8s - loss: 0.8420 - regression_loss: 0.7581 - classification_loss: 0.0839 477/500 [===========================>..] - ETA: 7s - loss: 0.8416 - regression_loss: 0.7578 - classification_loss: 0.0838 478/500 [===========================>..] - ETA: 7s - loss: 0.8414 - regression_loss: 0.7577 - classification_loss: 0.0838 479/500 [===========================>..] - ETA: 7s - loss: 0.8420 - regression_loss: 0.7582 - classification_loss: 0.0838 480/500 [===========================>..] - ETA: 6s - loss: 0.8423 - regression_loss: 0.7584 - classification_loss: 0.0839 481/500 [===========================>..] - ETA: 6s - loss: 0.8422 - regression_loss: 0.7583 - classification_loss: 0.0839 482/500 [===========================>..] - ETA: 6s - loss: 0.8417 - regression_loss: 0.7579 - classification_loss: 0.0838 483/500 [===========================>..] - ETA: 5s - loss: 0.8405 - regression_loss: 0.7569 - classification_loss: 0.0836 484/500 [============================>.] - ETA: 5s - loss: 0.8402 - regression_loss: 0.7566 - classification_loss: 0.0836 485/500 [============================>.] - ETA: 5s - loss: 0.8403 - regression_loss: 0.7566 - classification_loss: 0.0836 486/500 [============================>.] - ETA: 4s - loss: 0.8397 - regression_loss: 0.7562 - classification_loss: 0.0835 487/500 [============================>.] - ETA: 4s - loss: 0.8400 - regression_loss: 0.7564 - classification_loss: 0.0836 488/500 [============================>.] - ETA: 4s - loss: 0.8392 - regression_loss: 0.7557 - classification_loss: 0.0835 489/500 [============================>.] - ETA: 3s - loss: 0.8394 - regression_loss: 0.7559 - classification_loss: 0.0835 490/500 [============================>.] - ETA: 3s - loss: 0.8401 - regression_loss: 0.7565 - classification_loss: 0.0836 491/500 [============================>.] - ETA: 3s - loss: 0.8390 - regression_loss: 0.7555 - classification_loss: 0.0835 492/500 [============================>.] - ETA: 2s - loss: 0.8400 - regression_loss: 0.7564 - classification_loss: 0.0837 493/500 [============================>.] - ETA: 2s - loss: 0.8394 - regression_loss: 0.7558 - classification_loss: 0.0836 494/500 [============================>.] - ETA: 2s - loss: 0.8402 - regression_loss: 0.7567 - classification_loss: 0.0836 495/500 [============================>.] - ETA: 1s - loss: 0.8410 - regression_loss: 0.7575 - classification_loss: 0.0836 496/500 [============================>.] - ETA: 1s - loss: 0.8411 - regression_loss: 0.7575 - classification_loss: 0.0836 497/500 [============================>.] - ETA: 1s - loss: 0.8408 - regression_loss: 0.7572 - classification_loss: 0.0836 498/500 [============================>.] - ETA: 0s - loss: 0.8396 - regression_loss: 0.7562 - classification_loss: 0.0834 499/500 [============================>.] - ETA: 0s - loss: 0.8397 - regression_loss: 0.7563 - classification_loss: 0.0834 500/500 [==============================] - 170s 341ms/step - loss: 0.8396 - regression_loss: 0.7562 - classification_loss: 0.0834 1172 instances of class plum with average precision: 0.8028 mAP: 0.8028 Epoch 00043: saving model to ./training/snapshots/resnet101_pascal_43.h5