CHECK: Is CUDA the right version (10)? Using backbone resnet101 weights arg is None Loading imagenet weights Creating model, this may take a second... Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b1_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2a (BatchNormalizat (None, None, None, 1 512 res3b1_branch2a[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu (Activatio (None, None, None, 1 0 bn3b1_branch2a[0][0] __________________________________________________________________________________________________ padding3b1_branch2b (ZeroPaddin (None, None, None, 1 0 res3b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b1_branch2b (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b[0][0] __________________________________________________________________________________________________ bn3b1_branch2b (BatchNormalizat (None, None, None, 1 512 res3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu (Activatio (None, None, None, 1 0 bn3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2c (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b1_branch2c[0][0] __________________________________________________________________________________________________ res3b1 (Add) (None, None, None, 5 0 bn3b1_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b1_relu (Activation) (None, None, None, 5 0 res3b1[0][0] __________________________________________________________________________________________________ res3b2_branch2a (Conv2D) (None, None, None, 1 65536 res3b1_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2a (BatchNormalizat (None, None, None, 1 512 res3b2_branch2a[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu (Activatio (None, None, None, 1 0 bn3b2_branch2a[0][0] __________________________________________________________________________________________________ padding3b2_branch2b (ZeroPaddin (None, None, None, 1 0 res3b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b2_branch2b (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b[0][0] __________________________________________________________________________________________________ bn3b2_branch2b (BatchNormalizat (None, None, None, 1 512 res3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu (Activatio (None, None, None, 1 0 bn3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2c (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b2_branch2c[0][0] __________________________________________________________________________________________________ res3b2 (Add) (None, None, None, 5 0 bn3b2_branch2c[0][0] res3b1_relu[0][0] __________________________________________________________________________________________________ res3b2_relu (Activation) (None, None, None, 5 0 res3b2[0][0] __________________________________________________________________________________________________ res3b3_branch2a (Conv2D) (None, None, None, 1 65536 res3b2_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2a (BatchNormalizat (None, None, None, 1 512 res3b3_branch2a[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu (Activatio (None, None, None, 1 0 bn3b3_branch2a[0][0] __________________________________________________________________________________________________ padding3b3_branch2b (ZeroPaddin (None, None, None, 1 0 res3b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b3_branch2b (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b[0][0] __________________________________________________________________________________________________ bn3b3_branch2b (BatchNormalizat (None, None, None, 1 512 res3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu (Activatio (None, None, None, 1 0 bn3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2c (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b3_branch2c[0][0] __________________________________________________________________________________________________ res3b3 (Add) (None, None, None, 5 0 bn3b3_branch2c[0][0] res3b2_relu[0][0] __________________________________________________________________________________________________ res3b3_relu (Activation) (None, None, None, 5 0 res3b3[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b1_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2a[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu (Activatio (None, None, None, 2 0 bn4b1_branch2a[0][0] __________________________________________________________________________________________________ padding4b1_branch2b (ZeroPaddin (None, None, None, 2 0 res4b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b1_branch2b (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b[0][0] __________________________________________________________________________________________________ bn4b1_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu (Activatio (None, None, None, 2 0 bn4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2c (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b1_branch2c[0][0] __________________________________________________________________________________________________ res4b1 (Add) (None, None, None, 1 0 bn4b1_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b1_relu (Activation) (None, None, None, 1 0 res4b1[0][0] __________________________________________________________________________________________________ res4b2_branch2a (Conv2D) (None, None, None, 2 262144 res4b1_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2a[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu (Activatio (None, None, None, 2 0 bn4b2_branch2a[0][0] __________________________________________________________________________________________________ padding4b2_branch2b (ZeroPaddin (None, None, None, 2 0 res4b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b2_branch2b (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b[0][0] __________________________________________________________________________________________________ bn4b2_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu (Activatio (None, None, None, 2 0 bn4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2c (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b2_branch2c[0][0] __________________________________________________________________________________________________ res4b2 (Add) (None, None, None, 1 0 bn4b2_branch2c[0][0] res4b1_relu[0][0] __________________________________________________________________________________________________ res4b2_relu (Activation) (None, None, None, 1 0 res4b2[0][0] __________________________________________________________________________________________________ res4b3_branch2a (Conv2D) (None, None, None, 2 262144 res4b2_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2a[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu (Activatio (None, None, None, 2 0 bn4b3_branch2a[0][0] __________________________________________________________________________________________________ padding4b3_branch2b (ZeroPaddin (None, None, None, 2 0 res4b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b3_branch2b (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b[0][0] __________________________________________________________________________________________________ bn4b3_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu (Activatio (None, None, None, 2 0 bn4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2c (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b3_branch2c[0][0] __________________________________________________________________________________________________ res4b3 (Add) (None, None, None, 1 0 bn4b3_branch2c[0][0] res4b2_relu[0][0] __________________________________________________________________________________________________ res4b3_relu (Activation) (None, None, None, 1 0 res4b3[0][0] __________________________________________________________________________________________________ res4b4_branch2a (Conv2D) (None, None, None, 2 262144 res4b3_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2a[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu (Activatio (None, None, None, 2 0 bn4b4_branch2a[0][0] __________________________________________________________________________________________________ padding4b4_branch2b (ZeroPaddin (None, None, None, 2 0 res4b4_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b4_branch2b (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b[0][0] __________________________________________________________________________________________________ bn4b4_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu (Activatio (None, None, None, 2 0 bn4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2c (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b4_branch2c[0][0] __________________________________________________________________________________________________ res4b4 (Add) (None, None, None, 1 0 bn4b4_branch2c[0][0] res4b3_relu[0][0] __________________________________________________________________________________________________ res4b4_relu (Activation) (None, None, None, 1 0 res4b4[0][0] __________________________________________________________________________________________________ res4b5_branch2a (Conv2D) (None, None, None, 2 262144 res4b4_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2a[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu (Activatio (None, None, None, 2 0 bn4b5_branch2a[0][0] __________________________________________________________________________________________________ padding4b5_branch2b (ZeroPaddin (None, None, None, 2 0 res4b5_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b5_branch2b (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b[0][0] __________________________________________________________________________________________________ bn4b5_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu (Activatio (None, None, None, 2 0 bn4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2c (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b5_branch2c[0][0] __________________________________________________________________________________________________ res4b5 (Add) (None, None, None, 1 0 bn4b5_branch2c[0][0] res4b4_relu[0][0] __________________________________________________________________________________________________ res4b5_relu (Activation) (None, None, None, 1 0 res4b5[0][0] __________________________________________________________________________________________________ res4b6_branch2a (Conv2D) (None, None, None, 2 262144 res4b5_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2a[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu (Activatio (None, None, None, 2 0 bn4b6_branch2a[0][0] __________________________________________________________________________________________________ padding4b6_branch2b (ZeroPaddin (None, None, None, 2 0 res4b6_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b6_branch2b (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b[0][0] __________________________________________________________________________________________________ bn4b6_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu (Activatio (None, None, None, 2 0 bn4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2c (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b6_branch2c[0][0] __________________________________________________________________________________________________ res4b6 (Add) (None, None, None, 1 0 bn4b6_branch2c[0][0] res4b5_relu[0][0] __________________________________________________________________________________________________ res4b6_relu (Activation) (None, None, None, 1 0 res4b6[0][0] __________________________________________________________________________________________________ res4b7_branch2a (Conv2D) (None, None, None, 2 262144 res4b6_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2a[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu (Activatio (None, None, None, 2 0 bn4b7_branch2a[0][0] __________________________________________________________________________________________________ padding4b7_branch2b (ZeroPaddin (None, None, None, 2 0 res4b7_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b7_branch2b (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b[0][0] __________________________________________________________________________________________________ bn4b7_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu (Activatio (None, None, None, 2 0 bn4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2c (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b7_branch2c[0][0] __________________________________________________________________________________________________ res4b7 (Add) (None, None, None, 1 0 bn4b7_branch2c[0][0] res4b6_relu[0][0] __________________________________________________________________________________________________ res4b7_relu (Activation) (None, None, None, 1 0 res4b7[0][0] __________________________________________________________________________________________________ res4b8_branch2a (Conv2D) (None, None, None, 2 262144 res4b7_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2a[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu (Activatio (None, None, None, 2 0 bn4b8_branch2a[0][0] __________________________________________________________________________________________________ padding4b8_branch2b (ZeroPaddin (None, None, None, 2 0 res4b8_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b8_branch2b (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b[0][0] __________________________________________________________________________________________________ bn4b8_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu (Activatio (None, None, None, 2 0 bn4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2c (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b8_branch2c[0][0] __________________________________________________________________________________________________ res4b8 (Add) (None, None, None, 1 0 bn4b8_branch2c[0][0] res4b7_relu[0][0] __________________________________________________________________________________________________ res4b8_relu (Activation) (None, None, None, 1 0 res4b8[0][0] __________________________________________________________________________________________________ res4b9_branch2a (Conv2D) (None, None, None, 2 262144 res4b8_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2a[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu (Activatio (None, None, None, 2 0 bn4b9_branch2a[0][0] __________________________________________________________________________________________________ padding4b9_branch2b (ZeroPaddin (None, None, None, 2 0 res4b9_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b9_branch2b (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b[0][0] __________________________________________________________________________________________________ bn4b9_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu (Activatio (None, None, None, 2 0 bn4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2c (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b9_branch2c[0][0] __________________________________________________________________________________________________ res4b9 (Add) (None, None, None, 1 0 bn4b9_branch2c[0][0] res4b8_relu[0][0] __________________________________________________________________________________________________ res4b9_relu (Activation) (None, None, None, 1 0 res4b9[0][0] __________________________________________________________________________________________________ res4b10_branch2a (Conv2D) (None, None, None, 2 262144 res4b9_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2a[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu (Activati (None, None, None, 2 0 bn4b10_branch2a[0][0] __________________________________________________________________________________________________ padding4b10_branch2b (ZeroPaddi (None, None, None, 2 0 res4b10_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b10_branch2b (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b[0][0] __________________________________________________________________________________________________ bn4b10_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu (Activati (None, None, None, 2 0 bn4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2c (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b10_branch2c[0][0] __________________________________________________________________________________________________ res4b10 (Add) (None, None, None, 1 0 bn4b10_branch2c[0][0] res4b9_relu[0][0] __________________________________________________________________________________________________ res4b10_relu (Activation) (None, None, None, 1 0 res4b10[0][0] __________________________________________________________________________________________________ res4b11_branch2a (Conv2D) (None, None, None, 2 262144 res4b10_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2a[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu (Activati (None, None, None, 2 0 bn4b11_branch2a[0][0] __________________________________________________________________________________________________ padding4b11_branch2b (ZeroPaddi (None, None, None, 2 0 res4b11_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b11_branch2b (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b[0][0] __________________________________________________________________________________________________ bn4b11_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu (Activati (None, None, None, 2 0 bn4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2c (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b11_branch2c[0][0] __________________________________________________________________________________________________ res4b11 (Add) (None, None, None, 1 0 bn4b11_branch2c[0][0] res4b10_relu[0][0] __________________________________________________________________________________________________ res4b11_relu (Activation) (None, None, None, 1 0 res4b11[0][0] __________________________________________________________________________________________________ res4b12_branch2a (Conv2D) (None, None, None, 2 262144 res4b11_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2a[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu (Activati (None, None, None, 2 0 bn4b12_branch2a[0][0] __________________________________________________________________________________________________ padding4b12_branch2b (ZeroPaddi (None, None, None, 2 0 res4b12_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b12_branch2b (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b[0][0] __________________________________________________________________________________________________ bn4b12_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu (Activati (None, None, None, 2 0 bn4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2c (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b12_branch2c[0][0] __________________________________________________________________________________________________ res4b12 (Add) (None, None, None, 1 0 bn4b12_branch2c[0][0] res4b11_relu[0][0] __________________________________________________________________________________________________ res4b12_relu (Activation) (None, None, None, 1 0 res4b12[0][0] __________________________________________________________________________________________________ res4b13_branch2a (Conv2D) (None, None, None, 2 262144 res4b12_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2a[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu (Activati (None, None, None, 2 0 bn4b13_branch2a[0][0] __________________________________________________________________________________________________ padding4b13_branch2b (ZeroPaddi (None, None, None, 2 0 res4b13_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b13_branch2b (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b[0][0] __________________________________________________________________________________________________ bn4b13_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu (Activati (None, None, None, 2 0 bn4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2c (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b13_branch2c[0][0] __________________________________________________________________________________________________ res4b13 (Add) (None, None, None, 1 0 bn4b13_branch2c[0][0] res4b12_relu[0][0] __________________________________________________________________________________________________ res4b13_relu (Activation) (None, None, None, 1 0 res4b13[0][0] __________________________________________________________________________________________________ res4b14_branch2a (Conv2D) (None, None, None, 2 262144 res4b13_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2a[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu (Activati (None, None, None, 2 0 bn4b14_branch2a[0][0] __________________________________________________________________________________________________ padding4b14_branch2b (ZeroPaddi (None, None, None, 2 0 res4b14_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b14_branch2b (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b[0][0] __________________________________________________________________________________________________ bn4b14_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu (Activati (None, None, None, 2 0 bn4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2c (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b14_branch2c[0][0] __________________________________________________________________________________________________ res4b14 (Add) (None, None, None, 1 0 bn4b14_branch2c[0][0] res4b13_relu[0][0] __________________________________________________________________________________________________ res4b14_relu (Activation) (None, None, None, 1 0 res4b14[0][0] __________________________________________________________________________________________________ res4b15_branch2a (Conv2D) (None, None, None, 2 262144 res4b14_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2a[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu (Activati (None, None, None, 2 0 bn4b15_branch2a[0][0] __________________________________________________________________________________________________ padding4b15_branch2b (ZeroPaddi (None, None, None, 2 0 res4b15_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b15_branch2b (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b[0][0] __________________________________________________________________________________________________ bn4b15_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu (Activati (None, None, None, 2 0 bn4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2c (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b15_branch2c[0][0] __________________________________________________________________________________________________ res4b15 (Add) (None, None, None, 1 0 bn4b15_branch2c[0][0] res4b14_relu[0][0] __________________________________________________________________________________________________ res4b15_relu (Activation) (None, None, None, 1 0 res4b15[0][0] __________________________________________________________________________________________________ res4b16_branch2a (Conv2D) (None, None, None, 2 262144 res4b15_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2a[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu (Activati (None, None, None, 2 0 bn4b16_branch2a[0][0] __________________________________________________________________________________________________ padding4b16_branch2b (ZeroPaddi (None, None, None, 2 0 res4b16_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b16_branch2b (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b[0][0] __________________________________________________________________________________________________ bn4b16_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu (Activati (None, None, None, 2 0 bn4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2c (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b16_branch2c[0][0] __________________________________________________________________________________________________ res4b16 (Add) (None, None, None, 1 0 bn4b16_branch2c[0][0] res4b15_relu[0][0] __________________________________________________________________________________________________ res4b16_relu (Activation) (None, None, None, 1 0 res4b16[0][0] __________________________________________________________________________________________________ res4b17_branch2a (Conv2D) (None, None, None, 2 262144 res4b16_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2a[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu (Activati (None, None, None, 2 0 bn4b17_branch2a[0][0] __________________________________________________________________________________________________ padding4b17_branch2b (ZeroPaddi (None, None, None, 2 0 res4b17_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b17_branch2b (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b[0][0] __________________________________________________________________________________________________ bn4b17_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu (Activati (None, None, None, 2 0 bn4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2c (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b17_branch2c[0][0] __________________________________________________________________________________________________ res4b17 (Add) (None, None, None, 1 0 bn4b17_branch2c[0][0] res4b16_relu[0][0] __________________________________________________________________________________________________ res4b17_relu (Activation) (None, None, None, 1 0 res4b17[0][0] __________________________________________________________________________________________________ res4b18_branch2a (Conv2D) (None, None, None, 2 262144 res4b17_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2a[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu (Activati (None, None, None, 2 0 bn4b18_branch2a[0][0] __________________________________________________________________________________________________ padding4b18_branch2b (ZeroPaddi (None, None, None, 2 0 res4b18_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b18_branch2b (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b[0][0] __________________________________________________________________________________________________ bn4b18_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu (Activati (None, None, None, 2 0 bn4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2c (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b18_branch2c[0][0] __________________________________________________________________________________________________ res4b18 (Add) (None, None, None, 1 0 bn4b18_branch2c[0][0] res4b17_relu[0][0] __________________________________________________________________________________________________ res4b18_relu (Activation) (None, None, None, 1 0 res4b18[0][0] __________________________________________________________________________________________________ res4b19_branch2a (Conv2D) (None, None, None, 2 262144 res4b18_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2a[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu (Activati (None, None, None, 2 0 bn4b19_branch2a[0][0] __________________________________________________________________________________________________ padding4b19_branch2b (ZeroPaddi (None, None, None, 2 0 res4b19_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b19_branch2b (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b[0][0] __________________________________________________________________________________________________ bn4b19_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu (Activati (None, None, None, 2 0 bn4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2c (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b19_branch2c[0][0] __________________________________________________________________________________________________ res4b19 (Add) (None, None, None, 1 0 bn4b19_branch2c[0][0] res4b18_relu[0][0] __________________________________________________________________________________________________ res4b19_relu (Activation) (None, None, None, 1 0 res4b19[0][0] __________________________________________________________________________________________________ res4b20_branch2a (Conv2D) (None, None, None, 2 262144 res4b19_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2a[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu (Activati (None, None, None, 2 0 bn4b20_branch2a[0][0] __________________________________________________________________________________________________ padding4b20_branch2b (ZeroPaddi (None, None, None, 2 0 res4b20_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b20_branch2b (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b[0][0] __________________________________________________________________________________________________ bn4b20_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu (Activati (None, None, None, 2 0 bn4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2c (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b20_branch2c[0][0] __________________________________________________________________________________________________ res4b20 (Add) (None, None, None, 1 0 bn4b20_branch2c[0][0] res4b19_relu[0][0] __________________________________________________________________________________________________ res4b20_relu (Activation) (None, None, None, 1 0 res4b20[0][0] __________________________________________________________________________________________________ res4b21_branch2a (Conv2D) (None, None, None, 2 262144 res4b20_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2a[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu (Activati (None, None, None, 2 0 bn4b21_branch2a[0][0] __________________________________________________________________________________________________ padding4b21_branch2b (ZeroPaddi (None, None, None, 2 0 res4b21_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b21_branch2b (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b[0][0] __________________________________________________________________________________________________ bn4b21_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu (Activati (None, None, None, 2 0 bn4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2c (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b21_branch2c[0][0] __________________________________________________________________________________________________ res4b21 (Add) (None, None, None, 1 0 bn4b21_branch2c[0][0] res4b20_relu[0][0] __________________________________________________________________________________________________ res4b21_relu (Activation) (None, None, None, 1 0 res4b21[0][0] __________________________________________________________________________________________________ res4b22_branch2a (Conv2D) (None, None, None, 2 262144 res4b21_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2a[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu (Activati (None, None, None, 2 0 bn4b22_branch2a[0][0] __________________________________________________________________________________________________ padding4b22_branch2b (ZeroPaddi (None, None, None, 2 0 res4b22_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b22_branch2b (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b[0][0] __________________________________________________________________________________________________ bn4b22_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu (Activati (None, None, None, 2 0 bn4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2c (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b22_branch2c[0][0] __________________________________________________________________________________________________ res4b22 (Add) (None, None, None, 1 0 bn4b22_branch2c[0][0] res4b21_relu[0][0] __________________________________________________________________________________________________ res4b22_relu (Activation) (None, None, None, 1 0 res4b22[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4b22_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4b22_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3b3_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3b3_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 55,427,309 Trainable params: 55,216,621 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 1:02:34 - loss: 3.9498 - regression_loss: 2.8153 - classification_loss: 1.1345 2/500 [..............................] - ETA: 32:32 - loss: 3.9991 - regression_loss: 2.8659 - classification_loss: 1.1332 3/500 [..............................] - ETA: 22:32 - loss: 4.0365 - regression_loss: 2.9029 - classification_loss: 1.1336 4/500 [..............................] - ETA: 17:34 - loss: 4.0220 - regression_loss: 2.8888 - classification_loss: 1.1333 5/500 [..............................] - ETA: 14:38 - loss: 4.0029 - regression_loss: 2.8696 - classification_loss: 1.1333 6/500 [..............................] - ETA: 12:35 - loss: 4.0053 - regression_loss: 2.8723 - classification_loss: 1.1331 7/500 [..............................] - ETA: 11:10 - loss: 4.0166 - regression_loss: 2.8837 - classification_loss: 1.1329 8/500 [..............................] - ETA: 10:07 - loss: 4.0270 - regression_loss: 2.8942 - classification_loss: 1.1327 9/500 [..............................] - ETA: 9:17 - loss: 4.0718 - regression_loss: 2.9391 - classification_loss: 1.1327 10/500 [..............................] - ETA: 8:37 - loss: 4.0798 - regression_loss: 2.9471 - classification_loss: 1.1327 11/500 [..............................] - ETA: 8:06 - loss: 4.0765 - regression_loss: 2.9439 - classification_loss: 1.1326 12/500 [..............................] - ETA: 7:39 - loss: 4.0735 - regression_loss: 2.9410 - classification_loss: 1.1324 13/500 [..............................] - ETA: 7:15 - loss: 4.0449 - regression_loss: 2.9125 - classification_loss: 1.1324 14/500 [..............................] - ETA: 6:54 - loss: 4.0535 - regression_loss: 2.9210 - classification_loss: 1.1325 15/500 [..............................] - ETA: 6:37 - loss: 4.0312 - regression_loss: 2.8988 - classification_loss: 1.1324 16/500 [..............................] - ETA: 6:22 - loss: 4.0298 - regression_loss: 2.8975 - classification_loss: 1.1324 17/500 [>.............................] - ETA: 6:07 - loss: 4.0276 - regression_loss: 2.8953 - classification_loss: 1.1322 18/500 [>.............................] - ETA: 5:55 - loss: 4.0261 - regression_loss: 2.8940 - classification_loss: 1.1321 19/500 [>.............................] - ETA: 5:44 - loss: 4.0303 - regression_loss: 2.8983 - classification_loss: 1.1320 20/500 [>.............................] - ETA: 5:34 - loss: 4.0260 - regression_loss: 2.8941 - classification_loss: 1.1319 21/500 [>.............................] - ETA: 5:25 - loss: 4.0215 - regression_loss: 2.8897 - classification_loss: 1.1318 22/500 [>.............................] - ETA: 5:17 - loss: 4.0115 - regression_loss: 2.8796 - classification_loss: 1.1319 23/500 [>.............................] - ETA: 5:09 - loss: 4.0135 - regression_loss: 2.8818 - classification_loss: 1.1317 24/500 [>.............................] - ETA: 5:02 - loss: 4.0011 - regression_loss: 2.8694 - classification_loss: 1.1317 25/500 [>.............................] - ETA: 4:56 - loss: 4.0123 - regression_loss: 2.8807 - classification_loss: 1.1317 26/500 [>.............................] - ETA: 4:50 - loss: 4.0163 - regression_loss: 2.8847 - classification_loss: 1.1316 27/500 [>.............................] - ETA: 4:44 - loss: 4.0226 - regression_loss: 2.8911 - classification_loss: 1.1315 28/500 [>.............................] - ETA: 4:39 - loss: 4.0234 - regression_loss: 2.8921 - classification_loss: 1.1313 29/500 [>.............................] - ETA: 4:34 - loss: 4.0209 - regression_loss: 2.8897 - classification_loss: 1.1312 30/500 [>.............................] - ETA: 4:30 - loss: 4.0166 - regression_loss: 2.8856 - classification_loss: 1.1311 31/500 [>.............................] - ETA: 4:26 - loss: 4.0125 - regression_loss: 2.8816 - classification_loss: 1.1309 32/500 [>.............................] - ETA: 4:21 - loss: 4.0135 - regression_loss: 2.8826 - classification_loss: 1.1309 33/500 [>.............................] - ETA: 4:17 - loss: 4.0144 - regression_loss: 2.8836 - classification_loss: 1.1308 34/500 [=>............................] - ETA: 4:14 - loss: 4.0141 - regression_loss: 2.8834 - classification_loss: 1.1306 35/500 [=>............................] - ETA: 4:11 - loss: 4.0111 - regression_loss: 2.8806 - classification_loss: 1.1305 36/500 [=>............................] - ETA: 4:07 - loss: 4.0083 - regression_loss: 2.8780 - classification_loss: 1.1304 37/500 [=>............................] - ETA: 4:04 - loss: 4.0062 - regression_loss: 2.8759 - classification_loss: 1.1303 38/500 [=>............................] - ETA: 4:02 - loss: 4.0024 - regression_loss: 2.8722 - classification_loss: 1.1302 39/500 [=>............................] - ETA: 3:59 - loss: 4.0009 - regression_loss: 2.8709 - classification_loss: 1.1300 40/500 [=>............................] - ETA: 3:56 - loss: 3.9968 - regression_loss: 2.8668 - classification_loss: 1.1300 41/500 [=>............................] - ETA: 3:54 - loss: 3.9933 - regression_loss: 2.8635 - classification_loss: 1.1298 42/500 [=>............................] - ETA: 3:51 - loss: 3.9922 - regression_loss: 2.8626 - classification_loss: 1.1296 43/500 [=>............................] - ETA: 3:49 - loss: 3.9920 - regression_loss: 2.8625 - classification_loss: 1.1295 44/500 [=>............................] - ETA: 3:46 - loss: 3.9895 - regression_loss: 2.8600 - classification_loss: 1.1295 45/500 [=>............................] - ETA: 3:44 - loss: 3.9880 - regression_loss: 2.8588 - classification_loss: 1.1293 46/500 [=>............................] - ETA: 3:41 - loss: 3.9816 - regression_loss: 2.8525 - classification_loss: 1.1291 47/500 [=>............................] - ETA: 3:39 - loss: 3.9800 - regression_loss: 2.8511 - classification_loss: 1.1289 48/500 [=>............................] - ETA: 3:38 - loss: 3.9787 - regression_loss: 2.8499 - classification_loss: 1.1288 49/500 [=>............................] - ETA: 3:36 - loss: 3.9726 - regression_loss: 2.8438 - classification_loss: 1.1288 50/500 [==>...........................] - ETA: 3:34 - loss: 3.9693 - regression_loss: 2.8406 - classification_loss: 1.1286 51/500 [==>...........................] - ETA: 3:32 - loss: 3.9671 - regression_loss: 2.8385 - classification_loss: 1.1285 52/500 [==>...........................] - ETA: 3:30 - loss: 3.9634 - regression_loss: 2.8350 - classification_loss: 1.1284 53/500 [==>...........................] - ETA: 3:29 - loss: 3.9607 - regression_loss: 2.8323 - classification_loss: 1.1284 54/500 [==>...........................] - ETA: 3:27 - loss: 3.9567 - regression_loss: 2.8285 - classification_loss: 1.1282 55/500 [==>...........................] - ETA: 3:25 - loss: 3.9543 - regression_loss: 2.8262 - classification_loss: 1.1281 56/500 [==>...........................] - ETA: 3:24 - loss: 3.9531 - regression_loss: 2.8251 - classification_loss: 1.1280 57/500 [==>...........................] - ETA: 3:22 - loss: 3.9509 - regression_loss: 2.8230 - classification_loss: 1.1278 58/500 [==>...........................] - ETA: 3:21 - loss: 3.9457 - regression_loss: 2.8182 - classification_loss: 1.1275 59/500 [==>...........................] - ETA: 3:19 - loss: 3.9415 - regression_loss: 2.8142 - classification_loss: 1.1272 60/500 [==>...........................] - ETA: 3:18 - loss: 3.9384 - regression_loss: 2.8114 - classification_loss: 1.1270 61/500 [==>...........................] - ETA: 3:16 - loss: 3.9337 - regression_loss: 2.8070 - classification_loss: 1.1266 62/500 [==>...........................] - ETA: 3:15 - loss: 3.9311 - regression_loss: 2.8046 - classification_loss: 1.1265 63/500 [==>...........................] - ETA: 3:14 - loss: 3.9254 - regression_loss: 2.7993 - classification_loss: 1.1261 64/500 [==>...........................] - ETA: 3:12 - loss: 3.9231 - regression_loss: 2.7973 - classification_loss: 1.1259 65/500 [==>...........................] - ETA: 3:11 - loss: 3.9196 - regression_loss: 2.7939 - classification_loss: 1.1257 66/500 [==>...........................] - ETA: 3:10 - loss: 3.9142 - regression_loss: 2.7890 - classification_loss: 1.1252 67/500 [===>..........................] - ETA: 3:09 - loss: 3.9093 - regression_loss: 2.7847 - classification_loss: 1.1246 68/500 [===>..........................] - ETA: 3:08 - loss: 3.9063 - regression_loss: 2.7820 - classification_loss: 1.1243 69/500 [===>..........................] - ETA: 3:07 - loss: 3.9034 - regression_loss: 2.7795 - classification_loss: 1.1239 70/500 [===>..........................] - ETA: 3:06 - loss: 3.9027 - regression_loss: 2.7789 - classification_loss: 1.1238 71/500 [===>..........................] - ETA: 3:05 - loss: 3.8982 - regression_loss: 2.7749 - classification_loss: 1.1232 72/500 [===>..........................] - ETA: 3:03 - loss: 3.8945 - regression_loss: 2.7717 - classification_loss: 1.1228 73/500 [===>..........................] - ETA: 3:03 - loss: 3.8946 - regression_loss: 2.7719 - classification_loss: 1.1227 74/500 [===>..........................] - ETA: 3:02 - loss: 3.8916 - regression_loss: 2.7695 - classification_loss: 1.1221 75/500 [===>..........................] - ETA: 3:00 - loss: 3.8887 - regression_loss: 2.7669 - classification_loss: 1.1218 76/500 [===>..........................] - ETA: 3:00 - loss: 3.8842 - regression_loss: 2.7631 - classification_loss: 1.1211 77/500 [===>..........................] - ETA: 2:59 - loss: 3.8884 - regression_loss: 2.7675 - classification_loss: 1.1209 78/500 [===>..........................] - ETA: 2:58 - loss: 3.8840 - regression_loss: 2.7638 - classification_loss: 1.1202 79/500 [===>..........................] - ETA: 2:57 - loss: 3.8870 - regression_loss: 2.7669 - classification_loss: 1.1201 80/500 [===>..........................] - ETA: 2:56 - loss: 3.8838 - regression_loss: 2.7641 - classification_loss: 1.1197 81/500 [===>..........................] - ETA: 2:55 - loss: 3.8796 - regression_loss: 2.7606 - classification_loss: 1.1190 82/500 [===>..........................] - ETA: 2:54 - loss: 3.8774 - regression_loss: 2.7584 - classification_loss: 1.1190 83/500 [===>..........................] - ETA: 2:53 - loss: 3.8731 - regression_loss: 2.7549 - classification_loss: 1.1182 84/500 [====>.........................] - ETA: 2:52 - loss: 3.8692 - regression_loss: 2.7512 - classification_loss: 1.1180 85/500 [====>.........................] - ETA: 2:51 - loss: 3.8646 - regression_loss: 2.7474 - classification_loss: 1.1172 86/500 [====>.........................] - ETA: 2:51 - loss: 3.8638 - regression_loss: 2.7467 - classification_loss: 1.1172 87/500 [====>.........................] - ETA: 2:50 - loss: 3.8599 - regression_loss: 2.7433 - classification_loss: 1.1165 88/500 [====>.........................] - ETA: 2:49 - loss: 3.8599 - regression_loss: 2.7434 - classification_loss: 1.1165 89/500 [====>.........................] - ETA: 2:48 - loss: 3.8547 - regression_loss: 2.7390 - classification_loss: 1.1157 90/500 [====>.........................] - ETA: 2:48 - loss: 3.8514 - regression_loss: 2.7366 - classification_loss: 1.1147 91/500 [====>.........................] - ETA: 2:47 - loss: 3.8478 - regression_loss: 2.7336 - classification_loss: 1.1142 92/500 [====>.........................] - ETA: 2:46 - loss: 3.8462 - regression_loss: 2.7322 - classification_loss: 1.1140 93/500 [====>.........................] - ETA: 2:45 - loss: 3.8420 - regression_loss: 2.7293 - classification_loss: 1.1128 94/500 [====>.........................] - ETA: 2:45 - loss: 3.8387 - regression_loss: 2.7267 - classification_loss: 1.1120 95/500 [====>.........................] - ETA: 2:44 - loss: 3.8382 - regression_loss: 2.7265 - classification_loss: 1.1117 96/500 [====>.........................] - ETA: 2:43 - loss: 3.8342 - regression_loss: 2.7238 - classification_loss: 1.1104 97/500 [====>.........................] - ETA: 2:42 - loss: 3.8290 - regression_loss: 2.7200 - classification_loss: 1.1091 98/500 [====>.........................] - ETA: 2:42 - loss: 3.8331 - regression_loss: 2.7244 - classification_loss: 1.1088 99/500 [====>.........................] - ETA: 2:41 - loss: 3.8321 - regression_loss: 2.7236 - classification_loss: 1.1085 100/500 [=====>........................] - ETA: 2:40 - loss: 3.8275 - regression_loss: 2.7208 - classification_loss: 1.1067 101/500 [=====>........................] - ETA: 2:39 - loss: 3.8257 - regression_loss: 2.7195 - classification_loss: 1.1062 102/500 [=====>........................] - ETA: 2:39 - loss: 3.8213 - regression_loss: 2.7167 - classification_loss: 1.1046 103/500 [=====>........................] - ETA: 2:38 - loss: 3.8158 - regression_loss: 2.7138 - classification_loss: 1.1020 104/500 [=====>........................] - ETA: 2:37 - loss: 3.8107 - regression_loss: 2.7109 - classification_loss: 1.0998 105/500 [=====>........................] - ETA: 2:37 - loss: 3.8078 - regression_loss: 2.7089 - classification_loss: 1.0989 106/500 [=====>........................] - ETA: 2:36 - loss: 3.8025 - regression_loss: 2.7071 - classification_loss: 1.0954 107/500 [=====>........................] - ETA: 2:35 - loss: 3.7959 - regression_loss: 2.7043 - classification_loss: 1.0916 108/500 [=====>........................] - ETA: 2:35 - loss: 3.7915 - regression_loss: 2.7018 - classification_loss: 1.0896 109/500 [=====>........................] - ETA: 2:34 - loss: 3.7842 - regression_loss: 2.6990 - classification_loss: 1.0852 110/500 [=====>........................] - ETA: 2:33 - loss: 3.7789 - regression_loss: 2.6973 - classification_loss: 1.0816 111/500 [=====>........................] - ETA: 2:33 - loss: 3.7724 - regression_loss: 2.6946 - classification_loss: 1.0778 112/500 [=====>........................] - ETA: 2:32 - loss: 3.7746 - regression_loss: 2.6951 - classification_loss: 1.0794 113/500 [=====>........................] - ETA: 2:32 - loss: 3.7749 - regression_loss: 2.6945 - classification_loss: 1.0804 114/500 [=====>........................] - ETA: 2:31 - loss: 3.7710 - regression_loss: 2.6923 - classification_loss: 1.0787 115/500 [=====>........................] - ETA: 2:30 - loss: 3.7742 - regression_loss: 2.6941 - classification_loss: 1.0801 116/500 [=====>........................] - ETA: 2:30 - loss: 3.7727 - regression_loss: 2.6932 - classification_loss: 1.0795 117/500 [======>.......................] - ETA: 2:29 - loss: 3.7703 - regression_loss: 2.6901 - classification_loss: 1.0801 118/500 [======>.......................] - ETA: 2:29 - loss: 3.7711 - regression_loss: 2.6907 - classification_loss: 1.0803 119/500 [======>.......................] - ETA: 2:28 - loss: 3.7651 - regression_loss: 2.6887 - classification_loss: 1.0764 120/500 [======>.......................] - ETA: 2:27 - loss: 3.7635 - regression_loss: 2.6880 - classification_loss: 1.0755 121/500 [======>.......................] - ETA: 2:27 - loss: 3.7596 - regression_loss: 2.6864 - classification_loss: 1.0732 122/500 [======>.......................] - ETA: 2:26 - loss: 3.7543 - regression_loss: 2.6844 - classification_loss: 1.0698 123/500 [======>.......................] - ETA: 2:25 - loss: 3.7497 - regression_loss: 2.6826 - classification_loss: 1.0672 124/500 [======>.......................] - ETA: 2:25 - loss: 3.7479 - regression_loss: 2.6812 - classification_loss: 1.0668 125/500 [======>.......................] - ETA: 2:24 - loss: 3.7465 - regression_loss: 2.6804 - classification_loss: 1.0661 126/500 [======>.......................] - ETA: 2:24 - loss: 3.7410 - regression_loss: 2.6782 - classification_loss: 1.0628 127/500 [======>.......................] - ETA: 2:23 - loss: 3.7408 - regression_loss: 2.6793 - classification_loss: 1.0615 128/500 [======>.......................] - ETA: 2:23 - loss: 3.7357 - regression_loss: 2.6771 - classification_loss: 1.0586 129/500 [======>.......................] - ETA: 2:22 - loss: 3.7327 - regression_loss: 2.6753 - classification_loss: 1.0574 130/500 [======>.......................] - ETA: 2:21 - loss: 3.7300 - regression_loss: 2.6739 - classification_loss: 1.0561 131/500 [======>.......................] - ETA: 2:21 - loss: 3.7244 - regression_loss: 2.6721 - classification_loss: 1.0523 132/500 [======>.......................] - ETA: 2:20 - loss: 3.7199 - regression_loss: 2.6698 - classification_loss: 1.0501 133/500 [======>.......................] - ETA: 2:20 - loss: 3.7168 - regression_loss: 2.6689 - classification_loss: 1.0480 134/500 [=======>......................] - ETA: 2:19 - loss: 3.7155 - regression_loss: 2.6675 - classification_loss: 1.0479 135/500 [=======>......................] - ETA: 2:19 - loss: 3.7161 - regression_loss: 2.6692 - classification_loss: 1.0470 136/500 [=======>......................] - ETA: 2:18 - loss: 3.7126 - regression_loss: 2.6676 - classification_loss: 1.0450 137/500 [=======>......................] - ETA: 2:18 - loss: 3.7093 - regression_loss: 2.6660 - classification_loss: 1.0432 138/500 [=======>......................] - ETA: 2:17 - loss: 3.7084 - regression_loss: 2.6671 - classification_loss: 1.0413 139/500 [=======>......................] - ETA: 2:17 - loss: 3.7054 - regression_loss: 2.6664 - classification_loss: 1.0390 140/500 [=======>......................] - ETA: 2:16 - loss: 3.7043 - regression_loss: 2.6672 - classification_loss: 1.0371 141/500 [=======>......................] - ETA: 2:16 - loss: 3.6989 - regression_loss: 2.6656 - classification_loss: 1.0333 142/500 [=======>......................] - ETA: 2:15 - loss: 3.7003 - regression_loss: 2.6678 - classification_loss: 1.0325 143/500 [=======>......................] - ETA: 2:15 - loss: 3.6955 - regression_loss: 2.6657 - classification_loss: 1.0298 144/500 [=======>......................] - ETA: 2:14 - loss: 3.6905 - regression_loss: 2.6644 - classification_loss: 1.0261 145/500 [=======>......................] - ETA: 2:14 - loss: 3.6892 - regression_loss: 2.6622 - classification_loss: 1.0270 146/500 [=======>......................] - ETA: 2:13 - loss: 3.6845 - regression_loss: 2.6605 - classification_loss: 1.0240 147/500 [=======>......................] - ETA: 2:13 - loss: 3.6812 - regression_loss: 2.6596 - classification_loss: 1.0216 148/500 [=======>......................] - ETA: 2:12 - loss: 3.6772 - regression_loss: 2.6580 - classification_loss: 1.0193 149/500 [=======>......................] - ETA: 2:12 - loss: 3.6726 - regression_loss: 2.6561 - classification_loss: 1.0165 150/500 [========>.....................] - ETA: 2:11 - loss: 3.6683 - regression_loss: 2.6543 - classification_loss: 1.0140 151/500 [========>.....................] - ETA: 2:11 - loss: 3.6685 - regression_loss: 2.6556 - classification_loss: 1.0129 152/500 [========>.....................] - ETA: 2:10 - loss: 3.6653 - regression_loss: 2.6544 - classification_loss: 1.0109 153/500 [========>.....................] - ETA: 2:10 - loss: 3.6626 - regression_loss: 2.6542 - classification_loss: 1.0084 154/500 [========>.....................] - ETA: 2:09 - loss: 3.6576 - regression_loss: 2.6528 - classification_loss: 1.0048 155/500 [========>.....................] - ETA: 2:09 - loss: 3.6545 - regression_loss: 2.6526 - classification_loss: 1.0019 156/500 [========>.....................] - ETA: 2:08 - loss: 3.6506 - regression_loss: 2.6520 - classification_loss: 0.9986 157/500 [========>.....................] - ETA: 2:08 - loss: 3.6468 - regression_loss: 2.6508 - classification_loss: 0.9960 158/500 [========>.....................] - ETA: 2:07 - loss: 3.6451 - regression_loss: 2.6512 - classification_loss: 0.9939 159/500 [========>.....................] - ETA: 2:07 - loss: 3.6407 - regression_loss: 2.6495 - classification_loss: 0.9912 160/500 [========>.....................] - ETA: 2:06 - loss: 3.6391 - regression_loss: 2.6483 - classification_loss: 0.9908 161/500 [========>.....................] - ETA: 2:06 - loss: 3.6345 - regression_loss: 2.6469 - classification_loss: 0.9876 162/500 [========>.....................] - ETA: 2:05 - loss: 3.6309 - regression_loss: 2.6462 - classification_loss: 0.9847 163/500 [========>.....................] - ETA: 2:05 - loss: 3.6264 - regression_loss: 2.6448 - classification_loss: 0.9816 164/500 [========>.....................] - ETA: 2:05 - loss: 3.6238 - regression_loss: 2.6442 - classification_loss: 0.9796 165/500 [========>.....................] - ETA: 2:04 - loss: 3.6202 - regression_loss: 2.6433 - classification_loss: 0.9769 166/500 [========>.....................] - ETA: 2:04 - loss: 3.6157 - regression_loss: 2.6417 - classification_loss: 0.9739 167/500 [=========>....................] - ETA: 2:03 - loss: 3.6119 - regression_loss: 2.6407 - classification_loss: 0.9713 168/500 [=========>....................] - ETA: 2:03 - loss: 3.6077 - regression_loss: 2.6396 - classification_loss: 0.9681 169/500 [=========>....................] - ETA: 2:02 - loss: 3.6036 - regression_loss: 2.6382 - classification_loss: 0.9654 170/500 [=========>....................] - ETA: 2:02 - loss: 3.5994 - regression_loss: 2.6368 - classification_loss: 0.9626 171/500 [=========>....................] - ETA: 2:01 - loss: 3.5966 - regression_loss: 2.6363 - classification_loss: 0.9603 172/500 [=========>....................] - ETA: 2:01 - loss: 3.5972 - regression_loss: 2.6382 - classification_loss: 0.9589 173/500 [=========>....................] - ETA: 2:01 - loss: 3.5944 - regression_loss: 2.6378 - classification_loss: 0.9566 174/500 [=========>....................] - ETA: 2:00 - loss: 3.5904 - regression_loss: 2.6365 - classification_loss: 0.9539 175/500 [=========>....................] - ETA: 2:00 - loss: 3.5870 - regression_loss: 2.6355 - classification_loss: 0.9515 176/500 [=========>....................] - ETA: 1:59 - loss: 3.5836 - regression_loss: 2.6345 - classification_loss: 0.9491 177/500 [=========>....................] - ETA: 1:59 - loss: 3.5795 - regression_loss: 2.6330 - classification_loss: 0.9465 178/500 [=========>....................] - ETA: 1:58 - loss: 3.5766 - regression_loss: 2.6326 - classification_loss: 0.9440 179/500 [=========>....................] - ETA: 1:58 - loss: 3.5725 - regression_loss: 2.6313 - classification_loss: 0.9412 180/500 [=========>....................] - ETA: 1:57 - loss: 3.5692 - regression_loss: 2.6304 - classification_loss: 0.9388 181/500 [=========>....................] - ETA: 1:57 - loss: 3.5670 - regression_loss: 2.6297 - classification_loss: 0.9373 182/500 [=========>....................] - ETA: 1:57 - loss: 3.5647 - regression_loss: 2.6291 - classification_loss: 0.9356 183/500 [=========>....................] - ETA: 1:56 - loss: 3.5609 - regression_loss: 2.6277 - classification_loss: 0.9332 184/500 [==========>...................] - ETA: 1:56 - loss: 3.5576 - regression_loss: 2.6267 - classification_loss: 0.9309 185/500 [==========>...................] - ETA: 1:55 - loss: 3.5547 - regression_loss: 2.6258 - classification_loss: 0.9289 186/500 [==========>...................] - ETA: 1:55 - loss: 3.5510 - regression_loss: 2.6245 - classification_loss: 0.9265 187/500 [==========>...................] - ETA: 1:54 - loss: 3.5488 - regression_loss: 2.6241 - classification_loss: 0.9248 188/500 [==========>...................] - ETA: 1:54 - loss: 3.5454 - regression_loss: 2.6229 - classification_loss: 0.9225 189/500 [==========>...................] - ETA: 1:54 - loss: 3.5427 - regression_loss: 2.6224 - classification_loss: 0.9203 190/500 [==========>...................] - ETA: 1:53 - loss: 3.5399 - regression_loss: 2.6215 - classification_loss: 0.9184 191/500 [==========>...................] - ETA: 1:53 - loss: 3.5363 - regression_loss: 2.6205 - classification_loss: 0.9157 192/500 [==========>...................] - ETA: 1:52 - loss: 3.5321 - regression_loss: 2.6193 - classification_loss: 0.9128 193/500 [==========>...................] - ETA: 1:52 - loss: 3.5286 - regression_loss: 2.6181 - classification_loss: 0.9106 194/500 [==========>...................] - ETA: 1:51 - loss: 3.5281 - regression_loss: 2.6164 - classification_loss: 0.9117 195/500 [==========>...................] - ETA: 1:51 - loss: 3.5292 - regression_loss: 2.6158 - classification_loss: 0.9134 196/500 [==========>...................] - ETA: 1:50 - loss: 3.5258 - regression_loss: 2.6140 - classification_loss: 0.9118 197/500 [==========>...................] - ETA: 1:50 - loss: 3.5228 - regression_loss: 2.6131 - classification_loss: 0.9097 198/500 [==========>...................] - ETA: 1:50 - loss: 3.5202 - regression_loss: 2.6125 - classification_loss: 0.9077 199/500 [==========>...................] - ETA: 1:49 - loss: 3.5161 - regression_loss: 2.6110 - classification_loss: 0.9051 200/500 [===========>..................] - ETA: 1:49 - loss: 3.5130 - regression_loss: 2.6100 - classification_loss: 0.9030 201/500 [===========>..................] - ETA: 1:48 - loss: 3.5106 - regression_loss: 2.6096 - classification_loss: 0.9010 202/500 [===========>..................] - ETA: 1:48 - loss: 3.5112 - regression_loss: 2.6117 - classification_loss: 0.8995 203/500 [===========>..................] - ETA: 1:48 - loss: 3.5080 - regression_loss: 2.6106 - classification_loss: 0.8974 204/500 [===========>..................] - ETA: 1:47 - loss: 3.5043 - regression_loss: 2.6092 - classification_loss: 0.8951 205/500 [===========>..................] - ETA: 1:47 - loss: 3.5011 - regression_loss: 2.6077 - classification_loss: 0.8934 206/500 [===========>..................] - ETA: 1:46 - loss: 3.5002 - regression_loss: 2.6078 - classification_loss: 0.8924 207/500 [===========>..................] - ETA: 1:46 - loss: 3.4969 - regression_loss: 2.6068 - classification_loss: 0.8900 208/500 [===========>..................] - ETA: 1:46 - loss: 3.4938 - regression_loss: 2.6060 - classification_loss: 0.8878 209/500 [===========>..................] - ETA: 1:45 - loss: 3.4926 - regression_loss: 2.6065 - classification_loss: 0.8861 210/500 [===========>..................] - ETA: 1:45 - loss: 3.4919 - regression_loss: 2.6073 - classification_loss: 0.8846 211/500 [===========>..................] - ETA: 1:44 - loss: 3.4890 - regression_loss: 2.6068 - classification_loss: 0.8823 212/500 [===========>..................] - ETA: 1:44 - loss: 3.4863 - regression_loss: 2.6060 - classification_loss: 0.8803 213/500 [===========>..................] - ETA: 1:43 - loss: 3.4840 - regression_loss: 2.6054 - classification_loss: 0.8786 214/500 [===========>..................] - ETA: 1:43 - loss: 3.4805 - regression_loss: 2.6039 - classification_loss: 0.8766 215/500 [===========>..................] - ETA: 1:43 - loss: 3.4776 - regression_loss: 2.6032 - classification_loss: 0.8744 216/500 [===========>..................] - ETA: 1:42 - loss: 3.4752 - regression_loss: 2.6027 - classification_loss: 0.8725 217/500 [============>.................] - ETA: 1:42 - loss: 3.4718 - regression_loss: 2.6014 - classification_loss: 0.8705 218/500 [============>.................] - ETA: 1:41 - loss: 3.4690 - regression_loss: 2.6005 - classification_loss: 0.8686 219/500 [============>.................] - ETA: 1:41 - loss: 3.4674 - regression_loss: 2.6007 - classification_loss: 0.8668 220/500 [============>.................] - ETA: 1:41 - loss: 3.4655 - regression_loss: 2.6004 - classification_loss: 0.8651 221/500 [============>.................] - ETA: 1:40 - loss: 3.4631 - regression_loss: 2.5995 - classification_loss: 0.8635 222/500 [============>.................] - ETA: 1:40 - loss: 3.4598 - regression_loss: 2.5983 - classification_loss: 0.8614 223/500 [============>.................] - ETA: 1:39 - loss: 3.4564 - regression_loss: 2.5971 - classification_loss: 0.8593 224/500 [============>.................] - ETA: 1:39 - loss: 3.4539 - regression_loss: 2.5963 - classification_loss: 0.8576 225/500 [============>.................] - ETA: 1:39 - loss: 3.4510 - regression_loss: 2.5954 - classification_loss: 0.8556 226/500 [============>.................] - ETA: 1:38 - loss: 3.4483 - regression_loss: 2.5946 - classification_loss: 0.8537 227/500 [============>.................] - ETA: 1:38 - loss: 3.4481 - regression_loss: 2.5957 - classification_loss: 0.8524 228/500 [============>.................] - ETA: 1:37 - loss: 3.4453 - regression_loss: 2.5945 - classification_loss: 0.8507 229/500 [============>.................] - ETA: 1:37 - loss: 3.4436 - regression_loss: 2.5943 - classification_loss: 0.8493 230/500 [============>.................] - ETA: 1:37 - loss: 3.4413 - regression_loss: 2.5933 - classification_loss: 0.8481 231/500 [============>.................] - ETA: 1:36 - loss: 3.4382 - regression_loss: 2.5921 - classification_loss: 0.8461 232/500 [============>.................] - ETA: 1:36 - loss: 3.4358 - regression_loss: 2.5915 - classification_loss: 0.8443 233/500 [============>.................] - ETA: 1:35 - loss: 3.4354 - regression_loss: 2.5912 - classification_loss: 0.8442 234/500 [=============>................] - ETA: 1:35 - loss: 3.4333 - regression_loss: 2.5908 - classification_loss: 0.8424 235/500 [=============>................] - ETA: 1:35 - loss: 3.4307 - regression_loss: 2.5900 - classification_loss: 0.8408 236/500 [=============>................] - ETA: 1:34 - loss: 3.4275 - regression_loss: 2.5885 - classification_loss: 0.8390 237/500 [=============>................] - ETA: 1:34 - loss: 3.4242 - regression_loss: 2.5872 - classification_loss: 0.8371 238/500 [=============>................] - ETA: 1:34 - loss: 3.4219 - regression_loss: 2.5863 - classification_loss: 0.8356 239/500 [=============>................] - ETA: 1:33 - loss: 3.4195 - regression_loss: 2.5856 - classification_loss: 0.8339 240/500 [=============>................] - ETA: 1:33 - loss: 3.4182 - regression_loss: 2.5855 - classification_loss: 0.8327 241/500 [=============>................] - ETA: 1:32 - loss: 3.4155 - regression_loss: 2.5846 - classification_loss: 0.8309 242/500 [=============>................] - ETA: 1:32 - loss: 3.4137 - regression_loss: 2.5837 - classification_loss: 0.8300 243/500 [=============>................] - ETA: 1:32 - loss: 3.4119 - regression_loss: 2.5832 - classification_loss: 0.8287 244/500 [=============>................] - ETA: 1:31 - loss: 3.4099 - regression_loss: 2.5824 - classification_loss: 0.8275 245/500 [=============>................] - ETA: 1:31 - loss: 3.4080 - regression_loss: 2.5820 - classification_loss: 0.8260 246/500 [=============>................] - ETA: 1:30 - loss: 3.4070 - regression_loss: 2.5825 - classification_loss: 0.8245 247/500 [=============>................] - ETA: 1:30 - loss: 3.4043 - regression_loss: 2.5815 - classification_loss: 0.8228 248/500 [=============>................] - ETA: 1:30 - loss: 3.4014 - regression_loss: 2.5804 - classification_loss: 0.8210 249/500 [=============>................] - ETA: 1:29 - loss: 3.3991 - regression_loss: 2.5796 - classification_loss: 0.8195 250/500 [==============>...............] - ETA: 1:29 - loss: 3.3968 - regression_loss: 2.5787 - classification_loss: 0.8181 251/500 [==============>...............] - ETA: 1:29 - loss: 3.3959 - regression_loss: 2.5789 - classification_loss: 0.8170 252/500 [==============>...............] - ETA: 1:28 - loss: 3.3928 - regression_loss: 2.5777 - classification_loss: 0.8151 253/500 [==============>...............] - ETA: 1:28 - loss: 3.3900 - regression_loss: 2.5766 - classification_loss: 0.8134 254/500 [==============>...............] - ETA: 1:27 - loss: 3.3880 - regression_loss: 2.5761 - classification_loss: 0.8119 255/500 [==============>...............] - ETA: 1:27 - loss: 3.3858 - regression_loss: 2.5756 - classification_loss: 0.8102 256/500 [==============>...............] - ETA: 1:27 - loss: 3.3830 - regression_loss: 2.5745 - classification_loss: 0.8085 257/500 [==============>...............] - ETA: 1:26 - loss: 3.3815 - regression_loss: 2.5740 - classification_loss: 0.8075 258/500 [==============>...............] - ETA: 1:26 - loss: 3.3789 - regression_loss: 2.5729 - classification_loss: 0.8061 259/500 [==============>...............] - ETA: 1:26 - loss: 3.3772 - regression_loss: 2.5726 - classification_loss: 0.8046 260/500 [==============>...............] - ETA: 1:25 - loss: 3.3764 - regression_loss: 2.5728 - classification_loss: 0.8036 261/500 [==============>...............] - ETA: 1:25 - loss: 3.3746 - regression_loss: 2.5724 - classification_loss: 0.8023 262/500 [==============>...............] - ETA: 1:24 - loss: 3.3723 - regression_loss: 2.5716 - classification_loss: 0.8007 263/500 [==============>...............] - ETA: 1:24 - loss: 3.3727 - regression_loss: 2.5731 - classification_loss: 0.7996 264/500 [==============>...............] - ETA: 1:24 - loss: 3.3723 - regression_loss: 2.5725 - classification_loss: 0.7998 265/500 [==============>...............] - ETA: 1:23 - loss: 3.3708 - regression_loss: 2.5720 - classification_loss: 0.7988 266/500 [==============>...............] - ETA: 1:23 - loss: 3.3686 - regression_loss: 2.5714 - classification_loss: 0.7972 267/500 [===============>..............] - ETA: 1:23 - loss: 3.3672 - regression_loss: 2.5709 - classification_loss: 0.7963 268/500 [===============>..............] - ETA: 1:22 - loss: 3.3645 - regression_loss: 2.5699 - classification_loss: 0.7947 269/500 [===============>..............] - ETA: 1:22 - loss: 3.3636 - regression_loss: 2.5700 - classification_loss: 0.7936 270/500 [===============>..............] - ETA: 1:21 - loss: 3.3617 - regression_loss: 2.5696 - classification_loss: 0.7921 271/500 [===============>..............] - ETA: 1:21 - loss: 3.3603 - regression_loss: 2.5696 - classification_loss: 0.7907 272/500 [===============>..............] - ETA: 1:21 - loss: 3.3607 - regression_loss: 2.5699 - classification_loss: 0.7908 273/500 [===============>..............] - ETA: 1:20 - loss: 3.3586 - regression_loss: 2.5689 - classification_loss: 0.7897 274/500 [===============>..............] - ETA: 1:20 - loss: 3.3565 - regression_loss: 2.5683 - classification_loss: 0.7882 275/500 [===============>..............] - ETA: 1:19 - loss: 3.3563 - regression_loss: 2.5685 - classification_loss: 0.7878 276/500 [===============>..............] - ETA: 1:19 - loss: 3.3557 - regression_loss: 2.5690 - classification_loss: 0.7867 277/500 [===============>..............] - ETA: 1:19 - loss: 3.3539 - regression_loss: 2.5686 - classification_loss: 0.7853 278/500 [===============>..............] - ETA: 1:18 - loss: 3.3543 - regression_loss: 2.5692 - classification_loss: 0.7851 279/500 [===============>..............] - ETA: 1:18 - loss: 3.3540 - regression_loss: 2.5699 - classification_loss: 0.7842 280/500 [===============>..............] - ETA: 1:18 - loss: 3.3519 - regression_loss: 2.5692 - classification_loss: 0.7826 281/500 [===============>..............] - ETA: 1:17 - loss: 3.3503 - regression_loss: 2.5688 - classification_loss: 0.7815 282/500 [===============>..............] - ETA: 1:17 - loss: 3.3489 - regression_loss: 2.5685 - classification_loss: 0.7804 283/500 [===============>..............] - ETA: 1:16 - loss: 3.3468 - regression_loss: 2.5679 - classification_loss: 0.7789 284/500 [================>.............] - ETA: 1:16 - loss: 3.3451 - regression_loss: 2.5676 - classification_loss: 0.7776 285/500 [================>.............] - ETA: 1:16 - loss: 3.3424 - regression_loss: 2.5664 - classification_loss: 0.7761 286/500 [================>.............] - ETA: 1:15 - loss: 3.3418 - regression_loss: 2.5662 - classification_loss: 0.7756 287/500 [================>.............] - ETA: 1:15 - loss: 3.3401 - regression_loss: 2.5655 - classification_loss: 0.7746 288/500 [================>.............] - ETA: 1:15 - loss: 3.3389 - regression_loss: 2.5654 - classification_loss: 0.7735 289/500 [================>.............] - ETA: 1:14 - loss: 3.3406 - regression_loss: 2.5658 - classification_loss: 0.7747 290/500 [================>.............] - ETA: 1:14 - loss: 3.3394 - regression_loss: 2.5654 - classification_loss: 0.7740 291/500 [================>.............] - ETA: 1:13 - loss: 3.3381 - regression_loss: 2.5652 - classification_loss: 0.7729 292/500 [================>.............] - ETA: 1:13 - loss: 3.3367 - regression_loss: 2.5649 - classification_loss: 0.7718 293/500 [================>.............] - ETA: 1:13 - loss: 3.3345 - regression_loss: 2.5640 - classification_loss: 0.7704 294/500 [================>.............] - ETA: 1:12 - loss: 3.3321 - regression_loss: 2.5630 - classification_loss: 0.7691 295/500 [================>.............] - ETA: 1:12 - loss: 3.3307 - regression_loss: 2.5626 - classification_loss: 0.7681 296/500 [================>.............] - ETA: 1:12 - loss: 3.3288 - regression_loss: 2.5617 - classification_loss: 0.7671 297/500 [================>.............] - ETA: 1:11 - loss: 3.3289 - regression_loss: 2.5626 - classification_loss: 0.7662 298/500 [================>.............] - ETA: 1:11 - loss: 3.3274 - regression_loss: 2.5622 - classification_loss: 0.7652 299/500 [================>.............] - ETA: 1:11 - loss: 3.3254 - regression_loss: 2.5614 - classification_loss: 0.7641 300/500 [=================>............] - ETA: 1:10 - loss: 3.3259 - regression_loss: 2.5625 - classification_loss: 0.7634 301/500 [=================>............] - ETA: 1:10 - loss: 3.3244 - regression_loss: 2.5620 - classification_loss: 0.7625 302/500 [=================>............] - ETA: 1:09 - loss: 3.3237 - regression_loss: 2.5613 - classification_loss: 0.7623 303/500 [=================>............] - ETA: 1:09 - loss: 3.3217 - regression_loss: 2.5606 - classification_loss: 0.7611 304/500 [=================>............] - ETA: 1:09 - loss: 3.3204 - regression_loss: 2.5601 - classification_loss: 0.7603 305/500 [=================>............] - ETA: 1:08 - loss: 3.3187 - regression_loss: 2.5597 - classification_loss: 0.7590 306/500 [=================>............] - ETA: 1:08 - loss: 3.3170 - regression_loss: 2.5590 - classification_loss: 0.7580 307/500 [=================>............] - ETA: 1:08 - loss: 3.3157 - regression_loss: 2.5586 - classification_loss: 0.7572 308/500 [=================>............] - ETA: 1:07 - loss: 3.3141 - regression_loss: 2.5580 - classification_loss: 0.7561 309/500 [=================>............] - ETA: 1:07 - loss: 3.3125 - regression_loss: 2.5576 - classification_loss: 0.7550 310/500 [=================>............] - ETA: 1:07 - loss: 3.3120 - regression_loss: 2.5581 - classification_loss: 0.7539 311/500 [=================>............] - ETA: 1:06 - loss: 3.3095 - regression_loss: 2.5568 - classification_loss: 0.7527 312/500 [=================>............] - ETA: 1:06 - loss: 3.3078 - regression_loss: 2.5562 - classification_loss: 0.7516 313/500 [=================>............] - ETA: 1:05 - loss: 3.3058 - regression_loss: 2.5554 - classification_loss: 0.7504 314/500 [=================>............] - ETA: 1:05 - loss: 3.3029 - regression_loss: 2.5534 - classification_loss: 0.7495 315/500 [=================>............] - ETA: 1:05 - loss: 3.3024 - regression_loss: 2.5537 - classification_loss: 0.7487 316/500 [=================>............] - ETA: 1:04 - loss: 3.3009 - regression_loss: 2.5531 - classification_loss: 0.7478 317/500 [==================>...........] - ETA: 1:04 - loss: 3.2989 - regression_loss: 2.5524 - classification_loss: 0.7465 318/500 [==================>...........] - ETA: 1:04 - loss: 3.2984 - regression_loss: 2.5528 - classification_loss: 0.7456 319/500 [==================>...........] - ETA: 1:03 - loss: 3.2966 - regression_loss: 2.5520 - classification_loss: 0.7446 320/500 [==================>...........] - ETA: 1:03 - loss: 3.2943 - regression_loss: 2.5510 - classification_loss: 0.7433 321/500 [==================>...........] - ETA: 1:03 - loss: 3.2930 - regression_loss: 2.5505 - classification_loss: 0.7425 322/500 [==================>...........] - ETA: 1:02 - loss: 3.2912 - regression_loss: 2.5498 - classification_loss: 0.7414 323/500 [==================>...........] - ETA: 1:02 - loss: 3.2896 - regression_loss: 2.5493 - classification_loss: 0.7403 324/500 [==================>...........] - ETA: 1:02 - loss: 3.2884 - regression_loss: 2.5490 - classification_loss: 0.7394 325/500 [==================>...........] - ETA: 1:01 - loss: 3.2863 - regression_loss: 2.5481 - classification_loss: 0.7382 326/500 [==================>...........] - ETA: 1:01 - loss: 3.2849 - regression_loss: 2.5476 - classification_loss: 0.7373 327/500 [==================>...........] - ETA: 1:01 - loss: 3.2831 - regression_loss: 2.5469 - classification_loss: 0.7363 328/500 [==================>...........] - ETA: 1:00 - loss: 3.2830 - regression_loss: 2.5471 - classification_loss: 0.7359 329/500 [==================>...........] - ETA: 1:00 - loss: 3.2813 - regression_loss: 2.5463 - classification_loss: 0.7350 330/500 [==================>...........] - ETA: 1:00 - loss: 3.2806 - regression_loss: 2.5457 - classification_loss: 0.7349 331/500 [==================>...........] - ETA: 59s - loss: 3.2798 - regression_loss: 2.5457 - classification_loss: 0.7341 332/500 [==================>...........] - ETA: 59s - loss: 3.2778 - regression_loss: 2.5447 - classification_loss: 0.7330 333/500 [==================>...........] - ETA: 58s - loss: 3.2758 - regression_loss: 2.5439 - classification_loss: 0.7319 334/500 [===================>..........] - ETA: 58s - loss: 3.2737 - regression_loss: 2.5429 - classification_loss: 0.7307 335/500 [===================>..........] - ETA: 58s - loss: 3.2724 - regression_loss: 2.5425 - classification_loss: 0.7298 336/500 [===================>..........] - ETA: 57s - loss: 3.2703 - regression_loss: 2.5416 - classification_loss: 0.7288 337/500 [===================>..........] - ETA: 57s - loss: 3.2692 - regression_loss: 2.5414 - classification_loss: 0.7278 338/500 [===================>..........] - ETA: 57s - loss: 3.2675 - regression_loss: 2.5408 - classification_loss: 0.7267 339/500 [===================>..........] - ETA: 56s - loss: 3.2663 - regression_loss: 2.5403 - classification_loss: 0.7260 340/500 [===================>..........] - ETA: 56s - loss: 3.2653 - regression_loss: 2.5404 - classification_loss: 0.7249 341/500 [===================>..........] - ETA: 56s - loss: 3.2637 - regression_loss: 2.5396 - classification_loss: 0.7240 342/500 [===================>..........] - ETA: 55s - loss: 3.2624 - regression_loss: 2.5391 - classification_loss: 0.7233 343/500 [===================>..........] - ETA: 55s - loss: 3.2610 - regression_loss: 2.5386 - classification_loss: 0.7224 344/500 [===================>..........] - ETA: 55s - loss: 3.2595 - regression_loss: 2.5380 - classification_loss: 0.7215 345/500 [===================>..........] - ETA: 54s - loss: 3.2587 - regression_loss: 2.5376 - classification_loss: 0.7211 346/500 [===================>..........] - ETA: 54s - loss: 3.2569 - regression_loss: 2.5368 - classification_loss: 0.7201 347/500 [===================>..........] - ETA: 54s - loss: 3.2570 - regression_loss: 2.5375 - classification_loss: 0.7196 348/500 [===================>..........] - ETA: 53s - loss: 3.2535 - regression_loss: 2.5354 - classification_loss: 0.7181 349/500 [===================>..........] - ETA: 53s - loss: 3.2525 - regression_loss: 2.5351 - classification_loss: 0.7174 350/500 [====================>.........] - ETA: 52s - loss: 3.2526 - regression_loss: 2.5342 - classification_loss: 0.7184 351/500 [====================>.........] - ETA: 52s - loss: 3.2523 - regression_loss: 2.5349 - classification_loss: 0.7174 352/500 [====================>.........] - ETA: 52s - loss: 3.2504 - regression_loss: 2.5341 - classification_loss: 0.7163 353/500 [====================>.........] - ETA: 51s - loss: 3.2499 - regression_loss: 2.5340 - classification_loss: 0.7159 354/500 [====================>.........] - ETA: 51s - loss: 3.2483 - regression_loss: 2.5334 - classification_loss: 0.7149 355/500 [====================>.........] - ETA: 51s - loss: 3.2451 - regression_loss: 2.5312 - classification_loss: 0.7139 356/500 [====================>.........] - ETA: 50s - loss: 3.2430 - regression_loss: 2.5302 - classification_loss: 0.7128 357/500 [====================>.........] - ETA: 50s - loss: 3.2412 - regression_loss: 2.5294 - classification_loss: 0.7118 358/500 [====================>.........] - ETA: 50s - loss: 3.2393 - regression_loss: 2.5284 - classification_loss: 0.7109 359/500 [====================>.........] - ETA: 49s - loss: 3.2376 - regression_loss: 2.5276 - classification_loss: 0.7100 360/500 [====================>.........] - ETA: 49s - loss: 3.2357 - regression_loss: 2.5265 - classification_loss: 0.7092 361/500 [====================>.........] - ETA: 49s - loss: 3.2340 - regression_loss: 2.5257 - classification_loss: 0.7083 362/500 [====================>.........] - ETA: 48s - loss: 3.2320 - regression_loss: 2.5246 - classification_loss: 0.7074 363/500 [====================>.........] - ETA: 48s - loss: 3.2299 - regression_loss: 2.5236 - classification_loss: 0.7063 364/500 [====================>.........] - ETA: 47s - loss: 3.2281 - regression_loss: 2.5227 - classification_loss: 0.7054 365/500 [====================>.........] - ETA: 47s - loss: 3.2271 - regression_loss: 2.5225 - classification_loss: 0.7046 366/500 [====================>.........] - ETA: 47s - loss: 3.2248 - regression_loss: 2.5212 - classification_loss: 0.7035 367/500 [=====================>........] - ETA: 46s - loss: 3.2240 - regression_loss: 2.5210 - classification_loss: 0.7030 368/500 [=====================>........] - ETA: 46s - loss: 3.2222 - regression_loss: 2.5201 - classification_loss: 0.7021 369/500 [=====================>........] - ETA: 46s - loss: 3.2211 - regression_loss: 2.5197 - classification_loss: 0.7014 370/500 [=====================>........] - ETA: 45s - loss: 3.2207 - regression_loss: 2.5194 - classification_loss: 0.7013 371/500 [=====================>........] - ETA: 45s - loss: 3.2192 - regression_loss: 2.5188 - classification_loss: 0.7003 372/500 [=====================>........] - ETA: 45s - loss: 3.2179 - regression_loss: 2.5184 - classification_loss: 0.6994 373/500 [=====================>........] - ETA: 44s - loss: 3.2167 - regression_loss: 2.5180 - classification_loss: 0.6987 374/500 [=====================>........] - ETA: 44s - loss: 3.2153 - regression_loss: 2.5174 - classification_loss: 0.6979 375/500 [=====================>........] - ETA: 43s - loss: 3.2133 - regression_loss: 2.5163 - classification_loss: 0.6970 376/500 [=====================>........] - ETA: 43s - loss: 3.2111 - regression_loss: 2.5152 - classification_loss: 0.6959 377/500 [=====================>........] - ETA: 43s - loss: 3.2090 - regression_loss: 2.5141 - classification_loss: 0.6950 378/500 [=====================>........] - ETA: 42s - loss: 3.2078 - regression_loss: 2.5136 - classification_loss: 0.6941 379/500 [=====================>........] - ETA: 42s - loss: 3.2069 - regression_loss: 2.5132 - classification_loss: 0.6937 380/500 [=====================>........] - ETA: 42s - loss: 3.2052 - regression_loss: 2.5125 - classification_loss: 0.6927 381/500 [=====================>........] - ETA: 41s - loss: 3.2039 - regression_loss: 2.5120 - classification_loss: 0.6919 382/500 [=====================>........] - ETA: 41s - loss: 3.2033 - regression_loss: 2.5121 - classification_loss: 0.6912 383/500 [=====================>........] - ETA: 41s - loss: 3.2015 - regression_loss: 2.5110 - classification_loss: 0.6905 384/500 [======================>.......] - ETA: 40s - loss: 3.1992 - regression_loss: 2.5098 - classification_loss: 0.6895 385/500 [======================>.......] - ETA: 40s - loss: 3.1983 - regression_loss: 2.5094 - classification_loss: 0.6889 386/500 [======================>.......] - ETA: 40s - loss: 3.1973 - regression_loss: 2.5091 - classification_loss: 0.6882 387/500 [======================>.......] - ETA: 39s - loss: 3.1967 - regression_loss: 2.5089 - classification_loss: 0.6878 388/500 [======================>.......] - ETA: 39s - loss: 3.1951 - regression_loss: 2.5081 - classification_loss: 0.6869 389/500 [======================>.......] - ETA: 39s - loss: 3.1936 - regression_loss: 2.5074 - classification_loss: 0.6862 390/500 [======================>.......] - ETA: 38s - loss: 3.1913 - regression_loss: 2.5060 - classification_loss: 0.6852 391/500 [======================>.......] - ETA: 38s - loss: 3.1901 - regression_loss: 2.5056 - classification_loss: 0.6845 392/500 [======================>.......] - ETA: 38s - loss: 3.1881 - regression_loss: 2.5045 - classification_loss: 0.6836 393/500 [======================>.......] - ETA: 37s - loss: 3.1881 - regression_loss: 2.5047 - classification_loss: 0.6835 394/500 [======================>.......] - ETA: 37s - loss: 3.1868 - regression_loss: 2.5042 - classification_loss: 0.6826 395/500 [======================>.......] - ETA: 37s - loss: 3.1854 - regression_loss: 2.5036 - classification_loss: 0.6818 396/500 [======================>.......] - ETA: 36s - loss: 3.1853 - regression_loss: 2.5042 - classification_loss: 0.6811 397/500 [======================>.......] - ETA: 36s - loss: 3.1837 - regression_loss: 2.5034 - classification_loss: 0.6802 398/500 [======================>.......] - ETA: 35s - loss: 3.1816 - regression_loss: 2.5022 - classification_loss: 0.6794 399/500 [======================>.......] - ETA: 35s - loss: 3.1808 - regression_loss: 2.5019 - classification_loss: 0.6789 400/500 [=======================>......] - ETA: 35s - loss: 3.1788 - regression_loss: 2.5007 - classification_loss: 0.6781 401/500 [=======================>......] - ETA: 34s - loss: 3.1782 - regression_loss: 2.5008 - classification_loss: 0.6773 402/500 [=======================>......] - ETA: 34s - loss: 3.1778 - regression_loss: 2.5010 - classification_loss: 0.6768 403/500 [=======================>......] - ETA: 34s - loss: 3.1758 - regression_loss: 2.4999 - classification_loss: 0.6759 404/500 [=======================>......] - ETA: 33s - loss: 3.1741 - regression_loss: 2.4989 - classification_loss: 0.6752 405/500 [=======================>......] - ETA: 33s - loss: 3.1731 - regression_loss: 2.4986 - classification_loss: 0.6745 406/500 [=======================>......] - ETA: 33s - loss: 3.1725 - regression_loss: 2.4987 - classification_loss: 0.6737 407/500 [=======================>......] - ETA: 32s - loss: 3.1709 - regression_loss: 2.4980 - classification_loss: 0.6729 408/500 [=======================>......] - ETA: 32s - loss: 3.1693 - regression_loss: 2.4973 - classification_loss: 0.6721 409/500 [=======================>......] - ETA: 32s - loss: 3.1691 - regression_loss: 2.4974 - classification_loss: 0.6717 410/500 [=======================>......] - ETA: 31s - loss: 3.1678 - regression_loss: 2.4968 - classification_loss: 0.6711 411/500 [=======================>......] - ETA: 31s - loss: 3.1668 - regression_loss: 2.4963 - classification_loss: 0.6705 412/500 [=======================>......] - ETA: 31s - loss: 3.1653 - regression_loss: 2.4954 - classification_loss: 0.6699 413/500 [=======================>......] - ETA: 30s - loss: 3.1636 - regression_loss: 2.4945 - classification_loss: 0.6691 414/500 [=======================>......] - ETA: 30s - loss: 3.1628 - regression_loss: 2.4943 - classification_loss: 0.6685 415/500 [=======================>......] - ETA: 29s - loss: 3.1621 - regression_loss: 2.4938 - classification_loss: 0.6682 416/500 [=======================>......] - ETA: 29s - loss: 3.1610 - regression_loss: 2.4933 - classification_loss: 0.6677 417/500 [========================>.....] - ETA: 29s - loss: 3.1616 - regression_loss: 2.4926 - classification_loss: 0.6690 418/500 [========================>.....] - ETA: 28s - loss: 3.1599 - regression_loss: 2.4917 - classification_loss: 0.6682 419/500 [========================>.....] - ETA: 28s - loss: 3.1615 - regression_loss: 2.4924 - classification_loss: 0.6691 420/500 [========================>.....] - ETA: 28s - loss: 3.1599 - regression_loss: 2.4914 - classification_loss: 0.6685 421/500 [========================>.....] - ETA: 27s - loss: 3.1583 - regression_loss: 2.4905 - classification_loss: 0.6678 422/500 [========================>.....] - ETA: 27s - loss: 3.1568 - regression_loss: 2.4897 - classification_loss: 0.6671 423/500 [========================>.....] - ETA: 27s - loss: 3.1549 - regression_loss: 2.4886 - classification_loss: 0.6663 424/500 [========================>.....] - ETA: 26s - loss: 3.1537 - regression_loss: 2.4881 - classification_loss: 0.6656 425/500 [========================>.....] - ETA: 26s - loss: 3.1524 - regression_loss: 2.4875 - classification_loss: 0.6649 426/500 [========================>.....] - ETA: 26s - loss: 3.1512 - regression_loss: 2.4870 - classification_loss: 0.6643 427/500 [========================>.....] - ETA: 25s - loss: 3.1518 - regression_loss: 2.4879 - classification_loss: 0.6639 428/500 [========================>.....] - ETA: 25s - loss: 3.1501 - regression_loss: 2.4869 - classification_loss: 0.6632 429/500 [========================>.....] - ETA: 24s - loss: 3.1490 - regression_loss: 2.4861 - classification_loss: 0.6629 430/500 [========================>.....] - ETA: 24s - loss: 3.1470 - regression_loss: 2.4849 - classification_loss: 0.6621 431/500 [========================>.....] - ETA: 24s - loss: 3.1453 - regression_loss: 2.4840 - classification_loss: 0.6613 432/500 [========================>.....] - ETA: 23s - loss: 3.1441 - regression_loss: 2.4833 - classification_loss: 0.6608 433/500 [========================>.....] - ETA: 23s - loss: 3.1432 - regression_loss: 2.4828 - classification_loss: 0.6604 434/500 [=========================>....] - ETA: 23s - loss: 3.1417 - regression_loss: 2.4820 - classification_loss: 0.6597 435/500 [=========================>....] - ETA: 22s - loss: 3.1409 - regression_loss: 2.4818 - classification_loss: 0.6592 436/500 [=========================>....] - ETA: 22s - loss: 3.1393 - regression_loss: 2.4809 - classification_loss: 0.6583 437/500 [=========================>....] - ETA: 22s - loss: 3.1382 - regression_loss: 2.4798 - classification_loss: 0.6584 438/500 [=========================>....] - ETA: 21s - loss: 3.1368 - regression_loss: 2.4790 - classification_loss: 0.6578 439/500 [=========================>....] - ETA: 21s - loss: 3.1352 - regression_loss: 2.4782 - classification_loss: 0.6570 440/500 [=========================>....] - ETA: 21s - loss: 3.1342 - regression_loss: 2.4776 - classification_loss: 0.6566 441/500 [=========================>....] - ETA: 20s - loss: 3.1330 - regression_loss: 2.4769 - classification_loss: 0.6561 442/500 [=========================>....] - ETA: 20s - loss: 3.1317 - regression_loss: 2.4761 - classification_loss: 0.6555 443/500 [=========================>....] - ETA: 19s - loss: 3.1304 - regression_loss: 2.4756 - classification_loss: 0.6548 444/500 [=========================>....] - ETA: 19s - loss: 3.1293 - regression_loss: 2.4751 - classification_loss: 0.6542 445/500 [=========================>....] - ETA: 19s - loss: 3.1287 - regression_loss: 2.4747 - classification_loss: 0.6539 446/500 [=========================>....] - ETA: 18s - loss: 3.1286 - regression_loss: 2.4750 - classification_loss: 0.6535 447/500 [=========================>....] - ETA: 18s - loss: 3.1274 - regression_loss: 2.4744 - classification_loss: 0.6530 448/500 [=========================>....] - ETA: 18s - loss: 3.1266 - regression_loss: 2.4740 - classification_loss: 0.6526 449/500 [=========================>....] - ETA: 17s - loss: 3.1254 - regression_loss: 2.4733 - classification_loss: 0.6521 450/500 [==========================>...] - ETA: 17s - loss: 3.1243 - regression_loss: 2.4727 - classification_loss: 0.6516 451/500 [==========================>...] - ETA: 17s - loss: 3.1231 - regression_loss: 2.4720 - classification_loss: 0.6511 452/500 [==========================>...] - ETA: 16s - loss: 3.1218 - regression_loss: 2.4713 - classification_loss: 0.6505 453/500 [==========================>...] - ETA: 16s - loss: 3.1209 - regression_loss: 2.4708 - classification_loss: 0.6501 454/500 [==========================>...] - ETA: 16s - loss: 3.1202 - regression_loss: 2.4701 - classification_loss: 0.6501 455/500 [==========================>...] - ETA: 15s - loss: 3.1191 - regression_loss: 2.4695 - classification_loss: 0.6496 456/500 [==========================>...] - ETA: 15s - loss: 3.1182 - regression_loss: 2.4691 - classification_loss: 0.6491 457/500 [==========================>...] - ETA: 15s - loss: 3.1176 - regression_loss: 2.4690 - classification_loss: 0.6486 458/500 [==========================>...] - ETA: 14s - loss: 3.1165 - regression_loss: 2.4684 - classification_loss: 0.6481 459/500 [==========================>...] - ETA: 14s - loss: 3.1149 - regression_loss: 2.4675 - classification_loss: 0.6475 460/500 [==========================>...] - ETA: 13s - loss: 3.1139 - regression_loss: 2.4669 - classification_loss: 0.6470 461/500 [==========================>...] - ETA: 13s - loss: 3.1123 - regression_loss: 2.4660 - classification_loss: 0.6463 462/500 [==========================>...] - ETA: 13s - loss: 3.1121 - regression_loss: 2.4660 - classification_loss: 0.6460 463/500 [==========================>...] - ETA: 12s - loss: 3.1106 - regression_loss: 2.4653 - classification_loss: 0.6453 464/500 [==========================>...] - ETA: 12s - loss: 3.1095 - regression_loss: 2.4648 - classification_loss: 0.6447 465/500 [==========================>...] - ETA: 12s - loss: 3.1085 - regression_loss: 2.4644 - classification_loss: 0.6441 466/500 [==========================>...] - ETA: 11s - loss: 3.1083 - regression_loss: 2.4646 - classification_loss: 0.6437 467/500 [===========================>..] - ETA: 11s - loss: 3.1070 - regression_loss: 2.4639 - classification_loss: 0.6431 468/500 [===========================>..] - ETA: 11s - loss: 3.1062 - regression_loss: 2.4637 - classification_loss: 0.6425 469/500 [===========================>..] - ETA: 10s - loss: 3.1055 - regression_loss: 2.4634 - classification_loss: 0.6421 470/500 [===========================>..] - ETA: 10s - loss: 3.1051 - regression_loss: 2.4633 - classification_loss: 0.6418 471/500 [===========================>..] - ETA: 10s - loss: 3.1042 - regression_loss: 2.4628 - classification_loss: 0.6413 472/500 [===========================>..] - ETA: 9s - loss: 3.1035 - regression_loss: 2.4627 - classification_loss: 0.6408 473/500 [===========================>..] - ETA: 9s - loss: 3.1025 - regression_loss: 2.4623 - classification_loss: 0.6402 474/500 [===========================>..] - ETA: 9s - loss: 3.1007 - regression_loss: 2.4611 - classification_loss: 0.6396 475/500 [===========================>..] - ETA: 8s - loss: 3.0994 - regression_loss: 2.4603 - classification_loss: 0.6391 476/500 [===========================>..] - ETA: 8s - loss: 3.0981 - regression_loss: 2.4597 - classification_loss: 0.6384 477/500 [===========================>..] - ETA: 8s - loss: 3.0968 - regression_loss: 2.4590 - classification_loss: 0.6378 478/500 [===========================>..] - ETA: 7s - loss: 3.0953 - regression_loss: 2.4581 - classification_loss: 0.6372 479/500 [===========================>..] - ETA: 7s - loss: 3.0950 - regression_loss: 2.4581 - classification_loss: 0.6369 480/500 [===========================>..] - ETA: 6s - loss: 3.0931 - regression_loss: 2.4570 - classification_loss: 0.6361 481/500 [===========================>..] - ETA: 6s - loss: 3.0920 - regression_loss: 2.4564 - classification_loss: 0.6356 482/500 [===========================>..] - ETA: 6s - loss: 3.0911 - regression_loss: 2.4559 - classification_loss: 0.6353 483/500 [===========================>..] - ETA: 5s - loss: 3.0906 - regression_loss: 2.4557 - classification_loss: 0.6350 484/500 [============================>.] - ETA: 5s - loss: 3.0892 - regression_loss: 2.4548 - classification_loss: 0.6344 485/500 [============================>.] - ETA: 5s - loss: 3.0882 - regression_loss: 2.4542 - classification_loss: 0.6340 486/500 [============================>.] - ETA: 4s - loss: 3.0860 - regression_loss: 2.4527 - classification_loss: 0.6333 487/500 [============================>.] - ETA: 4s - loss: 3.0852 - regression_loss: 2.4524 - classification_loss: 0.6328 488/500 [============================>.] - ETA: 4s - loss: 3.0840 - regression_loss: 2.4517 - classification_loss: 0.6323 489/500 [============================>.] - ETA: 3s - loss: 3.0828 - regression_loss: 2.4510 - classification_loss: 0.6318 490/500 [============================>.] - ETA: 3s - loss: 3.0815 - regression_loss: 2.4502 - classification_loss: 0.6313 491/500 [============================>.] - ETA: 3s - loss: 3.0807 - regression_loss: 2.4501 - classification_loss: 0.6306 492/500 [============================>.] - ETA: 2s - loss: 3.0792 - regression_loss: 2.4490 - classification_loss: 0.6302 493/500 [============================>.] - ETA: 2s - loss: 3.0792 - regression_loss: 2.4490 - classification_loss: 0.6302 494/500 [============================>.] - ETA: 2s - loss: 3.0785 - regression_loss: 2.4487 - classification_loss: 0.6298 495/500 [============================>.] - ETA: 1s - loss: 3.0779 - regression_loss: 2.4483 - classification_loss: 0.6296 496/500 [============================>.] - ETA: 1s - loss: 3.0772 - regression_loss: 2.4481 - classification_loss: 0.6291 497/500 [============================>.] - ETA: 1s - loss: 3.0758 - regression_loss: 2.4474 - classification_loss: 0.6284 498/500 [============================>.] - ETA: 0s - loss: 3.0748 - regression_loss: 2.4470 - classification_loss: 0.6279 499/500 [============================>.] - ETA: 0s - loss: 3.0743 - regression_loss: 2.4468 - classification_loss: 0.6275 500/500 [==============================] - 174s 348ms/step - loss: 3.0731 - regression_loss: 2.4462 - classification_loss: 0.6269 1172 instances of class plum with average precision: 0.1594 mAP: 0.1594 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:45 - loss: 2.6949 - regression_loss: 2.2857 - classification_loss: 0.4092 2/500 [..............................] - ETA: 2:46 - loss: 2.6466 - regression_loss: 2.2662 - classification_loss: 0.3804 3/500 [..............................] - ETA: 2:46 - loss: 2.8165 - regression_loss: 2.4137 - classification_loss: 0.4028 4/500 [..............................] - ETA: 2:47 - loss: 2.7709 - regression_loss: 2.3657 - classification_loss: 0.4052 5/500 [..............................] - ETA: 2:48 - loss: 2.6769 - regression_loss: 2.2832 - classification_loss: 0.3937 6/500 [..............................] - ETA: 2:48 - loss: 2.6500 - regression_loss: 2.2543 - classification_loss: 0.3957 7/500 [..............................] - ETA: 2:50 - loss: 2.5845 - regression_loss: 2.1984 - classification_loss: 0.3861 8/500 [..............................] - ETA: 2:50 - loss: 2.6126 - regression_loss: 2.2278 - classification_loss: 0.3848 9/500 [..............................] - ETA: 2:49 - loss: 2.5366 - regression_loss: 2.1563 - classification_loss: 0.3803 10/500 [..............................] - ETA: 2:48 - loss: 2.5585 - regression_loss: 2.1666 - classification_loss: 0.3919 11/500 [..............................] - ETA: 2:48 - loss: 2.5195 - regression_loss: 2.1280 - classification_loss: 0.3914 12/500 [..............................] - ETA: 2:47 - loss: 2.4698 - regression_loss: 2.0800 - classification_loss: 0.3897 13/500 [..............................] - ETA: 2:46 - loss: 2.4707 - regression_loss: 2.0750 - classification_loss: 0.3957 14/500 [..............................] - ETA: 2:46 - loss: 2.4898 - regression_loss: 2.0764 - classification_loss: 0.4133 15/500 [..............................] - ETA: 2:45 - loss: 2.4922 - regression_loss: 2.0815 - classification_loss: 0.4107 16/500 [..............................] - ETA: 2:46 - loss: 2.4852 - regression_loss: 2.0804 - classification_loss: 0.4048 17/500 [>.............................] - ETA: 2:46 - loss: 2.5058 - regression_loss: 2.1019 - classification_loss: 0.4039 18/500 [>.............................] - ETA: 2:46 - loss: 2.5062 - regression_loss: 2.1047 - classification_loss: 0.4015 19/500 [>.............................] - ETA: 2:45 - loss: 2.5432 - regression_loss: 2.1416 - classification_loss: 0.4016 20/500 [>.............................] - ETA: 2:44 - loss: 2.5609 - regression_loss: 2.1525 - classification_loss: 0.4085 21/500 [>.............................] - ETA: 2:44 - loss: 2.5389 - regression_loss: 2.1358 - classification_loss: 0.4031 22/500 [>.............................] - ETA: 2:44 - loss: 2.4749 - regression_loss: 2.0830 - classification_loss: 0.3919 23/500 [>.............................] - ETA: 2:44 - loss: 2.4749 - regression_loss: 2.0831 - classification_loss: 0.3918 24/500 [>.............................] - ETA: 2:44 - loss: 2.4752 - regression_loss: 2.0858 - classification_loss: 0.3895 25/500 [>.............................] - ETA: 2:44 - loss: 2.4548 - regression_loss: 2.0648 - classification_loss: 0.3900 26/500 [>.............................] - ETA: 2:43 - loss: 2.4642 - regression_loss: 2.0715 - classification_loss: 0.3926 27/500 [>.............................] - ETA: 2:43 - loss: 2.4593 - regression_loss: 2.0675 - classification_loss: 0.3918 28/500 [>.............................] - ETA: 2:42 - loss: 2.4393 - regression_loss: 2.0520 - classification_loss: 0.3873 29/500 [>.............................] - ETA: 2:42 - loss: 2.4312 - regression_loss: 2.0458 - classification_loss: 0.3854 30/500 [>.............................] - ETA: 2:41 - loss: 2.4324 - regression_loss: 2.0472 - classification_loss: 0.3852 31/500 [>.............................] - ETA: 2:40 - loss: 2.4259 - regression_loss: 2.0431 - classification_loss: 0.3828 32/500 [>.............................] - ETA: 2:40 - loss: 2.4181 - regression_loss: 2.0390 - classification_loss: 0.3791 33/500 [>.............................] - ETA: 2:39 - loss: 2.4158 - regression_loss: 2.0374 - classification_loss: 0.3784 34/500 [=>............................] - ETA: 2:39 - loss: 2.4138 - regression_loss: 2.0384 - classification_loss: 0.3755 35/500 [=>............................] - ETA: 2:38 - loss: 2.4098 - regression_loss: 2.0361 - classification_loss: 0.3737 36/500 [=>............................] - ETA: 2:37 - loss: 2.4106 - regression_loss: 2.0381 - classification_loss: 0.3725 37/500 [=>............................] - ETA: 2:37 - loss: 2.4131 - regression_loss: 2.0377 - classification_loss: 0.3754 38/500 [=>............................] - ETA: 2:36 - loss: 2.4102 - regression_loss: 2.0355 - classification_loss: 0.3747 39/500 [=>............................] - ETA: 2:36 - loss: 2.4105 - regression_loss: 2.0349 - classification_loss: 0.3756 40/500 [=>............................] - ETA: 2:35 - loss: 2.3998 - regression_loss: 2.0267 - classification_loss: 0.3731 41/500 [=>............................] - ETA: 2:35 - loss: 2.3966 - regression_loss: 2.0250 - classification_loss: 0.3716 42/500 [=>............................] - ETA: 2:35 - loss: 2.3930 - regression_loss: 2.0226 - classification_loss: 0.3704 43/500 [=>............................] - ETA: 2:34 - loss: 2.4124 - regression_loss: 2.0366 - classification_loss: 0.3758 44/500 [=>............................] - ETA: 2:34 - loss: 2.4117 - regression_loss: 2.0347 - classification_loss: 0.3770 45/500 [=>............................] - ETA: 2:34 - loss: 2.4146 - regression_loss: 2.0361 - classification_loss: 0.3785 46/500 [=>............................] - ETA: 2:34 - loss: 2.4157 - regression_loss: 2.0366 - classification_loss: 0.3791 47/500 [=>............................] - ETA: 2:33 - loss: 2.4053 - regression_loss: 2.0289 - classification_loss: 0.3764 48/500 [=>............................] - ETA: 2:33 - loss: 2.4127 - regression_loss: 2.0364 - classification_loss: 0.3763 49/500 [=>............................] - ETA: 2:32 - loss: 2.4097 - regression_loss: 2.0354 - classification_loss: 0.3743 50/500 [==>...........................] - ETA: 2:32 - loss: 2.4177 - regression_loss: 2.0433 - classification_loss: 0.3744 51/500 [==>...........................] - ETA: 2:32 - loss: 2.4127 - regression_loss: 2.0385 - classification_loss: 0.3742 52/500 [==>...........................] - ETA: 2:31 - loss: 2.4175 - regression_loss: 2.0424 - classification_loss: 0.3751 53/500 [==>...........................] - ETA: 2:30 - loss: 2.4108 - regression_loss: 2.0365 - classification_loss: 0.3743 54/500 [==>...........................] - ETA: 2:30 - loss: 2.4067 - regression_loss: 2.0332 - classification_loss: 0.3735 55/500 [==>...........................] - ETA: 2:30 - loss: 2.4059 - regression_loss: 2.0329 - classification_loss: 0.3730 56/500 [==>...........................] - ETA: 2:30 - loss: 2.4063 - regression_loss: 2.0328 - classification_loss: 0.3734 57/500 [==>...........................] - ETA: 2:29 - loss: 2.4089 - regression_loss: 2.0355 - classification_loss: 0.3734 58/500 [==>...........................] - ETA: 2:29 - loss: 2.4062 - regression_loss: 2.0336 - classification_loss: 0.3726 59/500 [==>...........................] - ETA: 2:29 - loss: 2.4019 - regression_loss: 2.0302 - classification_loss: 0.3717 60/500 [==>...........................] - ETA: 2:29 - loss: 2.3926 - regression_loss: 2.0224 - classification_loss: 0.3701 61/500 [==>...........................] - ETA: 2:28 - loss: 2.3939 - regression_loss: 2.0248 - classification_loss: 0.3691 62/500 [==>...........................] - ETA: 2:28 - loss: 2.3952 - regression_loss: 2.0261 - classification_loss: 0.3691 63/500 [==>...........................] - ETA: 2:28 - loss: 2.3943 - regression_loss: 2.0255 - classification_loss: 0.3689 64/500 [==>...........................] - ETA: 2:28 - loss: 2.3897 - regression_loss: 2.0227 - classification_loss: 0.3670 65/500 [==>...........................] - ETA: 2:27 - loss: 2.3935 - regression_loss: 2.0272 - classification_loss: 0.3663 66/500 [==>...........................] - ETA: 2:27 - loss: 2.3846 - regression_loss: 2.0209 - classification_loss: 0.3637 67/500 [===>..........................] - ETA: 2:27 - loss: 2.3910 - regression_loss: 2.0270 - classification_loss: 0.3640 68/500 [===>..........................] - ETA: 2:27 - loss: 2.3909 - regression_loss: 2.0274 - classification_loss: 0.3635 69/500 [===>..........................] - ETA: 2:26 - loss: 2.3904 - regression_loss: 2.0278 - classification_loss: 0.3626 70/500 [===>..........................] - ETA: 2:26 - loss: 2.3877 - regression_loss: 2.0260 - classification_loss: 0.3617 71/500 [===>..........................] - ETA: 2:26 - loss: 2.3863 - regression_loss: 2.0246 - classification_loss: 0.3617 72/500 [===>..........................] - ETA: 2:26 - loss: 2.3825 - regression_loss: 2.0214 - classification_loss: 0.3611 73/500 [===>..........................] - ETA: 2:25 - loss: 2.3822 - regression_loss: 2.0215 - classification_loss: 0.3607 74/500 [===>..........................] - ETA: 2:25 - loss: 2.3866 - regression_loss: 2.0245 - classification_loss: 0.3621 75/500 [===>..........................] - ETA: 2:25 - loss: 2.3841 - regression_loss: 2.0225 - classification_loss: 0.3615 76/500 [===>..........................] - ETA: 2:24 - loss: 2.3805 - regression_loss: 2.0193 - classification_loss: 0.3612 77/500 [===>..........................] - ETA: 2:24 - loss: 2.3924 - regression_loss: 2.0238 - classification_loss: 0.3686 78/500 [===>..........................] - ETA: 2:24 - loss: 2.3941 - regression_loss: 2.0254 - classification_loss: 0.3687 79/500 [===>..........................] - ETA: 2:24 - loss: 2.3980 - regression_loss: 2.0293 - classification_loss: 0.3687 80/500 [===>..........................] - ETA: 2:23 - loss: 2.3997 - regression_loss: 2.0306 - classification_loss: 0.3690 81/500 [===>..........................] - ETA: 2:23 - loss: 2.3960 - regression_loss: 2.0281 - classification_loss: 0.3679 82/500 [===>..........................] - ETA: 2:22 - loss: 2.4076 - regression_loss: 2.0368 - classification_loss: 0.3709 83/500 [===>..........................] - ETA: 2:22 - loss: 2.4058 - regression_loss: 2.0356 - classification_loss: 0.3702 84/500 [====>.........................] - ETA: 2:21 - loss: 2.4045 - regression_loss: 2.0347 - classification_loss: 0.3698 85/500 [====>.........................] - ETA: 2:20 - loss: 2.4071 - regression_loss: 2.0372 - classification_loss: 0.3699 86/500 [====>.........................] - ETA: 2:20 - loss: 2.4076 - regression_loss: 2.0380 - classification_loss: 0.3697 87/500 [====>.........................] - ETA: 2:20 - loss: 2.4060 - regression_loss: 2.0376 - classification_loss: 0.3684 88/500 [====>.........................] - ETA: 2:19 - loss: 2.4047 - regression_loss: 2.0377 - classification_loss: 0.3670 89/500 [====>.........................] - ETA: 2:19 - loss: 2.4051 - regression_loss: 2.0377 - classification_loss: 0.3674 90/500 [====>.........................] - ETA: 2:19 - loss: 2.4047 - regression_loss: 2.0365 - classification_loss: 0.3681 91/500 [====>.........................] - ETA: 2:18 - loss: 2.4047 - regression_loss: 2.0367 - classification_loss: 0.3680 92/500 [====>.........................] - ETA: 2:18 - loss: 2.4051 - regression_loss: 2.0372 - classification_loss: 0.3679 93/500 [====>.........................] - ETA: 2:17 - loss: 2.4062 - regression_loss: 2.0383 - classification_loss: 0.3679 94/500 [====>.........................] - ETA: 2:17 - loss: 2.4004 - regression_loss: 2.0335 - classification_loss: 0.3669 95/500 [====>.........................] - ETA: 2:17 - loss: 2.3977 - regression_loss: 2.0308 - classification_loss: 0.3669 96/500 [====>.........................] - ETA: 2:16 - loss: 2.3997 - regression_loss: 2.0329 - classification_loss: 0.3668 97/500 [====>.........................] - ETA: 2:16 - loss: 2.3885 - regression_loss: 2.0232 - classification_loss: 0.3653 98/500 [====>.........................] - ETA: 2:16 - loss: 2.3891 - regression_loss: 2.0237 - classification_loss: 0.3654 99/500 [====>.........................] - ETA: 2:15 - loss: 2.3869 - regression_loss: 2.0219 - classification_loss: 0.3650 100/500 [=====>........................] - ETA: 2:15 - loss: 2.3883 - regression_loss: 2.0230 - classification_loss: 0.3653 101/500 [=====>........................] - ETA: 2:14 - loss: 2.3835 - regression_loss: 2.0191 - classification_loss: 0.3644 102/500 [=====>........................] - ETA: 2:14 - loss: 2.3789 - regression_loss: 2.0151 - classification_loss: 0.3638 103/500 [=====>........................] - ETA: 2:14 - loss: 2.3862 - regression_loss: 2.0221 - classification_loss: 0.3641 104/500 [=====>........................] - ETA: 2:13 - loss: 2.3826 - regression_loss: 2.0190 - classification_loss: 0.3636 105/500 [=====>........................] - ETA: 2:13 - loss: 2.3820 - regression_loss: 2.0184 - classification_loss: 0.3637 106/500 [=====>........................] - ETA: 2:13 - loss: 2.3831 - regression_loss: 2.0193 - classification_loss: 0.3637 107/500 [=====>........................] - ETA: 2:12 - loss: 2.3826 - regression_loss: 2.0191 - classification_loss: 0.3635 108/500 [=====>........................] - ETA: 2:12 - loss: 2.3825 - regression_loss: 2.0192 - classification_loss: 0.3633 109/500 [=====>........................] - ETA: 2:12 - loss: 2.3817 - regression_loss: 2.0191 - classification_loss: 0.3626 110/500 [=====>........................] - ETA: 2:11 - loss: 2.3817 - regression_loss: 2.0193 - classification_loss: 0.3624 111/500 [=====>........................] - ETA: 2:11 - loss: 2.3783 - regression_loss: 2.0163 - classification_loss: 0.3619 112/500 [=====>........................] - ETA: 2:11 - loss: 2.3776 - regression_loss: 2.0154 - classification_loss: 0.3622 113/500 [=====>........................] - ETA: 2:10 - loss: 2.3756 - regression_loss: 2.0139 - classification_loss: 0.3617 114/500 [=====>........................] - ETA: 2:10 - loss: 2.3770 - regression_loss: 2.0154 - classification_loss: 0.3616 115/500 [=====>........................] - ETA: 2:10 - loss: 2.3778 - regression_loss: 2.0160 - classification_loss: 0.3618 116/500 [=====>........................] - ETA: 2:09 - loss: 2.3810 - regression_loss: 2.0191 - classification_loss: 0.3619 117/500 [======>.......................] - ETA: 2:09 - loss: 2.3839 - regression_loss: 2.0217 - classification_loss: 0.3622 118/500 [======>.......................] - ETA: 2:09 - loss: 2.3865 - regression_loss: 2.0241 - classification_loss: 0.3624 119/500 [======>.......................] - ETA: 2:08 - loss: 2.3891 - regression_loss: 2.0262 - classification_loss: 0.3629 120/500 [======>.......................] - ETA: 2:08 - loss: 2.3852 - regression_loss: 2.0233 - classification_loss: 0.3619 121/500 [======>.......................] - ETA: 2:08 - loss: 2.3847 - regression_loss: 2.0232 - classification_loss: 0.3615 122/500 [======>.......................] - ETA: 2:07 - loss: 2.3863 - regression_loss: 2.0246 - classification_loss: 0.3617 123/500 [======>.......................] - ETA: 2:07 - loss: 2.3834 - regression_loss: 2.0220 - classification_loss: 0.3614 124/500 [======>.......................] - ETA: 2:07 - loss: 2.3843 - regression_loss: 2.0230 - classification_loss: 0.3613 125/500 [======>.......................] - ETA: 2:06 - loss: 2.3814 - regression_loss: 2.0204 - classification_loss: 0.3611 126/500 [======>.......................] - ETA: 2:06 - loss: 2.3769 - regression_loss: 2.0165 - classification_loss: 0.3604 127/500 [======>.......................] - ETA: 2:05 - loss: 2.3773 - regression_loss: 2.0173 - classification_loss: 0.3599 128/500 [======>.......................] - ETA: 2:05 - loss: 2.3819 - regression_loss: 2.0205 - classification_loss: 0.3614 129/500 [======>.......................] - ETA: 2:05 - loss: 2.3831 - regression_loss: 2.0214 - classification_loss: 0.3617 130/500 [======>.......................] - ETA: 2:04 - loss: 2.3835 - regression_loss: 2.0217 - classification_loss: 0.3618 131/500 [======>.......................] - ETA: 2:04 - loss: 2.3841 - regression_loss: 2.0226 - classification_loss: 0.3616 132/500 [======>.......................] - ETA: 2:03 - loss: 2.3777 - regression_loss: 2.0164 - classification_loss: 0.3613 133/500 [======>.......................] - ETA: 2:03 - loss: 2.3788 - regression_loss: 2.0170 - classification_loss: 0.3618 134/500 [=======>......................] - ETA: 2:03 - loss: 2.3784 - regression_loss: 2.0166 - classification_loss: 0.3618 135/500 [=======>......................] - ETA: 2:02 - loss: 2.3776 - regression_loss: 2.0161 - classification_loss: 0.3615 136/500 [=======>......................] - ETA: 2:02 - loss: 2.3781 - regression_loss: 2.0168 - classification_loss: 0.3613 137/500 [=======>......................] - ETA: 2:02 - loss: 2.3801 - regression_loss: 2.0180 - classification_loss: 0.3621 138/500 [=======>......................] - ETA: 2:01 - loss: 2.3747 - regression_loss: 2.0131 - classification_loss: 0.3616 139/500 [=======>......................] - ETA: 2:01 - loss: 2.3751 - regression_loss: 2.0138 - classification_loss: 0.3613 140/500 [=======>......................] - ETA: 2:01 - loss: 2.3744 - regression_loss: 2.0131 - classification_loss: 0.3613 141/500 [=======>......................] - ETA: 2:00 - loss: 2.3720 - regression_loss: 2.0115 - classification_loss: 0.3605 142/500 [=======>......................] - ETA: 2:00 - loss: 2.3775 - regression_loss: 2.0165 - classification_loss: 0.3610 143/500 [=======>......................] - ETA: 2:00 - loss: 2.3779 - regression_loss: 2.0169 - classification_loss: 0.3610 144/500 [=======>......................] - ETA: 1:59 - loss: 2.3777 - regression_loss: 2.0171 - classification_loss: 0.3605 145/500 [=======>......................] - ETA: 1:59 - loss: 2.3785 - regression_loss: 2.0183 - classification_loss: 0.3602 146/500 [=======>......................] - ETA: 1:59 - loss: 2.3801 - regression_loss: 2.0197 - classification_loss: 0.3604 147/500 [=======>......................] - ETA: 1:58 - loss: 2.3788 - regression_loss: 2.0189 - classification_loss: 0.3599 148/500 [=======>......................] - ETA: 1:58 - loss: 2.3764 - regression_loss: 2.0169 - classification_loss: 0.3595 149/500 [=======>......................] - ETA: 1:58 - loss: 2.3747 - regression_loss: 2.0156 - classification_loss: 0.3590 150/500 [========>.....................] - ETA: 1:57 - loss: 2.3733 - regression_loss: 2.0145 - classification_loss: 0.3587 151/500 [========>.....................] - ETA: 1:57 - loss: 2.3752 - regression_loss: 2.0160 - classification_loss: 0.3593 152/500 [========>.....................] - ETA: 1:57 - loss: 2.3735 - regression_loss: 2.0147 - classification_loss: 0.3589 153/500 [========>.....................] - ETA: 1:56 - loss: 2.3708 - regression_loss: 2.0126 - classification_loss: 0.3582 154/500 [========>.....................] - ETA: 1:56 - loss: 2.3646 - regression_loss: 2.0072 - classification_loss: 0.3574 155/500 [========>.....................] - ETA: 1:56 - loss: 2.3608 - regression_loss: 2.0041 - classification_loss: 0.3567 156/500 [========>.....................] - ETA: 1:55 - loss: 2.3611 - regression_loss: 2.0046 - classification_loss: 0.3565 157/500 [========>.....................] - ETA: 1:55 - loss: 2.3603 - regression_loss: 2.0041 - classification_loss: 0.3561 158/500 [========>.....................] - ETA: 1:55 - loss: 2.3588 - regression_loss: 2.0030 - classification_loss: 0.3558 159/500 [========>.....................] - ETA: 1:55 - loss: 2.3622 - regression_loss: 2.0060 - classification_loss: 0.3563 160/500 [========>.....................] - ETA: 1:54 - loss: 2.3591 - regression_loss: 2.0032 - classification_loss: 0.3559 161/500 [========>.....................] - ETA: 1:54 - loss: 2.3592 - regression_loss: 2.0034 - classification_loss: 0.3558 162/500 [========>.....................] - ETA: 1:54 - loss: 2.3593 - regression_loss: 2.0035 - classification_loss: 0.3558 163/500 [========>.....................] - ETA: 1:53 - loss: 2.3597 - regression_loss: 2.0039 - classification_loss: 0.3558 164/500 [========>.....................] - ETA: 1:53 - loss: 2.3620 - regression_loss: 2.0056 - classification_loss: 0.3564 165/500 [========>.....................] - ETA: 1:53 - loss: 2.3555 - regression_loss: 1.9997 - classification_loss: 0.3557 166/500 [========>.....................] - ETA: 1:52 - loss: 2.3522 - regression_loss: 1.9970 - classification_loss: 0.3551 167/500 [=========>....................] - ETA: 1:52 - loss: 2.3530 - regression_loss: 1.9980 - classification_loss: 0.3550 168/500 [=========>....................] - ETA: 1:52 - loss: 2.3522 - regression_loss: 1.9975 - classification_loss: 0.3548 169/500 [=========>....................] - ETA: 1:51 - loss: 2.3519 - regression_loss: 1.9972 - classification_loss: 0.3547 170/500 [=========>....................] - ETA: 1:51 - loss: 2.3511 - regression_loss: 1.9964 - classification_loss: 0.3547 171/500 [=========>....................] - ETA: 1:51 - loss: 2.3503 - regression_loss: 1.9957 - classification_loss: 0.3545 172/500 [=========>....................] - ETA: 1:51 - loss: 2.3512 - regression_loss: 1.9969 - classification_loss: 0.3543 173/500 [=========>....................] - ETA: 1:50 - loss: 2.3527 - regression_loss: 1.9972 - classification_loss: 0.3555 174/500 [=========>....................] - ETA: 1:50 - loss: 2.3523 - regression_loss: 1.9971 - classification_loss: 0.3552 175/500 [=========>....................] - ETA: 1:50 - loss: 2.3521 - regression_loss: 1.9973 - classification_loss: 0.3548 176/500 [=========>....................] - ETA: 1:49 - loss: 2.3532 - regression_loss: 1.9984 - classification_loss: 0.3549 177/500 [=========>....................] - ETA: 1:49 - loss: 2.3521 - regression_loss: 1.9975 - classification_loss: 0.3546 178/500 [=========>....................] - ETA: 1:49 - loss: 2.3490 - regression_loss: 1.9950 - classification_loss: 0.3540 179/500 [=========>....................] - ETA: 1:48 - loss: 2.3508 - regression_loss: 1.9964 - classification_loss: 0.3544 180/500 [=========>....................] - ETA: 1:48 - loss: 2.3503 - regression_loss: 1.9957 - classification_loss: 0.3546 181/500 [=========>....................] - ETA: 1:48 - loss: 2.3545 - regression_loss: 1.9983 - classification_loss: 0.3561 182/500 [=========>....................] - ETA: 1:47 - loss: 2.3553 - regression_loss: 1.9990 - classification_loss: 0.3563 183/500 [=========>....................] - ETA: 1:47 - loss: 2.3559 - regression_loss: 1.9990 - classification_loss: 0.3569 184/500 [==========>...................] - ETA: 1:47 - loss: 2.3542 - regression_loss: 1.9977 - classification_loss: 0.3565 185/500 [==========>...................] - ETA: 1:46 - loss: 2.3515 - regression_loss: 1.9952 - classification_loss: 0.3563 186/500 [==========>...................] - ETA: 1:46 - loss: 2.3518 - regression_loss: 1.9955 - classification_loss: 0.3563 187/500 [==========>...................] - ETA: 1:46 - loss: 2.3537 - regression_loss: 1.9972 - classification_loss: 0.3565 188/500 [==========>...................] - ETA: 1:45 - loss: 2.3528 - regression_loss: 1.9964 - classification_loss: 0.3564 189/500 [==========>...................] - ETA: 1:45 - loss: 2.3531 - regression_loss: 1.9968 - classification_loss: 0.3563 190/500 [==========>...................] - ETA: 1:45 - loss: 2.3508 - regression_loss: 1.9948 - classification_loss: 0.3559 191/500 [==========>...................] - ETA: 1:44 - loss: 2.3530 - regression_loss: 1.9961 - classification_loss: 0.3569 192/500 [==========>...................] - ETA: 1:44 - loss: 2.3508 - regression_loss: 1.9941 - classification_loss: 0.3567 193/500 [==========>...................] - ETA: 1:44 - loss: 2.3523 - regression_loss: 1.9955 - classification_loss: 0.3568 194/500 [==========>...................] - ETA: 1:43 - loss: 2.3501 - regression_loss: 1.9936 - classification_loss: 0.3566 195/500 [==========>...................] - ETA: 1:43 - loss: 2.3523 - regression_loss: 1.9959 - classification_loss: 0.3565 196/500 [==========>...................] - ETA: 1:43 - loss: 2.3520 - regression_loss: 1.9956 - classification_loss: 0.3565 197/500 [==========>...................] - ETA: 1:42 - loss: 2.3516 - regression_loss: 1.9952 - classification_loss: 0.3564 198/500 [==========>...................] - ETA: 1:42 - loss: 2.3456 - regression_loss: 1.9901 - classification_loss: 0.3554 199/500 [==========>...................] - ETA: 1:42 - loss: 2.3474 - regression_loss: 1.9917 - classification_loss: 0.3557 200/500 [===========>..................] - ETA: 1:41 - loss: 2.3477 - regression_loss: 1.9918 - classification_loss: 0.3559 201/500 [===========>..................] - ETA: 1:41 - loss: 2.3459 - regression_loss: 1.9899 - classification_loss: 0.3560 202/500 [===========>..................] - ETA: 1:41 - loss: 2.3453 - regression_loss: 1.9892 - classification_loss: 0.3561 203/500 [===========>..................] - ETA: 1:40 - loss: 2.3429 - regression_loss: 1.9868 - classification_loss: 0.3560 204/500 [===========>..................] - ETA: 1:40 - loss: 2.3387 - regression_loss: 1.9825 - classification_loss: 0.3562 205/500 [===========>..................] - ETA: 1:40 - loss: 2.3378 - regression_loss: 1.9819 - classification_loss: 0.3559 206/500 [===========>..................] - ETA: 1:39 - loss: 2.3400 - regression_loss: 1.9839 - classification_loss: 0.3562 207/500 [===========>..................] - ETA: 1:39 - loss: 2.3390 - regression_loss: 1.9830 - classification_loss: 0.3560 208/500 [===========>..................] - ETA: 1:39 - loss: 2.3373 - regression_loss: 1.9817 - classification_loss: 0.3556 209/500 [===========>..................] - ETA: 1:38 - loss: 2.3381 - regression_loss: 1.9823 - classification_loss: 0.3557 210/500 [===========>..................] - ETA: 1:38 - loss: 2.3368 - regression_loss: 1.9814 - classification_loss: 0.3554 211/500 [===========>..................] - ETA: 1:38 - loss: 2.3363 - regression_loss: 1.9811 - classification_loss: 0.3551 212/500 [===========>..................] - ETA: 1:37 - loss: 2.3368 - regression_loss: 1.9813 - classification_loss: 0.3555 213/500 [===========>..................] - ETA: 1:37 - loss: 2.3377 - regression_loss: 1.9820 - classification_loss: 0.3557 214/500 [===========>..................] - ETA: 1:37 - loss: 2.3379 - regression_loss: 1.9820 - classification_loss: 0.3559 215/500 [===========>..................] - ETA: 1:36 - loss: 2.3375 - regression_loss: 1.9817 - classification_loss: 0.3559 216/500 [===========>..................] - ETA: 1:36 - loss: 2.3356 - regression_loss: 1.9802 - classification_loss: 0.3554 217/500 [============>.................] - ETA: 1:36 - loss: 2.3357 - regression_loss: 1.9804 - classification_loss: 0.3553 218/500 [============>.................] - ETA: 1:35 - loss: 2.3360 - regression_loss: 1.9807 - classification_loss: 0.3553 219/500 [============>.................] - ETA: 1:35 - loss: 2.3361 - regression_loss: 1.9807 - classification_loss: 0.3554 220/500 [============>.................] - ETA: 1:35 - loss: 2.3339 - regression_loss: 1.9786 - classification_loss: 0.3553 221/500 [============>.................] - ETA: 1:34 - loss: 2.3323 - regression_loss: 1.9772 - classification_loss: 0.3551 222/500 [============>.................] - ETA: 1:34 - loss: 2.3303 - regression_loss: 1.9754 - classification_loss: 0.3548 223/500 [============>.................] - ETA: 1:34 - loss: 2.3280 - regression_loss: 1.9735 - classification_loss: 0.3545 224/500 [============>.................] - ETA: 1:33 - loss: 2.3270 - regression_loss: 1.9727 - classification_loss: 0.3542 225/500 [============>.................] - ETA: 1:33 - loss: 2.3270 - regression_loss: 1.9727 - classification_loss: 0.3543 226/500 [============>.................] - ETA: 1:33 - loss: 2.3275 - regression_loss: 1.9732 - classification_loss: 0.3543 227/500 [============>.................] - ETA: 1:32 - loss: 2.3247 - regression_loss: 1.9707 - classification_loss: 0.3540 228/500 [============>.................] - ETA: 1:32 - loss: 2.3235 - regression_loss: 1.9697 - classification_loss: 0.3538 229/500 [============>.................] - ETA: 1:32 - loss: 2.3242 - regression_loss: 1.9704 - classification_loss: 0.3538 230/500 [============>.................] - ETA: 1:31 - loss: 2.3257 - regression_loss: 1.9718 - classification_loss: 0.3540 231/500 [============>.................] - ETA: 1:31 - loss: 2.3263 - regression_loss: 1.9722 - classification_loss: 0.3541 232/500 [============>.................] - ETA: 1:31 - loss: 2.3250 - regression_loss: 1.9712 - classification_loss: 0.3538 233/500 [============>.................] - ETA: 1:30 - loss: 2.3241 - regression_loss: 1.9705 - classification_loss: 0.3536 234/500 [=============>................] - ETA: 1:30 - loss: 2.3245 - regression_loss: 1.9710 - classification_loss: 0.3535 235/500 [=============>................] - ETA: 1:30 - loss: 2.3272 - regression_loss: 1.9725 - classification_loss: 0.3547 236/500 [=============>................] - ETA: 1:29 - loss: 2.3288 - regression_loss: 1.9742 - classification_loss: 0.3546 237/500 [=============>................] - ETA: 1:29 - loss: 2.3281 - regression_loss: 1.9738 - classification_loss: 0.3544 238/500 [=============>................] - ETA: 1:29 - loss: 2.3254 - regression_loss: 1.9717 - classification_loss: 0.3538 239/500 [=============>................] - ETA: 1:28 - loss: 2.3263 - regression_loss: 1.9725 - classification_loss: 0.3538 240/500 [=============>................] - ETA: 1:28 - loss: 2.3259 - regression_loss: 1.9725 - classification_loss: 0.3534 241/500 [=============>................] - ETA: 1:28 - loss: 2.3259 - regression_loss: 1.9725 - classification_loss: 0.3534 242/500 [=============>................] - ETA: 1:27 - loss: 2.3268 - regression_loss: 1.9735 - classification_loss: 0.3533 243/500 [=============>................] - ETA: 1:27 - loss: 2.3261 - regression_loss: 1.9729 - classification_loss: 0.3531 244/500 [=============>................] - ETA: 1:27 - loss: 2.3238 - regression_loss: 1.9713 - classification_loss: 0.3525 245/500 [=============>................] - ETA: 1:26 - loss: 2.3215 - regression_loss: 1.9691 - classification_loss: 0.3524 246/500 [=============>................] - ETA: 1:26 - loss: 2.3205 - regression_loss: 1.9682 - classification_loss: 0.3523 247/500 [=============>................] - ETA: 1:26 - loss: 2.3195 - regression_loss: 1.9676 - classification_loss: 0.3519 248/500 [=============>................] - ETA: 1:25 - loss: 2.3189 - regression_loss: 1.9670 - classification_loss: 0.3519 249/500 [=============>................] - ETA: 1:25 - loss: 2.3181 - regression_loss: 1.9664 - classification_loss: 0.3517 250/500 [==============>...............] - ETA: 1:25 - loss: 2.3190 - regression_loss: 1.9674 - classification_loss: 0.3517 251/500 [==============>...............] - ETA: 1:24 - loss: 2.3186 - regression_loss: 1.9671 - classification_loss: 0.3515 252/500 [==============>...............] - ETA: 1:24 - loss: 2.3177 - regression_loss: 1.9664 - classification_loss: 0.3513 253/500 [==============>...............] - ETA: 1:24 - loss: 2.3188 - regression_loss: 1.9673 - classification_loss: 0.3515 254/500 [==============>...............] - ETA: 1:23 - loss: 2.3172 - regression_loss: 1.9658 - classification_loss: 0.3515 255/500 [==============>...............] - ETA: 1:23 - loss: 2.3157 - regression_loss: 1.9644 - classification_loss: 0.3513 256/500 [==============>...............] - ETA: 1:23 - loss: 2.3119 - regression_loss: 1.9612 - classification_loss: 0.3507 257/500 [==============>...............] - ETA: 1:22 - loss: 2.3152 - regression_loss: 1.9640 - classification_loss: 0.3512 258/500 [==============>...............] - ETA: 1:22 - loss: 2.3130 - regression_loss: 1.9623 - classification_loss: 0.3507 259/500 [==============>...............] - ETA: 1:22 - loss: 2.3139 - regression_loss: 1.9631 - classification_loss: 0.3508 260/500 [==============>...............] - ETA: 1:21 - loss: 2.3129 - regression_loss: 1.9624 - classification_loss: 0.3506 261/500 [==============>...............] - ETA: 1:21 - loss: 2.3115 - regression_loss: 1.9614 - classification_loss: 0.3500 262/500 [==============>...............] - ETA: 1:21 - loss: 2.3085 - regression_loss: 1.9590 - classification_loss: 0.3495 263/500 [==============>...............] - ETA: 1:20 - loss: 2.3080 - regression_loss: 1.9586 - classification_loss: 0.3494 264/500 [==============>...............] - ETA: 1:20 - loss: 2.3064 - regression_loss: 1.9573 - classification_loss: 0.3491 265/500 [==============>...............] - ETA: 1:20 - loss: 2.3028 - regression_loss: 1.9544 - classification_loss: 0.3485 266/500 [==============>...............] - ETA: 1:19 - loss: 2.3047 - regression_loss: 1.9553 - classification_loss: 0.3494 267/500 [===============>..............] - ETA: 1:19 - loss: 2.3039 - regression_loss: 1.9543 - classification_loss: 0.3496 268/500 [===============>..............] - ETA: 1:19 - loss: 2.3029 - regression_loss: 1.9536 - classification_loss: 0.3494 269/500 [===============>..............] - ETA: 1:18 - loss: 2.3033 - regression_loss: 1.9540 - classification_loss: 0.3493 270/500 [===============>..............] - ETA: 1:18 - loss: 2.3049 - regression_loss: 1.9553 - classification_loss: 0.3496 271/500 [===============>..............] - ETA: 1:18 - loss: 2.3023 - regression_loss: 1.9530 - classification_loss: 0.3493 272/500 [===============>..............] - ETA: 1:17 - loss: 2.2999 - regression_loss: 1.9507 - classification_loss: 0.3492 273/500 [===============>..............] - ETA: 1:17 - loss: 2.2997 - regression_loss: 1.9506 - classification_loss: 0.3492 274/500 [===============>..............] - ETA: 1:17 - loss: 2.2993 - regression_loss: 1.9503 - classification_loss: 0.3489 275/500 [===============>..............] - ETA: 1:16 - loss: 2.2971 - regression_loss: 1.9482 - classification_loss: 0.3488 276/500 [===============>..............] - ETA: 1:16 - loss: 2.2957 - regression_loss: 1.9473 - classification_loss: 0.3484 277/500 [===============>..............] - ETA: 1:16 - loss: 2.2955 - regression_loss: 1.9468 - classification_loss: 0.3487 278/500 [===============>..............] - ETA: 1:15 - loss: 2.2957 - regression_loss: 1.9468 - classification_loss: 0.3490 279/500 [===============>..............] - ETA: 1:15 - loss: 2.2930 - regression_loss: 1.9443 - classification_loss: 0.3487 280/500 [===============>..............] - ETA: 1:15 - loss: 2.2922 - regression_loss: 1.9432 - classification_loss: 0.3490 281/500 [===============>..............] - ETA: 1:14 - loss: 2.2922 - regression_loss: 1.9433 - classification_loss: 0.3489 282/500 [===============>..............] - ETA: 1:14 - loss: 2.2923 - regression_loss: 1.9435 - classification_loss: 0.3488 283/500 [===============>..............] - ETA: 1:14 - loss: 2.2926 - regression_loss: 1.9435 - classification_loss: 0.3491 284/500 [================>.............] - ETA: 1:13 - loss: 2.2900 - regression_loss: 1.9413 - classification_loss: 0.3487 285/500 [================>.............] - ETA: 1:13 - loss: 2.2900 - regression_loss: 1.9413 - classification_loss: 0.3487 286/500 [================>.............] - ETA: 1:13 - loss: 2.2890 - regression_loss: 1.9403 - classification_loss: 0.3487 287/500 [================>.............] - ETA: 1:12 - loss: 2.2876 - regression_loss: 1.9392 - classification_loss: 0.3484 288/500 [================>.............] - ETA: 1:12 - loss: 2.2862 - regression_loss: 1.9378 - classification_loss: 0.3485 289/500 [================>.............] - ETA: 1:12 - loss: 2.2864 - regression_loss: 1.9379 - classification_loss: 0.3486 290/500 [================>.............] - ETA: 1:11 - loss: 2.2861 - regression_loss: 1.9376 - classification_loss: 0.3485 291/500 [================>.............] - ETA: 1:11 - loss: 2.2859 - regression_loss: 1.9376 - classification_loss: 0.3484 292/500 [================>.............] - ETA: 1:11 - loss: 2.2859 - regression_loss: 1.9377 - classification_loss: 0.3482 293/500 [================>.............] - ETA: 1:10 - loss: 2.2835 - regression_loss: 1.9358 - classification_loss: 0.3477 294/500 [================>.............] - ETA: 1:10 - loss: 2.2829 - regression_loss: 1.9354 - classification_loss: 0.3475 295/500 [================>.............] - ETA: 1:10 - loss: 2.2825 - regression_loss: 1.9352 - classification_loss: 0.3474 296/500 [================>.............] - ETA: 1:09 - loss: 2.2828 - regression_loss: 1.9356 - classification_loss: 0.3471 297/500 [================>.............] - ETA: 1:09 - loss: 2.2834 - regression_loss: 1.9363 - classification_loss: 0.3472 298/500 [================>.............] - ETA: 1:09 - loss: 2.2811 - regression_loss: 1.9343 - classification_loss: 0.3468 299/500 [================>.............] - ETA: 1:08 - loss: 2.2814 - regression_loss: 1.9346 - classification_loss: 0.3468 300/500 [=================>............] - ETA: 1:08 - loss: 2.2823 - regression_loss: 1.9354 - classification_loss: 0.3469 301/500 [=================>............] - ETA: 1:07 - loss: 2.2828 - regression_loss: 1.9360 - classification_loss: 0.3468 302/500 [=================>............] - ETA: 1:07 - loss: 2.2843 - regression_loss: 1.9374 - classification_loss: 0.3469 303/500 [=================>............] - ETA: 1:07 - loss: 2.2804 - regression_loss: 1.9341 - classification_loss: 0.3463 304/500 [=================>............] - ETA: 1:06 - loss: 2.2822 - regression_loss: 1.9356 - classification_loss: 0.3466 305/500 [=================>............] - ETA: 1:06 - loss: 2.2827 - regression_loss: 1.9358 - classification_loss: 0.3469 306/500 [=================>............] - ETA: 1:06 - loss: 2.2822 - regression_loss: 1.9355 - classification_loss: 0.3467 307/500 [=================>............] - ETA: 1:05 - loss: 2.2805 - regression_loss: 1.9340 - classification_loss: 0.3465 308/500 [=================>............] - ETA: 1:05 - loss: 2.2805 - regression_loss: 1.9343 - classification_loss: 0.3462 309/500 [=================>............] - ETA: 1:05 - loss: 2.2807 - regression_loss: 1.9346 - classification_loss: 0.3461 310/500 [=================>............] - ETA: 1:04 - loss: 2.2781 - regression_loss: 1.9321 - classification_loss: 0.3460 311/500 [=================>............] - ETA: 1:04 - loss: 2.2791 - regression_loss: 1.9326 - classification_loss: 0.3465 312/500 [=================>............] - ETA: 1:04 - loss: 2.2776 - regression_loss: 1.9314 - classification_loss: 0.3462 313/500 [=================>............] - ETA: 1:03 - loss: 2.2770 - regression_loss: 1.9309 - classification_loss: 0.3461 314/500 [=================>............] - ETA: 1:03 - loss: 2.2759 - regression_loss: 1.9300 - classification_loss: 0.3459 315/500 [=================>............] - ETA: 1:03 - loss: 2.2774 - regression_loss: 1.9312 - classification_loss: 0.3462 316/500 [=================>............] - ETA: 1:02 - loss: 2.2763 - regression_loss: 1.9302 - classification_loss: 0.3461 317/500 [==================>...........] - ETA: 1:02 - loss: 2.2763 - regression_loss: 1.9302 - classification_loss: 0.3461 318/500 [==================>...........] - ETA: 1:02 - loss: 2.2751 - regression_loss: 1.9291 - classification_loss: 0.3460 319/500 [==================>...........] - ETA: 1:01 - loss: 2.2743 - regression_loss: 1.9284 - classification_loss: 0.3459 320/500 [==================>...........] - ETA: 1:01 - loss: 2.2737 - regression_loss: 1.9279 - classification_loss: 0.3458 321/500 [==================>...........] - ETA: 1:01 - loss: 2.2734 - regression_loss: 1.9277 - classification_loss: 0.3457 322/500 [==================>...........] - ETA: 1:00 - loss: 2.2731 - regression_loss: 1.9277 - classification_loss: 0.3455 323/500 [==================>...........] - ETA: 1:00 - loss: 2.2727 - regression_loss: 1.9275 - classification_loss: 0.3453 324/500 [==================>...........] - ETA: 1:00 - loss: 2.2728 - regression_loss: 1.9275 - classification_loss: 0.3453 325/500 [==================>...........] - ETA: 59s - loss: 2.2721 - regression_loss: 1.9268 - classification_loss: 0.3452  326/500 [==================>...........] - ETA: 59s - loss: 2.2718 - regression_loss: 1.9265 - classification_loss: 0.3452 327/500 [==================>...........] - ETA: 59s - loss: 2.2727 - regression_loss: 1.9273 - classification_loss: 0.3454 328/500 [==================>...........] - ETA: 58s - loss: 2.2726 - regression_loss: 1.9276 - classification_loss: 0.3450 329/500 [==================>...........] - ETA: 58s - loss: 2.2736 - regression_loss: 1.9285 - classification_loss: 0.3451 330/500 [==================>...........] - ETA: 58s - loss: 2.2717 - regression_loss: 1.9269 - classification_loss: 0.3448 331/500 [==================>...........] - ETA: 57s - loss: 2.2707 - regression_loss: 1.9262 - classification_loss: 0.3445 332/500 [==================>...........] - ETA: 57s - loss: 2.2710 - regression_loss: 1.9265 - classification_loss: 0.3445 333/500 [==================>...........] - ETA: 57s - loss: 2.2691 - regression_loss: 1.9250 - classification_loss: 0.3441 334/500 [===================>..........] - ETA: 56s - loss: 2.2690 - regression_loss: 1.9250 - classification_loss: 0.3440 335/500 [===================>..........] - ETA: 56s - loss: 2.2682 - regression_loss: 1.9243 - classification_loss: 0.3439 336/500 [===================>..........] - ETA: 56s - loss: 2.2648 - regression_loss: 1.9212 - classification_loss: 0.3436 337/500 [===================>..........] - ETA: 55s - loss: 2.2652 - regression_loss: 1.9215 - classification_loss: 0.3437 338/500 [===================>..........] - ETA: 55s - loss: 2.2648 - regression_loss: 1.9212 - classification_loss: 0.3436 339/500 [===================>..........] - ETA: 55s - loss: 2.2640 - regression_loss: 1.9204 - classification_loss: 0.3436 340/500 [===================>..........] - ETA: 54s - loss: 2.2638 - regression_loss: 1.9204 - classification_loss: 0.3435 341/500 [===================>..........] - ETA: 54s - loss: 2.2635 - regression_loss: 1.9201 - classification_loss: 0.3434 342/500 [===================>..........] - ETA: 54s - loss: 2.2634 - regression_loss: 1.9199 - classification_loss: 0.3435 343/500 [===================>..........] - ETA: 53s - loss: 2.2624 - regression_loss: 1.9191 - classification_loss: 0.3433 344/500 [===================>..........] - ETA: 53s - loss: 2.2617 - regression_loss: 1.9184 - classification_loss: 0.3434 345/500 [===================>..........] - ETA: 53s - loss: 2.2635 - regression_loss: 1.9200 - classification_loss: 0.3435 346/500 [===================>..........] - ETA: 52s - loss: 2.2636 - regression_loss: 1.9201 - classification_loss: 0.3435 347/500 [===================>..........] - ETA: 52s - loss: 2.2596 - regression_loss: 1.9166 - classification_loss: 0.3430 348/500 [===================>..........] - ETA: 52s - loss: 2.2600 - regression_loss: 1.9169 - classification_loss: 0.3430 349/500 [===================>..........] - ETA: 51s - loss: 2.2598 - regression_loss: 1.9167 - classification_loss: 0.3431 350/500 [====================>.........] - ETA: 51s - loss: 2.2596 - regression_loss: 1.9165 - classification_loss: 0.3432 351/500 [====================>.........] - ETA: 51s - loss: 2.2587 - regression_loss: 1.9156 - classification_loss: 0.3430 352/500 [====================>.........] - ETA: 50s - loss: 2.2596 - regression_loss: 1.9163 - classification_loss: 0.3433 353/500 [====================>.........] - ETA: 50s - loss: 2.2594 - regression_loss: 1.9158 - classification_loss: 0.3436 354/500 [====================>.........] - ETA: 50s - loss: 2.2597 - regression_loss: 1.9161 - classification_loss: 0.3436 355/500 [====================>.........] - ETA: 49s - loss: 2.2596 - regression_loss: 1.9160 - classification_loss: 0.3436 356/500 [====================>.........] - ETA: 49s - loss: 2.2597 - regression_loss: 1.9161 - classification_loss: 0.3436 357/500 [====================>.........] - ETA: 49s - loss: 2.2589 - regression_loss: 1.9155 - classification_loss: 0.3434 358/500 [====================>.........] - ETA: 48s - loss: 2.2590 - regression_loss: 1.9155 - classification_loss: 0.3435 359/500 [====================>.........] - ETA: 48s - loss: 2.2584 - regression_loss: 1.9152 - classification_loss: 0.3432 360/500 [====================>.........] - ETA: 47s - loss: 2.2574 - regression_loss: 1.9143 - classification_loss: 0.3431 361/500 [====================>.........] - ETA: 47s - loss: 2.2573 - regression_loss: 1.9143 - classification_loss: 0.3430 362/500 [====================>.........] - ETA: 47s - loss: 2.2585 - regression_loss: 1.9154 - classification_loss: 0.3431 363/500 [====================>.........] - ETA: 46s - loss: 2.2599 - regression_loss: 1.9167 - classification_loss: 0.3432 364/500 [====================>.........] - ETA: 46s - loss: 2.2602 - regression_loss: 1.9168 - classification_loss: 0.3434 365/500 [====================>.........] - ETA: 46s - loss: 2.2605 - regression_loss: 1.9171 - classification_loss: 0.3435 366/500 [====================>.........] - ETA: 45s - loss: 2.2599 - regression_loss: 1.9167 - classification_loss: 0.3432 367/500 [=====================>........] - ETA: 45s - loss: 2.2593 - regression_loss: 1.9161 - classification_loss: 0.3432 368/500 [=====================>........] - ETA: 45s - loss: 2.2590 - regression_loss: 1.9158 - classification_loss: 0.3432 369/500 [=====================>........] - ETA: 44s - loss: 2.2579 - regression_loss: 1.9149 - classification_loss: 0.3431 370/500 [=====================>........] - ETA: 44s - loss: 2.2559 - regression_loss: 1.9130 - classification_loss: 0.3429 371/500 [=====================>........] - ETA: 44s - loss: 2.2561 - regression_loss: 1.9131 - classification_loss: 0.3430 372/500 [=====================>........] - ETA: 43s - loss: 2.2565 - regression_loss: 1.9135 - classification_loss: 0.3430 373/500 [=====================>........] - ETA: 43s - loss: 2.2563 - regression_loss: 1.9134 - classification_loss: 0.3429 374/500 [=====================>........] - ETA: 43s - loss: 2.2547 - regression_loss: 1.9119 - classification_loss: 0.3428 375/500 [=====================>........] - ETA: 42s - loss: 2.2547 - regression_loss: 1.9119 - classification_loss: 0.3428 376/500 [=====================>........] - ETA: 42s - loss: 2.2538 - regression_loss: 1.9113 - classification_loss: 0.3426 377/500 [=====================>........] - ETA: 42s - loss: 2.2540 - regression_loss: 1.9115 - classification_loss: 0.3425 378/500 [=====================>........] - ETA: 41s - loss: 2.2511 - regression_loss: 1.9088 - classification_loss: 0.3423 379/500 [=====================>........] - ETA: 41s - loss: 2.2510 - regression_loss: 1.9088 - classification_loss: 0.3422 380/500 [=====================>........] - ETA: 41s - loss: 2.2515 - regression_loss: 1.9093 - classification_loss: 0.3422 381/500 [=====================>........] - ETA: 40s - loss: 2.2506 - regression_loss: 1.9085 - classification_loss: 0.3421 382/500 [=====================>........] - ETA: 40s - loss: 2.2503 - regression_loss: 1.9084 - classification_loss: 0.3419 383/500 [=====================>........] - ETA: 40s - loss: 2.2505 - regression_loss: 1.9085 - classification_loss: 0.3419 384/500 [======================>.......] - ETA: 39s - loss: 2.2511 - regression_loss: 1.9088 - classification_loss: 0.3423 385/500 [======================>.......] - ETA: 39s - loss: 2.2509 - regression_loss: 1.9087 - classification_loss: 0.3422 386/500 [======================>.......] - ETA: 39s - loss: 2.2499 - regression_loss: 1.9080 - classification_loss: 0.3420 387/500 [======================>.......] - ETA: 38s - loss: 2.2514 - regression_loss: 1.9090 - classification_loss: 0.3424 388/500 [======================>.......] - ETA: 38s - loss: 2.2498 - regression_loss: 1.9077 - classification_loss: 0.3421 389/500 [======================>.......] - ETA: 38s - loss: 2.2503 - regression_loss: 1.9080 - classification_loss: 0.3423 390/500 [======================>.......] - ETA: 37s - loss: 2.2494 - regression_loss: 1.9072 - classification_loss: 0.3422 391/500 [======================>.......] - ETA: 37s - loss: 2.2497 - regression_loss: 1.9076 - classification_loss: 0.3421 392/500 [======================>.......] - ETA: 37s - loss: 2.2485 - regression_loss: 1.9066 - classification_loss: 0.3419 393/500 [======================>.......] - ETA: 36s - loss: 2.2480 - regression_loss: 1.9061 - classification_loss: 0.3419 394/500 [======================>.......] - ETA: 36s - loss: 2.2487 - regression_loss: 1.9066 - classification_loss: 0.3421 395/500 [======================>.......] - ETA: 36s - loss: 2.2489 - regression_loss: 1.9069 - classification_loss: 0.3420 396/500 [======================>.......] - ETA: 35s - loss: 2.2489 - regression_loss: 1.9071 - classification_loss: 0.3418 397/500 [======================>.......] - ETA: 35s - loss: 2.2480 - regression_loss: 1.9064 - classification_loss: 0.3416 398/500 [======================>.......] - ETA: 35s - loss: 2.2498 - regression_loss: 1.9077 - classification_loss: 0.3421 399/500 [======================>.......] - ETA: 34s - loss: 2.2505 - regression_loss: 1.9080 - classification_loss: 0.3425 400/500 [=======================>......] - ETA: 34s - loss: 2.2509 - regression_loss: 1.9084 - classification_loss: 0.3425 401/500 [=======================>......] - ETA: 33s - loss: 2.2505 - regression_loss: 1.9082 - classification_loss: 0.3423 402/500 [=======================>......] - ETA: 33s - loss: 2.2496 - regression_loss: 1.9075 - classification_loss: 0.3422 403/500 [=======================>......] - ETA: 33s - loss: 2.2493 - regression_loss: 1.9071 - classification_loss: 0.3421 404/500 [=======================>......] - ETA: 32s - loss: 2.2503 - regression_loss: 1.9080 - classification_loss: 0.3423 405/500 [=======================>......] - ETA: 32s - loss: 2.2505 - regression_loss: 1.9081 - classification_loss: 0.3425 406/500 [=======================>......] - ETA: 32s - loss: 2.2508 - regression_loss: 1.9082 - classification_loss: 0.3426 407/500 [=======================>......] - ETA: 31s - loss: 2.2507 - regression_loss: 1.9080 - classification_loss: 0.3426 408/500 [=======================>......] - ETA: 31s - loss: 2.2510 - regression_loss: 1.9084 - classification_loss: 0.3426 409/500 [=======================>......] - ETA: 31s - loss: 2.2500 - regression_loss: 1.9074 - classification_loss: 0.3426 410/500 [=======================>......] - ETA: 30s - loss: 2.2493 - regression_loss: 1.9068 - classification_loss: 0.3425 411/500 [=======================>......] - ETA: 30s - loss: 2.2482 - regression_loss: 1.9059 - classification_loss: 0.3423 412/500 [=======================>......] - ETA: 30s - loss: 2.2470 - regression_loss: 1.9048 - classification_loss: 0.3422 413/500 [=======================>......] - ETA: 29s - loss: 2.2465 - regression_loss: 1.9044 - classification_loss: 0.3421 414/500 [=======================>......] - ETA: 29s - loss: 2.2448 - regression_loss: 1.9030 - classification_loss: 0.3418 415/500 [=======================>......] - ETA: 29s - loss: 2.2446 - regression_loss: 1.9029 - classification_loss: 0.3418 416/500 [=======================>......] - ETA: 28s - loss: 2.2441 - regression_loss: 1.9025 - classification_loss: 0.3416 417/500 [========================>.....] - ETA: 28s - loss: 2.2439 - regression_loss: 1.9023 - classification_loss: 0.3416 418/500 [========================>.....] - ETA: 28s - loss: 2.2434 - regression_loss: 1.9020 - classification_loss: 0.3414 419/500 [========================>.....] - ETA: 27s - loss: 2.2435 - regression_loss: 1.9021 - classification_loss: 0.3414 420/500 [========================>.....] - ETA: 27s - loss: 2.2437 - regression_loss: 1.9023 - classification_loss: 0.3413 421/500 [========================>.....] - ETA: 27s - loss: 2.2415 - regression_loss: 1.9005 - classification_loss: 0.3409 422/500 [========================>.....] - ETA: 26s - loss: 2.2411 - regression_loss: 1.9003 - classification_loss: 0.3408 423/500 [========================>.....] - ETA: 26s - loss: 2.2412 - regression_loss: 1.9005 - classification_loss: 0.3407 424/500 [========================>.....] - ETA: 26s - loss: 2.2422 - regression_loss: 1.9014 - classification_loss: 0.3408 425/500 [========================>.....] - ETA: 25s - loss: 2.2412 - regression_loss: 1.9006 - classification_loss: 0.3406 426/500 [========================>.....] - ETA: 25s - loss: 2.2412 - regression_loss: 1.9007 - classification_loss: 0.3405 427/500 [========================>.....] - ETA: 25s - loss: 2.2403 - regression_loss: 1.9000 - classification_loss: 0.3403 428/500 [========================>.....] - ETA: 24s - loss: 2.2389 - regression_loss: 1.8988 - classification_loss: 0.3400 429/500 [========================>.....] - ETA: 24s - loss: 2.2387 - regression_loss: 1.8988 - classification_loss: 0.3399 430/500 [========================>.....] - ETA: 24s - loss: 2.2388 - regression_loss: 1.8989 - classification_loss: 0.3398 431/500 [========================>.....] - ETA: 23s - loss: 2.2385 - regression_loss: 1.8987 - classification_loss: 0.3397 432/500 [========================>.....] - ETA: 23s - loss: 2.2373 - regression_loss: 1.8977 - classification_loss: 0.3396 433/500 [========================>.....] - ETA: 23s - loss: 2.2366 - regression_loss: 1.8971 - classification_loss: 0.3395 434/500 [=========================>....] - ETA: 22s - loss: 2.2358 - regression_loss: 1.8965 - classification_loss: 0.3394 435/500 [=========================>....] - ETA: 22s - loss: 2.2348 - regression_loss: 1.8955 - classification_loss: 0.3393 436/500 [=========================>....] - ETA: 21s - loss: 2.2325 - regression_loss: 1.8932 - classification_loss: 0.3392 437/500 [=========================>....] - ETA: 21s - loss: 2.2327 - regression_loss: 1.8935 - classification_loss: 0.3392 438/500 [=========================>....] - ETA: 21s - loss: 2.2329 - regression_loss: 1.8937 - classification_loss: 0.3392 439/500 [=========================>....] - ETA: 20s - loss: 2.2335 - regression_loss: 1.8943 - classification_loss: 0.3393 440/500 [=========================>....] - ETA: 20s - loss: 2.2339 - regression_loss: 1.8946 - classification_loss: 0.3393 441/500 [=========================>....] - ETA: 20s - loss: 2.2319 - regression_loss: 1.8928 - classification_loss: 0.3391 442/500 [=========================>....] - ETA: 19s - loss: 2.2319 - regression_loss: 1.8928 - classification_loss: 0.3391 443/500 [=========================>....] - ETA: 19s - loss: 2.2315 - regression_loss: 1.8925 - classification_loss: 0.3390 444/500 [=========================>....] - ETA: 19s - loss: 2.2313 - regression_loss: 1.8924 - classification_loss: 0.3389 445/500 [=========================>....] - ETA: 18s - loss: 2.2312 - regression_loss: 1.8923 - classification_loss: 0.3389 446/500 [=========================>....] - ETA: 18s - loss: 2.2308 - regression_loss: 1.8920 - classification_loss: 0.3388 447/500 [=========================>....] - ETA: 18s - loss: 2.2301 - regression_loss: 1.8914 - classification_loss: 0.3387 448/500 [=========================>....] - ETA: 17s - loss: 2.2287 - regression_loss: 1.8903 - classification_loss: 0.3384 449/500 [=========================>....] - ETA: 17s - loss: 2.2292 - regression_loss: 1.8908 - classification_loss: 0.3384 450/500 [==========================>...] - ETA: 17s - loss: 2.2284 - regression_loss: 1.8902 - classification_loss: 0.3383 451/500 [==========================>...] - ETA: 16s - loss: 2.2276 - regression_loss: 1.8894 - classification_loss: 0.3381 452/500 [==========================>...] - ETA: 16s - loss: 2.2277 - regression_loss: 1.8894 - classification_loss: 0.3383 453/500 [==========================>...] - ETA: 16s - loss: 2.2286 - regression_loss: 1.8902 - classification_loss: 0.3385 454/500 [==========================>...] - ETA: 15s - loss: 2.2284 - regression_loss: 1.8901 - classification_loss: 0.3384 455/500 [==========================>...] - ETA: 15s - loss: 2.2273 - regression_loss: 1.8889 - classification_loss: 0.3384 456/500 [==========================>...] - ETA: 15s - loss: 2.2270 - regression_loss: 1.8887 - classification_loss: 0.3383 457/500 [==========================>...] - ETA: 14s - loss: 2.2275 - regression_loss: 1.8893 - classification_loss: 0.3382 458/500 [==========================>...] - ETA: 14s - loss: 2.2287 - regression_loss: 1.8905 - classification_loss: 0.3382 459/500 [==========================>...] - ETA: 14s - loss: 2.2276 - regression_loss: 1.8897 - classification_loss: 0.3379 460/500 [==========================>...] - ETA: 13s - loss: 2.2266 - regression_loss: 1.8888 - classification_loss: 0.3377 461/500 [==========================>...] - ETA: 13s - loss: 2.2275 - regression_loss: 1.8897 - classification_loss: 0.3378 462/500 [==========================>...] - ETA: 13s - loss: 2.2265 - regression_loss: 1.8888 - classification_loss: 0.3377 463/500 [==========================>...] - ETA: 12s - loss: 2.2265 - regression_loss: 1.8888 - classification_loss: 0.3377 464/500 [==========================>...] - ETA: 12s - loss: 2.2241 - regression_loss: 1.8868 - classification_loss: 0.3373 465/500 [==========================>...] - ETA: 12s - loss: 2.2238 - regression_loss: 1.8864 - classification_loss: 0.3373 466/500 [==========================>...] - ETA: 11s - loss: 2.2246 - regression_loss: 1.8872 - classification_loss: 0.3374 467/500 [===========================>..] - ETA: 11s - loss: 2.2232 - regression_loss: 1.8859 - classification_loss: 0.3372 468/500 [===========================>..] - ETA: 11s - loss: 2.2218 - regression_loss: 1.8848 - classification_loss: 0.3370 469/500 [===========================>..] - ETA: 10s - loss: 2.2217 - regression_loss: 1.8848 - classification_loss: 0.3369 470/500 [===========================>..] - ETA: 10s - loss: 2.2219 - regression_loss: 1.8850 - classification_loss: 0.3370 471/500 [===========================>..] - ETA: 9s - loss: 2.2229 - regression_loss: 1.8858 - classification_loss: 0.3371  472/500 [===========================>..] - ETA: 9s - loss: 2.2226 - regression_loss: 1.8854 - classification_loss: 0.3371 473/500 [===========================>..] - ETA: 9s - loss: 2.2228 - regression_loss: 1.8858 - classification_loss: 0.3370 474/500 [===========================>..] - ETA: 8s - loss: 2.2231 - regression_loss: 1.8862 - classification_loss: 0.3370 475/500 [===========================>..] - ETA: 8s - loss: 2.2216 - regression_loss: 1.8848 - classification_loss: 0.3367 476/500 [===========================>..] - ETA: 8s - loss: 2.2209 - regression_loss: 1.8843 - classification_loss: 0.3366 477/500 [===========================>..] - ETA: 7s - loss: 2.2194 - regression_loss: 1.8830 - classification_loss: 0.3364 478/500 [===========================>..] - ETA: 7s - loss: 2.2197 - regression_loss: 1.8833 - classification_loss: 0.3364 479/500 [===========================>..] - ETA: 7s - loss: 2.2181 - regression_loss: 1.8819 - classification_loss: 0.3362 480/500 [===========================>..] - ETA: 6s - loss: 2.2154 - regression_loss: 1.8797 - classification_loss: 0.3358 481/500 [===========================>..] - ETA: 6s - loss: 2.2144 - regression_loss: 1.8788 - classification_loss: 0.3356 482/500 [===========================>..] - ETA: 6s - loss: 2.2140 - regression_loss: 1.8785 - classification_loss: 0.3356 483/500 [===========================>..] - ETA: 5s - loss: 2.2138 - regression_loss: 1.8782 - classification_loss: 0.3357 484/500 [============================>.] - ETA: 5s - loss: 2.2130 - regression_loss: 1.8775 - classification_loss: 0.3356 485/500 [============================>.] - ETA: 5s - loss: 2.2128 - regression_loss: 1.8773 - classification_loss: 0.3355 486/500 [============================>.] - ETA: 4s - loss: 2.2126 - regression_loss: 1.8771 - classification_loss: 0.3355 487/500 [============================>.] - ETA: 4s - loss: 2.2125 - regression_loss: 1.8769 - classification_loss: 0.3356 488/500 [============================>.] - ETA: 4s - loss: 2.2124 - regression_loss: 1.8770 - classification_loss: 0.3354 489/500 [============================>.] - ETA: 3s - loss: 2.2116 - regression_loss: 1.8762 - classification_loss: 0.3354 490/500 [============================>.] - ETA: 3s - loss: 2.2115 - regression_loss: 1.8762 - classification_loss: 0.3353 491/500 [============================>.] - ETA: 3s - loss: 2.2113 - regression_loss: 1.8760 - classification_loss: 0.3354 492/500 [============================>.] - ETA: 2s - loss: 2.2089 - regression_loss: 1.8739 - classification_loss: 0.3350 493/500 [============================>.] - ETA: 2s - loss: 2.2086 - regression_loss: 1.8737 - classification_loss: 0.3349 494/500 [============================>.] - ETA: 2s - loss: 2.2084 - regression_loss: 1.8734 - classification_loss: 0.3349 495/500 [============================>.] - ETA: 1s - loss: 2.2069 - regression_loss: 1.8722 - classification_loss: 0.3347 496/500 [============================>.] - ETA: 1s - loss: 2.2053 - regression_loss: 1.8709 - classification_loss: 0.3344 497/500 [============================>.] - ETA: 1s - loss: 2.2051 - regression_loss: 1.8708 - classification_loss: 0.3343 498/500 [============================>.] - ETA: 0s - loss: 2.2046 - regression_loss: 1.8704 - classification_loss: 0.3342 499/500 [============================>.] - ETA: 0s - loss: 2.2037 - regression_loss: 1.8696 - classification_loss: 0.3341 500/500 [==============================] - 172s 344ms/step - loss: 2.2031 - regression_loss: 1.8692 - classification_loss: 0.3339 1172 instances of class plum with average precision: 0.4826 mAP: 0.4826 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 2:42 - loss: 1.9244 - regression_loss: 1.5926 - classification_loss: 0.3318 2/500 [..............................] - ETA: 2:42 - loss: 1.7777 - regression_loss: 1.4905 - classification_loss: 0.2872 3/500 [..............................] - ETA: 2:41 - loss: 1.8772 - regression_loss: 1.5758 - classification_loss: 0.3013 4/500 [..............................] - ETA: 2:38 - loss: 1.9804 - regression_loss: 1.6621 - classification_loss: 0.3182 5/500 [..............................] - ETA: 2:41 - loss: 2.1368 - regression_loss: 1.7981 - classification_loss: 0.3386 6/500 [..............................] - ETA: 2:44 - loss: 2.1633 - regression_loss: 1.8206 - classification_loss: 0.3427 7/500 [..............................] - ETA: 2:45 - loss: 2.1348 - regression_loss: 1.8009 - classification_loss: 0.3339 8/500 [..............................] - ETA: 2:45 - loss: 2.0758 - regression_loss: 1.7526 - classification_loss: 0.3232 9/500 [..............................] - ETA: 2:46 - loss: 2.0370 - regression_loss: 1.7221 - classification_loss: 0.3149 10/500 [..............................] - ETA: 2:47 - loss: 2.0521 - regression_loss: 1.7342 - classification_loss: 0.3179 11/500 [..............................] - ETA: 2:47 - loss: 2.0099 - regression_loss: 1.6986 - classification_loss: 0.3113 12/500 [..............................] - ETA: 2:47 - loss: 1.9613 - regression_loss: 1.6551 - classification_loss: 0.3062 13/500 [..............................] - ETA: 2:47 - loss: 1.9289 - regression_loss: 1.6328 - classification_loss: 0.2961 14/500 [..............................] - ETA: 2:47 - loss: 1.9181 - regression_loss: 1.6261 - classification_loss: 0.2920 15/500 [..............................] - ETA: 2:46 - loss: 1.9296 - regression_loss: 1.6374 - classification_loss: 0.2922 16/500 [..............................] - ETA: 2:46 - loss: 1.9464 - regression_loss: 1.6547 - classification_loss: 0.2917 17/500 [>.............................] - ETA: 2:45 - loss: 1.9753 - regression_loss: 1.6792 - classification_loss: 0.2961 18/500 [>.............................] - ETA: 2:44 - loss: 1.9938 - regression_loss: 1.6886 - classification_loss: 0.3053 19/500 [>.............................] - ETA: 2:44 - loss: 1.9815 - regression_loss: 1.6817 - classification_loss: 0.2998 20/500 [>.............................] - ETA: 2:43 - loss: 2.0011 - regression_loss: 1.6978 - classification_loss: 0.3032 21/500 [>.............................] - ETA: 2:43 - loss: 1.9831 - regression_loss: 1.6821 - classification_loss: 0.3010 22/500 [>.............................] - ETA: 2:43 - loss: 1.9740 - regression_loss: 1.6738 - classification_loss: 0.3001 23/500 [>.............................] - ETA: 2:42 - loss: 1.9658 - regression_loss: 1.6674 - classification_loss: 0.2984 24/500 [>.............................] - ETA: 2:42 - loss: 1.9726 - regression_loss: 1.6714 - classification_loss: 0.3012 25/500 [>.............................] - ETA: 2:42 - loss: 1.9487 - regression_loss: 1.6473 - classification_loss: 0.3014 26/500 [>.............................] - ETA: 2:41 - loss: 1.9525 - regression_loss: 1.6521 - classification_loss: 0.3004 27/500 [>.............................] - ETA: 2:42 - loss: 1.9883 - regression_loss: 1.6841 - classification_loss: 0.3042 28/500 [>.............................] - ETA: 2:41 - loss: 1.9935 - regression_loss: 1.6881 - classification_loss: 0.3054 29/500 [>.............................] - ETA: 2:41 - loss: 2.0017 - regression_loss: 1.6948 - classification_loss: 0.3069 30/500 [>.............................] - ETA: 2:41 - loss: 2.0008 - regression_loss: 1.6950 - classification_loss: 0.3058 31/500 [>.............................] - ETA: 2:41 - loss: 1.9967 - regression_loss: 1.6936 - classification_loss: 0.3031 32/500 [>.............................] - ETA: 2:40 - loss: 1.9837 - regression_loss: 1.6836 - classification_loss: 0.3001 33/500 [>.............................] - ETA: 2:40 - loss: 1.9867 - regression_loss: 1.6872 - classification_loss: 0.2995 34/500 [=>............................] - ETA: 2:40 - loss: 2.0114 - regression_loss: 1.7016 - classification_loss: 0.3098 35/500 [=>............................] - ETA: 2:40 - loss: 2.0010 - regression_loss: 1.6942 - classification_loss: 0.3068 36/500 [=>............................] - ETA: 2:40 - loss: 2.0062 - regression_loss: 1.6981 - classification_loss: 0.3081 37/500 [=>............................] - ETA: 2:39 - loss: 2.0104 - regression_loss: 1.7023 - classification_loss: 0.3081 38/500 [=>............................] - ETA: 2:39 - loss: 2.0035 - regression_loss: 1.6952 - classification_loss: 0.3083 39/500 [=>............................] - ETA: 2:39 - loss: 2.0083 - regression_loss: 1.6996 - classification_loss: 0.3087 40/500 [=>............................] - ETA: 2:39 - loss: 2.0118 - regression_loss: 1.7047 - classification_loss: 0.3070 41/500 [=>............................] - ETA: 2:38 - loss: 2.0108 - regression_loss: 1.7051 - classification_loss: 0.3057 42/500 [=>............................] - ETA: 2:38 - loss: 2.0154 - regression_loss: 1.7090 - classification_loss: 0.3064 43/500 [=>............................] - ETA: 2:37 - loss: 2.0242 - regression_loss: 1.7181 - classification_loss: 0.3061 44/500 [=>............................] - ETA: 2:37 - loss: 2.0210 - regression_loss: 1.7163 - classification_loss: 0.3048 45/500 [=>............................] - ETA: 2:37 - loss: 2.0290 - regression_loss: 1.7235 - classification_loss: 0.3054 46/500 [=>............................] - ETA: 2:36 - loss: 2.0309 - regression_loss: 1.7256 - classification_loss: 0.3053 47/500 [=>............................] - ETA: 2:36 - loss: 2.0263 - regression_loss: 1.7221 - classification_loss: 0.3041 48/500 [=>............................] - ETA: 2:36 - loss: 2.0438 - regression_loss: 1.7381 - classification_loss: 0.3057 49/500 [=>............................] - ETA: 2:35 - loss: 2.0508 - regression_loss: 1.7448 - classification_loss: 0.3060 50/500 [==>...........................] - ETA: 2:35 - loss: 2.0540 - regression_loss: 1.7486 - classification_loss: 0.3054 51/500 [==>...........................] - ETA: 2:34 - loss: 2.0550 - regression_loss: 1.7505 - classification_loss: 0.3045 52/500 [==>...........................] - ETA: 2:34 - loss: 2.0610 - regression_loss: 1.7564 - classification_loss: 0.3046 53/500 [==>...........................] - ETA: 2:34 - loss: 2.0607 - regression_loss: 1.7557 - classification_loss: 0.3050 54/500 [==>...........................] - ETA: 2:34 - loss: 2.0628 - regression_loss: 1.7576 - classification_loss: 0.3052 55/500 [==>...........................] - ETA: 2:33 - loss: 2.0612 - regression_loss: 1.7565 - classification_loss: 0.3048 56/500 [==>...........................] - ETA: 2:33 - loss: 2.0618 - regression_loss: 1.7567 - classification_loss: 0.3051 57/500 [==>...........................] - ETA: 2:32 - loss: 2.0795 - regression_loss: 1.7709 - classification_loss: 0.3086 58/500 [==>...........................] - ETA: 2:32 - loss: 2.0844 - regression_loss: 1.7750 - classification_loss: 0.3094 59/500 [==>...........................] - ETA: 2:31 - loss: 2.0852 - regression_loss: 1.7757 - classification_loss: 0.3095 60/500 [==>...........................] - ETA: 2:31 - loss: 2.0749 - regression_loss: 1.7677 - classification_loss: 0.3071 61/500 [==>...........................] - ETA: 2:31 - loss: 2.0717 - regression_loss: 1.7662 - classification_loss: 0.3055 62/500 [==>...........................] - ETA: 2:30 - loss: 2.0689 - regression_loss: 1.7648 - classification_loss: 0.3042 63/500 [==>...........................] - ETA: 2:30 - loss: 2.0661 - regression_loss: 1.7637 - classification_loss: 0.3024 64/500 [==>...........................] - ETA: 2:29 - loss: 2.0642 - regression_loss: 1.7622 - classification_loss: 0.3021 65/500 [==>...........................] - ETA: 2:29 - loss: 2.0612 - regression_loss: 1.7599 - classification_loss: 0.3013 66/500 [==>...........................] - ETA: 2:29 - loss: 2.0495 - regression_loss: 1.7497 - classification_loss: 0.2999 67/500 [===>..........................] - ETA: 2:28 - loss: 2.0475 - regression_loss: 1.7483 - classification_loss: 0.2992 68/500 [===>..........................] - ETA: 2:28 - loss: 2.0358 - regression_loss: 1.7349 - classification_loss: 0.3009 69/500 [===>..........................] - ETA: 2:28 - loss: 2.0327 - regression_loss: 1.7312 - classification_loss: 0.3015 70/500 [===>..........................] - ETA: 2:27 - loss: 2.0332 - regression_loss: 1.7318 - classification_loss: 0.3014 71/500 [===>..........................] - ETA: 2:27 - loss: 2.0347 - regression_loss: 1.7281 - classification_loss: 0.3066 72/500 [===>..........................] - ETA: 2:27 - loss: 2.0372 - regression_loss: 1.7304 - classification_loss: 0.3068 73/500 [===>..........................] - ETA: 2:27 - loss: 2.0366 - regression_loss: 1.7301 - classification_loss: 0.3065 74/500 [===>..........................] - ETA: 2:26 - loss: 2.0346 - regression_loss: 1.7286 - classification_loss: 0.3061 75/500 [===>..........................] - ETA: 2:26 - loss: 2.0421 - regression_loss: 1.7353 - classification_loss: 0.3068 76/500 [===>..........................] - ETA: 2:26 - loss: 2.0413 - regression_loss: 1.7340 - classification_loss: 0.3073 77/500 [===>..........................] - ETA: 2:25 - loss: 2.0351 - regression_loss: 1.7287 - classification_loss: 0.3064 78/500 [===>..........................] - ETA: 2:25 - loss: 2.0322 - regression_loss: 1.7261 - classification_loss: 0.3061 79/500 [===>..........................] - ETA: 2:25 - loss: 2.0298 - regression_loss: 1.7237 - classification_loss: 0.3061 80/500 [===>..........................] - ETA: 2:24 - loss: 2.0306 - regression_loss: 1.7244 - classification_loss: 0.3061 81/500 [===>..........................] - ETA: 2:24 - loss: 2.0283 - regression_loss: 1.7230 - classification_loss: 0.3053 82/500 [===>..........................] - ETA: 2:24 - loss: 2.0248 - regression_loss: 1.7200 - classification_loss: 0.3048 83/500 [===>..........................] - ETA: 2:23 - loss: 2.0252 - regression_loss: 1.7206 - classification_loss: 0.3046 84/500 [====>.........................] - ETA: 2:23 - loss: 2.0247 - regression_loss: 1.7203 - classification_loss: 0.3044 85/500 [====>.........................] - ETA: 2:23 - loss: 2.0246 - regression_loss: 1.7206 - classification_loss: 0.3040 86/500 [====>.........................] - ETA: 2:22 - loss: 2.0288 - regression_loss: 1.7244 - classification_loss: 0.3045 87/500 [====>.........................] - ETA: 2:22 - loss: 2.0290 - regression_loss: 1.7246 - classification_loss: 0.3044 88/500 [====>.........................] - ETA: 2:22 - loss: 2.0295 - regression_loss: 1.7253 - classification_loss: 0.3042 89/500 [====>.........................] - ETA: 2:21 - loss: 2.0236 - regression_loss: 1.7193 - classification_loss: 0.3044 90/500 [====>.........................] - ETA: 2:21 - loss: 2.0239 - regression_loss: 1.7201 - classification_loss: 0.3038 91/500 [====>.........................] - ETA: 2:20 - loss: 2.0176 - regression_loss: 1.7144 - classification_loss: 0.3032 92/500 [====>.........................] - ETA: 2:20 - loss: 2.0196 - regression_loss: 1.7163 - classification_loss: 0.3032 93/500 [====>.........................] - ETA: 2:20 - loss: 2.0198 - regression_loss: 1.7168 - classification_loss: 0.3030 94/500 [====>.........................] - ETA: 2:19 - loss: 2.0158 - regression_loss: 1.7133 - classification_loss: 0.3025 95/500 [====>.........................] - ETA: 2:19 - loss: 2.0158 - regression_loss: 1.7135 - classification_loss: 0.3024 96/500 [====>.........................] - ETA: 2:19 - loss: 2.0074 - regression_loss: 1.7063 - classification_loss: 0.3011 97/500 [====>.........................] - ETA: 2:18 - loss: 2.0092 - regression_loss: 1.7077 - classification_loss: 0.3015 98/500 [====>.........................] - ETA: 2:18 - loss: 2.0034 - regression_loss: 1.7032 - classification_loss: 0.3001 99/500 [====>.........................] - ETA: 2:18 - loss: 2.0020 - regression_loss: 1.7023 - classification_loss: 0.2996 100/500 [=====>........................] - ETA: 2:17 - loss: 1.9935 - regression_loss: 1.6952 - classification_loss: 0.2984 101/500 [=====>........................] - ETA: 2:17 - loss: 1.9928 - regression_loss: 1.6946 - classification_loss: 0.2981 102/500 [=====>........................] - ETA: 2:17 - loss: 1.9915 - regression_loss: 1.6932 - classification_loss: 0.2983 103/500 [=====>........................] - ETA: 2:16 - loss: 1.9938 - regression_loss: 1.6954 - classification_loss: 0.2984 104/500 [=====>........................] - ETA: 2:16 - loss: 1.9957 - regression_loss: 1.6973 - classification_loss: 0.2985 105/500 [=====>........................] - ETA: 2:16 - loss: 1.9965 - regression_loss: 1.6975 - classification_loss: 0.2990 106/500 [=====>........................] - ETA: 2:15 - loss: 1.9977 - regression_loss: 1.6982 - classification_loss: 0.2995 107/500 [=====>........................] - ETA: 2:15 - loss: 1.9968 - regression_loss: 1.6973 - classification_loss: 0.2995 108/500 [=====>........................] - ETA: 2:15 - loss: 1.9988 - regression_loss: 1.6992 - classification_loss: 0.2996 109/500 [=====>........................] - ETA: 2:14 - loss: 1.9983 - regression_loss: 1.6990 - classification_loss: 0.2994 110/500 [=====>........................] - ETA: 2:14 - loss: 1.9924 - regression_loss: 1.6941 - classification_loss: 0.2983 111/500 [=====>........................] - ETA: 2:13 - loss: 1.9933 - regression_loss: 1.6949 - classification_loss: 0.2985 112/500 [=====>........................] - ETA: 2:13 - loss: 1.9858 - regression_loss: 1.6884 - classification_loss: 0.2974 113/500 [=====>........................] - ETA: 2:13 - loss: 1.9877 - regression_loss: 1.6900 - classification_loss: 0.2977 114/500 [=====>........................] - ETA: 2:12 - loss: 1.9820 - regression_loss: 1.6853 - classification_loss: 0.2966 115/500 [=====>........................] - ETA: 2:12 - loss: 1.9833 - regression_loss: 1.6868 - classification_loss: 0.2966 116/500 [=====>........................] - ETA: 2:12 - loss: 1.9840 - regression_loss: 1.6874 - classification_loss: 0.2966 117/500 [======>.......................] - ETA: 2:11 - loss: 1.9796 - regression_loss: 1.6838 - classification_loss: 0.2958 118/500 [======>.......................] - ETA: 2:11 - loss: 1.9763 - regression_loss: 1.6811 - classification_loss: 0.2952 119/500 [======>.......................] - ETA: 2:11 - loss: 1.9807 - regression_loss: 1.6851 - classification_loss: 0.2957 120/500 [======>.......................] - ETA: 2:10 - loss: 1.9825 - regression_loss: 1.6869 - classification_loss: 0.2956 121/500 [======>.......................] - ETA: 2:10 - loss: 1.9795 - regression_loss: 1.6845 - classification_loss: 0.2949 122/500 [======>.......................] - ETA: 2:10 - loss: 1.9796 - regression_loss: 1.6849 - classification_loss: 0.2947 123/500 [======>.......................] - ETA: 2:09 - loss: 1.9806 - regression_loss: 1.6860 - classification_loss: 0.2945 124/500 [======>.......................] - ETA: 2:09 - loss: 1.9825 - regression_loss: 1.6880 - classification_loss: 0.2945 125/500 [======>.......................] - ETA: 2:09 - loss: 1.9813 - regression_loss: 1.6870 - classification_loss: 0.2943 126/500 [======>.......................] - ETA: 2:08 - loss: 1.9769 - regression_loss: 1.6810 - classification_loss: 0.2960 127/500 [======>.......................] - ETA: 2:08 - loss: 1.9726 - regression_loss: 1.6770 - classification_loss: 0.2956 128/500 [======>.......................] - ETA: 2:08 - loss: 1.9696 - regression_loss: 1.6743 - classification_loss: 0.2953 129/500 [======>.......................] - ETA: 2:07 - loss: 1.9706 - regression_loss: 1.6751 - classification_loss: 0.2955 130/500 [======>.......................] - ETA: 2:07 - loss: 1.9758 - regression_loss: 1.6798 - classification_loss: 0.2960 131/500 [======>.......................] - ETA: 2:07 - loss: 1.9743 - regression_loss: 1.6786 - classification_loss: 0.2957 132/500 [======>.......................] - ETA: 2:06 - loss: 1.9701 - regression_loss: 1.6748 - classification_loss: 0.2953 133/500 [======>.......................] - ETA: 2:06 - loss: 1.9739 - regression_loss: 1.6783 - classification_loss: 0.2956 134/500 [=======>......................] - ETA: 2:06 - loss: 1.9763 - regression_loss: 1.6808 - classification_loss: 0.2955 135/500 [=======>......................] - ETA: 2:05 - loss: 1.9744 - regression_loss: 1.6795 - classification_loss: 0.2949 136/500 [=======>......................] - ETA: 2:05 - loss: 1.9722 - regression_loss: 1.6778 - classification_loss: 0.2944 137/500 [=======>......................] - ETA: 2:04 - loss: 1.9723 - regression_loss: 1.6782 - classification_loss: 0.2941 138/500 [=======>......................] - ETA: 2:04 - loss: 1.9655 - regression_loss: 1.6725 - classification_loss: 0.2930 139/500 [=======>......................] - ETA: 2:04 - loss: 1.9604 - regression_loss: 1.6681 - classification_loss: 0.2923 140/500 [=======>......................] - ETA: 2:03 - loss: 1.9615 - regression_loss: 1.6694 - classification_loss: 0.2921 141/500 [=======>......................] - ETA: 2:03 - loss: 1.9623 - regression_loss: 1.6702 - classification_loss: 0.2921 142/500 [=======>......................] - ETA: 2:03 - loss: 1.9624 - regression_loss: 1.6705 - classification_loss: 0.2919 143/500 [=======>......................] - ETA: 2:02 - loss: 1.9628 - regression_loss: 1.6708 - classification_loss: 0.2920 144/500 [=======>......................] - ETA: 2:02 - loss: 1.9631 - regression_loss: 1.6709 - classification_loss: 0.2922 145/500 [=======>......................] - ETA: 2:02 - loss: 1.9660 - regression_loss: 1.6736 - classification_loss: 0.2925 146/500 [=======>......................] - ETA: 2:01 - loss: 1.9728 - regression_loss: 1.6793 - classification_loss: 0.2935 147/500 [=======>......................] - ETA: 2:01 - loss: 1.9757 - regression_loss: 1.6822 - classification_loss: 0.2935 148/500 [=======>......................] - ETA: 2:01 - loss: 1.9769 - regression_loss: 1.6833 - classification_loss: 0.2935 149/500 [=======>......................] - ETA: 2:01 - loss: 1.9744 - regression_loss: 1.6813 - classification_loss: 0.2931 150/500 [========>.....................] - ETA: 2:00 - loss: 1.9760 - regression_loss: 1.6826 - classification_loss: 0.2934 151/500 [========>.....................] - ETA: 2:00 - loss: 1.9723 - regression_loss: 1.6792 - classification_loss: 0.2931 152/500 [========>.....................] - ETA: 2:00 - loss: 1.9732 - regression_loss: 1.6806 - classification_loss: 0.2927 153/500 [========>.....................] - ETA: 1:59 - loss: 1.9742 - regression_loss: 1.6812 - classification_loss: 0.2931 154/500 [========>.....................] - ETA: 1:59 - loss: 1.9742 - regression_loss: 1.6812 - classification_loss: 0.2930 155/500 [========>.....................] - ETA: 1:58 - loss: 1.9763 - regression_loss: 1.6831 - classification_loss: 0.2932 156/500 [========>.....................] - ETA: 1:58 - loss: 1.9752 - regression_loss: 1.6827 - classification_loss: 0.2925 157/500 [========>.....................] - ETA: 1:58 - loss: 1.9750 - regression_loss: 1.6823 - classification_loss: 0.2927 158/500 [========>.....................] - ETA: 1:57 - loss: 1.9749 - regression_loss: 1.6824 - classification_loss: 0.2925 159/500 [========>.....................] - ETA: 1:57 - loss: 1.9754 - regression_loss: 1.6828 - classification_loss: 0.2925 160/500 [========>.....................] - ETA: 1:57 - loss: 1.9753 - regression_loss: 1.6829 - classification_loss: 0.2924 161/500 [========>.....................] - ETA: 1:56 - loss: 1.9707 - regression_loss: 1.6790 - classification_loss: 0.2917 162/500 [========>.....................] - ETA: 1:56 - loss: 1.9736 - regression_loss: 1.6814 - classification_loss: 0.2922 163/500 [========>.....................] - ETA: 1:56 - loss: 1.9741 - regression_loss: 1.6814 - classification_loss: 0.2927 164/500 [========>.....................] - ETA: 1:55 - loss: 1.9712 - regression_loss: 1.6792 - classification_loss: 0.2920 165/500 [========>.....................] - ETA: 1:55 - loss: 1.9724 - regression_loss: 1.6803 - classification_loss: 0.2920 166/500 [========>.....................] - ETA: 1:55 - loss: 1.9716 - regression_loss: 1.6798 - classification_loss: 0.2918 167/500 [=========>....................] - ETA: 1:54 - loss: 1.9701 - regression_loss: 1.6786 - classification_loss: 0.2915 168/500 [=========>....................] - ETA: 1:54 - loss: 1.9685 - regression_loss: 1.6770 - classification_loss: 0.2915 169/500 [=========>....................] - ETA: 1:54 - loss: 1.9683 - regression_loss: 1.6767 - classification_loss: 0.2915 170/500 [=========>....................] - ETA: 1:53 - loss: 1.9674 - regression_loss: 1.6760 - classification_loss: 0.2914 171/500 [=========>....................] - ETA: 1:53 - loss: 1.9667 - regression_loss: 1.6755 - classification_loss: 0.2913 172/500 [=========>....................] - ETA: 1:53 - loss: 1.9655 - regression_loss: 1.6744 - classification_loss: 0.2911 173/500 [=========>....................] - ETA: 1:52 - loss: 1.9622 - regression_loss: 1.6717 - classification_loss: 0.2906 174/500 [=========>....................] - ETA: 1:52 - loss: 1.9630 - regression_loss: 1.6723 - classification_loss: 0.2906 175/500 [=========>....................] - ETA: 1:52 - loss: 1.9655 - regression_loss: 1.6742 - classification_loss: 0.2913 176/500 [=========>....................] - ETA: 1:51 - loss: 1.9658 - regression_loss: 1.6746 - classification_loss: 0.2913 177/500 [=========>....................] - ETA: 1:51 - loss: 1.9652 - regression_loss: 1.6740 - classification_loss: 0.2912 178/500 [=========>....................] - ETA: 1:51 - loss: 1.9648 - regression_loss: 1.6731 - classification_loss: 0.2917 179/500 [=========>....................] - ETA: 1:50 - loss: 1.9652 - regression_loss: 1.6736 - classification_loss: 0.2915 180/500 [=========>....................] - ETA: 1:50 - loss: 1.9638 - regression_loss: 1.6726 - classification_loss: 0.2912 181/500 [=========>....................] - ETA: 1:50 - loss: 1.9620 - regression_loss: 1.6715 - classification_loss: 0.2905 182/500 [=========>....................] - ETA: 1:49 - loss: 1.9639 - regression_loss: 1.6731 - classification_loss: 0.2908 183/500 [=========>....................] - ETA: 1:49 - loss: 1.9647 - regression_loss: 1.6736 - classification_loss: 0.2912 184/500 [==========>...................] - ETA: 1:49 - loss: 1.9654 - regression_loss: 1.6740 - classification_loss: 0.2914 185/500 [==========>...................] - ETA: 1:48 - loss: 1.9679 - regression_loss: 1.6761 - classification_loss: 0.2918 186/500 [==========>...................] - ETA: 1:48 - loss: 1.9644 - regression_loss: 1.6731 - classification_loss: 0.2913 187/500 [==========>...................] - ETA: 1:48 - loss: 1.9652 - regression_loss: 1.6738 - classification_loss: 0.2914 188/500 [==========>...................] - ETA: 1:47 - loss: 1.9653 - regression_loss: 1.6735 - classification_loss: 0.2918 189/500 [==========>...................] - ETA: 1:47 - loss: 1.9669 - regression_loss: 1.6746 - classification_loss: 0.2923 190/500 [==========>...................] - ETA: 1:47 - loss: 1.9656 - regression_loss: 1.6733 - classification_loss: 0.2922 191/500 [==========>...................] - ETA: 1:46 - loss: 1.9619 - regression_loss: 1.6703 - classification_loss: 0.2916 192/500 [==========>...................] - ETA: 1:46 - loss: 1.9632 - regression_loss: 1.6717 - classification_loss: 0.2914 193/500 [==========>...................] - ETA: 1:46 - loss: 1.9607 - regression_loss: 1.6694 - classification_loss: 0.2913 194/500 [==========>...................] - ETA: 1:45 - loss: 1.9604 - regression_loss: 1.6690 - classification_loss: 0.2914 195/500 [==========>...................] - ETA: 1:45 - loss: 1.9597 - regression_loss: 1.6685 - classification_loss: 0.2912 196/500 [==========>...................] - ETA: 1:45 - loss: 1.9610 - regression_loss: 1.6696 - classification_loss: 0.2914 197/500 [==========>...................] - ETA: 1:44 - loss: 1.9624 - regression_loss: 1.6709 - classification_loss: 0.2915 198/500 [==========>...................] - ETA: 1:44 - loss: 1.9622 - regression_loss: 1.6708 - classification_loss: 0.2914 199/500 [==========>...................] - ETA: 1:44 - loss: 1.9560 - regression_loss: 1.6655 - classification_loss: 0.2906 200/500 [===========>..................] - ETA: 1:43 - loss: 1.9533 - regression_loss: 1.6628 - classification_loss: 0.2905 201/500 [===========>..................] - ETA: 1:43 - loss: 1.9510 - regression_loss: 1.6606 - classification_loss: 0.2904 202/500 [===========>..................] - ETA: 1:43 - loss: 1.9520 - regression_loss: 1.6615 - classification_loss: 0.2905 203/500 [===========>..................] - ETA: 1:42 - loss: 1.9517 - regression_loss: 1.6610 - classification_loss: 0.2907 204/500 [===========>..................] - ETA: 1:42 - loss: 1.9524 - regression_loss: 1.6619 - classification_loss: 0.2904 205/500 [===========>..................] - ETA: 1:42 - loss: 1.9537 - regression_loss: 1.6626 - classification_loss: 0.2912 206/500 [===========>..................] - ETA: 1:41 - loss: 1.9541 - regression_loss: 1.6629 - classification_loss: 0.2912 207/500 [===========>..................] - ETA: 1:41 - loss: 1.9540 - regression_loss: 1.6630 - classification_loss: 0.2910 208/500 [===========>..................] - ETA: 1:40 - loss: 1.9547 - regression_loss: 1.6637 - classification_loss: 0.2910 209/500 [===========>..................] - ETA: 1:40 - loss: 1.9533 - regression_loss: 1.6626 - classification_loss: 0.2907 210/500 [===========>..................] - ETA: 1:40 - loss: 1.9523 - regression_loss: 1.6619 - classification_loss: 0.2905 211/500 [===========>..................] - ETA: 1:39 - loss: 1.9519 - regression_loss: 1.6620 - classification_loss: 0.2899 212/500 [===========>..................] - ETA: 1:39 - loss: 1.9506 - regression_loss: 1.6611 - classification_loss: 0.2895 213/500 [===========>..................] - ETA: 1:39 - loss: 1.9490 - regression_loss: 1.6599 - classification_loss: 0.2891 214/500 [===========>..................] - ETA: 1:38 - loss: 1.9496 - regression_loss: 1.6606 - classification_loss: 0.2890 215/500 [===========>..................] - ETA: 1:38 - loss: 1.9498 - regression_loss: 1.6611 - classification_loss: 0.2887 216/500 [===========>..................] - ETA: 1:38 - loss: 1.9507 - regression_loss: 1.6620 - classification_loss: 0.2887 217/500 [============>.................] - ETA: 1:37 - loss: 1.9501 - regression_loss: 1.6617 - classification_loss: 0.2884 218/500 [============>.................] - ETA: 1:37 - loss: 1.9510 - regression_loss: 1.6626 - classification_loss: 0.2885 219/500 [============>.................] - ETA: 1:37 - loss: 1.9537 - regression_loss: 1.6648 - classification_loss: 0.2889 220/500 [============>.................] - ETA: 1:36 - loss: 1.9523 - regression_loss: 1.6637 - classification_loss: 0.2886 221/500 [============>.................] - ETA: 1:36 - loss: 1.9528 - regression_loss: 1.6642 - classification_loss: 0.2887 222/500 [============>.................] - ETA: 1:36 - loss: 1.9536 - regression_loss: 1.6648 - classification_loss: 0.2888 223/500 [============>.................] - ETA: 1:35 - loss: 1.9561 - regression_loss: 1.6670 - classification_loss: 0.2891 224/500 [============>.................] - ETA: 1:35 - loss: 1.9577 - regression_loss: 1.6685 - classification_loss: 0.2893 225/500 [============>.................] - ETA: 1:35 - loss: 1.9580 - regression_loss: 1.6689 - classification_loss: 0.2892 226/500 [============>.................] - ETA: 1:34 - loss: 1.9589 - regression_loss: 1.6700 - classification_loss: 0.2889 227/500 [============>.................] - ETA: 1:34 - loss: 1.9594 - regression_loss: 1.6708 - classification_loss: 0.2886 228/500 [============>.................] - ETA: 1:34 - loss: 1.9583 - regression_loss: 1.6703 - classification_loss: 0.2880 229/500 [============>.................] - ETA: 1:33 - loss: 1.9596 - regression_loss: 1.6714 - classification_loss: 0.2882 230/500 [============>.................] - ETA: 1:33 - loss: 1.9611 - regression_loss: 1.6727 - classification_loss: 0.2884 231/500 [============>.................] - ETA: 1:33 - loss: 1.9634 - regression_loss: 1.6748 - classification_loss: 0.2886 232/500 [============>.................] - ETA: 1:32 - loss: 1.9622 - regression_loss: 1.6738 - classification_loss: 0.2884 233/500 [============>.................] - ETA: 1:32 - loss: 1.9610 - regression_loss: 1.6727 - classification_loss: 0.2882 234/500 [=============>................] - ETA: 1:31 - loss: 1.9596 - regression_loss: 1.6717 - classification_loss: 0.2879 235/500 [=============>................] - ETA: 1:31 - loss: 1.9606 - regression_loss: 1.6726 - classification_loss: 0.2880 236/500 [=============>................] - ETA: 1:31 - loss: 1.9602 - regression_loss: 1.6722 - classification_loss: 0.2879 237/500 [=============>................] - ETA: 1:30 - loss: 1.9616 - regression_loss: 1.6736 - classification_loss: 0.2880 238/500 [=============>................] - ETA: 1:30 - loss: 1.9598 - regression_loss: 1.6722 - classification_loss: 0.2875 239/500 [=============>................] - ETA: 1:30 - loss: 1.9609 - regression_loss: 1.6735 - classification_loss: 0.2875 240/500 [=============>................] - ETA: 1:29 - loss: 1.9595 - regression_loss: 1.6723 - classification_loss: 0.2872 241/500 [=============>................] - ETA: 1:29 - loss: 1.9594 - regression_loss: 1.6723 - classification_loss: 0.2872 242/500 [=============>................] - ETA: 1:29 - loss: 1.9598 - regression_loss: 1.6727 - classification_loss: 0.2871 243/500 [=============>................] - ETA: 1:28 - loss: 1.9599 - regression_loss: 1.6727 - classification_loss: 0.2871 244/500 [=============>................] - ETA: 1:28 - loss: 1.9603 - regression_loss: 1.6731 - classification_loss: 0.2872 245/500 [=============>................] - ETA: 1:28 - loss: 1.9595 - regression_loss: 1.6724 - classification_loss: 0.2871 246/500 [=============>................] - ETA: 1:27 - loss: 1.9597 - regression_loss: 1.6726 - classification_loss: 0.2871 247/500 [=============>................] - ETA: 1:27 - loss: 1.9598 - regression_loss: 1.6728 - classification_loss: 0.2870 248/500 [=============>................] - ETA: 1:27 - loss: 1.9602 - regression_loss: 1.6731 - classification_loss: 0.2871 249/500 [=============>................] - ETA: 1:26 - loss: 1.9582 - regression_loss: 1.6714 - classification_loss: 0.2868 250/500 [==============>...............] - ETA: 1:26 - loss: 1.9569 - regression_loss: 1.6704 - classification_loss: 0.2865 251/500 [==============>...............] - ETA: 1:26 - loss: 1.9574 - regression_loss: 1.6711 - classification_loss: 0.2863 252/500 [==============>...............] - ETA: 1:25 - loss: 1.9572 - regression_loss: 1.6709 - classification_loss: 0.2863 253/500 [==============>...............] - ETA: 1:25 - loss: 1.9544 - regression_loss: 1.6686 - classification_loss: 0.2858 254/500 [==============>...............] - ETA: 1:25 - loss: 1.9538 - regression_loss: 1.6680 - classification_loss: 0.2858 255/500 [==============>...............] - ETA: 1:24 - loss: 1.9518 - regression_loss: 1.6662 - classification_loss: 0.2857 256/500 [==============>...............] - ETA: 1:24 - loss: 1.9507 - regression_loss: 1.6645 - classification_loss: 0.2862 257/500 [==============>...............] - ETA: 1:23 - loss: 1.9483 - regression_loss: 1.6624 - classification_loss: 0.2858 258/500 [==============>...............] - ETA: 1:23 - loss: 1.9467 - regression_loss: 1.6612 - classification_loss: 0.2855 259/500 [==============>...............] - ETA: 1:23 - loss: 1.9471 - regression_loss: 1.6615 - classification_loss: 0.2856 260/500 [==============>...............] - ETA: 1:22 - loss: 1.9459 - regression_loss: 1.6605 - classification_loss: 0.2854 261/500 [==============>...............] - ETA: 1:22 - loss: 1.9477 - regression_loss: 1.6620 - classification_loss: 0.2857 262/500 [==============>...............] - ETA: 1:22 - loss: 1.9475 - regression_loss: 1.6618 - classification_loss: 0.2857 263/500 [==============>...............] - ETA: 1:21 - loss: 1.9483 - regression_loss: 1.6623 - classification_loss: 0.2860 264/500 [==============>...............] - ETA: 1:21 - loss: 1.9475 - regression_loss: 1.6617 - classification_loss: 0.2858 265/500 [==============>...............] - ETA: 1:21 - loss: 1.9466 - regression_loss: 1.6607 - classification_loss: 0.2859 266/500 [==============>...............] - ETA: 1:20 - loss: 1.9488 - regression_loss: 1.6622 - classification_loss: 0.2866 267/500 [===============>..............] - ETA: 1:20 - loss: 1.9463 - regression_loss: 1.6601 - classification_loss: 0.2862 268/500 [===============>..............] - ETA: 1:20 - loss: 1.9468 - regression_loss: 1.6605 - classification_loss: 0.2863 269/500 [===============>..............] - ETA: 1:19 - loss: 1.9466 - regression_loss: 1.6603 - classification_loss: 0.2863 270/500 [===============>..............] - ETA: 1:19 - loss: 1.9483 - regression_loss: 1.6618 - classification_loss: 0.2865 271/500 [===============>..............] - ETA: 1:19 - loss: 1.9477 - regression_loss: 1.6614 - classification_loss: 0.2864 272/500 [===============>..............] - ETA: 1:18 - loss: 1.9460 - regression_loss: 1.6597 - classification_loss: 0.2863 273/500 [===============>..............] - ETA: 1:18 - loss: 1.9473 - regression_loss: 1.6609 - classification_loss: 0.2865 274/500 [===============>..............] - ETA: 1:18 - loss: 1.9475 - regression_loss: 1.6611 - classification_loss: 0.2864 275/500 [===============>..............] - ETA: 1:17 - loss: 1.9471 - regression_loss: 1.6610 - classification_loss: 0.2862 276/500 [===============>..............] - ETA: 1:17 - loss: 1.9473 - regression_loss: 1.6611 - classification_loss: 0.2862 277/500 [===============>..............] - ETA: 1:17 - loss: 1.9484 - regression_loss: 1.6621 - classification_loss: 0.2863 278/500 [===============>..............] - ETA: 1:16 - loss: 1.9487 - regression_loss: 1.6623 - classification_loss: 0.2864 279/500 [===============>..............] - ETA: 1:16 - loss: 1.9447 - regression_loss: 1.6588 - classification_loss: 0.2858 280/500 [===============>..............] - ETA: 1:16 - loss: 1.9450 - regression_loss: 1.6592 - classification_loss: 0.2857 281/500 [===============>..............] - ETA: 1:15 - loss: 1.9423 - regression_loss: 1.6567 - classification_loss: 0.2856 282/500 [===============>..............] - ETA: 1:15 - loss: 1.9417 - regression_loss: 1.6563 - classification_loss: 0.2854 283/500 [===============>..............] - ETA: 1:15 - loss: 1.9393 - regression_loss: 1.6544 - classification_loss: 0.2849 284/500 [================>.............] - ETA: 1:14 - loss: 1.9394 - regression_loss: 1.6545 - classification_loss: 0.2849 285/500 [================>.............] - ETA: 1:14 - loss: 1.9393 - regression_loss: 1.6545 - classification_loss: 0.2849 286/500 [================>.............] - ETA: 1:14 - loss: 1.9365 - regression_loss: 1.6521 - classification_loss: 0.2845 287/500 [================>.............] - ETA: 1:13 - loss: 1.9368 - regression_loss: 1.6523 - classification_loss: 0.2845 288/500 [================>.............] - ETA: 1:13 - loss: 1.9363 - regression_loss: 1.6520 - classification_loss: 0.2843 289/500 [================>.............] - ETA: 1:12 - loss: 1.9361 - regression_loss: 1.6519 - classification_loss: 0.2841 290/500 [================>.............] - ETA: 1:12 - loss: 1.9361 - regression_loss: 1.6519 - classification_loss: 0.2842 291/500 [================>.............] - ETA: 1:12 - loss: 1.9371 - regression_loss: 1.6528 - classification_loss: 0.2844 292/500 [================>.............] - ETA: 1:11 - loss: 1.9348 - regression_loss: 1.6508 - classification_loss: 0.2840 293/500 [================>.............] - ETA: 1:11 - loss: 1.9319 - regression_loss: 1.6484 - classification_loss: 0.2835 294/500 [================>.............] - ETA: 1:11 - loss: 1.9337 - regression_loss: 1.6501 - classification_loss: 0.2836 295/500 [================>.............] - ETA: 1:10 - loss: 1.9297 - regression_loss: 1.6467 - classification_loss: 0.2830 296/500 [================>.............] - ETA: 1:10 - loss: 1.9306 - regression_loss: 1.6470 - classification_loss: 0.2836 297/500 [================>.............] - ETA: 1:10 - loss: 1.9298 - regression_loss: 1.6463 - classification_loss: 0.2834 298/500 [================>.............] - ETA: 1:09 - loss: 1.9307 - regression_loss: 1.6471 - classification_loss: 0.2836 299/500 [================>.............] - ETA: 1:09 - loss: 1.9307 - regression_loss: 1.6471 - classification_loss: 0.2836 300/500 [=================>............] - ETA: 1:09 - loss: 1.9290 - regression_loss: 1.6457 - classification_loss: 0.2833 301/500 [=================>............] - ETA: 1:08 - loss: 1.9296 - regression_loss: 1.6461 - classification_loss: 0.2835 302/500 [=================>............] - ETA: 1:08 - loss: 1.9315 - regression_loss: 1.6477 - classification_loss: 0.2838 303/500 [=================>............] - ETA: 1:08 - loss: 1.9321 - regression_loss: 1.6482 - classification_loss: 0.2838 304/500 [=================>............] - ETA: 1:07 - loss: 1.9332 - regression_loss: 1.6494 - classification_loss: 0.2837 305/500 [=================>............] - ETA: 1:07 - loss: 1.9319 - regression_loss: 1.6485 - classification_loss: 0.2834 306/500 [=================>............] - ETA: 1:07 - loss: 1.9309 - regression_loss: 1.6477 - classification_loss: 0.2831 307/500 [=================>............] - ETA: 1:06 - loss: 1.9316 - regression_loss: 1.6485 - classification_loss: 0.2831 308/500 [=================>............] - ETA: 1:06 - loss: 1.9322 - regression_loss: 1.6491 - classification_loss: 0.2831 309/500 [=================>............] - ETA: 1:06 - loss: 1.9314 - regression_loss: 1.6484 - classification_loss: 0.2830 310/500 [=================>............] - ETA: 1:05 - loss: 1.9326 - regression_loss: 1.6495 - classification_loss: 0.2831 311/500 [=================>............] - ETA: 1:05 - loss: 1.9315 - regression_loss: 1.6485 - classification_loss: 0.2830 312/500 [=================>............] - ETA: 1:05 - loss: 1.9302 - regression_loss: 1.6472 - classification_loss: 0.2830 313/500 [=================>............] - ETA: 1:04 - loss: 1.9311 - regression_loss: 1.6481 - classification_loss: 0.2830 314/500 [=================>............] - ETA: 1:04 - loss: 1.9304 - regression_loss: 1.6476 - classification_loss: 0.2828 315/500 [=================>............] - ETA: 1:04 - loss: 1.9306 - regression_loss: 1.6478 - classification_loss: 0.2828 316/500 [=================>............] - ETA: 1:03 - loss: 1.9318 - regression_loss: 1.6490 - classification_loss: 0.2827 317/500 [==================>...........] - ETA: 1:03 - loss: 1.9340 - regression_loss: 1.6510 - classification_loss: 0.2829 318/500 [==================>...........] - ETA: 1:03 - loss: 1.9355 - regression_loss: 1.6521 - classification_loss: 0.2833 319/500 [==================>...........] - ETA: 1:02 - loss: 1.9357 - regression_loss: 1.6525 - classification_loss: 0.2832 320/500 [==================>...........] - ETA: 1:02 - loss: 1.9352 - regression_loss: 1.6521 - classification_loss: 0.2831 321/500 [==================>...........] - ETA: 1:01 - loss: 1.9356 - regression_loss: 1.6525 - classification_loss: 0.2830 322/500 [==================>...........] - ETA: 1:01 - loss: 1.9346 - regression_loss: 1.6517 - classification_loss: 0.2829 323/500 [==================>...........] - ETA: 1:01 - loss: 1.9334 - regression_loss: 1.6507 - classification_loss: 0.2827 324/500 [==================>...........] - ETA: 1:00 - loss: 1.9316 - regression_loss: 1.6491 - classification_loss: 0.2825 325/500 [==================>...........] - ETA: 1:00 - loss: 1.9307 - regression_loss: 1.6483 - classification_loss: 0.2825 326/500 [==================>...........] - ETA: 1:00 - loss: 1.9286 - regression_loss: 1.6464 - classification_loss: 0.2821 327/500 [==================>...........] - ETA: 59s - loss: 1.9299 - regression_loss: 1.6476 - classification_loss: 0.2823  328/500 [==================>...........] - ETA: 59s - loss: 1.9304 - regression_loss: 1.6481 - classification_loss: 0.2823 329/500 [==================>...........] - ETA: 59s - loss: 1.9308 - regression_loss: 1.6486 - classification_loss: 0.2822 330/500 [==================>...........] - ETA: 58s - loss: 1.9312 - regression_loss: 1.6487 - classification_loss: 0.2824 331/500 [==================>...........] - ETA: 58s - loss: 1.9309 - regression_loss: 1.6486 - classification_loss: 0.2824 332/500 [==================>...........] - ETA: 58s - loss: 1.9314 - regression_loss: 1.6490 - classification_loss: 0.2825 333/500 [==================>...........] - ETA: 57s - loss: 1.9323 - regression_loss: 1.6496 - classification_loss: 0.2827 334/500 [===================>..........] - ETA: 57s - loss: 1.9321 - regression_loss: 1.6495 - classification_loss: 0.2825 335/500 [===================>..........] - ETA: 57s - loss: 1.9318 - regression_loss: 1.6494 - classification_loss: 0.2824 336/500 [===================>..........] - ETA: 56s - loss: 1.9337 - regression_loss: 1.6511 - classification_loss: 0.2826 337/500 [===================>..........] - ETA: 56s - loss: 1.9329 - regression_loss: 1.6506 - classification_loss: 0.2823 338/500 [===================>..........] - ETA: 56s - loss: 1.9326 - regression_loss: 1.6504 - classification_loss: 0.2822 339/500 [===================>..........] - ETA: 55s - loss: 1.9297 - regression_loss: 1.6478 - classification_loss: 0.2819 340/500 [===================>..........] - ETA: 55s - loss: 1.9299 - regression_loss: 1.6481 - classification_loss: 0.2818 341/500 [===================>..........] - ETA: 55s - loss: 1.9279 - regression_loss: 1.6458 - classification_loss: 0.2821 342/500 [===================>..........] - ETA: 54s - loss: 1.9284 - regression_loss: 1.6462 - classification_loss: 0.2821 343/500 [===================>..........] - ETA: 54s - loss: 1.9274 - regression_loss: 1.6456 - classification_loss: 0.2818 344/500 [===================>..........] - ETA: 54s - loss: 1.9255 - regression_loss: 1.6439 - classification_loss: 0.2816 345/500 [===================>..........] - ETA: 53s - loss: 1.9247 - regression_loss: 1.6433 - classification_loss: 0.2814 346/500 [===================>..........] - ETA: 53s - loss: 1.9239 - regression_loss: 1.6427 - classification_loss: 0.2812 347/500 [===================>..........] - ETA: 53s - loss: 1.9223 - regression_loss: 1.6413 - classification_loss: 0.2810 348/500 [===================>..........] - ETA: 52s - loss: 1.9227 - regression_loss: 1.6417 - classification_loss: 0.2810 349/500 [===================>..........] - ETA: 52s - loss: 1.9232 - regression_loss: 1.6422 - classification_loss: 0.2811 350/500 [====================>.........] - ETA: 51s - loss: 1.9223 - regression_loss: 1.6414 - classification_loss: 0.2809 351/500 [====================>.........] - ETA: 51s - loss: 1.9213 - regression_loss: 1.6406 - classification_loss: 0.2807 352/500 [====================>.........] - ETA: 51s - loss: 1.9215 - regression_loss: 1.6408 - classification_loss: 0.2808 353/500 [====================>.........] - ETA: 50s - loss: 1.9221 - regression_loss: 1.6414 - classification_loss: 0.2808 354/500 [====================>.........] - ETA: 50s - loss: 1.9207 - regression_loss: 1.6400 - classification_loss: 0.2807 355/500 [====================>.........] - ETA: 50s - loss: 1.9207 - regression_loss: 1.6400 - classification_loss: 0.2807 356/500 [====================>.........] - ETA: 49s - loss: 1.9213 - regression_loss: 1.6406 - classification_loss: 0.2807 357/500 [====================>.........] - ETA: 49s - loss: 1.9201 - regression_loss: 1.6396 - classification_loss: 0.2806 358/500 [====================>.........] - ETA: 49s - loss: 1.9205 - regression_loss: 1.6398 - classification_loss: 0.2806 359/500 [====================>.........] - ETA: 48s - loss: 1.9198 - regression_loss: 1.6387 - classification_loss: 0.2810 360/500 [====================>.........] - ETA: 48s - loss: 1.9198 - regression_loss: 1.6387 - classification_loss: 0.2810 361/500 [====================>.........] - ETA: 48s - loss: 1.9206 - regression_loss: 1.6397 - classification_loss: 0.2809 362/500 [====================>.........] - ETA: 47s - loss: 1.9209 - regression_loss: 1.6400 - classification_loss: 0.2809 363/500 [====================>.........] - ETA: 47s - loss: 1.9211 - regression_loss: 1.6402 - classification_loss: 0.2808 364/500 [====================>.........] - ETA: 47s - loss: 1.9192 - regression_loss: 1.6388 - classification_loss: 0.2804 365/500 [====================>.........] - ETA: 46s - loss: 1.9190 - regression_loss: 1.6387 - classification_loss: 0.2803 366/500 [====================>.........] - ETA: 46s - loss: 1.9185 - regression_loss: 1.6382 - classification_loss: 0.2803 367/500 [=====================>........] - ETA: 46s - loss: 1.9191 - regression_loss: 1.6387 - classification_loss: 0.2804 368/500 [=====================>........] - ETA: 45s - loss: 1.9190 - regression_loss: 1.6385 - classification_loss: 0.2804 369/500 [=====================>........] - ETA: 45s - loss: 1.9171 - regression_loss: 1.6369 - classification_loss: 0.2801 370/500 [=====================>........] - ETA: 45s - loss: 1.9160 - regression_loss: 1.6358 - classification_loss: 0.2802 371/500 [=====================>........] - ETA: 44s - loss: 1.9161 - regression_loss: 1.6359 - classification_loss: 0.2802 372/500 [=====================>........] - ETA: 44s - loss: 1.9160 - regression_loss: 1.6360 - classification_loss: 0.2800 373/500 [=====================>........] - ETA: 43s - loss: 1.9150 - regression_loss: 1.6351 - classification_loss: 0.2799 374/500 [=====================>........] - ETA: 43s - loss: 1.9158 - regression_loss: 1.6358 - classification_loss: 0.2800 375/500 [=====================>........] - ETA: 43s - loss: 1.9162 - regression_loss: 1.6362 - classification_loss: 0.2800 376/500 [=====================>........] - ETA: 42s - loss: 1.9146 - regression_loss: 1.6348 - classification_loss: 0.2798 377/500 [=====================>........] - ETA: 42s - loss: 1.9141 - regression_loss: 1.6343 - classification_loss: 0.2798 378/500 [=====================>........] - ETA: 42s - loss: 1.9135 - regression_loss: 1.6339 - classification_loss: 0.2795 379/500 [=====================>........] - ETA: 41s - loss: 1.9130 - regression_loss: 1.6336 - classification_loss: 0.2795 380/500 [=====================>........] - ETA: 41s - loss: 1.9133 - regression_loss: 1.6338 - classification_loss: 0.2794 381/500 [=====================>........] - ETA: 41s - loss: 1.9120 - regression_loss: 1.6327 - classification_loss: 0.2793 382/500 [=====================>........] - ETA: 40s - loss: 1.9133 - regression_loss: 1.6338 - classification_loss: 0.2795 383/500 [=====================>........] - ETA: 40s - loss: 1.9126 - regression_loss: 1.6332 - classification_loss: 0.2794 384/500 [======================>.......] - ETA: 40s - loss: 1.9129 - regression_loss: 1.6334 - classification_loss: 0.2795 385/500 [======================>.......] - ETA: 39s - loss: 1.9115 - regression_loss: 1.6320 - classification_loss: 0.2795 386/500 [======================>.......] - ETA: 39s - loss: 1.9128 - regression_loss: 1.6333 - classification_loss: 0.2795 387/500 [======================>.......] - ETA: 39s - loss: 1.9132 - regression_loss: 1.6337 - classification_loss: 0.2795 388/500 [======================>.......] - ETA: 38s - loss: 1.9140 - regression_loss: 1.6344 - classification_loss: 0.2797 389/500 [======================>.......] - ETA: 38s - loss: 1.9133 - regression_loss: 1.6338 - classification_loss: 0.2796 390/500 [======================>.......] - ETA: 38s - loss: 1.9114 - regression_loss: 1.6321 - classification_loss: 0.2793 391/500 [======================>.......] - ETA: 37s - loss: 1.9115 - regression_loss: 1.6322 - classification_loss: 0.2792 392/500 [======================>.......] - ETA: 37s - loss: 1.9105 - regression_loss: 1.6314 - classification_loss: 0.2791 393/500 [======================>.......] - ETA: 37s - loss: 1.9104 - regression_loss: 1.6314 - classification_loss: 0.2790 394/500 [======================>.......] - ETA: 36s - loss: 1.9103 - regression_loss: 1.6314 - classification_loss: 0.2789 395/500 [======================>.......] - ETA: 36s - loss: 1.9101 - regression_loss: 1.6314 - classification_loss: 0.2788 396/500 [======================>.......] - ETA: 36s - loss: 1.9086 - regression_loss: 1.6296 - classification_loss: 0.2790 397/500 [======================>.......] - ETA: 35s - loss: 1.9062 - regression_loss: 1.6272 - classification_loss: 0.2790 398/500 [======================>.......] - ETA: 35s - loss: 1.9064 - regression_loss: 1.6274 - classification_loss: 0.2790 399/500 [======================>.......] - ETA: 34s - loss: 1.9060 - regression_loss: 1.6271 - classification_loss: 0.2789 400/500 [=======================>......] - ETA: 34s - loss: 1.9071 - regression_loss: 1.6281 - classification_loss: 0.2790 401/500 [=======================>......] - ETA: 34s - loss: 1.9076 - regression_loss: 1.6286 - classification_loss: 0.2789 402/500 [=======================>......] - ETA: 33s - loss: 1.9085 - regression_loss: 1.6295 - classification_loss: 0.2790 403/500 [=======================>......] - ETA: 33s - loss: 1.9081 - regression_loss: 1.6293 - classification_loss: 0.2788 404/500 [=======================>......] - ETA: 33s - loss: 1.9083 - regression_loss: 1.6294 - classification_loss: 0.2789 405/500 [=======================>......] - ETA: 32s - loss: 1.9089 - regression_loss: 1.6297 - classification_loss: 0.2792 406/500 [=======================>......] - ETA: 32s - loss: 1.9095 - regression_loss: 1.6302 - classification_loss: 0.2792 407/500 [=======================>......] - ETA: 32s - loss: 1.9099 - regression_loss: 1.6305 - classification_loss: 0.2795 408/500 [=======================>......] - ETA: 31s - loss: 1.9105 - regression_loss: 1.6309 - classification_loss: 0.2796 409/500 [=======================>......] - ETA: 31s - loss: 1.9105 - regression_loss: 1.6310 - classification_loss: 0.2795 410/500 [=======================>......] - ETA: 31s - loss: 1.9103 - regression_loss: 1.6309 - classification_loss: 0.2795 411/500 [=======================>......] - ETA: 30s - loss: 1.9104 - regression_loss: 1.6309 - classification_loss: 0.2795 412/500 [=======================>......] - ETA: 30s - loss: 1.9102 - regression_loss: 1.6308 - classification_loss: 0.2795 413/500 [=======================>......] - ETA: 30s - loss: 1.9104 - regression_loss: 1.6309 - classification_loss: 0.2795 414/500 [=======================>......] - ETA: 29s - loss: 1.9115 - regression_loss: 1.6319 - classification_loss: 0.2796 415/500 [=======================>......] - ETA: 29s - loss: 1.9119 - regression_loss: 1.6321 - classification_loss: 0.2798 416/500 [=======================>......] - ETA: 29s - loss: 1.9095 - regression_loss: 1.6299 - classification_loss: 0.2796 417/500 [========================>.....] - ETA: 28s - loss: 1.9094 - regression_loss: 1.6299 - classification_loss: 0.2795 418/500 [========================>.....] - ETA: 28s - loss: 1.9079 - regression_loss: 1.6286 - classification_loss: 0.2793 419/500 [========================>.....] - ETA: 28s - loss: 1.9092 - regression_loss: 1.6297 - classification_loss: 0.2795 420/500 [========================>.....] - ETA: 27s - loss: 1.9092 - regression_loss: 1.6298 - classification_loss: 0.2794 421/500 [========================>.....] - ETA: 27s - loss: 1.9082 - regression_loss: 1.6290 - classification_loss: 0.2792 422/500 [========================>.....] - ETA: 26s - loss: 1.9071 - regression_loss: 1.6280 - classification_loss: 0.2792 423/500 [========================>.....] - ETA: 26s - loss: 1.9047 - regression_loss: 1.6259 - classification_loss: 0.2788 424/500 [========================>.....] - ETA: 26s - loss: 1.9024 - regression_loss: 1.6239 - classification_loss: 0.2785 425/500 [========================>.....] - ETA: 25s - loss: 1.9022 - regression_loss: 1.6236 - classification_loss: 0.2786 426/500 [========================>.....] - ETA: 25s - loss: 1.9006 - regression_loss: 1.6223 - classification_loss: 0.2784 427/500 [========================>.....] - ETA: 25s - loss: 1.8992 - regression_loss: 1.6208 - classification_loss: 0.2783 428/500 [========================>.....] - ETA: 24s - loss: 1.8971 - regression_loss: 1.6191 - classification_loss: 0.2780 429/500 [========================>.....] - ETA: 24s - loss: 1.8975 - regression_loss: 1.6194 - classification_loss: 0.2781 430/500 [========================>.....] - ETA: 24s - loss: 1.8967 - regression_loss: 1.6188 - classification_loss: 0.2779 431/500 [========================>.....] - ETA: 23s - loss: 1.8971 - regression_loss: 1.6190 - classification_loss: 0.2781 432/500 [========================>.....] - ETA: 23s - loss: 1.8962 - regression_loss: 1.6182 - classification_loss: 0.2779 433/500 [========================>.....] - ETA: 23s - loss: 1.8954 - regression_loss: 1.6176 - classification_loss: 0.2778 434/500 [=========================>....] - ETA: 22s - loss: 1.8937 - regression_loss: 1.6160 - classification_loss: 0.2776 435/500 [=========================>....] - ETA: 22s - loss: 1.8911 - regression_loss: 1.6139 - classification_loss: 0.2772 436/500 [=========================>....] - ETA: 22s - loss: 1.8916 - regression_loss: 1.6143 - classification_loss: 0.2773 437/500 [=========================>....] - ETA: 21s - loss: 1.8902 - regression_loss: 1.6133 - classification_loss: 0.2770 438/500 [=========================>....] - ETA: 21s - loss: 1.8883 - regression_loss: 1.6116 - classification_loss: 0.2767 439/500 [=========================>....] - ETA: 21s - loss: 1.8876 - regression_loss: 1.6111 - classification_loss: 0.2765 440/500 [=========================>....] - ETA: 20s - loss: 1.8873 - regression_loss: 1.6108 - classification_loss: 0.2765 441/500 [=========================>....] - ETA: 20s - loss: 1.8856 - regression_loss: 1.6094 - classification_loss: 0.2763 442/500 [=========================>....] - ETA: 20s - loss: 1.8858 - regression_loss: 1.6096 - classification_loss: 0.2763 443/500 [=========================>....] - ETA: 19s - loss: 1.8856 - regression_loss: 1.6093 - classification_loss: 0.2763 444/500 [=========================>....] - ETA: 19s - loss: 1.8855 - regression_loss: 1.6093 - classification_loss: 0.2762 445/500 [=========================>....] - ETA: 19s - loss: 1.8853 - regression_loss: 1.6092 - classification_loss: 0.2760 446/500 [=========================>....] - ETA: 18s - loss: 1.8854 - regression_loss: 1.6093 - classification_loss: 0.2761 447/500 [=========================>....] - ETA: 18s - loss: 1.8856 - regression_loss: 1.6094 - classification_loss: 0.2761 448/500 [=========================>....] - ETA: 17s - loss: 1.8847 - regression_loss: 1.6087 - classification_loss: 0.2760 449/500 [=========================>....] - ETA: 17s - loss: 1.8856 - regression_loss: 1.6095 - classification_loss: 0.2761 450/500 [==========================>...] - ETA: 17s - loss: 1.8862 - regression_loss: 1.6100 - classification_loss: 0.2761 451/500 [==========================>...] - ETA: 16s - loss: 1.8854 - regression_loss: 1.6095 - classification_loss: 0.2759 452/500 [==========================>...] - ETA: 16s - loss: 1.8856 - regression_loss: 1.6097 - classification_loss: 0.2760 453/500 [==========================>...] - ETA: 16s - loss: 1.8845 - regression_loss: 1.6088 - classification_loss: 0.2757 454/500 [==========================>...] - ETA: 15s - loss: 1.8845 - regression_loss: 1.6087 - classification_loss: 0.2757 455/500 [==========================>...] - ETA: 15s - loss: 1.8830 - regression_loss: 1.6075 - classification_loss: 0.2755 456/500 [==========================>...] - ETA: 15s - loss: 1.8825 - regression_loss: 1.6071 - classification_loss: 0.2754 457/500 [==========================>...] - ETA: 14s - loss: 1.8822 - regression_loss: 1.6070 - classification_loss: 0.2753 458/500 [==========================>...] - ETA: 14s - loss: 1.8822 - regression_loss: 1.6072 - classification_loss: 0.2750 459/500 [==========================>...] - ETA: 14s - loss: 1.8825 - regression_loss: 1.6074 - classification_loss: 0.2750 460/500 [==========================>...] - ETA: 13s - loss: 1.8825 - regression_loss: 1.6075 - classification_loss: 0.2750 461/500 [==========================>...] - ETA: 13s - loss: 1.8808 - regression_loss: 1.6060 - classification_loss: 0.2748 462/500 [==========================>...] - ETA: 13s - loss: 1.8816 - regression_loss: 1.6066 - classification_loss: 0.2750 463/500 [==========================>...] - ETA: 12s - loss: 1.8814 - regression_loss: 1.6064 - classification_loss: 0.2750 464/500 [==========================>...] - ETA: 12s - loss: 1.8810 - regression_loss: 1.6061 - classification_loss: 0.2748 465/500 [==========================>...] - ETA: 12s - loss: 1.8789 - regression_loss: 1.6044 - classification_loss: 0.2745 466/500 [==========================>...] - ETA: 11s - loss: 1.8787 - regression_loss: 1.6042 - classification_loss: 0.2744 467/500 [===========================>..] - ETA: 11s - loss: 1.8780 - regression_loss: 1.6037 - classification_loss: 0.2743 468/500 [===========================>..] - ETA: 11s - loss: 1.8784 - regression_loss: 1.6041 - classification_loss: 0.2743 469/500 [===========================>..] - ETA: 10s - loss: 1.8775 - regression_loss: 1.6033 - classification_loss: 0.2742 470/500 [===========================>..] - ETA: 10s - loss: 1.8772 - regression_loss: 1.6031 - classification_loss: 0.2741 471/500 [===========================>..] - ETA: 10s - loss: 1.8775 - regression_loss: 1.6034 - classification_loss: 0.2741 472/500 [===========================>..] - ETA: 9s - loss: 1.8773 - regression_loss: 1.6031 - classification_loss: 0.2742  473/500 [===========================>..] - ETA: 9s - loss: 1.8776 - regression_loss: 1.6033 - classification_loss: 0.2743 474/500 [===========================>..] - ETA: 8s - loss: 1.8778 - regression_loss: 1.6035 - classification_loss: 0.2743 475/500 [===========================>..] - ETA: 8s - loss: 1.8775 - regression_loss: 1.6033 - classification_loss: 0.2742 476/500 [===========================>..] - ETA: 8s - loss: 1.8781 - regression_loss: 1.6039 - classification_loss: 0.2742 477/500 [===========================>..] - ETA: 7s - loss: 1.8786 - regression_loss: 1.6044 - classification_loss: 0.2742 478/500 [===========================>..] - ETA: 7s - loss: 1.8790 - regression_loss: 1.6048 - classification_loss: 0.2742 479/500 [===========================>..] - ETA: 7s - loss: 1.8789 - regression_loss: 1.6047 - classification_loss: 0.2741 480/500 [===========================>..] - ETA: 6s - loss: 1.8777 - regression_loss: 1.6036 - classification_loss: 0.2741 481/500 [===========================>..] - ETA: 6s - loss: 1.8787 - regression_loss: 1.6045 - classification_loss: 0.2743 482/500 [===========================>..] - ETA: 6s - loss: 1.8783 - regression_loss: 1.6041 - classification_loss: 0.2741 483/500 [===========================>..] - ETA: 5s - loss: 1.8789 - regression_loss: 1.6047 - classification_loss: 0.2742 484/500 [============================>.] - ETA: 5s - loss: 1.8785 - regression_loss: 1.6044 - classification_loss: 0.2741 485/500 [============================>.] - ETA: 5s - loss: 1.8768 - regression_loss: 1.6029 - classification_loss: 0.2739 486/500 [============================>.] - ETA: 4s - loss: 1.8764 - regression_loss: 1.6026 - classification_loss: 0.2737 487/500 [============================>.] - ETA: 4s - loss: 1.8764 - regression_loss: 1.6026 - classification_loss: 0.2738 488/500 [============================>.] - ETA: 4s - loss: 1.8756 - regression_loss: 1.6020 - classification_loss: 0.2736 489/500 [============================>.] - ETA: 3s - loss: 1.8762 - regression_loss: 1.6025 - classification_loss: 0.2737 490/500 [============================>.] - ETA: 3s - loss: 1.8759 - regression_loss: 1.6023 - classification_loss: 0.2737 491/500 [============================>.] - ETA: 3s - loss: 1.8756 - regression_loss: 1.6020 - classification_loss: 0.2736 492/500 [============================>.] - ETA: 2s - loss: 1.8757 - regression_loss: 1.6019 - classification_loss: 0.2737 493/500 [============================>.] - ETA: 2s - loss: 1.8758 - regression_loss: 1.6020 - classification_loss: 0.2738 494/500 [============================>.] - ETA: 2s - loss: 1.8758 - regression_loss: 1.6019 - classification_loss: 0.2738 495/500 [============================>.] - ETA: 1s - loss: 1.8759 - regression_loss: 1.6021 - classification_loss: 0.2738 496/500 [============================>.] - ETA: 1s - loss: 1.8738 - regression_loss: 1.6003 - classification_loss: 0.2735 497/500 [============================>.] - ETA: 1s - loss: 1.8736 - regression_loss: 1.6002 - classification_loss: 0.2735 498/500 [============================>.] - ETA: 0s - loss: 1.8724 - regression_loss: 1.5992 - classification_loss: 0.2732 499/500 [============================>.] - ETA: 0s - loss: 1.8737 - regression_loss: 1.6004 - classification_loss: 0.2733 500/500 [==============================] - 173s 346ms/step - loss: 1.8737 - regression_loss: 1.6006 - classification_loss: 0.2731 1172 instances of class plum with average precision: 0.5408 mAP: 0.5408 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 2:31 - loss: 2.2382 - regression_loss: 1.8715 - classification_loss: 0.3667 2/500 [..............................] - ETA: 2:42 - loss: 2.2142 - regression_loss: 1.8952 - classification_loss: 0.3190 3/500 [..............................] - ETA: 2:45 - loss: 2.1611 - regression_loss: 1.8608 - classification_loss: 0.3003 4/500 [..............................] - ETA: 2:46 - loss: 2.0397 - regression_loss: 1.7647 - classification_loss: 0.2749 5/500 [..............................] - ETA: 2:48 - loss: 2.0206 - regression_loss: 1.7465 - classification_loss: 0.2741 6/500 [..............................] - ETA: 2:49 - loss: 2.0670 - regression_loss: 1.7887 - classification_loss: 0.2783 7/500 [..............................] - ETA: 2:48 - loss: 2.0547 - regression_loss: 1.7847 - classification_loss: 0.2700 8/500 [..............................] - ETA: 2:48 - loss: 2.0493 - regression_loss: 1.7805 - classification_loss: 0.2688 9/500 [..............................] - ETA: 2:48 - loss: 1.9108 - regression_loss: 1.6596 - classification_loss: 0.2512 10/500 [..............................] - ETA: 2:49 - loss: 1.9299 - regression_loss: 1.6746 - classification_loss: 0.2552 11/500 [..............................] - ETA: 2:48 - loss: 1.9024 - regression_loss: 1.6475 - classification_loss: 0.2549 12/500 [..............................] - ETA: 2:48 - loss: 1.8787 - regression_loss: 1.6244 - classification_loss: 0.2543 13/500 [..............................] - ETA: 2:48 - loss: 1.8719 - regression_loss: 1.6141 - classification_loss: 0.2578 14/500 [..............................] - ETA: 2:48 - loss: 1.8659 - regression_loss: 1.6094 - classification_loss: 0.2565 15/500 [..............................] - ETA: 2:48 - loss: 1.8171 - regression_loss: 1.5629 - classification_loss: 0.2542 16/500 [..............................] - ETA: 2:48 - loss: 1.8243 - regression_loss: 1.5667 - classification_loss: 0.2576 17/500 [>.............................] - ETA: 2:48 - loss: 1.7793 - regression_loss: 1.5276 - classification_loss: 0.2517 18/500 [>.............................] - ETA: 2:48 - loss: 1.7720 - regression_loss: 1.5210 - classification_loss: 0.2511 19/500 [>.............................] - ETA: 2:48 - loss: 1.7930 - regression_loss: 1.5377 - classification_loss: 0.2553 20/500 [>.............................] - ETA: 2:48 - loss: 1.7896 - regression_loss: 1.5318 - classification_loss: 0.2578 21/500 [>.............................] - ETA: 2:47 - loss: 1.7879 - regression_loss: 1.5288 - classification_loss: 0.2591 22/500 [>.............................] - ETA: 2:47 - loss: 1.8011 - regression_loss: 1.5405 - classification_loss: 0.2606 23/500 [>.............................] - ETA: 2:46 - loss: 1.7792 - regression_loss: 1.5229 - classification_loss: 0.2563 24/500 [>.............................] - ETA: 2:46 - loss: 1.7825 - regression_loss: 1.5251 - classification_loss: 0.2574 25/500 [>.............................] - ETA: 2:46 - loss: 1.7864 - regression_loss: 1.5289 - classification_loss: 0.2575 26/500 [>.............................] - ETA: 2:45 - loss: 1.8006 - regression_loss: 1.5418 - classification_loss: 0.2587 27/500 [>.............................] - ETA: 2:45 - loss: 1.7820 - regression_loss: 1.5266 - classification_loss: 0.2554 28/500 [>.............................] - ETA: 2:44 - loss: 1.7912 - regression_loss: 1.5360 - classification_loss: 0.2552 29/500 [>.............................] - ETA: 2:44 - loss: 1.8114 - regression_loss: 1.5536 - classification_loss: 0.2578 30/500 [>.............................] - ETA: 2:44 - loss: 1.8141 - regression_loss: 1.5568 - classification_loss: 0.2573 31/500 [>.............................] - ETA: 2:43 - loss: 1.8183 - regression_loss: 1.5587 - classification_loss: 0.2596 32/500 [>.............................] - ETA: 2:43 - loss: 1.8205 - regression_loss: 1.5614 - classification_loss: 0.2591 33/500 [>.............................] - ETA: 2:42 - loss: 1.8273 - regression_loss: 1.5663 - classification_loss: 0.2610 34/500 [=>............................] - ETA: 2:42 - loss: 1.8392 - regression_loss: 1.5746 - classification_loss: 0.2647 35/500 [=>............................] - ETA: 2:41 - loss: 1.8409 - regression_loss: 1.5778 - classification_loss: 0.2631 36/500 [=>............................] - ETA: 2:41 - loss: 1.8400 - regression_loss: 1.5761 - classification_loss: 0.2639 37/500 [=>............................] - ETA: 2:41 - loss: 1.8369 - regression_loss: 1.5737 - classification_loss: 0.2632 38/500 [=>............................] - ETA: 2:40 - loss: 1.8203 - regression_loss: 1.5589 - classification_loss: 0.2615 39/500 [=>............................] - ETA: 2:40 - loss: 1.8181 - regression_loss: 1.5563 - classification_loss: 0.2618 40/500 [=>............................] - ETA: 2:39 - loss: 1.8180 - regression_loss: 1.5578 - classification_loss: 0.2603 41/500 [=>............................] - ETA: 2:39 - loss: 1.8191 - regression_loss: 1.5582 - classification_loss: 0.2610 42/500 [=>............................] - ETA: 2:38 - loss: 1.8016 - regression_loss: 1.5413 - classification_loss: 0.2603 43/500 [=>............................] - ETA: 2:38 - loss: 1.8110 - regression_loss: 1.5489 - classification_loss: 0.2622 44/500 [=>............................] - ETA: 2:38 - loss: 1.8142 - regression_loss: 1.5518 - classification_loss: 0.2624 45/500 [=>............................] - ETA: 2:37 - loss: 1.8119 - regression_loss: 1.5494 - classification_loss: 0.2625 46/500 [=>............................] - ETA: 2:37 - loss: 1.8093 - regression_loss: 1.5475 - classification_loss: 0.2618 47/500 [=>............................] - ETA: 2:36 - loss: 1.8134 - regression_loss: 1.5512 - classification_loss: 0.2622 48/500 [=>............................] - ETA: 2:36 - loss: 1.8168 - regression_loss: 1.5542 - classification_loss: 0.2626 49/500 [=>............................] - ETA: 2:35 - loss: 1.8152 - regression_loss: 1.5532 - classification_loss: 0.2620 50/500 [==>...........................] - ETA: 2:35 - loss: 1.8185 - regression_loss: 1.5567 - classification_loss: 0.2617 51/500 [==>...........................] - ETA: 2:35 - loss: 1.8054 - regression_loss: 1.5457 - classification_loss: 0.2597 52/500 [==>...........................] - ETA: 2:34 - loss: 1.8103 - regression_loss: 1.5499 - classification_loss: 0.2604 53/500 [==>...........................] - ETA: 2:34 - loss: 1.7946 - regression_loss: 1.5366 - classification_loss: 0.2580 54/500 [==>...........................] - ETA: 2:33 - loss: 1.7971 - regression_loss: 1.5385 - classification_loss: 0.2586 55/500 [==>...........................] - ETA: 2:33 - loss: 1.7902 - regression_loss: 1.5324 - classification_loss: 0.2577 56/500 [==>...........................] - ETA: 2:33 - loss: 1.7961 - regression_loss: 1.5381 - classification_loss: 0.2580 57/500 [==>...........................] - ETA: 2:32 - loss: 1.7963 - regression_loss: 1.5379 - classification_loss: 0.2583 58/500 [==>...........................] - ETA: 2:32 - loss: 1.7881 - regression_loss: 1.5298 - classification_loss: 0.2583 59/500 [==>...........................] - ETA: 2:32 - loss: 1.7864 - regression_loss: 1.5287 - classification_loss: 0.2577 60/500 [==>...........................] - ETA: 2:31 - loss: 1.7900 - regression_loss: 1.5321 - classification_loss: 0.2578 61/500 [==>...........................] - ETA: 2:31 - loss: 1.7920 - regression_loss: 1.5340 - classification_loss: 0.2580 62/500 [==>...........................] - ETA: 2:31 - loss: 1.7913 - regression_loss: 1.5335 - classification_loss: 0.2578 63/500 [==>...........................] - ETA: 2:30 - loss: 1.7899 - regression_loss: 1.5328 - classification_loss: 0.2572 64/500 [==>...........................] - ETA: 2:30 - loss: 1.7847 - regression_loss: 1.5282 - classification_loss: 0.2566 65/500 [==>...........................] - ETA: 2:30 - loss: 1.7773 - regression_loss: 1.5217 - classification_loss: 0.2556 66/500 [==>...........................] - ETA: 2:29 - loss: 1.7699 - regression_loss: 1.5144 - classification_loss: 0.2555 67/500 [===>..........................] - ETA: 2:29 - loss: 1.7744 - regression_loss: 1.5183 - classification_loss: 0.2560 68/500 [===>..........................] - ETA: 2:29 - loss: 1.7797 - regression_loss: 1.5234 - classification_loss: 0.2563 69/500 [===>..........................] - ETA: 2:29 - loss: 1.7782 - regression_loss: 1.5224 - classification_loss: 0.2558 70/500 [===>..........................] - ETA: 2:28 - loss: 1.7825 - regression_loss: 1.5262 - classification_loss: 0.2563 71/500 [===>..........................] - ETA: 2:28 - loss: 1.7895 - regression_loss: 1.5321 - classification_loss: 0.2574 72/500 [===>..........................] - ETA: 2:27 - loss: 1.7829 - regression_loss: 1.5271 - classification_loss: 0.2559 73/500 [===>..........................] - ETA: 2:27 - loss: 1.7849 - regression_loss: 1.5290 - classification_loss: 0.2559 74/500 [===>..........................] - ETA: 2:27 - loss: 1.7797 - regression_loss: 1.5252 - classification_loss: 0.2544 75/500 [===>..........................] - ETA: 2:27 - loss: 1.7760 - regression_loss: 1.5220 - classification_loss: 0.2540 76/500 [===>..........................] - ETA: 2:26 - loss: 1.7706 - regression_loss: 1.5173 - classification_loss: 0.2533 77/500 [===>..........................] - ETA: 2:26 - loss: 1.7646 - regression_loss: 1.5112 - classification_loss: 0.2534 78/500 [===>..........................] - ETA: 2:26 - loss: 1.7636 - regression_loss: 1.5106 - classification_loss: 0.2530 79/500 [===>..........................] - ETA: 2:25 - loss: 1.7674 - regression_loss: 1.5144 - classification_loss: 0.2530 80/500 [===>..........................] - ETA: 2:25 - loss: 1.7671 - regression_loss: 1.5144 - classification_loss: 0.2527 81/500 [===>..........................] - ETA: 2:24 - loss: 1.7628 - regression_loss: 1.5111 - classification_loss: 0.2518 82/500 [===>..........................] - ETA: 2:24 - loss: 1.7593 - regression_loss: 1.5081 - classification_loss: 0.2512 83/500 [===>..........................] - ETA: 2:24 - loss: 1.7559 - regression_loss: 1.5058 - classification_loss: 0.2501 84/500 [====>.........................] - ETA: 2:23 - loss: 1.7501 - regression_loss: 1.5002 - classification_loss: 0.2499 85/500 [====>.........................] - ETA: 2:23 - loss: 1.7493 - regression_loss: 1.4996 - classification_loss: 0.2497 86/500 [====>.........................] - ETA: 2:22 - loss: 1.7523 - regression_loss: 1.5023 - classification_loss: 0.2500 87/500 [====>.........................] - ETA: 2:22 - loss: 1.7431 - regression_loss: 1.4945 - classification_loss: 0.2486 88/500 [====>.........................] - ETA: 2:22 - loss: 1.7449 - regression_loss: 1.4961 - classification_loss: 0.2487 89/500 [====>.........................] - ETA: 2:21 - loss: 1.7369 - regression_loss: 1.4888 - classification_loss: 0.2481 90/500 [====>.........................] - ETA: 2:21 - loss: 1.7324 - regression_loss: 1.4852 - classification_loss: 0.2471 91/500 [====>.........................] - ETA: 2:21 - loss: 1.7264 - regression_loss: 1.4806 - classification_loss: 0.2458 92/500 [====>.........................] - ETA: 2:20 - loss: 1.7276 - regression_loss: 1.4814 - classification_loss: 0.2462 93/500 [====>.........................] - ETA: 2:20 - loss: 1.7286 - regression_loss: 1.4830 - classification_loss: 0.2456 94/500 [====>.........................] - ETA: 2:20 - loss: 1.7242 - regression_loss: 1.4796 - classification_loss: 0.2446 95/500 [====>.........................] - ETA: 2:19 - loss: 1.7122 - regression_loss: 1.4690 - classification_loss: 0.2432 96/500 [====>.........................] - ETA: 2:19 - loss: 1.7143 - regression_loss: 1.4709 - classification_loss: 0.2434 97/500 [====>.........................] - ETA: 2:19 - loss: 1.7089 - regression_loss: 1.4660 - classification_loss: 0.2430 98/500 [====>.........................] - ETA: 2:18 - loss: 1.7061 - regression_loss: 1.4634 - classification_loss: 0.2427 99/500 [====>.........................] - ETA: 2:18 - loss: 1.7063 - regression_loss: 1.4630 - classification_loss: 0.2433 100/500 [=====>........................] - ETA: 2:18 - loss: 1.7012 - regression_loss: 1.4580 - classification_loss: 0.2432 101/500 [=====>........................] - ETA: 2:17 - loss: 1.6929 - regression_loss: 1.4507 - classification_loss: 0.2423 102/500 [=====>........................] - ETA: 2:17 - loss: 1.6950 - regression_loss: 1.4527 - classification_loss: 0.2424 103/500 [=====>........................] - ETA: 2:17 - loss: 1.6966 - regression_loss: 1.4541 - classification_loss: 0.2425 104/500 [=====>........................] - ETA: 2:16 - loss: 1.6868 - regression_loss: 1.4455 - classification_loss: 0.2413 105/500 [=====>........................] - ETA: 2:16 - loss: 1.6865 - regression_loss: 1.4448 - classification_loss: 0.2417 106/500 [=====>........................] - ETA: 2:16 - loss: 1.6873 - regression_loss: 1.4450 - classification_loss: 0.2422 107/500 [=====>........................] - ETA: 2:15 - loss: 1.6824 - regression_loss: 1.4410 - classification_loss: 0.2414 108/500 [=====>........................] - ETA: 2:15 - loss: 1.6842 - regression_loss: 1.4424 - classification_loss: 0.2418 109/500 [=====>........................] - ETA: 2:15 - loss: 1.6873 - regression_loss: 1.4449 - classification_loss: 0.2424 110/500 [=====>........................] - ETA: 2:14 - loss: 1.6813 - regression_loss: 1.4392 - classification_loss: 0.2422 111/500 [=====>........................] - ETA: 2:14 - loss: 1.6776 - regression_loss: 1.4358 - classification_loss: 0.2417 112/500 [=====>........................] - ETA: 2:13 - loss: 1.6793 - regression_loss: 1.4372 - classification_loss: 0.2421 113/500 [=====>........................] - ETA: 2:13 - loss: 1.6808 - regression_loss: 1.4385 - classification_loss: 0.2423 114/500 [=====>........................] - ETA: 2:13 - loss: 1.6865 - regression_loss: 1.4438 - classification_loss: 0.2427 115/500 [=====>........................] - ETA: 2:12 - loss: 1.6884 - regression_loss: 1.4454 - classification_loss: 0.2430 116/500 [=====>........................] - ETA: 2:12 - loss: 1.6889 - regression_loss: 1.4458 - classification_loss: 0.2432 117/500 [======>.......................] - ETA: 2:12 - loss: 1.6869 - regression_loss: 1.4442 - classification_loss: 0.2426 118/500 [======>.......................] - ETA: 2:11 - loss: 1.6888 - regression_loss: 1.4458 - classification_loss: 0.2430 119/500 [======>.......................] - ETA: 2:11 - loss: 1.6857 - regression_loss: 1.4427 - classification_loss: 0.2430 120/500 [======>.......................] - ETA: 2:11 - loss: 1.6884 - regression_loss: 1.4450 - classification_loss: 0.2434 121/500 [======>.......................] - ETA: 2:10 - loss: 1.6812 - regression_loss: 1.4387 - classification_loss: 0.2426 122/500 [======>.......................] - ETA: 2:10 - loss: 1.6789 - regression_loss: 1.4369 - classification_loss: 0.2420 123/500 [======>.......................] - ETA: 2:10 - loss: 1.6799 - regression_loss: 1.4379 - classification_loss: 0.2420 124/500 [======>.......................] - ETA: 2:09 - loss: 1.6796 - regression_loss: 1.4377 - classification_loss: 0.2419 125/500 [======>.......................] - ETA: 2:09 - loss: 1.6828 - regression_loss: 1.4403 - classification_loss: 0.2424 126/500 [======>.......................] - ETA: 2:09 - loss: 1.6846 - regression_loss: 1.4421 - classification_loss: 0.2425 127/500 [======>.......................] - ETA: 2:08 - loss: 1.6803 - regression_loss: 1.4380 - classification_loss: 0.2423 128/500 [======>.......................] - ETA: 2:08 - loss: 1.6813 - regression_loss: 1.4389 - classification_loss: 0.2424 129/500 [======>.......................] - ETA: 2:08 - loss: 1.6862 - regression_loss: 1.4432 - classification_loss: 0.2430 130/500 [======>.......................] - ETA: 2:07 - loss: 1.6866 - regression_loss: 1.4437 - classification_loss: 0.2428 131/500 [======>.......................] - ETA: 2:07 - loss: 1.6823 - regression_loss: 1.4403 - classification_loss: 0.2420 132/500 [======>.......................] - ETA: 2:07 - loss: 1.6818 - regression_loss: 1.4399 - classification_loss: 0.2419 133/500 [======>.......................] - ETA: 2:06 - loss: 1.6800 - regression_loss: 1.4382 - classification_loss: 0.2418 134/500 [=======>......................] - ETA: 2:06 - loss: 1.6815 - regression_loss: 1.4395 - classification_loss: 0.2421 135/500 [=======>......................] - ETA: 2:06 - loss: 1.6766 - regression_loss: 1.4350 - classification_loss: 0.2416 136/500 [=======>......................] - ETA: 2:05 - loss: 1.6764 - regression_loss: 1.4349 - classification_loss: 0.2415 137/500 [=======>......................] - ETA: 2:05 - loss: 1.6766 - regression_loss: 1.4352 - classification_loss: 0.2414 138/500 [=======>......................] - ETA: 2:05 - loss: 1.6759 - regression_loss: 1.4349 - classification_loss: 0.2410 139/500 [=======>......................] - ETA: 2:05 - loss: 1.6745 - regression_loss: 1.4336 - classification_loss: 0.2408 140/500 [=======>......................] - ETA: 2:04 - loss: 1.6717 - regression_loss: 1.4315 - classification_loss: 0.2402 141/500 [=======>......................] - ETA: 2:04 - loss: 1.6705 - regression_loss: 1.4310 - classification_loss: 0.2395 142/500 [=======>......................] - ETA: 2:03 - loss: 1.6718 - regression_loss: 1.4316 - classification_loss: 0.2402 143/500 [=======>......................] - ETA: 2:03 - loss: 1.6719 - regression_loss: 1.4317 - classification_loss: 0.2402 144/500 [=======>......................] - ETA: 2:03 - loss: 1.6718 - regression_loss: 1.4317 - classification_loss: 0.2401 145/500 [=======>......................] - ETA: 2:03 - loss: 1.6716 - regression_loss: 1.4316 - classification_loss: 0.2400 146/500 [=======>......................] - ETA: 2:02 - loss: 1.6729 - regression_loss: 1.4328 - classification_loss: 0.2400 147/500 [=======>......................] - ETA: 2:02 - loss: 1.6740 - regression_loss: 1.4345 - classification_loss: 0.2396 148/500 [=======>......................] - ETA: 2:01 - loss: 1.6751 - regression_loss: 1.4357 - classification_loss: 0.2394 149/500 [=======>......................] - ETA: 2:01 - loss: 1.6764 - regression_loss: 1.4370 - classification_loss: 0.2394 150/500 [========>.....................] - ETA: 2:01 - loss: 1.6735 - regression_loss: 1.4345 - classification_loss: 0.2390 151/500 [========>.....................] - ETA: 2:00 - loss: 1.6722 - regression_loss: 1.4333 - classification_loss: 0.2388 152/500 [========>.....................] - ETA: 2:00 - loss: 1.6715 - regression_loss: 1.4329 - classification_loss: 0.2386 153/500 [========>.....................] - ETA: 2:00 - loss: 1.6749 - regression_loss: 1.4351 - classification_loss: 0.2398 154/500 [========>.....................] - ETA: 1:59 - loss: 1.6736 - regression_loss: 1.4340 - classification_loss: 0.2396 155/500 [========>.....................] - ETA: 1:59 - loss: 1.6743 - regression_loss: 1.4350 - classification_loss: 0.2393 156/500 [========>.....................] - ETA: 1:59 - loss: 1.6767 - regression_loss: 1.4371 - classification_loss: 0.2396 157/500 [========>.....................] - ETA: 1:58 - loss: 1.6767 - regression_loss: 1.4370 - classification_loss: 0.2396 158/500 [========>.....................] - ETA: 1:58 - loss: 1.6776 - regression_loss: 1.4377 - classification_loss: 0.2399 159/500 [========>.....................] - ETA: 1:58 - loss: 1.6769 - regression_loss: 1.4366 - classification_loss: 0.2403 160/500 [========>.....................] - ETA: 1:57 - loss: 1.6786 - regression_loss: 1.4382 - classification_loss: 0.2404 161/500 [========>.....................] - ETA: 1:57 - loss: 1.6803 - regression_loss: 1.4399 - classification_loss: 0.2404 162/500 [========>.....................] - ETA: 1:56 - loss: 1.6824 - regression_loss: 1.4416 - classification_loss: 0.2408 163/500 [========>.....................] - ETA: 1:56 - loss: 1.6835 - regression_loss: 1.4426 - classification_loss: 0.2409 164/500 [========>.....................] - ETA: 1:56 - loss: 1.6799 - regression_loss: 1.4394 - classification_loss: 0.2405 165/500 [========>.....................] - ETA: 1:55 - loss: 1.6836 - regression_loss: 1.4426 - classification_loss: 0.2410 166/500 [========>.....................] - ETA: 1:55 - loss: 1.6831 - regression_loss: 1.4421 - classification_loss: 0.2410 167/500 [=========>....................] - ETA: 1:55 - loss: 1.6808 - regression_loss: 1.4401 - classification_loss: 0.2407 168/500 [=========>....................] - ETA: 1:54 - loss: 1.6789 - regression_loss: 1.4384 - classification_loss: 0.2406 169/500 [=========>....................] - ETA: 1:54 - loss: 1.6780 - regression_loss: 1.4375 - classification_loss: 0.2405 170/500 [=========>....................] - ETA: 1:54 - loss: 1.6782 - regression_loss: 1.4378 - classification_loss: 0.2404 171/500 [=========>....................] - ETA: 1:53 - loss: 1.6795 - regression_loss: 1.4391 - classification_loss: 0.2404 172/500 [=========>....................] - ETA: 1:53 - loss: 1.6810 - regression_loss: 1.4401 - classification_loss: 0.2408 173/500 [=========>....................] - ETA: 1:53 - loss: 1.6825 - regression_loss: 1.4414 - classification_loss: 0.2411 174/500 [=========>....................] - ETA: 1:52 - loss: 1.6763 - regression_loss: 1.4359 - classification_loss: 0.2404 175/500 [=========>....................] - ETA: 1:52 - loss: 1.6793 - regression_loss: 1.4385 - classification_loss: 0.2408 176/500 [=========>....................] - ETA: 1:52 - loss: 1.6783 - regression_loss: 1.4375 - classification_loss: 0.2407 177/500 [=========>....................] - ETA: 1:51 - loss: 1.6775 - regression_loss: 1.4368 - classification_loss: 0.2406 178/500 [=========>....................] - ETA: 1:51 - loss: 1.6788 - regression_loss: 1.4379 - classification_loss: 0.2409 179/500 [=========>....................] - ETA: 1:51 - loss: 1.6743 - regression_loss: 1.4339 - classification_loss: 0.2403 180/500 [=========>....................] - ETA: 1:50 - loss: 1.6756 - regression_loss: 1.4353 - classification_loss: 0.2403 181/500 [=========>....................] - ETA: 1:50 - loss: 1.6777 - regression_loss: 1.4376 - classification_loss: 0.2400 182/500 [=========>....................] - ETA: 1:49 - loss: 1.6766 - regression_loss: 1.4365 - classification_loss: 0.2401 183/500 [=========>....................] - ETA: 1:49 - loss: 1.6765 - regression_loss: 1.4365 - classification_loss: 0.2401 184/500 [==========>...................] - ETA: 1:49 - loss: 1.6745 - regression_loss: 1.4347 - classification_loss: 0.2398 185/500 [==========>...................] - ETA: 1:49 - loss: 1.6740 - regression_loss: 1.4343 - classification_loss: 0.2397 186/500 [==========>...................] - ETA: 1:48 - loss: 1.6734 - regression_loss: 1.4341 - classification_loss: 0.2394 187/500 [==========>...................] - ETA: 1:48 - loss: 1.6750 - regression_loss: 1.4358 - classification_loss: 0.2392 188/500 [==========>...................] - ETA: 1:47 - loss: 1.6731 - regression_loss: 1.4341 - classification_loss: 0.2390 189/500 [==========>...................] - ETA: 1:47 - loss: 1.6753 - regression_loss: 1.4361 - classification_loss: 0.2392 190/500 [==========>...................] - ETA: 1:47 - loss: 1.6746 - regression_loss: 1.4357 - classification_loss: 0.2389 191/500 [==========>...................] - ETA: 1:46 - loss: 1.6767 - regression_loss: 1.4373 - classification_loss: 0.2394 192/500 [==========>...................] - ETA: 1:46 - loss: 1.6772 - regression_loss: 1.4379 - classification_loss: 0.2393 193/500 [==========>...................] - ETA: 1:46 - loss: 1.6770 - regression_loss: 1.4379 - classification_loss: 0.2391 194/500 [==========>...................] - ETA: 1:45 - loss: 1.6770 - regression_loss: 1.4379 - classification_loss: 0.2391 195/500 [==========>...................] - ETA: 1:45 - loss: 1.6768 - regression_loss: 1.4379 - classification_loss: 0.2389 196/500 [==========>...................] - ETA: 1:45 - loss: 1.6767 - regression_loss: 1.4378 - classification_loss: 0.2388 197/500 [==========>...................] - ETA: 1:44 - loss: 1.6757 - regression_loss: 1.4370 - classification_loss: 0.2387 198/500 [==========>...................] - ETA: 1:44 - loss: 1.6759 - regression_loss: 1.4373 - classification_loss: 0.2387 199/500 [==========>...................] - ETA: 1:44 - loss: 1.6774 - regression_loss: 1.4386 - classification_loss: 0.2389 200/500 [===========>..................] - ETA: 1:43 - loss: 1.6817 - regression_loss: 1.4425 - classification_loss: 0.2392 201/500 [===========>..................] - ETA: 1:43 - loss: 1.6839 - regression_loss: 1.4443 - classification_loss: 0.2396 202/500 [===========>..................] - ETA: 1:43 - loss: 1.6852 - regression_loss: 1.4456 - classification_loss: 0.2396 203/500 [===========>..................] - ETA: 1:42 - loss: 1.6875 - regression_loss: 1.4475 - classification_loss: 0.2400 204/500 [===========>..................] - ETA: 1:42 - loss: 1.6916 - regression_loss: 1.4512 - classification_loss: 0.2403 205/500 [===========>..................] - ETA: 1:42 - loss: 1.6930 - regression_loss: 1.4527 - classification_loss: 0.2404 206/500 [===========>..................] - ETA: 1:41 - loss: 1.6934 - regression_loss: 1.4531 - classification_loss: 0.2403 207/500 [===========>..................] - ETA: 1:41 - loss: 1.6938 - regression_loss: 1.4534 - classification_loss: 0.2404 208/500 [===========>..................] - ETA: 1:41 - loss: 1.6940 - regression_loss: 1.4534 - classification_loss: 0.2405 209/500 [===========>..................] - ETA: 1:40 - loss: 1.6945 - regression_loss: 1.4541 - classification_loss: 0.2404 210/500 [===========>..................] - ETA: 1:40 - loss: 1.6946 - regression_loss: 1.4544 - classification_loss: 0.2403 211/500 [===========>..................] - ETA: 1:40 - loss: 1.6960 - regression_loss: 1.4557 - classification_loss: 0.2403 212/500 [===========>..................] - ETA: 1:39 - loss: 1.6965 - regression_loss: 1.4564 - classification_loss: 0.2402 213/500 [===========>..................] - ETA: 1:39 - loss: 1.6976 - regression_loss: 1.4574 - classification_loss: 0.2402 214/500 [===========>..................] - ETA: 1:38 - loss: 1.6995 - regression_loss: 1.4592 - classification_loss: 0.2403 215/500 [===========>..................] - ETA: 1:38 - loss: 1.7007 - regression_loss: 1.4600 - classification_loss: 0.2407 216/500 [===========>..................] - ETA: 1:38 - loss: 1.7026 - regression_loss: 1.4614 - classification_loss: 0.2412 217/500 [============>.................] - ETA: 1:37 - loss: 1.7052 - regression_loss: 1.4635 - classification_loss: 0.2417 218/500 [============>.................] - ETA: 1:37 - loss: 1.7082 - regression_loss: 1.4661 - classification_loss: 0.2422 219/500 [============>.................] - ETA: 1:37 - loss: 1.7081 - regression_loss: 1.4660 - classification_loss: 0.2421 220/500 [============>.................] - ETA: 1:36 - loss: 1.7097 - regression_loss: 1.4675 - classification_loss: 0.2422 221/500 [============>.................] - ETA: 1:36 - loss: 1.7087 - regression_loss: 1.4666 - classification_loss: 0.2420 222/500 [============>.................] - ETA: 1:36 - loss: 1.7094 - regression_loss: 1.4673 - classification_loss: 0.2421 223/500 [============>.................] - ETA: 1:35 - loss: 1.7103 - regression_loss: 1.4681 - classification_loss: 0.2421 224/500 [============>.................] - ETA: 1:35 - loss: 1.7104 - regression_loss: 1.4682 - classification_loss: 0.2422 225/500 [============>.................] - ETA: 1:35 - loss: 1.7137 - regression_loss: 1.4712 - classification_loss: 0.2425 226/500 [============>.................] - ETA: 1:34 - loss: 1.7149 - regression_loss: 1.4721 - classification_loss: 0.2428 227/500 [============>.................] - ETA: 1:34 - loss: 1.7103 - regression_loss: 1.4680 - classification_loss: 0.2422 228/500 [============>.................] - ETA: 1:34 - loss: 1.7104 - regression_loss: 1.4683 - classification_loss: 0.2421 229/500 [============>.................] - ETA: 1:33 - loss: 1.7077 - regression_loss: 1.4661 - classification_loss: 0.2416 230/500 [============>.................] - ETA: 1:33 - loss: 1.7053 - regression_loss: 1.4641 - classification_loss: 0.2412 231/500 [============>.................] - ETA: 1:33 - loss: 1.7064 - regression_loss: 1.4650 - classification_loss: 0.2414 232/500 [============>.................] - ETA: 1:32 - loss: 1.7039 - regression_loss: 1.4628 - classification_loss: 0.2411 233/500 [============>.................] - ETA: 1:32 - loss: 1.7053 - regression_loss: 1.4640 - classification_loss: 0.2413 234/500 [=============>................] - ETA: 1:32 - loss: 1.7029 - regression_loss: 1.4619 - classification_loss: 0.2410 235/500 [=============>................] - ETA: 1:31 - loss: 1.7051 - regression_loss: 1.4636 - classification_loss: 0.2414 236/500 [=============>................] - ETA: 1:31 - loss: 1.7023 - regression_loss: 1.4611 - classification_loss: 0.2412 237/500 [=============>................] - ETA: 1:31 - loss: 1.7011 - regression_loss: 1.4603 - classification_loss: 0.2409 238/500 [=============>................] - ETA: 1:30 - loss: 1.6994 - regression_loss: 1.4589 - classification_loss: 0.2404 239/500 [=============>................] - ETA: 1:30 - loss: 1.7011 - regression_loss: 1.4604 - classification_loss: 0.2407 240/500 [=============>................] - ETA: 1:29 - loss: 1.7018 - regression_loss: 1.4612 - classification_loss: 0.2406 241/500 [=============>................] - ETA: 1:29 - loss: 1.7025 - regression_loss: 1.4617 - classification_loss: 0.2407 242/500 [=============>................] - ETA: 1:29 - loss: 1.7027 - regression_loss: 1.4618 - classification_loss: 0.2409 243/500 [=============>................] - ETA: 1:28 - loss: 1.7009 - regression_loss: 1.4599 - classification_loss: 0.2411 244/500 [=============>................] - ETA: 1:28 - loss: 1.6978 - regression_loss: 1.4573 - classification_loss: 0.2405 245/500 [=============>................] - ETA: 1:28 - loss: 1.6985 - regression_loss: 1.4578 - classification_loss: 0.2407 246/500 [=============>................] - ETA: 1:27 - loss: 1.6976 - regression_loss: 1.4568 - classification_loss: 0.2408 247/500 [=============>................] - ETA: 1:27 - loss: 1.6977 - regression_loss: 1.4570 - classification_loss: 0.2407 248/500 [=============>................] - ETA: 1:27 - loss: 1.6995 - regression_loss: 1.4584 - classification_loss: 0.2411 249/500 [=============>................] - ETA: 1:26 - loss: 1.6978 - regression_loss: 1.4568 - classification_loss: 0.2409 250/500 [==============>...............] - ETA: 1:26 - loss: 1.7005 - regression_loss: 1.4586 - classification_loss: 0.2419 251/500 [==============>...............] - ETA: 1:26 - loss: 1.6994 - regression_loss: 1.4577 - classification_loss: 0.2417 252/500 [==============>...............] - ETA: 1:25 - loss: 1.6989 - regression_loss: 1.4573 - classification_loss: 0.2416 253/500 [==============>...............] - ETA: 1:25 - loss: 1.7005 - regression_loss: 1.4587 - classification_loss: 0.2417 254/500 [==============>...............] - ETA: 1:25 - loss: 1.6987 - regression_loss: 1.4567 - classification_loss: 0.2420 255/500 [==============>...............] - ETA: 1:24 - loss: 1.6984 - regression_loss: 1.4564 - classification_loss: 0.2420 256/500 [==============>...............] - ETA: 1:24 - loss: 1.6971 - regression_loss: 1.4553 - classification_loss: 0.2418 257/500 [==============>...............] - ETA: 1:24 - loss: 1.6983 - regression_loss: 1.4562 - classification_loss: 0.2421 258/500 [==============>...............] - ETA: 1:23 - loss: 1.7001 - regression_loss: 1.4577 - classification_loss: 0.2424 259/500 [==============>...............] - ETA: 1:23 - loss: 1.7012 - regression_loss: 1.4586 - classification_loss: 0.2426 260/500 [==============>...............] - ETA: 1:23 - loss: 1.7021 - regression_loss: 1.4593 - classification_loss: 0.2428 261/500 [==============>...............] - ETA: 1:22 - loss: 1.7024 - regression_loss: 1.4594 - classification_loss: 0.2431 262/500 [==============>...............] - ETA: 1:22 - loss: 1.7006 - regression_loss: 1.4577 - classification_loss: 0.2429 263/500 [==============>...............] - ETA: 1:22 - loss: 1.6998 - regression_loss: 1.4569 - classification_loss: 0.2429 264/500 [==============>...............] - ETA: 1:21 - loss: 1.6982 - regression_loss: 1.4556 - classification_loss: 0.2426 265/500 [==============>...............] - ETA: 1:21 - loss: 1.6962 - regression_loss: 1.4539 - classification_loss: 0.2423 266/500 [==============>...............] - ETA: 1:21 - loss: 1.6962 - regression_loss: 1.4537 - classification_loss: 0.2425 267/500 [===============>..............] - ETA: 1:20 - loss: 1.6947 - regression_loss: 1.4523 - classification_loss: 0.2424 268/500 [===============>..............] - ETA: 1:20 - loss: 1.6968 - regression_loss: 1.4542 - classification_loss: 0.2427 269/500 [===============>..............] - ETA: 1:19 - loss: 1.6954 - regression_loss: 1.4528 - classification_loss: 0.2426 270/500 [===============>..............] - ETA: 1:19 - loss: 1.6959 - regression_loss: 1.4532 - classification_loss: 0.2427 271/500 [===============>..............] - ETA: 1:19 - loss: 1.6959 - regression_loss: 1.4533 - classification_loss: 0.2426 272/500 [===============>..............] - ETA: 1:18 - loss: 1.6940 - regression_loss: 1.4516 - classification_loss: 0.2424 273/500 [===============>..............] - ETA: 1:18 - loss: 1.6935 - regression_loss: 1.4513 - classification_loss: 0.2422 274/500 [===============>..............] - ETA: 1:18 - loss: 1.6941 - regression_loss: 1.4518 - classification_loss: 0.2423 275/500 [===============>..............] - ETA: 1:17 - loss: 1.6938 - regression_loss: 1.4515 - classification_loss: 0.2423 276/500 [===============>..............] - ETA: 1:17 - loss: 1.6943 - regression_loss: 1.4522 - classification_loss: 0.2421 277/500 [===============>..............] - ETA: 1:17 - loss: 1.6934 - regression_loss: 1.4514 - classification_loss: 0.2420 278/500 [===============>..............] - ETA: 1:16 - loss: 1.6931 - regression_loss: 1.4513 - classification_loss: 0.2418 279/500 [===============>..............] - ETA: 1:16 - loss: 1.6934 - regression_loss: 1.4517 - classification_loss: 0.2416 280/500 [===============>..............] - ETA: 1:16 - loss: 1.6952 - regression_loss: 1.4533 - classification_loss: 0.2420 281/500 [===============>..............] - ETA: 1:15 - loss: 1.6950 - regression_loss: 1.4531 - classification_loss: 0.2419 282/500 [===============>..............] - ETA: 1:15 - loss: 1.6945 - regression_loss: 1.4528 - classification_loss: 0.2417 283/500 [===============>..............] - ETA: 1:15 - loss: 1.6943 - regression_loss: 1.4528 - classification_loss: 0.2415 284/500 [================>.............] - ETA: 1:14 - loss: 1.6952 - regression_loss: 1.4537 - classification_loss: 0.2415 285/500 [================>.............] - ETA: 1:14 - loss: 1.6956 - regression_loss: 1.4542 - classification_loss: 0.2415 286/500 [================>.............] - ETA: 1:14 - loss: 1.6940 - regression_loss: 1.4528 - classification_loss: 0.2412 287/500 [================>.............] - ETA: 1:13 - loss: 1.6954 - regression_loss: 1.4542 - classification_loss: 0.2411 288/500 [================>.............] - ETA: 1:13 - loss: 1.6946 - regression_loss: 1.4536 - classification_loss: 0.2410 289/500 [================>.............] - ETA: 1:12 - loss: 1.6930 - regression_loss: 1.4522 - classification_loss: 0.2408 290/500 [================>.............] - ETA: 1:12 - loss: 1.6920 - regression_loss: 1.4515 - classification_loss: 0.2405 291/500 [================>.............] - ETA: 1:12 - loss: 1.6901 - regression_loss: 1.4499 - classification_loss: 0.2401 292/500 [================>.............] - ETA: 1:11 - loss: 1.6899 - regression_loss: 1.4498 - classification_loss: 0.2401 293/500 [================>.............] - ETA: 1:11 - loss: 1.6916 - regression_loss: 1.4510 - classification_loss: 0.2406 294/500 [================>.............] - ETA: 1:11 - loss: 1.6910 - regression_loss: 1.4506 - classification_loss: 0.2405 295/500 [================>.............] - ETA: 1:10 - loss: 1.6904 - regression_loss: 1.4501 - classification_loss: 0.2403 296/500 [================>.............] - ETA: 1:10 - loss: 1.6899 - regression_loss: 1.4498 - classification_loss: 0.2402 297/500 [================>.............] - ETA: 1:10 - loss: 1.6910 - regression_loss: 1.4508 - classification_loss: 0.2402 298/500 [================>.............] - ETA: 1:09 - loss: 1.6921 - regression_loss: 1.4517 - classification_loss: 0.2404 299/500 [================>.............] - ETA: 1:09 - loss: 1.6921 - regression_loss: 1.4518 - classification_loss: 0.2404 300/500 [=================>............] - ETA: 1:09 - loss: 1.6919 - regression_loss: 1.4517 - classification_loss: 0.2402 301/500 [=================>............] - ETA: 1:08 - loss: 1.6926 - regression_loss: 1.4523 - classification_loss: 0.2403 302/500 [=================>............] - ETA: 1:08 - loss: 1.6955 - regression_loss: 1.4542 - classification_loss: 0.2413 303/500 [=================>............] - ETA: 1:08 - loss: 1.6957 - regression_loss: 1.4543 - classification_loss: 0.2414 304/500 [=================>............] - ETA: 1:07 - loss: 1.6975 - regression_loss: 1.4557 - classification_loss: 0.2418 305/500 [=================>............] - ETA: 1:07 - loss: 1.6956 - regression_loss: 1.4541 - classification_loss: 0.2414 306/500 [=================>............] - ETA: 1:07 - loss: 1.6980 - regression_loss: 1.4562 - classification_loss: 0.2418 307/500 [=================>............] - ETA: 1:06 - loss: 1.6984 - regression_loss: 1.4566 - classification_loss: 0.2419 308/500 [=================>............] - ETA: 1:06 - loss: 1.6990 - regression_loss: 1.4571 - classification_loss: 0.2419 309/500 [=================>............] - ETA: 1:05 - loss: 1.6995 - regression_loss: 1.4576 - classification_loss: 0.2419 310/500 [=================>............] - ETA: 1:05 - loss: 1.6967 - regression_loss: 1.4551 - classification_loss: 0.2416 311/500 [=================>............] - ETA: 1:05 - loss: 1.6960 - regression_loss: 1.4545 - classification_loss: 0.2415 312/500 [=================>............] - ETA: 1:04 - loss: 1.6931 - regression_loss: 1.4520 - classification_loss: 0.2411 313/500 [=================>............] - ETA: 1:04 - loss: 1.6936 - regression_loss: 1.4525 - classification_loss: 0.2412 314/500 [=================>............] - ETA: 1:04 - loss: 1.6941 - regression_loss: 1.4530 - classification_loss: 0.2411 315/500 [=================>............] - ETA: 1:03 - loss: 1.6952 - regression_loss: 1.4541 - classification_loss: 0.2412 316/500 [=================>............] - ETA: 1:03 - loss: 1.6957 - regression_loss: 1.4544 - classification_loss: 0.2413 317/500 [==================>...........] - ETA: 1:03 - loss: 1.6953 - regression_loss: 1.4540 - classification_loss: 0.2412 318/500 [==================>...........] - ETA: 1:02 - loss: 1.6958 - regression_loss: 1.4546 - classification_loss: 0.2413 319/500 [==================>...........] - ETA: 1:02 - loss: 1.6932 - regression_loss: 1.4523 - classification_loss: 0.2408 320/500 [==================>...........] - ETA: 1:02 - loss: 1.6934 - regression_loss: 1.4525 - classification_loss: 0.2409 321/500 [==================>...........] - ETA: 1:01 - loss: 1.6920 - regression_loss: 1.4513 - classification_loss: 0.2407 322/500 [==================>...........] - ETA: 1:01 - loss: 1.6913 - regression_loss: 1.4508 - classification_loss: 0.2405 323/500 [==================>...........] - ETA: 1:01 - loss: 1.6903 - regression_loss: 1.4498 - classification_loss: 0.2405 324/500 [==================>...........] - ETA: 1:00 - loss: 1.6904 - regression_loss: 1.4500 - classification_loss: 0.2404 325/500 [==================>...........] - ETA: 1:00 - loss: 1.6906 - regression_loss: 1.4502 - classification_loss: 0.2404 326/500 [==================>...........] - ETA: 59s - loss: 1.6918 - regression_loss: 1.4513 - classification_loss: 0.2405  327/500 [==================>...........] - ETA: 59s - loss: 1.6918 - regression_loss: 1.4513 - classification_loss: 0.2405 328/500 [==================>...........] - ETA: 59s - loss: 1.6934 - regression_loss: 1.4527 - classification_loss: 0.2407 329/500 [==================>...........] - ETA: 58s - loss: 1.6936 - regression_loss: 1.4530 - classification_loss: 0.2406 330/500 [==================>...........] - ETA: 58s - loss: 1.6947 - regression_loss: 1.4541 - classification_loss: 0.2406 331/500 [==================>...........] - ETA: 58s - loss: 1.6957 - regression_loss: 1.4551 - classification_loss: 0.2405 332/500 [==================>...........] - ETA: 57s - loss: 1.6950 - regression_loss: 1.4546 - classification_loss: 0.2404 333/500 [==================>...........] - ETA: 57s - loss: 1.6934 - regression_loss: 1.4529 - classification_loss: 0.2405 334/500 [===================>..........] - ETA: 57s - loss: 1.6954 - regression_loss: 1.4547 - classification_loss: 0.2407 335/500 [===================>..........] - ETA: 56s - loss: 1.6938 - regression_loss: 1.4533 - classification_loss: 0.2405 336/500 [===================>..........] - ETA: 56s - loss: 1.6951 - regression_loss: 1.4545 - classification_loss: 0.2406 337/500 [===================>..........] - ETA: 56s - loss: 1.6958 - regression_loss: 1.4551 - classification_loss: 0.2406 338/500 [===================>..........] - ETA: 55s - loss: 1.6947 - regression_loss: 1.4543 - classification_loss: 0.2404 339/500 [===================>..........] - ETA: 55s - loss: 1.6931 - regression_loss: 1.4530 - classification_loss: 0.2401 340/500 [===================>..........] - ETA: 55s - loss: 1.6923 - regression_loss: 1.4524 - classification_loss: 0.2399 341/500 [===================>..........] - ETA: 54s - loss: 1.6912 - regression_loss: 1.4516 - classification_loss: 0.2396 342/500 [===================>..........] - ETA: 54s - loss: 1.6893 - regression_loss: 1.4500 - classification_loss: 0.2392 343/500 [===================>..........] - ETA: 53s - loss: 1.6873 - regression_loss: 1.4483 - classification_loss: 0.2390 344/500 [===================>..........] - ETA: 53s - loss: 1.6880 - regression_loss: 1.4490 - classification_loss: 0.2390 345/500 [===================>..........] - ETA: 53s - loss: 1.6887 - regression_loss: 1.4496 - classification_loss: 0.2390 346/500 [===================>..........] - ETA: 52s - loss: 1.6870 - regression_loss: 1.4483 - classification_loss: 0.2386 347/500 [===================>..........] - ETA: 52s - loss: 1.6873 - regression_loss: 1.4487 - classification_loss: 0.2386 348/500 [===================>..........] - ETA: 52s - loss: 1.6853 - regression_loss: 1.4469 - classification_loss: 0.2383 349/500 [===================>..........] - ETA: 51s - loss: 1.6855 - regression_loss: 1.4471 - classification_loss: 0.2384 350/500 [====================>.........] - ETA: 51s - loss: 1.6848 - regression_loss: 1.4466 - classification_loss: 0.2382 351/500 [====================>.........] - ETA: 51s - loss: 1.6836 - regression_loss: 1.4454 - classification_loss: 0.2381 352/500 [====================>.........] - ETA: 50s - loss: 1.6831 - regression_loss: 1.4449 - classification_loss: 0.2382 353/500 [====================>.........] - ETA: 50s - loss: 1.6849 - regression_loss: 1.4466 - classification_loss: 0.2383 354/500 [====================>.........] - ETA: 50s - loss: 1.6850 - regression_loss: 1.4468 - classification_loss: 0.2382 355/500 [====================>.........] - ETA: 49s - loss: 1.6867 - regression_loss: 1.4483 - classification_loss: 0.2384 356/500 [====================>.........] - ETA: 49s - loss: 1.6872 - regression_loss: 1.4487 - classification_loss: 0.2385 357/500 [====================>.........] - ETA: 49s - loss: 1.6878 - regression_loss: 1.4492 - classification_loss: 0.2386 358/500 [====================>.........] - ETA: 48s - loss: 1.6871 - regression_loss: 1.4485 - classification_loss: 0.2385 359/500 [====================>.........] - ETA: 48s - loss: 1.6861 - regression_loss: 1.4478 - classification_loss: 0.2383 360/500 [====================>.........] - ETA: 48s - loss: 1.6843 - regression_loss: 1.4463 - classification_loss: 0.2380 361/500 [====================>.........] - ETA: 47s - loss: 1.6839 - regression_loss: 1.4459 - classification_loss: 0.2380 362/500 [====================>.........] - ETA: 47s - loss: 1.6885 - regression_loss: 1.4498 - classification_loss: 0.2387 363/500 [====================>.........] - ETA: 47s - loss: 1.6888 - regression_loss: 1.4500 - classification_loss: 0.2388 364/500 [====================>.........] - ETA: 46s - loss: 1.6881 - regression_loss: 1.4495 - classification_loss: 0.2386 365/500 [====================>.........] - ETA: 46s - loss: 1.6887 - regression_loss: 1.4501 - classification_loss: 0.2387 366/500 [====================>.........] - ETA: 46s - loss: 1.6887 - regression_loss: 1.4501 - classification_loss: 0.2386 367/500 [=====================>........] - ETA: 45s - loss: 1.6879 - regression_loss: 1.4495 - classification_loss: 0.2384 368/500 [=====================>........] - ETA: 45s - loss: 1.6880 - regression_loss: 1.4497 - classification_loss: 0.2383 369/500 [=====================>........] - ETA: 45s - loss: 1.6865 - regression_loss: 1.4485 - classification_loss: 0.2379 370/500 [=====================>........] - ETA: 44s - loss: 1.6872 - regression_loss: 1.4491 - classification_loss: 0.2381 371/500 [=====================>........] - ETA: 44s - loss: 1.6882 - regression_loss: 1.4501 - classification_loss: 0.2381 372/500 [=====================>........] - ETA: 44s - loss: 1.6886 - regression_loss: 1.4502 - classification_loss: 0.2384 373/500 [=====================>........] - ETA: 43s - loss: 1.6879 - regression_loss: 1.4496 - classification_loss: 0.2383 374/500 [=====================>........] - ETA: 43s - loss: 1.6891 - regression_loss: 1.4507 - classification_loss: 0.2384 375/500 [=====================>........] - ETA: 43s - loss: 1.6892 - regression_loss: 1.4509 - classification_loss: 0.2383 376/500 [=====================>........] - ETA: 42s - loss: 1.6898 - regression_loss: 1.4513 - classification_loss: 0.2385 377/500 [=====================>........] - ETA: 42s - loss: 1.6874 - regression_loss: 1.4492 - classification_loss: 0.2382 378/500 [=====================>........] - ETA: 41s - loss: 1.6877 - regression_loss: 1.4494 - classification_loss: 0.2382 379/500 [=====================>........] - ETA: 41s - loss: 1.6871 - regression_loss: 1.4490 - classification_loss: 0.2381 380/500 [=====================>........] - ETA: 41s - loss: 1.6871 - regression_loss: 1.4490 - classification_loss: 0.2381 381/500 [=====================>........] - ETA: 40s - loss: 1.6853 - regression_loss: 1.4474 - classification_loss: 0.2379 382/500 [=====================>........] - ETA: 40s - loss: 1.6853 - regression_loss: 1.4474 - classification_loss: 0.2379 383/500 [=====================>........] - ETA: 40s - loss: 1.6844 - regression_loss: 1.4467 - classification_loss: 0.2376 384/500 [======================>.......] - ETA: 39s - loss: 1.6838 - regression_loss: 1.4462 - classification_loss: 0.2375 385/500 [======================>.......] - ETA: 39s - loss: 1.6847 - regression_loss: 1.4471 - classification_loss: 0.2376 386/500 [======================>.......] - ETA: 39s - loss: 1.6850 - regression_loss: 1.4473 - classification_loss: 0.2377 387/500 [======================>.......] - ETA: 38s - loss: 1.6831 - regression_loss: 1.4457 - classification_loss: 0.2374 388/500 [======================>.......] - ETA: 38s - loss: 1.6820 - regression_loss: 1.4448 - classification_loss: 0.2372 389/500 [======================>.......] - ETA: 38s - loss: 1.6824 - regression_loss: 1.4451 - classification_loss: 0.2373 390/500 [======================>.......] - ETA: 37s - loss: 1.6820 - regression_loss: 1.4449 - classification_loss: 0.2371 391/500 [======================>.......] - ETA: 37s - loss: 1.6826 - regression_loss: 1.4455 - classification_loss: 0.2371 392/500 [======================>.......] - ETA: 37s - loss: 1.6832 - regression_loss: 1.4460 - classification_loss: 0.2371 393/500 [======================>.......] - ETA: 36s - loss: 1.6815 - regression_loss: 1.4446 - classification_loss: 0.2369 394/500 [======================>.......] - ETA: 36s - loss: 1.6803 - regression_loss: 1.4436 - classification_loss: 0.2367 395/500 [======================>.......] - ETA: 36s - loss: 1.6811 - regression_loss: 1.4443 - classification_loss: 0.2368 396/500 [======================>.......] - ETA: 35s - loss: 1.6817 - regression_loss: 1.4448 - classification_loss: 0.2368 397/500 [======================>.......] - ETA: 35s - loss: 1.6813 - regression_loss: 1.4447 - classification_loss: 0.2367 398/500 [======================>.......] - ETA: 35s - loss: 1.6828 - regression_loss: 1.4459 - classification_loss: 0.2369 399/500 [======================>.......] - ETA: 34s - loss: 1.6829 - regression_loss: 1.4455 - classification_loss: 0.2374 400/500 [=======================>......] - ETA: 34s - loss: 1.6825 - regression_loss: 1.4452 - classification_loss: 0.2373 401/500 [=======================>......] - ETA: 34s - loss: 1.6825 - regression_loss: 1.4454 - classification_loss: 0.2371 402/500 [=======================>......] - ETA: 33s - loss: 1.6806 - regression_loss: 1.4437 - classification_loss: 0.2368 403/500 [=======================>......] - ETA: 33s - loss: 1.6806 - regression_loss: 1.4438 - classification_loss: 0.2368 404/500 [=======================>......] - ETA: 33s - loss: 1.6796 - regression_loss: 1.4430 - classification_loss: 0.2366 405/500 [=======================>......] - ETA: 32s - loss: 1.6786 - regression_loss: 1.4423 - classification_loss: 0.2364 406/500 [=======================>......] - ETA: 32s - loss: 1.6772 - regression_loss: 1.4410 - classification_loss: 0.2362 407/500 [=======================>......] - ETA: 32s - loss: 1.6758 - regression_loss: 1.4398 - classification_loss: 0.2360 408/500 [=======================>......] - ETA: 31s - loss: 1.6743 - regression_loss: 1.4386 - classification_loss: 0.2357 409/500 [=======================>......] - ETA: 31s - loss: 1.6749 - regression_loss: 1.4390 - classification_loss: 0.2358 410/500 [=======================>......] - ETA: 30s - loss: 1.6751 - regression_loss: 1.4392 - classification_loss: 0.2359 411/500 [=======================>......] - ETA: 30s - loss: 1.6739 - regression_loss: 1.4381 - classification_loss: 0.2358 412/500 [=======================>......] - ETA: 30s - loss: 1.6742 - regression_loss: 1.4384 - classification_loss: 0.2358 413/500 [=======================>......] - ETA: 29s - loss: 1.6724 - regression_loss: 1.4368 - classification_loss: 0.2356 414/500 [=======================>......] - ETA: 29s - loss: 1.6722 - regression_loss: 1.4366 - classification_loss: 0.2355 415/500 [=======================>......] - ETA: 29s - loss: 1.6719 - regression_loss: 1.4365 - classification_loss: 0.2354 416/500 [=======================>......] - ETA: 28s - loss: 1.6721 - regression_loss: 1.4367 - classification_loss: 0.2354 417/500 [========================>.....] - ETA: 28s - loss: 1.6722 - regression_loss: 1.4368 - classification_loss: 0.2354 418/500 [========================>.....] - ETA: 28s - loss: 1.6732 - regression_loss: 1.4378 - classification_loss: 0.2354 419/500 [========================>.....] - ETA: 27s - loss: 1.6728 - regression_loss: 1.4374 - classification_loss: 0.2354 420/500 [========================>.....] - ETA: 27s - loss: 1.6730 - regression_loss: 1.4376 - classification_loss: 0.2354 421/500 [========================>.....] - ETA: 27s - loss: 1.6737 - regression_loss: 1.4381 - classification_loss: 0.2356 422/500 [========================>.....] - ETA: 26s - loss: 1.6728 - regression_loss: 1.4373 - classification_loss: 0.2355 423/500 [========================>.....] - ETA: 26s - loss: 1.6728 - regression_loss: 1.4373 - classification_loss: 0.2355 424/500 [========================>.....] - ETA: 26s - loss: 1.6738 - regression_loss: 1.4379 - classification_loss: 0.2358 425/500 [========================>.....] - ETA: 25s - loss: 1.6744 - regression_loss: 1.4383 - classification_loss: 0.2361 426/500 [========================>.....] - ETA: 25s - loss: 1.6740 - regression_loss: 1.4380 - classification_loss: 0.2360 427/500 [========================>.....] - ETA: 25s - loss: 1.6741 - regression_loss: 1.4380 - classification_loss: 0.2361 428/500 [========================>.....] - ETA: 24s - loss: 1.6730 - regression_loss: 1.4370 - classification_loss: 0.2360 429/500 [========================>.....] - ETA: 24s - loss: 1.6728 - regression_loss: 1.4367 - classification_loss: 0.2361 430/500 [========================>.....] - ETA: 24s - loss: 1.6719 - regression_loss: 1.4359 - classification_loss: 0.2360 431/500 [========================>.....] - ETA: 23s - loss: 1.6723 - regression_loss: 1.4364 - classification_loss: 0.2359 432/500 [========================>.....] - ETA: 23s - loss: 1.6711 - regression_loss: 1.4354 - classification_loss: 0.2357 433/500 [========================>.....] - ETA: 23s - loss: 1.6704 - regression_loss: 1.4350 - classification_loss: 0.2354 434/500 [=========================>....] - ETA: 22s - loss: 1.6713 - regression_loss: 1.4358 - classification_loss: 0.2355 435/500 [=========================>....] - ETA: 22s - loss: 1.6721 - regression_loss: 1.4367 - classification_loss: 0.2354 436/500 [=========================>....] - ETA: 22s - loss: 1.6718 - regression_loss: 1.4364 - classification_loss: 0.2354 437/500 [=========================>....] - ETA: 21s - loss: 1.6724 - regression_loss: 1.4370 - classification_loss: 0.2354 438/500 [=========================>....] - ETA: 21s - loss: 1.6721 - regression_loss: 1.4366 - classification_loss: 0.2355 439/500 [=========================>....] - ETA: 21s - loss: 1.6727 - regression_loss: 1.4372 - classification_loss: 0.2355 440/500 [=========================>....] - ETA: 20s - loss: 1.6735 - regression_loss: 1.4379 - classification_loss: 0.2356 441/500 [=========================>....] - ETA: 20s - loss: 1.6731 - regression_loss: 1.4376 - classification_loss: 0.2355 442/500 [=========================>....] - ETA: 19s - loss: 1.6734 - regression_loss: 1.4380 - classification_loss: 0.2355 443/500 [=========================>....] - ETA: 19s - loss: 1.6728 - regression_loss: 1.4375 - classification_loss: 0.2353 444/500 [=========================>....] - ETA: 19s - loss: 1.6710 - regression_loss: 1.4359 - classification_loss: 0.2351 445/500 [=========================>....] - ETA: 18s - loss: 1.6711 - regression_loss: 1.4360 - classification_loss: 0.2351 446/500 [=========================>....] - ETA: 18s - loss: 1.6697 - regression_loss: 1.4347 - classification_loss: 0.2350 447/500 [=========================>....] - ETA: 18s - loss: 1.6696 - regression_loss: 1.4346 - classification_loss: 0.2350 448/500 [=========================>....] - ETA: 17s - loss: 1.6677 - regression_loss: 1.4331 - classification_loss: 0.2347 449/500 [=========================>....] - ETA: 17s - loss: 1.6680 - regression_loss: 1.4333 - classification_loss: 0.2347 450/500 [==========================>...] - ETA: 17s - loss: 1.6664 - regression_loss: 1.4319 - classification_loss: 0.2344 451/500 [==========================>...] - ETA: 16s - loss: 1.6646 - regression_loss: 1.4304 - classification_loss: 0.2342 452/500 [==========================>...] - ETA: 16s - loss: 1.6632 - regression_loss: 1.4290 - classification_loss: 0.2341 453/500 [==========================>...] - ETA: 16s - loss: 1.6634 - regression_loss: 1.4294 - classification_loss: 0.2341 454/500 [==========================>...] - ETA: 15s - loss: 1.6638 - regression_loss: 1.4296 - classification_loss: 0.2342 455/500 [==========================>...] - ETA: 15s - loss: 1.6648 - regression_loss: 1.4304 - classification_loss: 0.2344 456/500 [==========================>...] - ETA: 15s - loss: 1.6646 - regression_loss: 1.4303 - classification_loss: 0.2344 457/500 [==========================>...] - ETA: 14s - loss: 1.6637 - regression_loss: 1.4296 - classification_loss: 0.2341 458/500 [==========================>...] - ETA: 14s - loss: 1.6649 - regression_loss: 1.4307 - classification_loss: 0.2343 459/500 [==========================>...] - ETA: 14s - loss: 1.6650 - regression_loss: 1.4308 - classification_loss: 0.2342 460/500 [==========================>...] - ETA: 13s - loss: 1.6630 - regression_loss: 1.4291 - classification_loss: 0.2338 461/500 [==========================>...] - ETA: 13s - loss: 1.6628 - regression_loss: 1.4291 - classification_loss: 0.2337 462/500 [==========================>...] - ETA: 13s - loss: 1.6610 - regression_loss: 1.4275 - classification_loss: 0.2334 463/500 [==========================>...] - ETA: 12s - loss: 1.6603 - regression_loss: 1.4271 - classification_loss: 0.2333 464/500 [==========================>...] - ETA: 12s - loss: 1.6603 - regression_loss: 1.4272 - classification_loss: 0.2331 465/500 [==========================>...] - ETA: 12s - loss: 1.6596 - regression_loss: 1.4266 - classification_loss: 0.2330 466/500 [==========================>...] - ETA: 11s - loss: 1.6596 - regression_loss: 1.4263 - classification_loss: 0.2333 467/500 [===========================>..] - ETA: 11s - loss: 1.6591 - regression_loss: 1.4256 - classification_loss: 0.2336 468/500 [===========================>..] - ETA: 11s - loss: 1.6580 - regression_loss: 1.4246 - classification_loss: 0.2334 469/500 [===========================>..] - ETA: 10s - loss: 1.6578 - regression_loss: 1.4244 - classification_loss: 0.2334 470/500 [===========================>..] - ETA: 10s - loss: 1.6580 - regression_loss: 1.4245 - classification_loss: 0.2335 471/500 [===========================>..] - ETA: 9s - loss: 1.6576 - regression_loss: 1.4240 - classification_loss: 0.2336  472/500 [===========================>..] - ETA: 9s - loss: 1.6564 - regression_loss: 1.4229 - classification_loss: 0.2334 473/500 [===========================>..] - ETA: 9s - loss: 1.6557 - regression_loss: 1.4225 - classification_loss: 0.2332 474/500 [===========================>..] - ETA: 8s - loss: 1.6550 - regression_loss: 1.4219 - classification_loss: 0.2330 475/500 [===========================>..] - ETA: 8s - loss: 1.6561 - regression_loss: 1.4229 - classification_loss: 0.2332 476/500 [===========================>..] - ETA: 8s - loss: 1.6566 - regression_loss: 1.4232 - classification_loss: 0.2334 477/500 [===========================>..] - ETA: 7s - loss: 1.6559 - regression_loss: 1.4226 - classification_loss: 0.2334 478/500 [===========================>..] - ETA: 7s - loss: 1.6564 - regression_loss: 1.4229 - classification_loss: 0.2336 479/500 [===========================>..] - ETA: 7s - loss: 1.6557 - regression_loss: 1.4222 - classification_loss: 0.2336 480/500 [===========================>..] - ETA: 6s - loss: 1.6560 - regression_loss: 1.4223 - classification_loss: 0.2336 481/500 [===========================>..] - ETA: 6s - loss: 1.6570 - regression_loss: 1.4234 - classification_loss: 0.2336 482/500 [===========================>..] - ETA: 6s - loss: 1.6570 - regression_loss: 1.4234 - classification_loss: 0.2336 483/500 [===========================>..] - ETA: 5s - loss: 1.6556 - regression_loss: 1.4222 - classification_loss: 0.2334 484/500 [============================>.] - ETA: 5s - loss: 1.6559 - regression_loss: 1.4224 - classification_loss: 0.2335 485/500 [============================>.] - ETA: 5s - loss: 1.6548 - regression_loss: 1.4214 - classification_loss: 0.2334 486/500 [============================>.] - ETA: 4s - loss: 1.6557 - regression_loss: 1.4223 - classification_loss: 0.2335 487/500 [============================>.] - ETA: 4s - loss: 1.6563 - regression_loss: 1.4227 - classification_loss: 0.2336 488/500 [============================>.] - ETA: 4s - loss: 1.6556 - regression_loss: 1.4220 - classification_loss: 0.2336 489/500 [============================>.] - ETA: 3s - loss: 1.6536 - regression_loss: 1.4203 - classification_loss: 0.2333 490/500 [============================>.] - ETA: 3s - loss: 1.6526 - regression_loss: 1.4195 - classification_loss: 0.2331 491/500 [============================>.] - ETA: 3s - loss: 1.6530 - regression_loss: 1.4198 - classification_loss: 0.2332 492/500 [============================>.] - ETA: 2s - loss: 1.6531 - regression_loss: 1.4199 - classification_loss: 0.2331 493/500 [============================>.] - ETA: 2s - loss: 1.6534 - regression_loss: 1.4203 - classification_loss: 0.2332 494/500 [============================>.] - ETA: 2s - loss: 1.6530 - regression_loss: 1.4199 - classification_loss: 0.2331 495/500 [============================>.] - ETA: 1s - loss: 1.6518 - regression_loss: 1.4189 - classification_loss: 0.2329 496/500 [============================>.] - ETA: 1s - loss: 1.6519 - regression_loss: 1.4190 - classification_loss: 0.2329 497/500 [============================>.] - ETA: 1s - loss: 1.6512 - regression_loss: 1.4184 - classification_loss: 0.2328 498/500 [============================>.] - ETA: 0s - loss: 1.6507 - regression_loss: 1.4180 - classification_loss: 0.2327 499/500 [============================>.] - ETA: 0s - loss: 1.6496 - regression_loss: 1.4170 - classification_loss: 0.2326 500/500 [==============================] - 172s 344ms/step - loss: 1.6499 - regression_loss: 1.4173 - classification_loss: 0.2326 1172 instances of class plum with average precision: 0.6022 mAP: 0.6022 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 2:51 - loss: 1.5887 - regression_loss: 1.3650 - classification_loss: 0.2237 2/500 [..............................] - ETA: 2:53 - loss: 1.2669 - regression_loss: 1.0784 - classification_loss: 0.1885 3/500 [..............................] - ETA: 2:50 - loss: 1.3960 - regression_loss: 1.1879 - classification_loss: 0.2081 4/500 [..............................] - ETA: 2:47 - loss: 1.4945 - regression_loss: 1.2852 - classification_loss: 0.2093 5/500 [..............................] - ETA: 2:46 - loss: 1.3567 - regression_loss: 1.1698 - classification_loss: 0.1869 6/500 [..............................] - ETA: 2:46 - loss: 1.3991 - regression_loss: 1.2131 - classification_loss: 0.1860 7/500 [..............................] - ETA: 2:46 - loss: 1.4673 - regression_loss: 1.2700 - classification_loss: 0.1973 8/500 [..............................] - ETA: 2:47 - loss: 1.4916 - regression_loss: 1.2927 - classification_loss: 0.1989 9/500 [..............................] - ETA: 2:48 - loss: 1.4708 - regression_loss: 1.2726 - classification_loss: 0.1981 10/500 [..............................] - ETA: 2:47 - loss: 1.4812 - regression_loss: 1.2795 - classification_loss: 0.2017 11/500 [..............................] - ETA: 2:47 - loss: 1.5113 - regression_loss: 1.3047 - classification_loss: 0.2066 12/500 [..............................] - ETA: 2:47 - loss: 1.5285 - regression_loss: 1.3189 - classification_loss: 0.2096 13/500 [..............................] - ETA: 2:46 - loss: 1.5657 - regression_loss: 1.3469 - classification_loss: 0.2187 14/500 [..............................] - ETA: 2:45 - loss: 1.5947 - regression_loss: 1.3711 - classification_loss: 0.2236 15/500 [..............................] - ETA: 2:45 - loss: 1.5788 - regression_loss: 1.3566 - classification_loss: 0.2222 16/500 [..............................] - ETA: 2:45 - loss: 1.5624 - regression_loss: 1.3424 - classification_loss: 0.2200 17/500 [>.............................] - ETA: 2:45 - loss: 1.5675 - regression_loss: 1.3499 - classification_loss: 0.2176 18/500 [>.............................] - ETA: 2:45 - loss: 1.5845 - regression_loss: 1.3640 - classification_loss: 0.2205 19/500 [>.............................] - ETA: 2:45 - loss: 1.6024 - regression_loss: 1.3806 - classification_loss: 0.2218 20/500 [>.............................] - ETA: 2:45 - loss: 1.6216 - regression_loss: 1.4009 - classification_loss: 0.2207 21/500 [>.............................] - ETA: 2:44 - loss: 1.5936 - regression_loss: 1.3744 - classification_loss: 0.2191 22/500 [>.............................] - ETA: 2:44 - loss: 1.5944 - regression_loss: 1.3742 - classification_loss: 0.2202 23/500 [>.............................] - ETA: 2:43 - loss: 1.5991 - regression_loss: 1.3764 - classification_loss: 0.2226 24/500 [>.............................] - ETA: 2:43 - loss: 1.6004 - regression_loss: 1.3801 - classification_loss: 0.2202 25/500 [>.............................] - ETA: 2:43 - loss: 1.6033 - regression_loss: 1.3816 - classification_loss: 0.2217 26/500 [>.............................] - ETA: 2:42 - loss: 1.5782 - regression_loss: 1.3593 - classification_loss: 0.2189 27/500 [>.............................] - ETA: 2:42 - loss: 1.5659 - regression_loss: 1.3451 - classification_loss: 0.2208 28/500 [>.............................] - ETA: 2:42 - loss: 1.6020 - regression_loss: 1.3694 - classification_loss: 0.2326 29/500 [>.............................] - ETA: 2:41 - loss: 1.6117 - regression_loss: 1.3780 - classification_loss: 0.2337 30/500 [>.............................] - ETA: 2:41 - loss: 1.6245 - regression_loss: 1.3888 - classification_loss: 0.2357 31/500 [>.............................] - ETA: 2:40 - loss: 1.6204 - regression_loss: 1.3858 - classification_loss: 0.2346 32/500 [>.............................] - ETA: 2:40 - loss: 1.6079 - regression_loss: 1.3753 - classification_loss: 0.2326 33/500 [>.............................] - ETA: 2:40 - loss: 1.6103 - regression_loss: 1.3803 - classification_loss: 0.2300 34/500 [=>............................] - ETA: 2:40 - loss: 1.6026 - regression_loss: 1.3749 - classification_loss: 0.2277 35/500 [=>............................] - ETA: 2:40 - loss: 1.6100 - regression_loss: 1.3806 - classification_loss: 0.2294 36/500 [=>............................] - ETA: 2:40 - loss: 1.6104 - regression_loss: 1.3823 - classification_loss: 0.2281 37/500 [=>............................] - ETA: 2:39 - loss: 1.6228 - regression_loss: 1.3926 - classification_loss: 0.2301 38/500 [=>............................] - ETA: 2:39 - loss: 1.6165 - regression_loss: 1.3876 - classification_loss: 0.2289 39/500 [=>............................] - ETA: 2:39 - loss: 1.6159 - regression_loss: 1.3881 - classification_loss: 0.2278 40/500 [=>............................] - ETA: 2:38 - loss: 1.6104 - regression_loss: 1.3822 - classification_loss: 0.2282 41/500 [=>............................] - ETA: 2:38 - loss: 1.6046 - regression_loss: 1.3778 - classification_loss: 0.2268 42/500 [=>............................] - ETA: 2:38 - loss: 1.6024 - regression_loss: 1.3761 - classification_loss: 0.2264 43/500 [=>............................] - ETA: 2:37 - loss: 1.5868 - regression_loss: 1.3627 - classification_loss: 0.2241 44/500 [=>............................] - ETA: 2:37 - loss: 1.5875 - regression_loss: 1.3632 - classification_loss: 0.2243 45/500 [=>............................] - ETA: 2:37 - loss: 1.6030 - regression_loss: 1.3774 - classification_loss: 0.2256 46/500 [=>............................] - ETA: 2:36 - loss: 1.6089 - regression_loss: 1.3810 - classification_loss: 0.2279 47/500 [=>............................] - ETA: 2:36 - loss: 1.6040 - regression_loss: 1.3771 - classification_loss: 0.2269 48/500 [=>............................] - ETA: 2:36 - loss: 1.6126 - regression_loss: 1.3847 - classification_loss: 0.2279 49/500 [=>............................] - ETA: 2:35 - loss: 1.6090 - regression_loss: 1.3813 - classification_loss: 0.2277 50/500 [==>...........................] - ETA: 2:35 - loss: 1.6154 - regression_loss: 1.3874 - classification_loss: 0.2279 51/500 [==>...........................] - ETA: 2:35 - loss: 1.6135 - regression_loss: 1.3864 - classification_loss: 0.2271 52/500 [==>...........................] - ETA: 2:35 - loss: 1.6104 - regression_loss: 1.3829 - classification_loss: 0.2275 53/500 [==>...........................] - ETA: 2:34 - loss: 1.6083 - regression_loss: 1.3815 - classification_loss: 0.2268 54/500 [==>...........................] - ETA: 2:34 - loss: 1.6063 - regression_loss: 1.3803 - classification_loss: 0.2259 55/500 [==>...........................] - ETA: 2:34 - loss: 1.6076 - regression_loss: 1.3815 - classification_loss: 0.2261 56/500 [==>...........................] - ETA: 2:33 - loss: 1.6040 - regression_loss: 1.3782 - classification_loss: 0.2258 57/500 [==>...........................] - ETA: 2:33 - loss: 1.6104 - regression_loss: 1.3844 - classification_loss: 0.2261 58/500 [==>...........................] - ETA: 2:33 - loss: 1.6146 - regression_loss: 1.3878 - classification_loss: 0.2268 59/500 [==>...........................] - ETA: 2:32 - loss: 1.6039 - regression_loss: 1.3791 - classification_loss: 0.2248 60/500 [==>...........................] - ETA: 2:32 - loss: 1.6029 - regression_loss: 1.3788 - classification_loss: 0.2241 61/500 [==>...........................] - ETA: 2:32 - loss: 1.5972 - regression_loss: 1.3737 - classification_loss: 0.2236 62/500 [==>...........................] - ETA: 2:31 - loss: 1.5911 - regression_loss: 1.3681 - classification_loss: 0.2230 63/500 [==>...........................] - ETA: 2:31 - loss: 1.5877 - regression_loss: 1.3659 - classification_loss: 0.2219 64/500 [==>...........................] - ETA: 2:31 - loss: 1.5888 - regression_loss: 1.3670 - classification_loss: 0.2218 65/500 [==>...........................] - ETA: 2:30 - loss: 1.5839 - regression_loss: 1.3622 - classification_loss: 0.2217 66/500 [==>...........................] - ETA: 2:30 - loss: 1.5805 - regression_loss: 1.3594 - classification_loss: 0.2210 67/500 [===>..........................] - ETA: 2:30 - loss: 1.5806 - regression_loss: 1.3596 - classification_loss: 0.2209 68/500 [===>..........................] - ETA: 2:29 - loss: 1.5787 - regression_loss: 1.3586 - classification_loss: 0.2200 69/500 [===>..........................] - ETA: 2:28 - loss: 1.5722 - regression_loss: 1.3529 - classification_loss: 0.2192 70/500 [===>..........................] - ETA: 2:28 - loss: 1.5707 - regression_loss: 1.3511 - classification_loss: 0.2196 71/500 [===>..........................] - ETA: 2:28 - loss: 1.5622 - regression_loss: 1.3442 - classification_loss: 0.2180 72/500 [===>..........................] - ETA: 2:27 - loss: 1.5627 - regression_loss: 1.3444 - classification_loss: 0.2183 73/500 [===>..........................] - ETA: 2:27 - loss: 1.5561 - regression_loss: 1.3394 - classification_loss: 0.2168 74/500 [===>..........................] - ETA: 2:27 - loss: 1.5533 - regression_loss: 1.3370 - classification_loss: 0.2162 75/500 [===>..........................] - ETA: 2:26 - loss: 1.5536 - regression_loss: 1.3378 - classification_loss: 0.2158 76/500 [===>..........................] - ETA: 2:26 - loss: 1.5469 - regression_loss: 1.3324 - classification_loss: 0.2145 77/500 [===>..........................] - ETA: 2:25 - loss: 1.5537 - regression_loss: 1.3380 - classification_loss: 0.2157 78/500 [===>..........................] - ETA: 2:25 - loss: 1.5540 - regression_loss: 1.3381 - classification_loss: 0.2159 79/500 [===>..........................] - ETA: 2:25 - loss: 1.5495 - regression_loss: 1.3343 - classification_loss: 0.2152 80/500 [===>..........................] - ETA: 2:24 - loss: 1.5498 - regression_loss: 1.3343 - classification_loss: 0.2155 81/500 [===>..........................] - ETA: 2:24 - loss: 1.5534 - regression_loss: 1.3380 - classification_loss: 0.2154 82/500 [===>..........................] - ETA: 2:23 - loss: 1.5573 - regression_loss: 1.3411 - classification_loss: 0.2161 83/500 [===>..........................] - ETA: 2:23 - loss: 1.5520 - regression_loss: 1.3368 - classification_loss: 0.2152 84/500 [====>.........................] - ETA: 2:23 - loss: 1.5511 - regression_loss: 1.3360 - classification_loss: 0.2151 85/500 [====>.........................] - ETA: 2:22 - loss: 1.5513 - regression_loss: 1.3363 - classification_loss: 0.2151 86/500 [====>.........................] - ETA: 2:22 - loss: 1.5431 - regression_loss: 1.3285 - classification_loss: 0.2146 87/500 [====>.........................] - ETA: 2:21 - loss: 1.5347 - regression_loss: 1.3208 - classification_loss: 0.2140 88/500 [====>.........................] - ETA: 2:21 - loss: 1.5386 - regression_loss: 1.3245 - classification_loss: 0.2140 89/500 [====>.........................] - ETA: 2:21 - loss: 1.5446 - regression_loss: 1.3303 - classification_loss: 0.2142 90/500 [====>.........................] - ETA: 2:21 - loss: 1.5438 - regression_loss: 1.3298 - classification_loss: 0.2140 91/500 [====>.........................] - ETA: 2:20 - loss: 1.5462 - regression_loss: 1.3320 - classification_loss: 0.2142 92/500 [====>.........................] - ETA: 2:20 - loss: 1.5503 - regression_loss: 1.3344 - classification_loss: 0.2159 93/500 [====>.........................] - ETA: 2:20 - loss: 1.5547 - regression_loss: 1.3378 - classification_loss: 0.2169 94/500 [====>.........................] - ETA: 2:19 - loss: 1.5475 - regression_loss: 1.3319 - classification_loss: 0.2156 95/500 [====>.........................] - ETA: 2:19 - loss: 1.5472 - regression_loss: 1.3323 - classification_loss: 0.2149 96/500 [====>.........................] - ETA: 2:19 - loss: 1.5485 - regression_loss: 1.3332 - classification_loss: 0.2153 97/500 [====>.........................] - ETA: 2:18 - loss: 1.5486 - regression_loss: 1.3336 - classification_loss: 0.2150 98/500 [====>.........................] - ETA: 2:18 - loss: 1.5433 - regression_loss: 1.3287 - classification_loss: 0.2146 99/500 [====>.........................] - ETA: 2:18 - loss: 1.5366 - regression_loss: 1.3230 - classification_loss: 0.2136 100/500 [=====>........................] - ETA: 2:17 - loss: 1.5408 - regression_loss: 1.3267 - classification_loss: 0.2140 101/500 [=====>........................] - ETA: 2:17 - loss: 1.5402 - regression_loss: 1.3262 - classification_loss: 0.2140 102/500 [=====>........................] - ETA: 2:17 - loss: 1.5427 - regression_loss: 1.3283 - classification_loss: 0.2144 103/500 [=====>........................] - ETA: 2:17 - loss: 1.5446 - regression_loss: 1.3273 - classification_loss: 0.2173 104/500 [=====>........................] - ETA: 2:16 - loss: 1.5484 - regression_loss: 1.3308 - classification_loss: 0.2176 105/500 [=====>........................] - ETA: 2:16 - loss: 1.5501 - regression_loss: 1.3325 - classification_loss: 0.2176 106/500 [=====>........................] - ETA: 2:15 - loss: 1.5501 - regression_loss: 1.3325 - classification_loss: 0.2176 107/500 [=====>........................] - ETA: 2:15 - loss: 1.5465 - regression_loss: 1.3296 - classification_loss: 0.2169 108/500 [=====>........................] - ETA: 2:15 - loss: 1.5460 - regression_loss: 1.3293 - classification_loss: 0.2167 109/500 [=====>........................] - ETA: 2:15 - loss: 1.5499 - regression_loss: 1.3326 - classification_loss: 0.2172 110/500 [=====>........................] - ETA: 2:14 - loss: 1.5463 - regression_loss: 1.3297 - classification_loss: 0.2167 111/500 [=====>........................] - ETA: 2:14 - loss: 1.5446 - regression_loss: 1.3281 - classification_loss: 0.2165 112/500 [=====>........................] - ETA: 2:14 - loss: 1.5440 - regression_loss: 1.3275 - classification_loss: 0.2165 113/500 [=====>........................] - ETA: 2:13 - loss: 1.5352 - regression_loss: 1.3198 - classification_loss: 0.2154 114/500 [=====>........................] - ETA: 2:13 - loss: 1.5359 - regression_loss: 1.3204 - classification_loss: 0.2155 115/500 [=====>........................] - ETA: 2:12 - loss: 1.5282 - regression_loss: 1.3137 - classification_loss: 0.2144 116/500 [=====>........................] - ETA: 2:12 - loss: 1.5305 - regression_loss: 1.3161 - classification_loss: 0.2144 117/500 [======>.......................] - ETA: 2:12 - loss: 1.5301 - regression_loss: 1.3156 - classification_loss: 0.2145 118/500 [======>.......................] - ETA: 2:11 - loss: 1.5245 - regression_loss: 1.3109 - classification_loss: 0.2136 119/500 [======>.......................] - ETA: 2:11 - loss: 1.5241 - regression_loss: 1.3106 - classification_loss: 0.2136 120/500 [======>.......................] - ETA: 2:11 - loss: 1.5232 - regression_loss: 1.3089 - classification_loss: 0.2143 121/500 [======>.......................] - ETA: 2:10 - loss: 1.5226 - regression_loss: 1.3087 - classification_loss: 0.2140 122/500 [======>.......................] - ETA: 2:10 - loss: 1.5247 - regression_loss: 1.3107 - classification_loss: 0.2140 123/500 [======>.......................] - ETA: 2:09 - loss: 1.5280 - regression_loss: 1.3136 - classification_loss: 0.2144 124/500 [======>.......................] - ETA: 2:09 - loss: 1.5278 - regression_loss: 1.3134 - classification_loss: 0.2144 125/500 [======>.......................] - ETA: 2:09 - loss: 1.5251 - regression_loss: 1.3112 - classification_loss: 0.2139 126/500 [======>.......................] - ETA: 2:08 - loss: 1.5220 - regression_loss: 1.3081 - classification_loss: 0.2139 127/500 [======>.......................] - ETA: 2:08 - loss: 1.5229 - regression_loss: 1.3092 - classification_loss: 0.2137 128/500 [======>.......................] - ETA: 2:08 - loss: 1.5235 - regression_loss: 1.3095 - classification_loss: 0.2141 129/500 [======>.......................] - ETA: 2:07 - loss: 1.5243 - regression_loss: 1.3102 - classification_loss: 0.2142 130/500 [======>.......................] - ETA: 2:07 - loss: 1.5268 - regression_loss: 1.3126 - classification_loss: 0.2142 131/500 [======>.......................] - ETA: 2:07 - loss: 1.5225 - regression_loss: 1.3088 - classification_loss: 0.2137 132/500 [======>.......................] - ETA: 2:06 - loss: 1.5230 - regression_loss: 1.3090 - classification_loss: 0.2140 133/500 [======>.......................] - ETA: 2:06 - loss: 1.5246 - regression_loss: 1.3103 - classification_loss: 0.2144 134/500 [=======>......................] - ETA: 2:06 - loss: 1.5217 - regression_loss: 1.3079 - classification_loss: 0.2138 135/500 [=======>......................] - ETA: 2:05 - loss: 1.5208 - regression_loss: 1.3071 - classification_loss: 0.2138 136/500 [=======>......................] - ETA: 2:05 - loss: 1.5214 - regression_loss: 1.3074 - classification_loss: 0.2140 137/500 [=======>......................] - ETA: 2:05 - loss: 1.5196 - regression_loss: 1.3060 - classification_loss: 0.2136 138/500 [=======>......................] - ETA: 2:04 - loss: 1.5170 - regression_loss: 1.3040 - classification_loss: 0.2131 139/500 [=======>......................] - ETA: 2:04 - loss: 1.5149 - regression_loss: 1.3023 - classification_loss: 0.2126 140/500 [=======>......................] - ETA: 2:04 - loss: 1.5155 - regression_loss: 1.3028 - classification_loss: 0.2127 141/500 [=======>......................] - ETA: 2:03 - loss: 1.5179 - regression_loss: 1.3050 - classification_loss: 0.2128 142/500 [=======>......................] - ETA: 2:03 - loss: 1.5182 - regression_loss: 1.3057 - classification_loss: 0.2125 143/500 [=======>......................] - ETA: 2:03 - loss: 1.5171 - regression_loss: 1.3050 - classification_loss: 0.2122 144/500 [=======>......................] - ETA: 2:02 - loss: 1.5163 - regression_loss: 1.3044 - classification_loss: 0.2119 145/500 [=======>......................] - ETA: 2:02 - loss: 1.5165 - regression_loss: 1.3046 - classification_loss: 0.2119 146/500 [=======>......................] - ETA: 2:02 - loss: 1.5156 - regression_loss: 1.3041 - classification_loss: 0.2114 147/500 [=======>......................] - ETA: 2:01 - loss: 1.5162 - regression_loss: 1.3048 - classification_loss: 0.2113 148/500 [=======>......................] - ETA: 2:01 - loss: 1.5180 - regression_loss: 1.3065 - classification_loss: 0.2114 149/500 [=======>......................] - ETA: 2:01 - loss: 1.5191 - regression_loss: 1.3077 - classification_loss: 0.2114 150/500 [========>.....................] - ETA: 2:00 - loss: 1.5244 - regression_loss: 1.3120 - classification_loss: 0.2124 151/500 [========>.....................] - ETA: 2:00 - loss: 1.5282 - regression_loss: 1.3149 - classification_loss: 0.2132 152/500 [========>.....................] - ETA: 2:00 - loss: 1.5280 - regression_loss: 1.3145 - classification_loss: 0.2134 153/500 [========>.....................] - ETA: 1:59 - loss: 1.5274 - regression_loss: 1.3143 - classification_loss: 0.2131 154/500 [========>.....................] - ETA: 1:59 - loss: 1.5284 - regression_loss: 1.3152 - classification_loss: 0.2132 155/500 [========>.....................] - ETA: 1:59 - loss: 1.5315 - regression_loss: 1.3179 - classification_loss: 0.2135 156/500 [========>.....................] - ETA: 1:58 - loss: 1.5340 - regression_loss: 1.3203 - classification_loss: 0.2137 157/500 [========>.....................] - ETA: 1:58 - loss: 1.5347 - regression_loss: 1.3210 - classification_loss: 0.2137 158/500 [========>.....................] - ETA: 1:58 - loss: 1.5361 - regression_loss: 1.3223 - classification_loss: 0.2138 159/500 [========>.....................] - ETA: 1:57 - loss: 1.5361 - regression_loss: 1.3223 - classification_loss: 0.2138 160/500 [========>.....................] - ETA: 1:57 - loss: 1.5336 - regression_loss: 1.3204 - classification_loss: 0.2132 161/500 [========>.....................] - ETA: 1:57 - loss: 1.5347 - regression_loss: 1.3213 - classification_loss: 0.2134 162/500 [========>.....................] - ETA: 1:56 - loss: 1.5357 - regression_loss: 1.3226 - classification_loss: 0.2132 163/500 [========>.....................] - ETA: 1:56 - loss: 1.5353 - regression_loss: 1.3222 - classification_loss: 0.2131 164/500 [========>.....................] - ETA: 1:56 - loss: 1.5370 - regression_loss: 1.3237 - classification_loss: 0.2132 165/500 [========>.....................] - ETA: 1:55 - loss: 1.5392 - regression_loss: 1.3257 - classification_loss: 0.2135 166/500 [========>.....................] - ETA: 1:55 - loss: 1.5394 - regression_loss: 1.3259 - classification_loss: 0.2135 167/500 [=========>....................] - ETA: 1:55 - loss: 1.5355 - regression_loss: 1.3224 - classification_loss: 0.2131 168/500 [=========>....................] - ETA: 1:54 - loss: 1.5318 - regression_loss: 1.3192 - classification_loss: 0.2126 169/500 [=========>....................] - ETA: 1:54 - loss: 1.5333 - regression_loss: 1.3205 - classification_loss: 0.2129 170/500 [=========>....................] - ETA: 1:54 - loss: 1.5284 - regression_loss: 1.3161 - classification_loss: 0.2123 171/500 [=========>....................] - ETA: 1:53 - loss: 1.5259 - regression_loss: 1.3141 - classification_loss: 0.2118 172/500 [=========>....................] - ETA: 1:53 - loss: 1.5232 - regression_loss: 1.3117 - classification_loss: 0.2116 173/500 [=========>....................] - ETA: 1:53 - loss: 1.5249 - regression_loss: 1.3130 - classification_loss: 0.2118 174/500 [=========>....................] - ETA: 1:52 - loss: 1.5250 - regression_loss: 1.3132 - classification_loss: 0.2118 175/500 [=========>....................] - ETA: 1:52 - loss: 1.5233 - regression_loss: 1.3117 - classification_loss: 0.2117 176/500 [=========>....................] - ETA: 1:52 - loss: 1.5242 - regression_loss: 1.3123 - classification_loss: 0.2119 177/500 [=========>....................] - ETA: 1:51 - loss: 1.5280 - regression_loss: 1.3159 - classification_loss: 0.2122 178/500 [=========>....................] - ETA: 1:51 - loss: 1.5288 - regression_loss: 1.3166 - classification_loss: 0.2122 179/500 [=========>....................] - ETA: 1:51 - loss: 1.5285 - regression_loss: 1.3163 - classification_loss: 0.2122 180/500 [=========>....................] - ETA: 1:50 - loss: 1.5323 - regression_loss: 1.3190 - classification_loss: 0.2133 181/500 [=========>....................] - ETA: 1:50 - loss: 1.5316 - regression_loss: 1.3183 - classification_loss: 0.2133 182/500 [=========>....................] - ETA: 1:50 - loss: 1.5328 - regression_loss: 1.3194 - classification_loss: 0.2134 183/500 [=========>....................] - ETA: 1:49 - loss: 1.5345 - regression_loss: 1.3209 - classification_loss: 0.2135 184/500 [==========>...................] - ETA: 1:49 - loss: 1.5371 - regression_loss: 1.3233 - classification_loss: 0.2137 185/500 [==========>...................] - ETA: 1:49 - loss: 1.5372 - regression_loss: 1.3234 - classification_loss: 0.2139 186/500 [==========>...................] - ETA: 1:48 - loss: 1.5378 - regression_loss: 1.3239 - classification_loss: 0.2139 187/500 [==========>...................] - ETA: 1:48 - loss: 1.5391 - regression_loss: 1.3249 - classification_loss: 0.2142 188/500 [==========>...................] - ETA: 1:48 - loss: 1.5382 - regression_loss: 1.3240 - classification_loss: 0.2142 189/500 [==========>...................] - ETA: 1:47 - loss: 1.5385 - regression_loss: 1.3246 - classification_loss: 0.2139 190/500 [==========>...................] - ETA: 1:47 - loss: 1.5396 - regression_loss: 1.3254 - classification_loss: 0.2141 191/500 [==========>...................] - ETA: 1:47 - loss: 1.5366 - regression_loss: 1.3230 - classification_loss: 0.2136 192/500 [==========>...................] - ETA: 1:46 - loss: 1.5350 - regression_loss: 1.3218 - classification_loss: 0.2132 193/500 [==========>...................] - ETA: 1:46 - loss: 1.5350 - regression_loss: 1.3220 - classification_loss: 0.2130 194/500 [==========>...................] - ETA: 1:46 - loss: 1.5344 - regression_loss: 1.3217 - classification_loss: 0.2127 195/500 [==========>...................] - ETA: 1:45 - loss: 1.5389 - regression_loss: 1.3258 - classification_loss: 0.2132 196/500 [==========>...................] - ETA: 1:45 - loss: 1.5398 - regression_loss: 1.3262 - classification_loss: 0.2136 197/500 [==========>...................] - ETA: 1:45 - loss: 1.5390 - regression_loss: 1.3257 - classification_loss: 0.2133 198/500 [==========>...................] - ETA: 1:44 - loss: 1.5378 - regression_loss: 1.3248 - classification_loss: 0.2130 199/500 [==========>...................] - ETA: 1:44 - loss: 1.5422 - regression_loss: 1.3287 - classification_loss: 0.2135 200/500 [===========>..................] - ETA: 1:44 - loss: 1.5459 - regression_loss: 1.3319 - classification_loss: 0.2140 201/500 [===========>..................] - ETA: 1:43 - loss: 1.5436 - regression_loss: 1.3298 - classification_loss: 0.2138 202/500 [===========>..................] - ETA: 1:43 - loss: 1.5426 - regression_loss: 1.3289 - classification_loss: 0.2137 203/500 [===========>..................] - ETA: 1:43 - loss: 1.5426 - regression_loss: 1.3289 - classification_loss: 0.2137 204/500 [===========>..................] - ETA: 1:42 - loss: 1.5397 - regression_loss: 1.3260 - classification_loss: 0.2137 205/500 [===========>..................] - ETA: 1:42 - loss: 1.5401 - regression_loss: 1.3263 - classification_loss: 0.2138 206/500 [===========>..................] - ETA: 1:42 - loss: 1.5385 - regression_loss: 1.3249 - classification_loss: 0.2135 207/500 [===========>..................] - ETA: 1:41 - loss: 1.5367 - regression_loss: 1.3233 - classification_loss: 0.2133 208/500 [===========>..................] - ETA: 1:41 - loss: 1.5337 - regression_loss: 1.3205 - classification_loss: 0.2132 209/500 [===========>..................] - ETA: 1:40 - loss: 1.5314 - regression_loss: 1.3184 - classification_loss: 0.2129 210/500 [===========>..................] - ETA: 1:40 - loss: 1.5311 - regression_loss: 1.3183 - classification_loss: 0.2128 211/500 [===========>..................] - ETA: 1:40 - loss: 1.5300 - regression_loss: 1.3174 - classification_loss: 0.2126 212/500 [===========>..................] - ETA: 1:40 - loss: 1.5360 - regression_loss: 1.3226 - classification_loss: 0.2134 213/500 [===========>..................] - ETA: 1:39 - loss: 1.5310 - regression_loss: 1.3181 - classification_loss: 0.2128 214/500 [===========>..................] - ETA: 1:39 - loss: 1.5307 - regression_loss: 1.3180 - classification_loss: 0.2127 215/500 [===========>..................] - ETA: 1:38 - loss: 1.5287 - regression_loss: 1.3162 - classification_loss: 0.2125 216/500 [===========>..................] - ETA: 1:38 - loss: 1.5267 - regression_loss: 1.3144 - classification_loss: 0.2124 217/500 [============>.................] - ETA: 1:38 - loss: 1.5282 - regression_loss: 1.3154 - classification_loss: 0.2128 218/500 [============>.................] - ETA: 1:37 - loss: 1.5298 - regression_loss: 1.3166 - classification_loss: 0.2132 219/500 [============>.................] - ETA: 1:37 - loss: 1.5294 - regression_loss: 1.3163 - classification_loss: 0.2131 220/500 [============>.................] - ETA: 1:37 - loss: 1.5304 - regression_loss: 1.3173 - classification_loss: 0.2131 221/500 [============>.................] - ETA: 1:36 - loss: 1.5300 - regression_loss: 1.3169 - classification_loss: 0.2131 222/500 [============>.................] - ETA: 1:36 - loss: 1.5307 - regression_loss: 1.3174 - classification_loss: 0.2133 223/500 [============>.................] - ETA: 1:36 - loss: 1.5282 - regression_loss: 1.3155 - classification_loss: 0.2128 224/500 [============>.................] - ETA: 1:35 - loss: 1.5287 - regression_loss: 1.3160 - classification_loss: 0.2127 225/500 [============>.................] - ETA: 1:35 - loss: 1.5301 - regression_loss: 1.3172 - classification_loss: 0.2129 226/500 [============>.................] - ETA: 1:35 - loss: 1.5301 - regression_loss: 1.3173 - classification_loss: 0.2128 227/500 [============>.................] - ETA: 1:34 - loss: 1.5288 - regression_loss: 1.3161 - classification_loss: 0.2127 228/500 [============>.................] - ETA: 1:34 - loss: 1.5290 - regression_loss: 1.3164 - classification_loss: 0.2126 229/500 [============>.................] - ETA: 1:34 - loss: 1.5313 - regression_loss: 1.3184 - classification_loss: 0.2128 230/500 [============>.................] - ETA: 1:33 - loss: 1.5317 - regression_loss: 1.3188 - classification_loss: 0.2128 231/500 [============>.................] - ETA: 1:33 - loss: 1.5306 - regression_loss: 1.3178 - classification_loss: 0.2128 232/500 [============>.................] - ETA: 1:33 - loss: 1.5289 - regression_loss: 1.3162 - classification_loss: 0.2127 233/500 [============>.................] - ETA: 1:32 - loss: 1.5267 - regression_loss: 1.3142 - classification_loss: 0.2125 234/500 [=============>................] - ETA: 1:32 - loss: 1.5248 - regression_loss: 1.3117 - classification_loss: 0.2130 235/500 [=============>................] - ETA: 1:32 - loss: 1.5257 - regression_loss: 1.3126 - classification_loss: 0.2131 236/500 [=============>................] - ETA: 1:31 - loss: 1.5266 - regression_loss: 1.3137 - classification_loss: 0.2129 237/500 [=============>................] - ETA: 1:31 - loss: 1.5277 - regression_loss: 1.3147 - classification_loss: 0.2130 238/500 [=============>................] - ETA: 1:31 - loss: 1.5249 - regression_loss: 1.3122 - classification_loss: 0.2127 239/500 [=============>................] - ETA: 1:30 - loss: 1.5225 - regression_loss: 1.3101 - classification_loss: 0.2124 240/500 [=============>................] - ETA: 1:30 - loss: 1.5193 - regression_loss: 1.3074 - classification_loss: 0.2118 241/500 [=============>................] - ETA: 1:30 - loss: 1.5217 - regression_loss: 1.3095 - classification_loss: 0.2122 242/500 [=============>................] - ETA: 1:29 - loss: 1.5196 - regression_loss: 1.3076 - classification_loss: 0.2120 243/500 [=============>................] - ETA: 1:29 - loss: 1.5166 - regression_loss: 1.3051 - classification_loss: 0.2115 244/500 [=============>................] - ETA: 1:29 - loss: 1.5166 - regression_loss: 1.3050 - classification_loss: 0.2116 245/500 [=============>................] - ETA: 1:28 - loss: 1.5172 - regression_loss: 1.3051 - classification_loss: 0.2121 246/500 [=============>................] - ETA: 1:28 - loss: 1.5167 - regression_loss: 1.3049 - classification_loss: 0.2119 247/500 [=============>................] - ETA: 1:27 - loss: 1.5183 - regression_loss: 1.3063 - classification_loss: 0.2120 248/500 [=============>................] - ETA: 1:27 - loss: 1.5181 - regression_loss: 1.3058 - classification_loss: 0.2123 249/500 [=============>................] - ETA: 1:27 - loss: 1.5187 - regression_loss: 1.3064 - classification_loss: 0.2123 250/500 [==============>...............] - ETA: 1:26 - loss: 1.5198 - regression_loss: 1.3074 - classification_loss: 0.2124 251/500 [==============>...............] - ETA: 1:26 - loss: 1.5210 - regression_loss: 1.3086 - classification_loss: 0.2124 252/500 [==============>...............] - ETA: 1:26 - loss: 1.5221 - regression_loss: 1.3095 - classification_loss: 0.2125 253/500 [==============>...............] - ETA: 1:25 - loss: 1.5231 - regression_loss: 1.3104 - classification_loss: 0.2127 254/500 [==============>...............] - ETA: 1:25 - loss: 1.5254 - regression_loss: 1.3122 - classification_loss: 0.2132 255/500 [==============>...............] - ETA: 1:25 - loss: 1.5259 - regression_loss: 1.3126 - classification_loss: 0.2133 256/500 [==============>...............] - ETA: 1:24 - loss: 1.5260 - regression_loss: 1.3127 - classification_loss: 0.2132 257/500 [==============>...............] - ETA: 1:24 - loss: 1.5260 - regression_loss: 1.3128 - classification_loss: 0.2132 258/500 [==============>...............] - ETA: 1:24 - loss: 1.5245 - regression_loss: 1.3110 - classification_loss: 0.2134 259/500 [==============>...............] - ETA: 1:23 - loss: 1.5243 - regression_loss: 1.3108 - classification_loss: 0.2135 260/500 [==============>...............] - ETA: 1:23 - loss: 1.5247 - regression_loss: 1.3112 - classification_loss: 0.2135 261/500 [==============>...............] - ETA: 1:23 - loss: 1.5263 - regression_loss: 1.3124 - classification_loss: 0.2139 262/500 [==============>...............] - ETA: 1:22 - loss: 1.5240 - regression_loss: 1.3105 - classification_loss: 0.2135 263/500 [==============>...............] - ETA: 1:22 - loss: 1.5238 - regression_loss: 1.3101 - classification_loss: 0.2137 264/500 [==============>...............] - ETA: 1:22 - loss: 1.5231 - regression_loss: 1.3095 - classification_loss: 0.2135 265/500 [==============>...............] - ETA: 1:21 - loss: 1.5211 - regression_loss: 1.3077 - classification_loss: 0.2134 266/500 [==============>...............] - ETA: 1:21 - loss: 1.5218 - regression_loss: 1.3080 - classification_loss: 0.2138 267/500 [===============>..............] - ETA: 1:21 - loss: 1.5199 - regression_loss: 1.3065 - classification_loss: 0.2134 268/500 [===============>..............] - ETA: 1:20 - loss: 1.5193 - regression_loss: 1.3060 - classification_loss: 0.2133 269/500 [===============>..............] - ETA: 1:20 - loss: 1.5209 - regression_loss: 1.3074 - classification_loss: 0.2136 270/500 [===============>..............] - ETA: 1:20 - loss: 1.5194 - regression_loss: 1.3061 - classification_loss: 0.2133 271/500 [===============>..............] - ETA: 1:19 - loss: 1.5189 - regression_loss: 1.3056 - classification_loss: 0.2132 272/500 [===============>..............] - ETA: 1:19 - loss: 1.5175 - regression_loss: 1.3046 - classification_loss: 0.2129 273/500 [===============>..............] - ETA: 1:18 - loss: 1.5181 - regression_loss: 1.3052 - classification_loss: 0.2129 274/500 [===============>..............] - ETA: 1:18 - loss: 1.5167 - regression_loss: 1.3039 - classification_loss: 0.2128 275/500 [===============>..............] - ETA: 1:18 - loss: 1.5151 - regression_loss: 1.3025 - classification_loss: 0.2126 276/500 [===============>..............] - ETA: 1:17 - loss: 1.5148 - regression_loss: 1.3023 - classification_loss: 0.2125 277/500 [===============>..............] - ETA: 1:17 - loss: 1.5131 - regression_loss: 1.3009 - classification_loss: 0.2122 278/500 [===============>..............] - ETA: 1:17 - loss: 1.5104 - regression_loss: 1.2987 - classification_loss: 0.2117 279/500 [===============>..............] - ETA: 1:16 - loss: 1.5114 - regression_loss: 1.2995 - classification_loss: 0.2119 280/500 [===============>..............] - ETA: 1:16 - loss: 1.5116 - regression_loss: 1.2997 - classification_loss: 0.2119 281/500 [===============>..............] - ETA: 1:16 - loss: 1.5109 - regression_loss: 1.2992 - classification_loss: 0.2117 282/500 [===============>..............] - ETA: 1:15 - loss: 1.5114 - regression_loss: 1.2996 - classification_loss: 0.2118 283/500 [===============>..............] - ETA: 1:15 - loss: 1.5114 - regression_loss: 1.2997 - classification_loss: 0.2117 284/500 [================>.............] - ETA: 1:15 - loss: 1.5101 - regression_loss: 1.2987 - classification_loss: 0.2114 285/500 [================>.............] - ETA: 1:14 - loss: 1.5112 - regression_loss: 1.2997 - classification_loss: 0.2115 286/500 [================>.............] - ETA: 1:14 - loss: 1.5107 - regression_loss: 1.2993 - classification_loss: 0.2113 287/500 [================>.............] - ETA: 1:14 - loss: 1.5108 - regression_loss: 1.2995 - classification_loss: 0.2113 288/500 [================>.............] - ETA: 1:13 - loss: 1.5121 - regression_loss: 1.3007 - classification_loss: 0.2115 289/500 [================>.............] - ETA: 1:13 - loss: 1.5136 - regression_loss: 1.3021 - classification_loss: 0.2115 290/500 [================>.............] - ETA: 1:13 - loss: 1.5123 - regression_loss: 1.3011 - classification_loss: 0.2112 291/500 [================>.............] - ETA: 1:12 - loss: 1.5110 - regression_loss: 1.3000 - classification_loss: 0.2111 292/500 [================>.............] - ETA: 1:12 - loss: 1.5105 - regression_loss: 1.2994 - classification_loss: 0.2111 293/500 [================>.............] - ETA: 1:11 - loss: 1.5106 - regression_loss: 1.2996 - classification_loss: 0.2110 294/500 [================>.............] - ETA: 1:11 - loss: 1.5111 - regression_loss: 1.3002 - classification_loss: 0.2109 295/500 [================>.............] - ETA: 1:11 - loss: 1.5110 - regression_loss: 1.3001 - classification_loss: 0.2109 296/500 [================>.............] - ETA: 1:10 - loss: 1.5113 - regression_loss: 1.3006 - classification_loss: 0.2107 297/500 [================>.............] - ETA: 1:10 - loss: 1.5121 - regression_loss: 1.3015 - classification_loss: 0.2107 298/500 [================>.............] - ETA: 1:10 - loss: 1.5131 - regression_loss: 1.3023 - classification_loss: 0.2108 299/500 [================>.............] - ETA: 1:09 - loss: 1.5128 - regression_loss: 1.3020 - classification_loss: 0.2107 300/500 [=================>............] - ETA: 1:09 - loss: 1.5125 - regression_loss: 1.3018 - classification_loss: 0.2107 301/500 [=================>............] - ETA: 1:09 - loss: 1.5103 - regression_loss: 1.3000 - classification_loss: 0.2103 302/500 [=================>............] - ETA: 1:08 - loss: 1.5106 - regression_loss: 1.3004 - classification_loss: 0.2102 303/500 [=================>............] - ETA: 1:08 - loss: 1.5113 - regression_loss: 1.3011 - classification_loss: 0.2102 304/500 [=================>............] - ETA: 1:08 - loss: 1.5106 - regression_loss: 1.3005 - classification_loss: 0.2101 305/500 [=================>............] - ETA: 1:07 - loss: 1.5102 - regression_loss: 1.3002 - classification_loss: 0.2100 306/500 [=================>............] - ETA: 1:07 - loss: 1.5102 - regression_loss: 1.3003 - classification_loss: 0.2100 307/500 [=================>............] - ETA: 1:07 - loss: 1.5080 - regression_loss: 1.2983 - classification_loss: 0.2097 308/500 [=================>............] - ETA: 1:06 - loss: 1.5091 - regression_loss: 1.2992 - classification_loss: 0.2098 309/500 [=================>............] - ETA: 1:06 - loss: 1.5086 - regression_loss: 1.2990 - classification_loss: 0.2096 310/500 [=================>............] - ETA: 1:06 - loss: 1.5095 - regression_loss: 1.2999 - classification_loss: 0.2095 311/500 [=================>............] - ETA: 1:05 - loss: 1.5087 - regression_loss: 1.2993 - classification_loss: 0.2094 312/500 [=================>............] - ETA: 1:05 - loss: 1.5093 - regression_loss: 1.3001 - classification_loss: 0.2092 313/500 [=================>............] - ETA: 1:05 - loss: 1.5093 - regression_loss: 1.3000 - classification_loss: 0.2093 314/500 [=================>............] - ETA: 1:04 - loss: 1.5090 - regression_loss: 1.2999 - classification_loss: 0.2091 315/500 [=================>............] - ETA: 1:04 - loss: 1.5080 - regression_loss: 1.2992 - classification_loss: 0.2089 316/500 [=================>............] - ETA: 1:04 - loss: 1.5071 - regression_loss: 1.2983 - classification_loss: 0.2088 317/500 [==================>...........] - ETA: 1:03 - loss: 1.5079 - regression_loss: 1.2988 - classification_loss: 0.2091 318/500 [==================>...........] - ETA: 1:03 - loss: 1.5066 - regression_loss: 1.2975 - classification_loss: 0.2091 319/500 [==================>...........] - ETA: 1:03 - loss: 1.5044 - regression_loss: 1.2956 - classification_loss: 0.2088 320/500 [==================>...........] - ETA: 1:02 - loss: 1.5053 - regression_loss: 1.2964 - classification_loss: 0.2090 321/500 [==================>...........] - ETA: 1:02 - loss: 1.5048 - regression_loss: 1.2961 - classification_loss: 0.2087 322/500 [==================>...........] - ETA: 1:01 - loss: 1.5045 - regression_loss: 1.2959 - classification_loss: 0.2086 323/500 [==================>...........] - ETA: 1:01 - loss: 1.5024 - regression_loss: 1.2941 - classification_loss: 0.2083 324/500 [==================>...........] - ETA: 1:01 - loss: 1.5007 - regression_loss: 1.2927 - classification_loss: 0.2080 325/500 [==================>...........] - ETA: 1:00 - loss: 1.5017 - regression_loss: 1.2935 - classification_loss: 0.2081 326/500 [==================>...........] - ETA: 1:00 - loss: 1.5007 - regression_loss: 1.2928 - classification_loss: 0.2079 327/500 [==================>...........] - ETA: 1:00 - loss: 1.5042 - regression_loss: 1.2957 - classification_loss: 0.2085 328/500 [==================>...........] - ETA: 59s - loss: 1.5046 - regression_loss: 1.2962 - classification_loss: 0.2085  329/500 [==================>...........] - ETA: 59s - loss: 1.5038 - regression_loss: 1.2955 - classification_loss: 0.2083 330/500 [==================>...........] - ETA: 59s - loss: 1.5047 - regression_loss: 1.2964 - classification_loss: 0.2083 331/500 [==================>...........] - ETA: 58s - loss: 1.5048 - regression_loss: 1.2964 - classification_loss: 0.2083 332/500 [==================>...........] - ETA: 58s - loss: 1.5030 - regression_loss: 1.2949 - classification_loss: 0.2081 333/500 [==================>...........] - ETA: 58s - loss: 1.5036 - regression_loss: 1.2955 - classification_loss: 0.2081 334/500 [===================>..........] - ETA: 57s - loss: 1.5040 - regression_loss: 1.2959 - classification_loss: 0.2082 335/500 [===================>..........] - ETA: 57s - loss: 1.5045 - regression_loss: 1.2962 - classification_loss: 0.2084 336/500 [===================>..........] - ETA: 57s - loss: 1.5035 - regression_loss: 1.2953 - classification_loss: 0.2081 337/500 [===================>..........] - ETA: 56s - loss: 1.5038 - regression_loss: 1.2956 - classification_loss: 0.2082 338/500 [===================>..........] - ETA: 56s - loss: 1.5034 - regression_loss: 1.2953 - classification_loss: 0.2081 339/500 [===================>..........] - ETA: 56s - loss: 1.5006 - regression_loss: 1.2926 - classification_loss: 0.2080 340/500 [===================>..........] - ETA: 55s - loss: 1.5006 - regression_loss: 1.2925 - classification_loss: 0.2081 341/500 [===================>..........] - ETA: 55s - loss: 1.5009 - regression_loss: 1.2928 - classification_loss: 0.2081 342/500 [===================>..........] - ETA: 55s - loss: 1.5003 - regression_loss: 1.2924 - classification_loss: 0.2079 343/500 [===================>..........] - ETA: 54s - loss: 1.4999 - regression_loss: 1.2922 - classification_loss: 0.2077 344/500 [===================>..........] - ETA: 54s - loss: 1.4995 - regression_loss: 1.2919 - classification_loss: 0.2076 345/500 [===================>..........] - ETA: 53s - loss: 1.4976 - regression_loss: 1.2902 - classification_loss: 0.2074 346/500 [===================>..........] - ETA: 53s - loss: 1.4976 - regression_loss: 1.2901 - classification_loss: 0.2074 347/500 [===================>..........] - ETA: 53s - loss: 1.4987 - regression_loss: 1.2911 - classification_loss: 0.2076 348/500 [===================>..........] - ETA: 52s - loss: 1.4989 - regression_loss: 1.2911 - classification_loss: 0.2078 349/500 [===================>..........] - ETA: 52s - loss: 1.4974 - regression_loss: 1.2900 - classification_loss: 0.2075 350/500 [====================>.........] - ETA: 52s - loss: 1.4993 - regression_loss: 1.2916 - classification_loss: 0.2078 351/500 [====================>.........] - ETA: 51s - loss: 1.4997 - regression_loss: 1.2920 - classification_loss: 0.2078 352/500 [====================>.........] - ETA: 51s - loss: 1.5004 - regression_loss: 1.2926 - classification_loss: 0.2078 353/500 [====================>.........] - ETA: 51s - loss: 1.4976 - regression_loss: 1.2902 - classification_loss: 0.2074 354/500 [====================>.........] - ETA: 50s - loss: 1.4967 - regression_loss: 1.2895 - classification_loss: 0.2072 355/500 [====================>.........] - ETA: 50s - loss: 1.4962 - regression_loss: 1.2891 - classification_loss: 0.2071 356/500 [====================>.........] - ETA: 50s - loss: 1.4961 - regression_loss: 1.2889 - classification_loss: 0.2072 357/500 [====================>.........] - ETA: 49s - loss: 1.4966 - regression_loss: 1.2893 - classification_loss: 0.2073 358/500 [====================>.........] - ETA: 49s - loss: 1.4970 - regression_loss: 1.2895 - classification_loss: 0.2075 359/500 [====================>.........] - ETA: 49s - loss: 1.4986 - regression_loss: 1.2912 - classification_loss: 0.2075 360/500 [====================>.........] - ETA: 48s - loss: 1.4995 - regression_loss: 1.2920 - classification_loss: 0.2075 361/500 [====================>.........] - ETA: 48s - loss: 1.5013 - regression_loss: 1.2936 - classification_loss: 0.2078 362/500 [====================>.........] - ETA: 48s - loss: 1.5029 - regression_loss: 1.2950 - classification_loss: 0.2079 363/500 [====================>.........] - ETA: 47s - loss: 1.5020 - regression_loss: 1.2944 - classification_loss: 0.2077 364/500 [====================>.........] - ETA: 47s - loss: 1.5020 - regression_loss: 1.2943 - classification_loss: 0.2077 365/500 [====================>.........] - ETA: 47s - loss: 1.5018 - regression_loss: 1.2940 - classification_loss: 0.2077 366/500 [====================>.........] - ETA: 46s - loss: 1.5023 - regression_loss: 1.2945 - classification_loss: 0.2078 367/500 [=====================>........] - ETA: 46s - loss: 1.5015 - regression_loss: 1.2939 - classification_loss: 0.2076 368/500 [=====================>........] - ETA: 45s - loss: 1.5007 - regression_loss: 1.2932 - classification_loss: 0.2076 369/500 [=====================>........] - ETA: 45s - loss: 1.5014 - regression_loss: 1.2938 - classification_loss: 0.2077 370/500 [=====================>........] - ETA: 45s - loss: 1.5017 - regression_loss: 1.2939 - classification_loss: 0.2078 371/500 [=====================>........] - ETA: 44s - loss: 1.5009 - regression_loss: 1.2932 - classification_loss: 0.2077 372/500 [=====================>........] - ETA: 44s - loss: 1.5009 - regression_loss: 1.2931 - classification_loss: 0.2078 373/500 [=====================>........] - ETA: 44s - loss: 1.5002 - regression_loss: 1.2925 - classification_loss: 0.2077 374/500 [=====================>........] - ETA: 43s - loss: 1.5021 - regression_loss: 1.2941 - classification_loss: 0.2080 375/500 [=====================>........] - ETA: 43s - loss: 1.5009 - regression_loss: 1.2929 - classification_loss: 0.2080 376/500 [=====================>........] - ETA: 43s - loss: 1.5030 - regression_loss: 1.2946 - classification_loss: 0.2083 377/500 [=====================>........] - ETA: 42s - loss: 1.5034 - regression_loss: 1.2951 - classification_loss: 0.2083 378/500 [=====================>........] - ETA: 42s - loss: 1.5045 - regression_loss: 1.2960 - classification_loss: 0.2084 379/500 [=====================>........] - ETA: 42s - loss: 1.5039 - regression_loss: 1.2955 - classification_loss: 0.2084 380/500 [=====================>........] - ETA: 41s - loss: 1.5028 - regression_loss: 1.2946 - classification_loss: 0.2082 381/500 [=====================>........] - ETA: 41s - loss: 1.5038 - regression_loss: 1.2955 - classification_loss: 0.2083 382/500 [=====================>........] - ETA: 41s - loss: 1.5053 - regression_loss: 1.2969 - classification_loss: 0.2084 383/500 [=====================>........] - ETA: 40s - loss: 1.5059 - regression_loss: 1.2974 - classification_loss: 0.2084 384/500 [======================>.......] - ETA: 40s - loss: 1.5046 - regression_loss: 1.2962 - classification_loss: 0.2083 385/500 [======================>.......] - ETA: 40s - loss: 1.5034 - regression_loss: 1.2952 - classification_loss: 0.2082 386/500 [======================>.......] - ETA: 39s - loss: 1.5048 - regression_loss: 1.2964 - classification_loss: 0.2084 387/500 [======================>.......] - ETA: 39s - loss: 1.5055 - regression_loss: 1.2971 - classification_loss: 0.2084 388/500 [======================>.......] - ETA: 39s - loss: 1.5058 - regression_loss: 1.2974 - classification_loss: 0.2084 389/500 [======================>.......] - ETA: 38s - loss: 1.5056 - regression_loss: 1.2974 - classification_loss: 0.2082 390/500 [======================>.......] - ETA: 38s - loss: 1.5061 - regression_loss: 1.2978 - classification_loss: 0.2083 391/500 [======================>.......] - ETA: 37s - loss: 1.5051 - regression_loss: 1.2969 - classification_loss: 0.2082 392/500 [======================>.......] - ETA: 37s - loss: 1.5057 - regression_loss: 1.2976 - classification_loss: 0.2082 393/500 [======================>.......] - ETA: 37s - loss: 1.5040 - regression_loss: 1.2960 - classification_loss: 0.2079 394/500 [======================>.......] - ETA: 36s - loss: 1.5046 - regression_loss: 1.2966 - classification_loss: 0.2079 395/500 [======================>.......] - ETA: 36s - loss: 1.5044 - regression_loss: 1.2964 - classification_loss: 0.2080 396/500 [======================>.......] - ETA: 36s - loss: 1.5045 - regression_loss: 1.2965 - classification_loss: 0.2080 397/500 [======================>.......] - ETA: 35s - loss: 1.5043 - regression_loss: 1.2962 - classification_loss: 0.2081 398/500 [======================>.......] - ETA: 35s - loss: 1.5043 - regression_loss: 1.2960 - classification_loss: 0.2082 399/500 [======================>.......] - ETA: 35s - loss: 1.5045 - regression_loss: 1.2961 - classification_loss: 0.2084 400/500 [=======================>......] - ETA: 34s - loss: 1.5042 - regression_loss: 1.2960 - classification_loss: 0.2083 401/500 [=======================>......] - ETA: 34s - loss: 1.5034 - regression_loss: 1.2952 - classification_loss: 0.2081 402/500 [=======================>......] - ETA: 34s - loss: 1.5046 - regression_loss: 1.2963 - classification_loss: 0.2083 403/500 [=======================>......] - ETA: 33s - loss: 1.5042 - regression_loss: 1.2958 - classification_loss: 0.2083 404/500 [=======================>......] - ETA: 33s - loss: 1.5049 - regression_loss: 1.2966 - classification_loss: 0.2084 405/500 [=======================>......] - ETA: 33s - loss: 1.5051 - regression_loss: 1.2966 - classification_loss: 0.2085 406/500 [=======================>......] - ETA: 32s - loss: 1.5030 - regression_loss: 1.2948 - classification_loss: 0.2083 407/500 [=======================>......] - ETA: 32s - loss: 1.5021 - regression_loss: 1.2938 - classification_loss: 0.2082 408/500 [=======================>......] - ETA: 32s - loss: 1.5006 - regression_loss: 1.2925 - classification_loss: 0.2081 409/500 [=======================>......] - ETA: 31s - loss: 1.4989 - regression_loss: 1.2911 - classification_loss: 0.2078 410/500 [=======================>......] - ETA: 31s - loss: 1.5004 - regression_loss: 1.2924 - classification_loss: 0.2080 411/500 [=======================>......] - ETA: 31s - loss: 1.5000 - regression_loss: 1.2921 - classification_loss: 0.2079 412/500 [=======================>......] - ETA: 30s - loss: 1.4991 - regression_loss: 1.2913 - classification_loss: 0.2078 413/500 [=======================>......] - ETA: 30s - loss: 1.4979 - regression_loss: 1.2901 - classification_loss: 0.2078 414/500 [=======================>......] - ETA: 29s - loss: 1.4972 - regression_loss: 1.2894 - classification_loss: 0.2078 415/500 [=======================>......] - ETA: 29s - loss: 1.4978 - regression_loss: 1.2898 - classification_loss: 0.2080 416/500 [=======================>......] - ETA: 29s - loss: 1.4980 - regression_loss: 1.2899 - classification_loss: 0.2081 417/500 [========================>.....] - ETA: 28s - loss: 1.4976 - regression_loss: 1.2896 - classification_loss: 0.2080 418/500 [========================>.....] - ETA: 28s - loss: 1.4968 - regression_loss: 1.2890 - classification_loss: 0.2079 419/500 [========================>.....] - ETA: 28s - loss: 1.4984 - regression_loss: 1.2902 - classification_loss: 0.2081 420/500 [========================>.....] - ETA: 27s - loss: 1.4983 - regression_loss: 1.2903 - classification_loss: 0.2080 421/500 [========================>.....] - ETA: 27s - loss: 1.4986 - regression_loss: 1.2906 - classification_loss: 0.2080 422/500 [========================>.....] - ETA: 27s - loss: 1.4980 - regression_loss: 1.2902 - classification_loss: 0.2079 423/500 [========================>.....] - ETA: 26s - loss: 1.4977 - regression_loss: 1.2899 - classification_loss: 0.2078 424/500 [========================>.....] - ETA: 26s - loss: 1.4979 - regression_loss: 1.2899 - classification_loss: 0.2079 425/500 [========================>.....] - ETA: 26s - loss: 1.4983 - regression_loss: 1.2902 - classification_loss: 0.2080 426/500 [========================>.....] - ETA: 25s - loss: 1.4976 - regression_loss: 1.2897 - classification_loss: 0.2079 427/500 [========================>.....] - ETA: 25s - loss: 1.4979 - regression_loss: 1.2899 - classification_loss: 0.2080 428/500 [========================>.....] - ETA: 25s - loss: 1.4968 - regression_loss: 1.2891 - classification_loss: 0.2077 429/500 [========================>.....] - ETA: 24s - loss: 1.4972 - regression_loss: 1.2895 - classification_loss: 0.2078 430/500 [========================>.....] - ETA: 24s - loss: 1.4978 - regression_loss: 1.2899 - classification_loss: 0.2079 431/500 [========================>.....] - ETA: 24s - loss: 1.4982 - regression_loss: 1.2901 - classification_loss: 0.2081 432/500 [========================>.....] - ETA: 23s - loss: 1.4983 - regression_loss: 1.2897 - classification_loss: 0.2086 433/500 [========================>.....] - ETA: 23s - loss: 1.4993 - regression_loss: 1.2905 - classification_loss: 0.2088 434/500 [=========================>....] - ETA: 23s - loss: 1.4989 - regression_loss: 1.2903 - classification_loss: 0.2086 435/500 [=========================>....] - ETA: 22s - loss: 1.4975 - regression_loss: 1.2890 - classification_loss: 0.2085 436/500 [=========================>....] - ETA: 22s - loss: 1.4972 - regression_loss: 1.2889 - classification_loss: 0.2083 437/500 [=========================>....] - ETA: 21s - loss: 1.4959 - regression_loss: 1.2877 - classification_loss: 0.2081 438/500 [=========================>....] - ETA: 21s - loss: 1.4966 - regression_loss: 1.2884 - classification_loss: 0.2082 439/500 [=========================>....] - ETA: 21s - loss: 1.4967 - regression_loss: 1.2885 - classification_loss: 0.2082 440/500 [=========================>....] - ETA: 20s - loss: 1.4968 - regression_loss: 1.2886 - classification_loss: 0.2082 441/500 [=========================>....] - ETA: 20s - loss: 1.4950 - regression_loss: 1.2870 - classification_loss: 0.2080 442/500 [=========================>....] - ETA: 20s - loss: 1.4930 - regression_loss: 1.2854 - classification_loss: 0.2076 443/500 [=========================>....] - ETA: 19s - loss: 1.4932 - regression_loss: 1.2856 - classification_loss: 0.2077 444/500 [=========================>....] - ETA: 19s - loss: 1.4928 - regression_loss: 1.2850 - classification_loss: 0.2077 445/500 [=========================>....] - ETA: 19s - loss: 1.4926 - regression_loss: 1.2848 - classification_loss: 0.2078 446/500 [=========================>....] - ETA: 18s - loss: 1.4927 - regression_loss: 1.2849 - classification_loss: 0.2078 447/500 [=========================>....] - ETA: 18s - loss: 1.4935 - regression_loss: 1.2856 - classification_loss: 0.2079 448/500 [=========================>....] - ETA: 18s - loss: 1.4937 - regression_loss: 1.2858 - classification_loss: 0.2079 449/500 [=========================>....] - ETA: 17s - loss: 1.4937 - regression_loss: 1.2859 - classification_loss: 0.2079 450/500 [==========================>...] - ETA: 17s - loss: 1.4936 - regression_loss: 1.2858 - classification_loss: 0.2078 451/500 [==========================>...] - ETA: 17s - loss: 1.4931 - regression_loss: 1.2853 - classification_loss: 0.2078 452/500 [==========================>...] - ETA: 16s - loss: 1.4929 - regression_loss: 1.2851 - classification_loss: 0.2078 453/500 [==========================>...] - ETA: 16s - loss: 1.4931 - regression_loss: 1.2853 - classification_loss: 0.2078 454/500 [==========================>...] - ETA: 16s - loss: 1.4921 - regression_loss: 1.2845 - classification_loss: 0.2076 455/500 [==========================>...] - ETA: 15s - loss: 1.4905 - regression_loss: 1.2832 - classification_loss: 0.2073 456/500 [==========================>...] - ETA: 15s - loss: 1.4911 - regression_loss: 1.2839 - classification_loss: 0.2072 457/500 [==========================>...] - ETA: 14s - loss: 1.4928 - regression_loss: 1.2854 - classification_loss: 0.2074 458/500 [==========================>...] - ETA: 14s - loss: 1.4915 - regression_loss: 1.2843 - classification_loss: 0.2072 459/500 [==========================>...] - ETA: 14s - loss: 1.4918 - regression_loss: 1.2846 - classification_loss: 0.2072 460/500 [==========================>...] - ETA: 13s - loss: 1.4916 - regression_loss: 1.2845 - classification_loss: 0.2071 461/500 [==========================>...] - ETA: 13s - loss: 1.4921 - regression_loss: 1.2849 - classification_loss: 0.2072 462/500 [==========================>...] - ETA: 13s - loss: 1.4907 - regression_loss: 1.2838 - classification_loss: 0.2069 463/500 [==========================>...] - ETA: 12s - loss: 1.4908 - regression_loss: 1.2839 - classification_loss: 0.2069 464/500 [==========================>...] - ETA: 12s - loss: 1.4910 - regression_loss: 1.2840 - classification_loss: 0.2070 465/500 [==========================>...] - ETA: 12s - loss: 1.4906 - regression_loss: 1.2837 - classification_loss: 0.2069 466/500 [==========================>...] - ETA: 11s - loss: 1.4910 - regression_loss: 1.2841 - classification_loss: 0.2069 467/500 [===========================>..] - ETA: 11s - loss: 1.4903 - regression_loss: 1.2835 - classification_loss: 0.2068 468/500 [===========================>..] - ETA: 11s - loss: 1.4893 - regression_loss: 1.2827 - classification_loss: 0.2066 469/500 [===========================>..] - ETA: 10s - loss: 1.4894 - regression_loss: 1.2828 - classification_loss: 0.2065 470/500 [===========================>..] - ETA: 10s - loss: 1.4880 - regression_loss: 1.2816 - classification_loss: 0.2064 471/500 [===========================>..] - ETA: 10s - loss: 1.4873 - regression_loss: 1.2811 - classification_loss: 0.2062 472/500 [===========================>..] - ETA: 9s - loss: 1.4884 - regression_loss: 1.2820 - classification_loss: 0.2064  473/500 [===========================>..] - ETA: 9s - loss: 1.4886 - regression_loss: 1.2823 - classification_loss: 0.2063 474/500 [===========================>..] - ETA: 9s - loss: 1.4890 - regression_loss: 1.2825 - classification_loss: 0.2065 475/500 [===========================>..] - ETA: 8s - loss: 1.4889 - regression_loss: 1.2823 - classification_loss: 0.2066 476/500 [===========================>..] - ETA: 8s - loss: 1.4879 - regression_loss: 1.2813 - classification_loss: 0.2066 477/500 [===========================>..] - ETA: 8s - loss: 1.4879 - regression_loss: 1.2813 - classification_loss: 0.2066 478/500 [===========================>..] - ETA: 7s - loss: 1.4885 - regression_loss: 1.2817 - classification_loss: 0.2067 479/500 [===========================>..] - ETA: 7s - loss: 1.4892 - regression_loss: 1.2825 - classification_loss: 0.2067 480/500 [===========================>..] - ETA: 6s - loss: 1.4897 - regression_loss: 1.2830 - classification_loss: 0.2067 481/500 [===========================>..] - ETA: 6s - loss: 1.4900 - regression_loss: 1.2833 - classification_loss: 0.2067 482/500 [===========================>..] - ETA: 6s - loss: 1.4884 - regression_loss: 1.2819 - classification_loss: 0.2065 483/500 [===========================>..] - ETA: 5s - loss: 1.4879 - regression_loss: 1.2815 - classification_loss: 0.2064 484/500 [============================>.] - ETA: 5s - loss: 1.4880 - regression_loss: 1.2816 - classification_loss: 0.2064 485/500 [============================>.] - ETA: 5s - loss: 1.4879 - regression_loss: 1.2815 - classification_loss: 0.2064 486/500 [============================>.] - ETA: 4s - loss: 1.4866 - regression_loss: 1.2803 - classification_loss: 0.2063 487/500 [============================>.] - ETA: 4s - loss: 1.4861 - regression_loss: 1.2800 - classification_loss: 0.2061 488/500 [============================>.] - ETA: 4s - loss: 1.4860 - regression_loss: 1.2798 - classification_loss: 0.2061 489/500 [============================>.] - ETA: 3s - loss: 1.4843 - regression_loss: 1.2784 - classification_loss: 0.2059 490/500 [============================>.] - ETA: 3s - loss: 1.4848 - regression_loss: 1.2788 - classification_loss: 0.2059 491/500 [============================>.] - ETA: 3s - loss: 1.4852 - regression_loss: 1.2792 - classification_loss: 0.2060 492/500 [============================>.] - ETA: 2s - loss: 1.4844 - regression_loss: 1.2785 - classification_loss: 0.2059 493/500 [============================>.] - ETA: 2s - loss: 1.4841 - regression_loss: 1.2782 - classification_loss: 0.2058 494/500 [============================>.] - ETA: 2s - loss: 1.4839 - regression_loss: 1.2781 - classification_loss: 0.2058 495/500 [============================>.] - ETA: 1s - loss: 1.4839 - regression_loss: 1.2781 - classification_loss: 0.2058 496/500 [============================>.] - ETA: 1s - loss: 1.4837 - regression_loss: 1.2780 - classification_loss: 0.2057 497/500 [============================>.] - ETA: 1s - loss: 1.4831 - regression_loss: 1.2777 - classification_loss: 0.2054 498/500 [============================>.] - ETA: 0s - loss: 1.4832 - regression_loss: 1.2778 - classification_loss: 0.2054 499/500 [============================>.] - ETA: 0s - loss: 1.4830 - regression_loss: 1.2777 - classification_loss: 0.2053 500/500 [==============================] - 174s 348ms/step - loss: 1.4830 - regression_loss: 1.2777 - classification_loss: 0.2053 1172 instances of class plum with average precision: 0.6484 mAP: 0.6484 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 2:47 - loss: 1.5744 - regression_loss: 1.3433 - classification_loss: 0.2311 2/500 [..............................] - ETA: 2:52 - loss: 1.3528 - regression_loss: 1.1418 - classification_loss: 0.2110 3/500 [..............................] - ETA: 2:51 - loss: 1.2044 - regression_loss: 1.0077 - classification_loss: 0.1967 4/500 [..............................] - ETA: 2:52 - loss: 1.2524 - regression_loss: 1.0500 - classification_loss: 0.2025 5/500 [..............................] - ETA: 2:51 - loss: 1.3402 - regression_loss: 1.1337 - classification_loss: 0.2065 6/500 [..............................] - ETA: 2:52 - loss: 1.3318 - regression_loss: 1.1279 - classification_loss: 0.2038 7/500 [..............................] - ETA: 2:51 - loss: 1.2573 - regression_loss: 1.0669 - classification_loss: 0.1904 8/500 [..............................] - ETA: 2:52 - loss: 1.2559 - regression_loss: 1.0659 - classification_loss: 0.1900 9/500 [..............................] - ETA: 2:51 - loss: 1.2351 - regression_loss: 1.0440 - classification_loss: 0.1911 10/500 [..............................] - ETA: 2:50 - loss: 1.2591 - regression_loss: 1.0614 - classification_loss: 0.1977 11/500 [..............................] - ETA: 2:49 - loss: 1.2751 - regression_loss: 1.0833 - classification_loss: 0.1918 12/500 [..............................] - ETA: 2:48 - loss: 1.3229 - regression_loss: 1.1220 - classification_loss: 0.2010 13/500 [..............................] - ETA: 2:48 - loss: 1.3543 - regression_loss: 1.1517 - classification_loss: 0.2026 14/500 [..............................] - ETA: 2:47 - loss: 1.3163 - regression_loss: 1.1158 - classification_loss: 0.2005 15/500 [..............................] - ETA: 2:47 - loss: 1.3076 - regression_loss: 1.1117 - classification_loss: 0.1959 16/500 [..............................] - ETA: 2:46 - loss: 1.3660 - regression_loss: 1.1621 - classification_loss: 0.2039 17/500 [>.............................] - ETA: 2:45 - loss: 1.3789 - regression_loss: 1.1743 - classification_loss: 0.2046 18/500 [>.............................] - ETA: 2:45 - loss: 1.3645 - regression_loss: 1.1623 - classification_loss: 0.2022 19/500 [>.............................] - ETA: 2:44 - loss: 1.3423 - regression_loss: 1.1446 - classification_loss: 0.1976 20/500 [>.............................] - ETA: 2:44 - loss: 1.3654 - regression_loss: 1.1637 - classification_loss: 0.2017 21/500 [>.............................] - ETA: 2:44 - loss: 1.3659 - regression_loss: 1.1652 - classification_loss: 0.2007 22/500 [>.............................] - ETA: 2:43 - loss: 1.3665 - regression_loss: 1.1662 - classification_loss: 0.2003 23/500 [>.............................] - ETA: 2:43 - loss: 1.3428 - regression_loss: 1.1457 - classification_loss: 0.1971 24/500 [>.............................] - ETA: 2:43 - loss: 1.3455 - regression_loss: 1.1491 - classification_loss: 0.1964 25/500 [>.............................] - ETA: 2:43 - loss: 1.3577 - regression_loss: 1.1621 - classification_loss: 0.1956 26/500 [>.............................] - ETA: 2:42 - loss: 1.3775 - regression_loss: 1.1794 - classification_loss: 0.1981 27/500 [>.............................] - ETA: 2:42 - loss: 1.3747 - regression_loss: 1.1779 - classification_loss: 0.1968 28/500 [>.............................] - ETA: 2:42 - loss: 1.3911 - regression_loss: 1.1948 - classification_loss: 0.1963 29/500 [>.............................] - ETA: 2:42 - loss: 1.4101 - regression_loss: 1.2125 - classification_loss: 0.1976 30/500 [>.............................] - ETA: 2:41 - loss: 1.4081 - regression_loss: 1.2109 - classification_loss: 0.1972 31/500 [>.............................] - ETA: 2:41 - loss: 1.4023 - regression_loss: 1.2062 - classification_loss: 0.1960 32/500 [>.............................] - ETA: 2:41 - loss: 1.3903 - regression_loss: 1.1959 - classification_loss: 0.1944 33/500 [>.............................] - ETA: 2:41 - loss: 1.3833 - regression_loss: 1.1880 - classification_loss: 0.1953 34/500 [=>............................] - ETA: 2:40 - loss: 1.3862 - regression_loss: 1.1903 - classification_loss: 0.1959 35/500 [=>............................] - ETA: 2:40 - loss: 1.3916 - regression_loss: 1.1952 - classification_loss: 0.1964 36/500 [=>............................] - ETA: 2:39 - loss: 1.4005 - regression_loss: 1.2035 - classification_loss: 0.1970 37/500 [=>............................] - ETA: 2:39 - loss: 1.4106 - regression_loss: 1.2133 - classification_loss: 0.1973 38/500 [=>............................] - ETA: 2:39 - loss: 1.4035 - regression_loss: 1.2076 - classification_loss: 0.1959 39/500 [=>............................] - ETA: 2:39 - loss: 1.3949 - regression_loss: 1.2017 - classification_loss: 0.1932 40/500 [=>............................] - ETA: 2:38 - loss: 1.4059 - regression_loss: 1.2117 - classification_loss: 0.1942 41/500 [=>............................] - ETA: 2:38 - loss: 1.4145 - regression_loss: 1.2199 - classification_loss: 0.1946 42/500 [=>............................] - ETA: 2:37 - loss: 1.4163 - regression_loss: 1.2217 - classification_loss: 0.1946 43/500 [=>............................] - ETA: 2:37 - loss: 1.4198 - regression_loss: 1.2253 - classification_loss: 0.1944 44/500 [=>............................] - ETA: 2:37 - loss: 1.4174 - regression_loss: 1.2231 - classification_loss: 0.1943 45/500 [=>............................] - ETA: 2:36 - loss: 1.4258 - regression_loss: 1.2298 - classification_loss: 0.1961 46/500 [=>............................] - ETA: 2:36 - loss: 1.4272 - regression_loss: 1.2326 - classification_loss: 0.1947 47/500 [=>............................] - ETA: 2:35 - loss: 1.4341 - regression_loss: 1.2386 - classification_loss: 0.1955 48/500 [=>............................] - ETA: 2:35 - loss: 1.4270 - regression_loss: 1.2331 - classification_loss: 0.1940 49/500 [=>............................] - ETA: 2:35 - loss: 1.4203 - regression_loss: 1.2267 - classification_loss: 0.1937 50/500 [==>...........................] - ETA: 2:34 - loss: 1.4261 - regression_loss: 1.2317 - classification_loss: 0.1944 51/500 [==>...........................] - ETA: 2:34 - loss: 1.4145 - regression_loss: 1.2213 - classification_loss: 0.1931 52/500 [==>...........................] - ETA: 2:34 - loss: 1.4129 - regression_loss: 1.2202 - classification_loss: 0.1927 53/500 [==>...........................] - ETA: 2:34 - loss: 1.4013 - regression_loss: 1.2103 - classification_loss: 0.1910 54/500 [==>...........................] - ETA: 2:33 - loss: 1.4089 - regression_loss: 1.2154 - classification_loss: 0.1935 55/500 [==>...........................] - ETA: 2:32 - loss: 1.3983 - regression_loss: 1.2070 - classification_loss: 0.1913 56/500 [==>...........................] - ETA: 2:32 - loss: 1.3979 - regression_loss: 1.2072 - classification_loss: 0.1907 57/500 [==>...........................] - ETA: 2:32 - loss: 1.3889 - regression_loss: 1.1984 - classification_loss: 0.1905 58/500 [==>...........................] - ETA: 2:31 - loss: 1.3963 - regression_loss: 1.2053 - classification_loss: 0.1910 59/500 [==>...........................] - ETA: 2:31 - loss: 1.3980 - regression_loss: 1.2070 - classification_loss: 0.1910 60/500 [==>...........................] - ETA: 2:31 - loss: 1.4020 - regression_loss: 1.2108 - classification_loss: 0.1912 61/500 [==>...........................] - ETA: 2:30 - loss: 1.4027 - regression_loss: 1.2117 - classification_loss: 0.1910 62/500 [==>...........................] - ETA: 2:30 - loss: 1.4076 - regression_loss: 1.2156 - classification_loss: 0.1920 63/500 [==>...........................] - ETA: 2:30 - loss: 1.4049 - regression_loss: 1.2133 - classification_loss: 0.1916 64/500 [==>...........................] - ETA: 2:29 - loss: 1.4010 - regression_loss: 1.2099 - classification_loss: 0.1911 65/500 [==>...........................] - ETA: 2:29 - loss: 1.4025 - regression_loss: 1.2116 - classification_loss: 0.1909 66/500 [==>...........................] - ETA: 2:29 - loss: 1.3956 - regression_loss: 1.2047 - classification_loss: 0.1909 67/500 [===>..........................] - ETA: 2:28 - loss: 1.4056 - regression_loss: 1.2137 - classification_loss: 0.1920 68/500 [===>..........................] - ETA: 2:28 - loss: 1.4069 - regression_loss: 1.2144 - classification_loss: 0.1925 69/500 [===>..........................] - ETA: 2:28 - loss: 1.4082 - regression_loss: 1.2161 - classification_loss: 0.1921 70/500 [===>..........................] - ETA: 2:27 - loss: 1.4025 - regression_loss: 1.2113 - classification_loss: 0.1912 71/500 [===>..........................] - ETA: 2:26 - loss: 1.3948 - regression_loss: 1.2051 - classification_loss: 0.1897 72/500 [===>..........................] - ETA: 2:26 - loss: 1.3854 - regression_loss: 1.1969 - classification_loss: 0.1885 73/500 [===>..........................] - ETA: 2:25 - loss: 1.3780 - regression_loss: 1.1907 - classification_loss: 0.1874 74/500 [===>..........................] - ETA: 2:25 - loss: 1.3878 - regression_loss: 1.1985 - classification_loss: 0.1893 75/500 [===>..........................] - ETA: 2:25 - loss: 1.3874 - regression_loss: 1.1977 - classification_loss: 0.1897 76/500 [===>..........................] - ETA: 2:24 - loss: 1.3907 - regression_loss: 1.2008 - classification_loss: 0.1899 77/500 [===>..........................] - ETA: 2:24 - loss: 1.3935 - regression_loss: 1.2036 - classification_loss: 0.1899 78/500 [===>..........................] - ETA: 2:24 - loss: 1.3975 - regression_loss: 1.2073 - classification_loss: 0.1902 79/500 [===>..........................] - ETA: 2:23 - loss: 1.3982 - regression_loss: 1.2081 - classification_loss: 0.1901 80/500 [===>..........................] - ETA: 2:23 - loss: 1.3911 - regression_loss: 1.2023 - classification_loss: 0.1888 81/500 [===>..........................] - ETA: 2:23 - loss: 1.3837 - regression_loss: 1.1957 - classification_loss: 0.1880 82/500 [===>..........................] - ETA: 2:22 - loss: 1.3877 - regression_loss: 1.1998 - classification_loss: 0.1879 83/500 [===>..........................] - ETA: 2:22 - loss: 1.3808 - regression_loss: 1.1940 - classification_loss: 0.1868 84/500 [====>.........................] - ETA: 2:21 - loss: 1.3779 - regression_loss: 1.1916 - classification_loss: 0.1863 85/500 [====>.........................] - ETA: 2:21 - loss: 1.3788 - regression_loss: 1.1921 - classification_loss: 0.1868 86/500 [====>.........................] - ETA: 2:21 - loss: 1.3866 - regression_loss: 1.1989 - classification_loss: 0.1877 87/500 [====>.........................] - ETA: 2:20 - loss: 1.3826 - regression_loss: 1.1958 - classification_loss: 0.1868 88/500 [====>.........................] - ETA: 2:20 - loss: 1.3832 - regression_loss: 1.1964 - classification_loss: 0.1868 89/500 [====>.........................] - ETA: 2:20 - loss: 1.3810 - regression_loss: 1.1951 - classification_loss: 0.1858 90/500 [====>.........................] - ETA: 2:20 - loss: 1.3787 - regression_loss: 1.1930 - classification_loss: 0.1857 91/500 [====>.........................] - ETA: 2:19 - loss: 1.3764 - regression_loss: 1.1912 - classification_loss: 0.1852 92/500 [====>.........................] - ETA: 2:19 - loss: 1.3776 - regression_loss: 1.1920 - classification_loss: 0.1857 93/500 [====>.........................] - ETA: 2:19 - loss: 1.3844 - regression_loss: 1.1982 - classification_loss: 0.1862 94/500 [====>.........................] - ETA: 2:18 - loss: 1.3783 - regression_loss: 1.1931 - classification_loss: 0.1851 95/500 [====>.........................] - ETA: 2:18 - loss: 1.3796 - regression_loss: 1.1945 - classification_loss: 0.1850 96/500 [====>.........................] - ETA: 2:18 - loss: 1.3814 - regression_loss: 1.1962 - classification_loss: 0.1853 97/500 [====>.........................] - ETA: 2:17 - loss: 1.3798 - regression_loss: 1.1943 - classification_loss: 0.1855 98/500 [====>.........................] - ETA: 2:17 - loss: 1.3757 - regression_loss: 1.1910 - classification_loss: 0.1847 99/500 [====>.........................] - ETA: 2:17 - loss: 1.3772 - regression_loss: 1.1923 - classification_loss: 0.1849 100/500 [=====>........................] - ETA: 2:16 - loss: 1.3749 - regression_loss: 1.1904 - classification_loss: 0.1845 101/500 [=====>........................] - ETA: 2:16 - loss: 1.3766 - regression_loss: 1.1923 - classification_loss: 0.1843 102/500 [=====>........................] - ETA: 2:16 - loss: 1.3786 - regression_loss: 1.1936 - classification_loss: 0.1850 103/500 [=====>........................] - ETA: 2:15 - loss: 1.3726 - regression_loss: 1.1884 - classification_loss: 0.1841 104/500 [=====>........................] - ETA: 2:15 - loss: 1.3656 - regression_loss: 1.1823 - classification_loss: 0.1833 105/500 [=====>........................] - ETA: 2:15 - loss: 1.3683 - regression_loss: 1.1850 - classification_loss: 0.1833 106/500 [=====>........................] - ETA: 2:14 - loss: 1.3674 - regression_loss: 1.1844 - classification_loss: 0.1830 107/500 [=====>........................] - ETA: 2:14 - loss: 1.3738 - regression_loss: 1.1894 - classification_loss: 0.1843 108/500 [=====>........................] - ETA: 2:14 - loss: 1.3758 - regression_loss: 1.1914 - classification_loss: 0.1844 109/500 [=====>........................] - ETA: 2:13 - loss: 1.3783 - regression_loss: 1.1933 - classification_loss: 0.1850 110/500 [=====>........................] - ETA: 2:13 - loss: 1.3781 - regression_loss: 1.1934 - classification_loss: 0.1848 111/500 [=====>........................] - ETA: 2:13 - loss: 1.3835 - regression_loss: 1.1977 - classification_loss: 0.1859 112/500 [=====>........................] - ETA: 2:12 - loss: 1.3859 - regression_loss: 1.2005 - classification_loss: 0.1854 113/500 [=====>........................] - ETA: 2:12 - loss: 1.3894 - regression_loss: 1.2039 - classification_loss: 0.1855 114/500 [=====>........................] - ETA: 2:12 - loss: 1.3928 - regression_loss: 1.2071 - classification_loss: 0.1857 115/500 [=====>........................] - ETA: 2:11 - loss: 1.3977 - regression_loss: 1.2122 - classification_loss: 0.1855 116/500 [=====>........................] - ETA: 2:11 - loss: 1.3981 - regression_loss: 1.2124 - classification_loss: 0.1856 117/500 [======>.......................] - ETA: 2:11 - loss: 1.3985 - regression_loss: 1.2129 - classification_loss: 0.1856 118/500 [======>.......................] - ETA: 2:10 - loss: 1.3917 - regression_loss: 1.2070 - classification_loss: 0.1847 119/500 [======>.......................] - ETA: 2:10 - loss: 1.3892 - regression_loss: 1.2051 - classification_loss: 0.1841 120/500 [======>.......................] - ETA: 2:10 - loss: 1.3913 - regression_loss: 1.2067 - classification_loss: 0.1847 121/500 [======>.......................] - ETA: 2:09 - loss: 1.3899 - regression_loss: 1.2052 - classification_loss: 0.1846 122/500 [======>.......................] - ETA: 2:09 - loss: 1.3902 - regression_loss: 1.2058 - classification_loss: 0.1844 123/500 [======>.......................] - ETA: 2:09 - loss: 1.3886 - regression_loss: 1.2043 - classification_loss: 0.1843 124/500 [======>.......................] - ETA: 2:08 - loss: 1.3885 - regression_loss: 1.2042 - classification_loss: 0.1843 125/500 [======>.......................] - ETA: 2:08 - loss: 1.3927 - regression_loss: 1.2078 - classification_loss: 0.1850 126/500 [======>.......................] - ETA: 2:08 - loss: 1.3967 - regression_loss: 1.2113 - classification_loss: 0.1853 127/500 [======>.......................] - ETA: 2:07 - loss: 1.3975 - regression_loss: 1.2119 - classification_loss: 0.1856 128/500 [======>.......................] - ETA: 2:07 - loss: 1.3964 - regression_loss: 1.2113 - classification_loss: 0.1851 129/500 [======>.......................] - ETA: 2:07 - loss: 1.4000 - regression_loss: 1.2144 - classification_loss: 0.1855 130/500 [======>.......................] - ETA: 2:07 - loss: 1.4016 - regression_loss: 1.2160 - classification_loss: 0.1856 131/500 [======>.......................] - ETA: 2:06 - loss: 1.4027 - regression_loss: 1.2170 - classification_loss: 0.1857 132/500 [======>.......................] - ETA: 2:06 - loss: 1.4068 - regression_loss: 1.2210 - classification_loss: 0.1858 133/500 [======>.......................] - ETA: 2:06 - loss: 1.4038 - regression_loss: 1.2185 - classification_loss: 0.1853 134/500 [=======>......................] - ETA: 2:05 - loss: 1.4040 - regression_loss: 1.2185 - classification_loss: 0.1855 135/500 [=======>......................] - ETA: 2:05 - loss: 1.4068 - regression_loss: 1.2208 - classification_loss: 0.1860 136/500 [=======>......................] - ETA: 2:05 - loss: 1.4062 - regression_loss: 1.2205 - classification_loss: 0.1856 137/500 [=======>......................] - ETA: 2:04 - loss: 1.4025 - regression_loss: 1.2176 - classification_loss: 0.1849 138/500 [=======>......................] - ETA: 2:04 - loss: 1.4019 - regression_loss: 1.2169 - classification_loss: 0.1850 139/500 [=======>......................] - ETA: 2:04 - loss: 1.4033 - regression_loss: 1.2180 - classification_loss: 0.1853 140/500 [=======>......................] - ETA: 2:03 - loss: 1.3965 - regression_loss: 1.2121 - classification_loss: 0.1844 141/500 [=======>......................] - ETA: 2:03 - loss: 1.3978 - regression_loss: 1.2133 - classification_loss: 0.1845 142/500 [=======>......................] - ETA: 2:03 - loss: 1.3964 - regression_loss: 1.2122 - classification_loss: 0.1843 143/500 [=======>......................] - ETA: 2:02 - loss: 1.3992 - regression_loss: 1.2147 - classification_loss: 0.1845 144/500 [=======>......................] - ETA: 2:02 - loss: 1.3995 - regression_loss: 1.2149 - classification_loss: 0.1846 145/500 [=======>......................] - ETA: 2:02 - loss: 1.3956 - regression_loss: 1.2116 - classification_loss: 0.1841 146/500 [=======>......................] - ETA: 2:01 - loss: 1.3944 - regression_loss: 1.2105 - classification_loss: 0.1838 147/500 [=======>......................] - ETA: 2:01 - loss: 1.3905 - regression_loss: 1.2070 - classification_loss: 0.1834 148/500 [=======>......................] - ETA: 2:01 - loss: 1.3881 - regression_loss: 1.2048 - classification_loss: 0.1833 149/500 [=======>......................] - ETA: 2:00 - loss: 1.3895 - regression_loss: 1.2062 - classification_loss: 0.1833 150/500 [========>.....................] - ETA: 2:00 - loss: 1.3904 - regression_loss: 1.2070 - classification_loss: 0.1834 151/500 [========>.....................] - ETA: 2:00 - loss: 1.3912 - regression_loss: 1.2078 - classification_loss: 0.1834 152/500 [========>.....................] - ETA: 1:59 - loss: 1.3884 - regression_loss: 1.2052 - classification_loss: 0.1832 153/500 [========>.....................] - ETA: 1:59 - loss: 1.3898 - regression_loss: 1.2068 - classification_loss: 0.1830 154/500 [========>.....................] - ETA: 1:58 - loss: 1.3891 - regression_loss: 1.2060 - classification_loss: 0.1831 155/500 [========>.....................] - ETA: 1:58 - loss: 1.3908 - regression_loss: 1.2075 - classification_loss: 0.1833 156/500 [========>.....................] - ETA: 1:58 - loss: 1.3915 - regression_loss: 1.2080 - classification_loss: 0.1835 157/500 [========>.....................] - ETA: 1:57 - loss: 1.3914 - regression_loss: 1.2078 - classification_loss: 0.1837 158/500 [========>.....................] - ETA: 1:57 - loss: 1.3903 - regression_loss: 1.2068 - classification_loss: 0.1835 159/500 [========>.....................] - ETA: 1:57 - loss: 1.3898 - regression_loss: 1.2064 - classification_loss: 0.1834 160/500 [========>.....................] - ETA: 1:56 - loss: 1.3867 - regression_loss: 1.2039 - classification_loss: 0.1828 161/500 [========>.....................] - ETA: 1:56 - loss: 1.3872 - regression_loss: 1.2045 - classification_loss: 0.1827 162/500 [========>.....................] - ETA: 1:56 - loss: 1.3852 - regression_loss: 1.2026 - classification_loss: 0.1827 163/500 [========>.....................] - ETA: 1:55 - loss: 1.3817 - regression_loss: 1.1995 - classification_loss: 0.1823 164/500 [========>.....................] - ETA: 1:55 - loss: 1.3807 - regression_loss: 1.1984 - classification_loss: 0.1823 165/500 [========>.....................] - ETA: 1:55 - loss: 1.3813 - regression_loss: 1.1988 - classification_loss: 0.1825 166/500 [========>.....................] - ETA: 1:55 - loss: 1.3812 - regression_loss: 1.1986 - classification_loss: 0.1826 167/500 [=========>....................] - ETA: 1:54 - loss: 1.3799 - regression_loss: 1.1976 - classification_loss: 0.1823 168/500 [=========>....................] - ETA: 1:54 - loss: 1.3805 - regression_loss: 1.1980 - classification_loss: 0.1825 169/500 [=========>....................] - ETA: 1:54 - loss: 1.3771 - regression_loss: 1.1950 - classification_loss: 0.1821 170/500 [=========>....................] - ETA: 1:53 - loss: 1.3786 - regression_loss: 1.1961 - classification_loss: 0.1824 171/500 [=========>....................] - ETA: 1:53 - loss: 1.3805 - regression_loss: 1.1979 - classification_loss: 0.1826 172/500 [=========>....................] - ETA: 1:53 - loss: 1.3785 - regression_loss: 1.1963 - classification_loss: 0.1822 173/500 [=========>....................] - ETA: 1:52 - loss: 1.3769 - regression_loss: 1.1950 - classification_loss: 0.1818 174/500 [=========>....................] - ETA: 1:52 - loss: 1.3783 - regression_loss: 1.1963 - classification_loss: 0.1819 175/500 [=========>....................] - ETA: 1:52 - loss: 1.3778 - regression_loss: 1.1959 - classification_loss: 0.1820 176/500 [=========>....................] - ETA: 1:51 - loss: 1.3774 - regression_loss: 1.1957 - classification_loss: 0.1816 177/500 [=========>....................] - ETA: 1:51 - loss: 1.3775 - regression_loss: 1.1956 - classification_loss: 0.1819 178/500 [=========>....................] - ETA: 1:51 - loss: 1.3778 - regression_loss: 1.1958 - classification_loss: 0.1819 179/500 [=========>....................] - ETA: 1:50 - loss: 1.3751 - regression_loss: 1.1935 - classification_loss: 0.1816 180/500 [=========>....................] - ETA: 1:50 - loss: 1.3719 - regression_loss: 1.1908 - classification_loss: 0.1812 181/500 [=========>....................] - ETA: 1:50 - loss: 1.3739 - regression_loss: 1.1930 - classification_loss: 0.1808 182/500 [=========>....................] - ETA: 1:49 - loss: 1.3713 - regression_loss: 1.1907 - classification_loss: 0.1806 183/500 [=========>....................] - ETA: 1:49 - loss: 1.3711 - regression_loss: 1.1906 - classification_loss: 0.1805 184/500 [==========>...................] - ETA: 1:49 - loss: 1.3735 - regression_loss: 1.1928 - classification_loss: 0.1807 185/500 [==========>...................] - ETA: 1:48 - loss: 1.3726 - regression_loss: 1.1919 - classification_loss: 0.1807 186/500 [==========>...................] - ETA: 1:48 - loss: 1.3687 - regression_loss: 1.1885 - classification_loss: 0.1803 187/500 [==========>...................] - ETA: 1:48 - loss: 1.3679 - regression_loss: 1.1880 - classification_loss: 0.1799 188/500 [==========>...................] - ETA: 1:47 - loss: 1.3720 - regression_loss: 1.1914 - classification_loss: 0.1806 189/500 [==========>...................] - ETA: 1:47 - loss: 1.3698 - regression_loss: 1.1894 - classification_loss: 0.1804 190/500 [==========>...................] - ETA: 1:47 - loss: 1.3702 - regression_loss: 1.1899 - classification_loss: 0.1803 191/500 [==========>...................] - ETA: 1:46 - loss: 1.3728 - regression_loss: 1.1920 - classification_loss: 0.1807 192/500 [==========>...................] - ETA: 1:46 - loss: 1.3718 - regression_loss: 1.1913 - classification_loss: 0.1805 193/500 [==========>...................] - ETA: 1:46 - loss: 1.3721 - regression_loss: 1.1919 - classification_loss: 0.1802 194/500 [==========>...................] - ETA: 1:45 - loss: 1.3724 - regression_loss: 1.1922 - classification_loss: 0.1802 195/500 [==========>...................] - ETA: 1:45 - loss: 1.3712 - regression_loss: 1.1911 - classification_loss: 0.1801 196/500 [==========>...................] - ETA: 1:45 - loss: 1.3705 - regression_loss: 1.1902 - classification_loss: 0.1802 197/500 [==========>...................] - ETA: 1:44 - loss: 1.3690 - regression_loss: 1.1890 - classification_loss: 0.1801 198/500 [==========>...................] - ETA: 1:44 - loss: 1.3722 - regression_loss: 1.1916 - classification_loss: 0.1806 199/500 [==========>...................] - ETA: 1:43 - loss: 1.3732 - regression_loss: 1.1923 - classification_loss: 0.1809 200/500 [===========>..................] - ETA: 1:43 - loss: 1.3728 - regression_loss: 1.1920 - classification_loss: 0.1808 201/500 [===========>..................] - ETA: 1:43 - loss: 1.3746 - regression_loss: 1.1935 - classification_loss: 0.1811 202/500 [===========>..................] - ETA: 1:42 - loss: 1.3760 - regression_loss: 1.1947 - classification_loss: 0.1813 203/500 [===========>..................] - ETA: 1:42 - loss: 1.3778 - regression_loss: 1.1965 - classification_loss: 0.1813 204/500 [===========>..................] - ETA: 1:42 - loss: 1.3792 - regression_loss: 1.1977 - classification_loss: 0.1815 205/500 [===========>..................] - ETA: 1:42 - loss: 1.3753 - regression_loss: 1.1943 - classification_loss: 0.1811 206/500 [===========>..................] - ETA: 1:41 - loss: 1.3779 - regression_loss: 1.1964 - classification_loss: 0.1814 207/500 [===========>..................] - ETA: 1:41 - loss: 1.3778 - regression_loss: 1.1966 - classification_loss: 0.1811 208/500 [===========>..................] - ETA: 1:40 - loss: 1.3785 - regression_loss: 1.1972 - classification_loss: 0.1813 209/500 [===========>..................] - ETA: 1:40 - loss: 1.3768 - regression_loss: 1.1958 - classification_loss: 0.1810 210/500 [===========>..................] - ETA: 1:40 - loss: 1.3794 - regression_loss: 1.1979 - classification_loss: 0.1815 211/500 [===========>..................] - ETA: 1:39 - loss: 1.3786 - regression_loss: 1.1970 - classification_loss: 0.1816 212/500 [===========>..................] - ETA: 1:39 - loss: 1.3762 - regression_loss: 1.1947 - classification_loss: 0.1815 213/500 [===========>..................] - ETA: 1:39 - loss: 1.3764 - regression_loss: 1.1950 - classification_loss: 0.1814 214/500 [===========>..................] - ETA: 1:38 - loss: 1.3765 - regression_loss: 1.1952 - classification_loss: 0.1812 215/500 [===========>..................] - ETA: 1:38 - loss: 1.3772 - regression_loss: 1.1959 - classification_loss: 0.1813 216/500 [===========>..................] - ETA: 1:38 - loss: 1.3790 - regression_loss: 1.1973 - classification_loss: 0.1817 217/500 [============>.................] - ETA: 1:37 - loss: 1.3776 - regression_loss: 1.1959 - classification_loss: 0.1816 218/500 [============>.................] - ETA: 1:37 - loss: 1.3802 - regression_loss: 1.1978 - classification_loss: 0.1824 219/500 [============>.................] - ETA: 1:37 - loss: 1.3775 - regression_loss: 1.1955 - classification_loss: 0.1820 220/500 [============>.................] - ETA: 1:36 - loss: 1.3753 - regression_loss: 1.1936 - classification_loss: 0.1817 221/500 [============>.................] - ETA: 1:36 - loss: 1.3742 - regression_loss: 1.1927 - classification_loss: 0.1815 222/500 [============>.................] - ETA: 1:36 - loss: 1.3724 - regression_loss: 1.1913 - classification_loss: 0.1811 223/500 [============>.................] - ETA: 1:35 - loss: 1.3728 - regression_loss: 1.1916 - classification_loss: 0.1812 224/500 [============>.................] - ETA: 1:35 - loss: 1.3724 - regression_loss: 1.1912 - classification_loss: 0.1812 225/500 [============>.................] - ETA: 1:35 - loss: 1.3727 - regression_loss: 1.1915 - classification_loss: 0.1812 226/500 [============>.................] - ETA: 1:34 - loss: 1.3711 - regression_loss: 1.1901 - classification_loss: 0.1810 227/500 [============>.................] - ETA: 1:34 - loss: 1.3691 - regression_loss: 1.1885 - classification_loss: 0.1806 228/500 [============>.................] - ETA: 1:33 - loss: 1.3690 - regression_loss: 1.1886 - classification_loss: 0.1804 229/500 [============>.................] - ETA: 1:33 - loss: 1.3661 - regression_loss: 1.1861 - classification_loss: 0.1800 230/500 [============>.................] - ETA: 1:33 - loss: 1.3663 - regression_loss: 1.1860 - classification_loss: 0.1803 231/500 [============>.................] - ETA: 1:32 - loss: 1.3656 - regression_loss: 1.1853 - classification_loss: 0.1802 232/500 [============>.................] - ETA: 1:32 - loss: 1.3650 - regression_loss: 1.1849 - classification_loss: 0.1801 233/500 [============>.................] - ETA: 1:32 - loss: 1.3646 - regression_loss: 1.1846 - classification_loss: 0.1800 234/500 [=============>................] - ETA: 1:31 - loss: 1.3630 - regression_loss: 1.1832 - classification_loss: 0.1798 235/500 [=============>................] - ETA: 1:31 - loss: 1.3609 - regression_loss: 1.1816 - classification_loss: 0.1794 236/500 [=============>................] - ETA: 1:30 - loss: 1.3588 - regression_loss: 1.1798 - classification_loss: 0.1790 237/500 [=============>................] - ETA: 1:30 - loss: 1.3589 - regression_loss: 1.1799 - classification_loss: 0.1790 238/500 [=============>................] - ETA: 1:30 - loss: 1.3612 - regression_loss: 1.1818 - classification_loss: 0.1794 239/500 [=============>................] - ETA: 1:29 - loss: 1.3615 - regression_loss: 1.1819 - classification_loss: 0.1796 240/500 [=============>................] - ETA: 1:29 - loss: 1.3626 - regression_loss: 1.1830 - classification_loss: 0.1796 241/500 [=============>................] - ETA: 1:29 - loss: 1.3624 - regression_loss: 1.1828 - classification_loss: 0.1796 242/500 [=============>................] - ETA: 1:28 - loss: 1.3631 - regression_loss: 1.1833 - classification_loss: 0.1798 243/500 [=============>................] - ETA: 1:28 - loss: 1.3643 - regression_loss: 1.1844 - classification_loss: 0.1799 244/500 [=============>................] - ETA: 1:27 - loss: 1.3642 - regression_loss: 1.1843 - classification_loss: 0.1799 245/500 [=============>................] - ETA: 1:27 - loss: 1.3622 - regression_loss: 1.1825 - classification_loss: 0.1798 246/500 [=============>................] - ETA: 1:27 - loss: 1.3605 - regression_loss: 1.1810 - classification_loss: 0.1795 247/500 [=============>................] - ETA: 1:26 - loss: 1.3610 - regression_loss: 1.1813 - classification_loss: 0.1797 248/500 [=============>................] - ETA: 1:26 - loss: 1.3616 - regression_loss: 1.1819 - classification_loss: 0.1797 249/500 [=============>................] - ETA: 1:26 - loss: 1.3614 - regression_loss: 1.1816 - classification_loss: 0.1799 250/500 [==============>...............] - ETA: 1:25 - loss: 1.3605 - regression_loss: 1.1809 - classification_loss: 0.1796 251/500 [==============>...............] - ETA: 1:25 - loss: 1.3610 - regression_loss: 1.1813 - classification_loss: 0.1798 252/500 [==============>...............] - ETA: 1:25 - loss: 1.3608 - regression_loss: 1.1810 - classification_loss: 0.1797 253/500 [==============>...............] - ETA: 1:24 - loss: 1.3585 - regression_loss: 1.1792 - classification_loss: 0.1794 254/500 [==============>...............] - ETA: 1:24 - loss: 1.3583 - regression_loss: 1.1788 - classification_loss: 0.1794 255/500 [==============>...............] - ETA: 1:24 - loss: 1.3573 - regression_loss: 1.1781 - classification_loss: 0.1793 256/500 [==============>...............] - ETA: 1:23 - loss: 1.3570 - regression_loss: 1.1778 - classification_loss: 0.1792 257/500 [==============>...............] - ETA: 1:23 - loss: 1.3579 - regression_loss: 1.1783 - classification_loss: 0.1796 258/500 [==============>...............] - ETA: 1:23 - loss: 1.3582 - regression_loss: 1.1782 - classification_loss: 0.1800 259/500 [==============>...............] - ETA: 1:22 - loss: 1.3599 - regression_loss: 1.1793 - classification_loss: 0.1806 260/500 [==============>...............] - ETA: 1:22 - loss: 1.3599 - regression_loss: 1.1792 - classification_loss: 0.1806 261/500 [==============>...............] - ETA: 1:21 - loss: 1.3605 - regression_loss: 1.1798 - classification_loss: 0.1807 262/500 [==============>...............] - ETA: 1:21 - loss: 1.3618 - regression_loss: 1.1810 - classification_loss: 0.1808 263/500 [==============>...............] - ETA: 1:21 - loss: 1.3626 - regression_loss: 1.1818 - classification_loss: 0.1809 264/500 [==============>...............] - ETA: 1:20 - loss: 1.3646 - regression_loss: 1.1835 - classification_loss: 0.1811 265/500 [==============>...............] - ETA: 1:20 - loss: 1.3652 - regression_loss: 1.1841 - classification_loss: 0.1812 266/500 [==============>...............] - ETA: 1:20 - loss: 1.3638 - regression_loss: 1.1827 - classification_loss: 0.1811 267/500 [===============>..............] - ETA: 1:19 - loss: 1.3649 - regression_loss: 1.1833 - classification_loss: 0.1816 268/500 [===============>..............] - ETA: 1:19 - loss: 1.3616 - regression_loss: 1.1803 - classification_loss: 0.1813 269/500 [===============>..............] - ETA: 1:19 - loss: 1.3628 - regression_loss: 1.1811 - classification_loss: 0.1817 270/500 [===============>..............] - ETA: 1:18 - loss: 1.3628 - regression_loss: 1.1811 - classification_loss: 0.1817 271/500 [===============>..............] - ETA: 1:18 - loss: 1.3615 - regression_loss: 1.1798 - classification_loss: 0.1817 272/500 [===============>..............] - ETA: 1:18 - loss: 1.3602 - regression_loss: 1.1787 - classification_loss: 0.1816 273/500 [===============>..............] - ETA: 1:17 - loss: 1.3620 - regression_loss: 1.1802 - classification_loss: 0.1817 274/500 [===============>..............] - ETA: 1:17 - loss: 1.3638 - regression_loss: 1.1818 - classification_loss: 0.1820 275/500 [===============>..............] - ETA: 1:17 - loss: 1.3645 - regression_loss: 1.1823 - classification_loss: 0.1821 276/500 [===============>..............] - ETA: 1:16 - loss: 1.3629 - regression_loss: 1.1807 - classification_loss: 0.1821 277/500 [===============>..............] - ETA: 1:16 - loss: 1.3610 - regression_loss: 1.1790 - classification_loss: 0.1820 278/500 [===============>..............] - ETA: 1:16 - loss: 1.3613 - regression_loss: 1.1795 - classification_loss: 0.1819 279/500 [===============>..............] - ETA: 1:15 - loss: 1.3628 - regression_loss: 1.1808 - classification_loss: 0.1820 280/500 [===============>..............] - ETA: 1:15 - loss: 1.3632 - regression_loss: 1.1812 - classification_loss: 0.1821 281/500 [===============>..............] - ETA: 1:15 - loss: 1.3614 - regression_loss: 1.1797 - classification_loss: 0.1818 282/500 [===============>..............] - ETA: 1:14 - loss: 1.3617 - regression_loss: 1.1799 - classification_loss: 0.1818 283/500 [===============>..............] - ETA: 1:14 - loss: 1.3623 - regression_loss: 1.1804 - classification_loss: 0.1819 284/500 [================>.............] - ETA: 1:14 - loss: 1.3628 - regression_loss: 1.1808 - classification_loss: 0.1819 285/500 [================>.............] - ETA: 1:13 - loss: 1.3629 - regression_loss: 1.1809 - classification_loss: 0.1820 286/500 [================>.............] - ETA: 1:13 - loss: 1.3628 - regression_loss: 1.1809 - classification_loss: 0.1819 287/500 [================>.............] - ETA: 1:13 - loss: 1.3609 - regression_loss: 1.1788 - classification_loss: 0.1821 288/500 [================>.............] - ETA: 1:12 - loss: 1.3606 - regression_loss: 1.1786 - classification_loss: 0.1820 289/500 [================>.............] - ETA: 1:12 - loss: 1.3611 - regression_loss: 1.1790 - classification_loss: 0.1822 290/500 [================>.............] - ETA: 1:12 - loss: 1.3639 - regression_loss: 1.1810 - classification_loss: 0.1829 291/500 [================>.............] - ETA: 1:11 - loss: 1.3632 - regression_loss: 1.1806 - classification_loss: 0.1827 292/500 [================>.............] - ETA: 1:11 - loss: 1.3625 - regression_loss: 1.1799 - classification_loss: 0.1826 293/500 [================>.............] - ETA: 1:11 - loss: 1.3631 - regression_loss: 1.1805 - classification_loss: 0.1826 294/500 [================>.............] - ETA: 1:10 - loss: 1.3601 - regression_loss: 1.1779 - classification_loss: 0.1822 295/500 [================>.............] - ETA: 1:10 - loss: 1.3588 - regression_loss: 1.1767 - classification_loss: 0.1820 296/500 [================>.............] - ETA: 1:10 - loss: 1.3568 - regression_loss: 1.1751 - classification_loss: 0.1817 297/500 [================>.............] - ETA: 1:09 - loss: 1.3556 - regression_loss: 1.1740 - classification_loss: 0.1816 298/500 [================>.............] - ETA: 1:09 - loss: 1.3548 - regression_loss: 1.1732 - classification_loss: 0.1816 299/500 [================>.............] - ETA: 1:08 - loss: 1.3558 - regression_loss: 1.1741 - classification_loss: 0.1818 300/500 [=================>............] - ETA: 1:08 - loss: 1.3538 - regression_loss: 1.1723 - classification_loss: 0.1815 301/500 [=================>............] - ETA: 1:08 - loss: 1.3521 - regression_loss: 1.1709 - classification_loss: 0.1812 302/500 [=================>............] - ETA: 1:07 - loss: 1.3526 - regression_loss: 1.1715 - classification_loss: 0.1811 303/500 [=================>............] - ETA: 1:07 - loss: 1.3535 - regression_loss: 1.1723 - classification_loss: 0.1812 304/500 [=================>............] - ETA: 1:07 - loss: 1.3542 - regression_loss: 1.1729 - classification_loss: 0.1814 305/500 [=================>............] - ETA: 1:06 - loss: 1.3540 - regression_loss: 1.1728 - classification_loss: 0.1813 306/500 [=================>............] - ETA: 1:06 - loss: 1.3538 - regression_loss: 1.1727 - classification_loss: 0.1812 307/500 [=================>............] - ETA: 1:06 - loss: 1.3528 - regression_loss: 1.1717 - classification_loss: 0.1810 308/500 [=================>............] - ETA: 1:05 - loss: 1.3523 - regression_loss: 1.1714 - classification_loss: 0.1809 309/500 [=================>............] - ETA: 1:05 - loss: 1.3539 - regression_loss: 1.1727 - classification_loss: 0.1812 310/500 [=================>............] - ETA: 1:05 - loss: 1.3529 - regression_loss: 1.1719 - classification_loss: 0.1810 311/500 [=================>............] - ETA: 1:04 - loss: 1.3518 - regression_loss: 1.1710 - classification_loss: 0.1808 312/500 [=================>............] - ETA: 1:04 - loss: 1.3517 - regression_loss: 1.1710 - classification_loss: 0.1807 313/500 [=================>............] - ETA: 1:04 - loss: 1.3519 - regression_loss: 1.1713 - classification_loss: 0.1806 314/500 [=================>............] - ETA: 1:03 - loss: 1.3503 - regression_loss: 1.1698 - classification_loss: 0.1805 315/500 [=================>............] - ETA: 1:03 - loss: 1.3510 - regression_loss: 1.1705 - classification_loss: 0.1805 316/500 [=================>............] - ETA: 1:03 - loss: 1.3526 - regression_loss: 1.1719 - classification_loss: 0.1808 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3526 - regression_loss: 1.1718 - classification_loss: 0.1809 318/500 [==================>...........] - ETA: 1:02 - loss: 1.3504 - regression_loss: 1.1698 - classification_loss: 0.1806 319/500 [==================>...........] - ETA: 1:02 - loss: 1.3486 - regression_loss: 1.1683 - classification_loss: 0.1803 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3477 - regression_loss: 1.1675 - classification_loss: 0.1802 321/500 [==================>...........] - ETA: 1:01 - loss: 1.3481 - regression_loss: 1.1679 - classification_loss: 0.1802 322/500 [==================>...........] - ETA: 1:01 - loss: 1.3490 - regression_loss: 1.1686 - classification_loss: 0.1804 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3499 - regression_loss: 1.1692 - classification_loss: 0.1807 324/500 [==================>...........] - ETA: 1:00 - loss: 1.3493 - regression_loss: 1.1687 - classification_loss: 0.1806 325/500 [==================>...........] - ETA: 1:00 - loss: 1.3490 - regression_loss: 1.1684 - classification_loss: 0.1806 326/500 [==================>...........] - ETA: 59s - loss: 1.3500 - regression_loss: 1.1694 - classification_loss: 0.1806  327/500 [==================>...........] - ETA: 59s - loss: 1.3479 - regression_loss: 1.1676 - classification_loss: 0.1803 328/500 [==================>...........] - ETA: 58s - loss: 1.3486 - regression_loss: 1.1683 - classification_loss: 0.1803 329/500 [==================>...........] - ETA: 58s - loss: 1.3494 - regression_loss: 1.1690 - classification_loss: 0.1804 330/500 [==================>...........] - ETA: 58s - loss: 1.3487 - regression_loss: 1.1684 - classification_loss: 0.1803 331/500 [==================>...........] - ETA: 57s - loss: 1.3486 - regression_loss: 1.1685 - classification_loss: 0.1802 332/500 [==================>...........] - ETA: 57s - loss: 1.3499 - regression_loss: 1.1695 - classification_loss: 0.1804 333/500 [==================>...........] - ETA: 57s - loss: 1.3516 - regression_loss: 1.1710 - classification_loss: 0.1806 334/500 [===================>..........] - ETA: 56s - loss: 1.3515 - regression_loss: 1.1709 - classification_loss: 0.1805 335/500 [===================>..........] - ETA: 56s - loss: 1.3525 - regression_loss: 1.1719 - classification_loss: 0.1806 336/500 [===================>..........] - ETA: 56s - loss: 1.3530 - regression_loss: 1.1723 - classification_loss: 0.1807 337/500 [===================>..........] - ETA: 55s - loss: 1.3540 - regression_loss: 1.1734 - classification_loss: 0.1806 338/500 [===================>..........] - ETA: 55s - loss: 1.3542 - regression_loss: 1.1737 - classification_loss: 0.1805 339/500 [===================>..........] - ETA: 55s - loss: 1.3539 - regression_loss: 1.1735 - classification_loss: 0.1804 340/500 [===================>..........] - ETA: 54s - loss: 1.3561 - regression_loss: 1.1753 - classification_loss: 0.1807 341/500 [===================>..........] - ETA: 54s - loss: 1.3570 - regression_loss: 1.1749 - classification_loss: 0.1821 342/500 [===================>..........] - ETA: 54s - loss: 1.3552 - regression_loss: 1.1733 - classification_loss: 0.1819 343/500 [===================>..........] - ETA: 53s - loss: 1.3556 - regression_loss: 1.1736 - classification_loss: 0.1820 344/500 [===================>..........] - ETA: 53s - loss: 1.3573 - regression_loss: 1.1752 - classification_loss: 0.1821 345/500 [===================>..........] - ETA: 53s - loss: 1.3579 - regression_loss: 1.1756 - classification_loss: 0.1823 346/500 [===================>..........] - ETA: 52s - loss: 1.3587 - regression_loss: 1.1764 - classification_loss: 0.1823 347/500 [===================>..........] - ETA: 52s - loss: 1.3571 - regression_loss: 1.1750 - classification_loss: 0.1821 348/500 [===================>..........] - ETA: 52s - loss: 1.3566 - regression_loss: 1.1746 - classification_loss: 0.1820 349/500 [===================>..........] - ETA: 51s - loss: 1.3559 - regression_loss: 1.1739 - classification_loss: 0.1820 350/500 [====================>.........] - ETA: 51s - loss: 1.3564 - regression_loss: 1.1742 - classification_loss: 0.1822 351/500 [====================>.........] - ETA: 51s - loss: 1.3564 - regression_loss: 1.1741 - classification_loss: 0.1823 352/500 [====================>.........] - ETA: 50s - loss: 1.3559 - regression_loss: 1.1739 - classification_loss: 0.1821 353/500 [====================>.........] - ETA: 50s - loss: 1.3542 - regression_loss: 1.1724 - classification_loss: 0.1818 354/500 [====================>.........] - ETA: 50s - loss: 1.3534 - regression_loss: 1.1717 - classification_loss: 0.1817 355/500 [====================>.........] - ETA: 49s - loss: 1.3528 - regression_loss: 1.1711 - classification_loss: 0.1817 356/500 [====================>.........] - ETA: 49s - loss: 1.3538 - regression_loss: 1.1719 - classification_loss: 0.1819 357/500 [====================>.........] - ETA: 48s - loss: 1.3536 - regression_loss: 1.1717 - classification_loss: 0.1819 358/500 [====================>.........] - ETA: 48s - loss: 1.3545 - regression_loss: 1.1725 - classification_loss: 0.1820 359/500 [====================>.........] - ETA: 48s - loss: 1.3530 - regression_loss: 1.1713 - classification_loss: 0.1817 360/500 [====================>.........] - ETA: 47s - loss: 1.3520 - regression_loss: 1.1705 - classification_loss: 0.1815 361/500 [====================>.........] - ETA: 47s - loss: 1.3527 - regression_loss: 1.1711 - classification_loss: 0.1816 362/500 [====================>.........] - ETA: 47s - loss: 1.3513 - regression_loss: 1.1700 - classification_loss: 0.1813 363/500 [====================>.........] - ETA: 46s - loss: 1.3516 - regression_loss: 1.1702 - classification_loss: 0.1814 364/500 [====================>.........] - ETA: 46s - loss: 1.3519 - regression_loss: 1.1705 - classification_loss: 0.1814 365/500 [====================>.........] - ETA: 46s - loss: 1.3500 - regression_loss: 1.1688 - classification_loss: 0.1811 366/500 [====================>.........] - ETA: 45s - loss: 1.3488 - regression_loss: 1.1679 - classification_loss: 0.1808 367/500 [=====================>........] - ETA: 45s - loss: 1.3491 - regression_loss: 1.1684 - classification_loss: 0.1807 368/500 [=====================>........] - ETA: 45s - loss: 1.3491 - regression_loss: 1.1685 - classification_loss: 0.1805 369/500 [=====================>........] - ETA: 44s - loss: 1.3490 - regression_loss: 1.1685 - classification_loss: 0.1805 370/500 [=====================>........] - ETA: 44s - loss: 1.3492 - regression_loss: 1.1688 - classification_loss: 0.1804 371/500 [=====================>........] - ETA: 44s - loss: 1.3490 - regression_loss: 1.1686 - classification_loss: 0.1804 372/500 [=====================>........] - ETA: 43s - loss: 1.3493 - regression_loss: 1.1687 - classification_loss: 0.1806 373/500 [=====================>........] - ETA: 43s - loss: 1.3474 - regression_loss: 1.1671 - classification_loss: 0.1803 374/500 [=====================>........] - ETA: 43s - loss: 1.3469 - regression_loss: 1.1667 - classification_loss: 0.1802 375/500 [=====================>........] - ETA: 42s - loss: 1.3471 - regression_loss: 1.1670 - classification_loss: 0.1802 376/500 [=====================>........] - ETA: 42s - loss: 1.3476 - regression_loss: 1.1673 - classification_loss: 0.1802 377/500 [=====================>........] - ETA: 42s - loss: 1.3462 - regression_loss: 1.1662 - classification_loss: 0.1801 378/500 [=====================>........] - ETA: 41s - loss: 1.3463 - regression_loss: 1.1663 - classification_loss: 0.1800 379/500 [=====================>........] - ETA: 41s - loss: 1.3452 - regression_loss: 1.1654 - classification_loss: 0.1798 380/500 [=====================>........] - ETA: 41s - loss: 1.3460 - regression_loss: 1.1661 - classification_loss: 0.1799 381/500 [=====================>........] - ETA: 40s - loss: 1.3455 - regression_loss: 1.1657 - classification_loss: 0.1798 382/500 [=====================>........] - ETA: 40s - loss: 1.3460 - regression_loss: 1.1663 - classification_loss: 0.1797 383/500 [=====================>........] - ETA: 40s - loss: 1.3472 - regression_loss: 1.1673 - classification_loss: 0.1799 384/500 [======================>.......] - ETA: 39s - loss: 1.3490 - regression_loss: 1.1686 - classification_loss: 0.1804 385/500 [======================>.......] - ETA: 39s - loss: 1.3498 - regression_loss: 1.1690 - classification_loss: 0.1808 386/500 [======================>.......] - ETA: 39s - loss: 1.3477 - regression_loss: 1.1672 - classification_loss: 0.1805 387/500 [======================>.......] - ETA: 38s - loss: 1.3483 - regression_loss: 1.1679 - classification_loss: 0.1805 388/500 [======================>.......] - ETA: 38s - loss: 1.3467 - regression_loss: 1.1664 - classification_loss: 0.1802 389/500 [======================>.......] - ETA: 38s - loss: 1.3482 - regression_loss: 1.1678 - classification_loss: 0.1804 390/500 [======================>.......] - ETA: 37s - loss: 1.3487 - regression_loss: 1.1682 - classification_loss: 0.1805 391/500 [======================>.......] - ETA: 37s - loss: 1.3485 - regression_loss: 1.1681 - classification_loss: 0.1803 392/500 [======================>.......] - ETA: 37s - loss: 1.3489 - regression_loss: 1.1686 - classification_loss: 0.1804 393/500 [======================>.......] - ETA: 36s - loss: 1.3501 - regression_loss: 1.1698 - classification_loss: 0.1803 394/500 [======================>.......] - ETA: 36s - loss: 1.3488 - regression_loss: 1.1687 - classification_loss: 0.1801 395/500 [======================>.......] - ETA: 36s - loss: 1.3485 - regression_loss: 1.1686 - classification_loss: 0.1800 396/500 [======================>.......] - ETA: 35s - loss: 1.3479 - regression_loss: 1.1680 - classification_loss: 0.1798 397/500 [======================>.......] - ETA: 35s - loss: 1.3482 - regression_loss: 1.1683 - classification_loss: 0.1799 398/500 [======================>.......] - ETA: 34s - loss: 1.3487 - regression_loss: 1.1688 - classification_loss: 0.1799 399/500 [======================>.......] - ETA: 34s - loss: 1.3470 - regression_loss: 1.1673 - classification_loss: 0.1797 400/500 [=======================>......] - ETA: 34s - loss: 1.3466 - regression_loss: 1.1670 - classification_loss: 0.1797 401/500 [=======================>......] - ETA: 33s - loss: 1.3471 - regression_loss: 1.1673 - classification_loss: 0.1798 402/500 [=======================>......] - ETA: 33s - loss: 1.3485 - regression_loss: 1.1684 - classification_loss: 0.1801 403/500 [=======================>......] - ETA: 33s - loss: 1.3494 - regression_loss: 1.1692 - classification_loss: 0.1802 404/500 [=======================>......] - ETA: 32s - loss: 1.3489 - regression_loss: 1.1688 - classification_loss: 0.1801 405/500 [=======================>......] - ETA: 32s - loss: 1.3492 - regression_loss: 1.1691 - classification_loss: 0.1801 406/500 [=======================>......] - ETA: 32s - loss: 1.3499 - regression_loss: 1.1697 - classification_loss: 0.1802 407/500 [=======================>......] - ETA: 31s - loss: 1.3499 - regression_loss: 1.1697 - classification_loss: 0.1802 408/500 [=======================>......] - ETA: 31s - loss: 1.3490 - regression_loss: 1.1690 - classification_loss: 0.1800 409/500 [=======================>......] - ETA: 31s - loss: 1.3486 - regression_loss: 1.1687 - classification_loss: 0.1800 410/500 [=======================>......] - ETA: 30s - loss: 1.3503 - regression_loss: 1.1701 - classification_loss: 0.1802 411/500 [=======================>......] - ETA: 30s - loss: 1.3498 - regression_loss: 1.1698 - classification_loss: 0.1800 412/500 [=======================>......] - ETA: 30s - loss: 1.3500 - regression_loss: 1.1701 - classification_loss: 0.1800 413/500 [=======================>......] - ETA: 29s - loss: 1.3502 - regression_loss: 1.1702 - classification_loss: 0.1800 414/500 [=======================>......] - ETA: 29s - loss: 1.3488 - regression_loss: 1.1691 - classification_loss: 0.1798 415/500 [=======================>......] - ETA: 29s - loss: 1.3485 - regression_loss: 1.1688 - classification_loss: 0.1797 416/500 [=======================>......] - ETA: 28s - loss: 1.3481 - regression_loss: 1.1684 - classification_loss: 0.1797 417/500 [========================>.....] - ETA: 28s - loss: 1.3466 - regression_loss: 1.1672 - classification_loss: 0.1794 418/500 [========================>.....] - ETA: 28s - loss: 1.3456 - regression_loss: 1.1663 - classification_loss: 0.1792 419/500 [========================>.....] - ETA: 27s - loss: 1.3444 - regression_loss: 1.1653 - classification_loss: 0.1791 420/500 [========================>.....] - ETA: 27s - loss: 1.3445 - regression_loss: 1.1654 - classification_loss: 0.1792 421/500 [========================>.....] - ETA: 27s - loss: 1.3426 - regression_loss: 1.1637 - classification_loss: 0.1789 422/500 [========================>.....] - ETA: 26s - loss: 1.3415 - regression_loss: 1.1628 - classification_loss: 0.1788 423/500 [========================>.....] - ETA: 26s - loss: 1.3401 - regression_loss: 1.1615 - classification_loss: 0.1786 424/500 [========================>.....] - ETA: 26s - loss: 1.3395 - regression_loss: 1.1610 - classification_loss: 0.1785 425/500 [========================>.....] - ETA: 25s - loss: 1.3391 - regression_loss: 1.1607 - classification_loss: 0.1784 426/500 [========================>.....] - ETA: 25s - loss: 1.3392 - regression_loss: 1.1608 - classification_loss: 0.1784 427/500 [========================>.....] - ETA: 25s - loss: 1.3383 - regression_loss: 1.1601 - classification_loss: 0.1782 428/500 [========================>.....] - ETA: 24s - loss: 1.3401 - regression_loss: 1.1616 - classification_loss: 0.1785 429/500 [========================>.....] - ETA: 24s - loss: 1.3399 - regression_loss: 1.1614 - classification_loss: 0.1786 430/500 [========================>.....] - ETA: 24s - loss: 1.3389 - regression_loss: 1.1605 - classification_loss: 0.1784 431/500 [========================>.....] - ETA: 23s - loss: 1.3379 - regression_loss: 1.1597 - classification_loss: 0.1782 432/500 [========================>.....] - ETA: 23s - loss: 1.3379 - regression_loss: 1.1598 - classification_loss: 0.1782 433/500 [========================>.....] - ETA: 22s - loss: 1.3376 - regression_loss: 1.1595 - classification_loss: 0.1781 434/500 [=========================>....] - ETA: 22s - loss: 1.3367 - regression_loss: 1.1587 - classification_loss: 0.1780 435/500 [=========================>....] - ETA: 22s - loss: 1.3374 - regression_loss: 1.1595 - classification_loss: 0.1779 436/500 [=========================>....] - ETA: 21s - loss: 1.3364 - regression_loss: 1.1586 - classification_loss: 0.1778 437/500 [=========================>....] - ETA: 21s - loss: 1.3368 - regression_loss: 1.1590 - classification_loss: 0.1778 438/500 [=========================>....] - ETA: 21s - loss: 1.3369 - regression_loss: 1.1591 - classification_loss: 0.1778 439/500 [=========================>....] - ETA: 20s - loss: 1.3379 - regression_loss: 1.1600 - classification_loss: 0.1779 440/500 [=========================>....] - ETA: 20s - loss: 1.3379 - regression_loss: 1.1600 - classification_loss: 0.1779 441/500 [=========================>....] - ETA: 20s - loss: 1.3370 - regression_loss: 1.1592 - classification_loss: 0.1778 442/500 [=========================>....] - ETA: 19s - loss: 1.3355 - regression_loss: 1.1580 - classification_loss: 0.1775 443/500 [=========================>....] - ETA: 19s - loss: 1.3364 - regression_loss: 1.1587 - classification_loss: 0.1777 444/500 [=========================>....] - ETA: 19s - loss: 1.3347 - regression_loss: 1.1573 - classification_loss: 0.1774 445/500 [=========================>....] - ETA: 18s - loss: 1.3344 - regression_loss: 1.1571 - classification_loss: 0.1773 446/500 [=========================>....] - ETA: 18s - loss: 1.3342 - regression_loss: 1.1568 - classification_loss: 0.1774 447/500 [=========================>....] - ETA: 18s - loss: 1.3334 - regression_loss: 1.1562 - classification_loss: 0.1772 448/500 [=========================>....] - ETA: 17s - loss: 1.3333 - regression_loss: 1.1561 - classification_loss: 0.1771 449/500 [=========================>....] - ETA: 17s - loss: 1.3334 - regression_loss: 1.1562 - classification_loss: 0.1772 450/500 [==========================>...] - ETA: 17s - loss: 1.3335 - regression_loss: 1.1564 - classification_loss: 0.1771 451/500 [==========================>...] - ETA: 16s - loss: 1.3330 - regression_loss: 1.1559 - classification_loss: 0.1771 452/500 [==========================>...] - ETA: 16s - loss: 1.3325 - regression_loss: 1.1555 - classification_loss: 0.1770 453/500 [==========================>...] - ETA: 16s - loss: 1.3323 - regression_loss: 1.1555 - classification_loss: 0.1769 454/500 [==========================>...] - ETA: 15s - loss: 1.3322 - regression_loss: 1.1555 - classification_loss: 0.1768 455/500 [==========================>...] - ETA: 15s - loss: 1.3319 - regression_loss: 1.1551 - classification_loss: 0.1768 456/500 [==========================>...] - ETA: 15s - loss: 1.3322 - regression_loss: 1.1553 - classification_loss: 0.1768 457/500 [==========================>...] - ETA: 14s - loss: 1.3309 - regression_loss: 1.1542 - classification_loss: 0.1766 458/500 [==========================>...] - ETA: 14s - loss: 1.3296 - regression_loss: 1.1531 - classification_loss: 0.1764 459/500 [==========================>...] - ETA: 14s - loss: 1.3288 - regression_loss: 1.1525 - classification_loss: 0.1763 460/500 [==========================>...] - ETA: 13s - loss: 1.3289 - regression_loss: 1.1527 - classification_loss: 0.1762 461/500 [==========================>...] - ETA: 13s - loss: 1.3281 - regression_loss: 1.1520 - classification_loss: 0.1761 462/500 [==========================>...] - ETA: 13s - loss: 1.3272 - regression_loss: 1.1512 - classification_loss: 0.1760 463/500 [==========================>...] - ETA: 12s - loss: 1.3261 - regression_loss: 1.1502 - classification_loss: 0.1759 464/500 [==========================>...] - ETA: 12s - loss: 1.3263 - regression_loss: 1.1504 - classification_loss: 0.1759 465/500 [==========================>...] - ETA: 12s - loss: 1.3268 - regression_loss: 1.1509 - classification_loss: 0.1760 466/500 [==========================>...] - ETA: 11s - loss: 1.3267 - regression_loss: 1.1508 - classification_loss: 0.1759 467/500 [===========================>..] - ETA: 11s - loss: 1.3262 - regression_loss: 1.1504 - classification_loss: 0.1758 468/500 [===========================>..] - ETA: 10s - loss: 1.3258 - regression_loss: 1.1500 - classification_loss: 0.1758 469/500 [===========================>..] - ETA: 10s - loss: 1.3248 - regression_loss: 1.1491 - classification_loss: 0.1757 470/500 [===========================>..] - ETA: 10s - loss: 1.3259 - regression_loss: 1.1500 - classification_loss: 0.1759 471/500 [===========================>..] - ETA: 9s - loss: 1.3252 - regression_loss: 1.1493 - classification_loss: 0.1758  472/500 [===========================>..] - ETA: 9s - loss: 1.3258 - regression_loss: 1.1499 - classification_loss: 0.1759 473/500 [===========================>..] - ETA: 9s - loss: 1.3248 - regression_loss: 1.1491 - classification_loss: 0.1757 474/500 [===========================>..] - ETA: 8s - loss: 1.3246 - regression_loss: 1.1490 - classification_loss: 0.1756 475/500 [===========================>..] - ETA: 8s - loss: 1.3243 - regression_loss: 1.1487 - classification_loss: 0.1756 476/500 [===========================>..] - ETA: 8s - loss: 1.3244 - regression_loss: 1.1488 - classification_loss: 0.1756 477/500 [===========================>..] - ETA: 7s - loss: 1.3241 - regression_loss: 1.1487 - classification_loss: 0.1754 478/500 [===========================>..] - ETA: 7s - loss: 1.3247 - regression_loss: 1.1493 - classification_loss: 0.1754 479/500 [===========================>..] - ETA: 7s - loss: 1.3249 - regression_loss: 1.1495 - classification_loss: 0.1754 480/500 [===========================>..] - ETA: 6s - loss: 1.3257 - regression_loss: 1.1501 - classification_loss: 0.1755 481/500 [===========================>..] - ETA: 6s - loss: 1.3260 - regression_loss: 1.1505 - classification_loss: 0.1756 482/500 [===========================>..] - ETA: 6s - loss: 1.3265 - regression_loss: 1.1509 - classification_loss: 0.1757 483/500 [===========================>..] - ETA: 5s - loss: 1.3266 - regression_loss: 1.1510 - classification_loss: 0.1757 484/500 [============================>.] - ETA: 5s - loss: 1.3272 - regression_loss: 1.1514 - classification_loss: 0.1758 485/500 [============================>.] - ETA: 5s - loss: 1.3271 - regression_loss: 1.1513 - classification_loss: 0.1758 486/500 [============================>.] - ETA: 4s - loss: 1.3277 - regression_loss: 1.1518 - classification_loss: 0.1759 487/500 [============================>.] - ETA: 4s - loss: 1.3282 - regression_loss: 1.1523 - classification_loss: 0.1759 488/500 [============================>.] - ETA: 4s - loss: 1.3286 - regression_loss: 1.1528 - classification_loss: 0.1758 489/500 [============================>.] - ETA: 3s - loss: 1.3277 - regression_loss: 1.1519 - classification_loss: 0.1758 490/500 [============================>.] - ETA: 3s - loss: 1.3265 - regression_loss: 1.1509 - classification_loss: 0.1756 491/500 [============================>.] - ETA: 3s - loss: 1.3276 - regression_loss: 1.1519 - classification_loss: 0.1757 492/500 [============================>.] - ETA: 2s - loss: 1.3269 - regression_loss: 1.1512 - classification_loss: 0.1757 493/500 [============================>.] - ETA: 2s - loss: 1.3267 - regression_loss: 1.1510 - classification_loss: 0.1756 494/500 [============================>.] - ETA: 2s - loss: 1.3264 - regression_loss: 1.1509 - classification_loss: 0.1755 495/500 [============================>.] - ETA: 1s - loss: 1.3255 - regression_loss: 1.1502 - classification_loss: 0.1753 496/500 [============================>.] - ETA: 1s - loss: 1.3262 - regression_loss: 1.1509 - classification_loss: 0.1753 497/500 [============================>.] - ETA: 1s - loss: 1.3259 - regression_loss: 1.1506 - classification_loss: 0.1752 498/500 [============================>.] - ETA: 0s - loss: 1.3258 - regression_loss: 1.1506 - classification_loss: 0.1752 499/500 [============================>.] - ETA: 0s - loss: 1.3258 - regression_loss: 1.1506 - classification_loss: 0.1752 500/500 [==============================] - 172s 344ms/step - loss: 1.3267 - regression_loss: 1.1514 - classification_loss: 0.1754 1172 instances of class plum with average precision: 0.6631 mAP: 0.6631 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 2:46 - loss: 1.6119 - regression_loss: 1.4379 - classification_loss: 0.1740 2/500 [..............................] - ETA: 2:48 - loss: 1.2824 - regression_loss: 1.1427 - classification_loss: 0.1398 3/500 [..............................] - ETA: 2:45 - loss: 1.1556 - regression_loss: 1.0260 - classification_loss: 0.1296 4/500 [..............................] - ETA: 2:46 - loss: 1.2289 - regression_loss: 1.0890 - classification_loss: 0.1399 5/500 [..............................] - ETA: 2:48 - loss: 1.3162 - regression_loss: 1.1543 - classification_loss: 0.1619 6/500 [..............................] - ETA: 2:48 - loss: 1.3368 - regression_loss: 1.1727 - classification_loss: 0.1642 7/500 [..............................] - ETA: 2:49 - loss: 1.3956 - regression_loss: 1.2191 - classification_loss: 0.1764 8/500 [..............................] - ETA: 2:49 - loss: 1.4371 - regression_loss: 1.2523 - classification_loss: 0.1848 9/500 [..............................] - ETA: 2:48 - loss: 1.4901 - regression_loss: 1.2972 - classification_loss: 0.1929 10/500 [..............................] - ETA: 2:48 - loss: 1.4567 - regression_loss: 1.2686 - classification_loss: 0.1881 11/500 [..............................] - ETA: 2:49 - loss: 1.4393 - regression_loss: 1.2546 - classification_loss: 0.1847 12/500 [..............................] - ETA: 2:49 - loss: 1.3760 - regression_loss: 1.2014 - classification_loss: 0.1746 13/500 [..............................] - ETA: 2:49 - loss: 1.3466 - regression_loss: 1.1751 - classification_loss: 0.1715 14/500 [..............................] - ETA: 2:48 - loss: 1.3089 - regression_loss: 1.1431 - classification_loss: 0.1657 15/500 [..............................] - ETA: 2:48 - loss: 1.3226 - regression_loss: 1.1551 - classification_loss: 0.1675 16/500 [..............................] - ETA: 2:47 - loss: 1.2724 - regression_loss: 1.1107 - classification_loss: 0.1617 17/500 [>.............................] - ETA: 2:47 - loss: 1.2865 - regression_loss: 1.1214 - classification_loss: 0.1651 18/500 [>.............................] - ETA: 2:47 - loss: 1.2998 - regression_loss: 1.1318 - classification_loss: 0.1680 19/500 [>.............................] - ETA: 2:47 - loss: 1.3301 - regression_loss: 1.1574 - classification_loss: 0.1727 20/500 [>.............................] - ETA: 2:46 - loss: 1.3067 - regression_loss: 1.1374 - classification_loss: 0.1693 21/500 [>.............................] - ETA: 2:46 - loss: 1.3215 - regression_loss: 1.1503 - classification_loss: 0.1712 22/500 [>.............................] - ETA: 2:46 - loss: 1.3400 - regression_loss: 1.1649 - classification_loss: 0.1751 23/500 [>.............................] - ETA: 2:46 - loss: 1.3565 - regression_loss: 1.1790 - classification_loss: 0.1775 24/500 [>.............................] - ETA: 2:46 - loss: 1.3471 - regression_loss: 1.1704 - classification_loss: 0.1767 25/500 [>.............................] - ETA: 2:45 - loss: 1.3482 - regression_loss: 1.1730 - classification_loss: 0.1752 26/500 [>.............................] - ETA: 2:45 - loss: 1.3385 - regression_loss: 1.1640 - classification_loss: 0.1745 27/500 [>.............................] - ETA: 2:45 - loss: 1.3335 - regression_loss: 1.1595 - classification_loss: 0.1740 28/500 [>.............................] - ETA: 2:44 - loss: 1.3069 - regression_loss: 1.1361 - classification_loss: 0.1708 29/500 [>.............................] - ETA: 2:44 - loss: 1.3293 - regression_loss: 1.1540 - classification_loss: 0.1752 30/500 [>.............................] - ETA: 2:43 - loss: 1.3093 - regression_loss: 1.1364 - classification_loss: 0.1729 31/500 [>.............................] - ETA: 2:43 - loss: 1.3135 - regression_loss: 1.1397 - classification_loss: 0.1738 32/500 [>.............................] - ETA: 2:43 - loss: 1.3215 - regression_loss: 1.1451 - classification_loss: 0.1763 33/500 [>.............................] - ETA: 2:42 - loss: 1.3239 - regression_loss: 1.1496 - classification_loss: 0.1743 34/500 [=>............................] - ETA: 2:42 - loss: 1.3267 - regression_loss: 1.1515 - classification_loss: 0.1752 35/500 [=>............................] - ETA: 2:41 - loss: 1.3149 - regression_loss: 1.1423 - classification_loss: 0.1727 36/500 [=>............................] - ETA: 2:41 - loss: 1.3135 - regression_loss: 1.1415 - classification_loss: 0.1720 37/500 [=>............................] - ETA: 2:40 - loss: 1.3133 - regression_loss: 1.1412 - classification_loss: 0.1721 38/500 [=>............................] - ETA: 2:40 - loss: 1.3150 - regression_loss: 1.1421 - classification_loss: 0.1729 39/500 [=>............................] - ETA: 2:40 - loss: 1.3047 - regression_loss: 1.1330 - classification_loss: 0.1718 40/500 [=>............................] - ETA: 2:39 - loss: 1.3185 - regression_loss: 1.1443 - classification_loss: 0.1742 41/500 [=>............................] - ETA: 2:39 - loss: 1.3342 - regression_loss: 1.1588 - classification_loss: 0.1755 42/500 [=>............................] - ETA: 2:39 - loss: 1.3387 - regression_loss: 1.1628 - classification_loss: 0.1760 43/500 [=>............................] - ETA: 2:38 - loss: 1.3381 - regression_loss: 1.1633 - classification_loss: 0.1748 44/500 [=>............................] - ETA: 2:38 - loss: 1.3407 - regression_loss: 1.1664 - classification_loss: 0.1743 45/500 [=>............................] - ETA: 2:37 - loss: 1.3474 - regression_loss: 1.1711 - classification_loss: 0.1764 46/500 [=>............................] - ETA: 2:37 - loss: 1.3464 - regression_loss: 1.1699 - classification_loss: 0.1765 47/500 [=>............................] - ETA: 2:37 - loss: 1.3582 - regression_loss: 1.1800 - classification_loss: 0.1782 48/500 [=>............................] - ETA: 2:36 - loss: 1.3573 - regression_loss: 1.1801 - classification_loss: 0.1772 49/500 [=>............................] - ETA: 2:36 - loss: 1.3466 - regression_loss: 1.1709 - classification_loss: 0.1757 50/500 [==>...........................] - ETA: 2:36 - loss: 1.3302 - regression_loss: 1.1572 - classification_loss: 0.1730 51/500 [==>...........................] - ETA: 2:35 - loss: 1.3287 - regression_loss: 1.1543 - classification_loss: 0.1744 52/500 [==>...........................] - ETA: 2:35 - loss: 1.3278 - regression_loss: 1.1527 - classification_loss: 0.1751 53/500 [==>...........................] - ETA: 2:35 - loss: 1.3185 - regression_loss: 1.1456 - classification_loss: 0.1730 54/500 [==>...........................] - ETA: 2:34 - loss: 1.3163 - regression_loss: 1.1440 - classification_loss: 0.1723 55/500 [==>...........................] - ETA: 2:34 - loss: 1.3179 - regression_loss: 1.1453 - classification_loss: 0.1726 56/500 [==>...........................] - ETA: 2:34 - loss: 1.3132 - regression_loss: 1.1413 - classification_loss: 0.1719 57/500 [==>...........................] - ETA: 2:33 - loss: 1.3120 - regression_loss: 1.1404 - classification_loss: 0.1716 58/500 [==>...........................] - ETA: 2:33 - loss: 1.3043 - regression_loss: 1.1333 - classification_loss: 0.1710 59/500 [==>...........................] - ETA: 2:33 - loss: 1.2990 - regression_loss: 1.1281 - classification_loss: 0.1709 60/500 [==>...........................] - ETA: 2:32 - loss: 1.3066 - regression_loss: 1.1352 - classification_loss: 0.1714 61/500 [==>...........................] - ETA: 2:32 - loss: 1.3088 - regression_loss: 1.1372 - classification_loss: 0.1717 62/500 [==>...........................] - ETA: 2:32 - loss: 1.3147 - regression_loss: 1.1425 - classification_loss: 0.1722 63/500 [==>...........................] - ETA: 2:31 - loss: 1.3099 - regression_loss: 1.1379 - classification_loss: 0.1720 64/500 [==>...........................] - ETA: 2:31 - loss: 1.3116 - regression_loss: 1.1389 - classification_loss: 0.1726 65/500 [==>...........................] - ETA: 2:30 - loss: 1.3136 - regression_loss: 1.1410 - classification_loss: 0.1726 66/500 [==>...........................] - ETA: 2:30 - loss: 1.3116 - regression_loss: 1.1393 - classification_loss: 0.1723 67/500 [===>..........................] - ETA: 2:30 - loss: 1.3121 - regression_loss: 1.1403 - classification_loss: 0.1718 68/500 [===>..........................] - ETA: 2:29 - loss: 1.3176 - regression_loss: 1.1450 - classification_loss: 0.1726 69/500 [===>..........................] - ETA: 2:29 - loss: 1.3183 - regression_loss: 1.1453 - classification_loss: 0.1731 70/500 [===>..........................] - ETA: 2:29 - loss: 1.3251 - regression_loss: 1.1510 - classification_loss: 0.1741 71/500 [===>..........................] - ETA: 2:28 - loss: 1.3296 - regression_loss: 1.1554 - classification_loss: 0.1742 72/500 [===>..........................] - ETA: 2:28 - loss: 1.3306 - regression_loss: 1.1562 - classification_loss: 0.1744 73/500 [===>..........................] - ETA: 2:27 - loss: 1.3325 - regression_loss: 1.1570 - classification_loss: 0.1755 74/500 [===>..........................] - ETA: 2:27 - loss: 1.3349 - regression_loss: 1.1586 - classification_loss: 0.1763 75/500 [===>..........................] - ETA: 2:26 - loss: 1.3297 - regression_loss: 1.1539 - classification_loss: 0.1758 76/500 [===>..........................] - ETA: 2:26 - loss: 1.3217 - regression_loss: 1.1469 - classification_loss: 0.1748 77/500 [===>..........................] - ETA: 2:26 - loss: 1.3136 - regression_loss: 1.1400 - classification_loss: 0.1737 78/500 [===>..........................] - ETA: 2:25 - loss: 1.3156 - regression_loss: 1.1420 - classification_loss: 0.1736 79/500 [===>..........................] - ETA: 2:25 - loss: 1.3061 - regression_loss: 1.1335 - classification_loss: 0.1725 80/500 [===>..........................] - ETA: 2:25 - loss: 1.3087 - regression_loss: 1.1361 - classification_loss: 0.1726 81/500 [===>..........................] - ETA: 2:24 - loss: 1.3039 - regression_loss: 1.1325 - classification_loss: 0.1714 82/500 [===>..........................] - ETA: 2:24 - loss: 1.3067 - regression_loss: 1.1353 - classification_loss: 0.1714 83/500 [===>..........................] - ETA: 2:23 - loss: 1.3083 - regression_loss: 1.1366 - classification_loss: 0.1718 84/500 [====>.........................] - ETA: 2:23 - loss: 1.3072 - regression_loss: 1.1341 - classification_loss: 0.1731 85/500 [====>.........................] - ETA: 2:22 - loss: 1.3035 - regression_loss: 1.1314 - classification_loss: 0.1721 86/500 [====>.........................] - ETA: 2:22 - loss: 1.3036 - regression_loss: 1.1312 - classification_loss: 0.1724 87/500 [====>.........................] - ETA: 2:22 - loss: 1.3055 - regression_loss: 1.1325 - classification_loss: 0.1730 88/500 [====>.........................] - ETA: 2:21 - loss: 1.3009 - regression_loss: 1.1280 - classification_loss: 0.1730 89/500 [====>.........................] - ETA: 2:21 - loss: 1.2977 - regression_loss: 1.1258 - classification_loss: 0.1719 90/500 [====>.........................] - ETA: 2:21 - loss: 1.2987 - regression_loss: 1.1269 - classification_loss: 0.1718 91/500 [====>.........................] - ETA: 2:20 - loss: 1.3037 - regression_loss: 1.1315 - classification_loss: 0.1722 92/500 [====>.........................] - ETA: 2:20 - loss: 1.2995 - regression_loss: 1.1283 - classification_loss: 0.1712 93/500 [====>.........................] - ETA: 2:20 - loss: 1.3000 - regression_loss: 1.1294 - classification_loss: 0.1706 94/500 [====>.........................] - ETA: 2:19 - loss: 1.2990 - regression_loss: 1.1278 - classification_loss: 0.1711 95/500 [====>.........................] - ETA: 2:19 - loss: 1.2960 - regression_loss: 1.1246 - classification_loss: 0.1714 96/500 [====>.........................] - ETA: 2:19 - loss: 1.2918 - regression_loss: 1.1212 - classification_loss: 0.1706 97/500 [====>.........................] - ETA: 2:18 - loss: 1.2940 - regression_loss: 1.1230 - classification_loss: 0.1710 98/500 [====>.........................] - ETA: 2:18 - loss: 1.2957 - regression_loss: 1.1243 - classification_loss: 0.1714 99/500 [====>.........................] - ETA: 2:18 - loss: 1.2949 - regression_loss: 1.1237 - classification_loss: 0.1712 100/500 [=====>........................] - ETA: 2:17 - loss: 1.2971 - regression_loss: 1.1246 - classification_loss: 0.1724 101/500 [=====>........................] - ETA: 2:17 - loss: 1.2965 - regression_loss: 1.1242 - classification_loss: 0.1723 102/500 [=====>........................] - ETA: 2:17 - loss: 1.2948 - regression_loss: 1.1229 - classification_loss: 0.1719 103/500 [=====>........................] - ETA: 2:16 - loss: 1.2931 - regression_loss: 1.1212 - classification_loss: 0.1719 104/500 [=====>........................] - ETA: 2:16 - loss: 1.2872 - regression_loss: 1.1159 - classification_loss: 0.1713 105/500 [=====>........................] - ETA: 2:16 - loss: 1.2883 - regression_loss: 1.1169 - classification_loss: 0.1714 106/500 [=====>........................] - ETA: 2:15 - loss: 1.2850 - regression_loss: 1.1145 - classification_loss: 0.1705 107/500 [=====>........................] - ETA: 2:15 - loss: 1.2816 - regression_loss: 1.1117 - classification_loss: 0.1700 108/500 [=====>........................] - ETA: 2:15 - loss: 1.2870 - regression_loss: 1.1171 - classification_loss: 0.1699 109/500 [=====>........................] - ETA: 2:15 - loss: 1.2845 - regression_loss: 1.1147 - classification_loss: 0.1698 110/500 [=====>........................] - ETA: 2:14 - loss: 1.2825 - regression_loss: 1.1130 - classification_loss: 0.1695 111/500 [=====>........................] - ETA: 2:14 - loss: 1.2831 - regression_loss: 1.1138 - classification_loss: 0.1694 112/500 [=====>........................] - ETA: 2:13 - loss: 1.2776 - regression_loss: 1.1091 - classification_loss: 0.1685 113/500 [=====>........................] - ETA: 2:13 - loss: 1.2735 - regression_loss: 1.1056 - classification_loss: 0.1680 114/500 [=====>........................] - ETA: 2:13 - loss: 1.2682 - regression_loss: 1.1009 - classification_loss: 0.1673 115/500 [=====>........................] - ETA: 2:12 - loss: 1.2704 - regression_loss: 1.1023 - classification_loss: 0.1681 116/500 [=====>........................] - ETA: 2:12 - loss: 1.2634 - regression_loss: 1.0963 - classification_loss: 0.1671 117/500 [======>.......................] - ETA: 2:12 - loss: 1.2669 - regression_loss: 1.0993 - classification_loss: 0.1676 118/500 [======>.......................] - ETA: 2:11 - loss: 1.2635 - regression_loss: 1.0966 - classification_loss: 0.1669 119/500 [======>.......................] - ETA: 2:11 - loss: 1.2627 - regression_loss: 1.0960 - classification_loss: 0.1667 120/500 [======>.......................] - ETA: 2:11 - loss: 1.2643 - regression_loss: 1.0968 - classification_loss: 0.1674 121/500 [======>.......................] - ETA: 2:10 - loss: 1.2660 - regression_loss: 1.0984 - classification_loss: 0.1677 122/500 [======>.......................] - ETA: 2:10 - loss: 1.2637 - regression_loss: 1.0964 - classification_loss: 0.1673 123/500 [======>.......................] - ETA: 2:10 - loss: 1.2648 - regression_loss: 1.0977 - classification_loss: 0.1671 124/500 [======>.......................] - ETA: 2:09 - loss: 1.2657 - regression_loss: 1.0984 - classification_loss: 0.1673 125/500 [======>.......................] - ETA: 2:09 - loss: 1.2614 - regression_loss: 1.0945 - classification_loss: 0.1668 126/500 [======>.......................] - ETA: 2:09 - loss: 1.2615 - regression_loss: 1.0946 - classification_loss: 0.1669 127/500 [======>.......................] - ETA: 2:08 - loss: 1.2580 - regression_loss: 1.0916 - classification_loss: 0.1665 128/500 [======>.......................] - ETA: 2:08 - loss: 1.2570 - regression_loss: 1.0908 - classification_loss: 0.1661 129/500 [======>.......................] - ETA: 2:08 - loss: 1.2558 - regression_loss: 1.0899 - classification_loss: 0.1659 130/500 [======>.......................] - ETA: 2:07 - loss: 1.2564 - regression_loss: 1.0904 - classification_loss: 0.1660 131/500 [======>.......................] - ETA: 2:07 - loss: 1.2535 - regression_loss: 1.0880 - classification_loss: 0.1655 132/500 [======>.......................] - ETA: 2:07 - loss: 1.2545 - regression_loss: 1.0892 - classification_loss: 0.1653 133/500 [======>.......................] - ETA: 2:06 - loss: 1.2540 - regression_loss: 1.0888 - classification_loss: 0.1652 134/500 [=======>......................] - ETA: 2:06 - loss: 1.2543 - regression_loss: 1.0891 - classification_loss: 0.1652 135/500 [=======>......................] - ETA: 2:05 - loss: 1.2507 - regression_loss: 1.0860 - classification_loss: 0.1647 136/500 [=======>......................] - ETA: 2:05 - loss: 1.2479 - regression_loss: 1.0836 - classification_loss: 0.1643 137/500 [=======>......................] - ETA: 2:05 - loss: 1.2503 - regression_loss: 1.0857 - classification_loss: 0.1647 138/500 [=======>......................] - ETA: 2:04 - loss: 1.2527 - regression_loss: 1.0878 - classification_loss: 0.1648 139/500 [=======>......................] - ETA: 2:04 - loss: 1.2547 - regression_loss: 1.0895 - classification_loss: 0.1652 140/500 [=======>......................] - ETA: 2:04 - loss: 1.2543 - regression_loss: 1.0891 - classification_loss: 0.1652 141/500 [=======>......................] - ETA: 2:03 - loss: 1.2531 - regression_loss: 1.0885 - classification_loss: 0.1647 142/500 [=======>......................] - ETA: 2:03 - loss: 1.2520 - regression_loss: 1.0873 - classification_loss: 0.1647 143/500 [=======>......................] - ETA: 2:03 - loss: 1.2497 - regression_loss: 1.0851 - classification_loss: 0.1646 144/500 [=======>......................] - ETA: 2:02 - loss: 1.2514 - regression_loss: 1.0868 - classification_loss: 0.1646 145/500 [=======>......................] - ETA: 2:02 - loss: 1.2474 - regression_loss: 1.0833 - classification_loss: 0.1641 146/500 [=======>......................] - ETA: 2:02 - loss: 1.2465 - regression_loss: 1.0825 - classification_loss: 0.1640 147/500 [=======>......................] - ETA: 2:01 - loss: 1.2496 - regression_loss: 1.0852 - classification_loss: 0.1644 148/500 [=======>......................] - ETA: 2:01 - loss: 1.2492 - regression_loss: 1.0847 - classification_loss: 0.1645 149/500 [=======>......................] - ETA: 2:01 - loss: 1.2488 - regression_loss: 1.0847 - classification_loss: 0.1642 150/500 [========>.....................] - ETA: 2:00 - loss: 1.2491 - regression_loss: 1.0850 - classification_loss: 0.1641 151/500 [========>.....................] - ETA: 2:00 - loss: 1.2516 - regression_loss: 1.0873 - classification_loss: 0.1644 152/500 [========>.....................] - ETA: 2:00 - loss: 1.2501 - regression_loss: 1.0863 - classification_loss: 0.1638 153/500 [========>.....................] - ETA: 1:59 - loss: 1.2498 - regression_loss: 1.0857 - classification_loss: 0.1641 154/500 [========>.....................] - ETA: 1:59 - loss: 1.2474 - regression_loss: 1.0835 - classification_loss: 0.1639 155/500 [========>.....................] - ETA: 1:58 - loss: 1.2465 - regression_loss: 1.0825 - classification_loss: 0.1640 156/500 [========>.....................] - ETA: 1:58 - loss: 1.2470 - regression_loss: 1.0831 - classification_loss: 0.1639 157/500 [========>.....................] - ETA: 1:58 - loss: 1.2430 - regression_loss: 1.0796 - classification_loss: 0.1634 158/500 [========>.....................] - ETA: 1:57 - loss: 1.2428 - regression_loss: 1.0795 - classification_loss: 0.1634 159/500 [========>.....................] - ETA: 1:57 - loss: 1.2461 - regression_loss: 1.0821 - classification_loss: 0.1639 160/500 [========>.....................] - ETA: 1:57 - loss: 1.2480 - regression_loss: 1.0837 - classification_loss: 0.1643 161/500 [========>.....................] - ETA: 1:56 - loss: 1.2475 - regression_loss: 1.0834 - classification_loss: 0.1641 162/500 [========>.....................] - ETA: 1:56 - loss: 1.2444 - regression_loss: 1.0807 - classification_loss: 0.1637 163/500 [========>.....................] - ETA: 1:56 - loss: 1.2444 - regression_loss: 1.0808 - classification_loss: 0.1636 164/500 [========>.....................] - ETA: 1:55 - loss: 1.2444 - regression_loss: 1.0810 - classification_loss: 0.1634 165/500 [========>.....................] - ETA: 1:55 - loss: 1.2448 - regression_loss: 1.0814 - classification_loss: 0.1634 166/500 [========>.....................] - ETA: 1:55 - loss: 1.2441 - regression_loss: 1.0810 - classification_loss: 0.1631 167/500 [=========>....................] - ETA: 1:54 - loss: 1.2491 - regression_loss: 1.0853 - classification_loss: 0.1638 168/500 [=========>....................] - ETA: 1:54 - loss: 1.2495 - regression_loss: 1.0859 - classification_loss: 0.1636 169/500 [=========>....................] - ETA: 1:53 - loss: 1.2485 - regression_loss: 1.0850 - classification_loss: 0.1636 170/500 [=========>....................] - ETA: 1:53 - loss: 1.2464 - regression_loss: 1.0834 - classification_loss: 0.1631 171/500 [=========>....................] - ETA: 1:53 - loss: 1.2496 - regression_loss: 1.0861 - classification_loss: 0.1635 172/500 [=========>....................] - ETA: 1:52 - loss: 1.2468 - regression_loss: 1.0837 - classification_loss: 0.1630 173/500 [=========>....................] - ETA: 1:52 - loss: 1.2469 - regression_loss: 1.0840 - classification_loss: 0.1628 174/500 [=========>....................] - ETA: 1:52 - loss: 1.2442 - regression_loss: 1.0817 - classification_loss: 0.1625 175/500 [=========>....................] - ETA: 1:51 - loss: 1.2427 - regression_loss: 1.0806 - classification_loss: 0.1621 176/500 [=========>....................] - ETA: 1:51 - loss: 1.2410 - regression_loss: 1.0790 - classification_loss: 0.1620 177/500 [=========>....................] - ETA: 1:51 - loss: 1.2386 - regression_loss: 1.0770 - classification_loss: 0.1616 178/500 [=========>....................] - ETA: 1:50 - loss: 1.2427 - regression_loss: 1.0807 - classification_loss: 0.1620 179/500 [=========>....................] - ETA: 1:50 - loss: 1.2425 - regression_loss: 1.0806 - classification_loss: 0.1619 180/500 [=========>....................] - ETA: 1:49 - loss: 1.2416 - regression_loss: 1.0797 - classification_loss: 0.1619 181/500 [=========>....................] - ETA: 1:49 - loss: 1.2411 - regression_loss: 1.0795 - classification_loss: 0.1616 182/500 [=========>....................] - ETA: 1:49 - loss: 1.2422 - regression_loss: 1.0805 - classification_loss: 0.1617 183/500 [=========>....................] - ETA: 1:48 - loss: 1.2416 - regression_loss: 1.0802 - classification_loss: 0.1614 184/500 [==========>...................] - ETA: 1:48 - loss: 1.2431 - regression_loss: 1.0815 - classification_loss: 0.1616 185/500 [==========>...................] - ETA: 1:48 - loss: 1.2412 - regression_loss: 1.0797 - classification_loss: 0.1615 186/500 [==========>...................] - ETA: 1:47 - loss: 1.2383 - regression_loss: 1.0773 - classification_loss: 0.1611 187/500 [==========>...................] - ETA: 1:47 - loss: 1.2400 - regression_loss: 1.0787 - classification_loss: 0.1613 188/500 [==========>...................] - ETA: 1:47 - loss: 1.2402 - regression_loss: 1.0787 - classification_loss: 0.1615 189/500 [==========>...................] - ETA: 1:47 - loss: 1.2374 - regression_loss: 1.0764 - classification_loss: 0.1610 190/500 [==========>...................] - ETA: 1:46 - loss: 1.2366 - regression_loss: 1.0758 - classification_loss: 0.1608 191/500 [==========>...................] - ETA: 1:46 - loss: 1.2363 - regression_loss: 1.0757 - classification_loss: 0.1606 192/500 [==========>...................] - ETA: 1:45 - loss: 1.2319 - regression_loss: 1.0719 - classification_loss: 0.1601 193/500 [==========>...................] - ETA: 1:45 - loss: 1.2341 - regression_loss: 1.0739 - classification_loss: 0.1602 194/500 [==========>...................] - ETA: 1:45 - loss: 1.2349 - regression_loss: 1.0745 - classification_loss: 0.1604 195/500 [==========>...................] - ETA: 1:44 - loss: 1.2340 - regression_loss: 1.0738 - classification_loss: 0.1602 196/500 [==========>...................] - ETA: 1:44 - loss: 1.2334 - regression_loss: 1.0733 - classification_loss: 0.1601 197/500 [==========>...................] - ETA: 1:44 - loss: 1.2360 - regression_loss: 1.0755 - classification_loss: 0.1605 198/500 [==========>...................] - ETA: 1:43 - loss: 1.2363 - regression_loss: 1.0757 - classification_loss: 0.1606 199/500 [==========>...................] - ETA: 1:43 - loss: 1.2352 - regression_loss: 1.0746 - classification_loss: 0.1606 200/500 [===========>..................] - ETA: 1:43 - loss: 1.2319 - regression_loss: 1.0717 - classification_loss: 0.1602 201/500 [===========>..................] - ETA: 1:42 - loss: 1.2344 - regression_loss: 1.0738 - classification_loss: 0.1606 202/500 [===========>..................] - ETA: 1:42 - loss: 1.2370 - regression_loss: 1.0760 - classification_loss: 0.1610 203/500 [===========>..................] - ETA: 1:42 - loss: 1.2333 - regression_loss: 1.0727 - classification_loss: 0.1606 204/500 [===========>..................] - ETA: 1:41 - loss: 1.2322 - regression_loss: 1.0721 - classification_loss: 0.1602 205/500 [===========>..................] - ETA: 1:41 - loss: 1.2342 - regression_loss: 1.0737 - classification_loss: 0.1605 206/500 [===========>..................] - ETA: 1:41 - loss: 1.2337 - regression_loss: 1.0732 - classification_loss: 0.1604 207/500 [===========>..................] - ETA: 1:40 - loss: 1.2340 - regression_loss: 1.0739 - classification_loss: 0.1601 208/500 [===========>..................] - ETA: 1:40 - loss: 1.2316 - regression_loss: 1.0719 - classification_loss: 0.1596 209/500 [===========>..................] - ETA: 1:39 - loss: 1.2313 - regression_loss: 1.0717 - classification_loss: 0.1596 210/500 [===========>..................] - ETA: 1:39 - loss: 1.2287 - regression_loss: 1.0696 - classification_loss: 0.1592 211/500 [===========>..................] - ETA: 1:39 - loss: 1.2266 - regression_loss: 1.0677 - classification_loss: 0.1588 212/500 [===========>..................] - ETA: 1:38 - loss: 1.2234 - regression_loss: 1.0650 - classification_loss: 0.1584 213/500 [===========>..................] - ETA: 1:38 - loss: 1.2261 - regression_loss: 1.0675 - classification_loss: 0.1587 214/500 [===========>..................] - ETA: 1:38 - loss: 1.2257 - regression_loss: 1.0670 - classification_loss: 0.1587 215/500 [===========>..................] - ETA: 1:37 - loss: 1.2283 - regression_loss: 1.0692 - classification_loss: 0.1591 216/500 [===========>..................] - ETA: 1:37 - loss: 1.2287 - regression_loss: 1.0694 - classification_loss: 0.1593 217/500 [============>.................] - ETA: 1:37 - loss: 1.2278 - regression_loss: 1.0685 - classification_loss: 0.1593 218/500 [============>.................] - ETA: 1:36 - loss: 1.2254 - regression_loss: 1.0663 - classification_loss: 0.1591 219/500 [============>.................] - ETA: 1:36 - loss: 1.2248 - regression_loss: 1.0657 - classification_loss: 0.1591 220/500 [============>.................] - ETA: 1:36 - loss: 1.2243 - regression_loss: 1.0653 - classification_loss: 0.1590 221/500 [============>.................] - ETA: 1:35 - loss: 1.2251 - regression_loss: 1.0660 - classification_loss: 0.1591 222/500 [============>.................] - ETA: 1:35 - loss: 1.2234 - regression_loss: 1.0643 - classification_loss: 0.1591 223/500 [============>.................] - ETA: 1:35 - loss: 1.2223 - regression_loss: 1.0634 - classification_loss: 0.1590 224/500 [============>.................] - ETA: 1:34 - loss: 1.2236 - regression_loss: 1.0645 - classification_loss: 0.1591 225/500 [============>.................] - ETA: 1:34 - loss: 1.2230 - regression_loss: 1.0639 - classification_loss: 0.1590 226/500 [============>.................] - ETA: 1:34 - loss: 1.2228 - regression_loss: 1.0640 - classification_loss: 0.1588 227/500 [============>.................] - ETA: 1:33 - loss: 1.2209 - regression_loss: 1.0624 - classification_loss: 0.1585 228/500 [============>.................] - ETA: 1:33 - loss: 1.2199 - regression_loss: 1.0614 - classification_loss: 0.1585 229/500 [============>.................] - ETA: 1:33 - loss: 1.2171 - regression_loss: 1.0590 - classification_loss: 0.1581 230/500 [============>.................] - ETA: 1:32 - loss: 1.2161 - regression_loss: 1.0583 - classification_loss: 0.1578 231/500 [============>.................] - ETA: 1:32 - loss: 1.2144 - regression_loss: 1.0568 - classification_loss: 0.1576 232/500 [============>.................] - ETA: 1:32 - loss: 1.2150 - regression_loss: 1.0573 - classification_loss: 0.1577 233/500 [============>.................] - ETA: 1:31 - loss: 1.2124 - regression_loss: 1.0549 - classification_loss: 0.1575 234/500 [=============>................] - ETA: 1:31 - loss: 1.2130 - regression_loss: 1.0555 - classification_loss: 0.1575 235/500 [=============>................] - ETA: 1:31 - loss: 1.2123 - regression_loss: 1.0548 - classification_loss: 0.1575 236/500 [=============>................] - ETA: 1:30 - loss: 1.2118 - regression_loss: 1.0545 - classification_loss: 0.1573 237/500 [=============>................] - ETA: 1:30 - loss: 1.2112 - regression_loss: 1.0540 - classification_loss: 0.1571 238/500 [=============>................] - ETA: 1:30 - loss: 1.2118 - regression_loss: 1.0545 - classification_loss: 0.1572 239/500 [=============>................] - ETA: 1:29 - loss: 1.2138 - regression_loss: 1.0563 - classification_loss: 0.1575 240/500 [=============>................] - ETA: 1:29 - loss: 1.2133 - regression_loss: 1.0560 - classification_loss: 0.1573 241/500 [=============>................] - ETA: 1:28 - loss: 1.2123 - regression_loss: 1.0550 - classification_loss: 0.1573 242/500 [=============>................] - ETA: 1:28 - loss: 1.2147 - regression_loss: 1.0569 - classification_loss: 0.1577 243/500 [=============>................] - ETA: 1:28 - loss: 1.2142 - regression_loss: 1.0565 - classification_loss: 0.1577 244/500 [=============>................] - ETA: 1:27 - loss: 1.2158 - regression_loss: 1.0583 - classification_loss: 0.1575 245/500 [=============>................] - ETA: 1:27 - loss: 1.2185 - regression_loss: 1.0606 - classification_loss: 0.1579 246/500 [=============>................] - ETA: 1:27 - loss: 1.2204 - regression_loss: 1.0622 - classification_loss: 0.1582 247/500 [=============>................] - ETA: 1:26 - loss: 1.2174 - regression_loss: 1.0596 - classification_loss: 0.1579 248/500 [=============>................] - ETA: 1:26 - loss: 1.2159 - regression_loss: 1.0584 - classification_loss: 0.1576 249/500 [=============>................] - ETA: 1:26 - loss: 1.2126 - regression_loss: 1.0554 - classification_loss: 0.1572 250/500 [==============>...............] - ETA: 1:25 - loss: 1.2141 - regression_loss: 1.0568 - classification_loss: 0.1573 251/500 [==============>...............] - ETA: 1:25 - loss: 1.2146 - regression_loss: 1.0573 - classification_loss: 0.1573 252/500 [==============>...............] - ETA: 1:25 - loss: 1.2158 - regression_loss: 1.0584 - classification_loss: 0.1574 253/500 [==============>...............] - ETA: 1:24 - loss: 1.2176 - regression_loss: 1.0601 - classification_loss: 0.1575 254/500 [==============>...............] - ETA: 1:24 - loss: 1.2191 - regression_loss: 1.0615 - classification_loss: 0.1576 255/500 [==============>...............] - ETA: 1:24 - loss: 1.2214 - regression_loss: 1.0635 - classification_loss: 0.1579 256/500 [==============>...............] - ETA: 1:23 - loss: 1.2227 - regression_loss: 1.0647 - classification_loss: 0.1580 257/500 [==============>...............] - ETA: 1:23 - loss: 1.2235 - regression_loss: 1.0653 - classification_loss: 0.1582 258/500 [==============>...............] - ETA: 1:23 - loss: 1.2248 - regression_loss: 1.0665 - classification_loss: 0.1583 259/500 [==============>...............] - ETA: 1:22 - loss: 1.2263 - regression_loss: 1.0679 - classification_loss: 0.1583 260/500 [==============>...............] - ETA: 1:22 - loss: 1.2241 - regression_loss: 1.0661 - classification_loss: 0.1580 261/500 [==============>...............] - ETA: 1:22 - loss: 1.2231 - regression_loss: 1.0652 - classification_loss: 0.1579 262/500 [==============>...............] - ETA: 1:21 - loss: 1.2235 - regression_loss: 1.0656 - classification_loss: 0.1578 263/500 [==============>...............] - ETA: 1:21 - loss: 1.2227 - regression_loss: 1.0652 - classification_loss: 0.1575 264/500 [==============>...............] - ETA: 1:21 - loss: 1.2232 - regression_loss: 1.0659 - classification_loss: 0.1574 265/500 [==============>...............] - ETA: 1:20 - loss: 1.2230 - regression_loss: 1.0657 - classification_loss: 0.1574 266/500 [==============>...............] - ETA: 1:20 - loss: 1.2238 - regression_loss: 1.0662 - classification_loss: 0.1576 267/500 [===============>..............] - ETA: 1:19 - loss: 1.2232 - regression_loss: 1.0651 - classification_loss: 0.1581 268/500 [===============>..............] - ETA: 1:19 - loss: 1.2248 - regression_loss: 1.0664 - classification_loss: 0.1584 269/500 [===============>..............] - ETA: 1:19 - loss: 1.2243 - regression_loss: 1.0659 - classification_loss: 0.1583 270/500 [===============>..............] - ETA: 1:18 - loss: 1.2218 - regression_loss: 1.0637 - classification_loss: 0.1581 271/500 [===============>..............] - ETA: 1:18 - loss: 1.2216 - regression_loss: 1.0638 - classification_loss: 0.1579 272/500 [===============>..............] - ETA: 1:18 - loss: 1.2207 - regression_loss: 1.0630 - classification_loss: 0.1577 273/500 [===============>..............] - ETA: 1:17 - loss: 1.2189 - regression_loss: 1.0613 - classification_loss: 0.1575 274/500 [===============>..............] - ETA: 1:17 - loss: 1.2170 - regression_loss: 1.0598 - classification_loss: 0.1572 275/500 [===============>..............] - ETA: 1:17 - loss: 1.2157 - regression_loss: 1.0587 - classification_loss: 0.1569 276/500 [===============>..............] - ETA: 1:16 - loss: 1.2144 - regression_loss: 1.0576 - classification_loss: 0.1568 277/500 [===============>..............] - ETA: 1:16 - loss: 1.2150 - regression_loss: 1.0581 - classification_loss: 0.1569 278/500 [===============>..............] - ETA: 1:16 - loss: 1.2149 - regression_loss: 1.0578 - classification_loss: 0.1571 279/500 [===============>..............] - ETA: 1:15 - loss: 1.2144 - regression_loss: 1.0573 - classification_loss: 0.1571 280/500 [===============>..............] - ETA: 1:15 - loss: 1.2127 - regression_loss: 1.0559 - classification_loss: 0.1568 281/500 [===============>..............] - ETA: 1:15 - loss: 1.2117 - regression_loss: 1.0551 - classification_loss: 0.1567 282/500 [===============>..............] - ETA: 1:14 - loss: 1.2114 - regression_loss: 1.0548 - classification_loss: 0.1566 283/500 [===============>..............] - ETA: 1:14 - loss: 1.2104 - regression_loss: 1.0539 - classification_loss: 0.1564 284/500 [================>.............] - ETA: 1:14 - loss: 1.2112 - regression_loss: 1.0548 - classification_loss: 0.1564 285/500 [================>.............] - ETA: 1:13 - loss: 1.2131 - regression_loss: 1.0565 - classification_loss: 0.1566 286/500 [================>.............] - ETA: 1:13 - loss: 1.2129 - regression_loss: 1.0562 - classification_loss: 0.1567 287/500 [================>.............] - ETA: 1:13 - loss: 1.2113 - regression_loss: 1.0548 - classification_loss: 0.1565 288/500 [================>.............] - ETA: 1:12 - loss: 1.2119 - regression_loss: 1.0554 - classification_loss: 0.1565 289/500 [================>.............] - ETA: 1:12 - loss: 1.2125 - regression_loss: 1.0560 - classification_loss: 0.1565 290/500 [================>.............] - ETA: 1:12 - loss: 1.2130 - regression_loss: 1.0564 - classification_loss: 0.1566 291/500 [================>.............] - ETA: 1:11 - loss: 1.2139 - regression_loss: 1.0573 - classification_loss: 0.1565 292/500 [================>.............] - ETA: 1:11 - loss: 1.2141 - regression_loss: 1.0574 - classification_loss: 0.1567 293/500 [================>.............] - ETA: 1:11 - loss: 1.2150 - regression_loss: 1.0583 - classification_loss: 0.1567 294/500 [================>.............] - ETA: 1:10 - loss: 1.2147 - regression_loss: 1.0581 - classification_loss: 0.1566 295/500 [================>.............] - ETA: 1:10 - loss: 1.2140 - regression_loss: 1.0574 - classification_loss: 0.1566 296/500 [================>.............] - ETA: 1:10 - loss: 1.2125 - regression_loss: 1.0561 - classification_loss: 0.1564 297/500 [================>.............] - ETA: 1:09 - loss: 1.2123 - regression_loss: 1.0559 - classification_loss: 0.1564 298/500 [================>.............] - ETA: 1:09 - loss: 1.2132 - regression_loss: 1.0565 - classification_loss: 0.1566 299/500 [================>.............] - ETA: 1:09 - loss: 1.2138 - regression_loss: 1.0571 - classification_loss: 0.1567 300/500 [=================>............] - ETA: 1:08 - loss: 1.2140 - regression_loss: 1.0573 - classification_loss: 0.1567 301/500 [=================>............] - ETA: 1:08 - loss: 1.2149 - regression_loss: 1.0582 - classification_loss: 0.1568 302/500 [=================>............] - ETA: 1:08 - loss: 1.2127 - regression_loss: 1.0562 - classification_loss: 0.1565 303/500 [=================>............] - ETA: 1:07 - loss: 1.2116 - regression_loss: 1.0551 - classification_loss: 0.1565 304/500 [=================>............] - ETA: 1:07 - loss: 1.2114 - regression_loss: 1.0549 - classification_loss: 0.1565 305/500 [=================>............] - ETA: 1:06 - loss: 1.2108 - regression_loss: 1.0544 - classification_loss: 0.1564 306/500 [=================>............] - ETA: 1:06 - loss: 1.2098 - regression_loss: 1.0535 - classification_loss: 0.1563 307/500 [=================>............] - ETA: 1:06 - loss: 1.2089 - regression_loss: 1.0528 - classification_loss: 0.1561 308/500 [=================>............] - ETA: 1:05 - loss: 1.2090 - regression_loss: 1.0529 - classification_loss: 0.1561 309/500 [=================>............] - ETA: 1:05 - loss: 1.2086 - regression_loss: 1.0526 - classification_loss: 0.1560 310/500 [=================>............] - ETA: 1:05 - loss: 1.2090 - regression_loss: 1.0529 - classification_loss: 0.1560 311/500 [=================>............] - ETA: 1:04 - loss: 1.2095 - regression_loss: 1.0534 - classification_loss: 0.1561 312/500 [=================>............] - ETA: 1:04 - loss: 1.2098 - regression_loss: 1.0536 - classification_loss: 0.1561 313/500 [=================>............] - ETA: 1:04 - loss: 1.2104 - regression_loss: 1.0543 - classification_loss: 0.1561 314/500 [=================>............] - ETA: 1:03 - loss: 1.2107 - regression_loss: 1.0546 - classification_loss: 0.1561 315/500 [=================>............] - ETA: 1:03 - loss: 1.2121 - regression_loss: 1.0560 - classification_loss: 0.1561 316/500 [=================>............] - ETA: 1:03 - loss: 1.2127 - regression_loss: 1.0565 - classification_loss: 0.1561 317/500 [==================>...........] - ETA: 1:02 - loss: 1.2120 - regression_loss: 1.0559 - classification_loss: 0.1561 318/500 [==================>...........] - ETA: 1:02 - loss: 1.2126 - regression_loss: 1.0564 - classification_loss: 0.1562 319/500 [==================>...........] - ETA: 1:02 - loss: 1.2113 - regression_loss: 1.0553 - classification_loss: 0.1560 320/500 [==================>...........] - ETA: 1:01 - loss: 1.2097 - regression_loss: 1.0539 - classification_loss: 0.1559 321/500 [==================>...........] - ETA: 1:01 - loss: 1.2084 - regression_loss: 1.0527 - classification_loss: 0.1557 322/500 [==================>...........] - ETA: 1:01 - loss: 1.2084 - regression_loss: 1.0530 - classification_loss: 0.1555 323/500 [==================>...........] - ETA: 1:00 - loss: 1.2083 - regression_loss: 1.0529 - classification_loss: 0.1554 324/500 [==================>...........] - ETA: 1:00 - loss: 1.2083 - regression_loss: 1.0528 - classification_loss: 0.1555 325/500 [==================>...........] - ETA: 1:00 - loss: 1.2092 - regression_loss: 1.0535 - classification_loss: 0.1557 326/500 [==================>...........] - ETA: 59s - loss: 1.2098 - regression_loss: 1.0540 - classification_loss: 0.1558  327/500 [==================>...........] - ETA: 59s - loss: 1.2104 - regression_loss: 1.0544 - classification_loss: 0.1560 328/500 [==================>...........] - ETA: 58s - loss: 1.2083 - regression_loss: 1.0525 - classification_loss: 0.1558 329/500 [==================>...........] - ETA: 58s - loss: 1.2067 - regression_loss: 1.0512 - classification_loss: 0.1555 330/500 [==================>...........] - ETA: 58s - loss: 1.2069 - regression_loss: 1.0513 - classification_loss: 0.1556 331/500 [==================>...........] - ETA: 57s - loss: 1.2044 - regression_loss: 1.0490 - classification_loss: 0.1554 332/500 [==================>...........] - ETA: 57s - loss: 1.2040 - regression_loss: 1.0487 - classification_loss: 0.1553 333/500 [==================>...........] - ETA: 57s - loss: 1.2028 - regression_loss: 1.0478 - classification_loss: 0.1551 334/500 [===================>..........] - ETA: 56s - loss: 1.2019 - regression_loss: 1.0468 - classification_loss: 0.1551 335/500 [===================>..........] - ETA: 56s - loss: 1.2000 - regression_loss: 1.0452 - classification_loss: 0.1548 336/500 [===================>..........] - ETA: 56s - loss: 1.2011 - regression_loss: 1.0460 - classification_loss: 0.1551 337/500 [===================>..........] - ETA: 55s - loss: 1.2008 - regression_loss: 1.0458 - classification_loss: 0.1550 338/500 [===================>..........] - ETA: 55s - loss: 1.2004 - regression_loss: 1.0455 - classification_loss: 0.1549 339/500 [===================>..........] - ETA: 55s - loss: 1.2004 - regression_loss: 1.0455 - classification_loss: 0.1548 340/500 [===================>..........] - ETA: 54s - loss: 1.2009 - regression_loss: 1.0460 - classification_loss: 0.1549 341/500 [===================>..........] - ETA: 54s - loss: 1.1995 - regression_loss: 1.0448 - classification_loss: 0.1547 342/500 [===================>..........] - ETA: 54s - loss: 1.1996 - regression_loss: 1.0445 - classification_loss: 0.1550 343/500 [===================>..........] - ETA: 53s - loss: 1.1998 - regression_loss: 1.0448 - classification_loss: 0.1550 344/500 [===================>..........] - ETA: 53s - loss: 1.1989 - regression_loss: 1.0441 - classification_loss: 0.1548 345/500 [===================>..........] - ETA: 53s - loss: 1.1975 - regression_loss: 1.0428 - classification_loss: 0.1546 346/500 [===================>..........] - ETA: 52s - loss: 1.1975 - regression_loss: 1.0429 - classification_loss: 0.1545 347/500 [===================>..........] - ETA: 52s - loss: 1.1985 - regression_loss: 1.0438 - classification_loss: 0.1547 348/500 [===================>..........] - ETA: 52s - loss: 1.1977 - regression_loss: 1.0432 - classification_loss: 0.1546 349/500 [===================>..........] - ETA: 51s - loss: 1.1967 - regression_loss: 1.0424 - classification_loss: 0.1543 350/500 [====================>.........] - ETA: 51s - loss: 1.1962 - regression_loss: 1.0419 - classification_loss: 0.1543 351/500 [====================>.........] - ETA: 51s - loss: 1.1971 - regression_loss: 1.0427 - classification_loss: 0.1544 352/500 [====================>.........] - ETA: 50s - loss: 1.1982 - regression_loss: 1.0438 - classification_loss: 0.1544 353/500 [====================>.........] - ETA: 50s - loss: 1.1983 - regression_loss: 1.0440 - classification_loss: 0.1543 354/500 [====================>.........] - ETA: 50s - loss: 1.1996 - regression_loss: 1.0451 - classification_loss: 0.1545 355/500 [====================>.........] - ETA: 49s - loss: 1.1994 - regression_loss: 1.0449 - classification_loss: 0.1545 356/500 [====================>.........] - ETA: 49s - loss: 1.1993 - regression_loss: 1.0448 - classification_loss: 0.1545 357/500 [====================>.........] - ETA: 49s - loss: 1.1985 - regression_loss: 1.0440 - classification_loss: 0.1545 358/500 [====================>.........] - ETA: 48s - loss: 1.1991 - regression_loss: 1.0446 - classification_loss: 0.1545 359/500 [====================>.........] - ETA: 48s - loss: 1.2001 - regression_loss: 1.0454 - classification_loss: 0.1547 360/500 [====================>.........] - ETA: 48s - loss: 1.2014 - regression_loss: 1.0465 - classification_loss: 0.1549 361/500 [====================>.........] - ETA: 47s - loss: 1.2015 - regression_loss: 1.0466 - classification_loss: 0.1549 362/500 [====================>.........] - ETA: 47s - loss: 1.2002 - regression_loss: 1.0455 - classification_loss: 0.1547 363/500 [====================>.........] - ETA: 47s - loss: 1.2007 - regression_loss: 1.0460 - classification_loss: 0.1547 364/500 [====================>.........] - ETA: 46s - loss: 1.2016 - regression_loss: 1.0470 - classification_loss: 0.1546 365/500 [====================>.........] - ETA: 46s - loss: 1.2025 - regression_loss: 1.0478 - classification_loss: 0.1547 366/500 [====================>.........] - ETA: 45s - loss: 1.2025 - regression_loss: 1.0479 - classification_loss: 0.1546 367/500 [=====================>........] - ETA: 45s - loss: 1.2027 - regression_loss: 1.0482 - classification_loss: 0.1545 368/500 [=====================>........] - ETA: 45s - loss: 1.2034 - regression_loss: 1.0488 - classification_loss: 0.1546 369/500 [=====================>........] - ETA: 44s - loss: 1.2034 - regression_loss: 1.0488 - classification_loss: 0.1546 370/500 [=====================>........] - ETA: 44s - loss: 1.2039 - regression_loss: 1.0491 - classification_loss: 0.1547 371/500 [=====================>........] - ETA: 44s - loss: 1.2046 - regression_loss: 1.0496 - classification_loss: 0.1549 372/500 [=====================>........] - ETA: 43s - loss: 1.2045 - regression_loss: 1.0496 - classification_loss: 0.1549 373/500 [=====================>........] - ETA: 43s - loss: 1.2030 - regression_loss: 1.0484 - classification_loss: 0.1546 374/500 [=====================>........] - ETA: 43s - loss: 1.2027 - regression_loss: 1.0477 - classification_loss: 0.1550 375/500 [=====================>........] - ETA: 42s - loss: 1.2023 - regression_loss: 1.0473 - classification_loss: 0.1550 376/500 [=====================>........] - ETA: 42s - loss: 1.2023 - regression_loss: 1.0474 - classification_loss: 0.1549 377/500 [=====================>........] - ETA: 42s - loss: 1.2007 - regression_loss: 1.0460 - classification_loss: 0.1547 378/500 [=====================>........] - ETA: 41s - loss: 1.1995 - regression_loss: 1.0449 - classification_loss: 0.1546 379/500 [=====================>........] - ETA: 41s - loss: 1.1987 - regression_loss: 1.0444 - classification_loss: 0.1543 380/500 [=====================>........] - ETA: 41s - loss: 1.1987 - regression_loss: 1.0444 - classification_loss: 0.1542 381/500 [=====================>........] - ETA: 40s - loss: 1.2011 - regression_loss: 1.0465 - classification_loss: 0.1546 382/500 [=====================>........] - ETA: 40s - loss: 1.2006 - regression_loss: 1.0461 - classification_loss: 0.1545 383/500 [=====================>........] - ETA: 40s - loss: 1.2010 - regression_loss: 1.0464 - classification_loss: 0.1545 384/500 [======================>.......] - ETA: 39s - loss: 1.2015 - regression_loss: 1.0468 - classification_loss: 0.1547 385/500 [======================>.......] - ETA: 39s - loss: 1.2020 - regression_loss: 1.0469 - classification_loss: 0.1550 386/500 [======================>.......] - ETA: 39s - loss: 1.2014 - regression_loss: 1.0464 - classification_loss: 0.1550 387/500 [======================>.......] - ETA: 38s - loss: 1.2004 - regression_loss: 1.0455 - classification_loss: 0.1549 388/500 [======================>.......] - ETA: 38s - loss: 1.2002 - regression_loss: 1.0454 - classification_loss: 0.1548 389/500 [======================>.......] - ETA: 38s - loss: 1.1996 - regression_loss: 1.0448 - classification_loss: 0.1548 390/500 [======================>.......] - ETA: 37s - loss: 1.1995 - regression_loss: 1.0449 - classification_loss: 0.1547 391/500 [======================>.......] - ETA: 37s - loss: 1.1998 - regression_loss: 1.0451 - classification_loss: 0.1547 392/500 [======================>.......] - ETA: 37s - loss: 1.1979 - regression_loss: 1.0434 - classification_loss: 0.1545 393/500 [======================>.......] - ETA: 36s - loss: 1.1963 - regression_loss: 1.0420 - classification_loss: 0.1543 394/500 [======================>.......] - ETA: 36s - loss: 1.1957 - regression_loss: 1.0415 - classification_loss: 0.1542 395/500 [======================>.......] - ETA: 36s - loss: 1.1957 - regression_loss: 1.0416 - classification_loss: 0.1541 396/500 [======================>.......] - ETA: 35s - loss: 1.1947 - regression_loss: 1.0408 - classification_loss: 0.1539 397/500 [======================>.......] - ETA: 35s - loss: 1.1950 - regression_loss: 1.0410 - classification_loss: 0.1540 398/500 [======================>.......] - ETA: 34s - loss: 1.1929 - regression_loss: 1.0392 - classification_loss: 0.1537 399/500 [======================>.......] - ETA: 34s - loss: 1.1927 - regression_loss: 1.0391 - classification_loss: 0.1536 400/500 [=======================>......] - ETA: 34s - loss: 1.1936 - regression_loss: 1.0398 - classification_loss: 0.1537 401/500 [=======================>......] - ETA: 33s - loss: 1.1938 - regression_loss: 1.0401 - classification_loss: 0.1537 402/500 [=======================>......] - ETA: 33s - loss: 1.1943 - regression_loss: 1.0407 - classification_loss: 0.1537 403/500 [=======================>......] - ETA: 33s - loss: 1.1945 - regression_loss: 1.0407 - classification_loss: 0.1538 404/500 [=======================>......] - ETA: 32s - loss: 1.1953 - regression_loss: 1.0416 - classification_loss: 0.1537 405/500 [=======================>......] - ETA: 32s - loss: 1.1959 - regression_loss: 1.0421 - classification_loss: 0.1538 406/500 [=======================>......] - ETA: 32s - loss: 1.1966 - regression_loss: 1.0427 - classification_loss: 0.1539 407/500 [=======================>......] - ETA: 31s - loss: 1.1950 - regression_loss: 1.0412 - classification_loss: 0.1537 408/500 [=======================>......] - ETA: 31s - loss: 1.1956 - regression_loss: 1.0418 - classification_loss: 0.1538 409/500 [=======================>......] - ETA: 31s - loss: 1.1960 - regression_loss: 1.0422 - classification_loss: 0.1538 410/500 [=======================>......] - ETA: 30s - loss: 1.1960 - regression_loss: 1.0421 - classification_loss: 0.1539 411/500 [=======================>......] - ETA: 30s - loss: 1.1948 - regression_loss: 1.0408 - classification_loss: 0.1540 412/500 [=======================>......] - ETA: 30s - loss: 1.1936 - regression_loss: 1.0396 - classification_loss: 0.1539 413/500 [=======================>......] - ETA: 29s - loss: 1.1918 - regression_loss: 1.0381 - classification_loss: 0.1537 414/500 [=======================>......] - ETA: 29s - loss: 1.1918 - regression_loss: 1.0381 - classification_loss: 0.1537 415/500 [=======================>......] - ETA: 29s - loss: 1.1926 - regression_loss: 1.0388 - classification_loss: 0.1538 416/500 [=======================>......] - ETA: 28s - loss: 1.1924 - regression_loss: 1.0386 - classification_loss: 0.1538 417/500 [========================>.....] - ETA: 28s - loss: 1.1928 - regression_loss: 1.0389 - classification_loss: 0.1539 418/500 [========================>.....] - ETA: 28s - loss: 1.1941 - regression_loss: 1.0398 - classification_loss: 0.1542 419/500 [========================>.....] - ETA: 27s - loss: 1.1929 - regression_loss: 1.0389 - classification_loss: 0.1540 420/500 [========================>.....] - ETA: 27s - loss: 1.1925 - regression_loss: 1.0386 - classification_loss: 0.1540 421/500 [========================>.....] - ETA: 27s - loss: 1.1914 - regression_loss: 1.0376 - classification_loss: 0.1538 422/500 [========================>.....] - ETA: 26s - loss: 1.1901 - regression_loss: 1.0365 - classification_loss: 0.1536 423/500 [========================>.....] - ETA: 26s - loss: 1.1896 - regression_loss: 1.0361 - classification_loss: 0.1535 424/500 [========================>.....] - ETA: 26s - loss: 1.1886 - regression_loss: 1.0353 - classification_loss: 0.1534 425/500 [========================>.....] - ETA: 25s - loss: 1.1898 - regression_loss: 1.0363 - classification_loss: 0.1534 426/500 [========================>.....] - ETA: 25s - loss: 1.1894 - regression_loss: 1.0359 - classification_loss: 0.1535 427/500 [========================>.....] - ETA: 25s - loss: 1.1898 - regression_loss: 1.0364 - classification_loss: 0.1534 428/500 [========================>.....] - ETA: 24s - loss: 1.1907 - regression_loss: 1.0373 - classification_loss: 0.1534 429/500 [========================>.....] - ETA: 24s - loss: 1.1902 - regression_loss: 1.0368 - classification_loss: 0.1534 430/500 [========================>.....] - ETA: 24s - loss: 1.1907 - regression_loss: 1.0372 - classification_loss: 0.1535 431/500 [========================>.....] - ETA: 23s - loss: 1.1900 - regression_loss: 1.0367 - classification_loss: 0.1533 432/500 [========================>.....] - ETA: 23s - loss: 1.1898 - regression_loss: 1.0365 - classification_loss: 0.1533 433/500 [========================>.....] - ETA: 22s - loss: 1.1899 - regression_loss: 1.0366 - classification_loss: 0.1533 434/500 [=========================>....] - ETA: 22s - loss: 1.1902 - regression_loss: 1.0368 - classification_loss: 0.1533 435/500 [=========================>....] - ETA: 22s - loss: 1.1920 - regression_loss: 1.0386 - classification_loss: 0.1534 436/500 [=========================>....] - ETA: 21s - loss: 1.1925 - regression_loss: 1.0390 - classification_loss: 0.1535 437/500 [=========================>....] - ETA: 21s - loss: 1.1918 - regression_loss: 1.0384 - classification_loss: 0.1534 438/500 [=========================>....] - ETA: 21s - loss: 1.1922 - regression_loss: 1.0387 - classification_loss: 0.1535 439/500 [=========================>....] - ETA: 20s - loss: 1.1916 - regression_loss: 1.0382 - classification_loss: 0.1534 440/500 [=========================>....] - ETA: 20s - loss: 1.1912 - regression_loss: 1.0379 - classification_loss: 0.1533 441/500 [=========================>....] - ETA: 20s - loss: 1.1900 - regression_loss: 1.0369 - classification_loss: 0.1531 442/500 [=========================>....] - ETA: 19s - loss: 1.1889 - regression_loss: 1.0360 - classification_loss: 0.1529 443/500 [=========================>....] - ETA: 19s - loss: 1.1879 - regression_loss: 1.0352 - classification_loss: 0.1527 444/500 [=========================>....] - ETA: 19s - loss: 1.1888 - regression_loss: 1.0360 - classification_loss: 0.1528 445/500 [=========================>....] - ETA: 18s - loss: 1.1894 - regression_loss: 1.0366 - classification_loss: 0.1529 446/500 [=========================>....] - ETA: 18s - loss: 1.1902 - regression_loss: 1.0373 - classification_loss: 0.1529 447/500 [=========================>....] - ETA: 18s - loss: 1.1903 - regression_loss: 1.0373 - classification_loss: 0.1530 448/500 [=========================>....] - ETA: 17s - loss: 1.1907 - regression_loss: 1.0377 - classification_loss: 0.1529 449/500 [=========================>....] - ETA: 17s - loss: 1.1895 - regression_loss: 1.0368 - classification_loss: 0.1528 450/500 [==========================>...] - ETA: 17s - loss: 1.1892 - regression_loss: 1.0365 - classification_loss: 0.1527 451/500 [==========================>...] - ETA: 16s - loss: 1.1890 - regression_loss: 1.0363 - classification_loss: 0.1527 452/500 [==========================>...] - ETA: 16s - loss: 1.1877 - regression_loss: 1.0351 - classification_loss: 0.1526 453/500 [==========================>...] - ETA: 16s - loss: 1.1873 - regression_loss: 1.0344 - classification_loss: 0.1529 454/500 [==========================>...] - ETA: 15s - loss: 1.1871 - regression_loss: 1.0343 - classification_loss: 0.1528 455/500 [==========================>...] - ETA: 15s - loss: 1.1874 - regression_loss: 1.0346 - classification_loss: 0.1528 456/500 [==========================>...] - ETA: 15s - loss: 1.1880 - regression_loss: 1.0350 - classification_loss: 0.1530 457/500 [==========================>...] - ETA: 14s - loss: 1.1870 - regression_loss: 1.0341 - classification_loss: 0.1529 458/500 [==========================>...] - ETA: 14s - loss: 1.1871 - regression_loss: 1.0342 - classification_loss: 0.1529 459/500 [==========================>...] - ETA: 14s - loss: 1.1877 - regression_loss: 1.0348 - classification_loss: 0.1530 460/500 [==========================>...] - ETA: 13s - loss: 1.1887 - regression_loss: 1.0355 - classification_loss: 0.1531 461/500 [==========================>...] - ETA: 13s - loss: 1.1891 - regression_loss: 1.0360 - classification_loss: 0.1532 462/500 [==========================>...] - ETA: 13s - loss: 1.1884 - regression_loss: 1.0354 - classification_loss: 0.1531 463/500 [==========================>...] - ETA: 12s - loss: 1.1884 - regression_loss: 1.0354 - classification_loss: 0.1531 464/500 [==========================>...] - ETA: 12s - loss: 1.1881 - regression_loss: 1.0351 - classification_loss: 0.1529 465/500 [==========================>...] - ETA: 12s - loss: 1.1878 - regression_loss: 1.0350 - classification_loss: 0.1528 466/500 [==========================>...] - ETA: 11s - loss: 1.1883 - regression_loss: 1.0353 - classification_loss: 0.1530 467/500 [===========================>..] - ETA: 11s - loss: 1.1876 - regression_loss: 1.0348 - classification_loss: 0.1529 468/500 [===========================>..] - ETA: 10s - loss: 1.1883 - regression_loss: 1.0355 - classification_loss: 0.1528 469/500 [===========================>..] - ETA: 10s - loss: 1.1883 - regression_loss: 1.0354 - classification_loss: 0.1529 470/500 [===========================>..] - ETA: 10s - loss: 1.1874 - regression_loss: 1.0345 - classification_loss: 0.1529 471/500 [===========================>..] - ETA: 9s - loss: 1.1883 - regression_loss: 1.0354 - classification_loss: 0.1530  472/500 [===========================>..] - ETA: 9s - loss: 1.1885 - regression_loss: 1.0356 - classification_loss: 0.1529 473/500 [===========================>..] - ETA: 9s - loss: 1.1889 - regression_loss: 1.0359 - classification_loss: 0.1530 474/500 [===========================>..] - ETA: 8s - loss: 1.1890 - regression_loss: 1.0361 - classification_loss: 0.1530 475/500 [===========================>..] - ETA: 8s - loss: 1.1881 - regression_loss: 1.0353 - classification_loss: 0.1528 476/500 [===========================>..] - ETA: 8s - loss: 1.1875 - regression_loss: 1.0347 - classification_loss: 0.1528 477/500 [===========================>..] - ETA: 7s - loss: 1.1886 - regression_loss: 1.0354 - classification_loss: 0.1531 478/500 [===========================>..] - ETA: 7s - loss: 1.1883 - regression_loss: 1.0352 - classification_loss: 0.1531 479/500 [===========================>..] - ETA: 7s - loss: 1.1889 - regression_loss: 1.0356 - classification_loss: 0.1532 480/500 [===========================>..] - ETA: 6s - loss: 1.1884 - regression_loss: 1.0353 - classification_loss: 0.1531 481/500 [===========================>..] - ETA: 6s - loss: 1.1873 - regression_loss: 1.0344 - classification_loss: 0.1529 482/500 [===========================>..] - ETA: 6s - loss: 1.1873 - regression_loss: 1.0345 - classification_loss: 0.1528 483/500 [===========================>..] - ETA: 5s - loss: 1.1864 - regression_loss: 1.0337 - classification_loss: 0.1528 484/500 [============================>.] - ETA: 5s - loss: 1.1855 - regression_loss: 1.0329 - classification_loss: 0.1526 485/500 [============================>.] - ETA: 5s - loss: 1.1851 - regression_loss: 1.0326 - classification_loss: 0.1524 486/500 [============================>.] - ETA: 4s - loss: 1.1855 - regression_loss: 1.0332 - classification_loss: 0.1524 487/500 [============================>.] - ETA: 4s - loss: 1.1859 - regression_loss: 1.0335 - classification_loss: 0.1524 488/500 [============================>.] - ETA: 4s - loss: 1.1851 - regression_loss: 1.0328 - classification_loss: 0.1523 489/500 [============================>.] - ETA: 3s - loss: 1.1851 - regression_loss: 1.0327 - classification_loss: 0.1524 490/500 [============================>.] - ETA: 3s - loss: 1.1851 - regression_loss: 1.0328 - classification_loss: 0.1523 491/500 [============================>.] - ETA: 3s - loss: 1.1848 - regression_loss: 1.0326 - classification_loss: 0.1522 492/500 [============================>.] - ETA: 2s - loss: 1.1852 - regression_loss: 1.0329 - classification_loss: 0.1523 493/500 [============================>.] - ETA: 2s - loss: 1.1852 - regression_loss: 1.0329 - classification_loss: 0.1523 494/500 [============================>.] - ETA: 2s - loss: 1.1854 - regression_loss: 1.0332 - classification_loss: 0.1522 495/500 [============================>.] - ETA: 1s - loss: 1.1862 - regression_loss: 1.0338 - classification_loss: 0.1524 496/500 [============================>.] - ETA: 1s - loss: 1.1862 - regression_loss: 1.0338 - classification_loss: 0.1524 497/500 [============================>.] - ETA: 1s - loss: 1.1854 - regression_loss: 1.0332 - classification_loss: 0.1522 498/500 [============================>.] - ETA: 0s - loss: 1.1853 - regression_loss: 1.0331 - classification_loss: 0.1522 499/500 [============================>.] - ETA: 0s - loss: 1.1848 - regression_loss: 1.0328 - classification_loss: 0.1520 500/500 [==============================] - 172s 343ms/step - loss: 1.1844 - regression_loss: 1.0324 - classification_loss: 0.1520 1172 instances of class plum with average precision: 0.6855 mAP: 0.6855 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 2:43 - loss: 0.5102 - regression_loss: 0.4487 - classification_loss: 0.0615 2/500 [..............................] - ETA: 2:41 - loss: 0.5355 - regression_loss: 0.4761 - classification_loss: 0.0594 3/500 [..............................] - ETA: 2:44 - loss: 0.6733 - regression_loss: 0.5898 - classification_loss: 0.0835 4/500 [..............................] - ETA: 2:47 - loss: 0.7528 - regression_loss: 0.6677 - classification_loss: 0.0851 5/500 [..............................] - ETA: 2:46 - loss: 0.8716 - regression_loss: 0.7659 - classification_loss: 0.1058 6/500 [..............................] - ETA: 2:46 - loss: 1.0443 - regression_loss: 0.9123 - classification_loss: 0.1321 7/500 [..............................] - ETA: 2:44 - loss: 1.1038 - regression_loss: 0.9647 - classification_loss: 0.1391 8/500 [..............................] - ETA: 2:43 - loss: 1.1513 - regression_loss: 1.0029 - classification_loss: 0.1484 9/500 [..............................] - ETA: 2:43 - loss: 1.1326 - regression_loss: 0.9828 - classification_loss: 0.1498 10/500 [..............................] - ETA: 2:44 - loss: 1.1310 - regression_loss: 0.9817 - classification_loss: 0.1494 11/500 [..............................] - ETA: 2:45 - loss: 1.0934 - regression_loss: 0.9470 - classification_loss: 0.1464 12/500 [..............................] - ETA: 2:44 - loss: 1.0477 - regression_loss: 0.9076 - classification_loss: 0.1400 13/500 [..............................] - ETA: 2:44 - loss: 1.0459 - regression_loss: 0.9046 - classification_loss: 0.1413 14/500 [..............................] - ETA: 2:43 - loss: 1.1028 - regression_loss: 0.9531 - classification_loss: 0.1497 15/500 [..............................] - ETA: 2:43 - loss: 1.1015 - regression_loss: 0.9526 - classification_loss: 0.1489 16/500 [..............................] - ETA: 2:44 - loss: 1.1328 - regression_loss: 0.9819 - classification_loss: 0.1509 17/500 [>.............................] - ETA: 2:43 - loss: 1.1341 - regression_loss: 0.9846 - classification_loss: 0.1496 18/500 [>.............................] - ETA: 2:43 - loss: 1.1378 - regression_loss: 0.9885 - classification_loss: 0.1493 19/500 [>.............................] - ETA: 2:43 - loss: 1.1153 - regression_loss: 0.9696 - classification_loss: 0.1457 20/500 [>.............................] - ETA: 2:42 - loss: 1.1153 - regression_loss: 0.9705 - classification_loss: 0.1448 21/500 [>.............................] - ETA: 2:41 - loss: 1.1139 - regression_loss: 0.9685 - classification_loss: 0.1454 22/500 [>.............................] - ETA: 2:40 - loss: 1.1032 - regression_loss: 0.9582 - classification_loss: 0.1450 23/500 [>.............................] - ETA: 2:40 - loss: 1.0814 - regression_loss: 0.9395 - classification_loss: 0.1419 24/500 [>.............................] - ETA: 2:40 - loss: 1.0893 - regression_loss: 0.9463 - classification_loss: 0.1430 25/500 [>.............................] - ETA: 2:39 - loss: 1.0948 - regression_loss: 0.9508 - classification_loss: 0.1439 26/500 [>.............................] - ETA: 2:38 - loss: 1.1012 - regression_loss: 0.9580 - classification_loss: 0.1432 27/500 [>.............................] - ETA: 2:38 - loss: 1.1196 - regression_loss: 0.9733 - classification_loss: 0.1463 28/500 [>.............................] - ETA: 2:38 - loss: 1.1092 - regression_loss: 0.9658 - classification_loss: 0.1434 29/500 [>.............................] - ETA: 2:38 - loss: 1.1219 - regression_loss: 0.9780 - classification_loss: 0.1439 30/500 [>.............................] - ETA: 2:38 - loss: 1.1211 - regression_loss: 0.9787 - classification_loss: 0.1424 31/500 [>.............................] - ETA: 2:38 - loss: 1.1132 - regression_loss: 0.9729 - classification_loss: 0.1403 32/500 [>.............................] - ETA: 2:38 - loss: 1.1272 - regression_loss: 0.9845 - classification_loss: 0.1427 33/500 [>.............................] - ETA: 2:37 - loss: 1.1123 - regression_loss: 0.9715 - classification_loss: 0.1409 34/500 [=>............................] - ETA: 2:37 - loss: 1.1054 - regression_loss: 0.9657 - classification_loss: 0.1396 35/500 [=>............................] - ETA: 2:37 - loss: 1.1013 - regression_loss: 0.9636 - classification_loss: 0.1377 36/500 [=>............................] - ETA: 2:37 - loss: 1.1011 - regression_loss: 0.9640 - classification_loss: 0.1371 37/500 [=>............................] - ETA: 2:36 - loss: 1.0838 - regression_loss: 0.9488 - classification_loss: 0.1350 38/500 [=>............................] - ETA: 2:36 - loss: 1.0960 - regression_loss: 0.9586 - classification_loss: 0.1374 39/500 [=>............................] - ETA: 2:36 - loss: 1.0859 - regression_loss: 0.9477 - classification_loss: 0.1383 40/500 [=>............................] - ETA: 2:36 - loss: 1.0845 - regression_loss: 0.9475 - classification_loss: 0.1371 41/500 [=>............................] - ETA: 2:35 - loss: 1.1034 - regression_loss: 0.9629 - classification_loss: 0.1405 42/500 [=>............................] - ETA: 2:35 - loss: 1.1007 - regression_loss: 0.9610 - classification_loss: 0.1396 43/500 [=>............................] - ETA: 2:35 - loss: 1.0982 - regression_loss: 0.9582 - classification_loss: 0.1399 44/500 [=>............................] - ETA: 2:34 - loss: 1.0937 - regression_loss: 0.9541 - classification_loss: 0.1395 45/500 [=>............................] - ETA: 2:34 - loss: 1.1000 - regression_loss: 0.9609 - classification_loss: 0.1391 46/500 [=>............................] - ETA: 2:34 - loss: 1.0952 - regression_loss: 0.9566 - classification_loss: 0.1387 47/500 [=>............................] - ETA: 2:34 - loss: 1.0873 - regression_loss: 0.9493 - classification_loss: 0.1380 48/500 [=>............................] - ETA: 2:34 - loss: 1.0901 - regression_loss: 0.9519 - classification_loss: 0.1382 49/500 [=>............................] - ETA: 2:33 - loss: 1.0941 - regression_loss: 0.9558 - classification_loss: 0.1383 50/500 [==>...........................] - ETA: 2:33 - loss: 1.0863 - regression_loss: 0.9491 - classification_loss: 0.1372 51/500 [==>...........................] - ETA: 2:33 - loss: 1.0861 - regression_loss: 0.9494 - classification_loss: 0.1367 52/500 [==>...........................] - ETA: 2:33 - loss: 1.0793 - regression_loss: 0.9439 - classification_loss: 0.1353 53/500 [==>...........................] - ETA: 2:32 - loss: 1.0946 - regression_loss: 0.9577 - classification_loss: 0.1369 54/500 [==>...........................] - ETA: 2:32 - loss: 1.0936 - regression_loss: 0.9574 - classification_loss: 0.1362 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0913 - regression_loss: 0.9552 - classification_loss: 0.1361 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0868 - regression_loss: 0.9517 - classification_loss: 0.1351 57/500 [==>...........................] - ETA: 2:31 - loss: 1.0879 - regression_loss: 0.9527 - classification_loss: 0.1352 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0791 - regression_loss: 0.9447 - classification_loss: 0.1344 59/500 [==>...........................] - ETA: 2:30 - loss: 1.0685 - regression_loss: 0.9355 - classification_loss: 0.1329 60/500 [==>...........................] - ETA: 2:30 - loss: 1.0724 - regression_loss: 0.9385 - classification_loss: 0.1339 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0790 - regression_loss: 0.9444 - classification_loss: 0.1346 62/500 [==>...........................] - ETA: 2:29 - loss: 1.0920 - regression_loss: 0.9565 - classification_loss: 0.1355 63/500 [==>...........................] - ETA: 2:29 - loss: 1.0931 - regression_loss: 0.9569 - classification_loss: 0.1362 64/500 [==>...........................] - ETA: 2:28 - loss: 1.1121 - regression_loss: 0.9741 - classification_loss: 0.1380 65/500 [==>...........................] - ETA: 2:28 - loss: 1.1204 - regression_loss: 0.9812 - classification_loss: 0.1392 66/500 [==>...........................] - ETA: 2:28 - loss: 1.1205 - regression_loss: 0.9821 - classification_loss: 0.1384 67/500 [===>..........................] - ETA: 2:28 - loss: 1.1239 - regression_loss: 0.9854 - classification_loss: 0.1385 68/500 [===>..........................] - ETA: 2:27 - loss: 1.1294 - regression_loss: 0.9901 - classification_loss: 0.1392 69/500 [===>..........................] - ETA: 2:27 - loss: 1.1300 - regression_loss: 0.9905 - classification_loss: 0.1395 70/500 [===>..........................] - ETA: 2:27 - loss: 1.1325 - regression_loss: 0.9923 - classification_loss: 0.1402 71/500 [===>..........................] - ETA: 2:26 - loss: 1.1389 - regression_loss: 0.9976 - classification_loss: 0.1413 72/500 [===>..........................] - ETA: 2:26 - loss: 1.1388 - regression_loss: 0.9977 - classification_loss: 0.1411 73/500 [===>..........................] - ETA: 2:26 - loss: 1.1307 - regression_loss: 0.9906 - classification_loss: 0.1400 74/500 [===>..........................] - ETA: 2:25 - loss: 1.1279 - regression_loss: 0.9881 - classification_loss: 0.1398 75/500 [===>..........................] - ETA: 2:25 - loss: 1.1277 - regression_loss: 0.9883 - classification_loss: 0.1393 76/500 [===>..........................] - ETA: 2:25 - loss: 1.1209 - regression_loss: 0.9823 - classification_loss: 0.1386 77/500 [===>..........................] - ETA: 2:24 - loss: 1.1173 - regression_loss: 0.9793 - classification_loss: 0.1380 78/500 [===>..........................] - ETA: 2:24 - loss: 1.1245 - regression_loss: 0.9857 - classification_loss: 0.1388 79/500 [===>..........................] - ETA: 2:23 - loss: 1.1311 - regression_loss: 0.9918 - classification_loss: 0.1393 80/500 [===>..........................] - ETA: 2:23 - loss: 1.1290 - regression_loss: 0.9899 - classification_loss: 0.1391 81/500 [===>..........................] - ETA: 2:23 - loss: 1.1272 - regression_loss: 0.9883 - classification_loss: 0.1390 82/500 [===>..........................] - ETA: 2:23 - loss: 1.1291 - regression_loss: 0.9902 - classification_loss: 0.1389 83/500 [===>..........................] - ETA: 2:22 - loss: 1.1318 - regression_loss: 0.9930 - classification_loss: 0.1388 84/500 [====>.........................] - ETA: 2:22 - loss: 1.1281 - regression_loss: 0.9896 - classification_loss: 0.1384 85/500 [====>.........................] - ETA: 2:22 - loss: 1.1205 - regression_loss: 0.9828 - classification_loss: 0.1377 86/500 [====>.........................] - ETA: 2:21 - loss: 1.1164 - regression_loss: 0.9793 - classification_loss: 0.1371 87/500 [====>.........................] - ETA: 2:21 - loss: 1.1206 - regression_loss: 0.9832 - classification_loss: 0.1375 88/500 [====>.........................] - ETA: 2:21 - loss: 1.1177 - regression_loss: 0.9801 - classification_loss: 0.1376 89/500 [====>.........................] - ETA: 2:20 - loss: 1.1211 - regression_loss: 0.9827 - classification_loss: 0.1383 90/500 [====>.........................] - ETA: 2:20 - loss: 1.1235 - regression_loss: 0.9850 - classification_loss: 0.1384 91/500 [====>.........................] - ETA: 2:20 - loss: 1.1184 - regression_loss: 0.9806 - classification_loss: 0.1378 92/500 [====>.........................] - ETA: 2:19 - loss: 1.1198 - regression_loss: 0.9822 - classification_loss: 0.1376 93/500 [====>.........................] - ETA: 2:19 - loss: 1.1173 - regression_loss: 0.9803 - classification_loss: 0.1370 94/500 [====>.........................] - ETA: 2:19 - loss: 1.1184 - regression_loss: 0.9814 - classification_loss: 0.1371 95/500 [====>.........................] - ETA: 2:18 - loss: 1.1185 - regression_loss: 0.9815 - classification_loss: 0.1370 96/500 [====>.........................] - ETA: 2:18 - loss: 1.1175 - regression_loss: 0.9804 - classification_loss: 0.1371 97/500 [====>.........................] - ETA: 2:18 - loss: 1.1193 - regression_loss: 0.9819 - classification_loss: 0.1373 98/500 [====>.........................] - ETA: 2:17 - loss: 1.1150 - regression_loss: 0.9782 - classification_loss: 0.1368 99/500 [====>.........................] - ETA: 2:17 - loss: 1.1148 - regression_loss: 0.9781 - classification_loss: 0.1367 100/500 [=====>........................] - ETA: 2:17 - loss: 1.1166 - regression_loss: 0.9793 - classification_loss: 0.1373 101/500 [=====>........................] - ETA: 2:16 - loss: 1.1204 - regression_loss: 0.9828 - classification_loss: 0.1377 102/500 [=====>........................] - ETA: 2:16 - loss: 1.1211 - regression_loss: 0.9821 - classification_loss: 0.1391 103/500 [=====>........................] - ETA: 2:16 - loss: 1.1143 - regression_loss: 0.9761 - classification_loss: 0.1382 104/500 [=====>........................] - ETA: 2:16 - loss: 1.1147 - regression_loss: 0.9765 - classification_loss: 0.1382 105/500 [=====>........................] - ETA: 2:15 - loss: 1.1198 - regression_loss: 0.9808 - classification_loss: 0.1390 106/500 [=====>........................] - ETA: 2:15 - loss: 1.1220 - regression_loss: 0.9827 - classification_loss: 0.1393 107/500 [=====>........................] - ETA: 2:15 - loss: 1.1169 - regression_loss: 0.9781 - classification_loss: 0.1388 108/500 [=====>........................] - ETA: 2:14 - loss: 1.1135 - regression_loss: 0.9750 - classification_loss: 0.1385 109/500 [=====>........................] - ETA: 2:14 - loss: 1.1152 - regression_loss: 0.9764 - classification_loss: 0.1388 110/500 [=====>........................] - ETA: 2:13 - loss: 1.1150 - regression_loss: 0.9761 - classification_loss: 0.1390 111/500 [=====>........................] - ETA: 2:13 - loss: 1.1160 - regression_loss: 0.9768 - classification_loss: 0.1393 112/500 [=====>........................] - ETA: 2:13 - loss: 1.1153 - regression_loss: 0.9757 - classification_loss: 0.1396 113/500 [=====>........................] - ETA: 2:13 - loss: 1.1147 - regression_loss: 0.9750 - classification_loss: 0.1398 114/500 [=====>........................] - ETA: 2:12 - loss: 1.1162 - regression_loss: 0.9759 - classification_loss: 0.1403 115/500 [=====>........................] - ETA: 2:12 - loss: 1.1226 - regression_loss: 0.9812 - classification_loss: 0.1414 116/500 [=====>........................] - ETA: 2:12 - loss: 1.1229 - regression_loss: 0.9816 - classification_loss: 0.1413 117/500 [======>.......................] - ETA: 2:11 - loss: 1.1234 - regression_loss: 0.9823 - classification_loss: 0.1411 118/500 [======>.......................] - ETA: 2:11 - loss: 1.1195 - regression_loss: 0.9791 - classification_loss: 0.1404 119/500 [======>.......................] - ETA: 2:11 - loss: 1.1210 - regression_loss: 0.9809 - classification_loss: 0.1401 120/500 [======>.......................] - ETA: 2:10 - loss: 1.1207 - regression_loss: 0.9802 - classification_loss: 0.1405 121/500 [======>.......................] - ETA: 2:10 - loss: 1.1257 - regression_loss: 0.9849 - classification_loss: 0.1408 122/500 [======>.......................] - ETA: 2:10 - loss: 1.1292 - regression_loss: 0.9884 - classification_loss: 0.1408 123/500 [======>.......................] - ETA: 2:09 - loss: 1.1293 - regression_loss: 0.9888 - classification_loss: 0.1405 124/500 [======>.......................] - ETA: 2:09 - loss: 1.1276 - regression_loss: 0.9873 - classification_loss: 0.1403 125/500 [======>.......................] - ETA: 2:08 - loss: 1.1319 - regression_loss: 0.9905 - classification_loss: 0.1415 126/500 [======>.......................] - ETA: 2:08 - loss: 1.1346 - regression_loss: 0.9924 - classification_loss: 0.1422 127/500 [======>.......................] - ETA: 2:08 - loss: 1.1300 - regression_loss: 0.9883 - classification_loss: 0.1417 128/500 [======>.......................] - ETA: 2:07 - loss: 1.1270 - regression_loss: 0.9855 - classification_loss: 0.1415 129/500 [======>.......................] - ETA: 2:07 - loss: 1.1267 - regression_loss: 0.9850 - classification_loss: 0.1417 130/500 [======>.......................] - ETA: 2:07 - loss: 1.1302 - regression_loss: 0.9881 - classification_loss: 0.1422 131/500 [======>.......................] - ETA: 2:06 - loss: 1.1356 - regression_loss: 0.9926 - classification_loss: 0.1430 132/500 [======>.......................] - ETA: 2:06 - loss: 1.1363 - regression_loss: 0.9936 - classification_loss: 0.1427 133/500 [======>.......................] - ETA: 2:06 - loss: 1.1332 - regression_loss: 0.9910 - classification_loss: 0.1422 134/500 [=======>......................] - ETA: 2:05 - loss: 1.1329 - regression_loss: 0.9906 - classification_loss: 0.1423 135/500 [=======>......................] - ETA: 2:05 - loss: 1.1359 - regression_loss: 0.9930 - classification_loss: 0.1429 136/500 [=======>......................] - ETA: 2:05 - loss: 1.1336 - regression_loss: 0.9908 - classification_loss: 0.1428 137/500 [=======>......................] - ETA: 2:04 - loss: 1.1320 - regression_loss: 0.9891 - classification_loss: 0.1428 138/500 [=======>......................] - ETA: 2:04 - loss: 1.1280 - regression_loss: 0.9857 - classification_loss: 0.1423 139/500 [=======>......................] - ETA: 2:04 - loss: 1.1270 - regression_loss: 0.9849 - classification_loss: 0.1421 140/500 [=======>......................] - ETA: 2:03 - loss: 1.1279 - regression_loss: 0.9856 - classification_loss: 0.1424 141/500 [=======>......................] - ETA: 2:03 - loss: 1.1294 - regression_loss: 0.9866 - classification_loss: 0.1428 142/500 [=======>......................] - ETA: 2:03 - loss: 1.1321 - regression_loss: 0.9889 - classification_loss: 0.1431 143/500 [=======>......................] - ETA: 2:02 - loss: 1.1302 - regression_loss: 0.9874 - classification_loss: 0.1428 144/500 [=======>......................] - ETA: 2:02 - loss: 1.1292 - regression_loss: 0.9867 - classification_loss: 0.1424 145/500 [=======>......................] - ETA: 2:02 - loss: 1.1321 - regression_loss: 0.9891 - classification_loss: 0.1430 146/500 [=======>......................] - ETA: 2:01 - loss: 1.1315 - regression_loss: 0.9889 - classification_loss: 0.1426 147/500 [=======>......................] - ETA: 2:01 - loss: 1.1292 - regression_loss: 0.9868 - classification_loss: 0.1424 148/500 [=======>......................] - ETA: 2:01 - loss: 1.1305 - regression_loss: 0.9881 - classification_loss: 0.1423 149/500 [=======>......................] - ETA: 2:00 - loss: 1.1325 - regression_loss: 0.9898 - classification_loss: 0.1427 150/500 [========>.....................] - ETA: 2:00 - loss: 1.1325 - regression_loss: 0.9896 - classification_loss: 0.1429 151/500 [========>.....................] - ETA: 1:59 - loss: 1.1324 - regression_loss: 0.9893 - classification_loss: 0.1431 152/500 [========>.....................] - ETA: 1:59 - loss: 1.1296 - regression_loss: 0.9869 - classification_loss: 0.1426 153/500 [========>.....................] - ETA: 1:59 - loss: 1.1268 - regression_loss: 0.9847 - classification_loss: 0.1421 154/500 [========>.....................] - ETA: 1:58 - loss: 1.1277 - regression_loss: 0.9853 - classification_loss: 0.1423 155/500 [========>.....................] - ETA: 1:58 - loss: 1.1284 - regression_loss: 0.9860 - classification_loss: 0.1423 156/500 [========>.....................] - ETA: 1:58 - loss: 1.1280 - regression_loss: 0.9856 - classification_loss: 0.1424 157/500 [========>.....................] - ETA: 1:57 - loss: 1.1234 - regression_loss: 0.9816 - classification_loss: 0.1418 158/500 [========>.....................] - ETA: 1:57 - loss: 1.1231 - regression_loss: 0.9812 - classification_loss: 0.1419 159/500 [========>.....................] - ETA: 1:57 - loss: 1.1202 - regression_loss: 0.9789 - classification_loss: 0.1413 160/500 [========>.....................] - ETA: 1:56 - loss: 1.1179 - regression_loss: 0.9770 - classification_loss: 0.1409 161/500 [========>.....................] - ETA: 1:56 - loss: 1.1191 - regression_loss: 0.9780 - classification_loss: 0.1410 162/500 [========>.....................] - ETA: 1:56 - loss: 1.1199 - regression_loss: 0.9790 - classification_loss: 0.1409 163/500 [========>.....................] - ETA: 1:55 - loss: 1.1167 - regression_loss: 0.9763 - classification_loss: 0.1404 164/500 [========>.....................] - ETA: 1:55 - loss: 1.1212 - regression_loss: 0.9801 - classification_loss: 0.1412 165/500 [========>.....................] - ETA: 1:55 - loss: 1.1206 - regression_loss: 0.9797 - classification_loss: 0.1409 166/500 [========>.....................] - ETA: 1:54 - loss: 1.1175 - regression_loss: 0.9770 - classification_loss: 0.1405 167/500 [=========>....................] - ETA: 1:54 - loss: 1.1147 - regression_loss: 0.9744 - classification_loss: 0.1403 168/500 [=========>....................] - ETA: 1:54 - loss: 1.1163 - regression_loss: 0.9757 - classification_loss: 0.1406 169/500 [=========>....................] - ETA: 1:53 - loss: 1.1175 - regression_loss: 0.9767 - classification_loss: 0.1408 170/500 [=========>....................] - ETA: 1:53 - loss: 1.1161 - regression_loss: 0.9754 - classification_loss: 0.1406 171/500 [=========>....................] - ETA: 1:53 - loss: 1.1161 - regression_loss: 0.9759 - classification_loss: 0.1402 172/500 [=========>....................] - ETA: 1:52 - loss: 1.1143 - regression_loss: 0.9746 - classification_loss: 0.1398 173/500 [=========>....................] - ETA: 1:52 - loss: 1.1124 - regression_loss: 0.9731 - classification_loss: 0.1393 174/500 [=========>....................] - ETA: 1:51 - loss: 1.1149 - regression_loss: 0.9755 - classification_loss: 0.1394 175/500 [=========>....................] - ETA: 1:51 - loss: 1.1146 - regression_loss: 0.9752 - classification_loss: 0.1394 176/500 [=========>....................] - ETA: 1:51 - loss: 1.1129 - regression_loss: 0.9738 - classification_loss: 0.1391 177/500 [=========>....................] - ETA: 1:50 - loss: 1.1124 - regression_loss: 0.9733 - classification_loss: 0.1391 178/500 [=========>....................] - ETA: 1:50 - loss: 1.1139 - regression_loss: 0.9746 - classification_loss: 0.1393 179/500 [=========>....................] - ETA: 1:50 - loss: 1.1121 - regression_loss: 0.9729 - classification_loss: 0.1392 180/500 [=========>....................] - ETA: 1:49 - loss: 1.1156 - regression_loss: 0.9754 - classification_loss: 0.1402 181/500 [=========>....................] - ETA: 1:49 - loss: 1.1136 - regression_loss: 0.9736 - classification_loss: 0.1400 182/500 [=========>....................] - ETA: 1:49 - loss: 1.1154 - regression_loss: 0.9752 - classification_loss: 0.1402 183/500 [=========>....................] - ETA: 1:48 - loss: 1.1162 - regression_loss: 0.9757 - classification_loss: 0.1405 184/500 [==========>...................] - ETA: 1:48 - loss: 1.1183 - regression_loss: 0.9777 - classification_loss: 0.1406 185/500 [==========>...................] - ETA: 1:48 - loss: 1.1225 - regression_loss: 0.9812 - classification_loss: 0.1413 186/500 [==========>...................] - ETA: 1:47 - loss: 1.1237 - regression_loss: 0.9822 - classification_loss: 0.1415 187/500 [==========>...................] - ETA: 1:47 - loss: 1.1264 - regression_loss: 0.9848 - classification_loss: 0.1416 188/500 [==========>...................] - ETA: 1:47 - loss: 1.1276 - regression_loss: 0.9855 - classification_loss: 0.1421 189/500 [==========>...................] - ETA: 1:46 - loss: 1.1311 - regression_loss: 0.9884 - classification_loss: 0.1427 190/500 [==========>...................] - ETA: 1:46 - loss: 1.1304 - regression_loss: 0.9876 - classification_loss: 0.1427 191/500 [==========>...................] - ETA: 1:46 - loss: 1.1317 - regression_loss: 0.9888 - classification_loss: 0.1429 192/500 [==========>...................] - ETA: 1:45 - loss: 1.1314 - regression_loss: 0.9886 - classification_loss: 0.1427 193/500 [==========>...................] - ETA: 1:45 - loss: 1.1336 - regression_loss: 0.9904 - classification_loss: 0.1431 194/500 [==========>...................] - ETA: 1:44 - loss: 1.1374 - regression_loss: 0.9936 - classification_loss: 0.1438 195/500 [==========>...................] - ETA: 1:44 - loss: 1.1372 - regression_loss: 0.9937 - classification_loss: 0.1435 196/500 [==========>...................] - ETA: 1:44 - loss: 1.1372 - regression_loss: 0.9936 - classification_loss: 0.1437 197/500 [==========>...................] - ETA: 1:43 - loss: 1.1390 - regression_loss: 0.9951 - classification_loss: 0.1439 198/500 [==========>...................] - ETA: 1:43 - loss: 1.1364 - regression_loss: 0.9928 - classification_loss: 0.1435 199/500 [==========>...................] - ETA: 1:43 - loss: 1.1334 - regression_loss: 0.9903 - classification_loss: 0.1431 200/500 [===========>..................] - ETA: 1:42 - loss: 1.1356 - regression_loss: 0.9920 - classification_loss: 0.1436 201/500 [===========>..................] - ETA: 1:42 - loss: 1.1345 - regression_loss: 0.9912 - classification_loss: 0.1433 202/500 [===========>..................] - ETA: 1:42 - loss: 1.1325 - regression_loss: 0.9895 - classification_loss: 0.1430 203/500 [===========>..................] - ETA: 1:41 - loss: 1.1310 - regression_loss: 0.9883 - classification_loss: 0.1427 204/500 [===========>..................] - ETA: 1:41 - loss: 1.1324 - regression_loss: 0.9894 - classification_loss: 0.1431 205/500 [===========>..................] - ETA: 1:41 - loss: 1.1343 - regression_loss: 0.9912 - classification_loss: 0.1431 206/500 [===========>..................] - ETA: 1:40 - loss: 1.1342 - regression_loss: 0.9911 - classification_loss: 0.1431 207/500 [===========>..................] - ETA: 1:40 - loss: 1.1346 - regression_loss: 0.9914 - classification_loss: 0.1432 208/500 [===========>..................] - ETA: 1:40 - loss: 1.1346 - regression_loss: 0.9913 - classification_loss: 0.1433 209/500 [===========>..................] - ETA: 1:39 - loss: 1.1341 - regression_loss: 0.9904 - classification_loss: 0.1436 210/500 [===========>..................] - ETA: 1:39 - loss: 1.1324 - regression_loss: 0.9888 - classification_loss: 0.1436 211/500 [===========>..................] - ETA: 1:39 - loss: 1.1299 - regression_loss: 0.9866 - classification_loss: 0.1434 212/500 [===========>..................] - ETA: 1:38 - loss: 1.1301 - regression_loss: 0.9866 - classification_loss: 0.1434 213/500 [===========>..................] - ETA: 1:38 - loss: 1.1301 - regression_loss: 0.9866 - classification_loss: 0.1435 214/500 [===========>..................] - ETA: 1:38 - loss: 1.1291 - regression_loss: 0.9859 - classification_loss: 0.1431 215/500 [===========>..................] - ETA: 1:37 - loss: 1.1281 - regression_loss: 0.9850 - classification_loss: 0.1431 216/500 [===========>..................] - ETA: 1:37 - loss: 1.1282 - regression_loss: 0.9851 - classification_loss: 0.1431 217/500 [============>.................] - ETA: 1:37 - loss: 1.1278 - regression_loss: 0.9849 - classification_loss: 0.1429 218/500 [============>.................] - ETA: 1:36 - loss: 1.1306 - regression_loss: 0.9870 - classification_loss: 0.1436 219/500 [============>.................] - ETA: 1:36 - loss: 1.1307 - regression_loss: 0.9870 - classification_loss: 0.1437 220/500 [============>.................] - ETA: 1:36 - loss: 1.1339 - regression_loss: 0.9897 - classification_loss: 0.1442 221/500 [============>.................] - ETA: 1:35 - loss: 1.1332 - regression_loss: 0.9890 - classification_loss: 0.1442 222/500 [============>.................] - ETA: 1:35 - loss: 1.1311 - regression_loss: 0.9872 - classification_loss: 0.1439 223/500 [============>.................] - ETA: 1:34 - loss: 1.1285 - regression_loss: 0.9849 - classification_loss: 0.1436 224/500 [============>.................] - ETA: 1:34 - loss: 1.1260 - regression_loss: 0.9829 - classification_loss: 0.1431 225/500 [============>.................] - ETA: 1:34 - loss: 1.1237 - regression_loss: 0.9808 - classification_loss: 0.1429 226/500 [============>.................] - ETA: 1:33 - loss: 1.1239 - regression_loss: 0.9808 - classification_loss: 0.1431 227/500 [============>.................] - ETA: 1:33 - loss: 1.1235 - regression_loss: 0.9805 - classification_loss: 0.1430 228/500 [============>.................] - ETA: 1:33 - loss: 1.1235 - regression_loss: 0.9804 - classification_loss: 0.1431 229/500 [============>.................] - ETA: 1:32 - loss: 1.1245 - regression_loss: 0.9812 - classification_loss: 0.1432 230/500 [============>.................] - ETA: 1:32 - loss: 1.1218 - regression_loss: 0.9788 - classification_loss: 0.1430 231/500 [============>.................] - ETA: 1:32 - loss: 1.1199 - regression_loss: 0.9773 - classification_loss: 0.1427 232/500 [============>.................] - ETA: 1:31 - loss: 1.1179 - regression_loss: 0.9756 - classification_loss: 0.1423 233/500 [============>.................] - ETA: 1:31 - loss: 1.1168 - regression_loss: 0.9747 - classification_loss: 0.1421 234/500 [=============>................] - ETA: 1:31 - loss: 1.1188 - regression_loss: 0.9765 - classification_loss: 0.1423 235/500 [=============>................] - ETA: 1:30 - loss: 1.1206 - regression_loss: 0.9781 - classification_loss: 0.1425 236/500 [=============>................] - ETA: 1:30 - loss: 1.1190 - regression_loss: 0.9768 - classification_loss: 0.1423 237/500 [=============>................] - ETA: 1:30 - loss: 1.1210 - regression_loss: 0.9786 - classification_loss: 0.1424 238/500 [=============>................] - ETA: 1:29 - loss: 1.1207 - regression_loss: 0.9783 - classification_loss: 0.1424 239/500 [=============>................] - ETA: 1:29 - loss: 1.1223 - regression_loss: 0.9798 - classification_loss: 0.1426 240/500 [=============>................] - ETA: 1:29 - loss: 1.1224 - regression_loss: 0.9800 - classification_loss: 0.1425 241/500 [=============>................] - ETA: 1:28 - loss: 1.1221 - regression_loss: 0.9798 - classification_loss: 0.1423 242/500 [=============>................] - ETA: 1:28 - loss: 1.1198 - regression_loss: 0.9779 - classification_loss: 0.1419 243/500 [=============>................] - ETA: 1:28 - loss: 1.1200 - regression_loss: 0.9783 - classification_loss: 0.1417 244/500 [=============>................] - ETA: 1:27 - loss: 1.1184 - regression_loss: 0.9770 - classification_loss: 0.1414 245/500 [=============>................] - ETA: 1:27 - loss: 1.1189 - regression_loss: 0.9772 - classification_loss: 0.1416 246/500 [=============>................] - ETA: 1:27 - loss: 1.1192 - regression_loss: 0.9777 - classification_loss: 0.1415 247/500 [=============>................] - ETA: 1:26 - loss: 1.1207 - regression_loss: 0.9790 - classification_loss: 0.1417 248/500 [=============>................] - ETA: 1:26 - loss: 1.1202 - regression_loss: 0.9784 - classification_loss: 0.1418 249/500 [=============>................] - ETA: 1:26 - loss: 1.1209 - regression_loss: 0.9790 - classification_loss: 0.1419 250/500 [==============>...............] - ETA: 1:25 - loss: 1.1214 - regression_loss: 0.9796 - classification_loss: 0.1418 251/500 [==============>...............] - ETA: 1:25 - loss: 1.1226 - regression_loss: 0.9806 - classification_loss: 0.1420 252/500 [==============>...............] - ETA: 1:25 - loss: 1.1206 - regression_loss: 0.9789 - classification_loss: 0.1417 253/500 [==============>...............] - ETA: 1:24 - loss: 1.1175 - regression_loss: 0.9762 - classification_loss: 0.1414 254/500 [==============>...............] - ETA: 1:24 - loss: 1.1160 - regression_loss: 0.9748 - classification_loss: 0.1411 255/500 [==============>...............] - ETA: 1:24 - loss: 1.1180 - regression_loss: 0.9764 - classification_loss: 0.1416 256/500 [==============>...............] - ETA: 1:23 - loss: 1.1191 - regression_loss: 0.9775 - classification_loss: 0.1416 257/500 [==============>...............] - ETA: 1:23 - loss: 1.1198 - regression_loss: 0.9781 - classification_loss: 0.1417 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1203 - regression_loss: 0.9786 - classification_loss: 0.1417 259/500 [==============>...............] - ETA: 1:22 - loss: 1.1203 - regression_loss: 0.9787 - classification_loss: 0.1417 260/500 [==============>...............] - ETA: 1:22 - loss: 1.1199 - regression_loss: 0.9783 - classification_loss: 0.1416 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1186 - regression_loss: 0.9770 - classification_loss: 0.1415 262/500 [==============>...............] - ETA: 1:21 - loss: 1.1181 - regression_loss: 0.9767 - classification_loss: 0.1413 263/500 [==============>...............] - ETA: 1:21 - loss: 1.1173 - regression_loss: 0.9761 - classification_loss: 0.1412 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1191 - regression_loss: 0.9777 - classification_loss: 0.1414 265/500 [==============>...............] - ETA: 1:20 - loss: 1.1209 - regression_loss: 0.9793 - classification_loss: 0.1416 266/500 [==============>...............] - ETA: 1:20 - loss: 1.1204 - regression_loss: 0.9789 - classification_loss: 0.1415 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1198 - regression_loss: 0.9784 - classification_loss: 0.1414 268/500 [===============>..............] - ETA: 1:19 - loss: 1.1201 - regression_loss: 0.9787 - classification_loss: 0.1414 269/500 [===============>..............] - ETA: 1:19 - loss: 1.1204 - regression_loss: 0.9790 - classification_loss: 0.1415 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1199 - regression_loss: 0.9785 - classification_loss: 0.1414 271/500 [===============>..............] - ETA: 1:18 - loss: 1.1197 - regression_loss: 0.9784 - classification_loss: 0.1413 272/500 [===============>..............] - ETA: 1:18 - loss: 1.1182 - regression_loss: 0.9772 - classification_loss: 0.1410 273/500 [===============>..............] - ETA: 1:17 - loss: 1.1185 - regression_loss: 0.9774 - classification_loss: 0.1412 274/500 [===============>..............] - ETA: 1:17 - loss: 1.1189 - regression_loss: 0.9776 - classification_loss: 0.1413 275/500 [===============>..............] - ETA: 1:17 - loss: 1.1201 - regression_loss: 0.9784 - classification_loss: 0.1417 276/500 [===============>..............] - ETA: 1:16 - loss: 1.1220 - regression_loss: 0.9802 - classification_loss: 0.1418 277/500 [===============>..............] - ETA: 1:16 - loss: 1.1218 - regression_loss: 0.9800 - classification_loss: 0.1418 278/500 [===============>..............] - ETA: 1:16 - loss: 1.1232 - regression_loss: 0.9813 - classification_loss: 0.1419 279/500 [===============>..............] - ETA: 1:15 - loss: 1.1240 - regression_loss: 0.9820 - classification_loss: 0.1420 280/500 [===============>..............] - ETA: 1:15 - loss: 1.1230 - regression_loss: 0.9812 - classification_loss: 0.1419 281/500 [===============>..............] - ETA: 1:15 - loss: 1.1226 - regression_loss: 0.9808 - classification_loss: 0.1418 282/500 [===============>..............] - ETA: 1:14 - loss: 1.1220 - regression_loss: 0.9803 - classification_loss: 0.1417 283/500 [===============>..............] - ETA: 1:14 - loss: 1.1208 - regression_loss: 0.9792 - classification_loss: 0.1416 284/500 [================>.............] - ETA: 1:14 - loss: 1.1219 - regression_loss: 0.9800 - classification_loss: 0.1418 285/500 [================>.............] - ETA: 1:13 - loss: 1.1230 - regression_loss: 0.9810 - classification_loss: 0.1420 286/500 [================>.............] - ETA: 1:13 - loss: 1.1234 - regression_loss: 0.9813 - classification_loss: 0.1420 287/500 [================>.............] - ETA: 1:13 - loss: 1.1236 - regression_loss: 0.9816 - classification_loss: 0.1420 288/500 [================>.............] - ETA: 1:12 - loss: 1.1228 - regression_loss: 0.9808 - classification_loss: 0.1420 289/500 [================>.............] - ETA: 1:12 - loss: 1.1226 - regression_loss: 0.9807 - classification_loss: 0.1419 290/500 [================>.............] - ETA: 1:12 - loss: 1.1231 - regression_loss: 0.9812 - classification_loss: 0.1419 291/500 [================>.............] - ETA: 1:11 - loss: 1.1235 - regression_loss: 0.9816 - classification_loss: 0.1419 292/500 [================>.............] - ETA: 1:11 - loss: 1.1228 - regression_loss: 0.9811 - classification_loss: 0.1418 293/500 [================>.............] - ETA: 1:10 - loss: 1.1237 - regression_loss: 0.9818 - classification_loss: 0.1419 294/500 [================>.............] - ETA: 1:10 - loss: 1.1242 - regression_loss: 0.9823 - classification_loss: 0.1419 295/500 [================>.............] - ETA: 1:10 - loss: 1.1222 - regression_loss: 0.9805 - classification_loss: 0.1417 296/500 [================>.............] - ETA: 1:09 - loss: 1.1214 - regression_loss: 0.9798 - classification_loss: 0.1417 297/500 [================>.............] - ETA: 1:09 - loss: 1.1199 - regression_loss: 0.9785 - classification_loss: 0.1414 298/500 [================>.............] - ETA: 1:09 - loss: 1.1185 - regression_loss: 0.9774 - classification_loss: 0.1411 299/500 [================>.............] - ETA: 1:08 - loss: 1.1170 - regression_loss: 0.9761 - classification_loss: 0.1409 300/500 [=================>............] - ETA: 1:08 - loss: 1.1174 - regression_loss: 0.9765 - classification_loss: 0.1409 301/500 [=================>............] - ETA: 1:08 - loss: 1.1173 - regression_loss: 0.9765 - classification_loss: 0.1408 302/500 [=================>............] - ETA: 1:07 - loss: 1.1164 - regression_loss: 0.9757 - classification_loss: 0.1407 303/500 [=================>............] - ETA: 1:07 - loss: 1.1163 - regression_loss: 0.9756 - classification_loss: 0.1407 304/500 [=================>............] - ETA: 1:07 - loss: 1.1176 - regression_loss: 0.9768 - classification_loss: 0.1407 305/500 [=================>............] - ETA: 1:06 - loss: 1.1164 - regression_loss: 0.9759 - classification_loss: 0.1406 306/500 [=================>............] - ETA: 1:06 - loss: 1.1157 - regression_loss: 0.9752 - classification_loss: 0.1405 307/500 [=================>............] - ETA: 1:06 - loss: 1.1156 - regression_loss: 0.9750 - classification_loss: 0.1406 308/500 [=================>............] - ETA: 1:05 - loss: 1.1140 - regression_loss: 0.9736 - classification_loss: 0.1404 309/500 [=================>............] - ETA: 1:05 - loss: 1.1154 - regression_loss: 0.9747 - classification_loss: 0.1407 310/500 [=================>............] - ETA: 1:05 - loss: 1.1155 - regression_loss: 0.9748 - classification_loss: 0.1407 311/500 [=================>............] - ETA: 1:04 - loss: 1.1134 - regression_loss: 0.9730 - classification_loss: 0.1404 312/500 [=================>............] - ETA: 1:04 - loss: 1.1143 - regression_loss: 0.9739 - classification_loss: 0.1404 313/500 [=================>............] - ETA: 1:04 - loss: 1.1130 - regression_loss: 0.9728 - classification_loss: 0.1402 314/500 [=================>............] - ETA: 1:03 - loss: 1.1119 - regression_loss: 0.9719 - classification_loss: 0.1400 315/500 [=================>............] - ETA: 1:03 - loss: 1.1125 - regression_loss: 0.9725 - classification_loss: 0.1401 316/500 [=================>............] - ETA: 1:03 - loss: 1.1121 - regression_loss: 0.9721 - classification_loss: 0.1401 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1125 - regression_loss: 0.9725 - classification_loss: 0.1400 318/500 [==================>...........] - ETA: 1:02 - loss: 1.1122 - regression_loss: 0.9722 - classification_loss: 0.1400 319/500 [==================>...........] - ETA: 1:02 - loss: 1.1134 - regression_loss: 0.9732 - classification_loss: 0.1401 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1133 - regression_loss: 0.9733 - classification_loss: 0.1400 321/500 [==================>...........] - ETA: 1:01 - loss: 1.1117 - regression_loss: 0.9720 - classification_loss: 0.1397 322/500 [==================>...........] - ETA: 1:01 - loss: 1.1105 - regression_loss: 0.9709 - classification_loss: 0.1396 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1108 - regression_loss: 0.9711 - classification_loss: 0.1397 324/500 [==================>...........] - ETA: 1:00 - loss: 1.1124 - regression_loss: 0.9725 - classification_loss: 0.1399 325/500 [==================>...........] - ETA: 1:00 - loss: 1.1106 - regression_loss: 0.9710 - classification_loss: 0.1397 326/500 [==================>...........] - ETA: 59s - loss: 1.1109 - regression_loss: 0.9712 - classification_loss: 0.1397  327/500 [==================>...........] - ETA: 59s - loss: 1.1129 - regression_loss: 0.9729 - classification_loss: 0.1400 328/500 [==================>...........] - ETA: 59s - loss: 1.1119 - regression_loss: 0.9722 - classification_loss: 0.1398 329/500 [==================>...........] - ETA: 58s - loss: 1.1114 - regression_loss: 0.9717 - classification_loss: 0.1397 330/500 [==================>...........] - ETA: 58s - loss: 1.1120 - regression_loss: 0.9723 - classification_loss: 0.1397 331/500 [==================>...........] - ETA: 57s - loss: 1.1120 - regression_loss: 0.9724 - classification_loss: 0.1396 332/500 [==================>...........] - ETA: 57s - loss: 1.1123 - regression_loss: 0.9727 - classification_loss: 0.1396 333/500 [==================>...........] - ETA: 57s - loss: 1.1112 - regression_loss: 0.9718 - classification_loss: 0.1394 334/500 [===================>..........] - ETA: 56s - loss: 1.1102 - regression_loss: 0.9707 - classification_loss: 0.1395 335/500 [===================>..........] - ETA: 56s - loss: 1.1117 - regression_loss: 0.9721 - classification_loss: 0.1396 336/500 [===================>..........] - ETA: 56s - loss: 1.1120 - regression_loss: 0.9724 - classification_loss: 0.1396 337/500 [===================>..........] - ETA: 55s - loss: 1.1123 - regression_loss: 0.9726 - classification_loss: 0.1397 338/500 [===================>..........] - ETA: 55s - loss: 1.1126 - regression_loss: 0.9731 - classification_loss: 0.1395 339/500 [===================>..........] - ETA: 55s - loss: 1.1124 - regression_loss: 0.9730 - classification_loss: 0.1395 340/500 [===================>..........] - ETA: 54s - loss: 1.1127 - regression_loss: 0.9732 - classification_loss: 0.1395 341/500 [===================>..........] - ETA: 54s - loss: 1.1116 - regression_loss: 0.9723 - classification_loss: 0.1393 342/500 [===================>..........] - ETA: 54s - loss: 1.1109 - regression_loss: 0.9718 - classification_loss: 0.1392 343/500 [===================>..........] - ETA: 53s - loss: 1.1117 - regression_loss: 0.9724 - classification_loss: 0.1394 344/500 [===================>..........] - ETA: 53s - loss: 1.1105 - regression_loss: 0.9712 - classification_loss: 0.1393 345/500 [===================>..........] - ETA: 53s - loss: 1.1109 - regression_loss: 0.9716 - classification_loss: 0.1393 346/500 [===================>..........] - ETA: 52s - loss: 1.1117 - regression_loss: 0.9724 - classification_loss: 0.1393 347/500 [===================>..........] - ETA: 52s - loss: 1.1108 - regression_loss: 0.9716 - classification_loss: 0.1391 348/500 [===================>..........] - ETA: 52s - loss: 1.1111 - regression_loss: 0.9718 - classification_loss: 0.1393 349/500 [===================>..........] - ETA: 51s - loss: 1.1109 - regression_loss: 0.9718 - classification_loss: 0.1392 350/500 [====================>.........] - ETA: 51s - loss: 1.1101 - regression_loss: 0.9711 - classification_loss: 0.1389 351/500 [====================>.........] - ETA: 51s - loss: 1.1109 - regression_loss: 0.9718 - classification_loss: 0.1390 352/500 [====================>.........] - ETA: 50s - loss: 1.1109 - regression_loss: 0.9718 - classification_loss: 0.1391 353/500 [====================>.........] - ETA: 50s - loss: 1.1096 - regression_loss: 0.9707 - classification_loss: 0.1389 354/500 [====================>.........] - ETA: 50s - loss: 1.1090 - regression_loss: 0.9701 - classification_loss: 0.1389 355/500 [====================>.........] - ETA: 49s - loss: 1.1094 - regression_loss: 0.9705 - classification_loss: 0.1389 356/500 [====================>.........] - ETA: 49s - loss: 1.1095 - regression_loss: 0.9705 - classification_loss: 0.1389 357/500 [====================>.........] - ETA: 49s - loss: 1.1094 - regression_loss: 0.9706 - classification_loss: 0.1388 358/500 [====================>.........] - ETA: 48s - loss: 1.1085 - regression_loss: 0.9698 - classification_loss: 0.1387 359/500 [====================>.........] - ETA: 48s - loss: 1.1092 - regression_loss: 0.9705 - classification_loss: 0.1387 360/500 [====================>.........] - ETA: 48s - loss: 1.1088 - regression_loss: 0.9702 - classification_loss: 0.1386 361/500 [====================>.........] - ETA: 47s - loss: 1.1087 - regression_loss: 0.9701 - classification_loss: 0.1385 362/500 [====================>.........] - ETA: 47s - loss: 1.1094 - regression_loss: 0.9707 - classification_loss: 0.1386 363/500 [====================>.........] - ETA: 47s - loss: 1.1097 - regression_loss: 0.9711 - classification_loss: 0.1386 364/500 [====================>.........] - ETA: 46s - loss: 1.1097 - regression_loss: 0.9710 - classification_loss: 0.1387 365/500 [====================>.........] - ETA: 46s - loss: 1.1096 - regression_loss: 0.9709 - classification_loss: 0.1387 366/500 [====================>.........] - ETA: 46s - loss: 1.1098 - regression_loss: 0.9712 - classification_loss: 0.1386 367/500 [=====================>........] - ETA: 45s - loss: 1.1099 - regression_loss: 0.9712 - classification_loss: 0.1387 368/500 [=====================>........] - ETA: 45s - loss: 1.1082 - regression_loss: 0.9697 - classification_loss: 0.1385 369/500 [=====================>........] - ETA: 45s - loss: 1.1075 - regression_loss: 0.9692 - classification_loss: 0.1383 370/500 [=====================>........] - ETA: 44s - loss: 1.1074 - regression_loss: 0.9691 - classification_loss: 0.1383 371/500 [=====================>........] - ETA: 44s - loss: 1.1063 - regression_loss: 0.9682 - classification_loss: 0.1382 372/500 [=====================>........] - ETA: 43s - loss: 1.1063 - regression_loss: 0.9682 - classification_loss: 0.1381 373/500 [=====================>........] - ETA: 43s - loss: 1.1066 - regression_loss: 0.9685 - classification_loss: 0.1382 374/500 [=====================>........] - ETA: 43s - loss: 1.1054 - regression_loss: 0.9674 - classification_loss: 0.1380 375/500 [=====================>........] - ETA: 42s - loss: 1.1067 - regression_loss: 0.9685 - classification_loss: 0.1382 376/500 [=====================>........] - ETA: 42s - loss: 1.1072 - regression_loss: 0.9690 - classification_loss: 0.1382 377/500 [=====================>........] - ETA: 42s - loss: 1.1066 - regression_loss: 0.9686 - classification_loss: 0.1380 378/500 [=====================>........] - ETA: 41s - loss: 1.1063 - regression_loss: 0.9683 - classification_loss: 0.1380 379/500 [=====================>........] - ETA: 41s - loss: 1.1055 - regression_loss: 0.9676 - classification_loss: 0.1378 380/500 [=====================>........] - ETA: 41s - loss: 1.1052 - regression_loss: 0.9674 - classification_loss: 0.1378 381/500 [=====================>........] - ETA: 40s - loss: 1.1065 - regression_loss: 0.9684 - classification_loss: 0.1381 382/500 [=====================>........] - ETA: 40s - loss: 1.1055 - regression_loss: 0.9675 - classification_loss: 0.1379 383/500 [=====================>........] - ETA: 40s - loss: 1.1051 - regression_loss: 0.9672 - classification_loss: 0.1379 384/500 [======================>.......] - ETA: 39s - loss: 1.1044 - regression_loss: 0.9666 - classification_loss: 0.1378 385/500 [======================>.......] - ETA: 39s - loss: 1.1039 - regression_loss: 0.9661 - classification_loss: 0.1377 386/500 [======================>.......] - ETA: 39s - loss: 1.1041 - regression_loss: 0.9664 - classification_loss: 0.1377 387/500 [======================>.......] - ETA: 38s - loss: 1.1043 - regression_loss: 0.9667 - classification_loss: 0.1376 388/500 [======================>.......] - ETA: 38s - loss: 1.1042 - regression_loss: 0.9667 - classification_loss: 0.1375 389/500 [======================>.......] - ETA: 38s - loss: 1.1043 - regression_loss: 0.9667 - classification_loss: 0.1375 390/500 [======================>.......] - ETA: 37s - loss: 1.1049 - regression_loss: 0.9673 - classification_loss: 0.1376 391/500 [======================>.......] - ETA: 37s - loss: 1.1038 - regression_loss: 0.9664 - classification_loss: 0.1375 392/500 [======================>.......] - ETA: 37s - loss: 1.1021 - regression_loss: 0.9648 - classification_loss: 0.1373 393/500 [======================>.......] - ETA: 36s - loss: 1.1014 - regression_loss: 0.9641 - classification_loss: 0.1372 394/500 [======================>.......] - ETA: 36s - loss: 1.1009 - regression_loss: 0.9637 - classification_loss: 0.1372 395/500 [======================>.......] - ETA: 36s - loss: 1.1017 - regression_loss: 0.9644 - classification_loss: 0.1373 396/500 [======================>.......] - ETA: 35s - loss: 1.1023 - regression_loss: 0.9649 - classification_loss: 0.1374 397/500 [======================>.......] - ETA: 35s - loss: 1.1017 - regression_loss: 0.9643 - classification_loss: 0.1374 398/500 [======================>.......] - ETA: 35s - loss: 1.1011 - regression_loss: 0.9638 - classification_loss: 0.1373 399/500 [======================>.......] - ETA: 34s - loss: 1.1012 - regression_loss: 0.9639 - classification_loss: 0.1373 400/500 [=======================>......] - ETA: 34s - loss: 1.0993 - regression_loss: 0.9622 - classification_loss: 0.1371 401/500 [=======================>......] - ETA: 33s - loss: 1.0982 - regression_loss: 0.9613 - classification_loss: 0.1370 402/500 [=======================>......] - ETA: 33s - loss: 1.0967 - regression_loss: 0.9600 - classification_loss: 0.1368 403/500 [=======================>......] - ETA: 33s - loss: 1.0971 - regression_loss: 0.9603 - classification_loss: 0.1368 404/500 [=======================>......] - ETA: 32s - loss: 1.0965 - regression_loss: 0.9598 - classification_loss: 0.1367 405/500 [=======================>......] - ETA: 32s - loss: 1.0951 - regression_loss: 0.9586 - classification_loss: 0.1365 406/500 [=======================>......] - ETA: 32s - loss: 1.0949 - regression_loss: 0.9584 - classification_loss: 0.1365 407/500 [=======================>......] - ETA: 31s - loss: 1.0938 - regression_loss: 0.9575 - classification_loss: 0.1363 408/500 [=======================>......] - ETA: 31s - loss: 1.0946 - regression_loss: 0.9583 - classification_loss: 0.1363 409/500 [=======================>......] - ETA: 31s - loss: 1.0953 - regression_loss: 0.9590 - classification_loss: 0.1363 410/500 [=======================>......] - ETA: 30s - loss: 1.0956 - regression_loss: 0.9592 - classification_loss: 0.1364 411/500 [=======================>......] - ETA: 30s - loss: 1.0957 - regression_loss: 0.9592 - classification_loss: 0.1365 412/500 [=======================>......] - ETA: 30s - loss: 1.0951 - regression_loss: 0.9584 - classification_loss: 0.1366 413/500 [=======================>......] - ETA: 29s - loss: 1.0945 - regression_loss: 0.9579 - classification_loss: 0.1366 414/500 [=======================>......] - ETA: 29s - loss: 1.0939 - regression_loss: 0.9573 - classification_loss: 0.1365 415/500 [=======================>......] - ETA: 29s - loss: 1.0930 - regression_loss: 0.9567 - classification_loss: 0.1364 416/500 [=======================>......] - ETA: 28s - loss: 1.0940 - regression_loss: 0.9577 - classification_loss: 0.1364 417/500 [========================>.....] - ETA: 28s - loss: 1.0951 - regression_loss: 0.9587 - classification_loss: 0.1365 418/500 [========================>.....] - ETA: 28s - loss: 1.0936 - regression_loss: 0.9572 - classification_loss: 0.1364 419/500 [========================>.....] - ETA: 27s - loss: 1.0935 - regression_loss: 0.9570 - classification_loss: 0.1365 420/500 [========================>.....] - ETA: 27s - loss: 1.0949 - regression_loss: 0.9583 - classification_loss: 0.1366 421/500 [========================>.....] - ETA: 27s - loss: 1.0948 - regression_loss: 0.9582 - classification_loss: 0.1366 422/500 [========================>.....] - ETA: 26s - loss: 1.0945 - regression_loss: 0.9579 - classification_loss: 0.1366 423/500 [========================>.....] - ETA: 26s - loss: 1.0945 - regression_loss: 0.9579 - classification_loss: 0.1366 424/500 [========================>.....] - ETA: 26s - loss: 1.0951 - regression_loss: 0.9585 - classification_loss: 0.1366 425/500 [========================>.....] - ETA: 25s - loss: 1.0967 - regression_loss: 0.9599 - classification_loss: 0.1367 426/500 [========================>.....] - ETA: 25s - loss: 1.0977 - regression_loss: 0.9611 - classification_loss: 0.1366 427/500 [========================>.....] - ETA: 25s - loss: 1.0993 - regression_loss: 0.9625 - classification_loss: 0.1368 428/500 [========================>.....] - ETA: 24s - loss: 1.0999 - regression_loss: 0.9630 - classification_loss: 0.1368 429/500 [========================>.....] - ETA: 24s - loss: 1.0997 - regression_loss: 0.9628 - classification_loss: 0.1369 430/500 [========================>.....] - ETA: 24s - loss: 1.0996 - regression_loss: 0.9628 - classification_loss: 0.1368 431/500 [========================>.....] - ETA: 23s - loss: 1.0984 - regression_loss: 0.9617 - classification_loss: 0.1367 432/500 [========================>.....] - ETA: 23s - loss: 1.0983 - regression_loss: 0.9616 - classification_loss: 0.1367 433/500 [========================>.....] - ETA: 23s - loss: 1.0986 - regression_loss: 0.9619 - classification_loss: 0.1366 434/500 [=========================>....] - ETA: 22s - loss: 1.0987 - regression_loss: 0.9619 - classification_loss: 0.1368 435/500 [=========================>....] - ETA: 22s - loss: 1.0982 - regression_loss: 0.9615 - classification_loss: 0.1366 436/500 [=========================>....] - ETA: 21s - loss: 1.0980 - regression_loss: 0.9614 - classification_loss: 0.1366 437/500 [=========================>....] - ETA: 21s - loss: 1.0983 - regression_loss: 0.9617 - classification_loss: 0.1366 438/500 [=========================>....] - ETA: 21s - loss: 1.0989 - regression_loss: 0.9622 - classification_loss: 0.1368 439/500 [=========================>....] - ETA: 20s - loss: 1.0991 - regression_loss: 0.9623 - classification_loss: 0.1368 440/500 [=========================>....] - ETA: 20s - loss: 1.0990 - regression_loss: 0.9623 - classification_loss: 0.1367 441/500 [=========================>....] - ETA: 20s - loss: 1.0992 - regression_loss: 0.9624 - classification_loss: 0.1367 442/500 [=========================>....] - ETA: 19s - loss: 1.0998 - regression_loss: 0.9629 - classification_loss: 0.1369 443/500 [=========================>....] - ETA: 19s - loss: 1.0992 - regression_loss: 0.9624 - classification_loss: 0.1368 444/500 [=========================>....] - ETA: 19s - loss: 1.0991 - regression_loss: 0.9623 - classification_loss: 0.1368 445/500 [=========================>....] - ETA: 18s - loss: 1.1000 - regression_loss: 0.9630 - classification_loss: 0.1370 446/500 [=========================>....] - ETA: 18s - loss: 1.0997 - regression_loss: 0.9626 - classification_loss: 0.1370 447/500 [=========================>....] - ETA: 18s - loss: 1.0990 - regression_loss: 0.9620 - classification_loss: 0.1370 448/500 [=========================>....] - ETA: 17s - loss: 1.0977 - regression_loss: 0.9608 - classification_loss: 0.1368 449/500 [=========================>....] - ETA: 17s - loss: 1.0972 - regression_loss: 0.9605 - classification_loss: 0.1367 450/500 [==========================>...] - ETA: 17s - loss: 1.0972 - regression_loss: 0.9605 - classification_loss: 0.1367 451/500 [==========================>...] - ETA: 16s - loss: 1.0971 - regression_loss: 0.9605 - classification_loss: 0.1367 452/500 [==========================>...] - ETA: 16s - loss: 1.0982 - regression_loss: 0.9614 - classification_loss: 0.1369 453/500 [==========================>...] - ETA: 16s - loss: 1.0983 - regression_loss: 0.9614 - classification_loss: 0.1369 454/500 [==========================>...] - ETA: 15s - loss: 1.0990 - regression_loss: 0.9620 - classification_loss: 0.1370 455/500 [==========================>...] - ETA: 15s - loss: 1.0988 - regression_loss: 0.9619 - classification_loss: 0.1370 456/500 [==========================>...] - ETA: 15s - loss: 1.0979 - regression_loss: 0.9610 - classification_loss: 0.1369 457/500 [==========================>...] - ETA: 14s - loss: 1.0979 - regression_loss: 0.9610 - classification_loss: 0.1369 458/500 [==========================>...] - ETA: 14s - loss: 1.0982 - regression_loss: 0.9613 - classification_loss: 0.1369 459/500 [==========================>...] - ETA: 14s - loss: 1.0977 - regression_loss: 0.9608 - classification_loss: 0.1369 460/500 [==========================>...] - ETA: 13s - loss: 1.0980 - regression_loss: 0.9611 - classification_loss: 0.1369 461/500 [==========================>...] - ETA: 13s - loss: 1.0990 - regression_loss: 0.9621 - classification_loss: 0.1370 462/500 [==========================>...] - ETA: 13s - loss: 1.0994 - regression_loss: 0.9624 - classification_loss: 0.1370 463/500 [==========================>...] - ETA: 12s - loss: 1.0993 - regression_loss: 0.9624 - classification_loss: 0.1370 464/500 [==========================>...] - ETA: 12s - loss: 1.0983 - regression_loss: 0.9614 - classification_loss: 0.1369 465/500 [==========================>...] - ETA: 12s - loss: 1.0991 - regression_loss: 0.9621 - classification_loss: 0.1370 466/500 [==========================>...] - ETA: 11s - loss: 1.0988 - regression_loss: 0.9618 - classification_loss: 0.1370 467/500 [===========================>..] - ETA: 11s - loss: 1.0993 - regression_loss: 0.9622 - classification_loss: 0.1370 468/500 [===========================>..] - ETA: 10s - loss: 1.0988 - regression_loss: 0.9619 - classification_loss: 0.1369 469/500 [===========================>..] - ETA: 10s - loss: 1.0989 - regression_loss: 0.9619 - classification_loss: 0.1370 470/500 [===========================>..] - ETA: 10s - loss: 1.0993 - regression_loss: 0.9623 - classification_loss: 0.1371 471/500 [===========================>..] - ETA: 9s - loss: 1.0983 - regression_loss: 0.9614 - classification_loss: 0.1369  472/500 [===========================>..] - ETA: 9s - loss: 1.0982 - regression_loss: 0.9613 - classification_loss: 0.1369 473/500 [===========================>..] - ETA: 9s - loss: 1.0971 - regression_loss: 0.9604 - classification_loss: 0.1367 474/500 [===========================>..] - ETA: 8s - loss: 1.0965 - regression_loss: 0.9599 - classification_loss: 0.1366 475/500 [===========================>..] - ETA: 8s - loss: 1.0972 - regression_loss: 0.9605 - classification_loss: 0.1367 476/500 [===========================>..] - ETA: 8s - loss: 1.0974 - regression_loss: 0.9607 - classification_loss: 0.1367 477/500 [===========================>..] - ETA: 7s - loss: 1.0976 - regression_loss: 0.9609 - classification_loss: 0.1367 478/500 [===========================>..] - ETA: 7s - loss: 1.0973 - regression_loss: 0.9606 - classification_loss: 0.1368 479/500 [===========================>..] - ETA: 7s - loss: 1.0977 - regression_loss: 0.9610 - classification_loss: 0.1367 480/500 [===========================>..] - ETA: 6s - loss: 1.0969 - regression_loss: 0.9603 - classification_loss: 0.1366 481/500 [===========================>..] - ETA: 6s - loss: 1.0965 - regression_loss: 0.9600 - classification_loss: 0.1365 482/500 [===========================>..] - ETA: 6s - loss: 1.0955 - regression_loss: 0.9592 - classification_loss: 0.1363 483/500 [===========================>..] - ETA: 5s - loss: 1.0945 - regression_loss: 0.9583 - classification_loss: 0.1361 484/500 [============================>.] - ETA: 5s - loss: 1.0956 - regression_loss: 0.9593 - classification_loss: 0.1363 485/500 [============================>.] - ETA: 5s - loss: 1.0948 - regression_loss: 0.9586 - classification_loss: 0.1362 486/500 [============================>.] - ETA: 4s - loss: 1.0952 - regression_loss: 0.9590 - classification_loss: 0.1362 487/500 [============================>.] - ETA: 4s - loss: 1.0957 - regression_loss: 0.9594 - classification_loss: 0.1363 488/500 [============================>.] - ETA: 4s - loss: 1.0944 - regression_loss: 0.9583 - classification_loss: 0.1361 489/500 [============================>.] - ETA: 3s - loss: 1.0939 - regression_loss: 0.9578 - classification_loss: 0.1361 490/500 [============================>.] - ETA: 3s - loss: 1.0932 - regression_loss: 0.9573 - classification_loss: 0.1359 491/500 [============================>.] - ETA: 3s - loss: 1.0932 - regression_loss: 0.9573 - classification_loss: 0.1359 492/500 [============================>.] - ETA: 2s - loss: 1.0927 - regression_loss: 0.9568 - classification_loss: 0.1358 493/500 [============================>.] - ETA: 2s - loss: 1.0936 - regression_loss: 0.9577 - classification_loss: 0.1359 494/500 [============================>.] - ETA: 2s - loss: 1.0932 - regression_loss: 0.9573 - classification_loss: 0.1359 495/500 [============================>.] - ETA: 1s - loss: 1.0936 - regression_loss: 0.9577 - classification_loss: 0.1359 496/500 [============================>.] - ETA: 1s - loss: 1.0939 - regression_loss: 0.9580 - classification_loss: 0.1359 497/500 [============================>.] - ETA: 1s - loss: 1.0941 - regression_loss: 0.9582 - classification_loss: 0.1359 498/500 [============================>.] - ETA: 0s - loss: 1.0943 - regression_loss: 0.9583 - classification_loss: 0.1360 499/500 [============================>.] - ETA: 0s - loss: 1.0933 - regression_loss: 0.9574 - classification_loss: 0.1359 500/500 [==============================] - 172s 343ms/step - loss: 1.0930 - regression_loss: 0.9571 - classification_loss: 0.1360 1172 instances of class plum with average precision: 0.6663 mAP: 0.6663 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 2:45 - loss: 1.9263 - regression_loss: 1.7289 - classification_loss: 0.1974 2/500 [..............................] - ETA: 2:49 - loss: 1.3455 - regression_loss: 1.2018 - classification_loss: 0.1437 3/500 [..............................] - ETA: 2:52 - loss: 1.0997 - regression_loss: 0.9833 - classification_loss: 0.1164 4/500 [..............................] - ETA: 2:52 - loss: 1.0477 - regression_loss: 0.9410 - classification_loss: 0.1066 5/500 [..............................] - ETA: 2:49 - loss: 1.0061 - regression_loss: 0.9024 - classification_loss: 0.1037 6/500 [..............................] - ETA: 2:50 - loss: 1.0279 - regression_loss: 0.9205 - classification_loss: 0.1074 7/500 [..............................] - ETA: 2:50 - loss: 1.0708 - regression_loss: 0.9517 - classification_loss: 0.1191 8/500 [..............................] - ETA: 2:50 - loss: 1.1235 - regression_loss: 1.0067 - classification_loss: 0.1167 9/500 [..............................] - ETA: 2:49 - loss: 1.0829 - regression_loss: 0.9666 - classification_loss: 0.1162 10/500 [..............................] - ETA: 2:49 - loss: 1.1054 - regression_loss: 0.9917 - classification_loss: 0.1137 11/500 [..............................] - ETA: 2:48 - loss: 1.0901 - regression_loss: 0.9554 - classification_loss: 0.1348 12/500 [..............................] - ETA: 2:48 - loss: 1.1196 - regression_loss: 0.9819 - classification_loss: 0.1377 13/500 [..............................] - ETA: 2:47 - loss: 1.1392 - regression_loss: 0.9960 - classification_loss: 0.1431 14/500 [..............................] - ETA: 2:47 - loss: 1.1422 - regression_loss: 0.9998 - classification_loss: 0.1424 15/500 [..............................] - ETA: 2:47 - loss: 1.1537 - regression_loss: 1.0075 - classification_loss: 0.1462 16/500 [..............................] - ETA: 2:46 - loss: 1.1497 - regression_loss: 1.0040 - classification_loss: 0.1457 17/500 [>.............................] - ETA: 2:46 - loss: 1.1415 - regression_loss: 0.9961 - classification_loss: 0.1454 18/500 [>.............................] - ETA: 2:45 - loss: 1.1176 - regression_loss: 0.9762 - classification_loss: 0.1414 19/500 [>.............................] - ETA: 2:45 - loss: 1.1123 - regression_loss: 0.9719 - classification_loss: 0.1404 20/500 [>.............................] - ETA: 2:45 - loss: 1.1373 - regression_loss: 0.9927 - classification_loss: 0.1445 21/500 [>.............................] - ETA: 2:45 - loss: 1.1358 - regression_loss: 0.9940 - classification_loss: 0.1418 22/500 [>.............................] - ETA: 2:44 - loss: 1.1458 - regression_loss: 1.0045 - classification_loss: 0.1414 23/500 [>.............................] - ETA: 2:44 - loss: 1.1396 - regression_loss: 1.0002 - classification_loss: 0.1394 24/500 [>.............................] - ETA: 2:44 - loss: 1.1212 - regression_loss: 0.9848 - classification_loss: 0.1364 25/500 [>.............................] - ETA: 2:44 - loss: 1.1164 - regression_loss: 0.9819 - classification_loss: 0.1345 26/500 [>.............................] - ETA: 2:43 - loss: 1.1339 - regression_loss: 0.9964 - classification_loss: 0.1374 27/500 [>.............................] - ETA: 2:42 - loss: 1.1344 - regression_loss: 0.9979 - classification_loss: 0.1366 28/500 [>.............................] - ETA: 2:42 - loss: 1.1373 - regression_loss: 0.9981 - classification_loss: 0.1393 29/500 [>.............................] - ETA: 2:42 - loss: 1.1559 - regression_loss: 1.0109 - classification_loss: 0.1450 30/500 [>.............................] - ETA: 2:42 - loss: 1.1607 - regression_loss: 1.0133 - classification_loss: 0.1474 31/500 [>.............................] - ETA: 2:41 - loss: 1.1718 - regression_loss: 1.0214 - classification_loss: 0.1504 32/500 [>.............................] - ETA: 2:41 - loss: 1.1797 - regression_loss: 1.0266 - classification_loss: 0.1530 33/500 [>.............................] - ETA: 2:41 - loss: 1.1818 - regression_loss: 1.0265 - classification_loss: 0.1553 34/500 [=>............................] - ETA: 2:40 - loss: 1.1818 - regression_loss: 1.0263 - classification_loss: 0.1554 35/500 [=>............................] - ETA: 2:40 - loss: 1.1718 - regression_loss: 1.0151 - classification_loss: 0.1567 36/500 [=>............................] - ETA: 2:39 - loss: 1.1650 - regression_loss: 1.0101 - classification_loss: 0.1548 37/500 [=>............................] - ETA: 2:39 - loss: 1.1544 - regression_loss: 1.0000 - classification_loss: 0.1543 38/500 [=>............................] - ETA: 2:39 - loss: 1.1550 - regression_loss: 1.0003 - classification_loss: 0.1547 39/500 [=>............................] - ETA: 2:38 - loss: 1.1528 - regression_loss: 0.9979 - classification_loss: 0.1549 40/500 [=>............................] - ETA: 2:37 - loss: 1.1416 - regression_loss: 0.9888 - classification_loss: 0.1528 41/500 [=>............................] - ETA: 2:37 - loss: 1.1464 - regression_loss: 0.9933 - classification_loss: 0.1530 42/500 [=>............................] - ETA: 2:37 - loss: 1.1557 - regression_loss: 1.0011 - classification_loss: 0.1546 43/500 [=>............................] - ETA: 2:37 - loss: 1.1633 - regression_loss: 1.0067 - classification_loss: 0.1566 44/500 [=>............................] - ETA: 2:36 - loss: 1.1474 - regression_loss: 0.9928 - classification_loss: 0.1546 45/500 [=>............................] - ETA: 2:36 - loss: 1.1443 - regression_loss: 0.9893 - classification_loss: 0.1550 46/500 [=>............................] - ETA: 2:36 - loss: 1.1345 - regression_loss: 0.9807 - classification_loss: 0.1538 47/500 [=>............................] - ETA: 2:35 - loss: 1.1277 - regression_loss: 0.9746 - classification_loss: 0.1531 48/500 [=>............................] - ETA: 2:35 - loss: 1.1227 - regression_loss: 0.9708 - classification_loss: 0.1520 49/500 [=>............................] - ETA: 2:34 - loss: 1.1226 - regression_loss: 0.9711 - classification_loss: 0.1516 50/500 [==>...........................] - ETA: 2:33 - loss: 1.1218 - regression_loss: 0.9693 - classification_loss: 0.1525 51/500 [==>...........................] - ETA: 2:33 - loss: 1.1152 - regression_loss: 0.9639 - classification_loss: 0.1513 52/500 [==>...........................] - ETA: 2:33 - loss: 1.1254 - regression_loss: 0.9732 - classification_loss: 0.1522 53/500 [==>...........................] - ETA: 2:32 - loss: 1.1291 - regression_loss: 0.9755 - classification_loss: 0.1536 54/500 [==>...........................] - ETA: 2:32 - loss: 1.1194 - regression_loss: 0.9674 - classification_loss: 0.1521 55/500 [==>...........................] - ETA: 2:31 - loss: 1.1242 - regression_loss: 0.9713 - classification_loss: 0.1528 56/500 [==>...........................] - ETA: 2:31 - loss: 1.1172 - regression_loss: 0.9652 - classification_loss: 0.1520 57/500 [==>...........................] - ETA: 2:31 - loss: 1.1170 - regression_loss: 0.9657 - classification_loss: 0.1512 58/500 [==>...........................] - ETA: 2:30 - loss: 1.1105 - regression_loss: 0.9601 - classification_loss: 0.1504 59/500 [==>...........................] - ETA: 2:30 - loss: 1.1159 - regression_loss: 0.9646 - classification_loss: 0.1513 60/500 [==>...........................] - ETA: 2:30 - loss: 1.1182 - regression_loss: 0.9668 - classification_loss: 0.1514 61/500 [==>...........................] - ETA: 2:29 - loss: 1.1276 - regression_loss: 0.9753 - classification_loss: 0.1523 62/500 [==>...........................] - ETA: 2:29 - loss: 1.1246 - regression_loss: 0.9728 - classification_loss: 0.1518 63/500 [==>...........................] - ETA: 2:29 - loss: 1.1221 - regression_loss: 0.9703 - classification_loss: 0.1518 64/500 [==>...........................] - ETA: 2:29 - loss: 1.1235 - regression_loss: 0.9716 - classification_loss: 0.1518 65/500 [==>...........................] - ETA: 2:28 - loss: 1.1226 - regression_loss: 0.9715 - classification_loss: 0.1511 66/500 [==>...........................] - ETA: 2:28 - loss: 1.1156 - regression_loss: 0.9650 - classification_loss: 0.1506 67/500 [===>..........................] - ETA: 2:28 - loss: 1.1160 - regression_loss: 0.9659 - classification_loss: 0.1501 68/500 [===>..........................] - ETA: 2:27 - loss: 1.1106 - regression_loss: 0.9616 - classification_loss: 0.1490 69/500 [===>..........................] - ETA: 2:27 - loss: 1.1131 - regression_loss: 0.9637 - classification_loss: 0.1495 70/500 [===>..........................] - ETA: 2:27 - loss: 1.1114 - regression_loss: 0.9629 - classification_loss: 0.1485 71/500 [===>..........................] - ETA: 2:26 - loss: 1.1145 - regression_loss: 0.9655 - classification_loss: 0.1490 72/500 [===>..........................] - ETA: 2:26 - loss: 1.1133 - regression_loss: 0.9647 - classification_loss: 0.1486 73/500 [===>..........................] - ETA: 2:26 - loss: 1.1031 - regression_loss: 0.9558 - classification_loss: 0.1473 74/500 [===>..........................] - ETA: 2:25 - loss: 1.0941 - regression_loss: 0.9479 - classification_loss: 0.1462 75/500 [===>..........................] - ETA: 2:25 - loss: 1.0998 - regression_loss: 0.9527 - classification_loss: 0.1470 76/500 [===>..........................] - ETA: 2:25 - loss: 1.0943 - regression_loss: 0.9474 - classification_loss: 0.1469 77/500 [===>..........................] - ETA: 2:24 - loss: 1.0958 - regression_loss: 0.9489 - classification_loss: 0.1469 78/500 [===>..........................] - ETA: 2:24 - loss: 1.0939 - regression_loss: 0.9473 - classification_loss: 0.1466 79/500 [===>..........................] - ETA: 2:24 - loss: 1.0984 - regression_loss: 0.9515 - classification_loss: 0.1469 80/500 [===>..........................] - ETA: 2:23 - loss: 1.0966 - regression_loss: 0.9501 - classification_loss: 0.1465 81/500 [===>..........................] - ETA: 2:23 - loss: 1.0939 - regression_loss: 0.9477 - classification_loss: 0.1462 82/500 [===>..........................] - ETA: 2:23 - loss: 1.0912 - regression_loss: 0.9456 - classification_loss: 0.1456 83/500 [===>..........................] - ETA: 2:22 - loss: 1.0949 - regression_loss: 0.9491 - classification_loss: 0.1458 84/500 [====>.........................] - ETA: 2:22 - loss: 1.0999 - regression_loss: 0.9534 - classification_loss: 0.1465 85/500 [====>.........................] - ETA: 2:22 - loss: 1.0968 - regression_loss: 0.9511 - classification_loss: 0.1456 86/500 [====>.........................] - ETA: 2:21 - loss: 1.0946 - regression_loss: 0.9490 - classification_loss: 0.1456 87/500 [====>.........................] - ETA: 2:21 - loss: 1.0910 - regression_loss: 0.9459 - classification_loss: 0.1451 88/500 [====>.........................] - ETA: 2:21 - loss: 1.0945 - regression_loss: 0.9491 - classification_loss: 0.1454 89/500 [====>.........................] - ETA: 2:20 - loss: 1.0894 - regression_loss: 0.9447 - classification_loss: 0.1447 90/500 [====>.........................] - ETA: 2:20 - loss: 1.0842 - regression_loss: 0.9401 - classification_loss: 0.1441 91/500 [====>.........................] - ETA: 2:20 - loss: 1.0808 - regression_loss: 0.9374 - classification_loss: 0.1433 92/500 [====>.........................] - ETA: 2:19 - loss: 1.0837 - regression_loss: 0.9399 - classification_loss: 0.1438 93/500 [====>.........................] - ETA: 2:19 - loss: 1.0849 - regression_loss: 0.9407 - classification_loss: 0.1442 94/500 [====>.........................] - ETA: 2:19 - loss: 1.0892 - regression_loss: 0.9445 - classification_loss: 0.1447 95/500 [====>.........................] - ETA: 2:18 - loss: 1.0925 - regression_loss: 0.9476 - classification_loss: 0.1449 96/500 [====>.........................] - ETA: 2:18 - loss: 1.0918 - regression_loss: 0.9475 - classification_loss: 0.1444 97/500 [====>.........................] - ETA: 2:18 - loss: 1.0878 - regression_loss: 0.9443 - classification_loss: 0.1435 98/500 [====>.........................] - ETA: 2:17 - loss: 1.0854 - regression_loss: 0.9422 - classification_loss: 0.1432 99/500 [====>.........................] - ETA: 2:17 - loss: 1.0857 - regression_loss: 0.9427 - classification_loss: 0.1430 100/500 [=====>........................] - ETA: 2:17 - loss: 1.0819 - regression_loss: 0.9390 - classification_loss: 0.1429 101/500 [=====>........................] - ETA: 2:16 - loss: 1.0880 - regression_loss: 0.9445 - classification_loss: 0.1435 102/500 [=====>........................] - ETA: 2:16 - loss: 1.0907 - regression_loss: 0.9472 - classification_loss: 0.1435 103/500 [=====>........................] - ETA: 2:16 - loss: 1.0898 - regression_loss: 0.9465 - classification_loss: 0.1433 104/500 [=====>........................] - ETA: 2:15 - loss: 1.0910 - regression_loss: 0.9473 - classification_loss: 0.1437 105/500 [=====>........................] - ETA: 2:15 - loss: 1.0890 - regression_loss: 0.9456 - classification_loss: 0.1435 106/500 [=====>........................] - ETA: 2:15 - loss: 1.0874 - regression_loss: 0.9447 - classification_loss: 0.1428 107/500 [=====>........................] - ETA: 2:15 - loss: 1.0877 - regression_loss: 0.9451 - classification_loss: 0.1426 108/500 [=====>........................] - ETA: 2:14 - loss: 1.0894 - regression_loss: 0.9463 - classification_loss: 0.1431 109/500 [=====>........................] - ETA: 2:14 - loss: 1.0905 - regression_loss: 0.9474 - classification_loss: 0.1430 110/500 [=====>........................] - ETA: 2:14 - loss: 1.0877 - regression_loss: 0.9450 - classification_loss: 0.1427 111/500 [=====>........................] - ETA: 2:13 - loss: 1.0900 - regression_loss: 0.9476 - classification_loss: 0.1424 112/500 [=====>........................] - ETA: 2:13 - loss: 1.0903 - regression_loss: 0.9479 - classification_loss: 0.1424 113/500 [=====>........................] - ETA: 2:12 - loss: 1.0915 - regression_loss: 0.9482 - classification_loss: 0.1433 114/500 [=====>........................] - ETA: 2:12 - loss: 1.0923 - regression_loss: 0.9486 - classification_loss: 0.1437 115/500 [=====>........................] - ETA: 2:12 - loss: 1.0957 - regression_loss: 0.9515 - classification_loss: 0.1442 116/500 [=====>........................] - ETA: 2:11 - loss: 1.0926 - regression_loss: 0.9490 - classification_loss: 0.1437 117/500 [======>.......................] - ETA: 2:11 - loss: 1.0888 - regression_loss: 0.9456 - classification_loss: 0.1432 118/500 [======>.......................] - ETA: 2:11 - loss: 1.0880 - regression_loss: 0.9449 - classification_loss: 0.1431 119/500 [======>.......................] - ETA: 2:11 - loss: 1.0881 - regression_loss: 0.9449 - classification_loss: 0.1433 120/500 [======>.......................] - ETA: 2:10 - loss: 1.0887 - regression_loss: 0.9455 - classification_loss: 0.1432 121/500 [======>.......................] - ETA: 2:10 - loss: 1.0918 - regression_loss: 0.9486 - classification_loss: 0.1432 122/500 [======>.......................] - ETA: 2:10 - loss: 1.0907 - regression_loss: 0.9477 - classification_loss: 0.1430 123/500 [======>.......................] - ETA: 2:09 - loss: 1.0874 - regression_loss: 0.9450 - classification_loss: 0.1425 124/500 [======>.......................] - ETA: 2:09 - loss: 1.0822 - regression_loss: 0.9405 - classification_loss: 0.1418 125/500 [======>.......................] - ETA: 2:09 - loss: 1.0828 - regression_loss: 0.9409 - classification_loss: 0.1419 126/500 [======>.......................] - ETA: 2:08 - loss: 1.0852 - regression_loss: 0.9433 - classification_loss: 0.1420 127/500 [======>.......................] - ETA: 2:08 - loss: 1.0890 - regression_loss: 0.9464 - classification_loss: 0.1426 128/500 [======>.......................] - ETA: 2:08 - loss: 1.0932 - regression_loss: 0.9499 - classification_loss: 0.1432 129/500 [======>.......................] - ETA: 2:07 - loss: 1.0914 - regression_loss: 0.9485 - classification_loss: 0.1429 130/500 [======>.......................] - ETA: 2:07 - loss: 1.0909 - regression_loss: 0.9479 - classification_loss: 0.1429 131/500 [======>.......................] - ETA: 2:06 - loss: 1.0926 - regression_loss: 0.9494 - classification_loss: 0.1432 132/500 [======>.......................] - ETA: 2:06 - loss: 1.0927 - regression_loss: 0.9497 - classification_loss: 0.1430 133/500 [======>.......................] - ETA: 2:06 - loss: 1.0892 - regression_loss: 0.9467 - classification_loss: 0.1426 134/500 [=======>......................] - ETA: 2:05 - loss: 1.0898 - regression_loss: 0.9473 - classification_loss: 0.1424 135/500 [=======>......................] - ETA: 2:05 - loss: 1.0893 - regression_loss: 0.9471 - classification_loss: 0.1423 136/500 [=======>......................] - ETA: 2:05 - loss: 1.0891 - regression_loss: 0.9472 - classification_loss: 0.1420 137/500 [=======>......................] - ETA: 2:04 - loss: 1.0841 - regression_loss: 0.9428 - classification_loss: 0.1413 138/500 [=======>......................] - ETA: 2:04 - loss: 1.0827 - regression_loss: 0.9417 - classification_loss: 0.1411 139/500 [=======>......................] - ETA: 2:04 - loss: 1.0846 - regression_loss: 0.9431 - classification_loss: 0.1415 140/500 [=======>......................] - ETA: 2:03 - loss: 1.0818 - regression_loss: 0.9406 - classification_loss: 0.1412 141/500 [=======>......................] - ETA: 2:03 - loss: 1.0847 - regression_loss: 0.9432 - classification_loss: 0.1415 142/500 [=======>......................] - ETA: 2:03 - loss: 1.0856 - regression_loss: 0.9441 - classification_loss: 0.1416 143/500 [=======>......................] - ETA: 2:02 - loss: 1.0820 - regression_loss: 0.9408 - classification_loss: 0.1412 144/500 [=======>......................] - ETA: 2:02 - loss: 1.0789 - regression_loss: 0.9383 - classification_loss: 0.1406 145/500 [=======>......................] - ETA: 2:02 - loss: 1.0748 - regression_loss: 0.9347 - classification_loss: 0.1401 146/500 [=======>......................] - ETA: 2:01 - loss: 1.0747 - regression_loss: 0.9345 - classification_loss: 0.1402 147/500 [=======>......................] - ETA: 2:01 - loss: 1.0766 - regression_loss: 0.9361 - classification_loss: 0.1405 148/500 [=======>......................] - ETA: 2:01 - loss: 1.0766 - regression_loss: 0.9361 - classification_loss: 0.1405 149/500 [=======>......................] - ETA: 2:00 - loss: 1.0741 - regression_loss: 0.9342 - classification_loss: 0.1399 150/500 [========>.....................] - ETA: 2:00 - loss: 1.0779 - regression_loss: 0.9372 - classification_loss: 0.1407 151/500 [========>.....................] - ETA: 2:00 - loss: 1.0782 - regression_loss: 0.9378 - classification_loss: 0.1405 152/500 [========>.....................] - ETA: 1:59 - loss: 1.0777 - regression_loss: 0.9371 - classification_loss: 0.1406 153/500 [========>.....................] - ETA: 1:59 - loss: 1.0772 - regression_loss: 0.9364 - classification_loss: 0.1407 154/500 [========>.....................] - ETA: 1:59 - loss: 1.0797 - regression_loss: 0.9388 - classification_loss: 0.1408 155/500 [========>.....................] - ETA: 1:58 - loss: 1.0794 - regression_loss: 0.9387 - classification_loss: 0.1408 156/500 [========>.....................] - ETA: 1:58 - loss: 1.0804 - regression_loss: 0.9399 - classification_loss: 0.1405 157/500 [========>.....................] - ETA: 1:58 - loss: 1.0775 - regression_loss: 0.9374 - classification_loss: 0.1401 158/500 [========>.....................] - ETA: 1:57 - loss: 1.0793 - regression_loss: 0.9388 - classification_loss: 0.1405 159/500 [========>.....................] - ETA: 1:57 - loss: 1.0816 - regression_loss: 0.9410 - classification_loss: 0.1406 160/500 [========>.....................] - ETA: 1:57 - loss: 1.0787 - regression_loss: 0.9385 - classification_loss: 0.1402 161/500 [========>.....................] - ETA: 1:56 - loss: 1.0796 - regression_loss: 0.9393 - classification_loss: 0.1403 162/500 [========>.....................] - ETA: 1:56 - loss: 1.0790 - regression_loss: 0.9390 - classification_loss: 0.1401 163/500 [========>.....................] - ETA: 1:56 - loss: 1.0806 - regression_loss: 0.9405 - classification_loss: 0.1401 164/500 [========>.....................] - ETA: 1:55 - loss: 1.0796 - regression_loss: 0.9397 - classification_loss: 0.1399 165/500 [========>.....................] - ETA: 1:55 - loss: 1.0767 - regression_loss: 0.9373 - classification_loss: 0.1394 166/500 [========>.....................] - ETA: 1:54 - loss: 1.0786 - regression_loss: 0.9390 - classification_loss: 0.1396 167/500 [=========>....................] - ETA: 1:54 - loss: 1.0785 - regression_loss: 0.9391 - classification_loss: 0.1394 168/500 [=========>....................] - ETA: 1:54 - loss: 1.0813 - regression_loss: 0.9414 - classification_loss: 0.1399 169/500 [=========>....................] - ETA: 1:54 - loss: 1.0784 - regression_loss: 0.9389 - classification_loss: 0.1394 170/500 [=========>....................] - ETA: 1:53 - loss: 1.0770 - regression_loss: 0.9379 - classification_loss: 0.1392 171/500 [=========>....................] - ETA: 1:53 - loss: 1.0771 - regression_loss: 0.9381 - classification_loss: 0.1390 172/500 [=========>....................] - ETA: 1:52 - loss: 1.0775 - regression_loss: 0.9386 - classification_loss: 0.1389 173/500 [=========>....................] - ETA: 1:52 - loss: 1.0756 - regression_loss: 0.9371 - classification_loss: 0.1386 174/500 [=========>....................] - ETA: 1:52 - loss: 1.0753 - regression_loss: 0.9369 - classification_loss: 0.1384 175/500 [=========>....................] - ETA: 1:52 - loss: 1.0788 - regression_loss: 0.9403 - classification_loss: 0.1385 176/500 [=========>....................] - ETA: 1:51 - loss: 1.0764 - regression_loss: 0.9382 - classification_loss: 0.1381 177/500 [=========>....................] - ETA: 1:51 - loss: 1.0770 - regression_loss: 0.9388 - classification_loss: 0.1382 178/500 [=========>....................] - ETA: 1:50 - loss: 1.0802 - regression_loss: 0.9417 - classification_loss: 0.1385 179/500 [=========>....................] - ETA: 1:50 - loss: 1.0784 - regression_loss: 0.9401 - classification_loss: 0.1383 180/500 [=========>....................] - ETA: 1:50 - loss: 1.0794 - regression_loss: 0.9410 - classification_loss: 0.1385 181/500 [=========>....................] - ETA: 1:49 - loss: 1.0796 - regression_loss: 0.9411 - classification_loss: 0.1385 182/500 [=========>....................] - ETA: 1:49 - loss: 1.0766 - regression_loss: 0.9386 - classification_loss: 0.1380 183/500 [=========>....................] - ETA: 1:49 - loss: 1.0759 - regression_loss: 0.9384 - classification_loss: 0.1376 184/500 [==========>...................] - ETA: 1:48 - loss: 1.0773 - regression_loss: 0.9395 - classification_loss: 0.1377 185/500 [==========>...................] - ETA: 1:48 - loss: 1.0762 - regression_loss: 0.9388 - classification_loss: 0.1374 186/500 [==========>...................] - ETA: 1:48 - loss: 1.0794 - regression_loss: 0.9417 - classification_loss: 0.1377 187/500 [==========>...................] - ETA: 1:47 - loss: 1.0799 - regression_loss: 0.9423 - classification_loss: 0.1376 188/500 [==========>...................] - ETA: 1:47 - loss: 1.0804 - regression_loss: 0.9425 - classification_loss: 0.1379 189/500 [==========>...................] - ETA: 1:47 - loss: 1.0798 - regression_loss: 0.9421 - classification_loss: 0.1377 190/500 [==========>...................] - ETA: 1:46 - loss: 1.0808 - regression_loss: 0.9428 - classification_loss: 0.1379 191/500 [==========>...................] - ETA: 1:46 - loss: 1.0788 - regression_loss: 0.9413 - classification_loss: 0.1375 192/500 [==========>...................] - ETA: 1:46 - loss: 1.0778 - regression_loss: 0.9404 - classification_loss: 0.1374 193/500 [==========>...................] - ETA: 1:45 - loss: 1.0786 - regression_loss: 0.9409 - classification_loss: 0.1377 194/500 [==========>...................] - ETA: 1:45 - loss: 1.0753 - regression_loss: 0.9380 - classification_loss: 0.1373 195/500 [==========>...................] - ETA: 1:44 - loss: 1.0747 - regression_loss: 0.9372 - classification_loss: 0.1376 196/500 [==========>...................] - ETA: 1:44 - loss: 1.0727 - regression_loss: 0.9356 - classification_loss: 0.1372 197/500 [==========>...................] - ETA: 1:44 - loss: 1.0730 - regression_loss: 0.9362 - classification_loss: 0.1368 198/500 [==========>...................] - ETA: 1:43 - loss: 1.0727 - regression_loss: 0.9361 - classification_loss: 0.1367 199/500 [==========>...................] - ETA: 1:43 - loss: 1.0734 - regression_loss: 0.9368 - classification_loss: 0.1366 200/500 [===========>..................] - ETA: 1:43 - loss: 1.0730 - regression_loss: 0.9365 - classification_loss: 0.1365 201/500 [===========>..................] - ETA: 1:42 - loss: 1.0735 - regression_loss: 0.9372 - classification_loss: 0.1364 202/500 [===========>..................] - ETA: 1:42 - loss: 1.0736 - regression_loss: 0.9372 - classification_loss: 0.1364 203/500 [===========>..................] - ETA: 1:42 - loss: 1.0718 - regression_loss: 0.9357 - classification_loss: 0.1360 204/500 [===========>..................] - ETA: 1:41 - loss: 1.0716 - regression_loss: 0.9355 - classification_loss: 0.1361 205/500 [===========>..................] - ETA: 1:41 - loss: 1.0731 - regression_loss: 0.9368 - classification_loss: 0.1363 206/500 [===========>..................] - ETA: 1:41 - loss: 1.0746 - regression_loss: 0.9381 - classification_loss: 0.1365 207/500 [===========>..................] - ETA: 1:40 - loss: 1.0744 - regression_loss: 0.9379 - classification_loss: 0.1364 208/500 [===========>..................] - ETA: 1:40 - loss: 1.0730 - regression_loss: 0.9369 - classification_loss: 0.1362 209/500 [===========>..................] - ETA: 1:40 - loss: 1.0714 - regression_loss: 0.9354 - classification_loss: 0.1360 210/500 [===========>..................] - ETA: 1:39 - loss: 1.0683 - regression_loss: 0.9328 - classification_loss: 0.1356 211/500 [===========>..................] - ETA: 1:39 - loss: 1.0654 - regression_loss: 0.9303 - classification_loss: 0.1351 212/500 [===========>..................] - ETA: 1:39 - loss: 1.0667 - regression_loss: 0.9316 - classification_loss: 0.1351 213/500 [===========>..................] - ETA: 1:38 - loss: 1.0660 - regression_loss: 0.9309 - classification_loss: 0.1351 214/500 [===========>..................] - ETA: 1:38 - loss: 1.0637 - regression_loss: 0.9289 - classification_loss: 0.1349 215/500 [===========>..................] - ETA: 1:37 - loss: 1.0641 - regression_loss: 0.9292 - classification_loss: 0.1349 216/500 [===========>..................] - ETA: 1:37 - loss: 1.0646 - regression_loss: 0.9298 - classification_loss: 0.1348 217/500 [============>.................] - ETA: 1:37 - loss: 1.0650 - regression_loss: 0.9304 - classification_loss: 0.1345 218/500 [============>.................] - ETA: 1:36 - loss: 1.0648 - regression_loss: 0.9304 - classification_loss: 0.1345 219/500 [============>.................] - ETA: 1:36 - loss: 1.0653 - regression_loss: 0.9309 - classification_loss: 0.1344 220/500 [============>.................] - ETA: 1:36 - loss: 1.0652 - regression_loss: 0.9311 - classification_loss: 0.1341 221/500 [============>.................] - ETA: 1:35 - loss: 1.0641 - regression_loss: 0.9302 - classification_loss: 0.1339 222/500 [============>.................] - ETA: 1:35 - loss: 1.0625 - regression_loss: 0.9288 - classification_loss: 0.1337 223/500 [============>.................] - ETA: 1:35 - loss: 1.0629 - regression_loss: 0.9291 - classification_loss: 0.1337 224/500 [============>.................] - ETA: 1:34 - loss: 1.0629 - regression_loss: 0.9292 - classification_loss: 0.1337 225/500 [============>.................] - ETA: 1:34 - loss: 1.0620 - regression_loss: 0.9283 - classification_loss: 0.1337 226/500 [============>.................] - ETA: 1:34 - loss: 1.0615 - regression_loss: 0.9280 - classification_loss: 0.1334 227/500 [============>.................] - ETA: 1:33 - loss: 1.0599 - regression_loss: 0.9267 - classification_loss: 0.1332 228/500 [============>.................] - ETA: 1:33 - loss: 1.0600 - regression_loss: 0.9268 - classification_loss: 0.1332 229/500 [============>.................] - ETA: 1:33 - loss: 1.0613 - regression_loss: 0.9280 - classification_loss: 0.1333 230/500 [============>.................] - ETA: 1:32 - loss: 1.0601 - regression_loss: 0.9269 - classification_loss: 0.1332 231/500 [============>.................] - ETA: 1:32 - loss: 1.0592 - regression_loss: 0.9261 - classification_loss: 0.1331 232/500 [============>.................] - ETA: 1:32 - loss: 1.0576 - regression_loss: 0.9246 - classification_loss: 0.1329 233/500 [============>.................] - ETA: 1:31 - loss: 1.0562 - regression_loss: 0.9234 - classification_loss: 0.1328 234/500 [=============>................] - ETA: 1:31 - loss: 1.0558 - regression_loss: 0.9231 - classification_loss: 0.1327 235/500 [=============>................] - ETA: 1:31 - loss: 1.0554 - regression_loss: 0.9228 - classification_loss: 0.1326 236/500 [=============>................] - ETA: 1:30 - loss: 1.0562 - regression_loss: 0.9237 - classification_loss: 0.1325 237/500 [=============>................] - ETA: 1:30 - loss: 1.0574 - regression_loss: 0.9249 - classification_loss: 0.1326 238/500 [=============>................] - ETA: 1:30 - loss: 1.0553 - regression_loss: 0.9231 - classification_loss: 0.1321 239/500 [=============>................] - ETA: 1:29 - loss: 1.0559 - regression_loss: 0.9235 - classification_loss: 0.1324 240/500 [=============>................] - ETA: 1:29 - loss: 1.0551 - regression_loss: 0.9228 - classification_loss: 0.1324 241/500 [=============>................] - ETA: 1:29 - loss: 1.0548 - regression_loss: 0.9227 - classification_loss: 0.1321 242/500 [=============>................] - ETA: 1:28 - loss: 1.0545 - regression_loss: 0.9226 - classification_loss: 0.1319 243/500 [=============>................] - ETA: 1:28 - loss: 1.0537 - regression_loss: 0.9221 - classification_loss: 0.1316 244/500 [=============>................] - ETA: 1:27 - loss: 1.0516 - regression_loss: 0.9202 - classification_loss: 0.1313 245/500 [=============>................] - ETA: 1:27 - loss: 1.0511 - regression_loss: 0.9197 - classification_loss: 0.1313 246/500 [=============>................] - ETA: 1:27 - loss: 1.0490 - regression_loss: 0.9180 - classification_loss: 0.1310 247/500 [=============>................] - ETA: 1:26 - loss: 1.0500 - regression_loss: 0.9187 - classification_loss: 0.1313 248/500 [=============>................] - ETA: 1:26 - loss: 1.0510 - regression_loss: 0.9197 - classification_loss: 0.1314 249/500 [=============>................] - ETA: 1:26 - loss: 1.0522 - regression_loss: 0.9207 - classification_loss: 0.1314 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0513 - regression_loss: 0.9200 - classification_loss: 0.1313 251/500 [==============>...............] - ETA: 1:25 - loss: 1.0504 - regression_loss: 0.9193 - classification_loss: 0.1311 252/500 [==============>...............] - ETA: 1:25 - loss: 1.0486 - regression_loss: 0.9178 - classification_loss: 0.1308 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0469 - regression_loss: 0.9163 - classification_loss: 0.1306 254/500 [==============>...............] - ETA: 1:24 - loss: 1.0485 - regression_loss: 0.9177 - classification_loss: 0.1308 255/500 [==============>...............] - ETA: 1:24 - loss: 1.0474 - regression_loss: 0.9167 - classification_loss: 0.1307 256/500 [==============>...............] - ETA: 1:23 - loss: 1.0463 - regression_loss: 0.9157 - classification_loss: 0.1306 257/500 [==============>...............] - ETA: 1:23 - loss: 1.0467 - regression_loss: 0.9162 - classification_loss: 0.1305 258/500 [==============>...............] - ETA: 1:23 - loss: 1.0466 - regression_loss: 0.9162 - classification_loss: 0.1304 259/500 [==============>...............] - ETA: 1:22 - loss: 1.0451 - regression_loss: 0.9148 - classification_loss: 0.1303 260/500 [==============>...............] - ETA: 1:22 - loss: 1.0448 - regression_loss: 0.9147 - classification_loss: 0.1302 261/500 [==============>...............] - ETA: 1:22 - loss: 1.0448 - regression_loss: 0.9147 - classification_loss: 0.1301 262/500 [==============>...............] - ETA: 1:21 - loss: 1.0426 - regression_loss: 0.9128 - classification_loss: 0.1298 263/500 [==============>...............] - ETA: 1:21 - loss: 1.0436 - regression_loss: 0.9135 - classification_loss: 0.1301 264/500 [==============>...............] - ETA: 1:21 - loss: 1.0417 - regression_loss: 0.9118 - classification_loss: 0.1299 265/500 [==============>...............] - ETA: 1:20 - loss: 1.0421 - regression_loss: 0.9122 - classification_loss: 0.1299 266/500 [==============>...............] - ETA: 1:20 - loss: 1.0420 - regression_loss: 0.9121 - classification_loss: 0.1299 267/500 [===============>..............] - ETA: 1:20 - loss: 1.0409 - regression_loss: 0.9111 - classification_loss: 0.1298 268/500 [===============>..............] - ETA: 1:19 - loss: 1.0409 - regression_loss: 0.9112 - classification_loss: 0.1298 269/500 [===============>..............] - ETA: 1:19 - loss: 1.0414 - regression_loss: 0.9116 - classification_loss: 0.1298 270/500 [===============>..............] - ETA: 1:19 - loss: 1.0433 - regression_loss: 0.9131 - classification_loss: 0.1302 271/500 [===============>..............] - ETA: 1:18 - loss: 1.0433 - regression_loss: 0.9132 - classification_loss: 0.1301 272/500 [===============>..............] - ETA: 1:18 - loss: 1.0422 - regression_loss: 0.9123 - classification_loss: 0.1299 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0427 - regression_loss: 0.9126 - classification_loss: 0.1300 274/500 [===============>..............] - ETA: 1:17 - loss: 1.0413 - regression_loss: 0.9115 - classification_loss: 0.1298 275/500 [===============>..............] - ETA: 1:17 - loss: 1.0411 - regression_loss: 0.9113 - classification_loss: 0.1298 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0393 - regression_loss: 0.9097 - classification_loss: 0.1295 277/500 [===============>..............] - ETA: 1:16 - loss: 1.0383 - regression_loss: 0.9090 - classification_loss: 0.1293 278/500 [===============>..............] - ETA: 1:16 - loss: 1.0399 - regression_loss: 0.9105 - classification_loss: 0.1295 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0397 - regression_loss: 0.9103 - classification_loss: 0.1294 280/500 [===============>..............] - ETA: 1:15 - loss: 1.0395 - regression_loss: 0.9102 - classification_loss: 0.1293 281/500 [===============>..............] - ETA: 1:15 - loss: 1.0386 - regression_loss: 0.9094 - classification_loss: 0.1292 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0390 - regression_loss: 0.9099 - classification_loss: 0.1291 283/500 [===============>..............] - ETA: 1:14 - loss: 1.0382 - regression_loss: 0.9092 - classification_loss: 0.1290 284/500 [================>.............] - ETA: 1:14 - loss: 1.0392 - regression_loss: 0.9101 - classification_loss: 0.1291 285/500 [================>.............] - ETA: 1:13 - loss: 1.0390 - regression_loss: 0.9100 - classification_loss: 0.1290 286/500 [================>.............] - ETA: 1:13 - loss: 1.0385 - regression_loss: 0.9098 - classification_loss: 0.1287 287/500 [================>.............] - ETA: 1:13 - loss: 1.0388 - regression_loss: 0.9100 - classification_loss: 0.1288 288/500 [================>.............] - ETA: 1:12 - loss: 1.0375 - regression_loss: 0.9089 - classification_loss: 0.1286 289/500 [================>.............] - ETA: 1:12 - loss: 1.0375 - regression_loss: 0.9089 - classification_loss: 0.1286 290/500 [================>.............] - ETA: 1:12 - loss: 1.0394 - regression_loss: 0.9108 - classification_loss: 0.1286 291/500 [================>.............] - ETA: 1:11 - loss: 1.0394 - regression_loss: 0.9108 - classification_loss: 0.1286 292/500 [================>.............] - ETA: 1:11 - loss: 1.0395 - regression_loss: 0.9109 - classification_loss: 0.1286 293/500 [================>.............] - ETA: 1:11 - loss: 1.0396 - regression_loss: 0.9110 - classification_loss: 0.1287 294/500 [================>.............] - ETA: 1:10 - loss: 1.0401 - regression_loss: 0.9115 - classification_loss: 0.1286 295/500 [================>.............] - ETA: 1:10 - loss: 1.0387 - regression_loss: 0.9103 - classification_loss: 0.1284 296/500 [================>.............] - ETA: 1:10 - loss: 1.0373 - regression_loss: 0.9091 - classification_loss: 0.1282 297/500 [================>.............] - ETA: 1:09 - loss: 1.0383 - regression_loss: 0.9099 - classification_loss: 0.1284 298/500 [================>.............] - ETA: 1:09 - loss: 1.0399 - regression_loss: 0.9113 - classification_loss: 0.1287 299/500 [================>.............] - ETA: 1:09 - loss: 1.0403 - regression_loss: 0.9115 - classification_loss: 0.1288 300/500 [=================>............] - ETA: 1:08 - loss: 1.0412 - regression_loss: 0.9124 - classification_loss: 0.1288 301/500 [=================>............] - ETA: 1:08 - loss: 1.0417 - regression_loss: 0.9128 - classification_loss: 0.1289 302/500 [=================>............] - ETA: 1:08 - loss: 1.0415 - regression_loss: 0.9126 - classification_loss: 0.1289 303/500 [=================>............] - ETA: 1:07 - loss: 1.0412 - regression_loss: 0.9123 - classification_loss: 0.1289 304/500 [=================>............] - ETA: 1:07 - loss: 1.0414 - regression_loss: 0.9125 - classification_loss: 0.1289 305/500 [=================>............] - ETA: 1:07 - loss: 1.0405 - regression_loss: 0.9119 - classification_loss: 0.1286 306/500 [=================>............] - ETA: 1:06 - loss: 1.0404 - regression_loss: 0.9119 - classification_loss: 0.1285 307/500 [=================>............] - ETA: 1:06 - loss: 1.0417 - regression_loss: 0.9132 - classification_loss: 0.1285 308/500 [=================>............] - ETA: 1:06 - loss: 1.0425 - regression_loss: 0.9140 - classification_loss: 0.1285 309/500 [=================>............] - ETA: 1:05 - loss: 1.0429 - regression_loss: 0.9144 - classification_loss: 0.1285 310/500 [=================>............] - ETA: 1:05 - loss: 1.0420 - regression_loss: 0.9136 - classification_loss: 0.1284 311/500 [=================>............] - ETA: 1:04 - loss: 1.0432 - regression_loss: 0.9146 - classification_loss: 0.1286 312/500 [=================>............] - ETA: 1:04 - loss: 1.0441 - regression_loss: 0.9155 - classification_loss: 0.1287 313/500 [=================>............] - ETA: 1:04 - loss: 1.0429 - regression_loss: 0.9144 - classification_loss: 0.1285 314/500 [=================>............] - ETA: 1:03 - loss: 1.0428 - regression_loss: 0.9143 - classification_loss: 0.1285 315/500 [=================>............] - ETA: 1:03 - loss: 1.0441 - regression_loss: 0.9154 - classification_loss: 0.1288 316/500 [=================>............] - ETA: 1:03 - loss: 1.0444 - regression_loss: 0.9157 - classification_loss: 0.1287 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0456 - regression_loss: 0.9167 - classification_loss: 0.1289 318/500 [==================>...........] - ETA: 1:02 - loss: 1.0457 - regression_loss: 0.9170 - classification_loss: 0.1287 319/500 [==================>...........] - ETA: 1:02 - loss: 1.0458 - regression_loss: 0.9170 - classification_loss: 0.1288 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0473 - regression_loss: 0.9184 - classification_loss: 0.1289 321/500 [==================>...........] - ETA: 1:01 - loss: 1.0482 - regression_loss: 0.9193 - classification_loss: 0.1289 322/500 [==================>...........] - ETA: 1:01 - loss: 1.0481 - regression_loss: 0.9191 - classification_loss: 0.1290 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0493 - regression_loss: 0.9203 - classification_loss: 0.1290 324/500 [==================>...........] - ETA: 1:00 - loss: 1.0476 - regression_loss: 0.9188 - classification_loss: 0.1288 325/500 [==================>...........] - ETA: 1:00 - loss: 1.0473 - regression_loss: 0.9187 - classification_loss: 0.1286 326/500 [==================>...........] - ETA: 59s - loss: 1.0476 - regression_loss: 0.9190 - classification_loss: 0.1286  327/500 [==================>...........] - ETA: 59s - loss: 1.0469 - regression_loss: 0.9183 - classification_loss: 0.1286 328/500 [==================>...........] - ETA: 59s - loss: 1.0455 - regression_loss: 0.9171 - classification_loss: 0.1284 329/500 [==================>...........] - ETA: 58s - loss: 1.0457 - regression_loss: 0.9175 - classification_loss: 0.1282 330/500 [==================>...........] - ETA: 58s - loss: 1.0453 - regression_loss: 0.9171 - classification_loss: 0.1282 331/500 [==================>...........] - ETA: 58s - loss: 1.0442 - regression_loss: 0.9162 - classification_loss: 0.1280 332/500 [==================>...........] - ETA: 57s - loss: 1.0439 - regression_loss: 0.9160 - classification_loss: 0.1279 333/500 [==================>...........] - ETA: 57s - loss: 1.0436 - regression_loss: 0.9159 - classification_loss: 0.1277 334/500 [===================>..........] - ETA: 57s - loss: 1.0443 - regression_loss: 0.9164 - classification_loss: 0.1278 335/500 [===================>..........] - ETA: 56s - loss: 1.0437 - regression_loss: 0.9160 - classification_loss: 0.1277 336/500 [===================>..........] - ETA: 56s - loss: 1.0435 - regression_loss: 0.9160 - classification_loss: 0.1275 337/500 [===================>..........] - ETA: 56s - loss: 1.0438 - regression_loss: 0.9162 - classification_loss: 0.1276 338/500 [===================>..........] - ETA: 55s - loss: 1.0438 - regression_loss: 0.9163 - classification_loss: 0.1275 339/500 [===================>..........] - ETA: 55s - loss: 1.0425 - regression_loss: 0.9152 - classification_loss: 0.1273 340/500 [===================>..........] - ETA: 54s - loss: 1.0416 - regression_loss: 0.9143 - classification_loss: 0.1273 341/500 [===================>..........] - ETA: 54s - loss: 1.0418 - regression_loss: 0.9145 - classification_loss: 0.1273 342/500 [===================>..........] - ETA: 54s - loss: 1.0416 - regression_loss: 0.9143 - classification_loss: 0.1273 343/500 [===================>..........] - ETA: 53s - loss: 1.0402 - regression_loss: 0.9132 - classification_loss: 0.1271 344/500 [===================>..........] - ETA: 53s - loss: 1.0404 - regression_loss: 0.9133 - classification_loss: 0.1271 345/500 [===================>..........] - ETA: 53s - loss: 1.0406 - regression_loss: 0.9134 - classification_loss: 0.1271 346/500 [===================>..........] - ETA: 52s - loss: 1.0396 - regression_loss: 0.9126 - classification_loss: 0.1270 347/500 [===================>..........] - ETA: 52s - loss: 1.0399 - regression_loss: 0.9129 - classification_loss: 0.1270 348/500 [===================>..........] - ETA: 52s - loss: 1.0387 - regression_loss: 0.9120 - classification_loss: 0.1267 349/500 [===================>..........] - ETA: 51s - loss: 1.0390 - regression_loss: 0.9122 - classification_loss: 0.1268 350/500 [====================>.........] - ETA: 51s - loss: 1.0408 - regression_loss: 0.9137 - classification_loss: 0.1271 351/500 [====================>.........] - ETA: 51s - loss: 1.0399 - regression_loss: 0.9129 - classification_loss: 0.1270 352/500 [====================>.........] - ETA: 50s - loss: 1.0399 - regression_loss: 0.9130 - classification_loss: 0.1270 353/500 [====================>.........] - ETA: 50s - loss: 1.0406 - regression_loss: 0.9136 - classification_loss: 0.1270 354/500 [====================>.........] - ETA: 50s - loss: 1.0393 - regression_loss: 0.9125 - classification_loss: 0.1269 355/500 [====================>.........] - ETA: 49s - loss: 1.0398 - regression_loss: 0.9129 - classification_loss: 0.1269 356/500 [====================>.........] - ETA: 49s - loss: 1.0390 - regression_loss: 0.9121 - classification_loss: 0.1268 357/500 [====================>.........] - ETA: 49s - loss: 1.0387 - regression_loss: 0.9119 - classification_loss: 0.1268 358/500 [====================>.........] - ETA: 48s - loss: 1.0392 - regression_loss: 0.9123 - classification_loss: 0.1269 359/500 [====================>.........] - ETA: 48s - loss: 1.0381 - regression_loss: 0.9114 - classification_loss: 0.1267 360/500 [====================>.........] - ETA: 48s - loss: 1.0362 - regression_loss: 0.9097 - classification_loss: 0.1265 361/500 [====================>.........] - ETA: 47s - loss: 1.0360 - regression_loss: 0.9095 - classification_loss: 0.1265 362/500 [====================>.........] - ETA: 47s - loss: 1.0357 - regression_loss: 0.9092 - classification_loss: 0.1264 363/500 [====================>.........] - ETA: 47s - loss: 1.0352 - regression_loss: 0.9088 - classification_loss: 0.1264 364/500 [====================>.........] - ETA: 46s - loss: 1.0349 - regression_loss: 0.9085 - classification_loss: 0.1264 365/500 [====================>.........] - ETA: 46s - loss: 1.0333 - regression_loss: 0.9070 - classification_loss: 0.1262 366/500 [====================>.........] - ETA: 46s - loss: 1.0341 - regression_loss: 0.9079 - classification_loss: 0.1262 367/500 [=====================>........] - ETA: 45s - loss: 1.0341 - regression_loss: 0.9078 - classification_loss: 0.1262 368/500 [=====================>........] - ETA: 45s - loss: 1.0348 - regression_loss: 0.9087 - classification_loss: 0.1261 369/500 [=====================>........] - ETA: 45s - loss: 1.0334 - regression_loss: 0.9075 - classification_loss: 0.1259 370/500 [=====================>........] - ETA: 44s - loss: 1.0319 - regression_loss: 0.9062 - classification_loss: 0.1257 371/500 [=====================>........] - ETA: 44s - loss: 1.0322 - regression_loss: 0.9064 - classification_loss: 0.1258 372/500 [=====================>........] - ETA: 43s - loss: 1.0350 - regression_loss: 0.9088 - classification_loss: 0.1262 373/500 [=====================>........] - ETA: 43s - loss: 1.0348 - regression_loss: 0.9086 - classification_loss: 0.1261 374/500 [=====================>........] - ETA: 43s - loss: 1.0338 - regression_loss: 0.9077 - classification_loss: 0.1261 375/500 [=====================>........] - ETA: 42s - loss: 1.0345 - regression_loss: 0.9084 - classification_loss: 0.1261 376/500 [=====================>........] - ETA: 42s - loss: 1.0344 - regression_loss: 0.9082 - classification_loss: 0.1262 377/500 [=====================>........] - ETA: 42s - loss: 1.0344 - regression_loss: 0.9082 - classification_loss: 0.1262 378/500 [=====================>........] - ETA: 41s - loss: 1.0335 - regression_loss: 0.9075 - classification_loss: 0.1261 379/500 [=====================>........] - ETA: 41s - loss: 1.0321 - regression_loss: 0.9062 - classification_loss: 0.1259 380/500 [=====================>........] - ETA: 41s - loss: 1.0321 - regression_loss: 0.9062 - classification_loss: 0.1259 381/500 [=====================>........] - ETA: 40s - loss: 1.0316 - regression_loss: 0.9058 - classification_loss: 0.1259 382/500 [=====================>........] - ETA: 40s - loss: 1.0315 - regression_loss: 0.9055 - classification_loss: 0.1260 383/500 [=====================>........] - ETA: 40s - loss: 1.0301 - regression_loss: 0.9043 - classification_loss: 0.1258 384/500 [======================>.......] - ETA: 39s - loss: 1.0312 - regression_loss: 0.9051 - classification_loss: 0.1261 385/500 [======================>.......] - ETA: 39s - loss: 1.0308 - regression_loss: 0.9047 - classification_loss: 0.1261 386/500 [======================>.......] - ETA: 39s - loss: 1.0306 - regression_loss: 0.9046 - classification_loss: 0.1260 387/500 [======================>.......] - ETA: 38s - loss: 1.0292 - regression_loss: 0.9033 - classification_loss: 0.1259 388/500 [======================>.......] - ETA: 38s - loss: 1.0292 - regression_loss: 0.9034 - classification_loss: 0.1258 389/500 [======================>.......] - ETA: 38s - loss: 1.0300 - regression_loss: 0.9040 - classification_loss: 0.1261 390/500 [======================>.......] - ETA: 37s - loss: 1.0295 - regression_loss: 0.9034 - classification_loss: 0.1260 391/500 [======================>.......] - ETA: 37s - loss: 1.0301 - regression_loss: 0.9039 - classification_loss: 0.1262 392/500 [======================>.......] - ETA: 37s - loss: 1.0294 - regression_loss: 0.9033 - classification_loss: 0.1261 393/500 [======================>.......] - ETA: 36s - loss: 1.0309 - regression_loss: 0.9046 - classification_loss: 0.1263 394/500 [======================>.......] - ETA: 36s - loss: 1.0305 - regression_loss: 0.9043 - classification_loss: 0.1262 395/500 [======================>.......] - ETA: 36s - loss: 1.0302 - regression_loss: 0.9039 - classification_loss: 0.1263 396/500 [======================>.......] - ETA: 35s - loss: 1.0302 - regression_loss: 0.9038 - classification_loss: 0.1264 397/500 [======================>.......] - ETA: 35s - loss: 1.0302 - regression_loss: 0.9039 - classification_loss: 0.1263 398/500 [======================>.......] - ETA: 35s - loss: 1.0311 - regression_loss: 0.9046 - classification_loss: 0.1264 399/500 [======================>.......] - ETA: 34s - loss: 1.0307 - regression_loss: 0.9044 - classification_loss: 0.1264 400/500 [=======================>......] - ETA: 34s - loss: 1.0295 - regression_loss: 0.9032 - classification_loss: 0.1262 401/500 [=======================>......] - ETA: 34s - loss: 1.0283 - regression_loss: 0.9022 - classification_loss: 0.1261 402/500 [=======================>......] - ETA: 33s - loss: 1.0283 - regression_loss: 0.9023 - classification_loss: 0.1260 403/500 [=======================>......] - ETA: 33s - loss: 1.0285 - regression_loss: 0.9024 - classification_loss: 0.1260 404/500 [=======================>......] - ETA: 32s - loss: 1.0283 - regression_loss: 0.9023 - classification_loss: 0.1260 405/500 [=======================>......] - ETA: 32s - loss: 1.0268 - regression_loss: 0.9010 - classification_loss: 0.1258 406/500 [=======================>......] - ETA: 32s - loss: 1.0268 - regression_loss: 0.9010 - classification_loss: 0.1258 407/500 [=======================>......] - ETA: 31s - loss: 1.0258 - regression_loss: 0.9002 - classification_loss: 0.1256 408/500 [=======================>......] - ETA: 31s - loss: 1.0258 - regression_loss: 0.9002 - classification_loss: 0.1256 409/500 [=======================>......] - ETA: 31s - loss: 1.0262 - regression_loss: 0.9005 - classification_loss: 0.1257 410/500 [=======================>......] - ETA: 30s - loss: 1.0261 - regression_loss: 0.9004 - classification_loss: 0.1257 411/500 [=======================>......] - ETA: 30s - loss: 1.0257 - regression_loss: 0.9001 - classification_loss: 0.1256 412/500 [=======================>......] - ETA: 30s - loss: 1.0246 - regression_loss: 0.8991 - classification_loss: 0.1255 413/500 [=======================>......] - ETA: 29s - loss: 1.0240 - regression_loss: 0.8986 - classification_loss: 0.1254 414/500 [=======================>......] - ETA: 29s - loss: 1.0241 - regression_loss: 0.8987 - classification_loss: 0.1254 415/500 [=======================>......] - ETA: 29s - loss: 1.0253 - regression_loss: 0.8997 - classification_loss: 0.1256 416/500 [=======================>......] - ETA: 28s - loss: 1.0256 - regression_loss: 0.8999 - classification_loss: 0.1257 417/500 [========================>.....] - ETA: 28s - loss: 1.0264 - regression_loss: 0.9006 - classification_loss: 0.1258 418/500 [========================>.....] - ETA: 28s - loss: 1.0266 - regression_loss: 0.9007 - classification_loss: 0.1260 419/500 [========================>.....] - ETA: 27s - loss: 1.0265 - regression_loss: 0.9006 - classification_loss: 0.1259 420/500 [========================>.....] - ETA: 27s - loss: 1.0264 - regression_loss: 0.9004 - classification_loss: 0.1260 421/500 [========================>.....] - ETA: 27s - loss: 1.0271 - regression_loss: 0.9010 - classification_loss: 0.1261 422/500 [========================>.....] - ETA: 26s - loss: 1.0274 - regression_loss: 0.9013 - classification_loss: 0.1261 423/500 [========================>.....] - ETA: 26s - loss: 1.0280 - regression_loss: 0.9018 - classification_loss: 0.1262 424/500 [========================>.....] - ETA: 26s - loss: 1.0281 - regression_loss: 0.9018 - classification_loss: 0.1263 425/500 [========================>.....] - ETA: 25s - loss: 1.0283 - regression_loss: 0.9019 - classification_loss: 0.1264 426/500 [========================>.....] - ETA: 25s - loss: 1.0282 - regression_loss: 0.9019 - classification_loss: 0.1263 427/500 [========================>.....] - ETA: 25s - loss: 1.0270 - regression_loss: 0.9008 - classification_loss: 0.1262 428/500 [========================>.....] - ETA: 24s - loss: 1.0267 - regression_loss: 0.9005 - classification_loss: 0.1261 429/500 [========================>.....] - ETA: 24s - loss: 1.0257 - regression_loss: 0.8997 - classification_loss: 0.1260 430/500 [========================>.....] - ETA: 24s - loss: 1.0254 - regression_loss: 0.8995 - classification_loss: 0.1259 431/500 [========================>.....] - ETA: 23s - loss: 1.0249 - regression_loss: 0.8991 - classification_loss: 0.1258 432/500 [========================>.....] - ETA: 23s - loss: 1.0239 - regression_loss: 0.8983 - classification_loss: 0.1257 433/500 [========================>.....] - ETA: 23s - loss: 1.0226 - regression_loss: 0.8971 - classification_loss: 0.1255 434/500 [=========================>....] - ETA: 22s - loss: 1.0221 - regression_loss: 0.8967 - classification_loss: 0.1255 435/500 [=========================>....] - ETA: 22s - loss: 1.0211 - regression_loss: 0.8958 - classification_loss: 0.1253 436/500 [=========================>....] - ETA: 21s - loss: 1.0198 - regression_loss: 0.8947 - classification_loss: 0.1251 437/500 [=========================>....] - ETA: 21s - loss: 1.0197 - regression_loss: 0.8947 - classification_loss: 0.1250 438/500 [=========================>....] - ETA: 21s - loss: 1.0185 - regression_loss: 0.8937 - classification_loss: 0.1248 439/500 [=========================>....] - ETA: 20s - loss: 1.0172 - regression_loss: 0.8926 - classification_loss: 0.1246 440/500 [=========================>....] - ETA: 20s - loss: 1.0190 - regression_loss: 0.8942 - classification_loss: 0.1248 441/500 [=========================>....] - ETA: 20s - loss: 1.0184 - regression_loss: 0.8936 - classification_loss: 0.1248 442/500 [=========================>....] - ETA: 19s - loss: 1.0199 - regression_loss: 0.8948 - classification_loss: 0.1251 443/500 [=========================>....] - ETA: 19s - loss: 1.0201 - regression_loss: 0.8950 - classification_loss: 0.1252 444/500 [=========================>....] - ETA: 19s - loss: 1.0211 - regression_loss: 0.8957 - classification_loss: 0.1254 445/500 [=========================>....] - ETA: 18s - loss: 1.0215 - regression_loss: 0.8960 - classification_loss: 0.1255 446/500 [=========================>....] - ETA: 18s - loss: 1.0221 - regression_loss: 0.8965 - classification_loss: 0.1256 447/500 [=========================>....] - ETA: 18s - loss: 1.0232 - regression_loss: 0.8975 - classification_loss: 0.1257 448/500 [=========================>....] - ETA: 17s - loss: 1.0224 - regression_loss: 0.8968 - classification_loss: 0.1256 449/500 [=========================>....] - ETA: 17s - loss: 1.0229 - regression_loss: 0.8972 - classification_loss: 0.1257 450/500 [==========================>...] - ETA: 17s - loss: 1.0237 - regression_loss: 0.8980 - classification_loss: 0.1257 451/500 [==========================>...] - ETA: 16s - loss: 1.0249 - regression_loss: 0.8990 - classification_loss: 0.1258 452/500 [==========================>...] - ETA: 16s - loss: 1.0257 - regression_loss: 0.8998 - classification_loss: 0.1259 453/500 [==========================>...] - ETA: 16s - loss: 1.0264 - regression_loss: 0.9004 - classification_loss: 0.1260 454/500 [==========================>...] - ETA: 15s - loss: 1.0253 - regression_loss: 0.8995 - classification_loss: 0.1258 455/500 [==========================>...] - ETA: 15s - loss: 1.0251 - regression_loss: 0.8993 - classification_loss: 0.1258 456/500 [==========================>...] - ETA: 15s - loss: 1.0259 - regression_loss: 0.9000 - classification_loss: 0.1259 457/500 [==========================>...] - ETA: 14s - loss: 1.0253 - regression_loss: 0.8995 - classification_loss: 0.1258 458/500 [==========================>...] - ETA: 14s - loss: 1.0258 - regression_loss: 0.8999 - classification_loss: 0.1259 459/500 [==========================>...] - ETA: 14s - loss: 1.0245 - regression_loss: 0.8988 - classification_loss: 0.1257 460/500 [==========================>...] - ETA: 13s - loss: 1.0261 - regression_loss: 0.9001 - classification_loss: 0.1260 461/500 [==========================>...] - ETA: 13s - loss: 1.0265 - regression_loss: 0.9004 - classification_loss: 0.1261 462/500 [==========================>...] - ETA: 13s - loss: 1.0256 - regression_loss: 0.8997 - classification_loss: 0.1260 463/500 [==========================>...] - ETA: 12s - loss: 1.0261 - regression_loss: 0.9002 - classification_loss: 0.1260 464/500 [==========================>...] - ETA: 12s - loss: 1.0255 - regression_loss: 0.8996 - classification_loss: 0.1259 465/500 [==========================>...] - ETA: 12s - loss: 1.0248 - regression_loss: 0.8990 - classification_loss: 0.1258 466/500 [==========================>...] - ETA: 11s - loss: 1.0256 - regression_loss: 0.8998 - classification_loss: 0.1258 467/500 [===========================>..] - ETA: 11s - loss: 1.0273 - regression_loss: 0.9014 - classification_loss: 0.1260 468/500 [===========================>..] - ETA: 10s - loss: 1.0276 - regression_loss: 0.9016 - classification_loss: 0.1259 469/500 [===========================>..] - ETA: 10s - loss: 1.0276 - regression_loss: 0.9016 - classification_loss: 0.1261 470/500 [===========================>..] - ETA: 10s - loss: 1.0290 - regression_loss: 0.9026 - classification_loss: 0.1264 471/500 [===========================>..] - ETA: 9s - loss: 1.0307 - regression_loss: 0.9040 - classification_loss: 0.1267  472/500 [===========================>..] - ETA: 9s - loss: 1.0314 - regression_loss: 0.9046 - classification_loss: 0.1269 473/500 [===========================>..] - ETA: 9s - loss: 1.0318 - regression_loss: 0.9048 - classification_loss: 0.1270 474/500 [===========================>..] - ETA: 8s - loss: 1.0310 - regression_loss: 0.9041 - classification_loss: 0.1269 475/500 [===========================>..] - ETA: 8s - loss: 1.0322 - regression_loss: 0.9051 - classification_loss: 0.1271 476/500 [===========================>..] - ETA: 8s - loss: 1.0324 - regression_loss: 0.9054 - classification_loss: 0.1270 477/500 [===========================>..] - ETA: 7s - loss: 1.0313 - regression_loss: 0.9044 - classification_loss: 0.1269 478/500 [===========================>..] - ETA: 7s - loss: 1.0306 - regression_loss: 0.9037 - classification_loss: 0.1269 479/500 [===========================>..] - ETA: 7s - loss: 1.0302 - regression_loss: 0.9034 - classification_loss: 0.1268 480/500 [===========================>..] - ETA: 6s - loss: 1.0296 - regression_loss: 0.9027 - classification_loss: 0.1268 481/500 [===========================>..] - ETA: 6s - loss: 1.0293 - regression_loss: 0.9025 - classification_loss: 0.1268 482/500 [===========================>..] - ETA: 6s - loss: 1.0290 - regression_loss: 0.9023 - classification_loss: 0.1267 483/500 [===========================>..] - ETA: 5s - loss: 1.0285 - regression_loss: 0.9017 - classification_loss: 0.1268 484/500 [============================>.] - ETA: 5s - loss: 1.0282 - regression_loss: 0.9014 - classification_loss: 0.1268 485/500 [============================>.] - ETA: 5s - loss: 1.0272 - regression_loss: 0.9005 - classification_loss: 0.1267 486/500 [============================>.] - ETA: 4s - loss: 1.0269 - regression_loss: 0.9003 - classification_loss: 0.1266 487/500 [============================>.] - ETA: 4s - loss: 1.0269 - regression_loss: 0.9004 - classification_loss: 0.1266 488/500 [============================>.] - ETA: 4s - loss: 1.0276 - regression_loss: 0.9010 - classification_loss: 0.1267 489/500 [============================>.] - ETA: 3s - loss: 1.0277 - regression_loss: 0.9011 - classification_loss: 0.1267 490/500 [============================>.] - ETA: 3s - loss: 1.0275 - regression_loss: 0.9008 - classification_loss: 0.1267 491/500 [============================>.] - ETA: 3s - loss: 1.0274 - regression_loss: 0.9008 - classification_loss: 0.1266 492/500 [============================>.] - ETA: 2s - loss: 1.0269 - regression_loss: 0.9003 - classification_loss: 0.1266 493/500 [============================>.] - ETA: 2s - loss: 1.0265 - regression_loss: 0.9000 - classification_loss: 0.1265 494/500 [============================>.] - ETA: 2s - loss: 1.0256 - regression_loss: 0.8992 - classification_loss: 0.1264 495/500 [============================>.] - ETA: 1s - loss: 1.0250 - regression_loss: 0.8987 - classification_loss: 0.1263 496/500 [============================>.] - ETA: 1s - loss: 1.0241 - regression_loss: 0.8979 - classification_loss: 0.1262 497/500 [============================>.] - ETA: 1s - loss: 1.0239 - regression_loss: 0.8979 - classification_loss: 0.1261 498/500 [============================>.] - ETA: 0s - loss: 1.0244 - regression_loss: 0.8983 - classification_loss: 0.1261 499/500 [============================>.] - ETA: 0s - loss: 1.0251 - regression_loss: 0.8988 - classification_loss: 0.1263 500/500 [==============================] - 172s 344ms/step - loss: 1.0256 - regression_loss: 0.8993 - classification_loss: 0.1263 1172 instances of class plum with average precision: 0.6756 mAP: 0.6756 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:29 - loss: 0.5458 - regression_loss: 0.4494 - classification_loss: 0.0964 2/500 [..............................] - ETA: 2:36 - loss: 0.6509 - regression_loss: 0.5619 - classification_loss: 0.0890 3/500 [..............................] - ETA: 2:40 - loss: 0.7058 - regression_loss: 0.6246 - classification_loss: 0.0812 4/500 [..............................] - ETA: 2:43 - loss: 0.7796 - regression_loss: 0.6988 - classification_loss: 0.0808 5/500 [..............................] - ETA: 2:45 - loss: 0.8196 - regression_loss: 0.7355 - classification_loss: 0.0841 6/500 [..............................] - ETA: 2:45 - loss: 0.9044 - regression_loss: 0.8012 - classification_loss: 0.1032 7/500 [..............................] - ETA: 2:46 - loss: 0.9062 - regression_loss: 0.8009 - classification_loss: 0.1052 8/500 [..............................] - ETA: 2:47 - loss: 0.9151 - regression_loss: 0.8076 - classification_loss: 0.1075 9/500 [..............................] - ETA: 2:47 - loss: 0.9064 - regression_loss: 0.7981 - classification_loss: 0.1083 10/500 [..............................] - ETA: 2:46 - loss: 0.9533 - regression_loss: 0.8360 - classification_loss: 0.1173 11/500 [..............................] - ETA: 2:46 - loss: 0.9792 - regression_loss: 0.8601 - classification_loss: 0.1191 12/500 [..............................] - ETA: 2:46 - loss: 0.9640 - regression_loss: 0.8436 - classification_loss: 0.1204 13/500 [..............................] - ETA: 2:45 - loss: 0.9585 - regression_loss: 0.8392 - classification_loss: 0.1193 14/500 [..............................] - ETA: 2:45 - loss: 0.9355 - regression_loss: 0.8191 - classification_loss: 0.1164 15/500 [..............................] - ETA: 2:45 - loss: 0.9420 - regression_loss: 0.8251 - classification_loss: 0.1169 16/500 [..............................] - ETA: 2:44 - loss: 0.9371 - regression_loss: 0.8220 - classification_loss: 0.1151 17/500 [>.............................] - ETA: 2:44 - loss: 0.9434 - regression_loss: 0.8306 - classification_loss: 0.1128 18/500 [>.............................] - ETA: 2:44 - loss: 0.9409 - regression_loss: 0.8273 - classification_loss: 0.1136 19/500 [>.............................] - ETA: 2:43 - loss: 0.9251 - regression_loss: 0.8158 - classification_loss: 0.1093 20/500 [>.............................] - ETA: 2:43 - loss: 0.9485 - regression_loss: 0.8376 - classification_loss: 0.1109 21/500 [>.............................] - ETA: 2:42 - loss: 0.9386 - regression_loss: 0.8280 - classification_loss: 0.1106 22/500 [>.............................] - ETA: 2:42 - loss: 0.9277 - regression_loss: 0.8185 - classification_loss: 0.1092 23/500 [>.............................] - ETA: 2:42 - loss: 0.9161 - regression_loss: 0.8097 - classification_loss: 0.1064 24/500 [>.............................] - ETA: 2:42 - loss: 0.9046 - regression_loss: 0.7993 - classification_loss: 0.1053 25/500 [>.............................] - ETA: 2:41 - loss: 0.9204 - regression_loss: 0.8144 - classification_loss: 0.1060 26/500 [>.............................] - ETA: 2:41 - loss: 0.9413 - regression_loss: 0.8287 - classification_loss: 0.1126 27/500 [>.............................] - ETA: 2:41 - loss: 0.9364 - regression_loss: 0.8250 - classification_loss: 0.1114 28/500 [>.............................] - ETA: 2:41 - loss: 0.9539 - regression_loss: 0.8405 - classification_loss: 0.1135 29/500 [>.............................] - ETA: 2:41 - loss: 0.9814 - regression_loss: 0.8646 - classification_loss: 0.1167 30/500 [>.............................] - ETA: 2:41 - loss: 0.9801 - regression_loss: 0.8636 - classification_loss: 0.1165 31/500 [>.............................] - ETA: 2:40 - loss: 0.9905 - regression_loss: 0.8733 - classification_loss: 0.1172 32/500 [>.............................] - ETA: 2:40 - loss: 0.9748 - regression_loss: 0.8595 - classification_loss: 0.1153 33/500 [>.............................] - ETA: 2:40 - loss: 0.9747 - regression_loss: 0.8591 - classification_loss: 0.1156 34/500 [=>............................] - ETA: 2:39 - loss: 0.9848 - regression_loss: 0.8687 - classification_loss: 0.1162 35/500 [=>............................] - ETA: 2:39 - loss: 0.9915 - regression_loss: 0.8757 - classification_loss: 0.1158 36/500 [=>............................] - ETA: 2:38 - loss: 0.9850 - regression_loss: 0.8700 - classification_loss: 0.1149 37/500 [=>............................] - ETA: 2:38 - loss: 1.0056 - regression_loss: 0.8884 - classification_loss: 0.1172 38/500 [=>............................] - ETA: 2:38 - loss: 1.0003 - regression_loss: 0.8841 - classification_loss: 0.1161 39/500 [=>............................] - ETA: 2:37 - loss: 1.0036 - regression_loss: 0.8857 - classification_loss: 0.1180 40/500 [=>............................] - ETA: 2:37 - loss: 1.0002 - regression_loss: 0.8822 - classification_loss: 0.1179 41/500 [=>............................] - ETA: 2:37 - loss: 0.9987 - regression_loss: 0.8813 - classification_loss: 0.1174 42/500 [=>............................] - ETA: 2:37 - loss: 1.0122 - regression_loss: 0.8935 - classification_loss: 0.1186 43/500 [=>............................] - ETA: 2:37 - loss: 1.0145 - regression_loss: 0.8962 - classification_loss: 0.1183 44/500 [=>............................] - ETA: 2:36 - loss: 1.0243 - regression_loss: 0.9045 - classification_loss: 0.1198 45/500 [=>............................] - ETA: 2:36 - loss: 1.0281 - regression_loss: 0.9076 - classification_loss: 0.1205 46/500 [=>............................] - ETA: 2:36 - loss: 1.0315 - regression_loss: 0.9103 - classification_loss: 0.1211 47/500 [=>............................] - ETA: 2:35 - loss: 1.0176 - regression_loss: 0.8982 - classification_loss: 0.1195 48/500 [=>............................] - ETA: 2:35 - loss: 1.0181 - regression_loss: 0.8987 - classification_loss: 0.1194 49/500 [=>............................] - ETA: 2:34 - loss: 1.0080 - regression_loss: 0.8897 - classification_loss: 0.1184 50/500 [==>...........................] - ETA: 2:34 - loss: 1.0022 - regression_loss: 0.8845 - classification_loss: 0.1178 51/500 [==>...........................] - ETA: 2:34 - loss: 0.9989 - regression_loss: 0.8813 - classification_loss: 0.1176 52/500 [==>...........................] - ETA: 2:33 - loss: 0.9960 - regression_loss: 0.8783 - classification_loss: 0.1177 53/500 [==>...........................] - ETA: 2:33 - loss: 0.9940 - regression_loss: 0.8761 - classification_loss: 0.1179 54/500 [==>...........................] - ETA: 2:33 - loss: 0.9961 - regression_loss: 0.8780 - classification_loss: 0.1181 55/500 [==>...........................] - ETA: 2:33 - loss: 0.9919 - regression_loss: 0.8739 - classification_loss: 0.1180 56/500 [==>...........................] - ETA: 2:32 - loss: 0.9934 - regression_loss: 0.8746 - classification_loss: 0.1188 57/500 [==>...........................] - ETA: 2:32 - loss: 0.9922 - regression_loss: 0.8732 - classification_loss: 0.1190 58/500 [==>...........................] - ETA: 2:32 - loss: 0.9918 - regression_loss: 0.8733 - classification_loss: 0.1184 59/500 [==>...........................] - ETA: 2:32 - loss: 0.9919 - regression_loss: 0.8732 - classification_loss: 0.1187 60/500 [==>...........................] - ETA: 2:31 - loss: 0.9964 - regression_loss: 0.8768 - classification_loss: 0.1196 61/500 [==>...........................] - ETA: 2:31 - loss: 0.9940 - regression_loss: 0.8751 - classification_loss: 0.1190 62/500 [==>...........................] - ETA: 2:30 - loss: 0.9926 - regression_loss: 0.8739 - classification_loss: 0.1187 63/500 [==>...........................] - ETA: 2:30 - loss: 0.9857 - regression_loss: 0.8680 - classification_loss: 0.1177 64/500 [==>...........................] - ETA: 2:30 - loss: 0.9787 - regression_loss: 0.8616 - classification_loss: 0.1171 65/500 [==>...........................] - ETA: 2:29 - loss: 0.9755 - regression_loss: 0.8587 - classification_loss: 0.1167 66/500 [==>...........................] - ETA: 2:29 - loss: 0.9733 - regression_loss: 0.8567 - classification_loss: 0.1166 67/500 [===>..........................] - ETA: 2:29 - loss: 0.9677 - regression_loss: 0.8521 - classification_loss: 0.1156 68/500 [===>..........................] - ETA: 2:28 - loss: 0.9661 - regression_loss: 0.8507 - classification_loss: 0.1154 69/500 [===>..........................] - ETA: 2:28 - loss: 0.9649 - regression_loss: 0.8499 - classification_loss: 0.1150 70/500 [===>..........................] - ETA: 2:28 - loss: 0.9670 - regression_loss: 0.8520 - classification_loss: 0.1151 71/500 [===>..........................] - ETA: 2:28 - loss: 0.9618 - regression_loss: 0.8472 - classification_loss: 0.1146 72/500 [===>..........................] - ETA: 2:27 - loss: 0.9615 - regression_loss: 0.8471 - classification_loss: 0.1143 73/500 [===>..........................] - ETA: 2:27 - loss: 0.9593 - regression_loss: 0.8451 - classification_loss: 0.1143 74/500 [===>..........................] - ETA: 2:26 - loss: 0.9635 - regression_loss: 0.8478 - classification_loss: 0.1158 75/500 [===>..........................] - ETA: 2:26 - loss: 0.9632 - regression_loss: 0.8477 - classification_loss: 0.1154 76/500 [===>..........................] - ETA: 2:26 - loss: 0.9640 - regression_loss: 0.8484 - classification_loss: 0.1156 77/500 [===>..........................] - ETA: 2:25 - loss: 0.9647 - regression_loss: 0.8489 - classification_loss: 0.1158 78/500 [===>..........................] - ETA: 2:25 - loss: 0.9647 - regression_loss: 0.8491 - classification_loss: 0.1156 79/500 [===>..........................] - ETA: 2:24 - loss: 0.9671 - regression_loss: 0.8515 - classification_loss: 0.1156 80/500 [===>..........................] - ETA: 2:24 - loss: 0.9620 - regression_loss: 0.8470 - classification_loss: 0.1150 81/500 [===>..........................] - ETA: 2:24 - loss: 0.9561 - regression_loss: 0.8416 - classification_loss: 0.1145 82/500 [===>..........................] - ETA: 2:23 - loss: 0.9595 - regression_loss: 0.8447 - classification_loss: 0.1148 83/500 [===>..........................] - ETA: 2:23 - loss: 0.9603 - regression_loss: 0.8457 - classification_loss: 0.1146 84/500 [====>.........................] - ETA: 2:23 - loss: 0.9587 - regression_loss: 0.8445 - classification_loss: 0.1141 85/500 [====>.........................] - ETA: 2:22 - loss: 0.9616 - regression_loss: 0.8472 - classification_loss: 0.1144 86/500 [====>.........................] - ETA: 2:22 - loss: 0.9635 - regression_loss: 0.8490 - classification_loss: 0.1145 87/500 [====>.........................] - ETA: 2:22 - loss: 0.9724 - regression_loss: 0.8565 - classification_loss: 0.1159 88/500 [====>.........................] - ETA: 2:21 - loss: 0.9720 - regression_loss: 0.8555 - classification_loss: 0.1165 89/500 [====>.........................] - ETA: 2:21 - loss: 0.9711 - regression_loss: 0.8550 - classification_loss: 0.1162 90/500 [====>.........................] - ETA: 2:21 - loss: 0.9710 - regression_loss: 0.8550 - classification_loss: 0.1161 91/500 [====>.........................] - ETA: 2:20 - loss: 0.9692 - regression_loss: 0.8536 - classification_loss: 0.1157 92/500 [====>.........................] - ETA: 2:20 - loss: 0.9714 - regression_loss: 0.8555 - classification_loss: 0.1159 93/500 [====>.........................] - ETA: 2:20 - loss: 0.9749 - regression_loss: 0.8582 - classification_loss: 0.1167 94/500 [====>.........................] - ETA: 2:19 - loss: 0.9720 - regression_loss: 0.8557 - classification_loss: 0.1164 95/500 [====>.........................] - ETA: 2:19 - loss: 0.9673 - regression_loss: 0.8513 - classification_loss: 0.1160 96/500 [====>.........................] - ETA: 2:18 - loss: 0.9719 - regression_loss: 0.8550 - classification_loss: 0.1168 97/500 [====>.........................] - ETA: 2:18 - loss: 0.9720 - regression_loss: 0.8557 - classification_loss: 0.1163 98/500 [====>.........................] - ETA: 2:18 - loss: 0.9764 - regression_loss: 0.8602 - classification_loss: 0.1161 99/500 [====>.........................] - ETA: 2:17 - loss: 0.9762 - regression_loss: 0.8602 - classification_loss: 0.1159 100/500 [=====>........................] - ETA: 2:17 - loss: 0.9715 - regression_loss: 0.8562 - classification_loss: 0.1153 101/500 [=====>........................] - ETA: 2:17 - loss: 0.9687 - regression_loss: 0.8540 - classification_loss: 0.1147 102/500 [=====>........................] - ETA: 2:16 - loss: 0.9660 - regression_loss: 0.8519 - classification_loss: 0.1141 103/500 [=====>........................] - ETA: 2:16 - loss: 0.9663 - regression_loss: 0.8520 - classification_loss: 0.1143 104/500 [=====>........................] - ETA: 2:15 - loss: 0.9638 - regression_loss: 0.8496 - classification_loss: 0.1142 105/500 [=====>........................] - ETA: 2:15 - loss: 0.9627 - regression_loss: 0.8485 - classification_loss: 0.1142 106/500 [=====>........................] - ETA: 2:15 - loss: 0.9578 - regression_loss: 0.8433 - classification_loss: 0.1145 107/500 [=====>........................] - ETA: 2:15 - loss: 0.9570 - regression_loss: 0.8424 - classification_loss: 0.1146 108/500 [=====>........................] - ETA: 2:14 - loss: 0.9545 - regression_loss: 0.8402 - classification_loss: 0.1144 109/500 [=====>........................] - ETA: 2:14 - loss: 0.9530 - regression_loss: 0.8390 - classification_loss: 0.1140 110/500 [=====>........................] - ETA: 2:13 - loss: 0.9523 - regression_loss: 0.8383 - classification_loss: 0.1140 111/500 [=====>........................] - ETA: 2:13 - loss: 0.9512 - regression_loss: 0.8374 - classification_loss: 0.1138 112/500 [=====>........................] - ETA: 2:13 - loss: 0.9494 - regression_loss: 0.8358 - classification_loss: 0.1136 113/500 [=====>........................] - ETA: 2:12 - loss: 0.9492 - regression_loss: 0.8355 - classification_loss: 0.1137 114/500 [=====>........................] - ETA: 2:12 - loss: 0.9437 - regression_loss: 0.8305 - classification_loss: 0.1132 115/500 [=====>........................] - ETA: 2:12 - loss: 0.9415 - regression_loss: 0.8284 - classification_loss: 0.1131 116/500 [=====>........................] - ETA: 2:11 - loss: 0.9452 - regression_loss: 0.8317 - classification_loss: 0.1135 117/500 [======>.......................] - ETA: 2:11 - loss: 0.9448 - regression_loss: 0.8313 - classification_loss: 0.1135 118/500 [======>.......................] - ETA: 2:11 - loss: 0.9470 - regression_loss: 0.8335 - classification_loss: 0.1134 119/500 [======>.......................] - ETA: 2:10 - loss: 0.9457 - regression_loss: 0.8325 - classification_loss: 0.1132 120/500 [======>.......................] - ETA: 2:10 - loss: 0.9459 - regression_loss: 0.8325 - classification_loss: 0.1134 121/500 [======>.......................] - ETA: 2:10 - loss: 0.9426 - regression_loss: 0.8299 - classification_loss: 0.1128 122/500 [======>.......................] - ETA: 2:09 - loss: 0.9446 - regression_loss: 0.8317 - classification_loss: 0.1128 123/500 [======>.......................] - ETA: 2:09 - loss: 0.9481 - regression_loss: 0.8346 - classification_loss: 0.1135 124/500 [======>.......................] - ETA: 2:09 - loss: 0.9460 - regression_loss: 0.8329 - classification_loss: 0.1131 125/500 [======>.......................] - ETA: 2:08 - loss: 0.9470 - regression_loss: 0.8338 - classification_loss: 0.1132 126/500 [======>.......................] - ETA: 2:08 - loss: 0.9499 - regression_loss: 0.8365 - classification_loss: 0.1134 127/500 [======>.......................] - ETA: 2:08 - loss: 0.9489 - regression_loss: 0.8357 - classification_loss: 0.1132 128/500 [======>.......................] - ETA: 2:07 - loss: 0.9472 - regression_loss: 0.8343 - classification_loss: 0.1129 129/500 [======>.......................] - ETA: 2:07 - loss: 0.9520 - regression_loss: 0.8389 - classification_loss: 0.1131 130/500 [======>.......................] - ETA: 2:07 - loss: 0.9510 - regression_loss: 0.8380 - classification_loss: 0.1130 131/500 [======>.......................] - ETA: 2:06 - loss: 0.9516 - regression_loss: 0.8386 - classification_loss: 0.1129 132/500 [======>.......................] - ETA: 2:06 - loss: 0.9500 - regression_loss: 0.8373 - classification_loss: 0.1127 133/500 [======>.......................] - ETA: 2:05 - loss: 0.9482 - regression_loss: 0.8360 - classification_loss: 0.1122 134/500 [=======>......................] - ETA: 2:05 - loss: 0.9483 - regression_loss: 0.8362 - classification_loss: 0.1121 135/500 [=======>......................] - ETA: 2:05 - loss: 0.9471 - regression_loss: 0.8351 - classification_loss: 0.1119 136/500 [=======>......................] - ETA: 2:04 - loss: 0.9490 - regression_loss: 0.8365 - classification_loss: 0.1125 137/500 [=======>......................] - ETA: 2:04 - loss: 0.9460 - regression_loss: 0.8339 - classification_loss: 0.1121 138/500 [=======>......................] - ETA: 2:04 - loss: 0.9463 - regression_loss: 0.8342 - classification_loss: 0.1121 139/500 [=======>......................] - ETA: 2:04 - loss: 0.9440 - regression_loss: 0.8322 - classification_loss: 0.1117 140/500 [=======>......................] - ETA: 2:03 - loss: 0.9417 - regression_loss: 0.8301 - classification_loss: 0.1116 141/500 [=======>......................] - ETA: 2:03 - loss: 0.9428 - regression_loss: 0.8312 - classification_loss: 0.1116 142/500 [=======>......................] - ETA: 2:03 - loss: 0.9437 - regression_loss: 0.8320 - classification_loss: 0.1117 143/500 [=======>......................] - ETA: 2:02 - loss: 0.9433 - regression_loss: 0.8316 - classification_loss: 0.1117 144/500 [=======>......................] - ETA: 2:02 - loss: 0.9422 - regression_loss: 0.8304 - classification_loss: 0.1118 145/500 [=======>......................] - ETA: 2:02 - loss: 0.9416 - regression_loss: 0.8297 - classification_loss: 0.1119 146/500 [=======>......................] - ETA: 2:01 - loss: 0.9393 - regression_loss: 0.8277 - classification_loss: 0.1116 147/500 [=======>......................] - ETA: 2:01 - loss: 0.9390 - regression_loss: 0.8276 - classification_loss: 0.1114 148/500 [=======>......................] - ETA: 2:00 - loss: 0.9386 - regression_loss: 0.8270 - classification_loss: 0.1116 149/500 [=======>......................] - ETA: 2:00 - loss: 0.9355 - regression_loss: 0.8244 - classification_loss: 0.1111 150/500 [========>.....................] - ETA: 2:00 - loss: 0.9339 - regression_loss: 0.8230 - classification_loss: 0.1109 151/500 [========>.....................] - ETA: 1:59 - loss: 0.9321 - regression_loss: 0.8215 - classification_loss: 0.1106 152/500 [========>.....................] - ETA: 1:59 - loss: 0.9299 - regression_loss: 0.8196 - classification_loss: 0.1103 153/500 [========>.....................] - ETA: 1:59 - loss: 0.9342 - regression_loss: 0.8233 - classification_loss: 0.1108 154/500 [========>.....................] - ETA: 1:58 - loss: 0.9315 - regression_loss: 0.8211 - classification_loss: 0.1105 155/500 [========>.....................] - ETA: 1:58 - loss: 0.9332 - regression_loss: 0.8225 - classification_loss: 0.1107 156/500 [========>.....................] - ETA: 1:58 - loss: 0.9358 - regression_loss: 0.8247 - classification_loss: 0.1110 157/500 [========>.....................] - ETA: 1:57 - loss: 0.9347 - regression_loss: 0.8238 - classification_loss: 0.1109 158/500 [========>.....................] - ETA: 1:57 - loss: 0.9323 - regression_loss: 0.8218 - classification_loss: 0.1105 159/500 [========>.....................] - ETA: 1:57 - loss: 0.9309 - regression_loss: 0.8206 - classification_loss: 0.1103 160/500 [========>.....................] - ETA: 1:56 - loss: 0.9292 - regression_loss: 0.8191 - classification_loss: 0.1102 161/500 [========>.....................] - ETA: 1:56 - loss: 0.9308 - regression_loss: 0.8201 - classification_loss: 0.1107 162/500 [========>.....................] - ETA: 1:56 - loss: 0.9281 - regression_loss: 0.8176 - classification_loss: 0.1105 163/500 [========>.....................] - ETA: 1:55 - loss: 0.9287 - regression_loss: 0.8182 - classification_loss: 0.1105 164/500 [========>.....................] - ETA: 1:55 - loss: 0.9279 - regression_loss: 0.8175 - classification_loss: 0.1103 165/500 [========>.....................] - ETA: 1:55 - loss: 0.9279 - regression_loss: 0.8177 - classification_loss: 0.1102 166/500 [========>.....................] - ETA: 1:54 - loss: 0.9264 - regression_loss: 0.8163 - classification_loss: 0.1100 167/500 [=========>....................] - ETA: 1:54 - loss: 0.9266 - regression_loss: 0.8165 - classification_loss: 0.1101 168/500 [=========>....................] - ETA: 1:54 - loss: 0.9259 - regression_loss: 0.8159 - classification_loss: 0.1100 169/500 [=========>....................] - ETA: 1:53 - loss: 0.9263 - regression_loss: 0.8160 - classification_loss: 0.1103 170/500 [=========>....................] - ETA: 1:53 - loss: 0.9260 - regression_loss: 0.8158 - classification_loss: 0.1102 171/500 [=========>....................] - ETA: 1:53 - loss: 0.9284 - regression_loss: 0.8180 - classification_loss: 0.1104 172/500 [=========>....................] - ETA: 1:52 - loss: 0.9303 - regression_loss: 0.8196 - classification_loss: 0.1106 173/500 [=========>....................] - ETA: 1:52 - loss: 0.9307 - regression_loss: 0.8201 - classification_loss: 0.1106 174/500 [=========>....................] - ETA: 1:51 - loss: 0.9306 - regression_loss: 0.8203 - classification_loss: 0.1104 175/500 [=========>....................] - ETA: 1:51 - loss: 0.9333 - regression_loss: 0.8229 - classification_loss: 0.1104 176/500 [=========>....................] - ETA: 1:51 - loss: 0.9328 - regression_loss: 0.8226 - classification_loss: 0.1102 177/500 [=========>....................] - ETA: 1:50 - loss: 0.9310 - regression_loss: 0.8209 - classification_loss: 0.1101 178/500 [=========>....................] - ETA: 1:50 - loss: 0.9300 - regression_loss: 0.8200 - classification_loss: 0.1099 179/500 [=========>....................] - ETA: 1:50 - loss: 0.9311 - regression_loss: 0.8208 - classification_loss: 0.1104 180/500 [=========>....................] - ETA: 1:49 - loss: 0.9319 - regression_loss: 0.8211 - classification_loss: 0.1108 181/500 [=========>....................] - ETA: 1:49 - loss: 0.9316 - regression_loss: 0.8209 - classification_loss: 0.1107 182/500 [=========>....................] - ETA: 1:49 - loss: 0.9327 - regression_loss: 0.8219 - classification_loss: 0.1108 183/500 [=========>....................] - ETA: 1:48 - loss: 0.9325 - regression_loss: 0.8217 - classification_loss: 0.1108 184/500 [==========>...................] - ETA: 1:48 - loss: 0.9318 - regression_loss: 0.8213 - classification_loss: 0.1105 185/500 [==========>...................] - ETA: 1:48 - loss: 0.9311 - regression_loss: 0.8205 - classification_loss: 0.1106 186/500 [==========>...................] - ETA: 1:47 - loss: 0.9325 - regression_loss: 0.8220 - classification_loss: 0.1106 187/500 [==========>...................] - ETA: 1:47 - loss: 0.9337 - regression_loss: 0.8229 - classification_loss: 0.1108 188/500 [==========>...................] - ETA: 1:47 - loss: 0.9376 - regression_loss: 0.8262 - classification_loss: 0.1114 189/500 [==========>...................] - ETA: 1:46 - loss: 0.9361 - regression_loss: 0.8250 - classification_loss: 0.1111 190/500 [==========>...................] - ETA: 1:46 - loss: 0.9359 - regression_loss: 0.8248 - classification_loss: 0.1111 191/500 [==========>...................] - ETA: 1:46 - loss: 0.9334 - regression_loss: 0.8226 - classification_loss: 0.1108 192/500 [==========>...................] - ETA: 1:45 - loss: 0.9329 - regression_loss: 0.8221 - classification_loss: 0.1108 193/500 [==========>...................] - ETA: 1:45 - loss: 0.9372 - regression_loss: 0.8255 - classification_loss: 0.1117 194/500 [==========>...................] - ETA: 1:45 - loss: 0.9363 - regression_loss: 0.8248 - classification_loss: 0.1115 195/500 [==========>...................] - ETA: 1:44 - loss: 0.9349 - regression_loss: 0.8238 - classification_loss: 0.1112 196/500 [==========>...................] - ETA: 1:44 - loss: 0.9345 - regression_loss: 0.8237 - classification_loss: 0.1109 197/500 [==========>...................] - ETA: 1:44 - loss: 0.9339 - regression_loss: 0.8232 - classification_loss: 0.1107 198/500 [==========>...................] - ETA: 1:43 - loss: 0.9342 - regression_loss: 0.8234 - classification_loss: 0.1109 199/500 [==========>...................] - ETA: 1:43 - loss: 0.9364 - regression_loss: 0.8252 - classification_loss: 0.1112 200/500 [===========>..................] - ETA: 1:43 - loss: 0.9380 - regression_loss: 0.8266 - classification_loss: 0.1114 201/500 [===========>..................] - ETA: 1:42 - loss: 0.9402 - regression_loss: 0.8283 - classification_loss: 0.1119 202/500 [===========>..................] - ETA: 1:42 - loss: 0.9410 - regression_loss: 0.8288 - classification_loss: 0.1123 203/500 [===========>..................] - ETA: 1:42 - loss: 0.9435 - regression_loss: 0.8309 - classification_loss: 0.1125 204/500 [===========>..................] - ETA: 1:41 - loss: 0.9458 - regression_loss: 0.8330 - classification_loss: 0.1128 205/500 [===========>..................] - ETA: 1:41 - loss: 0.9456 - regression_loss: 0.8329 - classification_loss: 0.1127 206/500 [===========>..................] - ETA: 1:41 - loss: 0.9441 - regression_loss: 0.8316 - classification_loss: 0.1125 207/500 [===========>..................] - ETA: 1:40 - loss: 0.9443 - regression_loss: 0.8318 - classification_loss: 0.1125 208/500 [===========>..................] - ETA: 1:40 - loss: 0.9463 - regression_loss: 0.8337 - classification_loss: 0.1126 209/500 [===========>..................] - ETA: 1:40 - loss: 0.9486 - regression_loss: 0.8357 - classification_loss: 0.1129 210/500 [===========>..................] - ETA: 1:39 - loss: 0.9484 - regression_loss: 0.8355 - classification_loss: 0.1129 211/500 [===========>..................] - ETA: 1:39 - loss: 0.9455 - regression_loss: 0.8329 - classification_loss: 0.1126 212/500 [===========>..................] - ETA: 1:39 - loss: 0.9447 - regression_loss: 0.8322 - classification_loss: 0.1125 213/500 [===========>..................] - ETA: 1:38 - loss: 0.9434 - regression_loss: 0.8310 - classification_loss: 0.1123 214/500 [===========>..................] - ETA: 1:38 - loss: 0.9449 - regression_loss: 0.8323 - classification_loss: 0.1126 215/500 [===========>..................] - ETA: 1:38 - loss: 0.9454 - regression_loss: 0.8328 - classification_loss: 0.1126 216/500 [===========>..................] - ETA: 1:37 - loss: 0.9452 - regression_loss: 0.8326 - classification_loss: 0.1126 217/500 [============>.................] - ETA: 1:37 - loss: 0.9448 - regression_loss: 0.8323 - classification_loss: 0.1125 218/500 [============>.................] - ETA: 1:36 - loss: 0.9446 - regression_loss: 0.8321 - classification_loss: 0.1125 219/500 [============>.................] - ETA: 1:36 - loss: 0.9424 - regression_loss: 0.8302 - classification_loss: 0.1122 220/500 [============>.................] - ETA: 1:36 - loss: 0.9412 - regression_loss: 0.8291 - classification_loss: 0.1120 221/500 [============>.................] - ETA: 1:35 - loss: 0.9416 - regression_loss: 0.8294 - classification_loss: 0.1122 222/500 [============>.................] - ETA: 1:35 - loss: 0.9423 - regression_loss: 0.8300 - classification_loss: 0.1123 223/500 [============>.................] - ETA: 1:35 - loss: 0.9413 - regression_loss: 0.8292 - classification_loss: 0.1121 224/500 [============>.................] - ETA: 1:34 - loss: 0.9424 - regression_loss: 0.8301 - classification_loss: 0.1122 225/500 [============>.................] - ETA: 1:34 - loss: 0.9432 - regression_loss: 0.8311 - classification_loss: 0.1122 226/500 [============>.................] - ETA: 1:34 - loss: 0.9436 - regression_loss: 0.8313 - classification_loss: 0.1122 227/500 [============>.................] - ETA: 1:33 - loss: 0.9446 - regression_loss: 0.8323 - classification_loss: 0.1123 228/500 [============>.................] - ETA: 1:33 - loss: 0.9428 - regression_loss: 0.8307 - classification_loss: 0.1120 229/500 [============>.................] - ETA: 1:33 - loss: 0.9425 - regression_loss: 0.8307 - classification_loss: 0.1118 230/500 [============>.................] - ETA: 1:32 - loss: 0.9403 - regression_loss: 0.8288 - classification_loss: 0.1115 231/500 [============>.................] - ETA: 1:32 - loss: 0.9388 - regression_loss: 0.8275 - classification_loss: 0.1113 232/500 [============>.................] - ETA: 1:32 - loss: 0.9402 - regression_loss: 0.8287 - classification_loss: 0.1115 233/500 [============>.................] - ETA: 1:31 - loss: 0.9404 - regression_loss: 0.8290 - classification_loss: 0.1114 234/500 [=============>................] - ETA: 1:31 - loss: 0.9372 - regression_loss: 0.8261 - classification_loss: 0.1111 235/500 [=============>................] - ETA: 1:31 - loss: 0.9385 - regression_loss: 0.8273 - classification_loss: 0.1112 236/500 [=============>................] - ETA: 1:30 - loss: 0.9387 - regression_loss: 0.8274 - classification_loss: 0.1113 237/500 [=============>................] - ETA: 1:30 - loss: 0.9374 - regression_loss: 0.8263 - classification_loss: 0.1111 238/500 [=============>................] - ETA: 1:30 - loss: 0.9375 - regression_loss: 0.8266 - classification_loss: 0.1109 239/500 [=============>................] - ETA: 1:29 - loss: 0.9358 - regression_loss: 0.8253 - classification_loss: 0.1106 240/500 [=============>................] - ETA: 1:29 - loss: 0.9355 - regression_loss: 0.8248 - classification_loss: 0.1107 241/500 [=============>................] - ETA: 1:28 - loss: 0.9338 - regression_loss: 0.8233 - classification_loss: 0.1105 242/500 [=============>................] - ETA: 1:28 - loss: 0.9334 - regression_loss: 0.8228 - classification_loss: 0.1106 243/500 [=============>................] - ETA: 1:28 - loss: 0.9359 - regression_loss: 0.8248 - classification_loss: 0.1110 244/500 [=============>................] - ETA: 1:27 - loss: 0.9374 - regression_loss: 0.8260 - classification_loss: 0.1114 245/500 [=============>................] - ETA: 1:27 - loss: 0.9367 - regression_loss: 0.8253 - classification_loss: 0.1115 246/500 [=============>................] - ETA: 1:27 - loss: 0.9369 - regression_loss: 0.8254 - classification_loss: 0.1114 247/500 [=============>................] - ETA: 1:26 - loss: 0.9348 - regression_loss: 0.8236 - classification_loss: 0.1112 248/500 [=============>................] - ETA: 1:26 - loss: 0.9340 - regression_loss: 0.8227 - classification_loss: 0.1113 249/500 [=============>................] - ETA: 1:26 - loss: 0.9326 - regression_loss: 0.8215 - classification_loss: 0.1110 250/500 [==============>...............] - ETA: 1:25 - loss: 0.9339 - regression_loss: 0.8227 - classification_loss: 0.1112 251/500 [==============>...............] - ETA: 1:25 - loss: 0.9347 - regression_loss: 0.8235 - classification_loss: 0.1113 252/500 [==============>...............] - ETA: 1:25 - loss: 0.9345 - regression_loss: 0.8233 - classification_loss: 0.1112 253/500 [==============>...............] - ETA: 1:24 - loss: 0.9342 - regression_loss: 0.8231 - classification_loss: 0.1111 254/500 [==============>...............] - ETA: 1:24 - loss: 0.9329 - regression_loss: 0.8220 - classification_loss: 0.1109 255/500 [==============>...............] - ETA: 1:24 - loss: 0.9312 - regression_loss: 0.8205 - classification_loss: 0.1106 256/500 [==============>...............] - ETA: 1:23 - loss: 0.9328 - regression_loss: 0.8219 - classification_loss: 0.1108 257/500 [==============>...............] - ETA: 1:23 - loss: 0.9333 - regression_loss: 0.8225 - classification_loss: 0.1108 258/500 [==============>...............] - ETA: 1:23 - loss: 0.9328 - regression_loss: 0.8221 - classification_loss: 0.1107 259/500 [==============>...............] - ETA: 1:22 - loss: 0.9315 - regression_loss: 0.8210 - classification_loss: 0.1105 260/500 [==============>...............] - ETA: 1:22 - loss: 0.9324 - regression_loss: 0.8217 - classification_loss: 0.1107 261/500 [==============>...............] - ETA: 1:22 - loss: 0.9316 - regression_loss: 0.8210 - classification_loss: 0.1106 262/500 [==============>...............] - ETA: 1:21 - loss: 0.9321 - regression_loss: 0.8215 - classification_loss: 0.1106 263/500 [==============>...............] - ETA: 1:21 - loss: 0.9313 - regression_loss: 0.8209 - classification_loss: 0.1104 264/500 [==============>...............] - ETA: 1:21 - loss: 0.9300 - regression_loss: 0.8198 - classification_loss: 0.1102 265/500 [==============>...............] - ETA: 1:20 - loss: 0.9311 - regression_loss: 0.8208 - classification_loss: 0.1103 266/500 [==============>...............] - ETA: 1:20 - loss: 0.9302 - regression_loss: 0.8200 - classification_loss: 0.1102 267/500 [===============>..............] - ETA: 1:20 - loss: 0.9296 - regression_loss: 0.8195 - classification_loss: 0.1101 268/500 [===============>..............] - ETA: 1:19 - loss: 0.9281 - regression_loss: 0.8181 - classification_loss: 0.1100 269/500 [===============>..............] - ETA: 1:19 - loss: 0.9296 - regression_loss: 0.8193 - classification_loss: 0.1103 270/500 [===============>..............] - ETA: 1:19 - loss: 0.9302 - regression_loss: 0.8198 - classification_loss: 0.1104 271/500 [===============>..............] - ETA: 1:18 - loss: 0.9287 - regression_loss: 0.8186 - classification_loss: 0.1102 272/500 [===============>..............] - ETA: 1:18 - loss: 0.9297 - regression_loss: 0.8197 - classification_loss: 0.1100 273/500 [===============>..............] - ETA: 1:18 - loss: 0.9307 - regression_loss: 0.8206 - classification_loss: 0.1100 274/500 [===============>..............] - ETA: 1:17 - loss: 0.9292 - regression_loss: 0.8193 - classification_loss: 0.1099 275/500 [===============>..............] - ETA: 1:17 - loss: 0.9276 - regression_loss: 0.8180 - classification_loss: 0.1096 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9265 - regression_loss: 0.8170 - classification_loss: 0.1095 277/500 [===============>..............] - ETA: 1:16 - loss: 0.9272 - regression_loss: 0.8175 - classification_loss: 0.1097 278/500 [===============>..............] - ETA: 1:16 - loss: 0.9280 - regression_loss: 0.8183 - classification_loss: 0.1097 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9273 - regression_loss: 0.8177 - classification_loss: 0.1096 280/500 [===============>..............] - ETA: 1:15 - loss: 0.9270 - regression_loss: 0.8174 - classification_loss: 0.1096 281/500 [===============>..............] - ETA: 1:15 - loss: 0.9256 - regression_loss: 0.8162 - classification_loss: 0.1094 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9252 - regression_loss: 0.8159 - classification_loss: 0.1093 283/500 [===============>..............] - ETA: 1:14 - loss: 0.9262 - regression_loss: 0.8169 - classification_loss: 0.1093 284/500 [================>.............] - ETA: 1:14 - loss: 0.9257 - regression_loss: 0.8166 - classification_loss: 0.1092 285/500 [================>.............] - ETA: 1:13 - loss: 0.9258 - regression_loss: 0.8166 - classification_loss: 0.1092 286/500 [================>.............] - ETA: 1:13 - loss: 0.9269 - regression_loss: 0.8175 - classification_loss: 0.1094 287/500 [================>.............] - ETA: 1:13 - loss: 0.9285 - regression_loss: 0.8191 - classification_loss: 0.1093 288/500 [================>.............] - ETA: 1:12 - loss: 0.9276 - regression_loss: 0.8185 - classification_loss: 0.1091 289/500 [================>.............] - ETA: 1:12 - loss: 0.9279 - regression_loss: 0.8188 - classification_loss: 0.1091 290/500 [================>.............] - ETA: 1:12 - loss: 0.9263 - regression_loss: 0.8174 - classification_loss: 0.1088 291/500 [================>.............] - ETA: 1:11 - loss: 0.9260 - regression_loss: 0.8173 - classification_loss: 0.1087 292/500 [================>.............] - ETA: 1:11 - loss: 0.9270 - regression_loss: 0.8181 - classification_loss: 0.1089 293/500 [================>.............] - ETA: 1:11 - loss: 0.9258 - regression_loss: 0.8170 - classification_loss: 0.1088 294/500 [================>.............] - ETA: 1:10 - loss: 0.9262 - regression_loss: 0.8174 - classification_loss: 0.1088 295/500 [================>.............] - ETA: 1:10 - loss: 0.9256 - regression_loss: 0.8168 - classification_loss: 0.1087 296/500 [================>.............] - ETA: 1:10 - loss: 0.9252 - regression_loss: 0.8165 - classification_loss: 0.1087 297/500 [================>.............] - ETA: 1:09 - loss: 0.9245 - regression_loss: 0.8160 - classification_loss: 0.1086 298/500 [================>.............] - ETA: 1:09 - loss: 0.9259 - regression_loss: 0.8170 - classification_loss: 0.1088 299/500 [================>.............] - ETA: 1:09 - loss: 0.9257 - regression_loss: 0.8169 - classification_loss: 0.1088 300/500 [=================>............] - ETA: 1:08 - loss: 0.9253 - regression_loss: 0.8164 - classification_loss: 0.1089 301/500 [=================>............] - ETA: 1:08 - loss: 0.9245 - regression_loss: 0.8157 - classification_loss: 0.1088 302/500 [=================>............] - ETA: 1:08 - loss: 0.9243 - regression_loss: 0.8156 - classification_loss: 0.1087 303/500 [=================>............] - ETA: 1:07 - loss: 0.9238 - regression_loss: 0.8152 - classification_loss: 0.1086 304/500 [=================>............] - ETA: 1:07 - loss: 0.9245 - regression_loss: 0.8157 - classification_loss: 0.1088 305/500 [=================>............] - ETA: 1:07 - loss: 0.9235 - regression_loss: 0.8147 - classification_loss: 0.1088 306/500 [=================>............] - ETA: 1:06 - loss: 0.9221 - regression_loss: 0.8136 - classification_loss: 0.1086 307/500 [=================>............] - ETA: 1:06 - loss: 0.9207 - regression_loss: 0.8124 - classification_loss: 0.1083 308/500 [=================>............] - ETA: 1:05 - loss: 0.9207 - regression_loss: 0.8123 - classification_loss: 0.1085 309/500 [=================>............] - ETA: 1:05 - loss: 0.9209 - regression_loss: 0.8123 - classification_loss: 0.1085 310/500 [=================>............] - ETA: 1:05 - loss: 0.9206 - regression_loss: 0.8121 - classification_loss: 0.1084 311/500 [=================>............] - ETA: 1:04 - loss: 0.9213 - regression_loss: 0.8127 - classification_loss: 0.1086 312/500 [=================>............] - ETA: 1:04 - loss: 0.9203 - regression_loss: 0.8119 - classification_loss: 0.1084 313/500 [=================>............] - ETA: 1:04 - loss: 0.9195 - regression_loss: 0.8112 - classification_loss: 0.1083 314/500 [=================>............] - ETA: 1:03 - loss: 0.9212 - regression_loss: 0.8127 - classification_loss: 0.1085 315/500 [=================>............] - ETA: 1:03 - loss: 0.9216 - regression_loss: 0.8131 - classification_loss: 0.1085 316/500 [=================>............] - ETA: 1:03 - loss: 0.9228 - regression_loss: 0.8143 - classification_loss: 0.1085 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9215 - regression_loss: 0.8132 - classification_loss: 0.1083 318/500 [==================>...........] - ETA: 1:02 - loss: 0.9217 - regression_loss: 0.8134 - classification_loss: 0.1083 319/500 [==================>...........] - ETA: 1:02 - loss: 0.9216 - regression_loss: 0.8133 - classification_loss: 0.1083 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9222 - regression_loss: 0.8138 - classification_loss: 0.1084 321/500 [==================>...........] - ETA: 1:01 - loss: 0.9208 - regression_loss: 0.8126 - classification_loss: 0.1082 322/500 [==================>...........] - ETA: 1:01 - loss: 0.9199 - regression_loss: 0.8118 - classification_loss: 0.1081 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9191 - regression_loss: 0.8112 - classification_loss: 0.1079 324/500 [==================>...........] - ETA: 1:00 - loss: 0.9211 - regression_loss: 0.8130 - classification_loss: 0.1082 325/500 [==================>...........] - ETA: 1:00 - loss: 0.9222 - regression_loss: 0.8139 - classification_loss: 0.1082 326/500 [==================>...........] - ETA: 59s - loss: 0.9210 - regression_loss: 0.8129 - classification_loss: 0.1081  327/500 [==================>...........] - ETA: 59s - loss: 0.9204 - regression_loss: 0.8124 - classification_loss: 0.1081 328/500 [==================>...........] - ETA: 59s - loss: 0.9200 - regression_loss: 0.8119 - classification_loss: 0.1080 329/500 [==================>...........] - ETA: 58s - loss: 0.9201 - regression_loss: 0.8121 - classification_loss: 0.1080 330/500 [==================>...........] - ETA: 58s - loss: 0.9202 - regression_loss: 0.8122 - classification_loss: 0.1080 331/500 [==================>...........] - ETA: 58s - loss: 0.9205 - regression_loss: 0.8125 - classification_loss: 0.1080 332/500 [==================>...........] - ETA: 57s - loss: 0.9214 - regression_loss: 0.8132 - classification_loss: 0.1081 333/500 [==================>...........] - ETA: 57s - loss: 0.9215 - regression_loss: 0.8134 - classification_loss: 0.1081 334/500 [===================>..........] - ETA: 57s - loss: 0.9202 - regression_loss: 0.8122 - classification_loss: 0.1080 335/500 [===================>..........] - ETA: 56s - loss: 0.9193 - regression_loss: 0.8114 - classification_loss: 0.1078 336/500 [===================>..........] - ETA: 56s - loss: 0.9181 - regression_loss: 0.8105 - classification_loss: 0.1076 337/500 [===================>..........] - ETA: 56s - loss: 0.9182 - regression_loss: 0.8106 - classification_loss: 0.1075 338/500 [===================>..........] - ETA: 55s - loss: 0.9177 - regression_loss: 0.8102 - classification_loss: 0.1076 339/500 [===================>..........] - ETA: 55s - loss: 0.9185 - regression_loss: 0.8109 - classification_loss: 0.1076 340/500 [===================>..........] - ETA: 54s - loss: 0.9184 - regression_loss: 0.8108 - classification_loss: 0.1076 341/500 [===================>..........] - ETA: 54s - loss: 0.9186 - regression_loss: 0.8110 - classification_loss: 0.1077 342/500 [===================>..........] - ETA: 54s - loss: 0.9194 - regression_loss: 0.8116 - classification_loss: 0.1078 343/500 [===================>..........] - ETA: 53s - loss: 0.9192 - regression_loss: 0.8113 - classification_loss: 0.1079 344/500 [===================>..........] - ETA: 53s - loss: 0.9194 - regression_loss: 0.8115 - classification_loss: 0.1078 345/500 [===================>..........] - ETA: 53s - loss: 0.9211 - regression_loss: 0.8131 - classification_loss: 0.1080 346/500 [===================>..........] - ETA: 52s - loss: 0.9226 - regression_loss: 0.8144 - classification_loss: 0.1082 347/500 [===================>..........] - ETA: 52s - loss: 0.9235 - regression_loss: 0.8151 - classification_loss: 0.1083 348/500 [===================>..........] - ETA: 52s - loss: 0.9237 - regression_loss: 0.8152 - classification_loss: 0.1084 349/500 [===================>..........] - ETA: 51s - loss: 0.9242 - regression_loss: 0.8158 - classification_loss: 0.1084 350/500 [====================>.........] - ETA: 51s - loss: 0.9242 - regression_loss: 0.8157 - classification_loss: 0.1085 351/500 [====================>.........] - ETA: 51s - loss: 0.9239 - regression_loss: 0.8155 - classification_loss: 0.1085 352/500 [====================>.........] - ETA: 50s - loss: 0.9234 - regression_loss: 0.8150 - classification_loss: 0.1083 353/500 [====================>.........] - ETA: 50s - loss: 0.9225 - regression_loss: 0.8143 - classification_loss: 0.1082 354/500 [====================>.........] - ETA: 50s - loss: 0.9217 - regression_loss: 0.8136 - classification_loss: 0.1081 355/500 [====================>.........] - ETA: 49s - loss: 0.9213 - regression_loss: 0.8133 - classification_loss: 0.1080 356/500 [====================>.........] - ETA: 49s - loss: 0.9210 - regression_loss: 0.8130 - classification_loss: 0.1079 357/500 [====================>.........] - ETA: 49s - loss: 0.9199 - regression_loss: 0.8120 - classification_loss: 0.1079 358/500 [====================>.........] - ETA: 48s - loss: 0.9189 - regression_loss: 0.8111 - classification_loss: 0.1078 359/500 [====================>.........] - ETA: 48s - loss: 0.9205 - regression_loss: 0.8123 - classification_loss: 0.1082 360/500 [====================>.........] - ETA: 48s - loss: 0.9202 - regression_loss: 0.8120 - classification_loss: 0.1082 361/500 [====================>.........] - ETA: 47s - loss: 0.9192 - regression_loss: 0.8111 - classification_loss: 0.1080 362/500 [====================>.........] - ETA: 47s - loss: 0.9189 - regression_loss: 0.8109 - classification_loss: 0.1080 363/500 [====================>.........] - ETA: 47s - loss: 0.9187 - regression_loss: 0.8108 - classification_loss: 0.1079 364/500 [====================>.........] - ETA: 46s - loss: 0.9192 - regression_loss: 0.8113 - classification_loss: 0.1079 365/500 [====================>.........] - ETA: 46s - loss: 0.9198 - regression_loss: 0.8119 - classification_loss: 0.1079 366/500 [====================>.........] - ETA: 46s - loss: 0.9195 - regression_loss: 0.8116 - classification_loss: 0.1079 367/500 [=====================>........] - ETA: 45s - loss: 0.9189 - regression_loss: 0.8110 - classification_loss: 0.1079 368/500 [=====================>........] - ETA: 45s - loss: 0.9188 - regression_loss: 0.8110 - classification_loss: 0.1079 369/500 [=====================>........] - ETA: 44s - loss: 0.9195 - regression_loss: 0.8116 - classification_loss: 0.1079 370/500 [=====================>........] - ETA: 44s - loss: 0.9190 - regression_loss: 0.8112 - classification_loss: 0.1078 371/500 [=====================>........] - ETA: 44s - loss: 0.9185 - regression_loss: 0.8108 - classification_loss: 0.1078 372/500 [=====================>........] - ETA: 43s - loss: 0.9190 - regression_loss: 0.8110 - classification_loss: 0.1079 373/500 [=====================>........] - ETA: 43s - loss: 0.9179 - regression_loss: 0.8101 - classification_loss: 0.1078 374/500 [=====================>........] - ETA: 43s - loss: 0.9177 - regression_loss: 0.8099 - classification_loss: 0.1078 375/500 [=====================>........] - ETA: 42s - loss: 0.9180 - regression_loss: 0.8101 - classification_loss: 0.1079 376/500 [=====================>........] - ETA: 42s - loss: 0.9179 - regression_loss: 0.8100 - classification_loss: 0.1079 377/500 [=====================>........] - ETA: 42s - loss: 0.9172 - regression_loss: 0.8094 - classification_loss: 0.1078 378/500 [=====================>........] - ETA: 41s - loss: 0.9158 - regression_loss: 0.8081 - classification_loss: 0.1077 379/500 [=====================>........] - ETA: 41s - loss: 0.9155 - regression_loss: 0.8078 - classification_loss: 0.1077 380/500 [=====================>........] - ETA: 41s - loss: 0.9154 - regression_loss: 0.8077 - classification_loss: 0.1077 381/500 [=====================>........] - ETA: 40s - loss: 0.9162 - regression_loss: 0.8084 - classification_loss: 0.1078 382/500 [=====================>........] - ETA: 40s - loss: 0.9160 - regression_loss: 0.8082 - classification_loss: 0.1078 383/500 [=====================>........] - ETA: 40s - loss: 0.9169 - regression_loss: 0.8090 - classification_loss: 0.1079 384/500 [======================>.......] - ETA: 39s - loss: 0.9163 - regression_loss: 0.8084 - classification_loss: 0.1078 385/500 [======================>.......] - ETA: 39s - loss: 0.9168 - regression_loss: 0.8089 - classification_loss: 0.1078 386/500 [======================>.......] - ETA: 39s - loss: 0.9157 - regression_loss: 0.8080 - classification_loss: 0.1077 387/500 [======================>.......] - ETA: 38s - loss: 0.9147 - regression_loss: 0.8072 - classification_loss: 0.1075 388/500 [======================>.......] - ETA: 38s - loss: 0.9152 - regression_loss: 0.8077 - classification_loss: 0.1075 389/500 [======================>.......] - ETA: 38s - loss: 0.9147 - regression_loss: 0.8074 - classification_loss: 0.1074 390/500 [======================>.......] - ETA: 37s - loss: 0.9143 - regression_loss: 0.8070 - classification_loss: 0.1072 391/500 [======================>.......] - ETA: 37s - loss: 0.9141 - regression_loss: 0.8069 - classification_loss: 0.1073 392/500 [======================>.......] - ETA: 37s - loss: 0.9137 - regression_loss: 0.8064 - classification_loss: 0.1072 393/500 [======================>.......] - ETA: 36s - loss: 0.9151 - regression_loss: 0.8078 - classification_loss: 0.1073 394/500 [======================>.......] - ETA: 36s - loss: 0.9154 - regression_loss: 0.8081 - classification_loss: 0.1073 395/500 [======================>.......] - ETA: 36s - loss: 0.9153 - regression_loss: 0.8080 - classification_loss: 0.1073 396/500 [======================>.......] - ETA: 35s - loss: 0.9146 - regression_loss: 0.8073 - classification_loss: 0.1073 397/500 [======================>.......] - ETA: 35s - loss: 0.9147 - regression_loss: 0.8075 - classification_loss: 0.1072 398/500 [======================>.......] - ETA: 35s - loss: 0.9154 - regression_loss: 0.8082 - classification_loss: 0.1072 399/500 [======================>.......] - ETA: 34s - loss: 0.9160 - regression_loss: 0.8087 - classification_loss: 0.1073 400/500 [=======================>......] - ETA: 34s - loss: 0.9170 - regression_loss: 0.8096 - classification_loss: 0.1074 401/500 [=======================>......] - ETA: 33s - loss: 0.9166 - regression_loss: 0.8093 - classification_loss: 0.1073 402/500 [=======================>......] - ETA: 33s - loss: 0.9166 - regression_loss: 0.8093 - classification_loss: 0.1073 403/500 [=======================>......] - ETA: 33s - loss: 0.9167 - regression_loss: 0.8093 - classification_loss: 0.1074 404/500 [=======================>......] - ETA: 32s - loss: 0.9177 - regression_loss: 0.8101 - classification_loss: 0.1075 405/500 [=======================>......] - ETA: 32s - loss: 0.9173 - regression_loss: 0.8098 - classification_loss: 0.1075 406/500 [=======================>......] - ETA: 32s - loss: 0.9177 - regression_loss: 0.8102 - classification_loss: 0.1075 407/500 [=======================>......] - ETA: 31s - loss: 0.9178 - regression_loss: 0.8100 - classification_loss: 0.1078 408/500 [=======================>......] - ETA: 31s - loss: 0.9180 - regression_loss: 0.8100 - classification_loss: 0.1080 409/500 [=======================>......] - ETA: 31s - loss: 0.9180 - regression_loss: 0.8100 - classification_loss: 0.1080 410/500 [=======================>......] - ETA: 30s - loss: 0.9180 - regression_loss: 0.8100 - classification_loss: 0.1080 411/500 [=======================>......] - ETA: 30s - loss: 0.9177 - regression_loss: 0.8097 - classification_loss: 0.1080 412/500 [=======================>......] - ETA: 30s - loss: 0.9174 - regression_loss: 0.8095 - classification_loss: 0.1079 413/500 [=======================>......] - ETA: 29s - loss: 0.9176 - regression_loss: 0.8097 - classification_loss: 0.1079 414/500 [=======================>......] - ETA: 29s - loss: 0.9177 - regression_loss: 0.8098 - classification_loss: 0.1079 415/500 [=======================>......] - ETA: 29s - loss: 0.9165 - regression_loss: 0.8089 - classification_loss: 0.1077 416/500 [=======================>......] - ETA: 28s - loss: 0.9163 - regression_loss: 0.8087 - classification_loss: 0.1075 417/500 [========================>.....] - ETA: 28s - loss: 0.9173 - regression_loss: 0.8097 - classification_loss: 0.1076 418/500 [========================>.....] - ETA: 28s - loss: 0.9178 - regression_loss: 0.8101 - classification_loss: 0.1077 419/500 [========================>.....] - ETA: 27s - loss: 0.9183 - regression_loss: 0.8105 - classification_loss: 0.1078 420/500 [========================>.....] - ETA: 27s - loss: 0.9199 - regression_loss: 0.8119 - classification_loss: 0.1080 421/500 [========================>.....] - ETA: 27s - loss: 0.9212 - regression_loss: 0.8131 - classification_loss: 0.1081 422/500 [========================>.....] - ETA: 26s - loss: 0.9223 - regression_loss: 0.8140 - classification_loss: 0.1083 423/500 [========================>.....] - ETA: 26s - loss: 0.9235 - regression_loss: 0.8151 - classification_loss: 0.1084 424/500 [========================>.....] - ETA: 26s - loss: 0.9224 - regression_loss: 0.8141 - classification_loss: 0.1083 425/500 [========================>.....] - ETA: 25s - loss: 0.9226 - regression_loss: 0.8143 - classification_loss: 0.1083 426/500 [========================>.....] - ETA: 25s - loss: 0.9227 - regression_loss: 0.8144 - classification_loss: 0.1083 427/500 [========================>.....] - ETA: 25s - loss: 0.9230 - regression_loss: 0.8148 - classification_loss: 0.1083 428/500 [========================>.....] - ETA: 24s - loss: 0.9237 - regression_loss: 0.8154 - classification_loss: 0.1083 429/500 [========================>.....] - ETA: 24s - loss: 0.9236 - regression_loss: 0.8154 - classification_loss: 0.1082 430/500 [========================>.....] - ETA: 24s - loss: 0.9236 - regression_loss: 0.8154 - classification_loss: 0.1082 431/500 [========================>.....] - ETA: 23s - loss: 0.9252 - regression_loss: 0.8167 - classification_loss: 0.1086 432/500 [========================>.....] - ETA: 23s - loss: 0.9247 - regression_loss: 0.8162 - classification_loss: 0.1085 433/500 [========================>.....] - ETA: 22s - loss: 0.9257 - regression_loss: 0.8171 - classification_loss: 0.1086 434/500 [=========================>....] - ETA: 22s - loss: 0.9257 - regression_loss: 0.8172 - classification_loss: 0.1085 435/500 [=========================>....] - ETA: 22s - loss: 0.9253 - regression_loss: 0.8170 - classification_loss: 0.1084 436/500 [=========================>....] - ETA: 21s - loss: 0.9256 - regression_loss: 0.8172 - classification_loss: 0.1084 437/500 [=========================>....] - ETA: 21s - loss: 0.9257 - regression_loss: 0.8174 - classification_loss: 0.1084 438/500 [=========================>....] - ETA: 21s - loss: 0.9267 - regression_loss: 0.8182 - classification_loss: 0.1085 439/500 [=========================>....] - ETA: 20s - loss: 0.9255 - regression_loss: 0.8172 - classification_loss: 0.1083 440/500 [=========================>....] - ETA: 20s - loss: 0.9248 - regression_loss: 0.8166 - classification_loss: 0.1082 441/500 [=========================>....] - ETA: 20s - loss: 0.9244 - regression_loss: 0.8162 - classification_loss: 0.1082 442/500 [=========================>....] - ETA: 19s - loss: 0.9237 - regression_loss: 0.8156 - classification_loss: 0.1081 443/500 [=========================>....] - ETA: 19s - loss: 0.9244 - regression_loss: 0.8162 - classification_loss: 0.1082 444/500 [=========================>....] - ETA: 19s - loss: 0.9251 - regression_loss: 0.8167 - classification_loss: 0.1084 445/500 [=========================>....] - ETA: 18s - loss: 0.9253 - regression_loss: 0.8170 - classification_loss: 0.1084 446/500 [=========================>....] - ETA: 18s - loss: 0.9243 - regression_loss: 0.8161 - classification_loss: 0.1082 447/500 [=========================>....] - ETA: 18s - loss: 0.9237 - regression_loss: 0.8156 - classification_loss: 0.1081 448/500 [=========================>....] - ETA: 17s - loss: 0.9239 - regression_loss: 0.8158 - classification_loss: 0.1081 449/500 [=========================>....] - ETA: 17s - loss: 0.9242 - regression_loss: 0.8160 - classification_loss: 0.1082 450/500 [==========================>...] - ETA: 17s - loss: 0.9232 - regression_loss: 0.8152 - classification_loss: 0.1081 451/500 [==========================>...] - ETA: 16s - loss: 0.9240 - regression_loss: 0.8159 - classification_loss: 0.1081 452/500 [==========================>...] - ETA: 16s - loss: 0.9231 - regression_loss: 0.8151 - classification_loss: 0.1080 453/500 [==========================>...] - ETA: 16s - loss: 0.9237 - regression_loss: 0.8156 - classification_loss: 0.1081 454/500 [==========================>...] - ETA: 15s - loss: 0.9225 - regression_loss: 0.8146 - classification_loss: 0.1080 455/500 [==========================>...] - ETA: 15s - loss: 0.9228 - regression_loss: 0.8148 - classification_loss: 0.1081 456/500 [==========================>...] - ETA: 15s - loss: 0.9224 - regression_loss: 0.8145 - classification_loss: 0.1080 457/500 [==========================>...] - ETA: 14s - loss: 0.9218 - regression_loss: 0.8139 - classification_loss: 0.1079 458/500 [==========================>...] - ETA: 14s - loss: 0.9227 - regression_loss: 0.8147 - classification_loss: 0.1080 459/500 [==========================>...] - ETA: 14s - loss: 0.9220 - regression_loss: 0.8141 - classification_loss: 0.1079 460/500 [==========================>...] - ETA: 13s - loss: 0.9215 - regression_loss: 0.8137 - classification_loss: 0.1079 461/500 [==========================>...] - ETA: 13s - loss: 0.9226 - regression_loss: 0.8146 - classification_loss: 0.1080 462/500 [==========================>...] - ETA: 13s - loss: 0.9219 - regression_loss: 0.8140 - classification_loss: 0.1079 463/500 [==========================>...] - ETA: 12s - loss: 0.9222 - regression_loss: 0.8142 - classification_loss: 0.1080 464/500 [==========================>...] - ETA: 12s - loss: 0.9228 - regression_loss: 0.8147 - classification_loss: 0.1080 465/500 [==========================>...] - ETA: 12s - loss: 0.9216 - regression_loss: 0.8137 - classification_loss: 0.1079 466/500 [==========================>...] - ETA: 11s - loss: 0.9224 - regression_loss: 0.8143 - classification_loss: 0.1081 467/500 [===========================>..] - ETA: 11s - loss: 0.9228 - regression_loss: 0.8147 - classification_loss: 0.1081 468/500 [===========================>..] - ETA: 10s - loss: 0.9229 - regression_loss: 0.8148 - classification_loss: 0.1081 469/500 [===========================>..] - ETA: 10s - loss: 0.9224 - regression_loss: 0.8144 - classification_loss: 0.1080 470/500 [===========================>..] - ETA: 10s - loss: 0.9222 - regression_loss: 0.8143 - classification_loss: 0.1079 471/500 [===========================>..] - ETA: 9s - loss: 0.9213 - regression_loss: 0.8134 - classification_loss: 0.1078  472/500 [===========================>..] - ETA: 9s - loss: 0.9212 - regression_loss: 0.8133 - classification_loss: 0.1079 473/500 [===========================>..] - ETA: 9s - loss: 0.9204 - regression_loss: 0.8127 - classification_loss: 0.1078 474/500 [===========================>..] - ETA: 8s - loss: 0.9215 - regression_loss: 0.8136 - classification_loss: 0.1079 475/500 [===========================>..] - ETA: 8s - loss: 0.9213 - regression_loss: 0.8134 - classification_loss: 0.1079 476/500 [===========================>..] - ETA: 8s - loss: 0.9212 - regression_loss: 0.8133 - classification_loss: 0.1079 477/500 [===========================>..] - ETA: 7s - loss: 0.9212 - regression_loss: 0.8133 - classification_loss: 0.1079 478/500 [===========================>..] - ETA: 7s - loss: 0.9217 - regression_loss: 0.8137 - classification_loss: 0.1080 479/500 [===========================>..] - ETA: 7s - loss: 0.9216 - regression_loss: 0.8137 - classification_loss: 0.1080 480/500 [===========================>..] - ETA: 6s - loss: 0.9210 - regression_loss: 0.8131 - classification_loss: 0.1079 481/500 [===========================>..] - ETA: 6s - loss: 0.9209 - regression_loss: 0.8131 - classification_loss: 0.1079 482/500 [===========================>..] - ETA: 6s - loss: 0.9207 - regression_loss: 0.8128 - classification_loss: 0.1079 483/500 [===========================>..] - ETA: 5s - loss: 0.9210 - regression_loss: 0.8131 - classification_loss: 0.1080 484/500 [============================>.] - ETA: 5s - loss: 0.9212 - regression_loss: 0.8132 - classification_loss: 0.1080 485/500 [============================>.] - ETA: 5s - loss: 0.9212 - regression_loss: 0.8132 - classification_loss: 0.1080 486/500 [============================>.] - ETA: 4s - loss: 0.9209 - regression_loss: 0.8129 - classification_loss: 0.1080 487/500 [============================>.] - ETA: 4s - loss: 0.9211 - regression_loss: 0.8131 - classification_loss: 0.1080 488/500 [============================>.] - ETA: 4s - loss: 0.9206 - regression_loss: 0.8127 - classification_loss: 0.1079 489/500 [============================>.] - ETA: 3s - loss: 0.9208 - regression_loss: 0.8129 - classification_loss: 0.1079 490/500 [============================>.] - ETA: 3s - loss: 0.9198 - regression_loss: 0.8120 - classification_loss: 0.1078 491/500 [============================>.] - ETA: 3s - loss: 0.9193 - regression_loss: 0.8115 - classification_loss: 0.1078 492/500 [============================>.] - ETA: 2s - loss: 0.9199 - regression_loss: 0.8120 - classification_loss: 0.1078 493/500 [============================>.] - ETA: 2s - loss: 0.9198 - regression_loss: 0.8120 - classification_loss: 0.1078 494/500 [============================>.] - ETA: 2s - loss: 0.9202 - regression_loss: 0.8125 - classification_loss: 0.1078 495/500 [============================>.] - ETA: 1s - loss: 0.9205 - regression_loss: 0.8126 - classification_loss: 0.1078 496/500 [============================>.] - ETA: 1s - loss: 0.9208 - regression_loss: 0.8129 - classification_loss: 0.1079 497/500 [============================>.] - ETA: 1s - loss: 0.9196 - regression_loss: 0.8119 - classification_loss: 0.1077 498/500 [============================>.] - ETA: 0s - loss: 0.9196 - regression_loss: 0.8119 - classification_loss: 0.1077 499/500 [============================>.] - ETA: 0s - loss: 0.9189 - regression_loss: 0.8114 - classification_loss: 0.1076 500/500 [==============================] - 172s 344ms/step - loss: 0.9185 - regression_loss: 0.8110 - classification_loss: 0.1075 1172 instances of class plum with average precision: 0.6979 mAP: 0.6979 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 2:42 - loss: 1.0869 - regression_loss: 0.9588 - classification_loss: 0.1281 2/500 [..............................] - ETA: 2:45 - loss: 1.0410 - regression_loss: 0.9372 - classification_loss: 0.1038 3/500 [..............................] - ETA: 2:47 - loss: 1.0496 - regression_loss: 0.9379 - classification_loss: 0.1117 4/500 [..............................] - ETA: 2:46 - loss: 0.9499 - regression_loss: 0.8513 - classification_loss: 0.0986 5/500 [..............................] - ETA: 2:46 - loss: 0.9839 - regression_loss: 0.8731 - classification_loss: 0.1107 6/500 [..............................] - ETA: 2:46 - loss: 1.0887 - regression_loss: 0.9783 - classification_loss: 0.1105 7/500 [..............................] - ETA: 2:46 - loss: 1.0449 - regression_loss: 0.9372 - classification_loss: 0.1077 8/500 [..............................] - ETA: 2:47 - loss: 1.0199 - regression_loss: 0.9147 - classification_loss: 0.1052 9/500 [..............................] - ETA: 2:47 - loss: 0.9765 - regression_loss: 0.8721 - classification_loss: 0.1044 10/500 [..............................] - ETA: 2:48 - loss: 0.9717 - regression_loss: 0.8679 - classification_loss: 0.1037 11/500 [..............................] - ETA: 2:47 - loss: 0.9368 - regression_loss: 0.8370 - classification_loss: 0.0998 12/500 [..............................] - ETA: 2:47 - loss: 0.9024 - regression_loss: 0.8076 - classification_loss: 0.0949 13/500 [..............................] - ETA: 2:47 - loss: 0.8945 - regression_loss: 0.7992 - classification_loss: 0.0953 14/500 [..............................] - ETA: 2:47 - loss: 0.9353 - regression_loss: 0.8357 - classification_loss: 0.0996 15/500 [..............................] - ETA: 2:46 - loss: 0.9507 - regression_loss: 0.8456 - classification_loss: 0.1051 16/500 [..............................] - ETA: 2:46 - loss: 0.9476 - regression_loss: 0.8416 - classification_loss: 0.1060 17/500 [>.............................] - ETA: 2:46 - loss: 0.9328 - regression_loss: 0.8285 - classification_loss: 0.1042 18/500 [>.............................] - ETA: 2:45 - loss: 0.9318 - regression_loss: 0.8270 - classification_loss: 0.1047 19/500 [>.............................] - ETA: 2:44 - loss: 0.9437 - regression_loss: 0.8381 - classification_loss: 0.1056 20/500 [>.............................] - ETA: 2:44 - loss: 0.9317 - regression_loss: 0.8270 - classification_loss: 0.1047 21/500 [>.............................] - ETA: 2:44 - loss: 0.9414 - regression_loss: 0.8348 - classification_loss: 0.1066 22/500 [>.............................] - ETA: 2:43 - loss: 0.9541 - regression_loss: 0.8483 - classification_loss: 0.1058 23/500 [>.............................] - ETA: 2:43 - loss: 0.9626 - regression_loss: 0.8577 - classification_loss: 0.1050 24/500 [>.............................] - ETA: 2:42 - loss: 0.9656 - regression_loss: 0.8597 - classification_loss: 0.1058 25/500 [>.............................] - ETA: 2:42 - loss: 0.9577 - regression_loss: 0.8529 - classification_loss: 0.1048 26/500 [>.............................] - ETA: 2:41 - loss: 0.9423 - regression_loss: 0.8391 - classification_loss: 0.1031 27/500 [>.............................] - ETA: 2:41 - loss: 0.9493 - regression_loss: 0.8451 - classification_loss: 0.1042 28/500 [>.............................] - ETA: 2:40 - loss: 0.9378 - regression_loss: 0.8347 - classification_loss: 0.1030 29/500 [>.............................] - ETA: 2:40 - loss: 0.9531 - regression_loss: 0.8498 - classification_loss: 0.1033 30/500 [>.............................] - ETA: 2:39 - loss: 0.9463 - regression_loss: 0.8443 - classification_loss: 0.1020 31/500 [>.............................] - ETA: 2:39 - loss: 0.9450 - regression_loss: 0.8432 - classification_loss: 0.1018 32/500 [>.............................] - ETA: 2:39 - loss: 0.9282 - regression_loss: 0.8281 - classification_loss: 0.1001 33/500 [>.............................] - ETA: 2:39 - loss: 0.9291 - regression_loss: 0.8290 - classification_loss: 0.1001 34/500 [=>............................] - ETA: 2:38 - loss: 0.9486 - regression_loss: 0.8450 - classification_loss: 0.1035 35/500 [=>............................] - ETA: 2:38 - loss: 0.9530 - regression_loss: 0.8489 - classification_loss: 0.1042 36/500 [=>............................] - ETA: 2:38 - loss: 0.9506 - regression_loss: 0.8460 - classification_loss: 0.1046 37/500 [=>............................] - ETA: 2:37 - loss: 0.9546 - regression_loss: 0.8492 - classification_loss: 0.1054 38/500 [=>............................] - ETA: 2:37 - loss: 0.9452 - regression_loss: 0.8414 - classification_loss: 0.1039 39/500 [=>............................] - ETA: 2:37 - loss: 0.9567 - regression_loss: 0.8512 - classification_loss: 0.1055 40/500 [=>............................] - ETA: 2:37 - loss: 0.9530 - regression_loss: 0.8476 - classification_loss: 0.1054 41/500 [=>............................] - ETA: 2:36 - loss: 0.9474 - regression_loss: 0.8425 - classification_loss: 0.1049 42/500 [=>............................] - ETA: 2:36 - loss: 0.9454 - regression_loss: 0.8411 - classification_loss: 0.1043 43/500 [=>............................] - ETA: 2:36 - loss: 0.9480 - regression_loss: 0.8431 - classification_loss: 0.1049 44/500 [=>............................] - ETA: 2:36 - loss: 0.9503 - regression_loss: 0.8441 - classification_loss: 0.1062 45/500 [=>............................] - ETA: 2:35 - loss: 0.9500 - regression_loss: 0.8441 - classification_loss: 0.1059 46/500 [=>............................] - ETA: 2:35 - loss: 0.9489 - regression_loss: 0.8429 - classification_loss: 0.1060 47/500 [=>............................] - ETA: 2:35 - loss: 0.9517 - regression_loss: 0.8447 - classification_loss: 0.1069 48/500 [=>............................] - ETA: 2:34 - loss: 0.9563 - regression_loss: 0.8491 - classification_loss: 0.1073 49/500 [=>............................] - ETA: 2:34 - loss: 0.9593 - regression_loss: 0.8516 - classification_loss: 0.1077 50/500 [==>...........................] - ETA: 2:34 - loss: 0.9616 - regression_loss: 0.8534 - classification_loss: 0.1081 51/500 [==>...........................] - ETA: 2:33 - loss: 0.9500 - regression_loss: 0.8433 - classification_loss: 0.1067 52/500 [==>...........................] - ETA: 2:33 - loss: 0.9562 - regression_loss: 0.8482 - classification_loss: 0.1079 53/500 [==>...........................] - ETA: 2:32 - loss: 0.9516 - regression_loss: 0.8443 - classification_loss: 0.1074 54/500 [==>...........................] - ETA: 2:32 - loss: 0.9482 - regression_loss: 0.8408 - classification_loss: 0.1074 55/500 [==>...........................] - ETA: 2:32 - loss: 0.9465 - regression_loss: 0.8398 - classification_loss: 0.1067 56/500 [==>...........................] - ETA: 2:32 - loss: 0.9455 - regression_loss: 0.8391 - classification_loss: 0.1064 57/500 [==>...........................] - ETA: 2:31 - loss: 0.9425 - regression_loss: 0.8363 - classification_loss: 0.1062 58/500 [==>...........................] - ETA: 2:31 - loss: 0.9400 - regression_loss: 0.8340 - classification_loss: 0.1060 59/500 [==>...........................] - ETA: 2:30 - loss: 0.9342 - regression_loss: 0.8291 - classification_loss: 0.1051 60/500 [==>...........................] - ETA: 2:30 - loss: 0.9332 - regression_loss: 0.8284 - classification_loss: 0.1048 61/500 [==>...........................] - ETA: 2:30 - loss: 0.9298 - regression_loss: 0.8256 - classification_loss: 0.1041 62/500 [==>...........................] - ETA: 2:29 - loss: 0.9354 - regression_loss: 0.8309 - classification_loss: 0.1045 63/500 [==>...........................] - ETA: 2:29 - loss: 0.9400 - regression_loss: 0.8349 - classification_loss: 0.1051 64/500 [==>...........................] - ETA: 2:29 - loss: 0.9365 - regression_loss: 0.8317 - classification_loss: 0.1048 65/500 [==>...........................] - ETA: 2:28 - loss: 0.9387 - regression_loss: 0.8340 - classification_loss: 0.1048 66/500 [==>...........................] - ETA: 2:28 - loss: 0.9414 - regression_loss: 0.8363 - classification_loss: 0.1051 67/500 [===>..........................] - ETA: 2:28 - loss: 0.9409 - regression_loss: 0.8360 - classification_loss: 0.1049 68/500 [===>..........................] - ETA: 2:27 - loss: 0.9468 - regression_loss: 0.8417 - classification_loss: 0.1052 69/500 [===>..........................] - ETA: 2:27 - loss: 0.9512 - regression_loss: 0.8453 - classification_loss: 0.1060 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9516 - regression_loss: 0.8457 - classification_loss: 0.1059 71/500 [===>..........................] - ETA: 2:26 - loss: 0.9483 - regression_loss: 0.8427 - classification_loss: 0.1056 72/500 [===>..........................] - ETA: 2:26 - loss: 0.9432 - regression_loss: 0.8383 - classification_loss: 0.1049 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9437 - regression_loss: 0.8388 - classification_loss: 0.1049 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9481 - regression_loss: 0.8424 - classification_loss: 0.1058 75/500 [===>..........................] - ETA: 2:25 - loss: 0.9449 - regression_loss: 0.8399 - classification_loss: 0.1051 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9395 - regression_loss: 0.8350 - classification_loss: 0.1045 77/500 [===>..........................] - ETA: 2:24 - loss: 0.9393 - regression_loss: 0.8351 - classification_loss: 0.1043 78/500 [===>..........................] - ETA: 2:24 - loss: 0.9356 - regression_loss: 0.8318 - classification_loss: 0.1037 79/500 [===>..........................] - ETA: 2:24 - loss: 0.9286 - regression_loss: 0.8257 - classification_loss: 0.1029 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9280 - regression_loss: 0.8245 - classification_loss: 0.1034 81/500 [===>..........................] - ETA: 2:23 - loss: 0.9232 - regression_loss: 0.8202 - classification_loss: 0.1030 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9198 - regression_loss: 0.8174 - classification_loss: 0.1025 83/500 [===>..........................] - ETA: 2:22 - loss: 0.9180 - regression_loss: 0.8157 - classification_loss: 0.1024 84/500 [====>.........................] - ETA: 2:22 - loss: 0.9119 - regression_loss: 0.8104 - classification_loss: 0.1015 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9114 - regression_loss: 0.8101 - classification_loss: 0.1013 86/500 [====>.........................] - ETA: 2:21 - loss: 0.9139 - regression_loss: 0.8123 - classification_loss: 0.1016 87/500 [====>.........................] - ETA: 2:21 - loss: 0.9116 - regression_loss: 0.8100 - classification_loss: 0.1016 88/500 [====>.........................] - ETA: 2:21 - loss: 0.9108 - regression_loss: 0.8097 - classification_loss: 0.1012 89/500 [====>.........................] - ETA: 2:20 - loss: 0.9121 - regression_loss: 0.8111 - classification_loss: 0.1010 90/500 [====>.........................] - ETA: 2:20 - loss: 0.9074 - regression_loss: 0.8071 - classification_loss: 0.1003 91/500 [====>.........................] - ETA: 2:20 - loss: 0.9019 - regression_loss: 0.8023 - classification_loss: 0.0996 92/500 [====>.........................] - ETA: 2:19 - loss: 0.9006 - regression_loss: 0.8010 - classification_loss: 0.0996 93/500 [====>.........................] - ETA: 2:19 - loss: 0.8967 - regression_loss: 0.7977 - classification_loss: 0.0990 94/500 [====>.........................] - ETA: 2:19 - loss: 0.8957 - regression_loss: 0.7957 - classification_loss: 0.1000 95/500 [====>.........................] - ETA: 2:18 - loss: 0.8953 - regression_loss: 0.7953 - classification_loss: 0.1001 96/500 [====>.........................] - ETA: 2:18 - loss: 0.8952 - regression_loss: 0.7948 - classification_loss: 0.1004 97/500 [====>.........................] - ETA: 2:18 - loss: 0.8898 - regression_loss: 0.7898 - classification_loss: 0.0999 98/500 [====>.........................] - ETA: 2:17 - loss: 0.8927 - regression_loss: 0.7924 - classification_loss: 0.1003 99/500 [====>.........................] - ETA: 2:17 - loss: 0.8908 - regression_loss: 0.7906 - classification_loss: 0.1002 100/500 [=====>........................] - ETA: 2:16 - loss: 0.8911 - regression_loss: 0.7908 - classification_loss: 0.1004 101/500 [=====>........................] - ETA: 2:16 - loss: 0.8914 - regression_loss: 0.7908 - classification_loss: 0.1006 102/500 [=====>........................] - ETA: 2:16 - loss: 0.8917 - regression_loss: 0.7912 - classification_loss: 0.1005 103/500 [=====>........................] - ETA: 2:16 - loss: 0.8888 - regression_loss: 0.7888 - classification_loss: 0.1000 104/500 [=====>........................] - ETA: 2:15 - loss: 0.8884 - regression_loss: 0.7886 - classification_loss: 0.0998 105/500 [=====>........................] - ETA: 2:15 - loss: 0.8908 - regression_loss: 0.7907 - classification_loss: 0.1001 106/500 [=====>........................] - ETA: 2:14 - loss: 0.8941 - regression_loss: 0.7933 - classification_loss: 0.1008 107/500 [=====>........................] - ETA: 2:14 - loss: 0.8927 - regression_loss: 0.7920 - classification_loss: 0.1007 108/500 [=====>........................] - ETA: 2:14 - loss: 0.8890 - regression_loss: 0.7886 - classification_loss: 0.1004 109/500 [=====>........................] - ETA: 2:13 - loss: 0.8894 - regression_loss: 0.7889 - classification_loss: 0.1006 110/500 [=====>........................] - ETA: 2:13 - loss: 0.8890 - regression_loss: 0.7885 - classification_loss: 0.1005 111/500 [=====>........................] - ETA: 2:13 - loss: 0.8908 - regression_loss: 0.7899 - classification_loss: 0.1009 112/500 [=====>........................] - ETA: 2:12 - loss: 0.8861 - regression_loss: 0.7857 - classification_loss: 0.1004 113/500 [=====>........................] - ETA: 2:12 - loss: 0.8823 - regression_loss: 0.7825 - classification_loss: 0.0999 114/500 [=====>........................] - ETA: 2:12 - loss: 0.8823 - regression_loss: 0.7824 - classification_loss: 0.0999 115/500 [=====>........................] - ETA: 2:11 - loss: 0.8795 - regression_loss: 0.7800 - classification_loss: 0.0995 116/500 [=====>........................] - ETA: 2:11 - loss: 0.8841 - regression_loss: 0.7841 - classification_loss: 0.1000 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8837 - regression_loss: 0.7838 - classification_loss: 0.0999 118/500 [======>.......................] - ETA: 2:10 - loss: 0.8857 - regression_loss: 0.7857 - classification_loss: 0.1000 119/500 [======>.......................] - ETA: 2:10 - loss: 0.8839 - regression_loss: 0.7841 - classification_loss: 0.0999 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8821 - regression_loss: 0.7825 - classification_loss: 0.0996 121/500 [======>.......................] - ETA: 2:09 - loss: 0.8814 - regression_loss: 0.7817 - classification_loss: 0.0996 122/500 [======>.......................] - ETA: 2:09 - loss: 0.8791 - regression_loss: 0.7798 - classification_loss: 0.0993 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8780 - regression_loss: 0.7791 - classification_loss: 0.0989 124/500 [======>.......................] - ETA: 2:08 - loss: 0.8812 - regression_loss: 0.7819 - classification_loss: 0.0993 125/500 [======>.......................] - ETA: 2:08 - loss: 0.8828 - regression_loss: 0.7833 - classification_loss: 0.0995 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8827 - regression_loss: 0.7830 - classification_loss: 0.0997 127/500 [======>.......................] - ETA: 2:07 - loss: 0.8810 - regression_loss: 0.7817 - classification_loss: 0.0993 128/500 [======>.......................] - ETA: 2:07 - loss: 0.8833 - regression_loss: 0.7835 - classification_loss: 0.0998 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8802 - regression_loss: 0.7808 - classification_loss: 0.0994 130/500 [======>.......................] - ETA: 2:06 - loss: 0.8797 - regression_loss: 0.7803 - classification_loss: 0.0994 131/500 [======>.......................] - ETA: 2:06 - loss: 0.8793 - regression_loss: 0.7798 - classification_loss: 0.0995 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8808 - regression_loss: 0.7811 - classification_loss: 0.0998 133/500 [======>.......................] - ETA: 2:05 - loss: 0.8844 - regression_loss: 0.7844 - classification_loss: 0.1001 134/500 [=======>......................] - ETA: 2:05 - loss: 0.8842 - regression_loss: 0.7840 - classification_loss: 0.1001 135/500 [=======>......................] - ETA: 2:05 - loss: 0.8876 - regression_loss: 0.7870 - classification_loss: 0.1006 136/500 [=======>......................] - ETA: 2:04 - loss: 0.8872 - regression_loss: 0.7868 - classification_loss: 0.1004 137/500 [=======>......................] - ETA: 2:04 - loss: 0.8837 - regression_loss: 0.7836 - classification_loss: 0.1001 138/500 [=======>......................] - ETA: 2:03 - loss: 0.8820 - regression_loss: 0.7823 - classification_loss: 0.0997 139/500 [=======>......................] - ETA: 2:03 - loss: 0.8791 - regression_loss: 0.7798 - classification_loss: 0.0993 140/500 [=======>......................] - ETA: 2:03 - loss: 0.8821 - regression_loss: 0.7824 - classification_loss: 0.0997 141/500 [=======>......................] - ETA: 2:02 - loss: 0.8796 - regression_loss: 0.7802 - classification_loss: 0.0994 142/500 [=======>......................] - ETA: 2:02 - loss: 0.8823 - regression_loss: 0.7830 - classification_loss: 0.0993 143/500 [=======>......................] - ETA: 2:02 - loss: 0.8826 - regression_loss: 0.7834 - classification_loss: 0.0992 144/500 [=======>......................] - ETA: 2:01 - loss: 0.8831 - regression_loss: 0.7840 - classification_loss: 0.0991 145/500 [=======>......................] - ETA: 2:01 - loss: 0.8865 - regression_loss: 0.7870 - classification_loss: 0.0995 146/500 [=======>......................] - ETA: 2:01 - loss: 0.8862 - regression_loss: 0.7869 - classification_loss: 0.0992 147/500 [=======>......................] - ETA: 2:00 - loss: 0.8828 - regression_loss: 0.7840 - classification_loss: 0.0988 148/500 [=======>......................] - ETA: 2:00 - loss: 0.8854 - regression_loss: 0.7864 - classification_loss: 0.0989 149/500 [=======>......................] - ETA: 2:00 - loss: 0.8816 - regression_loss: 0.7831 - classification_loss: 0.0985 150/500 [========>.....................] - ETA: 1:59 - loss: 0.8803 - regression_loss: 0.7822 - classification_loss: 0.0981 151/500 [========>.....................] - ETA: 1:59 - loss: 0.8811 - regression_loss: 0.7826 - classification_loss: 0.0985 152/500 [========>.....................] - ETA: 1:59 - loss: 0.8810 - regression_loss: 0.7826 - classification_loss: 0.0985 153/500 [========>.....................] - ETA: 1:58 - loss: 0.8825 - regression_loss: 0.7835 - classification_loss: 0.0989 154/500 [========>.....................] - ETA: 1:58 - loss: 0.8826 - regression_loss: 0.7835 - classification_loss: 0.0991 155/500 [========>.....................] - ETA: 1:58 - loss: 0.8848 - regression_loss: 0.7856 - classification_loss: 0.0992 156/500 [========>.....................] - ETA: 1:57 - loss: 0.8860 - regression_loss: 0.7868 - classification_loss: 0.0992 157/500 [========>.....................] - ETA: 1:57 - loss: 0.8861 - regression_loss: 0.7868 - classification_loss: 0.0993 158/500 [========>.....................] - ETA: 1:57 - loss: 0.8860 - regression_loss: 0.7864 - classification_loss: 0.0997 159/500 [========>.....................] - ETA: 1:56 - loss: 0.8861 - regression_loss: 0.7863 - classification_loss: 0.0999 160/500 [========>.....................] - ETA: 1:56 - loss: 0.8853 - regression_loss: 0.7854 - classification_loss: 0.0999 161/500 [========>.....................] - ETA: 1:56 - loss: 0.8852 - regression_loss: 0.7853 - classification_loss: 0.0999 162/500 [========>.....................] - ETA: 1:55 - loss: 0.8865 - regression_loss: 0.7868 - classification_loss: 0.0997 163/500 [========>.....................] - ETA: 1:55 - loss: 0.8882 - regression_loss: 0.7881 - classification_loss: 0.1001 164/500 [========>.....................] - ETA: 1:55 - loss: 0.8889 - regression_loss: 0.7888 - classification_loss: 0.1001 165/500 [========>.....................] - ETA: 1:54 - loss: 0.8880 - regression_loss: 0.7881 - classification_loss: 0.0999 166/500 [========>.....................] - ETA: 1:54 - loss: 0.8888 - regression_loss: 0.7885 - classification_loss: 0.1003 167/500 [=========>....................] - ETA: 1:54 - loss: 0.8913 - regression_loss: 0.7903 - classification_loss: 0.1010 168/500 [=========>....................] - ETA: 1:53 - loss: 0.8909 - regression_loss: 0.7900 - classification_loss: 0.1009 169/500 [=========>....................] - ETA: 1:53 - loss: 0.8919 - regression_loss: 0.7911 - classification_loss: 0.1009 170/500 [=========>....................] - ETA: 1:53 - loss: 0.8905 - regression_loss: 0.7898 - classification_loss: 0.1007 171/500 [=========>....................] - ETA: 1:52 - loss: 0.8915 - regression_loss: 0.7908 - classification_loss: 0.1007 172/500 [=========>....................] - ETA: 1:52 - loss: 0.8908 - regression_loss: 0.7902 - classification_loss: 0.1006 173/500 [=========>....................] - ETA: 1:52 - loss: 0.8917 - regression_loss: 0.7911 - classification_loss: 0.1006 174/500 [=========>....................] - ETA: 1:51 - loss: 0.8917 - regression_loss: 0.7910 - classification_loss: 0.1006 175/500 [=========>....................] - ETA: 1:51 - loss: 0.8903 - regression_loss: 0.7899 - classification_loss: 0.1005 176/500 [=========>....................] - ETA: 1:51 - loss: 0.8889 - regression_loss: 0.7887 - classification_loss: 0.1003 177/500 [=========>....................] - ETA: 1:50 - loss: 0.8894 - regression_loss: 0.7890 - classification_loss: 0.1003 178/500 [=========>....................] - ETA: 1:50 - loss: 0.8893 - regression_loss: 0.7891 - classification_loss: 0.1002 179/500 [=========>....................] - ETA: 1:50 - loss: 0.8879 - regression_loss: 0.7878 - classification_loss: 0.1001 180/500 [=========>....................] - ETA: 1:49 - loss: 0.8858 - regression_loss: 0.7858 - classification_loss: 0.1000 181/500 [=========>....................] - ETA: 1:49 - loss: 0.8865 - regression_loss: 0.7864 - classification_loss: 0.1001 182/500 [=========>....................] - ETA: 1:49 - loss: 0.8854 - regression_loss: 0.7854 - classification_loss: 0.1000 183/500 [=========>....................] - ETA: 1:48 - loss: 0.8865 - regression_loss: 0.7865 - classification_loss: 0.1000 184/500 [==========>...................] - ETA: 1:48 - loss: 0.8871 - regression_loss: 0.7871 - classification_loss: 0.1000 185/500 [==========>...................] - ETA: 1:48 - loss: 0.8893 - regression_loss: 0.7890 - classification_loss: 0.1003 186/500 [==========>...................] - ETA: 1:47 - loss: 0.8881 - regression_loss: 0.7879 - classification_loss: 0.1002 187/500 [==========>...................] - ETA: 1:47 - loss: 0.8862 - regression_loss: 0.7861 - classification_loss: 0.1001 188/500 [==========>...................] - ETA: 1:47 - loss: 0.8864 - regression_loss: 0.7864 - classification_loss: 0.1000 189/500 [==========>...................] - ETA: 1:46 - loss: 0.8887 - regression_loss: 0.7882 - classification_loss: 0.1005 190/500 [==========>...................] - ETA: 1:46 - loss: 0.8898 - regression_loss: 0.7892 - classification_loss: 0.1006 191/500 [==========>...................] - ETA: 1:46 - loss: 0.8895 - regression_loss: 0.7891 - classification_loss: 0.1005 192/500 [==========>...................] - ETA: 1:45 - loss: 0.8873 - regression_loss: 0.7871 - classification_loss: 0.1002 193/500 [==========>...................] - ETA: 1:45 - loss: 0.8893 - regression_loss: 0.7889 - classification_loss: 0.1004 194/500 [==========>...................] - ETA: 1:45 - loss: 0.8871 - regression_loss: 0.7869 - classification_loss: 0.1002 195/500 [==========>...................] - ETA: 1:44 - loss: 0.8875 - regression_loss: 0.7871 - classification_loss: 0.1004 196/500 [==========>...................] - ETA: 1:44 - loss: 0.8873 - regression_loss: 0.7870 - classification_loss: 0.1003 197/500 [==========>...................] - ETA: 1:43 - loss: 0.8886 - regression_loss: 0.7882 - classification_loss: 0.1004 198/500 [==========>...................] - ETA: 1:43 - loss: 0.8876 - regression_loss: 0.7873 - classification_loss: 0.1003 199/500 [==========>...................] - ETA: 1:43 - loss: 0.8859 - regression_loss: 0.7858 - classification_loss: 0.1001 200/500 [===========>..................] - ETA: 1:42 - loss: 0.8855 - regression_loss: 0.7854 - classification_loss: 0.1001 201/500 [===========>..................] - ETA: 1:42 - loss: 0.8855 - regression_loss: 0.7854 - classification_loss: 0.1001 202/500 [===========>..................] - ETA: 1:42 - loss: 0.8850 - regression_loss: 0.7849 - classification_loss: 0.1001 203/500 [===========>..................] - ETA: 1:41 - loss: 0.8834 - regression_loss: 0.7835 - classification_loss: 0.0999 204/500 [===========>..................] - ETA: 1:41 - loss: 0.8841 - regression_loss: 0.7840 - classification_loss: 0.1001 205/500 [===========>..................] - ETA: 1:41 - loss: 0.8838 - regression_loss: 0.7838 - classification_loss: 0.1001 206/500 [===========>..................] - ETA: 1:40 - loss: 0.8831 - regression_loss: 0.7831 - classification_loss: 0.0999 207/500 [===========>..................] - ETA: 1:40 - loss: 0.8838 - regression_loss: 0.7838 - classification_loss: 0.1001 208/500 [===========>..................] - ETA: 1:40 - loss: 0.8831 - regression_loss: 0.7833 - classification_loss: 0.0998 209/500 [===========>..................] - ETA: 1:39 - loss: 0.8832 - regression_loss: 0.7834 - classification_loss: 0.0998 210/500 [===========>..................] - ETA: 1:39 - loss: 0.8831 - regression_loss: 0.7835 - classification_loss: 0.0997 211/500 [===========>..................] - ETA: 1:39 - loss: 0.8821 - regression_loss: 0.7826 - classification_loss: 0.0995 212/500 [===========>..................] - ETA: 1:38 - loss: 0.8843 - regression_loss: 0.7845 - classification_loss: 0.0998 213/500 [===========>..................] - ETA: 1:38 - loss: 0.8826 - regression_loss: 0.7827 - classification_loss: 0.0999 214/500 [===========>..................] - ETA: 1:38 - loss: 0.8843 - regression_loss: 0.7839 - classification_loss: 0.1003 215/500 [===========>..................] - ETA: 1:37 - loss: 0.8834 - regression_loss: 0.7831 - classification_loss: 0.1003 216/500 [===========>..................] - ETA: 1:37 - loss: 0.8820 - regression_loss: 0.7818 - classification_loss: 0.1002 217/500 [============>.................] - ETA: 1:37 - loss: 0.8812 - regression_loss: 0.7811 - classification_loss: 0.1001 218/500 [============>.................] - ETA: 1:36 - loss: 0.8820 - regression_loss: 0.7819 - classification_loss: 0.1001 219/500 [============>.................] - ETA: 1:36 - loss: 0.8809 - regression_loss: 0.7810 - classification_loss: 0.0999 220/500 [============>.................] - ETA: 1:36 - loss: 0.8791 - regression_loss: 0.7794 - classification_loss: 0.0997 221/500 [============>.................] - ETA: 1:35 - loss: 0.8795 - regression_loss: 0.7797 - classification_loss: 0.0998 222/500 [============>.................] - ETA: 1:35 - loss: 0.8777 - regression_loss: 0.7782 - classification_loss: 0.0995 223/500 [============>.................] - ETA: 1:35 - loss: 0.8768 - regression_loss: 0.7773 - classification_loss: 0.0994 224/500 [============>.................] - ETA: 1:34 - loss: 0.8761 - regression_loss: 0.7769 - classification_loss: 0.0992 225/500 [============>.................] - ETA: 1:34 - loss: 0.8765 - regression_loss: 0.7773 - classification_loss: 0.0992 226/500 [============>.................] - ETA: 1:34 - loss: 0.8760 - regression_loss: 0.7769 - classification_loss: 0.0991 227/500 [============>.................] - ETA: 1:33 - loss: 0.8748 - regression_loss: 0.7760 - classification_loss: 0.0988 228/500 [============>.................] - ETA: 1:33 - loss: 0.8729 - regression_loss: 0.7743 - classification_loss: 0.0986 229/500 [============>.................] - ETA: 1:32 - loss: 0.8729 - regression_loss: 0.7743 - classification_loss: 0.0986 230/500 [============>.................] - ETA: 1:32 - loss: 0.8731 - regression_loss: 0.7745 - classification_loss: 0.0987 231/500 [============>.................] - ETA: 1:32 - loss: 0.8711 - regression_loss: 0.7726 - classification_loss: 0.0985 232/500 [============>.................] - ETA: 1:31 - loss: 0.8713 - regression_loss: 0.7730 - classification_loss: 0.0983 233/500 [============>.................] - ETA: 1:31 - loss: 0.8720 - regression_loss: 0.7735 - classification_loss: 0.0985 234/500 [=============>................] - ETA: 1:31 - loss: 0.8714 - regression_loss: 0.7731 - classification_loss: 0.0983 235/500 [=============>................] - ETA: 1:30 - loss: 0.8710 - regression_loss: 0.7726 - classification_loss: 0.0984 236/500 [=============>................] - ETA: 1:30 - loss: 0.8717 - regression_loss: 0.7733 - classification_loss: 0.0984 237/500 [=============>................] - ETA: 1:30 - loss: 0.8733 - regression_loss: 0.7745 - classification_loss: 0.0988 238/500 [=============>................] - ETA: 1:29 - loss: 0.8730 - regression_loss: 0.7743 - classification_loss: 0.0987 239/500 [=============>................] - ETA: 1:29 - loss: 0.8737 - regression_loss: 0.7748 - classification_loss: 0.0989 240/500 [=============>................] - ETA: 1:29 - loss: 0.8749 - regression_loss: 0.7758 - classification_loss: 0.0991 241/500 [=============>................] - ETA: 1:28 - loss: 0.8746 - regression_loss: 0.7756 - classification_loss: 0.0990 242/500 [=============>................] - ETA: 1:28 - loss: 0.8745 - regression_loss: 0.7757 - classification_loss: 0.0988 243/500 [=============>................] - ETA: 1:28 - loss: 0.8724 - regression_loss: 0.7738 - classification_loss: 0.0986 244/500 [=============>................] - ETA: 1:27 - loss: 0.8726 - regression_loss: 0.7739 - classification_loss: 0.0987 245/500 [=============>................] - ETA: 1:27 - loss: 0.8728 - regression_loss: 0.7741 - classification_loss: 0.0986 246/500 [=============>................] - ETA: 1:27 - loss: 0.8739 - regression_loss: 0.7752 - classification_loss: 0.0988 247/500 [=============>................] - ETA: 1:26 - loss: 0.8757 - regression_loss: 0.7767 - classification_loss: 0.0990 248/500 [=============>................] - ETA: 1:26 - loss: 0.8759 - regression_loss: 0.7770 - classification_loss: 0.0989 249/500 [=============>................] - ETA: 1:26 - loss: 0.8752 - regression_loss: 0.7763 - classification_loss: 0.0989 250/500 [==============>...............] - ETA: 1:25 - loss: 0.8743 - regression_loss: 0.7756 - classification_loss: 0.0987 251/500 [==============>...............] - ETA: 1:25 - loss: 0.8759 - regression_loss: 0.7770 - classification_loss: 0.0989 252/500 [==============>...............] - ETA: 1:25 - loss: 0.8763 - regression_loss: 0.7771 - classification_loss: 0.0992 253/500 [==============>...............] - ETA: 1:24 - loss: 0.8767 - regression_loss: 0.7775 - classification_loss: 0.0993 254/500 [==============>...............] - ETA: 1:24 - loss: 0.8762 - regression_loss: 0.7770 - classification_loss: 0.0992 255/500 [==============>...............] - ETA: 1:24 - loss: 0.8746 - regression_loss: 0.7755 - classification_loss: 0.0991 256/500 [==============>...............] - ETA: 1:23 - loss: 0.8719 - regression_loss: 0.7731 - classification_loss: 0.0989 257/500 [==============>...............] - ETA: 1:23 - loss: 0.8699 - regression_loss: 0.7712 - classification_loss: 0.0987 258/500 [==============>...............] - ETA: 1:23 - loss: 0.8685 - regression_loss: 0.7701 - classification_loss: 0.0984 259/500 [==============>...............] - ETA: 1:22 - loss: 0.8667 - regression_loss: 0.7684 - classification_loss: 0.0983 260/500 [==============>...............] - ETA: 1:22 - loss: 0.8672 - regression_loss: 0.7688 - classification_loss: 0.0983 261/500 [==============>...............] - ETA: 1:22 - loss: 0.8673 - regression_loss: 0.7690 - classification_loss: 0.0983 262/500 [==============>...............] - ETA: 1:21 - loss: 0.8681 - regression_loss: 0.7699 - classification_loss: 0.0982 263/500 [==============>...............] - ETA: 1:21 - loss: 0.8704 - regression_loss: 0.7718 - classification_loss: 0.0986 264/500 [==============>...............] - ETA: 1:21 - loss: 0.8701 - regression_loss: 0.7715 - classification_loss: 0.0986 265/500 [==============>...............] - ETA: 1:20 - loss: 0.8703 - regression_loss: 0.7717 - classification_loss: 0.0987 266/500 [==============>...............] - ETA: 1:20 - loss: 0.8709 - regression_loss: 0.7723 - classification_loss: 0.0985 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8712 - regression_loss: 0.7727 - classification_loss: 0.0986 268/500 [===============>..............] - ETA: 1:19 - loss: 0.8706 - regression_loss: 0.7720 - classification_loss: 0.0986 269/500 [===============>..............] - ETA: 1:19 - loss: 0.8709 - regression_loss: 0.7723 - classification_loss: 0.0986 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8709 - regression_loss: 0.7722 - classification_loss: 0.0987 271/500 [===============>..............] - ETA: 1:18 - loss: 0.8713 - regression_loss: 0.7726 - classification_loss: 0.0987 272/500 [===============>..............] - ETA: 1:18 - loss: 0.8707 - regression_loss: 0.7721 - classification_loss: 0.0986 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8715 - regression_loss: 0.7729 - classification_loss: 0.0986 274/500 [===============>..............] - ETA: 1:17 - loss: 0.8733 - regression_loss: 0.7744 - classification_loss: 0.0989 275/500 [===============>..............] - ETA: 1:17 - loss: 0.8738 - regression_loss: 0.7748 - classification_loss: 0.0990 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8727 - regression_loss: 0.7738 - classification_loss: 0.0989 277/500 [===============>..............] - ETA: 1:16 - loss: 0.8709 - regression_loss: 0.7721 - classification_loss: 0.0987 278/500 [===============>..............] - ETA: 1:16 - loss: 0.8714 - regression_loss: 0.7724 - classification_loss: 0.0990 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8729 - regression_loss: 0.7737 - classification_loss: 0.0992 280/500 [===============>..............] - ETA: 1:15 - loss: 0.8718 - regression_loss: 0.7728 - classification_loss: 0.0990 281/500 [===============>..............] - ETA: 1:15 - loss: 0.8718 - regression_loss: 0.7728 - classification_loss: 0.0990 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8719 - regression_loss: 0.7727 - classification_loss: 0.0992 283/500 [===============>..............] - ETA: 1:14 - loss: 0.8726 - regression_loss: 0.7734 - classification_loss: 0.0992 284/500 [================>.............] - ETA: 1:14 - loss: 0.8738 - regression_loss: 0.7744 - classification_loss: 0.0994 285/500 [================>.............] - ETA: 1:13 - loss: 0.8735 - regression_loss: 0.7742 - classification_loss: 0.0993 286/500 [================>.............] - ETA: 1:13 - loss: 0.8740 - regression_loss: 0.7746 - classification_loss: 0.0993 287/500 [================>.............] - ETA: 1:13 - loss: 0.8725 - regression_loss: 0.7733 - classification_loss: 0.0992 288/500 [================>.............] - ETA: 1:12 - loss: 0.8726 - regression_loss: 0.7735 - classification_loss: 0.0991 289/500 [================>.............] - ETA: 1:12 - loss: 0.8711 - regression_loss: 0.7722 - classification_loss: 0.0990 290/500 [================>.............] - ETA: 1:12 - loss: 0.8716 - regression_loss: 0.7725 - classification_loss: 0.0991 291/500 [================>.............] - ETA: 1:11 - loss: 0.8726 - regression_loss: 0.7732 - classification_loss: 0.0994 292/500 [================>.............] - ETA: 1:11 - loss: 0.8728 - regression_loss: 0.7733 - classification_loss: 0.0994 293/500 [================>.............] - ETA: 1:11 - loss: 0.8730 - regression_loss: 0.7735 - classification_loss: 0.0995 294/500 [================>.............] - ETA: 1:10 - loss: 0.8733 - regression_loss: 0.7736 - classification_loss: 0.0996 295/500 [================>.............] - ETA: 1:10 - loss: 0.8718 - regression_loss: 0.7722 - classification_loss: 0.0996 296/500 [================>.............] - ETA: 1:10 - loss: 0.8717 - regression_loss: 0.7722 - classification_loss: 0.0995 297/500 [================>.............] - ETA: 1:09 - loss: 0.8708 - regression_loss: 0.7715 - classification_loss: 0.0994 298/500 [================>.............] - ETA: 1:09 - loss: 0.8718 - regression_loss: 0.7723 - classification_loss: 0.0995 299/500 [================>.............] - ETA: 1:08 - loss: 0.8720 - regression_loss: 0.7725 - classification_loss: 0.0995 300/500 [=================>............] - ETA: 1:08 - loss: 0.8725 - regression_loss: 0.7729 - classification_loss: 0.0996 301/500 [=================>............] - ETA: 1:08 - loss: 0.8716 - regression_loss: 0.7720 - classification_loss: 0.0995 302/500 [=================>............] - ETA: 1:07 - loss: 0.8716 - regression_loss: 0.7719 - classification_loss: 0.0997 303/500 [=================>............] - ETA: 1:07 - loss: 0.8704 - regression_loss: 0.7709 - classification_loss: 0.0995 304/500 [=================>............] - ETA: 1:07 - loss: 0.8705 - regression_loss: 0.7712 - classification_loss: 0.0994 305/500 [=================>............] - ETA: 1:06 - loss: 0.8693 - regression_loss: 0.7701 - classification_loss: 0.0992 306/500 [=================>............] - ETA: 1:06 - loss: 0.8699 - regression_loss: 0.7705 - classification_loss: 0.0994 307/500 [=================>............] - ETA: 1:06 - loss: 0.8715 - regression_loss: 0.7720 - classification_loss: 0.0995 308/500 [=================>............] - ETA: 1:05 - loss: 0.8706 - regression_loss: 0.7712 - classification_loss: 0.0994 309/500 [=================>............] - ETA: 1:05 - loss: 0.8695 - regression_loss: 0.7702 - classification_loss: 0.0993 310/500 [=================>............] - ETA: 1:05 - loss: 0.8699 - regression_loss: 0.7707 - classification_loss: 0.0992 311/500 [=================>............] - ETA: 1:04 - loss: 0.8714 - regression_loss: 0.7719 - classification_loss: 0.0995 312/500 [=================>............] - ETA: 1:04 - loss: 0.8747 - regression_loss: 0.7748 - classification_loss: 0.0999 313/500 [=================>............] - ETA: 1:04 - loss: 0.8745 - regression_loss: 0.7747 - classification_loss: 0.0999 314/500 [=================>............] - ETA: 1:03 - loss: 0.8741 - regression_loss: 0.7743 - classification_loss: 0.0998 315/500 [=================>............] - ETA: 1:03 - loss: 0.8734 - regression_loss: 0.7738 - classification_loss: 0.0996 316/500 [=================>............] - ETA: 1:03 - loss: 0.8726 - regression_loss: 0.7731 - classification_loss: 0.0995 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8741 - regression_loss: 0.7744 - classification_loss: 0.0997 318/500 [==================>...........] - ETA: 1:02 - loss: 0.8726 - regression_loss: 0.7730 - classification_loss: 0.0995 319/500 [==================>...........] - ETA: 1:02 - loss: 0.8732 - regression_loss: 0.7735 - classification_loss: 0.0996 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8742 - regression_loss: 0.7744 - classification_loss: 0.0998 321/500 [==================>...........] - ETA: 1:01 - loss: 0.8750 - regression_loss: 0.7752 - classification_loss: 0.0999 322/500 [==================>...........] - ETA: 1:01 - loss: 0.8759 - regression_loss: 0.7758 - classification_loss: 0.1001 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8759 - regression_loss: 0.7758 - classification_loss: 0.1001 324/500 [==================>...........] - ETA: 1:00 - loss: 0.8753 - regression_loss: 0.7753 - classification_loss: 0.1000 325/500 [==================>...........] - ETA: 1:00 - loss: 0.8754 - regression_loss: 0.7755 - classification_loss: 0.0999 326/500 [==================>...........] - ETA: 59s - loss: 0.8755 - regression_loss: 0.7756 - classification_loss: 0.0999  327/500 [==================>...........] - ETA: 59s - loss: 0.8742 - regression_loss: 0.7744 - classification_loss: 0.0997 328/500 [==================>...........] - ETA: 59s - loss: 0.8752 - regression_loss: 0.7752 - classification_loss: 0.0999 329/500 [==================>...........] - ETA: 58s - loss: 0.8759 - regression_loss: 0.7759 - classification_loss: 0.0999 330/500 [==================>...........] - ETA: 58s - loss: 0.8750 - regression_loss: 0.7753 - classification_loss: 0.0998 331/500 [==================>...........] - ETA: 58s - loss: 0.8761 - regression_loss: 0.7763 - classification_loss: 0.0998 332/500 [==================>...........] - ETA: 57s - loss: 0.8762 - regression_loss: 0.7765 - classification_loss: 0.0997 333/500 [==================>...........] - ETA: 57s - loss: 0.8755 - regression_loss: 0.7759 - classification_loss: 0.0996 334/500 [===================>..........] - ETA: 57s - loss: 0.8771 - regression_loss: 0.7777 - classification_loss: 0.0994 335/500 [===================>..........] - ETA: 56s - loss: 0.8784 - regression_loss: 0.7788 - classification_loss: 0.0995 336/500 [===================>..........] - ETA: 56s - loss: 0.8792 - regression_loss: 0.7796 - classification_loss: 0.0997 337/500 [===================>..........] - ETA: 55s - loss: 0.8783 - regression_loss: 0.7788 - classification_loss: 0.0996 338/500 [===================>..........] - ETA: 55s - loss: 0.8792 - regression_loss: 0.7794 - classification_loss: 0.0997 339/500 [===================>..........] - ETA: 55s - loss: 0.8778 - regression_loss: 0.7782 - classification_loss: 0.0996 340/500 [===================>..........] - ETA: 54s - loss: 0.8773 - regression_loss: 0.7777 - classification_loss: 0.0995 341/500 [===================>..........] - ETA: 54s - loss: 0.8775 - regression_loss: 0.7780 - classification_loss: 0.0995 342/500 [===================>..........] - ETA: 54s - loss: 0.8770 - regression_loss: 0.7775 - classification_loss: 0.0995 343/500 [===================>..........] - ETA: 53s - loss: 0.8772 - regression_loss: 0.7777 - classification_loss: 0.0995 344/500 [===================>..........] - ETA: 53s - loss: 0.8759 - regression_loss: 0.7766 - classification_loss: 0.0994 345/500 [===================>..........] - ETA: 53s - loss: 0.8768 - regression_loss: 0.7775 - classification_loss: 0.0994 346/500 [===================>..........] - ETA: 52s - loss: 0.8765 - regression_loss: 0.7772 - classification_loss: 0.0993 347/500 [===================>..........] - ETA: 52s - loss: 0.8765 - regression_loss: 0.7773 - classification_loss: 0.0992 348/500 [===================>..........] - ETA: 52s - loss: 0.8765 - regression_loss: 0.7772 - classification_loss: 0.0993 349/500 [===================>..........] - ETA: 51s - loss: 0.8762 - regression_loss: 0.7768 - classification_loss: 0.0993 350/500 [====================>.........] - ETA: 51s - loss: 0.8762 - regression_loss: 0.7769 - classification_loss: 0.0993 351/500 [====================>.........] - ETA: 51s - loss: 0.8754 - regression_loss: 0.7762 - classification_loss: 0.0992 352/500 [====================>.........] - ETA: 50s - loss: 0.8755 - regression_loss: 0.7763 - classification_loss: 0.0992 353/500 [====================>.........] - ETA: 50s - loss: 0.8755 - regression_loss: 0.7764 - classification_loss: 0.0991 354/500 [====================>.........] - ETA: 50s - loss: 0.8754 - regression_loss: 0.7763 - classification_loss: 0.0991 355/500 [====================>.........] - ETA: 49s - loss: 0.8746 - regression_loss: 0.7756 - classification_loss: 0.0990 356/500 [====================>.........] - ETA: 49s - loss: 0.8733 - regression_loss: 0.7745 - classification_loss: 0.0988 357/500 [====================>.........] - ETA: 49s - loss: 0.8731 - regression_loss: 0.7743 - classification_loss: 0.0988 358/500 [====================>.........] - ETA: 48s - loss: 0.8732 - regression_loss: 0.7744 - classification_loss: 0.0988 359/500 [====================>.........] - ETA: 48s - loss: 0.8726 - regression_loss: 0.7738 - classification_loss: 0.0987 360/500 [====================>.........] - ETA: 48s - loss: 0.8724 - regression_loss: 0.7738 - classification_loss: 0.0986 361/500 [====================>.........] - ETA: 47s - loss: 0.8721 - regression_loss: 0.7734 - classification_loss: 0.0987 362/500 [====================>.........] - ETA: 47s - loss: 0.8729 - regression_loss: 0.7740 - classification_loss: 0.0988 363/500 [====================>.........] - ETA: 47s - loss: 0.8717 - regression_loss: 0.7730 - classification_loss: 0.0987 364/500 [====================>.........] - ETA: 46s - loss: 0.8716 - regression_loss: 0.7728 - classification_loss: 0.0988 365/500 [====================>.........] - ETA: 46s - loss: 0.8713 - regression_loss: 0.7726 - classification_loss: 0.0987 366/500 [====================>.........] - ETA: 46s - loss: 0.8711 - regression_loss: 0.7724 - classification_loss: 0.0987 367/500 [=====================>........] - ETA: 45s - loss: 0.8709 - regression_loss: 0.7722 - classification_loss: 0.0987 368/500 [=====================>........] - ETA: 45s - loss: 0.8718 - regression_loss: 0.7730 - classification_loss: 0.0988 369/500 [=====================>........] - ETA: 44s - loss: 0.8713 - regression_loss: 0.7726 - classification_loss: 0.0987 370/500 [=====================>........] - ETA: 44s - loss: 0.8712 - regression_loss: 0.7726 - classification_loss: 0.0987 371/500 [=====================>........] - ETA: 44s - loss: 0.8720 - regression_loss: 0.7734 - classification_loss: 0.0986 372/500 [=====================>........] - ETA: 43s - loss: 0.8714 - regression_loss: 0.7728 - classification_loss: 0.0986 373/500 [=====================>........] - ETA: 43s - loss: 0.8708 - regression_loss: 0.7723 - classification_loss: 0.0985 374/500 [=====================>........] - ETA: 43s - loss: 0.8698 - regression_loss: 0.7715 - classification_loss: 0.0983 375/500 [=====================>........] - ETA: 42s - loss: 0.8701 - regression_loss: 0.7718 - classification_loss: 0.0984 376/500 [=====================>........] - ETA: 42s - loss: 0.8715 - regression_loss: 0.7730 - classification_loss: 0.0985 377/500 [=====================>........] - ETA: 42s - loss: 0.8729 - regression_loss: 0.7742 - classification_loss: 0.0987 378/500 [=====================>........] - ETA: 41s - loss: 0.8739 - regression_loss: 0.7751 - classification_loss: 0.0988 379/500 [=====================>........] - ETA: 41s - loss: 0.8730 - regression_loss: 0.7743 - classification_loss: 0.0987 380/500 [=====================>........] - ETA: 41s - loss: 0.8730 - regression_loss: 0.7743 - classification_loss: 0.0987 381/500 [=====================>........] - ETA: 40s - loss: 0.8726 - regression_loss: 0.7740 - classification_loss: 0.0986 382/500 [=====================>........] - ETA: 40s - loss: 0.8726 - regression_loss: 0.7740 - classification_loss: 0.0985 383/500 [=====================>........] - ETA: 40s - loss: 0.8729 - regression_loss: 0.7743 - classification_loss: 0.0986 384/500 [======================>.......] - ETA: 39s - loss: 0.8732 - regression_loss: 0.7746 - classification_loss: 0.0986 385/500 [======================>.......] - ETA: 39s - loss: 0.8722 - regression_loss: 0.7738 - classification_loss: 0.0985 386/500 [======================>.......] - ETA: 39s - loss: 0.8729 - regression_loss: 0.7744 - classification_loss: 0.0985 387/500 [======================>.......] - ETA: 38s - loss: 0.8734 - regression_loss: 0.7749 - classification_loss: 0.0985 388/500 [======================>.......] - ETA: 38s - loss: 0.8728 - regression_loss: 0.7744 - classification_loss: 0.0984 389/500 [======================>.......] - ETA: 38s - loss: 0.8719 - regression_loss: 0.7737 - classification_loss: 0.0982 390/500 [======================>.......] - ETA: 37s - loss: 0.8713 - regression_loss: 0.7732 - classification_loss: 0.0981 391/500 [======================>.......] - ETA: 37s - loss: 0.8720 - regression_loss: 0.7738 - classification_loss: 0.0982 392/500 [======================>.......] - ETA: 37s - loss: 0.8719 - regression_loss: 0.7736 - classification_loss: 0.0983 393/500 [======================>.......] - ETA: 36s - loss: 0.8712 - regression_loss: 0.7731 - classification_loss: 0.0982 394/500 [======================>.......] - ETA: 36s - loss: 0.8721 - regression_loss: 0.7739 - classification_loss: 0.0983 395/500 [======================>.......] - ETA: 36s - loss: 0.8712 - regression_loss: 0.7730 - classification_loss: 0.0982 396/500 [======================>.......] - ETA: 35s - loss: 0.8715 - regression_loss: 0.7733 - classification_loss: 0.0983 397/500 [======================>.......] - ETA: 35s - loss: 0.8714 - regression_loss: 0.7732 - classification_loss: 0.0982 398/500 [======================>.......] - ETA: 35s - loss: 0.8710 - regression_loss: 0.7729 - classification_loss: 0.0982 399/500 [======================>.......] - ETA: 34s - loss: 0.8707 - regression_loss: 0.7726 - classification_loss: 0.0981 400/500 [=======================>......] - ETA: 34s - loss: 0.8693 - regression_loss: 0.7713 - classification_loss: 0.0981 401/500 [=======================>......] - ETA: 33s - loss: 0.8697 - regression_loss: 0.7716 - classification_loss: 0.0982 402/500 [=======================>......] - ETA: 33s - loss: 0.8701 - regression_loss: 0.7720 - classification_loss: 0.0981 403/500 [=======================>......] - ETA: 33s - loss: 0.8703 - regression_loss: 0.7722 - classification_loss: 0.0981 404/500 [=======================>......] - ETA: 32s - loss: 0.8697 - regression_loss: 0.7717 - classification_loss: 0.0980 405/500 [=======================>......] - ETA: 32s - loss: 0.8700 - regression_loss: 0.7720 - classification_loss: 0.0980 406/500 [=======================>......] - ETA: 32s - loss: 0.8690 - regression_loss: 0.7711 - classification_loss: 0.0979 407/500 [=======================>......] - ETA: 31s - loss: 0.8689 - regression_loss: 0.7711 - classification_loss: 0.0978 408/500 [=======================>......] - ETA: 31s - loss: 0.8687 - regression_loss: 0.7710 - classification_loss: 0.0978 409/500 [=======================>......] - ETA: 31s - loss: 0.8693 - regression_loss: 0.7715 - classification_loss: 0.0978 410/500 [=======================>......] - ETA: 30s - loss: 0.8684 - regression_loss: 0.7708 - classification_loss: 0.0977 411/500 [=======================>......] - ETA: 30s - loss: 0.8676 - regression_loss: 0.7701 - classification_loss: 0.0975 412/500 [=======================>......] - ETA: 30s - loss: 0.8683 - regression_loss: 0.7707 - classification_loss: 0.0976 413/500 [=======================>......] - ETA: 29s - loss: 0.8681 - regression_loss: 0.7705 - classification_loss: 0.0975 414/500 [=======================>......] - ETA: 29s - loss: 0.8683 - regression_loss: 0.7708 - classification_loss: 0.0976 415/500 [=======================>......] - ETA: 29s - loss: 0.8709 - regression_loss: 0.7730 - classification_loss: 0.0979 416/500 [=======================>......] - ETA: 28s - loss: 0.8704 - regression_loss: 0.7726 - classification_loss: 0.0978 417/500 [========================>.....] - ETA: 28s - loss: 0.8705 - regression_loss: 0.7726 - classification_loss: 0.0978 418/500 [========================>.....] - ETA: 28s - loss: 0.8712 - regression_loss: 0.7733 - classification_loss: 0.0979 419/500 [========================>.....] - ETA: 27s - loss: 0.8711 - regression_loss: 0.7732 - classification_loss: 0.0979 420/500 [========================>.....] - ETA: 27s - loss: 0.8718 - regression_loss: 0.7739 - classification_loss: 0.0979 421/500 [========================>.....] - ETA: 27s - loss: 0.8708 - regression_loss: 0.7730 - classification_loss: 0.0978 422/500 [========================>.....] - ETA: 26s - loss: 0.8709 - regression_loss: 0.7729 - classification_loss: 0.0979 423/500 [========================>.....] - ETA: 26s - loss: 0.8705 - regression_loss: 0.7727 - classification_loss: 0.0978 424/500 [========================>.....] - ETA: 26s - loss: 0.8701 - regression_loss: 0.7723 - classification_loss: 0.0978 425/500 [========================>.....] - ETA: 25s - loss: 0.8700 - regression_loss: 0.7723 - classification_loss: 0.0977 426/500 [========================>.....] - ETA: 25s - loss: 0.8690 - regression_loss: 0.7714 - classification_loss: 0.0976 427/500 [========================>.....] - ETA: 25s - loss: 0.8694 - regression_loss: 0.7718 - classification_loss: 0.0977 428/500 [========================>.....] - ETA: 24s - loss: 0.8680 - regression_loss: 0.7705 - classification_loss: 0.0975 429/500 [========================>.....] - ETA: 24s - loss: 0.8679 - regression_loss: 0.7704 - classification_loss: 0.0975 430/500 [========================>.....] - ETA: 24s - loss: 0.8685 - regression_loss: 0.7709 - classification_loss: 0.0976 431/500 [========================>.....] - ETA: 23s - loss: 0.8678 - regression_loss: 0.7703 - classification_loss: 0.0975 432/500 [========================>.....] - ETA: 23s - loss: 0.8675 - regression_loss: 0.7701 - classification_loss: 0.0974 433/500 [========================>.....] - ETA: 23s - loss: 0.8682 - regression_loss: 0.7707 - classification_loss: 0.0975 434/500 [=========================>....] - ETA: 22s - loss: 0.8682 - regression_loss: 0.7707 - classification_loss: 0.0975 435/500 [=========================>....] - ETA: 22s - loss: 0.8685 - regression_loss: 0.7709 - classification_loss: 0.0976 436/500 [=========================>....] - ETA: 21s - loss: 0.8682 - regression_loss: 0.7706 - classification_loss: 0.0976 437/500 [=========================>....] - ETA: 21s - loss: 0.8677 - regression_loss: 0.7702 - classification_loss: 0.0975 438/500 [=========================>....] - ETA: 21s - loss: 0.8668 - regression_loss: 0.7694 - classification_loss: 0.0974 439/500 [=========================>....] - ETA: 20s - loss: 0.8666 - regression_loss: 0.7693 - classification_loss: 0.0974 440/500 [=========================>....] - ETA: 20s - loss: 0.8682 - regression_loss: 0.7706 - classification_loss: 0.0975 441/500 [=========================>....] - ETA: 20s - loss: 0.8679 - regression_loss: 0.7704 - classification_loss: 0.0975 442/500 [=========================>....] - ETA: 19s - loss: 0.8681 - regression_loss: 0.7706 - classification_loss: 0.0975 443/500 [=========================>....] - ETA: 19s - loss: 0.8674 - regression_loss: 0.7700 - classification_loss: 0.0974 444/500 [=========================>....] - ETA: 19s - loss: 0.8675 - regression_loss: 0.7701 - classification_loss: 0.0973 445/500 [=========================>....] - ETA: 18s - loss: 0.8681 - regression_loss: 0.7706 - classification_loss: 0.0974 446/500 [=========================>....] - ETA: 18s - loss: 0.8679 - regression_loss: 0.7705 - classification_loss: 0.0974 447/500 [=========================>....] - ETA: 18s - loss: 0.8677 - regression_loss: 0.7703 - classification_loss: 0.0974 448/500 [=========================>....] - ETA: 17s - loss: 0.8685 - regression_loss: 0.7710 - classification_loss: 0.0975 449/500 [=========================>....] - ETA: 17s - loss: 0.8696 - regression_loss: 0.7719 - classification_loss: 0.0976 450/500 [==========================>...] - ETA: 17s - loss: 0.8693 - regression_loss: 0.7717 - classification_loss: 0.0976 451/500 [==========================>...] - ETA: 16s - loss: 0.8694 - regression_loss: 0.7718 - classification_loss: 0.0976 452/500 [==========================>...] - ETA: 16s - loss: 0.8698 - regression_loss: 0.7722 - classification_loss: 0.0976 453/500 [==========================>...] - ETA: 16s - loss: 0.8698 - regression_loss: 0.7722 - classification_loss: 0.0976 454/500 [==========================>...] - ETA: 15s - loss: 0.8697 - regression_loss: 0.7721 - classification_loss: 0.0976 455/500 [==========================>...] - ETA: 15s - loss: 0.8698 - regression_loss: 0.7723 - classification_loss: 0.0976 456/500 [==========================>...] - ETA: 15s - loss: 0.8707 - regression_loss: 0.7731 - classification_loss: 0.0976 457/500 [==========================>...] - ETA: 14s - loss: 0.8705 - regression_loss: 0.7729 - classification_loss: 0.0976 458/500 [==========================>...] - ETA: 14s - loss: 0.8703 - regression_loss: 0.7728 - classification_loss: 0.0976 459/500 [==========================>...] - ETA: 14s - loss: 0.8697 - regression_loss: 0.7722 - classification_loss: 0.0975 460/500 [==========================>...] - ETA: 13s - loss: 0.8690 - regression_loss: 0.7716 - classification_loss: 0.0974 461/500 [==========================>...] - ETA: 13s - loss: 0.8688 - regression_loss: 0.7715 - classification_loss: 0.0973 462/500 [==========================>...] - ETA: 13s - loss: 0.8683 - regression_loss: 0.7710 - classification_loss: 0.0973 463/500 [==========================>...] - ETA: 12s - loss: 0.8695 - regression_loss: 0.7720 - classification_loss: 0.0975 464/500 [==========================>...] - ETA: 12s - loss: 0.8695 - regression_loss: 0.7720 - classification_loss: 0.0975 465/500 [==========================>...] - ETA: 12s - loss: 0.8694 - regression_loss: 0.7720 - classification_loss: 0.0974 466/500 [==========================>...] - ETA: 11s - loss: 0.8691 - regression_loss: 0.7717 - classification_loss: 0.0973 467/500 [===========================>..] - ETA: 11s - loss: 0.8690 - regression_loss: 0.7715 - classification_loss: 0.0976 468/500 [===========================>..] - ETA: 11s - loss: 0.8705 - regression_loss: 0.7727 - classification_loss: 0.0978 469/500 [===========================>..] - ETA: 10s - loss: 0.8699 - regression_loss: 0.7722 - classification_loss: 0.0977 470/500 [===========================>..] - ETA: 10s - loss: 0.8701 - regression_loss: 0.7723 - classification_loss: 0.0978 471/500 [===========================>..] - ETA: 9s - loss: 0.8696 - regression_loss: 0.7718 - classification_loss: 0.0978  472/500 [===========================>..] - ETA: 9s - loss: 0.8693 - regression_loss: 0.7717 - classification_loss: 0.0977 473/500 [===========================>..] - ETA: 9s - loss: 0.8690 - regression_loss: 0.7714 - classification_loss: 0.0976 474/500 [===========================>..] - ETA: 8s - loss: 0.8692 - regression_loss: 0.7716 - classification_loss: 0.0976 475/500 [===========================>..] - ETA: 8s - loss: 0.8688 - regression_loss: 0.7714 - classification_loss: 0.0975 476/500 [===========================>..] - ETA: 8s - loss: 0.8688 - regression_loss: 0.7713 - classification_loss: 0.0974 477/500 [===========================>..] - ETA: 7s - loss: 0.8686 - regression_loss: 0.7713 - classification_loss: 0.0973 478/500 [===========================>..] - ETA: 7s - loss: 0.8686 - regression_loss: 0.7713 - classification_loss: 0.0973 479/500 [===========================>..] - ETA: 7s - loss: 0.8683 - regression_loss: 0.7711 - classification_loss: 0.0972 480/500 [===========================>..] - ETA: 6s - loss: 0.8689 - regression_loss: 0.7716 - classification_loss: 0.0973 481/500 [===========================>..] - ETA: 6s - loss: 0.8685 - regression_loss: 0.7714 - classification_loss: 0.0972 482/500 [===========================>..] - ETA: 6s - loss: 0.8675 - regression_loss: 0.7704 - classification_loss: 0.0970 483/500 [===========================>..] - ETA: 5s - loss: 0.8675 - regression_loss: 0.7705 - classification_loss: 0.0970 484/500 [============================>.] - ETA: 5s - loss: 0.8674 - regression_loss: 0.7704 - classification_loss: 0.0970 485/500 [============================>.] - ETA: 5s - loss: 0.8675 - regression_loss: 0.7705 - classification_loss: 0.0970 486/500 [============================>.] - ETA: 4s - loss: 0.8677 - regression_loss: 0.7706 - classification_loss: 0.0970 487/500 [============================>.] - ETA: 4s - loss: 0.8687 - regression_loss: 0.7717 - classification_loss: 0.0970 488/500 [============================>.] - ETA: 4s - loss: 0.8691 - regression_loss: 0.7720 - classification_loss: 0.0971 489/500 [============================>.] - ETA: 3s - loss: 0.8685 - regression_loss: 0.7715 - classification_loss: 0.0970 490/500 [============================>.] - ETA: 3s - loss: 0.8698 - regression_loss: 0.7726 - classification_loss: 0.0972 491/500 [============================>.] - ETA: 3s - loss: 0.8704 - regression_loss: 0.7731 - classification_loss: 0.0974 492/500 [============================>.] - ETA: 2s - loss: 0.8697 - regression_loss: 0.7724 - classification_loss: 0.0972 493/500 [============================>.] - ETA: 2s - loss: 0.8700 - regression_loss: 0.7727 - classification_loss: 0.0973 494/500 [============================>.] - ETA: 2s - loss: 0.8707 - regression_loss: 0.7733 - classification_loss: 0.0974 495/500 [============================>.] - ETA: 1s - loss: 0.8704 - regression_loss: 0.7731 - classification_loss: 0.0973 496/500 [============================>.] - ETA: 1s - loss: 0.8697 - regression_loss: 0.7725 - classification_loss: 0.0972 497/500 [============================>.] - ETA: 1s - loss: 0.8699 - regression_loss: 0.7726 - classification_loss: 0.0973 498/500 [============================>.] - ETA: 0s - loss: 0.8693 - regression_loss: 0.7721 - classification_loss: 0.0972 499/500 [============================>.] - ETA: 0s - loss: 0.8688 - regression_loss: 0.7717 - classification_loss: 0.0972 500/500 [==============================] - 172s 344ms/step - loss: 0.8684 - regression_loss: 0.7713 - classification_loss: 0.0971 1172 instances of class plum with average precision: 0.6988 mAP: 0.6988 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:46 - loss: 0.9791 - regression_loss: 0.9003 - classification_loss: 0.0788 2/500 [..............................] - ETA: 2:44 - loss: 0.8565 - regression_loss: 0.7979 - classification_loss: 0.0587 3/500 [..............................] - ETA: 2:48 - loss: 0.9297 - regression_loss: 0.8560 - classification_loss: 0.0737 4/500 [..............................] - ETA: 2:50 - loss: 1.1002 - regression_loss: 1.0094 - classification_loss: 0.0908 5/500 [..............................] - ETA: 2:48 - loss: 1.0359 - regression_loss: 0.9518 - classification_loss: 0.0841 6/500 [..............................] - ETA: 2:48 - loss: 0.9614 - regression_loss: 0.8813 - classification_loss: 0.0801 7/500 [..............................] - ETA: 2:47 - loss: 0.9257 - regression_loss: 0.8452 - classification_loss: 0.0805 8/500 [..............................] - ETA: 2:46 - loss: 0.9558 - regression_loss: 0.8670 - classification_loss: 0.0888 9/500 [..............................] - ETA: 2:45 - loss: 0.9292 - regression_loss: 0.8401 - classification_loss: 0.0891 10/500 [..............................] - ETA: 2:46 - loss: 0.8885 - regression_loss: 0.8037 - classification_loss: 0.0848 11/500 [..............................] - ETA: 2:45 - loss: 0.8450 - regression_loss: 0.7637 - classification_loss: 0.0813 12/500 [..............................] - ETA: 2:45 - loss: 0.8424 - regression_loss: 0.7599 - classification_loss: 0.0825 13/500 [..............................] - ETA: 2:45 - loss: 0.8438 - regression_loss: 0.7610 - classification_loss: 0.0828 14/500 [..............................] - ETA: 2:44 - loss: 0.8210 - regression_loss: 0.7379 - classification_loss: 0.0830 15/500 [..............................] - ETA: 2:44 - loss: 0.8260 - regression_loss: 0.7417 - classification_loss: 0.0842 16/500 [..............................] - ETA: 2:44 - loss: 0.8295 - regression_loss: 0.7456 - classification_loss: 0.0839 17/500 [>.............................] - ETA: 2:44 - loss: 0.8332 - regression_loss: 0.7485 - classification_loss: 0.0847 18/500 [>.............................] - ETA: 2:44 - loss: 0.8214 - regression_loss: 0.7362 - classification_loss: 0.0852 19/500 [>.............................] - ETA: 2:44 - loss: 0.8199 - regression_loss: 0.7337 - classification_loss: 0.0862 20/500 [>.............................] - ETA: 2:44 - loss: 0.8235 - regression_loss: 0.7375 - classification_loss: 0.0860 21/500 [>.............................] - ETA: 2:44 - loss: 0.8381 - regression_loss: 0.7512 - classification_loss: 0.0868 22/500 [>.............................] - ETA: 2:44 - loss: 0.8301 - regression_loss: 0.7428 - classification_loss: 0.0873 23/500 [>.............................] - ETA: 2:43 - loss: 0.8256 - regression_loss: 0.7387 - classification_loss: 0.0869 24/500 [>.............................] - ETA: 2:43 - loss: 0.8348 - regression_loss: 0.7462 - classification_loss: 0.0886 25/500 [>.............................] - ETA: 2:43 - loss: 0.8280 - regression_loss: 0.7400 - classification_loss: 0.0880 26/500 [>.............................] - ETA: 2:42 - loss: 0.8239 - regression_loss: 0.7353 - classification_loss: 0.0887 27/500 [>.............................] - ETA: 2:42 - loss: 0.8097 - regression_loss: 0.7226 - classification_loss: 0.0871 28/500 [>.............................] - ETA: 2:42 - loss: 0.8039 - regression_loss: 0.7167 - classification_loss: 0.0871 29/500 [>.............................] - ETA: 2:41 - loss: 0.7918 - regression_loss: 0.7061 - classification_loss: 0.0857 30/500 [>.............................] - ETA: 2:41 - loss: 0.7888 - regression_loss: 0.7039 - classification_loss: 0.0848 31/500 [>.............................] - ETA: 2:41 - loss: 0.7801 - regression_loss: 0.6963 - classification_loss: 0.0838 32/500 [>.............................] - ETA: 2:40 - loss: 0.7827 - regression_loss: 0.6987 - classification_loss: 0.0840 33/500 [>.............................] - ETA: 2:40 - loss: 0.8027 - regression_loss: 0.7150 - classification_loss: 0.0877 34/500 [=>............................] - ETA: 2:40 - loss: 0.7998 - regression_loss: 0.7127 - classification_loss: 0.0871 35/500 [=>............................] - ETA: 2:40 - loss: 0.8012 - regression_loss: 0.7137 - classification_loss: 0.0875 36/500 [=>............................] - ETA: 2:39 - loss: 0.8086 - regression_loss: 0.7191 - classification_loss: 0.0894 37/500 [=>............................] - ETA: 2:39 - loss: 0.8054 - regression_loss: 0.7163 - classification_loss: 0.0891 38/500 [=>............................] - ETA: 2:39 - loss: 0.8063 - regression_loss: 0.7167 - classification_loss: 0.0896 39/500 [=>............................] - ETA: 2:38 - loss: 0.8177 - regression_loss: 0.7266 - classification_loss: 0.0910 40/500 [=>............................] - ETA: 2:38 - loss: 0.8218 - regression_loss: 0.7312 - classification_loss: 0.0905 41/500 [=>............................] - ETA: 2:38 - loss: 0.8203 - regression_loss: 0.7305 - classification_loss: 0.0898 42/500 [=>............................] - ETA: 2:38 - loss: 0.8182 - regression_loss: 0.7288 - classification_loss: 0.0893 43/500 [=>............................] - ETA: 2:37 - loss: 0.8165 - regression_loss: 0.7273 - classification_loss: 0.0891 44/500 [=>............................] - ETA: 2:37 - loss: 0.8267 - regression_loss: 0.7366 - classification_loss: 0.0900 45/500 [=>............................] - ETA: 2:37 - loss: 0.8309 - regression_loss: 0.7411 - classification_loss: 0.0898 46/500 [=>............................] - ETA: 2:36 - loss: 0.8255 - regression_loss: 0.7359 - classification_loss: 0.0896 47/500 [=>............................] - ETA: 2:36 - loss: 0.8286 - regression_loss: 0.7379 - classification_loss: 0.0906 48/500 [=>............................] - ETA: 2:36 - loss: 0.8226 - regression_loss: 0.7332 - classification_loss: 0.0895 49/500 [=>............................] - ETA: 2:36 - loss: 0.8216 - regression_loss: 0.7320 - classification_loss: 0.0896 50/500 [==>...........................] - ETA: 2:35 - loss: 0.8144 - regression_loss: 0.7257 - classification_loss: 0.0887 51/500 [==>...........................] - ETA: 2:35 - loss: 0.8088 - regression_loss: 0.7208 - classification_loss: 0.0880 52/500 [==>...........................] - ETA: 2:34 - loss: 0.8089 - regression_loss: 0.7210 - classification_loss: 0.0879 53/500 [==>...........................] - ETA: 2:34 - loss: 0.8107 - regression_loss: 0.7230 - classification_loss: 0.0878 54/500 [==>...........................] - ETA: 2:34 - loss: 0.8142 - regression_loss: 0.7254 - classification_loss: 0.0888 55/500 [==>...........................] - ETA: 2:33 - loss: 0.8198 - regression_loss: 0.7303 - classification_loss: 0.0895 56/500 [==>...........................] - ETA: 2:33 - loss: 0.8203 - regression_loss: 0.7306 - classification_loss: 0.0897 57/500 [==>...........................] - ETA: 2:33 - loss: 0.8293 - regression_loss: 0.7384 - classification_loss: 0.0908 58/500 [==>...........................] - ETA: 2:32 - loss: 0.8273 - regression_loss: 0.7368 - classification_loss: 0.0905 59/500 [==>...........................] - ETA: 2:32 - loss: 0.8328 - regression_loss: 0.7420 - classification_loss: 0.0908 60/500 [==>...........................] - ETA: 2:31 - loss: 0.8284 - regression_loss: 0.7382 - classification_loss: 0.0902 61/500 [==>...........................] - ETA: 2:31 - loss: 0.8307 - regression_loss: 0.7394 - classification_loss: 0.0913 62/500 [==>...........................] - ETA: 2:30 - loss: 0.8371 - regression_loss: 0.7453 - classification_loss: 0.0918 63/500 [==>...........................] - ETA: 2:30 - loss: 0.8358 - regression_loss: 0.7438 - classification_loss: 0.0920 64/500 [==>...........................] - ETA: 2:30 - loss: 0.8338 - regression_loss: 0.7418 - classification_loss: 0.0920 65/500 [==>...........................] - ETA: 2:29 - loss: 0.8273 - regression_loss: 0.7360 - classification_loss: 0.0913 66/500 [==>...........................] - ETA: 2:29 - loss: 0.8282 - regression_loss: 0.7372 - classification_loss: 0.0910 67/500 [===>..........................] - ETA: 2:29 - loss: 0.8256 - regression_loss: 0.7352 - classification_loss: 0.0904 68/500 [===>..........................] - ETA: 2:28 - loss: 0.8238 - regression_loss: 0.7340 - classification_loss: 0.0899 69/500 [===>..........................] - ETA: 2:28 - loss: 0.8253 - regression_loss: 0.7355 - classification_loss: 0.0898 70/500 [===>..........................] - ETA: 2:27 - loss: 0.8287 - regression_loss: 0.7386 - classification_loss: 0.0901 71/500 [===>..........................] - ETA: 2:27 - loss: 0.8248 - regression_loss: 0.7350 - classification_loss: 0.0898 72/500 [===>..........................] - ETA: 2:27 - loss: 0.8292 - regression_loss: 0.7386 - classification_loss: 0.0906 73/500 [===>..........................] - ETA: 2:26 - loss: 0.8263 - regression_loss: 0.7360 - classification_loss: 0.0903 74/500 [===>..........................] - ETA: 2:26 - loss: 0.8230 - regression_loss: 0.7327 - classification_loss: 0.0903 75/500 [===>..........................] - ETA: 2:25 - loss: 0.8181 - regression_loss: 0.7285 - classification_loss: 0.0896 76/500 [===>..........................] - ETA: 2:25 - loss: 0.8203 - regression_loss: 0.7302 - classification_loss: 0.0901 77/500 [===>..........................] - ETA: 2:25 - loss: 0.8190 - regression_loss: 0.7291 - classification_loss: 0.0899 78/500 [===>..........................] - ETA: 2:24 - loss: 0.8168 - regression_loss: 0.7269 - classification_loss: 0.0900 79/500 [===>..........................] - ETA: 2:24 - loss: 0.8153 - regression_loss: 0.7251 - classification_loss: 0.0902 80/500 [===>..........................] - ETA: 2:24 - loss: 0.8148 - regression_loss: 0.7243 - classification_loss: 0.0905 81/500 [===>..........................] - ETA: 2:23 - loss: 0.8230 - regression_loss: 0.7326 - classification_loss: 0.0904 82/500 [===>..........................] - ETA: 2:23 - loss: 0.8244 - regression_loss: 0.7340 - classification_loss: 0.0904 83/500 [===>..........................] - ETA: 2:23 - loss: 0.8245 - regression_loss: 0.7341 - classification_loss: 0.0904 84/500 [====>.........................] - ETA: 2:23 - loss: 0.8247 - regression_loss: 0.7343 - classification_loss: 0.0904 85/500 [====>.........................] - ETA: 2:22 - loss: 0.8242 - regression_loss: 0.7340 - classification_loss: 0.0901 86/500 [====>.........................] - ETA: 2:22 - loss: 0.8263 - regression_loss: 0.7360 - classification_loss: 0.0903 87/500 [====>.........................] - ETA: 2:22 - loss: 0.8269 - regression_loss: 0.7364 - classification_loss: 0.0905 88/500 [====>.........................] - ETA: 2:21 - loss: 0.8264 - regression_loss: 0.7360 - classification_loss: 0.0904 89/500 [====>.........................] - ETA: 2:21 - loss: 0.8221 - regression_loss: 0.7318 - classification_loss: 0.0903 90/500 [====>.........................] - ETA: 2:21 - loss: 0.8176 - regression_loss: 0.7278 - classification_loss: 0.0898 91/500 [====>.........................] - ETA: 2:20 - loss: 0.8193 - regression_loss: 0.7295 - classification_loss: 0.0898 92/500 [====>.........................] - ETA: 2:20 - loss: 0.8216 - regression_loss: 0.7315 - classification_loss: 0.0902 93/500 [====>.........................] - ETA: 2:20 - loss: 0.8215 - regression_loss: 0.7313 - classification_loss: 0.0902 94/500 [====>.........................] - ETA: 2:19 - loss: 0.8235 - regression_loss: 0.7334 - classification_loss: 0.0901 95/500 [====>.........................] - ETA: 2:19 - loss: 0.8272 - regression_loss: 0.7363 - classification_loss: 0.0909 96/500 [====>.........................] - ETA: 2:19 - loss: 0.8300 - regression_loss: 0.7384 - classification_loss: 0.0916 97/500 [====>.........................] - ETA: 2:18 - loss: 0.8282 - regression_loss: 0.7369 - classification_loss: 0.0912 98/500 [====>.........................] - ETA: 2:18 - loss: 0.8311 - regression_loss: 0.7395 - classification_loss: 0.0917 99/500 [====>.........................] - ETA: 2:18 - loss: 0.8290 - regression_loss: 0.7375 - classification_loss: 0.0915 100/500 [=====>........................] - ETA: 2:17 - loss: 0.8314 - regression_loss: 0.7398 - classification_loss: 0.0916 101/500 [=====>........................] - ETA: 2:17 - loss: 0.8309 - regression_loss: 0.7397 - classification_loss: 0.0913 102/500 [=====>........................] - ETA: 2:17 - loss: 0.8250 - regression_loss: 0.7343 - classification_loss: 0.0906 103/500 [=====>........................] - ETA: 2:16 - loss: 0.8232 - regression_loss: 0.7329 - classification_loss: 0.0903 104/500 [=====>........................] - ETA: 2:16 - loss: 0.8216 - regression_loss: 0.7314 - classification_loss: 0.0902 105/500 [=====>........................] - ETA: 2:16 - loss: 0.8206 - regression_loss: 0.7302 - classification_loss: 0.0905 106/500 [=====>........................] - ETA: 2:15 - loss: 0.8207 - regression_loss: 0.7302 - classification_loss: 0.0905 107/500 [=====>........................] - ETA: 2:15 - loss: 0.8220 - regression_loss: 0.7313 - classification_loss: 0.0907 108/500 [=====>........................] - ETA: 2:15 - loss: 0.8232 - regression_loss: 0.7323 - classification_loss: 0.0910 109/500 [=====>........................] - ETA: 2:14 - loss: 0.8209 - regression_loss: 0.7303 - classification_loss: 0.0906 110/500 [=====>........................] - ETA: 2:14 - loss: 0.8229 - regression_loss: 0.7321 - classification_loss: 0.0908 111/500 [=====>........................] - ETA: 2:14 - loss: 0.8249 - regression_loss: 0.7339 - classification_loss: 0.0910 112/500 [=====>........................] - ETA: 2:13 - loss: 0.8267 - regression_loss: 0.7354 - classification_loss: 0.0913 113/500 [=====>........................] - ETA: 2:13 - loss: 0.8278 - regression_loss: 0.7364 - classification_loss: 0.0914 114/500 [=====>........................] - ETA: 2:13 - loss: 0.8290 - regression_loss: 0.7375 - classification_loss: 0.0915 115/500 [=====>........................] - ETA: 2:12 - loss: 0.8264 - regression_loss: 0.7353 - classification_loss: 0.0911 116/500 [=====>........................] - ETA: 2:12 - loss: 0.8290 - regression_loss: 0.7372 - classification_loss: 0.0918 117/500 [======>.......................] - ETA: 2:12 - loss: 0.8265 - regression_loss: 0.7351 - classification_loss: 0.0914 118/500 [======>.......................] - ETA: 2:11 - loss: 0.8289 - regression_loss: 0.7371 - classification_loss: 0.0919 119/500 [======>.......................] - ETA: 2:11 - loss: 0.8309 - regression_loss: 0.7389 - classification_loss: 0.0920 120/500 [======>.......................] - ETA: 2:10 - loss: 0.8316 - regression_loss: 0.7398 - classification_loss: 0.0918 121/500 [======>.......................] - ETA: 2:10 - loss: 0.8322 - regression_loss: 0.7402 - classification_loss: 0.0920 122/500 [======>.......................] - ETA: 2:10 - loss: 0.8321 - regression_loss: 0.7400 - classification_loss: 0.0921 123/500 [======>.......................] - ETA: 2:09 - loss: 0.8307 - regression_loss: 0.7387 - classification_loss: 0.0920 124/500 [======>.......................] - ETA: 2:09 - loss: 0.8300 - regression_loss: 0.7380 - classification_loss: 0.0921 125/500 [======>.......................] - ETA: 2:09 - loss: 0.8294 - regression_loss: 0.7374 - classification_loss: 0.0920 126/500 [======>.......................] - ETA: 2:08 - loss: 0.8262 - regression_loss: 0.7346 - classification_loss: 0.0916 127/500 [======>.......................] - ETA: 2:08 - loss: 0.8274 - regression_loss: 0.7356 - classification_loss: 0.0918 128/500 [======>.......................] - ETA: 2:08 - loss: 0.8274 - regression_loss: 0.7353 - classification_loss: 0.0920 129/500 [======>.......................] - ETA: 2:07 - loss: 0.8251 - regression_loss: 0.7333 - classification_loss: 0.0919 130/500 [======>.......................] - ETA: 2:07 - loss: 0.8247 - regression_loss: 0.7328 - classification_loss: 0.0919 131/500 [======>.......................] - ETA: 2:07 - loss: 0.8230 - regression_loss: 0.7314 - classification_loss: 0.0916 132/500 [======>.......................] - ETA: 2:06 - loss: 0.8210 - regression_loss: 0.7297 - classification_loss: 0.0913 133/500 [======>.......................] - ETA: 2:06 - loss: 0.8209 - regression_loss: 0.7296 - classification_loss: 0.0913 134/500 [=======>......................] - ETA: 2:05 - loss: 0.8243 - regression_loss: 0.7332 - classification_loss: 0.0911 135/500 [=======>......................] - ETA: 2:05 - loss: 0.8213 - regression_loss: 0.7305 - classification_loss: 0.0908 136/500 [=======>......................] - ETA: 2:05 - loss: 0.8193 - regression_loss: 0.7287 - classification_loss: 0.0905 137/500 [=======>......................] - ETA: 2:04 - loss: 0.8184 - regression_loss: 0.7281 - classification_loss: 0.0903 138/500 [=======>......................] - ETA: 2:04 - loss: 0.8151 - regression_loss: 0.7249 - classification_loss: 0.0903 139/500 [=======>......................] - ETA: 2:04 - loss: 0.8161 - regression_loss: 0.7258 - classification_loss: 0.0903 140/500 [=======>......................] - ETA: 2:03 - loss: 0.8147 - regression_loss: 0.7245 - classification_loss: 0.0902 141/500 [=======>......................] - ETA: 2:03 - loss: 0.8116 - regression_loss: 0.7218 - classification_loss: 0.0898 142/500 [=======>......................] - ETA: 2:03 - loss: 0.8109 - regression_loss: 0.7212 - classification_loss: 0.0897 143/500 [=======>......................] - ETA: 2:02 - loss: 0.8103 - regression_loss: 0.7208 - classification_loss: 0.0895 144/500 [=======>......................] - ETA: 2:02 - loss: 0.8086 - regression_loss: 0.7193 - classification_loss: 0.0893 145/500 [=======>......................] - ETA: 2:01 - loss: 0.8098 - regression_loss: 0.7201 - classification_loss: 0.0897 146/500 [=======>......................] - ETA: 2:01 - loss: 0.8059 - regression_loss: 0.7166 - classification_loss: 0.0893 147/500 [=======>......................] - ETA: 2:01 - loss: 0.8068 - regression_loss: 0.7174 - classification_loss: 0.0893 148/500 [=======>......................] - ETA: 2:00 - loss: 0.8067 - regression_loss: 0.7174 - classification_loss: 0.0893 149/500 [=======>......................] - ETA: 2:00 - loss: 0.8064 - regression_loss: 0.7173 - classification_loss: 0.0892 150/500 [========>.....................] - ETA: 2:00 - loss: 0.8067 - regression_loss: 0.7173 - classification_loss: 0.0895 151/500 [========>.....................] - ETA: 1:59 - loss: 0.8072 - regression_loss: 0.7176 - classification_loss: 0.0897 152/500 [========>.....................] - ETA: 1:59 - loss: 0.8067 - regression_loss: 0.7172 - classification_loss: 0.0896 153/500 [========>.....................] - ETA: 1:59 - loss: 0.8071 - regression_loss: 0.7175 - classification_loss: 0.0897 154/500 [========>.....................] - ETA: 1:58 - loss: 0.8058 - regression_loss: 0.7163 - classification_loss: 0.0895 155/500 [========>.....................] - ETA: 1:58 - loss: 0.8075 - regression_loss: 0.7178 - classification_loss: 0.0896 156/500 [========>.....................] - ETA: 1:58 - loss: 0.8093 - regression_loss: 0.7192 - classification_loss: 0.0900 157/500 [========>.....................] - ETA: 1:57 - loss: 0.8098 - regression_loss: 0.7197 - classification_loss: 0.0901 158/500 [========>.....................] - ETA: 1:57 - loss: 0.8076 - regression_loss: 0.7176 - classification_loss: 0.0899 159/500 [========>.....................] - ETA: 1:57 - loss: 0.8047 - regression_loss: 0.7152 - classification_loss: 0.0896 160/500 [========>.....................] - ETA: 1:56 - loss: 0.8040 - regression_loss: 0.7145 - classification_loss: 0.0895 161/500 [========>.....................] - ETA: 1:56 - loss: 0.8020 - regression_loss: 0.7127 - classification_loss: 0.0892 162/500 [========>.....................] - ETA: 1:56 - loss: 0.8014 - regression_loss: 0.7123 - classification_loss: 0.0891 163/500 [========>.....................] - ETA: 1:55 - loss: 0.8002 - regression_loss: 0.7112 - classification_loss: 0.0890 164/500 [========>.....................] - ETA: 1:55 - loss: 0.8006 - regression_loss: 0.7117 - classification_loss: 0.0888 165/500 [========>.....................] - ETA: 1:55 - loss: 0.8014 - regression_loss: 0.7123 - classification_loss: 0.0892 166/500 [========>.....................] - ETA: 1:54 - loss: 0.8009 - regression_loss: 0.7117 - classification_loss: 0.0891 167/500 [=========>....................] - ETA: 1:54 - loss: 0.7994 - regression_loss: 0.7105 - classification_loss: 0.0889 168/500 [=========>....................] - ETA: 1:53 - loss: 0.8010 - regression_loss: 0.7119 - classification_loss: 0.0891 169/500 [=========>....................] - ETA: 1:53 - loss: 0.8021 - regression_loss: 0.7128 - classification_loss: 0.0892 170/500 [=========>....................] - ETA: 1:53 - loss: 0.8001 - regression_loss: 0.7112 - classification_loss: 0.0889 171/500 [=========>....................] - ETA: 1:52 - loss: 0.8021 - regression_loss: 0.7130 - classification_loss: 0.0891 172/500 [=========>....................] - ETA: 1:52 - loss: 0.8032 - regression_loss: 0.7140 - classification_loss: 0.0892 173/500 [=========>....................] - ETA: 1:52 - loss: 0.8043 - regression_loss: 0.7152 - classification_loss: 0.0891 174/500 [=========>....................] - ETA: 1:51 - loss: 0.8033 - regression_loss: 0.7143 - classification_loss: 0.0890 175/500 [=========>....................] - ETA: 1:51 - loss: 0.8022 - regression_loss: 0.7132 - classification_loss: 0.0890 176/500 [=========>....................] - ETA: 1:51 - loss: 0.8019 - regression_loss: 0.7130 - classification_loss: 0.0889 177/500 [=========>....................] - ETA: 1:50 - loss: 0.8020 - regression_loss: 0.7133 - classification_loss: 0.0887 178/500 [=========>....................] - ETA: 1:50 - loss: 0.8037 - regression_loss: 0.7151 - classification_loss: 0.0886 179/500 [=========>....................] - ETA: 1:50 - loss: 0.8020 - regression_loss: 0.7136 - classification_loss: 0.0883 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7999 - regression_loss: 0.7118 - classification_loss: 0.0880 181/500 [=========>....................] - ETA: 1:49 - loss: 0.8000 - regression_loss: 0.7121 - classification_loss: 0.0880 182/500 [=========>....................] - ETA: 1:49 - loss: 0.7989 - regression_loss: 0.7112 - classification_loss: 0.0877 183/500 [=========>....................] - ETA: 1:48 - loss: 0.7986 - regression_loss: 0.7108 - classification_loss: 0.0878 184/500 [==========>...................] - ETA: 1:48 - loss: 0.8002 - regression_loss: 0.7123 - classification_loss: 0.0879 185/500 [==========>...................] - ETA: 1:48 - loss: 0.8012 - regression_loss: 0.7130 - classification_loss: 0.0882 186/500 [==========>...................] - ETA: 1:47 - loss: 0.8020 - regression_loss: 0.7136 - classification_loss: 0.0884 187/500 [==========>...................] - ETA: 1:47 - loss: 0.8071 - regression_loss: 0.7184 - classification_loss: 0.0887 188/500 [==========>...................] - ETA: 1:47 - loss: 0.8077 - regression_loss: 0.7189 - classification_loss: 0.0887 189/500 [==========>...................] - ETA: 1:46 - loss: 0.8099 - regression_loss: 0.7208 - classification_loss: 0.0891 190/500 [==========>...................] - ETA: 1:46 - loss: 0.8105 - regression_loss: 0.7212 - classification_loss: 0.0893 191/500 [==========>...................] - ETA: 1:46 - loss: 0.8128 - regression_loss: 0.7230 - classification_loss: 0.0897 192/500 [==========>...................] - ETA: 1:45 - loss: 0.8120 - regression_loss: 0.7224 - classification_loss: 0.0897 193/500 [==========>...................] - ETA: 1:45 - loss: 0.8093 - regression_loss: 0.7199 - classification_loss: 0.0893 194/500 [==========>...................] - ETA: 1:45 - loss: 0.8074 - regression_loss: 0.7182 - classification_loss: 0.0892 195/500 [==========>...................] - ETA: 1:44 - loss: 0.8055 - regression_loss: 0.7165 - classification_loss: 0.0890 196/500 [==========>...................] - ETA: 1:44 - loss: 0.8081 - regression_loss: 0.7188 - classification_loss: 0.0893 197/500 [==========>...................] - ETA: 1:44 - loss: 0.8097 - regression_loss: 0.7203 - classification_loss: 0.0894 198/500 [==========>...................] - ETA: 1:43 - loss: 0.8079 - regression_loss: 0.7187 - classification_loss: 0.0893 199/500 [==========>...................] - ETA: 1:43 - loss: 0.8067 - regression_loss: 0.7176 - classification_loss: 0.0891 200/500 [===========>..................] - ETA: 1:43 - loss: 0.8082 - regression_loss: 0.7190 - classification_loss: 0.0892 201/500 [===========>..................] - ETA: 1:42 - loss: 0.8070 - regression_loss: 0.7180 - classification_loss: 0.0890 202/500 [===========>..................] - ETA: 1:42 - loss: 0.8056 - regression_loss: 0.7168 - classification_loss: 0.0888 203/500 [===========>..................] - ETA: 1:42 - loss: 0.8065 - regression_loss: 0.7177 - classification_loss: 0.0888 204/500 [===========>..................] - ETA: 1:41 - loss: 0.8054 - regression_loss: 0.7168 - classification_loss: 0.0887 205/500 [===========>..................] - ETA: 1:41 - loss: 0.8053 - regression_loss: 0.7167 - classification_loss: 0.0887 206/500 [===========>..................] - ETA: 1:41 - loss: 0.8054 - regression_loss: 0.7166 - classification_loss: 0.0887 207/500 [===========>..................] - ETA: 1:40 - loss: 0.8059 - regression_loss: 0.7171 - classification_loss: 0.0888 208/500 [===========>..................] - ETA: 1:40 - loss: 0.8058 - regression_loss: 0.7171 - classification_loss: 0.0887 209/500 [===========>..................] - ETA: 1:39 - loss: 0.8038 - regression_loss: 0.7152 - classification_loss: 0.0886 210/500 [===========>..................] - ETA: 1:39 - loss: 0.8051 - regression_loss: 0.7164 - classification_loss: 0.0887 211/500 [===========>..................] - ETA: 1:39 - loss: 0.8057 - regression_loss: 0.7169 - classification_loss: 0.0888 212/500 [===========>..................] - ETA: 1:38 - loss: 0.8044 - regression_loss: 0.7159 - classification_loss: 0.0886 213/500 [===========>..................] - ETA: 1:38 - loss: 0.8071 - regression_loss: 0.7181 - classification_loss: 0.0891 214/500 [===========>..................] - ETA: 1:38 - loss: 0.8083 - regression_loss: 0.7187 - classification_loss: 0.0896 215/500 [===========>..................] - ETA: 1:37 - loss: 0.8078 - regression_loss: 0.7182 - classification_loss: 0.0895 216/500 [===========>..................] - ETA: 1:37 - loss: 0.8084 - regression_loss: 0.7188 - classification_loss: 0.0896 217/500 [============>.................] - ETA: 1:37 - loss: 0.8084 - regression_loss: 0.7188 - classification_loss: 0.0896 218/500 [============>.................] - ETA: 1:36 - loss: 0.8080 - regression_loss: 0.7185 - classification_loss: 0.0895 219/500 [============>.................] - ETA: 1:36 - loss: 0.8073 - regression_loss: 0.7177 - classification_loss: 0.0896 220/500 [============>.................] - ETA: 1:36 - loss: 0.8071 - regression_loss: 0.7177 - classification_loss: 0.0895 221/500 [============>.................] - ETA: 1:35 - loss: 0.8064 - regression_loss: 0.7171 - classification_loss: 0.0893 222/500 [============>.................] - ETA: 1:35 - loss: 0.8071 - regression_loss: 0.7178 - classification_loss: 0.0893 223/500 [============>.................] - ETA: 1:35 - loss: 0.8060 - regression_loss: 0.7168 - classification_loss: 0.0892 224/500 [============>.................] - ETA: 1:34 - loss: 0.8048 - regression_loss: 0.7157 - classification_loss: 0.0892 225/500 [============>.................] - ETA: 1:34 - loss: 0.8035 - regression_loss: 0.7145 - classification_loss: 0.0890 226/500 [============>.................] - ETA: 1:34 - loss: 0.8032 - regression_loss: 0.7143 - classification_loss: 0.0889 227/500 [============>.................] - ETA: 1:33 - loss: 0.8036 - regression_loss: 0.7146 - classification_loss: 0.0890 228/500 [============>.................] - ETA: 1:33 - loss: 0.8040 - regression_loss: 0.7149 - classification_loss: 0.0891 229/500 [============>.................] - ETA: 1:33 - loss: 0.8032 - regression_loss: 0.7141 - classification_loss: 0.0891 230/500 [============>.................] - ETA: 1:32 - loss: 0.8041 - regression_loss: 0.7151 - classification_loss: 0.0890 231/500 [============>.................] - ETA: 1:32 - loss: 0.8049 - regression_loss: 0.7159 - classification_loss: 0.0890 232/500 [============>.................] - ETA: 1:32 - loss: 0.8038 - regression_loss: 0.7149 - classification_loss: 0.0889 233/500 [============>.................] - ETA: 1:31 - loss: 0.8048 - regression_loss: 0.7159 - classification_loss: 0.0890 234/500 [=============>................] - ETA: 1:31 - loss: 0.8047 - regression_loss: 0.7159 - classification_loss: 0.0888 235/500 [=============>................] - ETA: 1:31 - loss: 0.8065 - regression_loss: 0.7176 - classification_loss: 0.0890 236/500 [=============>................] - ETA: 1:30 - loss: 0.8080 - regression_loss: 0.7189 - classification_loss: 0.0891 237/500 [=============>................] - ETA: 1:30 - loss: 0.8091 - regression_loss: 0.7198 - classification_loss: 0.0894 238/500 [=============>................] - ETA: 1:30 - loss: 0.8079 - regression_loss: 0.7187 - classification_loss: 0.0892 239/500 [=============>................] - ETA: 1:29 - loss: 0.8064 - regression_loss: 0.7173 - classification_loss: 0.0891 240/500 [=============>................] - ETA: 1:29 - loss: 0.8081 - regression_loss: 0.7189 - classification_loss: 0.0891 241/500 [=============>................] - ETA: 1:29 - loss: 0.8081 - regression_loss: 0.7191 - classification_loss: 0.0891 242/500 [=============>................] - ETA: 1:28 - loss: 0.8090 - regression_loss: 0.7200 - classification_loss: 0.0890 243/500 [=============>................] - ETA: 1:28 - loss: 0.8094 - regression_loss: 0.7204 - classification_loss: 0.0891 244/500 [=============>................] - ETA: 1:28 - loss: 0.8106 - regression_loss: 0.7214 - classification_loss: 0.0892 245/500 [=============>................] - ETA: 1:27 - loss: 0.8104 - regression_loss: 0.7212 - classification_loss: 0.0892 246/500 [=============>................] - ETA: 1:27 - loss: 0.8119 - regression_loss: 0.7226 - classification_loss: 0.0893 247/500 [=============>................] - ETA: 1:27 - loss: 0.8125 - regression_loss: 0.7232 - classification_loss: 0.0893 248/500 [=============>................] - ETA: 1:26 - loss: 0.8116 - regression_loss: 0.7225 - classification_loss: 0.0891 249/500 [=============>................] - ETA: 1:26 - loss: 0.8117 - regression_loss: 0.7226 - classification_loss: 0.0891 250/500 [==============>...............] - ETA: 1:26 - loss: 0.8097 - regression_loss: 0.7208 - classification_loss: 0.0889 251/500 [==============>...............] - ETA: 1:25 - loss: 0.8095 - regression_loss: 0.7207 - classification_loss: 0.0888 252/500 [==============>...............] - ETA: 1:25 - loss: 0.8104 - regression_loss: 0.7217 - classification_loss: 0.0887 253/500 [==============>...............] - ETA: 1:24 - loss: 0.8107 - regression_loss: 0.7221 - classification_loss: 0.0887 254/500 [==============>...............] - ETA: 1:24 - loss: 0.8123 - regression_loss: 0.7234 - classification_loss: 0.0889 255/500 [==============>...............] - ETA: 1:24 - loss: 0.8125 - regression_loss: 0.7235 - classification_loss: 0.0889 256/500 [==============>...............] - ETA: 1:23 - loss: 0.8112 - regression_loss: 0.7224 - classification_loss: 0.0888 257/500 [==============>...............] - ETA: 1:23 - loss: 0.8117 - regression_loss: 0.7230 - classification_loss: 0.0887 258/500 [==============>...............] - ETA: 1:23 - loss: 0.8141 - regression_loss: 0.7251 - classification_loss: 0.0890 259/500 [==============>...............] - ETA: 1:22 - loss: 0.8147 - regression_loss: 0.7256 - classification_loss: 0.0891 260/500 [==============>...............] - ETA: 1:22 - loss: 0.8145 - regression_loss: 0.7254 - classification_loss: 0.0892 261/500 [==============>...............] - ETA: 1:22 - loss: 0.8144 - regression_loss: 0.7254 - classification_loss: 0.0891 262/500 [==============>...............] - ETA: 1:21 - loss: 0.8163 - regression_loss: 0.7271 - classification_loss: 0.0892 263/500 [==============>...............] - ETA: 1:21 - loss: 0.8147 - regression_loss: 0.7258 - classification_loss: 0.0889 264/500 [==============>...............] - ETA: 1:21 - loss: 0.8151 - regression_loss: 0.7261 - classification_loss: 0.0890 265/500 [==============>...............] - ETA: 1:20 - loss: 0.8161 - regression_loss: 0.7271 - classification_loss: 0.0890 266/500 [==============>...............] - ETA: 1:20 - loss: 0.8191 - regression_loss: 0.7296 - classification_loss: 0.0895 267/500 [===============>..............] - ETA: 1:20 - loss: 0.8190 - regression_loss: 0.7295 - classification_loss: 0.0895 268/500 [===============>..............] - ETA: 1:19 - loss: 0.8197 - regression_loss: 0.7302 - classification_loss: 0.0895 269/500 [===============>..............] - ETA: 1:19 - loss: 0.8198 - regression_loss: 0.7303 - classification_loss: 0.0895 270/500 [===============>..............] - ETA: 1:19 - loss: 0.8179 - regression_loss: 0.7286 - classification_loss: 0.0893 271/500 [===============>..............] - ETA: 1:18 - loss: 0.8191 - regression_loss: 0.7298 - classification_loss: 0.0893 272/500 [===============>..............] - ETA: 1:18 - loss: 0.8200 - regression_loss: 0.7307 - classification_loss: 0.0893 273/500 [===============>..............] - ETA: 1:18 - loss: 0.8198 - regression_loss: 0.7307 - classification_loss: 0.0891 274/500 [===============>..............] - ETA: 1:17 - loss: 0.8187 - regression_loss: 0.7297 - classification_loss: 0.0890 275/500 [===============>..............] - ETA: 1:17 - loss: 0.8179 - regression_loss: 0.7292 - classification_loss: 0.0888 276/500 [===============>..............] - ETA: 1:17 - loss: 0.8165 - regression_loss: 0.7279 - classification_loss: 0.0886 277/500 [===============>..............] - ETA: 1:16 - loss: 0.8165 - regression_loss: 0.7280 - classification_loss: 0.0886 278/500 [===============>..............] - ETA: 1:16 - loss: 0.8152 - regression_loss: 0.7268 - classification_loss: 0.0884 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8166 - regression_loss: 0.7278 - classification_loss: 0.0888 280/500 [===============>..............] - ETA: 1:15 - loss: 0.8151 - regression_loss: 0.7265 - classification_loss: 0.0886 281/500 [===============>..............] - ETA: 1:15 - loss: 0.8151 - regression_loss: 0.7265 - classification_loss: 0.0886 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8163 - regression_loss: 0.7276 - classification_loss: 0.0886 283/500 [===============>..............] - ETA: 1:14 - loss: 0.8157 - regression_loss: 0.7272 - classification_loss: 0.0886 284/500 [================>.............] - ETA: 1:14 - loss: 0.8159 - regression_loss: 0.7274 - classification_loss: 0.0886 285/500 [================>.............] - ETA: 1:13 - loss: 0.8150 - regression_loss: 0.7265 - classification_loss: 0.0885 286/500 [================>.............] - ETA: 1:13 - loss: 0.8147 - regression_loss: 0.7261 - classification_loss: 0.0886 287/500 [================>.............] - ETA: 1:13 - loss: 0.8145 - regression_loss: 0.7258 - classification_loss: 0.0886 288/500 [================>.............] - ETA: 1:12 - loss: 0.8152 - regression_loss: 0.7264 - classification_loss: 0.0888 289/500 [================>.............] - ETA: 1:12 - loss: 0.8155 - regression_loss: 0.7268 - classification_loss: 0.0887 290/500 [================>.............] - ETA: 1:12 - loss: 0.8148 - regression_loss: 0.7262 - classification_loss: 0.0886 291/500 [================>.............] - ETA: 1:11 - loss: 0.8148 - regression_loss: 0.7263 - classification_loss: 0.0885 292/500 [================>.............] - ETA: 1:11 - loss: 0.8149 - regression_loss: 0.7264 - classification_loss: 0.0886 293/500 [================>.............] - ETA: 1:11 - loss: 0.8138 - regression_loss: 0.7254 - classification_loss: 0.0884 294/500 [================>.............] - ETA: 1:10 - loss: 0.8150 - regression_loss: 0.7264 - classification_loss: 0.0886 295/500 [================>.............] - ETA: 1:10 - loss: 0.8135 - regression_loss: 0.7251 - classification_loss: 0.0884 296/500 [================>.............] - ETA: 1:10 - loss: 0.8135 - regression_loss: 0.7251 - classification_loss: 0.0884 297/500 [================>.............] - ETA: 1:09 - loss: 0.8139 - regression_loss: 0.7255 - classification_loss: 0.0884 298/500 [================>.............] - ETA: 1:09 - loss: 0.8148 - regression_loss: 0.7263 - classification_loss: 0.0885 299/500 [================>.............] - ETA: 1:09 - loss: 0.8142 - regression_loss: 0.7257 - classification_loss: 0.0884 300/500 [=================>............] - ETA: 1:08 - loss: 0.8160 - regression_loss: 0.7272 - classification_loss: 0.0888 301/500 [=================>............] - ETA: 1:08 - loss: 0.8165 - regression_loss: 0.7276 - classification_loss: 0.0889 302/500 [=================>............] - ETA: 1:08 - loss: 0.8158 - regression_loss: 0.7270 - classification_loss: 0.0888 303/500 [=================>............] - ETA: 1:07 - loss: 0.8156 - regression_loss: 0.7269 - classification_loss: 0.0886 304/500 [=================>............] - ETA: 1:07 - loss: 0.8160 - regression_loss: 0.7272 - classification_loss: 0.0887 305/500 [=================>............] - ETA: 1:07 - loss: 0.8158 - regression_loss: 0.7272 - classification_loss: 0.0886 306/500 [=================>............] - ETA: 1:06 - loss: 0.8159 - regression_loss: 0.7273 - classification_loss: 0.0886 307/500 [=================>............] - ETA: 1:06 - loss: 0.8167 - regression_loss: 0.7279 - classification_loss: 0.0887 308/500 [=================>............] - ETA: 1:06 - loss: 0.8170 - regression_loss: 0.7282 - classification_loss: 0.0888 309/500 [=================>............] - ETA: 1:05 - loss: 0.8165 - regression_loss: 0.7278 - classification_loss: 0.0888 310/500 [=================>............] - ETA: 1:05 - loss: 0.8173 - regression_loss: 0.7285 - classification_loss: 0.0888 311/500 [=================>............] - ETA: 1:04 - loss: 0.8167 - regression_loss: 0.7279 - classification_loss: 0.0888 312/500 [=================>............] - ETA: 1:04 - loss: 0.8158 - regression_loss: 0.7270 - classification_loss: 0.0888 313/500 [=================>............] - ETA: 1:04 - loss: 0.8154 - regression_loss: 0.7266 - classification_loss: 0.0888 314/500 [=================>............] - ETA: 1:03 - loss: 0.8156 - regression_loss: 0.7268 - classification_loss: 0.0888 315/500 [=================>............] - ETA: 1:03 - loss: 0.8163 - regression_loss: 0.7274 - classification_loss: 0.0889 316/500 [=================>............] - ETA: 1:03 - loss: 0.8170 - regression_loss: 0.7280 - classification_loss: 0.0890 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8171 - regression_loss: 0.7281 - classification_loss: 0.0889 318/500 [==================>...........] - ETA: 1:02 - loss: 0.8167 - regression_loss: 0.7278 - classification_loss: 0.0889 319/500 [==================>...........] - ETA: 1:02 - loss: 0.8167 - regression_loss: 0.7277 - classification_loss: 0.0890 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8165 - regression_loss: 0.7276 - classification_loss: 0.0889 321/500 [==================>...........] - ETA: 1:01 - loss: 0.8169 - regression_loss: 0.7280 - classification_loss: 0.0890 322/500 [==================>...........] - ETA: 1:01 - loss: 0.8168 - regression_loss: 0.7279 - classification_loss: 0.0890 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8174 - regression_loss: 0.7284 - classification_loss: 0.0890 324/500 [==================>...........] - ETA: 1:00 - loss: 0.8177 - regression_loss: 0.7286 - classification_loss: 0.0891 325/500 [==================>...........] - ETA: 1:00 - loss: 0.8177 - regression_loss: 0.7286 - classification_loss: 0.0891 326/500 [==================>...........] - ETA: 59s - loss: 0.8174 - regression_loss: 0.7284 - classification_loss: 0.0890  327/500 [==================>...........] - ETA: 59s - loss: 0.8173 - regression_loss: 0.7283 - classification_loss: 0.0890 328/500 [==================>...........] - ETA: 59s - loss: 0.8172 - regression_loss: 0.7283 - classification_loss: 0.0889 329/500 [==================>...........] - ETA: 58s - loss: 0.8180 - regression_loss: 0.7291 - classification_loss: 0.0889 330/500 [==================>...........] - ETA: 58s - loss: 0.8171 - regression_loss: 0.7284 - classification_loss: 0.0888 331/500 [==================>...........] - ETA: 58s - loss: 0.8167 - regression_loss: 0.7280 - classification_loss: 0.0887 332/500 [==================>...........] - ETA: 57s - loss: 0.8168 - regression_loss: 0.7282 - classification_loss: 0.0887 333/500 [==================>...........] - ETA: 57s - loss: 0.8164 - regression_loss: 0.7277 - classification_loss: 0.0887 334/500 [===================>..........] - ETA: 57s - loss: 0.8180 - regression_loss: 0.7292 - classification_loss: 0.0889 335/500 [===================>..........] - ETA: 56s - loss: 0.8169 - regression_loss: 0.7282 - classification_loss: 0.0887 336/500 [===================>..........] - ETA: 56s - loss: 0.8161 - regression_loss: 0.7274 - classification_loss: 0.0886 337/500 [===================>..........] - ETA: 56s - loss: 0.8157 - regression_loss: 0.7270 - classification_loss: 0.0887 338/500 [===================>..........] - ETA: 55s - loss: 0.8151 - regression_loss: 0.7265 - classification_loss: 0.0886 339/500 [===================>..........] - ETA: 55s - loss: 0.8137 - regression_loss: 0.7253 - classification_loss: 0.0885 340/500 [===================>..........] - ETA: 55s - loss: 0.8136 - regression_loss: 0.7251 - classification_loss: 0.0885 341/500 [===================>..........] - ETA: 54s - loss: 0.8127 - regression_loss: 0.7243 - classification_loss: 0.0884 342/500 [===================>..........] - ETA: 54s - loss: 0.8122 - regression_loss: 0.7238 - classification_loss: 0.0883 343/500 [===================>..........] - ETA: 53s - loss: 0.8114 - regression_loss: 0.7231 - classification_loss: 0.0883 344/500 [===================>..........] - ETA: 53s - loss: 0.8112 - regression_loss: 0.7230 - classification_loss: 0.0882 345/500 [===================>..........] - ETA: 53s - loss: 0.8119 - regression_loss: 0.7236 - classification_loss: 0.0883 346/500 [===================>..........] - ETA: 52s - loss: 0.8119 - regression_loss: 0.7236 - classification_loss: 0.0883 347/500 [===================>..........] - ETA: 52s - loss: 0.8124 - regression_loss: 0.7242 - classification_loss: 0.0882 348/500 [===================>..........] - ETA: 52s - loss: 0.8128 - regression_loss: 0.7245 - classification_loss: 0.0882 349/500 [===================>..........] - ETA: 51s - loss: 0.8131 - regression_loss: 0.7248 - classification_loss: 0.0883 350/500 [====================>.........] - ETA: 51s - loss: 0.8120 - regression_loss: 0.7238 - classification_loss: 0.0882 351/500 [====================>.........] - ETA: 51s - loss: 0.8120 - regression_loss: 0.7239 - classification_loss: 0.0881 352/500 [====================>.........] - ETA: 50s - loss: 0.8117 - regression_loss: 0.7236 - classification_loss: 0.0881 353/500 [====================>.........] - ETA: 50s - loss: 0.8107 - regression_loss: 0.7228 - classification_loss: 0.0880 354/500 [====================>.........] - ETA: 50s - loss: 0.8106 - regression_loss: 0.7225 - classification_loss: 0.0880 355/500 [====================>.........] - ETA: 49s - loss: 0.8114 - regression_loss: 0.7234 - classification_loss: 0.0880 356/500 [====================>.........] - ETA: 49s - loss: 0.8118 - regression_loss: 0.7236 - classification_loss: 0.0882 357/500 [====================>.........] - ETA: 49s - loss: 0.8120 - regression_loss: 0.7238 - classification_loss: 0.0883 358/500 [====================>.........] - ETA: 48s - loss: 0.8109 - regression_loss: 0.7228 - classification_loss: 0.0881 359/500 [====================>.........] - ETA: 48s - loss: 0.8123 - regression_loss: 0.7240 - classification_loss: 0.0883 360/500 [====================>.........] - ETA: 48s - loss: 0.8127 - regression_loss: 0.7243 - classification_loss: 0.0883 361/500 [====================>.........] - ETA: 47s - loss: 0.8127 - regression_loss: 0.7244 - classification_loss: 0.0883 362/500 [====================>.........] - ETA: 47s - loss: 0.8132 - regression_loss: 0.7249 - classification_loss: 0.0884 363/500 [====================>.........] - ETA: 47s - loss: 0.8133 - regression_loss: 0.7249 - classification_loss: 0.0884 364/500 [====================>.........] - ETA: 46s - loss: 0.8125 - regression_loss: 0.7242 - classification_loss: 0.0883 365/500 [====================>.........] - ETA: 46s - loss: 0.8119 - regression_loss: 0.7236 - classification_loss: 0.0883 366/500 [====================>.........] - ETA: 46s - loss: 0.8123 - regression_loss: 0.7240 - classification_loss: 0.0883 367/500 [=====================>........] - ETA: 45s - loss: 0.8124 - regression_loss: 0.7242 - classification_loss: 0.0883 368/500 [=====================>........] - ETA: 45s - loss: 0.8116 - regression_loss: 0.7234 - classification_loss: 0.0881 369/500 [=====================>........] - ETA: 45s - loss: 0.8114 - regression_loss: 0.7233 - classification_loss: 0.0881 370/500 [=====================>........] - ETA: 44s - loss: 0.8115 - regression_loss: 0.7234 - classification_loss: 0.0881 371/500 [=====================>........] - ETA: 44s - loss: 0.8126 - regression_loss: 0.7243 - classification_loss: 0.0882 372/500 [=====================>........] - ETA: 44s - loss: 0.8115 - regression_loss: 0.7235 - classification_loss: 0.0881 373/500 [=====================>........] - ETA: 43s - loss: 0.8103 - regression_loss: 0.7224 - classification_loss: 0.0880 374/500 [=====================>........] - ETA: 43s - loss: 0.8101 - regression_loss: 0.7223 - classification_loss: 0.0879 375/500 [=====================>........] - ETA: 43s - loss: 0.8096 - regression_loss: 0.7218 - classification_loss: 0.0878 376/500 [=====================>........] - ETA: 42s - loss: 0.8102 - regression_loss: 0.7223 - classification_loss: 0.0879 377/500 [=====================>........] - ETA: 42s - loss: 0.8094 - regression_loss: 0.7216 - classification_loss: 0.0878 378/500 [=====================>........] - ETA: 41s - loss: 0.8093 - regression_loss: 0.7215 - classification_loss: 0.0878 379/500 [=====================>........] - ETA: 41s - loss: 0.8095 - regression_loss: 0.7217 - classification_loss: 0.0878 380/500 [=====================>........] - ETA: 41s - loss: 0.8101 - regression_loss: 0.7222 - classification_loss: 0.0879 381/500 [=====================>........] - ETA: 40s - loss: 0.8102 - regression_loss: 0.7224 - classification_loss: 0.0879 382/500 [=====================>........] - ETA: 40s - loss: 0.8097 - regression_loss: 0.7219 - classification_loss: 0.0878 383/500 [=====================>........] - ETA: 40s - loss: 0.8092 - regression_loss: 0.7215 - classification_loss: 0.0877 384/500 [======================>.......] - ETA: 39s - loss: 0.8090 - regression_loss: 0.7215 - classification_loss: 0.0876 385/500 [======================>.......] - ETA: 39s - loss: 0.8095 - regression_loss: 0.7218 - classification_loss: 0.0877 386/500 [======================>.......] - ETA: 39s - loss: 0.8096 - regression_loss: 0.7219 - classification_loss: 0.0877 387/500 [======================>.......] - ETA: 38s - loss: 0.8090 - regression_loss: 0.7214 - classification_loss: 0.0876 388/500 [======================>.......] - ETA: 38s - loss: 0.8087 - regression_loss: 0.7211 - classification_loss: 0.0876 389/500 [======================>.......] - ETA: 38s - loss: 0.8093 - regression_loss: 0.7216 - classification_loss: 0.0876 390/500 [======================>.......] - ETA: 37s - loss: 0.8092 - regression_loss: 0.7213 - classification_loss: 0.0878 391/500 [======================>.......] - ETA: 37s - loss: 0.8082 - regression_loss: 0.7204 - classification_loss: 0.0877 392/500 [======================>.......] - ETA: 37s - loss: 0.8083 - regression_loss: 0.7206 - classification_loss: 0.0877 393/500 [======================>.......] - ETA: 36s - loss: 0.8088 - regression_loss: 0.7211 - classification_loss: 0.0877 394/500 [======================>.......] - ETA: 36s - loss: 0.8086 - regression_loss: 0.7209 - classification_loss: 0.0877 395/500 [======================>.......] - ETA: 36s - loss: 0.8108 - regression_loss: 0.7227 - classification_loss: 0.0881 396/500 [======================>.......] - ETA: 35s - loss: 0.8111 - regression_loss: 0.7229 - classification_loss: 0.0882 397/500 [======================>.......] - ETA: 35s - loss: 0.8106 - regression_loss: 0.7225 - classification_loss: 0.0882 398/500 [======================>.......] - ETA: 35s - loss: 0.8101 - regression_loss: 0.7219 - classification_loss: 0.0883 399/500 [======================>.......] - ETA: 34s - loss: 0.8108 - regression_loss: 0.7224 - classification_loss: 0.0885 400/500 [=======================>......] - ETA: 34s - loss: 0.8119 - regression_loss: 0.7235 - classification_loss: 0.0884 401/500 [=======================>......] - ETA: 34s - loss: 0.8116 - regression_loss: 0.7232 - classification_loss: 0.0884 402/500 [=======================>......] - ETA: 33s - loss: 0.8118 - regression_loss: 0.7234 - classification_loss: 0.0885 403/500 [=======================>......] - ETA: 33s - loss: 0.8124 - regression_loss: 0.7240 - classification_loss: 0.0884 404/500 [=======================>......] - ETA: 33s - loss: 0.8128 - regression_loss: 0.7242 - classification_loss: 0.0885 405/500 [=======================>......] - ETA: 32s - loss: 0.8118 - regression_loss: 0.7234 - classification_loss: 0.0884 406/500 [=======================>......] - ETA: 32s - loss: 0.8107 - regression_loss: 0.7224 - classification_loss: 0.0883 407/500 [=======================>......] - ETA: 31s - loss: 0.8110 - regression_loss: 0.7228 - classification_loss: 0.0882 408/500 [=======================>......] - ETA: 31s - loss: 0.8100 - regression_loss: 0.7219 - classification_loss: 0.0881 409/500 [=======================>......] - ETA: 31s - loss: 0.8090 - regression_loss: 0.7210 - classification_loss: 0.0880 410/500 [=======================>......] - ETA: 30s - loss: 0.8104 - regression_loss: 0.7222 - classification_loss: 0.0882 411/500 [=======================>......] - ETA: 30s - loss: 0.8122 - regression_loss: 0.7238 - classification_loss: 0.0884 412/500 [=======================>......] - ETA: 30s - loss: 0.8129 - regression_loss: 0.7245 - classification_loss: 0.0885 413/500 [=======================>......] - ETA: 29s - loss: 0.8134 - regression_loss: 0.7249 - classification_loss: 0.0885 414/500 [=======================>......] - ETA: 29s - loss: 0.8150 - regression_loss: 0.7262 - classification_loss: 0.0887 415/500 [=======================>......] - ETA: 29s - loss: 0.8143 - regression_loss: 0.7257 - classification_loss: 0.0886 416/500 [=======================>......] - ETA: 28s - loss: 0.8159 - regression_loss: 0.7271 - classification_loss: 0.0888 417/500 [========================>.....] - ETA: 28s - loss: 0.8150 - regression_loss: 0.7263 - classification_loss: 0.0887 418/500 [========================>.....] - ETA: 28s - loss: 0.8147 - regression_loss: 0.7261 - classification_loss: 0.0886 419/500 [========================>.....] - ETA: 27s - loss: 0.8150 - regression_loss: 0.7264 - classification_loss: 0.0886 420/500 [========================>.....] - ETA: 27s - loss: 0.8160 - regression_loss: 0.7273 - classification_loss: 0.0887 421/500 [========================>.....] - ETA: 27s - loss: 0.8158 - regression_loss: 0.7271 - classification_loss: 0.0887 422/500 [========================>.....] - ETA: 26s - loss: 0.8162 - regression_loss: 0.7276 - classification_loss: 0.0886 423/500 [========================>.....] - ETA: 26s - loss: 0.8154 - regression_loss: 0.7269 - classification_loss: 0.0885 424/500 [========================>.....] - ETA: 26s - loss: 0.8157 - regression_loss: 0.7271 - classification_loss: 0.0886 425/500 [========================>.....] - ETA: 25s - loss: 0.8153 - regression_loss: 0.7267 - classification_loss: 0.0886 426/500 [========================>.....] - ETA: 25s - loss: 0.8149 - regression_loss: 0.7263 - classification_loss: 0.0886 427/500 [========================>.....] - ETA: 25s - loss: 0.8145 - regression_loss: 0.7260 - classification_loss: 0.0885 428/500 [========================>.....] - ETA: 24s - loss: 0.8153 - regression_loss: 0.7267 - classification_loss: 0.0886 429/500 [========================>.....] - ETA: 24s - loss: 0.8155 - regression_loss: 0.7269 - classification_loss: 0.0886 430/500 [========================>.....] - ETA: 24s - loss: 0.8170 - regression_loss: 0.7283 - classification_loss: 0.0887 431/500 [========================>.....] - ETA: 23s - loss: 0.8175 - regression_loss: 0.7288 - classification_loss: 0.0887 432/500 [========================>.....] - ETA: 23s - loss: 0.8172 - regression_loss: 0.7285 - classification_loss: 0.0887 433/500 [========================>.....] - ETA: 23s - loss: 0.8177 - regression_loss: 0.7290 - classification_loss: 0.0887 434/500 [=========================>....] - ETA: 22s - loss: 0.8177 - regression_loss: 0.7290 - classification_loss: 0.0887 435/500 [=========================>....] - ETA: 22s - loss: 0.8180 - regression_loss: 0.7292 - classification_loss: 0.0888 436/500 [=========================>....] - ETA: 22s - loss: 0.8180 - regression_loss: 0.7292 - classification_loss: 0.0888 437/500 [=========================>....] - ETA: 21s - loss: 0.8182 - regression_loss: 0.7294 - classification_loss: 0.0888 438/500 [=========================>....] - ETA: 21s - loss: 0.8177 - regression_loss: 0.7289 - classification_loss: 0.0887 439/500 [=========================>....] - ETA: 20s - loss: 0.8179 - regression_loss: 0.7291 - classification_loss: 0.0887 440/500 [=========================>....] - ETA: 20s - loss: 0.8182 - regression_loss: 0.7295 - classification_loss: 0.0887 441/500 [=========================>....] - ETA: 20s - loss: 0.8186 - regression_loss: 0.7299 - classification_loss: 0.0888 442/500 [=========================>....] - ETA: 19s - loss: 0.8190 - regression_loss: 0.7302 - classification_loss: 0.0888 443/500 [=========================>....] - ETA: 19s - loss: 0.8190 - regression_loss: 0.7301 - classification_loss: 0.0888 444/500 [=========================>....] - ETA: 19s - loss: 0.8186 - regression_loss: 0.7296 - classification_loss: 0.0889 445/500 [=========================>....] - ETA: 18s - loss: 0.8198 - regression_loss: 0.7309 - classification_loss: 0.0889 446/500 [=========================>....] - ETA: 18s - loss: 0.8197 - regression_loss: 0.7308 - classification_loss: 0.0889 447/500 [=========================>....] - ETA: 18s - loss: 0.8198 - regression_loss: 0.7309 - classification_loss: 0.0889 448/500 [=========================>....] - ETA: 17s - loss: 0.8200 - regression_loss: 0.7311 - classification_loss: 0.0889 449/500 [=========================>....] - ETA: 17s - loss: 0.8196 - regression_loss: 0.7309 - classification_loss: 0.0888 450/500 [==========================>...] - ETA: 17s - loss: 0.8203 - regression_loss: 0.7315 - classification_loss: 0.0888 451/500 [==========================>...] - ETA: 16s - loss: 0.8207 - regression_loss: 0.7317 - classification_loss: 0.0890 452/500 [==========================>...] - ETA: 16s - loss: 0.8201 - regression_loss: 0.7311 - classification_loss: 0.0889 453/500 [==========================>...] - ETA: 16s - loss: 0.8197 - regression_loss: 0.7308 - classification_loss: 0.0889 454/500 [==========================>...] - ETA: 15s - loss: 0.8196 - regression_loss: 0.7307 - classification_loss: 0.0889 455/500 [==========================>...] - ETA: 15s - loss: 0.8202 - regression_loss: 0.7312 - classification_loss: 0.0891 456/500 [==========================>...] - ETA: 15s - loss: 0.8194 - regression_loss: 0.7305 - classification_loss: 0.0889 457/500 [==========================>...] - ETA: 14s - loss: 0.8199 - regression_loss: 0.7308 - classification_loss: 0.0890 458/500 [==========================>...] - ETA: 14s - loss: 0.8195 - regression_loss: 0.7306 - classification_loss: 0.0890 459/500 [==========================>...] - ETA: 14s - loss: 0.8199 - regression_loss: 0.7308 - classification_loss: 0.0891 460/500 [==========================>...] - ETA: 13s - loss: 0.8205 - regression_loss: 0.7313 - classification_loss: 0.0893 461/500 [==========================>...] - ETA: 13s - loss: 0.8200 - regression_loss: 0.7309 - classification_loss: 0.0891 462/500 [==========================>...] - ETA: 13s - loss: 0.8194 - regression_loss: 0.7303 - classification_loss: 0.0891 463/500 [==========================>...] - ETA: 12s - loss: 0.8187 - regression_loss: 0.7298 - classification_loss: 0.0890 464/500 [==========================>...] - ETA: 12s - loss: 0.8177 - regression_loss: 0.7288 - classification_loss: 0.0889 465/500 [==========================>...] - ETA: 12s - loss: 0.8184 - regression_loss: 0.7294 - classification_loss: 0.0890 466/500 [==========================>...] - ETA: 11s - loss: 0.8181 - regression_loss: 0.7290 - classification_loss: 0.0891 467/500 [===========================>..] - ETA: 11s - loss: 0.8181 - regression_loss: 0.7290 - classification_loss: 0.0890 468/500 [===========================>..] - ETA: 11s - loss: 0.8174 - regression_loss: 0.7285 - classification_loss: 0.0889 469/500 [===========================>..] - ETA: 10s - loss: 0.8162 - regression_loss: 0.7274 - classification_loss: 0.0888 470/500 [===========================>..] - ETA: 10s - loss: 0.8163 - regression_loss: 0.7274 - classification_loss: 0.0888 471/500 [===========================>..] - ETA: 9s - loss: 0.8171 - regression_loss: 0.7282 - classification_loss: 0.0889  472/500 [===========================>..] - ETA: 9s - loss: 0.8179 - regression_loss: 0.7290 - classification_loss: 0.0889 473/500 [===========================>..] - ETA: 9s - loss: 0.8177 - regression_loss: 0.7288 - classification_loss: 0.0889 474/500 [===========================>..] - ETA: 8s - loss: 0.8171 - regression_loss: 0.7283 - classification_loss: 0.0888 475/500 [===========================>..] - ETA: 8s - loss: 0.8170 - regression_loss: 0.7282 - classification_loss: 0.0888 476/500 [===========================>..] - ETA: 8s - loss: 0.8173 - regression_loss: 0.7286 - classification_loss: 0.0888 477/500 [===========================>..] - ETA: 7s - loss: 0.8174 - regression_loss: 0.7286 - classification_loss: 0.0888 478/500 [===========================>..] - ETA: 7s - loss: 0.8171 - regression_loss: 0.7284 - classification_loss: 0.0887 479/500 [===========================>..] - ETA: 7s - loss: 0.8162 - regression_loss: 0.7276 - classification_loss: 0.0886 480/500 [===========================>..] - ETA: 6s - loss: 0.8154 - regression_loss: 0.7269 - classification_loss: 0.0885 481/500 [===========================>..] - ETA: 6s - loss: 0.8156 - regression_loss: 0.7271 - classification_loss: 0.0885 482/500 [===========================>..] - ETA: 6s - loss: 0.8156 - regression_loss: 0.7271 - classification_loss: 0.0885 483/500 [===========================>..] - ETA: 5s - loss: 0.8154 - regression_loss: 0.7269 - classification_loss: 0.0884 484/500 [============================>.] - ETA: 5s - loss: 0.8145 - regression_loss: 0.7262 - classification_loss: 0.0884 485/500 [============================>.] - ETA: 5s - loss: 0.8148 - regression_loss: 0.7264 - classification_loss: 0.0884 486/500 [============================>.] - ETA: 4s - loss: 0.8150 - regression_loss: 0.7266 - classification_loss: 0.0884 487/500 [============================>.] - ETA: 4s - loss: 0.8147 - regression_loss: 0.7261 - classification_loss: 0.0886 488/500 [============================>.] - ETA: 4s - loss: 0.8138 - regression_loss: 0.7254 - classification_loss: 0.0884 489/500 [============================>.] - ETA: 3s - loss: 0.8133 - regression_loss: 0.7249 - classification_loss: 0.0883 490/500 [============================>.] - ETA: 3s - loss: 0.8132 - regression_loss: 0.7249 - classification_loss: 0.0883 491/500 [============================>.] - ETA: 3s - loss: 0.8129 - regression_loss: 0.7247 - classification_loss: 0.0882 492/500 [============================>.] - ETA: 2s - loss: 0.8131 - regression_loss: 0.7248 - classification_loss: 0.0883 493/500 [============================>.] - ETA: 2s - loss: 0.8128 - regression_loss: 0.7246 - classification_loss: 0.0883 494/500 [============================>.] - ETA: 2s - loss: 0.8124 - regression_loss: 0.7241 - classification_loss: 0.0882 495/500 [============================>.] - ETA: 1s - loss: 0.8127 - regression_loss: 0.7245 - classification_loss: 0.0882 496/500 [============================>.] - ETA: 1s - loss: 0.8126 - regression_loss: 0.7244 - classification_loss: 0.0882 497/500 [============================>.] - ETA: 1s - loss: 0.8125 - regression_loss: 0.7243 - classification_loss: 0.0882 498/500 [============================>.] - ETA: 0s - loss: 0.8124 - regression_loss: 0.7243 - classification_loss: 0.0881 499/500 [============================>.] - ETA: 0s - loss: 0.8117 - regression_loss: 0.7237 - classification_loss: 0.0880 500/500 [==============================] - 172s 344ms/step - loss: 0.8113 - regression_loss: 0.7233 - classification_loss: 0.0879 1172 instances of class plum with average precision: 0.7101 mAP: 0.7101 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 2:45 - loss: 0.9375 - regression_loss: 0.8478 - classification_loss: 0.0896 2/500 [..............................] - ETA: 2:47 - loss: 0.9000 - regression_loss: 0.8021 - classification_loss: 0.0979 3/500 [..............................] - ETA: 2:45 - loss: 0.8062 - regression_loss: 0.7164 - classification_loss: 0.0898 4/500 [..............................] - ETA: 2:43 - loss: 0.7196 - regression_loss: 0.6372 - classification_loss: 0.0824 5/500 [..............................] - ETA: 2:44 - loss: 0.6488 - regression_loss: 0.5735 - classification_loss: 0.0753 6/500 [..............................] - ETA: 2:43 - loss: 0.6028 - regression_loss: 0.5342 - classification_loss: 0.0686 7/500 [..............................] - ETA: 2:43 - loss: 0.6052 - regression_loss: 0.5405 - classification_loss: 0.0648 8/500 [..............................] - ETA: 2:43 - loss: 0.5813 - regression_loss: 0.5189 - classification_loss: 0.0624 9/500 [..............................] - ETA: 2:44 - loss: 0.6369 - regression_loss: 0.5740 - classification_loss: 0.0629 10/500 [..............................] - ETA: 2:45 - loss: 0.6247 - regression_loss: 0.5628 - classification_loss: 0.0619 11/500 [..............................] - ETA: 2:45 - loss: 0.6479 - regression_loss: 0.5854 - classification_loss: 0.0625 12/500 [..............................] - ETA: 2:45 - loss: 0.6486 - regression_loss: 0.5834 - classification_loss: 0.0652 13/500 [..............................] - ETA: 2:45 - loss: 0.6569 - regression_loss: 0.5873 - classification_loss: 0.0696 14/500 [..............................] - ETA: 2:46 - loss: 0.6852 - regression_loss: 0.6116 - classification_loss: 0.0735 15/500 [..............................] - ETA: 2:46 - loss: 0.6877 - regression_loss: 0.6124 - classification_loss: 0.0753 16/500 [..............................] - ETA: 2:46 - loss: 0.6926 - regression_loss: 0.6175 - classification_loss: 0.0751 17/500 [>.............................] - ETA: 2:45 - loss: 0.6957 - regression_loss: 0.6202 - classification_loss: 0.0755 18/500 [>.............................] - ETA: 2:45 - loss: 0.6933 - regression_loss: 0.6187 - classification_loss: 0.0746 19/500 [>.............................] - ETA: 2:45 - loss: 0.6915 - regression_loss: 0.6193 - classification_loss: 0.0722 20/500 [>.............................] - ETA: 2:44 - loss: 0.6871 - regression_loss: 0.6157 - classification_loss: 0.0714 21/500 [>.............................] - ETA: 2:44 - loss: 0.7114 - regression_loss: 0.6364 - classification_loss: 0.0750 22/500 [>.............................] - ETA: 2:44 - loss: 0.7100 - regression_loss: 0.6347 - classification_loss: 0.0753 23/500 [>.............................] - ETA: 2:44 - loss: 0.7113 - regression_loss: 0.6368 - classification_loss: 0.0745 24/500 [>.............................] - ETA: 2:43 - loss: 0.7109 - regression_loss: 0.6362 - classification_loss: 0.0747 25/500 [>.............................] - ETA: 2:43 - loss: 0.6986 - regression_loss: 0.6256 - classification_loss: 0.0731 26/500 [>.............................] - ETA: 2:42 - loss: 0.6912 - regression_loss: 0.6193 - classification_loss: 0.0719 27/500 [>.............................] - ETA: 2:42 - loss: 0.7115 - regression_loss: 0.6363 - classification_loss: 0.0752 28/500 [>.............................] - ETA: 2:41 - loss: 0.7171 - regression_loss: 0.6421 - classification_loss: 0.0750 29/500 [>.............................] - ETA: 2:40 - loss: 0.7172 - regression_loss: 0.6426 - classification_loss: 0.0746 30/500 [>.............................] - ETA: 2:40 - loss: 0.7228 - regression_loss: 0.6476 - classification_loss: 0.0752 31/500 [>.............................] - ETA: 2:40 - loss: 0.7118 - regression_loss: 0.6380 - classification_loss: 0.0738 32/500 [>.............................] - ETA: 2:40 - loss: 0.7412 - regression_loss: 0.6655 - classification_loss: 0.0758 33/500 [>.............................] - ETA: 2:39 - loss: 0.7379 - regression_loss: 0.6625 - classification_loss: 0.0754 34/500 [=>............................] - ETA: 2:39 - loss: 0.7425 - regression_loss: 0.6660 - classification_loss: 0.0765 35/500 [=>............................] - ETA: 2:39 - loss: 0.7433 - regression_loss: 0.6670 - classification_loss: 0.0763 36/500 [=>............................] - ETA: 2:39 - loss: 0.7434 - regression_loss: 0.6676 - classification_loss: 0.0758 37/500 [=>............................] - ETA: 2:39 - loss: 0.7454 - regression_loss: 0.6685 - classification_loss: 0.0770 38/500 [=>............................] - ETA: 2:38 - loss: 0.7405 - regression_loss: 0.6640 - classification_loss: 0.0765 39/500 [=>............................] - ETA: 2:38 - loss: 0.7309 - regression_loss: 0.6556 - classification_loss: 0.0754 40/500 [=>............................] - ETA: 2:37 - loss: 0.7247 - regression_loss: 0.6505 - classification_loss: 0.0741 41/500 [=>............................] - ETA: 2:37 - loss: 0.7372 - regression_loss: 0.6626 - classification_loss: 0.0746 42/500 [=>............................] - ETA: 2:37 - loss: 0.7395 - regression_loss: 0.6643 - classification_loss: 0.0752 43/500 [=>............................] - ETA: 2:36 - loss: 0.7397 - regression_loss: 0.6645 - classification_loss: 0.0752 44/500 [=>............................] - ETA: 2:36 - loss: 0.7466 - regression_loss: 0.6700 - classification_loss: 0.0766 45/500 [=>............................] - ETA: 2:36 - loss: 0.7553 - regression_loss: 0.6790 - classification_loss: 0.0764 46/500 [=>............................] - ETA: 2:35 - loss: 0.7561 - regression_loss: 0.6769 - classification_loss: 0.0792 47/500 [=>............................] - ETA: 2:35 - loss: 0.7560 - regression_loss: 0.6769 - classification_loss: 0.0791 48/500 [=>............................] - ETA: 2:34 - loss: 0.7541 - regression_loss: 0.6747 - classification_loss: 0.0793 49/500 [=>............................] - ETA: 2:34 - loss: 0.7560 - regression_loss: 0.6755 - classification_loss: 0.0805 50/500 [==>...........................] - ETA: 2:34 - loss: 0.7578 - regression_loss: 0.6765 - classification_loss: 0.0812 51/500 [==>...........................] - ETA: 2:33 - loss: 0.7653 - regression_loss: 0.6841 - classification_loss: 0.0812 52/500 [==>...........................] - ETA: 2:33 - loss: 0.7622 - regression_loss: 0.6814 - classification_loss: 0.0808 53/500 [==>...........................] - ETA: 2:33 - loss: 0.7602 - regression_loss: 0.6792 - classification_loss: 0.0810 54/500 [==>...........................] - ETA: 2:32 - loss: 0.7581 - regression_loss: 0.6775 - classification_loss: 0.0806 55/500 [==>...........................] - ETA: 2:32 - loss: 0.7531 - regression_loss: 0.6732 - classification_loss: 0.0799 56/500 [==>...........................] - ETA: 2:31 - loss: 0.7628 - regression_loss: 0.6808 - classification_loss: 0.0820 57/500 [==>...........................] - ETA: 2:31 - loss: 0.7591 - regression_loss: 0.6772 - classification_loss: 0.0819 58/500 [==>...........................] - ETA: 2:31 - loss: 0.7581 - regression_loss: 0.6757 - classification_loss: 0.0823 59/500 [==>...........................] - ETA: 2:30 - loss: 0.7640 - regression_loss: 0.6807 - classification_loss: 0.0833 60/500 [==>...........................] - ETA: 2:30 - loss: 0.7660 - regression_loss: 0.6824 - classification_loss: 0.0837 61/500 [==>...........................] - ETA: 2:30 - loss: 0.7618 - regression_loss: 0.6790 - classification_loss: 0.0828 62/500 [==>...........................] - ETA: 2:29 - loss: 0.7640 - regression_loss: 0.6809 - classification_loss: 0.0831 63/500 [==>...........................] - ETA: 2:29 - loss: 0.7660 - regression_loss: 0.6821 - classification_loss: 0.0839 64/500 [==>...........................] - ETA: 2:29 - loss: 0.7663 - regression_loss: 0.6825 - classification_loss: 0.0838 65/500 [==>...........................] - ETA: 2:29 - loss: 0.7663 - regression_loss: 0.6822 - classification_loss: 0.0841 66/500 [==>...........................] - ETA: 2:28 - loss: 0.7662 - regression_loss: 0.6822 - classification_loss: 0.0840 67/500 [===>..........................] - ETA: 2:28 - loss: 0.7704 - regression_loss: 0.6859 - classification_loss: 0.0845 68/500 [===>..........................] - ETA: 2:28 - loss: 0.7713 - regression_loss: 0.6866 - classification_loss: 0.0847 69/500 [===>..........................] - ETA: 2:27 - loss: 0.7703 - regression_loss: 0.6853 - classification_loss: 0.0850 70/500 [===>..........................] - ETA: 2:27 - loss: 0.7690 - regression_loss: 0.6845 - classification_loss: 0.0845 71/500 [===>..........................] - ETA: 2:27 - loss: 0.7705 - regression_loss: 0.6854 - classification_loss: 0.0852 72/500 [===>..........................] - ETA: 2:26 - loss: 0.7705 - regression_loss: 0.6850 - classification_loss: 0.0856 73/500 [===>..........................] - ETA: 2:26 - loss: 0.7746 - regression_loss: 0.6882 - classification_loss: 0.0864 74/500 [===>..........................] - ETA: 2:26 - loss: 0.7785 - regression_loss: 0.6919 - classification_loss: 0.0866 75/500 [===>..........................] - ETA: 2:25 - loss: 0.7838 - regression_loss: 0.6963 - classification_loss: 0.0876 76/500 [===>..........................] - ETA: 2:25 - loss: 0.7759 - regression_loss: 0.6892 - classification_loss: 0.0867 77/500 [===>..........................] - ETA: 2:24 - loss: 0.7737 - regression_loss: 0.6875 - classification_loss: 0.0863 78/500 [===>..........................] - ETA: 2:24 - loss: 0.7739 - regression_loss: 0.6878 - classification_loss: 0.0861 79/500 [===>..........................] - ETA: 2:24 - loss: 0.7780 - regression_loss: 0.6915 - classification_loss: 0.0864 80/500 [===>..........................] - ETA: 2:24 - loss: 0.7780 - regression_loss: 0.6916 - classification_loss: 0.0864 81/500 [===>..........................] - ETA: 2:23 - loss: 0.7724 - regression_loss: 0.6865 - classification_loss: 0.0859 82/500 [===>..........................] - ETA: 2:23 - loss: 0.7683 - regression_loss: 0.6828 - classification_loss: 0.0855 83/500 [===>..........................] - ETA: 2:23 - loss: 0.7655 - regression_loss: 0.6805 - classification_loss: 0.0850 84/500 [====>.........................] - ETA: 2:22 - loss: 0.7659 - regression_loss: 0.6806 - classification_loss: 0.0853 85/500 [====>.........................] - ETA: 2:22 - loss: 0.7681 - regression_loss: 0.6827 - classification_loss: 0.0854 86/500 [====>.........................] - ETA: 2:22 - loss: 0.7694 - regression_loss: 0.6839 - classification_loss: 0.0854 87/500 [====>.........................] - ETA: 2:21 - loss: 0.7657 - regression_loss: 0.6806 - classification_loss: 0.0852 88/500 [====>.........................] - ETA: 2:21 - loss: 0.7675 - regression_loss: 0.6821 - classification_loss: 0.0855 89/500 [====>.........................] - ETA: 2:21 - loss: 0.7679 - regression_loss: 0.6827 - classification_loss: 0.0852 90/500 [====>.........................] - ETA: 2:21 - loss: 0.7730 - regression_loss: 0.6873 - classification_loss: 0.0858 91/500 [====>.........................] - ETA: 2:20 - loss: 0.7753 - regression_loss: 0.6892 - classification_loss: 0.0860 92/500 [====>.........................] - ETA: 2:20 - loss: 0.7803 - regression_loss: 0.6935 - classification_loss: 0.0868 93/500 [====>.........................] - ETA: 2:19 - loss: 0.7798 - regression_loss: 0.6933 - classification_loss: 0.0865 94/500 [====>.........................] - ETA: 2:19 - loss: 0.7818 - regression_loss: 0.6950 - classification_loss: 0.0868 95/500 [====>.........................] - ETA: 2:19 - loss: 0.7837 - regression_loss: 0.6965 - classification_loss: 0.0872 96/500 [====>.........................] - ETA: 2:18 - loss: 0.7783 - regression_loss: 0.6916 - classification_loss: 0.0866 97/500 [====>.........................] - ETA: 2:18 - loss: 0.7788 - regression_loss: 0.6921 - classification_loss: 0.0867 98/500 [====>.........................] - ETA: 2:18 - loss: 0.7816 - regression_loss: 0.6942 - classification_loss: 0.0874 99/500 [====>.........................] - ETA: 2:17 - loss: 0.7834 - regression_loss: 0.6955 - classification_loss: 0.0879 100/500 [=====>........................] - ETA: 2:17 - loss: 0.7813 - regression_loss: 0.6933 - classification_loss: 0.0880 101/500 [=====>........................] - ETA: 2:16 - loss: 0.7798 - regression_loss: 0.6922 - classification_loss: 0.0875 102/500 [=====>........................] - ETA: 2:16 - loss: 0.7813 - regression_loss: 0.6937 - classification_loss: 0.0875 103/500 [=====>........................] - ETA: 2:16 - loss: 0.7782 - regression_loss: 0.6910 - classification_loss: 0.0872 104/500 [=====>........................] - ETA: 2:15 - loss: 0.7735 - regression_loss: 0.6863 - classification_loss: 0.0871 105/500 [=====>........................] - ETA: 2:15 - loss: 0.7741 - regression_loss: 0.6872 - classification_loss: 0.0869 106/500 [=====>........................] - ETA: 2:14 - loss: 0.7702 - regression_loss: 0.6838 - classification_loss: 0.0864 107/500 [=====>........................] - ETA: 2:14 - loss: 0.7724 - regression_loss: 0.6860 - classification_loss: 0.0863 108/500 [=====>........................] - ETA: 2:14 - loss: 0.7712 - regression_loss: 0.6852 - classification_loss: 0.0859 109/500 [=====>........................] - ETA: 2:13 - loss: 0.7719 - regression_loss: 0.6858 - classification_loss: 0.0861 110/500 [=====>........................] - ETA: 2:13 - loss: 0.7742 - regression_loss: 0.6876 - classification_loss: 0.0866 111/500 [=====>........................] - ETA: 2:12 - loss: 0.7748 - regression_loss: 0.6881 - classification_loss: 0.0867 112/500 [=====>........................] - ETA: 2:12 - loss: 0.7756 - regression_loss: 0.6887 - classification_loss: 0.0869 113/500 [=====>........................] - ETA: 2:12 - loss: 0.7735 - regression_loss: 0.6868 - classification_loss: 0.0867 114/500 [=====>........................] - ETA: 2:11 - loss: 0.7726 - regression_loss: 0.6860 - classification_loss: 0.0866 115/500 [=====>........................] - ETA: 2:11 - loss: 0.7740 - regression_loss: 0.6869 - classification_loss: 0.0871 116/500 [=====>........................] - ETA: 2:11 - loss: 0.7766 - regression_loss: 0.6893 - classification_loss: 0.0874 117/500 [======>.......................] - ETA: 2:10 - loss: 0.7743 - regression_loss: 0.6872 - classification_loss: 0.0871 118/500 [======>.......................] - ETA: 2:10 - loss: 0.7725 - regression_loss: 0.6856 - classification_loss: 0.0869 119/500 [======>.......................] - ETA: 2:10 - loss: 0.7745 - regression_loss: 0.6876 - classification_loss: 0.0869 120/500 [======>.......................] - ETA: 2:09 - loss: 0.7755 - regression_loss: 0.6883 - classification_loss: 0.0871 121/500 [======>.......................] - ETA: 2:09 - loss: 0.7725 - regression_loss: 0.6858 - classification_loss: 0.0868 122/500 [======>.......................] - ETA: 2:08 - loss: 0.7704 - regression_loss: 0.6840 - classification_loss: 0.0864 123/500 [======>.......................] - ETA: 2:08 - loss: 0.7678 - regression_loss: 0.6818 - classification_loss: 0.0860 124/500 [======>.......................] - ETA: 2:08 - loss: 0.7688 - regression_loss: 0.6827 - classification_loss: 0.0860 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7685 - regression_loss: 0.6824 - classification_loss: 0.0861 126/500 [======>.......................] - ETA: 2:07 - loss: 0.7691 - regression_loss: 0.6831 - classification_loss: 0.0860 127/500 [======>.......................] - ETA: 2:07 - loss: 0.7723 - regression_loss: 0.6860 - classification_loss: 0.0863 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7757 - regression_loss: 0.6889 - classification_loss: 0.0868 129/500 [======>.......................] - ETA: 2:06 - loss: 0.7728 - regression_loss: 0.6864 - classification_loss: 0.0864 130/500 [======>.......................] - ETA: 2:06 - loss: 0.7729 - regression_loss: 0.6862 - classification_loss: 0.0867 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7746 - regression_loss: 0.6878 - classification_loss: 0.0868 132/500 [======>.......................] - ETA: 2:05 - loss: 0.7806 - regression_loss: 0.6934 - classification_loss: 0.0873 133/500 [======>.......................] - ETA: 2:05 - loss: 0.7815 - regression_loss: 0.6943 - classification_loss: 0.0873 134/500 [=======>......................] - ETA: 2:04 - loss: 0.7855 - regression_loss: 0.6979 - classification_loss: 0.0876 135/500 [=======>......................] - ETA: 2:04 - loss: 0.7869 - regression_loss: 0.6993 - classification_loss: 0.0876 136/500 [=======>......................] - ETA: 2:04 - loss: 0.7898 - regression_loss: 0.7017 - classification_loss: 0.0882 137/500 [=======>......................] - ETA: 2:03 - loss: 0.7911 - regression_loss: 0.7028 - classification_loss: 0.0883 138/500 [=======>......................] - ETA: 2:03 - loss: 0.7895 - regression_loss: 0.7014 - classification_loss: 0.0881 139/500 [=======>......................] - ETA: 2:03 - loss: 0.7914 - regression_loss: 0.7031 - classification_loss: 0.0883 140/500 [=======>......................] - ETA: 2:02 - loss: 0.7907 - regression_loss: 0.7025 - classification_loss: 0.0882 141/500 [=======>......................] - ETA: 2:02 - loss: 0.7891 - regression_loss: 0.7012 - classification_loss: 0.0879 142/500 [=======>......................] - ETA: 2:02 - loss: 0.7928 - regression_loss: 0.7044 - classification_loss: 0.0884 143/500 [=======>......................] - ETA: 2:02 - loss: 0.7933 - regression_loss: 0.7049 - classification_loss: 0.0884 144/500 [=======>......................] - ETA: 2:01 - loss: 0.7928 - regression_loss: 0.7045 - classification_loss: 0.0883 145/500 [=======>......................] - ETA: 2:01 - loss: 0.7934 - regression_loss: 0.7052 - classification_loss: 0.0882 146/500 [=======>......................] - ETA: 2:00 - loss: 0.7916 - regression_loss: 0.7037 - classification_loss: 0.0879 147/500 [=======>......................] - ETA: 2:00 - loss: 0.7913 - regression_loss: 0.7033 - classification_loss: 0.0879 148/500 [=======>......................] - ETA: 2:00 - loss: 0.7925 - regression_loss: 0.7045 - classification_loss: 0.0880 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7912 - regression_loss: 0.7032 - classification_loss: 0.0881 150/500 [========>.....................] - ETA: 1:59 - loss: 0.7893 - regression_loss: 0.7012 - classification_loss: 0.0881 151/500 [========>.....................] - ETA: 1:59 - loss: 0.7909 - regression_loss: 0.7028 - classification_loss: 0.0882 152/500 [========>.....................] - ETA: 1:58 - loss: 0.7903 - regression_loss: 0.7022 - classification_loss: 0.0881 153/500 [========>.....................] - ETA: 1:58 - loss: 0.7899 - regression_loss: 0.7018 - classification_loss: 0.0881 154/500 [========>.....................] - ETA: 1:58 - loss: 0.7897 - regression_loss: 0.7017 - classification_loss: 0.0880 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7903 - regression_loss: 0.7023 - classification_loss: 0.0880 156/500 [========>.....................] - ETA: 1:57 - loss: 0.7884 - regression_loss: 0.7006 - classification_loss: 0.0878 157/500 [========>.....................] - ETA: 1:57 - loss: 0.7897 - regression_loss: 0.7016 - classification_loss: 0.0880 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7895 - regression_loss: 0.7017 - classification_loss: 0.0878 159/500 [========>.....................] - ETA: 1:56 - loss: 0.7867 - regression_loss: 0.6993 - classification_loss: 0.0874 160/500 [========>.....................] - ETA: 1:56 - loss: 0.7845 - regression_loss: 0.6971 - classification_loss: 0.0874 161/500 [========>.....................] - ETA: 1:55 - loss: 0.7830 - regression_loss: 0.6959 - classification_loss: 0.0871 162/500 [========>.....................] - ETA: 1:55 - loss: 0.7830 - regression_loss: 0.6960 - classification_loss: 0.0870 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7820 - regression_loss: 0.6951 - classification_loss: 0.0868 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7816 - regression_loss: 0.6948 - classification_loss: 0.0867 165/500 [========>.....................] - ETA: 1:54 - loss: 0.7810 - regression_loss: 0.6943 - classification_loss: 0.0866 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7781 - regression_loss: 0.6918 - classification_loss: 0.0863 167/500 [=========>....................] - ETA: 1:53 - loss: 0.7767 - regression_loss: 0.6905 - classification_loss: 0.0861 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7759 - regression_loss: 0.6898 - classification_loss: 0.0860 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7743 - regression_loss: 0.6884 - classification_loss: 0.0859 170/500 [=========>....................] - ETA: 1:52 - loss: 0.7746 - regression_loss: 0.6888 - classification_loss: 0.0858 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7752 - regression_loss: 0.6894 - classification_loss: 0.0858 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7735 - regression_loss: 0.6880 - classification_loss: 0.0854 173/500 [=========>....................] - ETA: 1:51 - loss: 0.7740 - regression_loss: 0.6885 - classification_loss: 0.0854 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7728 - regression_loss: 0.6877 - classification_loss: 0.0852 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7724 - regression_loss: 0.6874 - classification_loss: 0.0850 176/500 [=========>....................] - ETA: 1:50 - loss: 0.7711 - regression_loss: 0.6863 - classification_loss: 0.0848 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7727 - regression_loss: 0.6877 - classification_loss: 0.0850 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7745 - regression_loss: 0.6893 - classification_loss: 0.0852 179/500 [=========>....................] - ETA: 1:49 - loss: 0.7789 - regression_loss: 0.6929 - classification_loss: 0.0861 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7799 - regression_loss: 0.6937 - classification_loss: 0.0862 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7812 - regression_loss: 0.6949 - classification_loss: 0.0863 182/500 [=========>....................] - ETA: 1:48 - loss: 0.7811 - regression_loss: 0.6948 - classification_loss: 0.0864 183/500 [=========>....................] - ETA: 1:48 - loss: 0.7784 - regression_loss: 0.6924 - classification_loss: 0.0860 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7784 - regression_loss: 0.6924 - classification_loss: 0.0860 185/500 [==========>...................] - ETA: 1:47 - loss: 0.7794 - regression_loss: 0.6933 - classification_loss: 0.0861 186/500 [==========>...................] - ETA: 1:47 - loss: 0.7781 - regression_loss: 0.6921 - classification_loss: 0.0860 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7812 - regression_loss: 0.6947 - classification_loss: 0.0864 188/500 [==========>...................] - ETA: 1:46 - loss: 0.7794 - regression_loss: 0.6933 - classification_loss: 0.0862 189/500 [==========>...................] - ETA: 1:46 - loss: 0.7776 - regression_loss: 0.6916 - classification_loss: 0.0859 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7764 - regression_loss: 0.6907 - classification_loss: 0.0857 191/500 [==========>...................] - ETA: 1:45 - loss: 0.7736 - regression_loss: 0.6882 - classification_loss: 0.0854 192/500 [==========>...................] - ETA: 1:45 - loss: 0.7739 - regression_loss: 0.6886 - classification_loss: 0.0854 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7745 - regression_loss: 0.6892 - classification_loss: 0.0854 194/500 [==========>...................] - ETA: 1:44 - loss: 0.7728 - regression_loss: 0.6877 - classification_loss: 0.0852 195/500 [==========>...................] - ETA: 1:44 - loss: 0.7738 - regression_loss: 0.6886 - classification_loss: 0.0852 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7741 - regression_loss: 0.6889 - classification_loss: 0.0852 197/500 [==========>...................] - ETA: 1:43 - loss: 0.7752 - regression_loss: 0.6898 - classification_loss: 0.0854 198/500 [==========>...................] - ETA: 1:43 - loss: 0.7743 - regression_loss: 0.6891 - classification_loss: 0.0852 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7749 - regression_loss: 0.6898 - classification_loss: 0.0851 200/500 [===========>..................] - ETA: 1:42 - loss: 0.7751 - regression_loss: 0.6900 - classification_loss: 0.0851 201/500 [===========>..................] - ETA: 1:42 - loss: 0.7760 - regression_loss: 0.6909 - classification_loss: 0.0851 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7741 - regression_loss: 0.6892 - classification_loss: 0.0849 203/500 [===========>..................] - ETA: 1:41 - loss: 0.7738 - regression_loss: 0.6889 - classification_loss: 0.0849 204/500 [===========>..................] - ETA: 1:41 - loss: 0.7725 - regression_loss: 0.6877 - classification_loss: 0.0848 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7735 - regression_loss: 0.6884 - classification_loss: 0.0851 206/500 [===========>..................] - ETA: 1:40 - loss: 0.7729 - regression_loss: 0.6878 - classification_loss: 0.0851 207/500 [===========>..................] - ETA: 1:40 - loss: 0.7733 - regression_loss: 0.6881 - classification_loss: 0.0852 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7752 - regression_loss: 0.6898 - classification_loss: 0.0854 209/500 [===========>..................] - ETA: 1:39 - loss: 0.7760 - regression_loss: 0.6906 - classification_loss: 0.0854 210/500 [===========>..................] - ETA: 1:39 - loss: 0.7756 - regression_loss: 0.6903 - classification_loss: 0.0854 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7765 - regression_loss: 0.6909 - classification_loss: 0.0856 212/500 [===========>..................] - ETA: 1:38 - loss: 0.7756 - regression_loss: 0.6901 - classification_loss: 0.0855 213/500 [===========>..................] - ETA: 1:38 - loss: 0.7760 - regression_loss: 0.6905 - classification_loss: 0.0855 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7750 - regression_loss: 0.6896 - classification_loss: 0.0854 215/500 [===========>..................] - ETA: 1:37 - loss: 0.7734 - regression_loss: 0.6882 - classification_loss: 0.0853 216/500 [===========>..................] - ETA: 1:37 - loss: 0.7730 - regression_loss: 0.6878 - classification_loss: 0.0852 217/500 [============>.................] - ETA: 1:36 - loss: 0.7737 - regression_loss: 0.6884 - classification_loss: 0.0853 218/500 [============>.................] - ETA: 1:36 - loss: 0.7720 - regression_loss: 0.6870 - classification_loss: 0.0851 219/500 [============>.................] - ETA: 1:36 - loss: 0.7706 - regression_loss: 0.6857 - classification_loss: 0.0849 220/500 [============>.................] - ETA: 1:35 - loss: 0.7722 - regression_loss: 0.6869 - classification_loss: 0.0852 221/500 [============>.................] - ETA: 1:35 - loss: 0.7714 - regression_loss: 0.6864 - classification_loss: 0.0850 222/500 [============>.................] - ETA: 1:34 - loss: 0.7722 - regression_loss: 0.6870 - classification_loss: 0.0852 223/500 [============>.................] - ETA: 1:34 - loss: 0.7735 - regression_loss: 0.6882 - classification_loss: 0.0853 224/500 [============>.................] - ETA: 1:34 - loss: 0.7735 - regression_loss: 0.6883 - classification_loss: 0.0852 225/500 [============>.................] - ETA: 1:33 - loss: 0.7732 - regression_loss: 0.6879 - classification_loss: 0.0852 226/500 [============>.................] - ETA: 1:33 - loss: 0.7741 - regression_loss: 0.6888 - classification_loss: 0.0853 227/500 [============>.................] - ETA: 1:33 - loss: 0.7738 - regression_loss: 0.6886 - classification_loss: 0.0852 228/500 [============>.................] - ETA: 1:32 - loss: 0.7727 - regression_loss: 0.6876 - classification_loss: 0.0850 229/500 [============>.................] - ETA: 1:32 - loss: 0.7729 - regression_loss: 0.6879 - classification_loss: 0.0850 230/500 [============>.................] - ETA: 1:32 - loss: 0.7718 - regression_loss: 0.6868 - classification_loss: 0.0849 231/500 [============>.................] - ETA: 1:31 - loss: 0.7732 - regression_loss: 0.6881 - classification_loss: 0.0851 232/500 [============>.................] - ETA: 1:31 - loss: 0.7741 - regression_loss: 0.6889 - classification_loss: 0.0852 233/500 [============>.................] - ETA: 1:31 - loss: 0.7758 - regression_loss: 0.6903 - classification_loss: 0.0855 234/500 [=============>................] - ETA: 1:30 - loss: 0.7749 - regression_loss: 0.6896 - classification_loss: 0.0854 235/500 [=============>................] - ETA: 1:30 - loss: 0.7753 - regression_loss: 0.6899 - classification_loss: 0.0854 236/500 [=============>................] - ETA: 1:30 - loss: 0.7750 - regression_loss: 0.6896 - classification_loss: 0.0854 237/500 [=============>................] - ETA: 1:29 - loss: 0.7754 - regression_loss: 0.6899 - classification_loss: 0.0854 238/500 [=============>................] - ETA: 1:29 - loss: 0.7756 - regression_loss: 0.6902 - classification_loss: 0.0853 239/500 [=============>................] - ETA: 1:29 - loss: 0.7755 - regression_loss: 0.6901 - classification_loss: 0.0854 240/500 [=============>................] - ETA: 1:28 - loss: 0.7739 - regression_loss: 0.6887 - classification_loss: 0.0852 241/500 [=============>................] - ETA: 1:28 - loss: 0.7747 - regression_loss: 0.6893 - classification_loss: 0.0853 242/500 [=============>................] - ETA: 1:28 - loss: 0.7741 - regression_loss: 0.6888 - classification_loss: 0.0853 243/500 [=============>................] - ETA: 1:27 - loss: 0.7740 - regression_loss: 0.6888 - classification_loss: 0.0852 244/500 [=============>................] - ETA: 1:27 - loss: 0.7731 - regression_loss: 0.6879 - classification_loss: 0.0852 245/500 [=============>................] - ETA: 1:27 - loss: 0.7734 - regression_loss: 0.6882 - classification_loss: 0.0851 246/500 [=============>................] - ETA: 1:26 - loss: 0.7730 - regression_loss: 0.6879 - classification_loss: 0.0851 247/500 [=============>................] - ETA: 1:26 - loss: 0.7721 - regression_loss: 0.6871 - classification_loss: 0.0850 248/500 [=============>................] - ETA: 1:26 - loss: 0.7711 - regression_loss: 0.6862 - classification_loss: 0.0849 249/500 [=============>................] - ETA: 1:25 - loss: 0.7710 - regression_loss: 0.6860 - classification_loss: 0.0849 250/500 [==============>...............] - ETA: 1:25 - loss: 0.7699 - regression_loss: 0.6851 - classification_loss: 0.0848 251/500 [==============>...............] - ETA: 1:25 - loss: 0.7711 - regression_loss: 0.6864 - classification_loss: 0.0848 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7715 - regression_loss: 0.6867 - classification_loss: 0.0848 253/500 [==============>...............] - ETA: 1:24 - loss: 0.7709 - regression_loss: 0.6863 - classification_loss: 0.0846 254/500 [==============>...............] - ETA: 1:24 - loss: 0.7692 - regression_loss: 0.6847 - classification_loss: 0.0844 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7690 - regression_loss: 0.6847 - classification_loss: 0.0844 256/500 [==============>...............] - ETA: 1:23 - loss: 0.7705 - regression_loss: 0.6858 - classification_loss: 0.0846 257/500 [==============>...............] - ETA: 1:23 - loss: 0.7688 - regression_loss: 0.6844 - classification_loss: 0.0844 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7676 - regression_loss: 0.6832 - classification_loss: 0.0844 259/500 [==============>...............] - ETA: 1:22 - loss: 0.7666 - regression_loss: 0.6823 - classification_loss: 0.0843 260/500 [==============>...............] - ETA: 1:22 - loss: 0.7663 - regression_loss: 0.6820 - classification_loss: 0.0843 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7646 - regression_loss: 0.6806 - classification_loss: 0.0841 262/500 [==============>...............] - ETA: 1:21 - loss: 0.7645 - regression_loss: 0.6806 - classification_loss: 0.0839 263/500 [==============>...............] - ETA: 1:21 - loss: 0.7631 - regression_loss: 0.6794 - classification_loss: 0.0837 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7649 - regression_loss: 0.6810 - classification_loss: 0.0839 265/500 [==============>...............] - ETA: 1:20 - loss: 0.7645 - regression_loss: 0.6807 - classification_loss: 0.0838 266/500 [==============>...............] - ETA: 1:20 - loss: 0.7653 - regression_loss: 0.6814 - classification_loss: 0.0839 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7675 - regression_loss: 0.6834 - classification_loss: 0.0841 268/500 [===============>..............] - ETA: 1:19 - loss: 0.7678 - regression_loss: 0.6837 - classification_loss: 0.0841 269/500 [===============>..............] - ETA: 1:19 - loss: 0.7671 - regression_loss: 0.6833 - classification_loss: 0.0839 270/500 [===============>..............] - ETA: 1:18 - loss: 0.7674 - regression_loss: 0.6837 - classification_loss: 0.0838 271/500 [===============>..............] - ETA: 1:18 - loss: 0.7664 - regression_loss: 0.6827 - classification_loss: 0.0838 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7679 - regression_loss: 0.6839 - classification_loss: 0.0840 273/500 [===============>..............] - ETA: 1:17 - loss: 0.7680 - regression_loss: 0.6839 - classification_loss: 0.0841 274/500 [===============>..............] - ETA: 1:17 - loss: 0.7683 - regression_loss: 0.6841 - classification_loss: 0.0842 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7685 - regression_loss: 0.6842 - classification_loss: 0.0842 276/500 [===============>..............] - ETA: 1:16 - loss: 0.7681 - regression_loss: 0.6837 - classification_loss: 0.0843 277/500 [===============>..............] - ETA: 1:16 - loss: 0.7685 - regression_loss: 0.6841 - classification_loss: 0.0844 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7700 - regression_loss: 0.6855 - classification_loss: 0.0845 279/500 [===============>..............] - ETA: 1:15 - loss: 0.7690 - regression_loss: 0.6847 - classification_loss: 0.0843 280/500 [===============>..............] - ETA: 1:15 - loss: 0.7709 - regression_loss: 0.6864 - classification_loss: 0.0845 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7714 - regression_loss: 0.6868 - classification_loss: 0.0846 282/500 [===============>..............] - ETA: 1:14 - loss: 0.7697 - regression_loss: 0.6852 - classification_loss: 0.0844 283/500 [===============>..............] - ETA: 1:14 - loss: 0.7690 - regression_loss: 0.6846 - classification_loss: 0.0844 284/500 [================>.............] - ETA: 1:13 - loss: 0.7696 - regression_loss: 0.6852 - classification_loss: 0.0844 285/500 [================>.............] - ETA: 1:13 - loss: 0.7688 - regression_loss: 0.6845 - classification_loss: 0.0843 286/500 [================>.............] - ETA: 1:13 - loss: 0.7703 - regression_loss: 0.6858 - classification_loss: 0.0845 287/500 [================>.............] - ETA: 1:12 - loss: 0.7712 - regression_loss: 0.6867 - classification_loss: 0.0845 288/500 [================>.............] - ETA: 1:12 - loss: 0.7698 - regression_loss: 0.6854 - classification_loss: 0.0844 289/500 [================>.............] - ETA: 1:12 - loss: 0.7696 - regression_loss: 0.6852 - classification_loss: 0.0843 290/500 [================>.............] - ETA: 1:11 - loss: 0.7694 - regression_loss: 0.6851 - classification_loss: 0.0843 291/500 [================>.............] - ETA: 1:11 - loss: 0.7694 - regression_loss: 0.6851 - classification_loss: 0.0843 292/500 [================>.............] - ETA: 1:11 - loss: 0.7683 - regression_loss: 0.6842 - classification_loss: 0.0841 293/500 [================>.............] - ETA: 1:10 - loss: 0.7679 - regression_loss: 0.6838 - classification_loss: 0.0840 294/500 [================>.............] - ETA: 1:10 - loss: 0.7684 - regression_loss: 0.6844 - classification_loss: 0.0841 295/500 [================>.............] - ETA: 1:10 - loss: 0.7681 - regression_loss: 0.6841 - classification_loss: 0.0840 296/500 [================>.............] - ETA: 1:09 - loss: 0.7683 - regression_loss: 0.6843 - classification_loss: 0.0840 297/500 [================>.............] - ETA: 1:09 - loss: 0.7676 - regression_loss: 0.6837 - classification_loss: 0.0839 298/500 [================>.............] - ETA: 1:09 - loss: 0.7679 - regression_loss: 0.6839 - classification_loss: 0.0840 299/500 [================>.............] - ETA: 1:08 - loss: 0.7670 - regression_loss: 0.6831 - classification_loss: 0.0839 300/500 [=================>............] - ETA: 1:08 - loss: 0.7689 - regression_loss: 0.6849 - classification_loss: 0.0840 301/500 [=================>............] - ETA: 1:08 - loss: 0.7693 - regression_loss: 0.6851 - classification_loss: 0.0842 302/500 [=================>............] - ETA: 1:07 - loss: 0.7701 - regression_loss: 0.6860 - classification_loss: 0.0841 303/500 [=================>............] - ETA: 1:07 - loss: 0.7702 - regression_loss: 0.6859 - classification_loss: 0.0842 304/500 [=================>............] - ETA: 1:07 - loss: 0.7699 - regression_loss: 0.6858 - classification_loss: 0.0842 305/500 [=================>............] - ETA: 1:06 - loss: 0.7709 - regression_loss: 0.6867 - classification_loss: 0.0842 306/500 [=================>............] - ETA: 1:06 - loss: 0.7714 - regression_loss: 0.6872 - classification_loss: 0.0842 307/500 [=================>............] - ETA: 1:06 - loss: 0.7702 - regression_loss: 0.6863 - classification_loss: 0.0840 308/500 [=================>............] - ETA: 1:05 - loss: 0.7703 - regression_loss: 0.6863 - classification_loss: 0.0839 309/500 [=================>............] - ETA: 1:05 - loss: 0.7705 - regression_loss: 0.6865 - classification_loss: 0.0840 310/500 [=================>............] - ETA: 1:05 - loss: 0.7709 - regression_loss: 0.6869 - classification_loss: 0.0840 311/500 [=================>............] - ETA: 1:04 - loss: 0.7711 - regression_loss: 0.6872 - classification_loss: 0.0839 312/500 [=================>............] - ETA: 1:04 - loss: 0.7718 - regression_loss: 0.6878 - classification_loss: 0.0840 313/500 [=================>............] - ETA: 1:04 - loss: 0.7711 - regression_loss: 0.6871 - classification_loss: 0.0840 314/500 [=================>............] - ETA: 1:03 - loss: 0.7702 - regression_loss: 0.6863 - classification_loss: 0.0839 315/500 [=================>............] - ETA: 1:03 - loss: 0.7698 - regression_loss: 0.6860 - classification_loss: 0.0838 316/500 [=================>............] - ETA: 1:03 - loss: 0.7697 - regression_loss: 0.6859 - classification_loss: 0.0837 317/500 [==================>...........] - ETA: 1:02 - loss: 0.7700 - regression_loss: 0.6862 - classification_loss: 0.0839 318/500 [==================>...........] - ETA: 1:02 - loss: 0.7701 - regression_loss: 0.6862 - classification_loss: 0.0838 319/500 [==================>...........] - ETA: 1:02 - loss: 0.7694 - regression_loss: 0.6856 - classification_loss: 0.0838 320/500 [==================>...........] - ETA: 1:01 - loss: 0.7687 - regression_loss: 0.6850 - classification_loss: 0.0837 321/500 [==================>...........] - ETA: 1:01 - loss: 0.7685 - regression_loss: 0.6848 - classification_loss: 0.0837 322/500 [==================>...........] - ETA: 1:01 - loss: 0.7702 - regression_loss: 0.6866 - classification_loss: 0.0836 323/500 [==================>...........] - ETA: 1:00 - loss: 0.7707 - regression_loss: 0.6871 - classification_loss: 0.0837 324/500 [==================>...........] - ETA: 1:00 - loss: 0.7717 - regression_loss: 0.6879 - classification_loss: 0.0838 325/500 [==================>...........] - ETA: 1:00 - loss: 0.7712 - regression_loss: 0.6876 - classification_loss: 0.0837 326/500 [==================>...........] - ETA: 59s - loss: 0.7712 - regression_loss: 0.6876 - classification_loss: 0.0836  327/500 [==================>...........] - ETA: 59s - loss: 0.7713 - regression_loss: 0.6877 - classification_loss: 0.0836 328/500 [==================>...........] - ETA: 59s - loss: 0.7711 - regression_loss: 0.6874 - classification_loss: 0.0837 329/500 [==================>...........] - ETA: 58s - loss: 0.7707 - regression_loss: 0.6869 - classification_loss: 0.0838 330/500 [==================>...........] - ETA: 58s - loss: 0.7710 - regression_loss: 0.6871 - classification_loss: 0.0838 331/500 [==================>...........] - ETA: 58s - loss: 0.7718 - regression_loss: 0.6880 - classification_loss: 0.0839 332/500 [==================>...........] - ETA: 57s - loss: 0.7730 - regression_loss: 0.6890 - classification_loss: 0.0840 333/500 [==================>...........] - ETA: 57s - loss: 0.7722 - regression_loss: 0.6883 - classification_loss: 0.0839 334/500 [===================>..........] - ETA: 56s - loss: 0.7712 - regression_loss: 0.6875 - classification_loss: 0.0837 335/500 [===================>..........] - ETA: 56s - loss: 0.7712 - regression_loss: 0.6876 - classification_loss: 0.0836 336/500 [===================>..........] - ETA: 56s - loss: 0.7718 - regression_loss: 0.6880 - classification_loss: 0.0837 337/500 [===================>..........] - ETA: 55s - loss: 0.7717 - regression_loss: 0.6880 - classification_loss: 0.0837 338/500 [===================>..........] - ETA: 55s - loss: 0.7724 - regression_loss: 0.6887 - classification_loss: 0.0837 339/500 [===================>..........] - ETA: 55s - loss: 0.7713 - regression_loss: 0.6878 - classification_loss: 0.0835 340/500 [===================>..........] - ETA: 54s - loss: 0.7709 - regression_loss: 0.6874 - classification_loss: 0.0835 341/500 [===================>..........] - ETA: 54s - loss: 0.7717 - regression_loss: 0.6881 - classification_loss: 0.0837 342/500 [===================>..........] - ETA: 54s - loss: 0.7716 - regression_loss: 0.6880 - classification_loss: 0.0837 343/500 [===================>..........] - ETA: 53s - loss: 0.7704 - regression_loss: 0.6869 - classification_loss: 0.0835 344/500 [===================>..........] - ETA: 53s - loss: 0.7706 - regression_loss: 0.6871 - classification_loss: 0.0835 345/500 [===================>..........] - ETA: 53s - loss: 0.7704 - regression_loss: 0.6869 - classification_loss: 0.0835 346/500 [===================>..........] - ETA: 52s - loss: 0.7691 - regression_loss: 0.6858 - classification_loss: 0.0834 347/500 [===================>..........] - ETA: 52s - loss: 0.7687 - regression_loss: 0.6853 - classification_loss: 0.0833 348/500 [===================>..........] - ETA: 52s - loss: 0.7685 - regression_loss: 0.6852 - classification_loss: 0.0833 349/500 [===================>..........] - ETA: 51s - loss: 0.7681 - regression_loss: 0.6849 - classification_loss: 0.0832 350/500 [====================>.........] - ETA: 51s - loss: 0.7675 - regression_loss: 0.6844 - classification_loss: 0.0831 351/500 [====================>.........] - ETA: 51s - loss: 0.7666 - regression_loss: 0.6837 - classification_loss: 0.0829 352/500 [====================>.........] - ETA: 50s - loss: 0.7657 - regression_loss: 0.6829 - classification_loss: 0.0828 353/500 [====================>.........] - ETA: 50s - loss: 0.7654 - regression_loss: 0.6827 - classification_loss: 0.0828 354/500 [====================>.........] - ETA: 50s - loss: 0.7657 - regression_loss: 0.6829 - classification_loss: 0.0827 355/500 [====================>.........] - ETA: 49s - loss: 0.7663 - regression_loss: 0.6835 - classification_loss: 0.0828 356/500 [====================>.........] - ETA: 49s - loss: 0.7653 - regression_loss: 0.6827 - classification_loss: 0.0826 357/500 [====================>.........] - ETA: 49s - loss: 0.7642 - regression_loss: 0.6818 - classification_loss: 0.0825 358/500 [====================>.........] - ETA: 48s - loss: 0.7637 - regression_loss: 0.6813 - classification_loss: 0.0824 359/500 [====================>.........] - ETA: 48s - loss: 0.7637 - regression_loss: 0.6813 - classification_loss: 0.0823 360/500 [====================>.........] - ETA: 48s - loss: 0.7653 - regression_loss: 0.6828 - classification_loss: 0.0825 361/500 [====================>.........] - ETA: 47s - loss: 0.7661 - regression_loss: 0.6836 - classification_loss: 0.0826 362/500 [====================>.........] - ETA: 47s - loss: 0.7656 - regression_loss: 0.6831 - classification_loss: 0.0825 363/500 [====================>.........] - ETA: 47s - loss: 0.7658 - regression_loss: 0.6832 - classification_loss: 0.0825 364/500 [====================>.........] - ETA: 46s - loss: 0.7653 - regression_loss: 0.6828 - classification_loss: 0.0825 365/500 [====================>.........] - ETA: 46s - loss: 0.7672 - regression_loss: 0.6845 - classification_loss: 0.0827 366/500 [====================>.........] - ETA: 46s - loss: 0.7658 - regression_loss: 0.6832 - classification_loss: 0.0826 367/500 [=====================>........] - ETA: 45s - loss: 0.7656 - regression_loss: 0.6830 - classification_loss: 0.0826 368/500 [=====================>........] - ETA: 45s - loss: 0.7643 - regression_loss: 0.6819 - classification_loss: 0.0824 369/500 [=====================>........] - ETA: 44s - loss: 0.7635 - regression_loss: 0.6812 - classification_loss: 0.0823 370/500 [=====================>........] - ETA: 44s - loss: 0.7626 - regression_loss: 0.6804 - classification_loss: 0.0822 371/500 [=====================>........] - ETA: 44s - loss: 0.7623 - regression_loss: 0.6801 - classification_loss: 0.0822 372/500 [=====================>........] - ETA: 43s - loss: 0.7625 - regression_loss: 0.6802 - classification_loss: 0.0823 373/500 [=====================>........] - ETA: 43s - loss: 0.7612 - regression_loss: 0.6791 - classification_loss: 0.0822 374/500 [=====================>........] - ETA: 43s - loss: 0.7606 - regression_loss: 0.6784 - classification_loss: 0.0821 375/500 [=====================>........] - ETA: 42s - loss: 0.7602 - regression_loss: 0.6781 - classification_loss: 0.0821 376/500 [=====================>........] - ETA: 42s - loss: 0.7611 - regression_loss: 0.6789 - classification_loss: 0.0821 377/500 [=====================>........] - ETA: 42s - loss: 0.7621 - regression_loss: 0.6799 - classification_loss: 0.0822 378/500 [=====================>........] - ETA: 41s - loss: 0.7610 - regression_loss: 0.6789 - classification_loss: 0.0821 379/500 [=====================>........] - ETA: 41s - loss: 0.7609 - regression_loss: 0.6788 - classification_loss: 0.0821 380/500 [=====================>........] - ETA: 41s - loss: 0.7606 - regression_loss: 0.6786 - classification_loss: 0.0820 381/500 [=====================>........] - ETA: 40s - loss: 0.7604 - regression_loss: 0.6784 - classification_loss: 0.0820 382/500 [=====================>........] - ETA: 40s - loss: 0.7615 - regression_loss: 0.6794 - classification_loss: 0.0821 383/500 [=====================>........] - ETA: 40s - loss: 0.7619 - regression_loss: 0.6797 - classification_loss: 0.0821 384/500 [======================>.......] - ETA: 39s - loss: 0.7607 - regression_loss: 0.6787 - classification_loss: 0.0820 385/500 [======================>.......] - ETA: 39s - loss: 0.7603 - regression_loss: 0.6783 - classification_loss: 0.0820 386/500 [======================>.......] - ETA: 39s - loss: 0.7601 - regression_loss: 0.6782 - classification_loss: 0.0819 387/500 [======================>.......] - ETA: 38s - loss: 0.7596 - regression_loss: 0.6778 - classification_loss: 0.0818 388/500 [======================>.......] - ETA: 38s - loss: 0.7598 - regression_loss: 0.6780 - classification_loss: 0.0818 389/500 [======================>.......] - ETA: 38s - loss: 0.7606 - regression_loss: 0.6787 - classification_loss: 0.0819 390/500 [======================>.......] - ETA: 37s - loss: 0.7605 - regression_loss: 0.6787 - classification_loss: 0.0819 391/500 [======================>.......] - ETA: 37s - loss: 0.7616 - regression_loss: 0.6797 - classification_loss: 0.0819 392/500 [======================>.......] - ETA: 37s - loss: 0.7625 - regression_loss: 0.6804 - classification_loss: 0.0821 393/500 [======================>.......] - ETA: 36s - loss: 0.7621 - regression_loss: 0.6800 - classification_loss: 0.0821 394/500 [======================>.......] - ETA: 36s - loss: 0.7624 - regression_loss: 0.6803 - classification_loss: 0.0821 395/500 [======================>.......] - ETA: 36s - loss: 0.7628 - regression_loss: 0.6807 - classification_loss: 0.0821 396/500 [======================>.......] - ETA: 35s - loss: 0.7642 - regression_loss: 0.6818 - classification_loss: 0.0824 397/500 [======================>.......] - ETA: 35s - loss: 0.7636 - regression_loss: 0.6812 - classification_loss: 0.0824 398/500 [======================>.......] - ETA: 35s - loss: 0.7634 - regression_loss: 0.6811 - classification_loss: 0.0823 399/500 [======================>.......] - ETA: 34s - loss: 0.7631 - regression_loss: 0.6809 - classification_loss: 0.0822 400/500 [=======================>......] - ETA: 34s - loss: 0.7622 - regression_loss: 0.6801 - classification_loss: 0.0821 401/500 [=======================>......] - ETA: 33s - loss: 0.7617 - regression_loss: 0.6796 - classification_loss: 0.0821 402/500 [=======================>......] - ETA: 33s - loss: 0.7612 - regression_loss: 0.6792 - classification_loss: 0.0820 403/500 [=======================>......] - ETA: 33s - loss: 0.7611 - regression_loss: 0.6792 - classification_loss: 0.0820 404/500 [=======================>......] - ETA: 32s - loss: 0.7616 - regression_loss: 0.6796 - classification_loss: 0.0820 405/500 [=======================>......] - ETA: 32s - loss: 0.7625 - regression_loss: 0.6803 - classification_loss: 0.0822 406/500 [=======================>......] - ETA: 32s - loss: 0.7626 - regression_loss: 0.6804 - classification_loss: 0.0822 407/500 [=======================>......] - ETA: 31s - loss: 0.7626 - regression_loss: 0.6805 - classification_loss: 0.0822 408/500 [=======================>......] - ETA: 31s - loss: 0.7619 - regression_loss: 0.6798 - classification_loss: 0.0821 409/500 [=======================>......] - ETA: 31s - loss: 0.7612 - regression_loss: 0.6792 - classification_loss: 0.0821 410/500 [=======================>......] - ETA: 30s - loss: 0.7622 - regression_loss: 0.6800 - classification_loss: 0.0821 411/500 [=======================>......] - ETA: 30s - loss: 0.7625 - regression_loss: 0.6803 - classification_loss: 0.0822 412/500 [=======================>......] - ETA: 30s - loss: 0.7620 - regression_loss: 0.6799 - classification_loss: 0.0821 413/500 [=======================>......] - ETA: 29s - loss: 0.7618 - regression_loss: 0.6798 - classification_loss: 0.0820 414/500 [=======================>......] - ETA: 29s - loss: 0.7620 - regression_loss: 0.6800 - classification_loss: 0.0820 415/500 [=======================>......] - ETA: 29s - loss: 0.7617 - regression_loss: 0.6798 - classification_loss: 0.0819 416/500 [=======================>......] - ETA: 28s - loss: 0.7608 - regression_loss: 0.6789 - classification_loss: 0.0818 417/500 [========================>.....] - ETA: 28s - loss: 0.7615 - regression_loss: 0.6795 - classification_loss: 0.0820 418/500 [========================>.....] - ETA: 28s - loss: 0.7620 - regression_loss: 0.6799 - classification_loss: 0.0821 419/500 [========================>.....] - ETA: 27s - loss: 0.7625 - regression_loss: 0.6804 - classification_loss: 0.0822 420/500 [========================>.....] - ETA: 27s - loss: 0.7625 - regression_loss: 0.6804 - classification_loss: 0.0821 421/500 [========================>.....] - ETA: 27s - loss: 0.7631 - regression_loss: 0.6809 - classification_loss: 0.0822 422/500 [========================>.....] - ETA: 26s - loss: 0.7636 - regression_loss: 0.6812 - classification_loss: 0.0824 423/500 [========================>.....] - ETA: 26s - loss: 0.7641 - regression_loss: 0.6818 - classification_loss: 0.0824 424/500 [========================>.....] - ETA: 26s - loss: 0.7654 - regression_loss: 0.6829 - classification_loss: 0.0824 425/500 [========================>.....] - ETA: 25s - loss: 0.7645 - regression_loss: 0.6821 - classification_loss: 0.0823 426/500 [========================>.....] - ETA: 25s - loss: 0.7660 - regression_loss: 0.6833 - classification_loss: 0.0827 427/500 [========================>.....] - ETA: 25s - loss: 0.7662 - regression_loss: 0.6835 - classification_loss: 0.0827 428/500 [========================>.....] - ETA: 24s - loss: 0.7666 - regression_loss: 0.6839 - classification_loss: 0.0828 429/500 [========================>.....] - ETA: 24s - loss: 0.7672 - regression_loss: 0.6843 - classification_loss: 0.0829 430/500 [========================>.....] - ETA: 24s - loss: 0.7675 - regression_loss: 0.6846 - classification_loss: 0.0829 431/500 [========================>.....] - ETA: 23s - loss: 0.7675 - regression_loss: 0.6846 - classification_loss: 0.0829 432/500 [========================>.....] - ETA: 23s - loss: 0.7673 - regression_loss: 0.6844 - classification_loss: 0.0829 433/500 [========================>.....] - ETA: 22s - loss: 0.7672 - regression_loss: 0.6844 - classification_loss: 0.0828 434/500 [=========================>....] - ETA: 22s - loss: 0.7678 - regression_loss: 0.6849 - classification_loss: 0.0829 435/500 [=========================>....] - ETA: 22s - loss: 0.7672 - regression_loss: 0.6844 - classification_loss: 0.0828 436/500 [=========================>....] - ETA: 21s - loss: 0.7674 - regression_loss: 0.6846 - classification_loss: 0.0829 437/500 [=========================>....] - ETA: 21s - loss: 0.7675 - regression_loss: 0.6846 - classification_loss: 0.0829 438/500 [=========================>....] - ETA: 21s - loss: 0.7674 - regression_loss: 0.6845 - classification_loss: 0.0829 439/500 [=========================>....] - ETA: 20s - loss: 0.7681 - regression_loss: 0.6850 - classification_loss: 0.0831 440/500 [=========================>....] - ETA: 20s - loss: 0.7685 - regression_loss: 0.6854 - classification_loss: 0.0831 441/500 [=========================>....] - ETA: 20s - loss: 0.7684 - regression_loss: 0.6854 - classification_loss: 0.0830 442/500 [=========================>....] - ETA: 19s - loss: 0.7690 - regression_loss: 0.6860 - classification_loss: 0.0830 443/500 [=========================>....] - ETA: 19s - loss: 0.7700 - regression_loss: 0.6868 - classification_loss: 0.0832 444/500 [=========================>....] - ETA: 19s - loss: 0.7704 - regression_loss: 0.6872 - classification_loss: 0.0832 445/500 [=========================>....] - ETA: 18s - loss: 0.7700 - regression_loss: 0.6869 - classification_loss: 0.0832 446/500 [=========================>....] - ETA: 18s - loss: 0.7697 - regression_loss: 0.6866 - classification_loss: 0.0831 447/500 [=========================>....] - ETA: 18s - loss: 0.7705 - regression_loss: 0.6874 - classification_loss: 0.0830 448/500 [=========================>....] - ETA: 17s - loss: 0.7701 - regression_loss: 0.6871 - classification_loss: 0.0830 449/500 [=========================>....] - ETA: 17s - loss: 0.7713 - regression_loss: 0.6882 - classification_loss: 0.0831 450/500 [==========================>...] - ETA: 17s - loss: 0.7721 - regression_loss: 0.6889 - classification_loss: 0.0832 451/500 [==========================>...] - ETA: 16s - loss: 0.7711 - regression_loss: 0.6880 - classification_loss: 0.0831 452/500 [==========================>...] - ETA: 16s - loss: 0.7724 - regression_loss: 0.6892 - classification_loss: 0.0832 453/500 [==========================>...] - ETA: 16s - loss: 0.7730 - regression_loss: 0.6898 - classification_loss: 0.0833 454/500 [==========================>...] - ETA: 15s - loss: 0.7731 - regression_loss: 0.6899 - classification_loss: 0.0832 455/500 [==========================>...] - ETA: 15s - loss: 0.7740 - regression_loss: 0.6907 - classification_loss: 0.0833 456/500 [==========================>...] - ETA: 15s - loss: 0.7746 - regression_loss: 0.6913 - classification_loss: 0.0833 457/500 [==========================>...] - ETA: 14s - loss: 0.7743 - regression_loss: 0.6911 - classification_loss: 0.0833 458/500 [==========================>...] - ETA: 14s - loss: 0.7745 - regression_loss: 0.6913 - classification_loss: 0.0832 459/500 [==========================>...] - ETA: 14s - loss: 0.7742 - regression_loss: 0.6910 - classification_loss: 0.0832 460/500 [==========================>...] - ETA: 13s - loss: 0.7741 - regression_loss: 0.6910 - classification_loss: 0.0832 461/500 [==========================>...] - ETA: 13s - loss: 0.7749 - regression_loss: 0.6916 - classification_loss: 0.0833 462/500 [==========================>...] - ETA: 13s - loss: 0.7753 - regression_loss: 0.6919 - classification_loss: 0.0834 463/500 [==========================>...] - ETA: 12s - loss: 0.7767 - regression_loss: 0.6931 - classification_loss: 0.0837 464/500 [==========================>...] - ETA: 12s - loss: 0.7767 - regression_loss: 0.6930 - classification_loss: 0.0837 465/500 [==========================>...] - ETA: 12s - loss: 0.7763 - regression_loss: 0.6926 - classification_loss: 0.0837 466/500 [==========================>...] - ETA: 11s - loss: 0.7756 - regression_loss: 0.6920 - classification_loss: 0.0836 467/500 [===========================>..] - ETA: 11s - loss: 0.7752 - regression_loss: 0.6917 - classification_loss: 0.0836 468/500 [===========================>..] - ETA: 10s - loss: 0.7760 - regression_loss: 0.6923 - classification_loss: 0.0837 469/500 [===========================>..] - ETA: 10s - loss: 0.7763 - regression_loss: 0.6927 - classification_loss: 0.0836 470/500 [===========================>..] - ETA: 10s - loss: 0.7760 - regression_loss: 0.6924 - classification_loss: 0.0835 471/500 [===========================>..] - ETA: 9s - loss: 0.7760 - regression_loss: 0.6924 - classification_loss: 0.0835  472/500 [===========================>..] - ETA: 9s - loss: 0.7758 - regression_loss: 0.6923 - classification_loss: 0.0835 473/500 [===========================>..] - ETA: 9s - loss: 0.7771 - regression_loss: 0.6935 - classification_loss: 0.0837 474/500 [===========================>..] - ETA: 8s - loss: 0.7790 - regression_loss: 0.6950 - classification_loss: 0.0840 475/500 [===========================>..] - ETA: 8s - loss: 0.7781 - regression_loss: 0.6941 - classification_loss: 0.0839 476/500 [===========================>..] - ETA: 8s - loss: 0.7785 - regression_loss: 0.6945 - classification_loss: 0.0840 477/500 [===========================>..] - ETA: 7s - loss: 0.7784 - regression_loss: 0.6945 - classification_loss: 0.0839 478/500 [===========================>..] - ETA: 7s - loss: 0.7788 - regression_loss: 0.6949 - classification_loss: 0.0839 479/500 [===========================>..] - ETA: 7s - loss: 0.7783 - regression_loss: 0.6945 - classification_loss: 0.0838 480/500 [===========================>..] - ETA: 6s - loss: 0.7789 - regression_loss: 0.6950 - classification_loss: 0.0839 481/500 [===========================>..] - ETA: 6s - loss: 0.7793 - regression_loss: 0.6953 - classification_loss: 0.0840 482/500 [===========================>..] - ETA: 6s - loss: 0.7786 - regression_loss: 0.6947 - classification_loss: 0.0839 483/500 [===========================>..] - ETA: 5s - loss: 0.7787 - regression_loss: 0.6947 - classification_loss: 0.0840 484/500 [============================>.] - ETA: 5s - loss: 0.7789 - regression_loss: 0.6950 - classification_loss: 0.0840 485/500 [============================>.] - ETA: 5s - loss: 0.7790 - regression_loss: 0.6951 - classification_loss: 0.0839 486/500 [============================>.] - ETA: 4s - loss: 0.7793 - regression_loss: 0.6953 - classification_loss: 0.0841 487/500 [============================>.] - ETA: 4s - loss: 0.7785 - regression_loss: 0.6945 - classification_loss: 0.0840 488/500 [============================>.] - ETA: 4s - loss: 0.7786 - regression_loss: 0.6945 - classification_loss: 0.0841 489/500 [============================>.] - ETA: 3s - loss: 0.7779 - regression_loss: 0.6939 - classification_loss: 0.0840 490/500 [============================>.] - ETA: 3s - loss: 0.7781 - regression_loss: 0.6940 - classification_loss: 0.0841 491/500 [============================>.] - ETA: 3s - loss: 0.7785 - regression_loss: 0.6943 - classification_loss: 0.0842 492/500 [============================>.] - ETA: 2s - loss: 0.7784 - regression_loss: 0.6943 - classification_loss: 0.0842 493/500 [============================>.] - ETA: 2s - loss: 0.7778 - regression_loss: 0.6937 - classification_loss: 0.0841 494/500 [============================>.] - ETA: 2s - loss: 0.7773 - regression_loss: 0.6932 - classification_loss: 0.0840 495/500 [============================>.] - ETA: 1s - loss: 0.7766 - regression_loss: 0.6927 - classification_loss: 0.0839 496/500 [============================>.] - ETA: 1s - loss: 0.7770 - regression_loss: 0.6931 - classification_loss: 0.0839 497/500 [============================>.] - ETA: 1s - loss: 0.7770 - regression_loss: 0.6931 - classification_loss: 0.0839 498/500 [============================>.] - ETA: 0s - loss: 0.7774 - regression_loss: 0.6935 - classification_loss: 0.0839 499/500 [============================>.] - ETA: 0s - loss: 0.7780 - regression_loss: 0.6939 - classification_loss: 0.0841 500/500 [==============================] - 172s 343ms/step - loss: 0.7779 - regression_loss: 0.6938 - classification_loss: 0.0840 1172 instances of class plum with average precision: 0.7017 mAP: 0.7017 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 2:41 - loss: 1.0218 - regression_loss: 0.8565 - classification_loss: 0.1653 2/500 [..............................] - ETA: 2:41 - loss: 0.8865 - regression_loss: 0.7573 - classification_loss: 0.1292 3/500 [..............................] - ETA: 2:43 - loss: 0.8371 - regression_loss: 0.7263 - classification_loss: 0.1107 4/500 [..............................] - ETA: 2:44 - loss: 0.8921 - regression_loss: 0.7795 - classification_loss: 0.1126 5/500 [..............................] - ETA: 2:44 - loss: 0.8708 - regression_loss: 0.7593 - classification_loss: 0.1115 6/500 [..............................] - ETA: 2:45 - loss: 0.8144 - regression_loss: 0.7119 - classification_loss: 0.1025 7/500 [..............................] - ETA: 2:46 - loss: 0.7619 - regression_loss: 0.6638 - classification_loss: 0.0981 8/500 [..............................] - ETA: 2:46 - loss: 0.7608 - regression_loss: 0.6659 - classification_loss: 0.0948 9/500 [..............................] - ETA: 2:47 - loss: 0.7633 - regression_loss: 0.6694 - classification_loss: 0.0939 10/500 [..............................] - ETA: 2:46 - loss: 0.7517 - regression_loss: 0.6605 - classification_loss: 0.0912 11/500 [..............................] - ETA: 2:45 - loss: 0.8104 - regression_loss: 0.7159 - classification_loss: 0.0945 12/500 [..............................] - ETA: 2:44 - loss: 0.8073 - regression_loss: 0.7149 - classification_loss: 0.0924 13/500 [..............................] - ETA: 2:43 - loss: 0.7932 - regression_loss: 0.7015 - classification_loss: 0.0916 14/500 [..............................] - ETA: 2:42 - loss: 0.8147 - regression_loss: 0.7167 - classification_loss: 0.0981 15/500 [..............................] - ETA: 2:43 - loss: 0.8131 - regression_loss: 0.7167 - classification_loss: 0.0963 16/500 [..............................] - ETA: 2:43 - loss: 0.7851 - regression_loss: 0.6930 - classification_loss: 0.0921 17/500 [>.............................] - ETA: 2:42 - loss: 0.8100 - regression_loss: 0.7151 - classification_loss: 0.0950 18/500 [>.............................] - ETA: 2:41 - loss: 0.8017 - regression_loss: 0.7089 - classification_loss: 0.0928 19/500 [>.............................] - ETA: 2:41 - loss: 0.8087 - regression_loss: 0.7159 - classification_loss: 0.0928 20/500 [>.............................] - ETA: 2:41 - loss: 0.8311 - regression_loss: 0.7336 - classification_loss: 0.0976 21/500 [>.............................] - ETA: 2:41 - loss: 0.8252 - regression_loss: 0.7275 - classification_loss: 0.0978 22/500 [>.............................] - ETA: 2:41 - loss: 0.8251 - regression_loss: 0.7276 - classification_loss: 0.0975 23/500 [>.............................] - ETA: 2:40 - loss: 0.8405 - regression_loss: 0.7399 - classification_loss: 0.1006 24/500 [>.............................] - ETA: 2:40 - loss: 0.8284 - regression_loss: 0.7306 - classification_loss: 0.0978 25/500 [>.............................] - ETA: 2:39 - loss: 0.8126 - regression_loss: 0.7172 - classification_loss: 0.0954 26/500 [>.............................] - ETA: 2:39 - loss: 0.8016 - regression_loss: 0.7073 - classification_loss: 0.0943 27/500 [>.............................] - ETA: 2:39 - loss: 0.7844 - regression_loss: 0.6920 - classification_loss: 0.0924 28/500 [>.............................] - ETA: 2:38 - loss: 0.7905 - regression_loss: 0.6980 - classification_loss: 0.0925 29/500 [>.............................] - ETA: 2:38 - loss: 0.7838 - regression_loss: 0.6929 - classification_loss: 0.0909 30/500 [>.............................] - ETA: 2:38 - loss: 0.7852 - regression_loss: 0.6943 - classification_loss: 0.0910 31/500 [>.............................] - ETA: 2:37 - loss: 0.7933 - regression_loss: 0.7021 - classification_loss: 0.0912 32/500 [>.............................] - ETA: 2:37 - loss: 0.7846 - regression_loss: 0.6947 - classification_loss: 0.0899 33/500 [>.............................] - ETA: 2:37 - loss: 0.7794 - regression_loss: 0.6899 - classification_loss: 0.0895 34/500 [=>............................] - ETA: 2:37 - loss: 0.7740 - regression_loss: 0.6857 - classification_loss: 0.0883 35/500 [=>............................] - ETA: 2:37 - loss: 0.7796 - regression_loss: 0.6912 - classification_loss: 0.0884 36/500 [=>............................] - ETA: 2:37 - loss: 0.7807 - regression_loss: 0.6926 - classification_loss: 0.0881 37/500 [=>............................] - ETA: 2:37 - loss: 0.7787 - regression_loss: 0.6918 - classification_loss: 0.0868 38/500 [=>............................] - ETA: 2:36 - loss: 0.7834 - regression_loss: 0.6958 - classification_loss: 0.0876 39/500 [=>............................] - ETA: 2:36 - loss: 0.7881 - regression_loss: 0.6998 - classification_loss: 0.0884 40/500 [=>............................] - ETA: 2:36 - loss: 0.7876 - regression_loss: 0.6990 - classification_loss: 0.0886 41/500 [=>............................] - ETA: 2:36 - loss: 0.7821 - regression_loss: 0.6945 - classification_loss: 0.0876 42/500 [=>............................] - ETA: 2:35 - loss: 0.7817 - regression_loss: 0.6939 - classification_loss: 0.0878 43/500 [=>............................] - ETA: 2:35 - loss: 0.7785 - regression_loss: 0.6917 - classification_loss: 0.0868 44/500 [=>............................] - ETA: 2:35 - loss: 0.7830 - regression_loss: 0.6960 - classification_loss: 0.0871 45/500 [=>............................] - ETA: 2:35 - loss: 0.7919 - regression_loss: 0.7034 - classification_loss: 0.0885 46/500 [=>............................] - ETA: 2:34 - loss: 0.7867 - regression_loss: 0.6988 - classification_loss: 0.0879 47/500 [=>............................] - ETA: 2:34 - loss: 0.7810 - regression_loss: 0.6938 - classification_loss: 0.0872 48/500 [=>............................] - ETA: 2:34 - loss: 0.7783 - regression_loss: 0.6910 - classification_loss: 0.0874 49/500 [=>............................] - ETA: 2:34 - loss: 0.7855 - regression_loss: 0.6973 - classification_loss: 0.0881 50/500 [==>...........................] - ETA: 2:33 - loss: 0.7869 - regression_loss: 0.6989 - classification_loss: 0.0880 51/500 [==>...........................] - ETA: 2:33 - loss: 0.7819 - regression_loss: 0.6950 - classification_loss: 0.0868 52/500 [==>...........................] - ETA: 2:33 - loss: 0.7892 - regression_loss: 0.7012 - classification_loss: 0.0881 53/500 [==>...........................] - ETA: 2:33 - loss: 0.7853 - regression_loss: 0.6974 - classification_loss: 0.0878 54/500 [==>...........................] - ETA: 2:32 - loss: 0.7833 - regression_loss: 0.6959 - classification_loss: 0.0873 55/500 [==>...........................] - ETA: 2:32 - loss: 0.7810 - regression_loss: 0.6938 - classification_loss: 0.0871 56/500 [==>...........................] - ETA: 2:31 - loss: 0.7769 - regression_loss: 0.6902 - classification_loss: 0.0867 57/500 [==>...........................] - ETA: 2:31 - loss: 0.7744 - regression_loss: 0.6879 - classification_loss: 0.0865 58/500 [==>...........................] - ETA: 2:31 - loss: 0.7681 - regression_loss: 0.6824 - classification_loss: 0.0857 59/500 [==>...........................] - ETA: 2:30 - loss: 0.7655 - regression_loss: 0.6800 - classification_loss: 0.0854 60/500 [==>...........................] - ETA: 2:30 - loss: 0.7597 - regression_loss: 0.6747 - classification_loss: 0.0850 61/500 [==>...........................] - ETA: 2:30 - loss: 0.7599 - regression_loss: 0.6750 - classification_loss: 0.0849 62/500 [==>...........................] - ETA: 2:29 - loss: 0.7627 - regression_loss: 0.6778 - classification_loss: 0.0849 63/500 [==>...........................] - ETA: 2:29 - loss: 0.7611 - regression_loss: 0.6765 - classification_loss: 0.0845 64/500 [==>...........................] - ETA: 2:29 - loss: 0.7625 - regression_loss: 0.6777 - classification_loss: 0.0848 65/500 [==>...........................] - ETA: 2:29 - loss: 0.7633 - regression_loss: 0.6788 - classification_loss: 0.0845 66/500 [==>...........................] - ETA: 2:29 - loss: 0.7628 - regression_loss: 0.6784 - classification_loss: 0.0844 67/500 [===>..........................] - ETA: 2:28 - loss: 0.7597 - regression_loss: 0.6760 - classification_loss: 0.0838 68/500 [===>..........................] - ETA: 2:28 - loss: 0.7606 - regression_loss: 0.6768 - classification_loss: 0.0839 69/500 [===>..........................] - ETA: 2:28 - loss: 0.7587 - regression_loss: 0.6748 - classification_loss: 0.0839 70/500 [===>..........................] - ETA: 2:27 - loss: 0.7519 - regression_loss: 0.6687 - classification_loss: 0.0833 71/500 [===>..........................] - ETA: 2:27 - loss: 0.7470 - regression_loss: 0.6644 - classification_loss: 0.0826 72/500 [===>..........................] - ETA: 2:27 - loss: 0.7469 - regression_loss: 0.6641 - classification_loss: 0.0827 73/500 [===>..........................] - ETA: 2:26 - loss: 0.7417 - regression_loss: 0.6596 - classification_loss: 0.0821 74/500 [===>..........................] - ETA: 2:26 - loss: 0.7416 - regression_loss: 0.6594 - classification_loss: 0.0822 75/500 [===>..........................] - ETA: 2:26 - loss: 0.7376 - regression_loss: 0.6560 - classification_loss: 0.0816 76/500 [===>..........................] - ETA: 2:25 - loss: 0.7403 - regression_loss: 0.6586 - classification_loss: 0.0817 77/500 [===>..........................] - ETA: 2:25 - loss: 0.7417 - regression_loss: 0.6599 - classification_loss: 0.0818 78/500 [===>..........................] - ETA: 2:25 - loss: 0.7399 - regression_loss: 0.6585 - classification_loss: 0.0814 79/500 [===>..........................] - ETA: 2:24 - loss: 0.7394 - regression_loss: 0.6578 - classification_loss: 0.0816 80/500 [===>..........................] - ETA: 2:24 - loss: 0.7398 - regression_loss: 0.6582 - classification_loss: 0.0816 81/500 [===>..........................] - ETA: 2:24 - loss: 0.7409 - regression_loss: 0.6591 - classification_loss: 0.0817 82/500 [===>..........................] - ETA: 2:23 - loss: 0.7446 - regression_loss: 0.6631 - classification_loss: 0.0815 83/500 [===>..........................] - ETA: 2:23 - loss: 0.7398 - regression_loss: 0.6590 - classification_loss: 0.0808 84/500 [====>.........................] - ETA: 2:23 - loss: 0.7375 - regression_loss: 0.6572 - classification_loss: 0.0803 85/500 [====>.........................] - ETA: 2:23 - loss: 0.7387 - regression_loss: 0.6584 - classification_loss: 0.0803 86/500 [====>.........................] - ETA: 2:22 - loss: 0.7413 - regression_loss: 0.6606 - classification_loss: 0.0807 87/500 [====>.........................] - ETA: 2:22 - loss: 0.7398 - regression_loss: 0.6592 - classification_loss: 0.0806 88/500 [====>.........................] - ETA: 2:22 - loss: 0.7424 - regression_loss: 0.6614 - classification_loss: 0.0810 89/500 [====>.........................] - ETA: 2:21 - loss: 0.7397 - regression_loss: 0.6591 - classification_loss: 0.0807 90/500 [====>.........................] - ETA: 2:21 - loss: 0.7403 - regression_loss: 0.6595 - classification_loss: 0.0808 91/500 [====>.........................] - ETA: 2:21 - loss: 0.7365 - regression_loss: 0.6562 - classification_loss: 0.0803 92/500 [====>.........................] - ETA: 2:20 - loss: 0.7373 - regression_loss: 0.6570 - classification_loss: 0.0802 93/500 [====>.........................] - ETA: 2:20 - loss: 0.7354 - regression_loss: 0.6556 - classification_loss: 0.0798 94/500 [====>.........................] - ETA: 2:20 - loss: 0.7357 - regression_loss: 0.6560 - classification_loss: 0.0797 95/500 [====>.........................] - ETA: 2:19 - loss: 0.7386 - regression_loss: 0.6587 - classification_loss: 0.0799 96/500 [====>.........................] - ETA: 2:19 - loss: 0.7390 - regression_loss: 0.6591 - classification_loss: 0.0799 97/500 [====>.........................] - ETA: 2:19 - loss: 0.7412 - regression_loss: 0.6610 - classification_loss: 0.0803 98/500 [====>.........................] - ETA: 2:18 - loss: 0.7405 - regression_loss: 0.6602 - classification_loss: 0.0803 99/500 [====>.........................] - ETA: 2:18 - loss: 0.7412 - regression_loss: 0.6608 - classification_loss: 0.0804 100/500 [=====>........................] - ETA: 2:17 - loss: 0.7431 - regression_loss: 0.6628 - classification_loss: 0.0803 101/500 [=====>........................] - ETA: 2:17 - loss: 0.7430 - regression_loss: 0.6627 - classification_loss: 0.0803 102/500 [=====>........................] - ETA: 2:17 - loss: 0.7424 - regression_loss: 0.6623 - classification_loss: 0.0801 103/500 [=====>........................] - ETA: 2:16 - loss: 0.7393 - regression_loss: 0.6598 - classification_loss: 0.0795 104/500 [=====>........................] - ETA: 2:16 - loss: 0.7405 - regression_loss: 0.6604 - classification_loss: 0.0801 105/500 [=====>........................] - ETA: 2:16 - loss: 0.7395 - regression_loss: 0.6595 - classification_loss: 0.0800 106/500 [=====>........................] - ETA: 2:15 - loss: 0.7385 - regression_loss: 0.6584 - classification_loss: 0.0801 107/500 [=====>........................] - ETA: 2:15 - loss: 0.7356 - regression_loss: 0.6559 - classification_loss: 0.0797 108/500 [=====>........................] - ETA: 2:15 - loss: 0.7373 - regression_loss: 0.6576 - classification_loss: 0.0797 109/500 [=====>........................] - ETA: 2:15 - loss: 0.7388 - regression_loss: 0.6588 - classification_loss: 0.0801 110/500 [=====>........................] - ETA: 2:14 - loss: 0.7371 - regression_loss: 0.6573 - classification_loss: 0.0798 111/500 [=====>........................] - ETA: 2:14 - loss: 0.7389 - regression_loss: 0.6588 - classification_loss: 0.0802 112/500 [=====>........................] - ETA: 2:13 - loss: 0.7399 - regression_loss: 0.6595 - classification_loss: 0.0804 113/500 [=====>........................] - ETA: 2:13 - loss: 0.7408 - regression_loss: 0.6605 - classification_loss: 0.0803 114/500 [=====>........................] - ETA: 2:12 - loss: 0.7388 - regression_loss: 0.6589 - classification_loss: 0.0799 115/500 [=====>........................] - ETA: 2:12 - loss: 0.7357 - regression_loss: 0.6561 - classification_loss: 0.0795 116/500 [=====>........................] - ETA: 2:11 - loss: 0.7333 - regression_loss: 0.6541 - classification_loss: 0.0792 117/500 [======>.......................] - ETA: 2:11 - loss: 0.7308 - regression_loss: 0.6519 - classification_loss: 0.0788 118/500 [======>.......................] - ETA: 2:10 - loss: 0.7280 - regression_loss: 0.6495 - classification_loss: 0.0785 119/500 [======>.......................] - ETA: 2:10 - loss: 0.7272 - regression_loss: 0.6487 - classification_loss: 0.0784 120/500 [======>.......................] - ETA: 2:09 - loss: 0.7282 - regression_loss: 0.6495 - classification_loss: 0.0787 121/500 [======>.......................] - ETA: 2:09 - loss: 0.7270 - regression_loss: 0.6487 - classification_loss: 0.0783 122/500 [======>.......................] - ETA: 2:08 - loss: 0.7253 - regression_loss: 0.6470 - classification_loss: 0.0782 123/500 [======>.......................] - ETA: 2:08 - loss: 0.7267 - regression_loss: 0.6485 - classification_loss: 0.0782 124/500 [======>.......................] - ETA: 2:08 - loss: 0.7279 - regression_loss: 0.6498 - classification_loss: 0.0782 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7282 - regression_loss: 0.6501 - classification_loss: 0.0781 126/500 [======>.......................] - ETA: 2:07 - loss: 0.7289 - regression_loss: 0.6512 - classification_loss: 0.0778 127/500 [======>.......................] - ETA: 2:07 - loss: 0.7314 - regression_loss: 0.6534 - classification_loss: 0.0780 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7322 - regression_loss: 0.6541 - classification_loss: 0.0781 129/500 [======>.......................] - ETA: 2:06 - loss: 0.7328 - regression_loss: 0.6546 - classification_loss: 0.0782 130/500 [======>.......................] - ETA: 2:06 - loss: 0.7337 - regression_loss: 0.6555 - classification_loss: 0.0782 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7333 - regression_loss: 0.6551 - classification_loss: 0.0782 132/500 [======>.......................] - ETA: 2:05 - loss: 0.7320 - regression_loss: 0.6540 - classification_loss: 0.0780 133/500 [======>.......................] - ETA: 2:05 - loss: 0.7316 - regression_loss: 0.6536 - classification_loss: 0.0781 134/500 [=======>......................] - ETA: 2:04 - loss: 0.7325 - regression_loss: 0.6543 - classification_loss: 0.0781 135/500 [=======>......................] - ETA: 2:04 - loss: 0.7363 - regression_loss: 0.6576 - classification_loss: 0.0787 136/500 [=======>......................] - ETA: 2:04 - loss: 0.7359 - regression_loss: 0.6574 - classification_loss: 0.0786 137/500 [=======>......................] - ETA: 2:03 - loss: 0.7366 - regression_loss: 0.6581 - classification_loss: 0.0785 138/500 [=======>......................] - ETA: 2:03 - loss: 0.7340 - regression_loss: 0.6557 - classification_loss: 0.0783 139/500 [=======>......................] - ETA: 2:03 - loss: 0.7368 - regression_loss: 0.6583 - classification_loss: 0.0785 140/500 [=======>......................] - ETA: 2:02 - loss: 0.7382 - regression_loss: 0.6597 - classification_loss: 0.0785 141/500 [=======>......................] - ETA: 2:02 - loss: 0.7394 - regression_loss: 0.6609 - classification_loss: 0.0786 142/500 [=======>......................] - ETA: 2:02 - loss: 0.7385 - regression_loss: 0.6602 - classification_loss: 0.0783 143/500 [=======>......................] - ETA: 2:01 - loss: 0.7414 - regression_loss: 0.6626 - classification_loss: 0.0788 144/500 [=======>......................] - ETA: 2:01 - loss: 0.7392 - regression_loss: 0.6605 - classification_loss: 0.0786 145/500 [=======>......................] - ETA: 2:00 - loss: 0.7372 - regression_loss: 0.6589 - classification_loss: 0.0784 146/500 [=======>......................] - ETA: 2:00 - loss: 0.7335 - regression_loss: 0.6555 - classification_loss: 0.0780 147/500 [=======>......................] - ETA: 2:00 - loss: 0.7351 - regression_loss: 0.6569 - classification_loss: 0.0781 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7376 - regression_loss: 0.6590 - classification_loss: 0.0786 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7359 - regression_loss: 0.6575 - classification_loss: 0.0784 150/500 [========>.....................] - ETA: 1:59 - loss: 0.7339 - regression_loss: 0.6557 - classification_loss: 0.0782 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7322 - regression_loss: 0.6542 - classification_loss: 0.0781 152/500 [========>.....................] - ETA: 1:58 - loss: 0.7337 - regression_loss: 0.6555 - classification_loss: 0.0782 153/500 [========>.....................] - ETA: 1:58 - loss: 0.7344 - regression_loss: 0.6562 - classification_loss: 0.0782 154/500 [========>.....................] - ETA: 1:57 - loss: 0.7382 - regression_loss: 0.6595 - classification_loss: 0.0787 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7377 - regression_loss: 0.6590 - classification_loss: 0.0787 156/500 [========>.....................] - ETA: 1:57 - loss: 0.7384 - regression_loss: 0.6596 - classification_loss: 0.0788 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7370 - regression_loss: 0.6584 - classification_loss: 0.0786 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7384 - regression_loss: 0.6600 - classification_loss: 0.0784 159/500 [========>.....................] - ETA: 1:56 - loss: 0.7366 - regression_loss: 0.6582 - classification_loss: 0.0784 160/500 [========>.....................] - ETA: 1:55 - loss: 0.7362 - regression_loss: 0.6577 - classification_loss: 0.0785 161/500 [========>.....................] - ETA: 1:55 - loss: 0.7395 - regression_loss: 0.6605 - classification_loss: 0.0789 162/500 [========>.....................] - ETA: 1:55 - loss: 0.7389 - regression_loss: 0.6601 - classification_loss: 0.0787 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7356 - regression_loss: 0.6573 - classification_loss: 0.0784 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7394 - regression_loss: 0.6609 - classification_loss: 0.0786 165/500 [========>.....................] - ETA: 1:54 - loss: 0.7398 - regression_loss: 0.6611 - classification_loss: 0.0788 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7385 - regression_loss: 0.6599 - classification_loss: 0.0786 167/500 [=========>....................] - ETA: 1:53 - loss: 0.7367 - regression_loss: 0.6583 - classification_loss: 0.0784 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7360 - regression_loss: 0.6576 - classification_loss: 0.0783 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7377 - regression_loss: 0.6592 - classification_loss: 0.0785 170/500 [=========>....................] - ETA: 1:52 - loss: 0.7365 - regression_loss: 0.6582 - classification_loss: 0.0783 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7347 - regression_loss: 0.6566 - classification_loss: 0.0781 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7338 - regression_loss: 0.6560 - classification_loss: 0.0779 173/500 [=========>....................] - ETA: 1:51 - loss: 0.7335 - regression_loss: 0.6556 - classification_loss: 0.0779 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7319 - regression_loss: 0.6541 - classification_loss: 0.0778 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7335 - regression_loss: 0.6557 - classification_loss: 0.0778 176/500 [=========>....................] - ETA: 1:50 - loss: 0.7335 - regression_loss: 0.6558 - classification_loss: 0.0778 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7358 - regression_loss: 0.6575 - classification_loss: 0.0783 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7340 - regression_loss: 0.6560 - classification_loss: 0.0780 179/500 [=========>....................] - ETA: 1:49 - loss: 0.7348 - regression_loss: 0.6567 - classification_loss: 0.0781 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7348 - regression_loss: 0.6568 - classification_loss: 0.0780 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7338 - regression_loss: 0.6560 - classification_loss: 0.0778 182/500 [=========>....................] - ETA: 1:48 - loss: 0.7329 - regression_loss: 0.6551 - classification_loss: 0.0778 183/500 [=========>....................] - ETA: 1:48 - loss: 0.7343 - regression_loss: 0.6564 - classification_loss: 0.0779 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7360 - regression_loss: 0.6579 - classification_loss: 0.0782 185/500 [==========>...................] - ETA: 1:47 - loss: 0.7362 - regression_loss: 0.6580 - classification_loss: 0.0782 186/500 [==========>...................] - ETA: 1:47 - loss: 0.7346 - regression_loss: 0.6566 - classification_loss: 0.0780 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7330 - regression_loss: 0.6552 - classification_loss: 0.0778 188/500 [==========>...................] - ETA: 1:46 - loss: 0.7332 - regression_loss: 0.6555 - classification_loss: 0.0777 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7322 - regression_loss: 0.6545 - classification_loss: 0.0776 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7309 - regression_loss: 0.6535 - classification_loss: 0.0774 191/500 [==========>...................] - ETA: 1:45 - loss: 0.7337 - regression_loss: 0.6557 - classification_loss: 0.0780 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7323 - regression_loss: 0.6544 - classification_loss: 0.0779 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7306 - regression_loss: 0.6530 - classification_loss: 0.0776 194/500 [==========>...................] - ETA: 1:44 - loss: 0.7295 - regression_loss: 0.6521 - classification_loss: 0.0775 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7305 - regression_loss: 0.6529 - classification_loss: 0.0776 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7323 - regression_loss: 0.6546 - classification_loss: 0.0777 197/500 [==========>...................] - ETA: 1:43 - loss: 0.7342 - regression_loss: 0.6563 - classification_loss: 0.0779 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7343 - regression_loss: 0.6566 - classification_loss: 0.0778 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7349 - regression_loss: 0.6571 - classification_loss: 0.0779 200/500 [===========>..................] - ETA: 1:42 - loss: 0.7338 - regression_loss: 0.6561 - classification_loss: 0.0777 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7349 - regression_loss: 0.6571 - classification_loss: 0.0778 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7365 - regression_loss: 0.6583 - classification_loss: 0.0782 203/500 [===========>..................] - ETA: 1:41 - loss: 0.7356 - regression_loss: 0.6576 - classification_loss: 0.0780 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7365 - regression_loss: 0.6584 - classification_loss: 0.0781 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7363 - regression_loss: 0.6582 - classification_loss: 0.0781 206/500 [===========>..................] - ETA: 1:40 - loss: 0.7365 - regression_loss: 0.6585 - classification_loss: 0.0780 207/500 [===========>..................] - ETA: 1:40 - loss: 0.7349 - regression_loss: 0.6571 - classification_loss: 0.0778 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7364 - regression_loss: 0.6585 - classification_loss: 0.0779 209/500 [===========>..................] - ETA: 1:39 - loss: 0.7357 - regression_loss: 0.6580 - classification_loss: 0.0777 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7362 - regression_loss: 0.6585 - classification_loss: 0.0777 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7348 - regression_loss: 0.6574 - classification_loss: 0.0774 212/500 [===========>..................] - ETA: 1:38 - loss: 0.7361 - regression_loss: 0.6585 - classification_loss: 0.0776 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7346 - regression_loss: 0.6572 - classification_loss: 0.0774 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7339 - regression_loss: 0.6567 - classification_loss: 0.0773 215/500 [===========>..................] - ETA: 1:37 - loss: 0.7357 - regression_loss: 0.6581 - classification_loss: 0.0776 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7345 - regression_loss: 0.6570 - classification_loss: 0.0775 217/500 [============>.................] - ETA: 1:36 - loss: 0.7338 - regression_loss: 0.6564 - classification_loss: 0.0774 218/500 [============>.................] - ETA: 1:36 - loss: 0.7337 - regression_loss: 0.6564 - classification_loss: 0.0773 219/500 [============>.................] - ETA: 1:35 - loss: 0.7327 - regression_loss: 0.6555 - classification_loss: 0.0772 220/500 [============>.................] - ETA: 1:35 - loss: 0.7325 - regression_loss: 0.6554 - classification_loss: 0.0771 221/500 [============>.................] - ETA: 1:35 - loss: 0.7316 - regression_loss: 0.6546 - classification_loss: 0.0770 222/500 [============>.................] - ETA: 1:34 - loss: 0.7323 - regression_loss: 0.6551 - classification_loss: 0.0772 223/500 [============>.................] - ETA: 1:34 - loss: 0.7315 - regression_loss: 0.6544 - classification_loss: 0.0771 224/500 [============>.................] - ETA: 1:34 - loss: 0.7294 - regression_loss: 0.6525 - classification_loss: 0.0768 225/500 [============>.................] - ETA: 1:33 - loss: 0.7292 - regression_loss: 0.6524 - classification_loss: 0.0768 226/500 [============>.................] - ETA: 1:33 - loss: 0.7305 - regression_loss: 0.6534 - classification_loss: 0.0771 227/500 [============>.................] - ETA: 1:33 - loss: 0.7295 - regression_loss: 0.6526 - classification_loss: 0.0769 228/500 [============>.................] - ETA: 1:32 - loss: 0.7305 - regression_loss: 0.6536 - classification_loss: 0.0768 229/500 [============>.................] - ETA: 1:32 - loss: 0.7299 - regression_loss: 0.6531 - classification_loss: 0.0768 230/500 [============>.................] - ETA: 1:32 - loss: 0.7305 - regression_loss: 0.6537 - classification_loss: 0.0768 231/500 [============>.................] - ETA: 1:31 - loss: 0.7319 - regression_loss: 0.6549 - classification_loss: 0.0770 232/500 [============>.................] - ETA: 1:31 - loss: 0.7343 - regression_loss: 0.6567 - classification_loss: 0.0776 233/500 [============>.................] - ETA: 1:31 - loss: 0.7336 - regression_loss: 0.6560 - classification_loss: 0.0776 234/500 [=============>................] - ETA: 1:30 - loss: 0.7335 - regression_loss: 0.6560 - classification_loss: 0.0776 235/500 [=============>................] - ETA: 1:30 - loss: 0.7334 - regression_loss: 0.6559 - classification_loss: 0.0774 236/500 [=============>................] - ETA: 1:30 - loss: 0.7318 - regression_loss: 0.6546 - classification_loss: 0.0773 237/500 [=============>................] - ETA: 1:29 - loss: 0.7317 - regression_loss: 0.6543 - classification_loss: 0.0774 238/500 [=============>................] - ETA: 1:29 - loss: 0.7321 - regression_loss: 0.6546 - classification_loss: 0.0775 239/500 [=============>................] - ETA: 1:29 - loss: 0.7311 - regression_loss: 0.6537 - classification_loss: 0.0773 240/500 [=============>................] - ETA: 1:28 - loss: 0.7312 - regression_loss: 0.6539 - classification_loss: 0.0773 241/500 [=============>................] - ETA: 1:28 - loss: 0.7297 - regression_loss: 0.6526 - classification_loss: 0.0771 242/500 [=============>................] - ETA: 1:28 - loss: 0.7317 - regression_loss: 0.6542 - classification_loss: 0.0776 243/500 [=============>................] - ETA: 1:27 - loss: 0.7309 - regression_loss: 0.6535 - classification_loss: 0.0774 244/500 [=============>................] - ETA: 1:27 - loss: 0.7322 - regression_loss: 0.6546 - classification_loss: 0.0777 245/500 [=============>................] - ETA: 1:27 - loss: 0.7317 - regression_loss: 0.6541 - classification_loss: 0.0776 246/500 [=============>................] - ETA: 1:26 - loss: 0.7312 - regression_loss: 0.6537 - classification_loss: 0.0775 247/500 [=============>................] - ETA: 1:26 - loss: 0.7310 - regression_loss: 0.6536 - classification_loss: 0.0775 248/500 [=============>................] - ETA: 1:26 - loss: 0.7309 - regression_loss: 0.6536 - classification_loss: 0.0774 249/500 [=============>................] - ETA: 1:25 - loss: 0.7302 - regression_loss: 0.6529 - classification_loss: 0.0773 250/500 [==============>...............] - ETA: 1:25 - loss: 0.7299 - regression_loss: 0.6526 - classification_loss: 0.0773 251/500 [==============>...............] - ETA: 1:25 - loss: 0.7296 - regression_loss: 0.6524 - classification_loss: 0.0772 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7294 - regression_loss: 0.6523 - classification_loss: 0.0771 253/500 [==============>...............] - ETA: 1:24 - loss: 0.7286 - regression_loss: 0.6515 - classification_loss: 0.0770 254/500 [==============>...............] - ETA: 1:24 - loss: 0.7294 - regression_loss: 0.6523 - classification_loss: 0.0771 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7296 - regression_loss: 0.6526 - classification_loss: 0.0770 256/500 [==============>...............] - ETA: 1:23 - loss: 0.7295 - regression_loss: 0.6524 - classification_loss: 0.0770 257/500 [==============>...............] - ETA: 1:23 - loss: 0.7304 - regression_loss: 0.6532 - classification_loss: 0.0772 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7294 - regression_loss: 0.6523 - classification_loss: 0.0771 259/500 [==============>...............] - ETA: 1:22 - loss: 0.7296 - regression_loss: 0.6525 - classification_loss: 0.0771 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7290 - regression_loss: 0.6519 - classification_loss: 0.0771 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7283 - regression_loss: 0.6513 - classification_loss: 0.0770 262/500 [==============>...............] - ETA: 1:21 - loss: 0.7276 - regression_loss: 0.6507 - classification_loss: 0.0769 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7293 - regression_loss: 0.6521 - classification_loss: 0.0772 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7311 - regression_loss: 0.6539 - classification_loss: 0.0772 265/500 [==============>...............] - ETA: 1:20 - loss: 0.7296 - regression_loss: 0.6526 - classification_loss: 0.0770 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7320 - regression_loss: 0.6548 - classification_loss: 0.0772 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7305 - regression_loss: 0.6535 - classification_loss: 0.0770 268/500 [===============>..............] - ETA: 1:19 - loss: 0.7294 - regression_loss: 0.6526 - classification_loss: 0.0768 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7284 - regression_loss: 0.6517 - classification_loss: 0.0768 270/500 [===============>..............] - ETA: 1:18 - loss: 0.7308 - regression_loss: 0.6538 - classification_loss: 0.0770 271/500 [===============>..............] - ETA: 1:18 - loss: 0.7302 - regression_loss: 0.6533 - classification_loss: 0.0768 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7315 - regression_loss: 0.6546 - classification_loss: 0.0769 273/500 [===============>..............] - ETA: 1:17 - loss: 0.7311 - regression_loss: 0.6542 - classification_loss: 0.0769 274/500 [===============>..............] - ETA: 1:17 - loss: 0.7299 - regression_loss: 0.6531 - classification_loss: 0.0767 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7299 - regression_loss: 0.6531 - classification_loss: 0.0767 276/500 [===============>..............] - ETA: 1:16 - loss: 0.7303 - regression_loss: 0.6535 - classification_loss: 0.0768 277/500 [===============>..............] - ETA: 1:16 - loss: 0.7297 - regression_loss: 0.6531 - classification_loss: 0.0767 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7290 - regression_loss: 0.6524 - classification_loss: 0.0766 279/500 [===============>..............] - ETA: 1:15 - loss: 0.7287 - regression_loss: 0.6522 - classification_loss: 0.0765 280/500 [===============>..............] - ETA: 1:15 - loss: 0.7279 - regression_loss: 0.6516 - classification_loss: 0.0764 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7294 - regression_loss: 0.6527 - classification_loss: 0.0766 282/500 [===============>..............] - ETA: 1:14 - loss: 0.7284 - regression_loss: 0.6519 - classification_loss: 0.0765 283/500 [===============>..............] - ETA: 1:14 - loss: 0.7283 - regression_loss: 0.6519 - classification_loss: 0.0765 284/500 [================>.............] - ETA: 1:13 - loss: 0.7298 - regression_loss: 0.6532 - classification_loss: 0.0766 285/500 [================>.............] - ETA: 1:13 - loss: 0.7302 - regression_loss: 0.6536 - classification_loss: 0.0766 286/500 [================>.............] - ETA: 1:13 - loss: 0.7291 - regression_loss: 0.6525 - classification_loss: 0.0765 287/500 [================>.............] - ETA: 1:12 - loss: 0.7292 - regression_loss: 0.6526 - classification_loss: 0.0766 288/500 [================>.............] - ETA: 1:12 - loss: 0.7287 - regression_loss: 0.6522 - classification_loss: 0.0765 289/500 [================>.............] - ETA: 1:12 - loss: 0.7289 - regression_loss: 0.6524 - classification_loss: 0.0765 290/500 [================>.............] - ETA: 1:11 - loss: 0.7292 - regression_loss: 0.6527 - classification_loss: 0.0765 291/500 [================>.............] - ETA: 1:11 - loss: 0.7288 - regression_loss: 0.6524 - classification_loss: 0.0764 292/500 [================>.............] - ETA: 1:11 - loss: 0.7290 - regression_loss: 0.6525 - classification_loss: 0.0764 293/500 [================>.............] - ETA: 1:10 - loss: 0.7295 - regression_loss: 0.6531 - classification_loss: 0.0764 294/500 [================>.............] - ETA: 1:10 - loss: 0.7293 - regression_loss: 0.6529 - classification_loss: 0.0764 295/500 [================>.............] - ETA: 1:10 - loss: 0.7288 - regression_loss: 0.6525 - classification_loss: 0.0763 296/500 [================>.............] - ETA: 1:09 - loss: 0.7282 - regression_loss: 0.6520 - classification_loss: 0.0762 297/500 [================>.............] - ETA: 1:09 - loss: 0.7278 - regression_loss: 0.6516 - classification_loss: 0.0762 298/500 [================>.............] - ETA: 1:08 - loss: 0.7268 - regression_loss: 0.6507 - classification_loss: 0.0761 299/500 [================>.............] - ETA: 1:08 - loss: 0.7271 - regression_loss: 0.6510 - classification_loss: 0.0761 300/500 [=================>............] - ETA: 1:08 - loss: 0.7278 - regression_loss: 0.6517 - classification_loss: 0.0762 301/500 [=================>............] - ETA: 1:07 - loss: 0.7278 - regression_loss: 0.6517 - classification_loss: 0.0761 302/500 [=================>............] - ETA: 1:07 - loss: 0.7273 - regression_loss: 0.6513 - classification_loss: 0.0760 303/500 [=================>............] - ETA: 1:07 - loss: 0.7272 - regression_loss: 0.6513 - classification_loss: 0.0759 304/500 [=================>............] - ETA: 1:06 - loss: 0.7266 - regression_loss: 0.6508 - classification_loss: 0.0759 305/500 [=================>............] - ETA: 1:06 - loss: 0.7258 - regression_loss: 0.6500 - classification_loss: 0.0758 306/500 [=================>............] - ETA: 1:06 - loss: 0.7258 - regression_loss: 0.6500 - classification_loss: 0.0758 307/500 [=================>............] - ETA: 1:05 - loss: 0.7264 - regression_loss: 0.6505 - classification_loss: 0.0758 308/500 [=================>............] - ETA: 1:05 - loss: 0.7265 - regression_loss: 0.6507 - classification_loss: 0.0759 309/500 [=================>............] - ETA: 1:05 - loss: 0.7258 - regression_loss: 0.6500 - classification_loss: 0.0758 310/500 [=================>............] - ETA: 1:04 - loss: 0.7251 - regression_loss: 0.6494 - classification_loss: 0.0757 311/500 [=================>............] - ETA: 1:04 - loss: 0.7248 - regression_loss: 0.6491 - classification_loss: 0.0757 312/500 [=================>............] - ETA: 1:04 - loss: 0.7236 - regression_loss: 0.6480 - classification_loss: 0.0756 313/500 [=================>............] - ETA: 1:03 - loss: 0.7244 - regression_loss: 0.6490 - classification_loss: 0.0754 314/500 [=================>............] - ETA: 1:03 - loss: 0.7254 - regression_loss: 0.6498 - classification_loss: 0.0755 315/500 [=================>............] - ETA: 1:03 - loss: 0.7243 - regression_loss: 0.6489 - classification_loss: 0.0754 316/500 [=================>............] - ETA: 1:02 - loss: 0.7246 - regression_loss: 0.6492 - classification_loss: 0.0754 317/500 [==================>...........] - ETA: 1:02 - loss: 0.7250 - regression_loss: 0.6496 - classification_loss: 0.0755 318/500 [==================>...........] - ETA: 1:02 - loss: 0.7256 - regression_loss: 0.6501 - classification_loss: 0.0755 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7271 - regression_loss: 0.6513 - classification_loss: 0.0758 320/500 [==================>...........] - ETA: 1:01 - loss: 0.7289 - regression_loss: 0.6529 - classification_loss: 0.0760 321/500 [==================>...........] - ETA: 1:01 - loss: 0.7296 - regression_loss: 0.6534 - classification_loss: 0.0762 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7292 - regression_loss: 0.6530 - classification_loss: 0.0762 323/500 [==================>...........] - ETA: 1:00 - loss: 0.7282 - regression_loss: 0.6521 - classification_loss: 0.0761 324/500 [==================>...........] - ETA: 1:00 - loss: 0.7283 - regression_loss: 0.6523 - classification_loss: 0.0760 325/500 [==================>...........] - ETA: 59s - loss: 0.7279 - regression_loss: 0.6520 - classification_loss: 0.0759  326/500 [==================>...........] - ETA: 59s - loss: 0.7283 - regression_loss: 0.6524 - classification_loss: 0.0759 327/500 [==================>...........] - ETA: 59s - loss: 0.7283 - regression_loss: 0.6525 - classification_loss: 0.0759 328/500 [==================>...........] - ETA: 58s - loss: 0.7299 - regression_loss: 0.6538 - classification_loss: 0.0761 329/500 [==================>...........] - ETA: 58s - loss: 0.7303 - regression_loss: 0.6542 - classification_loss: 0.0761 330/500 [==================>...........] - ETA: 58s - loss: 0.7307 - regression_loss: 0.6546 - classification_loss: 0.0762 331/500 [==================>...........] - ETA: 57s - loss: 0.7298 - regression_loss: 0.6538 - classification_loss: 0.0760 332/500 [==================>...........] - ETA: 57s - loss: 0.7298 - regression_loss: 0.6538 - classification_loss: 0.0760 333/500 [==================>...........] - ETA: 57s - loss: 0.7299 - regression_loss: 0.6540 - classification_loss: 0.0759 334/500 [===================>..........] - ETA: 56s - loss: 0.7291 - regression_loss: 0.6533 - classification_loss: 0.0757 335/500 [===================>..........] - ETA: 56s - loss: 0.7294 - regression_loss: 0.6537 - classification_loss: 0.0757 336/500 [===================>..........] - ETA: 56s - loss: 0.7293 - regression_loss: 0.6536 - classification_loss: 0.0757 337/500 [===================>..........] - ETA: 55s - loss: 0.7297 - regression_loss: 0.6540 - classification_loss: 0.0757 338/500 [===================>..........] - ETA: 55s - loss: 0.7302 - regression_loss: 0.6544 - classification_loss: 0.0758 339/500 [===================>..........] - ETA: 55s - loss: 0.7289 - regression_loss: 0.6532 - classification_loss: 0.0757 340/500 [===================>..........] - ETA: 54s - loss: 0.7296 - regression_loss: 0.6538 - classification_loss: 0.0758 341/500 [===================>..........] - ETA: 54s - loss: 0.7303 - regression_loss: 0.6542 - classification_loss: 0.0760 342/500 [===================>..........] - ETA: 54s - loss: 0.7308 - regression_loss: 0.6548 - classification_loss: 0.0760 343/500 [===================>..........] - ETA: 53s - loss: 0.7307 - regression_loss: 0.6547 - classification_loss: 0.0760 344/500 [===================>..........] - ETA: 53s - loss: 0.7305 - regression_loss: 0.6546 - classification_loss: 0.0759 345/500 [===================>..........] - ETA: 53s - loss: 0.7308 - regression_loss: 0.6549 - classification_loss: 0.0760 346/500 [===================>..........] - ETA: 52s - loss: 0.7309 - regression_loss: 0.6548 - classification_loss: 0.0760 347/500 [===================>..........] - ETA: 52s - loss: 0.7308 - regression_loss: 0.6547 - classification_loss: 0.0761 348/500 [===================>..........] - ETA: 52s - loss: 0.7311 - regression_loss: 0.6549 - classification_loss: 0.0762 349/500 [===================>..........] - ETA: 51s - loss: 0.7305 - regression_loss: 0.6543 - classification_loss: 0.0762 350/500 [====================>.........] - ETA: 51s - loss: 0.7313 - regression_loss: 0.6550 - classification_loss: 0.0763 351/500 [====================>.........] - ETA: 50s - loss: 0.7317 - regression_loss: 0.6552 - classification_loss: 0.0764 352/500 [====================>.........] - ETA: 50s - loss: 0.7326 - regression_loss: 0.6561 - classification_loss: 0.0765 353/500 [====================>.........] - ETA: 50s - loss: 0.7336 - regression_loss: 0.6570 - classification_loss: 0.0767 354/500 [====================>.........] - ETA: 49s - loss: 0.7339 - regression_loss: 0.6573 - classification_loss: 0.0767 355/500 [====================>.........] - ETA: 49s - loss: 0.7346 - regression_loss: 0.6580 - classification_loss: 0.0766 356/500 [====================>.........] - ETA: 49s - loss: 0.7333 - regression_loss: 0.6569 - classification_loss: 0.0764 357/500 [====================>.........] - ETA: 48s - loss: 0.7327 - regression_loss: 0.6564 - classification_loss: 0.0763 358/500 [====================>.........] - ETA: 48s - loss: 0.7324 - regression_loss: 0.6562 - classification_loss: 0.0763 359/500 [====================>.........] - ETA: 48s - loss: 0.7317 - regression_loss: 0.6556 - classification_loss: 0.0761 360/500 [====================>.........] - ETA: 47s - loss: 0.7324 - regression_loss: 0.6561 - classification_loss: 0.0763 361/500 [====================>.........] - ETA: 47s - loss: 0.7318 - regression_loss: 0.6556 - classification_loss: 0.0762 362/500 [====================>.........] - ETA: 47s - loss: 0.7328 - regression_loss: 0.6563 - classification_loss: 0.0765 363/500 [====================>.........] - ETA: 46s - loss: 0.7324 - regression_loss: 0.6559 - classification_loss: 0.0765 364/500 [====================>.........] - ETA: 46s - loss: 0.7326 - regression_loss: 0.6561 - classification_loss: 0.0765 365/500 [====================>.........] - ETA: 46s - loss: 0.7323 - regression_loss: 0.6559 - classification_loss: 0.0764 366/500 [====================>.........] - ETA: 45s - loss: 0.7330 - regression_loss: 0.6566 - classification_loss: 0.0763 367/500 [=====================>........] - ETA: 45s - loss: 0.7325 - regression_loss: 0.6562 - classification_loss: 0.0762 368/500 [=====================>........] - ETA: 45s - loss: 0.7327 - regression_loss: 0.6564 - classification_loss: 0.0763 369/500 [=====================>........] - ETA: 44s - loss: 0.7316 - regression_loss: 0.6555 - classification_loss: 0.0761 370/500 [=====================>........] - ETA: 44s - loss: 0.7318 - regression_loss: 0.6556 - classification_loss: 0.0761 371/500 [=====================>........] - ETA: 44s - loss: 0.7317 - regression_loss: 0.6556 - classification_loss: 0.0761 372/500 [=====================>........] - ETA: 43s - loss: 0.7311 - regression_loss: 0.6551 - classification_loss: 0.0760 373/500 [=====================>........] - ETA: 43s - loss: 0.7303 - regression_loss: 0.6544 - classification_loss: 0.0759 374/500 [=====================>........] - ETA: 43s - loss: 0.7315 - regression_loss: 0.6554 - classification_loss: 0.0761 375/500 [=====================>........] - ETA: 42s - loss: 0.7307 - regression_loss: 0.6547 - classification_loss: 0.0760 376/500 [=====================>........] - ETA: 42s - loss: 0.7304 - regression_loss: 0.6545 - classification_loss: 0.0759 377/500 [=====================>........] - ETA: 42s - loss: 0.7299 - regression_loss: 0.6540 - classification_loss: 0.0758 378/500 [=====================>........] - ETA: 41s - loss: 0.7298 - regression_loss: 0.6538 - classification_loss: 0.0759 379/500 [=====================>........] - ETA: 41s - loss: 0.7295 - regression_loss: 0.6535 - classification_loss: 0.0759 380/500 [=====================>........] - ETA: 41s - loss: 0.7298 - regression_loss: 0.6539 - classification_loss: 0.0759 381/500 [=====================>........] - ETA: 40s - loss: 0.7306 - regression_loss: 0.6547 - classification_loss: 0.0759 382/500 [=====================>........] - ETA: 40s - loss: 0.7300 - regression_loss: 0.6541 - classification_loss: 0.0759 383/500 [=====================>........] - ETA: 40s - loss: 0.7299 - regression_loss: 0.6541 - classification_loss: 0.0758 384/500 [======================>.......] - ETA: 39s - loss: 0.7301 - regression_loss: 0.6542 - classification_loss: 0.0759 385/500 [======================>.......] - ETA: 39s - loss: 0.7301 - regression_loss: 0.6542 - classification_loss: 0.0759 386/500 [======================>.......] - ETA: 39s - loss: 0.7298 - regression_loss: 0.6539 - classification_loss: 0.0759 387/500 [======================>.......] - ETA: 38s - loss: 0.7296 - regression_loss: 0.6537 - classification_loss: 0.0759 388/500 [======================>.......] - ETA: 38s - loss: 0.7297 - regression_loss: 0.6537 - classification_loss: 0.0760 389/500 [======================>.......] - ETA: 38s - loss: 0.7297 - regression_loss: 0.6536 - classification_loss: 0.0760 390/500 [======================>.......] - ETA: 37s - loss: 0.7293 - regression_loss: 0.6533 - classification_loss: 0.0760 391/500 [======================>.......] - ETA: 37s - loss: 0.7293 - regression_loss: 0.6533 - classification_loss: 0.0760 392/500 [======================>.......] - ETA: 36s - loss: 0.7304 - regression_loss: 0.6543 - classification_loss: 0.0761 393/500 [======================>.......] - ETA: 36s - loss: 0.7301 - regression_loss: 0.6541 - classification_loss: 0.0761 394/500 [======================>.......] - ETA: 36s - loss: 0.7308 - regression_loss: 0.6546 - classification_loss: 0.0762 395/500 [======================>.......] - ETA: 35s - loss: 0.7301 - regression_loss: 0.6540 - classification_loss: 0.0761 396/500 [======================>.......] - ETA: 35s - loss: 0.7304 - regression_loss: 0.6542 - classification_loss: 0.0761 397/500 [======================>.......] - ETA: 35s - loss: 0.7314 - regression_loss: 0.6554 - classification_loss: 0.0760 398/500 [======================>.......] - ETA: 34s - loss: 0.7310 - regression_loss: 0.6550 - classification_loss: 0.0760 399/500 [======================>.......] - ETA: 34s - loss: 0.7303 - regression_loss: 0.6544 - classification_loss: 0.0759 400/500 [=======================>......] - ETA: 34s - loss: 0.7311 - regression_loss: 0.6552 - classification_loss: 0.0759 401/500 [=======================>......] - ETA: 33s - loss: 0.7319 - regression_loss: 0.6559 - classification_loss: 0.0760 402/500 [=======================>......] - ETA: 33s - loss: 0.7329 - regression_loss: 0.6567 - classification_loss: 0.0762 403/500 [=======================>......] - ETA: 33s - loss: 0.7321 - regression_loss: 0.6561 - classification_loss: 0.0760 404/500 [=======================>......] - ETA: 32s - loss: 0.7323 - regression_loss: 0.6563 - classification_loss: 0.0760 405/500 [=======================>......] - ETA: 32s - loss: 0.7336 - regression_loss: 0.6574 - classification_loss: 0.0763 406/500 [=======================>......] - ETA: 32s - loss: 0.7339 - regression_loss: 0.6576 - classification_loss: 0.0763 407/500 [=======================>......] - ETA: 31s - loss: 0.7330 - regression_loss: 0.6568 - classification_loss: 0.0762 408/500 [=======================>......] - ETA: 31s - loss: 0.7325 - regression_loss: 0.6563 - classification_loss: 0.0762 409/500 [=======================>......] - ETA: 31s - loss: 0.7335 - regression_loss: 0.6572 - classification_loss: 0.0763 410/500 [=======================>......] - ETA: 30s - loss: 0.7331 - regression_loss: 0.6568 - classification_loss: 0.0763 411/500 [=======================>......] - ETA: 30s - loss: 0.7322 - regression_loss: 0.6560 - classification_loss: 0.0763 412/500 [=======================>......] - ETA: 30s - loss: 0.7313 - regression_loss: 0.6552 - classification_loss: 0.0762 413/500 [=======================>......] - ETA: 29s - loss: 0.7311 - regression_loss: 0.6550 - classification_loss: 0.0761 414/500 [=======================>......] - ETA: 29s - loss: 0.7309 - regression_loss: 0.6549 - classification_loss: 0.0760 415/500 [=======================>......] - ETA: 29s - loss: 0.7300 - regression_loss: 0.6541 - classification_loss: 0.0759 416/500 [=======================>......] - ETA: 28s - loss: 0.7293 - regression_loss: 0.6535 - classification_loss: 0.0759 417/500 [========================>.....] - ETA: 28s - loss: 0.7294 - regression_loss: 0.6535 - classification_loss: 0.0759 418/500 [========================>.....] - ETA: 28s - loss: 0.7290 - regression_loss: 0.6531 - classification_loss: 0.0759 419/500 [========================>.....] - ETA: 27s - loss: 0.7283 - regression_loss: 0.6525 - classification_loss: 0.0758 420/500 [========================>.....] - ETA: 27s - loss: 0.7286 - regression_loss: 0.6529 - classification_loss: 0.0757 421/500 [========================>.....] - ETA: 27s - loss: 0.7282 - regression_loss: 0.6525 - classification_loss: 0.0757 422/500 [========================>.....] - ETA: 26s - loss: 0.7276 - regression_loss: 0.6520 - classification_loss: 0.0756 423/500 [========================>.....] - ETA: 26s - loss: 0.7275 - regression_loss: 0.6519 - classification_loss: 0.0756 424/500 [========================>.....] - ETA: 26s - loss: 0.7271 - regression_loss: 0.6516 - classification_loss: 0.0755 425/500 [========================>.....] - ETA: 25s - loss: 0.7275 - regression_loss: 0.6519 - classification_loss: 0.0755 426/500 [========================>.....] - ETA: 25s - loss: 0.7277 - regression_loss: 0.6521 - classification_loss: 0.0756 427/500 [========================>.....] - ETA: 25s - loss: 0.7271 - regression_loss: 0.6515 - classification_loss: 0.0755 428/500 [========================>.....] - ETA: 24s - loss: 0.7262 - regression_loss: 0.6507 - classification_loss: 0.0755 429/500 [========================>.....] - ETA: 24s - loss: 0.7264 - regression_loss: 0.6508 - classification_loss: 0.0755 430/500 [========================>.....] - ETA: 23s - loss: 0.7263 - regression_loss: 0.6508 - classification_loss: 0.0755 431/500 [========================>.....] - ETA: 23s - loss: 0.7269 - regression_loss: 0.6513 - classification_loss: 0.0756 432/500 [========================>.....] - ETA: 23s - loss: 0.7266 - regression_loss: 0.6510 - classification_loss: 0.0756 433/500 [========================>.....] - ETA: 22s - loss: 0.7278 - regression_loss: 0.6521 - classification_loss: 0.0758 434/500 [=========================>....] - ETA: 22s - loss: 0.7277 - regression_loss: 0.6519 - classification_loss: 0.0758 435/500 [=========================>....] - ETA: 22s - loss: 0.7274 - regression_loss: 0.6516 - classification_loss: 0.0758 436/500 [=========================>....] - ETA: 21s - loss: 0.7267 - regression_loss: 0.6510 - classification_loss: 0.0757 437/500 [=========================>....] - ETA: 21s - loss: 0.7268 - regression_loss: 0.6509 - classification_loss: 0.0758 438/500 [=========================>....] - ETA: 21s - loss: 0.7262 - regression_loss: 0.6504 - classification_loss: 0.0758 439/500 [=========================>....] - ETA: 20s - loss: 0.7268 - regression_loss: 0.6510 - classification_loss: 0.0758 440/500 [=========================>....] - ETA: 20s - loss: 0.7267 - regression_loss: 0.6509 - classification_loss: 0.0758 441/500 [=========================>....] - ETA: 20s - loss: 0.7263 - regression_loss: 0.6505 - classification_loss: 0.0757 442/500 [=========================>....] - ETA: 19s - loss: 0.7266 - regression_loss: 0.6508 - classification_loss: 0.0758 443/500 [=========================>....] - ETA: 19s - loss: 0.7262 - regression_loss: 0.6504 - classification_loss: 0.0758 444/500 [=========================>....] - ETA: 19s - loss: 0.7259 - regression_loss: 0.6502 - classification_loss: 0.0757 445/500 [=========================>....] - ETA: 18s - loss: 0.7250 - regression_loss: 0.6494 - classification_loss: 0.0756 446/500 [=========================>....] - ETA: 18s - loss: 0.7245 - regression_loss: 0.6490 - classification_loss: 0.0756 447/500 [=========================>....] - ETA: 18s - loss: 0.7240 - regression_loss: 0.6485 - classification_loss: 0.0755 448/500 [=========================>....] - ETA: 17s - loss: 0.7240 - regression_loss: 0.6486 - classification_loss: 0.0755 449/500 [=========================>....] - ETA: 17s - loss: 0.7246 - regression_loss: 0.6490 - classification_loss: 0.0755 450/500 [==========================>...] - ETA: 17s - loss: 0.7242 - regression_loss: 0.6487 - classification_loss: 0.0755 451/500 [==========================>...] - ETA: 16s - loss: 0.7242 - regression_loss: 0.6487 - classification_loss: 0.0755 452/500 [==========================>...] - ETA: 16s - loss: 0.7245 - regression_loss: 0.6490 - classification_loss: 0.0756 453/500 [==========================>...] - ETA: 16s - loss: 0.7238 - regression_loss: 0.6484 - classification_loss: 0.0755 454/500 [==========================>...] - ETA: 15s - loss: 0.7245 - regression_loss: 0.6490 - classification_loss: 0.0755 455/500 [==========================>...] - ETA: 15s - loss: 0.7251 - regression_loss: 0.6494 - classification_loss: 0.0757 456/500 [==========================>...] - ETA: 15s - loss: 0.7254 - regression_loss: 0.6497 - classification_loss: 0.0757 457/500 [==========================>...] - ETA: 14s - loss: 0.7253 - regression_loss: 0.6497 - classification_loss: 0.0756 458/500 [==========================>...] - ETA: 14s - loss: 0.7247 - regression_loss: 0.6491 - classification_loss: 0.0756 459/500 [==========================>...] - ETA: 14s - loss: 0.7237 - regression_loss: 0.6483 - classification_loss: 0.0755 460/500 [==========================>...] - ETA: 13s - loss: 0.7233 - regression_loss: 0.6479 - classification_loss: 0.0754 461/500 [==========================>...] - ETA: 13s - loss: 0.7235 - regression_loss: 0.6480 - classification_loss: 0.0755 462/500 [==========================>...] - ETA: 13s - loss: 0.7236 - regression_loss: 0.6481 - classification_loss: 0.0754 463/500 [==========================>...] - ETA: 12s - loss: 0.7239 - regression_loss: 0.6485 - classification_loss: 0.0754 464/500 [==========================>...] - ETA: 12s - loss: 0.7241 - regression_loss: 0.6486 - classification_loss: 0.0755 465/500 [==========================>...] - ETA: 11s - loss: 0.7246 - regression_loss: 0.6490 - classification_loss: 0.0755 466/500 [==========================>...] - ETA: 11s - loss: 0.7249 - regression_loss: 0.6493 - classification_loss: 0.0755 467/500 [===========================>..] - ETA: 11s - loss: 0.7246 - regression_loss: 0.6491 - classification_loss: 0.0755 468/500 [===========================>..] - ETA: 10s - loss: 0.7242 - regression_loss: 0.6487 - classification_loss: 0.0755 469/500 [===========================>..] - ETA: 10s - loss: 0.7239 - regression_loss: 0.6485 - classification_loss: 0.0755 470/500 [===========================>..] - ETA: 10s - loss: 0.7231 - regression_loss: 0.6477 - classification_loss: 0.0754 471/500 [===========================>..] - ETA: 9s - loss: 0.7229 - regression_loss: 0.6475 - classification_loss: 0.0754  472/500 [===========================>..] - ETA: 9s - loss: 0.7233 - regression_loss: 0.6479 - classification_loss: 0.0754 473/500 [===========================>..] - ETA: 9s - loss: 0.7234 - regression_loss: 0.6480 - classification_loss: 0.0754 474/500 [===========================>..] - ETA: 8s - loss: 0.7230 - regression_loss: 0.6476 - classification_loss: 0.0754 475/500 [===========================>..] - ETA: 8s - loss: 0.7223 - regression_loss: 0.6470 - classification_loss: 0.0753 476/500 [===========================>..] - ETA: 8s - loss: 0.7223 - regression_loss: 0.6471 - classification_loss: 0.0752 477/500 [===========================>..] - ETA: 7s - loss: 0.7219 - regression_loss: 0.6467 - classification_loss: 0.0752 478/500 [===========================>..] - ETA: 7s - loss: 0.7227 - regression_loss: 0.6474 - classification_loss: 0.0753 479/500 [===========================>..] - ETA: 7s - loss: 0.7231 - regression_loss: 0.6476 - classification_loss: 0.0754 480/500 [===========================>..] - ETA: 6s - loss: 0.7224 - regression_loss: 0.6470 - classification_loss: 0.0753 481/500 [===========================>..] - ETA: 6s - loss: 0.7218 - regression_loss: 0.6465 - classification_loss: 0.0753 482/500 [===========================>..] - ETA: 6s - loss: 0.7221 - regression_loss: 0.6467 - classification_loss: 0.0754 483/500 [===========================>..] - ETA: 5s - loss: 0.7216 - regression_loss: 0.6462 - classification_loss: 0.0753 484/500 [============================>.] - ETA: 5s - loss: 0.7224 - regression_loss: 0.6470 - classification_loss: 0.0754 485/500 [============================>.] - ETA: 5s - loss: 0.7224 - regression_loss: 0.6470 - classification_loss: 0.0754 486/500 [============================>.] - ETA: 4s - loss: 0.7216 - regression_loss: 0.6463 - classification_loss: 0.0754 487/500 [============================>.] - ETA: 4s - loss: 0.7215 - regression_loss: 0.6462 - classification_loss: 0.0753 488/500 [============================>.] - ETA: 4s - loss: 0.7209 - regression_loss: 0.6457 - classification_loss: 0.0752 489/500 [============================>.] - ETA: 3s - loss: 0.7201 - regression_loss: 0.6449 - classification_loss: 0.0752 490/500 [============================>.] - ETA: 3s - loss: 0.7196 - regression_loss: 0.6444 - classification_loss: 0.0752 491/500 [============================>.] - ETA: 3s - loss: 0.7190 - regression_loss: 0.6440 - classification_loss: 0.0751 492/500 [============================>.] - ETA: 2s - loss: 0.7191 - regression_loss: 0.6441 - classification_loss: 0.0751 493/500 [============================>.] - ETA: 2s - loss: 0.7191 - regression_loss: 0.6441 - classification_loss: 0.0750 494/500 [============================>.] - ETA: 2s - loss: 0.7189 - regression_loss: 0.6438 - classification_loss: 0.0750 495/500 [============================>.] - ETA: 1s - loss: 0.7185 - regression_loss: 0.6435 - classification_loss: 0.0750 496/500 [============================>.] - ETA: 1s - loss: 0.7188 - regression_loss: 0.6438 - classification_loss: 0.0750 497/500 [============================>.] - ETA: 1s - loss: 0.7193 - regression_loss: 0.6442 - classification_loss: 0.0751 498/500 [============================>.] - ETA: 0s - loss: 0.7190 - regression_loss: 0.6440 - classification_loss: 0.0750 499/500 [============================>.] - ETA: 0s - loss: 0.7190 - regression_loss: 0.6440 - classification_loss: 0.0750 500/500 [==============================] - 171s 343ms/step - loss: 0.7189 - regression_loss: 0.6439 - classification_loss: 0.0750 1172 instances of class plum with average precision: 0.7265 mAP: 0.7265 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:40 - loss: 0.4610 - regression_loss: 0.4113 - classification_loss: 0.0497 2/500 [..............................] - ETA: 2:47 - loss: 0.7005 - regression_loss: 0.6345 - classification_loss: 0.0660 3/500 [..............................] - ETA: 2:49 - loss: 0.8174 - regression_loss: 0.7363 - classification_loss: 0.0811 4/500 [..............................] - ETA: 2:49 - loss: 0.8488 - regression_loss: 0.7558 - classification_loss: 0.0930 5/500 [..............................] - ETA: 2:50 - loss: 0.8867 - regression_loss: 0.7886 - classification_loss: 0.0981 6/500 [..............................] - ETA: 2:50 - loss: 0.8180 - regression_loss: 0.7256 - classification_loss: 0.0924 7/500 [..............................] - ETA: 2:49 - loss: 0.8394 - regression_loss: 0.7375 - classification_loss: 0.1019 8/500 [..............................] - ETA: 2:50 - loss: 0.7813 - regression_loss: 0.6879 - classification_loss: 0.0934 9/500 [..............................] - ETA: 2:50 - loss: 0.7957 - regression_loss: 0.7003 - classification_loss: 0.0955 10/500 [..............................] - ETA: 2:50 - loss: 0.8451 - regression_loss: 0.7461 - classification_loss: 0.0990 11/500 [..............................] - ETA: 2:50 - loss: 0.8518 - regression_loss: 0.7571 - classification_loss: 0.0947 12/500 [..............................] - ETA: 2:50 - loss: 0.8670 - regression_loss: 0.7739 - classification_loss: 0.0931 13/500 [..............................] - ETA: 2:50 - loss: 0.8999 - regression_loss: 0.8005 - classification_loss: 0.0995 14/500 [..............................] - ETA: 2:49 - loss: 0.8886 - regression_loss: 0.7915 - classification_loss: 0.0971 15/500 [..............................] - ETA: 2:49 - loss: 0.8577 - regression_loss: 0.7650 - classification_loss: 0.0927 16/500 [..............................] - ETA: 2:48 - loss: 0.8410 - regression_loss: 0.7509 - classification_loss: 0.0901 17/500 [>.............................] - ETA: 2:48 - loss: 0.8194 - regression_loss: 0.7323 - classification_loss: 0.0872 18/500 [>.............................] - ETA: 2:47 - loss: 0.7914 - regression_loss: 0.7069 - classification_loss: 0.0845 19/500 [>.............................] - ETA: 2:47 - loss: 0.7691 - regression_loss: 0.6875 - classification_loss: 0.0816 20/500 [>.............................] - ETA: 2:46 - loss: 0.7523 - regression_loss: 0.6731 - classification_loss: 0.0792 21/500 [>.............................] - ETA: 2:45 - loss: 0.7321 - regression_loss: 0.6548 - classification_loss: 0.0773 22/500 [>.............................] - ETA: 2:45 - loss: 0.7307 - regression_loss: 0.6548 - classification_loss: 0.0759 23/500 [>.............................] - ETA: 2:45 - loss: 0.7577 - regression_loss: 0.6807 - classification_loss: 0.0770 24/500 [>.............................] - ETA: 2:44 - loss: 0.7613 - regression_loss: 0.6848 - classification_loss: 0.0764 25/500 [>.............................] - ETA: 2:44 - loss: 0.7530 - regression_loss: 0.6775 - classification_loss: 0.0755 26/500 [>.............................] - ETA: 2:43 - loss: 0.7686 - regression_loss: 0.6915 - classification_loss: 0.0771 27/500 [>.............................] - ETA: 2:43 - loss: 0.8034 - regression_loss: 0.7232 - classification_loss: 0.0802 28/500 [>.............................] - ETA: 2:43 - loss: 0.8138 - regression_loss: 0.7339 - classification_loss: 0.0800 29/500 [>.............................] - ETA: 2:42 - loss: 0.8189 - regression_loss: 0.7382 - classification_loss: 0.0807 30/500 [>.............................] - ETA: 2:42 - loss: 0.8253 - regression_loss: 0.7432 - classification_loss: 0.0821 31/500 [>.............................] - ETA: 2:42 - loss: 0.8201 - regression_loss: 0.7384 - classification_loss: 0.0817 32/500 [>.............................] - ETA: 2:42 - loss: 0.8174 - regression_loss: 0.7359 - classification_loss: 0.0815 33/500 [>.............................] - ETA: 2:41 - loss: 0.8027 - regression_loss: 0.7229 - classification_loss: 0.0798 34/500 [=>............................] - ETA: 2:41 - loss: 0.7940 - regression_loss: 0.7152 - classification_loss: 0.0789 35/500 [=>............................] - ETA: 2:41 - loss: 0.7979 - regression_loss: 0.7184 - classification_loss: 0.0796 36/500 [=>............................] - ETA: 2:40 - loss: 0.7836 - regression_loss: 0.7056 - classification_loss: 0.0780 37/500 [=>............................] - ETA: 2:40 - loss: 0.7926 - regression_loss: 0.7131 - classification_loss: 0.0794 38/500 [=>............................] - ETA: 2:40 - loss: 0.7870 - regression_loss: 0.7086 - classification_loss: 0.0784 39/500 [=>............................] - ETA: 2:40 - loss: 0.7944 - regression_loss: 0.7155 - classification_loss: 0.0789 40/500 [=>............................] - ETA: 2:39 - loss: 0.7877 - regression_loss: 0.7102 - classification_loss: 0.0775 41/500 [=>............................] - ETA: 2:38 - loss: 0.7903 - regression_loss: 0.7095 - classification_loss: 0.0808 42/500 [=>............................] - ETA: 2:37 - loss: 0.7825 - regression_loss: 0.7020 - classification_loss: 0.0804 43/500 [=>............................] - ETA: 2:37 - loss: 0.7857 - regression_loss: 0.7043 - classification_loss: 0.0813 44/500 [=>............................] - ETA: 2:37 - loss: 0.7860 - regression_loss: 0.7049 - classification_loss: 0.0811 45/500 [=>............................] - ETA: 2:37 - loss: 0.7840 - regression_loss: 0.7035 - classification_loss: 0.0805 46/500 [=>............................] - ETA: 2:37 - loss: 0.7758 - regression_loss: 0.6963 - classification_loss: 0.0795 47/500 [=>............................] - ETA: 2:36 - loss: 0.7711 - regression_loss: 0.6919 - classification_loss: 0.0792 48/500 [=>............................] - ETA: 2:36 - loss: 0.7671 - regression_loss: 0.6881 - classification_loss: 0.0790 49/500 [=>............................] - ETA: 2:36 - loss: 0.7629 - regression_loss: 0.6846 - classification_loss: 0.0784 50/500 [==>...........................] - ETA: 2:35 - loss: 0.7617 - regression_loss: 0.6829 - classification_loss: 0.0788 51/500 [==>...........................] - ETA: 2:35 - loss: 0.7685 - regression_loss: 0.6889 - classification_loss: 0.0796 52/500 [==>...........................] - ETA: 2:35 - loss: 0.7669 - regression_loss: 0.6879 - classification_loss: 0.0790 53/500 [==>...........................] - ETA: 2:34 - loss: 0.7691 - regression_loss: 0.6896 - classification_loss: 0.0795 54/500 [==>...........................] - ETA: 2:34 - loss: 0.7704 - regression_loss: 0.6906 - classification_loss: 0.0798 55/500 [==>...........................] - ETA: 2:33 - loss: 0.7719 - regression_loss: 0.6919 - classification_loss: 0.0800 56/500 [==>...........................] - ETA: 2:33 - loss: 0.7721 - regression_loss: 0.6920 - classification_loss: 0.0801 57/500 [==>...........................] - ETA: 2:33 - loss: 0.7694 - regression_loss: 0.6891 - classification_loss: 0.0803 58/500 [==>...........................] - ETA: 2:32 - loss: 0.7658 - regression_loss: 0.6856 - classification_loss: 0.0801 59/500 [==>...........................] - ETA: 2:32 - loss: 0.7625 - regression_loss: 0.6826 - classification_loss: 0.0799 60/500 [==>...........................] - ETA: 2:32 - loss: 0.7606 - regression_loss: 0.6807 - classification_loss: 0.0799 61/500 [==>...........................] - ETA: 2:31 - loss: 0.7594 - regression_loss: 0.6795 - classification_loss: 0.0799 62/500 [==>...........................] - ETA: 2:31 - loss: 0.7599 - regression_loss: 0.6796 - classification_loss: 0.0803 63/500 [==>...........................] - ETA: 2:30 - loss: 0.7566 - regression_loss: 0.6767 - classification_loss: 0.0799 64/500 [==>...........................] - ETA: 2:30 - loss: 0.7579 - regression_loss: 0.6778 - classification_loss: 0.0801 65/500 [==>...........................] - ETA: 2:30 - loss: 0.7581 - regression_loss: 0.6782 - classification_loss: 0.0799 66/500 [==>...........................] - ETA: 2:29 - loss: 0.7597 - regression_loss: 0.6796 - classification_loss: 0.0801 67/500 [===>..........................] - ETA: 2:29 - loss: 0.7552 - regression_loss: 0.6757 - classification_loss: 0.0795 68/500 [===>..........................] - ETA: 2:28 - loss: 0.7536 - regression_loss: 0.6744 - classification_loss: 0.0792 69/500 [===>..........................] - ETA: 2:28 - loss: 0.7473 - regression_loss: 0.6686 - classification_loss: 0.0788 70/500 [===>..........................] - ETA: 2:28 - loss: 0.7507 - regression_loss: 0.6717 - classification_loss: 0.0790 71/500 [===>..........................] - ETA: 2:27 - loss: 0.7553 - regression_loss: 0.6753 - classification_loss: 0.0801 72/500 [===>..........................] - ETA: 2:27 - loss: 0.7508 - regression_loss: 0.6711 - classification_loss: 0.0797 73/500 [===>..........................] - ETA: 2:26 - loss: 0.7481 - regression_loss: 0.6687 - classification_loss: 0.0794 74/500 [===>..........................] - ETA: 2:26 - loss: 0.7463 - regression_loss: 0.6672 - classification_loss: 0.0791 75/500 [===>..........................] - ETA: 2:26 - loss: 0.7446 - regression_loss: 0.6658 - classification_loss: 0.0788 76/500 [===>..........................] - ETA: 2:25 - loss: 0.7437 - regression_loss: 0.6650 - classification_loss: 0.0787 77/500 [===>..........................] - ETA: 2:25 - loss: 0.7427 - regression_loss: 0.6639 - classification_loss: 0.0787 78/500 [===>..........................] - ETA: 2:25 - loss: 0.7435 - regression_loss: 0.6645 - classification_loss: 0.0789 79/500 [===>..........................] - ETA: 2:24 - loss: 0.7424 - regression_loss: 0.6636 - classification_loss: 0.0788 80/500 [===>..........................] - ETA: 2:24 - loss: 0.7395 - regression_loss: 0.6608 - classification_loss: 0.0787 81/500 [===>..........................] - ETA: 2:24 - loss: 0.7337 - regression_loss: 0.6558 - classification_loss: 0.0780 82/500 [===>..........................] - ETA: 2:23 - loss: 0.7325 - regression_loss: 0.6544 - classification_loss: 0.0781 83/500 [===>..........................] - ETA: 2:23 - loss: 0.7343 - regression_loss: 0.6566 - classification_loss: 0.0777 84/500 [====>.........................] - ETA: 2:23 - loss: 0.7317 - regression_loss: 0.6545 - classification_loss: 0.0772 85/500 [====>.........................] - ETA: 2:22 - loss: 0.7280 - regression_loss: 0.6512 - classification_loss: 0.0768 86/500 [====>.........................] - ETA: 2:22 - loss: 0.7257 - regression_loss: 0.6492 - classification_loss: 0.0764 87/500 [====>.........................] - ETA: 2:22 - loss: 0.7247 - regression_loss: 0.6486 - classification_loss: 0.0761 88/500 [====>.........................] - ETA: 2:21 - loss: 0.7255 - regression_loss: 0.6494 - classification_loss: 0.0761 89/500 [====>.........................] - ETA: 2:21 - loss: 0.7230 - regression_loss: 0.6471 - classification_loss: 0.0759 90/500 [====>.........................] - ETA: 2:21 - loss: 0.7195 - regression_loss: 0.6443 - classification_loss: 0.0753 91/500 [====>.........................] - ETA: 2:20 - loss: 0.7189 - regression_loss: 0.6437 - classification_loss: 0.0752 92/500 [====>.........................] - ETA: 2:20 - loss: 0.7166 - regression_loss: 0.6419 - classification_loss: 0.0748 93/500 [====>.........................] - ETA: 2:20 - loss: 0.7139 - regression_loss: 0.6394 - classification_loss: 0.0745 94/500 [====>.........................] - ETA: 2:19 - loss: 0.7128 - regression_loss: 0.6386 - classification_loss: 0.0742 95/500 [====>.........................] - ETA: 2:19 - loss: 0.7109 - regression_loss: 0.6369 - classification_loss: 0.0740 96/500 [====>.........................] - ETA: 2:19 - loss: 0.7074 - regression_loss: 0.6338 - classification_loss: 0.0736 97/500 [====>.........................] - ETA: 2:18 - loss: 0.7092 - regression_loss: 0.6353 - classification_loss: 0.0739 98/500 [====>.........................] - ETA: 2:18 - loss: 0.7070 - regression_loss: 0.6335 - classification_loss: 0.0735 99/500 [====>.........................] - ETA: 2:18 - loss: 0.7070 - regression_loss: 0.6334 - classification_loss: 0.0736 100/500 [=====>........................] - ETA: 2:17 - loss: 0.7072 - regression_loss: 0.6337 - classification_loss: 0.0735 101/500 [=====>........................] - ETA: 2:17 - loss: 0.7042 - regression_loss: 0.6311 - classification_loss: 0.0731 102/500 [=====>........................] - ETA: 2:17 - loss: 0.7027 - regression_loss: 0.6298 - classification_loss: 0.0729 103/500 [=====>........................] - ETA: 2:16 - loss: 0.7060 - regression_loss: 0.6334 - classification_loss: 0.0726 104/500 [=====>........................] - ETA: 2:16 - loss: 0.7041 - regression_loss: 0.6318 - classification_loss: 0.0723 105/500 [=====>........................] - ETA: 2:15 - loss: 0.7005 - regression_loss: 0.6287 - classification_loss: 0.0719 106/500 [=====>........................] - ETA: 2:15 - loss: 0.7024 - regression_loss: 0.6304 - classification_loss: 0.0720 107/500 [=====>........................] - ETA: 2:15 - loss: 0.7039 - regression_loss: 0.6316 - classification_loss: 0.0723 108/500 [=====>........................] - ETA: 2:14 - loss: 0.7106 - regression_loss: 0.6371 - classification_loss: 0.0736 109/500 [=====>........................] - ETA: 2:14 - loss: 0.7115 - regression_loss: 0.6377 - classification_loss: 0.0738 110/500 [=====>........................] - ETA: 2:14 - loss: 0.7088 - regression_loss: 0.6354 - classification_loss: 0.0735 111/500 [=====>........................] - ETA: 2:14 - loss: 0.7116 - regression_loss: 0.6379 - classification_loss: 0.0737 112/500 [=====>........................] - ETA: 2:13 - loss: 0.7105 - regression_loss: 0.6368 - classification_loss: 0.0737 113/500 [=====>........................] - ETA: 2:13 - loss: 0.7094 - regression_loss: 0.6358 - classification_loss: 0.0736 114/500 [=====>........................] - ETA: 2:12 - loss: 0.7127 - regression_loss: 0.6385 - classification_loss: 0.0741 115/500 [=====>........................] - ETA: 2:12 - loss: 0.7094 - regression_loss: 0.6357 - classification_loss: 0.0738 116/500 [=====>........................] - ETA: 2:12 - loss: 0.7086 - regression_loss: 0.6351 - classification_loss: 0.0735 117/500 [======>.......................] - ETA: 2:11 - loss: 0.7058 - regression_loss: 0.6326 - classification_loss: 0.0731 118/500 [======>.......................] - ETA: 2:11 - loss: 0.7047 - regression_loss: 0.6315 - classification_loss: 0.0731 119/500 [======>.......................] - ETA: 2:10 - loss: 0.7017 - regression_loss: 0.6289 - classification_loss: 0.0727 120/500 [======>.......................] - ETA: 2:10 - loss: 0.7028 - regression_loss: 0.6302 - classification_loss: 0.0726 121/500 [======>.......................] - ETA: 2:10 - loss: 0.7003 - regression_loss: 0.6280 - classification_loss: 0.0723 122/500 [======>.......................] - ETA: 2:10 - loss: 0.6992 - regression_loss: 0.6270 - classification_loss: 0.0722 123/500 [======>.......................] - ETA: 2:09 - loss: 0.6974 - regression_loss: 0.6255 - classification_loss: 0.0719 124/500 [======>.......................] - ETA: 2:09 - loss: 0.6993 - regression_loss: 0.6275 - classification_loss: 0.0718 125/500 [======>.......................] - ETA: 2:09 - loss: 0.7011 - regression_loss: 0.6291 - classification_loss: 0.0720 126/500 [======>.......................] - ETA: 2:08 - loss: 0.7020 - regression_loss: 0.6300 - classification_loss: 0.0721 127/500 [======>.......................] - ETA: 2:08 - loss: 0.7009 - regression_loss: 0.6289 - classification_loss: 0.0720 128/500 [======>.......................] - ETA: 2:08 - loss: 0.7030 - regression_loss: 0.6307 - classification_loss: 0.0723 129/500 [======>.......................] - ETA: 2:07 - loss: 0.7018 - regression_loss: 0.6297 - classification_loss: 0.0721 130/500 [======>.......................] - ETA: 2:07 - loss: 0.7028 - regression_loss: 0.6304 - classification_loss: 0.0724 131/500 [======>.......................] - ETA: 2:07 - loss: 0.7006 - regression_loss: 0.6284 - classification_loss: 0.0722 132/500 [======>.......................] - ETA: 2:06 - loss: 0.6998 - regression_loss: 0.6275 - classification_loss: 0.0723 133/500 [======>.......................] - ETA: 2:06 - loss: 0.6999 - regression_loss: 0.6277 - classification_loss: 0.0722 134/500 [=======>......................] - ETA: 2:05 - loss: 0.6983 - regression_loss: 0.6261 - classification_loss: 0.0722 135/500 [=======>......................] - ETA: 2:05 - loss: 0.6967 - regression_loss: 0.6246 - classification_loss: 0.0721 136/500 [=======>......................] - ETA: 2:05 - loss: 0.6944 - regression_loss: 0.6225 - classification_loss: 0.0719 137/500 [=======>......................] - ETA: 2:04 - loss: 0.6949 - regression_loss: 0.6229 - classification_loss: 0.0719 138/500 [=======>......................] - ETA: 2:04 - loss: 0.6965 - regression_loss: 0.6245 - classification_loss: 0.0720 139/500 [=======>......................] - ETA: 2:04 - loss: 0.6964 - regression_loss: 0.6244 - classification_loss: 0.0720 140/500 [=======>......................] - ETA: 2:03 - loss: 0.6965 - regression_loss: 0.6245 - classification_loss: 0.0720 141/500 [=======>......................] - ETA: 2:03 - loss: 0.6965 - regression_loss: 0.6246 - classification_loss: 0.0719 142/500 [=======>......................] - ETA: 2:03 - loss: 0.6973 - regression_loss: 0.6252 - classification_loss: 0.0721 143/500 [=======>......................] - ETA: 2:02 - loss: 0.6987 - regression_loss: 0.6265 - classification_loss: 0.0722 144/500 [=======>......................] - ETA: 2:02 - loss: 0.6986 - regression_loss: 0.6264 - classification_loss: 0.0722 145/500 [=======>......................] - ETA: 2:02 - loss: 0.7007 - regression_loss: 0.6285 - classification_loss: 0.0722 146/500 [=======>......................] - ETA: 2:01 - loss: 0.7045 - regression_loss: 0.6320 - classification_loss: 0.0725 147/500 [=======>......................] - ETA: 2:01 - loss: 0.7054 - regression_loss: 0.6330 - classification_loss: 0.0724 148/500 [=======>......................] - ETA: 2:01 - loss: 0.7050 - regression_loss: 0.6327 - classification_loss: 0.0723 149/500 [=======>......................] - ETA: 2:00 - loss: 0.7059 - regression_loss: 0.6336 - classification_loss: 0.0723 150/500 [========>.....................] - ETA: 2:00 - loss: 0.7050 - regression_loss: 0.6327 - classification_loss: 0.0723 151/500 [========>.....................] - ETA: 1:59 - loss: 0.7049 - regression_loss: 0.6327 - classification_loss: 0.0722 152/500 [========>.....................] - ETA: 1:59 - loss: 0.7031 - regression_loss: 0.6311 - classification_loss: 0.0720 153/500 [========>.....................] - ETA: 1:59 - loss: 0.7030 - regression_loss: 0.6309 - classification_loss: 0.0720 154/500 [========>.....................] - ETA: 1:58 - loss: 0.7035 - regression_loss: 0.6314 - classification_loss: 0.0721 155/500 [========>.....................] - ETA: 1:58 - loss: 0.7046 - regression_loss: 0.6326 - classification_loss: 0.0721 156/500 [========>.....................] - ETA: 1:58 - loss: 0.7051 - regression_loss: 0.6328 - classification_loss: 0.0723 157/500 [========>.....................] - ETA: 1:57 - loss: 0.7041 - regression_loss: 0.6320 - classification_loss: 0.0721 158/500 [========>.....................] - ETA: 1:57 - loss: 0.7020 - regression_loss: 0.6302 - classification_loss: 0.0719 159/500 [========>.....................] - ETA: 1:57 - loss: 0.7013 - regression_loss: 0.6296 - classification_loss: 0.0717 160/500 [========>.....................] - ETA: 1:56 - loss: 0.7018 - regression_loss: 0.6299 - classification_loss: 0.0718 161/500 [========>.....................] - ETA: 1:56 - loss: 0.7023 - regression_loss: 0.6305 - classification_loss: 0.0718 162/500 [========>.....................] - ETA: 1:56 - loss: 0.7018 - regression_loss: 0.6301 - classification_loss: 0.0717 163/500 [========>.....................] - ETA: 1:55 - loss: 0.7010 - regression_loss: 0.6295 - classification_loss: 0.0715 164/500 [========>.....................] - ETA: 1:55 - loss: 0.6996 - regression_loss: 0.6284 - classification_loss: 0.0712 165/500 [========>.....................] - ETA: 1:55 - loss: 0.7013 - regression_loss: 0.6299 - classification_loss: 0.0714 166/500 [========>.....................] - ETA: 1:54 - loss: 0.7048 - regression_loss: 0.6327 - classification_loss: 0.0721 167/500 [=========>....................] - ETA: 1:54 - loss: 0.7032 - regression_loss: 0.6313 - classification_loss: 0.0719 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7019 - regression_loss: 0.6302 - classification_loss: 0.0716 169/500 [=========>....................] - ETA: 1:53 - loss: 0.7003 - regression_loss: 0.6289 - classification_loss: 0.0714 170/500 [=========>....................] - ETA: 1:53 - loss: 0.6986 - regression_loss: 0.6273 - classification_loss: 0.0713 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7011 - regression_loss: 0.6294 - classification_loss: 0.0717 172/500 [=========>....................] - ETA: 1:52 - loss: 0.6991 - regression_loss: 0.6277 - classification_loss: 0.0714 173/500 [=========>....................] - ETA: 1:52 - loss: 0.7012 - regression_loss: 0.6296 - classification_loss: 0.0716 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7014 - regression_loss: 0.6297 - classification_loss: 0.0716 175/500 [=========>....................] - ETA: 1:51 - loss: 0.7024 - regression_loss: 0.6308 - classification_loss: 0.0716 176/500 [=========>....................] - ETA: 1:51 - loss: 0.7021 - regression_loss: 0.6305 - classification_loss: 0.0716 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7019 - regression_loss: 0.6304 - classification_loss: 0.0715 178/500 [=========>....................] - ETA: 1:50 - loss: 0.7009 - regression_loss: 0.6295 - classification_loss: 0.0714 179/500 [=========>....................] - ETA: 1:50 - loss: 0.7011 - regression_loss: 0.6298 - classification_loss: 0.0713 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7020 - regression_loss: 0.6304 - classification_loss: 0.0716 181/500 [=========>....................] - ETA: 1:49 - loss: 0.7044 - regression_loss: 0.6324 - classification_loss: 0.0720 182/500 [=========>....................] - ETA: 1:49 - loss: 0.7050 - regression_loss: 0.6329 - classification_loss: 0.0721 183/500 [=========>....................] - ETA: 1:48 - loss: 0.7068 - regression_loss: 0.6346 - classification_loss: 0.0722 184/500 [==========>...................] - ETA: 1:48 - loss: 0.7051 - regression_loss: 0.6331 - classification_loss: 0.0721 185/500 [==========>...................] - ETA: 1:48 - loss: 0.7037 - regression_loss: 0.6318 - classification_loss: 0.0719 186/500 [==========>...................] - ETA: 1:47 - loss: 0.7032 - regression_loss: 0.6315 - classification_loss: 0.0718 187/500 [==========>...................] - ETA: 1:47 - loss: 0.7016 - regression_loss: 0.6301 - classification_loss: 0.0715 188/500 [==========>...................] - ETA: 1:47 - loss: 0.6996 - regression_loss: 0.6283 - classification_loss: 0.0713 189/500 [==========>...................] - ETA: 1:46 - loss: 0.6991 - regression_loss: 0.6279 - classification_loss: 0.0713 190/500 [==========>...................] - ETA: 1:46 - loss: 0.6964 - regression_loss: 0.6254 - classification_loss: 0.0710 191/500 [==========>...................] - ETA: 1:46 - loss: 0.6955 - regression_loss: 0.6245 - classification_loss: 0.0710 192/500 [==========>...................] - ETA: 1:45 - loss: 0.6935 - regression_loss: 0.6228 - classification_loss: 0.0707 193/500 [==========>...................] - ETA: 1:45 - loss: 0.6944 - regression_loss: 0.6235 - classification_loss: 0.0709 194/500 [==========>...................] - ETA: 1:45 - loss: 0.6938 - regression_loss: 0.6230 - classification_loss: 0.0708 195/500 [==========>...................] - ETA: 1:44 - loss: 0.6929 - regression_loss: 0.6221 - classification_loss: 0.0708 196/500 [==========>...................] - ETA: 1:44 - loss: 0.6933 - regression_loss: 0.6226 - classification_loss: 0.0707 197/500 [==========>...................] - ETA: 1:44 - loss: 0.6939 - regression_loss: 0.6234 - classification_loss: 0.0706 198/500 [==========>...................] - ETA: 1:43 - loss: 0.6944 - regression_loss: 0.6239 - classification_loss: 0.0706 199/500 [==========>...................] - ETA: 1:43 - loss: 0.6931 - regression_loss: 0.6228 - classification_loss: 0.0704 200/500 [===========>..................] - ETA: 1:43 - loss: 0.6933 - regression_loss: 0.6230 - classification_loss: 0.0704 201/500 [===========>..................] - ETA: 1:42 - loss: 0.6911 - regression_loss: 0.6210 - classification_loss: 0.0701 202/500 [===========>..................] - ETA: 1:42 - loss: 0.6909 - regression_loss: 0.6208 - classification_loss: 0.0701 203/500 [===========>..................] - ETA: 1:42 - loss: 0.6903 - regression_loss: 0.6203 - classification_loss: 0.0700 204/500 [===========>..................] - ETA: 1:41 - loss: 0.6881 - regression_loss: 0.6184 - classification_loss: 0.0698 205/500 [===========>..................] - ETA: 1:41 - loss: 0.6868 - regression_loss: 0.6172 - classification_loss: 0.0696 206/500 [===========>..................] - ETA: 1:41 - loss: 0.6866 - regression_loss: 0.6171 - classification_loss: 0.0695 207/500 [===========>..................] - ETA: 1:40 - loss: 0.6867 - regression_loss: 0.6173 - classification_loss: 0.0694 208/500 [===========>..................] - ETA: 1:40 - loss: 0.6871 - regression_loss: 0.6174 - classification_loss: 0.0696 209/500 [===========>..................] - ETA: 1:39 - loss: 0.6862 - regression_loss: 0.6166 - classification_loss: 0.0696 210/500 [===========>..................] - ETA: 1:39 - loss: 0.6860 - regression_loss: 0.6165 - classification_loss: 0.0695 211/500 [===========>..................] - ETA: 1:39 - loss: 0.6843 - regression_loss: 0.6150 - classification_loss: 0.0693 212/500 [===========>..................] - ETA: 1:38 - loss: 0.6835 - regression_loss: 0.6144 - classification_loss: 0.0691 213/500 [===========>..................] - ETA: 1:38 - loss: 0.6811 - regression_loss: 0.6122 - classification_loss: 0.0689 214/500 [===========>..................] - ETA: 1:38 - loss: 0.6806 - regression_loss: 0.6118 - classification_loss: 0.0688 215/500 [===========>..................] - ETA: 1:37 - loss: 0.6819 - regression_loss: 0.6127 - classification_loss: 0.0691 216/500 [===========>..................] - ETA: 1:37 - loss: 0.6829 - regression_loss: 0.6135 - classification_loss: 0.0694 217/500 [============>.................] - ETA: 1:37 - loss: 0.6825 - regression_loss: 0.6132 - classification_loss: 0.0694 218/500 [============>.................] - ETA: 1:36 - loss: 0.6830 - regression_loss: 0.6137 - classification_loss: 0.0693 219/500 [============>.................] - ETA: 1:36 - loss: 0.6826 - regression_loss: 0.6134 - classification_loss: 0.0693 220/500 [============>.................] - ETA: 1:36 - loss: 0.6820 - regression_loss: 0.6127 - classification_loss: 0.0692 221/500 [============>.................] - ETA: 1:35 - loss: 0.6809 - regression_loss: 0.6119 - classification_loss: 0.0690 222/500 [============>.................] - ETA: 1:35 - loss: 0.6809 - regression_loss: 0.6119 - classification_loss: 0.0690 223/500 [============>.................] - ETA: 1:35 - loss: 0.6820 - regression_loss: 0.6128 - classification_loss: 0.0692 224/500 [============>.................] - ETA: 1:34 - loss: 0.6820 - regression_loss: 0.6129 - classification_loss: 0.0691 225/500 [============>.................] - ETA: 1:34 - loss: 0.6823 - regression_loss: 0.6133 - classification_loss: 0.0690 226/500 [============>.................] - ETA: 1:34 - loss: 0.6810 - regression_loss: 0.6122 - classification_loss: 0.0688 227/500 [============>.................] - ETA: 1:33 - loss: 0.6797 - regression_loss: 0.6110 - classification_loss: 0.0687 228/500 [============>.................] - ETA: 1:33 - loss: 0.6797 - regression_loss: 0.6111 - classification_loss: 0.0687 229/500 [============>.................] - ETA: 1:32 - loss: 0.6790 - regression_loss: 0.6104 - classification_loss: 0.0686 230/500 [============>.................] - ETA: 1:32 - loss: 0.6806 - regression_loss: 0.6117 - classification_loss: 0.0689 231/500 [============>.................] - ETA: 1:32 - loss: 0.6807 - regression_loss: 0.6119 - classification_loss: 0.0688 232/500 [============>.................] - ETA: 1:31 - loss: 0.6788 - regression_loss: 0.6103 - classification_loss: 0.0686 233/500 [============>.................] - ETA: 1:31 - loss: 0.6800 - regression_loss: 0.6114 - classification_loss: 0.0686 234/500 [=============>................] - ETA: 1:31 - loss: 0.6808 - regression_loss: 0.6123 - classification_loss: 0.0685 235/500 [=============>................] - ETA: 1:30 - loss: 0.6810 - regression_loss: 0.6125 - classification_loss: 0.0685 236/500 [=============>................] - ETA: 1:30 - loss: 0.6808 - regression_loss: 0.6123 - classification_loss: 0.0685 237/500 [=============>................] - ETA: 1:30 - loss: 0.6820 - regression_loss: 0.6133 - classification_loss: 0.0687 238/500 [=============>................] - ETA: 1:29 - loss: 0.6823 - regression_loss: 0.6135 - classification_loss: 0.0688 239/500 [=============>................] - ETA: 1:29 - loss: 0.6811 - regression_loss: 0.6125 - classification_loss: 0.0687 240/500 [=============>................] - ETA: 1:29 - loss: 0.6810 - regression_loss: 0.6124 - classification_loss: 0.0686 241/500 [=============>................] - ETA: 1:28 - loss: 0.6835 - regression_loss: 0.6147 - classification_loss: 0.0688 242/500 [=============>................] - ETA: 1:28 - loss: 0.6844 - regression_loss: 0.6156 - classification_loss: 0.0688 243/500 [=============>................] - ETA: 1:28 - loss: 0.6843 - regression_loss: 0.6154 - classification_loss: 0.0689 244/500 [=============>................] - ETA: 1:27 - loss: 0.6826 - regression_loss: 0.6139 - classification_loss: 0.0687 245/500 [=============>................] - ETA: 1:27 - loss: 0.6820 - regression_loss: 0.6134 - classification_loss: 0.0686 246/500 [=============>................] - ETA: 1:27 - loss: 0.6805 - regression_loss: 0.6120 - classification_loss: 0.0685 247/500 [=============>................] - ETA: 1:26 - loss: 0.6819 - regression_loss: 0.6134 - classification_loss: 0.0685 248/500 [=============>................] - ETA: 1:26 - loss: 0.6830 - regression_loss: 0.6144 - classification_loss: 0.0686 249/500 [=============>................] - ETA: 1:26 - loss: 0.6819 - regression_loss: 0.6134 - classification_loss: 0.0685 250/500 [==============>...............] - ETA: 1:25 - loss: 0.6832 - regression_loss: 0.6146 - classification_loss: 0.0686 251/500 [==============>...............] - ETA: 1:25 - loss: 0.6820 - regression_loss: 0.6135 - classification_loss: 0.0685 252/500 [==============>...............] - ETA: 1:25 - loss: 0.6815 - regression_loss: 0.6131 - classification_loss: 0.0685 253/500 [==============>...............] - ETA: 1:24 - loss: 0.6832 - regression_loss: 0.6146 - classification_loss: 0.0687 254/500 [==============>...............] - ETA: 1:24 - loss: 0.6832 - regression_loss: 0.6146 - classification_loss: 0.0686 255/500 [==============>...............] - ETA: 1:24 - loss: 0.6823 - regression_loss: 0.6138 - classification_loss: 0.0685 256/500 [==============>...............] - ETA: 1:23 - loss: 0.6822 - regression_loss: 0.6137 - classification_loss: 0.0685 257/500 [==============>...............] - ETA: 1:23 - loss: 0.6821 - regression_loss: 0.6135 - classification_loss: 0.0686 258/500 [==============>...............] - ETA: 1:23 - loss: 0.6832 - regression_loss: 0.6145 - classification_loss: 0.0687 259/500 [==============>...............] - ETA: 1:22 - loss: 0.6852 - regression_loss: 0.6163 - classification_loss: 0.0689 260/500 [==============>...............] - ETA: 1:22 - loss: 0.6856 - regression_loss: 0.6167 - classification_loss: 0.0690 261/500 [==============>...............] - ETA: 1:22 - loss: 0.6855 - regression_loss: 0.6165 - classification_loss: 0.0690 262/500 [==============>...............] - ETA: 1:21 - loss: 0.6858 - regression_loss: 0.6168 - classification_loss: 0.0691 263/500 [==============>...............] - ETA: 1:21 - loss: 0.6867 - regression_loss: 0.6177 - classification_loss: 0.0691 264/500 [==============>...............] - ETA: 1:21 - loss: 0.6869 - regression_loss: 0.6178 - classification_loss: 0.0691 265/500 [==============>...............] - ETA: 1:20 - loss: 0.6867 - regression_loss: 0.6177 - classification_loss: 0.0690 266/500 [==============>...............] - ETA: 1:20 - loss: 0.6860 - regression_loss: 0.6171 - classification_loss: 0.0689 267/500 [===============>..............] - ETA: 1:19 - loss: 0.6866 - regression_loss: 0.6176 - classification_loss: 0.0689 268/500 [===============>..............] - ETA: 1:19 - loss: 0.6867 - regression_loss: 0.6178 - classification_loss: 0.0689 269/500 [===============>..............] - ETA: 1:19 - loss: 0.6869 - regression_loss: 0.6179 - classification_loss: 0.0690 270/500 [===============>..............] - ETA: 1:18 - loss: 0.6865 - regression_loss: 0.6175 - classification_loss: 0.0689 271/500 [===============>..............] - ETA: 1:18 - loss: 0.6859 - regression_loss: 0.6171 - classification_loss: 0.0689 272/500 [===============>..............] - ETA: 1:18 - loss: 0.6849 - regression_loss: 0.6162 - classification_loss: 0.0687 273/500 [===============>..............] - ETA: 1:17 - loss: 0.6834 - regression_loss: 0.6148 - classification_loss: 0.0686 274/500 [===============>..............] - ETA: 1:17 - loss: 0.6830 - regression_loss: 0.6144 - classification_loss: 0.0686 275/500 [===============>..............] - ETA: 1:17 - loss: 0.6822 - regression_loss: 0.6138 - classification_loss: 0.0684 276/500 [===============>..............] - ETA: 1:16 - loss: 0.6816 - regression_loss: 0.6133 - classification_loss: 0.0683 277/500 [===============>..............] - ETA: 1:16 - loss: 0.6802 - regression_loss: 0.6120 - classification_loss: 0.0682 278/500 [===============>..............] - ETA: 1:16 - loss: 0.6811 - regression_loss: 0.6128 - classification_loss: 0.0683 279/500 [===============>..............] - ETA: 1:15 - loss: 0.6815 - regression_loss: 0.6132 - classification_loss: 0.0684 280/500 [===============>..............] - ETA: 1:15 - loss: 0.6826 - regression_loss: 0.6140 - classification_loss: 0.0687 281/500 [===============>..............] - ETA: 1:15 - loss: 0.6822 - regression_loss: 0.6136 - classification_loss: 0.0686 282/500 [===============>..............] - ETA: 1:14 - loss: 0.6825 - regression_loss: 0.6138 - classification_loss: 0.0687 283/500 [===============>..............] - ETA: 1:14 - loss: 0.6824 - regression_loss: 0.6138 - classification_loss: 0.0687 284/500 [================>.............] - ETA: 1:14 - loss: 0.6826 - regression_loss: 0.6140 - classification_loss: 0.0687 285/500 [================>.............] - ETA: 1:13 - loss: 0.6832 - regression_loss: 0.6144 - classification_loss: 0.0688 286/500 [================>.............] - ETA: 1:13 - loss: 0.6862 - regression_loss: 0.6171 - classification_loss: 0.0691 287/500 [================>.............] - ETA: 1:13 - loss: 0.6865 - regression_loss: 0.6173 - classification_loss: 0.0692 288/500 [================>.............] - ETA: 1:12 - loss: 0.6865 - regression_loss: 0.6171 - classification_loss: 0.0694 289/500 [================>.............] - ETA: 1:12 - loss: 0.6881 - regression_loss: 0.6185 - classification_loss: 0.0696 290/500 [================>.............] - ETA: 1:12 - loss: 0.6885 - regression_loss: 0.6187 - classification_loss: 0.0698 291/500 [================>.............] - ETA: 1:11 - loss: 0.6882 - regression_loss: 0.6183 - classification_loss: 0.0699 292/500 [================>.............] - ETA: 1:11 - loss: 0.6897 - regression_loss: 0.6195 - classification_loss: 0.0702 293/500 [================>.............] - ETA: 1:11 - loss: 0.6887 - regression_loss: 0.6186 - classification_loss: 0.0701 294/500 [================>.............] - ETA: 1:10 - loss: 0.6889 - regression_loss: 0.6188 - classification_loss: 0.0701 295/500 [================>.............] - ETA: 1:10 - loss: 0.6882 - regression_loss: 0.6182 - classification_loss: 0.0700 296/500 [================>.............] - ETA: 1:10 - loss: 0.6889 - regression_loss: 0.6189 - classification_loss: 0.0701 297/500 [================>.............] - ETA: 1:09 - loss: 0.6881 - regression_loss: 0.6182 - classification_loss: 0.0699 298/500 [================>.............] - ETA: 1:09 - loss: 0.6887 - regression_loss: 0.6187 - classification_loss: 0.0700 299/500 [================>.............] - ETA: 1:08 - loss: 0.6874 - regression_loss: 0.6176 - classification_loss: 0.0698 300/500 [=================>............] - ETA: 1:08 - loss: 0.6882 - regression_loss: 0.6181 - classification_loss: 0.0701 301/500 [=================>............] - ETA: 1:08 - loss: 0.6870 - regression_loss: 0.6171 - classification_loss: 0.0699 302/500 [=================>............] - ETA: 1:07 - loss: 0.6873 - regression_loss: 0.6173 - classification_loss: 0.0700 303/500 [=================>............] - ETA: 1:07 - loss: 0.6879 - regression_loss: 0.6178 - classification_loss: 0.0700 304/500 [=================>............] - ETA: 1:07 - loss: 0.6882 - regression_loss: 0.6182 - classification_loss: 0.0700 305/500 [=================>............] - ETA: 1:06 - loss: 0.6887 - regression_loss: 0.6186 - classification_loss: 0.0701 306/500 [=================>............] - ETA: 1:06 - loss: 0.6892 - regression_loss: 0.6190 - classification_loss: 0.0702 307/500 [=================>............] - ETA: 1:06 - loss: 0.6889 - regression_loss: 0.6187 - classification_loss: 0.0701 308/500 [=================>............] - ETA: 1:05 - loss: 0.6892 - regression_loss: 0.6190 - classification_loss: 0.0702 309/500 [=================>............] - ETA: 1:05 - loss: 0.6908 - regression_loss: 0.6204 - classification_loss: 0.0704 310/500 [=================>............] - ETA: 1:05 - loss: 0.6906 - regression_loss: 0.6202 - classification_loss: 0.0703 311/500 [=================>............] - ETA: 1:04 - loss: 0.6906 - regression_loss: 0.6202 - classification_loss: 0.0703 312/500 [=================>............] - ETA: 1:04 - loss: 0.6911 - regression_loss: 0.6208 - classification_loss: 0.0703 313/500 [=================>............] - ETA: 1:04 - loss: 0.6919 - regression_loss: 0.6215 - classification_loss: 0.0704 314/500 [=================>............] - ETA: 1:03 - loss: 0.6909 - regression_loss: 0.6206 - classification_loss: 0.0703 315/500 [=================>............] - ETA: 1:03 - loss: 0.6903 - regression_loss: 0.6200 - classification_loss: 0.0703 316/500 [=================>............] - ETA: 1:03 - loss: 0.6901 - regression_loss: 0.6199 - classification_loss: 0.0702 317/500 [==================>...........] - ETA: 1:02 - loss: 0.6900 - regression_loss: 0.6199 - classification_loss: 0.0701 318/500 [==================>...........] - ETA: 1:02 - loss: 0.6911 - regression_loss: 0.6207 - classification_loss: 0.0704 319/500 [==================>...........] - ETA: 1:02 - loss: 0.6915 - regression_loss: 0.6211 - classification_loss: 0.0703 320/500 [==================>...........] - ETA: 1:01 - loss: 0.6920 - regression_loss: 0.6216 - classification_loss: 0.0704 321/500 [==================>...........] - ETA: 1:01 - loss: 0.6929 - regression_loss: 0.6225 - classification_loss: 0.0704 322/500 [==================>...........] - ETA: 1:01 - loss: 0.6939 - regression_loss: 0.6233 - classification_loss: 0.0707 323/500 [==================>...........] - ETA: 1:00 - loss: 0.6934 - regression_loss: 0.6228 - classification_loss: 0.0706 324/500 [==================>...........] - ETA: 1:00 - loss: 0.6928 - regression_loss: 0.6223 - classification_loss: 0.0705 325/500 [==================>...........] - ETA: 1:00 - loss: 0.6925 - regression_loss: 0.6219 - classification_loss: 0.0705 326/500 [==================>...........] - ETA: 59s - loss: 0.6925 - regression_loss: 0.6219 - classification_loss: 0.0705  327/500 [==================>...........] - ETA: 59s - loss: 0.6930 - regression_loss: 0.6223 - classification_loss: 0.0707 328/500 [==================>...........] - ETA: 59s - loss: 0.6918 - regression_loss: 0.6213 - classification_loss: 0.0705 329/500 [==================>...........] - ETA: 58s - loss: 0.6915 - regression_loss: 0.6210 - classification_loss: 0.0705 330/500 [==================>...........] - ETA: 58s - loss: 0.6911 - regression_loss: 0.6207 - classification_loss: 0.0704 331/500 [==================>...........] - ETA: 58s - loss: 0.6900 - regression_loss: 0.6198 - classification_loss: 0.0702 332/500 [==================>...........] - ETA: 57s - loss: 0.6888 - regression_loss: 0.6187 - classification_loss: 0.0702 333/500 [==================>...........] - ETA: 57s - loss: 0.6883 - regression_loss: 0.6181 - classification_loss: 0.0701 334/500 [===================>..........] - ETA: 56s - loss: 0.6883 - regression_loss: 0.6181 - classification_loss: 0.0702 335/500 [===================>..........] - ETA: 56s - loss: 0.6876 - regression_loss: 0.6175 - classification_loss: 0.0701 336/500 [===================>..........] - ETA: 56s - loss: 0.6869 - regression_loss: 0.6168 - classification_loss: 0.0701 337/500 [===================>..........] - ETA: 55s - loss: 0.6861 - regression_loss: 0.6161 - classification_loss: 0.0699 338/500 [===================>..........] - ETA: 55s - loss: 0.6852 - regression_loss: 0.6153 - classification_loss: 0.0699 339/500 [===================>..........] - ETA: 55s - loss: 0.6850 - regression_loss: 0.6152 - classification_loss: 0.0698 340/500 [===================>..........] - ETA: 54s - loss: 0.6848 - regression_loss: 0.6150 - classification_loss: 0.0698 341/500 [===================>..........] - ETA: 54s - loss: 0.6848 - regression_loss: 0.6151 - classification_loss: 0.0697 342/500 [===================>..........] - ETA: 54s - loss: 0.6839 - regression_loss: 0.6143 - classification_loss: 0.0696 343/500 [===================>..........] - ETA: 53s - loss: 0.6833 - regression_loss: 0.6137 - classification_loss: 0.0696 344/500 [===================>..........] - ETA: 53s - loss: 0.6821 - regression_loss: 0.6127 - classification_loss: 0.0695 345/500 [===================>..........] - ETA: 53s - loss: 0.6830 - regression_loss: 0.6134 - classification_loss: 0.0697 346/500 [===================>..........] - ETA: 52s - loss: 0.6829 - regression_loss: 0.6132 - classification_loss: 0.0697 347/500 [===================>..........] - ETA: 52s - loss: 0.6829 - regression_loss: 0.6133 - classification_loss: 0.0696 348/500 [===================>..........] - ETA: 52s - loss: 0.6832 - regression_loss: 0.6135 - classification_loss: 0.0697 349/500 [===================>..........] - ETA: 51s - loss: 0.6841 - regression_loss: 0.6144 - classification_loss: 0.0697 350/500 [====================>.........] - ETA: 51s - loss: 0.6833 - regression_loss: 0.6137 - classification_loss: 0.0696 351/500 [====================>.........] - ETA: 51s - loss: 0.6831 - regression_loss: 0.6135 - classification_loss: 0.0696 352/500 [====================>.........] - ETA: 50s - loss: 0.6835 - regression_loss: 0.6138 - classification_loss: 0.0697 353/500 [====================>.........] - ETA: 50s - loss: 0.6828 - regression_loss: 0.6131 - classification_loss: 0.0697 354/500 [====================>.........] - ETA: 50s - loss: 0.6824 - regression_loss: 0.6129 - classification_loss: 0.0696 355/500 [====================>.........] - ETA: 49s - loss: 0.6816 - regression_loss: 0.6122 - classification_loss: 0.0695 356/500 [====================>.........] - ETA: 49s - loss: 0.6807 - regression_loss: 0.6113 - classification_loss: 0.0694 357/500 [====================>.........] - ETA: 49s - loss: 0.6797 - regression_loss: 0.6104 - classification_loss: 0.0693 358/500 [====================>.........] - ETA: 48s - loss: 0.6792 - regression_loss: 0.6099 - classification_loss: 0.0693 359/500 [====================>.........] - ETA: 48s - loss: 0.6790 - regression_loss: 0.6097 - classification_loss: 0.0692 360/500 [====================>.........] - ETA: 48s - loss: 0.6798 - regression_loss: 0.6105 - classification_loss: 0.0694 361/500 [====================>.........] - ETA: 47s - loss: 0.6796 - regression_loss: 0.6103 - classification_loss: 0.0693 362/500 [====================>.........] - ETA: 47s - loss: 0.6788 - regression_loss: 0.6096 - classification_loss: 0.0692 363/500 [====================>.........] - ETA: 47s - loss: 0.6787 - regression_loss: 0.6097 - classification_loss: 0.0691 364/500 [====================>.........] - ETA: 46s - loss: 0.6793 - regression_loss: 0.6102 - classification_loss: 0.0691 365/500 [====================>.........] - ETA: 46s - loss: 0.6787 - regression_loss: 0.6097 - classification_loss: 0.0690 366/500 [====================>.........] - ETA: 45s - loss: 0.6782 - regression_loss: 0.6093 - classification_loss: 0.0689 367/500 [=====================>........] - ETA: 45s - loss: 0.6782 - regression_loss: 0.6092 - classification_loss: 0.0689 368/500 [=====================>........] - ETA: 45s - loss: 0.6781 - regression_loss: 0.6092 - classification_loss: 0.0689 369/500 [=====================>........] - ETA: 44s - loss: 0.6785 - regression_loss: 0.6095 - classification_loss: 0.0689 370/500 [=====================>........] - ETA: 44s - loss: 0.6783 - regression_loss: 0.6094 - classification_loss: 0.0689 371/500 [=====================>........] - ETA: 44s - loss: 0.6773 - regression_loss: 0.6085 - classification_loss: 0.0688 372/500 [=====================>........] - ETA: 43s - loss: 0.6769 - regression_loss: 0.6081 - classification_loss: 0.0688 373/500 [=====================>........] - ETA: 43s - loss: 0.6775 - regression_loss: 0.6086 - classification_loss: 0.0688 374/500 [=====================>........] - ETA: 43s - loss: 0.6775 - regression_loss: 0.6086 - classification_loss: 0.0689 375/500 [=====================>........] - ETA: 42s - loss: 0.6765 - regression_loss: 0.6077 - classification_loss: 0.0688 376/500 [=====================>........] - ETA: 42s - loss: 0.6762 - regression_loss: 0.6075 - classification_loss: 0.0688 377/500 [=====================>........] - ETA: 42s - loss: 0.6776 - regression_loss: 0.6088 - classification_loss: 0.0688 378/500 [=====================>........] - ETA: 41s - loss: 0.6779 - regression_loss: 0.6091 - classification_loss: 0.0689 379/500 [=====================>........] - ETA: 41s - loss: 0.6774 - regression_loss: 0.6087 - classification_loss: 0.0687 380/500 [=====================>........] - ETA: 41s - loss: 0.6774 - regression_loss: 0.6086 - classification_loss: 0.0688 381/500 [=====================>........] - ETA: 40s - loss: 0.6766 - regression_loss: 0.6080 - classification_loss: 0.0687 382/500 [=====================>........] - ETA: 40s - loss: 0.6775 - regression_loss: 0.6088 - classification_loss: 0.0687 383/500 [=====================>........] - ETA: 40s - loss: 0.6772 - regression_loss: 0.6085 - classification_loss: 0.0687 384/500 [======================>.......] - ETA: 39s - loss: 0.6778 - regression_loss: 0.6090 - classification_loss: 0.0687 385/500 [======================>.......] - ETA: 39s - loss: 0.6769 - regression_loss: 0.6083 - classification_loss: 0.0687 386/500 [======================>.......] - ETA: 39s - loss: 0.6776 - regression_loss: 0.6089 - classification_loss: 0.0687 387/500 [======================>.......] - ETA: 38s - loss: 0.6772 - regression_loss: 0.6086 - classification_loss: 0.0687 388/500 [======================>.......] - ETA: 38s - loss: 0.6776 - regression_loss: 0.6090 - classification_loss: 0.0687 389/500 [======================>.......] - ETA: 38s - loss: 0.6776 - regression_loss: 0.6089 - classification_loss: 0.0687 390/500 [======================>.......] - ETA: 37s - loss: 0.6780 - regression_loss: 0.6092 - classification_loss: 0.0687 391/500 [======================>.......] - ETA: 37s - loss: 0.6778 - regression_loss: 0.6091 - classification_loss: 0.0687 392/500 [======================>.......] - ETA: 37s - loss: 0.6778 - regression_loss: 0.6091 - classification_loss: 0.0688 393/500 [======================>.......] - ETA: 36s - loss: 0.6767 - regression_loss: 0.6080 - classification_loss: 0.0686 394/500 [======================>.......] - ETA: 36s - loss: 0.6775 - regression_loss: 0.6088 - classification_loss: 0.0687 395/500 [======================>.......] - ETA: 36s - loss: 0.6773 - regression_loss: 0.6087 - classification_loss: 0.0687 396/500 [======================>.......] - ETA: 35s - loss: 0.6763 - regression_loss: 0.6077 - classification_loss: 0.0686 397/500 [======================>.......] - ETA: 35s - loss: 0.6765 - regression_loss: 0.6079 - classification_loss: 0.0686 398/500 [======================>.......] - ETA: 34s - loss: 0.6767 - regression_loss: 0.6082 - classification_loss: 0.0685 399/500 [======================>.......] - ETA: 34s - loss: 0.6767 - regression_loss: 0.6082 - classification_loss: 0.0685 400/500 [=======================>......] - ETA: 34s - loss: 0.6771 - regression_loss: 0.6086 - classification_loss: 0.0685 401/500 [=======================>......] - ETA: 33s - loss: 0.6771 - regression_loss: 0.6087 - classification_loss: 0.0685 402/500 [=======================>......] - ETA: 33s - loss: 0.6776 - regression_loss: 0.6090 - classification_loss: 0.0686 403/500 [=======================>......] - ETA: 33s - loss: 0.6781 - regression_loss: 0.6095 - classification_loss: 0.0686 404/500 [=======================>......] - ETA: 32s - loss: 0.6774 - regression_loss: 0.6089 - classification_loss: 0.0685 405/500 [=======================>......] - ETA: 32s - loss: 0.6772 - regression_loss: 0.6087 - classification_loss: 0.0685 406/500 [=======================>......] - ETA: 32s - loss: 0.6775 - regression_loss: 0.6090 - classification_loss: 0.0685 407/500 [=======================>......] - ETA: 31s - loss: 0.6767 - regression_loss: 0.6083 - classification_loss: 0.0685 408/500 [=======================>......] - ETA: 31s - loss: 0.6763 - regression_loss: 0.6079 - classification_loss: 0.0683 409/500 [=======================>......] - ETA: 31s - loss: 0.6759 - regression_loss: 0.6076 - classification_loss: 0.0683 410/500 [=======================>......] - ETA: 30s - loss: 0.6766 - regression_loss: 0.6082 - classification_loss: 0.0684 411/500 [=======================>......] - ETA: 30s - loss: 0.6757 - regression_loss: 0.6075 - classification_loss: 0.0682 412/500 [=======================>......] - ETA: 30s - loss: 0.6760 - regression_loss: 0.6078 - classification_loss: 0.0683 413/500 [=======================>......] - ETA: 29s - loss: 0.6775 - regression_loss: 0.6089 - classification_loss: 0.0686 414/500 [=======================>......] - ETA: 29s - loss: 0.6774 - regression_loss: 0.6088 - classification_loss: 0.0686 415/500 [=======================>......] - ETA: 29s - loss: 0.6772 - regression_loss: 0.6087 - classification_loss: 0.0685 416/500 [=======================>......] - ETA: 28s - loss: 0.6763 - regression_loss: 0.6079 - classification_loss: 0.0684 417/500 [========================>.....] - ETA: 28s - loss: 0.6771 - regression_loss: 0.6085 - classification_loss: 0.0686 418/500 [========================>.....] - ETA: 28s - loss: 0.6771 - regression_loss: 0.6086 - classification_loss: 0.0685 419/500 [========================>.....] - ETA: 27s - loss: 0.6771 - regression_loss: 0.6087 - classification_loss: 0.0684 420/500 [========================>.....] - ETA: 27s - loss: 0.6778 - regression_loss: 0.6093 - classification_loss: 0.0685 421/500 [========================>.....] - ETA: 27s - loss: 0.6773 - regression_loss: 0.6089 - classification_loss: 0.0684 422/500 [========================>.....] - ETA: 26s - loss: 0.6765 - regression_loss: 0.6081 - classification_loss: 0.0683 423/500 [========================>.....] - ETA: 26s - loss: 0.6761 - regression_loss: 0.6078 - classification_loss: 0.0683 424/500 [========================>.....] - ETA: 26s - loss: 0.6757 - regression_loss: 0.6075 - classification_loss: 0.0682 425/500 [========================>.....] - ETA: 25s - loss: 0.6758 - regression_loss: 0.6076 - classification_loss: 0.0682 426/500 [========================>.....] - ETA: 25s - loss: 0.6758 - regression_loss: 0.6076 - classification_loss: 0.0681 427/500 [========================>.....] - ETA: 25s - loss: 0.6757 - regression_loss: 0.6075 - classification_loss: 0.0682 428/500 [========================>.....] - ETA: 24s - loss: 0.6756 - regression_loss: 0.6075 - classification_loss: 0.0681 429/500 [========================>.....] - ETA: 24s - loss: 0.6750 - regression_loss: 0.6070 - classification_loss: 0.0680 430/500 [========================>.....] - ETA: 24s - loss: 0.6751 - regression_loss: 0.6070 - classification_loss: 0.0681 431/500 [========================>.....] - ETA: 23s - loss: 0.6751 - regression_loss: 0.6070 - classification_loss: 0.0681 432/500 [========================>.....] - ETA: 23s - loss: 0.6756 - regression_loss: 0.6075 - classification_loss: 0.0681 433/500 [========================>.....] - ETA: 22s - loss: 0.6753 - regression_loss: 0.6073 - classification_loss: 0.0680 434/500 [=========================>....] - ETA: 22s - loss: 0.6762 - regression_loss: 0.6081 - classification_loss: 0.0681 435/500 [=========================>....] - ETA: 22s - loss: 0.6760 - regression_loss: 0.6079 - classification_loss: 0.0681 436/500 [=========================>....] - ETA: 21s - loss: 0.6754 - regression_loss: 0.6074 - classification_loss: 0.0681 437/500 [=========================>....] - ETA: 21s - loss: 0.6747 - regression_loss: 0.6067 - classification_loss: 0.0680 438/500 [=========================>....] - ETA: 21s - loss: 0.6746 - regression_loss: 0.6067 - classification_loss: 0.0680 439/500 [=========================>....] - ETA: 20s - loss: 0.6749 - regression_loss: 0.6069 - classification_loss: 0.0681 440/500 [=========================>....] - ETA: 20s - loss: 0.6749 - regression_loss: 0.6069 - classification_loss: 0.0680 441/500 [=========================>....] - ETA: 20s - loss: 0.6752 - regression_loss: 0.6072 - classification_loss: 0.0680 442/500 [=========================>....] - ETA: 19s - loss: 0.6743 - regression_loss: 0.6064 - classification_loss: 0.0679 443/500 [=========================>....] - ETA: 19s - loss: 0.6743 - regression_loss: 0.6064 - classification_loss: 0.0679 444/500 [=========================>....] - ETA: 19s - loss: 0.6736 - regression_loss: 0.6058 - classification_loss: 0.0678 445/500 [=========================>....] - ETA: 18s - loss: 0.6737 - regression_loss: 0.6059 - classification_loss: 0.0678 446/500 [=========================>....] - ETA: 18s - loss: 0.6737 - regression_loss: 0.6059 - classification_loss: 0.0678 447/500 [=========================>....] - ETA: 18s - loss: 0.6743 - regression_loss: 0.6066 - classification_loss: 0.0677 448/500 [=========================>....] - ETA: 17s - loss: 0.6742 - regression_loss: 0.6065 - classification_loss: 0.0677 449/500 [=========================>....] - ETA: 17s - loss: 0.6738 - regression_loss: 0.6062 - classification_loss: 0.0676 450/500 [==========================>...] - ETA: 17s - loss: 0.6740 - regression_loss: 0.6063 - classification_loss: 0.0676 451/500 [==========================>...] - ETA: 16s - loss: 0.6734 - regression_loss: 0.6058 - classification_loss: 0.0675 452/500 [==========================>...] - ETA: 16s - loss: 0.6737 - regression_loss: 0.6061 - classification_loss: 0.0676 453/500 [==========================>...] - ETA: 16s - loss: 0.6733 - regression_loss: 0.6057 - classification_loss: 0.0675 454/500 [==========================>...] - ETA: 15s - loss: 0.6731 - regression_loss: 0.6056 - classification_loss: 0.0675 455/500 [==========================>...] - ETA: 15s - loss: 0.6729 - regression_loss: 0.6055 - classification_loss: 0.0675 456/500 [==========================>...] - ETA: 15s - loss: 0.6722 - regression_loss: 0.6048 - classification_loss: 0.0674 457/500 [==========================>...] - ETA: 14s - loss: 0.6716 - regression_loss: 0.6043 - classification_loss: 0.0673 458/500 [==========================>...] - ETA: 14s - loss: 0.6710 - regression_loss: 0.6038 - classification_loss: 0.0673 459/500 [==========================>...] - ETA: 14s - loss: 0.6702 - regression_loss: 0.6030 - classification_loss: 0.0672 460/500 [==========================>...] - ETA: 13s - loss: 0.6703 - regression_loss: 0.6031 - classification_loss: 0.0673 461/500 [==========================>...] - ETA: 13s - loss: 0.6701 - regression_loss: 0.6029 - classification_loss: 0.0672 462/500 [==========================>...] - ETA: 13s - loss: 0.6693 - regression_loss: 0.6022 - classification_loss: 0.0671 463/500 [==========================>...] - ETA: 12s - loss: 0.6695 - regression_loss: 0.6023 - classification_loss: 0.0672 464/500 [==========================>...] - ETA: 12s - loss: 0.6695 - regression_loss: 0.6023 - classification_loss: 0.0672 465/500 [==========================>...] - ETA: 12s - loss: 0.6698 - regression_loss: 0.6026 - classification_loss: 0.0672 466/500 [==========================>...] - ETA: 11s - loss: 0.6697 - regression_loss: 0.6025 - classification_loss: 0.0672 467/500 [===========================>..] - ETA: 11s - loss: 0.6699 - regression_loss: 0.6026 - classification_loss: 0.0673 468/500 [===========================>..] - ETA: 10s - loss: 0.6694 - regression_loss: 0.6022 - classification_loss: 0.0672 469/500 [===========================>..] - ETA: 10s - loss: 0.6695 - regression_loss: 0.6024 - classification_loss: 0.0672 470/500 [===========================>..] - ETA: 10s - loss: 0.6685 - regression_loss: 0.6014 - classification_loss: 0.0671 471/500 [===========================>..] - ETA: 9s - loss: 0.6685 - regression_loss: 0.6014 - classification_loss: 0.0671  472/500 [===========================>..] - ETA: 9s - loss: 0.6677 - regression_loss: 0.6007 - classification_loss: 0.0670 473/500 [===========================>..] - ETA: 9s - loss: 0.6683 - regression_loss: 0.6013 - classification_loss: 0.0670 474/500 [===========================>..] - ETA: 8s - loss: 0.6682 - regression_loss: 0.6011 - classification_loss: 0.0670 475/500 [===========================>..] - ETA: 8s - loss: 0.6680 - regression_loss: 0.6009 - classification_loss: 0.0671 476/500 [===========================>..] - ETA: 8s - loss: 0.6674 - regression_loss: 0.6004 - classification_loss: 0.0670 477/500 [===========================>..] - ETA: 7s - loss: 0.6668 - regression_loss: 0.5998 - classification_loss: 0.0670 478/500 [===========================>..] - ETA: 7s - loss: 0.6663 - regression_loss: 0.5994 - classification_loss: 0.0669 479/500 [===========================>..] - ETA: 7s - loss: 0.6669 - regression_loss: 0.5999 - classification_loss: 0.0670 480/500 [===========================>..] - ETA: 6s - loss: 0.6666 - regression_loss: 0.5996 - classification_loss: 0.0670 481/500 [===========================>..] - ETA: 6s - loss: 0.6659 - regression_loss: 0.5990 - classification_loss: 0.0669 482/500 [===========================>..] - ETA: 6s - loss: 0.6654 - regression_loss: 0.5986 - classification_loss: 0.0669 483/500 [===========================>..] - ETA: 5s - loss: 0.6661 - regression_loss: 0.5992 - classification_loss: 0.0670 484/500 [============================>.] - ETA: 5s - loss: 0.6667 - regression_loss: 0.5997 - classification_loss: 0.0670 485/500 [============================>.] - ETA: 5s - loss: 0.6669 - regression_loss: 0.5997 - classification_loss: 0.0671 486/500 [============================>.] - ETA: 4s - loss: 0.6670 - regression_loss: 0.5998 - classification_loss: 0.0672 487/500 [============================>.] - ETA: 4s - loss: 0.6668 - regression_loss: 0.5996 - classification_loss: 0.0672 488/500 [============================>.] - ETA: 4s - loss: 0.6674 - regression_loss: 0.6002 - classification_loss: 0.0672 489/500 [============================>.] - ETA: 3s - loss: 0.6677 - regression_loss: 0.6005 - classification_loss: 0.0672 490/500 [============================>.] - ETA: 3s - loss: 0.6681 - regression_loss: 0.6008 - classification_loss: 0.0673 491/500 [============================>.] - ETA: 3s - loss: 0.6680 - regression_loss: 0.6007 - classification_loss: 0.0673 492/500 [============================>.] - ETA: 2s - loss: 0.6678 - regression_loss: 0.6005 - classification_loss: 0.0673 493/500 [============================>.] - ETA: 2s - loss: 0.6680 - regression_loss: 0.6006 - classification_loss: 0.0673 494/500 [============================>.] - ETA: 2s - loss: 0.6675 - regression_loss: 0.6002 - classification_loss: 0.0673 495/500 [============================>.] - ETA: 1s - loss: 0.6673 - regression_loss: 0.6000 - classification_loss: 0.0673 496/500 [============================>.] - ETA: 1s - loss: 0.6676 - regression_loss: 0.6001 - classification_loss: 0.0674 497/500 [============================>.] - ETA: 1s - loss: 0.6676 - regression_loss: 0.6002 - classification_loss: 0.0675 498/500 [============================>.] - ETA: 0s - loss: 0.6678 - regression_loss: 0.6003 - classification_loss: 0.0675 499/500 [============================>.] - ETA: 0s - loss: 0.6670 - regression_loss: 0.5996 - classification_loss: 0.0674 500/500 [==============================] - 172s 344ms/step - loss: 0.6665 - regression_loss: 0.5992 - classification_loss: 0.0673 1172 instances of class plum with average precision: 0.7310 mAP: 0.7310 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 2:42 - loss: 0.8786 - regression_loss: 0.8111 - classification_loss: 0.0674 2/500 [..............................] - ETA: 2:49 - loss: 0.7004 - regression_loss: 0.6410 - classification_loss: 0.0594 3/500 [..............................] - ETA: 2:50 - loss: 0.8112 - regression_loss: 0.7543 - classification_loss: 0.0569 4/500 [..............................] - ETA: 2:50 - loss: 0.8423 - regression_loss: 0.7711 - classification_loss: 0.0712 5/500 [..............................] - ETA: 2:50 - loss: 0.8089 - regression_loss: 0.7404 - classification_loss: 0.0685 6/500 [..............................] - ETA: 2:51 - loss: 0.7666 - regression_loss: 0.7024 - classification_loss: 0.0642 7/500 [..............................] - ETA: 2:51 - loss: 0.7414 - regression_loss: 0.6794 - classification_loss: 0.0620 8/500 [..............................] - ETA: 2:51 - loss: 0.6865 - regression_loss: 0.6285 - classification_loss: 0.0580 9/500 [..............................] - ETA: 2:50 - loss: 0.6799 - regression_loss: 0.6212 - classification_loss: 0.0588 10/500 [..............................] - ETA: 2:50 - loss: 0.6892 - regression_loss: 0.6306 - classification_loss: 0.0586 11/500 [..............................] - ETA: 2:50 - loss: 0.6738 - regression_loss: 0.6156 - classification_loss: 0.0581 12/500 [..............................] - ETA: 2:49 - loss: 0.6727 - regression_loss: 0.6150 - classification_loss: 0.0577 13/500 [..............................] - ETA: 2:49 - loss: 0.6885 - regression_loss: 0.6282 - classification_loss: 0.0603 14/500 [..............................] - ETA: 2:49 - loss: 0.6854 - regression_loss: 0.6246 - classification_loss: 0.0608 15/500 [..............................] - ETA: 2:48 - loss: 0.6960 - regression_loss: 0.6347 - classification_loss: 0.0612 16/500 [..............................] - ETA: 2:48 - loss: 0.6727 - regression_loss: 0.6131 - classification_loss: 0.0595 17/500 [>.............................] - ETA: 2:47 - loss: 0.6688 - regression_loss: 0.6092 - classification_loss: 0.0595 18/500 [>.............................] - ETA: 2:47 - loss: 0.6601 - regression_loss: 0.6020 - classification_loss: 0.0581 19/500 [>.............................] - ETA: 2:47 - loss: 0.6633 - regression_loss: 0.6036 - classification_loss: 0.0597 20/500 [>.............................] - ETA: 2:46 - loss: 0.6497 - regression_loss: 0.5911 - classification_loss: 0.0586 21/500 [>.............................] - ETA: 2:46 - loss: 0.6509 - regression_loss: 0.5920 - classification_loss: 0.0589 22/500 [>.............................] - ETA: 2:46 - loss: 0.6711 - regression_loss: 0.6112 - classification_loss: 0.0599 23/500 [>.............................] - ETA: 2:45 - loss: 0.6700 - regression_loss: 0.6106 - classification_loss: 0.0594 24/500 [>.............................] - ETA: 2:45 - loss: 0.6845 - regression_loss: 0.6235 - classification_loss: 0.0610 25/500 [>.............................] - ETA: 2:45 - loss: 0.6720 - regression_loss: 0.6116 - classification_loss: 0.0604 26/500 [>.............................] - ETA: 2:45 - loss: 0.6732 - regression_loss: 0.6133 - classification_loss: 0.0599 27/500 [>.............................] - ETA: 2:44 - loss: 0.6596 - regression_loss: 0.6011 - classification_loss: 0.0584 28/500 [>.............................] - ETA: 2:44 - loss: 0.6760 - regression_loss: 0.6135 - classification_loss: 0.0625 29/500 [>.............................] - ETA: 2:44 - loss: 0.6684 - regression_loss: 0.6069 - classification_loss: 0.0615 30/500 [>.............................] - ETA: 2:43 - loss: 0.6625 - regression_loss: 0.6017 - classification_loss: 0.0608 31/500 [>.............................] - ETA: 2:43 - loss: 0.6546 - regression_loss: 0.5942 - classification_loss: 0.0604 32/500 [>.............................] - ETA: 2:42 - loss: 0.6448 - regression_loss: 0.5854 - classification_loss: 0.0594 33/500 [>.............................] - ETA: 2:42 - loss: 0.6508 - regression_loss: 0.5901 - classification_loss: 0.0607 34/500 [=>............................] - ETA: 2:41 - loss: 0.6560 - regression_loss: 0.5947 - classification_loss: 0.0612 35/500 [=>............................] - ETA: 2:41 - loss: 0.6699 - regression_loss: 0.6064 - classification_loss: 0.0635 36/500 [=>............................] - ETA: 2:40 - loss: 0.6764 - regression_loss: 0.6110 - classification_loss: 0.0653 37/500 [=>............................] - ETA: 2:40 - loss: 0.6798 - regression_loss: 0.6143 - classification_loss: 0.0655 38/500 [=>............................] - ETA: 2:40 - loss: 0.6776 - regression_loss: 0.6127 - classification_loss: 0.0649 39/500 [=>............................] - ETA: 2:39 - loss: 0.6806 - regression_loss: 0.6146 - classification_loss: 0.0660 40/500 [=>............................] - ETA: 2:39 - loss: 0.6811 - regression_loss: 0.6155 - classification_loss: 0.0657 41/500 [=>............................] - ETA: 2:38 - loss: 0.6822 - regression_loss: 0.6154 - classification_loss: 0.0668 42/500 [=>............................] - ETA: 2:38 - loss: 0.6762 - regression_loss: 0.6101 - classification_loss: 0.0661 43/500 [=>............................] - ETA: 2:37 - loss: 0.6765 - regression_loss: 0.6106 - classification_loss: 0.0658 44/500 [=>............................] - ETA: 2:37 - loss: 0.6710 - regression_loss: 0.6059 - classification_loss: 0.0650 45/500 [=>............................] - ETA: 2:37 - loss: 0.6678 - regression_loss: 0.6031 - classification_loss: 0.0647 46/500 [=>............................] - ETA: 2:36 - loss: 0.6696 - regression_loss: 0.6046 - classification_loss: 0.0650 47/500 [=>............................] - ETA: 2:36 - loss: 0.6639 - regression_loss: 0.5997 - classification_loss: 0.0642 48/500 [=>............................] - ETA: 2:36 - loss: 0.6635 - regression_loss: 0.5994 - classification_loss: 0.0641 49/500 [=>............................] - ETA: 2:35 - loss: 0.6756 - regression_loss: 0.6094 - classification_loss: 0.0662 50/500 [==>...........................] - ETA: 2:35 - loss: 0.6692 - regression_loss: 0.6037 - classification_loss: 0.0654 51/500 [==>...........................] - ETA: 2:35 - loss: 0.6772 - regression_loss: 0.6109 - classification_loss: 0.0663 52/500 [==>...........................] - ETA: 2:34 - loss: 0.6773 - regression_loss: 0.6113 - classification_loss: 0.0660 53/500 [==>...........................] - ETA: 2:34 - loss: 0.6847 - regression_loss: 0.6177 - classification_loss: 0.0669 54/500 [==>...........................] - ETA: 2:34 - loss: 0.6752 - regression_loss: 0.6091 - classification_loss: 0.0660 55/500 [==>...........................] - ETA: 2:33 - loss: 0.6706 - regression_loss: 0.6051 - classification_loss: 0.0655 56/500 [==>...........................] - ETA: 2:33 - loss: 0.6675 - regression_loss: 0.6020 - classification_loss: 0.0655 57/500 [==>...........................] - ETA: 2:33 - loss: 0.6644 - regression_loss: 0.5991 - classification_loss: 0.0653 58/500 [==>...........................] - ETA: 2:33 - loss: 0.6644 - regression_loss: 0.5992 - classification_loss: 0.0652 59/500 [==>...........................] - ETA: 2:32 - loss: 0.6634 - regression_loss: 0.5982 - classification_loss: 0.0652 60/500 [==>...........................] - ETA: 2:32 - loss: 0.6617 - regression_loss: 0.5968 - classification_loss: 0.0649 61/500 [==>...........................] - ETA: 2:31 - loss: 0.6665 - regression_loss: 0.6006 - classification_loss: 0.0659 62/500 [==>...........................] - ETA: 2:31 - loss: 0.6660 - regression_loss: 0.5996 - classification_loss: 0.0664 63/500 [==>...........................] - ETA: 2:31 - loss: 0.6680 - regression_loss: 0.6019 - classification_loss: 0.0661 64/500 [==>...........................] - ETA: 2:30 - loss: 0.6719 - regression_loss: 0.6056 - classification_loss: 0.0663 65/500 [==>...........................] - ETA: 2:30 - loss: 0.6697 - regression_loss: 0.6035 - classification_loss: 0.0662 66/500 [==>...........................] - ETA: 2:30 - loss: 0.6672 - regression_loss: 0.6012 - classification_loss: 0.0660 67/500 [===>..........................] - ETA: 2:29 - loss: 0.6704 - regression_loss: 0.6038 - classification_loss: 0.0666 68/500 [===>..........................] - ETA: 2:29 - loss: 0.6719 - regression_loss: 0.6051 - classification_loss: 0.0668 69/500 [===>..........................] - ETA: 2:28 - loss: 0.6717 - regression_loss: 0.6048 - classification_loss: 0.0669 70/500 [===>..........................] - ETA: 2:28 - loss: 0.6735 - regression_loss: 0.6067 - classification_loss: 0.0668 71/500 [===>..........................] - ETA: 2:27 - loss: 0.6705 - regression_loss: 0.6040 - classification_loss: 0.0666 72/500 [===>..........................] - ETA: 2:27 - loss: 0.6739 - regression_loss: 0.6076 - classification_loss: 0.0663 73/500 [===>..........................] - ETA: 2:27 - loss: 0.6742 - regression_loss: 0.6076 - classification_loss: 0.0666 74/500 [===>..........................] - ETA: 2:26 - loss: 0.6752 - regression_loss: 0.6084 - classification_loss: 0.0668 75/500 [===>..........................] - ETA: 2:26 - loss: 0.6697 - regression_loss: 0.6035 - classification_loss: 0.0662 76/500 [===>..........................] - ETA: 2:26 - loss: 0.6658 - regression_loss: 0.6001 - classification_loss: 0.0658 77/500 [===>..........................] - ETA: 2:25 - loss: 0.6649 - regression_loss: 0.5992 - classification_loss: 0.0658 78/500 [===>..........................] - ETA: 2:25 - loss: 0.6629 - regression_loss: 0.5973 - classification_loss: 0.0656 79/500 [===>..........................] - ETA: 2:25 - loss: 0.6675 - regression_loss: 0.6015 - classification_loss: 0.0660 80/500 [===>..........................] - ETA: 2:25 - loss: 0.6665 - regression_loss: 0.6004 - classification_loss: 0.0661 81/500 [===>..........................] - ETA: 2:24 - loss: 0.6678 - regression_loss: 0.6016 - classification_loss: 0.0662 82/500 [===>..........................] - ETA: 2:24 - loss: 0.6725 - regression_loss: 0.6055 - classification_loss: 0.0669 83/500 [===>..........................] - ETA: 2:23 - loss: 0.6718 - regression_loss: 0.6048 - classification_loss: 0.0670 84/500 [====>.........................] - ETA: 2:23 - loss: 0.6727 - regression_loss: 0.6058 - classification_loss: 0.0668 85/500 [====>.........................] - ETA: 2:23 - loss: 0.6709 - regression_loss: 0.6043 - classification_loss: 0.0666 86/500 [====>.........................] - ETA: 2:22 - loss: 0.6709 - regression_loss: 0.6043 - classification_loss: 0.0666 87/500 [====>.........................] - ETA: 2:22 - loss: 0.6685 - regression_loss: 0.6019 - classification_loss: 0.0666 88/500 [====>.........................] - ETA: 2:21 - loss: 0.6669 - regression_loss: 0.6003 - classification_loss: 0.0666 89/500 [====>.........................] - ETA: 2:21 - loss: 0.6668 - regression_loss: 0.6002 - classification_loss: 0.0665 90/500 [====>.........................] - ETA: 2:21 - loss: 0.6689 - regression_loss: 0.6021 - classification_loss: 0.0668 91/500 [====>.........................] - ETA: 2:20 - loss: 0.6653 - regression_loss: 0.5989 - classification_loss: 0.0664 92/500 [====>.........................] - ETA: 2:20 - loss: 0.6667 - regression_loss: 0.6002 - classification_loss: 0.0665 93/500 [====>.........................] - ETA: 2:20 - loss: 0.6665 - regression_loss: 0.6002 - classification_loss: 0.0663 94/500 [====>.........................] - ETA: 2:19 - loss: 0.6686 - regression_loss: 0.6023 - classification_loss: 0.0663 95/500 [====>.........................] - ETA: 2:19 - loss: 0.6679 - regression_loss: 0.6020 - classification_loss: 0.0659 96/500 [====>.........................] - ETA: 2:18 - loss: 0.6666 - regression_loss: 0.6008 - classification_loss: 0.0657 97/500 [====>.........................] - ETA: 2:18 - loss: 0.6640 - regression_loss: 0.5985 - classification_loss: 0.0655 98/500 [====>.........................] - ETA: 2:18 - loss: 0.6624 - regression_loss: 0.5972 - classification_loss: 0.0652 99/500 [====>.........................] - ETA: 2:17 - loss: 0.6599 - regression_loss: 0.5950 - classification_loss: 0.0649 100/500 [=====>........................] - ETA: 2:17 - loss: 0.6594 - regression_loss: 0.5945 - classification_loss: 0.0649 101/500 [=====>........................] - ETA: 2:17 - loss: 0.6604 - regression_loss: 0.5952 - classification_loss: 0.0652 102/500 [=====>........................] - ETA: 2:16 - loss: 0.6604 - regression_loss: 0.5952 - classification_loss: 0.0652 103/500 [=====>........................] - ETA: 2:16 - loss: 0.6590 - regression_loss: 0.5938 - classification_loss: 0.0652 104/500 [=====>........................] - ETA: 2:16 - loss: 0.6607 - regression_loss: 0.5951 - classification_loss: 0.0656 105/500 [=====>........................] - ETA: 2:15 - loss: 0.6623 - regression_loss: 0.5964 - classification_loss: 0.0658 106/500 [=====>........................] - ETA: 2:15 - loss: 0.6652 - regression_loss: 0.5992 - classification_loss: 0.0660 107/500 [=====>........................] - ETA: 2:15 - loss: 0.6653 - regression_loss: 0.5995 - classification_loss: 0.0657 108/500 [=====>........................] - ETA: 2:14 - loss: 0.6677 - regression_loss: 0.6016 - classification_loss: 0.0661 109/500 [=====>........................] - ETA: 2:14 - loss: 0.6682 - regression_loss: 0.6021 - classification_loss: 0.0661 110/500 [=====>........................] - ETA: 2:14 - loss: 0.6676 - regression_loss: 0.6017 - classification_loss: 0.0659 111/500 [=====>........................] - ETA: 2:13 - loss: 0.6672 - regression_loss: 0.6011 - classification_loss: 0.0660 112/500 [=====>........................] - ETA: 2:13 - loss: 0.6650 - regression_loss: 0.5992 - classification_loss: 0.0659 113/500 [=====>........................] - ETA: 2:13 - loss: 0.6620 - regression_loss: 0.5964 - classification_loss: 0.0655 114/500 [=====>........................] - ETA: 2:12 - loss: 0.6623 - regression_loss: 0.5967 - classification_loss: 0.0657 115/500 [=====>........................] - ETA: 2:12 - loss: 0.6630 - regression_loss: 0.5970 - classification_loss: 0.0659 116/500 [=====>........................] - ETA: 2:12 - loss: 0.6678 - regression_loss: 0.6012 - classification_loss: 0.0666 117/500 [======>.......................] - ETA: 2:11 - loss: 0.6672 - regression_loss: 0.6007 - classification_loss: 0.0665 118/500 [======>.......................] - ETA: 2:11 - loss: 0.6650 - regression_loss: 0.5988 - classification_loss: 0.0662 119/500 [======>.......................] - ETA: 2:11 - loss: 0.6657 - regression_loss: 0.5994 - classification_loss: 0.0663 120/500 [======>.......................] - ETA: 2:10 - loss: 0.6654 - regression_loss: 0.5990 - classification_loss: 0.0664 121/500 [======>.......................] - ETA: 2:10 - loss: 0.6653 - regression_loss: 0.5987 - classification_loss: 0.0666 122/500 [======>.......................] - ETA: 2:09 - loss: 0.6628 - regression_loss: 0.5963 - classification_loss: 0.0665 123/500 [======>.......................] - ETA: 2:09 - loss: 0.6630 - regression_loss: 0.5965 - classification_loss: 0.0665 124/500 [======>.......................] - ETA: 2:09 - loss: 0.6609 - regression_loss: 0.5945 - classification_loss: 0.0663 125/500 [======>.......................] - ETA: 2:08 - loss: 0.6576 - regression_loss: 0.5915 - classification_loss: 0.0661 126/500 [======>.......................] - ETA: 2:08 - loss: 0.6564 - regression_loss: 0.5905 - classification_loss: 0.0659 127/500 [======>.......................] - ETA: 2:08 - loss: 0.6563 - regression_loss: 0.5905 - classification_loss: 0.0658 128/500 [======>.......................] - ETA: 2:07 - loss: 0.6539 - regression_loss: 0.5883 - classification_loss: 0.0656 129/500 [======>.......................] - ETA: 2:07 - loss: 0.6541 - regression_loss: 0.5887 - classification_loss: 0.0654 130/500 [======>.......................] - ETA: 2:06 - loss: 0.6536 - regression_loss: 0.5883 - classification_loss: 0.0652 131/500 [======>.......................] - ETA: 2:06 - loss: 0.6530 - regression_loss: 0.5876 - classification_loss: 0.0654 132/500 [======>.......................] - ETA: 2:06 - loss: 0.6525 - regression_loss: 0.5873 - classification_loss: 0.0652 133/500 [======>.......................] - ETA: 2:05 - loss: 0.6525 - regression_loss: 0.5873 - classification_loss: 0.0652 134/500 [=======>......................] - ETA: 2:05 - loss: 0.6510 - regression_loss: 0.5860 - classification_loss: 0.0651 135/500 [=======>......................] - ETA: 2:05 - loss: 0.6512 - regression_loss: 0.5861 - classification_loss: 0.0651 136/500 [=======>......................] - ETA: 2:04 - loss: 0.6518 - regression_loss: 0.5867 - classification_loss: 0.0652 137/500 [=======>......................] - ETA: 2:04 - loss: 0.6532 - regression_loss: 0.5875 - classification_loss: 0.0657 138/500 [=======>......................] - ETA: 2:04 - loss: 0.6522 - regression_loss: 0.5866 - classification_loss: 0.0656 139/500 [=======>......................] - ETA: 2:03 - loss: 0.6496 - regression_loss: 0.5844 - classification_loss: 0.0653 140/500 [=======>......................] - ETA: 2:03 - loss: 0.6491 - regression_loss: 0.5838 - classification_loss: 0.0652 141/500 [=======>......................] - ETA: 2:03 - loss: 0.6483 - regression_loss: 0.5830 - classification_loss: 0.0653 142/500 [=======>......................] - ETA: 2:02 - loss: 0.6478 - regression_loss: 0.5826 - classification_loss: 0.0652 143/500 [=======>......................] - ETA: 2:02 - loss: 0.6476 - regression_loss: 0.5826 - classification_loss: 0.0651 144/500 [=======>......................] - ETA: 2:02 - loss: 0.6460 - regression_loss: 0.5811 - classification_loss: 0.0649 145/500 [=======>......................] - ETA: 2:01 - loss: 0.6447 - regression_loss: 0.5798 - classification_loss: 0.0648 146/500 [=======>......................] - ETA: 2:01 - loss: 0.6433 - regression_loss: 0.5786 - classification_loss: 0.0647 147/500 [=======>......................] - ETA: 2:01 - loss: 0.6434 - regression_loss: 0.5788 - classification_loss: 0.0645 148/500 [=======>......................] - ETA: 2:00 - loss: 0.6462 - regression_loss: 0.5816 - classification_loss: 0.0645 149/500 [=======>......................] - ETA: 2:00 - loss: 0.6440 - regression_loss: 0.5796 - classification_loss: 0.0645 150/500 [========>.....................] - ETA: 2:00 - loss: 0.6428 - regression_loss: 0.5786 - classification_loss: 0.0642 151/500 [========>.....................] - ETA: 1:59 - loss: 0.6414 - regression_loss: 0.5773 - classification_loss: 0.0641 152/500 [========>.....................] - ETA: 1:59 - loss: 0.6410 - regression_loss: 0.5770 - classification_loss: 0.0640 153/500 [========>.....................] - ETA: 1:59 - loss: 0.6394 - regression_loss: 0.5756 - classification_loss: 0.0638 154/500 [========>.....................] - ETA: 1:58 - loss: 0.6369 - regression_loss: 0.5735 - classification_loss: 0.0635 155/500 [========>.....................] - ETA: 1:58 - loss: 0.6355 - regression_loss: 0.5721 - classification_loss: 0.0634 156/500 [========>.....................] - ETA: 1:58 - loss: 0.6340 - regression_loss: 0.5708 - classification_loss: 0.0633 157/500 [========>.....................] - ETA: 1:57 - loss: 0.6349 - regression_loss: 0.5716 - classification_loss: 0.0633 158/500 [========>.....................] - ETA: 1:57 - loss: 0.6343 - regression_loss: 0.5710 - classification_loss: 0.0632 159/500 [========>.....................] - ETA: 1:57 - loss: 0.6335 - regression_loss: 0.5702 - classification_loss: 0.0633 160/500 [========>.....................] - ETA: 1:56 - loss: 0.6349 - regression_loss: 0.5716 - classification_loss: 0.0633 161/500 [========>.....................] - ETA: 1:56 - loss: 0.6347 - regression_loss: 0.5714 - classification_loss: 0.0632 162/500 [========>.....................] - ETA: 1:55 - loss: 0.6346 - regression_loss: 0.5712 - classification_loss: 0.0634 163/500 [========>.....................] - ETA: 1:55 - loss: 0.6347 - regression_loss: 0.5712 - classification_loss: 0.0635 164/500 [========>.....................] - ETA: 1:55 - loss: 0.6329 - regression_loss: 0.5696 - classification_loss: 0.0634 165/500 [========>.....................] - ETA: 1:54 - loss: 0.6328 - regression_loss: 0.5693 - classification_loss: 0.0634 166/500 [========>.....................] - ETA: 1:54 - loss: 0.6321 - regression_loss: 0.5687 - classification_loss: 0.0634 167/500 [=========>....................] - ETA: 1:54 - loss: 0.6334 - regression_loss: 0.5700 - classification_loss: 0.0634 168/500 [=========>....................] - ETA: 1:53 - loss: 0.6329 - regression_loss: 0.5696 - classification_loss: 0.0633 169/500 [=========>....................] - ETA: 1:53 - loss: 0.6341 - regression_loss: 0.5708 - classification_loss: 0.0634 170/500 [=========>....................] - ETA: 1:53 - loss: 0.6346 - regression_loss: 0.5713 - classification_loss: 0.0633 171/500 [=========>....................] - ETA: 1:52 - loss: 0.6333 - regression_loss: 0.5701 - classification_loss: 0.0631 172/500 [=========>....................] - ETA: 1:52 - loss: 0.6340 - regression_loss: 0.5707 - classification_loss: 0.0633 173/500 [=========>....................] - ETA: 1:52 - loss: 0.6331 - regression_loss: 0.5698 - classification_loss: 0.0633 174/500 [=========>....................] - ETA: 1:51 - loss: 0.6349 - regression_loss: 0.5714 - classification_loss: 0.0635 175/500 [=========>....................] - ETA: 1:51 - loss: 0.6349 - regression_loss: 0.5715 - classification_loss: 0.0634 176/500 [=========>....................] - ETA: 1:51 - loss: 0.6352 - regression_loss: 0.5718 - classification_loss: 0.0633 177/500 [=========>....................] - ETA: 1:50 - loss: 0.6334 - regression_loss: 0.5703 - classification_loss: 0.0631 178/500 [=========>....................] - ETA: 1:50 - loss: 0.6357 - regression_loss: 0.5722 - classification_loss: 0.0636 179/500 [=========>....................] - ETA: 1:50 - loss: 0.6369 - regression_loss: 0.5732 - classification_loss: 0.0637 180/500 [=========>....................] - ETA: 1:49 - loss: 0.6363 - regression_loss: 0.5726 - classification_loss: 0.0637 181/500 [=========>....................] - ETA: 1:49 - loss: 0.6381 - regression_loss: 0.5741 - classification_loss: 0.0640 182/500 [=========>....................] - ETA: 1:49 - loss: 0.6382 - regression_loss: 0.5742 - classification_loss: 0.0640 183/500 [=========>....................] - ETA: 1:48 - loss: 0.6410 - regression_loss: 0.5763 - classification_loss: 0.0648 184/500 [==========>...................] - ETA: 1:48 - loss: 0.6394 - regression_loss: 0.5749 - classification_loss: 0.0645 185/500 [==========>...................] - ETA: 1:48 - loss: 0.6398 - regression_loss: 0.5752 - classification_loss: 0.0646 186/500 [==========>...................] - ETA: 1:47 - loss: 0.6417 - regression_loss: 0.5770 - classification_loss: 0.0646 187/500 [==========>...................] - ETA: 1:47 - loss: 0.6406 - regression_loss: 0.5761 - classification_loss: 0.0645 188/500 [==========>...................] - ETA: 1:47 - loss: 0.6410 - regression_loss: 0.5765 - classification_loss: 0.0645 189/500 [==========>...................] - ETA: 1:46 - loss: 0.6439 - regression_loss: 0.5793 - classification_loss: 0.0646 190/500 [==========>...................] - ETA: 1:46 - loss: 0.6446 - regression_loss: 0.5803 - classification_loss: 0.0644 191/500 [==========>...................] - ETA: 1:46 - loss: 0.6437 - regression_loss: 0.5794 - classification_loss: 0.0643 192/500 [==========>...................] - ETA: 1:45 - loss: 0.6435 - regression_loss: 0.5792 - classification_loss: 0.0643 193/500 [==========>...................] - ETA: 1:45 - loss: 0.6429 - regression_loss: 0.5787 - classification_loss: 0.0642 194/500 [==========>...................] - ETA: 1:45 - loss: 0.6420 - regression_loss: 0.5780 - classification_loss: 0.0640 195/500 [==========>...................] - ETA: 1:44 - loss: 0.6414 - regression_loss: 0.5775 - classification_loss: 0.0639 196/500 [==========>...................] - ETA: 1:44 - loss: 0.6414 - regression_loss: 0.5775 - classification_loss: 0.0639 197/500 [==========>...................] - ETA: 1:44 - loss: 0.6425 - regression_loss: 0.5786 - classification_loss: 0.0639 198/500 [==========>...................] - ETA: 1:43 - loss: 0.6448 - regression_loss: 0.5806 - classification_loss: 0.0642 199/500 [==========>...................] - ETA: 1:43 - loss: 0.6437 - regression_loss: 0.5796 - classification_loss: 0.0641 200/500 [===========>..................] - ETA: 1:43 - loss: 0.6441 - regression_loss: 0.5799 - classification_loss: 0.0642 201/500 [===========>..................] - ETA: 1:42 - loss: 0.6449 - regression_loss: 0.5806 - classification_loss: 0.0643 202/500 [===========>..................] - ETA: 1:42 - loss: 0.6442 - regression_loss: 0.5800 - classification_loss: 0.0642 203/500 [===========>..................] - ETA: 1:42 - loss: 0.6451 - regression_loss: 0.5808 - classification_loss: 0.0644 204/500 [===========>..................] - ETA: 1:41 - loss: 0.6447 - regression_loss: 0.5804 - classification_loss: 0.0644 205/500 [===========>..................] - ETA: 1:41 - loss: 0.6446 - regression_loss: 0.5802 - classification_loss: 0.0644 206/500 [===========>..................] - ETA: 1:41 - loss: 0.6428 - regression_loss: 0.5786 - classification_loss: 0.0642 207/500 [===========>..................] - ETA: 1:40 - loss: 0.6405 - regression_loss: 0.5765 - classification_loss: 0.0640 208/500 [===========>..................] - ETA: 1:40 - loss: 0.6405 - regression_loss: 0.5765 - classification_loss: 0.0639 209/500 [===========>..................] - ETA: 1:40 - loss: 0.6396 - regression_loss: 0.5759 - classification_loss: 0.0638 210/500 [===========>..................] - ETA: 1:39 - loss: 0.6418 - regression_loss: 0.5776 - classification_loss: 0.0642 211/500 [===========>..................] - ETA: 1:39 - loss: 0.6431 - regression_loss: 0.5785 - classification_loss: 0.0645 212/500 [===========>..................] - ETA: 1:39 - loss: 0.6448 - regression_loss: 0.5802 - classification_loss: 0.0646 213/500 [===========>..................] - ETA: 1:38 - loss: 0.6441 - regression_loss: 0.5795 - classification_loss: 0.0645 214/500 [===========>..................] - ETA: 1:38 - loss: 0.6434 - regression_loss: 0.5789 - classification_loss: 0.0644 215/500 [===========>..................] - ETA: 1:38 - loss: 0.6451 - regression_loss: 0.5804 - classification_loss: 0.0648 216/500 [===========>..................] - ETA: 1:37 - loss: 0.6444 - regression_loss: 0.5798 - classification_loss: 0.0646 217/500 [============>.................] - ETA: 1:37 - loss: 0.6433 - regression_loss: 0.5789 - classification_loss: 0.0644 218/500 [============>.................] - ETA: 1:37 - loss: 0.6445 - regression_loss: 0.5802 - classification_loss: 0.0643 219/500 [============>.................] - ETA: 1:36 - loss: 0.6451 - regression_loss: 0.5808 - classification_loss: 0.0643 220/500 [============>.................] - ETA: 1:36 - loss: 0.6442 - regression_loss: 0.5799 - classification_loss: 0.0643 221/500 [============>.................] - ETA: 1:36 - loss: 0.6438 - regression_loss: 0.5795 - classification_loss: 0.0643 222/500 [============>.................] - ETA: 1:35 - loss: 0.6445 - regression_loss: 0.5801 - classification_loss: 0.0644 223/500 [============>.................] - ETA: 1:35 - loss: 0.6437 - regression_loss: 0.5794 - classification_loss: 0.0643 224/500 [============>.................] - ETA: 1:35 - loss: 0.6427 - regression_loss: 0.5784 - classification_loss: 0.0642 225/500 [============>.................] - ETA: 1:34 - loss: 0.6417 - regression_loss: 0.5776 - classification_loss: 0.0641 226/500 [============>.................] - ETA: 1:34 - loss: 0.6407 - regression_loss: 0.5767 - classification_loss: 0.0640 227/500 [============>.................] - ETA: 1:34 - loss: 0.6404 - regression_loss: 0.5764 - classification_loss: 0.0639 228/500 [============>.................] - ETA: 1:33 - loss: 0.6397 - regression_loss: 0.5760 - classification_loss: 0.0638 229/500 [============>.................] - ETA: 1:33 - loss: 0.6388 - regression_loss: 0.5751 - classification_loss: 0.0637 230/500 [============>.................] - ETA: 1:32 - loss: 0.6381 - regression_loss: 0.5744 - classification_loss: 0.0637 231/500 [============>.................] - ETA: 1:32 - loss: 0.6392 - regression_loss: 0.5754 - classification_loss: 0.0638 232/500 [============>.................] - ETA: 1:32 - loss: 0.6387 - regression_loss: 0.5750 - classification_loss: 0.0637 233/500 [============>.................] - ETA: 1:31 - loss: 0.6375 - regression_loss: 0.5739 - classification_loss: 0.0637 234/500 [=============>................] - ETA: 1:31 - loss: 0.6360 - regression_loss: 0.5724 - classification_loss: 0.0635 235/500 [=============>................] - ETA: 1:31 - loss: 0.6374 - regression_loss: 0.5736 - classification_loss: 0.0638 236/500 [=============>................] - ETA: 1:30 - loss: 0.6356 - regression_loss: 0.5720 - classification_loss: 0.0636 237/500 [=============>................] - ETA: 1:30 - loss: 0.6336 - regression_loss: 0.5702 - classification_loss: 0.0634 238/500 [=============>................] - ETA: 1:30 - loss: 0.6331 - regression_loss: 0.5697 - classification_loss: 0.0633 239/500 [=============>................] - ETA: 1:29 - loss: 0.6332 - regression_loss: 0.5698 - classification_loss: 0.0634 240/500 [=============>................] - ETA: 1:29 - loss: 0.6326 - regression_loss: 0.5693 - classification_loss: 0.0633 241/500 [=============>................] - ETA: 1:29 - loss: 0.6322 - regression_loss: 0.5690 - classification_loss: 0.0632 242/500 [=============>................] - ETA: 1:28 - loss: 0.6314 - regression_loss: 0.5683 - classification_loss: 0.0631 243/500 [=============>................] - ETA: 1:28 - loss: 0.6321 - regression_loss: 0.5689 - classification_loss: 0.0632 244/500 [=============>................] - ETA: 1:28 - loss: 0.6338 - regression_loss: 0.5703 - classification_loss: 0.0635 245/500 [=============>................] - ETA: 1:27 - loss: 0.6332 - regression_loss: 0.5698 - classification_loss: 0.0634 246/500 [=============>................] - ETA: 1:27 - loss: 0.6328 - regression_loss: 0.5694 - classification_loss: 0.0634 247/500 [=============>................] - ETA: 1:27 - loss: 0.6321 - regression_loss: 0.5689 - classification_loss: 0.0633 248/500 [=============>................] - ETA: 1:26 - loss: 0.6322 - regression_loss: 0.5690 - classification_loss: 0.0632 249/500 [=============>................] - ETA: 1:26 - loss: 0.6313 - regression_loss: 0.5682 - classification_loss: 0.0631 250/500 [==============>...............] - ETA: 1:26 - loss: 0.6335 - regression_loss: 0.5701 - classification_loss: 0.0634 251/500 [==============>...............] - ETA: 1:25 - loss: 0.6347 - regression_loss: 0.5711 - classification_loss: 0.0635 252/500 [==============>...............] - ETA: 1:25 - loss: 0.6336 - regression_loss: 0.5703 - classification_loss: 0.0633 253/500 [==============>...............] - ETA: 1:25 - loss: 0.6326 - regression_loss: 0.5694 - classification_loss: 0.0632 254/500 [==============>...............] - ETA: 1:24 - loss: 0.6328 - regression_loss: 0.5695 - classification_loss: 0.0633 255/500 [==============>...............] - ETA: 1:24 - loss: 0.6317 - regression_loss: 0.5685 - classification_loss: 0.0632 256/500 [==============>...............] - ETA: 1:23 - loss: 0.6323 - regression_loss: 0.5691 - classification_loss: 0.0632 257/500 [==============>...............] - ETA: 1:23 - loss: 0.6322 - regression_loss: 0.5691 - classification_loss: 0.0632 258/500 [==============>...............] - ETA: 1:23 - loss: 0.6308 - regression_loss: 0.5678 - classification_loss: 0.0630 259/500 [==============>...............] - ETA: 1:22 - loss: 0.6310 - regression_loss: 0.5679 - classification_loss: 0.0631 260/500 [==============>...............] - ETA: 1:22 - loss: 0.6308 - regression_loss: 0.5677 - classification_loss: 0.0631 261/500 [==============>...............] - ETA: 1:22 - loss: 0.6297 - regression_loss: 0.5667 - classification_loss: 0.0630 262/500 [==============>...............] - ETA: 1:21 - loss: 0.6293 - regression_loss: 0.5663 - classification_loss: 0.0630 263/500 [==============>...............] - ETA: 1:21 - loss: 0.6297 - regression_loss: 0.5667 - classification_loss: 0.0629 264/500 [==============>...............] - ETA: 1:21 - loss: 0.6308 - regression_loss: 0.5676 - classification_loss: 0.0631 265/500 [==============>...............] - ETA: 1:20 - loss: 0.6291 - regression_loss: 0.5662 - classification_loss: 0.0630 266/500 [==============>...............] - ETA: 1:20 - loss: 0.6297 - regression_loss: 0.5666 - classification_loss: 0.0631 267/500 [===============>..............] - ETA: 1:20 - loss: 0.6308 - regression_loss: 0.5676 - classification_loss: 0.0632 268/500 [===============>..............] - ETA: 1:19 - loss: 0.6305 - regression_loss: 0.5674 - classification_loss: 0.0631 269/500 [===============>..............] - ETA: 1:19 - loss: 0.6315 - regression_loss: 0.5684 - classification_loss: 0.0631 270/500 [===============>..............] - ETA: 1:19 - loss: 0.6311 - regression_loss: 0.5681 - classification_loss: 0.0630 271/500 [===============>..............] - ETA: 1:18 - loss: 0.6307 - regression_loss: 0.5677 - classification_loss: 0.0630 272/500 [===============>..............] - ETA: 1:18 - loss: 0.6299 - regression_loss: 0.5670 - classification_loss: 0.0629 273/500 [===============>..............] - ETA: 1:18 - loss: 0.6301 - regression_loss: 0.5670 - classification_loss: 0.0630 274/500 [===============>..............] - ETA: 1:17 - loss: 0.6306 - regression_loss: 0.5675 - classification_loss: 0.0631 275/500 [===============>..............] - ETA: 1:17 - loss: 0.6296 - regression_loss: 0.5665 - classification_loss: 0.0631 276/500 [===============>..............] - ETA: 1:17 - loss: 0.6285 - regression_loss: 0.5655 - classification_loss: 0.0630 277/500 [===============>..............] - ETA: 1:16 - loss: 0.6274 - regression_loss: 0.5644 - classification_loss: 0.0629 278/500 [===============>..............] - ETA: 1:16 - loss: 0.6287 - regression_loss: 0.5656 - classification_loss: 0.0631 279/500 [===============>..............] - ETA: 1:15 - loss: 0.6284 - regression_loss: 0.5654 - classification_loss: 0.0630 280/500 [===============>..............] - ETA: 1:15 - loss: 0.6282 - regression_loss: 0.5652 - classification_loss: 0.0630 281/500 [===============>..............] - ETA: 1:15 - loss: 0.6301 - regression_loss: 0.5667 - classification_loss: 0.0634 282/500 [===============>..............] - ETA: 1:14 - loss: 0.6315 - regression_loss: 0.5680 - classification_loss: 0.0635 283/500 [===============>..............] - ETA: 1:14 - loss: 0.6311 - regression_loss: 0.5676 - classification_loss: 0.0635 284/500 [================>.............] - ETA: 1:14 - loss: 0.6309 - regression_loss: 0.5675 - classification_loss: 0.0635 285/500 [================>.............] - ETA: 1:13 - loss: 0.6314 - regression_loss: 0.5678 - classification_loss: 0.0635 286/500 [================>.............] - ETA: 1:13 - loss: 0.6315 - regression_loss: 0.5679 - classification_loss: 0.0636 287/500 [================>.............] - ETA: 1:13 - loss: 0.6310 - regression_loss: 0.5674 - classification_loss: 0.0636 288/500 [================>.............] - ETA: 1:12 - loss: 0.6304 - regression_loss: 0.5669 - classification_loss: 0.0636 289/500 [================>.............] - ETA: 1:12 - loss: 0.6298 - regression_loss: 0.5662 - classification_loss: 0.0636 290/500 [================>.............] - ETA: 1:12 - loss: 0.6293 - regression_loss: 0.5658 - classification_loss: 0.0635 291/500 [================>.............] - ETA: 1:11 - loss: 0.6287 - regression_loss: 0.5653 - classification_loss: 0.0634 292/500 [================>.............] - ETA: 1:11 - loss: 0.6288 - regression_loss: 0.5654 - classification_loss: 0.0634 293/500 [================>.............] - ETA: 1:11 - loss: 0.6282 - regression_loss: 0.5649 - classification_loss: 0.0633 294/500 [================>.............] - ETA: 1:10 - loss: 0.6281 - regression_loss: 0.5647 - classification_loss: 0.0634 295/500 [================>.............] - ETA: 1:10 - loss: 0.6277 - regression_loss: 0.5644 - classification_loss: 0.0633 296/500 [================>.............] - ETA: 1:10 - loss: 0.6278 - regression_loss: 0.5645 - classification_loss: 0.0633 297/500 [================>.............] - ETA: 1:09 - loss: 0.6273 - regression_loss: 0.5640 - classification_loss: 0.0632 298/500 [================>.............] - ETA: 1:09 - loss: 0.6276 - regression_loss: 0.5644 - classification_loss: 0.0633 299/500 [================>.............] - ETA: 1:09 - loss: 0.6266 - regression_loss: 0.5634 - classification_loss: 0.0631 300/500 [=================>............] - ETA: 1:08 - loss: 0.6266 - regression_loss: 0.5634 - classification_loss: 0.0632 301/500 [=================>............] - ETA: 1:08 - loss: 0.6274 - regression_loss: 0.5640 - classification_loss: 0.0634 302/500 [=================>............] - ETA: 1:08 - loss: 0.6258 - regression_loss: 0.5626 - classification_loss: 0.0632 303/500 [=================>............] - ETA: 1:07 - loss: 0.6263 - regression_loss: 0.5631 - classification_loss: 0.0633 304/500 [=================>............] - ETA: 1:07 - loss: 0.6271 - regression_loss: 0.5638 - classification_loss: 0.0633 305/500 [=================>............] - ETA: 1:07 - loss: 0.6270 - regression_loss: 0.5636 - classification_loss: 0.0633 306/500 [=================>............] - ETA: 1:06 - loss: 0.6262 - regression_loss: 0.5629 - classification_loss: 0.0632 307/500 [=================>............] - ETA: 1:06 - loss: 0.6265 - regression_loss: 0.5632 - classification_loss: 0.0633 308/500 [=================>............] - ETA: 1:06 - loss: 0.6256 - regression_loss: 0.5625 - classification_loss: 0.0631 309/500 [=================>............] - ETA: 1:05 - loss: 0.6248 - regression_loss: 0.5618 - classification_loss: 0.0630 310/500 [=================>............] - ETA: 1:05 - loss: 0.6251 - regression_loss: 0.5622 - classification_loss: 0.0629 311/500 [=================>............] - ETA: 1:05 - loss: 0.6251 - regression_loss: 0.5622 - classification_loss: 0.0629 312/500 [=================>............] - ETA: 1:04 - loss: 0.6259 - regression_loss: 0.5630 - classification_loss: 0.0629 313/500 [=================>............] - ETA: 1:04 - loss: 0.6270 - regression_loss: 0.5639 - classification_loss: 0.0630 314/500 [=================>............] - ETA: 1:03 - loss: 0.6268 - regression_loss: 0.5639 - classification_loss: 0.0630 315/500 [=================>............] - ETA: 1:03 - loss: 0.6273 - regression_loss: 0.5642 - classification_loss: 0.0631 316/500 [=================>............] - ETA: 1:03 - loss: 0.6271 - regression_loss: 0.5640 - classification_loss: 0.0631 317/500 [==================>...........] - ETA: 1:02 - loss: 0.6263 - regression_loss: 0.5633 - classification_loss: 0.0630 318/500 [==================>...........] - ETA: 1:02 - loss: 0.6265 - regression_loss: 0.5635 - classification_loss: 0.0630 319/500 [==================>...........] - ETA: 1:02 - loss: 0.6261 - regression_loss: 0.5631 - classification_loss: 0.0630 320/500 [==================>...........] - ETA: 1:01 - loss: 0.6272 - regression_loss: 0.5640 - classification_loss: 0.0632 321/500 [==================>...........] - ETA: 1:01 - loss: 0.6263 - regression_loss: 0.5632 - classification_loss: 0.0631 322/500 [==================>...........] - ETA: 1:01 - loss: 0.6255 - regression_loss: 0.5625 - classification_loss: 0.0630 323/500 [==================>...........] - ETA: 1:00 - loss: 0.6258 - regression_loss: 0.5627 - classification_loss: 0.0630 324/500 [==================>...........] - ETA: 1:00 - loss: 0.6249 - regression_loss: 0.5620 - classification_loss: 0.0629 325/500 [==================>...........] - ETA: 1:00 - loss: 0.6242 - regression_loss: 0.5614 - classification_loss: 0.0628 326/500 [==================>...........] - ETA: 59s - loss: 0.6232 - regression_loss: 0.5606 - classification_loss: 0.0627  327/500 [==================>...........] - ETA: 59s - loss: 0.6241 - regression_loss: 0.5613 - classification_loss: 0.0628 328/500 [==================>...........] - ETA: 59s - loss: 0.6239 - regression_loss: 0.5611 - classification_loss: 0.0628 329/500 [==================>...........] - ETA: 58s - loss: 0.6231 - regression_loss: 0.5604 - classification_loss: 0.0627 330/500 [==================>...........] - ETA: 58s - loss: 0.6243 - regression_loss: 0.5615 - classification_loss: 0.0628 331/500 [==================>...........] - ETA: 58s - loss: 0.6232 - regression_loss: 0.5606 - classification_loss: 0.0627 332/500 [==================>...........] - ETA: 57s - loss: 0.6222 - regression_loss: 0.5596 - classification_loss: 0.0626 333/500 [==================>...........] - ETA: 57s - loss: 0.6217 - regression_loss: 0.5592 - classification_loss: 0.0625 334/500 [===================>..........] - ETA: 57s - loss: 0.6220 - regression_loss: 0.5594 - classification_loss: 0.0625 335/500 [===================>..........] - ETA: 56s - loss: 0.6226 - regression_loss: 0.5599 - classification_loss: 0.0627 336/500 [===================>..........] - ETA: 56s - loss: 0.6224 - regression_loss: 0.5598 - classification_loss: 0.0626 337/500 [===================>..........] - ETA: 56s - loss: 0.6231 - regression_loss: 0.5603 - classification_loss: 0.0627 338/500 [===================>..........] - ETA: 55s - loss: 0.6236 - regression_loss: 0.5609 - classification_loss: 0.0627 339/500 [===================>..........] - ETA: 55s - loss: 0.6250 - regression_loss: 0.5622 - classification_loss: 0.0628 340/500 [===================>..........] - ETA: 55s - loss: 0.6252 - regression_loss: 0.5624 - classification_loss: 0.0629 341/500 [===================>..........] - ETA: 54s - loss: 0.6258 - regression_loss: 0.5629 - classification_loss: 0.0629 342/500 [===================>..........] - ETA: 54s - loss: 0.6247 - regression_loss: 0.5620 - classification_loss: 0.0627 343/500 [===================>..........] - ETA: 54s - loss: 0.6243 - regression_loss: 0.5616 - classification_loss: 0.0626 344/500 [===================>..........] - ETA: 53s - loss: 0.6245 - regression_loss: 0.5619 - classification_loss: 0.0626 345/500 [===================>..........] - ETA: 53s - loss: 0.6245 - regression_loss: 0.5619 - classification_loss: 0.0626 346/500 [===================>..........] - ETA: 53s - loss: 0.6238 - regression_loss: 0.5613 - classification_loss: 0.0625 347/500 [===================>..........] - ETA: 52s - loss: 0.6233 - regression_loss: 0.5608 - classification_loss: 0.0625 348/500 [===================>..........] - ETA: 52s - loss: 0.6227 - regression_loss: 0.5603 - classification_loss: 0.0624 349/500 [===================>..........] - ETA: 51s - loss: 0.6228 - regression_loss: 0.5604 - classification_loss: 0.0624 350/500 [====================>.........] - ETA: 51s - loss: 0.6234 - regression_loss: 0.5610 - classification_loss: 0.0624 351/500 [====================>.........] - ETA: 51s - loss: 0.6249 - regression_loss: 0.5624 - classification_loss: 0.0625 352/500 [====================>.........] - ETA: 50s - loss: 0.6248 - regression_loss: 0.5623 - classification_loss: 0.0625 353/500 [====================>.........] - ETA: 50s - loss: 0.6239 - regression_loss: 0.5615 - classification_loss: 0.0624 354/500 [====================>.........] - ETA: 50s - loss: 0.6233 - regression_loss: 0.5610 - classification_loss: 0.0624 355/500 [====================>.........] - ETA: 49s - loss: 0.6232 - regression_loss: 0.5608 - classification_loss: 0.0624 356/500 [====================>.........] - ETA: 49s - loss: 0.6232 - regression_loss: 0.5608 - classification_loss: 0.0624 357/500 [====================>.........] - ETA: 49s - loss: 0.6241 - regression_loss: 0.5616 - classification_loss: 0.0625 358/500 [====================>.........] - ETA: 48s - loss: 0.6235 - regression_loss: 0.5610 - classification_loss: 0.0624 359/500 [====================>.........] - ETA: 48s - loss: 0.6238 - regression_loss: 0.5613 - classification_loss: 0.0625 360/500 [====================>.........] - ETA: 48s - loss: 0.6234 - regression_loss: 0.5610 - classification_loss: 0.0625 361/500 [====================>.........] - ETA: 47s - loss: 0.6239 - regression_loss: 0.5614 - classification_loss: 0.0625 362/500 [====================>.........] - ETA: 47s - loss: 0.6251 - regression_loss: 0.5627 - classification_loss: 0.0624 363/500 [====================>.........] - ETA: 47s - loss: 0.6250 - regression_loss: 0.5626 - classification_loss: 0.0624 364/500 [====================>.........] - ETA: 46s - loss: 0.6251 - regression_loss: 0.5626 - classification_loss: 0.0624 365/500 [====================>.........] - ETA: 46s - loss: 0.6241 - regression_loss: 0.5618 - classification_loss: 0.0623 366/500 [====================>.........] - ETA: 46s - loss: 0.6244 - regression_loss: 0.5620 - classification_loss: 0.0624 367/500 [=====================>........] - ETA: 45s - loss: 0.6241 - regression_loss: 0.5617 - classification_loss: 0.0625 368/500 [=====================>........] - ETA: 45s - loss: 0.6241 - regression_loss: 0.5617 - classification_loss: 0.0624 369/500 [=====================>........] - ETA: 45s - loss: 0.6243 - regression_loss: 0.5619 - classification_loss: 0.0624 370/500 [=====================>........] - ETA: 44s - loss: 0.6243 - regression_loss: 0.5620 - classification_loss: 0.0624 371/500 [=====================>........] - ETA: 44s - loss: 0.6242 - regression_loss: 0.5619 - classification_loss: 0.0623 372/500 [=====================>........] - ETA: 44s - loss: 0.6239 - regression_loss: 0.5616 - classification_loss: 0.0623 373/500 [=====================>........] - ETA: 43s - loss: 0.6240 - regression_loss: 0.5618 - classification_loss: 0.0622 374/500 [=====================>........] - ETA: 43s - loss: 0.6236 - regression_loss: 0.5614 - classification_loss: 0.0622 375/500 [=====================>........] - ETA: 43s - loss: 0.6233 - regression_loss: 0.5612 - classification_loss: 0.0621 376/500 [=====================>........] - ETA: 42s - loss: 0.6225 - regression_loss: 0.5605 - classification_loss: 0.0620 377/500 [=====================>........] - ETA: 42s - loss: 0.6224 - regression_loss: 0.5604 - classification_loss: 0.0620 378/500 [=====================>........] - ETA: 41s - loss: 0.6225 - regression_loss: 0.5605 - classification_loss: 0.0620 379/500 [=====================>........] - ETA: 41s - loss: 0.6223 - regression_loss: 0.5603 - classification_loss: 0.0620 380/500 [=====================>........] - ETA: 41s - loss: 0.6225 - regression_loss: 0.5604 - classification_loss: 0.0620 381/500 [=====================>........] - ETA: 40s - loss: 0.6226 - regression_loss: 0.5606 - classification_loss: 0.0620 382/500 [=====================>........] - ETA: 40s - loss: 0.6226 - regression_loss: 0.5607 - classification_loss: 0.0619 383/500 [=====================>........] - ETA: 40s - loss: 0.6220 - regression_loss: 0.5602 - classification_loss: 0.0619 384/500 [======================>.......] - ETA: 39s - loss: 0.6225 - regression_loss: 0.5605 - classification_loss: 0.0620 385/500 [======================>.......] - ETA: 39s - loss: 0.6226 - regression_loss: 0.5606 - classification_loss: 0.0619 386/500 [======================>.......] - ETA: 39s - loss: 0.6229 - regression_loss: 0.5609 - classification_loss: 0.0620 387/500 [======================>.......] - ETA: 38s - loss: 0.6244 - regression_loss: 0.5621 - classification_loss: 0.0623 388/500 [======================>.......] - ETA: 38s - loss: 0.6241 - regression_loss: 0.5619 - classification_loss: 0.0622 389/500 [======================>.......] - ETA: 38s - loss: 0.6237 - regression_loss: 0.5616 - classification_loss: 0.0621 390/500 [======================>.......] - ETA: 37s - loss: 0.6242 - regression_loss: 0.5621 - classification_loss: 0.0622 391/500 [======================>.......] - ETA: 37s - loss: 0.6236 - regression_loss: 0.5615 - classification_loss: 0.0621 392/500 [======================>.......] - ETA: 37s - loss: 0.6231 - regression_loss: 0.5611 - classification_loss: 0.0620 393/500 [======================>.......] - ETA: 36s - loss: 0.6240 - regression_loss: 0.5619 - classification_loss: 0.0621 394/500 [======================>.......] - ETA: 36s - loss: 0.6246 - regression_loss: 0.5624 - classification_loss: 0.0622 395/500 [======================>.......] - ETA: 36s - loss: 0.6242 - regression_loss: 0.5621 - classification_loss: 0.0621 396/500 [======================>.......] - ETA: 35s - loss: 0.6246 - regression_loss: 0.5624 - classification_loss: 0.0622 397/500 [======================>.......] - ETA: 35s - loss: 0.6247 - regression_loss: 0.5624 - classification_loss: 0.0622 398/500 [======================>.......] - ETA: 35s - loss: 0.6248 - regression_loss: 0.5626 - classification_loss: 0.0622 399/500 [======================>.......] - ETA: 34s - loss: 0.6244 - regression_loss: 0.5622 - classification_loss: 0.0622 400/500 [=======================>......] - ETA: 34s - loss: 0.6239 - regression_loss: 0.5618 - classification_loss: 0.0621 401/500 [=======================>......] - ETA: 34s - loss: 0.6231 - regression_loss: 0.5611 - classification_loss: 0.0620 402/500 [=======================>......] - ETA: 33s - loss: 0.6239 - regression_loss: 0.5618 - classification_loss: 0.0620 403/500 [=======================>......] - ETA: 33s - loss: 0.6239 - regression_loss: 0.5618 - classification_loss: 0.0620 404/500 [=======================>......] - ETA: 33s - loss: 0.6244 - regression_loss: 0.5623 - classification_loss: 0.0621 405/500 [=======================>......] - ETA: 32s - loss: 0.6248 - regression_loss: 0.5628 - classification_loss: 0.0621 406/500 [=======================>......] - ETA: 32s - loss: 0.6251 - regression_loss: 0.5631 - classification_loss: 0.0621 407/500 [=======================>......] - ETA: 31s - loss: 0.6256 - regression_loss: 0.5635 - classification_loss: 0.0621 408/500 [=======================>......] - ETA: 31s - loss: 0.6258 - regression_loss: 0.5636 - classification_loss: 0.0622 409/500 [=======================>......] - ETA: 31s - loss: 0.6260 - regression_loss: 0.5639 - classification_loss: 0.0622 410/500 [=======================>......] - ETA: 30s - loss: 0.6272 - regression_loss: 0.5649 - classification_loss: 0.0623 411/500 [=======================>......] - ETA: 30s - loss: 0.6269 - regression_loss: 0.5645 - classification_loss: 0.0624 412/500 [=======================>......] - ETA: 30s - loss: 0.6278 - regression_loss: 0.5654 - classification_loss: 0.0624 413/500 [=======================>......] - ETA: 29s - loss: 0.6281 - regression_loss: 0.5657 - classification_loss: 0.0624 414/500 [=======================>......] - ETA: 29s - loss: 0.6280 - regression_loss: 0.5657 - classification_loss: 0.0623 415/500 [=======================>......] - ETA: 29s - loss: 0.6277 - regression_loss: 0.5654 - classification_loss: 0.0622 416/500 [=======================>......] - ETA: 28s - loss: 0.6273 - regression_loss: 0.5651 - classification_loss: 0.0622 417/500 [========================>.....] - ETA: 28s - loss: 0.6273 - regression_loss: 0.5651 - classification_loss: 0.0622 418/500 [========================>.....] - ETA: 28s - loss: 0.6270 - regression_loss: 0.5648 - classification_loss: 0.0622 419/500 [========================>.....] - ETA: 27s - loss: 0.6269 - regression_loss: 0.5647 - classification_loss: 0.0622 420/500 [========================>.....] - ETA: 27s - loss: 0.6276 - regression_loss: 0.5653 - classification_loss: 0.0623 421/500 [========================>.....] - ETA: 27s - loss: 0.6287 - regression_loss: 0.5663 - classification_loss: 0.0624 422/500 [========================>.....] - ETA: 26s - loss: 0.6289 - regression_loss: 0.5664 - classification_loss: 0.0625 423/500 [========================>.....] - ETA: 26s - loss: 0.6301 - regression_loss: 0.5676 - classification_loss: 0.0625 424/500 [========================>.....] - ETA: 26s - loss: 0.6299 - regression_loss: 0.5675 - classification_loss: 0.0624 425/500 [========================>.....] - ETA: 25s - loss: 0.6305 - regression_loss: 0.5680 - classification_loss: 0.0625 426/500 [========================>.....] - ETA: 25s - loss: 0.6308 - regression_loss: 0.5684 - classification_loss: 0.0625 427/500 [========================>.....] - ETA: 25s - loss: 0.6305 - regression_loss: 0.5681 - classification_loss: 0.0624 428/500 [========================>.....] - ETA: 24s - loss: 0.6307 - regression_loss: 0.5682 - classification_loss: 0.0625 429/500 [========================>.....] - ETA: 24s - loss: 0.6304 - regression_loss: 0.5680 - classification_loss: 0.0624 430/500 [========================>.....] - ETA: 24s - loss: 0.6314 - regression_loss: 0.5689 - classification_loss: 0.0625 431/500 [========================>.....] - ETA: 23s - loss: 0.6333 - regression_loss: 0.5706 - classification_loss: 0.0627 432/500 [========================>.....] - ETA: 23s - loss: 0.6335 - regression_loss: 0.5708 - classification_loss: 0.0627 433/500 [========================>.....] - ETA: 23s - loss: 0.6348 - regression_loss: 0.5719 - classification_loss: 0.0629 434/500 [=========================>....] - ETA: 22s - loss: 0.6352 - regression_loss: 0.5723 - classification_loss: 0.0629 435/500 [=========================>....] - ETA: 22s - loss: 0.6354 - regression_loss: 0.5725 - classification_loss: 0.0629 436/500 [=========================>....] - ETA: 22s - loss: 0.6347 - regression_loss: 0.5719 - classification_loss: 0.0628 437/500 [=========================>....] - ETA: 21s - loss: 0.6340 - regression_loss: 0.5713 - classification_loss: 0.0627 438/500 [=========================>....] - ETA: 21s - loss: 0.6341 - regression_loss: 0.5714 - classification_loss: 0.0627 439/500 [=========================>....] - ETA: 20s - loss: 0.6344 - regression_loss: 0.5717 - classification_loss: 0.0628 440/500 [=========================>....] - ETA: 20s - loss: 0.6344 - regression_loss: 0.5717 - classification_loss: 0.0627 441/500 [=========================>....] - ETA: 20s - loss: 0.6348 - regression_loss: 0.5720 - classification_loss: 0.0628 442/500 [=========================>....] - ETA: 19s - loss: 0.6353 - regression_loss: 0.5725 - classification_loss: 0.0628 443/500 [=========================>....] - ETA: 19s - loss: 0.6355 - regression_loss: 0.5727 - classification_loss: 0.0628 444/500 [=========================>....] - ETA: 19s - loss: 0.6357 - regression_loss: 0.5729 - classification_loss: 0.0628 445/500 [=========================>....] - ETA: 18s - loss: 0.6358 - regression_loss: 0.5731 - classification_loss: 0.0628 446/500 [=========================>....] - ETA: 18s - loss: 0.6368 - regression_loss: 0.5740 - classification_loss: 0.0628 447/500 [=========================>....] - ETA: 18s - loss: 0.6371 - regression_loss: 0.5743 - classification_loss: 0.0628 448/500 [=========================>....] - ETA: 17s - loss: 0.6377 - regression_loss: 0.5748 - classification_loss: 0.0629 449/500 [=========================>....] - ETA: 17s - loss: 0.6382 - regression_loss: 0.5752 - classification_loss: 0.0630 450/500 [==========================>...] - ETA: 17s - loss: 0.6391 - regression_loss: 0.5761 - classification_loss: 0.0630 451/500 [==========================>...] - ETA: 16s - loss: 0.6396 - regression_loss: 0.5765 - classification_loss: 0.0630 452/500 [==========================>...] - ETA: 16s - loss: 0.6392 - regression_loss: 0.5762 - classification_loss: 0.0630 453/500 [==========================>...] - ETA: 16s - loss: 0.6387 - regression_loss: 0.5758 - classification_loss: 0.0629 454/500 [==========================>...] - ETA: 15s - loss: 0.6388 - regression_loss: 0.5758 - classification_loss: 0.0630 455/500 [==========================>...] - ETA: 15s - loss: 0.6383 - regression_loss: 0.5754 - classification_loss: 0.0629 456/500 [==========================>...] - ETA: 15s - loss: 0.6377 - regression_loss: 0.5749 - classification_loss: 0.0628 457/500 [==========================>...] - ETA: 14s - loss: 0.6371 - regression_loss: 0.5743 - classification_loss: 0.0628 458/500 [==========================>...] - ETA: 14s - loss: 0.6378 - regression_loss: 0.5750 - classification_loss: 0.0628 459/500 [==========================>...] - ETA: 14s - loss: 0.6375 - regression_loss: 0.5747 - classification_loss: 0.0628 460/500 [==========================>...] - ETA: 13s - loss: 0.6375 - regression_loss: 0.5747 - classification_loss: 0.0628 461/500 [==========================>...] - ETA: 13s - loss: 0.6373 - regression_loss: 0.5746 - classification_loss: 0.0627 462/500 [==========================>...] - ETA: 13s - loss: 0.6365 - regression_loss: 0.5739 - classification_loss: 0.0626 463/500 [==========================>...] - ETA: 12s - loss: 0.6359 - regression_loss: 0.5733 - classification_loss: 0.0625 464/500 [==========================>...] - ETA: 12s - loss: 0.6365 - regression_loss: 0.5739 - classification_loss: 0.0626 465/500 [==========================>...] - ETA: 12s - loss: 0.6367 - regression_loss: 0.5742 - classification_loss: 0.0626 466/500 [==========================>...] - ETA: 11s - loss: 0.6363 - regression_loss: 0.5738 - classification_loss: 0.0625 467/500 [===========================>..] - ETA: 11s - loss: 0.6364 - regression_loss: 0.5739 - classification_loss: 0.0625 468/500 [===========================>..] - ETA: 11s - loss: 0.6360 - regression_loss: 0.5736 - classification_loss: 0.0624 469/500 [===========================>..] - ETA: 10s - loss: 0.6371 - regression_loss: 0.5745 - classification_loss: 0.0626 470/500 [===========================>..] - ETA: 10s - loss: 0.6374 - regression_loss: 0.5748 - classification_loss: 0.0626 471/500 [===========================>..] - ETA: 9s - loss: 0.6382 - regression_loss: 0.5755 - classification_loss: 0.0627  472/500 [===========================>..] - ETA: 9s - loss: 0.6379 - regression_loss: 0.5753 - classification_loss: 0.0626 473/500 [===========================>..] - ETA: 9s - loss: 0.6380 - regression_loss: 0.5753 - classification_loss: 0.0627 474/500 [===========================>..] - ETA: 8s - loss: 0.6379 - regression_loss: 0.5752 - classification_loss: 0.0627 475/500 [===========================>..] - ETA: 8s - loss: 0.6377 - regression_loss: 0.5750 - classification_loss: 0.0627 476/500 [===========================>..] - ETA: 8s - loss: 0.6372 - regression_loss: 0.5745 - classification_loss: 0.0627 477/500 [===========================>..] - ETA: 7s - loss: 0.6374 - regression_loss: 0.5747 - classification_loss: 0.0627 478/500 [===========================>..] - ETA: 7s - loss: 0.6369 - regression_loss: 0.5743 - classification_loss: 0.0626 479/500 [===========================>..] - ETA: 7s - loss: 0.6369 - regression_loss: 0.5743 - classification_loss: 0.0626 480/500 [===========================>..] - ETA: 6s - loss: 0.6366 - regression_loss: 0.5740 - classification_loss: 0.0626 481/500 [===========================>..] - ETA: 6s - loss: 0.6366 - regression_loss: 0.5740 - classification_loss: 0.0626 482/500 [===========================>..] - ETA: 6s - loss: 0.6366 - regression_loss: 0.5740 - classification_loss: 0.0626 483/500 [===========================>..] - ETA: 5s - loss: 0.6366 - regression_loss: 0.5740 - classification_loss: 0.0626 484/500 [============================>.] - ETA: 5s - loss: 0.6370 - regression_loss: 0.5744 - classification_loss: 0.0626 485/500 [============================>.] - ETA: 5s - loss: 0.6363 - regression_loss: 0.5738 - classification_loss: 0.0625 486/500 [============================>.] - ETA: 4s - loss: 0.6357 - regression_loss: 0.5732 - classification_loss: 0.0625 487/500 [============================>.] - ETA: 4s - loss: 0.6357 - regression_loss: 0.5732 - classification_loss: 0.0624 488/500 [============================>.] - ETA: 4s - loss: 0.6360 - regression_loss: 0.5735 - classification_loss: 0.0624 489/500 [============================>.] - ETA: 3s - loss: 0.6356 - regression_loss: 0.5732 - classification_loss: 0.0624 490/500 [============================>.] - ETA: 3s - loss: 0.6351 - regression_loss: 0.5728 - classification_loss: 0.0623 491/500 [============================>.] - ETA: 3s - loss: 0.6354 - regression_loss: 0.5731 - classification_loss: 0.0623 492/500 [============================>.] - ETA: 2s - loss: 0.6349 - regression_loss: 0.5727 - classification_loss: 0.0622 493/500 [============================>.] - ETA: 2s - loss: 0.6346 - regression_loss: 0.5723 - classification_loss: 0.0622 494/500 [============================>.] - ETA: 2s - loss: 0.6348 - regression_loss: 0.5726 - classification_loss: 0.0622 495/500 [============================>.] - ETA: 1s - loss: 0.6346 - regression_loss: 0.5725 - classification_loss: 0.0621 496/500 [============================>.] - ETA: 1s - loss: 0.6348 - regression_loss: 0.5728 - classification_loss: 0.0620 497/500 [============================>.] - ETA: 1s - loss: 0.6354 - regression_loss: 0.5734 - classification_loss: 0.0621 498/500 [============================>.] - ETA: 0s - loss: 0.6355 - regression_loss: 0.5734 - classification_loss: 0.0621 499/500 [============================>.] - ETA: 0s - loss: 0.6350 - regression_loss: 0.5730 - classification_loss: 0.0620 500/500 [==============================] - 172s 344ms/step - loss: 0.6343 - regression_loss: 0.5723 - classification_loss: 0.0619 1172 instances of class plum with average precision: 0.7428 mAP: 0.7428 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:43 - loss: 0.4377 - regression_loss: 0.4019 - classification_loss: 0.0358 2/500 [..............................] - ETA: 2:44 - loss: 0.6117 - regression_loss: 0.5519 - classification_loss: 0.0598 3/500 [..............................] - ETA: 2:43 - loss: 0.6720 - regression_loss: 0.6024 - classification_loss: 0.0696 4/500 [..............................] - ETA: 2:45 - loss: 0.5959 - regression_loss: 0.5349 - classification_loss: 0.0610 5/500 [..............................] - ETA: 2:45 - loss: 0.5316 - regression_loss: 0.4776 - classification_loss: 0.0540 6/500 [..............................] - ETA: 2:46 - loss: 0.4712 - regression_loss: 0.4235 - classification_loss: 0.0477 7/500 [..............................] - ETA: 2:45 - loss: 0.4812 - regression_loss: 0.4304 - classification_loss: 0.0508 8/500 [..............................] - ETA: 2:46 - loss: 0.5010 - regression_loss: 0.4483 - classification_loss: 0.0528 9/500 [..............................] - ETA: 2:45 - loss: 0.4921 - regression_loss: 0.4405 - classification_loss: 0.0517 10/500 [..............................] - ETA: 2:44 - loss: 0.4768 - regression_loss: 0.4264 - classification_loss: 0.0504 11/500 [..............................] - ETA: 2:44 - loss: 0.4766 - regression_loss: 0.4283 - classification_loss: 0.0483 12/500 [..............................] - ETA: 2:44 - loss: 0.4822 - regression_loss: 0.4339 - classification_loss: 0.0483 13/500 [..............................] - ETA: 2:43 - loss: 0.4779 - regression_loss: 0.4309 - classification_loss: 0.0470 14/500 [..............................] - ETA: 2:43 - loss: 0.4788 - regression_loss: 0.4323 - classification_loss: 0.0465 15/500 [..............................] - ETA: 2:42 - loss: 0.4969 - regression_loss: 0.4497 - classification_loss: 0.0473 16/500 [..............................] - ETA: 2:42 - loss: 0.5139 - regression_loss: 0.4637 - classification_loss: 0.0502 17/500 [>.............................] - ETA: 2:41 - loss: 0.5144 - regression_loss: 0.4634 - classification_loss: 0.0511 18/500 [>.............................] - ETA: 2:40 - loss: 0.5326 - regression_loss: 0.4769 - classification_loss: 0.0557 19/500 [>.............................] - ETA: 2:40 - loss: 0.5459 - regression_loss: 0.4884 - classification_loss: 0.0575 20/500 [>.............................] - ETA: 2:40 - loss: 0.5376 - regression_loss: 0.4815 - classification_loss: 0.0561 21/500 [>.............................] - ETA: 2:40 - loss: 0.5621 - regression_loss: 0.5065 - classification_loss: 0.0556 22/500 [>.............................] - ETA: 2:40 - loss: 0.5614 - regression_loss: 0.5071 - classification_loss: 0.0543 23/500 [>.............................] - ETA: 2:39 - loss: 0.5644 - regression_loss: 0.5096 - classification_loss: 0.0548 24/500 [>.............................] - ETA: 2:39 - loss: 0.5684 - regression_loss: 0.5133 - classification_loss: 0.0551 25/500 [>.............................] - ETA: 2:39 - loss: 0.5748 - regression_loss: 0.5150 - classification_loss: 0.0598 26/500 [>.............................] - ETA: 2:38 - loss: 0.5779 - regression_loss: 0.5186 - classification_loss: 0.0593 27/500 [>.............................] - ETA: 2:38 - loss: 0.5791 - regression_loss: 0.5197 - classification_loss: 0.0595 28/500 [>.............................] - ETA: 2:37 - loss: 0.5762 - regression_loss: 0.5172 - classification_loss: 0.0591 29/500 [>.............................] - ETA: 2:37 - loss: 0.5952 - regression_loss: 0.5327 - classification_loss: 0.0625 30/500 [>.............................] - ETA: 2:37 - loss: 0.5922 - regression_loss: 0.5300 - classification_loss: 0.0622 31/500 [>.............................] - ETA: 2:36 - loss: 0.5957 - regression_loss: 0.5325 - classification_loss: 0.0632 32/500 [>.............................] - ETA: 2:36 - loss: 0.6075 - regression_loss: 0.5413 - classification_loss: 0.0662 33/500 [>.............................] - ETA: 2:36 - loss: 0.6019 - regression_loss: 0.5371 - classification_loss: 0.0648 34/500 [=>............................] - ETA: 2:36 - loss: 0.6015 - regression_loss: 0.5373 - classification_loss: 0.0642 35/500 [=>............................] - ETA: 2:36 - loss: 0.5981 - regression_loss: 0.5341 - classification_loss: 0.0640 36/500 [=>............................] - ETA: 2:35 - loss: 0.6090 - regression_loss: 0.5425 - classification_loss: 0.0665 37/500 [=>............................] - ETA: 2:35 - loss: 0.6071 - regression_loss: 0.5417 - classification_loss: 0.0654 38/500 [=>............................] - ETA: 2:35 - loss: 0.6112 - regression_loss: 0.5466 - classification_loss: 0.0646 39/500 [=>............................] - ETA: 2:34 - loss: 0.6086 - regression_loss: 0.5451 - classification_loss: 0.0635 40/500 [=>............................] - ETA: 2:34 - loss: 0.6052 - regression_loss: 0.5423 - classification_loss: 0.0630 41/500 [=>............................] - ETA: 2:34 - loss: 0.6104 - regression_loss: 0.5467 - classification_loss: 0.0637 42/500 [=>............................] - ETA: 2:33 - loss: 0.6170 - regression_loss: 0.5522 - classification_loss: 0.0648 43/500 [=>............................] - ETA: 2:33 - loss: 0.6169 - regression_loss: 0.5524 - classification_loss: 0.0645 44/500 [=>............................] - ETA: 2:33 - loss: 0.6141 - regression_loss: 0.5497 - classification_loss: 0.0643 45/500 [=>............................] - ETA: 2:33 - loss: 0.6140 - regression_loss: 0.5490 - classification_loss: 0.0650 46/500 [=>............................] - ETA: 2:33 - loss: 0.6143 - regression_loss: 0.5501 - classification_loss: 0.0642 47/500 [=>............................] - ETA: 2:33 - loss: 0.6145 - regression_loss: 0.5499 - classification_loss: 0.0645 48/500 [=>............................] - ETA: 2:32 - loss: 0.6112 - regression_loss: 0.5472 - classification_loss: 0.0640 49/500 [=>............................] - ETA: 2:32 - loss: 0.6075 - regression_loss: 0.5440 - classification_loss: 0.0635 50/500 [==>...........................] - ETA: 2:32 - loss: 0.6089 - regression_loss: 0.5459 - classification_loss: 0.0630 51/500 [==>...........................] - ETA: 2:32 - loss: 0.6093 - regression_loss: 0.5466 - classification_loss: 0.0627 52/500 [==>...........................] - ETA: 2:32 - loss: 0.6113 - regression_loss: 0.5486 - classification_loss: 0.0627 53/500 [==>...........................] - ETA: 2:31 - loss: 0.6147 - regression_loss: 0.5518 - classification_loss: 0.0630 54/500 [==>...........................] - ETA: 2:31 - loss: 0.6141 - regression_loss: 0.5510 - classification_loss: 0.0632 55/500 [==>...........................] - ETA: 2:31 - loss: 0.6113 - regression_loss: 0.5482 - classification_loss: 0.0630 56/500 [==>...........................] - ETA: 2:30 - loss: 0.6111 - regression_loss: 0.5478 - classification_loss: 0.0633 57/500 [==>...........................] - ETA: 2:30 - loss: 0.6140 - regression_loss: 0.5507 - classification_loss: 0.0633 58/500 [==>...........................] - ETA: 2:30 - loss: 0.6205 - regression_loss: 0.5562 - classification_loss: 0.0644 59/500 [==>...........................] - ETA: 2:29 - loss: 0.6187 - regression_loss: 0.5550 - classification_loss: 0.0637 60/500 [==>...........................] - ETA: 2:29 - loss: 0.6197 - regression_loss: 0.5563 - classification_loss: 0.0634 61/500 [==>...........................] - ETA: 2:29 - loss: 0.6166 - regression_loss: 0.5535 - classification_loss: 0.0631 62/500 [==>...........................] - ETA: 2:29 - loss: 0.6156 - regression_loss: 0.5528 - classification_loss: 0.0628 63/500 [==>...........................] - ETA: 2:28 - loss: 0.6120 - regression_loss: 0.5496 - classification_loss: 0.0624 64/500 [==>...........................] - ETA: 2:28 - loss: 0.6188 - regression_loss: 0.5555 - classification_loss: 0.0633 65/500 [==>...........................] - ETA: 2:28 - loss: 0.6121 - regression_loss: 0.5495 - classification_loss: 0.0626 66/500 [==>...........................] - ETA: 2:27 - loss: 0.6097 - regression_loss: 0.5469 - classification_loss: 0.0628 67/500 [===>..........................] - ETA: 2:27 - loss: 0.6082 - regression_loss: 0.5454 - classification_loss: 0.0628 68/500 [===>..........................] - ETA: 2:27 - loss: 0.6119 - regression_loss: 0.5488 - classification_loss: 0.0631 69/500 [===>..........................] - ETA: 2:27 - loss: 0.6097 - regression_loss: 0.5470 - classification_loss: 0.0627 70/500 [===>..........................] - ETA: 2:26 - loss: 0.6088 - regression_loss: 0.5460 - classification_loss: 0.0628 71/500 [===>..........................] - ETA: 2:26 - loss: 0.6059 - regression_loss: 0.5434 - classification_loss: 0.0625 72/500 [===>..........................] - ETA: 2:25 - loss: 0.6034 - regression_loss: 0.5414 - classification_loss: 0.0620 73/500 [===>..........................] - ETA: 2:25 - loss: 0.6009 - regression_loss: 0.5390 - classification_loss: 0.0619 74/500 [===>..........................] - ETA: 2:25 - loss: 0.6015 - regression_loss: 0.5396 - classification_loss: 0.0618 75/500 [===>..........................] - ETA: 2:24 - loss: 0.6005 - regression_loss: 0.5387 - classification_loss: 0.0618 76/500 [===>..........................] - ETA: 2:24 - loss: 0.6023 - regression_loss: 0.5402 - classification_loss: 0.0621 77/500 [===>..........................] - ETA: 2:24 - loss: 0.6042 - regression_loss: 0.5420 - classification_loss: 0.0622 78/500 [===>..........................] - ETA: 2:24 - loss: 0.6053 - regression_loss: 0.5428 - classification_loss: 0.0624 79/500 [===>..........................] - ETA: 2:23 - loss: 0.6075 - regression_loss: 0.5452 - classification_loss: 0.0623 80/500 [===>..........................] - ETA: 2:23 - loss: 0.6065 - regression_loss: 0.5445 - classification_loss: 0.0620 81/500 [===>..........................] - ETA: 2:23 - loss: 0.6103 - regression_loss: 0.5481 - classification_loss: 0.0622 82/500 [===>..........................] - ETA: 2:22 - loss: 0.6129 - regression_loss: 0.5502 - classification_loss: 0.0627 83/500 [===>..........................] - ETA: 2:22 - loss: 0.6138 - regression_loss: 0.5508 - classification_loss: 0.0630 84/500 [====>.........................] - ETA: 2:22 - loss: 0.6124 - regression_loss: 0.5497 - classification_loss: 0.0627 85/500 [====>.........................] - ETA: 2:21 - loss: 0.6138 - regression_loss: 0.5509 - classification_loss: 0.0629 86/500 [====>.........................] - ETA: 2:21 - loss: 0.6147 - regression_loss: 0.5516 - classification_loss: 0.0631 87/500 [====>.........................] - ETA: 2:21 - loss: 0.6143 - regression_loss: 0.5512 - classification_loss: 0.0631 88/500 [====>.........................] - ETA: 2:21 - loss: 0.6124 - regression_loss: 0.5491 - classification_loss: 0.0633 89/500 [====>.........................] - ETA: 2:20 - loss: 0.6090 - regression_loss: 0.5461 - classification_loss: 0.0629 90/500 [====>.........................] - ETA: 2:20 - loss: 0.6095 - regression_loss: 0.5465 - classification_loss: 0.0630 91/500 [====>.........................] - ETA: 2:20 - loss: 0.6107 - regression_loss: 0.5477 - classification_loss: 0.0630 92/500 [====>.........................] - ETA: 2:19 - loss: 0.6097 - regression_loss: 0.5469 - classification_loss: 0.0628 93/500 [====>.........................] - ETA: 2:19 - loss: 0.6124 - regression_loss: 0.5496 - classification_loss: 0.0628 94/500 [====>.........................] - ETA: 2:19 - loss: 0.6141 - regression_loss: 0.5513 - classification_loss: 0.0628 95/500 [====>.........................] - ETA: 2:19 - loss: 0.6158 - regression_loss: 0.5532 - classification_loss: 0.0626 96/500 [====>.........................] - ETA: 2:18 - loss: 0.6121 - regression_loss: 0.5499 - classification_loss: 0.0622 97/500 [====>.........................] - ETA: 2:18 - loss: 0.6133 - regression_loss: 0.5510 - classification_loss: 0.0623 98/500 [====>.........................] - ETA: 2:17 - loss: 0.6113 - regression_loss: 0.5491 - classification_loss: 0.0622 99/500 [====>.........................] - ETA: 2:17 - loss: 0.6088 - regression_loss: 0.5468 - classification_loss: 0.0620 100/500 [=====>........................] - ETA: 2:17 - loss: 0.6137 - regression_loss: 0.5511 - classification_loss: 0.0626 101/500 [=====>........................] - ETA: 2:16 - loss: 0.6147 - regression_loss: 0.5520 - classification_loss: 0.0627 102/500 [=====>........................] - ETA: 2:16 - loss: 0.6147 - regression_loss: 0.5518 - classification_loss: 0.0629 103/500 [=====>........................] - ETA: 2:15 - loss: 0.6175 - regression_loss: 0.5541 - classification_loss: 0.0634 104/500 [=====>........................] - ETA: 2:15 - loss: 0.6238 - regression_loss: 0.5594 - classification_loss: 0.0644 105/500 [=====>........................] - ETA: 2:15 - loss: 0.6249 - regression_loss: 0.5605 - classification_loss: 0.0644 106/500 [=====>........................] - ETA: 2:14 - loss: 0.6220 - regression_loss: 0.5580 - classification_loss: 0.0641 107/500 [=====>........................] - ETA: 2:14 - loss: 0.6227 - regression_loss: 0.5584 - classification_loss: 0.0644 108/500 [=====>........................] - ETA: 2:14 - loss: 0.6231 - regression_loss: 0.5586 - classification_loss: 0.0645 109/500 [=====>........................] - ETA: 2:13 - loss: 0.6214 - regression_loss: 0.5570 - classification_loss: 0.0644 110/500 [=====>........................] - ETA: 2:13 - loss: 0.6216 - regression_loss: 0.5573 - classification_loss: 0.0643 111/500 [=====>........................] - ETA: 2:13 - loss: 0.6196 - regression_loss: 0.5557 - classification_loss: 0.0639 112/500 [=====>........................] - ETA: 2:12 - loss: 0.6192 - regression_loss: 0.5553 - classification_loss: 0.0639 113/500 [=====>........................] - ETA: 2:12 - loss: 0.6228 - regression_loss: 0.5586 - classification_loss: 0.0642 114/500 [=====>........................] - ETA: 2:12 - loss: 0.6213 - regression_loss: 0.5574 - classification_loss: 0.0640 115/500 [=====>........................] - ETA: 2:11 - loss: 0.6196 - regression_loss: 0.5558 - classification_loss: 0.0638 116/500 [=====>........................] - ETA: 2:11 - loss: 0.6183 - regression_loss: 0.5548 - classification_loss: 0.0636 117/500 [======>.......................] - ETA: 2:11 - loss: 0.6157 - regression_loss: 0.5525 - classification_loss: 0.0633 118/500 [======>.......................] - ETA: 2:11 - loss: 0.6168 - regression_loss: 0.5534 - classification_loss: 0.0634 119/500 [======>.......................] - ETA: 2:10 - loss: 0.6175 - regression_loss: 0.5543 - classification_loss: 0.0633 120/500 [======>.......................] - ETA: 2:10 - loss: 0.6178 - regression_loss: 0.5546 - classification_loss: 0.0633 121/500 [======>.......................] - ETA: 2:09 - loss: 0.6157 - regression_loss: 0.5527 - classification_loss: 0.0630 122/500 [======>.......................] - ETA: 2:09 - loss: 0.6127 - regression_loss: 0.5501 - classification_loss: 0.0626 123/500 [======>.......................] - ETA: 2:09 - loss: 0.6105 - regression_loss: 0.5482 - classification_loss: 0.0623 124/500 [======>.......................] - ETA: 2:08 - loss: 0.6106 - regression_loss: 0.5483 - classification_loss: 0.0623 125/500 [======>.......................] - ETA: 2:08 - loss: 0.6103 - regression_loss: 0.5479 - classification_loss: 0.0624 126/500 [======>.......................] - ETA: 2:08 - loss: 0.6116 - regression_loss: 0.5494 - classification_loss: 0.0622 127/500 [======>.......................] - ETA: 2:07 - loss: 0.6109 - regression_loss: 0.5489 - classification_loss: 0.0620 128/500 [======>.......................] - ETA: 2:07 - loss: 0.6115 - regression_loss: 0.5494 - classification_loss: 0.0621 129/500 [======>.......................] - ETA: 2:07 - loss: 0.6119 - regression_loss: 0.5499 - classification_loss: 0.0619 130/500 [======>.......................] - ETA: 2:06 - loss: 0.6102 - regression_loss: 0.5484 - classification_loss: 0.0618 131/500 [======>.......................] - ETA: 2:06 - loss: 0.6096 - regression_loss: 0.5479 - classification_loss: 0.0617 132/500 [======>.......................] - ETA: 2:06 - loss: 0.6091 - regression_loss: 0.5475 - classification_loss: 0.0616 133/500 [======>.......................] - ETA: 2:05 - loss: 0.6065 - regression_loss: 0.5452 - classification_loss: 0.0614 134/500 [=======>......................] - ETA: 2:05 - loss: 0.6063 - regression_loss: 0.5449 - classification_loss: 0.0614 135/500 [=======>......................] - ETA: 2:05 - loss: 0.6065 - regression_loss: 0.5451 - classification_loss: 0.0615 136/500 [=======>......................] - ETA: 2:04 - loss: 0.6056 - regression_loss: 0.5442 - classification_loss: 0.0614 137/500 [=======>......................] - ETA: 2:04 - loss: 0.6054 - regression_loss: 0.5442 - classification_loss: 0.0612 138/500 [=======>......................] - ETA: 2:03 - loss: 0.6065 - regression_loss: 0.5451 - classification_loss: 0.0613 139/500 [=======>......................] - ETA: 2:03 - loss: 0.6071 - regression_loss: 0.5458 - classification_loss: 0.0614 140/500 [=======>......................] - ETA: 2:03 - loss: 0.6062 - regression_loss: 0.5449 - classification_loss: 0.0612 141/500 [=======>......................] - ETA: 2:02 - loss: 0.6067 - regression_loss: 0.5453 - classification_loss: 0.0614 142/500 [=======>......................] - ETA: 2:02 - loss: 0.6083 - regression_loss: 0.5467 - classification_loss: 0.0616 143/500 [=======>......................] - ETA: 2:02 - loss: 0.6060 - regression_loss: 0.5447 - classification_loss: 0.0613 144/500 [=======>......................] - ETA: 2:01 - loss: 0.6084 - regression_loss: 0.5467 - classification_loss: 0.0616 145/500 [=======>......................] - ETA: 2:01 - loss: 0.6111 - regression_loss: 0.5493 - classification_loss: 0.0618 146/500 [=======>......................] - ETA: 2:01 - loss: 0.6117 - regression_loss: 0.5500 - classification_loss: 0.0617 147/500 [=======>......................] - ETA: 2:00 - loss: 0.6101 - regression_loss: 0.5486 - classification_loss: 0.0615 148/500 [=======>......................] - ETA: 2:00 - loss: 0.6089 - regression_loss: 0.5476 - classification_loss: 0.0612 149/500 [=======>......................] - ETA: 2:00 - loss: 0.6107 - regression_loss: 0.5494 - classification_loss: 0.0613 150/500 [========>.....................] - ETA: 1:59 - loss: 0.6116 - regression_loss: 0.5504 - classification_loss: 0.0613 151/500 [========>.....................] - ETA: 1:59 - loss: 0.6120 - regression_loss: 0.5507 - classification_loss: 0.0613 152/500 [========>.....................] - ETA: 1:59 - loss: 0.6111 - regression_loss: 0.5500 - classification_loss: 0.0611 153/500 [========>.....................] - ETA: 1:58 - loss: 0.6097 - regression_loss: 0.5488 - classification_loss: 0.0609 154/500 [========>.....................] - ETA: 1:58 - loss: 0.6074 - regression_loss: 0.5466 - classification_loss: 0.0608 155/500 [========>.....................] - ETA: 1:58 - loss: 0.6061 - regression_loss: 0.5456 - classification_loss: 0.0605 156/500 [========>.....................] - ETA: 1:57 - loss: 0.6063 - regression_loss: 0.5458 - classification_loss: 0.0605 157/500 [========>.....................] - ETA: 1:57 - loss: 0.6062 - regression_loss: 0.5459 - classification_loss: 0.0604 158/500 [========>.....................] - ETA: 1:57 - loss: 0.6044 - regression_loss: 0.5443 - classification_loss: 0.0601 159/500 [========>.....................] - ETA: 1:56 - loss: 0.6042 - regression_loss: 0.5442 - classification_loss: 0.0600 160/500 [========>.....................] - ETA: 1:56 - loss: 0.6060 - regression_loss: 0.5459 - classification_loss: 0.0600 161/500 [========>.....................] - ETA: 1:56 - loss: 0.6068 - regression_loss: 0.5468 - classification_loss: 0.0600 162/500 [========>.....................] - ETA: 1:55 - loss: 0.6078 - regression_loss: 0.5476 - classification_loss: 0.0602 163/500 [========>.....................] - ETA: 1:55 - loss: 0.6077 - regression_loss: 0.5475 - classification_loss: 0.0602 164/500 [========>.....................] - ETA: 1:54 - loss: 0.6075 - regression_loss: 0.5474 - classification_loss: 0.0601 165/500 [========>.....................] - ETA: 1:54 - loss: 0.6080 - regression_loss: 0.5478 - classification_loss: 0.0602 166/500 [========>.....................] - ETA: 1:54 - loss: 0.6122 - regression_loss: 0.5518 - classification_loss: 0.0605 167/500 [=========>....................] - ETA: 1:53 - loss: 0.6131 - regression_loss: 0.5527 - classification_loss: 0.0604 168/500 [=========>....................] - ETA: 1:53 - loss: 0.6142 - regression_loss: 0.5538 - classification_loss: 0.0603 169/500 [=========>....................] - ETA: 1:53 - loss: 0.6157 - regression_loss: 0.5551 - classification_loss: 0.0606 170/500 [=========>....................] - ETA: 1:52 - loss: 0.6164 - regression_loss: 0.5557 - classification_loss: 0.0607 171/500 [=========>....................] - ETA: 1:52 - loss: 0.6171 - regression_loss: 0.5562 - classification_loss: 0.0609 172/500 [=========>....................] - ETA: 1:52 - loss: 0.6150 - regression_loss: 0.5542 - classification_loss: 0.0608 173/500 [=========>....................] - ETA: 1:51 - loss: 0.6135 - regression_loss: 0.5530 - classification_loss: 0.0605 174/500 [=========>....................] - ETA: 1:51 - loss: 0.6154 - regression_loss: 0.5549 - classification_loss: 0.0606 175/500 [=========>....................] - ETA: 1:51 - loss: 0.6180 - regression_loss: 0.5573 - classification_loss: 0.0607 176/500 [=========>....................] - ETA: 1:50 - loss: 0.6203 - regression_loss: 0.5595 - classification_loss: 0.0608 177/500 [=========>....................] - ETA: 1:50 - loss: 0.6212 - regression_loss: 0.5605 - classification_loss: 0.0607 178/500 [=========>....................] - ETA: 1:50 - loss: 0.6202 - regression_loss: 0.5596 - classification_loss: 0.0606 179/500 [=========>....................] - ETA: 1:49 - loss: 0.6202 - regression_loss: 0.5596 - classification_loss: 0.0606 180/500 [=========>....................] - ETA: 1:49 - loss: 0.6212 - regression_loss: 0.5604 - classification_loss: 0.0607 181/500 [=========>....................] - ETA: 1:49 - loss: 0.6207 - regression_loss: 0.5600 - classification_loss: 0.0607 182/500 [=========>....................] - ETA: 1:48 - loss: 0.6214 - regression_loss: 0.5606 - classification_loss: 0.0608 183/500 [=========>....................] - ETA: 1:48 - loss: 0.6202 - regression_loss: 0.5595 - classification_loss: 0.0607 184/500 [==========>...................] - ETA: 1:48 - loss: 0.6213 - regression_loss: 0.5606 - classification_loss: 0.0607 185/500 [==========>...................] - ETA: 1:47 - loss: 0.6218 - regression_loss: 0.5611 - classification_loss: 0.0607 186/500 [==========>...................] - ETA: 1:47 - loss: 0.6202 - regression_loss: 0.5597 - classification_loss: 0.0605 187/500 [==========>...................] - ETA: 1:47 - loss: 0.6191 - regression_loss: 0.5588 - classification_loss: 0.0603 188/500 [==========>...................] - ETA: 1:46 - loss: 0.6211 - regression_loss: 0.5606 - classification_loss: 0.0605 189/500 [==========>...................] - ETA: 1:46 - loss: 0.6212 - regression_loss: 0.5606 - classification_loss: 0.0605 190/500 [==========>...................] - ETA: 1:46 - loss: 0.6216 - regression_loss: 0.5612 - classification_loss: 0.0604 191/500 [==========>...................] - ETA: 1:45 - loss: 0.6207 - regression_loss: 0.5604 - classification_loss: 0.0603 192/500 [==========>...................] - ETA: 1:45 - loss: 0.6235 - regression_loss: 0.5630 - classification_loss: 0.0605 193/500 [==========>...................] - ETA: 1:45 - loss: 0.6272 - regression_loss: 0.5664 - classification_loss: 0.0608 194/500 [==========>...................] - ETA: 1:44 - loss: 0.6282 - regression_loss: 0.5676 - classification_loss: 0.0606 195/500 [==========>...................] - ETA: 1:44 - loss: 0.6293 - regression_loss: 0.5685 - classification_loss: 0.0608 196/500 [==========>...................] - ETA: 1:44 - loss: 0.6296 - regression_loss: 0.5689 - classification_loss: 0.0607 197/500 [==========>...................] - ETA: 1:43 - loss: 0.6290 - regression_loss: 0.5683 - classification_loss: 0.0607 198/500 [==========>...................] - ETA: 1:43 - loss: 0.6303 - regression_loss: 0.5697 - classification_loss: 0.0607 199/500 [==========>...................] - ETA: 1:43 - loss: 0.6301 - regression_loss: 0.5694 - classification_loss: 0.0607 200/500 [===========>..................] - ETA: 1:42 - loss: 0.6301 - regression_loss: 0.5694 - classification_loss: 0.0607 201/500 [===========>..................] - ETA: 1:42 - loss: 0.6289 - regression_loss: 0.5683 - classification_loss: 0.0606 202/500 [===========>..................] - ETA: 1:42 - loss: 0.6299 - regression_loss: 0.5692 - classification_loss: 0.0607 203/500 [===========>..................] - ETA: 1:41 - loss: 0.6299 - regression_loss: 0.5691 - classification_loss: 0.0608 204/500 [===========>..................] - ETA: 1:41 - loss: 0.6295 - regression_loss: 0.5687 - classification_loss: 0.0607 205/500 [===========>..................] - ETA: 1:41 - loss: 0.6301 - regression_loss: 0.5693 - classification_loss: 0.0608 206/500 [===========>..................] - ETA: 1:40 - loss: 0.6309 - regression_loss: 0.5701 - classification_loss: 0.0609 207/500 [===========>..................] - ETA: 1:40 - loss: 0.6305 - regression_loss: 0.5696 - classification_loss: 0.0609 208/500 [===========>..................] - ETA: 1:40 - loss: 0.6292 - regression_loss: 0.5685 - classification_loss: 0.0608 209/500 [===========>..................] - ETA: 1:39 - loss: 0.6300 - regression_loss: 0.5690 - classification_loss: 0.0610 210/500 [===========>..................] - ETA: 1:39 - loss: 0.6295 - regression_loss: 0.5687 - classification_loss: 0.0609 211/500 [===========>..................] - ETA: 1:39 - loss: 0.6311 - regression_loss: 0.5703 - classification_loss: 0.0609 212/500 [===========>..................] - ETA: 1:38 - loss: 0.6305 - regression_loss: 0.5697 - classification_loss: 0.0608 213/500 [===========>..................] - ETA: 1:38 - loss: 0.6297 - regression_loss: 0.5690 - classification_loss: 0.0607 214/500 [===========>..................] - ETA: 1:38 - loss: 0.6308 - regression_loss: 0.5699 - classification_loss: 0.0609 215/500 [===========>..................] - ETA: 1:37 - loss: 0.6307 - regression_loss: 0.5699 - classification_loss: 0.0609 216/500 [===========>..................] - ETA: 1:37 - loss: 0.6308 - regression_loss: 0.5698 - classification_loss: 0.0610 217/500 [============>.................] - ETA: 1:37 - loss: 0.6307 - regression_loss: 0.5698 - classification_loss: 0.0609 218/500 [============>.................] - ETA: 1:36 - loss: 0.6305 - regression_loss: 0.5697 - classification_loss: 0.0608 219/500 [============>.................] - ETA: 1:36 - loss: 0.6310 - regression_loss: 0.5700 - classification_loss: 0.0610 220/500 [============>.................] - ETA: 1:36 - loss: 0.6310 - regression_loss: 0.5698 - classification_loss: 0.0612 221/500 [============>.................] - ETA: 1:35 - loss: 0.6314 - regression_loss: 0.5703 - classification_loss: 0.0611 222/500 [============>.................] - ETA: 1:35 - loss: 0.6325 - regression_loss: 0.5711 - classification_loss: 0.0615 223/500 [============>.................] - ETA: 1:34 - loss: 0.6317 - regression_loss: 0.5703 - classification_loss: 0.0614 224/500 [============>.................] - ETA: 1:34 - loss: 0.6320 - regression_loss: 0.5705 - classification_loss: 0.0615 225/500 [============>.................] - ETA: 1:34 - loss: 0.6315 - regression_loss: 0.5700 - classification_loss: 0.0615 226/500 [============>.................] - ETA: 1:33 - loss: 0.6323 - regression_loss: 0.5706 - classification_loss: 0.0617 227/500 [============>.................] - ETA: 1:33 - loss: 0.6340 - regression_loss: 0.5720 - classification_loss: 0.0620 228/500 [============>.................] - ETA: 1:33 - loss: 0.6339 - regression_loss: 0.5720 - classification_loss: 0.0620 229/500 [============>.................] - ETA: 1:32 - loss: 0.6330 - regression_loss: 0.5709 - classification_loss: 0.0621 230/500 [============>.................] - ETA: 1:32 - loss: 0.6348 - regression_loss: 0.5726 - classification_loss: 0.0622 231/500 [============>.................] - ETA: 1:32 - loss: 0.6328 - regression_loss: 0.5708 - classification_loss: 0.0621 232/500 [============>.................] - ETA: 1:31 - loss: 0.6346 - regression_loss: 0.5723 - classification_loss: 0.0623 233/500 [============>.................] - ETA: 1:31 - loss: 0.6350 - regression_loss: 0.5727 - classification_loss: 0.0624 234/500 [=============>................] - ETA: 1:31 - loss: 0.6346 - regression_loss: 0.5723 - classification_loss: 0.0623 235/500 [=============>................] - ETA: 1:30 - loss: 0.6336 - regression_loss: 0.5715 - classification_loss: 0.0621 236/500 [=============>................] - ETA: 1:30 - loss: 0.6343 - regression_loss: 0.5721 - classification_loss: 0.0622 237/500 [=============>................] - ETA: 1:30 - loss: 0.6337 - regression_loss: 0.5715 - classification_loss: 0.0622 238/500 [=============>................] - ETA: 1:29 - loss: 0.6334 - regression_loss: 0.5712 - classification_loss: 0.0622 239/500 [=============>................] - ETA: 1:29 - loss: 0.6330 - regression_loss: 0.5708 - classification_loss: 0.0622 240/500 [=============>................] - ETA: 1:29 - loss: 0.6340 - regression_loss: 0.5714 - classification_loss: 0.0626 241/500 [=============>................] - ETA: 1:28 - loss: 0.6346 - regression_loss: 0.5719 - classification_loss: 0.0627 242/500 [=============>................] - ETA: 1:28 - loss: 0.6342 - regression_loss: 0.5716 - classification_loss: 0.0626 243/500 [=============>................] - ETA: 1:28 - loss: 0.6344 - regression_loss: 0.5719 - classification_loss: 0.0625 244/500 [=============>................] - ETA: 1:27 - loss: 0.6349 - regression_loss: 0.5722 - classification_loss: 0.0627 245/500 [=============>................] - ETA: 1:27 - loss: 0.6343 - regression_loss: 0.5717 - classification_loss: 0.0626 246/500 [=============>................] - ETA: 1:27 - loss: 0.6346 - regression_loss: 0.5719 - classification_loss: 0.0627 247/500 [=============>................] - ETA: 1:26 - loss: 0.6348 - regression_loss: 0.5720 - classification_loss: 0.0628 248/500 [=============>................] - ETA: 1:26 - loss: 0.6354 - regression_loss: 0.5727 - classification_loss: 0.0627 249/500 [=============>................] - ETA: 1:26 - loss: 0.6362 - regression_loss: 0.5734 - classification_loss: 0.0628 250/500 [==============>...............] - ETA: 1:25 - loss: 0.6364 - regression_loss: 0.5736 - classification_loss: 0.0628 251/500 [==============>...............] - ETA: 1:25 - loss: 0.6378 - regression_loss: 0.5747 - classification_loss: 0.0631 252/500 [==============>...............] - ETA: 1:25 - loss: 0.6376 - regression_loss: 0.5746 - classification_loss: 0.0631 253/500 [==============>...............] - ETA: 1:24 - loss: 0.6371 - regression_loss: 0.5741 - classification_loss: 0.0630 254/500 [==============>...............] - ETA: 1:24 - loss: 0.6365 - regression_loss: 0.5736 - classification_loss: 0.0629 255/500 [==============>...............] - ETA: 1:24 - loss: 0.6362 - regression_loss: 0.5734 - classification_loss: 0.0629 256/500 [==============>...............] - ETA: 1:23 - loss: 0.6357 - regression_loss: 0.5729 - classification_loss: 0.0628 257/500 [==============>...............] - ETA: 1:23 - loss: 0.6367 - regression_loss: 0.5738 - classification_loss: 0.0630 258/500 [==============>...............] - ETA: 1:23 - loss: 0.6367 - regression_loss: 0.5738 - classification_loss: 0.0629 259/500 [==============>...............] - ETA: 1:22 - loss: 0.6366 - regression_loss: 0.5736 - classification_loss: 0.0630 260/500 [==============>...............] - ETA: 1:22 - loss: 0.6362 - regression_loss: 0.5732 - classification_loss: 0.0629 261/500 [==============>...............] - ETA: 1:21 - loss: 0.6364 - regression_loss: 0.5735 - classification_loss: 0.0629 262/500 [==============>...............] - ETA: 1:21 - loss: 0.6352 - regression_loss: 0.5725 - classification_loss: 0.0627 263/500 [==============>...............] - ETA: 1:21 - loss: 0.6344 - regression_loss: 0.5718 - classification_loss: 0.0626 264/500 [==============>...............] - ETA: 1:20 - loss: 0.6345 - regression_loss: 0.5718 - classification_loss: 0.0627 265/500 [==============>...............] - ETA: 1:20 - loss: 0.6331 - regression_loss: 0.5705 - classification_loss: 0.0626 266/500 [==============>...............] - ETA: 1:20 - loss: 0.6331 - regression_loss: 0.5705 - classification_loss: 0.0626 267/500 [===============>..............] - ETA: 1:19 - loss: 0.6356 - regression_loss: 0.5731 - classification_loss: 0.0626 268/500 [===============>..............] - ETA: 1:19 - loss: 0.6361 - regression_loss: 0.5735 - classification_loss: 0.0626 269/500 [===============>..............] - ETA: 1:19 - loss: 0.6358 - regression_loss: 0.5732 - classification_loss: 0.0627 270/500 [===============>..............] - ETA: 1:18 - loss: 0.6361 - regression_loss: 0.5734 - classification_loss: 0.0627 271/500 [===============>..............] - ETA: 1:18 - loss: 0.6357 - regression_loss: 0.5730 - classification_loss: 0.0626 272/500 [===============>..............] - ETA: 1:18 - loss: 0.6353 - regression_loss: 0.5726 - classification_loss: 0.0627 273/500 [===============>..............] - ETA: 1:17 - loss: 0.6352 - regression_loss: 0.5725 - classification_loss: 0.0627 274/500 [===============>..............] - ETA: 1:17 - loss: 0.6377 - regression_loss: 0.5747 - classification_loss: 0.0629 275/500 [===============>..............] - ETA: 1:17 - loss: 0.6364 - regression_loss: 0.5736 - classification_loss: 0.0628 276/500 [===============>..............] - ETA: 1:16 - loss: 0.6363 - regression_loss: 0.5736 - classification_loss: 0.0628 277/500 [===============>..............] - ETA: 1:16 - loss: 0.6353 - regression_loss: 0.5727 - classification_loss: 0.0627 278/500 [===============>..............] - ETA: 1:15 - loss: 0.6352 - regression_loss: 0.5725 - classification_loss: 0.0627 279/500 [===============>..............] - ETA: 1:15 - loss: 0.6344 - regression_loss: 0.5718 - classification_loss: 0.0626 280/500 [===============>..............] - ETA: 1:15 - loss: 0.6382 - regression_loss: 0.5751 - classification_loss: 0.0631 281/500 [===============>..............] - ETA: 1:14 - loss: 0.6392 - regression_loss: 0.5761 - classification_loss: 0.0631 282/500 [===============>..............] - ETA: 1:14 - loss: 0.6413 - regression_loss: 0.5780 - classification_loss: 0.0633 283/500 [===============>..............] - ETA: 1:14 - loss: 0.6404 - regression_loss: 0.5772 - classification_loss: 0.0631 284/500 [================>.............] - ETA: 1:13 - loss: 0.6410 - regression_loss: 0.5778 - classification_loss: 0.0632 285/500 [================>.............] - ETA: 1:13 - loss: 0.6415 - regression_loss: 0.5783 - classification_loss: 0.0632 286/500 [================>.............] - ETA: 1:13 - loss: 0.6417 - regression_loss: 0.5785 - classification_loss: 0.0632 287/500 [================>.............] - ETA: 1:12 - loss: 0.6411 - regression_loss: 0.5780 - classification_loss: 0.0631 288/500 [================>.............] - ETA: 1:12 - loss: 0.6426 - regression_loss: 0.5794 - classification_loss: 0.0632 289/500 [================>.............] - ETA: 1:12 - loss: 0.6438 - regression_loss: 0.5806 - classification_loss: 0.0632 290/500 [================>.............] - ETA: 1:11 - loss: 0.6456 - regression_loss: 0.5822 - classification_loss: 0.0634 291/500 [================>.............] - ETA: 1:11 - loss: 0.6445 - regression_loss: 0.5811 - classification_loss: 0.0634 292/500 [================>.............] - ETA: 1:11 - loss: 0.6474 - regression_loss: 0.5836 - classification_loss: 0.0638 293/500 [================>.............] - ETA: 1:10 - loss: 0.6489 - regression_loss: 0.5849 - classification_loss: 0.0641 294/500 [================>.............] - ETA: 1:10 - loss: 0.6501 - regression_loss: 0.5859 - classification_loss: 0.0642 295/500 [================>.............] - ETA: 1:10 - loss: 0.6508 - regression_loss: 0.5866 - classification_loss: 0.0642 296/500 [================>.............] - ETA: 1:09 - loss: 0.6504 - regression_loss: 0.5862 - classification_loss: 0.0642 297/500 [================>.............] - ETA: 1:09 - loss: 0.6506 - regression_loss: 0.5865 - classification_loss: 0.0641 298/500 [================>.............] - ETA: 1:09 - loss: 0.6515 - regression_loss: 0.5873 - classification_loss: 0.0641 299/500 [================>.............] - ETA: 1:08 - loss: 0.6505 - regression_loss: 0.5865 - classification_loss: 0.0640 300/500 [=================>............] - ETA: 1:08 - loss: 0.6495 - regression_loss: 0.5857 - classification_loss: 0.0639 301/500 [=================>............] - ETA: 1:08 - loss: 0.6491 - regression_loss: 0.5853 - classification_loss: 0.0638 302/500 [=================>............] - ETA: 1:07 - loss: 0.6481 - regression_loss: 0.5844 - classification_loss: 0.0636 303/500 [=================>............] - ETA: 1:07 - loss: 0.6469 - regression_loss: 0.5834 - classification_loss: 0.0635 304/500 [=================>............] - ETA: 1:07 - loss: 0.6458 - regression_loss: 0.5825 - classification_loss: 0.0634 305/500 [=================>............] - ETA: 1:06 - loss: 0.6450 - regression_loss: 0.5817 - classification_loss: 0.0633 306/500 [=================>............] - ETA: 1:06 - loss: 0.6453 - regression_loss: 0.5820 - classification_loss: 0.0633 307/500 [=================>............] - ETA: 1:06 - loss: 0.6453 - regression_loss: 0.5821 - classification_loss: 0.0633 308/500 [=================>............] - ETA: 1:05 - loss: 0.6464 - regression_loss: 0.5830 - classification_loss: 0.0634 309/500 [=================>............] - ETA: 1:05 - loss: 0.6466 - regression_loss: 0.5832 - classification_loss: 0.0634 310/500 [=================>............] - ETA: 1:05 - loss: 0.6459 - regression_loss: 0.5826 - classification_loss: 0.0633 311/500 [=================>............] - ETA: 1:04 - loss: 0.6454 - regression_loss: 0.5822 - classification_loss: 0.0632 312/500 [=================>............] - ETA: 1:04 - loss: 0.6458 - regression_loss: 0.5826 - classification_loss: 0.0632 313/500 [=================>............] - ETA: 1:04 - loss: 0.6453 - regression_loss: 0.5820 - classification_loss: 0.0632 314/500 [=================>............] - ETA: 1:03 - loss: 0.6459 - regression_loss: 0.5826 - classification_loss: 0.0633 315/500 [=================>............] - ETA: 1:03 - loss: 0.6463 - regression_loss: 0.5830 - classification_loss: 0.0633 316/500 [=================>............] - ETA: 1:03 - loss: 0.6469 - regression_loss: 0.5836 - classification_loss: 0.0633 317/500 [==================>...........] - ETA: 1:02 - loss: 0.6465 - regression_loss: 0.5832 - classification_loss: 0.0633 318/500 [==================>...........] - ETA: 1:02 - loss: 0.6465 - regression_loss: 0.5832 - classification_loss: 0.0633 319/500 [==================>...........] - ETA: 1:02 - loss: 0.6455 - regression_loss: 0.5823 - classification_loss: 0.0632 320/500 [==================>...........] - ETA: 1:01 - loss: 0.6447 - regression_loss: 0.5816 - classification_loss: 0.0631 321/500 [==================>...........] - ETA: 1:01 - loss: 0.6433 - regression_loss: 0.5803 - classification_loss: 0.0630 322/500 [==================>...........] - ETA: 1:01 - loss: 0.6422 - regression_loss: 0.5793 - classification_loss: 0.0628 323/500 [==================>...........] - ETA: 1:00 - loss: 0.6427 - regression_loss: 0.5799 - classification_loss: 0.0628 324/500 [==================>...........] - ETA: 1:00 - loss: 0.6440 - regression_loss: 0.5811 - classification_loss: 0.0629 325/500 [==================>...........] - ETA: 1:00 - loss: 0.6440 - regression_loss: 0.5812 - classification_loss: 0.0628 326/500 [==================>...........] - ETA: 59s - loss: 0.6442 - regression_loss: 0.5814 - classification_loss: 0.0628  327/500 [==================>...........] - ETA: 59s - loss: 0.6439 - regression_loss: 0.5811 - classification_loss: 0.0628 328/500 [==================>...........] - ETA: 58s - loss: 0.6438 - regression_loss: 0.5811 - classification_loss: 0.0627 329/500 [==================>...........] - ETA: 58s - loss: 0.6428 - regression_loss: 0.5802 - classification_loss: 0.0626 330/500 [==================>...........] - ETA: 58s - loss: 0.6420 - regression_loss: 0.5795 - classification_loss: 0.0625 331/500 [==================>...........] - ETA: 57s - loss: 0.6421 - regression_loss: 0.5796 - classification_loss: 0.0625 332/500 [==================>...........] - ETA: 57s - loss: 0.6425 - regression_loss: 0.5799 - classification_loss: 0.0626 333/500 [==================>...........] - ETA: 57s - loss: 0.6430 - regression_loss: 0.5802 - classification_loss: 0.0628 334/500 [===================>..........] - ETA: 56s - loss: 0.6441 - regression_loss: 0.5813 - classification_loss: 0.0628 335/500 [===================>..........] - ETA: 56s - loss: 0.6443 - regression_loss: 0.5815 - classification_loss: 0.0628 336/500 [===================>..........] - ETA: 56s - loss: 0.6454 - regression_loss: 0.5824 - classification_loss: 0.0630 337/500 [===================>..........] - ETA: 55s - loss: 0.6458 - regression_loss: 0.5827 - classification_loss: 0.0631 338/500 [===================>..........] - ETA: 55s - loss: 0.6477 - regression_loss: 0.5843 - classification_loss: 0.0634 339/500 [===================>..........] - ETA: 55s - loss: 0.6477 - regression_loss: 0.5844 - classification_loss: 0.0633 340/500 [===================>..........] - ETA: 54s - loss: 0.6476 - regression_loss: 0.5843 - classification_loss: 0.0633 341/500 [===================>..........] - ETA: 54s - loss: 0.6470 - regression_loss: 0.5838 - classification_loss: 0.0632 342/500 [===================>..........] - ETA: 54s - loss: 0.6471 - regression_loss: 0.5839 - classification_loss: 0.0632 343/500 [===================>..........] - ETA: 53s - loss: 0.6461 - regression_loss: 0.5830 - classification_loss: 0.0631 344/500 [===================>..........] - ETA: 53s - loss: 0.6472 - regression_loss: 0.5841 - classification_loss: 0.0632 345/500 [===================>..........] - ETA: 53s - loss: 0.6472 - regression_loss: 0.5841 - classification_loss: 0.0631 346/500 [===================>..........] - ETA: 52s - loss: 0.6463 - regression_loss: 0.5833 - classification_loss: 0.0630 347/500 [===================>..........] - ETA: 52s - loss: 0.6457 - regression_loss: 0.5828 - classification_loss: 0.0629 348/500 [===================>..........] - ETA: 52s - loss: 0.6455 - regression_loss: 0.5826 - classification_loss: 0.0629 349/500 [===================>..........] - ETA: 51s - loss: 0.6448 - regression_loss: 0.5820 - classification_loss: 0.0628 350/500 [====================>.........] - ETA: 51s - loss: 0.6448 - regression_loss: 0.5820 - classification_loss: 0.0628 351/500 [====================>.........] - ETA: 51s - loss: 0.6442 - regression_loss: 0.5815 - classification_loss: 0.0627 352/500 [====================>.........] - ETA: 50s - loss: 0.6441 - regression_loss: 0.5813 - classification_loss: 0.0628 353/500 [====================>.........] - ETA: 50s - loss: 0.6432 - regression_loss: 0.5805 - classification_loss: 0.0627 354/500 [====================>.........] - ETA: 50s - loss: 0.6435 - regression_loss: 0.5808 - classification_loss: 0.0627 355/500 [====================>.........] - ETA: 49s - loss: 0.6445 - regression_loss: 0.5816 - classification_loss: 0.0629 356/500 [====================>.........] - ETA: 49s - loss: 0.6440 - regression_loss: 0.5812 - classification_loss: 0.0629 357/500 [====================>.........] - ETA: 49s - loss: 0.6444 - regression_loss: 0.5814 - classification_loss: 0.0629 358/500 [====================>.........] - ETA: 48s - loss: 0.6439 - regression_loss: 0.5810 - classification_loss: 0.0629 359/500 [====================>.........] - ETA: 48s - loss: 0.6430 - regression_loss: 0.5802 - classification_loss: 0.0628 360/500 [====================>.........] - ETA: 48s - loss: 0.6428 - regression_loss: 0.5801 - classification_loss: 0.0627 361/500 [====================>.........] - ETA: 47s - loss: 0.6423 - regression_loss: 0.5797 - classification_loss: 0.0627 362/500 [====================>.........] - ETA: 47s - loss: 0.6419 - regression_loss: 0.5793 - classification_loss: 0.0626 363/500 [====================>.........] - ETA: 46s - loss: 0.6427 - regression_loss: 0.5800 - classification_loss: 0.0627 364/500 [====================>.........] - ETA: 46s - loss: 0.6428 - regression_loss: 0.5802 - classification_loss: 0.0626 365/500 [====================>.........] - ETA: 46s - loss: 0.6422 - regression_loss: 0.5797 - classification_loss: 0.0625 366/500 [====================>.........] - ETA: 45s - loss: 0.6413 - regression_loss: 0.5789 - classification_loss: 0.0624 367/500 [=====================>........] - ETA: 45s - loss: 0.6411 - regression_loss: 0.5787 - classification_loss: 0.0624 368/500 [=====================>........] - ETA: 45s - loss: 0.6410 - regression_loss: 0.5785 - classification_loss: 0.0625 369/500 [=====================>........] - ETA: 44s - loss: 0.6411 - regression_loss: 0.5786 - classification_loss: 0.0625 370/500 [=====================>........] - ETA: 44s - loss: 0.6412 - regression_loss: 0.5788 - classification_loss: 0.0625 371/500 [=====================>........] - ETA: 44s - loss: 0.6411 - regression_loss: 0.5786 - classification_loss: 0.0624 372/500 [=====================>........] - ETA: 43s - loss: 0.6409 - regression_loss: 0.5785 - classification_loss: 0.0624 373/500 [=====================>........] - ETA: 43s - loss: 0.6407 - regression_loss: 0.5783 - classification_loss: 0.0624 374/500 [=====================>........] - ETA: 43s - loss: 0.6402 - regression_loss: 0.5779 - classification_loss: 0.0623 375/500 [=====================>........] - ETA: 42s - loss: 0.6405 - regression_loss: 0.5781 - classification_loss: 0.0624 376/500 [=====================>........] - ETA: 42s - loss: 0.6403 - regression_loss: 0.5779 - classification_loss: 0.0624 377/500 [=====================>........] - ETA: 42s - loss: 0.6399 - regression_loss: 0.5776 - classification_loss: 0.0624 378/500 [=====================>........] - ETA: 41s - loss: 0.6405 - regression_loss: 0.5780 - classification_loss: 0.0625 379/500 [=====================>........] - ETA: 41s - loss: 0.6406 - regression_loss: 0.5781 - classification_loss: 0.0625 380/500 [=====================>........] - ETA: 41s - loss: 0.6401 - regression_loss: 0.5776 - classification_loss: 0.0626 381/500 [=====================>........] - ETA: 40s - loss: 0.6404 - regression_loss: 0.5778 - classification_loss: 0.0626 382/500 [=====================>........] - ETA: 40s - loss: 0.6397 - regression_loss: 0.5772 - classification_loss: 0.0625 383/500 [=====================>........] - ETA: 40s - loss: 0.6400 - regression_loss: 0.5774 - classification_loss: 0.0626 384/500 [======================>.......] - ETA: 39s - loss: 0.6395 - regression_loss: 0.5770 - classification_loss: 0.0625 385/500 [======================>.......] - ETA: 39s - loss: 0.6401 - regression_loss: 0.5776 - classification_loss: 0.0626 386/500 [======================>.......] - ETA: 39s - loss: 0.6391 - regression_loss: 0.5766 - classification_loss: 0.0625 387/500 [======================>.......] - ETA: 38s - loss: 0.6390 - regression_loss: 0.5765 - classification_loss: 0.0625 388/500 [======================>.......] - ETA: 38s - loss: 0.6388 - regression_loss: 0.5763 - classification_loss: 0.0625 389/500 [======================>.......] - ETA: 38s - loss: 0.6391 - regression_loss: 0.5766 - classification_loss: 0.0626 390/500 [======================>.......] - ETA: 37s - loss: 0.6393 - regression_loss: 0.5767 - classification_loss: 0.0626 391/500 [======================>.......] - ETA: 37s - loss: 0.6395 - regression_loss: 0.5769 - classification_loss: 0.0626 392/500 [======================>.......] - ETA: 37s - loss: 0.6398 - regression_loss: 0.5772 - classification_loss: 0.0626 393/500 [======================>.......] - ETA: 36s - loss: 0.6398 - regression_loss: 0.5772 - classification_loss: 0.0626 394/500 [======================>.......] - ETA: 36s - loss: 0.6396 - regression_loss: 0.5771 - classification_loss: 0.0625 395/500 [======================>.......] - ETA: 36s - loss: 0.6392 - regression_loss: 0.5767 - classification_loss: 0.0625 396/500 [======================>.......] - ETA: 35s - loss: 0.6387 - regression_loss: 0.5762 - classification_loss: 0.0625 397/500 [======================>.......] - ETA: 35s - loss: 0.6391 - regression_loss: 0.5767 - classification_loss: 0.0624 398/500 [======================>.......] - ETA: 34s - loss: 0.6380 - regression_loss: 0.5757 - classification_loss: 0.0623 399/500 [======================>.......] - ETA: 34s - loss: 0.6374 - regression_loss: 0.5751 - classification_loss: 0.0623 400/500 [=======================>......] - ETA: 34s - loss: 0.6388 - regression_loss: 0.5762 - classification_loss: 0.0626 401/500 [=======================>......] - ETA: 33s - loss: 0.6391 - regression_loss: 0.5766 - classification_loss: 0.0626 402/500 [=======================>......] - ETA: 33s - loss: 0.6393 - regression_loss: 0.5768 - classification_loss: 0.0625 403/500 [=======================>......] - ETA: 33s - loss: 0.6394 - regression_loss: 0.5769 - classification_loss: 0.0625 404/500 [=======================>......] - ETA: 32s - loss: 0.6395 - regression_loss: 0.5769 - classification_loss: 0.0626 405/500 [=======================>......] - ETA: 32s - loss: 0.6395 - regression_loss: 0.5770 - classification_loss: 0.0625 406/500 [=======================>......] - ETA: 32s - loss: 0.6397 - regression_loss: 0.5771 - classification_loss: 0.0626 407/500 [=======================>......] - ETA: 31s - loss: 0.6394 - regression_loss: 0.5767 - classification_loss: 0.0627 408/500 [=======================>......] - ETA: 31s - loss: 0.6395 - regression_loss: 0.5768 - classification_loss: 0.0626 409/500 [=======================>......] - ETA: 31s - loss: 0.6397 - regression_loss: 0.5771 - classification_loss: 0.0626 410/500 [=======================>......] - ETA: 30s - loss: 0.6388 - regression_loss: 0.5763 - classification_loss: 0.0626 411/500 [=======================>......] - ETA: 30s - loss: 0.6393 - regression_loss: 0.5767 - classification_loss: 0.0626 412/500 [=======================>......] - ETA: 30s - loss: 0.6394 - regression_loss: 0.5769 - classification_loss: 0.0625 413/500 [=======================>......] - ETA: 29s - loss: 0.6393 - regression_loss: 0.5769 - classification_loss: 0.0625 414/500 [=======================>......] - ETA: 29s - loss: 0.6397 - regression_loss: 0.5773 - classification_loss: 0.0624 415/500 [=======================>......] - ETA: 29s - loss: 0.6395 - regression_loss: 0.5770 - classification_loss: 0.0624 416/500 [=======================>......] - ETA: 28s - loss: 0.6393 - regression_loss: 0.5769 - classification_loss: 0.0624 417/500 [========================>.....] - ETA: 28s - loss: 0.6385 - regression_loss: 0.5762 - classification_loss: 0.0623 418/500 [========================>.....] - ETA: 28s - loss: 0.6385 - regression_loss: 0.5762 - classification_loss: 0.0623 419/500 [========================>.....] - ETA: 27s - loss: 0.6380 - regression_loss: 0.5757 - classification_loss: 0.0623 420/500 [========================>.....] - ETA: 27s - loss: 0.6388 - regression_loss: 0.5764 - classification_loss: 0.0624 421/500 [========================>.....] - ETA: 27s - loss: 0.6386 - regression_loss: 0.5763 - classification_loss: 0.0624 422/500 [========================>.....] - ETA: 26s - loss: 0.6382 - regression_loss: 0.5759 - classification_loss: 0.0623 423/500 [========================>.....] - ETA: 26s - loss: 0.6372 - regression_loss: 0.5750 - classification_loss: 0.0622 424/500 [========================>.....] - ETA: 26s - loss: 0.6364 - regression_loss: 0.5742 - classification_loss: 0.0621 425/500 [========================>.....] - ETA: 25s - loss: 0.6372 - regression_loss: 0.5749 - classification_loss: 0.0623 426/500 [========================>.....] - ETA: 25s - loss: 0.6377 - regression_loss: 0.5754 - classification_loss: 0.0623 427/500 [========================>.....] - ETA: 25s - loss: 0.6387 - regression_loss: 0.5762 - classification_loss: 0.0624 428/500 [========================>.....] - ETA: 24s - loss: 0.6384 - regression_loss: 0.5760 - classification_loss: 0.0624 429/500 [========================>.....] - ETA: 24s - loss: 0.6379 - regression_loss: 0.5756 - classification_loss: 0.0623 430/500 [========================>.....] - ETA: 24s - loss: 0.6376 - regression_loss: 0.5753 - classification_loss: 0.0623 431/500 [========================>.....] - ETA: 23s - loss: 0.6377 - regression_loss: 0.5754 - classification_loss: 0.0623 432/500 [========================>.....] - ETA: 23s - loss: 0.6389 - regression_loss: 0.5765 - classification_loss: 0.0625 433/500 [========================>.....] - ETA: 22s - loss: 0.6392 - regression_loss: 0.5767 - classification_loss: 0.0625 434/500 [=========================>....] - ETA: 22s - loss: 0.6401 - regression_loss: 0.5776 - classification_loss: 0.0625 435/500 [=========================>....] - ETA: 22s - loss: 0.6402 - regression_loss: 0.5777 - classification_loss: 0.0625 436/500 [=========================>....] - ETA: 21s - loss: 0.6400 - regression_loss: 0.5775 - classification_loss: 0.0625 437/500 [=========================>....] - ETA: 21s - loss: 0.6403 - regression_loss: 0.5777 - classification_loss: 0.0627 438/500 [=========================>....] - ETA: 21s - loss: 0.6407 - regression_loss: 0.5781 - classification_loss: 0.0626 439/500 [=========================>....] - ETA: 20s - loss: 0.6407 - regression_loss: 0.5781 - classification_loss: 0.0626 440/500 [=========================>....] - ETA: 20s - loss: 0.6406 - regression_loss: 0.5780 - classification_loss: 0.0626 441/500 [=========================>....] - ETA: 20s - loss: 0.6411 - regression_loss: 0.5785 - classification_loss: 0.0626 442/500 [=========================>....] - ETA: 19s - loss: 0.6411 - regression_loss: 0.5785 - classification_loss: 0.0626 443/500 [=========================>....] - ETA: 19s - loss: 0.6408 - regression_loss: 0.5783 - classification_loss: 0.0626 444/500 [=========================>....] - ETA: 19s - loss: 0.6409 - regression_loss: 0.5784 - classification_loss: 0.0625 445/500 [=========================>....] - ETA: 18s - loss: 0.6402 - regression_loss: 0.5778 - classification_loss: 0.0624 446/500 [=========================>....] - ETA: 18s - loss: 0.6404 - regression_loss: 0.5779 - classification_loss: 0.0625 447/500 [=========================>....] - ETA: 18s - loss: 0.6408 - regression_loss: 0.5783 - classification_loss: 0.0625 448/500 [=========================>....] - ETA: 17s - loss: 0.6404 - regression_loss: 0.5779 - classification_loss: 0.0624 449/500 [=========================>....] - ETA: 17s - loss: 0.6405 - regression_loss: 0.5780 - classification_loss: 0.0625 450/500 [==========================>...] - ETA: 17s - loss: 0.6402 - regression_loss: 0.5778 - classification_loss: 0.0624 451/500 [==========================>...] - ETA: 16s - loss: 0.6408 - regression_loss: 0.5782 - classification_loss: 0.0625 452/500 [==========================>...] - ETA: 16s - loss: 0.6407 - regression_loss: 0.5782 - classification_loss: 0.0625 453/500 [==========================>...] - ETA: 16s - loss: 0.6410 - regression_loss: 0.5784 - classification_loss: 0.0626 454/500 [==========================>...] - ETA: 15s - loss: 0.6410 - regression_loss: 0.5784 - classification_loss: 0.0626 455/500 [==========================>...] - ETA: 15s - loss: 0.6405 - regression_loss: 0.5779 - classification_loss: 0.0625 456/500 [==========================>...] - ETA: 15s - loss: 0.6405 - regression_loss: 0.5779 - classification_loss: 0.0625 457/500 [==========================>...] - ETA: 14s - loss: 0.6412 - regression_loss: 0.5785 - classification_loss: 0.0627 458/500 [==========================>...] - ETA: 14s - loss: 0.6406 - regression_loss: 0.5780 - classification_loss: 0.0626 459/500 [==========================>...] - ETA: 14s - loss: 0.6404 - regression_loss: 0.5778 - classification_loss: 0.0625 460/500 [==========================>...] - ETA: 13s - loss: 0.6407 - regression_loss: 0.5780 - classification_loss: 0.0627 461/500 [==========================>...] - ETA: 13s - loss: 0.6404 - regression_loss: 0.5778 - classification_loss: 0.0626 462/500 [==========================>...] - ETA: 13s - loss: 0.6401 - regression_loss: 0.5775 - classification_loss: 0.0627 463/500 [==========================>...] - ETA: 12s - loss: 0.6405 - regression_loss: 0.5778 - classification_loss: 0.0626 464/500 [==========================>...] - ETA: 12s - loss: 0.6398 - regression_loss: 0.5772 - classification_loss: 0.0625 465/500 [==========================>...] - ETA: 12s - loss: 0.6399 - regression_loss: 0.5774 - classification_loss: 0.0625 466/500 [==========================>...] - ETA: 11s - loss: 0.6400 - regression_loss: 0.5775 - classification_loss: 0.0624 467/500 [===========================>..] - ETA: 11s - loss: 0.6392 - regression_loss: 0.5768 - classification_loss: 0.0624 468/500 [===========================>..] - ETA: 10s - loss: 0.6384 - regression_loss: 0.5761 - classification_loss: 0.0623 469/500 [===========================>..] - ETA: 10s - loss: 0.6384 - regression_loss: 0.5762 - classification_loss: 0.0622 470/500 [===========================>..] - ETA: 10s - loss: 0.6377 - regression_loss: 0.5756 - classification_loss: 0.0621 471/500 [===========================>..] - ETA: 9s - loss: 0.6382 - regression_loss: 0.5761 - classification_loss: 0.0621  472/500 [===========================>..] - ETA: 9s - loss: 0.6384 - regression_loss: 0.5762 - classification_loss: 0.0621 473/500 [===========================>..] - ETA: 9s - loss: 0.6384 - regression_loss: 0.5763 - classification_loss: 0.0621 474/500 [===========================>..] - ETA: 8s - loss: 0.6389 - regression_loss: 0.5767 - classification_loss: 0.0622 475/500 [===========================>..] - ETA: 8s - loss: 0.6392 - regression_loss: 0.5770 - classification_loss: 0.0622 476/500 [===========================>..] - ETA: 8s - loss: 0.6388 - regression_loss: 0.5767 - classification_loss: 0.0621 477/500 [===========================>..] - ETA: 7s - loss: 0.6391 - regression_loss: 0.5769 - classification_loss: 0.0621 478/500 [===========================>..] - ETA: 7s - loss: 0.6393 - regression_loss: 0.5771 - classification_loss: 0.0621 479/500 [===========================>..] - ETA: 7s - loss: 0.6385 - regression_loss: 0.5764 - classification_loss: 0.0621 480/500 [===========================>..] - ETA: 6s - loss: 0.6392 - regression_loss: 0.5770 - classification_loss: 0.0621 481/500 [===========================>..] - ETA: 6s - loss: 0.6391 - regression_loss: 0.5770 - classification_loss: 0.0621 482/500 [===========================>..] - ETA: 6s - loss: 0.6392 - regression_loss: 0.5771 - classification_loss: 0.0620 483/500 [===========================>..] - ETA: 5s - loss: 0.6390 - regression_loss: 0.5770 - classification_loss: 0.0620 484/500 [============================>.] - ETA: 5s - loss: 0.6392 - regression_loss: 0.5772 - classification_loss: 0.0620 485/500 [============================>.] - ETA: 5s - loss: 0.6389 - regression_loss: 0.5769 - classification_loss: 0.0620 486/500 [============================>.] - ETA: 4s - loss: 0.6389 - regression_loss: 0.5769 - classification_loss: 0.0620 487/500 [============================>.] - ETA: 4s - loss: 0.6387 - regression_loss: 0.5767 - classification_loss: 0.0620 488/500 [============================>.] - ETA: 4s - loss: 0.6381 - regression_loss: 0.5762 - classification_loss: 0.0619 489/500 [============================>.] - ETA: 3s - loss: 0.6386 - regression_loss: 0.5766 - classification_loss: 0.0620 490/500 [============================>.] - ETA: 3s - loss: 0.6385 - regression_loss: 0.5765 - classification_loss: 0.0620 491/500 [============================>.] - ETA: 3s - loss: 0.6382 - regression_loss: 0.5762 - classification_loss: 0.0620 492/500 [============================>.] - ETA: 2s - loss: 0.6388 - regression_loss: 0.5768 - classification_loss: 0.0620 493/500 [============================>.] - ETA: 2s - loss: 0.6387 - regression_loss: 0.5767 - classification_loss: 0.0620 494/500 [============================>.] - ETA: 2s - loss: 0.6387 - regression_loss: 0.5767 - classification_loss: 0.0620 495/500 [============================>.] - ETA: 1s - loss: 0.6386 - regression_loss: 0.5766 - classification_loss: 0.0620 496/500 [============================>.] - ETA: 1s - loss: 0.6385 - regression_loss: 0.5765 - classification_loss: 0.0620 497/500 [============================>.] - ETA: 1s - loss: 0.6385 - regression_loss: 0.5765 - classification_loss: 0.0620 498/500 [============================>.] - ETA: 0s - loss: 0.6384 - regression_loss: 0.5764 - classification_loss: 0.0621 499/500 [============================>.] - ETA: 0s - loss: 0.6384 - regression_loss: 0.5763 - classification_loss: 0.0620 500/500 [==============================] - 171s 343ms/step - loss: 0.6377 - regression_loss: 0.5757 - classification_loss: 0.0620 1172 instances of class plum with average precision: 0.7222 mAP: 0.7222 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:43 - loss: 0.3347 - regression_loss: 0.3098 - classification_loss: 0.0249 2/500 [..............................] - ETA: 2:47 - loss: 0.3165 - regression_loss: 0.2862 - classification_loss: 0.0303 3/500 [..............................] - ETA: 2:49 - loss: 0.5285 - regression_loss: 0.4701 - classification_loss: 0.0584 4/500 [..............................] - ETA: 2:49 - loss: 0.5370 - regression_loss: 0.4817 - classification_loss: 0.0553 5/500 [..............................] - ETA: 2:48 - loss: 0.5490 - regression_loss: 0.4909 - classification_loss: 0.0581 6/500 [..............................] - ETA: 2:49 - loss: 0.5700 - regression_loss: 0.5104 - classification_loss: 0.0596 7/500 [..............................] - ETA: 2:47 - loss: 0.5678 - regression_loss: 0.5065 - classification_loss: 0.0612 8/500 [..............................] - ETA: 2:47 - loss: 0.5530 - regression_loss: 0.4943 - classification_loss: 0.0587 9/500 [..............................] - ETA: 2:48 - loss: 0.5799 - regression_loss: 0.5184 - classification_loss: 0.0614 10/500 [..............................] - ETA: 2:47 - loss: 0.5839 - regression_loss: 0.5232 - classification_loss: 0.0607 11/500 [..............................] - ETA: 2:48 - loss: 0.5979 - regression_loss: 0.5373 - classification_loss: 0.0605 12/500 [..............................] - ETA: 2:48 - loss: 0.5802 - regression_loss: 0.5233 - classification_loss: 0.0569 13/500 [..............................] - ETA: 2:48 - loss: 0.5602 - regression_loss: 0.5058 - classification_loss: 0.0543 14/500 [..............................] - ETA: 2:49 - loss: 0.5739 - regression_loss: 0.5179 - classification_loss: 0.0560 15/500 [..............................] - ETA: 2:48 - loss: 0.5677 - regression_loss: 0.5131 - classification_loss: 0.0546 16/500 [..............................] - ETA: 2:48 - loss: 0.5829 - regression_loss: 0.5277 - classification_loss: 0.0552 17/500 [>.............................] - ETA: 2:48 - loss: 0.5911 - regression_loss: 0.5343 - classification_loss: 0.0568 18/500 [>.............................] - ETA: 2:47 - loss: 0.5746 - regression_loss: 0.5188 - classification_loss: 0.0557 19/500 [>.............................] - ETA: 2:47 - loss: 0.5826 - regression_loss: 0.5268 - classification_loss: 0.0558 20/500 [>.............................] - ETA: 2:46 - loss: 0.5872 - regression_loss: 0.5333 - classification_loss: 0.0539 21/500 [>.............................] - ETA: 2:45 - loss: 0.5844 - regression_loss: 0.5310 - classification_loss: 0.0533 22/500 [>.............................] - ETA: 2:45 - loss: 0.5825 - regression_loss: 0.5297 - classification_loss: 0.0528 23/500 [>.............................] - ETA: 2:45 - loss: 0.5717 - regression_loss: 0.5199 - classification_loss: 0.0518 24/500 [>.............................] - ETA: 2:45 - loss: 0.5636 - regression_loss: 0.5122 - classification_loss: 0.0515 25/500 [>.............................] - ETA: 2:44 - loss: 0.5606 - regression_loss: 0.5095 - classification_loss: 0.0511 26/500 [>.............................] - ETA: 2:44 - loss: 0.5701 - regression_loss: 0.5172 - classification_loss: 0.0529 27/500 [>.............................] - ETA: 2:44 - loss: 0.5587 - regression_loss: 0.5072 - classification_loss: 0.0515 28/500 [>.............................] - ETA: 2:44 - loss: 0.5651 - regression_loss: 0.5131 - classification_loss: 0.0520 29/500 [>.............................] - ETA: 2:43 - loss: 0.5598 - regression_loss: 0.5084 - classification_loss: 0.0514 30/500 [>.............................] - ETA: 2:43 - loss: 0.5619 - regression_loss: 0.5104 - classification_loss: 0.0515 31/500 [>.............................] - ETA: 2:42 - loss: 0.5654 - regression_loss: 0.5126 - classification_loss: 0.0528 32/500 [>.............................] - ETA: 2:42 - loss: 0.5633 - regression_loss: 0.5103 - classification_loss: 0.0530 33/500 [>.............................] - ETA: 2:41 - loss: 0.5596 - regression_loss: 0.5065 - classification_loss: 0.0532 34/500 [=>............................] - ETA: 2:41 - loss: 0.5559 - regression_loss: 0.5032 - classification_loss: 0.0527 35/500 [=>............................] - ETA: 2:40 - loss: 0.5512 - regression_loss: 0.4990 - classification_loss: 0.0523 36/500 [=>............................] - ETA: 2:40 - loss: 0.5573 - regression_loss: 0.5044 - classification_loss: 0.0528 37/500 [=>............................] - ETA: 2:40 - loss: 0.5618 - regression_loss: 0.5092 - classification_loss: 0.0527 38/500 [=>............................] - ETA: 2:39 - loss: 0.5604 - regression_loss: 0.5077 - classification_loss: 0.0527 39/500 [=>............................] - ETA: 2:39 - loss: 0.5592 - regression_loss: 0.5071 - classification_loss: 0.0520 40/500 [=>............................] - ETA: 2:39 - loss: 0.5593 - regression_loss: 0.5075 - classification_loss: 0.0518 41/500 [=>............................] - ETA: 2:38 - loss: 0.5560 - regression_loss: 0.5044 - classification_loss: 0.0517 42/500 [=>............................] - ETA: 2:38 - loss: 0.5528 - regression_loss: 0.5007 - classification_loss: 0.0521 43/500 [=>............................] - ETA: 2:37 - loss: 0.5558 - regression_loss: 0.5033 - classification_loss: 0.0524 44/500 [=>............................] - ETA: 2:37 - loss: 0.5541 - regression_loss: 0.5016 - classification_loss: 0.0525 45/500 [=>............................] - ETA: 2:37 - loss: 0.5525 - regression_loss: 0.5000 - classification_loss: 0.0525 46/500 [=>............................] - ETA: 2:36 - loss: 0.5533 - regression_loss: 0.5004 - classification_loss: 0.0528 47/500 [=>............................] - ETA: 2:36 - loss: 0.5661 - regression_loss: 0.5109 - classification_loss: 0.0552 48/500 [=>............................] - ETA: 2:36 - loss: 0.5605 - regression_loss: 0.5059 - classification_loss: 0.0546 49/500 [=>............................] - ETA: 2:35 - loss: 0.5631 - regression_loss: 0.5089 - classification_loss: 0.0542 50/500 [==>...........................] - ETA: 2:35 - loss: 0.5697 - regression_loss: 0.5151 - classification_loss: 0.0546 51/500 [==>...........................] - ETA: 2:34 - loss: 0.5660 - regression_loss: 0.5121 - classification_loss: 0.0539 52/500 [==>...........................] - ETA: 2:34 - loss: 0.5693 - regression_loss: 0.5154 - classification_loss: 0.0538 53/500 [==>...........................] - ETA: 2:34 - loss: 0.5693 - regression_loss: 0.5153 - classification_loss: 0.0540 54/500 [==>...........................] - ETA: 2:33 - loss: 0.5688 - regression_loss: 0.5148 - classification_loss: 0.0540 55/500 [==>...........................] - ETA: 2:33 - loss: 0.5677 - regression_loss: 0.5135 - classification_loss: 0.0542 56/500 [==>...........................] - ETA: 2:32 - loss: 0.5689 - regression_loss: 0.5145 - classification_loss: 0.0544 57/500 [==>...........................] - ETA: 2:32 - loss: 0.5681 - regression_loss: 0.5138 - classification_loss: 0.0543 58/500 [==>...........................] - ETA: 2:32 - loss: 0.5696 - regression_loss: 0.5150 - classification_loss: 0.0546 59/500 [==>...........................] - ETA: 2:31 - loss: 0.5752 - regression_loss: 0.5199 - classification_loss: 0.0553 60/500 [==>...........................] - ETA: 2:31 - loss: 0.5725 - regression_loss: 0.5175 - classification_loss: 0.0550 61/500 [==>...........................] - ETA: 2:31 - loss: 0.5692 - regression_loss: 0.5143 - classification_loss: 0.0549 62/500 [==>...........................] - ETA: 2:30 - loss: 0.5669 - regression_loss: 0.5120 - classification_loss: 0.0549 63/500 [==>...........................] - ETA: 2:30 - loss: 0.5631 - regression_loss: 0.5086 - classification_loss: 0.0545 64/500 [==>...........................] - ETA: 2:29 - loss: 0.5604 - regression_loss: 0.5062 - classification_loss: 0.0541 65/500 [==>...........................] - ETA: 2:29 - loss: 0.5617 - regression_loss: 0.5075 - classification_loss: 0.0542 66/500 [==>...........................] - ETA: 2:29 - loss: 0.5611 - regression_loss: 0.5071 - classification_loss: 0.0540 67/500 [===>..........................] - ETA: 2:28 - loss: 0.5622 - regression_loss: 0.5081 - classification_loss: 0.0541 68/500 [===>..........................] - ETA: 2:28 - loss: 0.5637 - regression_loss: 0.5096 - classification_loss: 0.0541 69/500 [===>..........................] - ETA: 2:28 - loss: 0.5644 - regression_loss: 0.5104 - classification_loss: 0.0540 70/500 [===>..........................] - ETA: 2:27 - loss: 0.5606 - regression_loss: 0.5070 - classification_loss: 0.0536 71/500 [===>..........................] - ETA: 2:27 - loss: 0.5636 - regression_loss: 0.5094 - classification_loss: 0.0542 72/500 [===>..........................] - ETA: 2:26 - loss: 0.5659 - regression_loss: 0.5116 - classification_loss: 0.0544 73/500 [===>..........................] - ETA: 2:26 - loss: 0.5655 - regression_loss: 0.5112 - classification_loss: 0.0543 74/500 [===>..........................] - ETA: 2:26 - loss: 0.5673 - regression_loss: 0.5126 - classification_loss: 0.0547 75/500 [===>..........................] - ETA: 2:25 - loss: 0.5643 - regression_loss: 0.5099 - classification_loss: 0.0543 76/500 [===>..........................] - ETA: 2:25 - loss: 0.5633 - regression_loss: 0.5093 - classification_loss: 0.0540 77/500 [===>..........................] - ETA: 2:25 - loss: 0.5637 - regression_loss: 0.5098 - classification_loss: 0.0539 78/500 [===>..........................] - ETA: 2:24 - loss: 0.5661 - regression_loss: 0.5123 - classification_loss: 0.0538 79/500 [===>..........................] - ETA: 2:24 - loss: 0.5733 - regression_loss: 0.5187 - classification_loss: 0.0546 80/500 [===>..........................] - ETA: 2:24 - loss: 0.5736 - regression_loss: 0.5193 - classification_loss: 0.0544 81/500 [===>..........................] - ETA: 2:23 - loss: 0.5750 - regression_loss: 0.5205 - classification_loss: 0.0545 82/500 [===>..........................] - ETA: 2:23 - loss: 0.5738 - regression_loss: 0.5196 - classification_loss: 0.0542 83/500 [===>..........................] - ETA: 2:23 - loss: 0.5704 - regression_loss: 0.5166 - classification_loss: 0.0538 84/500 [====>.........................] - ETA: 2:22 - loss: 0.5715 - regression_loss: 0.5175 - classification_loss: 0.0540 85/500 [====>.........................] - ETA: 2:22 - loss: 0.5704 - regression_loss: 0.5166 - classification_loss: 0.0537 86/500 [====>.........................] - ETA: 2:21 - loss: 0.5684 - regression_loss: 0.5147 - classification_loss: 0.0537 87/500 [====>.........................] - ETA: 2:21 - loss: 0.5724 - regression_loss: 0.5186 - classification_loss: 0.0537 88/500 [====>.........................] - ETA: 2:21 - loss: 0.5680 - regression_loss: 0.5146 - classification_loss: 0.0533 89/500 [====>.........................] - ETA: 2:20 - loss: 0.5670 - regression_loss: 0.5140 - classification_loss: 0.0530 90/500 [====>.........................] - ETA: 2:20 - loss: 0.5689 - regression_loss: 0.5157 - classification_loss: 0.0531 91/500 [====>.........................] - ETA: 2:19 - loss: 0.5658 - regression_loss: 0.5131 - classification_loss: 0.0527 92/500 [====>.........................] - ETA: 2:19 - loss: 0.5678 - regression_loss: 0.5150 - classification_loss: 0.0528 93/500 [====>.........................] - ETA: 2:19 - loss: 0.5658 - regression_loss: 0.5134 - classification_loss: 0.0525 94/500 [====>.........................] - ETA: 2:18 - loss: 0.5649 - regression_loss: 0.5126 - classification_loss: 0.0524 95/500 [====>.........................] - ETA: 2:18 - loss: 0.5643 - regression_loss: 0.5121 - classification_loss: 0.0521 96/500 [====>.........................] - ETA: 2:18 - loss: 0.5643 - regression_loss: 0.5122 - classification_loss: 0.0521 97/500 [====>.........................] - ETA: 2:17 - loss: 0.5655 - regression_loss: 0.5133 - classification_loss: 0.0522 98/500 [====>.........................] - ETA: 2:17 - loss: 0.5642 - regression_loss: 0.5121 - classification_loss: 0.0521 99/500 [====>.........................] - ETA: 2:17 - loss: 0.5630 - regression_loss: 0.5107 - classification_loss: 0.0523 100/500 [=====>........................] - ETA: 2:16 - loss: 0.5664 - regression_loss: 0.5137 - classification_loss: 0.0528 101/500 [=====>........................] - ETA: 2:16 - loss: 0.5709 - regression_loss: 0.5183 - classification_loss: 0.0526 102/500 [=====>........................] - ETA: 2:16 - loss: 0.5711 - regression_loss: 0.5184 - classification_loss: 0.0526 103/500 [=====>........................] - ETA: 2:16 - loss: 0.5728 - regression_loss: 0.5196 - classification_loss: 0.0532 104/500 [=====>........................] - ETA: 2:15 - loss: 0.5722 - regression_loss: 0.5191 - classification_loss: 0.0531 105/500 [=====>........................] - ETA: 2:15 - loss: 0.5767 - regression_loss: 0.5231 - classification_loss: 0.0536 106/500 [=====>........................] - ETA: 2:14 - loss: 0.5754 - regression_loss: 0.5220 - classification_loss: 0.0534 107/500 [=====>........................] - ETA: 2:14 - loss: 0.5734 - regression_loss: 0.5202 - classification_loss: 0.0532 108/500 [=====>........................] - ETA: 2:14 - loss: 0.5754 - regression_loss: 0.5217 - classification_loss: 0.0537 109/500 [=====>........................] - ETA: 2:13 - loss: 0.5713 - regression_loss: 0.5178 - classification_loss: 0.0535 110/500 [=====>........................] - ETA: 2:13 - loss: 0.5722 - regression_loss: 0.5187 - classification_loss: 0.0535 111/500 [=====>........................] - ETA: 2:13 - loss: 0.5715 - regression_loss: 0.5180 - classification_loss: 0.0535 112/500 [=====>........................] - ETA: 2:12 - loss: 0.5707 - regression_loss: 0.5172 - classification_loss: 0.0536 113/500 [=====>........................] - ETA: 2:12 - loss: 0.5710 - regression_loss: 0.5175 - classification_loss: 0.0534 114/500 [=====>........................] - ETA: 2:12 - loss: 0.5697 - regression_loss: 0.5162 - classification_loss: 0.0534 115/500 [=====>........................] - ETA: 2:11 - loss: 0.5726 - regression_loss: 0.5188 - classification_loss: 0.0537 116/500 [=====>........................] - ETA: 2:11 - loss: 0.5727 - regression_loss: 0.5188 - classification_loss: 0.0538 117/500 [======>.......................] - ETA: 2:11 - loss: 0.5768 - regression_loss: 0.5226 - classification_loss: 0.0542 118/500 [======>.......................] - ETA: 2:11 - loss: 0.5753 - regression_loss: 0.5212 - classification_loss: 0.0541 119/500 [======>.......................] - ETA: 2:10 - loss: 0.5728 - regression_loss: 0.5190 - classification_loss: 0.0538 120/500 [======>.......................] - ETA: 2:10 - loss: 0.5717 - regression_loss: 0.5182 - classification_loss: 0.0535 121/500 [======>.......................] - ETA: 2:10 - loss: 0.5719 - regression_loss: 0.5180 - classification_loss: 0.0539 122/500 [======>.......................] - ETA: 2:09 - loss: 0.5750 - regression_loss: 0.5209 - classification_loss: 0.0540 123/500 [======>.......................] - ETA: 2:09 - loss: 0.5755 - regression_loss: 0.5212 - classification_loss: 0.0543 124/500 [======>.......................] - ETA: 2:08 - loss: 0.5739 - regression_loss: 0.5197 - classification_loss: 0.0542 125/500 [======>.......................] - ETA: 2:08 - loss: 0.5728 - regression_loss: 0.5186 - classification_loss: 0.0542 126/500 [======>.......................] - ETA: 2:08 - loss: 0.5709 - regression_loss: 0.5169 - classification_loss: 0.0540 127/500 [======>.......................] - ETA: 2:07 - loss: 0.5707 - regression_loss: 0.5169 - classification_loss: 0.0538 128/500 [======>.......................] - ETA: 2:07 - loss: 0.5707 - regression_loss: 0.5171 - classification_loss: 0.0537 129/500 [======>.......................] - ETA: 2:07 - loss: 0.5700 - regression_loss: 0.5164 - classification_loss: 0.0536 130/500 [======>.......................] - ETA: 2:06 - loss: 0.5698 - regression_loss: 0.5164 - classification_loss: 0.0533 131/500 [======>.......................] - ETA: 2:06 - loss: 0.5709 - regression_loss: 0.5174 - classification_loss: 0.0536 132/500 [======>.......................] - ETA: 2:06 - loss: 0.5695 - regression_loss: 0.5162 - classification_loss: 0.0534 133/500 [======>.......................] - ETA: 2:05 - loss: 0.5692 - regression_loss: 0.5159 - classification_loss: 0.0533 134/500 [=======>......................] - ETA: 2:05 - loss: 0.5700 - regression_loss: 0.5165 - classification_loss: 0.0534 135/500 [=======>......................] - ETA: 2:05 - loss: 0.5700 - regression_loss: 0.5167 - classification_loss: 0.0534 136/500 [=======>......................] - ETA: 2:04 - loss: 0.5722 - regression_loss: 0.5187 - classification_loss: 0.0534 137/500 [=======>......................] - ETA: 2:04 - loss: 0.5703 - regression_loss: 0.5170 - classification_loss: 0.0532 138/500 [=======>......................] - ETA: 2:03 - loss: 0.5707 - regression_loss: 0.5175 - classification_loss: 0.0531 139/500 [=======>......................] - ETA: 2:03 - loss: 0.5683 - regression_loss: 0.5155 - classification_loss: 0.0528 140/500 [=======>......................] - ETA: 2:03 - loss: 0.5698 - regression_loss: 0.5166 - classification_loss: 0.0532 141/500 [=======>......................] - ETA: 2:02 - loss: 0.5689 - regression_loss: 0.5159 - classification_loss: 0.0531 142/500 [=======>......................] - ETA: 2:02 - loss: 0.5681 - regression_loss: 0.5149 - classification_loss: 0.0531 143/500 [=======>......................] - ETA: 2:02 - loss: 0.5675 - regression_loss: 0.5144 - classification_loss: 0.0531 144/500 [=======>......................] - ETA: 2:02 - loss: 0.5659 - regression_loss: 0.5130 - classification_loss: 0.0530 145/500 [=======>......................] - ETA: 2:01 - loss: 0.5659 - regression_loss: 0.5131 - classification_loss: 0.0528 146/500 [=======>......................] - ETA: 2:01 - loss: 0.5681 - regression_loss: 0.5153 - classification_loss: 0.0528 147/500 [=======>......................] - ETA: 2:01 - loss: 0.5684 - regression_loss: 0.5156 - classification_loss: 0.0527 148/500 [=======>......................] - ETA: 2:00 - loss: 0.5701 - regression_loss: 0.5173 - classification_loss: 0.0529 149/500 [=======>......................] - ETA: 2:00 - loss: 0.5701 - regression_loss: 0.5172 - classification_loss: 0.0528 150/500 [========>.....................] - ETA: 1:59 - loss: 0.5689 - regression_loss: 0.5162 - classification_loss: 0.0526 151/500 [========>.....................] - ETA: 1:59 - loss: 0.5682 - regression_loss: 0.5156 - classification_loss: 0.0526 152/500 [========>.....................] - ETA: 1:59 - loss: 0.5670 - regression_loss: 0.5144 - classification_loss: 0.0526 153/500 [========>.....................] - ETA: 1:58 - loss: 0.5648 - regression_loss: 0.5124 - classification_loss: 0.0524 154/500 [========>.....................] - ETA: 1:58 - loss: 0.5643 - regression_loss: 0.5119 - classification_loss: 0.0524 155/500 [========>.....................] - ETA: 1:58 - loss: 0.5657 - regression_loss: 0.5132 - classification_loss: 0.0525 156/500 [========>.....................] - ETA: 1:57 - loss: 0.5652 - regression_loss: 0.5127 - classification_loss: 0.0525 157/500 [========>.....................] - ETA: 1:57 - loss: 0.5638 - regression_loss: 0.5115 - classification_loss: 0.0523 158/500 [========>.....................] - ETA: 1:57 - loss: 0.5637 - regression_loss: 0.5115 - classification_loss: 0.0522 159/500 [========>.....................] - ETA: 1:56 - loss: 0.5626 - regression_loss: 0.5105 - classification_loss: 0.0521 160/500 [========>.....................] - ETA: 1:56 - loss: 0.5613 - regression_loss: 0.5094 - classification_loss: 0.0519 161/500 [========>.....................] - ETA: 1:56 - loss: 0.5609 - regression_loss: 0.5091 - classification_loss: 0.0519 162/500 [========>.....................] - ETA: 1:55 - loss: 0.5618 - regression_loss: 0.5099 - classification_loss: 0.0520 163/500 [========>.....................] - ETA: 1:55 - loss: 0.5604 - regression_loss: 0.5085 - classification_loss: 0.0519 164/500 [========>.....................] - ETA: 1:55 - loss: 0.5610 - regression_loss: 0.5089 - classification_loss: 0.0521 165/500 [========>.....................] - ETA: 1:54 - loss: 0.5615 - regression_loss: 0.5093 - classification_loss: 0.0522 166/500 [========>.....................] - ETA: 1:54 - loss: 0.5615 - regression_loss: 0.5095 - classification_loss: 0.0521 167/500 [=========>....................] - ETA: 1:54 - loss: 0.5631 - regression_loss: 0.5109 - classification_loss: 0.0522 168/500 [=========>....................] - ETA: 1:53 - loss: 0.5631 - regression_loss: 0.5111 - classification_loss: 0.0521 169/500 [=========>....................] - ETA: 1:53 - loss: 0.5667 - regression_loss: 0.5141 - classification_loss: 0.0526 170/500 [=========>....................] - ETA: 1:53 - loss: 0.5666 - regression_loss: 0.5140 - classification_loss: 0.0527 171/500 [=========>....................] - ETA: 1:52 - loss: 0.5656 - regression_loss: 0.5131 - classification_loss: 0.0525 172/500 [=========>....................] - ETA: 1:52 - loss: 0.5653 - regression_loss: 0.5129 - classification_loss: 0.0524 173/500 [=========>....................] - ETA: 1:52 - loss: 0.5665 - regression_loss: 0.5139 - classification_loss: 0.0526 174/500 [=========>....................] - ETA: 1:51 - loss: 0.5645 - regression_loss: 0.5121 - classification_loss: 0.0524 175/500 [=========>....................] - ETA: 1:51 - loss: 0.5626 - regression_loss: 0.5104 - classification_loss: 0.0522 176/500 [=========>....................] - ETA: 1:51 - loss: 0.5623 - regression_loss: 0.5100 - classification_loss: 0.0523 177/500 [=========>....................] - ETA: 1:50 - loss: 0.5622 - regression_loss: 0.5100 - classification_loss: 0.0522 178/500 [=========>....................] - ETA: 1:50 - loss: 0.5652 - regression_loss: 0.5123 - classification_loss: 0.0529 179/500 [=========>....................] - ETA: 1:50 - loss: 0.5653 - regression_loss: 0.5123 - classification_loss: 0.0530 180/500 [=========>....................] - ETA: 1:49 - loss: 0.5644 - regression_loss: 0.5116 - classification_loss: 0.0529 181/500 [=========>....................] - ETA: 1:49 - loss: 0.5654 - regression_loss: 0.5124 - classification_loss: 0.0530 182/500 [=========>....................] - ETA: 1:49 - loss: 0.5656 - regression_loss: 0.5124 - classification_loss: 0.0532 183/500 [=========>....................] - ETA: 1:48 - loss: 0.5656 - regression_loss: 0.5125 - classification_loss: 0.0531 184/500 [==========>...................] - ETA: 1:48 - loss: 0.5673 - regression_loss: 0.5138 - classification_loss: 0.0535 185/500 [==========>...................] - ETA: 1:48 - loss: 0.5668 - regression_loss: 0.5134 - classification_loss: 0.0534 186/500 [==========>...................] - ETA: 1:47 - loss: 0.5671 - regression_loss: 0.5135 - classification_loss: 0.0536 187/500 [==========>...................] - ETA: 1:47 - loss: 0.5665 - regression_loss: 0.5129 - classification_loss: 0.0536 188/500 [==========>...................] - ETA: 1:47 - loss: 0.5667 - regression_loss: 0.5132 - classification_loss: 0.0536 189/500 [==========>...................] - ETA: 1:46 - loss: 0.5686 - regression_loss: 0.5147 - classification_loss: 0.0539 190/500 [==========>...................] - ETA: 1:46 - loss: 0.5671 - regression_loss: 0.5132 - classification_loss: 0.0538 191/500 [==========>...................] - ETA: 1:46 - loss: 0.5675 - regression_loss: 0.5135 - classification_loss: 0.0539 192/500 [==========>...................] - ETA: 1:45 - loss: 0.5671 - regression_loss: 0.5133 - classification_loss: 0.0539 193/500 [==========>...................] - ETA: 1:45 - loss: 0.5665 - regression_loss: 0.5128 - classification_loss: 0.0537 194/500 [==========>...................] - ETA: 1:45 - loss: 0.5666 - regression_loss: 0.5130 - classification_loss: 0.0536 195/500 [==========>...................] - ETA: 1:44 - loss: 0.5677 - regression_loss: 0.5140 - classification_loss: 0.0537 196/500 [==========>...................] - ETA: 1:44 - loss: 0.5678 - regression_loss: 0.5141 - classification_loss: 0.0537 197/500 [==========>...................] - ETA: 1:44 - loss: 0.5673 - regression_loss: 0.5137 - classification_loss: 0.0536 198/500 [==========>...................] - ETA: 1:43 - loss: 0.5666 - regression_loss: 0.5131 - classification_loss: 0.0535 199/500 [==========>...................] - ETA: 1:43 - loss: 0.5667 - regression_loss: 0.5132 - classification_loss: 0.0535 200/500 [===========>..................] - ETA: 1:43 - loss: 0.5658 - regression_loss: 0.5125 - classification_loss: 0.0534 201/500 [===========>..................] - ETA: 1:42 - loss: 0.5658 - regression_loss: 0.5124 - classification_loss: 0.0534 202/500 [===========>..................] - ETA: 1:42 - loss: 0.5647 - regression_loss: 0.5115 - classification_loss: 0.0532 203/500 [===========>..................] - ETA: 1:42 - loss: 0.5641 - regression_loss: 0.5109 - classification_loss: 0.0532 204/500 [===========>..................] - ETA: 1:41 - loss: 0.5638 - regression_loss: 0.5107 - classification_loss: 0.0532 205/500 [===========>..................] - ETA: 1:41 - loss: 0.5644 - regression_loss: 0.5113 - classification_loss: 0.0532 206/500 [===========>..................] - ETA: 1:41 - loss: 0.5645 - regression_loss: 0.5112 - classification_loss: 0.0533 207/500 [===========>..................] - ETA: 1:40 - loss: 0.5642 - regression_loss: 0.5110 - classification_loss: 0.0532 208/500 [===========>..................] - ETA: 1:40 - loss: 0.5641 - regression_loss: 0.5110 - classification_loss: 0.0532 209/500 [===========>..................] - ETA: 1:40 - loss: 0.5640 - regression_loss: 0.5107 - classification_loss: 0.0533 210/500 [===========>..................] - ETA: 1:39 - loss: 0.5645 - regression_loss: 0.5112 - classification_loss: 0.0533 211/500 [===========>..................] - ETA: 1:39 - loss: 0.5649 - regression_loss: 0.5117 - classification_loss: 0.0532 212/500 [===========>..................] - ETA: 1:38 - loss: 0.5657 - regression_loss: 0.5123 - classification_loss: 0.0534 213/500 [===========>..................] - ETA: 1:38 - loss: 0.5663 - regression_loss: 0.5127 - classification_loss: 0.0535 214/500 [===========>..................] - ETA: 1:38 - loss: 0.5669 - regression_loss: 0.5133 - classification_loss: 0.0535 215/500 [===========>..................] - ETA: 1:37 - loss: 0.5678 - regression_loss: 0.5143 - classification_loss: 0.0535 216/500 [===========>..................] - ETA: 1:37 - loss: 0.5678 - regression_loss: 0.5143 - classification_loss: 0.0535 217/500 [============>.................] - ETA: 1:37 - loss: 0.5668 - regression_loss: 0.5135 - classification_loss: 0.0534 218/500 [============>.................] - ETA: 1:36 - loss: 0.5676 - regression_loss: 0.5143 - classification_loss: 0.0534 219/500 [============>.................] - ETA: 1:36 - loss: 0.5667 - regression_loss: 0.5134 - classification_loss: 0.0533 220/500 [============>.................] - ETA: 1:36 - loss: 0.5660 - regression_loss: 0.5128 - classification_loss: 0.0532 221/500 [============>.................] - ETA: 1:35 - loss: 0.5657 - regression_loss: 0.5125 - classification_loss: 0.0532 222/500 [============>.................] - ETA: 1:35 - loss: 0.5644 - regression_loss: 0.5113 - classification_loss: 0.0531 223/500 [============>.................] - ETA: 1:35 - loss: 0.5666 - regression_loss: 0.5134 - classification_loss: 0.0533 224/500 [============>.................] - ETA: 1:34 - loss: 0.5656 - regression_loss: 0.5125 - classification_loss: 0.0532 225/500 [============>.................] - ETA: 1:34 - loss: 0.5666 - regression_loss: 0.5135 - classification_loss: 0.0531 226/500 [============>.................] - ETA: 1:34 - loss: 0.5675 - regression_loss: 0.5143 - classification_loss: 0.0532 227/500 [============>.................] - ETA: 1:33 - loss: 0.5686 - regression_loss: 0.5154 - classification_loss: 0.0531 228/500 [============>.................] - ETA: 1:33 - loss: 0.5692 - regression_loss: 0.5159 - classification_loss: 0.0533 229/500 [============>.................] - ETA: 1:33 - loss: 0.5700 - regression_loss: 0.5165 - classification_loss: 0.0535 230/500 [============>.................] - ETA: 1:32 - loss: 0.5713 - regression_loss: 0.5177 - classification_loss: 0.0536 231/500 [============>.................] - ETA: 1:32 - loss: 0.5712 - regression_loss: 0.5176 - classification_loss: 0.0536 232/500 [============>.................] - ETA: 1:32 - loss: 0.5718 - regression_loss: 0.5182 - classification_loss: 0.0536 233/500 [============>.................] - ETA: 1:31 - loss: 0.5713 - regression_loss: 0.5178 - classification_loss: 0.0535 234/500 [=============>................] - ETA: 1:31 - loss: 0.5707 - regression_loss: 0.5171 - classification_loss: 0.0537 235/500 [=============>................] - ETA: 1:31 - loss: 0.5695 - regression_loss: 0.5160 - classification_loss: 0.0535 236/500 [=============>................] - ETA: 1:30 - loss: 0.5714 - regression_loss: 0.5177 - classification_loss: 0.0537 237/500 [=============>................] - ETA: 1:30 - loss: 0.5726 - regression_loss: 0.5188 - classification_loss: 0.0538 238/500 [=============>................] - ETA: 1:30 - loss: 0.5737 - regression_loss: 0.5198 - classification_loss: 0.0540 239/500 [=============>................] - ETA: 1:29 - loss: 0.5756 - regression_loss: 0.5213 - classification_loss: 0.0543 240/500 [=============>................] - ETA: 1:29 - loss: 0.5762 - regression_loss: 0.5218 - classification_loss: 0.0545 241/500 [=============>................] - ETA: 1:29 - loss: 0.5755 - regression_loss: 0.5211 - classification_loss: 0.0543 242/500 [=============>................] - ETA: 1:28 - loss: 0.5754 - regression_loss: 0.5211 - classification_loss: 0.0543 243/500 [=============>................] - ETA: 1:28 - loss: 0.5757 - regression_loss: 0.5214 - classification_loss: 0.0543 244/500 [=============>................] - ETA: 1:28 - loss: 0.5755 - regression_loss: 0.5212 - classification_loss: 0.0543 245/500 [=============>................] - ETA: 1:27 - loss: 0.5759 - regression_loss: 0.5215 - classification_loss: 0.0543 246/500 [=============>................] - ETA: 1:27 - loss: 0.5750 - regression_loss: 0.5208 - classification_loss: 0.0542 247/500 [=============>................] - ETA: 1:26 - loss: 0.5737 - regression_loss: 0.5196 - classification_loss: 0.0541 248/500 [=============>................] - ETA: 1:26 - loss: 0.5739 - regression_loss: 0.5199 - classification_loss: 0.0541 249/500 [=============>................] - ETA: 1:26 - loss: 0.5728 - regression_loss: 0.5189 - classification_loss: 0.0539 250/500 [==============>...............] - ETA: 1:25 - loss: 0.5744 - regression_loss: 0.5203 - classification_loss: 0.0541 251/500 [==============>...............] - ETA: 1:25 - loss: 0.5736 - regression_loss: 0.5197 - classification_loss: 0.0539 252/500 [==============>...............] - ETA: 1:25 - loss: 0.5735 - regression_loss: 0.5196 - classification_loss: 0.0539 253/500 [==============>...............] - ETA: 1:24 - loss: 0.5739 - regression_loss: 0.5200 - classification_loss: 0.0540 254/500 [==============>...............] - ETA: 1:24 - loss: 0.5737 - regression_loss: 0.5197 - classification_loss: 0.0540 255/500 [==============>...............] - ETA: 1:24 - loss: 0.5734 - regression_loss: 0.5196 - classification_loss: 0.0538 256/500 [==============>...............] - ETA: 1:23 - loss: 0.5738 - regression_loss: 0.5200 - classification_loss: 0.0538 257/500 [==============>...............] - ETA: 1:23 - loss: 0.5749 - regression_loss: 0.5209 - classification_loss: 0.0540 258/500 [==============>...............] - ETA: 1:23 - loss: 0.5761 - regression_loss: 0.5222 - classification_loss: 0.0540 259/500 [==============>...............] - ETA: 1:22 - loss: 0.5774 - regression_loss: 0.5235 - classification_loss: 0.0539 260/500 [==============>...............] - ETA: 1:22 - loss: 0.5766 - regression_loss: 0.5227 - classification_loss: 0.0539 261/500 [==============>...............] - ETA: 1:22 - loss: 0.5764 - regression_loss: 0.5225 - classification_loss: 0.0539 262/500 [==============>...............] - ETA: 1:21 - loss: 0.5755 - regression_loss: 0.5216 - classification_loss: 0.0539 263/500 [==============>...............] - ETA: 1:21 - loss: 0.5748 - regression_loss: 0.5210 - classification_loss: 0.0538 264/500 [==============>...............] - ETA: 1:21 - loss: 0.5739 - regression_loss: 0.5203 - classification_loss: 0.0537 265/500 [==============>...............] - ETA: 1:20 - loss: 0.5737 - regression_loss: 0.5201 - classification_loss: 0.0536 266/500 [==============>...............] - ETA: 1:20 - loss: 0.5742 - regression_loss: 0.5206 - classification_loss: 0.0537 267/500 [===============>..............] - ETA: 1:20 - loss: 0.5736 - regression_loss: 0.5200 - classification_loss: 0.0536 268/500 [===============>..............] - ETA: 1:19 - loss: 0.5730 - regression_loss: 0.5194 - classification_loss: 0.0536 269/500 [===============>..............] - ETA: 1:19 - loss: 0.5726 - regression_loss: 0.5190 - classification_loss: 0.0536 270/500 [===============>..............] - ETA: 1:19 - loss: 0.5726 - regression_loss: 0.5190 - classification_loss: 0.0536 271/500 [===============>..............] - ETA: 1:18 - loss: 0.5734 - regression_loss: 0.5198 - classification_loss: 0.0537 272/500 [===============>..............] - ETA: 1:18 - loss: 0.5724 - regression_loss: 0.5189 - classification_loss: 0.0535 273/500 [===============>..............] - ETA: 1:17 - loss: 0.5733 - regression_loss: 0.5197 - classification_loss: 0.0536 274/500 [===============>..............] - ETA: 1:17 - loss: 0.5721 - regression_loss: 0.5186 - classification_loss: 0.0535 275/500 [===============>..............] - ETA: 1:17 - loss: 0.5724 - regression_loss: 0.5187 - classification_loss: 0.0537 276/500 [===============>..............] - ETA: 1:16 - loss: 0.5723 - regression_loss: 0.5187 - classification_loss: 0.0536 277/500 [===============>..............] - ETA: 1:16 - loss: 0.5726 - regression_loss: 0.5189 - classification_loss: 0.0537 278/500 [===============>..............] - ETA: 1:16 - loss: 0.5741 - regression_loss: 0.5202 - classification_loss: 0.0539 279/500 [===============>..............] - ETA: 1:15 - loss: 0.5735 - regression_loss: 0.5197 - classification_loss: 0.0538 280/500 [===============>..............] - ETA: 1:15 - loss: 0.5740 - regression_loss: 0.5201 - classification_loss: 0.0539 281/500 [===============>..............] - ETA: 1:15 - loss: 0.5742 - regression_loss: 0.5203 - classification_loss: 0.0539 282/500 [===============>..............] - ETA: 1:14 - loss: 0.5747 - regression_loss: 0.5207 - classification_loss: 0.0540 283/500 [===============>..............] - ETA: 1:14 - loss: 0.5742 - regression_loss: 0.5203 - classification_loss: 0.0539 284/500 [================>.............] - ETA: 1:14 - loss: 0.5743 - regression_loss: 0.5204 - classification_loss: 0.0539 285/500 [================>.............] - ETA: 1:13 - loss: 0.5741 - regression_loss: 0.5201 - classification_loss: 0.0540 286/500 [================>.............] - ETA: 1:13 - loss: 0.5735 - regression_loss: 0.5196 - classification_loss: 0.0539 287/500 [================>.............] - ETA: 1:13 - loss: 0.5754 - regression_loss: 0.5213 - classification_loss: 0.0541 288/500 [================>.............] - ETA: 1:12 - loss: 0.5749 - regression_loss: 0.5209 - classification_loss: 0.0540 289/500 [================>.............] - ETA: 1:12 - loss: 0.5741 - regression_loss: 0.5202 - classification_loss: 0.0539 290/500 [================>.............] - ETA: 1:12 - loss: 0.5752 - regression_loss: 0.5211 - classification_loss: 0.0541 291/500 [================>.............] - ETA: 1:11 - loss: 0.5755 - regression_loss: 0.5214 - classification_loss: 0.0541 292/500 [================>.............] - ETA: 1:11 - loss: 0.5760 - regression_loss: 0.5217 - classification_loss: 0.0543 293/500 [================>.............] - ETA: 1:11 - loss: 0.5754 - regression_loss: 0.5212 - classification_loss: 0.0542 294/500 [================>.............] - ETA: 1:10 - loss: 0.5746 - regression_loss: 0.5205 - classification_loss: 0.0541 295/500 [================>.............] - ETA: 1:10 - loss: 0.5736 - regression_loss: 0.5196 - classification_loss: 0.0540 296/500 [================>.............] - ETA: 1:10 - loss: 0.5728 - regression_loss: 0.5189 - classification_loss: 0.0539 297/500 [================>.............] - ETA: 1:09 - loss: 0.5720 - regression_loss: 0.5182 - classification_loss: 0.0538 298/500 [================>.............] - ETA: 1:09 - loss: 0.5724 - regression_loss: 0.5186 - classification_loss: 0.0538 299/500 [================>.............] - ETA: 1:09 - loss: 0.5719 - regression_loss: 0.5181 - classification_loss: 0.0538 300/500 [=================>............] - ETA: 1:08 - loss: 0.5713 - regression_loss: 0.5175 - classification_loss: 0.0538 301/500 [=================>............] - ETA: 1:08 - loss: 0.5709 - regression_loss: 0.5172 - classification_loss: 0.0537 302/500 [=================>............] - ETA: 1:07 - loss: 0.5711 - regression_loss: 0.5175 - classification_loss: 0.0537 303/500 [=================>............] - ETA: 1:07 - loss: 0.5710 - regression_loss: 0.5172 - classification_loss: 0.0537 304/500 [=================>............] - ETA: 1:07 - loss: 0.5705 - regression_loss: 0.5168 - classification_loss: 0.0537 305/500 [=================>............] - ETA: 1:06 - loss: 0.5705 - regression_loss: 0.5168 - classification_loss: 0.0537 306/500 [=================>............] - ETA: 1:06 - loss: 0.5705 - regression_loss: 0.5168 - classification_loss: 0.0537 307/500 [=================>............] - ETA: 1:06 - loss: 0.5706 - regression_loss: 0.5169 - classification_loss: 0.0537 308/500 [=================>............] - ETA: 1:05 - loss: 0.5700 - regression_loss: 0.5164 - classification_loss: 0.0536 309/500 [=================>............] - ETA: 1:05 - loss: 0.5708 - regression_loss: 0.5171 - classification_loss: 0.0537 310/500 [=================>............] - ETA: 1:05 - loss: 0.5715 - regression_loss: 0.5177 - classification_loss: 0.0537 311/500 [=================>............] - ETA: 1:04 - loss: 0.5722 - regression_loss: 0.5184 - classification_loss: 0.0539 312/500 [=================>............] - ETA: 1:04 - loss: 0.5728 - regression_loss: 0.5188 - classification_loss: 0.0539 313/500 [=================>............] - ETA: 1:04 - loss: 0.5734 - regression_loss: 0.5194 - classification_loss: 0.0540 314/500 [=================>............] - ETA: 1:03 - loss: 0.5729 - regression_loss: 0.5190 - classification_loss: 0.0539 315/500 [=================>............] - ETA: 1:03 - loss: 0.5740 - regression_loss: 0.5199 - classification_loss: 0.0540 316/500 [=================>............] - ETA: 1:03 - loss: 0.5735 - regression_loss: 0.5195 - classification_loss: 0.0540 317/500 [==================>...........] - ETA: 1:02 - loss: 0.5741 - regression_loss: 0.5201 - classification_loss: 0.0541 318/500 [==================>...........] - ETA: 1:02 - loss: 0.5744 - regression_loss: 0.5203 - classification_loss: 0.0541 319/500 [==================>...........] - ETA: 1:02 - loss: 0.5745 - regression_loss: 0.5204 - classification_loss: 0.0541 320/500 [==================>...........] - ETA: 1:01 - loss: 0.5732 - regression_loss: 0.5192 - classification_loss: 0.0540 321/500 [==================>...........] - ETA: 1:01 - loss: 0.5730 - regression_loss: 0.5190 - classification_loss: 0.0540 322/500 [==================>...........] - ETA: 1:01 - loss: 0.5739 - regression_loss: 0.5199 - classification_loss: 0.0540 323/500 [==================>...........] - ETA: 1:00 - loss: 0.5737 - regression_loss: 0.5197 - classification_loss: 0.0540 324/500 [==================>...........] - ETA: 1:00 - loss: 0.5746 - regression_loss: 0.5206 - classification_loss: 0.0540 325/500 [==================>...........] - ETA: 1:00 - loss: 0.5743 - regression_loss: 0.5204 - classification_loss: 0.0539 326/500 [==================>...........] - ETA: 59s - loss: 0.5743 - regression_loss: 0.5203 - classification_loss: 0.0539  327/500 [==================>...........] - ETA: 59s - loss: 0.5747 - regression_loss: 0.5208 - classification_loss: 0.0539 328/500 [==================>...........] - ETA: 59s - loss: 0.5745 - regression_loss: 0.5205 - classification_loss: 0.0539 329/500 [==================>...........] - ETA: 58s - loss: 0.5746 - regression_loss: 0.5207 - classification_loss: 0.0539 330/500 [==================>...........] - ETA: 58s - loss: 0.5747 - regression_loss: 0.5209 - classification_loss: 0.0538 331/500 [==================>...........] - ETA: 58s - loss: 0.5750 - regression_loss: 0.5212 - classification_loss: 0.0539 332/500 [==================>...........] - ETA: 57s - loss: 0.5742 - regression_loss: 0.5204 - classification_loss: 0.0538 333/500 [==================>...........] - ETA: 57s - loss: 0.5740 - regression_loss: 0.5202 - classification_loss: 0.0538 334/500 [===================>..........] - ETA: 56s - loss: 0.5731 - regression_loss: 0.5194 - classification_loss: 0.0537 335/500 [===================>..........] - ETA: 56s - loss: 0.5736 - regression_loss: 0.5199 - classification_loss: 0.0537 336/500 [===================>..........] - ETA: 56s - loss: 0.5737 - regression_loss: 0.5200 - classification_loss: 0.0537 337/500 [===================>..........] - ETA: 55s - loss: 0.5740 - regression_loss: 0.5203 - classification_loss: 0.0538 338/500 [===================>..........] - ETA: 55s - loss: 0.5746 - regression_loss: 0.5207 - classification_loss: 0.0539 339/500 [===================>..........] - ETA: 55s - loss: 0.5748 - regression_loss: 0.5209 - classification_loss: 0.0539 340/500 [===================>..........] - ETA: 54s - loss: 0.5748 - regression_loss: 0.5209 - classification_loss: 0.0538 341/500 [===================>..........] - ETA: 54s - loss: 0.5741 - regression_loss: 0.5203 - classification_loss: 0.0538 342/500 [===================>..........] - ETA: 54s - loss: 0.5737 - regression_loss: 0.5199 - classification_loss: 0.0538 343/500 [===================>..........] - ETA: 53s - loss: 0.5746 - regression_loss: 0.5207 - classification_loss: 0.0538 344/500 [===================>..........] - ETA: 53s - loss: 0.5752 - regression_loss: 0.5213 - classification_loss: 0.0539 345/500 [===================>..........] - ETA: 53s - loss: 0.5752 - regression_loss: 0.5212 - classification_loss: 0.0539 346/500 [===================>..........] - ETA: 52s - loss: 0.5753 - regression_loss: 0.5213 - classification_loss: 0.0540 347/500 [===================>..........] - ETA: 52s - loss: 0.5745 - regression_loss: 0.5206 - classification_loss: 0.0539 348/500 [===================>..........] - ETA: 52s - loss: 0.5737 - regression_loss: 0.5199 - classification_loss: 0.0538 349/500 [===================>..........] - ETA: 51s - loss: 0.5734 - regression_loss: 0.5197 - classification_loss: 0.0537 350/500 [====================>.........] - ETA: 51s - loss: 0.5730 - regression_loss: 0.5194 - classification_loss: 0.0537 351/500 [====================>.........] - ETA: 51s - loss: 0.5727 - regression_loss: 0.5191 - classification_loss: 0.0536 352/500 [====================>.........] - ETA: 50s - loss: 0.5731 - regression_loss: 0.5196 - classification_loss: 0.0535 353/500 [====================>.........] - ETA: 50s - loss: 0.5735 - regression_loss: 0.5200 - classification_loss: 0.0535 354/500 [====================>.........] - ETA: 50s - loss: 0.5735 - regression_loss: 0.5199 - classification_loss: 0.0535 355/500 [====================>.........] - ETA: 49s - loss: 0.5743 - regression_loss: 0.5207 - classification_loss: 0.0537 356/500 [====================>.........] - ETA: 49s - loss: 0.5739 - regression_loss: 0.5203 - classification_loss: 0.0536 357/500 [====================>.........] - ETA: 49s - loss: 0.5732 - regression_loss: 0.5197 - classification_loss: 0.0535 358/500 [====================>.........] - ETA: 48s - loss: 0.5730 - regression_loss: 0.5195 - classification_loss: 0.0535 359/500 [====================>.........] - ETA: 48s - loss: 0.5723 - regression_loss: 0.5189 - classification_loss: 0.0534 360/500 [====================>.........] - ETA: 48s - loss: 0.5723 - regression_loss: 0.5189 - classification_loss: 0.0534 361/500 [====================>.........] - ETA: 47s - loss: 0.5718 - regression_loss: 0.5186 - classification_loss: 0.0533 362/500 [====================>.........] - ETA: 47s - loss: 0.5719 - regression_loss: 0.5187 - classification_loss: 0.0532 363/500 [====================>.........] - ETA: 47s - loss: 0.5709 - regression_loss: 0.5178 - classification_loss: 0.0531 364/500 [====================>.........] - ETA: 46s - loss: 0.5708 - regression_loss: 0.5178 - classification_loss: 0.0530 365/500 [====================>.........] - ETA: 46s - loss: 0.5715 - regression_loss: 0.5185 - classification_loss: 0.0530 366/500 [====================>.........] - ETA: 45s - loss: 0.5709 - regression_loss: 0.5179 - classification_loss: 0.0530 367/500 [=====================>........] - ETA: 45s - loss: 0.5708 - regression_loss: 0.5178 - classification_loss: 0.0530 368/500 [=====================>........] - ETA: 45s - loss: 0.5700 - regression_loss: 0.5171 - classification_loss: 0.0529 369/500 [=====================>........] - ETA: 44s - loss: 0.5702 - regression_loss: 0.5173 - classification_loss: 0.0529 370/500 [=====================>........] - ETA: 44s - loss: 0.5702 - regression_loss: 0.5173 - classification_loss: 0.0529 371/500 [=====================>........] - ETA: 44s - loss: 0.5700 - regression_loss: 0.5172 - classification_loss: 0.0528 372/500 [=====================>........] - ETA: 43s - loss: 0.5700 - regression_loss: 0.5172 - classification_loss: 0.0528 373/500 [=====================>........] - ETA: 43s - loss: 0.5698 - regression_loss: 0.5171 - classification_loss: 0.0528 374/500 [=====================>........] - ETA: 43s - loss: 0.5703 - regression_loss: 0.5175 - classification_loss: 0.0528 375/500 [=====================>........] - ETA: 42s - loss: 0.5696 - regression_loss: 0.5170 - classification_loss: 0.0527 376/500 [=====================>........] - ETA: 42s - loss: 0.5696 - regression_loss: 0.5169 - classification_loss: 0.0527 377/500 [=====================>........] - ETA: 42s - loss: 0.5695 - regression_loss: 0.5168 - classification_loss: 0.0527 378/500 [=====================>........] - ETA: 41s - loss: 0.5690 - regression_loss: 0.5163 - classification_loss: 0.0526 379/500 [=====================>........] - ETA: 41s - loss: 0.5686 - regression_loss: 0.5160 - classification_loss: 0.0526 380/500 [=====================>........] - ETA: 41s - loss: 0.5682 - regression_loss: 0.5157 - classification_loss: 0.0525 381/500 [=====================>........] - ETA: 40s - loss: 0.5675 - regression_loss: 0.5150 - classification_loss: 0.0525 382/500 [=====================>........] - ETA: 40s - loss: 0.5677 - regression_loss: 0.5151 - classification_loss: 0.0526 383/500 [=====================>........] - ETA: 40s - loss: 0.5681 - regression_loss: 0.5154 - classification_loss: 0.0526 384/500 [======================>.......] - ETA: 39s - loss: 0.5674 - regression_loss: 0.5149 - classification_loss: 0.0526 385/500 [======================>.......] - ETA: 39s - loss: 0.5673 - regression_loss: 0.5148 - classification_loss: 0.0525 386/500 [======================>.......] - ETA: 39s - loss: 0.5669 - regression_loss: 0.5144 - classification_loss: 0.0525 387/500 [======================>.......] - ETA: 38s - loss: 0.5662 - regression_loss: 0.5138 - classification_loss: 0.0524 388/500 [======================>.......] - ETA: 38s - loss: 0.5664 - regression_loss: 0.5139 - classification_loss: 0.0524 389/500 [======================>.......] - ETA: 38s - loss: 0.5663 - regression_loss: 0.5139 - classification_loss: 0.0524 390/500 [======================>.......] - ETA: 37s - loss: 0.5660 - regression_loss: 0.5136 - classification_loss: 0.0523 391/500 [======================>.......] - ETA: 37s - loss: 0.5654 - regression_loss: 0.5131 - classification_loss: 0.0524 392/500 [======================>.......] - ETA: 37s - loss: 0.5654 - regression_loss: 0.5131 - classification_loss: 0.0524 393/500 [======================>.......] - ETA: 36s - loss: 0.5648 - regression_loss: 0.5125 - classification_loss: 0.0523 394/500 [======================>.......] - ETA: 36s - loss: 0.5651 - regression_loss: 0.5127 - classification_loss: 0.0523 395/500 [======================>.......] - ETA: 36s - loss: 0.5658 - regression_loss: 0.5133 - classification_loss: 0.0525 396/500 [======================>.......] - ETA: 35s - loss: 0.5663 - regression_loss: 0.5138 - classification_loss: 0.0525 397/500 [======================>.......] - ETA: 35s - loss: 0.5667 - regression_loss: 0.5141 - classification_loss: 0.0526 398/500 [======================>.......] - ETA: 35s - loss: 0.5659 - regression_loss: 0.5133 - classification_loss: 0.0525 399/500 [======================>.......] - ETA: 34s - loss: 0.5655 - regression_loss: 0.5130 - classification_loss: 0.0525 400/500 [=======================>......] - ETA: 34s - loss: 0.5654 - regression_loss: 0.5130 - classification_loss: 0.0525 401/500 [=======================>......] - ETA: 33s - loss: 0.5660 - regression_loss: 0.5135 - classification_loss: 0.0525 402/500 [=======================>......] - ETA: 33s - loss: 0.5658 - regression_loss: 0.5133 - classification_loss: 0.0525 403/500 [=======================>......] - ETA: 33s - loss: 0.5652 - regression_loss: 0.5128 - classification_loss: 0.0524 404/500 [=======================>......] - ETA: 32s - loss: 0.5651 - regression_loss: 0.5127 - classification_loss: 0.0524 405/500 [=======================>......] - ETA: 32s - loss: 0.5647 - regression_loss: 0.5124 - classification_loss: 0.0523 406/500 [=======================>......] - ETA: 32s - loss: 0.5650 - regression_loss: 0.5125 - classification_loss: 0.0525 407/500 [=======================>......] - ETA: 31s - loss: 0.5653 - regression_loss: 0.5129 - classification_loss: 0.0524 408/500 [=======================>......] - ETA: 31s - loss: 0.5657 - regression_loss: 0.5132 - classification_loss: 0.0525 409/500 [=======================>......] - ETA: 31s - loss: 0.5659 - regression_loss: 0.5134 - classification_loss: 0.0526 410/500 [=======================>......] - ETA: 30s - loss: 0.5683 - regression_loss: 0.5156 - classification_loss: 0.0527 411/500 [=======================>......] - ETA: 30s - loss: 0.5678 - regression_loss: 0.5151 - classification_loss: 0.0527 412/500 [=======================>......] - ETA: 30s - loss: 0.5669 - regression_loss: 0.5143 - classification_loss: 0.0526 413/500 [=======================>......] - ETA: 29s - loss: 0.5671 - regression_loss: 0.5145 - classification_loss: 0.0526 414/500 [=======================>......] - ETA: 29s - loss: 0.5676 - regression_loss: 0.5149 - classification_loss: 0.0527 415/500 [=======================>......] - ETA: 29s - loss: 0.5670 - regression_loss: 0.5144 - classification_loss: 0.0527 416/500 [=======================>......] - ETA: 28s - loss: 0.5670 - regression_loss: 0.5143 - classification_loss: 0.0527 417/500 [========================>.....] - ETA: 28s - loss: 0.5680 - regression_loss: 0.5154 - classification_loss: 0.0526 418/500 [========================>.....] - ETA: 28s - loss: 0.5680 - regression_loss: 0.5154 - classification_loss: 0.0526 419/500 [========================>.....] - ETA: 27s - loss: 0.5680 - regression_loss: 0.5154 - classification_loss: 0.0526 420/500 [========================>.....] - ETA: 27s - loss: 0.5683 - regression_loss: 0.5157 - classification_loss: 0.0526 421/500 [========================>.....] - ETA: 27s - loss: 0.5683 - regression_loss: 0.5157 - classification_loss: 0.0526 422/500 [========================>.....] - ETA: 26s - loss: 0.5683 - regression_loss: 0.5157 - classification_loss: 0.0527 423/500 [========================>.....] - ETA: 26s - loss: 0.5680 - regression_loss: 0.5154 - classification_loss: 0.0526 424/500 [========================>.....] - ETA: 26s - loss: 0.5679 - regression_loss: 0.5152 - classification_loss: 0.0526 425/500 [========================>.....] - ETA: 25s - loss: 0.5685 - regression_loss: 0.5157 - classification_loss: 0.0528 426/500 [========================>.....] - ETA: 25s - loss: 0.5684 - regression_loss: 0.5156 - classification_loss: 0.0528 427/500 [========================>.....] - ETA: 25s - loss: 0.5677 - regression_loss: 0.5149 - classification_loss: 0.0527 428/500 [========================>.....] - ETA: 24s - loss: 0.5676 - regression_loss: 0.5149 - classification_loss: 0.0527 429/500 [========================>.....] - ETA: 24s - loss: 0.5674 - regression_loss: 0.5147 - classification_loss: 0.0527 430/500 [========================>.....] - ETA: 24s - loss: 0.5669 - regression_loss: 0.5142 - classification_loss: 0.0527 431/500 [========================>.....] - ETA: 23s - loss: 0.5668 - regression_loss: 0.5141 - classification_loss: 0.0527 432/500 [========================>.....] - ETA: 23s - loss: 0.5665 - regression_loss: 0.5138 - classification_loss: 0.0526 433/500 [========================>.....] - ETA: 22s - loss: 0.5661 - regression_loss: 0.5136 - classification_loss: 0.0525 434/500 [=========================>....] - ETA: 22s - loss: 0.5653 - regression_loss: 0.5128 - classification_loss: 0.0525 435/500 [=========================>....] - ETA: 22s - loss: 0.5656 - regression_loss: 0.5131 - classification_loss: 0.0525 436/500 [=========================>....] - ETA: 21s - loss: 0.5651 - regression_loss: 0.5127 - classification_loss: 0.0524 437/500 [=========================>....] - ETA: 21s - loss: 0.5646 - regression_loss: 0.5123 - classification_loss: 0.0523 438/500 [=========================>....] - ETA: 21s - loss: 0.5643 - regression_loss: 0.5120 - classification_loss: 0.0523 439/500 [=========================>....] - ETA: 20s - loss: 0.5645 - regression_loss: 0.5121 - classification_loss: 0.0524 440/500 [=========================>....] - ETA: 20s - loss: 0.5641 - regression_loss: 0.5117 - classification_loss: 0.0524 441/500 [=========================>....] - ETA: 20s - loss: 0.5638 - regression_loss: 0.5115 - classification_loss: 0.0523 442/500 [=========================>....] - ETA: 19s - loss: 0.5635 - regression_loss: 0.5113 - classification_loss: 0.0522 443/500 [=========================>....] - ETA: 19s - loss: 0.5639 - regression_loss: 0.5116 - classification_loss: 0.0522 444/500 [=========================>....] - ETA: 19s - loss: 0.5631 - regression_loss: 0.5109 - classification_loss: 0.0522 445/500 [=========================>....] - ETA: 18s - loss: 0.5635 - regression_loss: 0.5112 - classification_loss: 0.0522 446/500 [=========================>....] - ETA: 18s - loss: 0.5639 - regression_loss: 0.5115 - classification_loss: 0.0523 447/500 [=========================>....] - ETA: 18s - loss: 0.5638 - regression_loss: 0.5115 - classification_loss: 0.0523 448/500 [=========================>....] - ETA: 17s - loss: 0.5639 - regression_loss: 0.5116 - classification_loss: 0.0523 449/500 [=========================>....] - ETA: 17s - loss: 0.5641 - regression_loss: 0.5119 - classification_loss: 0.0523 450/500 [==========================>...] - ETA: 17s - loss: 0.5649 - regression_loss: 0.5126 - classification_loss: 0.0523 451/500 [==========================>...] - ETA: 16s - loss: 0.5651 - regression_loss: 0.5127 - classification_loss: 0.0524 452/500 [==========================>...] - ETA: 16s - loss: 0.5658 - regression_loss: 0.5134 - classification_loss: 0.0524 453/500 [==========================>...] - ETA: 16s - loss: 0.5658 - regression_loss: 0.5135 - classification_loss: 0.0523 454/500 [==========================>...] - ETA: 15s - loss: 0.5658 - regression_loss: 0.5135 - classification_loss: 0.0523 455/500 [==========================>...] - ETA: 15s - loss: 0.5657 - regression_loss: 0.5133 - classification_loss: 0.0525 456/500 [==========================>...] - ETA: 15s - loss: 0.5664 - regression_loss: 0.5139 - classification_loss: 0.0525 457/500 [==========================>...] - ETA: 14s - loss: 0.5663 - regression_loss: 0.5139 - classification_loss: 0.0525 458/500 [==========================>...] - ETA: 14s - loss: 0.5664 - regression_loss: 0.5139 - classification_loss: 0.0525 459/500 [==========================>...] - ETA: 14s - loss: 0.5660 - regression_loss: 0.5136 - classification_loss: 0.0525 460/500 [==========================>...] - ETA: 13s - loss: 0.5669 - regression_loss: 0.5143 - classification_loss: 0.0526 461/500 [==========================>...] - ETA: 13s - loss: 0.5673 - regression_loss: 0.5147 - classification_loss: 0.0526 462/500 [==========================>...] - ETA: 13s - loss: 0.5674 - regression_loss: 0.5147 - classification_loss: 0.0527 463/500 [==========================>...] - ETA: 12s - loss: 0.5675 - regression_loss: 0.5148 - classification_loss: 0.0527 464/500 [==========================>...] - ETA: 12s - loss: 0.5670 - regression_loss: 0.5144 - classification_loss: 0.0526 465/500 [==========================>...] - ETA: 12s - loss: 0.5666 - regression_loss: 0.5139 - classification_loss: 0.0526 466/500 [==========================>...] - ETA: 11s - loss: 0.5669 - regression_loss: 0.5142 - classification_loss: 0.0527 467/500 [===========================>..] - ETA: 11s - loss: 0.5666 - regression_loss: 0.5140 - classification_loss: 0.0527 468/500 [===========================>..] - ETA: 10s - loss: 0.5667 - regression_loss: 0.5141 - classification_loss: 0.0526 469/500 [===========================>..] - ETA: 10s - loss: 0.5664 - regression_loss: 0.5138 - classification_loss: 0.0526 470/500 [===========================>..] - ETA: 10s - loss: 0.5658 - regression_loss: 0.5133 - classification_loss: 0.0525 471/500 [===========================>..] - ETA: 9s - loss: 0.5650 - regression_loss: 0.5126 - classification_loss: 0.0524  472/500 [===========================>..] - ETA: 9s - loss: 0.5654 - regression_loss: 0.5130 - classification_loss: 0.0524 473/500 [===========================>..] - ETA: 9s - loss: 0.5652 - regression_loss: 0.5128 - classification_loss: 0.0524 474/500 [===========================>..] - ETA: 8s - loss: 0.5648 - regression_loss: 0.5125 - classification_loss: 0.0523 475/500 [===========================>..] - ETA: 8s - loss: 0.5659 - regression_loss: 0.5133 - classification_loss: 0.0526 476/500 [===========================>..] - ETA: 8s - loss: 0.5660 - regression_loss: 0.5135 - classification_loss: 0.0526 477/500 [===========================>..] - ETA: 7s - loss: 0.5654 - regression_loss: 0.5129 - classification_loss: 0.0525 478/500 [===========================>..] - ETA: 7s - loss: 0.5648 - regression_loss: 0.5123 - classification_loss: 0.0525 479/500 [===========================>..] - ETA: 7s - loss: 0.5646 - regression_loss: 0.5122 - classification_loss: 0.0525 480/500 [===========================>..] - ETA: 6s - loss: 0.5644 - regression_loss: 0.5120 - classification_loss: 0.0524 481/500 [===========================>..] - ETA: 6s - loss: 0.5644 - regression_loss: 0.5120 - classification_loss: 0.0524 482/500 [===========================>..] - ETA: 6s - loss: 0.5640 - regression_loss: 0.5116 - classification_loss: 0.0524 483/500 [===========================>..] - ETA: 5s - loss: 0.5634 - regression_loss: 0.5111 - classification_loss: 0.0523 484/500 [============================>.] - ETA: 5s - loss: 0.5640 - regression_loss: 0.5117 - classification_loss: 0.0523 485/500 [============================>.] - ETA: 5s - loss: 0.5640 - regression_loss: 0.5117 - classification_loss: 0.0523 486/500 [============================>.] - ETA: 4s - loss: 0.5644 - regression_loss: 0.5122 - classification_loss: 0.0522 487/500 [============================>.] - ETA: 4s - loss: 0.5641 - regression_loss: 0.5119 - classification_loss: 0.0522 488/500 [============================>.] - ETA: 4s - loss: 0.5642 - regression_loss: 0.5120 - classification_loss: 0.0522 489/500 [============================>.] - ETA: 3s - loss: 0.5648 - regression_loss: 0.5125 - classification_loss: 0.0523 490/500 [============================>.] - ETA: 3s - loss: 0.5653 - regression_loss: 0.5131 - classification_loss: 0.0523 491/500 [============================>.] - ETA: 3s - loss: 0.5654 - regression_loss: 0.5131 - classification_loss: 0.0523 492/500 [============================>.] - ETA: 2s - loss: 0.5657 - regression_loss: 0.5134 - classification_loss: 0.0523 493/500 [============================>.] - ETA: 2s - loss: 0.5652 - regression_loss: 0.5130 - classification_loss: 0.0522 494/500 [============================>.] - ETA: 2s - loss: 0.5648 - regression_loss: 0.5127 - classification_loss: 0.0521 495/500 [============================>.] - ETA: 1s - loss: 0.5648 - regression_loss: 0.5127 - classification_loss: 0.0521 496/500 [============================>.] - ETA: 1s - loss: 0.5648 - regression_loss: 0.5127 - classification_loss: 0.0521 497/500 [============================>.] - ETA: 1s - loss: 0.5649 - regression_loss: 0.5128 - classification_loss: 0.0521 498/500 [============================>.] - ETA: 0s - loss: 0.5647 - regression_loss: 0.5127 - classification_loss: 0.0521 499/500 [============================>.] - ETA: 0s - loss: 0.5651 - regression_loss: 0.5130 - classification_loss: 0.0521 500/500 [==============================] - 172s 343ms/step - loss: 0.5653 - regression_loss: 0.5131 - classification_loss: 0.0521 1172 instances of class plum with average precision: 0.7186 mAP: 0.7186 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 2:45 - loss: 0.4955 - regression_loss: 0.4548 - classification_loss: 0.0408 2/500 [..............................] - ETA: 2:49 - loss: 0.5461 - regression_loss: 0.4997 - classification_loss: 0.0464 3/500 [..............................] - ETA: 2:50 - loss: 0.4794 - regression_loss: 0.4423 - classification_loss: 0.0371 4/500 [..............................] - ETA: 2:49 - loss: 0.5193 - regression_loss: 0.4784 - classification_loss: 0.0409 5/500 [..............................] - ETA: 2:49 - loss: 0.5170 - regression_loss: 0.4723 - classification_loss: 0.0448 6/500 [..............................] - ETA: 2:48 - loss: 0.5660 - regression_loss: 0.5158 - classification_loss: 0.0502 7/500 [..............................] - ETA: 2:47 - loss: 0.5774 - regression_loss: 0.5263 - classification_loss: 0.0511 8/500 [..............................] - ETA: 2:48 - loss: 0.5915 - regression_loss: 0.5377 - classification_loss: 0.0538 9/500 [..............................] - ETA: 2:49 - loss: 0.5768 - regression_loss: 0.5247 - classification_loss: 0.0522 10/500 [..............................] - ETA: 2:49 - loss: 0.5736 - regression_loss: 0.5214 - classification_loss: 0.0522 11/500 [..............................] - ETA: 2:48 - loss: 0.5942 - regression_loss: 0.5408 - classification_loss: 0.0534 12/500 [..............................] - ETA: 2:48 - loss: 0.5758 - regression_loss: 0.5249 - classification_loss: 0.0509 13/500 [..............................] - ETA: 2:48 - loss: 0.5779 - regression_loss: 0.5270 - classification_loss: 0.0510 14/500 [..............................] - ETA: 2:48 - loss: 0.5979 - regression_loss: 0.5404 - classification_loss: 0.0576 15/500 [..............................] - ETA: 2:47 - loss: 0.5879 - regression_loss: 0.5318 - classification_loss: 0.0560 16/500 [..............................] - ETA: 2:47 - loss: 0.5952 - regression_loss: 0.5381 - classification_loss: 0.0571 17/500 [>.............................] - ETA: 2:46 - loss: 0.5814 - regression_loss: 0.5250 - classification_loss: 0.0564 18/500 [>.............................] - ETA: 2:45 - loss: 0.5788 - regression_loss: 0.5229 - classification_loss: 0.0559 19/500 [>.............................] - ETA: 2:45 - loss: 0.5652 - regression_loss: 0.5109 - classification_loss: 0.0544 20/500 [>.............................] - ETA: 2:45 - loss: 0.5562 - regression_loss: 0.5035 - classification_loss: 0.0528 21/500 [>.............................] - ETA: 2:44 - loss: 0.5404 - regression_loss: 0.4893 - classification_loss: 0.0511 22/500 [>.............................] - ETA: 2:44 - loss: 0.5612 - regression_loss: 0.5082 - classification_loss: 0.0531 23/500 [>.............................] - ETA: 2:43 - loss: 0.5577 - regression_loss: 0.5055 - classification_loss: 0.0522 24/500 [>.............................] - ETA: 2:43 - loss: 0.5706 - regression_loss: 0.5172 - classification_loss: 0.0534 25/500 [>.............................] - ETA: 2:42 - loss: 0.5792 - regression_loss: 0.5257 - classification_loss: 0.0535 26/500 [>.............................] - ETA: 2:41 - loss: 0.5749 - regression_loss: 0.5219 - classification_loss: 0.0530 27/500 [>.............................] - ETA: 2:41 - loss: 0.5672 - regression_loss: 0.5156 - classification_loss: 0.0515 28/500 [>.............................] - ETA: 2:41 - loss: 0.5793 - regression_loss: 0.5260 - classification_loss: 0.0533 29/500 [>.............................] - ETA: 2:40 - loss: 0.5740 - regression_loss: 0.5210 - classification_loss: 0.0530 30/500 [>.............................] - ETA: 2:40 - loss: 0.5688 - regression_loss: 0.5166 - classification_loss: 0.0522 31/500 [>.............................] - ETA: 2:40 - loss: 0.5712 - regression_loss: 0.5193 - classification_loss: 0.0518 32/500 [>.............................] - ETA: 2:40 - loss: 0.5696 - regression_loss: 0.5180 - classification_loss: 0.0515 33/500 [>.............................] - ETA: 2:40 - loss: 0.5630 - regression_loss: 0.5125 - classification_loss: 0.0505 34/500 [=>............................] - ETA: 2:39 - loss: 0.5567 - regression_loss: 0.5069 - classification_loss: 0.0498 35/500 [=>............................] - ETA: 2:39 - loss: 0.5544 - regression_loss: 0.5048 - classification_loss: 0.0497 36/500 [=>............................] - ETA: 2:39 - loss: 0.5492 - regression_loss: 0.5002 - classification_loss: 0.0490 37/500 [=>............................] - ETA: 2:38 - loss: 0.5572 - regression_loss: 0.5077 - classification_loss: 0.0495 38/500 [=>............................] - ETA: 2:38 - loss: 0.5530 - regression_loss: 0.5040 - classification_loss: 0.0490 39/500 [=>............................] - ETA: 2:38 - loss: 0.5527 - regression_loss: 0.5036 - classification_loss: 0.0491 40/500 [=>............................] - ETA: 2:38 - loss: 0.5562 - regression_loss: 0.5063 - classification_loss: 0.0499 41/500 [=>............................] - ETA: 2:37 - loss: 0.5693 - regression_loss: 0.5201 - classification_loss: 0.0493 42/500 [=>............................] - ETA: 2:37 - loss: 0.5719 - regression_loss: 0.5224 - classification_loss: 0.0495 43/500 [=>............................] - ETA: 2:37 - loss: 0.5749 - regression_loss: 0.5244 - classification_loss: 0.0505 44/500 [=>............................] - ETA: 2:36 - loss: 0.5796 - regression_loss: 0.5285 - classification_loss: 0.0511 45/500 [=>............................] - ETA: 2:36 - loss: 0.5842 - regression_loss: 0.5327 - classification_loss: 0.0515 46/500 [=>............................] - ETA: 2:36 - loss: 0.5824 - regression_loss: 0.5309 - classification_loss: 0.0515 47/500 [=>............................] - ETA: 2:35 - loss: 0.5879 - regression_loss: 0.5353 - classification_loss: 0.0526 48/500 [=>............................] - ETA: 2:35 - loss: 0.5859 - regression_loss: 0.5337 - classification_loss: 0.0521 49/500 [=>............................] - ETA: 2:35 - loss: 0.5788 - regression_loss: 0.5275 - classification_loss: 0.0514 50/500 [==>...........................] - ETA: 2:34 - loss: 0.5732 - regression_loss: 0.5223 - classification_loss: 0.0509 51/500 [==>...........................] - ETA: 2:34 - loss: 0.5695 - regression_loss: 0.5191 - classification_loss: 0.0504 52/500 [==>...........................] - ETA: 2:34 - loss: 0.5685 - regression_loss: 0.5179 - classification_loss: 0.0506 53/500 [==>...........................] - ETA: 2:33 - loss: 0.5639 - regression_loss: 0.5136 - classification_loss: 0.0503 54/500 [==>...........................] - ETA: 2:33 - loss: 0.5697 - regression_loss: 0.5189 - classification_loss: 0.0508 55/500 [==>...........................] - ETA: 2:33 - loss: 0.5741 - regression_loss: 0.5225 - classification_loss: 0.0516 56/500 [==>...........................] - ETA: 2:32 - loss: 0.5743 - regression_loss: 0.5227 - classification_loss: 0.0516 57/500 [==>...........................] - ETA: 2:32 - loss: 0.5680 - regression_loss: 0.5169 - classification_loss: 0.0511 58/500 [==>...........................] - ETA: 2:32 - loss: 0.5656 - regression_loss: 0.5146 - classification_loss: 0.0510 59/500 [==>...........................] - ETA: 2:31 - loss: 0.5683 - regression_loss: 0.5172 - classification_loss: 0.0511 60/500 [==>...........................] - ETA: 2:31 - loss: 0.5636 - regression_loss: 0.5131 - classification_loss: 0.0506 61/500 [==>...........................] - ETA: 2:31 - loss: 0.5611 - regression_loss: 0.5103 - classification_loss: 0.0508 62/500 [==>...........................] - ETA: 2:30 - loss: 0.5661 - regression_loss: 0.5152 - classification_loss: 0.0509 63/500 [==>...........................] - ETA: 2:30 - loss: 0.5766 - regression_loss: 0.5246 - classification_loss: 0.0520 64/500 [==>...........................] - ETA: 2:30 - loss: 0.5764 - regression_loss: 0.5245 - classification_loss: 0.0520 65/500 [==>...........................] - ETA: 2:29 - loss: 0.5808 - regression_loss: 0.5283 - classification_loss: 0.0525 66/500 [==>...........................] - ETA: 2:29 - loss: 0.5832 - regression_loss: 0.5303 - classification_loss: 0.0529 67/500 [===>..........................] - ETA: 2:28 - loss: 0.5846 - regression_loss: 0.5317 - classification_loss: 0.0529 68/500 [===>..........................] - ETA: 2:28 - loss: 0.5841 - regression_loss: 0.5311 - classification_loss: 0.0529 69/500 [===>..........................] - ETA: 2:28 - loss: 0.5839 - regression_loss: 0.5307 - classification_loss: 0.0532 70/500 [===>..........................] - ETA: 2:27 - loss: 0.5845 - regression_loss: 0.5311 - classification_loss: 0.0533 71/500 [===>..........................] - ETA: 2:27 - loss: 0.5814 - regression_loss: 0.5284 - classification_loss: 0.0529 72/500 [===>..........................] - ETA: 2:26 - loss: 0.5868 - regression_loss: 0.5338 - classification_loss: 0.0530 73/500 [===>..........................] - ETA: 2:26 - loss: 0.5829 - regression_loss: 0.5304 - classification_loss: 0.0525 74/500 [===>..........................] - ETA: 2:26 - loss: 0.5802 - regression_loss: 0.5280 - classification_loss: 0.0522 75/500 [===>..........................] - ETA: 2:25 - loss: 0.5818 - regression_loss: 0.5292 - classification_loss: 0.0526 76/500 [===>..........................] - ETA: 2:25 - loss: 0.5800 - regression_loss: 0.5277 - classification_loss: 0.0523 77/500 [===>..........................] - ETA: 2:25 - loss: 0.5755 - regression_loss: 0.5237 - classification_loss: 0.0518 78/500 [===>..........................] - ETA: 2:25 - loss: 0.5745 - regression_loss: 0.5225 - classification_loss: 0.0520 79/500 [===>..........................] - ETA: 2:24 - loss: 0.5710 - regression_loss: 0.5193 - classification_loss: 0.0517 80/500 [===>..........................] - ETA: 2:24 - loss: 0.5684 - regression_loss: 0.5171 - classification_loss: 0.0513 81/500 [===>..........................] - ETA: 2:23 - loss: 0.5657 - regression_loss: 0.5148 - classification_loss: 0.0509 82/500 [===>..........................] - ETA: 2:23 - loss: 0.5669 - regression_loss: 0.5160 - classification_loss: 0.0509 83/500 [===>..........................] - ETA: 2:23 - loss: 0.5684 - regression_loss: 0.5172 - classification_loss: 0.0511 84/500 [====>.........................] - ETA: 2:23 - loss: 0.5696 - regression_loss: 0.5186 - classification_loss: 0.0510 85/500 [====>.........................] - ETA: 2:22 - loss: 0.5706 - regression_loss: 0.5193 - classification_loss: 0.0513 86/500 [====>.........................] - ETA: 2:22 - loss: 0.5713 - regression_loss: 0.5200 - classification_loss: 0.0513 87/500 [====>.........................] - ETA: 2:21 - loss: 0.5700 - regression_loss: 0.5188 - classification_loss: 0.0512 88/500 [====>.........................] - ETA: 2:21 - loss: 0.5687 - regression_loss: 0.5176 - classification_loss: 0.0510 89/500 [====>.........................] - ETA: 2:21 - loss: 0.5665 - regression_loss: 0.5159 - classification_loss: 0.0507 90/500 [====>.........................] - ETA: 2:20 - loss: 0.5685 - regression_loss: 0.5178 - classification_loss: 0.0507 91/500 [====>.........................] - ETA: 2:20 - loss: 0.5719 - regression_loss: 0.5211 - classification_loss: 0.0508 92/500 [====>.........................] - ETA: 2:20 - loss: 0.5745 - regression_loss: 0.5236 - classification_loss: 0.0509 93/500 [====>.........................] - ETA: 2:19 - loss: 0.5731 - regression_loss: 0.5222 - classification_loss: 0.0509 94/500 [====>.........................] - ETA: 2:19 - loss: 0.5702 - regression_loss: 0.5196 - classification_loss: 0.0506 95/500 [====>.........................] - ETA: 2:19 - loss: 0.5678 - regression_loss: 0.5175 - classification_loss: 0.0503 96/500 [====>.........................] - ETA: 2:18 - loss: 0.5677 - regression_loss: 0.5174 - classification_loss: 0.0503 97/500 [====>.........................] - ETA: 2:18 - loss: 0.5666 - regression_loss: 0.5160 - classification_loss: 0.0506 98/500 [====>.........................] - ETA: 2:17 - loss: 0.5684 - regression_loss: 0.5176 - classification_loss: 0.0508 99/500 [====>.........................] - ETA: 2:17 - loss: 0.5693 - regression_loss: 0.5183 - classification_loss: 0.0510 100/500 [=====>........................] - ETA: 2:17 - loss: 0.5688 - regression_loss: 0.5179 - classification_loss: 0.0509 101/500 [=====>........................] - ETA: 2:16 - loss: 0.5674 - regression_loss: 0.5168 - classification_loss: 0.0506 102/500 [=====>........................] - ETA: 2:16 - loss: 0.5683 - regression_loss: 0.5177 - classification_loss: 0.0507 103/500 [=====>........................] - ETA: 2:16 - loss: 0.5671 - regression_loss: 0.5165 - classification_loss: 0.0506 104/500 [=====>........................] - ETA: 2:15 - loss: 0.5668 - regression_loss: 0.5163 - classification_loss: 0.0504 105/500 [=====>........................] - ETA: 2:15 - loss: 0.5676 - regression_loss: 0.5171 - classification_loss: 0.0505 106/500 [=====>........................] - ETA: 2:15 - loss: 0.5694 - regression_loss: 0.5184 - classification_loss: 0.0510 107/500 [=====>........................] - ETA: 2:14 - loss: 0.5672 - regression_loss: 0.5163 - classification_loss: 0.0508 108/500 [=====>........................] - ETA: 2:14 - loss: 0.5713 - regression_loss: 0.5195 - classification_loss: 0.0518 109/500 [=====>........................] - ETA: 2:14 - loss: 0.5688 - regression_loss: 0.5173 - classification_loss: 0.0515 110/500 [=====>........................] - ETA: 2:13 - loss: 0.5714 - regression_loss: 0.5199 - classification_loss: 0.0515 111/500 [=====>........................] - ETA: 2:13 - loss: 0.5738 - regression_loss: 0.5221 - classification_loss: 0.0517 112/500 [=====>........................] - ETA: 2:13 - loss: 0.5800 - regression_loss: 0.5273 - classification_loss: 0.0527 113/500 [=====>........................] - ETA: 2:12 - loss: 0.5779 - regression_loss: 0.5255 - classification_loss: 0.0525 114/500 [=====>........................] - ETA: 2:12 - loss: 0.5821 - regression_loss: 0.5292 - classification_loss: 0.0529 115/500 [=====>........................] - ETA: 2:12 - loss: 0.5856 - regression_loss: 0.5324 - classification_loss: 0.0532 116/500 [=====>........................] - ETA: 2:11 - loss: 0.5867 - regression_loss: 0.5334 - classification_loss: 0.0532 117/500 [======>.......................] - ETA: 2:11 - loss: 0.5839 - regression_loss: 0.5310 - classification_loss: 0.0529 118/500 [======>.......................] - ETA: 2:11 - loss: 0.5839 - regression_loss: 0.5312 - classification_loss: 0.0528 119/500 [======>.......................] - ETA: 2:10 - loss: 0.5832 - regression_loss: 0.5305 - classification_loss: 0.0527 120/500 [======>.......................] - ETA: 2:10 - loss: 0.5852 - regression_loss: 0.5322 - classification_loss: 0.0530 121/500 [======>.......................] - ETA: 2:10 - loss: 0.5861 - regression_loss: 0.5330 - classification_loss: 0.0531 122/500 [======>.......................] - ETA: 2:09 - loss: 0.5871 - regression_loss: 0.5335 - classification_loss: 0.0536 123/500 [======>.......................] - ETA: 2:09 - loss: 0.5852 - regression_loss: 0.5319 - classification_loss: 0.0533 124/500 [======>.......................] - ETA: 2:08 - loss: 0.5865 - regression_loss: 0.5332 - classification_loss: 0.0533 125/500 [======>.......................] - ETA: 2:08 - loss: 0.5882 - regression_loss: 0.5346 - classification_loss: 0.0536 126/500 [======>.......................] - ETA: 2:08 - loss: 0.5892 - regression_loss: 0.5353 - classification_loss: 0.0539 127/500 [======>.......................] - ETA: 2:07 - loss: 0.5878 - regression_loss: 0.5340 - classification_loss: 0.0538 128/500 [======>.......................] - ETA: 2:07 - loss: 0.5910 - regression_loss: 0.5365 - classification_loss: 0.0545 129/500 [======>.......................] - ETA: 2:07 - loss: 0.5915 - regression_loss: 0.5368 - classification_loss: 0.0547 130/500 [======>.......................] - ETA: 2:06 - loss: 0.5918 - regression_loss: 0.5370 - classification_loss: 0.0548 131/500 [======>.......................] - ETA: 2:06 - loss: 0.5893 - regression_loss: 0.5348 - classification_loss: 0.0545 132/500 [======>.......................] - ETA: 2:06 - loss: 0.5892 - regression_loss: 0.5347 - classification_loss: 0.0545 133/500 [======>.......................] - ETA: 2:05 - loss: 0.5875 - regression_loss: 0.5332 - classification_loss: 0.0544 134/500 [=======>......................] - ETA: 2:05 - loss: 0.5870 - regression_loss: 0.5327 - classification_loss: 0.0543 135/500 [=======>......................] - ETA: 2:05 - loss: 0.5837 - regression_loss: 0.5295 - classification_loss: 0.0541 136/500 [=======>......................] - ETA: 2:04 - loss: 0.5861 - regression_loss: 0.5316 - classification_loss: 0.0545 137/500 [=======>......................] - ETA: 2:04 - loss: 0.5871 - regression_loss: 0.5323 - classification_loss: 0.0548 138/500 [=======>......................] - ETA: 2:04 - loss: 0.5855 - regression_loss: 0.5307 - classification_loss: 0.0547 139/500 [=======>......................] - ETA: 2:03 - loss: 0.5846 - regression_loss: 0.5297 - classification_loss: 0.0549 140/500 [=======>......................] - ETA: 2:03 - loss: 0.5833 - regression_loss: 0.5286 - classification_loss: 0.0548 141/500 [=======>......................] - ETA: 2:03 - loss: 0.5836 - regression_loss: 0.5289 - classification_loss: 0.0547 142/500 [=======>......................] - ETA: 2:02 - loss: 0.5825 - regression_loss: 0.5280 - classification_loss: 0.0545 143/500 [=======>......................] - ETA: 2:02 - loss: 0.5838 - regression_loss: 0.5290 - classification_loss: 0.0548 144/500 [=======>......................] - ETA: 2:02 - loss: 0.5814 - regression_loss: 0.5269 - classification_loss: 0.0545 145/500 [=======>......................] - ETA: 2:01 - loss: 0.5838 - regression_loss: 0.5289 - classification_loss: 0.0549 146/500 [=======>......................] - ETA: 2:01 - loss: 0.5834 - regression_loss: 0.5285 - classification_loss: 0.0548 147/500 [=======>......................] - ETA: 2:01 - loss: 0.5850 - regression_loss: 0.5298 - classification_loss: 0.0552 148/500 [=======>......................] - ETA: 2:00 - loss: 0.5851 - regression_loss: 0.5298 - classification_loss: 0.0552 149/500 [=======>......................] - ETA: 2:00 - loss: 0.5832 - regression_loss: 0.5281 - classification_loss: 0.0551 150/500 [========>.....................] - ETA: 2:00 - loss: 0.5839 - regression_loss: 0.5285 - classification_loss: 0.0554 151/500 [========>.....................] - ETA: 1:59 - loss: 0.5843 - regression_loss: 0.5289 - classification_loss: 0.0554 152/500 [========>.....................] - ETA: 1:59 - loss: 0.5831 - regression_loss: 0.5279 - classification_loss: 0.0552 153/500 [========>.....................] - ETA: 1:58 - loss: 0.5824 - regression_loss: 0.5274 - classification_loss: 0.0551 154/500 [========>.....................] - ETA: 1:58 - loss: 0.5821 - regression_loss: 0.5271 - classification_loss: 0.0550 155/500 [========>.....................] - ETA: 1:58 - loss: 0.5808 - regression_loss: 0.5260 - classification_loss: 0.0548 156/500 [========>.....................] - ETA: 1:57 - loss: 0.5785 - regression_loss: 0.5239 - classification_loss: 0.0546 157/500 [========>.....................] - ETA: 1:57 - loss: 0.5764 - regression_loss: 0.5220 - classification_loss: 0.0544 158/500 [========>.....................] - ETA: 1:57 - loss: 0.5778 - regression_loss: 0.5233 - classification_loss: 0.0545 159/500 [========>.....................] - ETA: 1:56 - loss: 0.5773 - regression_loss: 0.5226 - classification_loss: 0.0547 160/500 [========>.....................] - ETA: 1:56 - loss: 0.5766 - regression_loss: 0.5220 - classification_loss: 0.0546 161/500 [========>.....................] - ETA: 1:56 - loss: 0.5753 - regression_loss: 0.5208 - classification_loss: 0.0545 162/500 [========>.....................] - ETA: 1:55 - loss: 0.5761 - regression_loss: 0.5215 - classification_loss: 0.0545 163/500 [========>.....................] - ETA: 1:55 - loss: 0.5743 - regression_loss: 0.5200 - classification_loss: 0.0543 164/500 [========>.....................] - ETA: 1:54 - loss: 0.5745 - regression_loss: 0.5202 - classification_loss: 0.0544 165/500 [========>.....................] - ETA: 1:54 - loss: 0.5747 - regression_loss: 0.5203 - classification_loss: 0.0544 166/500 [========>.....................] - ETA: 1:54 - loss: 0.5751 - regression_loss: 0.5206 - classification_loss: 0.0544 167/500 [=========>....................] - ETA: 1:53 - loss: 0.5753 - regression_loss: 0.5209 - classification_loss: 0.0544 168/500 [=========>....................] - ETA: 1:53 - loss: 0.5760 - regression_loss: 0.5216 - classification_loss: 0.0544 169/500 [=========>....................] - ETA: 1:53 - loss: 0.5769 - regression_loss: 0.5226 - classification_loss: 0.0543 170/500 [=========>....................] - ETA: 1:52 - loss: 0.5768 - regression_loss: 0.5225 - classification_loss: 0.0543 171/500 [=========>....................] - ETA: 1:52 - loss: 0.5768 - regression_loss: 0.5226 - classification_loss: 0.0543 172/500 [=========>....................] - ETA: 1:52 - loss: 0.5766 - regression_loss: 0.5222 - classification_loss: 0.0544 173/500 [=========>....................] - ETA: 1:51 - loss: 0.5756 - regression_loss: 0.5212 - classification_loss: 0.0543 174/500 [=========>....................] - ETA: 1:51 - loss: 0.5756 - regression_loss: 0.5211 - classification_loss: 0.0545 175/500 [=========>....................] - ETA: 1:51 - loss: 0.5764 - regression_loss: 0.5218 - classification_loss: 0.0547 176/500 [=========>....................] - ETA: 1:50 - loss: 0.5768 - regression_loss: 0.5223 - classification_loss: 0.0545 177/500 [=========>....................] - ETA: 1:50 - loss: 0.5758 - regression_loss: 0.5215 - classification_loss: 0.0543 178/500 [=========>....................] - ETA: 1:50 - loss: 0.5757 - regression_loss: 0.5214 - classification_loss: 0.0543 179/500 [=========>....................] - ETA: 1:49 - loss: 0.5745 - regression_loss: 0.5203 - classification_loss: 0.0541 180/500 [=========>....................] - ETA: 1:49 - loss: 0.5769 - regression_loss: 0.5224 - classification_loss: 0.0545 181/500 [=========>....................] - ETA: 1:49 - loss: 0.5755 - regression_loss: 0.5211 - classification_loss: 0.0544 182/500 [=========>....................] - ETA: 1:48 - loss: 0.5774 - regression_loss: 0.5227 - classification_loss: 0.0547 183/500 [=========>....................] - ETA: 1:48 - loss: 0.5793 - regression_loss: 0.5243 - classification_loss: 0.0551 184/500 [==========>...................] - ETA: 1:48 - loss: 0.5797 - regression_loss: 0.5246 - classification_loss: 0.0551 185/500 [==========>...................] - ETA: 1:47 - loss: 0.5783 - regression_loss: 0.5233 - classification_loss: 0.0550 186/500 [==========>...................] - ETA: 1:47 - loss: 0.5772 - regression_loss: 0.5223 - classification_loss: 0.0549 187/500 [==========>...................] - ETA: 1:47 - loss: 0.5774 - regression_loss: 0.5224 - classification_loss: 0.0549 188/500 [==========>...................] - ETA: 1:46 - loss: 0.5782 - regression_loss: 0.5231 - classification_loss: 0.0551 189/500 [==========>...................] - ETA: 1:46 - loss: 0.5769 - regression_loss: 0.5220 - classification_loss: 0.0549 190/500 [==========>...................] - ETA: 1:46 - loss: 0.5775 - regression_loss: 0.5225 - classification_loss: 0.0550 191/500 [==========>...................] - ETA: 1:45 - loss: 0.5775 - regression_loss: 0.5226 - classification_loss: 0.0549 192/500 [==========>...................] - ETA: 1:45 - loss: 0.5764 - regression_loss: 0.5216 - classification_loss: 0.0548 193/500 [==========>...................] - ETA: 1:45 - loss: 0.5794 - regression_loss: 0.5245 - classification_loss: 0.0549 194/500 [==========>...................] - ETA: 1:44 - loss: 0.5773 - regression_loss: 0.5226 - classification_loss: 0.0547 195/500 [==========>...................] - ETA: 1:44 - loss: 0.5775 - regression_loss: 0.5228 - classification_loss: 0.0547 196/500 [==========>...................] - ETA: 1:44 - loss: 0.5782 - regression_loss: 0.5234 - classification_loss: 0.0548 197/500 [==========>...................] - ETA: 1:43 - loss: 0.5786 - regression_loss: 0.5238 - classification_loss: 0.0548 198/500 [==========>...................] - ETA: 1:43 - loss: 0.5799 - regression_loss: 0.5250 - classification_loss: 0.0548 199/500 [==========>...................] - ETA: 1:43 - loss: 0.5793 - regression_loss: 0.5245 - classification_loss: 0.0548 200/500 [===========>..................] - ETA: 1:42 - loss: 0.5797 - regression_loss: 0.5251 - classification_loss: 0.0546 201/500 [===========>..................] - ETA: 1:42 - loss: 0.5794 - regression_loss: 0.5250 - classification_loss: 0.0545 202/500 [===========>..................] - ETA: 1:42 - loss: 0.5795 - regression_loss: 0.5250 - classification_loss: 0.0545 203/500 [===========>..................] - ETA: 1:41 - loss: 0.5811 - regression_loss: 0.5264 - classification_loss: 0.0547 204/500 [===========>..................] - ETA: 1:41 - loss: 0.5836 - regression_loss: 0.5287 - classification_loss: 0.0549 205/500 [===========>..................] - ETA: 1:41 - loss: 0.5830 - regression_loss: 0.5282 - classification_loss: 0.0548 206/500 [===========>..................] - ETA: 1:40 - loss: 0.5834 - regression_loss: 0.5286 - classification_loss: 0.0548 207/500 [===========>..................] - ETA: 1:40 - loss: 0.5844 - regression_loss: 0.5296 - classification_loss: 0.0548 208/500 [===========>..................] - ETA: 1:40 - loss: 0.5842 - regression_loss: 0.5295 - classification_loss: 0.0548 209/500 [===========>..................] - ETA: 1:39 - loss: 0.5845 - regression_loss: 0.5299 - classification_loss: 0.0546 210/500 [===========>..................] - ETA: 1:39 - loss: 0.5844 - regression_loss: 0.5299 - classification_loss: 0.0545 211/500 [===========>..................] - ETA: 1:39 - loss: 0.5846 - regression_loss: 0.5300 - classification_loss: 0.0546 212/500 [===========>..................] - ETA: 1:38 - loss: 0.5838 - regression_loss: 0.5293 - classification_loss: 0.0545 213/500 [===========>..................] - ETA: 1:38 - loss: 0.5830 - regression_loss: 0.5286 - classification_loss: 0.0544 214/500 [===========>..................] - ETA: 1:37 - loss: 0.5816 - regression_loss: 0.5274 - classification_loss: 0.0542 215/500 [===========>..................] - ETA: 1:37 - loss: 0.5819 - regression_loss: 0.5276 - classification_loss: 0.0542 216/500 [===========>..................] - ETA: 1:37 - loss: 0.5833 - regression_loss: 0.5288 - classification_loss: 0.0545 217/500 [============>.................] - ETA: 1:36 - loss: 0.5828 - regression_loss: 0.5283 - classification_loss: 0.0545 218/500 [============>.................] - ETA: 1:36 - loss: 0.5813 - regression_loss: 0.5270 - classification_loss: 0.0543 219/500 [============>.................] - ETA: 1:36 - loss: 0.5813 - regression_loss: 0.5270 - classification_loss: 0.0543 220/500 [============>.................] - ETA: 1:35 - loss: 0.5803 - regression_loss: 0.5262 - classification_loss: 0.0541 221/500 [============>.................] - ETA: 1:35 - loss: 0.5805 - regression_loss: 0.5263 - classification_loss: 0.0542 222/500 [============>.................] - ETA: 1:35 - loss: 0.5798 - regression_loss: 0.5256 - classification_loss: 0.0542 223/500 [============>.................] - ETA: 1:34 - loss: 0.5806 - regression_loss: 0.5263 - classification_loss: 0.0543 224/500 [============>.................] - ETA: 1:34 - loss: 0.5816 - regression_loss: 0.5270 - classification_loss: 0.0546 225/500 [============>.................] - ETA: 1:34 - loss: 0.5818 - regression_loss: 0.5272 - classification_loss: 0.0546 226/500 [============>.................] - ETA: 1:33 - loss: 0.5817 - regression_loss: 0.5272 - classification_loss: 0.0544 227/500 [============>.................] - ETA: 1:33 - loss: 0.5816 - regression_loss: 0.5271 - classification_loss: 0.0545 228/500 [============>.................] - ETA: 1:33 - loss: 0.5804 - regression_loss: 0.5261 - classification_loss: 0.0544 229/500 [============>.................] - ETA: 1:32 - loss: 0.5796 - regression_loss: 0.5252 - classification_loss: 0.0544 230/500 [============>.................] - ETA: 1:32 - loss: 0.5797 - regression_loss: 0.5253 - classification_loss: 0.0544 231/500 [============>.................] - ETA: 1:32 - loss: 0.5783 - regression_loss: 0.5240 - classification_loss: 0.0542 232/500 [============>.................] - ETA: 1:31 - loss: 0.5783 - regression_loss: 0.5240 - classification_loss: 0.0542 233/500 [============>.................] - ETA: 1:31 - loss: 0.5771 - regression_loss: 0.5230 - classification_loss: 0.0541 234/500 [=============>................] - ETA: 1:31 - loss: 0.5780 - regression_loss: 0.5237 - classification_loss: 0.0542 235/500 [=============>................] - ETA: 1:30 - loss: 0.5786 - regression_loss: 0.5242 - classification_loss: 0.0545 236/500 [=============>................] - ETA: 1:30 - loss: 0.5790 - regression_loss: 0.5246 - classification_loss: 0.0544 237/500 [=============>................] - ETA: 1:30 - loss: 0.5789 - regression_loss: 0.5245 - classification_loss: 0.0544 238/500 [=============>................] - ETA: 1:29 - loss: 0.5788 - regression_loss: 0.5243 - classification_loss: 0.0545 239/500 [=============>................] - ETA: 1:29 - loss: 0.5775 - regression_loss: 0.5231 - classification_loss: 0.0544 240/500 [=============>................] - ETA: 1:29 - loss: 0.5769 - regression_loss: 0.5227 - classification_loss: 0.0543 241/500 [=============>................] - ETA: 1:28 - loss: 0.5763 - regression_loss: 0.5221 - classification_loss: 0.0542 242/500 [=============>................] - ETA: 1:28 - loss: 0.5766 - regression_loss: 0.5224 - classification_loss: 0.0542 243/500 [=============>................] - ETA: 1:28 - loss: 0.5770 - regression_loss: 0.5229 - classification_loss: 0.0542 244/500 [=============>................] - ETA: 1:27 - loss: 0.5780 - regression_loss: 0.5238 - classification_loss: 0.0542 245/500 [=============>................] - ETA: 1:27 - loss: 0.5779 - regression_loss: 0.5238 - classification_loss: 0.0541 246/500 [=============>................] - ETA: 1:27 - loss: 0.5766 - regression_loss: 0.5227 - classification_loss: 0.0540 247/500 [=============>................] - ETA: 1:26 - loss: 0.5767 - regression_loss: 0.5228 - classification_loss: 0.0539 248/500 [=============>................] - ETA: 1:26 - loss: 0.5776 - regression_loss: 0.5238 - classification_loss: 0.0538 249/500 [=============>................] - ETA: 1:26 - loss: 0.5776 - regression_loss: 0.5239 - classification_loss: 0.0537 250/500 [==============>...............] - ETA: 1:25 - loss: 0.5773 - regression_loss: 0.5236 - classification_loss: 0.0537 251/500 [==============>...............] - ETA: 1:25 - loss: 0.5776 - regression_loss: 0.5239 - classification_loss: 0.0537 252/500 [==============>...............] - ETA: 1:25 - loss: 0.5766 - regression_loss: 0.5231 - classification_loss: 0.0535 253/500 [==============>...............] - ETA: 1:24 - loss: 0.5762 - regression_loss: 0.5227 - classification_loss: 0.0535 254/500 [==============>...............] - ETA: 1:24 - loss: 0.5785 - regression_loss: 0.5246 - classification_loss: 0.0539 255/500 [==============>...............] - ETA: 1:24 - loss: 0.5779 - regression_loss: 0.5241 - classification_loss: 0.0538 256/500 [==============>...............] - ETA: 1:23 - loss: 0.5788 - regression_loss: 0.5250 - classification_loss: 0.0539 257/500 [==============>...............] - ETA: 1:23 - loss: 0.5791 - regression_loss: 0.5253 - classification_loss: 0.0538 258/500 [==============>...............] - ETA: 1:22 - loss: 0.5792 - regression_loss: 0.5253 - classification_loss: 0.0539 259/500 [==============>...............] - ETA: 1:22 - loss: 0.5796 - regression_loss: 0.5258 - classification_loss: 0.0538 260/500 [==============>...............] - ETA: 1:22 - loss: 0.5792 - regression_loss: 0.5254 - classification_loss: 0.0537 261/500 [==============>...............] - ETA: 1:21 - loss: 0.5783 - regression_loss: 0.5247 - classification_loss: 0.0536 262/500 [==============>...............] - ETA: 1:21 - loss: 0.5781 - regression_loss: 0.5246 - classification_loss: 0.0535 263/500 [==============>...............] - ETA: 1:21 - loss: 0.5787 - regression_loss: 0.5250 - classification_loss: 0.0537 264/500 [==============>...............] - ETA: 1:20 - loss: 0.5784 - regression_loss: 0.5247 - classification_loss: 0.0536 265/500 [==============>...............] - ETA: 1:20 - loss: 0.5777 - regression_loss: 0.5241 - classification_loss: 0.0536 266/500 [==============>...............] - ETA: 1:20 - loss: 0.5770 - regression_loss: 0.5235 - classification_loss: 0.0535 267/500 [===============>..............] - ETA: 1:19 - loss: 0.5759 - regression_loss: 0.5225 - classification_loss: 0.0534 268/500 [===============>..............] - ETA: 1:19 - loss: 0.5749 - regression_loss: 0.5216 - classification_loss: 0.0533 269/500 [===============>..............] - ETA: 1:19 - loss: 0.5755 - regression_loss: 0.5222 - classification_loss: 0.0533 270/500 [===============>..............] - ETA: 1:18 - loss: 0.5771 - regression_loss: 0.5237 - classification_loss: 0.0534 271/500 [===============>..............] - ETA: 1:18 - loss: 0.5776 - regression_loss: 0.5241 - classification_loss: 0.0535 272/500 [===============>..............] - ETA: 1:18 - loss: 0.5771 - regression_loss: 0.5236 - classification_loss: 0.0535 273/500 [===============>..............] - ETA: 1:17 - loss: 0.5769 - regression_loss: 0.5234 - classification_loss: 0.0535 274/500 [===============>..............] - ETA: 1:17 - loss: 0.5779 - regression_loss: 0.5243 - classification_loss: 0.0535 275/500 [===============>..............] - ETA: 1:17 - loss: 0.5775 - regression_loss: 0.5240 - classification_loss: 0.0534 276/500 [===============>..............] - ETA: 1:16 - loss: 0.5779 - regression_loss: 0.5245 - classification_loss: 0.0534 277/500 [===============>..............] - ETA: 1:16 - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 278/500 [===============>..............] - ETA: 1:16 - loss: 0.5761 - regression_loss: 0.5228 - classification_loss: 0.0533 279/500 [===============>..............] - ETA: 1:15 - loss: 0.5761 - regression_loss: 0.5229 - classification_loss: 0.0533 280/500 [===============>..............] - ETA: 1:15 - loss: 0.5771 - regression_loss: 0.5237 - classification_loss: 0.0533 281/500 [===============>..............] - ETA: 1:15 - loss: 0.5772 - regression_loss: 0.5239 - classification_loss: 0.0533 282/500 [===============>..............] - ETA: 1:14 - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 283/500 [===============>..............] - ETA: 1:14 - loss: 0.5764 - regression_loss: 0.5232 - classification_loss: 0.0532 284/500 [================>.............] - ETA: 1:14 - loss: 0.5762 - regression_loss: 0.5231 - classification_loss: 0.0531 285/500 [================>.............] - ETA: 1:13 - loss: 0.5760 - regression_loss: 0.5228 - classification_loss: 0.0531 286/500 [================>.............] - ETA: 1:13 - loss: 0.5761 - regression_loss: 0.5229 - classification_loss: 0.0532 287/500 [================>.............] - ETA: 1:12 - loss: 0.5763 - regression_loss: 0.5232 - classification_loss: 0.0532 288/500 [================>.............] - ETA: 1:12 - loss: 0.5763 - regression_loss: 0.5231 - classification_loss: 0.0531 289/500 [================>.............] - ETA: 1:12 - loss: 0.5757 - regression_loss: 0.5226 - classification_loss: 0.0531 290/500 [================>.............] - ETA: 1:11 - loss: 0.5755 - regression_loss: 0.5224 - classification_loss: 0.0531 291/500 [================>.............] - ETA: 1:11 - loss: 0.5754 - regression_loss: 0.5223 - classification_loss: 0.0530 292/500 [================>.............] - ETA: 1:11 - loss: 0.5756 - regression_loss: 0.5225 - classification_loss: 0.0531 293/500 [================>.............] - ETA: 1:10 - loss: 0.5756 - regression_loss: 0.5224 - classification_loss: 0.0531 294/500 [================>.............] - ETA: 1:10 - loss: 0.5753 - regression_loss: 0.5222 - classification_loss: 0.0531 295/500 [================>.............] - ETA: 1:10 - loss: 0.5754 - regression_loss: 0.5223 - classification_loss: 0.0531 296/500 [================>.............] - ETA: 1:09 - loss: 0.5754 - regression_loss: 0.5224 - classification_loss: 0.0530 297/500 [================>.............] - ETA: 1:09 - loss: 0.5748 - regression_loss: 0.5218 - classification_loss: 0.0530 298/500 [================>.............] - ETA: 1:09 - loss: 0.5757 - regression_loss: 0.5226 - classification_loss: 0.0531 299/500 [================>.............] - ETA: 1:08 - loss: 0.5760 - regression_loss: 0.5229 - classification_loss: 0.0531 300/500 [=================>............] - ETA: 1:08 - loss: 0.5760 - regression_loss: 0.5229 - classification_loss: 0.0532 301/500 [=================>............] - ETA: 1:08 - loss: 0.5765 - regression_loss: 0.5233 - classification_loss: 0.0532 302/500 [=================>............] - ETA: 1:07 - loss: 0.5767 - regression_loss: 0.5235 - classification_loss: 0.0532 303/500 [=================>............] - ETA: 1:07 - loss: 0.5760 - regression_loss: 0.5229 - classification_loss: 0.0531 304/500 [=================>............] - ETA: 1:07 - loss: 0.5759 - regression_loss: 0.5228 - classification_loss: 0.0531 305/500 [=================>............] - ETA: 1:06 - loss: 0.5757 - regression_loss: 0.5227 - classification_loss: 0.0530 306/500 [=================>............] - ETA: 1:06 - loss: 0.5752 - regression_loss: 0.5223 - classification_loss: 0.0529 307/500 [=================>............] - ETA: 1:06 - loss: 0.5741 - regression_loss: 0.5213 - classification_loss: 0.0528 308/500 [=================>............] - ETA: 1:05 - loss: 0.5734 - regression_loss: 0.5206 - classification_loss: 0.0528 309/500 [=================>............] - ETA: 1:05 - loss: 0.5721 - regression_loss: 0.5194 - classification_loss: 0.0527 310/500 [=================>............] - ETA: 1:05 - loss: 0.5717 - regression_loss: 0.5191 - classification_loss: 0.0526 311/500 [=================>............] - ETA: 1:04 - loss: 0.5717 - regression_loss: 0.5191 - classification_loss: 0.0526 312/500 [=================>............] - ETA: 1:04 - loss: 0.5714 - regression_loss: 0.5188 - classification_loss: 0.0526 313/500 [=================>............] - ETA: 1:04 - loss: 0.5718 - regression_loss: 0.5192 - classification_loss: 0.0526 314/500 [=================>............] - ETA: 1:03 - loss: 0.5712 - regression_loss: 0.5186 - classification_loss: 0.0526 315/500 [=================>............] - ETA: 1:03 - loss: 0.5712 - regression_loss: 0.5187 - classification_loss: 0.0525 316/500 [=================>............] - ETA: 1:03 - loss: 0.5712 - regression_loss: 0.5187 - classification_loss: 0.0525 317/500 [==================>...........] - ETA: 1:02 - loss: 0.5713 - regression_loss: 0.5188 - classification_loss: 0.0524 318/500 [==================>...........] - ETA: 1:02 - loss: 0.5707 - regression_loss: 0.5183 - classification_loss: 0.0524 319/500 [==================>...........] - ETA: 1:02 - loss: 0.5704 - regression_loss: 0.5180 - classification_loss: 0.0524 320/500 [==================>...........] - ETA: 1:01 - loss: 0.5706 - regression_loss: 0.5182 - classification_loss: 0.0524 321/500 [==================>...........] - ETA: 1:01 - loss: 0.5719 - regression_loss: 0.5194 - classification_loss: 0.0526 322/500 [==================>...........] - ETA: 1:01 - loss: 0.5719 - regression_loss: 0.5194 - classification_loss: 0.0525 323/500 [==================>...........] - ETA: 1:00 - loss: 0.5719 - regression_loss: 0.5194 - classification_loss: 0.0525 324/500 [==================>...........] - ETA: 1:00 - loss: 0.5707 - regression_loss: 0.5183 - classification_loss: 0.0524 325/500 [==================>...........] - ETA: 1:00 - loss: 0.5705 - regression_loss: 0.5181 - classification_loss: 0.0524 326/500 [==================>...........] - ETA: 59s - loss: 0.5699 - regression_loss: 0.5176 - classification_loss: 0.0523  327/500 [==================>...........] - ETA: 59s - loss: 0.5703 - regression_loss: 0.5180 - classification_loss: 0.0523 328/500 [==================>...........] - ETA: 58s - loss: 0.5697 - regression_loss: 0.5175 - classification_loss: 0.0523 329/500 [==================>...........] - ETA: 58s - loss: 0.5689 - regression_loss: 0.5168 - classification_loss: 0.0521 330/500 [==================>...........] - ETA: 58s - loss: 0.5686 - regression_loss: 0.5165 - classification_loss: 0.0520 331/500 [==================>...........] - ETA: 57s - loss: 0.5694 - regression_loss: 0.5173 - classification_loss: 0.0521 332/500 [==================>...........] - ETA: 57s - loss: 0.5696 - regression_loss: 0.5176 - classification_loss: 0.0521 333/500 [==================>...........] - ETA: 57s - loss: 0.5690 - regression_loss: 0.5170 - classification_loss: 0.0520 334/500 [===================>..........] - ETA: 56s - loss: 0.5696 - regression_loss: 0.5176 - classification_loss: 0.0520 335/500 [===================>..........] - ETA: 56s - loss: 0.5702 - regression_loss: 0.5181 - classification_loss: 0.0521 336/500 [===================>..........] - ETA: 56s - loss: 0.5706 - regression_loss: 0.5183 - classification_loss: 0.0523 337/500 [===================>..........] - ETA: 55s - loss: 0.5714 - regression_loss: 0.5190 - classification_loss: 0.0524 338/500 [===================>..........] - ETA: 55s - loss: 0.5704 - regression_loss: 0.5181 - classification_loss: 0.0523 339/500 [===================>..........] - ETA: 55s - loss: 0.5698 - regression_loss: 0.5176 - classification_loss: 0.0522 340/500 [===================>..........] - ETA: 54s - loss: 0.5699 - regression_loss: 0.5176 - classification_loss: 0.0523 341/500 [===================>..........] - ETA: 54s - loss: 0.5694 - regression_loss: 0.5170 - classification_loss: 0.0523 342/500 [===================>..........] - ETA: 54s - loss: 0.5698 - regression_loss: 0.5174 - classification_loss: 0.0524 343/500 [===================>..........] - ETA: 53s - loss: 0.5691 - regression_loss: 0.5168 - classification_loss: 0.0523 344/500 [===================>..........] - ETA: 53s - loss: 0.5690 - regression_loss: 0.5167 - classification_loss: 0.0523 345/500 [===================>..........] - ETA: 53s - loss: 0.5688 - regression_loss: 0.5166 - classification_loss: 0.0523 346/500 [===================>..........] - ETA: 52s - loss: 0.5693 - regression_loss: 0.5170 - classification_loss: 0.0524 347/500 [===================>..........] - ETA: 52s - loss: 0.5684 - regression_loss: 0.5161 - classification_loss: 0.0523 348/500 [===================>..........] - ETA: 52s - loss: 0.5680 - regression_loss: 0.5158 - classification_loss: 0.0522 349/500 [===================>..........] - ETA: 51s - loss: 0.5669 - regression_loss: 0.5149 - classification_loss: 0.0521 350/500 [====================>.........] - ETA: 51s - loss: 0.5666 - regression_loss: 0.5145 - classification_loss: 0.0521 351/500 [====================>.........] - ETA: 51s - loss: 0.5657 - regression_loss: 0.5137 - classification_loss: 0.0520 352/500 [====================>.........] - ETA: 50s - loss: 0.5659 - regression_loss: 0.5139 - classification_loss: 0.0520 353/500 [====================>.........] - ETA: 50s - loss: 0.5654 - regression_loss: 0.5134 - classification_loss: 0.0519 354/500 [====================>.........] - ETA: 50s - loss: 0.5652 - regression_loss: 0.5133 - classification_loss: 0.0519 355/500 [====================>.........] - ETA: 49s - loss: 0.5663 - regression_loss: 0.5144 - classification_loss: 0.0519 356/500 [====================>.........] - ETA: 49s - loss: 0.5670 - regression_loss: 0.5149 - classification_loss: 0.0520 357/500 [====================>.........] - ETA: 49s - loss: 0.5671 - regression_loss: 0.5151 - classification_loss: 0.0520 358/500 [====================>.........] - ETA: 48s - loss: 0.5679 - regression_loss: 0.5158 - classification_loss: 0.0520 359/500 [====================>.........] - ETA: 48s - loss: 0.5680 - regression_loss: 0.5160 - classification_loss: 0.0520 360/500 [====================>.........] - ETA: 48s - loss: 0.5677 - regression_loss: 0.5158 - classification_loss: 0.0519 361/500 [====================>.........] - ETA: 47s - loss: 0.5671 - regression_loss: 0.5153 - classification_loss: 0.0519 362/500 [====================>.........] - ETA: 47s - loss: 0.5671 - regression_loss: 0.5153 - classification_loss: 0.0518 363/500 [====================>.........] - ETA: 47s - loss: 0.5673 - regression_loss: 0.5154 - classification_loss: 0.0519 364/500 [====================>.........] - ETA: 46s - loss: 0.5672 - regression_loss: 0.5153 - classification_loss: 0.0520 365/500 [====================>.........] - ETA: 46s - loss: 0.5669 - regression_loss: 0.5150 - classification_loss: 0.0519 366/500 [====================>.........] - ETA: 45s - loss: 0.5668 - regression_loss: 0.5149 - classification_loss: 0.0519 367/500 [=====================>........] - ETA: 45s - loss: 0.5667 - regression_loss: 0.5149 - classification_loss: 0.0518 368/500 [=====================>........] - ETA: 45s - loss: 0.5666 - regression_loss: 0.5148 - classification_loss: 0.0518 369/500 [=====================>........] - ETA: 44s - loss: 0.5663 - regression_loss: 0.5145 - classification_loss: 0.0518 370/500 [=====================>........] - ETA: 44s - loss: 0.5662 - regression_loss: 0.5145 - classification_loss: 0.0518 371/500 [=====================>........] - ETA: 44s - loss: 0.5667 - regression_loss: 0.5149 - classification_loss: 0.0518 372/500 [=====================>........] - ETA: 43s - loss: 0.5669 - regression_loss: 0.5152 - classification_loss: 0.0517 373/500 [=====================>........] - ETA: 43s - loss: 0.5671 - regression_loss: 0.5154 - classification_loss: 0.0517 374/500 [=====================>........] - ETA: 43s - loss: 0.5663 - regression_loss: 0.5146 - classification_loss: 0.0516 375/500 [=====================>........] - ETA: 42s - loss: 0.5664 - regression_loss: 0.5147 - classification_loss: 0.0517 376/500 [=====================>........] - ETA: 42s - loss: 0.5659 - regression_loss: 0.5143 - classification_loss: 0.0516 377/500 [=====================>........] - ETA: 42s - loss: 0.5657 - regression_loss: 0.5140 - classification_loss: 0.0516 378/500 [=====================>........] - ETA: 41s - loss: 0.5660 - regression_loss: 0.5143 - classification_loss: 0.0517 379/500 [=====================>........] - ETA: 41s - loss: 0.5658 - regression_loss: 0.5142 - classification_loss: 0.0517 380/500 [=====================>........] - ETA: 41s - loss: 0.5664 - regression_loss: 0.5146 - classification_loss: 0.0518 381/500 [=====================>........] - ETA: 40s - loss: 0.5661 - regression_loss: 0.5144 - classification_loss: 0.0517 382/500 [=====================>........] - ETA: 40s - loss: 0.5660 - regression_loss: 0.5143 - classification_loss: 0.0517 383/500 [=====================>........] - ETA: 40s - loss: 0.5656 - regression_loss: 0.5139 - classification_loss: 0.0517 384/500 [======================>.......] - ETA: 39s - loss: 0.5655 - regression_loss: 0.5138 - classification_loss: 0.0517 385/500 [======================>.......] - ETA: 39s - loss: 0.5652 - regression_loss: 0.5135 - classification_loss: 0.0517 386/500 [======================>.......] - ETA: 39s - loss: 0.5654 - regression_loss: 0.5137 - classification_loss: 0.0517 387/500 [======================>.......] - ETA: 38s - loss: 0.5654 - regression_loss: 0.5136 - classification_loss: 0.0518 388/500 [======================>.......] - ETA: 38s - loss: 0.5647 - regression_loss: 0.5130 - classification_loss: 0.0517 389/500 [======================>.......] - ETA: 38s - loss: 0.5643 - regression_loss: 0.5126 - classification_loss: 0.0516 390/500 [======================>.......] - ETA: 37s - loss: 0.5635 - regression_loss: 0.5119 - classification_loss: 0.0515 391/500 [======================>.......] - ETA: 37s - loss: 0.5629 - regression_loss: 0.5114 - classification_loss: 0.0515 392/500 [======================>.......] - ETA: 37s - loss: 0.5626 - regression_loss: 0.5111 - classification_loss: 0.0514 393/500 [======================>.......] - ETA: 36s - loss: 0.5629 - regression_loss: 0.5114 - classification_loss: 0.0515 394/500 [======================>.......] - ETA: 36s - loss: 0.5626 - regression_loss: 0.5112 - classification_loss: 0.0514 395/500 [======================>.......] - ETA: 36s - loss: 0.5627 - regression_loss: 0.5113 - classification_loss: 0.0514 396/500 [======================>.......] - ETA: 35s - loss: 0.5623 - regression_loss: 0.5110 - classification_loss: 0.0514 397/500 [======================>.......] - ETA: 35s - loss: 0.5623 - regression_loss: 0.5110 - classification_loss: 0.0513 398/500 [======================>.......] - ETA: 35s - loss: 0.5619 - regression_loss: 0.5106 - classification_loss: 0.0512 399/500 [======================>.......] - ETA: 34s - loss: 0.5621 - regression_loss: 0.5108 - classification_loss: 0.0513 400/500 [=======================>......] - ETA: 34s - loss: 0.5627 - regression_loss: 0.5114 - classification_loss: 0.0513 401/500 [=======================>......] - ETA: 34s - loss: 0.5620 - regression_loss: 0.5107 - classification_loss: 0.0513 402/500 [=======================>......] - ETA: 33s - loss: 0.5613 - regression_loss: 0.5101 - classification_loss: 0.0512 403/500 [=======================>......] - ETA: 33s - loss: 0.5613 - regression_loss: 0.5102 - classification_loss: 0.0511 404/500 [=======================>......] - ETA: 32s - loss: 0.5609 - regression_loss: 0.5098 - classification_loss: 0.0511 405/500 [=======================>......] - ETA: 32s - loss: 0.5604 - regression_loss: 0.5094 - classification_loss: 0.0510 406/500 [=======================>......] - ETA: 32s - loss: 0.5605 - regression_loss: 0.5095 - classification_loss: 0.0511 407/500 [=======================>......] - ETA: 31s - loss: 0.5608 - regression_loss: 0.5096 - classification_loss: 0.0512 408/500 [=======================>......] - ETA: 31s - loss: 0.5613 - regression_loss: 0.5101 - classification_loss: 0.0512 409/500 [=======================>......] - ETA: 31s - loss: 0.5614 - regression_loss: 0.5102 - classification_loss: 0.0512 410/500 [=======================>......] - ETA: 30s - loss: 0.5616 - regression_loss: 0.5105 - classification_loss: 0.0511 411/500 [=======================>......] - ETA: 30s - loss: 0.5612 - regression_loss: 0.5101 - classification_loss: 0.0510 412/500 [=======================>......] - ETA: 30s - loss: 0.5606 - regression_loss: 0.5096 - classification_loss: 0.0510 413/500 [=======================>......] - ETA: 29s - loss: 0.5607 - regression_loss: 0.5097 - classification_loss: 0.0509 414/500 [=======================>......] - ETA: 29s - loss: 0.5606 - regression_loss: 0.5097 - classification_loss: 0.0509 415/500 [=======================>......] - ETA: 29s - loss: 0.5602 - regression_loss: 0.5094 - classification_loss: 0.0508 416/500 [=======================>......] - ETA: 28s - loss: 0.5602 - regression_loss: 0.5093 - classification_loss: 0.0509 417/500 [========================>.....] - ETA: 28s - loss: 0.5597 - regression_loss: 0.5089 - classification_loss: 0.0508 418/500 [========================>.....] - ETA: 28s - loss: 0.5597 - regression_loss: 0.5090 - classification_loss: 0.0507 419/500 [========================>.....] - ETA: 27s - loss: 0.5594 - regression_loss: 0.5088 - classification_loss: 0.0507 420/500 [========================>.....] - ETA: 27s - loss: 0.5598 - regression_loss: 0.5091 - classification_loss: 0.0507 421/500 [========================>.....] - ETA: 27s - loss: 0.5592 - regression_loss: 0.5085 - classification_loss: 0.0506 422/500 [========================>.....] - ETA: 26s - loss: 0.5586 - regression_loss: 0.5080 - classification_loss: 0.0505 423/500 [========================>.....] - ETA: 26s - loss: 0.5584 - regression_loss: 0.5078 - classification_loss: 0.0506 424/500 [========================>.....] - ETA: 26s - loss: 0.5580 - regression_loss: 0.5074 - classification_loss: 0.0506 425/500 [========================>.....] - ETA: 25s - loss: 0.5586 - regression_loss: 0.5080 - classification_loss: 0.0506 426/500 [========================>.....] - ETA: 25s - loss: 0.5584 - regression_loss: 0.5078 - classification_loss: 0.0506 427/500 [========================>.....] - ETA: 25s - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 428/500 [========================>.....] - ETA: 24s - loss: 0.5580 - regression_loss: 0.5075 - classification_loss: 0.0505 429/500 [========================>.....] - ETA: 24s - loss: 0.5583 - regression_loss: 0.5078 - classification_loss: 0.0505 430/500 [========================>.....] - ETA: 24s - loss: 0.5577 - regression_loss: 0.5073 - classification_loss: 0.0505 431/500 [========================>.....] - ETA: 23s - loss: 0.5583 - regression_loss: 0.5077 - classification_loss: 0.0506 432/500 [========================>.....] - ETA: 23s - loss: 0.5579 - regression_loss: 0.5075 - classification_loss: 0.0505 433/500 [========================>.....] - ETA: 23s - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 434/500 [=========================>....] - ETA: 22s - loss: 0.5579 - regression_loss: 0.5075 - classification_loss: 0.0504 435/500 [=========================>....] - ETA: 22s - loss: 0.5578 - regression_loss: 0.5074 - classification_loss: 0.0505 436/500 [=========================>....] - ETA: 21s - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 437/500 [=========================>....] - ETA: 21s - loss: 0.5583 - regression_loss: 0.5079 - classification_loss: 0.0505 438/500 [=========================>....] - ETA: 21s - loss: 0.5580 - regression_loss: 0.5075 - classification_loss: 0.0504 439/500 [=========================>....] - ETA: 20s - loss: 0.5578 - regression_loss: 0.5074 - classification_loss: 0.0504 440/500 [=========================>....] - ETA: 20s - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 441/500 [=========================>....] - ETA: 20s - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 442/500 [=========================>....] - ETA: 19s - loss: 0.5580 - regression_loss: 0.5076 - classification_loss: 0.0504 443/500 [=========================>....] - ETA: 19s - loss: 0.5584 - regression_loss: 0.5080 - classification_loss: 0.0505 444/500 [=========================>....] - ETA: 19s - loss: 0.5602 - regression_loss: 0.5092 - classification_loss: 0.0509 445/500 [=========================>....] - ETA: 18s - loss: 0.5600 - regression_loss: 0.5091 - classification_loss: 0.0509 446/500 [=========================>....] - ETA: 18s - loss: 0.5605 - regression_loss: 0.5095 - classification_loss: 0.0509 447/500 [=========================>....] - ETA: 18s - loss: 0.5605 - regression_loss: 0.5095 - classification_loss: 0.0509 448/500 [=========================>....] - ETA: 17s - loss: 0.5598 - regression_loss: 0.5089 - classification_loss: 0.0508 449/500 [=========================>....] - ETA: 17s - loss: 0.5602 - regression_loss: 0.5093 - classification_loss: 0.0509 450/500 [==========================>...] - ETA: 17s - loss: 0.5600 - regression_loss: 0.5092 - classification_loss: 0.0509 451/500 [==========================>...] - ETA: 16s - loss: 0.5599 - regression_loss: 0.5091 - classification_loss: 0.0508 452/500 [==========================>...] - ETA: 16s - loss: 0.5600 - regression_loss: 0.5091 - classification_loss: 0.0509 453/500 [==========================>...] - ETA: 16s - loss: 0.5602 - regression_loss: 0.5093 - classification_loss: 0.0509 454/500 [==========================>...] - ETA: 15s - loss: 0.5595 - regression_loss: 0.5087 - classification_loss: 0.0509 455/500 [==========================>...] - ETA: 15s - loss: 0.5586 - regression_loss: 0.5078 - classification_loss: 0.0508 456/500 [==========================>...] - ETA: 15s - loss: 0.5579 - regression_loss: 0.5072 - classification_loss: 0.0508 457/500 [==========================>...] - ETA: 14s - loss: 0.5582 - regression_loss: 0.5074 - classification_loss: 0.0507 458/500 [==========================>...] - ETA: 14s - loss: 0.5587 - regression_loss: 0.5079 - classification_loss: 0.0508 459/500 [==========================>...] - ETA: 14s - loss: 0.5589 - regression_loss: 0.5081 - classification_loss: 0.0508 460/500 [==========================>...] - ETA: 13s - loss: 0.5585 - regression_loss: 0.5077 - classification_loss: 0.0507 461/500 [==========================>...] - ETA: 13s - loss: 0.5597 - regression_loss: 0.5087 - classification_loss: 0.0509 462/500 [==========================>...] - ETA: 13s - loss: 0.5595 - regression_loss: 0.5086 - classification_loss: 0.0509 463/500 [==========================>...] - ETA: 12s - loss: 0.5590 - regression_loss: 0.5082 - classification_loss: 0.0508 464/500 [==========================>...] - ETA: 12s - loss: 0.5589 - regression_loss: 0.5081 - classification_loss: 0.0509 465/500 [==========================>...] - ETA: 12s - loss: 0.5589 - regression_loss: 0.5082 - classification_loss: 0.0508 466/500 [==========================>...] - ETA: 11s - loss: 0.5596 - regression_loss: 0.5087 - classification_loss: 0.0509 467/500 [===========================>..] - ETA: 11s - loss: 0.5603 - regression_loss: 0.5094 - classification_loss: 0.0509 468/500 [===========================>..] - ETA: 10s - loss: 0.5601 - regression_loss: 0.5092 - classification_loss: 0.0509 469/500 [===========================>..] - ETA: 10s - loss: 0.5602 - regression_loss: 0.5093 - classification_loss: 0.0509 470/500 [===========================>..] - ETA: 10s - loss: 0.5601 - regression_loss: 0.5092 - classification_loss: 0.0509 471/500 [===========================>..] - ETA: 9s - loss: 0.5611 - regression_loss: 0.5102 - classification_loss: 0.0509  472/500 [===========================>..] - ETA: 9s - loss: 0.5612 - regression_loss: 0.5103 - classification_loss: 0.0509 473/500 [===========================>..] - ETA: 9s - loss: 0.5608 - regression_loss: 0.5099 - classification_loss: 0.0508 474/500 [===========================>..] - ETA: 8s - loss: 0.5615 - regression_loss: 0.5106 - classification_loss: 0.0509 475/500 [===========================>..] - ETA: 8s - loss: 0.5628 - regression_loss: 0.5116 - classification_loss: 0.0512 476/500 [===========================>..] - ETA: 8s - loss: 0.5627 - regression_loss: 0.5116 - classification_loss: 0.0512 477/500 [===========================>..] - ETA: 7s - loss: 0.5623 - regression_loss: 0.5111 - classification_loss: 0.0511 478/500 [===========================>..] - ETA: 7s - loss: 0.5621 - regression_loss: 0.5110 - classification_loss: 0.0511 479/500 [===========================>..] - ETA: 7s - loss: 0.5626 - regression_loss: 0.5114 - classification_loss: 0.0512 480/500 [===========================>..] - ETA: 6s - loss: 0.5624 - regression_loss: 0.5113 - classification_loss: 0.0511 481/500 [===========================>..] - ETA: 6s - loss: 0.5629 - regression_loss: 0.5117 - classification_loss: 0.0511 482/500 [===========================>..] - ETA: 6s - loss: 0.5626 - regression_loss: 0.5114 - classification_loss: 0.0512 483/500 [===========================>..] - ETA: 5s - loss: 0.5629 - regression_loss: 0.5117 - classification_loss: 0.0512 484/500 [============================>.] - ETA: 5s - loss: 0.5622 - regression_loss: 0.5111 - classification_loss: 0.0511 485/500 [============================>.] - ETA: 5s - loss: 0.5625 - regression_loss: 0.5114 - classification_loss: 0.0511 486/500 [============================>.] - ETA: 4s - loss: 0.5626 - regression_loss: 0.5114 - classification_loss: 0.0511 487/500 [============================>.] - ETA: 4s - loss: 0.5623 - regression_loss: 0.5112 - classification_loss: 0.0511 488/500 [============================>.] - ETA: 4s - loss: 0.5627 - regression_loss: 0.5115 - classification_loss: 0.0513 489/500 [============================>.] - ETA: 3s - loss: 0.5626 - regression_loss: 0.5113 - classification_loss: 0.0513 490/500 [============================>.] - ETA: 3s - loss: 0.5622 - regression_loss: 0.5110 - classification_loss: 0.0513 491/500 [============================>.] - ETA: 3s - loss: 0.5620 - regression_loss: 0.5108 - classification_loss: 0.0513 492/500 [============================>.] - ETA: 2s - loss: 0.5625 - regression_loss: 0.5112 - classification_loss: 0.0513 493/500 [============================>.] - ETA: 2s - loss: 0.5630 - regression_loss: 0.5116 - classification_loss: 0.0514 494/500 [============================>.] - ETA: 2s - loss: 0.5625 - regression_loss: 0.5111 - classification_loss: 0.0514 495/500 [============================>.] - ETA: 1s - loss: 0.5628 - regression_loss: 0.5114 - classification_loss: 0.0514 496/500 [============================>.] - ETA: 1s - loss: 0.5628 - regression_loss: 0.5114 - classification_loss: 0.0514 497/500 [============================>.] - ETA: 1s - loss: 0.5631 - regression_loss: 0.5116 - classification_loss: 0.0514 498/500 [============================>.] - ETA: 0s - loss: 0.5626 - regression_loss: 0.5112 - classification_loss: 0.0514 499/500 [============================>.] - ETA: 0s - loss: 0.5623 - regression_loss: 0.5109 - classification_loss: 0.0514 500/500 [==============================] - 172s 343ms/step - loss: 0.5624 - regression_loss: 0.5111 - classification_loss: 0.0514 1172 instances of class plum with average precision: 0.7377 mAP: 0.7377 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 2:39 - loss: 0.5162 - regression_loss: 0.4316 - classification_loss: 0.0846 2/500 [..............................] - ETA: 2:38 - loss: 0.5183 - regression_loss: 0.4650 - classification_loss: 0.0532 3/500 [..............................] - ETA: 2:44 - loss: 0.5360 - regression_loss: 0.4820 - classification_loss: 0.0540 4/500 [..............................] - ETA: 2:42 - loss: 0.4784 - regression_loss: 0.4321 - classification_loss: 0.0463 5/500 [..............................] - ETA: 2:40 - loss: 0.4950 - regression_loss: 0.4471 - classification_loss: 0.0479 6/500 [..............................] - ETA: 2:39 - loss: 0.4863 - regression_loss: 0.4418 - classification_loss: 0.0445 7/500 [..............................] - ETA: 2:39 - loss: 0.5695 - regression_loss: 0.5232 - classification_loss: 0.0463 8/500 [..............................] - ETA: 2:39 - loss: 0.5590 - regression_loss: 0.5132 - classification_loss: 0.0457 9/500 [..............................] - ETA: 2:40 - loss: 0.5408 - regression_loss: 0.4980 - classification_loss: 0.0428 10/500 [..............................] - ETA: 2:40 - loss: 0.5306 - regression_loss: 0.4898 - classification_loss: 0.0408 11/500 [..............................] - ETA: 2:41 - loss: 0.5471 - regression_loss: 0.5061 - classification_loss: 0.0410 12/500 [..............................] - ETA: 2:41 - loss: 0.5702 - regression_loss: 0.5249 - classification_loss: 0.0453 13/500 [..............................] - ETA: 2:41 - loss: 0.5450 - regression_loss: 0.5017 - classification_loss: 0.0433 14/500 [..............................] - ETA: 2:42 - loss: 0.5378 - regression_loss: 0.4947 - classification_loss: 0.0431 15/500 [..............................] - ETA: 2:42 - loss: 0.5506 - regression_loss: 0.5059 - classification_loss: 0.0447 16/500 [..............................] - ETA: 2:41 - loss: 0.5375 - regression_loss: 0.4940 - classification_loss: 0.0434 17/500 [>.............................] - ETA: 2:42 - loss: 0.5279 - regression_loss: 0.4853 - classification_loss: 0.0425 18/500 [>.............................] - ETA: 2:41 - loss: 0.5374 - regression_loss: 0.4919 - classification_loss: 0.0455 19/500 [>.............................] - ETA: 2:41 - loss: 0.5494 - regression_loss: 0.5033 - classification_loss: 0.0462 20/500 [>.............................] - ETA: 2:41 - loss: 0.5425 - regression_loss: 0.4964 - classification_loss: 0.0461 21/500 [>.............................] - ETA: 2:40 - loss: 0.5798 - regression_loss: 0.5285 - classification_loss: 0.0513 22/500 [>.............................] - ETA: 2:40 - loss: 0.5723 - regression_loss: 0.5220 - classification_loss: 0.0503 23/500 [>.............................] - ETA: 2:40 - loss: 0.5659 - regression_loss: 0.5151 - classification_loss: 0.0508 24/500 [>.............................] - ETA: 2:39 - loss: 0.5704 - regression_loss: 0.5196 - classification_loss: 0.0508 25/500 [>.............................] - ETA: 2:39 - loss: 0.5631 - regression_loss: 0.5130 - classification_loss: 0.0501 26/500 [>.............................] - ETA: 2:38 - loss: 0.5769 - regression_loss: 0.5256 - classification_loss: 0.0513 27/500 [>.............................] - ETA: 2:38 - loss: 0.5791 - regression_loss: 0.5281 - classification_loss: 0.0510 28/500 [>.............................] - ETA: 2:38 - loss: 0.5835 - regression_loss: 0.5319 - classification_loss: 0.0516 29/500 [>.............................] - ETA: 2:38 - loss: 0.5839 - regression_loss: 0.5322 - classification_loss: 0.0517 30/500 [>.............................] - ETA: 2:38 - loss: 0.5874 - regression_loss: 0.5356 - classification_loss: 0.0518 31/500 [>.............................] - ETA: 2:38 - loss: 0.5845 - regression_loss: 0.5332 - classification_loss: 0.0513 32/500 [>.............................] - ETA: 2:37 - loss: 0.5740 - regression_loss: 0.5236 - classification_loss: 0.0504 33/500 [>.............................] - ETA: 2:37 - loss: 0.5640 - regression_loss: 0.5145 - classification_loss: 0.0496 34/500 [=>............................] - ETA: 2:37 - loss: 0.5657 - regression_loss: 0.5164 - classification_loss: 0.0493 35/500 [=>............................] - ETA: 2:36 - loss: 0.5681 - regression_loss: 0.5180 - classification_loss: 0.0501 36/500 [=>............................] - ETA: 2:36 - loss: 0.5633 - regression_loss: 0.5136 - classification_loss: 0.0497 37/500 [=>............................] - ETA: 2:36 - loss: 0.5790 - regression_loss: 0.5259 - classification_loss: 0.0531 38/500 [=>............................] - ETA: 2:36 - loss: 0.5758 - regression_loss: 0.5230 - classification_loss: 0.0528 39/500 [=>............................] - ETA: 2:36 - loss: 0.5710 - regression_loss: 0.5190 - classification_loss: 0.0521 40/500 [=>............................] - ETA: 2:35 - loss: 0.5636 - regression_loss: 0.5121 - classification_loss: 0.0515 41/500 [=>............................] - ETA: 2:35 - loss: 0.5681 - regression_loss: 0.5168 - classification_loss: 0.0512 42/500 [=>............................] - ETA: 2:35 - loss: 0.5629 - regression_loss: 0.5123 - classification_loss: 0.0505 43/500 [=>............................] - ETA: 2:35 - loss: 0.5645 - regression_loss: 0.5140 - classification_loss: 0.0504 44/500 [=>............................] - ETA: 2:34 - loss: 0.5582 - regression_loss: 0.5085 - classification_loss: 0.0497 45/500 [=>............................] - ETA: 2:34 - loss: 0.5549 - regression_loss: 0.5051 - classification_loss: 0.0498 46/500 [=>............................] - ETA: 2:34 - loss: 0.5625 - regression_loss: 0.5117 - classification_loss: 0.0507 47/500 [=>............................] - ETA: 2:33 - loss: 0.5630 - regression_loss: 0.5122 - classification_loss: 0.0508 48/500 [=>............................] - ETA: 2:33 - loss: 0.5652 - regression_loss: 0.5144 - classification_loss: 0.0508 49/500 [=>............................] - ETA: 2:33 - loss: 0.5682 - regression_loss: 0.5171 - classification_loss: 0.0512 50/500 [==>...........................] - ETA: 2:33 - loss: 0.5682 - regression_loss: 0.5169 - classification_loss: 0.0513 51/500 [==>...........................] - ETA: 2:32 - loss: 0.5640 - regression_loss: 0.5132 - classification_loss: 0.0508 52/500 [==>...........................] - ETA: 2:32 - loss: 0.5645 - regression_loss: 0.5135 - classification_loss: 0.0510 53/500 [==>...........................] - ETA: 2:32 - loss: 0.5604 - regression_loss: 0.5100 - classification_loss: 0.0504 54/500 [==>...........................] - ETA: 2:32 - loss: 0.5621 - regression_loss: 0.5114 - classification_loss: 0.0507 55/500 [==>...........................] - ETA: 2:31 - loss: 0.5670 - regression_loss: 0.5161 - classification_loss: 0.0508 56/500 [==>...........................] - ETA: 2:31 - loss: 0.5671 - regression_loss: 0.5161 - classification_loss: 0.0510 57/500 [==>...........................] - ETA: 2:31 - loss: 0.5709 - regression_loss: 0.5195 - classification_loss: 0.0514 58/500 [==>...........................] - ETA: 2:30 - loss: 0.5728 - regression_loss: 0.5209 - classification_loss: 0.0519 59/500 [==>...........................] - ETA: 2:30 - loss: 0.5671 - regression_loss: 0.5157 - classification_loss: 0.0514 60/500 [==>...........................] - ETA: 2:29 - loss: 0.5626 - regression_loss: 0.5118 - classification_loss: 0.0507 61/500 [==>...........................] - ETA: 2:29 - loss: 0.5625 - regression_loss: 0.5121 - classification_loss: 0.0504 62/500 [==>...........................] - ETA: 2:29 - loss: 0.5607 - regression_loss: 0.5108 - classification_loss: 0.0499 63/500 [==>...........................] - ETA: 2:28 - loss: 0.5638 - regression_loss: 0.5130 - classification_loss: 0.0508 64/500 [==>...........................] - ETA: 2:28 - loss: 0.5605 - regression_loss: 0.5102 - classification_loss: 0.0503 65/500 [==>...........................] - ETA: 2:28 - loss: 0.5637 - regression_loss: 0.5130 - classification_loss: 0.0508 66/500 [==>...........................] - ETA: 2:28 - loss: 0.5655 - regression_loss: 0.5146 - classification_loss: 0.0509 67/500 [===>..........................] - ETA: 2:27 - loss: 0.5667 - regression_loss: 0.5154 - classification_loss: 0.0513 68/500 [===>..........................] - ETA: 2:27 - loss: 0.5697 - regression_loss: 0.5176 - classification_loss: 0.0521 69/500 [===>..........................] - ETA: 2:27 - loss: 0.5700 - regression_loss: 0.5179 - classification_loss: 0.0521 70/500 [===>..........................] - ETA: 2:27 - loss: 0.5749 - regression_loss: 0.5224 - classification_loss: 0.0525 71/500 [===>..........................] - ETA: 2:26 - loss: 0.5719 - regression_loss: 0.5199 - classification_loss: 0.0520 72/500 [===>..........................] - ETA: 2:26 - loss: 0.5749 - regression_loss: 0.5226 - classification_loss: 0.0522 73/500 [===>..........................] - ETA: 2:26 - loss: 0.5799 - regression_loss: 0.5274 - classification_loss: 0.0525 74/500 [===>..........................] - ETA: 2:25 - loss: 0.5858 - regression_loss: 0.5327 - classification_loss: 0.0531 75/500 [===>..........................] - ETA: 2:25 - loss: 0.5874 - regression_loss: 0.5338 - classification_loss: 0.0537 76/500 [===>..........................] - ETA: 2:24 - loss: 0.5855 - regression_loss: 0.5323 - classification_loss: 0.0532 77/500 [===>..........................] - ETA: 2:24 - loss: 0.5849 - regression_loss: 0.5316 - classification_loss: 0.0533 78/500 [===>..........................] - ETA: 2:24 - loss: 0.5843 - regression_loss: 0.5307 - classification_loss: 0.0536 79/500 [===>..........................] - ETA: 2:23 - loss: 0.5830 - regression_loss: 0.5297 - classification_loss: 0.0533 80/500 [===>..........................] - ETA: 2:23 - loss: 0.5816 - regression_loss: 0.5285 - classification_loss: 0.0531 81/500 [===>..........................] - ETA: 2:23 - loss: 0.5801 - regression_loss: 0.5272 - classification_loss: 0.0529 82/500 [===>..........................] - ETA: 2:22 - loss: 0.5765 - regression_loss: 0.5240 - classification_loss: 0.0525 83/500 [===>..........................] - ETA: 2:22 - loss: 0.5768 - regression_loss: 0.5243 - classification_loss: 0.0525 84/500 [====>.........................] - ETA: 2:22 - loss: 0.5822 - regression_loss: 0.5287 - classification_loss: 0.0534 85/500 [====>.........................] - ETA: 2:21 - loss: 0.5807 - regression_loss: 0.5275 - classification_loss: 0.0532 86/500 [====>.........................] - ETA: 2:21 - loss: 0.5795 - regression_loss: 0.5266 - classification_loss: 0.0529 87/500 [====>.........................] - ETA: 2:21 - loss: 0.5778 - regression_loss: 0.5250 - classification_loss: 0.0528 88/500 [====>.........................] - ETA: 2:20 - loss: 0.5765 - regression_loss: 0.5239 - classification_loss: 0.0526 89/500 [====>.........................] - ETA: 2:20 - loss: 0.5752 - regression_loss: 0.5227 - classification_loss: 0.0525 90/500 [====>.........................] - ETA: 2:20 - loss: 0.5806 - regression_loss: 0.5277 - classification_loss: 0.0529 91/500 [====>.........................] - ETA: 2:19 - loss: 0.5799 - regression_loss: 0.5272 - classification_loss: 0.0527 92/500 [====>.........................] - ETA: 2:19 - loss: 0.5792 - regression_loss: 0.5267 - classification_loss: 0.0524 93/500 [====>.........................] - ETA: 2:18 - loss: 0.5792 - regression_loss: 0.5268 - classification_loss: 0.0524 94/500 [====>.........................] - ETA: 2:18 - loss: 0.5808 - regression_loss: 0.5285 - classification_loss: 0.0523 95/500 [====>.........................] - ETA: 2:18 - loss: 0.5804 - regression_loss: 0.5281 - classification_loss: 0.0523 96/500 [====>.........................] - ETA: 2:17 - loss: 0.5778 - regression_loss: 0.5255 - classification_loss: 0.0522 97/500 [====>.........................] - ETA: 2:17 - loss: 0.5807 - regression_loss: 0.5279 - classification_loss: 0.0527 98/500 [====>.........................] - ETA: 2:17 - loss: 0.5811 - regression_loss: 0.5283 - classification_loss: 0.0528 99/500 [====>.........................] - ETA: 2:17 - loss: 0.5827 - regression_loss: 0.5297 - classification_loss: 0.0530 100/500 [=====>........................] - ETA: 2:16 - loss: 0.5807 - regression_loss: 0.5280 - classification_loss: 0.0527 101/500 [=====>........................] - ETA: 2:16 - loss: 0.5790 - regression_loss: 0.5264 - classification_loss: 0.0527 102/500 [=====>........................] - ETA: 2:16 - loss: 0.5773 - regression_loss: 0.5249 - classification_loss: 0.0524 103/500 [=====>........................] - ETA: 2:15 - loss: 0.5761 - regression_loss: 0.5236 - classification_loss: 0.0525 104/500 [=====>........................] - ETA: 2:15 - loss: 0.5728 - regression_loss: 0.5206 - classification_loss: 0.0522 105/500 [=====>........................] - ETA: 2:15 - loss: 0.5713 - regression_loss: 0.5193 - classification_loss: 0.0520 106/500 [=====>........................] - ETA: 2:14 - loss: 0.5710 - regression_loss: 0.5190 - classification_loss: 0.0520 107/500 [=====>........................] - ETA: 2:14 - loss: 0.5707 - regression_loss: 0.5187 - classification_loss: 0.0520 108/500 [=====>........................] - ETA: 2:14 - loss: 0.5707 - regression_loss: 0.5187 - classification_loss: 0.0521 109/500 [=====>........................] - ETA: 2:13 - loss: 0.5681 - regression_loss: 0.5162 - classification_loss: 0.0519 110/500 [=====>........................] - ETA: 2:13 - loss: 0.5676 - regression_loss: 0.5157 - classification_loss: 0.0519 111/500 [=====>........................] - ETA: 2:12 - loss: 0.5660 - regression_loss: 0.5143 - classification_loss: 0.0517 112/500 [=====>........................] - ETA: 2:12 - loss: 0.5640 - regression_loss: 0.5123 - classification_loss: 0.0517 113/500 [=====>........................] - ETA: 2:12 - loss: 0.5629 - regression_loss: 0.5111 - classification_loss: 0.0517 114/500 [=====>........................] - ETA: 2:11 - loss: 0.5625 - regression_loss: 0.5108 - classification_loss: 0.0517 115/500 [=====>........................] - ETA: 2:11 - loss: 0.5612 - regression_loss: 0.5097 - classification_loss: 0.0514 116/500 [=====>........................] - ETA: 2:11 - loss: 0.5604 - regression_loss: 0.5091 - classification_loss: 0.0513 117/500 [======>.......................] - ETA: 2:10 - loss: 0.5662 - regression_loss: 0.5142 - classification_loss: 0.0519 118/500 [======>.......................] - ETA: 2:10 - loss: 0.5654 - regression_loss: 0.5137 - classification_loss: 0.0517 119/500 [======>.......................] - ETA: 2:10 - loss: 0.5669 - regression_loss: 0.5151 - classification_loss: 0.0518 120/500 [======>.......................] - ETA: 2:09 - loss: 0.5685 - regression_loss: 0.5164 - classification_loss: 0.0520 121/500 [======>.......................] - ETA: 2:09 - loss: 0.5679 - regression_loss: 0.5160 - classification_loss: 0.0519 122/500 [======>.......................] - ETA: 2:09 - loss: 0.5708 - regression_loss: 0.5186 - classification_loss: 0.0522 123/500 [======>.......................] - ETA: 2:08 - loss: 0.5689 - regression_loss: 0.5169 - classification_loss: 0.0520 124/500 [======>.......................] - ETA: 2:08 - loss: 0.5683 - regression_loss: 0.5162 - classification_loss: 0.0521 125/500 [======>.......................] - ETA: 2:08 - loss: 0.5679 - regression_loss: 0.5158 - classification_loss: 0.0521 126/500 [======>.......................] - ETA: 2:07 - loss: 0.5682 - regression_loss: 0.5162 - classification_loss: 0.0521 127/500 [======>.......................] - ETA: 2:07 - loss: 0.5674 - regression_loss: 0.5153 - classification_loss: 0.0520 128/500 [======>.......................] - ETA: 2:07 - loss: 0.5675 - regression_loss: 0.5155 - classification_loss: 0.0520 129/500 [======>.......................] - ETA: 2:06 - loss: 0.5711 - regression_loss: 0.5187 - classification_loss: 0.0524 130/500 [======>.......................] - ETA: 2:06 - loss: 0.5701 - regression_loss: 0.5180 - classification_loss: 0.0522 131/500 [======>.......................] - ETA: 2:06 - loss: 0.5704 - regression_loss: 0.5184 - classification_loss: 0.0521 132/500 [======>.......................] - ETA: 2:05 - loss: 0.5697 - regression_loss: 0.5179 - classification_loss: 0.0519 133/500 [======>.......................] - ETA: 2:05 - loss: 0.5720 - regression_loss: 0.5199 - classification_loss: 0.0521 134/500 [=======>......................] - ETA: 2:05 - loss: 0.5713 - regression_loss: 0.5194 - classification_loss: 0.0519 135/500 [=======>......................] - ETA: 2:04 - loss: 0.5696 - regression_loss: 0.5178 - classification_loss: 0.0518 136/500 [=======>......................] - ETA: 2:04 - loss: 0.5703 - regression_loss: 0.5185 - classification_loss: 0.0518 137/500 [=======>......................] - ETA: 2:04 - loss: 0.5696 - regression_loss: 0.5178 - classification_loss: 0.0518 138/500 [=======>......................] - ETA: 2:03 - loss: 0.5672 - regression_loss: 0.5156 - classification_loss: 0.0516 139/500 [=======>......................] - ETA: 2:03 - loss: 0.5686 - regression_loss: 0.5168 - classification_loss: 0.0519 140/500 [=======>......................] - ETA: 2:03 - loss: 0.5682 - regression_loss: 0.5163 - classification_loss: 0.0519 141/500 [=======>......................] - ETA: 2:02 - loss: 0.5693 - regression_loss: 0.5172 - classification_loss: 0.0521 142/500 [=======>......................] - ETA: 2:02 - loss: 0.5674 - regression_loss: 0.5154 - classification_loss: 0.0520 143/500 [=======>......................] - ETA: 2:01 - loss: 0.5681 - regression_loss: 0.5160 - classification_loss: 0.0521 144/500 [=======>......................] - ETA: 2:01 - loss: 0.5666 - regression_loss: 0.5146 - classification_loss: 0.0520 145/500 [=======>......................] - ETA: 2:01 - loss: 0.5637 - regression_loss: 0.5120 - classification_loss: 0.0517 146/500 [=======>......................] - ETA: 2:00 - loss: 0.5616 - regression_loss: 0.5101 - classification_loss: 0.0515 147/500 [=======>......................] - ETA: 2:00 - loss: 0.5634 - regression_loss: 0.5118 - classification_loss: 0.0516 148/500 [=======>......................] - ETA: 2:00 - loss: 0.5620 - regression_loss: 0.5105 - classification_loss: 0.0516 149/500 [=======>......................] - ETA: 1:59 - loss: 0.5614 - regression_loss: 0.5100 - classification_loss: 0.0513 150/500 [========>.....................] - ETA: 1:59 - loss: 0.5633 - regression_loss: 0.5117 - classification_loss: 0.0516 151/500 [========>.....................] - ETA: 1:59 - loss: 0.5620 - regression_loss: 0.5107 - classification_loss: 0.0514 152/500 [========>.....................] - ETA: 1:58 - loss: 0.5647 - regression_loss: 0.5130 - classification_loss: 0.0517 153/500 [========>.....................] - ETA: 1:58 - loss: 0.5662 - regression_loss: 0.5144 - classification_loss: 0.0517 154/500 [========>.....................] - ETA: 1:58 - loss: 0.5648 - regression_loss: 0.5133 - classification_loss: 0.0515 155/500 [========>.....................] - ETA: 1:57 - loss: 0.5639 - regression_loss: 0.5125 - classification_loss: 0.0514 156/500 [========>.....................] - ETA: 1:57 - loss: 0.5631 - regression_loss: 0.5118 - classification_loss: 0.0513 157/500 [========>.....................] - ETA: 1:57 - loss: 0.5648 - regression_loss: 0.5130 - classification_loss: 0.0517 158/500 [========>.....................] - ETA: 1:56 - loss: 0.5657 - regression_loss: 0.5139 - classification_loss: 0.0518 159/500 [========>.....................] - ETA: 1:56 - loss: 0.5655 - regression_loss: 0.5138 - classification_loss: 0.0517 160/500 [========>.....................] - ETA: 1:56 - loss: 0.5661 - regression_loss: 0.5142 - classification_loss: 0.0519 161/500 [========>.....................] - ETA: 1:55 - loss: 0.5664 - regression_loss: 0.5145 - classification_loss: 0.0519 162/500 [========>.....................] - ETA: 1:55 - loss: 0.5646 - regression_loss: 0.5129 - classification_loss: 0.0517 163/500 [========>.....................] - ETA: 1:55 - loss: 0.5645 - regression_loss: 0.5129 - classification_loss: 0.0516 164/500 [========>.....................] - ETA: 1:54 - loss: 0.5648 - regression_loss: 0.5132 - classification_loss: 0.0516 165/500 [========>.....................] - ETA: 1:54 - loss: 0.5656 - regression_loss: 0.5140 - classification_loss: 0.0516 166/500 [========>.....................] - ETA: 1:54 - loss: 0.5663 - regression_loss: 0.5145 - classification_loss: 0.0518 167/500 [=========>....................] - ETA: 1:53 - loss: 0.5661 - regression_loss: 0.5144 - classification_loss: 0.0518 168/500 [=========>....................] - ETA: 1:53 - loss: 0.5651 - regression_loss: 0.5136 - classification_loss: 0.0516 169/500 [=========>....................] - ETA: 1:53 - loss: 0.5641 - regression_loss: 0.5127 - classification_loss: 0.0514 170/500 [=========>....................] - ETA: 1:52 - loss: 0.5628 - regression_loss: 0.5116 - classification_loss: 0.0512 171/500 [=========>....................] - ETA: 1:52 - loss: 0.5634 - regression_loss: 0.5120 - classification_loss: 0.0513 172/500 [=========>....................] - ETA: 1:52 - loss: 0.5629 - regression_loss: 0.5117 - classification_loss: 0.0512 173/500 [=========>....................] - ETA: 1:51 - loss: 0.5635 - regression_loss: 0.5122 - classification_loss: 0.0513 174/500 [=========>....................] - ETA: 1:51 - loss: 0.5629 - regression_loss: 0.5118 - classification_loss: 0.0511 175/500 [=========>....................] - ETA: 1:51 - loss: 0.5628 - regression_loss: 0.5117 - classification_loss: 0.0511 176/500 [=========>....................] - ETA: 1:50 - loss: 0.5629 - regression_loss: 0.5118 - classification_loss: 0.0512 177/500 [=========>....................] - ETA: 1:50 - loss: 0.5647 - regression_loss: 0.5134 - classification_loss: 0.0513 178/500 [=========>....................] - ETA: 1:50 - loss: 0.5634 - regression_loss: 0.5123 - classification_loss: 0.0511 179/500 [=========>....................] - ETA: 1:49 - loss: 0.5640 - regression_loss: 0.5129 - classification_loss: 0.0511 180/500 [=========>....................] - ETA: 1:49 - loss: 0.5642 - regression_loss: 0.5127 - classification_loss: 0.0514 181/500 [=========>....................] - ETA: 1:48 - loss: 0.5630 - regression_loss: 0.5118 - classification_loss: 0.0513 182/500 [=========>....................] - ETA: 1:48 - loss: 0.5631 - regression_loss: 0.5118 - classification_loss: 0.0513 183/500 [=========>....................] - ETA: 1:48 - loss: 0.5616 - regression_loss: 0.5104 - classification_loss: 0.0511 184/500 [==========>...................] - ETA: 1:47 - loss: 0.5611 - regression_loss: 0.5101 - classification_loss: 0.0510 185/500 [==========>...................] - ETA: 1:47 - loss: 0.5616 - regression_loss: 0.5107 - classification_loss: 0.0509 186/500 [==========>...................] - ETA: 1:47 - loss: 0.5602 - regression_loss: 0.5094 - classification_loss: 0.0508 187/500 [==========>...................] - ETA: 1:46 - loss: 0.5602 - regression_loss: 0.5093 - classification_loss: 0.0509 188/500 [==========>...................] - ETA: 1:46 - loss: 0.5606 - regression_loss: 0.5097 - classification_loss: 0.0509 189/500 [==========>...................] - ETA: 1:46 - loss: 0.5611 - regression_loss: 0.5101 - classification_loss: 0.0509 190/500 [==========>...................] - ETA: 1:45 - loss: 0.5601 - regression_loss: 0.5092 - classification_loss: 0.0509 191/500 [==========>...................] - ETA: 1:45 - loss: 0.5613 - regression_loss: 0.5102 - classification_loss: 0.0510 192/500 [==========>...................] - ETA: 1:45 - loss: 0.5631 - regression_loss: 0.5119 - classification_loss: 0.0512 193/500 [==========>...................] - ETA: 1:44 - loss: 0.5630 - regression_loss: 0.5118 - classification_loss: 0.0512 194/500 [==========>...................] - ETA: 1:44 - loss: 0.5615 - regression_loss: 0.5105 - classification_loss: 0.0510 195/500 [==========>...................] - ETA: 1:44 - loss: 0.5623 - regression_loss: 0.5113 - classification_loss: 0.0510 196/500 [==========>...................] - ETA: 1:43 - loss: 0.5620 - regression_loss: 0.5110 - classification_loss: 0.0510 197/500 [==========>...................] - ETA: 1:43 - loss: 0.5651 - regression_loss: 0.5138 - classification_loss: 0.0513 198/500 [==========>...................] - ETA: 1:43 - loss: 0.5641 - regression_loss: 0.5129 - classification_loss: 0.0512 199/500 [==========>...................] - ETA: 1:42 - loss: 0.5631 - regression_loss: 0.5120 - classification_loss: 0.0511 200/500 [===========>..................] - ETA: 1:42 - loss: 0.5619 - regression_loss: 0.5109 - classification_loss: 0.0510 201/500 [===========>..................] - ETA: 1:42 - loss: 0.5643 - regression_loss: 0.5129 - classification_loss: 0.0513 202/500 [===========>..................] - ETA: 1:41 - loss: 0.5637 - regression_loss: 0.5124 - classification_loss: 0.0513 203/500 [===========>..................] - ETA: 1:41 - loss: 0.5646 - regression_loss: 0.5132 - classification_loss: 0.0515 204/500 [===========>..................] - ETA: 1:41 - loss: 0.5648 - regression_loss: 0.5134 - classification_loss: 0.0514 205/500 [===========>..................] - ETA: 1:40 - loss: 0.5643 - regression_loss: 0.5130 - classification_loss: 0.0513 206/500 [===========>..................] - ETA: 1:40 - loss: 0.5649 - regression_loss: 0.5134 - classification_loss: 0.0515 207/500 [===========>..................] - ETA: 1:40 - loss: 0.5646 - regression_loss: 0.5130 - classification_loss: 0.0516 208/500 [===========>..................] - ETA: 1:39 - loss: 0.5632 - regression_loss: 0.5118 - classification_loss: 0.0514 209/500 [===========>..................] - ETA: 1:39 - loss: 0.5637 - regression_loss: 0.5122 - classification_loss: 0.0515 210/500 [===========>..................] - ETA: 1:39 - loss: 0.5626 - regression_loss: 0.5112 - classification_loss: 0.0514 211/500 [===========>..................] - ETA: 1:38 - loss: 0.5621 - regression_loss: 0.5107 - classification_loss: 0.0514 212/500 [===========>..................] - ETA: 1:38 - loss: 0.5631 - regression_loss: 0.5115 - classification_loss: 0.0515 213/500 [===========>..................] - ETA: 1:38 - loss: 0.5623 - regression_loss: 0.5108 - classification_loss: 0.0515 214/500 [===========>..................] - ETA: 1:37 - loss: 0.5607 - regression_loss: 0.5094 - classification_loss: 0.0513 215/500 [===========>..................] - ETA: 1:37 - loss: 0.5611 - regression_loss: 0.5098 - classification_loss: 0.0513 216/500 [===========>..................] - ETA: 1:36 - loss: 0.5615 - regression_loss: 0.5101 - classification_loss: 0.0514 217/500 [============>.................] - ETA: 1:36 - loss: 0.5617 - regression_loss: 0.5103 - classification_loss: 0.0514 218/500 [============>.................] - ETA: 1:36 - loss: 0.5616 - regression_loss: 0.5103 - classification_loss: 0.0514 219/500 [============>.................] - ETA: 1:36 - loss: 0.5610 - regression_loss: 0.5097 - classification_loss: 0.0514 220/500 [============>.................] - ETA: 1:35 - loss: 0.5599 - regression_loss: 0.5087 - classification_loss: 0.0512 221/500 [============>.................] - ETA: 1:35 - loss: 0.5589 - regression_loss: 0.5078 - classification_loss: 0.0511 222/500 [============>.................] - ETA: 1:35 - loss: 0.5600 - regression_loss: 0.5086 - classification_loss: 0.0513 223/500 [============>.................] - ETA: 1:34 - loss: 0.5598 - regression_loss: 0.5083 - classification_loss: 0.0514 224/500 [============>.................] - ETA: 1:34 - loss: 0.5614 - regression_loss: 0.5100 - classification_loss: 0.0514 225/500 [============>.................] - ETA: 1:34 - loss: 0.5602 - regression_loss: 0.5090 - classification_loss: 0.0512 226/500 [============>.................] - ETA: 1:33 - loss: 0.5599 - regression_loss: 0.5088 - classification_loss: 0.0512 227/500 [============>.................] - ETA: 1:33 - loss: 0.5604 - regression_loss: 0.5093 - classification_loss: 0.0512 228/500 [============>.................] - ETA: 1:32 - loss: 0.5619 - regression_loss: 0.5104 - classification_loss: 0.0516 229/500 [============>.................] - ETA: 1:32 - loss: 0.5627 - regression_loss: 0.5110 - classification_loss: 0.0516 230/500 [============>.................] - ETA: 1:32 - loss: 0.5620 - regression_loss: 0.5104 - classification_loss: 0.0516 231/500 [============>.................] - ETA: 1:31 - loss: 0.5628 - regression_loss: 0.5112 - classification_loss: 0.0517 232/500 [============>.................] - ETA: 1:31 - loss: 0.5632 - regression_loss: 0.5114 - classification_loss: 0.0518 233/500 [============>.................] - ETA: 1:31 - loss: 0.5627 - regression_loss: 0.5110 - classification_loss: 0.0517 234/500 [=============>................] - ETA: 1:30 - loss: 0.5612 - regression_loss: 0.5096 - classification_loss: 0.0516 235/500 [=============>................] - ETA: 1:30 - loss: 0.5611 - regression_loss: 0.5095 - classification_loss: 0.0516 236/500 [=============>................] - ETA: 1:30 - loss: 0.5616 - regression_loss: 0.5101 - classification_loss: 0.0515 237/500 [=============>................] - ETA: 1:29 - loss: 0.5619 - regression_loss: 0.5103 - classification_loss: 0.0516 238/500 [=============>................] - ETA: 1:29 - loss: 0.5613 - regression_loss: 0.5098 - classification_loss: 0.0515 239/500 [=============>................] - ETA: 1:29 - loss: 0.5604 - regression_loss: 0.5090 - classification_loss: 0.0514 240/500 [=============>................] - ETA: 1:28 - loss: 0.5595 - regression_loss: 0.5082 - classification_loss: 0.0513 241/500 [=============>................] - ETA: 1:28 - loss: 0.5584 - regression_loss: 0.5072 - classification_loss: 0.0512 242/500 [=============>................] - ETA: 1:28 - loss: 0.5574 - regression_loss: 0.5063 - classification_loss: 0.0510 243/500 [=============>................] - ETA: 1:27 - loss: 0.5571 - regression_loss: 0.5062 - classification_loss: 0.0509 244/500 [=============>................] - ETA: 1:27 - loss: 0.5585 - regression_loss: 0.5074 - classification_loss: 0.0511 245/500 [=============>................] - ETA: 1:27 - loss: 0.5576 - regression_loss: 0.5067 - classification_loss: 0.0510 246/500 [=============>................] - ETA: 1:26 - loss: 0.5585 - regression_loss: 0.5074 - classification_loss: 0.0511 247/500 [=============>................] - ETA: 1:26 - loss: 0.5581 - regression_loss: 0.5072 - classification_loss: 0.0510 248/500 [=============>................] - ETA: 1:26 - loss: 0.5577 - regression_loss: 0.5067 - classification_loss: 0.0509 249/500 [=============>................] - ETA: 1:25 - loss: 0.5582 - regression_loss: 0.5071 - classification_loss: 0.0511 250/500 [==============>...............] - ETA: 1:25 - loss: 0.5583 - regression_loss: 0.5073 - classification_loss: 0.0510 251/500 [==============>...............] - ETA: 1:24 - loss: 0.5573 - regression_loss: 0.5064 - classification_loss: 0.0509 252/500 [==============>...............] - ETA: 1:24 - loss: 0.5572 - regression_loss: 0.5063 - classification_loss: 0.0509 253/500 [==============>...............] - ETA: 1:24 - loss: 0.5568 - regression_loss: 0.5060 - classification_loss: 0.0508 254/500 [==============>...............] - ETA: 1:23 - loss: 0.5560 - regression_loss: 0.5054 - classification_loss: 0.0507 255/500 [==============>...............] - ETA: 1:23 - loss: 0.5575 - regression_loss: 0.5067 - classification_loss: 0.0508 256/500 [==============>...............] - ETA: 1:23 - loss: 0.5569 - regression_loss: 0.5062 - classification_loss: 0.0507 257/500 [==============>...............] - ETA: 1:22 - loss: 0.5562 - regression_loss: 0.5056 - classification_loss: 0.0506 258/500 [==============>...............] - ETA: 1:22 - loss: 0.5551 - regression_loss: 0.5046 - classification_loss: 0.0505 259/500 [==============>...............] - ETA: 1:22 - loss: 0.5550 - regression_loss: 0.5044 - classification_loss: 0.0506 260/500 [==============>...............] - ETA: 1:21 - loss: 0.5548 - regression_loss: 0.5042 - classification_loss: 0.0506 261/500 [==============>...............] - ETA: 1:21 - loss: 0.5548 - regression_loss: 0.5042 - classification_loss: 0.0507 262/500 [==============>...............] - ETA: 1:21 - loss: 0.5546 - regression_loss: 0.5040 - classification_loss: 0.0506 263/500 [==============>...............] - ETA: 1:20 - loss: 0.5541 - regression_loss: 0.5035 - classification_loss: 0.0506 264/500 [==============>...............] - ETA: 1:20 - loss: 0.5533 - regression_loss: 0.5028 - classification_loss: 0.0505 265/500 [==============>...............] - ETA: 1:20 - loss: 0.5535 - regression_loss: 0.5030 - classification_loss: 0.0505 266/500 [==============>...............] - ETA: 1:19 - loss: 0.5525 - regression_loss: 0.5021 - classification_loss: 0.0504 267/500 [===============>..............] - ETA: 1:19 - loss: 0.5519 - regression_loss: 0.5016 - classification_loss: 0.0503 268/500 [===============>..............] - ETA: 1:19 - loss: 0.5514 - regression_loss: 0.5012 - classification_loss: 0.0502 269/500 [===============>..............] - ETA: 1:18 - loss: 0.5506 - regression_loss: 0.5005 - classification_loss: 0.0501 270/500 [===============>..............] - ETA: 1:18 - loss: 0.5497 - regression_loss: 0.4997 - classification_loss: 0.0501 271/500 [===============>..............] - ETA: 1:18 - loss: 0.5486 - regression_loss: 0.4986 - classification_loss: 0.0500 272/500 [===============>..............] - ETA: 1:17 - loss: 0.5484 - regression_loss: 0.4985 - classification_loss: 0.0499 273/500 [===============>..............] - ETA: 1:17 - loss: 0.5484 - regression_loss: 0.4985 - classification_loss: 0.0499 274/500 [===============>..............] - ETA: 1:17 - loss: 0.5475 - regression_loss: 0.4977 - classification_loss: 0.0498 275/500 [===============>..............] - ETA: 1:16 - loss: 0.5470 - regression_loss: 0.4973 - classification_loss: 0.0497 276/500 [===============>..............] - ETA: 1:16 - loss: 0.5466 - regression_loss: 0.4969 - classification_loss: 0.0497 277/500 [===============>..............] - ETA: 1:16 - loss: 0.5456 - regression_loss: 0.4960 - classification_loss: 0.0496 278/500 [===============>..............] - ETA: 1:15 - loss: 0.5451 - regression_loss: 0.4955 - classification_loss: 0.0496 279/500 [===============>..............] - ETA: 1:15 - loss: 0.5454 - regression_loss: 0.4958 - classification_loss: 0.0496 280/500 [===============>..............] - ETA: 1:15 - loss: 0.5440 - regression_loss: 0.4945 - classification_loss: 0.0495 281/500 [===============>..............] - ETA: 1:14 - loss: 0.5438 - regression_loss: 0.4944 - classification_loss: 0.0494 282/500 [===============>..............] - ETA: 1:14 - loss: 0.5434 - regression_loss: 0.4940 - classification_loss: 0.0494 283/500 [===============>..............] - ETA: 1:14 - loss: 0.5426 - regression_loss: 0.4932 - classification_loss: 0.0493 284/500 [================>.............] - ETA: 1:13 - loss: 0.5426 - regression_loss: 0.4932 - classification_loss: 0.0493 285/500 [================>.............] - ETA: 1:13 - loss: 0.5435 - regression_loss: 0.4941 - classification_loss: 0.0494 286/500 [================>.............] - ETA: 1:13 - loss: 0.5432 - regression_loss: 0.4939 - classification_loss: 0.0494 287/500 [================>.............] - ETA: 1:12 - loss: 0.5427 - regression_loss: 0.4934 - classification_loss: 0.0493 288/500 [================>.............] - ETA: 1:12 - loss: 0.5433 - regression_loss: 0.4938 - classification_loss: 0.0494 289/500 [================>.............] - ETA: 1:12 - loss: 0.5431 - regression_loss: 0.4937 - classification_loss: 0.0494 290/500 [================>.............] - ETA: 1:11 - loss: 0.5424 - regression_loss: 0.4931 - classification_loss: 0.0493 291/500 [================>.............] - ETA: 1:11 - loss: 0.5421 - regression_loss: 0.4928 - classification_loss: 0.0493 292/500 [================>.............] - ETA: 1:10 - loss: 0.5417 - regression_loss: 0.4925 - classification_loss: 0.0492 293/500 [================>.............] - ETA: 1:10 - loss: 0.5417 - regression_loss: 0.4925 - classification_loss: 0.0492 294/500 [================>.............] - ETA: 1:10 - loss: 0.5417 - regression_loss: 0.4926 - classification_loss: 0.0492 295/500 [================>.............] - ETA: 1:09 - loss: 0.5414 - regression_loss: 0.4922 - classification_loss: 0.0492 296/500 [================>.............] - ETA: 1:09 - loss: 0.5425 - regression_loss: 0.4933 - classification_loss: 0.0492 297/500 [================>.............] - ETA: 1:09 - loss: 0.5419 - regression_loss: 0.4928 - classification_loss: 0.0491 298/500 [================>.............] - ETA: 1:08 - loss: 0.5416 - regression_loss: 0.4925 - classification_loss: 0.0491 299/500 [================>.............] - ETA: 1:08 - loss: 0.5412 - regression_loss: 0.4922 - classification_loss: 0.0491 300/500 [=================>............] - ETA: 1:08 - loss: 0.5409 - regression_loss: 0.4918 - classification_loss: 0.0490 301/500 [=================>............] - ETA: 1:07 - loss: 0.5402 - regression_loss: 0.4913 - classification_loss: 0.0489 302/500 [=================>............] - ETA: 1:07 - loss: 0.5395 - regression_loss: 0.4907 - classification_loss: 0.0488 303/500 [=================>............] - ETA: 1:07 - loss: 0.5395 - regression_loss: 0.4905 - classification_loss: 0.0489 304/500 [=================>............] - ETA: 1:06 - loss: 0.5395 - regression_loss: 0.4906 - classification_loss: 0.0489 305/500 [=================>............] - ETA: 1:06 - loss: 0.5389 - regression_loss: 0.4901 - classification_loss: 0.0488 306/500 [=================>............] - ETA: 1:06 - loss: 0.5397 - regression_loss: 0.4909 - classification_loss: 0.0488 307/500 [=================>............] - ETA: 1:05 - loss: 0.5396 - regression_loss: 0.4908 - classification_loss: 0.0488 308/500 [=================>............] - ETA: 1:05 - loss: 0.5394 - regression_loss: 0.4906 - classification_loss: 0.0488 309/500 [=================>............] - ETA: 1:05 - loss: 0.5385 - regression_loss: 0.4897 - classification_loss: 0.0487 310/500 [=================>............] - ETA: 1:04 - loss: 0.5390 - regression_loss: 0.4901 - classification_loss: 0.0489 311/500 [=================>............] - ETA: 1:04 - loss: 0.5388 - regression_loss: 0.4900 - classification_loss: 0.0488 312/500 [=================>............] - ETA: 1:04 - loss: 0.5385 - regression_loss: 0.4897 - classification_loss: 0.0488 313/500 [=================>............] - ETA: 1:03 - loss: 0.5384 - regression_loss: 0.4895 - classification_loss: 0.0488 314/500 [=================>............] - ETA: 1:03 - loss: 0.5384 - regression_loss: 0.4896 - classification_loss: 0.0488 315/500 [=================>............] - ETA: 1:03 - loss: 0.5395 - regression_loss: 0.4907 - classification_loss: 0.0488 316/500 [=================>............] - ETA: 1:02 - loss: 0.5391 - regression_loss: 0.4903 - classification_loss: 0.0488 317/500 [==================>...........] - ETA: 1:02 - loss: 0.5392 - regression_loss: 0.4905 - classification_loss: 0.0488 318/500 [==================>...........] - ETA: 1:02 - loss: 0.5389 - regression_loss: 0.4901 - classification_loss: 0.0488 319/500 [==================>...........] - ETA: 1:01 - loss: 0.5388 - regression_loss: 0.4900 - classification_loss: 0.0487 320/500 [==================>...........] - ETA: 1:01 - loss: 0.5388 - regression_loss: 0.4900 - classification_loss: 0.0487 321/500 [==================>...........] - ETA: 1:01 - loss: 0.5380 - regression_loss: 0.4894 - classification_loss: 0.0486 322/500 [==================>...........] - ETA: 1:00 - loss: 0.5373 - regression_loss: 0.4888 - classification_loss: 0.0485 323/500 [==================>...........] - ETA: 1:00 - loss: 0.5374 - regression_loss: 0.4888 - classification_loss: 0.0485 324/500 [==================>...........] - ETA: 1:00 - loss: 0.5374 - regression_loss: 0.4889 - classification_loss: 0.0486 325/500 [==================>...........] - ETA: 59s - loss: 0.5375 - regression_loss: 0.4889 - classification_loss: 0.0486  326/500 [==================>...........] - ETA: 59s - loss: 0.5375 - regression_loss: 0.4890 - classification_loss: 0.0486 327/500 [==================>...........] - ETA: 59s - loss: 0.5377 - regression_loss: 0.4891 - classification_loss: 0.0486 328/500 [==================>...........] - ETA: 58s - loss: 0.5377 - regression_loss: 0.4891 - classification_loss: 0.0485 329/500 [==================>...........] - ETA: 58s - loss: 0.5380 - regression_loss: 0.4894 - classification_loss: 0.0486 330/500 [==================>...........] - ETA: 58s - loss: 0.5394 - regression_loss: 0.4905 - classification_loss: 0.0488 331/500 [==================>...........] - ETA: 57s - loss: 0.5404 - regression_loss: 0.4915 - classification_loss: 0.0489 332/500 [==================>...........] - ETA: 57s - loss: 0.5405 - regression_loss: 0.4916 - classification_loss: 0.0489 333/500 [==================>...........] - ETA: 57s - loss: 0.5405 - regression_loss: 0.4916 - classification_loss: 0.0489 334/500 [===================>..........] - ETA: 56s - loss: 0.5404 - regression_loss: 0.4915 - classification_loss: 0.0489 335/500 [===================>..........] - ETA: 56s - loss: 0.5399 - regression_loss: 0.4910 - classification_loss: 0.0489 336/500 [===================>..........] - ETA: 56s - loss: 0.5396 - regression_loss: 0.4907 - classification_loss: 0.0489 337/500 [===================>..........] - ETA: 55s - loss: 0.5399 - regression_loss: 0.4909 - classification_loss: 0.0490 338/500 [===================>..........] - ETA: 55s - loss: 0.5400 - regression_loss: 0.4910 - classification_loss: 0.0490 339/500 [===================>..........] - ETA: 55s - loss: 0.5395 - regression_loss: 0.4906 - classification_loss: 0.0489 340/500 [===================>..........] - ETA: 54s - loss: 0.5392 - regression_loss: 0.4903 - classification_loss: 0.0489 341/500 [===================>..........] - ETA: 54s - loss: 0.5398 - regression_loss: 0.4909 - classification_loss: 0.0489 342/500 [===================>..........] - ETA: 54s - loss: 0.5395 - regression_loss: 0.4905 - classification_loss: 0.0489 343/500 [===================>..........] - ETA: 53s - loss: 0.5396 - regression_loss: 0.4907 - classification_loss: 0.0489 344/500 [===================>..........] - ETA: 53s - loss: 0.5390 - regression_loss: 0.4901 - classification_loss: 0.0489 345/500 [===================>..........] - ETA: 52s - loss: 0.5383 - regression_loss: 0.4895 - classification_loss: 0.0488 346/500 [===================>..........] - ETA: 52s - loss: 0.5372 - regression_loss: 0.4885 - classification_loss: 0.0487 347/500 [===================>..........] - ETA: 52s - loss: 0.5369 - regression_loss: 0.4882 - classification_loss: 0.0487 348/500 [===================>..........] - ETA: 51s - loss: 0.5369 - regression_loss: 0.4882 - classification_loss: 0.0487 349/500 [===================>..........] - ETA: 51s - loss: 0.5373 - regression_loss: 0.4887 - classification_loss: 0.0486 350/500 [====================>.........] - ETA: 51s - loss: 0.5372 - regression_loss: 0.4885 - classification_loss: 0.0487 351/500 [====================>.........] - ETA: 50s - loss: 0.5373 - regression_loss: 0.4887 - classification_loss: 0.0487 352/500 [====================>.........] - ETA: 50s - loss: 0.5377 - regression_loss: 0.4890 - classification_loss: 0.0487 353/500 [====================>.........] - ETA: 50s - loss: 0.5376 - regression_loss: 0.4889 - classification_loss: 0.0487 354/500 [====================>.........] - ETA: 49s - loss: 0.5381 - regression_loss: 0.4894 - classification_loss: 0.0488 355/500 [====================>.........] - ETA: 49s - loss: 0.5380 - regression_loss: 0.4892 - classification_loss: 0.0488 356/500 [====================>.........] - ETA: 49s - loss: 0.5370 - regression_loss: 0.4883 - classification_loss: 0.0487 357/500 [====================>.........] - ETA: 48s - loss: 0.5388 - regression_loss: 0.4900 - classification_loss: 0.0488 358/500 [====================>.........] - ETA: 48s - loss: 0.5381 - regression_loss: 0.4893 - classification_loss: 0.0488 359/500 [====================>.........] - ETA: 48s - loss: 0.5375 - regression_loss: 0.4888 - classification_loss: 0.0487 360/500 [====================>.........] - ETA: 47s - loss: 0.5376 - regression_loss: 0.4889 - classification_loss: 0.0487 361/500 [====================>.........] - ETA: 47s - loss: 0.5371 - regression_loss: 0.4884 - classification_loss: 0.0486 362/500 [====================>.........] - ETA: 47s - loss: 0.5368 - regression_loss: 0.4882 - classification_loss: 0.0486 363/500 [====================>.........] - ETA: 46s - loss: 0.5363 - regression_loss: 0.4878 - classification_loss: 0.0485 364/500 [====================>.........] - ETA: 46s - loss: 0.5363 - regression_loss: 0.4879 - classification_loss: 0.0485 365/500 [====================>.........] - ETA: 46s - loss: 0.5354 - regression_loss: 0.4871 - classification_loss: 0.0484 366/500 [====================>.........] - ETA: 45s - loss: 0.5350 - regression_loss: 0.4867 - classification_loss: 0.0483 367/500 [=====================>........] - ETA: 45s - loss: 0.5355 - regression_loss: 0.4871 - classification_loss: 0.0484 368/500 [=====================>........] - ETA: 45s - loss: 0.5347 - regression_loss: 0.4864 - classification_loss: 0.0483 369/500 [=====================>........] - ETA: 44s - loss: 0.5343 - regression_loss: 0.4860 - classification_loss: 0.0483 370/500 [=====================>........] - ETA: 44s - loss: 0.5342 - regression_loss: 0.4859 - classification_loss: 0.0483 371/500 [=====================>........] - ETA: 44s - loss: 0.5345 - regression_loss: 0.4862 - classification_loss: 0.0483 372/500 [=====================>........] - ETA: 43s - loss: 0.5346 - regression_loss: 0.4862 - classification_loss: 0.0483 373/500 [=====================>........] - ETA: 43s - loss: 0.5345 - regression_loss: 0.4862 - classification_loss: 0.0483 374/500 [=====================>........] - ETA: 43s - loss: 0.5342 - regression_loss: 0.4860 - classification_loss: 0.0482 375/500 [=====================>........] - ETA: 42s - loss: 0.5341 - regression_loss: 0.4859 - classification_loss: 0.0482 376/500 [=====================>........] - ETA: 42s - loss: 0.5339 - regression_loss: 0.4858 - classification_loss: 0.0482 377/500 [=====================>........] - ETA: 42s - loss: 0.5335 - regression_loss: 0.4854 - classification_loss: 0.0481 378/500 [=====================>........] - ETA: 41s - loss: 0.5336 - regression_loss: 0.4855 - classification_loss: 0.0481 379/500 [=====================>........] - ETA: 41s - loss: 0.5343 - regression_loss: 0.4862 - classification_loss: 0.0481 380/500 [=====================>........] - ETA: 41s - loss: 0.5338 - regression_loss: 0.4858 - classification_loss: 0.0481 381/500 [=====================>........] - ETA: 40s - loss: 0.5336 - regression_loss: 0.4856 - classification_loss: 0.0480 382/500 [=====================>........] - ETA: 40s - loss: 0.5327 - regression_loss: 0.4847 - classification_loss: 0.0480 383/500 [=====================>........] - ETA: 40s - loss: 0.5323 - regression_loss: 0.4844 - classification_loss: 0.0479 384/500 [======================>.......] - ETA: 39s - loss: 0.5322 - regression_loss: 0.4842 - classification_loss: 0.0480 385/500 [======================>.......] - ETA: 39s - loss: 0.5316 - regression_loss: 0.4837 - classification_loss: 0.0479 386/500 [======================>.......] - ETA: 39s - loss: 0.5315 - regression_loss: 0.4837 - classification_loss: 0.0478 387/500 [======================>.......] - ETA: 38s - loss: 0.5316 - regression_loss: 0.4839 - classification_loss: 0.0478 388/500 [======================>.......] - ETA: 38s - loss: 0.5323 - regression_loss: 0.4844 - classification_loss: 0.0478 389/500 [======================>.......] - ETA: 37s - loss: 0.5325 - regression_loss: 0.4846 - classification_loss: 0.0478 390/500 [======================>.......] - ETA: 37s - loss: 0.5327 - regression_loss: 0.4849 - classification_loss: 0.0478 391/500 [======================>.......] - ETA: 37s - loss: 0.5328 - regression_loss: 0.4850 - classification_loss: 0.0478 392/500 [======================>.......] - ETA: 36s - loss: 0.5321 - regression_loss: 0.4843 - classification_loss: 0.0478 393/500 [======================>.......] - ETA: 36s - loss: 0.5323 - regression_loss: 0.4845 - classification_loss: 0.0478 394/500 [======================>.......] - ETA: 36s - loss: 0.5334 - regression_loss: 0.4854 - classification_loss: 0.0480 395/500 [======================>.......] - ETA: 35s - loss: 0.5328 - regression_loss: 0.4849 - classification_loss: 0.0480 396/500 [======================>.......] - ETA: 35s - loss: 0.5325 - regression_loss: 0.4846 - classification_loss: 0.0479 397/500 [======================>.......] - ETA: 35s - loss: 0.5319 - regression_loss: 0.4840 - classification_loss: 0.0479 398/500 [======================>.......] - ETA: 34s - loss: 0.5316 - regression_loss: 0.4838 - classification_loss: 0.0478 399/500 [======================>.......] - ETA: 34s - loss: 0.5318 - regression_loss: 0.4839 - classification_loss: 0.0479 400/500 [=======================>......] - ETA: 34s - loss: 0.5322 - regression_loss: 0.4842 - classification_loss: 0.0480 401/500 [=======================>......] - ETA: 33s - loss: 0.5319 - regression_loss: 0.4840 - classification_loss: 0.0479 402/500 [=======================>......] - ETA: 33s - loss: 0.5316 - regression_loss: 0.4837 - classification_loss: 0.0479 403/500 [=======================>......] - ETA: 33s - loss: 0.5314 - regression_loss: 0.4836 - classification_loss: 0.0478 404/500 [=======================>......] - ETA: 32s - loss: 0.5315 - regression_loss: 0.4838 - classification_loss: 0.0477 405/500 [=======================>......] - ETA: 32s - loss: 0.5314 - regression_loss: 0.4837 - classification_loss: 0.0477 406/500 [=======================>......] - ETA: 32s - loss: 0.5307 - regression_loss: 0.4830 - classification_loss: 0.0476 407/500 [=======================>......] - ETA: 31s - loss: 0.5303 - regression_loss: 0.4827 - classification_loss: 0.0476 408/500 [=======================>......] - ETA: 31s - loss: 0.5304 - regression_loss: 0.4828 - classification_loss: 0.0476 409/500 [=======================>......] - ETA: 31s - loss: 0.5300 - regression_loss: 0.4824 - classification_loss: 0.0476 410/500 [=======================>......] - ETA: 30s - loss: 0.5297 - regression_loss: 0.4822 - classification_loss: 0.0475 411/500 [=======================>......] - ETA: 30s - loss: 0.5294 - regression_loss: 0.4819 - classification_loss: 0.0475 412/500 [=======================>......] - ETA: 30s - loss: 0.5292 - regression_loss: 0.4817 - classification_loss: 0.0475 413/500 [=======================>......] - ETA: 29s - loss: 0.5285 - regression_loss: 0.4811 - classification_loss: 0.0474 414/500 [=======================>......] - ETA: 29s - loss: 0.5281 - regression_loss: 0.4808 - classification_loss: 0.0473 415/500 [=======================>......] - ETA: 29s - loss: 0.5274 - regression_loss: 0.4801 - classification_loss: 0.0473 416/500 [=======================>......] - ETA: 28s - loss: 0.5280 - regression_loss: 0.4807 - classification_loss: 0.0473 417/500 [========================>.....] - ETA: 28s - loss: 0.5280 - regression_loss: 0.4807 - classification_loss: 0.0473 418/500 [========================>.....] - ETA: 28s - loss: 0.5276 - regression_loss: 0.4804 - classification_loss: 0.0473 419/500 [========================>.....] - ETA: 27s - loss: 0.5267 - regression_loss: 0.4795 - classification_loss: 0.0472 420/500 [========================>.....] - ETA: 27s - loss: 0.5267 - regression_loss: 0.4795 - classification_loss: 0.0472 421/500 [========================>.....] - ETA: 27s - loss: 0.5269 - regression_loss: 0.4798 - classification_loss: 0.0472 422/500 [========================>.....] - ETA: 26s - loss: 0.5282 - regression_loss: 0.4809 - classification_loss: 0.0473 423/500 [========================>.....] - ETA: 26s - loss: 0.5288 - regression_loss: 0.4815 - classification_loss: 0.0473 424/500 [========================>.....] - ETA: 26s - loss: 0.5293 - regression_loss: 0.4821 - classification_loss: 0.0473 425/500 [========================>.....] - ETA: 25s - loss: 0.5301 - regression_loss: 0.4828 - classification_loss: 0.0473 426/500 [========================>.....] - ETA: 25s - loss: 0.5311 - regression_loss: 0.4837 - classification_loss: 0.0474 427/500 [========================>.....] - ETA: 24s - loss: 0.5315 - regression_loss: 0.4841 - classification_loss: 0.0474 428/500 [========================>.....] - ETA: 24s - loss: 0.5309 - regression_loss: 0.4836 - classification_loss: 0.0474 429/500 [========================>.....] - ETA: 24s - loss: 0.5309 - regression_loss: 0.4835 - classification_loss: 0.0474 430/500 [========================>.....] - ETA: 23s - loss: 0.5314 - regression_loss: 0.4839 - classification_loss: 0.0475 431/500 [========================>.....] - ETA: 23s - loss: 0.5314 - regression_loss: 0.4839 - classification_loss: 0.0475 432/500 [========================>.....] - ETA: 23s - loss: 0.5314 - regression_loss: 0.4839 - classification_loss: 0.0475 433/500 [========================>.....] - ETA: 22s - loss: 0.5311 - regression_loss: 0.4836 - classification_loss: 0.0475 434/500 [=========================>....] - ETA: 22s - loss: 0.5320 - regression_loss: 0.4844 - classification_loss: 0.0475 435/500 [=========================>....] - ETA: 22s - loss: 0.5323 - regression_loss: 0.4847 - classification_loss: 0.0476 436/500 [=========================>....] - ETA: 21s - loss: 0.5324 - regression_loss: 0.4847 - classification_loss: 0.0476 437/500 [=========================>....] - ETA: 21s - loss: 0.5325 - regression_loss: 0.4849 - classification_loss: 0.0476 438/500 [=========================>....] - ETA: 21s - loss: 0.5328 - regression_loss: 0.4850 - classification_loss: 0.0477 439/500 [=========================>....] - ETA: 20s - loss: 0.5320 - regression_loss: 0.4843 - classification_loss: 0.0477 440/500 [=========================>....] - ETA: 20s - loss: 0.5314 - regression_loss: 0.4838 - classification_loss: 0.0476 441/500 [=========================>....] - ETA: 20s - loss: 0.5315 - regression_loss: 0.4839 - classification_loss: 0.0476 442/500 [=========================>....] - ETA: 19s - loss: 0.5315 - regression_loss: 0.4839 - classification_loss: 0.0476 443/500 [=========================>....] - ETA: 19s - loss: 0.5323 - regression_loss: 0.4846 - classification_loss: 0.0478 444/500 [=========================>....] - ETA: 19s - loss: 0.5323 - regression_loss: 0.4846 - classification_loss: 0.0477 445/500 [=========================>....] - ETA: 18s - loss: 0.5327 - regression_loss: 0.4848 - classification_loss: 0.0478 446/500 [=========================>....] - ETA: 18s - loss: 0.5332 - regression_loss: 0.4854 - classification_loss: 0.0478 447/500 [=========================>....] - ETA: 18s - loss: 0.5325 - regression_loss: 0.4847 - classification_loss: 0.0477 448/500 [=========================>....] - ETA: 17s - loss: 0.5323 - regression_loss: 0.4846 - classification_loss: 0.0477 449/500 [=========================>....] - ETA: 17s - loss: 0.5316 - regression_loss: 0.4840 - classification_loss: 0.0476 450/500 [==========================>...] - ETA: 17s - loss: 0.5324 - regression_loss: 0.4847 - classification_loss: 0.0477 451/500 [==========================>...] - ETA: 16s - loss: 0.5325 - regression_loss: 0.4848 - classification_loss: 0.0477 452/500 [==========================>...] - ETA: 16s - loss: 0.5324 - regression_loss: 0.4847 - classification_loss: 0.0478 453/500 [==========================>...] - ETA: 16s - loss: 0.5322 - regression_loss: 0.4845 - classification_loss: 0.0477 454/500 [==========================>...] - ETA: 15s - loss: 0.5319 - regression_loss: 0.4842 - classification_loss: 0.0477 455/500 [==========================>...] - ETA: 15s - loss: 0.5320 - regression_loss: 0.4843 - classification_loss: 0.0477 456/500 [==========================>...] - ETA: 15s - loss: 0.5317 - regression_loss: 0.4841 - classification_loss: 0.0476 457/500 [==========================>...] - ETA: 14s - loss: 0.5316 - regression_loss: 0.4840 - classification_loss: 0.0476 458/500 [==========================>...] - ETA: 14s - loss: 0.5316 - regression_loss: 0.4840 - classification_loss: 0.0476 459/500 [==========================>...] - ETA: 14s - loss: 0.5316 - regression_loss: 0.4840 - classification_loss: 0.0476 460/500 [==========================>...] - ETA: 13s - loss: 0.5319 - regression_loss: 0.4842 - classification_loss: 0.0477 461/500 [==========================>...] - ETA: 13s - loss: 0.5317 - regression_loss: 0.4840 - classification_loss: 0.0476 462/500 [==========================>...] - ETA: 13s - loss: 0.5315 - regression_loss: 0.4840 - classification_loss: 0.0476 463/500 [==========================>...] - ETA: 12s - loss: 0.5310 - regression_loss: 0.4835 - classification_loss: 0.0475 464/500 [==========================>...] - ETA: 12s - loss: 0.5306 - regression_loss: 0.4831 - classification_loss: 0.0475 465/500 [==========================>...] - ETA: 11s - loss: 0.5306 - regression_loss: 0.4831 - classification_loss: 0.0475 466/500 [==========================>...] - ETA: 11s - loss: 0.5308 - regression_loss: 0.4833 - classification_loss: 0.0475 467/500 [===========================>..] - ETA: 11s - loss: 0.5306 - regression_loss: 0.4831 - classification_loss: 0.0475 468/500 [===========================>..] - ETA: 10s - loss: 0.5308 - regression_loss: 0.4833 - classification_loss: 0.0475 469/500 [===========================>..] - ETA: 10s - loss: 0.5310 - regression_loss: 0.4835 - classification_loss: 0.0475 470/500 [===========================>..] - ETA: 10s - loss: 0.5306 - regression_loss: 0.4832 - classification_loss: 0.0475 471/500 [===========================>..] - ETA: 9s - loss: 0.5304 - regression_loss: 0.4829 - classification_loss: 0.0475  472/500 [===========================>..] - ETA: 9s - loss: 0.5304 - regression_loss: 0.4829 - classification_loss: 0.0475 473/500 [===========================>..] - ETA: 9s - loss: 0.5310 - regression_loss: 0.4835 - classification_loss: 0.0476 474/500 [===========================>..] - ETA: 8s - loss: 0.5309 - regression_loss: 0.4833 - classification_loss: 0.0476 475/500 [===========================>..] - ETA: 8s - loss: 0.5308 - regression_loss: 0.4833 - classification_loss: 0.0475 476/500 [===========================>..] - ETA: 8s - loss: 0.5304 - regression_loss: 0.4829 - classification_loss: 0.0475 477/500 [===========================>..] - ETA: 7s - loss: 0.5306 - regression_loss: 0.4831 - classification_loss: 0.0475 478/500 [===========================>..] - ETA: 7s - loss: 0.5307 - regression_loss: 0.4832 - classification_loss: 0.0475 479/500 [===========================>..] - ETA: 7s - loss: 0.5299 - regression_loss: 0.4824 - classification_loss: 0.0475 480/500 [===========================>..] - ETA: 6s - loss: 0.5309 - regression_loss: 0.4833 - classification_loss: 0.0475 481/500 [===========================>..] - ETA: 6s - loss: 0.5304 - regression_loss: 0.4829 - classification_loss: 0.0475 482/500 [===========================>..] - ETA: 6s - loss: 0.5309 - regression_loss: 0.4833 - classification_loss: 0.0476 483/500 [===========================>..] - ETA: 5s - loss: 0.5304 - regression_loss: 0.4829 - classification_loss: 0.0475 484/500 [============================>.] - ETA: 5s - loss: 0.5303 - regression_loss: 0.4828 - classification_loss: 0.0475 485/500 [============================>.] - ETA: 5s - loss: 0.5298 - regression_loss: 0.4825 - classification_loss: 0.0474 486/500 [============================>.] - ETA: 4s - loss: 0.5299 - regression_loss: 0.4825 - classification_loss: 0.0474 487/500 [============================>.] - ETA: 4s - loss: 0.5301 - regression_loss: 0.4827 - classification_loss: 0.0474 488/500 [============================>.] - ETA: 4s - loss: 0.5300 - regression_loss: 0.4827 - classification_loss: 0.0473 489/500 [============================>.] - ETA: 3s - loss: 0.5297 - regression_loss: 0.4824 - classification_loss: 0.0473 490/500 [============================>.] - ETA: 3s - loss: 0.5293 - regression_loss: 0.4820 - classification_loss: 0.0473 491/500 [============================>.] - ETA: 3s - loss: 0.5295 - regression_loss: 0.4822 - classification_loss: 0.0473 492/500 [============================>.] - ETA: 2s - loss: 0.5294 - regression_loss: 0.4821 - classification_loss: 0.0473 493/500 [============================>.] - ETA: 2s - loss: 0.5290 - regression_loss: 0.4818 - classification_loss: 0.0472 494/500 [============================>.] - ETA: 2s - loss: 0.5288 - regression_loss: 0.4816 - classification_loss: 0.0472 495/500 [============================>.] - ETA: 1s - loss: 0.5289 - regression_loss: 0.4816 - classification_loss: 0.0473 496/500 [============================>.] - ETA: 1s - loss: 0.5287 - regression_loss: 0.4814 - classification_loss: 0.0473 497/500 [============================>.] - ETA: 1s - loss: 0.5283 - regression_loss: 0.4811 - classification_loss: 0.0472 498/500 [============================>.] - ETA: 0s - loss: 0.5285 - regression_loss: 0.4812 - classification_loss: 0.0473 499/500 [============================>.] - ETA: 0s - loss: 0.5279 - regression_loss: 0.4807 - classification_loss: 0.0472 500/500 [==============================] - 171s 343ms/step - loss: 0.5285 - regression_loss: 0.4811 - classification_loss: 0.0473 1172 instances of class plum with average precision: 0.7292 mAP: 0.7292 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:45 - loss: 0.3966 - regression_loss: 0.3617 - classification_loss: 0.0349 2/500 [..............................] - ETA: 2:46 - loss: 0.3144 - regression_loss: 0.2914 - classification_loss: 0.0230 3/500 [..............................] - ETA: 2:48 - loss: 0.2927 - regression_loss: 0.2711 - classification_loss: 0.0216 4/500 [..............................] - ETA: 2:48 - loss: 0.3820 - regression_loss: 0.3448 - classification_loss: 0.0372 5/500 [..............................] - ETA: 2:47 - loss: 0.3835 - regression_loss: 0.3485 - classification_loss: 0.0350 6/500 [..............................] - ETA: 2:47 - loss: 0.3815 - regression_loss: 0.3482 - classification_loss: 0.0333 7/500 [..............................] - ETA: 2:48 - loss: 0.3944 - regression_loss: 0.3582 - classification_loss: 0.0362 8/500 [..............................] - ETA: 2:48 - loss: 0.4320 - regression_loss: 0.3934 - classification_loss: 0.0386 9/500 [..............................] - ETA: 2:48 - loss: 0.4097 - regression_loss: 0.3728 - classification_loss: 0.0370 10/500 [..............................] - ETA: 2:48 - loss: 0.4214 - regression_loss: 0.3837 - classification_loss: 0.0377 11/500 [..............................] - ETA: 2:48 - loss: 0.4396 - regression_loss: 0.4003 - classification_loss: 0.0393 12/500 [..............................] - ETA: 2:48 - loss: 0.4634 - regression_loss: 0.4215 - classification_loss: 0.0419 13/500 [..............................] - ETA: 2:47 - loss: 0.4654 - regression_loss: 0.4238 - classification_loss: 0.0416 14/500 [..............................] - ETA: 2:48 - loss: 0.4561 - regression_loss: 0.4158 - classification_loss: 0.0404 15/500 [..............................] - ETA: 2:48 - loss: 0.4740 - regression_loss: 0.4317 - classification_loss: 0.0423 16/500 [..............................] - ETA: 2:47 - loss: 0.4751 - regression_loss: 0.4324 - classification_loss: 0.0428 17/500 [>.............................] - ETA: 2:46 - loss: 0.4710 - regression_loss: 0.4282 - classification_loss: 0.0428 18/500 [>.............................] - ETA: 2:45 - loss: 0.4697 - regression_loss: 0.4276 - classification_loss: 0.0421 19/500 [>.............................] - ETA: 2:44 - loss: 0.4707 - regression_loss: 0.4287 - classification_loss: 0.0420 20/500 [>.............................] - ETA: 2:44 - loss: 0.4615 - regression_loss: 0.4208 - classification_loss: 0.0407 21/500 [>.............................] - ETA: 2:44 - loss: 0.4642 - regression_loss: 0.4232 - classification_loss: 0.0410 22/500 [>.............................] - ETA: 2:44 - loss: 0.4691 - regression_loss: 0.4273 - classification_loss: 0.0418 23/500 [>.............................] - ETA: 2:43 - loss: 0.4779 - regression_loss: 0.4344 - classification_loss: 0.0435 24/500 [>.............................] - ETA: 2:43 - loss: 0.4970 - regression_loss: 0.4525 - classification_loss: 0.0445 25/500 [>.............................] - ETA: 2:43 - loss: 0.5140 - regression_loss: 0.4668 - classification_loss: 0.0472 26/500 [>.............................] - ETA: 2:42 - loss: 0.5066 - regression_loss: 0.4602 - classification_loss: 0.0464 27/500 [>.............................] - ETA: 2:42 - loss: 0.5093 - regression_loss: 0.4630 - classification_loss: 0.0463 28/500 [>.............................] - ETA: 2:42 - loss: 0.5125 - regression_loss: 0.4662 - classification_loss: 0.0463 29/500 [>.............................] - ETA: 2:42 - loss: 0.5101 - regression_loss: 0.4641 - classification_loss: 0.0459 30/500 [>.............................] - ETA: 2:41 - loss: 0.5116 - regression_loss: 0.4657 - classification_loss: 0.0459 31/500 [>.............................] - ETA: 2:41 - loss: 0.5102 - regression_loss: 0.4650 - classification_loss: 0.0453 32/500 [>.............................] - ETA: 2:40 - loss: 0.5045 - regression_loss: 0.4603 - classification_loss: 0.0442 33/500 [>.............................] - ETA: 2:39 - loss: 0.4987 - regression_loss: 0.4551 - classification_loss: 0.0437 34/500 [=>............................] - ETA: 2:39 - loss: 0.4956 - regression_loss: 0.4522 - classification_loss: 0.0435 35/500 [=>............................] - ETA: 2:38 - loss: 0.4890 - regression_loss: 0.4461 - classification_loss: 0.0430 36/500 [=>............................] - ETA: 2:38 - loss: 0.4927 - regression_loss: 0.4495 - classification_loss: 0.0433 37/500 [=>............................] - ETA: 2:38 - loss: 0.4926 - regression_loss: 0.4492 - classification_loss: 0.0434 38/500 [=>............................] - ETA: 2:37 - loss: 0.4844 - regression_loss: 0.4418 - classification_loss: 0.0426 39/500 [=>............................] - ETA: 2:37 - loss: 0.4868 - regression_loss: 0.4443 - classification_loss: 0.0426 40/500 [=>............................] - ETA: 2:37 - loss: 0.4923 - regression_loss: 0.4503 - classification_loss: 0.0419 41/500 [=>............................] - ETA: 2:37 - loss: 0.5007 - regression_loss: 0.4573 - classification_loss: 0.0435 42/500 [=>............................] - ETA: 2:36 - loss: 0.4967 - regression_loss: 0.4538 - classification_loss: 0.0429 43/500 [=>............................] - ETA: 2:36 - loss: 0.5002 - regression_loss: 0.4562 - classification_loss: 0.0440 44/500 [=>............................] - ETA: 2:36 - loss: 0.5012 - regression_loss: 0.4570 - classification_loss: 0.0442 45/500 [=>............................] - ETA: 2:35 - loss: 0.4998 - regression_loss: 0.4558 - classification_loss: 0.0439 46/500 [=>............................] - ETA: 2:35 - loss: 0.4958 - regression_loss: 0.4524 - classification_loss: 0.0434 47/500 [=>............................] - ETA: 2:35 - loss: 0.4977 - regression_loss: 0.4543 - classification_loss: 0.0434 48/500 [=>............................] - ETA: 2:34 - loss: 0.5005 - regression_loss: 0.4567 - classification_loss: 0.0438 49/500 [=>............................] - ETA: 2:34 - loss: 0.5063 - regression_loss: 0.4614 - classification_loss: 0.0449 50/500 [==>...........................] - ETA: 2:34 - loss: 0.5099 - regression_loss: 0.4645 - classification_loss: 0.0454 51/500 [==>...........................] - ETA: 2:33 - loss: 0.5092 - regression_loss: 0.4642 - classification_loss: 0.0450 52/500 [==>...........................] - ETA: 2:33 - loss: 0.5043 - regression_loss: 0.4596 - classification_loss: 0.0447 53/500 [==>...........................] - ETA: 2:32 - loss: 0.5082 - regression_loss: 0.4626 - classification_loss: 0.0456 54/500 [==>...........................] - ETA: 2:32 - loss: 0.5119 - regression_loss: 0.4649 - classification_loss: 0.0470 55/500 [==>...........................] - ETA: 2:32 - loss: 0.5087 - regression_loss: 0.4620 - classification_loss: 0.0467 56/500 [==>...........................] - ETA: 2:31 - loss: 0.5088 - regression_loss: 0.4624 - classification_loss: 0.0464 57/500 [==>...........................] - ETA: 2:31 - loss: 0.5074 - regression_loss: 0.4615 - classification_loss: 0.0458 58/500 [==>...........................] - ETA: 2:31 - loss: 0.5106 - regression_loss: 0.4642 - classification_loss: 0.0464 59/500 [==>...........................] - ETA: 2:30 - loss: 0.5115 - regression_loss: 0.4648 - classification_loss: 0.0467 60/500 [==>...........................] - ETA: 2:30 - loss: 0.5140 - regression_loss: 0.4675 - classification_loss: 0.0464 61/500 [==>...........................] - ETA: 2:30 - loss: 0.5133 - regression_loss: 0.4670 - classification_loss: 0.0462 62/500 [==>...........................] - ETA: 2:29 - loss: 0.5114 - regression_loss: 0.4652 - classification_loss: 0.0463 63/500 [==>...........................] - ETA: 2:29 - loss: 0.5129 - regression_loss: 0.4665 - classification_loss: 0.0463 64/500 [==>...........................] - ETA: 2:29 - loss: 0.5111 - regression_loss: 0.4652 - classification_loss: 0.0459 65/500 [==>...........................] - ETA: 2:28 - loss: 0.5106 - regression_loss: 0.4650 - classification_loss: 0.0456 66/500 [==>...........................] - ETA: 2:28 - loss: 0.5114 - regression_loss: 0.4657 - classification_loss: 0.0457 67/500 [===>..........................] - ETA: 2:28 - loss: 0.5094 - regression_loss: 0.4637 - classification_loss: 0.0457 68/500 [===>..........................] - ETA: 2:27 - loss: 0.5082 - regression_loss: 0.4626 - classification_loss: 0.0456 69/500 [===>..........................] - ETA: 2:27 - loss: 0.5084 - regression_loss: 0.4629 - classification_loss: 0.0454 70/500 [===>..........................] - ETA: 2:27 - loss: 0.5098 - regression_loss: 0.4639 - classification_loss: 0.0458 71/500 [===>..........................] - ETA: 2:27 - loss: 0.5081 - regression_loss: 0.4623 - classification_loss: 0.0458 72/500 [===>..........................] - ETA: 2:26 - loss: 0.5056 - regression_loss: 0.4601 - classification_loss: 0.0455 73/500 [===>..........................] - ETA: 2:26 - loss: 0.5064 - regression_loss: 0.4609 - classification_loss: 0.0455 74/500 [===>..........................] - ETA: 2:26 - loss: 0.5051 - regression_loss: 0.4596 - classification_loss: 0.0455 75/500 [===>..........................] - ETA: 2:25 - loss: 0.5022 - regression_loss: 0.4570 - classification_loss: 0.0452 76/500 [===>..........................] - ETA: 2:25 - loss: 0.5013 - regression_loss: 0.4565 - classification_loss: 0.0448 77/500 [===>..........................] - ETA: 2:24 - loss: 0.5021 - regression_loss: 0.4575 - classification_loss: 0.0446 78/500 [===>..........................] - ETA: 2:24 - loss: 0.5076 - regression_loss: 0.4622 - classification_loss: 0.0454 79/500 [===>..........................] - ETA: 2:24 - loss: 0.5043 - regression_loss: 0.4589 - classification_loss: 0.0454 80/500 [===>..........................] - ETA: 2:23 - loss: 0.5038 - regression_loss: 0.4587 - classification_loss: 0.0451 81/500 [===>..........................] - ETA: 2:23 - loss: 0.5051 - regression_loss: 0.4602 - classification_loss: 0.0449 82/500 [===>..........................] - ETA: 2:23 - loss: 0.5052 - regression_loss: 0.4604 - classification_loss: 0.0448 83/500 [===>..........................] - ETA: 2:23 - loss: 0.5051 - regression_loss: 0.4600 - classification_loss: 0.0451 84/500 [====>.........................] - ETA: 2:22 - loss: 0.5083 - regression_loss: 0.4629 - classification_loss: 0.0454 85/500 [====>.........................] - ETA: 2:22 - loss: 0.5088 - regression_loss: 0.4635 - classification_loss: 0.0453 86/500 [====>.........................] - ETA: 2:22 - loss: 0.5109 - regression_loss: 0.4654 - classification_loss: 0.0455 87/500 [====>.........................] - ETA: 2:21 - loss: 0.5105 - regression_loss: 0.4651 - classification_loss: 0.0454 88/500 [====>.........................] - ETA: 2:21 - loss: 0.5083 - regression_loss: 0.4631 - classification_loss: 0.0452 89/500 [====>.........................] - ETA: 2:21 - loss: 0.5062 - regression_loss: 0.4614 - classification_loss: 0.0448 90/500 [====>.........................] - ETA: 2:20 - loss: 0.5046 - regression_loss: 0.4601 - classification_loss: 0.0445 91/500 [====>.........................] - ETA: 2:20 - loss: 0.5017 - regression_loss: 0.4575 - classification_loss: 0.0442 92/500 [====>.........................] - ETA: 2:20 - loss: 0.5025 - regression_loss: 0.4582 - classification_loss: 0.0442 93/500 [====>.........................] - ETA: 2:19 - loss: 0.5041 - regression_loss: 0.4600 - classification_loss: 0.0441 94/500 [====>.........................] - ETA: 2:19 - loss: 0.5009 - regression_loss: 0.4571 - classification_loss: 0.0438 95/500 [====>.........................] - ETA: 2:18 - loss: 0.4987 - regression_loss: 0.4549 - classification_loss: 0.0438 96/500 [====>.........................] - ETA: 2:18 - loss: 0.4984 - regression_loss: 0.4548 - classification_loss: 0.0436 97/500 [====>.........................] - ETA: 2:18 - loss: 0.4988 - regression_loss: 0.4550 - classification_loss: 0.0438 98/500 [====>.........................] - ETA: 2:17 - loss: 0.4998 - regression_loss: 0.4560 - classification_loss: 0.0439 99/500 [====>.........................] - ETA: 2:17 - loss: 0.4979 - regression_loss: 0.4543 - classification_loss: 0.0436 100/500 [=====>........................] - ETA: 2:17 - loss: 0.4981 - regression_loss: 0.4545 - classification_loss: 0.0436 101/500 [=====>........................] - ETA: 2:16 - loss: 0.4979 - regression_loss: 0.4543 - classification_loss: 0.0436 102/500 [=====>........................] - ETA: 2:16 - loss: 0.5005 - regression_loss: 0.4566 - classification_loss: 0.0439 103/500 [=====>........................] - ETA: 2:16 - loss: 0.4993 - regression_loss: 0.4554 - classification_loss: 0.0439 104/500 [=====>........................] - ETA: 2:15 - loss: 0.4983 - regression_loss: 0.4545 - classification_loss: 0.0438 105/500 [=====>........................] - ETA: 2:15 - loss: 0.4982 - regression_loss: 0.4545 - classification_loss: 0.0437 106/500 [=====>........................] - ETA: 2:15 - loss: 0.4977 - regression_loss: 0.4540 - classification_loss: 0.0437 107/500 [=====>........................] - ETA: 2:15 - loss: 0.4976 - regression_loss: 0.4540 - classification_loss: 0.0436 108/500 [=====>........................] - ETA: 2:14 - loss: 0.4983 - regression_loss: 0.4547 - classification_loss: 0.0436 109/500 [=====>........................] - ETA: 2:14 - loss: 0.4976 - regression_loss: 0.4540 - classification_loss: 0.0436 110/500 [=====>........................] - ETA: 2:13 - loss: 0.4968 - regression_loss: 0.4534 - classification_loss: 0.0434 111/500 [=====>........................] - ETA: 2:13 - loss: 0.4981 - regression_loss: 0.4547 - classification_loss: 0.0434 112/500 [=====>........................] - ETA: 2:13 - loss: 0.5009 - regression_loss: 0.4574 - classification_loss: 0.0436 113/500 [=====>........................] - ETA: 2:12 - loss: 0.5012 - regression_loss: 0.4575 - classification_loss: 0.0437 114/500 [=====>........................] - ETA: 2:12 - loss: 0.5001 - regression_loss: 0.4564 - classification_loss: 0.0437 115/500 [=====>........................] - ETA: 2:12 - loss: 0.4983 - regression_loss: 0.4547 - classification_loss: 0.0436 116/500 [=====>........................] - ETA: 2:11 - loss: 0.4984 - regression_loss: 0.4548 - classification_loss: 0.0436 117/500 [======>.......................] - ETA: 2:11 - loss: 0.5003 - regression_loss: 0.4561 - classification_loss: 0.0442 118/500 [======>.......................] - ETA: 2:11 - loss: 0.5010 - regression_loss: 0.4567 - classification_loss: 0.0443 119/500 [======>.......................] - ETA: 2:10 - loss: 0.5027 - regression_loss: 0.4584 - classification_loss: 0.0443 120/500 [======>.......................] - ETA: 2:10 - loss: 0.5043 - regression_loss: 0.4599 - classification_loss: 0.0444 121/500 [======>.......................] - ETA: 2:10 - loss: 0.5020 - regression_loss: 0.4579 - classification_loss: 0.0441 122/500 [======>.......................] - ETA: 2:10 - loss: 0.4996 - regression_loss: 0.4557 - classification_loss: 0.0439 123/500 [======>.......................] - ETA: 2:09 - loss: 0.4998 - regression_loss: 0.4557 - classification_loss: 0.0441 124/500 [======>.......................] - ETA: 2:09 - loss: 0.4983 - regression_loss: 0.4544 - classification_loss: 0.0439 125/500 [======>.......................] - ETA: 2:08 - loss: 0.4985 - regression_loss: 0.4546 - classification_loss: 0.0439 126/500 [======>.......................] - ETA: 2:08 - loss: 0.4973 - regression_loss: 0.4535 - classification_loss: 0.0438 127/500 [======>.......................] - ETA: 2:08 - loss: 0.4976 - regression_loss: 0.4538 - classification_loss: 0.0438 128/500 [======>.......................] - ETA: 2:07 - loss: 0.4987 - regression_loss: 0.4550 - classification_loss: 0.0437 129/500 [======>.......................] - ETA: 2:07 - loss: 0.4985 - regression_loss: 0.4548 - classification_loss: 0.0436 130/500 [======>.......................] - ETA: 2:07 - loss: 0.5006 - regression_loss: 0.4570 - classification_loss: 0.0436 131/500 [======>.......................] - ETA: 2:06 - loss: 0.5011 - regression_loss: 0.4574 - classification_loss: 0.0437 132/500 [======>.......................] - ETA: 2:06 - loss: 0.5031 - regression_loss: 0.4592 - classification_loss: 0.0438 133/500 [======>.......................] - ETA: 2:06 - loss: 0.5014 - regression_loss: 0.4577 - classification_loss: 0.0437 134/500 [=======>......................] - ETA: 2:05 - loss: 0.5001 - regression_loss: 0.4565 - classification_loss: 0.0436 135/500 [=======>......................] - ETA: 2:05 - loss: 0.4979 - regression_loss: 0.4545 - classification_loss: 0.0434 136/500 [=======>......................] - ETA: 2:05 - loss: 0.4959 - regression_loss: 0.4527 - classification_loss: 0.0432 137/500 [=======>......................] - ETA: 2:04 - loss: 0.4952 - regression_loss: 0.4520 - classification_loss: 0.0432 138/500 [=======>......................] - ETA: 2:04 - loss: 0.4974 - regression_loss: 0.4541 - classification_loss: 0.0433 139/500 [=======>......................] - ETA: 2:04 - loss: 0.4975 - regression_loss: 0.4542 - classification_loss: 0.0433 140/500 [=======>......................] - ETA: 2:03 - loss: 0.4991 - regression_loss: 0.4556 - classification_loss: 0.0435 141/500 [=======>......................] - ETA: 2:03 - loss: 0.4983 - regression_loss: 0.4549 - classification_loss: 0.0434 142/500 [=======>......................] - ETA: 2:03 - loss: 0.5004 - regression_loss: 0.4568 - classification_loss: 0.0437 143/500 [=======>......................] - ETA: 2:02 - loss: 0.4987 - regression_loss: 0.4552 - classification_loss: 0.0435 144/500 [=======>......................] - ETA: 2:02 - loss: 0.5023 - regression_loss: 0.4581 - classification_loss: 0.0442 145/500 [=======>......................] - ETA: 2:02 - loss: 0.5013 - regression_loss: 0.4572 - classification_loss: 0.0441 146/500 [=======>......................] - ETA: 2:01 - loss: 0.5005 - regression_loss: 0.4564 - classification_loss: 0.0441 147/500 [=======>......................] - ETA: 2:01 - loss: 0.5010 - regression_loss: 0.4567 - classification_loss: 0.0442 148/500 [=======>......................] - ETA: 2:00 - loss: 0.5022 - regression_loss: 0.4578 - classification_loss: 0.0443 149/500 [=======>......................] - ETA: 2:00 - loss: 0.5007 - regression_loss: 0.4566 - classification_loss: 0.0441 150/500 [========>.....................] - ETA: 2:00 - loss: 0.4995 - regression_loss: 0.4556 - classification_loss: 0.0439 151/500 [========>.....................] - ETA: 1:59 - loss: 0.4987 - regression_loss: 0.4549 - classification_loss: 0.0438 152/500 [========>.....................] - ETA: 1:59 - loss: 0.4983 - regression_loss: 0.4546 - classification_loss: 0.0437 153/500 [========>.....................] - ETA: 1:59 - loss: 0.4987 - regression_loss: 0.4550 - classification_loss: 0.0436 154/500 [========>.....................] - ETA: 1:58 - loss: 0.4993 - regression_loss: 0.4556 - classification_loss: 0.0437 155/500 [========>.....................] - ETA: 1:58 - loss: 0.4981 - regression_loss: 0.4546 - classification_loss: 0.0435 156/500 [========>.....................] - ETA: 1:58 - loss: 0.4969 - regression_loss: 0.4535 - classification_loss: 0.0433 157/500 [========>.....................] - ETA: 1:57 - loss: 0.4976 - regression_loss: 0.4541 - classification_loss: 0.0434 158/500 [========>.....................] - ETA: 1:57 - loss: 0.4984 - regression_loss: 0.4549 - classification_loss: 0.0434 159/500 [========>.....................] - ETA: 1:57 - loss: 0.4985 - regression_loss: 0.4550 - classification_loss: 0.0434 160/500 [========>.....................] - ETA: 1:56 - loss: 0.4976 - regression_loss: 0.4542 - classification_loss: 0.0434 161/500 [========>.....................] - ETA: 1:56 - loss: 0.4981 - regression_loss: 0.4547 - classification_loss: 0.0434 162/500 [========>.....................] - ETA: 1:56 - loss: 0.4980 - regression_loss: 0.4546 - classification_loss: 0.0433 163/500 [========>.....................] - ETA: 1:55 - loss: 0.4983 - regression_loss: 0.4550 - classification_loss: 0.0433 164/500 [========>.....................] - ETA: 1:55 - loss: 0.4985 - regression_loss: 0.4553 - classification_loss: 0.0432 165/500 [========>.....................] - ETA: 1:55 - loss: 0.4982 - regression_loss: 0.4550 - classification_loss: 0.0432 166/500 [========>.....................] - ETA: 1:54 - loss: 0.4990 - regression_loss: 0.4559 - classification_loss: 0.0432 167/500 [=========>....................] - ETA: 1:54 - loss: 0.4982 - regression_loss: 0.4551 - classification_loss: 0.0431 168/500 [=========>....................] - ETA: 1:54 - loss: 0.4982 - regression_loss: 0.4553 - classification_loss: 0.0429 169/500 [=========>....................] - ETA: 1:53 - loss: 0.4973 - regression_loss: 0.4545 - classification_loss: 0.0428 170/500 [=========>....................] - ETA: 1:53 - loss: 0.4985 - regression_loss: 0.4555 - classification_loss: 0.0430 171/500 [=========>....................] - ETA: 1:53 - loss: 0.4995 - regression_loss: 0.4564 - classification_loss: 0.0431 172/500 [=========>....................] - ETA: 1:52 - loss: 0.5023 - regression_loss: 0.4591 - classification_loss: 0.0433 173/500 [=========>....................] - ETA: 1:52 - loss: 0.5049 - regression_loss: 0.4612 - classification_loss: 0.0437 174/500 [=========>....................] - ETA: 1:52 - loss: 0.5078 - regression_loss: 0.4639 - classification_loss: 0.0439 175/500 [=========>....................] - ETA: 1:51 - loss: 0.5066 - regression_loss: 0.4629 - classification_loss: 0.0437 176/500 [=========>....................] - ETA: 1:51 - loss: 0.5058 - regression_loss: 0.4621 - classification_loss: 0.0437 177/500 [=========>....................] - ETA: 1:50 - loss: 0.5048 - regression_loss: 0.4612 - classification_loss: 0.0436 178/500 [=========>....................] - ETA: 1:50 - loss: 0.5033 - regression_loss: 0.4598 - classification_loss: 0.0435 179/500 [=========>....................] - ETA: 1:50 - loss: 0.5018 - regression_loss: 0.4585 - classification_loss: 0.0434 180/500 [=========>....................] - ETA: 1:49 - loss: 0.5021 - regression_loss: 0.4588 - classification_loss: 0.0434 181/500 [=========>....................] - ETA: 1:49 - loss: 0.5029 - regression_loss: 0.4596 - classification_loss: 0.0433 182/500 [=========>....................] - ETA: 1:49 - loss: 0.5034 - regression_loss: 0.4600 - classification_loss: 0.0433 183/500 [=========>....................] - ETA: 1:48 - loss: 0.5038 - regression_loss: 0.4604 - classification_loss: 0.0434 184/500 [==========>...................] - ETA: 1:48 - loss: 0.5039 - regression_loss: 0.4605 - classification_loss: 0.0434 185/500 [==========>...................] - ETA: 1:48 - loss: 0.5024 - regression_loss: 0.4591 - classification_loss: 0.0432 186/500 [==========>...................] - ETA: 1:47 - loss: 0.5034 - regression_loss: 0.4602 - classification_loss: 0.0433 187/500 [==========>...................] - ETA: 1:47 - loss: 0.5041 - regression_loss: 0.4607 - classification_loss: 0.0433 188/500 [==========>...................] - ETA: 1:47 - loss: 0.5029 - regression_loss: 0.4597 - classification_loss: 0.0432 189/500 [==========>...................] - ETA: 1:46 - loss: 0.5026 - regression_loss: 0.4595 - classification_loss: 0.0432 190/500 [==========>...................] - ETA: 1:46 - loss: 0.5033 - regression_loss: 0.4600 - classification_loss: 0.0433 191/500 [==========>...................] - ETA: 1:46 - loss: 0.5032 - regression_loss: 0.4599 - classification_loss: 0.0434 192/500 [==========>...................] - ETA: 1:45 - loss: 0.5038 - regression_loss: 0.4604 - classification_loss: 0.0434 193/500 [==========>...................] - ETA: 1:45 - loss: 0.5033 - regression_loss: 0.4598 - classification_loss: 0.0434 194/500 [==========>...................] - ETA: 1:45 - loss: 0.5023 - regression_loss: 0.4590 - classification_loss: 0.0433 195/500 [==========>...................] - ETA: 1:44 - loss: 0.5018 - regression_loss: 0.4586 - classification_loss: 0.0432 196/500 [==========>...................] - ETA: 1:44 - loss: 0.5028 - regression_loss: 0.4593 - classification_loss: 0.0434 197/500 [==========>...................] - ETA: 1:44 - loss: 0.5048 - regression_loss: 0.4615 - classification_loss: 0.0433 198/500 [==========>...................] - ETA: 1:43 - loss: 0.5044 - regression_loss: 0.4610 - classification_loss: 0.0434 199/500 [==========>...................] - ETA: 1:43 - loss: 0.5055 - regression_loss: 0.4618 - classification_loss: 0.0437 200/500 [===========>..................] - ETA: 1:43 - loss: 0.5046 - regression_loss: 0.4610 - classification_loss: 0.0436 201/500 [===========>..................] - ETA: 1:42 - loss: 0.5048 - regression_loss: 0.4612 - classification_loss: 0.0436 202/500 [===========>..................] - ETA: 1:42 - loss: 0.5054 - regression_loss: 0.4617 - classification_loss: 0.0437 203/500 [===========>..................] - ETA: 1:42 - loss: 0.5061 - regression_loss: 0.4623 - classification_loss: 0.0438 204/500 [===========>..................] - ETA: 1:41 - loss: 0.5062 - regression_loss: 0.4623 - classification_loss: 0.0439 205/500 [===========>..................] - ETA: 1:41 - loss: 0.5078 - regression_loss: 0.4637 - classification_loss: 0.0441 206/500 [===========>..................] - ETA: 1:41 - loss: 0.5089 - regression_loss: 0.4646 - classification_loss: 0.0443 207/500 [===========>..................] - ETA: 1:40 - loss: 0.5096 - regression_loss: 0.4651 - classification_loss: 0.0445 208/500 [===========>..................] - ETA: 1:40 - loss: 0.5099 - regression_loss: 0.4654 - classification_loss: 0.0445 209/500 [===========>..................] - ETA: 1:39 - loss: 0.5103 - regression_loss: 0.4658 - classification_loss: 0.0445 210/500 [===========>..................] - ETA: 1:39 - loss: 0.5113 - regression_loss: 0.4667 - classification_loss: 0.0446 211/500 [===========>..................] - ETA: 1:39 - loss: 0.5120 - regression_loss: 0.4674 - classification_loss: 0.0445 212/500 [===========>..................] - ETA: 1:38 - loss: 0.5113 - regression_loss: 0.4669 - classification_loss: 0.0445 213/500 [===========>..................] - ETA: 1:38 - loss: 0.5113 - regression_loss: 0.4668 - classification_loss: 0.0445 214/500 [===========>..................] - ETA: 1:38 - loss: 0.5104 - regression_loss: 0.4660 - classification_loss: 0.0443 215/500 [===========>..................] - ETA: 1:37 - loss: 0.5094 - regression_loss: 0.4652 - classification_loss: 0.0442 216/500 [===========>..................] - ETA: 1:37 - loss: 0.5101 - regression_loss: 0.4658 - classification_loss: 0.0443 217/500 [============>.................] - ETA: 1:37 - loss: 0.5106 - regression_loss: 0.4663 - classification_loss: 0.0443 218/500 [============>.................] - ETA: 1:36 - loss: 0.5122 - regression_loss: 0.4677 - classification_loss: 0.0445 219/500 [============>.................] - ETA: 1:36 - loss: 0.5119 - regression_loss: 0.4675 - classification_loss: 0.0445 220/500 [============>.................] - ETA: 1:36 - loss: 0.5117 - regression_loss: 0.4671 - classification_loss: 0.0445 221/500 [============>.................] - ETA: 1:35 - loss: 0.5129 - regression_loss: 0.4682 - classification_loss: 0.0447 222/500 [============>.................] - ETA: 1:35 - loss: 0.5130 - regression_loss: 0.4684 - classification_loss: 0.0447 223/500 [============>.................] - ETA: 1:35 - loss: 0.5135 - regression_loss: 0.4689 - classification_loss: 0.0446 224/500 [============>.................] - ETA: 1:34 - loss: 0.5132 - regression_loss: 0.4685 - classification_loss: 0.0446 225/500 [============>.................] - ETA: 1:34 - loss: 0.5138 - regression_loss: 0.4688 - classification_loss: 0.0449 226/500 [============>.................] - ETA: 1:34 - loss: 0.5148 - regression_loss: 0.4699 - classification_loss: 0.0450 227/500 [============>.................] - ETA: 1:33 - loss: 0.5151 - regression_loss: 0.4701 - classification_loss: 0.0450 228/500 [============>.................] - ETA: 1:33 - loss: 0.5143 - regression_loss: 0.4693 - classification_loss: 0.0450 229/500 [============>.................] - ETA: 1:33 - loss: 0.5130 - regression_loss: 0.4682 - classification_loss: 0.0448 230/500 [============>.................] - ETA: 1:32 - loss: 0.5130 - regression_loss: 0.4681 - classification_loss: 0.0449 231/500 [============>.................] - ETA: 1:32 - loss: 0.5136 - regression_loss: 0.4687 - classification_loss: 0.0449 232/500 [============>.................] - ETA: 1:31 - loss: 0.5134 - regression_loss: 0.4685 - classification_loss: 0.0449 233/500 [============>.................] - ETA: 1:31 - loss: 0.5127 - regression_loss: 0.4679 - classification_loss: 0.0448 234/500 [=============>................] - ETA: 1:31 - loss: 0.5151 - regression_loss: 0.4702 - classification_loss: 0.0450 235/500 [=============>................] - ETA: 1:30 - loss: 0.5145 - regression_loss: 0.4697 - classification_loss: 0.0449 236/500 [=============>................] - ETA: 1:30 - loss: 0.5146 - regression_loss: 0.4698 - classification_loss: 0.0449 237/500 [=============>................] - ETA: 1:30 - loss: 0.5154 - regression_loss: 0.4705 - classification_loss: 0.0449 238/500 [=============>................] - ETA: 1:29 - loss: 0.5167 - regression_loss: 0.4717 - classification_loss: 0.0450 239/500 [=============>................] - ETA: 1:29 - loss: 0.5154 - regression_loss: 0.4705 - classification_loss: 0.0449 240/500 [=============>................] - ETA: 1:29 - loss: 0.5157 - regression_loss: 0.4707 - classification_loss: 0.0450 241/500 [=============>................] - ETA: 1:28 - loss: 0.5148 - regression_loss: 0.4700 - classification_loss: 0.0449 242/500 [=============>................] - ETA: 1:28 - loss: 0.5146 - regression_loss: 0.4696 - classification_loss: 0.0449 243/500 [=============>................] - ETA: 1:28 - loss: 0.5148 - regression_loss: 0.4699 - classification_loss: 0.0449 244/500 [=============>................] - ETA: 1:27 - loss: 0.5135 - regression_loss: 0.4687 - classification_loss: 0.0448 245/500 [=============>................] - ETA: 1:27 - loss: 0.5129 - regression_loss: 0.4682 - classification_loss: 0.0447 246/500 [=============>................] - ETA: 1:27 - loss: 0.5131 - regression_loss: 0.4684 - classification_loss: 0.0448 247/500 [=============>................] - ETA: 1:26 - loss: 0.5126 - regression_loss: 0.4679 - classification_loss: 0.0447 248/500 [=============>................] - ETA: 1:26 - loss: 0.5130 - regression_loss: 0.4683 - classification_loss: 0.0448 249/500 [=============>................] - ETA: 1:26 - loss: 0.5123 - regression_loss: 0.4676 - classification_loss: 0.0447 250/500 [==============>...............] - ETA: 1:25 - loss: 0.5113 - regression_loss: 0.4667 - classification_loss: 0.0446 251/500 [==============>...............] - ETA: 1:25 - loss: 0.5108 - regression_loss: 0.4662 - classification_loss: 0.0445 252/500 [==============>...............] - ETA: 1:25 - loss: 0.5108 - regression_loss: 0.4663 - classification_loss: 0.0445 253/500 [==============>...............] - ETA: 1:24 - loss: 0.5121 - regression_loss: 0.4676 - classification_loss: 0.0445 254/500 [==============>...............] - ETA: 1:24 - loss: 0.5124 - regression_loss: 0.4678 - classification_loss: 0.0445 255/500 [==============>...............] - ETA: 1:24 - loss: 0.5115 - regression_loss: 0.4671 - classification_loss: 0.0444 256/500 [==============>...............] - ETA: 1:23 - loss: 0.5113 - regression_loss: 0.4670 - classification_loss: 0.0443 257/500 [==============>...............] - ETA: 1:23 - loss: 0.5124 - regression_loss: 0.4680 - classification_loss: 0.0445 258/500 [==============>...............] - ETA: 1:23 - loss: 0.5120 - regression_loss: 0.4676 - classification_loss: 0.0444 259/500 [==============>...............] - ETA: 1:22 - loss: 0.5111 - regression_loss: 0.4668 - classification_loss: 0.0443 260/500 [==============>...............] - ETA: 1:22 - loss: 0.5105 - regression_loss: 0.4662 - classification_loss: 0.0443 261/500 [==============>...............] - ETA: 1:22 - loss: 0.5102 - regression_loss: 0.4660 - classification_loss: 0.0442 262/500 [==============>...............] - ETA: 1:21 - loss: 0.5110 - regression_loss: 0.4668 - classification_loss: 0.0442 263/500 [==============>...............] - ETA: 1:21 - loss: 0.5111 - regression_loss: 0.4669 - classification_loss: 0.0442 264/500 [==============>...............] - ETA: 1:20 - loss: 0.5100 - regression_loss: 0.4659 - classification_loss: 0.0442 265/500 [==============>...............] - ETA: 1:20 - loss: 0.5093 - regression_loss: 0.4652 - classification_loss: 0.0441 266/500 [==============>...............] - ETA: 1:20 - loss: 0.5092 - regression_loss: 0.4652 - classification_loss: 0.0440 267/500 [===============>..............] - ETA: 1:19 - loss: 0.5098 - regression_loss: 0.4658 - classification_loss: 0.0440 268/500 [===============>..............] - ETA: 1:19 - loss: 0.5099 - regression_loss: 0.4659 - classification_loss: 0.0440 269/500 [===============>..............] - ETA: 1:19 - loss: 0.5104 - regression_loss: 0.4662 - classification_loss: 0.0442 270/500 [===============>..............] - ETA: 1:18 - loss: 0.5115 - regression_loss: 0.4672 - classification_loss: 0.0443 271/500 [===============>..............] - ETA: 1:18 - loss: 0.5109 - regression_loss: 0.4666 - classification_loss: 0.0442 272/500 [===============>..............] - ETA: 1:18 - loss: 0.5103 - regression_loss: 0.4660 - classification_loss: 0.0443 273/500 [===============>..............] - ETA: 1:17 - loss: 0.5107 - regression_loss: 0.4663 - classification_loss: 0.0443 274/500 [===============>..............] - ETA: 1:17 - loss: 0.5100 - regression_loss: 0.4658 - classification_loss: 0.0442 275/500 [===============>..............] - ETA: 1:17 - loss: 0.5099 - regression_loss: 0.4658 - classification_loss: 0.0442 276/500 [===============>..............] - ETA: 1:16 - loss: 0.5096 - regression_loss: 0.4655 - classification_loss: 0.0441 277/500 [===============>..............] - ETA: 1:16 - loss: 0.5105 - regression_loss: 0.4663 - classification_loss: 0.0441 278/500 [===============>..............] - ETA: 1:16 - loss: 0.5106 - regression_loss: 0.4665 - classification_loss: 0.0441 279/500 [===============>..............] - ETA: 1:15 - loss: 0.5106 - regression_loss: 0.4665 - classification_loss: 0.0441 280/500 [===============>..............] - ETA: 1:15 - loss: 0.5103 - regression_loss: 0.4662 - classification_loss: 0.0441 281/500 [===============>..............] - ETA: 1:15 - loss: 0.5110 - regression_loss: 0.4668 - classification_loss: 0.0442 282/500 [===============>..............] - ETA: 1:14 - loss: 0.5120 - regression_loss: 0.4675 - classification_loss: 0.0445 283/500 [===============>..............] - ETA: 1:14 - loss: 0.5115 - regression_loss: 0.4671 - classification_loss: 0.0444 284/500 [================>.............] - ETA: 1:14 - loss: 0.5120 - regression_loss: 0.4676 - classification_loss: 0.0444 285/500 [================>.............] - ETA: 1:13 - loss: 0.5120 - regression_loss: 0.4676 - classification_loss: 0.0444 286/500 [================>.............] - ETA: 1:13 - loss: 0.5128 - regression_loss: 0.4683 - classification_loss: 0.0444 287/500 [================>.............] - ETA: 1:13 - loss: 0.5124 - regression_loss: 0.4680 - classification_loss: 0.0444 288/500 [================>.............] - ETA: 1:12 - loss: 0.5126 - regression_loss: 0.4682 - classification_loss: 0.0443 289/500 [================>.............] - ETA: 1:12 - loss: 0.5125 - regression_loss: 0.4682 - classification_loss: 0.0442 290/500 [================>.............] - ETA: 1:12 - loss: 0.5124 - regression_loss: 0.4682 - classification_loss: 0.0442 291/500 [================>.............] - ETA: 1:11 - loss: 0.5135 - regression_loss: 0.4692 - classification_loss: 0.0443 292/500 [================>.............] - ETA: 1:11 - loss: 0.5144 - regression_loss: 0.4701 - classification_loss: 0.0443 293/500 [================>.............] - ETA: 1:11 - loss: 0.5140 - regression_loss: 0.4697 - classification_loss: 0.0442 294/500 [================>.............] - ETA: 1:10 - loss: 0.5167 - regression_loss: 0.4723 - classification_loss: 0.0444 295/500 [================>.............] - ETA: 1:10 - loss: 0.5188 - regression_loss: 0.4743 - classification_loss: 0.0445 296/500 [================>.............] - ETA: 1:09 - loss: 0.5206 - regression_loss: 0.4760 - classification_loss: 0.0446 297/500 [================>.............] - ETA: 1:09 - loss: 0.5232 - regression_loss: 0.4783 - classification_loss: 0.0449 298/500 [================>.............] - ETA: 1:09 - loss: 0.5238 - regression_loss: 0.4789 - classification_loss: 0.0449 299/500 [================>.............] - ETA: 1:08 - loss: 0.5235 - regression_loss: 0.4786 - classification_loss: 0.0449 300/500 [=================>............] - ETA: 1:08 - loss: 0.5222 - regression_loss: 0.4773 - classification_loss: 0.0448 301/500 [=================>............] - ETA: 1:08 - loss: 0.5223 - regression_loss: 0.4774 - classification_loss: 0.0449 302/500 [=================>............] - ETA: 1:07 - loss: 0.5222 - regression_loss: 0.4774 - classification_loss: 0.0448 303/500 [=================>............] - ETA: 1:07 - loss: 0.5226 - regression_loss: 0.4779 - classification_loss: 0.0447 304/500 [=================>............] - ETA: 1:07 - loss: 0.5224 - regression_loss: 0.4777 - classification_loss: 0.0447 305/500 [=================>............] - ETA: 1:06 - loss: 0.5252 - regression_loss: 0.4802 - classification_loss: 0.0450 306/500 [=================>............] - ETA: 1:06 - loss: 0.5296 - regression_loss: 0.4842 - classification_loss: 0.0454 307/500 [=================>............] - ETA: 1:06 - loss: 0.5341 - regression_loss: 0.4882 - classification_loss: 0.0459 308/500 [=================>............] - ETA: 1:05 - loss: 0.5339 - regression_loss: 0.4880 - classification_loss: 0.0459 309/500 [=================>............] - ETA: 1:05 - loss: 0.5340 - regression_loss: 0.4881 - classification_loss: 0.0459 310/500 [=================>............] - ETA: 1:05 - loss: 0.5332 - regression_loss: 0.4874 - classification_loss: 0.0458 311/500 [=================>............] - ETA: 1:04 - loss: 0.5340 - regression_loss: 0.4881 - classification_loss: 0.0459 312/500 [=================>............] - ETA: 1:04 - loss: 0.5340 - regression_loss: 0.4881 - classification_loss: 0.0458 313/500 [=================>............] - ETA: 1:04 - loss: 0.5340 - regression_loss: 0.4882 - classification_loss: 0.0458 314/500 [=================>............] - ETA: 1:03 - loss: 0.5349 - regression_loss: 0.4889 - classification_loss: 0.0460 315/500 [=================>............] - ETA: 1:03 - loss: 0.5341 - regression_loss: 0.4882 - classification_loss: 0.0459 316/500 [=================>............] - ETA: 1:03 - loss: 0.5344 - regression_loss: 0.4885 - classification_loss: 0.0459 317/500 [==================>...........] - ETA: 1:02 - loss: 0.5353 - regression_loss: 0.4893 - classification_loss: 0.0460 318/500 [==================>...........] - ETA: 1:02 - loss: 0.5361 - regression_loss: 0.4901 - classification_loss: 0.0461 319/500 [==================>...........] - ETA: 1:02 - loss: 0.5369 - regression_loss: 0.4908 - classification_loss: 0.0461 320/500 [==================>...........] - ETA: 1:01 - loss: 0.5370 - regression_loss: 0.4909 - classification_loss: 0.0462 321/500 [==================>...........] - ETA: 1:01 - loss: 0.5364 - regression_loss: 0.4903 - classification_loss: 0.0461 322/500 [==================>...........] - ETA: 1:00 - loss: 0.5361 - regression_loss: 0.4901 - classification_loss: 0.0460 323/500 [==================>...........] - ETA: 1:00 - loss: 0.5364 - regression_loss: 0.4904 - classification_loss: 0.0461 324/500 [==================>...........] - ETA: 1:00 - loss: 0.5377 - regression_loss: 0.4916 - classification_loss: 0.0461 325/500 [==================>...........] - ETA: 59s - loss: 0.5372 - regression_loss: 0.4912 - classification_loss: 0.0460  326/500 [==================>...........] - ETA: 59s - loss: 0.5374 - regression_loss: 0.4913 - classification_loss: 0.0461 327/500 [==================>...........] - ETA: 59s - loss: 0.5383 - regression_loss: 0.4920 - classification_loss: 0.0463 328/500 [==================>...........] - ETA: 58s - loss: 0.5378 - regression_loss: 0.4915 - classification_loss: 0.0463 329/500 [==================>...........] - ETA: 58s - loss: 0.5378 - regression_loss: 0.4915 - classification_loss: 0.0463 330/500 [==================>...........] - ETA: 58s - loss: 0.5379 - regression_loss: 0.4916 - classification_loss: 0.0464 331/500 [==================>...........] - ETA: 57s - loss: 0.5385 - regression_loss: 0.4921 - classification_loss: 0.0464 332/500 [==================>...........] - ETA: 57s - loss: 0.5386 - regression_loss: 0.4922 - classification_loss: 0.0464 333/500 [==================>...........] - ETA: 57s - loss: 0.5381 - regression_loss: 0.4917 - classification_loss: 0.0464 334/500 [===================>..........] - ETA: 56s - loss: 0.5371 - regression_loss: 0.4908 - classification_loss: 0.0463 335/500 [===================>..........] - ETA: 56s - loss: 0.5375 - regression_loss: 0.4912 - classification_loss: 0.0463 336/500 [===================>..........] - ETA: 56s - loss: 0.5376 - regression_loss: 0.4912 - classification_loss: 0.0463 337/500 [===================>..........] - ETA: 55s - loss: 0.5379 - regression_loss: 0.4916 - classification_loss: 0.0463 338/500 [===================>..........] - ETA: 55s - loss: 0.5381 - regression_loss: 0.4917 - classification_loss: 0.0464 339/500 [===================>..........] - ETA: 55s - loss: 0.5380 - regression_loss: 0.4916 - classification_loss: 0.0464 340/500 [===================>..........] - ETA: 54s - loss: 0.5378 - regression_loss: 0.4913 - classification_loss: 0.0465 341/500 [===================>..........] - ETA: 54s - loss: 0.5378 - regression_loss: 0.4913 - classification_loss: 0.0465 342/500 [===================>..........] - ETA: 54s - loss: 0.5379 - regression_loss: 0.4915 - classification_loss: 0.0465 343/500 [===================>..........] - ETA: 53s - loss: 0.5380 - regression_loss: 0.4915 - classification_loss: 0.0465 344/500 [===================>..........] - ETA: 53s - loss: 0.5389 - regression_loss: 0.4924 - classification_loss: 0.0466 345/500 [===================>..........] - ETA: 53s - loss: 0.5388 - regression_loss: 0.4923 - classification_loss: 0.0465 346/500 [===================>..........] - ETA: 52s - loss: 0.5380 - regression_loss: 0.4915 - classification_loss: 0.0465 347/500 [===================>..........] - ETA: 52s - loss: 0.5374 - regression_loss: 0.4910 - classification_loss: 0.0465 348/500 [===================>..........] - ETA: 52s - loss: 0.5370 - regression_loss: 0.4905 - classification_loss: 0.0464 349/500 [===================>..........] - ETA: 51s - loss: 0.5369 - regression_loss: 0.4906 - classification_loss: 0.0464 350/500 [====================>.........] - ETA: 51s - loss: 0.5362 - regression_loss: 0.4899 - classification_loss: 0.0463 351/500 [====================>.........] - ETA: 51s - loss: 0.5355 - regression_loss: 0.4893 - classification_loss: 0.0462 352/500 [====================>.........] - ETA: 50s - loss: 0.5348 - regression_loss: 0.4886 - classification_loss: 0.0461 353/500 [====================>.........] - ETA: 50s - loss: 0.5358 - regression_loss: 0.4896 - classification_loss: 0.0461 354/500 [====================>.........] - ETA: 50s - loss: 0.5355 - regression_loss: 0.4894 - classification_loss: 0.0461 355/500 [====================>.........] - ETA: 49s - loss: 0.5352 - regression_loss: 0.4892 - classification_loss: 0.0460 356/500 [====================>.........] - ETA: 49s - loss: 0.5348 - regression_loss: 0.4888 - classification_loss: 0.0460 357/500 [====================>.........] - ETA: 49s - loss: 0.5357 - regression_loss: 0.4896 - classification_loss: 0.0461 358/500 [====================>.........] - ETA: 48s - loss: 0.5352 - regression_loss: 0.4892 - classification_loss: 0.0461 359/500 [====================>.........] - ETA: 48s - loss: 0.5344 - regression_loss: 0.4884 - classification_loss: 0.0460 360/500 [====================>.........] - ETA: 47s - loss: 0.5349 - regression_loss: 0.4888 - classification_loss: 0.0461 361/500 [====================>.........] - ETA: 47s - loss: 0.5349 - regression_loss: 0.4888 - classification_loss: 0.0461 362/500 [====================>.........] - ETA: 47s - loss: 0.5345 - regression_loss: 0.4884 - classification_loss: 0.0461 363/500 [====================>.........] - ETA: 46s - loss: 0.5338 - regression_loss: 0.4878 - classification_loss: 0.0460 364/500 [====================>.........] - ETA: 46s - loss: 0.5337 - regression_loss: 0.4877 - classification_loss: 0.0460 365/500 [====================>.........] - ETA: 46s - loss: 0.5332 - regression_loss: 0.4873 - classification_loss: 0.0459 366/500 [====================>.........] - ETA: 45s - loss: 0.5335 - regression_loss: 0.4875 - classification_loss: 0.0459 367/500 [=====================>........] - ETA: 45s - loss: 0.5336 - regression_loss: 0.4877 - classification_loss: 0.0459 368/500 [=====================>........] - ETA: 45s - loss: 0.5336 - regression_loss: 0.4878 - classification_loss: 0.0459 369/500 [=====================>........] - ETA: 44s - loss: 0.5336 - regression_loss: 0.4877 - classification_loss: 0.0458 370/500 [=====================>........] - ETA: 44s - loss: 0.5327 - regression_loss: 0.4869 - classification_loss: 0.0458 371/500 [=====================>........] - ETA: 44s - loss: 0.5328 - regression_loss: 0.4871 - classification_loss: 0.0458 372/500 [=====================>........] - ETA: 43s - loss: 0.5326 - regression_loss: 0.4868 - classification_loss: 0.0458 373/500 [=====================>........] - ETA: 43s - loss: 0.5318 - regression_loss: 0.4860 - classification_loss: 0.0457 374/500 [=====================>........] - ETA: 43s - loss: 0.5316 - regression_loss: 0.4859 - classification_loss: 0.0457 375/500 [=====================>........] - ETA: 42s - loss: 0.5307 - regression_loss: 0.4850 - classification_loss: 0.0456 376/500 [=====================>........] - ETA: 42s - loss: 0.5303 - regression_loss: 0.4846 - classification_loss: 0.0456 377/500 [=====================>........] - ETA: 42s - loss: 0.5301 - regression_loss: 0.4845 - classification_loss: 0.0457 378/500 [=====================>........] - ETA: 41s - loss: 0.5302 - regression_loss: 0.4845 - classification_loss: 0.0457 379/500 [=====================>........] - ETA: 41s - loss: 0.5295 - regression_loss: 0.4838 - classification_loss: 0.0456 380/500 [=====================>........] - ETA: 41s - loss: 0.5291 - regression_loss: 0.4835 - classification_loss: 0.0456 381/500 [=====================>........] - ETA: 40s - loss: 0.5293 - regression_loss: 0.4837 - classification_loss: 0.0456 382/500 [=====================>........] - ETA: 40s - loss: 0.5300 - regression_loss: 0.4844 - classification_loss: 0.0456 383/500 [=====================>........] - ETA: 40s - loss: 0.5305 - regression_loss: 0.4849 - classification_loss: 0.0455 384/500 [======================>.......] - ETA: 39s - loss: 0.5307 - regression_loss: 0.4852 - classification_loss: 0.0456 385/500 [======================>.......] - ETA: 39s - loss: 0.5315 - regression_loss: 0.4859 - classification_loss: 0.0457 386/500 [======================>.......] - ETA: 39s - loss: 0.5308 - regression_loss: 0.4852 - classification_loss: 0.0456 387/500 [======================>.......] - ETA: 38s - loss: 0.5312 - regression_loss: 0.4856 - classification_loss: 0.0456 388/500 [======================>.......] - ETA: 38s - loss: 0.5312 - regression_loss: 0.4856 - classification_loss: 0.0456 389/500 [======================>.......] - ETA: 38s - loss: 0.5310 - regression_loss: 0.4855 - classification_loss: 0.0455 390/500 [======================>.......] - ETA: 37s - loss: 0.5305 - regression_loss: 0.4850 - classification_loss: 0.0455 391/500 [======================>.......] - ETA: 37s - loss: 0.5303 - regression_loss: 0.4848 - classification_loss: 0.0455 392/500 [======================>.......] - ETA: 36s - loss: 0.5304 - regression_loss: 0.4849 - classification_loss: 0.0455 393/500 [======================>.......] - ETA: 36s - loss: 0.5306 - regression_loss: 0.4851 - classification_loss: 0.0455 394/500 [======================>.......] - ETA: 36s - loss: 0.5308 - regression_loss: 0.4852 - classification_loss: 0.0455 395/500 [======================>.......] - ETA: 35s - loss: 0.5315 - regression_loss: 0.4858 - classification_loss: 0.0456 396/500 [======================>.......] - ETA: 35s - loss: 0.5316 - regression_loss: 0.4860 - classification_loss: 0.0456 397/500 [======================>.......] - ETA: 35s - loss: 0.5312 - regression_loss: 0.4855 - classification_loss: 0.0457 398/500 [======================>.......] - ETA: 34s - loss: 0.5310 - regression_loss: 0.4853 - classification_loss: 0.0456 399/500 [======================>.......] - ETA: 34s - loss: 0.5309 - regression_loss: 0.4853 - classification_loss: 0.0457 400/500 [=======================>......] - ETA: 34s - loss: 0.5306 - regression_loss: 0.4850 - classification_loss: 0.0456 401/500 [=======================>......] - ETA: 33s - loss: 0.5312 - regression_loss: 0.4856 - classification_loss: 0.0456 402/500 [=======================>......] - ETA: 33s - loss: 0.5308 - regression_loss: 0.4853 - classification_loss: 0.0455 403/500 [=======================>......] - ETA: 33s - loss: 0.5302 - regression_loss: 0.4848 - classification_loss: 0.0455 404/500 [=======================>......] - ETA: 32s - loss: 0.5297 - regression_loss: 0.4843 - classification_loss: 0.0454 405/500 [=======================>......] - ETA: 32s - loss: 0.5290 - regression_loss: 0.4837 - classification_loss: 0.0453 406/500 [=======================>......] - ETA: 32s - loss: 0.5288 - regression_loss: 0.4835 - classification_loss: 0.0453 407/500 [=======================>......] - ETA: 31s - loss: 0.5297 - regression_loss: 0.4842 - classification_loss: 0.0455 408/500 [=======================>......] - ETA: 31s - loss: 0.5295 - regression_loss: 0.4840 - classification_loss: 0.0455 409/500 [=======================>......] - ETA: 31s - loss: 0.5289 - regression_loss: 0.4836 - classification_loss: 0.0454 410/500 [=======================>......] - ETA: 30s - loss: 0.5288 - regression_loss: 0.4834 - classification_loss: 0.0454 411/500 [=======================>......] - ETA: 30s - loss: 0.5288 - regression_loss: 0.4834 - classification_loss: 0.0454 412/500 [=======================>......] - ETA: 30s - loss: 0.5285 - regression_loss: 0.4832 - classification_loss: 0.0453 413/500 [=======================>......] - ETA: 29s - loss: 0.5281 - regression_loss: 0.4828 - classification_loss: 0.0453 414/500 [=======================>......] - ETA: 29s - loss: 0.5282 - regression_loss: 0.4828 - classification_loss: 0.0454 415/500 [=======================>......] - ETA: 29s - loss: 0.5279 - regression_loss: 0.4826 - classification_loss: 0.0453 416/500 [=======================>......] - ETA: 28s - loss: 0.5283 - regression_loss: 0.4829 - classification_loss: 0.0454 417/500 [========================>.....] - ETA: 28s - loss: 0.5284 - regression_loss: 0.4830 - classification_loss: 0.0454 418/500 [========================>.....] - ETA: 28s - loss: 0.5288 - regression_loss: 0.4833 - classification_loss: 0.0455 419/500 [========================>.....] - ETA: 27s - loss: 0.5284 - regression_loss: 0.4829 - classification_loss: 0.0455 420/500 [========================>.....] - ETA: 27s - loss: 0.5281 - regression_loss: 0.4827 - classification_loss: 0.0454 421/500 [========================>.....] - ETA: 27s - loss: 0.5283 - regression_loss: 0.4828 - classification_loss: 0.0454 422/500 [========================>.....] - ETA: 26s - loss: 0.5280 - regression_loss: 0.4827 - classification_loss: 0.0454 423/500 [========================>.....] - ETA: 26s - loss: 0.5277 - regression_loss: 0.4824 - classification_loss: 0.0453 424/500 [========================>.....] - ETA: 26s - loss: 0.5273 - regression_loss: 0.4820 - classification_loss: 0.0453 425/500 [========================>.....] - ETA: 25s - loss: 0.5268 - regression_loss: 0.4815 - classification_loss: 0.0452 426/500 [========================>.....] - ETA: 25s - loss: 0.5269 - regression_loss: 0.4817 - classification_loss: 0.0452 427/500 [========================>.....] - ETA: 25s - loss: 0.5275 - regression_loss: 0.4821 - classification_loss: 0.0454 428/500 [========================>.....] - ETA: 24s - loss: 0.5275 - regression_loss: 0.4821 - classification_loss: 0.0454 429/500 [========================>.....] - ETA: 24s - loss: 0.5270 - regression_loss: 0.4817 - classification_loss: 0.0453 430/500 [========================>.....] - ETA: 23s - loss: 0.5264 - regression_loss: 0.4811 - classification_loss: 0.0453 431/500 [========================>.....] - ETA: 23s - loss: 0.5267 - regression_loss: 0.4814 - classification_loss: 0.0453 432/500 [========================>.....] - ETA: 23s - loss: 0.5273 - regression_loss: 0.4819 - classification_loss: 0.0454 433/500 [========================>.....] - ETA: 22s - loss: 0.5267 - regression_loss: 0.4813 - classification_loss: 0.0453 434/500 [=========================>....] - ETA: 22s - loss: 0.5269 - regression_loss: 0.4815 - classification_loss: 0.0453 435/500 [=========================>....] - ETA: 22s - loss: 0.5267 - regression_loss: 0.4813 - classification_loss: 0.0453 436/500 [=========================>....] - ETA: 21s - loss: 0.5265 - regression_loss: 0.4812 - classification_loss: 0.0453 437/500 [=========================>....] - ETA: 21s - loss: 0.5266 - regression_loss: 0.4812 - classification_loss: 0.0454 438/500 [=========================>....] - ETA: 21s - loss: 0.5258 - regression_loss: 0.4805 - classification_loss: 0.0453 439/500 [=========================>....] - ETA: 20s - loss: 0.5254 - regression_loss: 0.4801 - classification_loss: 0.0453 440/500 [=========================>....] - ETA: 20s - loss: 0.5248 - regression_loss: 0.4796 - classification_loss: 0.0452 441/500 [=========================>....] - ETA: 20s - loss: 0.5249 - regression_loss: 0.4797 - classification_loss: 0.0452 442/500 [=========================>....] - ETA: 19s - loss: 0.5248 - regression_loss: 0.4796 - classification_loss: 0.0452 443/500 [=========================>....] - ETA: 19s - loss: 0.5243 - regression_loss: 0.4792 - classification_loss: 0.0451 444/500 [=========================>....] - ETA: 19s - loss: 0.5246 - regression_loss: 0.4795 - classification_loss: 0.0452 445/500 [=========================>....] - ETA: 18s - loss: 0.5252 - regression_loss: 0.4800 - classification_loss: 0.0452 446/500 [=========================>....] - ETA: 18s - loss: 0.5247 - regression_loss: 0.4796 - classification_loss: 0.0451 447/500 [=========================>....] - ETA: 18s - loss: 0.5239 - regression_loss: 0.4789 - classification_loss: 0.0450 448/500 [=========================>....] - ETA: 17s - loss: 0.5238 - regression_loss: 0.4787 - classification_loss: 0.0451 449/500 [=========================>....] - ETA: 17s - loss: 0.5241 - regression_loss: 0.4790 - classification_loss: 0.0451 450/500 [==========================>...] - ETA: 17s - loss: 0.5233 - regression_loss: 0.4783 - classification_loss: 0.0450 451/500 [==========================>...] - ETA: 16s - loss: 0.5235 - regression_loss: 0.4784 - classification_loss: 0.0451 452/500 [==========================>...] - ETA: 16s - loss: 0.5244 - regression_loss: 0.4793 - classification_loss: 0.0452 453/500 [==========================>...] - ETA: 16s - loss: 0.5240 - regression_loss: 0.4789 - classification_loss: 0.0451 454/500 [==========================>...] - ETA: 15s - loss: 0.5241 - regression_loss: 0.4790 - classification_loss: 0.0451 455/500 [==========================>...] - ETA: 15s - loss: 0.5246 - regression_loss: 0.4794 - classification_loss: 0.0451 456/500 [==========================>...] - ETA: 15s - loss: 0.5241 - regression_loss: 0.4791 - classification_loss: 0.0451 457/500 [==========================>...] - ETA: 14s - loss: 0.5236 - regression_loss: 0.4786 - classification_loss: 0.0451 458/500 [==========================>...] - ETA: 14s - loss: 0.5241 - regression_loss: 0.4790 - classification_loss: 0.0451 459/500 [==========================>...] - ETA: 14s - loss: 0.5242 - regression_loss: 0.4791 - classification_loss: 0.0451 460/500 [==========================>...] - ETA: 13s - loss: 0.5242 - regression_loss: 0.4790 - classification_loss: 0.0451 461/500 [==========================>...] - ETA: 13s - loss: 0.5242 - regression_loss: 0.4791 - classification_loss: 0.0451 462/500 [==========================>...] - ETA: 13s - loss: 0.5240 - regression_loss: 0.4790 - classification_loss: 0.0451 463/500 [==========================>...] - ETA: 12s - loss: 0.5232 - regression_loss: 0.4782 - classification_loss: 0.0450 464/500 [==========================>...] - ETA: 12s - loss: 0.5233 - regression_loss: 0.4784 - classification_loss: 0.0449 465/500 [==========================>...] - ETA: 11s - loss: 0.5234 - regression_loss: 0.4784 - classification_loss: 0.0449 466/500 [==========================>...] - ETA: 11s - loss: 0.5230 - regression_loss: 0.4781 - classification_loss: 0.0449 467/500 [===========================>..] - ETA: 11s - loss: 0.5224 - regression_loss: 0.4775 - classification_loss: 0.0449 468/500 [===========================>..] - ETA: 10s - loss: 0.5229 - regression_loss: 0.4780 - classification_loss: 0.0449 469/500 [===========================>..] - ETA: 10s - loss: 0.5224 - regression_loss: 0.4775 - classification_loss: 0.0448 470/500 [===========================>..] - ETA: 10s - loss: 0.5224 - regression_loss: 0.4776 - classification_loss: 0.0448 471/500 [===========================>..] - ETA: 9s - loss: 0.5222 - regression_loss: 0.4774 - classification_loss: 0.0448  472/500 [===========================>..] - ETA: 9s - loss: 0.5216 - regression_loss: 0.4769 - classification_loss: 0.0447 473/500 [===========================>..] - ETA: 9s - loss: 0.5215 - regression_loss: 0.4768 - classification_loss: 0.0447 474/500 [===========================>..] - ETA: 8s - loss: 0.5215 - regression_loss: 0.4768 - classification_loss: 0.0447 475/500 [===========================>..] - ETA: 8s - loss: 0.5218 - regression_loss: 0.4771 - classification_loss: 0.0447 476/500 [===========================>..] - ETA: 8s - loss: 0.5220 - regression_loss: 0.4772 - classification_loss: 0.0447 477/500 [===========================>..] - ETA: 7s - loss: 0.5225 - regression_loss: 0.4777 - classification_loss: 0.0448 478/500 [===========================>..] - ETA: 7s - loss: 0.5222 - regression_loss: 0.4775 - classification_loss: 0.0448 479/500 [===========================>..] - ETA: 7s - loss: 0.5225 - regression_loss: 0.4778 - classification_loss: 0.0447 480/500 [===========================>..] - ETA: 6s - loss: 0.5225 - regression_loss: 0.4778 - classification_loss: 0.0447 481/500 [===========================>..] - ETA: 6s - loss: 0.5222 - regression_loss: 0.4775 - classification_loss: 0.0447 482/500 [===========================>..] - ETA: 6s - loss: 0.5220 - regression_loss: 0.4774 - classification_loss: 0.0446 483/500 [===========================>..] - ETA: 5s - loss: 0.5232 - regression_loss: 0.4784 - classification_loss: 0.0448 484/500 [============================>.] - ETA: 5s - loss: 0.5239 - regression_loss: 0.4790 - classification_loss: 0.0449 485/500 [============================>.] - ETA: 5s - loss: 0.5236 - regression_loss: 0.4788 - classification_loss: 0.0448 486/500 [============================>.] - ETA: 4s - loss: 0.5238 - regression_loss: 0.4789 - classification_loss: 0.0448 487/500 [============================>.] - ETA: 4s - loss: 0.5235 - regression_loss: 0.4788 - classification_loss: 0.0448 488/500 [============================>.] - ETA: 4s - loss: 0.5234 - regression_loss: 0.4787 - classification_loss: 0.0447 489/500 [============================>.] - ETA: 3s - loss: 0.5231 - regression_loss: 0.4785 - classification_loss: 0.0447 490/500 [============================>.] - ETA: 3s - loss: 0.5226 - regression_loss: 0.4780 - classification_loss: 0.0446 491/500 [============================>.] - ETA: 3s - loss: 0.5231 - regression_loss: 0.4784 - classification_loss: 0.0446 492/500 [============================>.] - ETA: 2s - loss: 0.5225 - regression_loss: 0.4780 - classification_loss: 0.0446 493/500 [============================>.] - ETA: 2s - loss: 0.5224 - regression_loss: 0.4778 - classification_loss: 0.0446 494/500 [============================>.] - ETA: 2s - loss: 0.5220 - regression_loss: 0.4774 - classification_loss: 0.0445 495/500 [============================>.] - ETA: 1s - loss: 0.5214 - regression_loss: 0.4769 - classification_loss: 0.0445 496/500 [============================>.] - ETA: 1s - loss: 0.5213 - regression_loss: 0.4769 - classification_loss: 0.0444 497/500 [============================>.] - ETA: 1s - loss: 0.5215 - regression_loss: 0.4771 - classification_loss: 0.0444 498/500 [============================>.] - ETA: 0s - loss: 0.5215 - regression_loss: 0.4771 - classification_loss: 0.0444 499/500 [============================>.] - ETA: 0s - loss: 0.5216 - regression_loss: 0.4772 - classification_loss: 0.0444 500/500 [==============================] - 171s 343ms/step - loss: 0.5214 - regression_loss: 0.4770 - classification_loss: 0.0443 1172 instances of class plum with average precision: 0.6943 mAP: 0.6943 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 2:39 - loss: 0.2233 - regression_loss: 0.2051 - classification_loss: 0.0182 2/500 [..............................] - ETA: 2:49 - loss: 0.3622 - regression_loss: 0.3300 - classification_loss: 0.0322 3/500 [..............................] - ETA: 2:50 - loss: 0.3876 - regression_loss: 0.3535 - classification_loss: 0.0341 4/500 [..............................] - ETA: 2:52 - loss: 0.4476 - regression_loss: 0.4096 - classification_loss: 0.0380 5/500 [..............................] - ETA: 2:52 - loss: 0.4706 - regression_loss: 0.4312 - classification_loss: 0.0394 6/500 [..............................] - ETA: 2:52 - loss: 0.4462 - regression_loss: 0.4091 - classification_loss: 0.0371 7/500 [..............................] - ETA: 2:52 - loss: 0.4412 - regression_loss: 0.4035 - classification_loss: 0.0377 8/500 [..............................] - ETA: 2:51 - loss: 0.4743 - regression_loss: 0.4345 - classification_loss: 0.0398 9/500 [..............................] - ETA: 2:50 - loss: 0.4830 - regression_loss: 0.4419 - classification_loss: 0.0411 10/500 [..............................] - ETA: 2:49 - loss: 0.5101 - regression_loss: 0.4699 - classification_loss: 0.0403 11/500 [..............................] - ETA: 2:48 - loss: 0.5115 - regression_loss: 0.4712 - classification_loss: 0.0404 12/500 [..............................] - ETA: 2:49 - loss: 0.4992 - regression_loss: 0.4608 - classification_loss: 0.0385 13/500 [..............................] - ETA: 2:48 - loss: 0.5030 - regression_loss: 0.4633 - classification_loss: 0.0397 14/500 [..............................] - ETA: 2:47 - loss: 0.5126 - regression_loss: 0.4690 - classification_loss: 0.0436 15/500 [..............................] - ETA: 2:47 - loss: 0.5100 - regression_loss: 0.4647 - classification_loss: 0.0452 16/500 [..............................] - ETA: 2:47 - loss: 0.5283 - regression_loss: 0.4801 - classification_loss: 0.0481 17/500 [>.............................] - ETA: 2:46 - loss: 0.5287 - regression_loss: 0.4819 - classification_loss: 0.0468 18/500 [>.............................] - ETA: 2:46 - loss: 0.5193 - regression_loss: 0.4725 - classification_loss: 0.0467 19/500 [>.............................] - ETA: 2:45 - loss: 0.5097 - regression_loss: 0.4641 - classification_loss: 0.0456 20/500 [>.............................] - ETA: 2:44 - loss: 0.5109 - regression_loss: 0.4649 - classification_loss: 0.0460 21/500 [>.............................] - ETA: 2:44 - loss: 0.5078 - regression_loss: 0.4625 - classification_loss: 0.0453 22/500 [>.............................] - ETA: 2:44 - loss: 0.5152 - regression_loss: 0.4677 - classification_loss: 0.0475 23/500 [>.............................] - ETA: 2:43 - loss: 0.5167 - regression_loss: 0.4699 - classification_loss: 0.0468 24/500 [>.............................] - ETA: 2:42 - loss: 0.5043 - regression_loss: 0.4588 - classification_loss: 0.0455 25/500 [>.............................] - ETA: 2:42 - loss: 0.5245 - regression_loss: 0.4792 - classification_loss: 0.0453 26/500 [>.............................] - ETA: 2:42 - loss: 0.5258 - regression_loss: 0.4805 - classification_loss: 0.0453 27/500 [>.............................] - ETA: 2:41 - loss: 0.5231 - regression_loss: 0.4784 - classification_loss: 0.0447 28/500 [>.............................] - ETA: 2:41 - loss: 0.5125 - regression_loss: 0.4687 - classification_loss: 0.0438 29/500 [>.............................] - ETA: 2:40 - loss: 0.5053 - regression_loss: 0.4627 - classification_loss: 0.0426 30/500 [>.............................] - ETA: 2:40 - loss: 0.4989 - regression_loss: 0.4569 - classification_loss: 0.0419 31/500 [>.............................] - ETA: 2:39 - loss: 0.4976 - regression_loss: 0.4560 - classification_loss: 0.0417 32/500 [>.............................] - ETA: 2:38 - loss: 0.5005 - regression_loss: 0.4588 - classification_loss: 0.0417 33/500 [>.............................] - ETA: 2:38 - loss: 0.4924 - regression_loss: 0.4515 - classification_loss: 0.0410 34/500 [=>............................] - ETA: 2:38 - loss: 0.4905 - regression_loss: 0.4494 - classification_loss: 0.0411 35/500 [=>............................] - ETA: 2:37 - loss: 0.4844 - regression_loss: 0.4439 - classification_loss: 0.0405 36/500 [=>............................] - ETA: 2:37 - loss: 0.4914 - regression_loss: 0.4503 - classification_loss: 0.0412 37/500 [=>............................] - ETA: 2:37 - loss: 0.4985 - regression_loss: 0.4550 - classification_loss: 0.0435 38/500 [=>............................] - ETA: 2:36 - loss: 0.4986 - regression_loss: 0.4551 - classification_loss: 0.0435 39/500 [=>............................] - ETA: 2:36 - loss: 0.4962 - regression_loss: 0.4535 - classification_loss: 0.0428 40/500 [=>............................] - ETA: 2:36 - loss: 0.4984 - regression_loss: 0.4548 - classification_loss: 0.0436 41/500 [=>............................] - ETA: 2:36 - loss: 0.5038 - regression_loss: 0.4597 - classification_loss: 0.0441 42/500 [=>............................] - ETA: 2:36 - loss: 0.4992 - regression_loss: 0.4557 - classification_loss: 0.0435 43/500 [=>............................] - ETA: 2:35 - loss: 0.5006 - regression_loss: 0.4571 - classification_loss: 0.0435 44/500 [=>............................] - ETA: 2:35 - loss: 0.4983 - regression_loss: 0.4552 - classification_loss: 0.0432 45/500 [=>............................] - ETA: 2:35 - loss: 0.5028 - regression_loss: 0.4598 - classification_loss: 0.0430 46/500 [=>............................] - ETA: 2:34 - loss: 0.4975 - regression_loss: 0.4549 - classification_loss: 0.0426 47/500 [=>............................] - ETA: 2:34 - loss: 0.4971 - regression_loss: 0.4542 - classification_loss: 0.0429 48/500 [=>............................] - ETA: 2:33 - loss: 0.4945 - regression_loss: 0.4522 - classification_loss: 0.0424 49/500 [=>............................] - ETA: 2:33 - loss: 0.4972 - regression_loss: 0.4549 - classification_loss: 0.0423 50/500 [==>...........................] - ETA: 2:33 - loss: 0.5043 - regression_loss: 0.4608 - classification_loss: 0.0435 51/500 [==>...........................] - ETA: 2:33 - loss: 0.5094 - regression_loss: 0.4654 - classification_loss: 0.0440 52/500 [==>...........................] - ETA: 2:33 - loss: 0.5050 - regression_loss: 0.4612 - classification_loss: 0.0438 53/500 [==>...........................] - ETA: 2:32 - loss: 0.5059 - regression_loss: 0.4623 - classification_loss: 0.0437 54/500 [==>...........................] - ETA: 2:32 - loss: 0.5021 - regression_loss: 0.4588 - classification_loss: 0.0433 55/500 [==>...........................] - ETA: 2:32 - loss: 0.5034 - regression_loss: 0.4598 - classification_loss: 0.0436 56/500 [==>...........................] - ETA: 2:31 - loss: 0.5071 - regression_loss: 0.4632 - classification_loss: 0.0439 57/500 [==>...........................] - ETA: 2:31 - loss: 0.5105 - regression_loss: 0.4667 - classification_loss: 0.0437 58/500 [==>...........................] - ETA: 2:31 - loss: 0.5121 - regression_loss: 0.4684 - classification_loss: 0.0437 59/500 [==>...........................] - ETA: 2:30 - loss: 0.5130 - regression_loss: 0.4695 - classification_loss: 0.0435 60/500 [==>...........................] - ETA: 2:30 - loss: 0.5160 - regression_loss: 0.4724 - classification_loss: 0.0437 61/500 [==>...........................] - ETA: 2:29 - loss: 0.5139 - regression_loss: 0.4706 - classification_loss: 0.0433 62/500 [==>...........................] - ETA: 2:29 - loss: 0.5160 - regression_loss: 0.4726 - classification_loss: 0.0434 63/500 [==>...........................] - ETA: 2:28 - loss: 0.5208 - regression_loss: 0.4767 - classification_loss: 0.0441 64/500 [==>...........................] - ETA: 2:28 - loss: 0.5201 - regression_loss: 0.4764 - classification_loss: 0.0438 65/500 [==>...........................] - ETA: 2:28 - loss: 0.5259 - regression_loss: 0.4816 - classification_loss: 0.0443 66/500 [==>...........................] - ETA: 2:27 - loss: 0.5254 - regression_loss: 0.4810 - classification_loss: 0.0443 67/500 [===>..........................] - ETA: 2:27 - loss: 0.5291 - regression_loss: 0.4842 - classification_loss: 0.0449 68/500 [===>..........................] - ETA: 2:27 - loss: 0.5286 - regression_loss: 0.4840 - classification_loss: 0.0446 69/500 [===>..........................] - ETA: 2:26 - loss: 0.5229 - regression_loss: 0.4789 - classification_loss: 0.0441 70/500 [===>..........................] - ETA: 2:26 - loss: 0.5209 - regression_loss: 0.4770 - classification_loss: 0.0439 71/500 [===>..........................] - ETA: 2:26 - loss: 0.5219 - regression_loss: 0.4779 - classification_loss: 0.0441 72/500 [===>..........................] - ETA: 2:25 - loss: 0.5222 - regression_loss: 0.4780 - classification_loss: 0.0442 73/500 [===>..........................] - ETA: 2:25 - loss: 0.5220 - regression_loss: 0.4777 - classification_loss: 0.0443 74/500 [===>..........................] - ETA: 2:25 - loss: 0.5221 - regression_loss: 0.4780 - classification_loss: 0.0440 75/500 [===>..........................] - ETA: 2:24 - loss: 0.5217 - regression_loss: 0.4777 - classification_loss: 0.0440 76/500 [===>..........................] - ETA: 2:24 - loss: 0.5228 - regression_loss: 0.4787 - classification_loss: 0.0441 77/500 [===>..........................] - ETA: 2:24 - loss: 0.5271 - regression_loss: 0.4826 - classification_loss: 0.0445 78/500 [===>..........................] - ETA: 2:23 - loss: 0.5382 - regression_loss: 0.4929 - classification_loss: 0.0453 79/500 [===>..........................] - ETA: 2:23 - loss: 0.5383 - regression_loss: 0.4928 - classification_loss: 0.0455 80/500 [===>..........................] - ETA: 2:22 - loss: 0.5384 - regression_loss: 0.4931 - classification_loss: 0.0453 81/500 [===>..........................] - ETA: 2:22 - loss: 0.5354 - regression_loss: 0.4904 - classification_loss: 0.0451 82/500 [===>..........................] - ETA: 2:22 - loss: 0.5366 - regression_loss: 0.4915 - classification_loss: 0.0451 83/500 [===>..........................] - ETA: 2:21 - loss: 0.5389 - regression_loss: 0.4938 - classification_loss: 0.0451 84/500 [====>.........................] - ETA: 2:21 - loss: 0.5393 - regression_loss: 0.4940 - classification_loss: 0.0453 85/500 [====>.........................] - ETA: 2:21 - loss: 0.5437 - regression_loss: 0.4981 - classification_loss: 0.0456 86/500 [====>.........................] - ETA: 2:20 - loss: 0.5395 - regression_loss: 0.4943 - classification_loss: 0.0452 87/500 [====>.........................] - ETA: 2:20 - loss: 0.5434 - regression_loss: 0.4981 - classification_loss: 0.0453 88/500 [====>.........................] - ETA: 2:20 - loss: 0.5432 - regression_loss: 0.4981 - classification_loss: 0.0451 89/500 [====>.........................] - ETA: 2:19 - loss: 0.5424 - regression_loss: 0.4974 - classification_loss: 0.0450 90/500 [====>.........................] - ETA: 2:19 - loss: 0.5455 - regression_loss: 0.5004 - classification_loss: 0.0451 91/500 [====>.........................] - ETA: 2:19 - loss: 0.5464 - regression_loss: 0.5012 - classification_loss: 0.0453 92/500 [====>.........................] - ETA: 2:18 - loss: 0.5457 - regression_loss: 0.5005 - classification_loss: 0.0451 93/500 [====>.........................] - ETA: 2:18 - loss: 0.5447 - regression_loss: 0.4996 - classification_loss: 0.0451 94/500 [====>.........................] - ETA: 2:17 - loss: 0.5464 - regression_loss: 0.5013 - classification_loss: 0.0451 95/500 [====>.........................] - ETA: 2:17 - loss: 0.5484 - regression_loss: 0.5029 - classification_loss: 0.0455 96/500 [====>.........................] - ETA: 2:17 - loss: 0.5495 - regression_loss: 0.5041 - classification_loss: 0.0454 97/500 [====>.........................] - ETA: 2:16 - loss: 0.5480 - regression_loss: 0.5028 - classification_loss: 0.0452 98/500 [====>.........................] - ETA: 2:16 - loss: 0.5486 - regression_loss: 0.5033 - classification_loss: 0.0453 99/500 [====>.........................] - ETA: 2:16 - loss: 0.5502 - regression_loss: 0.5046 - classification_loss: 0.0456 100/500 [=====>........................] - ETA: 2:15 - loss: 0.5483 - regression_loss: 0.5027 - classification_loss: 0.0456 101/500 [=====>........................] - ETA: 2:15 - loss: 0.5511 - regression_loss: 0.5056 - classification_loss: 0.0454 102/500 [=====>........................] - ETA: 2:14 - loss: 0.5487 - regression_loss: 0.5034 - classification_loss: 0.0453 103/500 [=====>........................] - ETA: 2:14 - loss: 0.5475 - regression_loss: 0.5024 - classification_loss: 0.0452 104/500 [=====>........................] - ETA: 2:14 - loss: 0.5456 - regression_loss: 0.5007 - classification_loss: 0.0450 105/500 [=====>........................] - ETA: 2:13 - loss: 0.5475 - regression_loss: 0.5025 - classification_loss: 0.0450 106/500 [=====>........................] - ETA: 2:13 - loss: 0.5448 - regression_loss: 0.5000 - classification_loss: 0.0448 107/500 [=====>........................] - ETA: 2:13 - loss: 0.5430 - regression_loss: 0.4985 - classification_loss: 0.0445 108/500 [=====>........................] - ETA: 2:13 - loss: 0.5442 - regression_loss: 0.4998 - classification_loss: 0.0445 109/500 [=====>........................] - ETA: 2:12 - loss: 0.5435 - regression_loss: 0.4991 - classification_loss: 0.0444 110/500 [=====>........................] - ETA: 2:12 - loss: 0.5405 - regression_loss: 0.4964 - classification_loss: 0.0441 111/500 [=====>........................] - ETA: 2:12 - loss: 0.5403 - regression_loss: 0.4963 - classification_loss: 0.0441 112/500 [=====>........................] - ETA: 2:11 - loss: 0.5409 - regression_loss: 0.4965 - classification_loss: 0.0444 113/500 [=====>........................] - ETA: 2:11 - loss: 0.5391 - regression_loss: 0.4949 - classification_loss: 0.0443 114/500 [=====>........................] - ETA: 2:11 - loss: 0.5412 - regression_loss: 0.4965 - classification_loss: 0.0446 115/500 [=====>........................] - ETA: 2:10 - loss: 0.5411 - regression_loss: 0.4964 - classification_loss: 0.0447 116/500 [=====>........................] - ETA: 2:10 - loss: 0.5403 - regression_loss: 0.4957 - classification_loss: 0.0447 117/500 [======>.......................] - ETA: 2:10 - loss: 0.5396 - regression_loss: 0.4949 - classification_loss: 0.0447 118/500 [======>.......................] - ETA: 2:09 - loss: 0.5370 - regression_loss: 0.4925 - classification_loss: 0.0445 119/500 [======>.......................] - ETA: 2:09 - loss: 0.5384 - regression_loss: 0.4940 - classification_loss: 0.0444 120/500 [======>.......................] - ETA: 2:09 - loss: 0.5381 - regression_loss: 0.4937 - classification_loss: 0.0444 121/500 [======>.......................] - ETA: 2:08 - loss: 0.5360 - regression_loss: 0.4918 - classification_loss: 0.0442 122/500 [======>.......................] - ETA: 2:08 - loss: 0.5347 - regression_loss: 0.4906 - classification_loss: 0.0441 123/500 [======>.......................] - ETA: 2:08 - loss: 0.5349 - regression_loss: 0.4908 - classification_loss: 0.0441 124/500 [======>.......................] - ETA: 2:07 - loss: 0.5329 - regression_loss: 0.4890 - classification_loss: 0.0439 125/500 [======>.......................] - ETA: 2:07 - loss: 0.5343 - regression_loss: 0.4899 - classification_loss: 0.0444 126/500 [======>.......................] - ETA: 2:07 - loss: 0.5321 - regression_loss: 0.4879 - classification_loss: 0.0442 127/500 [======>.......................] - ETA: 2:06 - loss: 0.5327 - regression_loss: 0.4884 - classification_loss: 0.0443 128/500 [======>.......................] - ETA: 2:06 - loss: 0.5309 - regression_loss: 0.4868 - classification_loss: 0.0441 129/500 [======>.......................] - ETA: 2:06 - loss: 0.5295 - regression_loss: 0.4855 - classification_loss: 0.0440 130/500 [======>.......................] - ETA: 2:05 - loss: 0.5295 - regression_loss: 0.4856 - classification_loss: 0.0440 131/500 [======>.......................] - ETA: 2:05 - loss: 0.5285 - regression_loss: 0.4847 - classification_loss: 0.0438 132/500 [======>.......................] - ETA: 2:05 - loss: 0.5279 - regression_loss: 0.4841 - classification_loss: 0.0438 133/500 [======>.......................] - ETA: 2:04 - loss: 0.5264 - regression_loss: 0.4829 - classification_loss: 0.0435 134/500 [=======>......................] - ETA: 2:04 - loss: 0.5257 - regression_loss: 0.4821 - classification_loss: 0.0436 135/500 [=======>......................] - ETA: 2:04 - loss: 0.5256 - regression_loss: 0.4822 - classification_loss: 0.0434 136/500 [=======>......................] - ETA: 2:03 - loss: 0.5283 - regression_loss: 0.4843 - classification_loss: 0.0440 137/500 [=======>......................] - ETA: 2:03 - loss: 0.5271 - regression_loss: 0.4832 - classification_loss: 0.0438 138/500 [=======>......................] - ETA: 2:02 - loss: 0.5259 - regression_loss: 0.4822 - classification_loss: 0.0437 139/500 [=======>......................] - ETA: 2:02 - loss: 0.5244 - regression_loss: 0.4809 - classification_loss: 0.0435 140/500 [=======>......................] - ETA: 2:02 - loss: 0.5266 - regression_loss: 0.4830 - classification_loss: 0.0436 141/500 [=======>......................] - ETA: 2:01 - loss: 0.5253 - regression_loss: 0.4818 - classification_loss: 0.0434 142/500 [=======>......................] - ETA: 2:01 - loss: 0.5251 - regression_loss: 0.4817 - classification_loss: 0.0434 143/500 [=======>......................] - ETA: 2:01 - loss: 0.5247 - regression_loss: 0.4813 - classification_loss: 0.0434 144/500 [=======>......................] - ETA: 2:00 - loss: 0.5249 - regression_loss: 0.4814 - classification_loss: 0.0435 145/500 [=======>......................] - ETA: 2:00 - loss: 0.5254 - regression_loss: 0.4819 - classification_loss: 0.0435 146/500 [=======>......................] - ETA: 2:00 - loss: 0.5250 - regression_loss: 0.4817 - classification_loss: 0.0433 147/500 [=======>......................] - ETA: 1:59 - loss: 0.5239 - regression_loss: 0.4806 - classification_loss: 0.0433 148/500 [=======>......................] - ETA: 1:59 - loss: 0.5246 - regression_loss: 0.4811 - classification_loss: 0.0435 149/500 [=======>......................] - ETA: 1:59 - loss: 0.5247 - regression_loss: 0.4813 - classification_loss: 0.0434 150/500 [========>.....................] - ETA: 1:58 - loss: 0.5231 - regression_loss: 0.4797 - classification_loss: 0.0434 151/500 [========>.....................] - ETA: 1:58 - loss: 0.5221 - regression_loss: 0.4789 - classification_loss: 0.0433 152/500 [========>.....................] - ETA: 1:58 - loss: 0.5223 - regression_loss: 0.4790 - classification_loss: 0.0433 153/500 [========>.....................] - ETA: 1:57 - loss: 0.5208 - regression_loss: 0.4776 - classification_loss: 0.0432 154/500 [========>.....................] - ETA: 1:57 - loss: 0.5207 - regression_loss: 0.4774 - classification_loss: 0.0432 155/500 [========>.....................] - ETA: 1:57 - loss: 0.5199 - regression_loss: 0.4768 - classification_loss: 0.0431 156/500 [========>.....................] - ETA: 1:56 - loss: 0.5225 - regression_loss: 0.4794 - classification_loss: 0.0430 157/500 [========>.....................] - ETA: 1:56 - loss: 0.5237 - regression_loss: 0.4806 - classification_loss: 0.0432 158/500 [========>.....................] - ETA: 1:56 - loss: 0.5242 - regression_loss: 0.4810 - classification_loss: 0.0432 159/500 [========>.....................] - ETA: 1:55 - loss: 0.5236 - regression_loss: 0.4806 - classification_loss: 0.0430 160/500 [========>.....................] - ETA: 1:55 - loss: 0.5224 - regression_loss: 0.4795 - classification_loss: 0.0429 161/500 [========>.....................] - ETA: 1:55 - loss: 0.5222 - regression_loss: 0.4792 - classification_loss: 0.0430 162/500 [========>.....................] - ETA: 1:55 - loss: 0.5208 - regression_loss: 0.4779 - classification_loss: 0.0429 163/500 [========>.....................] - ETA: 1:54 - loss: 0.5201 - regression_loss: 0.4773 - classification_loss: 0.0428 164/500 [========>.....................] - ETA: 1:54 - loss: 0.5213 - regression_loss: 0.4783 - classification_loss: 0.0430 165/500 [========>.....................] - ETA: 1:53 - loss: 0.5227 - regression_loss: 0.4794 - classification_loss: 0.0433 166/500 [========>.....................] - ETA: 1:53 - loss: 0.5238 - regression_loss: 0.4804 - classification_loss: 0.0434 167/500 [=========>....................] - ETA: 1:53 - loss: 0.5225 - regression_loss: 0.4791 - classification_loss: 0.0434 168/500 [=========>....................] - ETA: 1:52 - loss: 0.5226 - regression_loss: 0.4791 - classification_loss: 0.0435 169/500 [=========>....................] - ETA: 1:52 - loss: 0.5213 - regression_loss: 0.4779 - classification_loss: 0.0434 170/500 [=========>....................] - ETA: 1:52 - loss: 0.5218 - regression_loss: 0.4784 - classification_loss: 0.0434 171/500 [=========>....................] - ETA: 1:51 - loss: 0.5219 - regression_loss: 0.4785 - classification_loss: 0.0434 172/500 [=========>....................] - ETA: 1:51 - loss: 0.5230 - regression_loss: 0.4795 - classification_loss: 0.0435 173/500 [=========>....................] - ETA: 1:51 - loss: 0.5230 - regression_loss: 0.4794 - classification_loss: 0.0436 174/500 [=========>....................] - ETA: 1:50 - loss: 0.5238 - regression_loss: 0.4803 - classification_loss: 0.0436 175/500 [=========>....................] - ETA: 1:50 - loss: 0.5254 - regression_loss: 0.4818 - classification_loss: 0.0436 176/500 [=========>....................] - ETA: 1:50 - loss: 0.5262 - regression_loss: 0.4825 - classification_loss: 0.0437 177/500 [=========>....................] - ETA: 1:49 - loss: 0.5251 - regression_loss: 0.4815 - classification_loss: 0.0436 178/500 [=========>....................] - ETA: 1:49 - loss: 0.5234 - regression_loss: 0.4800 - classification_loss: 0.0434 179/500 [=========>....................] - ETA: 1:49 - loss: 0.5265 - regression_loss: 0.4828 - classification_loss: 0.0437 180/500 [=========>....................] - ETA: 1:48 - loss: 0.5254 - regression_loss: 0.4818 - classification_loss: 0.0435 181/500 [=========>....................] - ETA: 1:48 - loss: 0.5257 - regression_loss: 0.4822 - classification_loss: 0.0435 182/500 [=========>....................] - ETA: 1:48 - loss: 0.5251 - regression_loss: 0.4817 - classification_loss: 0.0434 183/500 [=========>....................] - ETA: 1:47 - loss: 0.5292 - regression_loss: 0.4855 - classification_loss: 0.0438 184/500 [==========>...................] - ETA: 1:47 - loss: 0.5276 - regression_loss: 0.4839 - classification_loss: 0.0437 185/500 [==========>...................] - ETA: 1:47 - loss: 0.5280 - regression_loss: 0.4844 - classification_loss: 0.0436 186/500 [==========>...................] - ETA: 1:46 - loss: 0.5277 - regression_loss: 0.4841 - classification_loss: 0.0436 187/500 [==========>...................] - ETA: 1:46 - loss: 0.5306 - regression_loss: 0.4869 - classification_loss: 0.0438 188/500 [==========>...................] - ETA: 1:46 - loss: 0.5301 - regression_loss: 0.4865 - classification_loss: 0.0437 189/500 [==========>...................] - ETA: 1:45 - loss: 0.5322 - regression_loss: 0.4883 - classification_loss: 0.0439 190/500 [==========>...................] - ETA: 1:45 - loss: 0.5323 - regression_loss: 0.4885 - classification_loss: 0.0438 191/500 [==========>...................] - ETA: 1:45 - loss: 0.5330 - regression_loss: 0.4891 - classification_loss: 0.0439 192/500 [==========>...................] - ETA: 1:44 - loss: 0.5340 - regression_loss: 0.4899 - classification_loss: 0.0441 193/500 [==========>...................] - ETA: 1:44 - loss: 0.5329 - regression_loss: 0.4888 - classification_loss: 0.0441 194/500 [==========>...................] - ETA: 1:44 - loss: 0.5319 - regression_loss: 0.4879 - classification_loss: 0.0440 195/500 [==========>...................] - ETA: 1:43 - loss: 0.5305 - regression_loss: 0.4867 - classification_loss: 0.0438 196/500 [==========>...................] - ETA: 1:43 - loss: 0.5285 - regression_loss: 0.4849 - classification_loss: 0.0436 197/500 [==========>...................] - ETA: 1:43 - loss: 0.5281 - regression_loss: 0.4843 - classification_loss: 0.0437 198/500 [==========>...................] - ETA: 1:42 - loss: 0.5288 - regression_loss: 0.4849 - classification_loss: 0.0439 199/500 [==========>...................] - ETA: 1:42 - loss: 0.5299 - regression_loss: 0.4858 - classification_loss: 0.0441 200/500 [===========>..................] - ETA: 1:42 - loss: 0.5313 - regression_loss: 0.4872 - classification_loss: 0.0441 201/500 [===========>..................] - ETA: 1:41 - loss: 0.5299 - regression_loss: 0.4858 - classification_loss: 0.0440 202/500 [===========>..................] - ETA: 1:41 - loss: 0.5302 - regression_loss: 0.4861 - classification_loss: 0.0441 203/500 [===========>..................] - ETA: 1:41 - loss: 0.5287 - regression_loss: 0.4847 - classification_loss: 0.0440 204/500 [===========>..................] - ETA: 1:40 - loss: 0.5276 - regression_loss: 0.4837 - classification_loss: 0.0439 205/500 [===========>..................] - ETA: 1:40 - loss: 0.5265 - regression_loss: 0.4827 - classification_loss: 0.0437 206/500 [===========>..................] - ETA: 1:40 - loss: 0.5280 - regression_loss: 0.4840 - classification_loss: 0.0440 207/500 [===========>..................] - ETA: 1:39 - loss: 0.5274 - regression_loss: 0.4835 - classification_loss: 0.0438 208/500 [===========>..................] - ETA: 1:39 - loss: 0.5265 - regression_loss: 0.4828 - classification_loss: 0.0437 209/500 [===========>..................] - ETA: 1:39 - loss: 0.5258 - regression_loss: 0.4822 - classification_loss: 0.0436 210/500 [===========>..................] - ETA: 1:38 - loss: 0.5283 - regression_loss: 0.4843 - classification_loss: 0.0441 211/500 [===========>..................] - ETA: 1:38 - loss: 0.5274 - regression_loss: 0.4834 - classification_loss: 0.0439 212/500 [===========>..................] - ETA: 1:38 - loss: 0.5288 - regression_loss: 0.4847 - classification_loss: 0.0440 213/500 [===========>..................] - ETA: 1:37 - loss: 0.5287 - regression_loss: 0.4847 - classification_loss: 0.0440 214/500 [===========>..................] - ETA: 1:37 - loss: 0.5286 - regression_loss: 0.4846 - classification_loss: 0.0439 215/500 [===========>..................] - ETA: 1:37 - loss: 0.5294 - regression_loss: 0.4853 - classification_loss: 0.0442 216/500 [===========>..................] - ETA: 1:36 - loss: 0.5308 - regression_loss: 0.4864 - classification_loss: 0.0444 217/500 [============>.................] - ETA: 1:36 - loss: 0.5300 - regression_loss: 0.4857 - classification_loss: 0.0443 218/500 [============>.................] - ETA: 1:36 - loss: 0.5294 - regression_loss: 0.4851 - classification_loss: 0.0443 219/500 [============>.................] - ETA: 1:35 - loss: 0.5286 - regression_loss: 0.4844 - classification_loss: 0.0443 220/500 [============>.................] - ETA: 1:35 - loss: 0.5279 - regression_loss: 0.4836 - classification_loss: 0.0442 221/500 [============>.................] - ETA: 1:35 - loss: 0.5271 - regression_loss: 0.4830 - classification_loss: 0.0442 222/500 [============>.................] - ETA: 1:34 - loss: 0.5260 - regression_loss: 0.4819 - classification_loss: 0.0441 223/500 [============>.................] - ETA: 1:34 - loss: 0.5257 - regression_loss: 0.4816 - classification_loss: 0.0441 224/500 [============>.................] - ETA: 1:34 - loss: 0.5251 - regression_loss: 0.4810 - classification_loss: 0.0441 225/500 [============>.................] - ETA: 1:33 - loss: 0.5239 - regression_loss: 0.4800 - classification_loss: 0.0440 226/500 [============>.................] - ETA: 1:33 - loss: 0.5240 - regression_loss: 0.4800 - classification_loss: 0.0440 227/500 [============>.................] - ETA: 1:32 - loss: 0.5238 - regression_loss: 0.4798 - classification_loss: 0.0440 228/500 [============>.................] - ETA: 1:32 - loss: 0.5246 - regression_loss: 0.4803 - classification_loss: 0.0442 229/500 [============>.................] - ETA: 1:32 - loss: 0.5242 - regression_loss: 0.4800 - classification_loss: 0.0442 230/500 [============>.................] - ETA: 1:32 - loss: 0.5230 - regression_loss: 0.4788 - classification_loss: 0.0442 231/500 [============>.................] - ETA: 1:31 - loss: 0.5228 - regression_loss: 0.4786 - classification_loss: 0.0443 232/500 [============>.................] - ETA: 1:31 - loss: 0.5219 - regression_loss: 0.4777 - classification_loss: 0.0442 233/500 [============>.................] - ETA: 1:30 - loss: 0.5225 - regression_loss: 0.4783 - classification_loss: 0.0442 234/500 [=============>................] - ETA: 1:30 - loss: 0.5220 - regression_loss: 0.4778 - classification_loss: 0.0442 235/500 [=============>................] - ETA: 1:30 - loss: 0.5222 - regression_loss: 0.4779 - classification_loss: 0.0443 236/500 [=============>................] - ETA: 1:29 - loss: 0.5220 - regression_loss: 0.4778 - classification_loss: 0.0442 237/500 [=============>................] - ETA: 1:29 - loss: 0.5232 - regression_loss: 0.4789 - classification_loss: 0.0443 238/500 [=============>................] - ETA: 1:29 - loss: 0.5225 - regression_loss: 0.4782 - classification_loss: 0.0443 239/500 [=============>................] - ETA: 1:28 - loss: 0.5218 - regression_loss: 0.4776 - classification_loss: 0.0442 240/500 [=============>................] - ETA: 1:28 - loss: 0.5214 - regression_loss: 0.4773 - classification_loss: 0.0441 241/500 [=============>................] - ETA: 1:28 - loss: 0.5218 - regression_loss: 0.4777 - classification_loss: 0.0441 242/500 [=============>................] - ETA: 1:27 - loss: 0.5235 - regression_loss: 0.4791 - classification_loss: 0.0444 243/500 [=============>................] - ETA: 1:27 - loss: 0.5222 - regression_loss: 0.4779 - classification_loss: 0.0443 244/500 [=============>................] - ETA: 1:27 - loss: 0.5219 - regression_loss: 0.4777 - classification_loss: 0.0442 245/500 [=============>................] - ETA: 1:26 - loss: 0.5208 - regression_loss: 0.4767 - classification_loss: 0.0441 246/500 [=============>................] - ETA: 1:26 - loss: 0.5210 - regression_loss: 0.4769 - classification_loss: 0.0442 247/500 [=============>................] - ETA: 1:26 - loss: 0.5210 - regression_loss: 0.4769 - classification_loss: 0.0441 248/500 [=============>................] - ETA: 1:25 - loss: 0.5202 - regression_loss: 0.4761 - classification_loss: 0.0440 249/500 [=============>................] - ETA: 1:25 - loss: 0.5201 - regression_loss: 0.4761 - classification_loss: 0.0440 250/500 [==============>...............] - ETA: 1:25 - loss: 0.5203 - regression_loss: 0.4763 - classification_loss: 0.0439 251/500 [==============>...............] - ETA: 1:24 - loss: 0.5192 - regression_loss: 0.4753 - classification_loss: 0.0439 252/500 [==============>...............] - ETA: 1:24 - loss: 0.5189 - regression_loss: 0.4752 - classification_loss: 0.0437 253/500 [==============>...............] - ETA: 1:24 - loss: 0.5183 - regression_loss: 0.4746 - classification_loss: 0.0436 254/500 [==============>...............] - ETA: 1:23 - loss: 0.5190 - regression_loss: 0.4753 - classification_loss: 0.0436 255/500 [==============>...............] - ETA: 1:23 - loss: 0.5194 - regression_loss: 0.4758 - classification_loss: 0.0436 256/500 [==============>...............] - ETA: 1:23 - loss: 0.5198 - regression_loss: 0.4761 - classification_loss: 0.0438 257/500 [==============>...............] - ETA: 1:22 - loss: 0.5203 - regression_loss: 0.4765 - classification_loss: 0.0438 258/500 [==============>...............] - ETA: 1:22 - loss: 0.5212 - regression_loss: 0.4773 - classification_loss: 0.0440 259/500 [==============>...............] - ETA: 1:22 - loss: 0.5214 - regression_loss: 0.4775 - classification_loss: 0.0439 260/500 [==============>...............] - ETA: 1:21 - loss: 0.5222 - regression_loss: 0.4781 - classification_loss: 0.0440 261/500 [==============>...............] - ETA: 1:21 - loss: 0.5224 - regression_loss: 0.4784 - classification_loss: 0.0440 262/500 [==============>...............] - ETA: 1:21 - loss: 0.5214 - regression_loss: 0.4775 - classification_loss: 0.0439 263/500 [==============>...............] - ETA: 1:20 - loss: 0.5208 - regression_loss: 0.4769 - classification_loss: 0.0439 264/500 [==============>...............] - ETA: 1:20 - loss: 0.5206 - regression_loss: 0.4767 - classification_loss: 0.0438 265/500 [==============>...............] - ETA: 1:20 - loss: 0.5209 - regression_loss: 0.4771 - classification_loss: 0.0438 266/500 [==============>...............] - ETA: 1:19 - loss: 0.5209 - regression_loss: 0.4771 - classification_loss: 0.0438 267/500 [===============>..............] - ETA: 1:19 - loss: 0.5214 - regression_loss: 0.4775 - classification_loss: 0.0438 268/500 [===============>..............] - ETA: 1:19 - loss: 0.5209 - regression_loss: 0.4771 - classification_loss: 0.0438 269/500 [===============>..............] - ETA: 1:18 - loss: 0.5202 - regression_loss: 0.4765 - classification_loss: 0.0437 270/500 [===============>..............] - ETA: 1:18 - loss: 0.5197 - regression_loss: 0.4760 - classification_loss: 0.0436 271/500 [===============>..............] - ETA: 1:18 - loss: 0.5191 - regression_loss: 0.4755 - classification_loss: 0.0436 272/500 [===============>..............] - ETA: 1:17 - loss: 0.5194 - regression_loss: 0.4757 - classification_loss: 0.0436 273/500 [===============>..............] - ETA: 1:17 - loss: 0.5190 - regression_loss: 0.4754 - classification_loss: 0.0436 274/500 [===============>..............] - ETA: 1:17 - loss: 0.5182 - regression_loss: 0.4746 - classification_loss: 0.0435 275/500 [===============>..............] - ETA: 1:16 - loss: 0.5176 - regression_loss: 0.4740 - classification_loss: 0.0435 276/500 [===============>..............] - ETA: 1:16 - loss: 0.5180 - regression_loss: 0.4744 - classification_loss: 0.0436 277/500 [===============>..............] - ETA: 1:16 - loss: 0.5172 - regression_loss: 0.4737 - classification_loss: 0.0436 278/500 [===============>..............] - ETA: 1:15 - loss: 0.5172 - regression_loss: 0.4737 - classification_loss: 0.0436 279/500 [===============>..............] - ETA: 1:15 - loss: 0.5168 - regression_loss: 0.4732 - classification_loss: 0.0435 280/500 [===============>..............] - ETA: 1:15 - loss: 0.5162 - regression_loss: 0.4726 - classification_loss: 0.0436 281/500 [===============>..............] - ETA: 1:14 - loss: 0.5153 - regression_loss: 0.4718 - classification_loss: 0.0435 282/500 [===============>..............] - ETA: 1:14 - loss: 0.5151 - regression_loss: 0.4716 - classification_loss: 0.0435 283/500 [===============>..............] - ETA: 1:13 - loss: 0.5151 - regression_loss: 0.4716 - classification_loss: 0.0436 284/500 [================>.............] - ETA: 1:13 - loss: 0.5157 - regression_loss: 0.4720 - classification_loss: 0.0436 285/500 [================>.............] - ETA: 1:13 - loss: 0.5145 - regression_loss: 0.4710 - classification_loss: 0.0435 286/500 [================>.............] - ETA: 1:12 - loss: 0.5140 - regression_loss: 0.4705 - classification_loss: 0.0435 287/500 [================>.............] - ETA: 1:12 - loss: 0.5142 - regression_loss: 0.4707 - classification_loss: 0.0435 288/500 [================>.............] - ETA: 1:12 - loss: 0.5148 - regression_loss: 0.4714 - classification_loss: 0.0434 289/500 [================>.............] - ETA: 1:11 - loss: 0.5154 - regression_loss: 0.4719 - classification_loss: 0.0435 290/500 [================>.............] - ETA: 1:11 - loss: 0.5155 - regression_loss: 0.4720 - classification_loss: 0.0435 291/500 [================>.............] - ETA: 1:11 - loss: 0.5148 - regression_loss: 0.4714 - classification_loss: 0.0435 292/500 [================>.............] - ETA: 1:10 - loss: 0.5144 - regression_loss: 0.4710 - classification_loss: 0.0434 293/500 [================>.............] - ETA: 1:10 - loss: 0.5142 - regression_loss: 0.4709 - classification_loss: 0.0433 294/500 [================>.............] - ETA: 1:10 - loss: 0.5138 - regression_loss: 0.4706 - classification_loss: 0.0432 295/500 [================>.............] - ETA: 1:09 - loss: 0.5146 - regression_loss: 0.4713 - classification_loss: 0.0433 296/500 [================>.............] - ETA: 1:09 - loss: 0.5141 - regression_loss: 0.4709 - classification_loss: 0.0432 297/500 [================>.............] - ETA: 1:09 - loss: 0.5145 - regression_loss: 0.4713 - classification_loss: 0.0432 298/500 [================>.............] - ETA: 1:08 - loss: 0.5148 - regression_loss: 0.4715 - classification_loss: 0.0432 299/500 [================>.............] - ETA: 1:08 - loss: 0.5151 - regression_loss: 0.4719 - classification_loss: 0.0433 300/500 [=================>............] - ETA: 1:08 - loss: 0.5154 - regression_loss: 0.4721 - classification_loss: 0.0433 301/500 [=================>............] - ETA: 1:07 - loss: 0.5147 - regression_loss: 0.4714 - classification_loss: 0.0433 302/500 [=================>............] - ETA: 1:07 - loss: 0.5154 - regression_loss: 0.4721 - classification_loss: 0.0433 303/500 [=================>............] - ETA: 1:07 - loss: 0.5155 - regression_loss: 0.4722 - classification_loss: 0.0434 304/500 [=================>............] - ETA: 1:06 - loss: 0.5161 - regression_loss: 0.4726 - classification_loss: 0.0435 305/500 [=================>............] - ETA: 1:06 - loss: 0.5157 - regression_loss: 0.4722 - classification_loss: 0.0435 306/500 [=================>............] - ETA: 1:06 - loss: 0.5149 - regression_loss: 0.4715 - classification_loss: 0.0434 307/500 [=================>............] - ETA: 1:05 - loss: 0.5144 - regression_loss: 0.4711 - classification_loss: 0.0434 308/500 [=================>............] - ETA: 1:05 - loss: 0.5147 - regression_loss: 0.4713 - classification_loss: 0.0434 309/500 [=================>............] - ETA: 1:05 - loss: 0.5149 - regression_loss: 0.4715 - classification_loss: 0.0434 310/500 [=================>............] - ETA: 1:04 - loss: 0.5152 - regression_loss: 0.4719 - classification_loss: 0.0433 311/500 [=================>............] - ETA: 1:04 - loss: 0.5147 - regression_loss: 0.4715 - classification_loss: 0.0432 312/500 [=================>............] - ETA: 1:04 - loss: 0.5146 - regression_loss: 0.4714 - classification_loss: 0.0432 313/500 [=================>............] - ETA: 1:03 - loss: 0.5144 - regression_loss: 0.4712 - classification_loss: 0.0432 314/500 [=================>............] - ETA: 1:03 - loss: 0.5143 - regression_loss: 0.4711 - classification_loss: 0.0432 315/500 [=================>............] - ETA: 1:03 - loss: 0.5135 - regression_loss: 0.4704 - classification_loss: 0.0431 316/500 [=================>............] - ETA: 1:02 - loss: 0.5130 - regression_loss: 0.4700 - classification_loss: 0.0430 317/500 [==================>...........] - ETA: 1:02 - loss: 0.5127 - regression_loss: 0.4697 - classification_loss: 0.0430 318/500 [==================>...........] - ETA: 1:02 - loss: 0.5126 - regression_loss: 0.4697 - classification_loss: 0.0430 319/500 [==================>...........] - ETA: 1:01 - loss: 0.5123 - regression_loss: 0.4694 - classification_loss: 0.0429 320/500 [==================>...........] - ETA: 1:01 - loss: 0.5117 - regression_loss: 0.4688 - classification_loss: 0.0428 321/500 [==================>...........] - ETA: 1:01 - loss: 0.5109 - regression_loss: 0.4681 - classification_loss: 0.0427 322/500 [==================>...........] - ETA: 1:00 - loss: 0.5106 - regression_loss: 0.4679 - classification_loss: 0.0426 323/500 [==================>...........] - ETA: 1:00 - loss: 0.5102 - regression_loss: 0.4676 - classification_loss: 0.0426 324/500 [==================>...........] - ETA: 1:00 - loss: 0.5098 - regression_loss: 0.4672 - classification_loss: 0.0426 325/500 [==================>...........] - ETA: 59s - loss: 0.5095 - regression_loss: 0.4669 - classification_loss: 0.0426  326/500 [==================>...........] - ETA: 59s - loss: 0.5103 - regression_loss: 0.4677 - classification_loss: 0.0426 327/500 [==================>...........] - ETA: 59s - loss: 0.5098 - regression_loss: 0.4673 - classification_loss: 0.0425 328/500 [==================>...........] - ETA: 58s - loss: 0.5091 - regression_loss: 0.4667 - classification_loss: 0.0424 329/500 [==================>...........] - ETA: 58s - loss: 0.5092 - regression_loss: 0.4668 - classification_loss: 0.0424 330/500 [==================>...........] - ETA: 58s - loss: 0.5092 - regression_loss: 0.4667 - classification_loss: 0.0424 331/500 [==================>...........] - ETA: 57s - loss: 0.5089 - regression_loss: 0.4665 - classification_loss: 0.0424 332/500 [==================>...........] - ETA: 57s - loss: 0.5088 - regression_loss: 0.4665 - classification_loss: 0.0424 333/500 [==================>...........] - ETA: 57s - loss: 0.5083 - regression_loss: 0.4660 - classification_loss: 0.0423 334/500 [===================>..........] - ETA: 56s - loss: 0.5076 - regression_loss: 0.4654 - classification_loss: 0.0422 335/500 [===================>..........] - ETA: 56s - loss: 0.5077 - regression_loss: 0.4655 - classification_loss: 0.0422 336/500 [===================>..........] - ETA: 56s - loss: 0.5075 - regression_loss: 0.4653 - classification_loss: 0.0421 337/500 [===================>..........] - ETA: 55s - loss: 0.5080 - regression_loss: 0.4657 - classification_loss: 0.0423 338/500 [===================>..........] - ETA: 55s - loss: 0.5084 - regression_loss: 0.4660 - classification_loss: 0.0424 339/500 [===================>..........] - ETA: 54s - loss: 0.5084 - regression_loss: 0.4660 - classification_loss: 0.0424 340/500 [===================>..........] - ETA: 54s - loss: 0.5097 - regression_loss: 0.4670 - classification_loss: 0.0427 341/500 [===================>..........] - ETA: 54s - loss: 0.5097 - regression_loss: 0.4670 - classification_loss: 0.0427 342/500 [===================>..........] - ETA: 53s - loss: 0.5097 - regression_loss: 0.4670 - classification_loss: 0.0427 343/500 [===================>..........] - ETA: 53s - loss: 0.5095 - regression_loss: 0.4669 - classification_loss: 0.0426 344/500 [===================>..........] - ETA: 53s - loss: 0.5094 - regression_loss: 0.4667 - classification_loss: 0.0426 345/500 [===================>..........] - ETA: 52s - loss: 0.5095 - regression_loss: 0.4668 - classification_loss: 0.0427 346/500 [===================>..........] - ETA: 52s - loss: 0.5097 - regression_loss: 0.4670 - classification_loss: 0.0427 347/500 [===================>..........] - ETA: 52s - loss: 0.5089 - regression_loss: 0.4663 - classification_loss: 0.0426 348/500 [===================>..........] - ETA: 51s - loss: 0.5082 - regression_loss: 0.4657 - classification_loss: 0.0425 349/500 [===================>..........] - ETA: 51s - loss: 0.5092 - regression_loss: 0.4666 - classification_loss: 0.0426 350/500 [====================>.........] - ETA: 51s - loss: 0.5098 - regression_loss: 0.4672 - classification_loss: 0.0426 351/500 [====================>.........] - ETA: 50s - loss: 0.5095 - regression_loss: 0.4670 - classification_loss: 0.0425 352/500 [====================>.........] - ETA: 50s - loss: 0.5098 - regression_loss: 0.4673 - classification_loss: 0.0425 353/500 [====================>.........] - ETA: 50s - loss: 0.5102 - regression_loss: 0.4676 - classification_loss: 0.0426 354/500 [====================>.........] - ETA: 49s - loss: 0.5105 - regression_loss: 0.4679 - classification_loss: 0.0426 355/500 [====================>.........] - ETA: 49s - loss: 0.5104 - regression_loss: 0.4678 - classification_loss: 0.0426 356/500 [====================>.........] - ETA: 49s - loss: 0.5100 - regression_loss: 0.4674 - classification_loss: 0.0426 357/500 [====================>.........] - ETA: 48s - loss: 0.5103 - regression_loss: 0.4677 - classification_loss: 0.0426 358/500 [====================>.........] - ETA: 48s - loss: 0.5105 - regression_loss: 0.4679 - classification_loss: 0.0426 359/500 [====================>.........] - ETA: 48s - loss: 0.5105 - regression_loss: 0.4679 - classification_loss: 0.0426 360/500 [====================>.........] - ETA: 47s - loss: 0.5097 - regression_loss: 0.4672 - classification_loss: 0.0425 361/500 [====================>.........] - ETA: 47s - loss: 0.5103 - regression_loss: 0.4678 - classification_loss: 0.0425 362/500 [====================>.........] - ETA: 47s - loss: 0.5101 - regression_loss: 0.4677 - classification_loss: 0.0424 363/500 [====================>.........] - ETA: 46s - loss: 0.5102 - regression_loss: 0.4678 - classification_loss: 0.0424 364/500 [====================>.........] - ETA: 46s - loss: 0.5100 - regression_loss: 0.4677 - classification_loss: 0.0424 365/500 [====================>.........] - ETA: 46s - loss: 0.5105 - regression_loss: 0.4681 - classification_loss: 0.0424 366/500 [====================>.........] - ETA: 45s - loss: 0.5104 - regression_loss: 0.4680 - classification_loss: 0.0424 367/500 [=====================>........] - ETA: 45s - loss: 0.5104 - regression_loss: 0.4680 - classification_loss: 0.0424 368/500 [=====================>........] - ETA: 45s - loss: 0.5102 - regression_loss: 0.4677 - classification_loss: 0.0424 369/500 [=====================>........] - ETA: 44s - loss: 0.5095 - regression_loss: 0.4671 - classification_loss: 0.0424 370/500 [=====================>........] - ETA: 44s - loss: 0.5097 - regression_loss: 0.4673 - classification_loss: 0.0424 371/500 [=====================>........] - ETA: 44s - loss: 0.5095 - regression_loss: 0.4670 - classification_loss: 0.0425 372/500 [=====================>........] - ETA: 43s - loss: 0.5089 - regression_loss: 0.4665 - classification_loss: 0.0424 373/500 [=====================>........] - ETA: 43s - loss: 0.5093 - regression_loss: 0.4668 - classification_loss: 0.0425 374/500 [=====================>........] - ETA: 43s - loss: 0.5083 - regression_loss: 0.4658 - classification_loss: 0.0425 375/500 [=====================>........] - ETA: 42s - loss: 0.5086 - regression_loss: 0.4660 - classification_loss: 0.0426 376/500 [=====================>........] - ETA: 42s - loss: 0.5081 - regression_loss: 0.4656 - classification_loss: 0.0425 377/500 [=====================>........] - ETA: 41s - loss: 0.5090 - regression_loss: 0.4663 - classification_loss: 0.0427 378/500 [=====================>........] - ETA: 41s - loss: 0.5095 - regression_loss: 0.4668 - classification_loss: 0.0427 379/500 [=====================>........] - ETA: 41s - loss: 0.5095 - regression_loss: 0.4668 - classification_loss: 0.0427 380/500 [=====================>........] - ETA: 40s - loss: 0.5092 - regression_loss: 0.4665 - classification_loss: 0.0427 381/500 [=====================>........] - ETA: 40s - loss: 0.5094 - regression_loss: 0.4666 - classification_loss: 0.0427 382/500 [=====================>........] - ETA: 40s - loss: 0.5094 - regression_loss: 0.4666 - classification_loss: 0.0427 383/500 [=====================>........] - ETA: 39s - loss: 0.5094 - regression_loss: 0.4667 - classification_loss: 0.0427 384/500 [======================>.......] - ETA: 39s - loss: 0.5085 - regression_loss: 0.4659 - classification_loss: 0.0426 385/500 [======================>.......] - ETA: 39s - loss: 0.5077 - regression_loss: 0.4652 - classification_loss: 0.0425 386/500 [======================>.......] - ETA: 38s - loss: 0.5078 - regression_loss: 0.4653 - classification_loss: 0.0425 387/500 [======================>.......] - ETA: 38s - loss: 0.5077 - regression_loss: 0.4652 - classification_loss: 0.0425 388/500 [======================>.......] - ETA: 38s - loss: 0.5074 - regression_loss: 0.4649 - classification_loss: 0.0425 389/500 [======================>.......] - ETA: 37s - loss: 0.5069 - regression_loss: 0.4644 - classification_loss: 0.0425 390/500 [======================>.......] - ETA: 37s - loss: 0.5076 - regression_loss: 0.4651 - classification_loss: 0.0424 391/500 [======================>.......] - ETA: 37s - loss: 0.5071 - regression_loss: 0.4647 - classification_loss: 0.0425 392/500 [======================>.......] - ETA: 36s - loss: 0.5066 - regression_loss: 0.4641 - classification_loss: 0.0424 393/500 [======================>.......] - ETA: 36s - loss: 0.5073 - regression_loss: 0.4648 - classification_loss: 0.0425 394/500 [======================>.......] - ETA: 36s - loss: 0.5071 - regression_loss: 0.4647 - classification_loss: 0.0424 395/500 [======================>.......] - ETA: 35s - loss: 0.5067 - regression_loss: 0.4643 - classification_loss: 0.0424 396/500 [======================>.......] - ETA: 35s - loss: 0.5070 - regression_loss: 0.4647 - classification_loss: 0.0423 397/500 [======================>.......] - ETA: 35s - loss: 0.5064 - regression_loss: 0.4641 - classification_loss: 0.0423 398/500 [======================>.......] - ETA: 34s - loss: 0.5072 - regression_loss: 0.4648 - classification_loss: 0.0424 399/500 [======================>.......] - ETA: 34s - loss: 0.5064 - regression_loss: 0.4641 - classification_loss: 0.0423 400/500 [=======================>......] - ETA: 34s - loss: 0.5066 - regression_loss: 0.4642 - classification_loss: 0.0424 401/500 [=======================>......] - ETA: 33s - loss: 0.5065 - regression_loss: 0.4641 - classification_loss: 0.0424 402/500 [=======================>......] - ETA: 33s - loss: 0.5060 - regression_loss: 0.4637 - classification_loss: 0.0423 403/500 [=======================>......] - ETA: 33s - loss: 0.5055 - regression_loss: 0.4633 - classification_loss: 0.0423 404/500 [=======================>......] - ETA: 32s - loss: 0.5055 - regression_loss: 0.4633 - classification_loss: 0.0422 405/500 [=======================>......] - ETA: 32s - loss: 0.5053 - regression_loss: 0.4630 - classification_loss: 0.0422 406/500 [=======================>......] - ETA: 32s - loss: 0.5051 - regression_loss: 0.4629 - classification_loss: 0.0422 407/500 [=======================>......] - ETA: 31s - loss: 0.5051 - regression_loss: 0.4629 - classification_loss: 0.0422 408/500 [=======================>......] - ETA: 31s - loss: 0.5046 - regression_loss: 0.4624 - classification_loss: 0.0422 409/500 [=======================>......] - ETA: 31s - loss: 0.5043 - regression_loss: 0.4621 - classification_loss: 0.0421 410/500 [=======================>......] - ETA: 30s - loss: 0.5038 - regression_loss: 0.4617 - classification_loss: 0.0421 411/500 [=======================>......] - ETA: 30s - loss: 0.5045 - regression_loss: 0.4623 - classification_loss: 0.0422 412/500 [=======================>......] - ETA: 30s - loss: 0.5040 - regression_loss: 0.4619 - classification_loss: 0.0421 413/500 [=======================>......] - ETA: 29s - loss: 0.5035 - regression_loss: 0.4615 - classification_loss: 0.0420 414/500 [=======================>......] - ETA: 29s - loss: 0.5030 - regression_loss: 0.4610 - classification_loss: 0.0420 415/500 [=======================>......] - ETA: 29s - loss: 0.5028 - regression_loss: 0.4608 - classification_loss: 0.0419 416/500 [=======================>......] - ETA: 28s - loss: 0.5032 - regression_loss: 0.4613 - classification_loss: 0.0420 417/500 [========================>.....] - ETA: 28s - loss: 0.5029 - regression_loss: 0.4609 - classification_loss: 0.0419 418/500 [========================>.....] - ETA: 28s - loss: 0.5027 - regression_loss: 0.4608 - classification_loss: 0.0419 419/500 [========================>.....] - ETA: 27s - loss: 0.5024 - regression_loss: 0.4605 - classification_loss: 0.0419 420/500 [========================>.....] - ETA: 27s - loss: 0.5022 - regression_loss: 0.4603 - classification_loss: 0.0418 421/500 [========================>.....] - ETA: 26s - loss: 0.5024 - regression_loss: 0.4605 - classification_loss: 0.0418 422/500 [========================>.....] - ETA: 26s - loss: 0.5026 - regression_loss: 0.4607 - classification_loss: 0.0418 423/500 [========================>.....] - ETA: 26s - loss: 0.5022 - regression_loss: 0.4604 - classification_loss: 0.0418 424/500 [========================>.....] - ETA: 25s - loss: 0.5017 - regression_loss: 0.4599 - classification_loss: 0.0417 425/500 [========================>.....] - ETA: 25s - loss: 0.5017 - regression_loss: 0.4600 - classification_loss: 0.0417 426/500 [========================>.....] - ETA: 25s - loss: 0.5021 - regression_loss: 0.4602 - classification_loss: 0.0418 427/500 [========================>.....] - ETA: 24s - loss: 0.5022 - regression_loss: 0.4603 - classification_loss: 0.0419 428/500 [========================>.....] - ETA: 24s - loss: 0.5028 - regression_loss: 0.4608 - classification_loss: 0.0420 429/500 [========================>.....] - ETA: 24s - loss: 0.5022 - regression_loss: 0.4603 - classification_loss: 0.0419 430/500 [========================>.....] - ETA: 23s - loss: 0.5018 - regression_loss: 0.4599 - classification_loss: 0.0418 431/500 [========================>.....] - ETA: 23s - loss: 0.5019 - regression_loss: 0.4601 - classification_loss: 0.0418 432/500 [========================>.....] - ETA: 23s - loss: 0.5021 - regression_loss: 0.4603 - classification_loss: 0.0419 433/500 [========================>.....] - ETA: 22s - loss: 0.5015 - regression_loss: 0.4597 - classification_loss: 0.0418 434/500 [=========================>....] - ETA: 22s - loss: 0.5022 - regression_loss: 0.4604 - classification_loss: 0.0418 435/500 [=========================>....] - ETA: 22s - loss: 0.5018 - regression_loss: 0.4601 - classification_loss: 0.0418 436/500 [=========================>....] - ETA: 21s - loss: 0.5018 - regression_loss: 0.4600 - classification_loss: 0.0418 437/500 [=========================>....] - ETA: 21s - loss: 0.5011 - regression_loss: 0.4594 - classification_loss: 0.0417 438/500 [=========================>....] - ETA: 21s - loss: 0.5016 - regression_loss: 0.4598 - classification_loss: 0.0418 439/500 [=========================>....] - ETA: 20s - loss: 0.5014 - regression_loss: 0.4596 - classification_loss: 0.0417 440/500 [=========================>....] - ETA: 20s - loss: 0.5009 - regression_loss: 0.4592 - classification_loss: 0.0417 441/500 [=========================>....] - ETA: 20s - loss: 0.5002 - regression_loss: 0.4586 - classification_loss: 0.0416 442/500 [=========================>....] - ETA: 19s - loss: 0.5006 - regression_loss: 0.4590 - classification_loss: 0.0417 443/500 [=========================>....] - ETA: 19s - loss: 0.5009 - regression_loss: 0.4592 - classification_loss: 0.0417 444/500 [=========================>....] - ETA: 19s - loss: 0.5007 - regression_loss: 0.4590 - classification_loss: 0.0417 445/500 [=========================>....] - ETA: 18s - loss: 0.5009 - regression_loss: 0.4592 - classification_loss: 0.0417 446/500 [=========================>....] - ETA: 18s - loss: 0.5013 - regression_loss: 0.4595 - classification_loss: 0.0417 447/500 [=========================>....] - ETA: 18s - loss: 0.5020 - regression_loss: 0.4601 - classification_loss: 0.0419 448/500 [=========================>....] - ETA: 17s - loss: 0.5018 - regression_loss: 0.4599 - classification_loss: 0.0418 449/500 [=========================>....] - ETA: 17s - loss: 0.5018 - regression_loss: 0.4600 - classification_loss: 0.0419 450/500 [==========================>...] - ETA: 17s - loss: 0.5019 - regression_loss: 0.4600 - classification_loss: 0.0418 451/500 [==========================>...] - ETA: 16s - loss: 0.5016 - regression_loss: 0.4598 - classification_loss: 0.0418 452/500 [==========================>...] - ETA: 16s - loss: 0.5013 - regression_loss: 0.4595 - classification_loss: 0.0418 453/500 [==========================>...] - ETA: 16s - loss: 0.5006 - regression_loss: 0.4588 - classification_loss: 0.0418 454/500 [==========================>...] - ETA: 15s - loss: 0.5007 - regression_loss: 0.4589 - classification_loss: 0.0418 455/500 [==========================>...] - ETA: 15s - loss: 0.5005 - regression_loss: 0.4587 - classification_loss: 0.0418 456/500 [==========================>...] - ETA: 15s - loss: 0.5001 - regression_loss: 0.4583 - classification_loss: 0.0418 457/500 [==========================>...] - ETA: 14s - loss: 0.5001 - regression_loss: 0.4583 - classification_loss: 0.0417 458/500 [==========================>...] - ETA: 14s - loss: 0.5003 - regression_loss: 0.4586 - classification_loss: 0.0418 459/500 [==========================>...] - ETA: 14s - loss: 0.5005 - regression_loss: 0.4587 - classification_loss: 0.0418 460/500 [==========================>...] - ETA: 13s - loss: 0.5006 - regression_loss: 0.4587 - classification_loss: 0.0418 461/500 [==========================>...] - ETA: 13s - loss: 0.5003 - regression_loss: 0.4585 - classification_loss: 0.0418 462/500 [==========================>...] - ETA: 12s - loss: 0.5000 - regression_loss: 0.4583 - classification_loss: 0.0417 463/500 [==========================>...] - ETA: 12s - loss: 0.4993 - regression_loss: 0.4576 - classification_loss: 0.0417 464/500 [==========================>...] - ETA: 12s - loss: 0.4991 - regression_loss: 0.4575 - classification_loss: 0.0416 465/500 [==========================>...] - ETA: 11s - loss: 0.4990 - regression_loss: 0.4574 - classification_loss: 0.0416 466/500 [==========================>...] - ETA: 11s - loss: 0.4989 - regression_loss: 0.4574 - classification_loss: 0.0415 467/500 [===========================>..] - ETA: 11s - loss: 0.4984 - regression_loss: 0.4569 - classification_loss: 0.0414 468/500 [===========================>..] - ETA: 10s - loss: 0.4982 - regression_loss: 0.4568 - classification_loss: 0.0414 469/500 [===========================>..] - ETA: 10s - loss: 0.4987 - regression_loss: 0.4572 - classification_loss: 0.0415 470/500 [===========================>..] - ETA: 10s - loss: 0.4982 - regression_loss: 0.4567 - classification_loss: 0.0414 471/500 [===========================>..] - ETA: 9s - loss: 0.4977 - regression_loss: 0.4563 - classification_loss: 0.0414  472/500 [===========================>..] - ETA: 9s - loss: 0.4974 - regression_loss: 0.4560 - classification_loss: 0.0413 473/500 [===========================>..] - ETA: 9s - loss: 0.4971 - regression_loss: 0.4558 - classification_loss: 0.0414 474/500 [===========================>..] - ETA: 8s - loss: 0.4980 - regression_loss: 0.4567 - classification_loss: 0.0414 475/500 [===========================>..] - ETA: 8s - loss: 0.4977 - regression_loss: 0.4563 - classification_loss: 0.0414 476/500 [===========================>..] - ETA: 8s - loss: 0.4977 - regression_loss: 0.4563 - classification_loss: 0.0414 477/500 [===========================>..] - ETA: 7s - loss: 0.4974 - regression_loss: 0.4561 - classification_loss: 0.0413 478/500 [===========================>..] - ETA: 7s - loss: 0.4978 - regression_loss: 0.4564 - classification_loss: 0.0414 479/500 [===========================>..] - ETA: 7s - loss: 0.4980 - regression_loss: 0.4566 - classification_loss: 0.0414 480/500 [===========================>..] - ETA: 6s - loss: 0.4980 - regression_loss: 0.4566 - classification_loss: 0.0414 481/500 [===========================>..] - ETA: 6s - loss: 0.4980 - regression_loss: 0.4566 - classification_loss: 0.0414 482/500 [===========================>..] - ETA: 6s - loss: 0.4981 - regression_loss: 0.4566 - classification_loss: 0.0414 483/500 [===========================>..] - ETA: 5s - loss: 0.4977 - regression_loss: 0.4563 - classification_loss: 0.0414 484/500 [============================>.] - ETA: 5s - loss: 0.4973 - regression_loss: 0.4559 - classification_loss: 0.0413 485/500 [============================>.] - ETA: 5s - loss: 0.4972 - regression_loss: 0.4559 - classification_loss: 0.0413 486/500 [============================>.] - ETA: 4s - loss: 0.4974 - regression_loss: 0.4560 - classification_loss: 0.0413 487/500 [============================>.] - ETA: 4s - loss: 0.4972 - regression_loss: 0.4558 - classification_loss: 0.0413 488/500 [============================>.] - ETA: 4s - loss: 0.4969 - regression_loss: 0.4556 - classification_loss: 0.0413 489/500 [============================>.] - ETA: 3s - loss: 0.4962 - regression_loss: 0.4550 - classification_loss: 0.0412 490/500 [============================>.] - ETA: 3s - loss: 0.4963 - regression_loss: 0.4550 - classification_loss: 0.0412 491/500 [============================>.] - ETA: 3s - loss: 0.4962 - regression_loss: 0.4550 - classification_loss: 0.0412 492/500 [============================>.] - ETA: 2s - loss: 0.4966 - regression_loss: 0.4553 - classification_loss: 0.0412 493/500 [============================>.] - ETA: 2s - loss: 0.4960 - regression_loss: 0.4548 - classification_loss: 0.0412 494/500 [============================>.] - ETA: 2s - loss: 0.4959 - regression_loss: 0.4547 - classification_loss: 0.0412 495/500 [============================>.] - ETA: 1s - loss: 0.4955 - regression_loss: 0.4544 - classification_loss: 0.0411 496/500 [============================>.] - ETA: 1s - loss: 0.4955 - regression_loss: 0.4544 - classification_loss: 0.0411 497/500 [============================>.] - ETA: 1s - loss: 0.4957 - regression_loss: 0.4545 - classification_loss: 0.0412 498/500 [============================>.] - ETA: 0s - loss: 0.4954 - regression_loss: 0.4542 - classification_loss: 0.0411 499/500 [============================>.] - ETA: 0s - loss: 0.4949 - regression_loss: 0.4538 - classification_loss: 0.0411 500/500 [==============================] - 171s 342ms/step - loss: 0.4952 - regression_loss: 0.4540 - classification_loss: 0.0412 1172 instances of class plum with average precision: 0.7353 mAP: 0.7353 Epoch 00022: saving model to ./training/snapshots/resnet101_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 2:47 - loss: 0.2736 - regression_loss: 0.2445 - classification_loss: 0.0291 2/500 [..............................] - ETA: 2:50 - loss: 0.4050 - regression_loss: 0.3793 - classification_loss: 0.0257 3/500 [..............................] - ETA: 2:48 - loss: 0.4012 - regression_loss: 0.3701 - classification_loss: 0.0311 4/500 [..............................] - ETA: 2:48 - loss: 0.3561 - regression_loss: 0.3272 - classification_loss: 0.0290 5/500 [..............................] - ETA: 2:48 - loss: 0.3623 - regression_loss: 0.3359 - classification_loss: 0.0264 6/500 [..............................] - ETA: 2:48 - loss: 0.3544 - regression_loss: 0.3292 - classification_loss: 0.0252 7/500 [..............................] - ETA: 2:48 - loss: 0.3842 - regression_loss: 0.3565 - classification_loss: 0.0277 8/500 [..............................] - ETA: 2:49 - loss: 0.4304 - regression_loss: 0.3995 - classification_loss: 0.0310 9/500 [..............................] - ETA: 2:50 - loss: 0.4666 - regression_loss: 0.4342 - classification_loss: 0.0324 10/500 [..............................] - ETA: 2:49 - loss: 0.4860 - regression_loss: 0.4483 - classification_loss: 0.0377 11/500 [..............................] - ETA: 2:49 - loss: 0.4776 - regression_loss: 0.4403 - classification_loss: 0.0372 12/500 [..............................] - ETA: 2:48 - loss: 0.4683 - regression_loss: 0.4313 - classification_loss: 0.0370 13/500 [..............................] - ETA: 2:48 - loss: 0.4836 - regression_loss: 0.4461 - classification_loss: 0.0375 14/500 [..............................] - ETA: 2:47 - loss: 0.4771 - regression_loss: 0.4410 - classification_loss: 0.0361 15/500 [..............................] - ETA: 2:46 - loss: 0.4841 - regression_loss: 0.4477 - classification_loss: 0.0364 16/500 [..............................] - ETA: 2:46 - loss: 0.4761 - regression_loss: 0.4400 - classification_loss: 0.0362 17/500 [>.............................] - ETA: 2:46 - loss: 0.4731 - regression_loss: 0.4367 - classification_loss: 0.0364 18/500 [>.............................] - ETA: 2:45 - loss: 0.4719 - regression_loss: 0.4356 - classification_loss: 0.0362 19/500 [>.............................] - ETA: 2:45 - loss: 0.4807 - regression_loss: 0.4448 - classification_loss: 0.0359 20/500 [>.............................] - ETA: 2:45 - loss: 0.4909 - regression_loss: 0.4548 - classification_loss: 0.0362 21/500 [>.............................] - ETA: 2:44 - loss: 0.4838 - regression_loss: 0.4486 - classification_loss: 0.0352 22/500 [>.............................] - ETA: 2:44 - loss: 0.4779 - regression_loss: 0.4434 - classification_loss: 0.0345 23/500 [>.............................] - ETA: 2:43 - loss: 0.4772 - regression_loss: 0.4426 - classification_loss: 0.0346 24/500 [>.............................] - ETA: 2:43 - loss: 0.4685 - regression_loss: 0.4344 - classification_loss: 0.0341 25/500 [>.............................] - ETA: 2:43 - loss: 0.4645 - regression_loss: 0.4308 - classification_loss: 0.0337 26/500 [>.............................] - ETA: 2:43 - loss: 0.4524 - regression_loss: 0.4196 - classification_loss: 0.0328 27/500 [>.............................] - ETA: 2:43 - loss: 0.4514 - regression_loss: 0.4183 - classification_loss: 0.0331 28/500 [>.............................] - ETA: 2:42 - loss: 0.4440 - regression_loss: 0.4107 - classification_loss: 0.0333 29/500 [>.............................] - ETA: 2:42 - loss: 0.4415 - regression_loss: 0.4080 - classification_loss: 0.0335 30/500 [>.............................] - ETA: 2:41 - loss: 0.4384 - regression_loss: 0.4049 - classification_loss: 0.0335 31/500 [>.............................] - ETA: 2:40 - loss: 0.4413 - regression_loss: 0.4074 - classification_loss: 0.0339 32/500 [>.............................] - ETA: 2:40 - loss: 0.4332 - regression_loss: 0.4001 - classification_loss: 0.0331 33/500 [>.............................] - ETA: 2:40 - loss: 0.4353 - regression_loss: 0.4021 - classification_loss: 0.0332 34/500 [=>............................] - ETA: 2:40 - loss: 0.4322 - regression_loss: 0.3994 - classification_loss: 0.0328 35/500 [=>............................] - ETA: 2:40 - loss: 0.4293 - regression_loss: 0.3968 - classification_loss: 0.0326 36/500 [=>............................] - ETA: 2:39 - loss: 0.4298 - regression_loss: 0.3971 - classification_loss: 0.0327 37/500 [=>............................] - ETA: 2:39 - loss: 0.4225 - regression_loss: 0.3904 - classification_loss: 0.0321 38/500 [=>............................] - ETA: 2:39 - loss: 0.4198 - regression_loss: 0.3881 - classification_loss: 0.0317 39/500 [=>............................] - ETA: 2:38 - loss: 0.4262 - regression_loss: 0.3938 - classification_loss: 0.0325 40/500 [=>............................] - ETA: 2:38 - loss: 0.4356 - regression_loss: 0.4017 - classification_loss: 0.0340 41/500 [=>............................] - ETA: 2:38 - loss: 0.4418 - regression_loss: 0.4077 - classification_loss: 0.0341 42/500 [=>............................] - ETA: 2:37 - loss: 0.4452 - regression_loss: 0.4108 - classification_loss: 0.0344 43/500 [=>............................] - ETA: 2:37 - loss: 0.4464 - regression_loss: 0.4119 - classification_loss: 0.0345 44/500 [=>............................] - ETA: 2:36 - loss: 0.4468 - regression_loss: 0.4119 - classification_loss: 0.0349 45/500 [=>............................] - ETA: 2:36 - loss: 0.4513 - regression_loss: 0.4158 - classification_loss: 0.0356 46/500 [=>............................] - ETA: 2:35 - loss: 0.4549 - regression_loss: 0.4190 - classification_loss: 0.0359 47/500 [=>............................] - ETA: 2:35 - loss: 0.4612 - regression_loss: 0.4249 - classification_loss: 0.0363 48/500 [=>............................] - ETA: 2:35 - loss: 0.4600 - regression_loss: 0.4233 - classification_loss: 0.0367 49/500 [=>............................] - ETA: 2:34 - loss: 0.4551 - regression_loss: 0.4189 - classification_loss: 0.0361 50/500 [==>...........................] - ETA: 2:34 - loss: 0.4562 - regression_loss: 0.4198 - classification_loss: 0.0364 51/500 [==>...........................] - ETA: 2:34 - loss: 0.4584 - regression_loss: 0.4216 - classification_loss: 0.0368 52/500 [==>...........................] - ETA: 2:33 - loss: 0.4543 - regression_loss: 0.4180 - classification_loss: 0.0363 53/500 [==>...........................] - ETA: 2:33 - loss: 0.4522 - regression_loss: 0.4163 - classification_loss: 0.0359 54/500 [==>...........................] - ETA: 2:32 - loss: 0.4570 - regression_loss: 0.4207 - classification_loss: 0.0363 55/500 [==>...........................] - ETA: 2:32 - loss: 0.4573 - regression_loss: 0.4211 - classification_loss: 0.0362 56/500 [==>...........................] - ETA: 2:32 - loss: 0.4549 - regression_loss: 0.4188 - classification_loss: 0.0361 57/500 [==>...........................] - ETA: 2:31 - loss: 0.4522 - regression_loss: 0.4164 - classification_loss: 0.0359 58/500 [==>...........................] - ETA: 2:31 - loss: 0.4544 - regression_loss: 0.4179 - classification_loss: 0.0365 59/500 [==>...........................] - ETA: 2:30 - loss: 0.4493 - regression_loss: 0.4132 - classification_loss: 0.0361 60/500 [==>...........................] - ETA: 2:30 - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 61/500 [==>...........................] - ETA: 2:30 - loss: 0.4559 - regression_loss: 0.4182 - classification_loss: 0.0378 62/500 [==>...........................] - ETA: 2:29 - loss: 0.4530 - regression_loss: 0.4156 - classification_loss: 0.0375 63/500 [==>...........................] - ETA: 2:29 - loss: 0.4577 - regression_loss: 0.4195 - classification_loss: 0.0383 64/500 [==>...........................] - ETA: 2:29 - loss: 0.4571 - regression_loss: 0.4189 - classification_loss: 0.0381 65/500 [==>...........................] - ETA: 2:29 - loss: 0.4592 - regression_loss: 0.4209 - classification_loss: 0.0383 66/500 [==>...........................] - ETA: 2:28 - loss: 0.4572 - regression_loss: 0.4187 - classification_loss: 0.0385 67/500 [===>..........................] - ETA: 2:28 - loss: 0.4551 - regression_loss: 0.4167 - classification_loss: 0.0384 68/500 [===>..........................] - ETA: 2:28 - loss: 0.4541 - regression_loss: 0.4157 - classification_loss: 0.0383 69/500 [===>..........................] - ETA: 2:27 - loss: 0.4543 - regression_loss: 0.4161 - classification_loss: 0.0381 70/500 [===>..........................] - ETA: 2:27 - loss: 0.4534 - regression_loss: 0.4153 - classification_loss: 0.0381 71/500 [===>..........................] - ETA: 2:27 - loss: 0.4500 - regression_loss: 0.4122 - classification_loss: 0.0378 72/500 [===>..........................] - ETA: 2:26 - loss: 0.4515 - regression_loss: 0.4132 - classification_loss: 0.0382 73/500 [===>..........................] - ETA: 2:26 - loss: 0.4547 - regression_loss: 0.4164 - classification_loss: 0.0383 74/500 [===>..........................] - ETA: 2:26 - loss: 0.4605 - regression_loss: 0.4215 - classification_loss: 0.0390 75/500 [===>..........................] - ETA: 2:25 - loss: 0.4596 - regression_loss: 0.4204 - classification_loss: 0.0392 76/500 [===>..........................] - ETA: 2:25 - loss: 0.4582 - regression_loss: 0.4192 - classification_loss: 0.0390 77/500 [===>..........................] - ETA: 2:25 - loss: 0.4604 - regression_loss: 0.4212 - classification_loss: 0.0392 78/500 [===>..........................] - ETA: 2:25 - loss: 0.4587 - regression_loss: 0.4197 - classification_loss: 0.0390 79/500 [===>..........................] - ETA: 2:24 - loss: 0.4592 - regression_loss: 0.4203 - classification_loss: 0.0389 80/500 [===>..........................] - ETA: 2:24 - loss: 0.4620 - regression_loss: 0.4229 - classification_loss: 0.0392 81/500 [===>..........................] - ETA: 2:24 - loss: 0.4645 - regression_loss: 0.4252 - classification_loss: 0.0393 82/500 [===>..........................] - ETA: 2:23 - loss: 0.4660 - regression_loss: 0.4262 - classification_loss: 0.0397 83/500 [===>..........................] - ETA: 2:23 - loss: 0.4668 - regression_loss: 0.4272 - classification_loss: 0.0397 84/500 [====>.........................] - ETA: 2:22 - loss: 0.4666 - regression_loss: 0.4270 - classification_loss: 0.0396 85/500 [====>.........................] - ETA: 2:22 - loss: 0.4661 - regression_loss: 0.4265 - classification_loss: 0.0396 86/500 [====>.........................] - ETA: 2:22 - loss: 0.4689 - regression_loss: 0.4294 - classification_loss: 0.0395 87/500 [====>.........................] - ETA: 2:21 - loss: 0.4676 - regression_loss: 0.4282 - classification_loss: 0.0394 88/500 [====>.........................] - ETA: 2:21 - loss: 0.4680 - regression_loss: 0.4288 - classification_loss: 0.0392 89/500 [====>.........................] - ETA: 2:20 - loss: 0.4667 - regression_loss: 0.4276 - classification_loss: 0.0391 90/500 [====>.........................] - ETA: 2:20 - loss: 0.4648 - regression_loss: 0.4259 - classification_loss: 0.0389 91/500 [====>.........................] - ETA: 2:20 - loss: 0.4627 - regression_loss: 0.4241 - classification_loss: 0.0386 92/500 [====>.........................] - ETA: 2:19 - loss: 0.4638 - regression_loss: 0.4251 - classification_loss: 0.0386 93/500 [====>.........................] - ETA: 2:19 - loss: 0.4609 - regression_loss: 0.4226 - classification_loss: 0.0384 94/500 [====>.........................] - ETA: 2:19 - loss: 0.4608 - regression_loss: 0.4226 - classification_loss: 0.0383 95/500 [====>.........................] - ETA: 2:18 - loss: 0.4596 - regression_loss: 0.4216 - classification_loss: 0.0381 96/500 [====>.........................] - ETA: 2:18 - loss: 0.4582 - regression_loss: 0.4204 - classification_loss: 0.0378 97/500 [====>.........................] - ETA: 2:18 - loss: 0.4557 - regression_loss: 0.4181 - classification_loss: 0.0376 98/500 [====>.........................] - ETA: 2:17 - loss: 0.4583 - regression_loss: 0.4206 - classification_loss: 0.0377 99/500 [====>.........................] - ETA: 2:17 - loss: 0.4580 - regression_loss: 0.4205 - classification_loss: 0.0375 100/500 [=====>........................] - ETA: 2:17 - loss: 0.4602 - regression_loss: 0.4222 - classification_loss: 0.0380 101/500 [=====>........................] - ETA: 2:16 - loss: 0.4588 - regression_loss: 0.4210 - classification_loss: 0.0378 102/500 [=====>........................] - ETA: 2:16 - loss: 0.4568 - regression_loss: 0.4193 - classification_loss: 0.0375 103/500 [=====>........................] - ETA: 2:16 - loss: 0.4575 - regression_loss: 0.4201 - classification_loss: 0.0375 104/500 [=====>........................] - ETA: 2:15 - loss: 0.4613 - regression_loss: 0.4232 - classification_loss: 0.0381 105/500 [=====>........................] - ETA: 2:15 - loss: 0.4602 - regression_loss: 0.4221 - classification_loss: 0.0381 106/500 [=====>........................] - ETA: 2:15 - loss: 0.4600 - regression_loss: 0.4218 - classification_loss: 0.0382 107/500 [=====>........................] - ETA: 2:14 - loss: 0.4589 - regression_loss: 0.4208 - classification_loss: 0.0382 108/500 [=====>........................] - ETA: 2:14 - loss: 0.4585 - regression_loss: 0.4202 - classification_loss: 0.0383 109/500 [=====>........................] - ETA: 2:14 - loss: 0.4600 - regression_loss: 0.4216 - classification_loss: 0.0384 110/500 [=====>........................] - ETA: 2:13 - loss: 0.4592 - regression_loss: 0.4210 - classification_loss: 0.0382 111/500 [=====>........................] - ETA: 2:13 - loss: 0.4593 - regression_loss: 0.4213 - classification_loss: 0.0380 112/500 [=====>........................] - ETA: 2:13 - loss: 0.4587 - regression_loss: 0.4208 - classification_loss: 0.0379 113/500 [=====>........................] - ETA: 2:12 - loss: 0.4587 - regression_loss: 0.4209 - classification_loss: 0.0378 114/500 [=====>........................] - ETA: 2:12 - loss: 0.4583 - regression_loss: 0.4205 - classification_loss: 0.0378 115/500 [=====>........................] - ETA: 2:12 - loss: 0.4599 - regression_loss: 0.4221 - classification_loss: 0.0378 116/500 [=====>........................] - ETA: 2:11 - loss: 0.4585 - regression_loss: 0.4209 - classification_loss: 0.0376 117/500 [======>.......................] - ETA: 2:11 - loss: 0.4567 - regression_loss: 0.4192 - classification_loss: 0.0375 118/500 [======>.......................] - ETA: 2:11 - loss: 0.4572 - regression_loss: 0.4197 - classification_loss: 0.0375 119/500 [======>.......................] - ETA: 2:10 - loss: 0.4560 - regression_loss: 0.4186 - classification_loss: 0.0374 120/500 [======>.......................] - ETA: 2:10 - loss: 0.4559 - regression_loss: 0.4185 - classification_loss: 0.0374 121/500 [======>.......................] - ETA: 2:09 - loss: 0.4571 - regression_loss: 0.4197 - classification_loss: 0.0375 122/500 [======>.......................] - ETA: 2:09 - loss: 0.4569 - regression_loss: 0.4194 - classification_loss: 0.0374 123/500 [======>.......................] - ETA: 2:09 - loss: 0.4594 - regression_loss: 0.4221 - classification_loss: 0.0373 124/500 [======>.......................] - ETA: 2:08 - loss: 0.4588 - regression_loss: 0.4217 - classification_loss: 0.0371 125/500 [======>.......................] - ETA: 2:08 - loss: 0.4575 - regression_loss: 0.4205 - classification_loss: 0.0370 126/500 [======>.......................] - ETA: 2:08 - loss: 0.4581 - regression_loss: 0.4210 - classification_loss: 0.0371 127/500 [======>.......................] - ETA: 2:07 - loss: 0.4564 - regression_loss: 0.4195 - classification_loss: 0.0369 128/500 [======>.......................] - ETA: 2:07 - loss: 0.4574 - regression_loss: 0.4202 - classification_loss: 0.0371 129/500 [======>.......................] - ETA: 2:06 - loss: 0.4565 - regression_loss: 0.4195 - classification_loss: 0.0370 130/500 [======>.......................] - ETA: 2:06 - loss: 0.4557 - regression_loss: 0.4188 - classification_loss: 0.0369 131/500 [======>.......................] - ETA: 2:06 - loss: 0.4539 - regression_loss: 0.4172 - classification_loss: 0.0367 132/500 [======>.......................] - ETA: 2:05 - loss: 0.4524 - regression_loss: 0.4158 - classification_loss: 0.0366 133/500 [======>.......................] - ETA: 2:05 - loss: 0.4516 - regression_loss: 0.4151 - classification_loss: 0.0364 134/500 [=======>......................] - ETA: 2:05 - loss: 0.4518 - regression_loss: 0.4154 - classification_loss: 0.0364 135/500 [=======>......................] - ETA: 2:05 - loss: 0.4526 - regression_loss: 0.4161 - classification_loss: 0.0365 136/500 [=======>......................] - ETA: 2:04 - loss: 0.4548 - regression_loss: 0.4181 - classification_loss: 0.0367 137/500 [=======>......................] - ETA: 2:04 - loss: 0.4544 - regression_loss: 0.4177 - classification_loss: 0.0367 138/500 [=======>......................] - ETA: 2:03 - loss: 0.4549 - regression_loss: 0.4181 - classification_loss: 0.0368 139/500 [=======>......................] - ETA: 2:03 - loss: 0.4545 - regression_loss: 0.4177 - classification_loss: 0.0368 140/500 [=======>......................] - ETA: 2:03 - loss: 0.4531 - regression_loss: 0.4164 - classification_loss: 0.0367 141/500 [=======>......................] - ETA: 2:03 - loss: 0.4533 - regression_loss: 0.4165 - classification_loss: 0.0368 142/500 [=======>......................] - ETA: 2:02 - loss: 0.4538 - regression_loss: 0.4170 - classification_loss: 0.0368 143/500 [=======>......................] - ETA: 2:02 - loss: 0.4521 - regression_loss: 0.4155 - classification_loss: 0.0366 144/500 [=======>......................] - ETA: 2:02 - loss: 0.4510 - regression_loss: 0.4144 - classification_loss: 0.0366 145/500 [=======>......................] - ETA: 2:01 - loss: 0.4499 - regression_loss: 0.4134 - classification_loss: 0.0365 146/500 [=======>......................] - ETA: 2:01 - loss: 0.4501 - regression_loss: 0.4135 - classification_loss: 0.0366 147/500 [=======>......................] - ETA: 2:00 - loss: 0.4511 - regression_loss: 0.4145 - classification_loss: 0.0366 148/500 [=======>......................] - ETA: 2:00 - loss: 0.4526 - regression_loss: 0.4156 - classification_loss: 0.0370 149/500 [=======>......................] - ETA: 2:00 - loss: 0.4527 - regression_loss: 0.4157 - classification_loss: 0.0370 150/500 [========>.....................] - ETA: 1:59 - loss: 0.4527 - regression_loss: 0.4158 - classification_loss: 0.0369 151/500 [========>.....................] - ETA: 1:59 - loss: 0.4542 - regression_loss: 0.4171 - classification_loss: 0.0371 152/500 [========>.....................] - ETA: 1:59 - loss: 0.4551 - regression_loss: 0.4178 - classification_loss: 0.0372 153/500 [========>.....................] - ETA: 1:58 - loss: 0.4554 - regression_loss: 0.4184 - classification_loss: 0.0371 154/500 [========>.....................] - ETA: 1:58 - loss: 0.4539 - regression_loss: 0.4170 - classification_loss: 0.0369 155/500 [========>.....................] - ETA: 1:58 - loss: 0.4542 - regression_loss: 0.4174 - classification_loss: 0.0369 156/500 [========>.....................] - ETA: 1:57 - loss: 0.4544 - regression_loss: 0.4176 - classification_loss: 0.0368 157/500 [========>.....................] - ETA: 1:57 - loss: 0.4556 - regression_loss: 0.4187 - classification_loss: 0.0369 158/500 [========>.....................] - ETA: 1:57 - loss: 0.4565 - regression_loss: 0.4196 - classification_loss: 0.0369 159/500 [========>.....................] - ETA: 1:56 - loss: 0.4574 - regression_loss: 0.4203 - classification_loss: 0.0371 160/500 [========>.....................] - ETA: 1:56 - loss: 0.4582 - regression_loss: 0.4210 - classification_loss: 0.0371 161/500 [========>.....................] - ETA: 1:56 - loss: 0.4591 - regression_loss: 0.4220 - classification_loss: 0.0372 162/500 [========>.....................] - ETA: 1:55 - loss: 0.4584 - regression_loss: 0.4214 - classification_loss: 0.0370 163/500 [========>.....................] - ETA: 1:55 - loss: 0.4575 - regression_loss: 0.4206 - classification_loss: 0.0370 164/500 [========>.....................] - ETA: 1:55 - loss: 0.4584 - regression_loss: 0.4213 - classification_loss: 0.0371 165/500 [========>.....................] - ETA: 1:54 - loss: 0.4570 - regression_loss: 0.4200 - classification_loss: 0.0370 166/500 [========>.....................] - ETA: 1:54 - loss: 0.4590 - regression_loss: 0.4217 - classification_loss: 0.0373 167/500 [=========>....................] - ETA: 1:53 - loss: 0.4596 - regression_loss: 0.4223 - classification_loss: 0.0373 168/500 [=========>....................] - ETA: 1:53 - loss: 0.4608 - regression_loss: 0.4233 - classification_loss: 0.0375 169/500 [=========>....................] - ETA: 1:53 - loss: 0.4605 - regression_loss: 0.4230 - classification_loss: 0.0375 170/500 [=========>....................] - ETA: 1:52 - loss: 0.4600 - regression_loss: 0.4226 - classification_loss: 0.0374 171/500 [=========>....................] - ETA: 1:52 - loss: 0.4601 - regression_loss: 0.4227 - classification_loss: 0.0374 172/500 [=========>....................] - ETA: 1:52 - loss: 0.4616 - regression_loss: 0.4241 - classification_loss: 0.0375 173/500 [=========>....................] - ETA: 1:51 - loss: 0.4607 - regression_loss: 0.4233 - classification_loss: 0.0374 174/500 [=========>....................] - ETA: 1:51 - loss: 0.4593 - regression_loss: 0.4220 - classification_loss: 0.0372 175/500 [=========>....................] - ETA: 1:51 - loss: 0.4584 - regression_loss: 0.4211 - classification_loss: 0.0372 176/500 [=========>....................] - ETA: 1:51 - loss: 0.4575 - regression_loss: 0.4203 - classification_loss: 0.0371 177/500 [=========>....................] - ETA: 1:50 - loss: 0.4567 - regression_loss: 0.4197 - classification_loss: 0.0371 178/500 [=========>....................] - ETA: 1:50 - loss: 0.4574 - regression_loss: 0.4204 - classification_loss: 0.0370 179/500 [=========>....................] - ETA: 1:49 - loss: 0.4570 - regression_loss: 0.4199 - classification_loss: 0.0371 180/500 [=========>....................] - ETA: 1:49 - loss: 0.4561 - regression_loss: 0.4192 - classification_loss: 0.0370 181/500 [=========>....................] - ETA: 1:49 - loss: 0.4561 - regression_loss: 0.4190 - classification_loss: 0.0371 182/500 [=========>....................] - ETA: 1:48 - loss: 0.4569 - regression_loss: 0.4195 - classification_loss: 0.0374 183/500 [=========>....................] - ETA: 1:48 - loss: 0.4563 - regression_loss: 0.4191 - classification_loss: 0.0372 184/500 [==========>...................] - ETA: 1:48 - loss: 0.4559 - regression_loss: 0.4186 - classification_loss: 0.0373 185/500 [==========>...................] - ETA: 1:47 - loss: 0.4577 - regression_loss: 0.4204 - classification_loss: 0.0373 186/500 [==========>...................] - ETA: 1:47 - loss: 0.4585 - regression_loss: 0.4212 - classification_loss: 0.0373 187/500 [==========>...................] - ETA: 1:47 - loss: 0.4578 - regression_loss: 0.4206 - classification_loss: 0.0372 188/500 [==========>...................] - ETA: 1:46 - loss: 0.4584 - regression_loss: 0.4211 - classification_loss: 0.0374 189/500 [==========>...................] - ETA: 1:46 - loss: 0.4589 - regression_loss: 0.4215 - classification_loss: 0.0374 190/500 [==========>...................] - ETA: 1:46 - loss: 0.4595 - regression_loss: 0.4221 - classification_loss: 0.0375 191/500 [==========>...................] - ETA: 1:45 - loss: 0.4596 - regression_loss: 0.4221 - classification_loss: 0.0374 192/500 [==========>...................] - ETA: 1:45 - loss: 0.4584 - regression_loss: 0.4211 - classification_loss: 0.0373 193/500 [==========>...................] - ETA: 1:45 - loss: 0.4579 - regression_loss: 0.4207 - classification_loss: 0.0372 194/500 [==========>...................] - ETA: 1:44 - loss: 0.4570 - regression_loss: 0.4199 - classification_loss: 0.0372 195/500 [==========>...................] - ETA: 1:44 - loss: 0.4562 - regression_loss: 0.4190 - classification_loss: 0.0371 196/500 [==========>...................] - ETA: 1:44 - loss: 0.4559 - regression_loss: 0.4186 - classification_loss: 0.0372 197/500 [==========>...................] - ETA: 1:43 - loss: 0.4557 - regression_loss: 0.4184 - classification_loss: 0.0373 198/500 [==========>...................] - ETA: 1:43 - loss: 0.4553 - regression_loss: 0.4180 - classification_loss: 0.0372 199/500 [==========>...................] - ETA: 1:43 - loss: 0.4543 - regression_loss: 0.4172 - classification_loss: 0.0371 200/500 [===========>..................] - ETA: 1:42 - loss: 0.4529 - regression_loss: 0.4159 - classification_loss: 0.0370 201/500 [===========>..................] - ETA: 1:42 - loss: 0.4529 - regression_loss: 0.4158 - classification_loss: 0.0371 202/500 [===========>..................] - ETA: 1:42 - loss: 0.4533 - regression_loss: 0.4161 - classification_loss: 0.0372 203/500 [===========>..................] - ETA: 1:41 - loss: 0.4533 - regression_loss: 0.4161 - classification_loss: 0.0372 204/500 [===========>..................] - ETA: 1:41 - loss: 0.4544 - regression_loss: 0.4171 - classification_loss: 0.0373 205/500 [===========>..................] - ETA: 1:41 - loss: 0.4540 - regression_loss: 0.4168 - classification_loss: 0.0372 206/500 [===========>..................] - ETA: 1:40 - loss: 0.4541 - regression_loss: 0.4169 - classification_loss: 0.0372 207/500 [===========>..................] - ETA: 1:40 - loss: 0.4547 - regression_loss: 0.4174 - classification_loss: 0.0373 208/500 [===========>..................] - ETA: 1:40 - loss: 0.4552 - regression_loss: 0.4179 - classification_loss: 0.0373 209/500 [===========>..................] - ETA: 1:39 - loss: 0.4557 - regression_loss: 0.4184 - classification_loss: 0.0373 210/500 [===========>..................] - ETA: 1:39 - loss: 0.4547 - regression_loss: 0.4174 - classification_loss: 0.0372 211/500 [===========>..................] - ETA: 1:39 - loss: 0.4549 - regression_loss: 0.4177 - classification_loss: 0.0372 212/500 [===========>..................] - ETA: 1:38 - loss: 0.4551 - regression_loss: 0.4180 - classification_loss: 0.0372 213/500 [===========>..................] - ETA: 1:38 - loss: 0.4555 - regression_loss: 0.4183 - classification_loss: 0.0372 214/500 [===========>..................] - ETA: 1:38 - loss: 0.4543 - regression_loss: 0.4173 - classification_loss: 0.0371 215/500 [===========>..................] - ETA: 1:37 - loss: 0.4537 - regression_loss: 0.4167 - classification_loss: 0.0370 216/500 [===========>..................] - ETA: 1:37 - loss: 0.4560 - regression_loss: 0.4188 - classification_loss: 0.0371 217/500 [============>.................] - ETA: 1:37 - loss: 0.4556 - regression_loss: 0.4186 - classification_loss: 0.0370 218/500 [============>.................] - ETA: 1:36 - loss: 0.4554 - regression_loss: 0.4185 - classification_loss: 0.0369 219/500 [============>.................] - ETA: 1:36 - loss: 0.4551 - regression_loss: 0.4183 - classification_loss: 0.0368 220/500 [============>.................] - ETA: 1:36 - loss: 0.4557 - regression_loss: 0.4189 - classification_loss: 0.0369 221/500 [============>.................] - ETA: 1:35 - loss: 0.4561 - regression_loss: 0.4191 - classification_loss: 0.0370 222/500 [============>.................] - ETA: 1:35 - loss: 0.4550 - regression_loss: 0.4181 - classification_loss: 0.0369 223/500 [============>.................] - ETA: 1:35 - loss: 0.4551 - regression_loss: 0.4182 - classification_loss: 0.0369 224/500 [============>.................] - ETA: 1:34 - loss: 0.4551 - regression_loss: 0.4183 - classification_loss: 0.0368 225/500 [============>.................] - ETA: 1:34 - loss: 0.4565 - regression_loss: 0.4195 - classification_loss: 0.0370 226/500 [============>.................] - ETA: 1:34 - loss: 0.4563 - regression_loss: 0.4193 - classification_loss: 0.0370 227/500 [============>.................] - ETA: 1:33 - loss: 0.4574 - regression_loss: 0.4202 - classification_loss: 0.0372 228/500 [============>.................] - ETA: 1:33 - loss: 0.4570 - regression_loss: 0.4199 - classification_loss: 0.0371 229/500 [============>.................] - ETA: 1:33 - loss: 0.4570 - regression_loss: 0.4199 - classification_loss: 0.0371 230/500 [============>.................] - ETA: 1:32 - loss: 0.4571 - regression_loss: 0.4200 - classification_loss: 0.0371 231/500 [============>.................] - ETA: 1:32 - loss: 0.4574 - regression_loss: 0.4203 - classification_loss: 0.0371 232/500 [============>.................] - ETA: 1:32 - loss: 0.4576 - regression_loss: 0.4205 - classification_loss: 0.0371 233/500 [============>.................] - ETA: 1:31 - loss: 0.4581 - regression_loss: 0.4209 - classification_loss: 0.0372 234/500 [=============>................] - ETA: 1:31 - loss: 0.4574 - regression_loss: 0.4203 - classification_loss: 0.0371 235/500 [=============>................] - ETA: 1:30 - loss: 0.4575 - regression_loss: 0.4203 - classification_loss: 0.0372 236/500 [=============>................] - ETA: 1:30 - loss: 0.4571 - regression_loss: 0.4200 - classification_loss: 0.0372 237/500 [=============>................] - ETA: 1:30 - loss: 0.4587 - regression_loss: 0.4213 - classification_loss: 0.0374 238/500 [=============>................] - ETA: 1:29 - loss: 0.4583 - regression_loss: 0.4209 - classification_loss: 0.0374 239/500 [=============>................] - ETA: 1:29 - loss: 0.4585 - regression_loss: 0.4210 - classification_loss: 0.0375 240/500 [=============>................] - ETA: 1:29 - loss: 0.4588 - regression_loss: 0.4213 - classification_loss: 0.0375 241/500 [=============>................] - ETA: 1:28 - loss: 0.4584 - regression_loss: 0.4209 - classification_loss: 0.0375 242/500 [=============>................] - ETA: 1:28 - loss: 0.4580 - regression_loss: 0.4206 - classification_loss: 0.0374 243/500 [=============>................] - ETA: 1:28 - loss: 0.4572 - regression_loss: 0.4199 - classification_loss: 0.0373 244/500 [=============>................] - ETA: 1:27 - loss: 0.4572 - regression_loss: 0.4197 - classification_loss: 0.0374 245/500 [=============>................] - ETA: 1:27 - loss: 0.4562 - regression_loss: 0.4188 - classification_loss: 0.0373 246/500 [=============>................] - ETA: 1:27 - loss: 0.4557 - regression_loss: 0.4184 - classification_loss: 0.0373 247/500 [=============>................] - ETA: 1:26 - loss: 0.4556 - regression_loss: 0.4184 - classification_loss: 0.0373 248/500 [=============>................] - ETA: 1:26 - loss: 0.4552 - regression_loss: 0.4180 - classification_loss: 0.0372 249/500 [=============>................] - ETA: 1:26 - loss: 0.4547 - regression_loss: 0.4175 - classification_loss: 0.0372 250/500 [==============>...............] - ETA: 1:25 - loss: 0.4551 - regression_loss: 0.4179 - classification_loss: 0.0372 251/500 [==============>...............] - ETA: 1:25 - loss: 0.4556 - regression_loss: 0.4184 - classification_loss: 0.0372 252/500 [==============>...............] - ETA: 1:25 - loss: 0.4551 - regression_loss: 0.4179 - classification_loss: 0.0372 253/500 [==============>...............] - ETA: 1:24 - loss: 0.4553 - regression_loss: 0.4181 - classification_loss: 0.0371 254/500 [==============>...............] - ETA: 1:24 - loss: 0.4544 - regression_loss: 0.4173 - classification_loss: 0.0371 255/500 [==============>...............] - ETA: 1:24 - loss: 0.4537 - regression_loss: 0.4167 - classification_loss: 0.0370 256/500 [==============>...............] - ETA: 1:23 - loss: 0.4539 - regression_loss: 0.4167 - classification_loss: 0.0372 257/500 [==============>...............] - ETA: 1:23 - loss: 0.4530 - regression_loss: 0.4160 - classification_loss: 0.0371 258/500 [==============>...............] - ETA: 1:23 - loss: 0.4533 - regression_loss: 0.4162 - classification_loss: 0.0371 259/500 [==============>...............] - ETA: 1:22 - loss: 0.4533 - regression_loss: 0.4163 - classification_loss: 0.0371 260/500 [==============>...............] - ETA: 1:22 - loss: 0.4528 - regression_loss: 0.4158 - classification_loss: 0.0370 261/500 [==============>...............] - ETA: 1:22 - loss: 0.4553 - regression_loss: 0.4182 - classification_loss: 0.0371 262/500 [==============>...............] - ETA: 1:21 - loss: 0.4545 - regression_loss: 0.4175 - classification_loss: 0.0370 263/500 [==============>...............] - ETA: 1:21 - loss: 0.4547 - regression_loss: 0.4177 - classification_loss: 0.0370 264/500 [==============>...............] - ETA: 1:21 - loss: 0.4547 - regression_loss: 0.4177 - classification_loss: 0.0370 265/500 [==============>...............] - ETA: 1:20 - loss: 0.4542 - regression_loss: 0.4173 - classification_loss: 0.0369 266/500 [==============>...............] - ETA: 1:20 - loss: 0.4538 - regression_loss: 0.4169 - classification_loss: 0.0369 267/500 [===============>..............] - ETA: 1:19 - loss: 0.4538 - regression_loss: 0.4168 - classification_loss: 0.0370 268/500 [===============>..............] - ETA: 1:19 - loss: 0.4538 - regression_loss: 0.4169 - classification_loss: 0.0369 269/500 [===============>..............] - ETA: 1:19 - loss: 0.4530 - regression_loss: 0.4161 - classification_loss: 0.0369 270/500 [===============>..............] - ETA: 1:18 - loss: 0.4530 - regression_loss: 0.4161 - classification_loss: 0.0369 271/500 [===============>..............] - ETA: 1:18 - loss: 0.4526 - regression_loss: 0.4158 - classification_loss: 0.0369 272/500 [===============>..............] - ETA: 1:18 - loss: 0.4522 - regression_loss: 0.4154 - classification_loss: 0.0368 273/500 [===============>..............] - ETA: 1:17 - loss: 0.4520 - regression_loss: 0.4152 - classification_loss: 0.0367 274/500 [===============>..............] - ETA: 1:17 - loss: 0.4519 - regression_loss: 0.4152 - classification_loss: 0.0367 275/500 [===============>..............] - ETA: 1:17 - loss: 0.4511 - regression_loss: 0.4145 - classification_loss: 0.0366 276/500 [===============>..............] - ETA: 1:16 - loss: 0.4508 - regression_loss: 0.4141 - classification_loss: 0.0367 277/500 [===============>..............] - ETA: 1:16 - loss: 0.4508 - regression_loss: 0.4141 - classification_loss: 0.0367 278/500 [===============>..............] - ETA: 1:16 - loss: 0.4510 - regression_loss: 0.4142 - classification_loss: 0.0368 279/500 [===============>..............] - ETA: 1:15 - loss: 0.4507 - regression_loss: 0.4139 - classification_loss: 0.0368 280/500 [===============>..............] - ETA: 1:15 - loss: 0.4515 - regression_loss: 0.4146 - classification_loss: 0.0369 281/500 [===============>..............] - ETA: 1:15 - loss: 0.4517 - regression_loss: 0.4148 - classification_loss: 0.0369 282/500 [===============>..............] - ETA: 1:14 - loss: 0.4517 - regression_loss: 0.4148 - classification_loss: 0.0369 283/500 [===============>..............] - ETA: 1:14 - loss: 0.4525 - regression_loss: 0.4157 - classification_loss: 0.0369 284/500 [================>.............] - ETA: 1:14 - loss: 0.4530 - regression_loss: 0.4161 - classification_loss: 0.0369 285/500 [================>.............] - ETA: 1:13 - loss: 0.4536 - regression_loss: 0.4166 - classification_loss: 0.0370 286/500 [================>.............] - ETA: 1:13 - loss: 0.4530 - regression_loss: 0.4160 - classification_loss: 0.0369 287/500 [================>.............] - ETA: 1:13 - loss: 0.4527 - regression_loss: 0.4158 - classification_loss: 0.0369 288/500 [================>.............] - ETA: 1:12 - loss: 0.4534 - regression_loss: 0.4164 - classification_loss: 0.0370 289/500 [================>.............] - ETA: 1:12 - loss: 0.4535 - regression_loss: 0.4165 - classification_loss: 0.0370 290/500 [================>.............] - ETA: 1:12 - loss: 0.4540 - regression_loss: 0.4170 - classification_loss: 0.0370 291/500 [================>.............] - ETA: 1:11 - loss: 0.4556 - regression_loss: 0.4186 - classification_loss: 0.0370 292/500 [================>.............] - ETA: 1:11 - loss: 0.4554 - regression_loss: 0.4184 - classification_loss: 0.0370 293/500 [================>.............] - ETA: 1:11 - loss: 0.4550 - regression_loss: 0.4180 - classification_loss: 0.0370 294/500 [================>.............] - ETA: 1:10 - loss: 0.4551 - regression_loss: 0.4182 - classification_loss: 0.0369 295/500 [================>.............] - ETA: 1:10 - loss: 0.4557 - regression_loss: 0.4187 - classification_loss: 0.0370 296/500 [================>.............] - ETA: 1:09 - loss: 0.4553 - regression_loss: 0.4183 - classification_loss: 0.0369 297/500 [================>.............] - ETA: 1:09 - loss: 0.4563 - regression_loss: 0.4192 - classification_loss: 0.0371 298/500 [================>.............] - ETA: 1:09 - loss: 0.4558 - regression_loss: 0.4187 - classification_loss: 0.0371 299/500 [================>.............] - ETA: 1:08 - loss: 0.4560 - regression_loss: 0.4190 - classification_loss: 0.0370 300/500 [=================>............] - ETA: 1:08 - loss: 0.4558 - regression_loss: 0.4188 - classification_loss: 0.0370 301/500 [=================>............] - ETA: 1:08 - loss: 0.4569 - regression_loss: 0.4199 - classification_loss: 0.0370 302/500 [=================>............] - ETA: 1:07 - loss: 0.4572 - regression_loss: 0.4201 - classification_loss: 0.0371 303/500 [=================>............] - ETA: 1:07 - loss: 0.4571 - regression_loss: 0.4200 - classification_loss: 0.0371 304/500 [=================>............] - ETA: 1:07 - loss: 0.4575 - regression_loss: 0.4204 - classification_loss: 0.0371 305/500 [=================>............] - ETA: 1:06 - loss: 0.4581 - regression_loss: 0.4209 - classification_loss: 0.0372 306/500 [=================>............] - ETA: 1:06 - loss: 0.4580 - regression_loss: 0.4208 - classification_loss: 0.0372 307/500 [=================>............] - ETA: 1:06 - loss: 0.4577 - regression_loss: 0.4205 - classification_loss: 0.0372 308/500 [=================>............] - ETA: 1:05 - loss: 0.4578 - regression_loss: 0.4207 - classification_loss: 0.0371 309/500 [=================>............] - ETA: 1:05 - loss: 0.4574 - regression_loss: 0.4204 - classification_loss: 0.0370 310/500 [=================>............] - ETA: 1:05 - loss: 0.4575 - regression_loss: 0.4205 - classification_loss: 0.0370 311/500 [=================>............] - ETA: 1:04 - loss: 0.4578 - regression_loss: 0.4208 - classification_loss: 0.0370 312/500 [=================>............] - ETA: 1:04 - loss: 0.4589 - regression_loss: 0.4218 - classification_loss: 0.0371 313/500 [=================>............] - ETA: 1:04 - loss: 0.4588 - regression_loss: 0.4217 - classification_loss: 0.0371 314/500 [=================>............] - ETA: 1:03 - loss: 0.4596 - regression_loss: 0.4224 - classification_loss: 0.0373 315/500 [=================>............] - ETA: 1:03 - loss: 0.4594 - regression_loss: 0.4222 - classification_loss: 0.0372 316/500 [=================>............] - ETA: 1:03 - loss: 0.4591 - regression_loss: 0.4219 - classification_loss: 0.0372 317/500 [==================>...........] - ETA: 1:02 - loss: 0.4596 - regression_loss: 0.4223 - classification_loss: 0.0373 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4599 - regression_loss: 0.4227 - classification_loss: 0.0373 319/500 [==================>...........] - ETA: 1:02 - loss: 0.4596 - regression_loss: 0.4224 - classification_loss: 0.0372 320/500 [==================>...........] - ETA: 1:01 - loss: 0.4594 - regression_loss: 0.4222 - classification_loss: 0.0372 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4585 - regression_loss: 0.4214 - classification_loss: 0.0371 322/500 [==================>...........] - ETA: 1:01 - loss: 0.4591 - regression_loss: 0.4219 - classification_loss: 0.0372 323/500 [==================>...........] - ETA: 1:00 - loss: 0.4589 - regression_loss: 0.4218 - classification_loss: 0.0371 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4581 - regression_loss: 0.4211 - classification_loss: 0.0371 325/500 [==================>...........] - ETA: 1:00 - loss: 0.4570 - regression_loss: 0.4201 - classification_loss: 0.0370 326/500 [==================>...........] - ETA: 59s - loss: 0.4561 - regression_loss: 0.4193 - classification_loss: 0.0369  327/500 [==================>...........] - ETA: 59s - loss: 0.4557 - regression_loss: 0.4188 - classification_loss: 0.0368 328/500 [==================>...........] - ETA: 59s - loss: 0.4560 - regression_loss: 0.4192 - classification_loss: 0.0368 329/500 [==================>...........] - ETA: 58s - loss: 0.4553 - regression_loss: 0.4185 - classification_loss: 0.0368 330/500 [==================>...........] - ETA: 58s - loss: 0.4560 - regression_loss: 0.4192 - classification_loss: 0.0368 331/500 [==================>...........] - ETA: 57s - loss: 0.4550 - regression_loss: 0.4183 - classification_loss: 0.0367 332/500 [==================>...........] - ETA: 57s - loss: 0.4546 - regression_loss: 0.4179 - classification_loss: 0.0367 333/500 [==================>...........] - ETA: 57s - loss: 0.4539 - regression_loss: 0.4173 - classification_loss: 0.0366 334/500 [===================>..........] - ETA: 56s - loss: 0.4534 - regression_loss: 0.4169 - classification_loss: 0.0366 335/500 [===================>..........] - ETA: 56s - loss: 0.4532 - regression_loss: 0.4167 - classification_loss: 0.0366 336/500 [===================>..........] - ETA: 56s - loss: 0.4535 - regression_loss: 0.4169 - classification_loss: 0.0366 337/500 [===================>..........] - ETA: 55s - loss: 0.4535 - regression_loss: 0.4169 - classification_loss: 0.0366 338/500 [===================>..........] - ETA: 55s - loss: 0.4536 - regression_loss: 0.4170 - classification_loss: 0.0366 339/500 [===================>..........] - ETA: 55s - loss: 0.4540 - regression_loss: 0.4172 - classification_loss: 0.0368 340/500 [===================>..........] - ETA: 54s - loss: 0.4540 - regression_loss: 0.4172 - classification_loss: 0.0368 341/500 [===================>..........] - ETA: 54s - loss: 0.4543 - regression_loss: 0.4174 - classification_loss: 0.0369 342/500 [===================>..........] - ETA: 54s - loss: 0.4545 - regression_loss: 0.4176 - classification_loss: 0.0369 343/500 [===================>..........] - ETA: 53s - loss: 0.4547 - regression_loss: 0.4178 - classification_loss: 0.0369 344/500 [===================>..........] - ETA: 53s - loss: 0.4543 - regression_loss: 0.4175 - classification_loss: 0.0369 345/500 [===================>..........] - ETA: 53s - loss: 0.4542 - regression_loss: 0.4173 - classification_loss: 0.0369 346/500 [===================>..........] - ETA: 52s - loss: 0.4546 - regression_loss: 0.4175 - classification_loss: 0.0370 347/500 [===================>..........] - ETA: 52s - loss: 0.4546 - regression_loss: 0.4176 - classification_loss: 0.0370 348/500 [===================>..........] - ETA: 52s - loss: 0.4543 - regression_loss: 0.4174 - classification_loss: 0.0370 349/500 [===================>..........] - ETA: 51s - loss: 0.4549 - regression_loss: 0.4179 - classification_loss: 0.0370 350/500 [====================>.........] - ETA: 51s - loss: 0.4546 - regression_loss: 0.4177 - classification_loss: 0.0370 351/500 [====================>.........] - ETA: 51s - loss: 0.4549 - regression_loss: 0.4178 - classification_loss: 0.0370 352/500 [====================>.........] - ETA: 50s - loss: 0.4544 - regression_loss: 0.4175 - classification_loss: 0.0370 353/500 [====================>.........] - ETA: 50s - loss: 0.4540 - regression_loss: 0.4171 - classification_loss: 0.0369 354/500 [====================>.........] - ETA: 50s - loss: 0.4536 - regression_loss: 0.4167 - classification_loss: 0.0369 355/500 [====================>.........] - ETA: 49s - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 356/500 [====================>.........] - ETA: 49s - loss: 0.4541 - regression_loss: 0.4171 - classification_loss: 0.0370 357/500 [====================>.........] - ETA: 49s - loss: 0.4541 - regression_loss: 0.4171 - classification_loss: 0.0370 358/500 [====================>.........] - ETA: 48s - loss: 0.4535 - regression_loss: 0.4165 - classification_loss: 0.0370 359/500 [====================>.........] - ETA: 48s - loss: 0.4532 - regression_loss: 0.4162 - classification_loss: 0.0370 360/500 [====================>.........] - ETA: 48s - loss: 0.4539 - regression_loss: 0.4168 - classification_loss: 0.0370 361/500 [====================>.........] - ETA: 47s - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 362/500 [====================>.........] - ETA: 47s - loss: 0.4545 - regression_loss: 0.4175 - classification_loss: 0.0370 363/500 [====================>.........] - ETA: 47s - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 364/500 [====================>.........] - ETA: 46s - loss: 0.4532 - regression_loss: 0.4163 - classification_loss: 0.0369 365/500 [====================>.........] - ETA: 46s - loss: 0.4537 - regression_loss: 0.4167 - classification_loss: 0.0370 366/500 [====================>.........] - ETA: 46s - loss: 0.4533 - regression_loss: 0.4164 - classification_loss: 0.0369 367/500 [=====================>........] - ETA: 45s - loss: 0.4540 - regression_loss: 0.4170 - classification_loss: 0.0370 368/500 [=====================>........] - ETA: 45s - loss: 0.4540 - regression_loss: 0.4170 - classification_loss: 0.0370 369/500 [=====================>........] - ETA: 44s - loss: 0.4535 - regression_loss: 0.4166 - classification_loss: 0.0369 370/500 [=====================>........] - ETA: 44s - loss: 0.4534 - regression_loss: 0.4165 - classification_loss: 0.0369 371/500 [=====================>........] - ETA: 44s - loss: 0.4534 - regression_loss: 0.4165 - classification_loss: 0.0369 372/500 [=====================>........] - ETA: 43s - loss: 0.4532 - regression_loss: 0.4164 - classification_loss: 0.0368 373/500 [=====================>........] - ETA: 43s - loss: 0.4526 - regression_loss: 0.4159 - classification_loss: 0.0368 374/500 [=====================>........] - ETA: 43s - loss: 0.4527 - regression_loss: 0.4159 - classification_loss: 0.0368 375/500 [=====================>........] - ETA: 42s - loss: 0.4529 - regression_loss: 0.4161 - classification_loss: 0.0368 376/500 [=====================>........] - ETA: 42s - loss: 0.4525 - regression_loss: 0.4158 - classification_loss: 0.0367 377/500 [=====================>........] - ETA: 42s - loss: 0.4520 - regression_loss: 0.4153 - classification_loss: 0.0367 378/500 [=====================>........] - ETA: 41s - loss: 0.4519 - regression_loss: 0.4153 - classification_loss: 0.0367 379/500 [=====================>........] - ETA: 41s - loss: 0.4516 - regression_loss: 0.4149 - classification_loss: 0.0367 380/500 [=====================>........] - ETA: 41s - loss: 0.4521 - regression_loss: 0.4154 - classification_loss: 0.0367 381/500 [=====================>........] - ETA: 40s - loss: 0.4522 - regression_loss: 0.4155 - classification_loss: 0.0367 382/500 [=====================>........] - ETA: 40s - loss: 0.4523 - regression_loss: 0.4155 - classification_loss: 0.0367 383/500 [=====================>........] - ETA: 40s - loss: 0.4518 - regression_loss: 0.4151 - classification_loss: 0.0367 384/500 [======================>.......] - ETA: 39s - loss: 0.4517 - regression_loss: 0.4150 - classification_loss: 0.0367 385/500 [======================>.......] - ETA: 39s - loss: 0.4521 - regression_loss: 0.4154 - classification_loss: 0.0367 386/500 [======================>.......] - ETA: 39s - loss: 0.4515 - regression_loss: 0.4148 - classification_loss: 0.0366 387/500 [======================>.......] - ETA: 38s - loss: 0.4515 - regression_loss: 0.4149 - classification_loss: 0.0366 388/500 [======================>.......] - ETA: 38s - loss: 0.4508 - regression_loss: 0.4143 - classification_loss: 0.0366 389/500 [======================>.......] - ETA: 38s - loss: 0.4520 - regression_loss: 0.4151 - classification_loss: 0.0369 390/500 [======================>.......] - ETA: 37s - loss: 0.4519 - regression_loss: 0.4151 - classification_loss: 0.0369 391/500 [======================>.......] - ETA: 37s - loss: 0.4515 - regression_loss: 0.4147 - classification_loss: 0.0368 392/500 [======================>.......] - ETA: 37s - loss: 0.4512 - regression_loss: 0.4144 - classification_loss: 0.0368 393/500 [======================>.......] - ETA: 36s - loss: 0.4517 - regression_loss: 0.4149 - classification_loss: 0.0368 394/500 [======================>.......] - ETA: 36s - loss: 0.4512 - regression_loss: 0.4145 - classification_loss: 0.0367 395/500 [======================>.......] - ETA: 36s - loss: 0.4506 - regression_loss: 0.4139 - classification_loss: 0.0367 396/500 [======================>.......] - ETA: 35s - loss: 0.4503 - regression_loss: 0.4137 - classification_loss: 0.0366 397/500 [======================>.......] - ETA: 35s - loss: 0.4506 - regression_loss: 0.4140 - classification_loss: 0.0366 398/500 [======================>.......] - ETA: 35s - loss: 0.4506 - regression_loss: 0.4140 - classification_loss: 0.0366 399/500 [======================>.......] - ETA: 34s - loss: 0.4512 - regression_loss: 0.4145 - classification_loss: 0.0367 400/500 [=======================>......] - ETA: 34s - loss: 0.4509 - regression_loss: 0.4142 - classification_loss: 0.0366 401/500 [=======================>......] - ETA: 33s - loss: 0.4514 - regression_loss: 0.4147 - classification_loss: 0.0367 402/500 [=======================>......] - ETA: 33s - loss: 0.4514 - regression_loss: 0.4147 - classification_loss: 0.0367 403/500 [=======================>......] - ETA: 33s - loss: 0.4518 - regression_loss: 0.4151 - classification_loss: 0.0367 404/500 [=======================>......] - ETA: 32s - loss: 0.4521 - regression_loss: 0.4154 - classification_loss: 0.0367 405/500 [=======================>......] - ETA: 32s - loss: 0.4525 - regression_loss: 0.4157 - classification_loss: 0.0368 406/500 [=======================>......] - ETA: 32s - loss: 0.4524 - regression_loss: 0.4155 - classification_loss: 0.0368 407/500 [=======================>......] - ETA: 31s - loss: 0.4524 - regression_loss: 0.4156 - classification_loss: 0.0368 408/500 [=======================>......] - ETA: 31s - loss: 0.4524 - regression_loss: 0.4156 - classification_loss: 0.0368 409/500 [=======================>......] - ETA: 31s - loss: 0.4520 - regression_loss: 0.4153 - classification_loss: 0.0367 410/500 [=======================>......] - ETA: 30s - loss: 0.4518 - regression_loss: 0.4151 - classification_loss: 0.0367 411/500 [=======================>......] - ETA: 30s - loss: 0.4513 - regression_loss: 0.4147 - classification_loss: 0.0366 412/500 [=======================>......] - ETA: 30s - loss: 0.4511 - regression_loss: 0.4146 - classification_loss: 0.0366 413/500 [=======================>......] - ETA: 29s - loss: 0.4533 - regression_loss: 0.4165 - classification_loss: 0.0367 414/500 [=======================>......] - ETA: 29s - loss: 0.4531 - regression_loss: 0.4164 - classification_loss: 0.0367 415/500 [=======================>......] - ETA: 29s - loss: 0.4526 - regression_loss: 0.4159 - classification_loss: 0.0367 416/500 [=======================>......] - ETA: 28s - loss: 0.4520 - regression_loss: 0.4154 - classification_loss: 0.0366 417/500 [========================>.....] - ETA: 28s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366 418/500 [========================>.....] - ETA: 28s - loss: 0.4528 - regression_loss: 0.4161 - classification_loss: 0.0367 419/500 [========================>.....] - ETA: 27s - loss: 0.4530 - regression_loss: 0.4163 - classification_loss: 0.0367 420/500 [========================>.....] - ETA: 27s - loss: 0.4533 - regression_loss: 0.4165 - classification_loss: 0.0367 421/500 [========================>.....] - ETA: 27s - loss: 0.4527 - regression_loss: 0.4160 - classification_loss: 0.0367 422/500 [========================>.....] - ETA: 26s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366 423/500 [========================>.....] - ETA: 26s - loss: 0.4521 - regression_loss: 0.4156 - classification_loss: 0.0366 424/500 [========================>.....] - ETA: 26s - loss: 0.4522 - regression_loss: 0.4157 - classification_loss: 0.0365 425/500 [========================>.....] - ETA: 25s - loss: 0.4524 - regression_loss: 0.4159 - classification_loss: 0.0365 426/500 [========================>.....] - ETA: 25s - loss: 0.4524 - regression_loss: 0.4158 - classification_loss: 0.0365 427/500 [========================>.....] - ETA: 25s - loss: 0.4527 - regression_loss: 0.4162 - classification_loss: 0.0366 428/500 [========================>.....] - ETA: 24s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366 429/500 [========================>.....] - ETA: 24s - loss: 0.4525 - regression_loss: 0.4159 - classification_loss: 0.0366 430/500 [========================>.....] - ETA: 24s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366 431/500 [========================>.....] - ETA: 23s - loss: 0.4528 - regression_loss: 0.4162 - classification_loss: 0.0366 432/500 [========================>.....] - ETA: 23s - loss: 0.4524 - regression_loss: 0.4159 - classification_loss: 0.0365 433/500 [========================>.....] - ETA: 23s - loss: 0.4521 - regression_loss: 0.4156 - classification_loss: 0.0365 434/500 [=========================>....] - ETA: 22s - loss: 0.4520 - regression_loss: 0.4155 - classification_loss: 0.0365 435/500 [=========================>....] - ETA: 22s - loss: 0.4522 - regression_loss: 0.4157 - classification_loss: 0.0365 436/500 [=========================>....] - ETA: 21s - loss: 0.4516 - regression_loss: 0.4152 - classification_loss: 0.0364 437/500 [=========================>....] - ETA: 21s - loss: 0.4512 - regression_loss: 0.4148 - classification_loss: 0.0364 438/500 [=========================>....] - ETA: 21s - loss: 0.4511 - regression_loss: 0.4147 - classification_loss: 0.0364 439/500 [=========================>....] - ETA: 20s - loss: 0.4516 - regression_loss: 0.4152 - classification_loss: 0.0365 440/500 [=========================>....] - ETA: 20s - loss: 0.4512 - regression_loss: 0.4148 - classification_loss: 0.0364 441/500 [=========================>....] - ETA: 20s - loss: 0.4506 - regression_loss: 0.4142 - classification_loss: 0.0363 442/500 [=========================>....] - ETA: 19s - loss: 0.4500 - regression_loss: 0.4137 - classification_loss: 0.0363 443/500 [=========================>....] - ETA: 19s - loss: 0.4504 - regression_loss: 0.4141 - classification_loss: 0.0363 444/500 [=========================>....] - ETA: 19s - loss: 0.4502 - regression_loss: 0.4139 - classification_loss: 0.0363 445/500 [=========================>....] - ETA: 18s - loss: 0.4510 - regression_loss: 0.4147 - classification_loss: 0.0363 446/500 [=========================>....] - ETA: 18s - loss: 0.4518 - regression_loss: 0.4156 - classification_loss: 0.0363 447/500 [=========================>....] - ETA: 18s - loss: 0.4513 - regression_loss: 0.4151 - classification_loss: 0.0362 448/500 [=========================>....] - ETA: 17s - loss: 0.4512 - regression_loss: 0.4150 - classification_loss: 0.0362 449/500 [=========================>....] - ETA: 17s - loss: 0.4514 - regression_loss: 0.4151 - classification_loss: 0.0363 450/500 [==========================>...] - ETA: 17s - loss: 0.4520 - regression_loss: 0.4157 - classification_loss: 0.0363 451/500 [==========================>...] - ETA: 16s - loss: 0.4519 - regression_loss: 0.4157 - classification_loss: 0.0363 452/500 [==========================>...] - ETA: 16s - loss: 0.4522 - regression_loss: 0.4159 - classification_loss: 0.0363 453/500 [==========================>...] - ETA: 16s - loss: 0.4518 - regression_loss: 0.4156 - classification_loss: 0.0362 454/500 [==========================>...] - ETA: 15s - loss: 0.4517 - regression_loss: 0.4155 - classification_loss: 0.0362 455/500 [==========================>...] - ETA: 15s - loss: 0.4517 - regression_loss: 0.4154 - classification_loss: 0.0362 456/500 [==========================>...] - ETA: 15s - loss: 0.4519 - regression_loss: 0.4157 - classification_loss: 0.0363 457/500 [==========================>...] - ETA: 14s - loss: 0.4516 - regression_loss: 0.4153 - classification_loss: 0.0363 458/500 [==========================>...] - ETA: 14s - loss: 0.4512 - regression_loss: 0.4150 - classification_loss: 0.0362 459/500 [==========================>...] - ETA: 14s - loss: 0.4512 - regression_loss: 0.4150 - classification_loss: 0.0362 460/500 [==========================>...] - ETA: 13s - loss: 0.4514 - regression_loss: 0.4152 - classification_loss: 0.0362 461/500 [==========================>...] - ETA: 13s - loss: 0.4524 - regression_loss: 0.4161 - classification_loss: 0.0362 462/500 [==========================>...] - ETA: 13s - loss: 0.4526 - regression_loss: 0.4164 - classification_loss: 0.0363 463/500 [==========================>...] - ETA: 12s - loss: 0.4531 - regression_loss: 0.4168 - classification_loss: 0.0363 464/500 [==========================>...] - ETA: 12s - loss: 0.4528 - regression_loss: 0.4166 - classification_loss: 0.0363 465/500 [==========================>...] - ETA: 12s - loss: 0.4531 - regression_loss: 0.4167 - classification_loss: 0.0363 466/500 [==========================>...] - ETA: 11s - loss: 0.4526 - regression_loss: 0.4163 - classification_loss: 0.0363 467/500 [===========================>..] - ETA: 11s - loss: 0.4524 - regression_loss: 0.4162 - classification_loss: 0.0363 468/500 [===========================>..] - ETA: 10s - loss: 0.4524 - regression_loss: 0.4161 - classification_loss: 0.0362 469/500 [===========================>..] - ETA: 10s - loss: 0.4521 - regression_loss: 0.4159 - classification_loss: 0.0362 470/500 [===========================>..] - ETA: 10s - loss: 0.4520 - regression_loss: 0.4158 - classification_loss: 0.0363 471/500 [===========================>..] - ETA: 9s - loss: 0.4516 - regression_loss: 0.4154 - classification_loss: 0.0362  472/500 [===========================>..] - ETA: 9s - loss: 0.4515 - regression_loss: 0.4153 - classification_loss: 0.0362 473/500 [===========================>..] - ETA: 9s - loss: 0.4519 - regression_loss: 0.4156 - classification_loss: 0.0363 474/500 [===========================>..] - ETA: 8s - loss: 0.4516 - regression_loss: 0.4153 - classification_loss: 0.0363 475/500 [===========================>..] - ETA: 8s - loss: 0.4517 - regression_loss: 0.4154 - classification_loss: 0.0363 476/500 [===========================>..] - ETA: 8s - loss: 0.4515 - regression_loss: 0.4152 - classification_loss: 0.0363 477/500 [===========================>..] - ETA: 7s - loss: 0.4516 - regression_loss: 0.4153 - classification_loss: 0.0363 478/500 [===========================>..] - ETA: 7s - loss: 0.4512 - regression_loss: 0.4149 - classification_loss: 0.0363 479/500 [===========================>..] - ETA: 7s - loss: 0.4512 - regression_loss: 0.4150 - classification_loss: 0.0362 480/500 [===========================>..] - ETA: 6s - loss: 0.4507 - regression_loss: 0.4145 - classification_loss: 0.0362 481/500 [===========================>..] - ETA: 6s - loss: 0.4508 - regression_loss: 0.4146 - classification_loss: 0.0362 482/500 [===========================>..] - ETA: 6s - loss: 0.4509 - regression_loss: 0.4147 - classification_loss: 0.0362 483/500 [===========================>..] - ETA: 5s - loss: 0.4512 - regression_loss: 0.4149 - classification_loss: 0.0362 484/500 [============================>.] - ETA: 5s - loss: 0.4516 - regression_loss: 0.4153 - classification_loss: 0.0363 485/500 [============================>.] - ETA: 5s - loss: 0.4517 - regression_loss: 0.4154 - classification_loss: 0.0363 486/500 [============================>.] - ETA: 4s - loss: 0.4519 - regression_loss: 0.4155 - classification_loss: 0.0364 487/500 [============================>.] - ETA: 4s - loss: 0.4520 - regression_loss: 0.4156 - classification_loss: 0.0364 488/500 [============================>.] - ETA: 4s - loss: 0.4518 - regression_loss: 0.4155 - classification_loss: 0.0364 489/500 [============================>.] - ETA: 3s - loss: 0.4517 - regression_loss: 0.4154 - classification_loss: 0.0363 490/500 [============================>.] - ETA: 3s - loss: 0.4523 - regression_loss: 0.4160 - classification_loss: 0.0364 491/500 [============================>.] - ETA: 3s - loss: 0.4524 - regression_loss: 0.4160 - classification_loss: 0.0364 492/500 [============================>.] - ETA: 2s - loss: 0.4525 - regression_loss: 0.4160 - classification_loss: 0.0365 493/500 [============================>.] - ETA: 2s - loss: 0.4522 - regression_loss: 0.4158 - classification_loss: 0.0364 494/500 [============================>.] - ETA: 2s - loss: 0.4526 - regression_loss: 0.4161 - classification_loss: 0.0365 495/500 [============================>.] - ETA: 1s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0365 496/500 [============================>.] - ETA: 1s - loss: 0.4527 - regression_loss: 0.4161 - classification_loss: 0.0366 497/500 [============================>.] - ETA: 1s - loss: 0.4524 - regression_loss: 0.4158 - classification_loss: 0.0366 498/500 [============================>.] - ETA: 0s - loss: 0.4520 - regression_loss: 0.4155 - classification_loss: 0.0366 499/500 [============================>.] - ETA: 0s - loss: 0.4518 - regression_loss: 0.4152 - classification_loss: 0.0366 500/500 [==============================] - 172s 344ms/step - loss: 0.4518 - regression_loss: 0.4152 - classification_loss: 0.0366 1172 instances of class plum with average precision: 0.7322 mAP: 0.7322 Epoch 00023: saving model to ./training/snapshots/resnet101_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:46 - loss: 0.1675 - regression_loss: 0.1564 - classification_loss: 0.0111 2/500 [..............................] - ETA: 2:48 - loss: 0.3084 - regression_loss: 0.2786 - classification_loss: 0.0298 3/500 [..............................] - ETA: 2:49 - loss: 0.3187 - regression_loss: 0.2897 - classification_loss: 0.0290 4/500 [..............................] - ETA: 2:49 - loss: 0.3243 - regression_loss: 0.2960 - classification_loss: 0.0283 5/500 [..............................] - ETA: 2:50 - loss: 0.3275 - regression_loss: 0.2975 - classification_loss: 0.0299 6/500 [..............................] - ETA: 2:50 - loss: 0.3287 - regression_loss: 0.2982 - classification_loss: 0.0305 7/500 [..............................] - ETA: 2:50 - loss: 0.3391 - regression_loss: 0.3082 - classification_loss: 0.0309 8/500 [..............................] - ETA: 2:50 - loss: 0.3439 - regression_loss: 0.3128 - classification_loss: 0.0311 9/500 [..............................] - ETA: 2:50 - loss: 0.3479 - regression_loss: 0.3160 - classification_loss: 0.0319 10/500 [..............................] - ETA: 2:50 - loss: 0.3381 - regression_loss: 0.3067 - classification_loss: 0.0314 11/500 [..............................] - ETA: 2:50 - loss: 0.3671 - regression_loss: 0.3336 - classification_loss: 0.0335 12/500 [..............................] - ETA: 2:49 - loss: 0.3651 - regression_loss: 0.3310 - classification_loss: 0.0341 13/500 [..............................] - ETA: 2:48 - loss: 0.3698 - regression_loss: 0.3360 - classification_loss: 0.0338 14/500 [..............................] - ETA: 2:48 - loss: 0.3932 - regression_loss: 0.3595 - classification_loss: 0.0336 15/500 [..............................] - ETA: 2:48 - loss: 0.4271 - regression_loss: 0.3915 - classification_loss: 0.0356 16/500 [..............................] - ETA: 2:48 - loss: 0.4523 - regression_loss: 0.4147 - classification_loss: 0.0376 17/500 [>.............................] - ETA: 2:48 - loss: 0.4660 - regression_loss: 0.4264 - classification_loss: 0.0396 18/500 [>.............................] - ETA: 2:47 - loss: 0.4576 - regression_loss: 0.4185 - classification_loss: 0.0391 19/500 [>.............................] - ETA: 2:47 - loss: 0.4529 - regression_loss: 0.4143 - classification_loss: 0.0385 20/500 [>.............................] - ETA: 2:46 - loss: 0.4403 - regression_loss: 0.4030 - classification_loss: 0.0373 21/500 [>.............................] - ETA: 2:46 - loss: 0.4442 - regression_loss: 0.4072 - classification_loss: 0.0370 22/500 [>.............................] - ETA: 2:46 - loss: 0.4328 - regression_loss: 0.3970 - classification_loss: 0.0358 23/500 [>.............................] - ETA: 2:45 - loss: 0.4372 - regression_loss: 0.4010 - classification_loss: 0.0363 24/500 [>.............................] - ETA: 2:45 - loss: 0.4297 - regression_loss: 0.3941 - classification_loss: 0.0356 25/500 [>.............................] - ETA: 2:45 - loss: 0.4228 - regression_loss: 0.3882 - classification_loss: 0.0346 26/500 [>.............................] - ETA: 2:44 - loss: 0.4312 - regression_loss: 0.3952 - classification_loss: 0.0360 27/500 [>.............................] - ETA: 2:44 - loss: 0.4304 - regression_loss: 0.3945 - classification_loss: 0.0359 28/500 [>.............................] - ETA: 2:44 - loss: 0.4349 - regression_loss: 0.3983 - classification_loss: 0.0366 29/500 [>.............................] - ETA: 2:43 - loss: 0.4283 - regression_loss: 0.3920 - classification_loss: 0.0363 30/500 [>.............................] - ETA: 2:42 - loss: 0.4318 - regression_loss: 0.3952 - classification_loss: 0.0366 31/500 [>.............................] - ETA: 2:42 - loss: 0.4318 - regression_loss: 0.3952 - classification_loss: 0.0366 32/500 [>.............................] - ETA: 2:42 - loss: 0.4275 - regression_loss: 0.3915 - classification_loss: 0.0360 33/500 [>.............................] - ETA: 2:41 - loss: 0.4192 - regression_loss: 0.3836 - classification_loss: 0.0356 34/500 [=>............................] - ETA: 2:41 - loss: 0.4181 - regression_loss: 0.3828 - classification_loss: 0.0353 35/500 [=>............................] - ETA: 2:40 - loss: 0.4119 - regression_loss: 0.3772 - classification_loss: 0.0346 36/500 [=>............................] - ETA: 2:40 - loss: 0.4217 - regression_loss: 0.3856 - classification_loss: 0.0361 37/500 [=>............................] - ETA: 2:39 - loss: 0.4190 - regression_loss: 0.3835 - classification_loss: 0.0355 38/500 [=>............................] - ETA: 2:39 - loss: 0.4232 - regression_loss: 0.3875 - classification_loss: 0.0357 39/500 [=>............................] - ETA: 2:39 - loss: 0.4292 - regression_loss: 0.3929 - classification_loss: 0.0364 40/500 [=>............................] - ETA: 2:38 - loss: 0.4313 - regression_loss: 0.3947 - classification_loss: 0.0366 41/500 [=>............................] - ETA: 2:38 - loss: 0.4283 - regression_loss: 0.3922 - classification_loss: 0.0361 42/500 [=>............................] - ETA: 2:38 - loss: 0.4343 - regression_loss: 0.3971 - classification_loss: 0.0372 43/500 [=>............................] - ETA: 2:38 - loss: 0.4341 - regression_loss: 0.3972 - classification_loss: 0.0369 44/500 [=>............................] - ETA: 2:37 - loss: 0.4379 - regression_loss: 0.4011 - classification_loss: 0.0368 45/500 [=>............................] - ETA: 2:37 - loss: 0.4365 - regression_loss: 0.3997 - classification_loss: 0.0368 46/500 [=>............................] - ETA: 2:37 - loss: 0.4316 - regression_loss: 0.3953 - classification_loss: 0.0363 47/500 [=>............................] - ETA: 2:36 - loss: 0.4312 - regression_loss: 0.3951 - classification_loss: 0.0362 48/500 [=>............................] - ETA: 2:36 - loss: 0.4281 - regression_loss: 0.3919 - classification_loss: 0.0362 49/500 [=>............................] - ETA: 2:36 - loss: 0.4343 - regression_loss: 0.3975 - classification_loss: 0.0368 50/500 [==>...........................] - ETA: 2:35 - loss: 0.4313 - regression_loss: 0.3948 - classification_loss: 0.0366 51/500 [==>...........................] - ETA: 2:35 - loss: 0.4268 - regression_loss: 0.3907 - classification_loss: 0.0360 52/500 [==>...........................] - ETA: 2:35 - loss: 0.4269 - regression_loss: 0.3911 - classification_loss: 0.0357 53/500 [==>...........................] - ETA: 2:34 - loss: 0.4341 - regression_loss: 0.3981 - classification_loss: 0.0359 54/500 [==>...........................] - ETA: 2:34 - loss: 0.4410 - regression_loss: 0.4047 - classification_loss: 0.0363 55/500 [==>...........................] - ETA: 2:34 - loss: 0.4421 - regression_loss: 0.4057 - classification_loss: 0.0364 56/500 [==>...........................] - ETA: 2:33 - loss: 0.4431 - regression_loss: 0.4067 - classification_loss: 0.0364 57/500 [==>...........................] - ETA: 2:33 - loss: 0.4424 - regression_loss: 0.4065 - classification_loss: 0.0359 58/500 [==>...........................] - ETA: 2:33 - loss: 0.4418 - regression_loss: 0.4062 - classification_loss: 0.0356 59/500 [==>...........................] - ETA: 2:32 - loss: 0.4450 - regression_loss: 0.4097 - classification_loss: 0.0353 60/500 [==>...........................] - ETA: 2:32 - loss: 0.4504 - regression_loss: 0.4148 - classification_loss: 0.0356 61/500 [==>...........................] - ETA: 2:32 - loss: 0.4532 - regression_loss: 0.4174 - classification_loss: 0.0358 62/500 [==>...........................] - ETA: 2:31 - loss: 0.4543 - regression_loss: 0.4187 - classification_loss: 0.0356 63/500 [==>...........................] - ETA: 2:31 - loss: 0.4534 - regression_loss: 0.4182 - classification_loss: 0.0353 64/500 [==>...........................] - ETA: 2:30 - loss: 0.4549 - regression_loss: 0.4195 - classification_loss: 0.0355 65/500 [==>...........................] - ETA: 2:30 - loss: 0.4581 - regression_loss: 0.4225 - classification_loss: 0.0355 66/500 [==>...........................] - ETA: 2:30 - loss: 0.4614 - regression_loss: 0.4254 - classification_loss: 0.0360 67/500 [===>..........................] - ETA: 2:29 - loss: 0.4673 - regression_loss: 0.4309 - classification_loss: 0.0364 68/500 [===>..........................] - ETA: 2:29 - loss: 0.4684 - regression_loss: 0.4319 - classification_loss: 0.0365 69/500 [===>..........................] - ETA: 2:29 - loss: 0.4672 - regression_loss: 0.4308 - classification_loss: 0.0363 70/500 [===>..........................] - ETA: 2:28 - loss: 0.4660 - regression_loss: 0.4297 - classification_loss: 0.0363 71/500 [===>..........................] - ETA: 2:28 - loss: 0.4644 - regression_loss: 0.4283 - classification_loss: 0.0361 72/500 [===>..........................] - ETA: 2:28 - loss: 0.4636 - regression_loss: 0.4274 - classification_loss: 0.0362 73/500 [===>..........................] - ETA: 2:27 - loss: 0.4634 - regression_loss: 0.4272 - classification_loss: 0.0362 74/500 [===>..........................] - ETA: 2:27 - loss: 0.4629 - regression_loss: 0.4267 - classification_loss: 0.0363 75/500 [===>..........................] - ETA: 2:26 - loss: 0.4612 - regression_loss: 0.4249 - classification_loss: 0.0363 76/500 [===>..........................] - ETA: 2:26 - loss: 0.4610 - regression_loss: 0.4248 - classification_loss: 0.0362 77/500 [===>..........................] - ETA: 2:26 - loss: 0.4636 - regression_loss: 0.4271 - classification_loss: 0.0364 78/500 [===>..........................] - ETA: 2:25 - loss: 0.4613 - regression_loss: 0.4251 - classification_loss: 0.0362 79/500 [===>..........................] - ETA: 2:25 - loss: 0.4581 - regression_loss: 0.4222 - classification_loss: 0.0359 80/500 [===>..........................] - ETA: 2:25 - loss: 0.4577 - regression_loss: 0.4220 - classification_loss: 0.0357 81/500 [===>..........................] - ETA: 2:24 - loss: 0.4596 - regression_loss: 0.4236 - classification_loss: 0.0360 82/500 [===>..........................] - ETA: 2:24 - loss: 0.4573 - regression_loss: 0.4216 - classification_loss: 0.0357 83/500 [===>..........................] - ETA: 2:24 - loss: 0.4562 - regression_loss: 0.4206 - classification_loss: 0.0356 84/500 [====>.........................] - ETA: 2:23 - loss: 0.4566 - regression_loss: 0.4210 - classification_loss: 0.0356 85/500 [====>.........................] - ETA: 2:23 - loss: 0.4543 - regression_loss: 0.4190 - classification_loss: 0.0353 86/500 [====>.........................] - ETA: 2:23 - loss: 0.4529 - regression_loss: 0.4178 - classification_loss: 0.0352 87/500 [====>.........................] - ETA: 2:22 - loss: 0.4494 - regression_loss: 0.4145 - classification_loss: 0.0348 88/500 [====>.........................] - ETA: 2:22 - loss: 0.4498 - regression_loss: 0.4149 - classification_loss: 0.0350 89/500 [====>.........................] - ETA: 2:22 - loss: 0.4543 - regression_loss: 0.4182 - classification_loss: 0.0360 90/500 [====>.........................] - ETA: 2:21 - loss: 0.4517 - regression_loss: 0.4159 - classification_loss: 0.0358 91/500 [====>.........................] - ETA: 2:21 - loss: 0.4534 - regression_loss: 0.4178 - classification_loss: 0.0356 92/500 [====>.........................] - ETA: 2:20 - loss: 0.4525 - regression_loss: 0.4171 - classification_loss: 0.0353 93/500 [====>.........................] - ETA: 2:20 - loss: 0.4519 - regression_loss: 0.4164 - classification_loss: 0.0354 94/500 [====>.........................] - ETA: 2:20 - loss: 0.4520 - regression_loss: 0.4166 - classification_loss: 0.0354 95/500 [====>.........................] - ETA: 2:19 - loss: 0.4543 - regression_loss: 0.4186 - classification_loss: 0.0357 96/500 [====>.........................] - ETA: 2:19 - loss: 0.4562 - regression_loss: 0.4201 - classification_loss: 0.0360 97/500 [====>.........................] - ETA: 2:19 - loss: 0.4541 - regression_loss: 0.4183 - classification_loss: 0.0358 98/500 [====>.........................] - ETA: 2:19 - loss: 0.4535 - regression_loss: 0.4176 - classification_loss: 0.0359 99/500 [====>.........................] - ETA: 2:18 - loss: 0.4528 - regression_loss: 0.4170 - classification_loss: 0.0358 100/500 [=====>........................] - ETA: 2:18 - loss: 0.4508 - regression_loss: 0.4152 - classification_loss: 0.0356 101/500 [=====>........................] - ETA: 2:18 - loss: 0.4505 - regression_loss: 0.4149 - classification_loss: 0.0356 102/500 [=====>........................] - ETA: 2:17 - loss: 0.4521 - regression_loss: 0.4165 - classification_loss: 0.0356 103/500 [=====>........................] - ETA: 2:17 - loss: 0.4539 - regression_loss: 0.4180 - classification_loss: 0.0359 104/500 [=====>........................] - ETA: 2:17 - loss: 0.4516 - regression_loss: 0.4160 - classification_loss: 0.0357 105/500 [=====>........................] - ETA: 2:16 - loss: 0.4485 - regression_loss: 0.4130 - classification_loss: 0.0355 106/500 [=====>........................] - ETA: 2:16 - loss: 0.4516 - regression_loss: 0.4160 - classification_loss: 0.0356 107/500 [=====>........................] - ETA: 2:15 - loss: 0.4523 - regression_loss: 0.4163 - classification_loss: 0.0359 108/500 [=====>........................] - ETA: 2:15 - loss: 0.4520 - regression_loss: 0.4161 - classification_loss: 0.0359 109/500 [=====>........................] - ETA: 2:15 - loss: 0.4533 - regression_loss: 0.4174 - classification_loss: 0.0360 110/500 [=====>........................] - ETA: 2:14 - loss: 0.4538 - regression_loss: 0.4179 - classification_loss: 0.0359 111/500 [=====>........................] - ETA: 2:14 - loss: 0.4541 - regression_loss: 0.4183 - classification_loss: 0.0358 112/500 [=====>........................] - ETA: 2:14 - loss: 0.4557 - regression_loss: 0.4200 - classification_loss: 0.0357 113/500 [=====>........................] - ETA: 2:14 - loss: 0.4546 - regression_loss: 0.4191 - classification_loss: 0.0355 114/500 [=====>........................] - ETA: 2:13 - loss: 0.4548 - regression_loss: 0.4193 - classification_loss: 0.0355 115/500 [=====>........................] - ETA: 2:13 - loss: 0.4546 - regression_loss: 0.4191 - classification_loss: 0.0355 116/500 [=====>........................] - ETA: 2:12 - loss: 0.4560 - regression_loss: 0.4203 - classification_loss: 0.0356 117/500 [======>.......................] - ETA: 2:12 - loss: 0.4576 - regression_loss: 0.4219 - classification_loss: 0.0357 118/500 [======>.......................] - ETA: 2:12 - loss: 0.4564 - regression_loss: 0.4208 - classification_loss: 0.0356 119/500 [======>.......................] - ETA: 2:11 - loss: 0.4543 - regression_loss: 0.4189 - classification_loss: 0.0354 120/500 [======>.......................] - ETA: 2:11 - loss: 0.4539 - regression_loss: 0.4185 - classification_loss: 0.0354 121/500 [======>.......................] - ETA: 2:10 - loss: 0.4530 - regression_loss: 0.4177 - classification_loss: 0.0353 122/500 [======>.......................] - ETA: 2:10 - loss: 0.4508 - regression_loss: 0.4158 - classification_loss: 0.0351 123/500 [======>.......................] - ETA: 2:10 - loss: 0.4499 - regression_loss: 0.4150 - classification_loss: 0.0350 124/500 [======>.......................] - ETA: 2:09 - loss: 0.4495 - regression_loss: 0.4146 - classification_loss: 0.0349 125/500 [======>.......................] - ETA: 2:09 - loss: 0.4495 - regression_loss: 0.4146 - classification_loss: 0.0349 126/500 [======>.......................] - ETA: 2:09 - loss: 0.4513 - regression_loss: 0.4162 - classification_loss: 0.0351 127/500 [======>.......................] - ETA: 2:08 - loss: 0.4514 - regression_loss: 0.4162 - classification_loss: 0.0352 128/500 [======>.......................] - ETA: 2:08 - loss: 0.4507 - regression_loss: 0.4155 - classification_loss: 0.0352 129/500 [======>.......................] - ETA: 2:08 - loss: 0.4532 - regression_loss: 0.4177 - classification_loss: 0.0355 130/500 [======>.......................] - ETA: 2:07 - loss: 0.4528 - regression_loss: 0.4173 - classification_loss: 0.0355 131/500 [======>.......................] - ETA: 2:07 - loss: 0.4527 - regression_loss: 0.4173 - classification_loss: 0.0354 132/500 [======>.......................] - ETA: 2:07 - loss: 0.4520 - regression_loss: 0.4166 - classification_loss: 0.0354 133/500 [======>.......................] - ETA: 2:06 - loss: 0.4531 - regression_loss: 0.4175 - classification_loss: 0.0356 134/500 [=======>......................] - ETA: 2:06 - loss: 0.4554 - regression_loss: 0.4195 - classification_loss: 0.0359 135/500 [=======>......................] - ETA: 2:06 - loss: 0.4538 - regression_loss: 0.4181 - classification_loss: 0.0357 136/500 [=======>......................] - ETA: 2:05 - loss: 0.4553 - regression_loss: 0.4191 - classification_loss: 0.0362 137/500 [=======>......................] - ETA: 2:05 - loss: 0.4546 - regression_loss: 0.4184 - classification_loss: 0.0362 138/500 [=======>......................] - ETA: 2:05 - loss: 0.4538 - regression_loss: 0.4177 - classification_loss: 0.0361 139/500 [=======>......................] - ETA: 2:04 - loss: 0.4530 - regression_loss: 0.4169 - classification_loss: 0.0361 140/500 [=======>......................] - ETA: 2:04 - loss: 0.4527 - regression_loss: 0.4165 - classification_loss: 0.0362 141/500 [=======>......................] - ETA: 2:04 - loss: 0.4511 - regression_loss: 0.4151 - classification_loss: 0.0360 142/500 [=======>......................] - ETA: 2:03 - loss: 0.4508 - regression_loss: 0.4149 - classification_loss: 0.0359 143/500 [=======>......................] - ETA: 2:03 - loss: 0.4526 - regression_loss: 0.4166 - classification_loss: 0.0361 144/500 [=======>......................] - ETA: 2:03 - loss: 0.4537 - regression_loss: 0.4175 - classification_loss: 0.0362 145/500 [=======>......................] - ETA: 2:02 - loss: 0.4541 - regression_loss: 0.4179 - classification_loss: 0.0362 146/500 [=======>......................] - ETA: 2:02 - loss: 0.4556 - regression_loss: 0.4192 - classification_loss: 0.0364 147/500 [=======>......................] - ETA: 2:02 - loss: 0.4570 - regression_loss: 0.4206 - classification_loss: 0.0364 148/500 [=======>......................] - ETA: 2:01 - loss: 0.4578 - regression_loss: 0.4211 - classification_loss: 0.0367 149/500 [=======>......................] - ETA: 2:01 - loss: 0.4566 - regression_loss: 0.4200 - classification_loss: 0.0366 150/500 [========>.....................] - ETA: 2:01 - loss: 0.4552 - regression_loss: 0.4187 - classification_loss: 0.0365 151/500 [========>.....................] - ETA: 2:00 - loss: 0.4541 - regression_loss: 0.4176 - classification_loss: 0.0364 152/500 [========>.....................] - ETA: 2:00 - loss: 0.4526 - regression_loss: 0.4162 - classification_loss: 0.0363 153/500 [========>.....................] - ETA: 2:00 - loss: 0.4555 - regression_loss: 0.4188 - classification_loss: 0.0368 154/500 [========>.....................] - ETA: 1:59 - loss: 0.4564 - regression_loss: 0.4196 - classification_loss: 0.0368 155/500 [========>.....................] - ETA: 1:59 - loss: 0.4571 - regression_loss: 0.4203 - classification_loss: 0.0367 156/500 [========>.....................] - ETA: 1:59 - loss: 0.4551 - regression_loss: 0.4186 - classification_loss: 0.0366 157/500 [========>.....................] - ETA: 1:58 - loss: 0.4552 - regression_loss: 0.4187 - classification_loss: 0.0365 158/500 [========>.....................] - ETA: 1:58 - loss: 0.4550 - regression_loss: 0.4186 - classification_loss: 0.0364 159/500 [========>.....................] - ETA: 1:57 - loss: 0.4549 - regression_loss: 0.4185 - classification_loss: 0.0364 160/500 [========>.....................] - ETA: 1:57 - loss: 0.4543 - regression_loss: 0.4181 - classification_loss: 0.0363 161/500 [========>.....................] - ETA: 1:57 - loss: 0.4543 - regression_loss: 0.4180 - classification_loss: 0.0363 162/500 [========>.....................] - ETA: 1:56 - loss: 0.4552 - regression_loss: 0.4189 - classification_loss: 0.0363 163/500 [========>.....................] - ETA: 1:56 - loss: 0.4543 - regression_loss: 0.4181 - classification_loss: 0.0362 164/500 [========>.....................] - ETA: 1:56 - loss: 0.4537 - regression_loss: 0.4176 - classification_loss: 0.0361 165/500 [========>.....................] - ETA: 1:55 - loss: 0.4550 - regression_loss: 0.4189 - classification_loss: 0.0362 166/500 [========>.....................] - ETA: 1:55 - loss: 0.4537 - regression_loss: 0.4177 - classification_loss: 0.0360 167/500 [=========>....................] - ETA: 1:55 - loss: 0.4540 - regression_loss: 0.4179 - classification_loss: 0.0361 168/500 [=========>....................] - ETA: 1:54 - loss: 0.4541 - regression_loss: 0.4178 - classification_loss: 0.0362 169/500 [=========>....................] - ETA: 1:54 - loss: 0.4530 - regression_loss: 0.4168 - classification_loss: 0.0362 170/500 [=========>....................] - ETA: 1:54 - loss: 0.4533 - regression_loss: 0.4170 - classification_loss: 0.0362 171/500 [=========>....................] - ETA: 1:53 - loss: 0.4534 - regression_loss: 0.4172 - classification_loss: 0.0362 172/500 [=========>....................] - ETA: 1:53 - loss: 0.4525 - regression_loss: 0.4165 - classification_loss: 0.0361 173/500 [=========>....................] - ETA: 1:53 - loss: 0.4535 - regression_loss: 0.4175 - classification_loss: 0.0361 174/500 [=========>....................] - ETA: 1:52 - loss: 0.4542 - regression_loss: 0.4179 - classification_loss: 0.0363 175/500 [=========>....................] - ETA: 1:52 - loss: 0.4530 - regression_loss: 0.4168 - classification_loss: 0.0361 176/500 [=========>....................] - ETA: 1:52 - loss: 0.4543 - regression_loss: 0.4179 - classification_loss: 0.0364 177/500 [=========>....................] - ETA: 1:51 - loss: 0.4543 - regression_loss: 0.4180 - classification_loss: 0.0362 178/500 [=========>....................] - ETA: 1:51 - loss: 0.4536 - regression_loss: 0.4174 - classification_loss: 0.0363 179/500 [=========>....................] - ETA: 1:51 - loss: 0.4541 - regression_loss: 0.4179 - classification_loss: 0.0362 180/500 [=========>....................] - ETA: 1:50 - loss: 0.4547 - regression_loss: 0.4184 - classification_loss: 0.0363 181/500 [=========>....................] - ETA: 1:50 - loss: 0.4542 - regression_loss: 0.4181 - classification_loss: 0.0362 182/500 [=========>....................] - ETA: 1:49 - loss: 0.4531 - regression_loss: 0.4171 - classification_loss: 0.0361 183/500 [=========>....................] - ETA: 1:49 - loss: 0.4523 - regression_loss: 0.4162 - classification_loss: 0.0361 184/500 [==========>...................] - ETA: 1:49 - loss: 0.4534 - regression_loss: 0.4172 - classification_loss: 0.0362 185/500 [==========>...................] - ETA: 1:48 - loss: 0.4537 - regression_loss: 0.4175 - classification_loss: 0.0362 186/500 [==========>...................] - ETA: 1:48 - loss: 0.4547 - regression_loss: 0.4184 - classification_loss: 0.0363 187/500 [==========>...................] - ETA: 1:48 - loss: 0.4542 - regression_loss: 0.4180 - classification_loss: 0.0362 188/500 [==========>...................] - ETA: 1:47 - loss: 0.4534 - regression_loss: 0.4173 - classification_loss: 0.0362 189/500 [==========>...................] - ETA: 1:47 - loss: 0.4522 - regression_loss: 0.4162 - classification_loss: 0.0360 190/500 [==========>...................] - ETA: 1:47 - loss: 0.4519 - regression_loss: 0.4160 - classification_loss: 0.0359 191/500 [==========>...................] - ETA: 1:46 - loss: 0.4530 - regression_loss: 0.4171 - classification_loss: 0.0359 192/500 [==========>...................] - ETA: 1:46 - loss: 0.4538 - regression_loss: 0.4177 - classification_loss: 0.0361 193/500 [==========>...................] - ETA: 1:46 - loss: 0.4550 - regression_loss: 0.4187 - classification_loss: 0.0363 194/500 [==========>...................] - ETA: 1:45 - loss: 0.4548 - regression_loss: 0.4186 - classification_loss: 0.0362 195/500 [==========>...................] - ETA: 1:45 - loss: 0.4540 - regression_loss: 0.4179 - classification_loss: 0.0361 196/500 [==========>...................] - ETA: 1:45 - loss: 0.4535 - regression_loss: 0.4174 - classification_loss: 0.0360 197/500 [==========>...................] - ETA: 1:44 - loss: 0.4529 - regression_loss: 0.4170 - classification_loss: 0.0359 198/500 [==========>...................] - ETA: 1:44 - loss: 0.4519 - regression_loss: 0.4160 - classification_loss: 0.0358 199/500 [==========>...................] - ETA: 1:44 - loss: 0.4519 - regression_loss: 0.4161 - classification_loss: 0.0358 200/500 [===========>..................] - ETA: 1:43 - loss: 0.4516 - regression_loss: 0.4158 - classification_loss: 0.0357 201/500 [===========>..................] - ETA: 1:43 - loss: 0.4524 - regression_loss: 0.4165 - classification_loss: 0.0359 202/500 [===========>..................] - ETA: 1:42 - loss: 0.4530 - regression_loss: 0.4171 - classification_loss: 0.0359 203/500 [===========>..................] - ETA: 1:42 - loss: 0.4533 - regression_loss: 0.4174 - classification_loss: 0.0359 204/500 [===========>..................] - ETA: 1:42 - loss: 0.4551 - regression_loss: 0.4190 - classification_loss: 0.0361 205/500 [===========>..................] - ETA: 1:41 - loss: 0.4548 - regression_loss: 0.4187 - classification_loss: 0.0361 206/500 [===========>..................] - ETA: 1:41 - loss: 0.4545 - regression_loss: 0.4185 - classification_loss: 0.0360 207/500 [===========>..................] - ETA: 1:41 - loss: 0.4544 - regression_loss: 0.4185 - classification_loss: 0.0359 208/500 [===========>..................] - ETA: 1:40 - loss: 0.4536 - regression_loss: 0.4178 - classification_loss: 0.0358 209/500 [===========>..................] - ETA: 1:40 - loss: 0.4532 - regression_loss: 0.4175 - classification_loss: 0.0357 210/500 [===========>..................] - ETA: 1:40 - loss: 0.4536 - regression_loss: 0.4179 - classification_loss: 0.0357 211/500 [===========>..................] - ETA: 1:39 - loss: 0.4536 - regression_loss: 0.4179 - classification_loss: 0.0357 212/500 [===========>..................] - ETA: 1:39 - loss: 0.4526 - regression_loss: 0.4170 - classification_loss: 0.0356 213/500 [===========>..................] - ETA: 1:39 - loss: 0.4531 - regression_loss: 0.4175 - classification_loss: 0.0356 214/500 [===========>..................] - ETA: 1:38 - loss: 0.4538 - regression_loss: 0.4181 - classification_loss: 0.0357 215/500 [===========>..................] - ETA: 1:38 - loss: 0.4547 - regression_loss: 0.4189 - classification_loss: 0.0358 216/500 [===========>..................] - ETA: 1:38 - loss: 0.4535 - regression_loss: 0.4179 - classification_loss: 0.0357 217/500 [============>.................] - ETA: 1:37 - loss: 0.4523 - regression_loss: 0.4167 - classification_loss: 0.0356 218/500 [============>.................] - ETA: 1:37 - loss: 0.4521 - regression_loss: 0.4165 - classification_loss: 0.0356 219/500 [============>.................] - ETA: 1:37 - loss: 0.4533 - regression_loss: 0.4174 - classification_loss: 0.0359 220/500 [============>.................] - ETA: 1:36 - loss: 0.4537 - regression_loss: 0.4178 - classification_loss: 0.0359 221/500 [============>.................] - ETA: 1:36 - loss: 0.4542 - regression_loss: 0.4182 - classification_loss: 0.0360 222/500 [============>.................] - ETA: 1:35 - loss: 0.4540 - regression_loss: 0.4180 - classification_loss: 0.0360 223/500 [============>.................] - ETA: 1:35 - loss: 0.4531 - regression_loss: 0.4172 - classification_loss: 0.0359 224/500 [============>.................] - ETA: 1:35 - loss: 0.4532 - regression_loss: 0.4173 - classification_loss: 0.0358 225/500 [============>.................] - ETA: 1:34 - loss: 0.4538 - regression_loss: 0.4179 - classification_loss: 0.0358 226/500 [============>.................] - ETA: 1:34 - loss: 0.4537 - regression_loss: 0.4179 - classification_loss: 0.0358 227/500 [============>.................] - ETA: 1:34 - loss: 0.4530 - regression_loss: 0.4172 - classification_loss: 0.0357 228/500 [============>.................] - ETA: 1:33 - loss: 0.4522 - regression_loss: 0.4165 - classification_loss: 0.0356 229/500 [============>.................] - ETA: 1:33 - loss: 0.4536 - regression_loss: 0.4178 - classification_loss: 0.0358 230/500 [============>.................] - ETA: 1:33 - loss: 0.4535 - regression_loss: 0.4177 - classification_loss: 0.0358 231/500 [============>.................] - ETA: 1:32 - loss: 0.4530 - regression_loss: 0.4172 - classification_loss: 0.0358 232/500 [============>.................] - ETA: 1:32 - loss: 0.4536 - regression_loss: 0.4177 - classification_loss: 0.0359 233/500 [============>.................] - ETA: 1:32 - loss: 0.4545 - regression_loss: 0.4184 - classification_loss: 0.0361 234/500 [=============>................] - ETA: 1:31 - loss: 0.4555 - regression_loss: 0.4193 - classification_loss: 0.0362 235/500 [=============>................] - ETA: 1:31 - loss: 0.4560 - regression_loss: 0.4197 - classification_loss: 0.0362 236/500 [=============>................] - ETA: 1:31 - loss: 0.4561 - regression_loss: 0.4199 - classification_loss: 0.0362 237/500 [=============>................] - ETA: 1:30 - loss: 0.4561 - regression_loss: 0.4199 - classification_loss: 0.0362 238/500 [=============>................] - ETA: 1:30 - loss: 0.4573 - regression_loss: 0.4208 - classification_loss: 0.0365 239/500 [=============>................] - ETA: 1:30 - loss: 0.4570 - regression_loss: 0.4205 - classification_loss: 0.0365 240/500 [=============>................] - ETA: 1:29 - loss: 0.4557 - regression_loss: 0.4193 - classification_loss: 0.0364 241/500 [=============>................] - ETA: 1:29 - loss: 0.4566 - regression_loss: 0.4201 - classification_loss: 0.0364 242/500 [=============>................] - ETA: 1:29 - loss: 0.4561 - regression_loss: 0.4197 - classification_loss: 0.0364 243/500 [=============>................] - ETA: 1:28 - loss: 0.4591 - regression_loss: 0.4223 - classification_loss: 0.0368 244/500 [=============>................] - ETA: 1:28 - loss: 0.4595 - regression_loss: 0.4226 - classification_loss: 0.0368 245/500 [=============>................] - ETA: 1:28 - loss: 0.4604 - regression_loss: 0.4236 - classification_loss: 0.0369 246/500 [=============>................] - ETA: 1:27 - loss: 0.4596 - regression_loss: 0.4229 - classification_loss: 0.0368 247/500 [=============>................] - ETA: 1:27 - loss: 0.4589 - regression_loss: 0.4221 - classification_loss: 0.0367 248/500 [=============>................] - ETA: 1:27 - loss: 0.4594 - regression_loss: 0.4226 - classification_loss: 0.0368 249/500 [=============>................] - ETA: 1:26 - loss: 0.4585 - regression_loss: 0.4218 - classification_loss: 0.0367 250/500 [==============>...............] - ETA: 1:26 - loss: 0.4582 - regression_loss: 0.4215 - classification_loss: 0.0367 251/500 [==============>...............] - ETA: 1:26 - loss: 0.4585 - regression_loss: 0.4218 - classification_loss: 0.0368 252/500 [==============>...............] - ETA: 1:25 - loss: 0.4581 - regression_loss: 0.4214 - classification_loss: 0.0367 253/500 [==============>...............] - ETA: 1:25 - loss: 0.4584 - regression_loss: 0.4216 - classification_loss: 0.0367 254/500 [==============>...............] - ETA: 1:24 - loss: 0.4573 - regression_loss: 0.4207 - classification_loss: 0.0366 255/500 [==============>...............] - ETA: 1:24 - loss: 0.4572 - regression_loss: 0.4206 - classification_loss: 0.0366 256/500 [==============>...............] - ETA: 1:24 - loss: 0.4576 - regression_loss: 0.4210 - classification_loss: 0.0366 257/500 [==============>...............] - ETA: 1:23 - loss: 0.4570 - regression_loss: 0.4205 - classification_loss: 0.0365 258/500 [==============>...............] - ETA: 1:23 - loss: 0.4577 - regression_loss: 0.4210 - classification_loss: 0.0367 259/500 [==============>...............] - ETA: 1:23 - loss: 0.4572 - regression_loss: 0.4205 - classification_loss: 0.0367 260/500 [==============>...............] - ETA: 1:22 - loss: 0.4571 - regression_loss: 0.4204 - classification_loss: 0.0367 261/500 [==============>...............] - ETA: 1:22 - loss: 0.4562 - regression_loss: 0.4196 - classification_loss: 0.0366 262/500 [==============>...............] - ETA: 1:22 - loss: 0.4570 - regression_loss: 0.4204 - classification_loss: 0.0366 263/500 [==============>...............] - ETA: 1:21 - loss: 0.4579 - regression_loss: 0.4213 - classification_loss: 0.0367 264/500 [==============>...............] - ETA: 1:21 - loss: 0.4583 - regression_loss: 0.4216 - classification_loss: 0.0367 265/500 [==============>...............] - ETA: 1:21 - loss: 0.4576 - regression_loss: 0.4209 - classification_loss: 0.0366 266/500 [==============>...............] - ETA: 1:20 - loss: 0.4565 - regression_loss: 0.4200 - classification_loss: 0.0365 267/500 [===============>..............] - ETA: 1:20 - loss: 0.4569 - regression_loss: 0.4203 - classification_loss: 0.0366 268/500 [===============>..............] - ETA: 1:20 - loss: 0.4570 - regression_loss: 0.4205 - classification_loss: 0.0365 269/500 [===============>..............] - ETA: 1:19 - loss: 0.4573 - regression_loss: 0.4207 - classification_loss: 0.0365 270/500 [===============>..............] - ETA: 1:19 - loss: 0.4571 - regression_loss: 0.4207 - classification_loss: 0.0364 271/500 [===============>..............] - ETA: 1:19 - loss: 0.4569 - regression_loss: 0.4205 - classification_loss: 0.0364 272/500 [===============>..............] - ETA: 1:18 - loss: 0.4560 - regression_loss: 0.4196 - classification_loss: 0.0363 273/500 [===============>..............] - ETA: 1:18 - loss: 0.4553 - regression_loss: 0.4191 - classification_loss: 0.0363 274/500 [===============>..............] - ETA: 1:18 - loss: 0.4558 - regression_loss: 0.4195 - classification_loss: 0.0363 275/500 [===============>..............] - ETA: 1:17 - loss: 0.4570 - regression_loss: 0.4207 - classification_loss: 0.0363 276/500 [===============>..............] - ETA: 1:17 - loss: 0.4558 - regression_loss: 0.4196 - classification_loss: 0.0362 277/500 [===============>..............] - ETA: 1:16 - loss: 0.4565 - regression_loss: 0.4203 - classification_loss: 0.0362 278/500 [===============>..............] - ETA: 1:16 - loss: 0.4570 - regression_loss: 0.4207 - classification_loss: 0.0363 279/500 [===============>..............] - ETA: 1:16 - loss: 0.4564 - regression_loss: 0.4201 - classification_loss: 0.0363 280/500 [===============>..............] - ETA: 1:15 - loss: 0.4563 - regression_loss: 0.4200 - classification_loss: 0.0363 281/500 [===============>..............] - ETA: 1:15 - loss: 0.4565 - regression_loss: 0.4201 - classification_loss: 0.0363 282/500 [===============>..............] - ETA: 1:15 - loss: 0.4561 - regression_loss: 0.4198 - classification_loss: 0.0363 283/500 [===============>..............] - ETA: 1:14 - loss: 0.4569 - regression_loss: 0.4206 - classification_loss: 0.0363 284/500 [================>.............] - ETA: 1:14 - loss: 0.4575 - regression_loss: 0.4211 - classification_loss: 0.0364 285/500 [================>.............] - ETA: 1:14 - loss: 0.4585 - regression_loss: 0.4220 - classification_loss: 0.0365 286/500 [================>.............] - ETA: 1:13 - loss: 0.4580 - regression_loss: 0.4216 - classification_loss: 0.0364 287/500 [================>.............] - ETA: 1:13 - loss: 0.4573 - regression_loss: 0.4210 - classification_loss: 0.0364 288/500 [================>.............] - ETA: 1:13 - loss: 0.4572 - regression_loss: 0.4209 - classification_loss: 0.0363 289/500 [================>.............] - ETA: 1:12 - loss: 0.4581 - regression_loss: 0.4217 - classification_loss: 0.0364 290/500 [================>.............] - ETA: 1:12 - loss: 0.4583 - regression_loss: 0.4218 - classification_loss: 0.0364 291/500 [================>.............] - ETA: 1:12 - loss: 0.4583 - regression_loss: 0.4219 - classification_loss: 0.0364 292/500 [================>.............] - ETA: 1:11 - loss: 0.4577 - regression_loss: 0.4214 - classification_loss: 0.0364 293/500 [================>.............] - ETA: 1:11 - loss: 0.4580 - regression_loss: 0.4215 - classification_loss: 0.0365 294/500 [================>.............] - ETA: 1:11 - loss: 0.4575 - regression_loss: 0.4211 - classification_loss: 0.0364 295/500 [================>.............] - ETA: 1:10 - loss: 0.4570 - regression_loss: 0.4206 - classification_loss: 0.0364 296/500 [================>.............] - ETA: 1:10 - loss: 0.4563 - regression_loss: 0.4200 - classification_loss: 0.0363 297/500 [================>.............] - ETA: 1:10 - loss: 0.4561 - regression_loss: 0.4199 - classification_loss: 0.0363 298/500 [================>.............] - ETA: 1:09 - loss: 0.4562 - regression_loss: 0.4199 - classification_loss: 0.0363 299/500 [================>.............] - ETA: 1:09 - loss: 0.4557 - regression_loss: 0.4194 - classification_loss: 0.0362 300/500 [=================>............] - ETA: 1:09 - loss: 0.4557 - regression_loss: 0.4195 - classification_loss: 0.0362 301/500 [=================>............] - ETA: 1:08 - loss: 0.4548 - regression_loss: 0.4186 - classification_loss: 0.0361 302/500 [=================>............] - ETA: 1:08 - loss: 0.4545 - regression_loss: 0.4183 - classification_loss: 0.0362 303/500 [=================>............] - ETA: 1:07 - loss: 0.4537 - regression_loss: 0.4176 - classification_loss: 0.0361 304/500 [=================>............] - ETA: 1:07 - loss: 0.4533 - regression_loss: 0.4173 - classification_loss: 0.0360 305/500 [=================>............] - ETA: 1:07 - loss: 0.4530 - regression_loss: 0.4171 - classification_loss: 0.0359 306/500 [=================>............] - ETA: 1:06 - loss: 0.4541 - regression_loss: 0.4180 - classification_loss: 0.0360 307/500 [=================>............] - ETA: 1:06 - loss: 0.4532 - regression_loss: 0.4173 - classification_loss: 0.0359 308/500 [=================>............] - ETA: 1:06 - loss: 0.4540 - regression_loss: 0.4180 - classification_loss: 0.0360 309/500 [=================>............] - ETA: 1:05 - loss: 0.4548 - regression_loss: 0.4187 - classification_loss: 0.0361 310/500 [=================>............] - ETA: 1:05 - loss: 0.4557 - regression_loss: 0.4196 - classification_loss: 0.0361 311/500 [=================>............] - ETA: 1:05 - loss: 0.4558 - regression_loss: 0.4197 - classification_loss: 0.0361 312/500 [=================>............] - ETA: 1:04 - loss: 0.4569 - regression_loss: 0.4208 - classification_loss: 0.0361 313/500 [=================>............] - ETA: 1:04 - loss: 0.4579 - regression_loss: 0.4216 - classification_loss: 0.0362 314/500 [=================>............] - ETA: 1:04 - loss: 0.4599 - regression_loss: 0.4234 - classification_loss: 0.0365 315/500 [=================>............] - ETA: 1:03 - loss: 0.4606 - regression_loss: 0.4240 - classification_loss: 0.0366 316/500 [=================>............] - ETA: 1:03 - loss: 0.4603 - regression_loss: 0.4237 - classification_loss: 0.0366 317/500 [==================>...........] - ETA: 1:03 - loss: 0.4609 - regression_loss: 0.4241 - classification_loss: 0.0368 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4618 - regression_loss: 0.4249 - classification_loss: 0.0369 319/500 [==================>...........] - ETA: 1:02 - loss: 0.4612 - regression_loss: 0.4243 - classification_loss: 0.0369 320/500 [==================>...........] - ETA: 1:02 - loss: 0.4612 - regression_loss: 0.4243 - classification_loss: 0.0369 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4607 - regression_loss: 0.4238 - classification_loss: 0.0369 322/500 [==================>...........] - ETA: 1:01 - loss: 0.4613 - regression_loss: 0.4242 - classification_loss: 0.0372 323/500 [==================>...........] - ETA: 1:01 - loss: 0.4616 - regression_loss: 0.4244 - classification_loss: 0.0372 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4620 - regression_loss: 0.4247 - classification_loss: 0.0372 325/500 [==================>...........] - ETA: 1:00 - loss: 0.4624 - regression_loss: 0.4252 - classification_loss: 0.0372 326/500 [==================>...........] - ETA: 59s - loss: 0.4619 - regression_loss: 0.4247 - classification_loss: 0.0372  327/500 [==================>...........] - ETA: 59s - loss: 0.4614 - regression_loss: 0.4243 - classification_loss: 0.0371 328/500 [==================>...........] - ETA: 59s - loss: 0.4613 - regression_loss: 0.4242 - classification_loss: 0.0371 329/500 [==================>...........] - ETA: 58s - loss: 0.4610 - regression_loss: 0.4239 - classification_loss: 0.0371 330/500 [==================>...........] - ETA: 58s - loss: 0.4608 - regression_loss: 0.4238 - classification_loss: 0.0370 331/500 [==================>...........] - ETA: 58s - loss: 0.4609 - regression_loss: 0.4239 - classification_loss: 0.0371 332/500 [==================>...........] - ETA: 57s - loss: 0.4612 - regression_loss: 0.4240 - classification_loss: 0.0371 333/500 [==================>...........] - ETA: 57s - loss: 0.4602 - regression_loss: 0.4232 - classification_loss: 0.0370 334/500 [===================>..........] - ETA: 57s - loss: 0.4599 - regression_loss: 0.4229 - classification_loss: 0.0370 335/500 [===================>..........] - ETA: 56s - loss: 0.4599 - regression_loss: 0.4229 - classification_loss: 0.0370 336/500 [===================>..........] - ETA: 56s - loss: 0.4607 - regression_loss: 0.4236 - classification_loss: 0.0371 337/500 [===================>..........] - ETA: 56s - loss: 0.4607 - regression_loss: 0.4236 - classification_loss: 0.0371 338/500 [===================>..........] - ETA: 55s - loss: 0.4605 - regression_loss: 0.4235 - classification_loss: 0.0370 339/500 [===================>..........] - ETA: 55s - loss: 0.4605 - regression_loss: 0.4235 - classification_loss: 0.0370 340/500 [===================>..........] - ETA: 55s - loss: 0.4600 - regression_loss: 0.4231 - classification_loss: 0.0370 341/500 [===================>..........] - ETA: 54s - loss: 0.4597 - regression_loss: 0.4227 - classification_loss: 0.0371 342/500 [===================>..........] - ETA: 54s - loss: 0.4595 - regression_loss: 0.4224 - classification_loss: 0.0371 343/500 [===================>..........] - ETA: 54s - loss: 0.4590 - regression_loss: 0.4220 - classification_loss: 0.0370 344/500 [===================>..........] - ETA: 53s - loss: 0.4584 - regression_loss: 0.4215 - classification_loss: 0.0370 345/500 [===================>..........] - ETA: 53s - loss: 0.4577 - regression_loss: 0.4208 - classification_loss: 0.0369 346/500 [===================>..........] - ETA: 53s - loss: 0.4576 - regression_loss: 0.4207 - classification_loss: 0.0369 347/500 [===================>..........] - ETA: 52s - loss: 0.4577 - regression_loss: 0.4207 - classification_loss: 0.0369 348/500 [===================>..........] - ETA: 52s - loss: 0.4573 - regression_loss: 0.4204 - classification_loss: 0.0369 349/500 [===================>..........] - ETA: 52s - loss: 0.4582 - regression_loss: 0.4212 - classification_loss: 0.0370 350/500 [====================>.........] - ETA: 51s - loss: 0.4579 - regression_loss: 0.4210 - classification_loss: 0.0369 351/500 [====================>.........] - ETA: 51s - loss: 0.4579 - regression_loss: 0.4209 - classification_loss: 0.0370 352/500 [====================>.........] - ETA: 50s - loss: 0.4578 - regression_loss: 0.4209 - classification_loss: 0.0370 353/500 [====================>.........] - ETA: 50s - loss: 0.4577 - regression_loss: 0.4208 - classification_loss: 0.0369 354/500 [====================>.........] - ETA: 50s - loss: 0.4574 - regression_loss: 0.4205 - classification_loss: 0.0369 355/500 [====================>.........] - ETA: 49s - loss: 0.4576 - regression_loss: 0.4207 - classification_loss: 0.0369 356/500 [====================>.........] - ETA: 49s - loss: 0.4570 - regression_loss: 0.4202 - classification_loss: 0.0368 357/500 [====================>.........] - ETA: 49s - loss: 0.4572 - regression_loss: 0.4204 - classification_loss: 0.0368 358/500 [====================>.........] - ETA: 48s - loss: 0.4564 - regression_loss: 0.4197 - classification_loss: 0.0368 359/500 [====================>.........] - ETA: 48s - loss: 0.4566 - regression_loss: 0.4199 - classification_loss: 0.0367 360/500 [====================>.........] - ETA: 48s - loss: 0.4570 - regression_loss: 0.4202 - classification_loss: 0.0367 361/500 [====================>.........] - ETA: 47s - loss: 0.4567 - regression_loss: 0.4200 - classification_loss: 0.0367 362/500 [====================>.........] - ETA: 47s - loss: 0.4567 - regression_loss: 0.4200 - classification_loss: 0.0367 363/500 [====================>.........] - ETA: 47s - loss: 0.4561 - regression_loss: 0.4195 - classification_loss: 0.0367 364/500 [====================>.........] - ETA: 46s - loss: 0.4558 - regression_loss: 0.4192 - classification_loss: 0.0366 365/500 [====================>.........] - ETA: 46s - loss: 0.4561 - regression_loss: 0.4194 - classification_loss: 0.0366 366/500 [====================>.........] - ETA: 46s - loss: 0.4561 - regression_loss: 0.4195 - classification_loss: 0.0366 367/500 [=====================>........] - ETA: 45s - loss: 0.4560 - regression_loss: 0.4194 - classification_loss: 0.0366 368/500 [=====================>........] - ETA: 45s - loss: 0.4560 - regression_loss: 0.4194 - classification_loss: 0.0366 369/500 [=====================>........] - ETA: 45s - loss: 0.4566 - regression_loss: 0.4199 - classification_loss: 0.0366 370/500 [=====================>........] - ETA: 44s - loss: 0.4563 - regression_loss: 0.4197 - classification_loss: 0.0366 371/500 [=====================>........] - ETA: 44s - loss: 0.4562 - regression_loss: 0.4196 - classification_loss: 0.0366 372/500 [=====================>........] - ETA: 44s - loss: 0.4558 - regression_loss: 0.4193 - classification_loss: 0.0366 373/500 [=====================>........] - ETA: 43s - loss: 0.4551 - regression_loss: 0.4186 - classification_loss: 0.0365 374/500 [=====================>........] - ETA: 43s - loss: 0.4552 - regression_loss: 0.4187 - classification_loss: 0.0365 375/500 [=====================>........] - ETA: 43s - loss: 0.4557 - regression_loss: 0.4191 - classification_loss: 0.0365 376/500 [=====================>........] - ETA: 42s - loss: 0.4556 - regression_loss: 0.4191 - classification_loss: 0.0365 377/500 [=====================>........] - ETA: 42s - loss: 0.4550 - regression_loss: 0.4186 - classification_loss: 0.0364 378/500 [=====================>........] - ETA: 42s - loss: 0.4551 - regression_loss: 0.4187 - classification_loss: 0.0364 379/500 [=====================>........] - ETA: 41s - loss: 0.4563 - regression_loss: 0.4196 - classification_loss: 0.0366 380/500 [=====================>........] - ETA: 41s - loss: 0.4570 - regression_loss: 0.4202 - classification_loss: 0.0368 381/500 [=====================>........] - ETA: 40s - loss: 0.4572 - regression_loss: 0.4204 - classification_loss: 0.0368 382/500 [=====================>........] - ETA: 40s - loss: 0.4569 - regression_loss: 0.4202 - classification_loss: 0.0367 383/500 [=====================>........] - ETA: 40s - loss: 0.4565 - regression_loss: 0.4198 - classification_loss: 0.0367 384/500 [======================>.......] - ETA: 39s - loss: 0.4566 - regression_loss: 0.4199 - classification_loss: 0.0367 385/500 [======================>.......] - ETA: 39s - loss: 0.4568 - regression_loss: 0.4201 - classification_loss: 0.0368 386/500 [======================>.......] - ETA: 39s - loss: 0.4566 - regression_loss: 0.4199 - classification_loss: 0.0367 387/500 [======================>.......] - ETA: 38s - loss: 0.4564 - regression_loss: 0.4198 - classification_loss: 0.0366 388/500 [======================>.......] - ETA: 38s - loss: 0.4566 - regression_loss: 0.4199 - classification_loss: 0.0367 389/500 [======================>.......] - ETA: 38s - loss: 0.4567 - regression_loss: 0.4201 - classification_loss: 0.0367 390/500 [======================>.......] - ETA: 37s - loss: 0.4563 - regression_loss: 0.4197 - classification_loss: 0.0366 391/500 [======================>.......] - ETA: 37s - loss: 0.4556 - regression_loss: 0.4190 - classification_loss: 0.0365 392/500 [======================>.......] - ETA: 37s - loss: 0.4557 - regression_loss: 0.4191 - classification_loss: 0.0366 393/500 [======================>.......] - ETA: 36s - loss: 0.4558 - regression_loss: 0.4193 - classification_loss: 0.0366 394/500 [======================>.......] - ETA: 36s - loss: 0.4556 - regression_loss: 0.4190 - classification_loss: 0.0366 395/500 [======================>.......] - ETA: 36s - loss: 0.4553 - regression_loss: 0.4187 - classification_loss: 0.0366 396/500 [======================>.......] - ETA: 35s - loss: 0.4553 - regression_loss: 0.4187 - classification_loss: 0.0366 397/500 [======================>.......] - ETA: 35s - loss: 0.4547 - regression_loss: 0.4181 - classification_loss: 0.0365 398/500 [======================>.......] - ETA: 35s - loss: 0.4552 - regression_loss: 0.4185 - classification_loss: 0.0366 399/500 [======================>.......] - ETA: 34s - loss: 0.4552 - regression_loss: 0.4186 - classification_loss: 0.0366 400/500 [=======================>......] - ETA: 34s - loss: 0.4552 - regression_loss: 0.4186 - classification_loss: 0.0366 401/500 [=======================>......] - ETA: 34s - loss: 0.4559 - regression_loss: 0.4191 - classification_loss: 0.0367 402/500 [=======================>......] - ETA: 33s - loss: 0.4557 - regression_loss: 0.4190 - classification_loss: 0.0367 403/500 [=======================>......] - ETA: 33s - loss: 0.4556 - regression_loss: 0.4189 - classification_loss: 0.0367 404/500 [=======================>......] - ETA: 33s - loss: 0.4561 - regression_loss: 0.4194 - classification_loss: 0.0368 405/500 [=======================>......] - ETA: 32s - loss: 0.4558 - regression_loss: 0.4191 - classification_loss: 0.0367 406/500 [=======================>......] - ETA: 32s - loss: 0.4554 - regression_loss: 0.4187 - classification_loss: 0.0366 407/500 [=======================>......] - ETA: 31s - loss: 0.4556 - regression_loss: 0.4190 - classification_loss: 0.0367 408/500 [=======================>......] - ETA: 31s - loss: 0.4558 - regression_loss: 0.4192 - classification_loss: 0.0367 409/500 [=======================>......] - ETA: 31s - loss: 0.4562 - regression_loss: 0.4195 - classification_loss: 0.0367 410/500 [=======================>......] - ETA: 30s - loss: 0.4558 - regression_loss: 0.4192 - classification_loss: 0.0366 411/500 [=======================>......] - ETA: 30s - loss: 0.4552 - regression_loss: 0.4187 - classification_loss: 0.0366 412/500 [=======================>......] - ETA: 30s - loss: 0.4550 - regression_loss: 0.4185 - classification_loss: 0.0365 413/500 [=======================>......] - ETA: 29s - loss: 0.4544 - regression_loss: 0.4179 - classification_loss: 0.0365 414/500 [=======================>......] - ETA: 29s - loss: 0.4551 - regression_loss: 0.4187 - classification_loss: 0.0364 415/500 [=======================>......] - ETA: 29s - loss: 0.4549 - regression_loss: 0.4185 - classification_loss: 0.0364 416/500 [=======================>......] - ETA: 28s - loss: 0.4553 - regression_loss: 0.4189 - classification_loss: 0.0365 417/500 [========================>.....] - ETA: 28s - loss: 0.4552 - regression_loss: 0.4188 - classification_loss: 0.0365 418/500 [========================>.....] - ETA: 28s - loss: 0.4555 - regression_loss: 0.4190 - classification_loss: 0.0365 419/500 [========================>.....] - ETA: 27s - loss: 0.4556 - regression_loss: 0.4191 - classification_loss: 0.0365 420/500 [========================>.....] - ETA: 27s - loss: 0.4561 - regression_loss: 0.4195 - classification_loss: 0.0366 421/500 [========================>.....] - ETA: 27s - loss: 0.4564 - regression_loss: 0.4198 - classification_loss: 0.0366 422/500 [========================>.....] - ETA: 26s - loss: 0.4561 - regression_loss: 0.4195 - classification_loss: 0.0365 423/500 [========================>.....] - ETA: 26s - loss: 0.4563 - regression_loss: 0.4198 - classification_loss: 0.0365 424/500 [========================>.....] - ETA: 26s - loss: 0.4563 - regression_loss: 0.4198 - classification_loss: 0.0365 425/500 [========================>.....] - ETA: 25s - loss: 0.4562 - regression_loss: 0.4197 - classification_loss: 0.0365 426/500 [========================>.....] - ETA: 25s - loss: 0.4556 - regression_loss: 0.4192 - classification_loss: 0.0364 427/500 [========================>.....] - ETA: 25s - loss: 0.4551 - regression_loss: 0.4187 - classification_loss: 0.0364 428/500 [========================>.....] - ETA: 24s - loss: 0.4547 - regression_loss: 0.4184 - classification_loss: 0.0364 429/500 [========================>.....] - ETA: 24s - loss: 0.4550 - regression_loss: 0.4186 - classification_loss: 0.0363 430/500 [========================>.....] - ETA: 24s - loss: 0.4547 - regression_loss: 0.4184 - classification_loss: 0.0363 431/500 [========================>.....] - ETA: 23s - loss: 0.4544 - regression_loss: 0.4181 - classification_loss: 0.0363 432/500 [========================>.....] - ETA: 23s - loss: 0.4539 - regression_loss: 0.4176 - classification_loss: 0.0363 433/500 [========================>.....] - ETA: 23s - loss: 0.4538 - regression_loss: 0.4175 - classification_loss: 0.0362 434/500 [=========================>....] - ETA: 22s - loss: 0.4538 - regression_loss: 0.4176 - classification_loss: 0.0363 435/500 [=========================>....] - ETA: 22s - loss: 0.4538 - regression_loss: 0.4176 - classification_loss: 0.0362 436/500 [=========================>....] - ETA: 22s - loss: 0.4541 - regression_loss: 0.4178 - classification_loss: 0.0362 437/500 [=========================>....] - ETA: 21s - loss: 0.4538 - regression_loss: 0.4176 - classification_loss: 0.0362 438/500 [=========================>....] - ETA: 21s - loss: 0.4534 - regression_loss: 0.4172 - classification_loss: 0.0361 439/500 [=========================>....] - ETA: 20s - loss: 0.4526 - regression_loss: 0.4165 - classification_loss: 0.0361 440/500 [=========================>....] - ETA: 20s - loss: 0.4528 - regression_loss: 0.4168 - classification_loss: 0.0361 441/500 [=========================>....] - ETA: 20s - loss: 0.4530 - regression_loss: 0.4169 - classification_loss: 0.0361 442/500 [=========================>....] - ETA: 19s - loss: 0.4527 - regression_loss: 0.4167 - classification_loss: 0.0360 443/500 [=========================>....] - ETA: 19s - loss: 0.4526 - regression_loss: 0.4167 - classification_loss: 0.0360 444/500 [=========================>....] - ETA: 19s - loss: 0.4531 - regression_loss: 0.4171 - classification_loss: 0.0361 445/500 [=========================>....] - ETA: 18s - loss: 0.4528 - regression_loss: 0.4168 - classification_loss: 0.0360 446/500 [=========================>....] - ETA: 18s - loss: 0.4525 - regression_loss: 0.4165 - classification_loss: 0.0360 447/500 [=========================>....] - ETA: 18s - loss: 0.4525 - regression_loss: 0.4165 - classification_loss: 0.0361 448/500 [=========================>....] - ETA: 17s - loss: 0.4521 - regression_loss: 0.4161 - classification_loss: 0.0361 449/500 [=========================>....] - ETA: 17s - loss: 0.4525 - regression_loss: 0.4164 - classification_loss: 0.0361 450/500 [==========================>...] - ETA: 17s - loss: 0.4530 - regression_loss: 0.4169 - classification_loss: 0.0361 451/500 [==========================>...] - ETA: 16s - loss: 0.4538 - regression_loss: 0.4175 - classification_loss: 0.0362 452/500 [==========================>...] - ETA: 16s - loss: 0.4532 - regression_loss: 0.4170 - classification_loss: 0.0362 453/500 [==========================>...] - ETA: 16s - loss: 0.4529 - regression_loss: 0.4167 - classification_loss: 0.0362 454/500 [==========================>...] - ETA: 15s - loss: 0.4529 - regression_loss: 0.4168 - classification_loss: 0.0361 455/500 [==========================>...] - ETA: 15s - loss: 0.4525 - regression_loss: 0.4164 - classification_loss: 0.0361 456/500 [==========================>...] - ETA: 15s - loss: 0.4520 - regression_loss: 0.4159 - classification_loss: 0.0361 457/500 [==========================>...] - ETA: 14s - loss: 0.4521 - regression_loss: 0.4161 - classification_loss: 0.0360 458/500 [==========================>...] - ETA: 14s - loss: 0.4524 - regression_loss: 0.4164 - classification_loss: 0.0360 459/500 [==========================>...] - ETA: 14s - loss: 0.4519 - regression_loss: 0.4159 - classification_loss: 0.0360 460/500 [==========================>...] - ETA: 13s - loss: 0.4516 - regression_loss: 0.4156 - classification_loss: 0.0360 461/500 [==========================>...] - ETA: 13s - loss: 0.4522 - regression_loss: 0.4162 - classification_loss: 0.0361 462/500 [==========================>...] - ETA: 13s - loss: 0.4521 - regression_loss: 0.4160 - classification_loss: 0.0361 463/500 [==========================>...] - ETA: 12s - loss: 0.4523 - regression_loss: 0.4162 - classification_loss: 0.0361 464/500 [==========================>...] - ETA: 12s - loss: 0.4526 - regression_loss: 0.4164 - classification_loss: 0.0362 465/500 [==========================>...] - ETA: 12s - loss: 0.4526 - regression_loss: 0.4165 - classification_loss: 0.0362 466/500 [==========================>...] - ETA: 11s - loss: 0.4526 - regression_loss: 0.4165 - classification_loss: 0.0361 467/500 [===========================>..] - ETA: 11s - loss: 0.4524 - regression_loss: 0.4163 - classification_loss: 0.0361 468/500 [===========================>..] - ETA: 11s - loss: 0.4523 - regression_loss: 0.4162 - classification_loss: 0.0361 469/500 [===========================>..] - ETA: 10s - loss: 0.4523 - regression_loss: 0.4162 - classification_loss: 0.0361 470/500 [===========================>..] - ETA: 10s - loss: 0.4529 - regression_loss: 0.4167 - classification_loss: 0.0362 471/500 [===========================>..] - ETA: 9s - loss: 0.4530 - regression_loss: 0.4168 - classification_loss: 0.0362  472/500 [===========================>..] - ETA: 9s - loss: 0.4528 - regression_loss: 0.4166 - classification_loss: 0.0362 473/500 [===========================>..] - ETA: 9s - loss: 0.4529 - regression_loss: 0.4167 - classification_loss: 0.0362 474/500 [===========================>..] - ETA: 8s - loss: 0.4526 - regression_loss: 0.4164 - classification_loss: 0.0362 475/500 [===========================>..] - ETA: 8s - loss: 0.4524 - regression_loss: 0.4163 - classification_loss: 0.0361 476/500 [===========================>..] - ETA: 8s - loss: 0.4522 - regression_loss: 0.4161 - classification_loss: 0.0361 477/500 [===========================>..] - ETA: 7s - loss: 0.4520 - regression_loss: 0.4159 - classification_loss: 0.0361 478/500 [===========================>..] - ETA: 7s - loss: 0.4514 - regression_loss: 0.4154 - classification_loss: 0.0361 479/500 [===========================>..] - ETA: 7s - loss: 0.4512 - regression_loss: 0.4151 - classification_loss: 0.0361 480/500 [===========================>..] - ETA: 6s - loss: 0.4506 - regression_loss: 0.4146 - classification_loss: 0.0360 481/500 [===========================>..] - ETA: 6s - loss: 0.4508 - regression_loss: 0.4148 - classification_loss: 0.0360 482/500 [===========================>..] - ETA: 6s - loss: 0.4511 - regression_loss: 0.4151 - classification_loss: 0.0361 483/500 [===========================>..] - ETA: 5s - loss: 0.4514 - regression_loss: 0.4152 - classification_loss: 0.0361 484/500 [============================>.] - ETA: 5s - loss: 0.4510 - regression_loss: 0.4149 - classification_loss: 0.0361 485/500 [============================>.] - ETA: 5s - loss: 0.4508 - regression_loss: 0.4147 - classification_loss: 0.0361 486/500 [============================>.] - ETA: 4s - loss: 0.4507 - regression_loss: 0.4146 - classification_loss: 0.0361 487/500 [============================>.] - ETA: 4s - loss: 0.4507 - regression_loss: 0.4146 - classification_loss: 0.0361 488/500 [============================>.] - ETA: 4s - loss: 0.4506 - regression_loss: 0.4145 - classification_loss: 0.0361 489/500 [============================>.] - ETA: 3s - loss: 0.4506 - regression_loss: 0.4146 - classification_loss: 0.0360 490/500 [============================>.] - ETA: 3s - loss: 0.4503 - regression_loss: 0.4143 - classification_loss: 0.0360 491/500 [============================>.] - ETA: 3s - loss: 0.4511 - regression_loss: 0.4150 - classification_loss: 0.0362 492/500 [============================>.] - ETA: 2s - loss: 0.4510 - regression_loss: 0.4149 - classification_loss: 0.0362 493/500 [============================>.] - ETA: 2s - loss: 0.4504 - regression_loss: 0.4143 - classification_loss: 0.0361 494/500 [============================>.] - ETA: 2s - loss: 0.4501 - regression_loss: 0.4140 - classification_loss: 0.0361 495/500 [============================>.] - ETA: 1s - loss: 0.4499 - regression_loss: 0.4138 - classification_loss: 0.0361 496/500 [============================>.] - ETA: 1s - loss: 0.4500 - regression_loss: 0.4139 - classification_loss: 0.0361 497/500 [============================>.] - ETA: 1s - loss: 0.4500 - regression_loss: 0.4139 - classification_loss: 0.0361 498/500 [============================>.] - ETA: 0s - loss: 0.4500 - regression_loss: 0.4139 - classification_loss: 0.0361 499/500 [============================>.] - ETA: 0s - loss: 0.4501 - regression_loss: 0.4140 - classification_loss: 0.0361 500/500 [==============================] - 172s 344ms/step - loss: 0.4502 - regression_loss: 0.4140 - classification_loss: 0.0361 1172 instances of class plum with average precision: 0.7430 mAP: 0.7430 Epoch 00024: saving model to ./training/snapshots/resnet101_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:44 - loss: 0.4640 - regression_loss: 0.4239 - classification_loss: 0.0402 2/500 [..............................] - ETA: 2:47 - loss: 0.4930 - regression_loss: 0.4521 - classification_loss: 0.0409 3/500 [..............................] - ETA: 2:44 - loss: 0.5033 - regression_loss: 0.4618 - classification_loss: 0.0415 4/500 [..............................] - ETA: 2:44 - loss: 0.5761 - regression_loss: 0.5161 - classification_loss: 0.0599 5/500 [..............................] - ETA: 2:46 - loss: 0.4964 - regression_loss: 0.4468 - classification_loss: 0.0496 6/500 [..............................] - ETA: 2:46 - loss: 0.4510 - regression_loss: 0.4079 - classification_loss: 0.0431 7/500 [..............................] - ETA: 2:46 - loss: 0.4601 - regression_loss: 0.4167 - classification_loss: 0.0435 8/500 [..............................] - ETA: 2:45 - loss: 0.4509 - regression_loss: 0.4101 - classification_loss: 0.0408 9/500 [..............................] - ETA: 2:46 - loss: 0.4742 - regression_loss: 0.4318 - classification_loss: 0.0423 10/500 [..............................] - ETA: 2:46 - loss: 0.4689 - regression_loss: 0.4273 - classification_loss: 0.0416 11/500 [..............................] - ETA: 2:46 - loss: 0.4762 - regression_loss: 0.4341 - classification_loss: 0.0420 12/500 [..............................] - ETA: 2:45 - loss: 0.4713 - regression_loss: 0.4318 - classification_loss: 0.0395 13/500 [..............................] - ETA: 2:45 - loss: 0.4621 - regression_loss: 0.4239 - classification_loss: 0.0383 14/500 [..............................] - ETA: 2:44 - loss: 0.4458 - regression_loss: 0.4091 - classification_loss: 0.0367 15/500 [..............................] - ETA: 2:44 - loss: 0.4382 - regression_loss: 0.4017 - classification_loss: 0.0365 16/500 [..............................] - ETA: 2:43 - loss: 0.4224 - regression_loss: 0.3872 - classification_loss: 0.0352 17/500 [>.............................] - ETA: 2:43 - loss: 0.4134 - regression_loss: 0.3787 - classification_loss: 0.0348 18/500 [>.............................] - ETA: 2:43 - loss: 0.4020 - regression_loss: 0.3683 - classification_loss: 0.0337 19/500 [>.............................] - ETA: 2:43 - loss: 0.4101 - regression_loss: 0.3756 - classification_loss: 0.0345 20/500 [>.............................] - ETA: 2:42 - loss: 0.4142 - regression_loss: 0.3808 - classification_loss: 0.0334 21/500 [>.............................] - ETA: 2:42 - loss: 0.4329 - regression_loss: 0.3984 - classification_loss: 0.0345 22/500 [>.............................] - ETA: 2:42 - loss: 0.4279 - regression_loss: 0.3941 - classification_loss: 0.0338 23/500 [>.............................] - ETA: 2:41 - loss: 0.4323 - regression_loss: 0.3983 - classification_loss: 0.0339 24/500 [>.............................] - ETA: 2:41 - loss: 0.4533 - regression_loss: 0.4177 - classification_loss: 0.0356 25/500 [>.............................] - ETA: 2:41 - loss: 0.4583 - regression_loss: 0.4228 - classification_loss: 0.0355 26/500 [>.............................] - ETA: 2:41 - loss: 0.4571 - regression_loss: 0.4216 - classification_loss: 0.0355 27/500 [>.............................] - ETA: 2:41 - loss: 0.4571 - regression_loss: 0.4222 - classification_loss: 0.0348 28/500 [>.............................] - ETA: 2:40 - loss: 0.4555 - regression_loss: 0.4211 - classification_loss: 0.0344 29/500 [>.............................] - ETA: 2:40 - loss: 0.4567 - regression_loss: 0.4222 - classification_loss: 0.0344 30/500 [>.............................] - ETA: 2:40 - loss: 0.4632 - regression_loss: 0.4277 - classification_loss: 0.0355 31/500 [>.............................] - ETA: 2:40 - loss: 0.4701 - regression_loss: 0.4342 - classification_loss: 0.0359 32/500 [>.............................] - ETA: 2:39 - loss: 0.4747 - regression_loss: 0.4380 - classification_loss: 0.0366 33/500 [>.............................] - ETA: 2:39 - loss: 0.4718 - regression_loss: 0.4356 - classification_loss: 0.0363 34/500 [=>............................] - ETA: 2:38 - loss: 0.4666 - regression_loss: 0.4307 - classification_loss: 0.0360 35/500 [=>............................] - ETA: 2:38 - loss: 0.4614 - regression_loss: 0.4259 - classification_loss: 0.0356 36/500 [=>............................] - ETA: 2:38 - loss: 0.4582 - regression_loss: 0.4227 - classification_loss: 0.0355 37/500 [=>............................] - ETA: 2:38 - loss: 0.4605 - regression_loss: 0.4250 - classification_loss: 0.0355 38/500 [=>............................] - ETA: 2:37 - loss: 0.4725 - regression_loss: 0.4373 - classification_loss: 0.0352 39/500 [=>............................] - ETA: 2:37 - loss: 0.4672 - regression_loss: 0.4325 - classification_loss: 0.0348 40/500 [=>............................] - ETA: 2:37 - loss: 0.4701 - regression_loss: 0.4354 - classification_loss: 0.0347 41/500 [=>............................] - ETA: 2:36 - loss: 0.4650 - regression_loss: 0.4308 - classification_loss: 0.0342 42/500 [=>............................] - ETA: 2:36 - loss: 0.4597 - regression_loss: 0.4260 - classification_loss: 0.0337 43/500 [=>............................] - ETA: 2:36 - loss: 0.4596 - regression_loss: 0.4260 - classification_loss: 0.0336 44/500 [=>............................] - ETA: 2:36 - loss: 0.4560 - regression_loss: 0.4229 - classification_loss: 0.0331 45/500 [=>............................] - ETA: 2:35 - loss: 0.4503 - regression_loss: 0.4175 - classification_loss: 0.0328 46/500 [=>............................] - ETA: 2:35 - loss: 0.4470 - regression_loss: 0.4143 - classification_loss: 0.0327 47/500 [=>............................] - ETA: 2:35 - loss: 0.4455 - regression_loss: 0.4130 - classification_loss: 0.0325 48/500 [=>............................] - ETA: 2:34 - loss: 0.4463 - regression_loss: 0.4134 - classification_loss: 0.0329 49/500 [=>............................] - ETA: 2:34 - loss: 0.4428 - regression_loss: 0.4101 - classification_loss: 0.0327 50/500 [==>...........................] - ETA: 2:34 - loss: 0.4396 - regression_loss: 0.4071 - classification_loss: 0.0324 51/500 [==>...........................] - ETA: 2:33 - loss: 0.4394 - regression_loss: 0.4066 - classification_loss: 0.0328 52/500 [==>...........................] - ETA: 2:33 - loss: 0.4440 - regression_loss: 0.4107 - classification_loss: 0.0333 53/500 [==>...........................] - ETA: 2:33 - loss: 0.4461 - regression_loss: 0.4126 - classification_loss: 0.0335 54/500 [==>...........................] - ETA: 2:32 - loss: 0.4464 - regression_loss: 0.4132 - classification_loss: 0.0331 55/500 [==>...........................] - ETA: 2:32 - loss: 0.4480 - regression_loss: 0.4146 - classification_loss: 0.0333 56/500 [==>...........................] - ETA: 2:32 - loss: 0.4512 - regression_loss: 0.4170 - classification_loss: 0.0341 57/500 [==>...........................] - ETA: 2:31 - loss: 0.4473 - regression_loss: 0.4134 - classification_loss: 0.0339 58/500 [==>...........................] - ETA: 2:31 - loss: 0.4459 - regression_loss: 0.4119 - classification_loss: 0.0340 59/500 [==>...........................] - ETA: 2:30 - loss: 0.4414 - regression_loss: 0.4077 - classification_loss: 0.0337 60/500 [==>...........................] - ETA: 2:30 - loss: 0.4407 - regression_loss: 0.4071 - classification_loss: 0.0336 61/500 [==>...........................] - ETA: 2:30 - loss: 0.4382 - regression_loss: 0.4049 - classification_loss: 0.0333 62/500 [==>...........................] - ETA: 2:29 - loss: 0.4407 - regression_loss: 0.4073 - classification_loss: 0.0334 63/500 [==>...........................] - ETA: 2:29 - loss: 0.4436 - regression_loss: 0.4102 - classification_loss: 0.0334 64/500 [==>...........................] - ETA: 2:29 - loss: 0.4405 - regression_loss: 0.4071 - classification_loss: 0.0334 65/500 [==>...........................] - ETA: 2:29 - loss: 0.4419 - regression_loss: 0.4085 - classification_loss: 0.0334 66/500 [==>...........................] - ETA: 2:28 - loss: 0.4388 - regression_loss: 0.4057 - classification_loss: 0.0331 67/500 [===>..........................] - ETA: 2:28 - loss: 0.4395 - regression_loss: 0.4062 - classification_loss: 0.0332 68/500 [===>..........................] - ETA: 2:28 - loss: 0.4422 - regression_loss: 0.4086 - classification_loss: 0.0336 69/500 [===>..........................] - ETA: 2:27 - loss: 0.4396 - regression_loss: 0.4063 - classification_loss: 0.0334 70/500 [===>..........................] - ETA: 2:27 - loss: 0.4391 - regression_loss: 0.4057 - classification_loss: 0.0334 71/500 [===>..........................] - ETA: 2:27 - loss: 0.4405 - regression_loss: 0.4069 - classification_loss: 0.0336 72/500 [===>..........................] - ETA: 2:26 - loss: 0.4394 - regression_loss: 0.4057 - classification_loss: 0.0337 73/500 [===>..........................] - ETA: 2:26 - loss: 0.4407 - regression_loss: 0.4069 - classification_loss: 0.0338 74/500 [===>..........................] - ETA: 2:26 - loss: 0.4411 - regression_loss: 0.4070 - classification_loss: 0.0342 75/500 [===>..........................] - ETA: 2:25 - loss: 0.4383 - regression_loss: 0.4043 - classification_loss: 0.0340 76/500 [===>..........................] - ETA: 2:25 - loss: 0.4371 - regression_loss: 0.4032 - classification_loss: 0.0339 77/500 [===>..........................] - ETA: 2:25 - loss: 0.4364 - regression_loss: 0.4025 - classification_loss: 0.0339 78/500 [===>..........................] - ETA: 2:25 - loss: 0.4394 - regression_loss: 0.4053 - classification_loss: 0.0341 79/500 [===>..........................] - ETA: 2:24 - loss: 0.4385 - regression_loss: 0.4046 - classification_loss: 0.0339 80/500 [===>..........................] - ETA: 2:24 - loss: 0.4392 - regression_loss: 0.4052 - classification_loss: 0.0340 81/500 [===>..........................] - ETA: 2:24 - loss: 0.4379 - regression_loss: 0.4042 - classification_loss: 0.0337 82/500 [===>..........................] - ETA: 2:23 - loss: 0.4371 - regression_loss: 0.4035 - classification_loss: 0.0336 83/500 [===>..........................] - ETA: 2:23 - loss: 0.4405 - regression_loss: 0.4068 - classification_loss: 0.0337 84/500 [====>.........................] - ETA: 2:23 - loss: 0.4393 - regression_loss: 0.4058 - classification_loss: 0.0335 85/500 [====>.........................] - ETA: 2:22 - loss: 0.4400 - regression_loss: 0.4065 - classification_loss: 0.0335 86/500 [====>.........................] - ETA: 2:22 - loss: 0.4421 - regression_loss: 0.4081 - classification_loss: 0.0340 87/500 [====>.........................] - ETA: 2:22 - loss: 0.4415 - regression_loss: 0.4074 - classification_loss: 0.0341 88/500 [====>.........................] - ETA: 2:21 - loss: 0.4409 - regression_loss: 0.4070 - classification_loss: 0.0339 89/500 [====>.........................] - ETA: 2:21 - loss: 0.4407 - regression_loss: 0.4069 - classification_loss: 0.0338 90/500 [====>.........................] - ETA: 2:21 - loss: 0.4414 - regression_loss: 0.4077 - classification_loss: 0.0337 91/500 [====>.........................] - ETA: 2:20 - loss: 0.4422 - regression_loss: 0.4083 - classification_loss: 0.0339 92/500 [====>.........................] - ETA: 2:20 - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 93/500 [====>.........................] - ETA: 2:20 - loss: 0.4434 - regression_loss: 0.4093 - classification_loss: 0.0341 94/500 [====>.........................] - ETA: 2:20 - loss: 0.4450 - regression_loss: 0.4105 - classification_loss: 0.0345 95/500 [====>.........................] - ETA: 2:19 - loss: 0.4436 - regression_loss: 0.4091 - classification_loss: 0.0345 96/500 [====>.........................] - ETA: 2:19 - loss: 0.4441 - regression_loss: 0.4092 - classification_loss: 0.0348 97/500 [====>.........................] - ETA: 2:18 - loss: 0.4425 - regression_loss: 0.4078 - classification_loss: 0.0347 98/500 [====>.........................] - ETA: 2:18 - loss: 0.4433 - regression_loss: 0.4085 - classification_loss: 0.0348 99/500 [====>.........................] - ETA: 2:18 - loss: 0.4428 - regression_loss: 0.4080 - classification_loss: 0.0348 100/500 [=====>........................] - ETA: 2:17 - loss: 0.4455 - regression_loss: 0.4105 - classification_loss: 0.0350 101/500 [=====>........................] - ETA: 2:17 - loss: 0.4466 - regression_loss: 0.4115 - classification_loss: 0.0351 102/500 [=====>........................] - ETA: 2:17 - loss: 0.4513 - regression_loss: 0.4161 - classification_loss: 0.0352 103/500 [=====>........................] - ETA: 2:16 - loss: 0.4500 - regression_loss: 0.4149 - classification_loss: 0.0351 104/500 [=====>........................] - ETA: 2:16 - loss: 0.4488 - regression_loss: 0.4139 - classification_loss: 0.0349 105/500 [=====>........................] - ETA: 2:16 - loss: 0.4462 - regression_loss: 0.4115 - classification_loss: 0.0348 106/500 [=====>........................] - ETA: 2:15 - loss: 0.4453 - regression_loss: 0.4107 - classification_loss: 0.0346 107/500 [=====>........................] - ETA: 2:15 - loss: 0.4459 - regression_loss: 0.4113 - classification_loss: 0.0346 108/500 [=====>........................] - ETA: 2:15 - loss: 0.4455 - regression_loss: 0.4109 - classification_loss: 0.0346 109/500 [=====>........................] - ETA: 2:14 - loss: 0.4455 - regression_loss: 0.4109 - classification_loss: 0.0346 110/500 [=====>........................] - ETA: 2:14 - loss: 0.4441 - regression_loss: 0.4096 - classification_loss: 0.0345 111/500 [=====>........................] - ETA: 2:14 - loss: 0.4458 - regression_loss: 0.4112 - classification_loss: 0.0346 112/500 [=====>........................] - ETA: 2:13 - loss: 0.4452 - regression_loss: 0.4106 - classification_loss: 0.0346 113/500 [=====>........................] - ETA: 2:13 - loss: 0.4442 - regression_loss: 0.4098 - classification_loss: 0.0344 114/500 [=====>........................] - ETA: 2:13 - loss: 0.4461 - regression_loss: 0.4117 - classification_loss: 0.0343 115/500 [=====>........................] - ETA: 2:12 - loss: 0.4468 - regression_loss: 0.4122 - classification_loss: 0.0346 116/500 [=====>........................] - ETA: 2:12 - loss: 0.4466 - regression_loss: 0.4121 - classification_loss: 0.0345 117/500 [======>.......................] - ETA: 2:12 - loss: 0.4477 - regression_loss: 0.4131 - classification_loss: 0.0347 118/500 [======>.......................] - ETA: 2:11 - loss: 0.4501 - regression_loss: 0.4152 - classification_loss: 0.0349 119/500 [======>.......................] - ETA: 2:11 - loss: 0.4501 - regression_loss: 0.4152 - classification_loss: 0.0349 120/500 [======>.......................] - ETA: 2:11 - loss: 0.4518 - regression_loss: 0.4164 - classification_loss: 0.0354 121/500 [======>.......................] - ETA: 2:10 - loss: 0.4518 - regression_loss: 0.4166 - classification_loss: 0.0353 122/500 [======>.......................] - ETA: 2:10 - loss: 0.4527 - regression_loss: 0.4173 - classification_loss: 0.0354 123/500 [======>.......................] - ETA: 2:10 - loss: 0.4515 - regression_loss: 0.4162 - classification_loss: 0.0353 124/500 [======>.......................] - ETA: 2:09 - loss: 0.4534 - regression_loss: 0.4178 - classification_loss: 0.0356 125/500 [======>.......................] - ETA: 2:09 - loss: 0.4536 - regression_loss: 0.4181 - classification_loss: 0.0355 126/500 [======>.......................] - ETA: 2:09 - loss: 0.4547 - regression_loss: 0.4192 - classification_loss: 0.0355 127/500 [======>.......................] - ETA: 2:08 - loss: 0.4539 - regression_loss: 0.4184 - classification_loss: 0.0354 128/500 [======>.......................] - ETA: 2:08 - loss: 0.4516 - regression_loss: 0.4165 - classification_loss: 0.0352 129/500 [======>.......................] - ETA: 2:08 - loss: 0.4507 - regression_loss: 0.4156 - classification_loss: 0.0350 130/500 [======>.......................] - ETA: 2:07 - loss: 0.4507 - regression_loss: 0.4156 - classification_loss: 0.0350 131/500 [======>.......................] - ETA: 2:07 - loss: 0.4494 - regression_loss: 0.4146 - classification_loss: 0.0349 132/500 [======>.......................] - ETA: 2:07 - loss: 0.4481 - regression_loss: 0.4134 - classification_loss: 0.0347 133/500 [======>.......................] - ETA: 2:06 - loss: 0.4479 - regression_loss: 0.4132 - classification_loss: 0.0346 134/500 [=======>......................] - ETA: 2:06 - loss: 0.4476 - regression_loss: 0.4131 - classification_loss: 0.0346 135/500 [=======>......................] - ETA: 2:06 - loss: 0.4474 - regression_loss: 0.4127 - classification_loss: 0.0347 136/500 [=======>......................] - ETA: 2:05 - loss: 0.4486 - regression_loss: 0.4139 - classification_loss: 0.0347 137/500 [=======>......................] - ETA: 2:05 - loss: 0.4487 - regression_loss: 0.4141 - classification_loss: 0.0346 138/500 [=======>......................] - ETA: 2:05 - loss: 0.4471 - regression_loss: 0.4125 - classification_loss: 0.0345 139/500 [=======>......................] - ETA: 2:04 - loss: 0.4466 - regression_loss: 0.4121 - classification_loss: 0.0345 140/500 [=======>......................] - ETA: 2:04 - loss: 0.4465 - regression_loss: 0.4121 - classification_loss: 0.0345 141/500 [=======>......................] - ETA: 2:04 - loss: 0.4483 - regression_loss: 0.4137 - classification_loss: 0.0346 142/500 [=======>......................] - ETA: 2:03 - loss: 0.4501 - regression_loss: 0.4154 - classification_loss: 0.0347 143/500 [=======>......................] - ETA: 2:03 - loss: 0.4511 - regression_loss: 0.4163 - classification_loss: 0.0348 144/500 [=======>......................] - ETA: 2:02 - loss: 0.4519 - regression_loss: 0.4171 - classification_loss: 0.0348 145/500 [=======>......................] - ETA: 2:02 - loss: 0.4521 - regression_loss: 0.4172 - classification_loss: 0.0348 146/500 [=======>......................] - ETA: 2:02 - loss: 0.4532 - regression_loss: 0.4183 - classification_loss: 0.0350 147/500 [=======>......................] - ETA: 2:01 - loss: 0.4557 - regression_loss: 0.4204 - classification_loss: 0.0353 148/500 [=======>......................] - ETA: 2:01 - loss: 0.4561 - regression_loss: 0.4209 - classification_loss: 0.0352 149/500 [=======>......................] - ETA: 2:01 - loss: 0.4540 - regression_loss: 0.4189 - classification_loss: 0.0351 150/500 [========>.....................] - ETA: 2:00 - loss: 0.4543 - regression_loss: 0.4192 - classification_loss: 0.0352 151/500 [========>.....................] - ETA: 2:00 - loss: 0.4540 - regression_loss: 0.4187 - classification_loss: 0.0353 152/500 [========>.....................] - ETA: 2:00 - loss: 0.4549 - regression_loss: 0.4196 - classification_loss: 0.0354 153/500 [========>.....................] - ETA: 1:59 - loss: 0.4552 - regression_loss: 0.4199 - classification_loss: 0.0354 154/500 [========>.....................] - ETA: 1:59 - loss: 0.4557 - regression_loss: 0.4204 - classification_loss: 0.0354 155/500 [========>.....................] - ETA: 1:58 - loss: 0.4576 - regression_loss: 0.4221 - classification_loss: 0.0355 156/500 [========>.....................] - ETA: 1:58 - loss: 0.4588 - regression_loss: 0.4233 - classification_loss: 0.0355 157/500 [========>.....................] - ETA: 1:58 - loss: 0.4589 - regression_loss: 0.4235 - classification_loss: 0.0354 158/500 [========>.....................] - ETA: 1:57 - loss: 0.4571 - regression_loss: 0.4217 - classification_loss: 0.0354 159/500 [========>.....................] - ETA: 1:57 - loss: 0.4551 - regression_loss: 0.4199 - classification_loss: 0.0352 160/500 [========>.....................] - ETA: 1:57 - loss: 0.4551 - regression_loss: 0.4198 - classification_loss: 0.0352 161/500 [========>.....................] - ETA: 1:56 - loss: 0.4548 - regression_loss: 0.4196 - classification_loss: 0.0351 162/500 [========>.....................] - ETA: 1:56 - loss: 0.4536 - regression_loss: 0.4185 - classification_loss: 0.0350 163/500 [========>.....................] - ETA: 1:56 - loss: 0.4535 - regression_loss: 0.4185 - classification_loss: 0.0350 164/500 [========>.....................] - ETA: 1:56 - loss: 0.4518 - regression_loss: 0.4169 - classification_loss: 0.0349 165/500 [========>.....................] - ETA: 1:55 - loss: 0.4518 - regression_loss: 0.4169 - classification_loss: 0.0349 166/500 [========>.....................] - ETA: 1:55 - loss: 0.4525 - regression_loss: 0.4175 - classification_loss: 0.0351 167/500 [=========>....................] - ETA: 1:54 - loss: 0.4522 - regression_loss: 0.4171 - classification_loss: 0.0351 168/500 [=========>....................] - ETA: 1:54 - loss: 0.4524 - regression_loss: 0.4171 - classification_loss: 0.0353 169/500 [=========>....................] - ETA: 1:54 - loss: 0.4516 - regression_loss: 0.4165 - classification_loss: 0.0352 170/500 [=========>....................] - ETA: 1:53 - loss: 0.4507 - regression_loss: 0.4156 - classification_loss: 0.0351 171/500 [=========>....................] - ETA: 1:53 - loss: 0.4494 - regression_loss: 0.4144 - classification_loss: 0.0350 172/500 [=========>....................] - ETA: 1:53 - loss: 0.4500 - regression_loss: 0.4150 - classification_loss: 0.0349 173/500 [=========>....................] - ETA: 1:52 - loss: 0.4491 - regression_loss: 0.4142 - classification_loss: 0.0349 174/500 [=========>....................] - ETA: 1:52 - loss: 0.4482 - regression_loss: 0.4134 - classification_loss: 0.0348 175/500 [=========>....................] - ETA: 1:52 - loss: 0.4493 - regression_loss: 0.4145 - classification_loss: 0.0348 176/500 [=========>....................] - ETA: 1:51 - loss: 0.4488 - regression_loss: 0.4141 - classification_loss: 0.0346 177/500 [=========>....................] - ETA: 1:51 - loss: 0.4473 - regression_loss: 0.4128 - classification_loss: 0.0345 178/500 [=========>....................] - ETA: 1:51 - loss: 0.4485 - regression_loss: 0.4139 - classification_loss: 0.0346 179/500 [=========>....................] - ETA: 1:50 - loss: 0.4493 - regression_loss: 0.4146 - classification_loss: 0.0347 180/500 [=========>....................] - ETA: 1:50 - loss: 0.4489 - regression_loss: 0.4143 - classification_loss: 0.0346 181/500 [=========>....................] - ETA: 1:49 - loss: 0.4490 - regression_loss: 0.4143 - classification_loss: 0.0346 182/500 [=========>....................] - ETA: 1:49 - loss: 0.4477 - regression_loss: 0.4132 - classification_loss: 0.0345 183/500 [=========>....................] - ETA: 1:49 - loss: 0.4482 - regression_loss: 0.4137 - classification_loss: 0.0345 184/500 [==========>...................] - ETA: 1:48 - loss: 0.4491 - regression_loss: 0.4146 - classification_loss: 0.0346 185/500 [==========>...................] - ETA: 1:48 - loss: 0.4485 - regression_loss: 0.4140 - classification_loss: 0.0345 186/500 [==========>...................] - ETA: 1:48 - loss: 0.4481 - regression_loss: 0.4137 - classification_loss: 0.0344 187/500 [==========>...................] - ETA: 1:47 - loss: 0.4483 - regression_loss: 0.4139 - classification_loss: 0.0344 188/500 [==========>...................] - ETA: 1:47 - loss: 0.4477 - regression_loss: 0.4134 - classification_loss: 0.0343 189/500 [==========>...................] - ETA: 1:47 - loss: 0.4476 - regression_loss: 0.4134 - classification_loss: 0.0342 190/500 [==========>...................] - ETA: 1:46 - loss: 0.4470 - regression_loss: 0.4127 - classification_loss: 0.0342 191/500 [==========>...................] - ETA: 1:46 - loss: 0.4484 - regression_loss: 0.4142 - classification_loss: 0.0342 192/500 [==========>...................] - ETA: 1:46 - loss: 0.4483 - regression_loss: 0.4141 - classification_loss: 0.0341 193/500 [==========>...................] - ETA: 1:45 - loss: 0.4502 - regression_loss: 0.4158 - classification_loss: 0.0343 194/500 [==========>...................] - ETA: 1:45 - loss: 0.4493 - regression_loss: 0.4151 - classification_loss: 0.0342 195/500 [==========>...................] - ETA: 1:44 - loss: 0.4491 - regression_loss: 0.4149 - classification_loss: 0.0342 196/500 [==========>...................] - ETA: 1:44 - loss: 0.4482 - regression_loss: 0.4140 - classification_loss: 0.0342 197/500 [==========>...................] - ETA: 1:44 - loss: 0.4499 - regression_loss: 0.4156 - classification_loss: 0.0343 198/500 [==========>...................] - ETA: 1:43 - loss: 0.4508 - regression_loss: 0.4164 - classification_loss: 0.0344 199/500 [==========>...................] - ETA: 1:43 - loss: 0.4501 - regression_loss: 0.4158 - classification_loss: 0.0343 200/500 [===========>..................] - ETA: 1:43 - loss: 0.4506 - regression_loss: 0.4163 - classification_loss: 0.0343 201/500 [===========>..................] - ETA: 1:42 - loss: 0.4517 - regression_loss: 0.4174 - classification_loss: 0.0344 202/500 [===========>..................] - ETA: 1:42 - loss: 0.4515 - regression_loss: 0.4170 - classification_loss: 0.0345 203/500 [===========>..................] - ETA: 1:42 - loss: 0.4514 - regression_loss: 0.4170 - classification_loss: 0.0344 204/500 [===========>..................] - ETA: 1:41 - loss: 0.4521 - regression_loss: 0.4178 - classification_loss: 0.0344 205/500 [===========>..................] - ETA: 1:41 - loss: 0.4510 - regression_loss: 0.4167 - classification_loss: 0.0343 206/500 [===========>..................] - ETA: 1:41 - loss: 0.4510 - regression_loss: 0.4167 - classification_loss: 0.0343 207/500 [===========>..................] - ETA: 1:40 - loss: 0.4529 - regression_loss: 0.4185 - classification_loss: 0.0344 208/500 [===========>..................] - ETA: 1:40 - loss: 0.4531 - regression_loss: 0.4186 - classification_loss: 0.0344 209/500 [===========>..................] - ETA: 1:40 - loss: 0.4524 - regression_loss: 0.4180 - classification_loss: 0.0343 210/500 [===========>..................] - ETA: 1:39 - loss: 0.4520 - regression_loss: 0.4177 - classification_loss: 0.0343 211/500 [===========>..................] - ETA: 1:39 - loss: 0.4503 - regression_loss: 0.4162 - classification_loss: 0.0341 212/500 [===========>..................] - ETA: 1:39 - loss: 0.4498 - regression_loss: 0.4158 - classification_loss: 0.0340 213/500 [===========>..................] - ETA: 1:38 - loss: 0.4501 - regression_loss: 0.4160 - classification_loss: 0.0342 214/500 [===========>..................] - ETA: 1:38 - loss: 0.4489 - regression_loss: 0.4149 - classification_loss: 0.0341 215/500 [===========>..................] - ETA: 1:38 - loss: 0.4481 - regression_loss: 0.4142 - classification_loss: 0.0340 216/500 [===========>..................] - ETA: 1:37 - loss: 0.4483 - regression_loss: 0.4143 - classification_loss: 0.0340 217/500 [============>.................] - ETA: 1:37 - loss: 0.4477 - regression_loss: 0.4137 - classification_loss: 0.0339 218/500 [============>.................] - ETA: 1:37 - loss: 0.4475 - regression_loss: 0.4137 - classification_loss: 0.0339 219/500 [============>.................] - ETA: 1:36 - loss: 0.4463 - regression_loss: 0.4125 - classification_loss: 0.0338 220/500 [============>.................] - ETA: 1:36 - loss: 0.4465 - regression_loss: 0.4127 - classification_loss: 0.0338 221/500 [============>.................] - ETA: 1:36 - loss: 0.4466 - regression_loss: 0.4128 - classification_loss: 0.0338 222/500 [============>.................] - ETA: 1:35 - loss: 0.4468 - regression_loss: 0.4131 - classification_loss: 0.0338 223/500 [============>.................] - ETA: 1:35 - loss: 0.4470 - regression_loss: 0.4132 - classification_loss: 0.0338 224/500 [============>.................] - ETA: 1:35 - loss: 0.4472 - regression_loss: 0.4134 - classification_loss: 0.0338 225/500 [============>.................] - ETA: 1:34 - loss: 0.4464 - regression_loss: 0.4127 - classification_loss: 0.0337 226/500 [============>.................] - ETA: 1:34 - loss: 0.4467 - regression_loss: 0.4130 - classification_loss: 0.0337 227/500 [============>.................] - ETA: 1:34 - loss: 0.4468 - regression_loss: 0.4131 - classification_loss: 0.0337 228/500 [============>.................] - ETA: 1:33 - loss: 0.4466 - regression_loss: 0.4128 - classification_loss: 0.0337 229/500 [============>.................] - ETA: 1:33 - loss: 0.4454 - regression_loss: 0.4118 - classification_loss: 0.0336 230/500 [============>.................] - ETA: 1:33 - loss: 0.4446 - regression_loss: 0.4111 - classification_loss: 0.0335 231/500 [============>.................] - ETA: 1:32 - loss: 0.4442 - regression_loss: 0.4107 - classification_loss: 0.0335 232/500 [============>.................] - ETA: 1:32 - loss: 0.4442 - regression_loss: 0.4108 - classification_loss: 0.0334 233/500 [============>.................] - ETA: 1:32 - loss: 0.4446 - regression_loss: 0.4111 - classification_loss: 0.0335 234/500 [=============>................] - ETA: 1:31 - loss: 0.4456 - regression_loss: 0.4120 - classification_loss: 0.0337 235/500 [=============>................] - ETA: 1:31 - loss: 0.4449 - regression_loss: 0.4113 - classification_loss: 0.0336 236/500 [=============>................] - ETA: 1:30 - loss: 0.4446 - regression_loss: 0.4110 - classification_loss: 0.0336 237/500 [=============>................] - ETA: 1:30 - loss: 0.4445 - regression_loss: 0.4109 - classification_loss: 0.0336 238/500 [=============>................] - ETA: 1:30 - loss: 0.4442 - regression_loss: 0.4107 - classification_loss: 0.0335 239/500 [=============>................] - ETA: 1:29 - loss: 0.4445 - regression_loss: 0.4110 - classification_loss: 0.0336 240/500 [=============>................] - ETA: 1:29 - loss: 0.4439 - regression_loss: 0.4104 - classification_loss: 0.0335 241/500 [=============>................] - ETA: 1:29 - loss: 0.4433 - regression_loss: 0.4099 - classification_loss: 0.0334 242/500 [=============>................] - ETA: 1:28 - loss: 0.4424 - regression_loss: 0.4091 - classification_loss: 0.0334 243/500 [=============>................] - ETA: 1:28 - loss: 0.4434 - regression_loss: 0.4099 - classification_loss: 0.0335 244/500 [=============>................] - ETA: 1:28 - loss: 0.4437 - regression_loss: 0.4102 - classification_loss: 0.0335 245/500 [=============>................] - ETA: 1:27 - loss: 0.4427 - regression_loss: 0.4093 - classification_loss: 0.0334 246/500 [=============>................] - ETA: 1:27 - loss: 0.4420 - regression_loss: 0.4086 - classification_loss: 0.0333 247/500 [=============>................] - ETA: 1:27 - loss: 0.4414 - regression_loss: 0.4081 - classification_loss: 0.0333 248/500 [=============>................] - ETA: 1:26 - loss: 0.4405 - regression_loss: 0.4073 - classification_loss: 0.0332 249/500 [=============>................] - ETA: 1:26 - loss: 0.4403 - regression_loss: 0.4072 - classification_loss: 0.0332 250/500 [==============>...............] - ETA: 1:26 - loss: 0.4409 - regression_loss: 0.4076 - classification_loss: 0.0333 251/500 [==============>...............] - ETA: 1:25 - loss: 0.4419 - regression_loss: 0.4084 - classification_loss: 0.0335 252/500 [==============>...............] - ETA: 1:25 - loss: 0.4416 - regression_loss: 0.4082 - classification_loss: 0.0334 253/500 [==============>...............] - ETA: 1:25 - loss: 0.4413 - regression_loss: 0.4080 - classification_loss: 0.0333 254/500 [==============>...............] - ETA: 1:24 - loss: 0.4408 - regression_loss: 0.4075 - classification_loss: 0.0333 255/500 [==============>...............] - ETA: 1:24 - loss: 0.4407 - regression_loss: 0.4074 - classification_loss: 0.0333 256/500 [==============>...............] - ETA: 1:24 - loss: 0.4398 - regression_loss: 0.4066 - classification_loss: 0.0332 257/500 [==============>...............] - ETA: 1:23 - loss: 0.4396 - regression_loss: 0.4064 - classification_loss: 0.0332 258/500 [==============>...............] - ETA: 1:23 - loss: 0.4396 - regression_loss: 0.4064 - classification_loss: 0.0332 259/500 [==============>...............] - ETA: 1:23 - loss: 0.4395 - regression_loss: 0.4063 - classification_loss: 0.0333 260/500 [==============>...............] - ETA: 1:22 - loss: 0.4386 - regression_loss: 0.4054 - classification_loss: 0.0332 261/500 [==============>...............] - ETA: 1:22 - loss: 0.4397 - regression_loss: 0.4065 - classification_loss: 0.0331 262/500 [==============>...............] - ETA: 1:22 - loss: 0.4399 - regression_loss: 0.4068 - classification_loss: 0.0332 263/500 [==============>...............] - ETA: 1:21 - loss: 0.4401 - regression_loss: 0.4069 - classification_loss: 0.0332 264/500 [==============>...............] - ETA: 1:21 - loss: 0.4399 - regression_loss: 0.4067 - classification_loss: 0.0332 265/500 [==============>...............] - ETA: 1:21 - loss: 0.4399 - regression_loss: 0.4067 - classification_loss: 0.0332 266/500 [==============>...............] - ETA: 1:20 - loss: 0.4392 - regression_loss: 0.4061 - classification_loss: 0.0331 267/500 [===============>..............] - ETA: 1:20 - loss: 0.4399 - regression_loss: 0.4066 - classification_loss: 0.0332 268/500 [===============>..............] - ETA: 1:20 - loss: 0.4393 - regression_loss: 0.4061 - classification_loss: 0.0332 269/500 [===============>..............] - ETA: 1:19 - loss: 0.4381 - regression_loss: 0.4050 - classification_loss: 0.0331 270/500 [===============>..............] - ETA: 1:19 - loss: 0.4377 - regression_loss: 0.4046 - classification_loss: 0.0331 271/500 [===============>..............] - ETA: 1:18 - loss: 0.4379 - regression_loss: 0.4048 - classification_loss: 0.0331 272/500 [===============>..............] - ETA: 1:18 - loss: 0.4377 - regression_loss: 0.4046 - classification_loss: 0.0331 273/500 [===============>..............] - ETA: 1:18 - loss: 0.4371 - regression_loss: 0.4040 - classification_loss: 0.0331 274/500 [===============>..............] - ETA: 1:17 - loss: 0.4375 - regression_loss: 0.4042 - classification_loss: 0.0333 275/500 [===============>..............] - ETA: 1:17 - loss: 0.4367 - regression_loss: 0.4035 - classification_loss: 0.0332 276/500 [===============>..............] - ETA: 1:17 - loss: 0.4380 - regression_loss: 0.4047 - classification_loss: 0.0333 277/500 [===============>..............] - ETA: 1:16 - loss: 0.4380 - regression_loss: 0.4047 - classification_loss: 0.0333 278/500 [===============>..............] - ETA: 1:16 - loss: 0.4386 - regression_loss: 0.4051 - classification_loss: 0.0335 279/500 [===============>..............] - ETA: 1:16 - loss: 0.4383 - regression_loss: 0.4048 - classification_loss: 0.0335 280/500 [===============>..............] - ETA: 1:15 - loss: 0.4386 - regression_loss: 0.4051 - classification_loss: 0.0336 281/500 [===============>..............] - ETA: 1:15 - loss: 0.4386 - regression_loss: 0.4050 - classification_loss: 0.0336 282/500 [===============>..............] - ETA: 1:15 - loss: 0.4384 - regression_loss: 0.4048 - classification_loss: 0.0336 283/500 [===============>..............] - ETA: 1:14 - loss: 0.4378 - regression_loss: 0.4043 - classification_loss: 0.0335 284/500 [================>.............] - ETA: 1:14 - loss: 0.4391 - regression_loss: 0.4055 - classification_loss: 0.0336 285/500 [================>.............] - ETA: 1:14 - loss: 0.4405 - regression_loss: 0.4067 - classification_loss: 0.0338 286/500 [================>.............] - ETA: 1:13 - loss: 0.4403 - regression_loss: 0.4066 - classification_loss: 0.0337 287/500 [================>.............] - ETA: 1:13 - loss: 0.4397 - regression_loss: 0.4060 - classification_loss: 0.0337 288/500 [================>.............] - ETA: 1:13 - loss: 0.4400 - regression_loss: 0.4064 - classification_loss: 0.0337 289/500 [================>.............] - ETA: 1:12 - loss: 0.4397 - regression_loss: 0.4060 - classification_loss: 0.0337 290/500 [================>.............] - ETA: 1:12 - loss: 0.4405 - regression_loss: 0.4068 - classification_loss: 0.0337 291/500 [================>.............] - ETA: 1:12 - loss: 0.4399 - regression_loss: 0.4063 - classification_loss: 0.0336 292/500 [================>.............] - ETA: 1:11 - loss: 0.4411 - regression_loss: 0.4073 - classification_loss: 0.0338 293/500 [================>.............] - ETA: 1:11 - loss: 0.4410 - regression_loss: 0.4073 - classification_loss: 0.0337 294/500 [================>.............] - ETA: 1:10 - loss: 0.4411 - regression_loss: 0.4074 - classification_loss: 0.0337 295/500 [================>.............] - ETA: 1:10 - loss: 0.4414 - regression_loss: 0.4077 - classification_loss: 0.0337 296/500 [================>.............] - ETA: 1:10 - loss: 0.4417 - regression_loss: 0.4080 - classification_loss: 0.0337 297/500 [================>.............] - ETA: 1:09 - loss: 0.4406 - regression_loss: 0.4070 - classification_loss: 0.0336 298/500 [================>.............] - ETA: 1:09 - loss: 0.4402 - regression_loss: 0.4066 - classification_loss: 0.0336 299/500 [================>.............] - ETA: 1:09 - loss: 0.4399 - regression_loss: 0.4063 - classification_loss: 0.0336 300/500 [=================>............] - ETA: 1:08 - loss: 0.4403 - regression_loss: 0.4067 - classification_loss: 0.0337 301/500 [=================>............] - ETA: 1:08 - loss: 0.4407 - regression_loss: 0.4071 - classification_loss: 0.0336 302/500 [=================>............] - ETA: 1:08 - loss: 0.4404 - regression_loss: 0.4068 - classification_loss: 0.0336 303/500 [=================>............] - ETA: 1:07 - loss: 0.4410 - regression_loss: 0.4075 - classification_loss: 0.0336 304/500 [=================>............] - ETA: 1:07 - loss: 0.4412 - regression_loss: 0.4077 - classification_loss: 0.0335 305/500 [=================>............] - ETA: 1:07 - loss: 0.4409 - regression_loss: 0.4074 - classification_loss: 0.0335 306/500 [=================>............] - ETA: 1:06 - loss: 0.4408 - regression_loss: 0.4073 - classification_loss: 0.0335 307/500 [=================>............] - ETA: 1:06 - loss: 0.4420 - regression_loss: 0.4083 - classification_loss: 0.0337 308/500 [=================>............] - ETA: 1:06 - loss: 0.4413 - regression_loss: 0.4077 - classification_loss: 0.0336 309/500 [=================>............] - ETA: 1:05 - loss: 0.4406 - regression_loss: 0.4071 - classification_loss: 0.0335 310/500 [=================>............] - ETA: 1:05 - loss: 0.4408 - regression_loss: 0.4073 - classification_loss: 0.0335 311/500 [=================>............] - ETA: 1:05 - loss: 0.4402 - regression_loss: 0.4067 - classification_loss: 0.0335 312/500 [=================>............] - ETA: 1:04 - loss: 0.4399 - regression_loss: 0.4064 - classification_loss: 0.0335 313/500 [=================>............] - ETA: 1:04 - loss: 0.4393 - regression_loss: 0.4059 - classification_loss: 0.0334 314/500 [=================>............] - ETA: 1:04 - loss: 0.4390 - regression_loss: 0.4056 - classification_loss: 0.0334 315/500 [=================>............] - ETA: 1:03 - loss: 0.4394 - regression_loss: 0.4059 - classification_loss: 0.0335 316/500 [=================>............] - ETA: 1:03 - loss: 0.4401 - regression_loss: 0.4065 - classification_loss: 0.0337 317/500 [==================>...........] - ETA: 1:02 - loss: 0.4400 - regression_loss: 0.4064 - classification_loss: 0.0336 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4406 - regression_loss: 0.4068 - classification_loss: 0.0338 319/500 [==================>...........] - ETA: 1:02 - loss: 0.4398 - regression_loss: 0.4061 - classification_loss: 0.0337 320/500 [==================>...........] - ETA: 1:01 - loss: 0.4408 - regression_loss: 0.4070 - classification_loss: 0.0338 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4410 - regression_loss: 0.4072 - classification_loss: 0.0338 322/500 [==================>...........] - ETA: 1:01 - loss: 0.4404 - regression_loss: 0.4066 - classification_loss: 0.0337 323/500 [==================>...........] - ETA: 1:00 - loss: 0.4400 - regression_loss: 0.4063 - classification_loss: 0.0337 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4406 - regression_loss: 0.4069 - classification_loss: 0.0337 325/500 [==================>...........] - ETA: 1:00 - loss: 0.4402 - regression_loss: 0.4066 - classification_loss: 0.0336 326/500 [==================>...........] - ETA: 59s - loss: 0.4402 - regression_loss: 0.4066 - classification_loss: 0.0336  327/500 [==================>...........] - ETA: 59s - loss: 0.4413 - regression_loss: 0.4076 - classification_loss: 0.0337 328/500 [==================>...........] - ETA: 59s - loss: 0.4419 - regression_loss: 0.4082 - classification_loss: 0.0337 329/500 [==================>...........] - ETA: 58s - loss: 0.4424 - regression_loss: 0.4088 - classification_loss: 0.0337 330/500 [==================>...........] - ETA: 58s - loss: 0.4425 - regression_loss: 0.4088 - classification_loss: 0.0337 331/500 [==================>...........] - ETA: 58s - loss: 0.4426 - regression_loss: 0.4090 - classification_loss: 0.0336 332/500 [==================>...........] - ETA: 57s - loss: 0.4426 - regression_loss: 0.4090 - classification_loss: 0.0336 333/500 [==================>...........] - ETA: 57s - loss: 0.4425 - regression_loss: 0.4089 - classification_loss: 0.0336 334/500 [===================>..........] - ETA: 57s - loss: 0.4423 - regression_loss: 0.4088 - classification_loss: 0.0335 335/500 [===================>..........] - ETA: 56s - loss: 0.4426 - regression_loss: 0.4090 - classification_loss: 0.0336 336/500 [===================>..........] - ETA: 56s - loss: 0.4429 - regression_loss: 0.4093 - classification_loss: 0.0336 337/500 [===================>..........] - ETA: 56s - loss: 0.4433 - regression_loss: 0.4097 - classification_loss: 0.0336 338/500 [===================>..........] - ETA: 55s - loss: 0.4433 - regression_loss: 0.4097 - classification_loss: 0.0336 339/500 [===================>..........] - ETA: 55s - loss: 0.4436 - regression_loss: 0.4099 - classification_loss: 0.0337 340/500 [===================>..........] - ETA: 55s - loss: 0.4437 - regression_loss: 0.4099 - classification_loss: 0.0338 341/500 [===================>..........] - ETA: 54s - loss: 0.4435 - regression_loss: 0.4096 - classification_loss: 0.0338 342/500 [===================>..........] - ETA: 54s - loss: 0.4444 - regression_loss: 0.4105 - classification_loss: 0.0339 343/500 [===================>..........] - ETA: 54s - loss: 0.4441 - regression_loss: 0.4102 - classification_loss: 0.0339 344/500 [===================>..........] - ETA: 53s - loss: 0.4440 - regression_loss: 0.4101 - classification_loss: 0.0338 345/500 [===================>..........] - ETA: 53s - loss: 0.4440 - regression_loss: 0.4101 - classification_loss: 0.0339 346/500 [===================>..........] - ETA: 52s - loss: 0.4446 - regression_loss: 0.4106 - classification_loss: 0.0340 347/500 [===================>..........] - ETA: 52s - loss: 0.4440 - regression_loss: 0.4101 - classification_loss: 0.0339 348/500 [===================>..........] - ETA: 52s - loss: 0.4441 - regression_loss: 0.4101 - classification_loss: 0.0339 349/500 [===================>..........] - ETA: 51s - loss: 0.4449 - regression_loss: 0.4110 - classification_loss: 0.0339 350/500 [====================>.........] - ETA: 51s - loss: 0.4448 - regression_loss: 0.4108 - classification_loss: 0.0340 351/500 [====================>.........] - ETA: 51s - loss: 0.4448 - regression_loss: 0.4109 - classification_loss: 0.0340 352/500 [====================>.........] - ETA: 50s - loss: 0.4450 - regression_loss: 0.4110 - classification_loss: 0.0340 353/500 [====================>.........] - ETA: 50s - loss: 0.4445 - regression_loss: 0.4107 - classification_loss: 0.0339 354/500 [====================>.........] - ETA: 50s - loss: 0.4441 - regression_loss: 0.4103 - classification_loss: 0.0338 355/500 [====================>.........] - ETA: 49s - loss: 0.4441 - regression_loss: 0.4103 - classification_loss: 0.0338 356/500 [====================>.........] - ETA: 49s - loss: 0.4436 - regression_loss: 0.4098 - classification_loss: 0.0338 357/500 [====================>.........] - ETA: 49s - loss: 0.4438 - regression_loss: 0.4100 - classification_loss: 0.0338 358/500 [====================>.........] - ETA: 48s - loss: 0.4429 - regression_loss: 0.4091 - classification_loss: 0.0337 359/500 [====================>.........] - ETA: 48s - loss: 0.4434 - regression_loss: 0.4095 - classification_loss: 0.0338 360/500 [====================>.........] - ETA: 48s - loss: 0.4432 - regression_loss: 0.4094 - classification_loss: 0.0338 361/500 [====================>.........] - ETA: 47s - loss: 0.4435 - regression_loss: 0.4096 - classification_loss: 0.0339 362/500 [====================>.........] - ETA: 47s - loss: 0.4440 - regression_loss: 0.4100 - classification_loss: 0.0340 363/500 [====================>.........] - ETA: 47s - loss: 0.4447 - regression_loss: 0.4107 - classification_loss: 0.0340 364/500 [====================>.........] - ETA: 46s - loss: 0.4456 - regression_loss: 0.4115 - classification_loss: 0.0341 365/500 [====================>.........] - ETA: 46s - loss: 0.4454 - regression_loss: 0.4113 - classification_loss: 0.0341 366/500 [====================>.........] - ETA: 46s - loss: 0.4452 - regression_loss: 0.4112 - classification_loss: 0.0341 367/500 [=====================>........] - ETA: 45s - loss: 0.4449 - regression_loss: 0.4109 - classification_loss: 0.0340 368/500 [=====================>........] - ETA: 45s - loss: 0.4450 - regression_loss: 0.4110 - classification_loss: 0.0340 369/500 [=====================>........] - ETA: 45s - loss: 0.4452 - regression_loss: 0.4112 - classification_loss: 0.0340 370/500 [=====================>........] - ETA: 44s - loss: 0.4446 - regression_loss: 0.4106 - classification_loss: 0.0340 371/500 [=====================>........] - ETA: 44s - loss: 0.4441 - regression_loss: 0.4102 - classification_loss: 0.0339 372/500 [=====================>........] - ETA: 44s - loss: 0.4443 - regression_loss: 0.4103 - classification_loss: 0.0340 373/500 [=====================>........] - ETA: 43s - loss: 0.4442 - regression_loss: 0.4102 - classification_loss: 0.0340 374/500 [=====================>........] - ETA: 43s - loss: 0.4441 - regression_loss: 0.4101 - classification_loss: 0.0340 375/500 [=====================>........] - ETA: 42s - loss: 0.4436 - regression_loss: 0.4097 - classification_loss: 0.0339 376/500 [=====================>........] - ETA: 42s - loss: 0.4448 - regression_loss: 0.4107 - classification_loss: 0.0341 377/500 [=====================>........] - ETA: 42s - loss: 0.4450 - regression_loss: 0.4108 - classification_loss: 0.0342 378/500 [=====================>........] - ETA: 41s - loss: 0.4445 - regression_loss: 0.4104 - classification_loss: 0.0341 379/500 [=====================>........] - ETA: 41s - loss: 0.4443 - regression_loss: 0.4102 - classification_loss: 0.0341 380/500 [=====================>........] - ETA: 41s - loss: 0.4439 - regression_loss: 0.4098 - classification_loss: 0.0341 381/500 [=====================>........] - ETA: 40s - loss: 0.4443 - regression_loss: 0.4102 - classification_loss: 0.0341 382/500 [=====================>........] - ETA: 40s - loss: 0.4447 - regression_loss: 0.4104 - classification_loss: 0.0342 383/500 [=====================>........] - ETA: 40s - loss: 0.4447 - regression_loss: 0.4105 - classification_loss: 0.0342 384/500 [======================>.......] - ETA: 39s - loss: 0.4441 - regression_loss: 0.4100 - classification_loss: 0.0342 385/500 [======================>.......] - ETA: 39s - loss: 0.4441 - regression_loss: 0.4099 - classification_loss: 0.0342 386/500 [======================>.......] - ETA: 39s - loss: 0.4434 - regression_loss: 0.4093 - classification_loss: 0.0341 387/500 [======================>.......] - ETA: 38s - loss: 0.4434 - regression_loss: 0.4092 - classification_loss: 0.0342 388/500 [======================>.......] - ETA: 38s - loss: 0.4431 - regression_loss: 0.4090 - classification_loss: 0.0341 389/500 [======================>.......] - ETA: 38s - loss: 0.4434 - regression_loss: 0.4092 - classification_loss: 0.0341 390/500 [======================>.......] - ETA: 37s - loss: 0.4438 - regression_loss: 0.4096 - classification_loss: 0.0341 391/500 [======================>.......] - ETA: 37s - loss: 0.4436 - regression_loss: 0.4096 - classification_loss: 0.0341 392/500 [======================>.......] - ETA: 37s - loss: 0.4432 - regression_loss: 0.4092 - classification_loss: 0.0340 393/500 [======================>.......] - ETA: 36s - loss: 0.4429 - regression_loss: 0.4089 - classification_loss: 0.0340 394/500 [======================>.......] - ETA: 36s - loss: 0.4433 - regression_loss: 0.4093 - classification_loss: 0.0340 395/500 [======================>.......] - ETA: 36s - loss: 0.4439 - regression_loss: 0.4099 - classification_loss: 0.0341 396/500 [======================>.......] - ETA: 35s - loss: 0.4447 - regression_loss: 0.4106 - classification_loss: 0.0341 397/500 [======================>.......] - ETA: 35s - loss: 0.4446 - regression_loss: 0.4106 - classification_loss: 0.0340 398/500 [======================>.......] - ETA: 35s - loss: 0.4450 - regression_loss: 0.4110 - classification_loss: 0.0340 399/500 [======================>.......] - ETA: 34s - loss: 0.4446 - regression_loss: 0.4106 - classification_loss: 0.0340 400/500 [=======================>......] - ETA: 34s - loss: 0.4444 - regression_loss: 0.4105 - classification_loss: 0.0340 401/500 [=======================>......] - ETA: 34s - loss: 0.4438 - regression_loss: 0.4099 - classification_loss: 0.0339 402/500 [=======================>......] - ETA: 33s - loss: 0.4441 - regression_loss: 0.4101 - classification_loss: 0.0340 403/500 [=======================>......] - ETA: 33s - loss: 0.4442 - regression_loss: 0.4102 - classification_loss: 0.0340 404/500 [=======================>......] - ETA: 32s - loss: 0.4440 - regression_loss: 0.4100 - classification_loss: 0.0340 405/500 [=======================>......] - ETA: 32s - loss: 0.4443 - regression_loss: 0.4102 - classification_loss: 0.0341 406/500 [=======================>......] - ETA: 32s - loss: 0.4439 - regression_loss: 0.4098 - classification_loss: 0.0341 407/500 [=======================>......] - ETA: 31s - loss: 0.4434 - regression_loss: 0.4094 - classification_loss: 0.0340 408/500 [=======================>......] - ETA: 31s - loss: 0.4434 - regression_loss: 0.4093 - classification_loss: 0.0341 409/500 [=======================>......] - ETA: 31s - loss: 0.4440 - regression_loss: 0.4098 - classification_loss: 0.0342 410/500 [=======================>......] - ETA: 30s - loss: 0.4438 - regression_loss: 0.4096 - classification_loss: 0.0342 411/500 [=======================>......] - ETA: 30s - loss: 0.4441 - regression_loss: 0.4099 - classification_loss: 0.0342 412/500 [=======================>......] - ETA: 30s - loss: 0.4442 - regression_loss: 0.4100 - classification_loss: 0.0342 413/500 [=======================>......] - ETA: 29s - loss: 0.4441 - regression_loss: 0.4099 - classification_loss: 0.0342 414/500 [=======================>......] - ETA: 29s - loss: 0.4445 - regression_loss: 0.4103 - classification_loss: 0.0342 415/500 [=======================>......] - ETA: 29s - loss: 0.4445 - regression_loss: 0.4104 - classification_loss: 0.0341 416/500 [=======================>......] - ETA: 28s - loss: 0.4445 - regression_loss: 0.4104 - classification_loss: 0.0341 417/500 [========================>.....] - ETA: 28s - loss: 0.4449 - regression_loss: 0.4108 - classification_loss: 0.0341 418/500 [========================>.....] - ETA: 28s - loss: 0.4442 - regression_loss: 0.4101 - classification_loss: 0.0341 419/500 [========================>.....] - ETA: 27s - loss: 0.4443 - regression_loss: 0.4102 - classification_loss: 0.0341 420/500 [========================>.....] - ETA: 27s - loss: 0.4440 - regression_loss: 0.4099 - classification_loss: 0.0341 421/500 [========================>.....] - ETA: 27s - loss: 0.4440 - regression_loss: 0.4099 - classification_loss: 0.0340 422/500 [========================>.....] - ETA: 26s - loss: 0.4445 - regression_loss: 0.4104 - classification_loss: 0.0341 423/500 [========================>.....] - ETA: 26s - loss: 0.4447 - regression_loss: 0.4106 - classification_loss: 0.0341 424/500 [========================>.....] - ETA: 26s - loss: 0.4447 - regression_loss: 0.4106 - classification_loss: 0.0340 425/500 [========================>.....] - ETA: 25s - loss: 0.4448 - regression_loss: 0.4108 - classification_loss: 0.0341 426/500 [========================>.....] - ETA: 25s - loss: 0.4455 - regression_loss: 0.4114 - classification_loss: 0.0341 427/500 [========================>.....] - ETA: 25s - loss: 0.4461 - regression_loss: 0.4119 - classification_loss: 0.0342 428/500 [========================>.....] - ETA: 24s - loss: 0.4464 - regression_loss: 0.4121 - classification_loss: 0.0343 429/500 [========================>.....] - ETA: 24s - loss: 0.4476 - regression_loss: 0.4130 - classification_loss: 0.0345 430/500 [========================>.....] - ETA: 24s - loss: 0.4478 - regression_loss: 0.4131 - classification_loss: 0.0346 431/500 [========================>.....] - ETA: 23s - loss: 0.4483 - regression_loss: 0.4137 - classification_loss: 0.0346 432/500 [========================>.....] - ETA: 23s - loss: 0.4480 - regression_loss: 0.4134 - classification_loss: 0.0346 433/500 [========================>.....] - ETA: 23s - loss: 0.4480 - regression_loss: 0.4134 - classification_loss: 0.0346 434/500 [=========================>....] - ETA: 22s - loss: 0.4480 - regression_loss: 0.4134 - classification_loss: 0.0346 435/500 [=========================>....] - ETA: 22s - loss: 0.4478 - regression_loss: 0.4131 - classification_loss: 0.0346 436/500 [=========================>....] - ETA: 21s - loss: 0.4473 - regression_loss: 0.4127 - classification_loss: 0.0346 437/500 [=========================>....] - ETA: 21s - loss: 0.4472 - regression_loss: 0.4126 - classification_loss: 0.0346 438/500 [=========================>....] - ETA: 21s - loss: 0.4470 - regression_loss: 0.4125 - classification_loss: 0.0346 439/500 [=========================>....] - ETA: 20s - loss: 0.4469 - regression_loss: 0.4124 - classification_loss: 0.0346 440/500 [=========================>....] - ETA: 20s - loss: 0.4464 - regression_loss: 0.4118 - classification_loss: 0.0345 441/500 [=========================>....] - ETA: 20s - loss: 0.4460 - regression_loss: 0.4115 - classification_loss: 0.0345 442/500 [=========================>....] - ETA: 19s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 443/500 [=========================>....] - ETA: 19s - loss: 0.4450 - regression_loss: 0.4107 - classification_loss: 0.0344 444/500 [=========================>....] - ETA: 19s - loss: 0.4448 - regression_loss: 0.4104 - classification_loss: 0.0343 445/500 [=========================>....] - ETA: 18s - loss: 0.4445 - regression_loss: 0.4101 - classification_loss: 0.0343 446/500 [=========================>....] - ETA: 18s - loss: 0.4444 - regression_loss: 0.4101 - classification_loss: 0.0343 447/500 [=========================>....] - ETA: 18s - loss: 0.4440 - regression_loss: 0.4097 - classification_loss: 0.0343 448/500 [=========================>....] - ETA: 17s - loss: 0.4441 - regression_loss: 0.4098 - classification_loss: 0.0343 449/500 [=========================>....] - ETA: 17s - loss: 0.4438 - regression_loss: 0.4095 - classification_loss: 0.0343 450/500 [==========================>...] - ETA: 17s - loss: 0.4437 - regression_loss: 0.4094 - classification_loss: 0.0343 451/500 [==========================>...] - ETA: 16s - loss: 0.4434 - regression_loss: 0.4092 - classification_loss: 0.0343 452/500 [==========================>...] - ETA: 16s - loss: 0.4429 - regression_loss: 0.4086 - classification_loss: 0.0342 453/500 [==========================>...] - ETA: 16s - loss: 0.4426 - regression_loss: 0.4084 - classification_loss: 0.0342 454/500 [==========================>...] - ETA: 15s - loss: 0.4428 - regression_loss: 0.4086 - classification_loss: 0.0342 455/500 [==========================>...] - ETA: 15s - loss: 0.4426 - regression_loss: 0.4084 - classification_loss: 0.0341 456/500 [==========================>...] - ETA: 15s - loss: 0.4423 - regression_loss: 0.4082 - classification_loss: 0.0341 457/500 [==========================>...] - ETA: 14s - loss: 0.4425 - regression_loss: 0.4084 - classification_loss: 0.0342 458/500 [==========================>...] - ETA: 14s - loss: 0.4419 - regression_loss: 0.4078 - classification_loss: 0.0341 459/500 [==========================>...] - ETA: 14s - loss: 0.4419 - regression_loss: 0.4077 - classification_loss: 0.0341 460/500 [==========================>...] - ETA: 13s - loss: 0.4418 - regression_loss: 0.4076 - classification_loss: 0.0341 461/500 [==========================>...] - ETA: 13s - loss: 0.4414 - regression_loss: 0.4073 - classification_loss: 0.0341 462/500 [==========================>...] - ETA: 13s - loss: 0.4413 - regression_loss: 0.4072 - classification_loss: 0.0341 463/500 [==========================>...] - ETA: 12s - loss: 0.4410 - regression_loss: 0.4069 - classification_loss: 0.0341 464/500 [==========================>...] - ETA: 12s - loss: 0.4408 - regression_loss: 0.4068 - classification_loss: 0.0341 465/500 [==========================>...] - ETA: 12s - loss: 0.4403 - regression_loss: 0.4063 - classification_loss: 0.0340 466/500 [==========================>...] - ETA: 11s - loss: 0.4399 - regression_loss: 0.4059 - classification_loss: 0.0340 467/500 [===========================>..] - ETA: 11s - loss: 0.4396 - regression_loss: 0.4057 - classification_loss: 0.0340 468/500 [===========================>..] - ETA: 10s - loss: 0.4396 - regression_loss: 0.4057 - classification_loss: 0.0340 469/500 [===========================>..] - ETA: 10s - loss: 0.4392 - regression_loss: 0.4053 - classification_loss: 0.0340 470/500 [===========================>..] - ETA: 10s - loss: 0.4387 - regression_loss: 0.4048 - classification_loss: 0.0339 471/500 [===========================>..] - ETA: 9s - loss: 0.4384 - regression_loss: 0.4045 - classification_loss: 0.0339  472/500 [===========================>..] - ETA: 9s - loss: 0.4379 - regression_loss: 0.4040 - classification_loss: 0.0338 473/500 [===========================>..] - ETA: 9s - loss: 0.4375 - regression_loss: 0.4037 - classification_loss: 0.0338 474/500 [===========================>..] - ETA: 8s - loss: 0.4371 - regression_loss: 0.4034 - classification_loss: 0.0338 475/500 [===========================>..] - ETA: 8s - loss: 0.4371 - regression_loss: 0.4034 - classification_loss: 0.0338 476/500 [===========================>..] - ETA: 8s - loss: 0.4368 - regression_loss: 0.4031 - classification_loss: 0.0337 477/500 [===========================>..] - ETA: 7s - loss: 0.4369 - regression_loss: 0.4031 - classification_loss: 0.0337 478/500 [===========================>..] - ETA: 7s - loss: 0.4368 - regression_loss: 0.4030 - classification_loss: 0.0337 479/500 [===========================>..] - ETA: 7s - loss: 0.4362 - regression_loss: 0.4025 - classification_loss: 0.0337 480/500 [===========================>..] - ETA: 6s - loss: 0.4359 - regression_loss: 0.4022 - classification_loss: 0.0337 481/500 [===========================>..] - ETA: 6s - loss: 0.4360 - regression_loss: 0.4023 - classification_loss: 0.0337 482/500 [===========================>..] - ETA: 6s - loss: 0.4362 - regression_loss: 0.4025 - classification_loss: 0.0337 483/500 [===========================>..] - ETA: 5s - loss: 0.4358 - regression_loss: 0.4022 - classification_loss: 0.0337 484/500 [============================>.] - ETA: 5s - loss: 0.4357 - regression_loss: 0.4021 - classification_loss: 0.0337 485/500 [============================>.] - ETA: 5s - loss: 0.4353 - regression_loss: 0.4017 - classification_loss: 0.0336 486/500 [============================>.] - ETA: 4s - loss: 0.4349 - regression_loss: 0.4013 - classification_loss: 0.0336 487/500 [============================>.] - ETA: 4s - loss: 0.4350 - regression_loss: 0.4014 - classification_loss: 0.0336 488/500 [============================>.] - ETA: 4s - loss: 0.4353 - regression_loss: 0.4017 - classification_loss: 0.0335 489/500 [============================>.] - ETA: 3s - loss: 0.4350 - regression_loss: 0.4015 - classification_loss: 0.0335 490/500 [============================>.] - ETA: 3s - loss: 0.4351 - regression_loss: 0.4016 - classification_loss: 0.0336 491/500 [============================>.] - ETA: 3s - loss: 0.4351 - regression_loss: 0.4016 - classification_loss: 0.0336 492/500 [============================>.] - ETA: 2s - loss: 0.4352 - regression_loss: 0.4017 - classification_loss: 0.0335 493/500 [============================>.] - ETA: 2s - loss: 0.4352 - regression_loss: 0.4017 - classification_loss: 0.0335 494/500 [============================>.] - ETA: 2s - loss: 0.4352 - regression_loss: 0.4017 - classification_loss: 0.0335 495/500 [============================>.] - ETA: 1s - loss: 0.4351 - regression_loss: 0.4016 - classification_loss: 0.0335 496/500 [============================>.] - ETA: 1s - loss: 0.4350 - regression_loss: 0.4015 - classification_loss: 0.0335 497/500 [============================>.] - ETA: 1s - loss: 0.4350 - regression_loss: 0.4015 - classification_loss: 0.0334 498/500 [============================>.] - ETA: 0s - loss: 0.4348 - regression_loss: 0.4013 - classification_loss: 0.0334 499/500 [============================>.] - ETA: 0s - loss: 0.4341 - regression_loss: 0.4007 - classification_loss: 0.0334 500/500 [==============================] - 172s 343ms/step - loss: 0.4337 - regression_loss: 0.4004 - classification_loss: 0.0333 1172 instances of class plum with average precision: 0.7333 mAP: 0.7333 Epoch 00025: saving model to ./training/snapshots/resnet101_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:43 - loss: 0.4310 - regression_loss: 0.4157 - classification_loss: 0.0154 2/500 [..............................] - ETA: 2:45 - loss: 0.3608 - regression_loss: 0.3494 - classification_loss: 0.0115 3/500 [..............................] - ETA: 2:49 - loss: 0.3521 - regression_loss: 0.3413 - classification_loss: 0.0108 4/500 [..............................] - ETA: 2:48 - loss: 0.3728 - regression_loss: 0.3598 - classification_loss: 0.0131 5/500 [..............................] - ETA: 2:46 - loss: 0.4390 - regression_loss: 0.4172 - classification_loss: 0.0218 6/500 [..............................] - ETA: 2:48 - loss: 0.4481 - regression_loss: 0.4229 - classification_loss: 0.0251 7/500 [..............................] - ETA: 2:49 - loss: 0.4247 - regression_loss: 0.4002 - classification_loss: 0.0245 8/500 [..............................] - ETA: 2:48 - loss: 0.4632 - regression_loss: 0.4326 - classification_loss: 0.0307 9/500 [..............................] - ETA: 2:47 - loss: 0.4379 - regression_loss: 0.4087 - classification_loss: 0.0292 10/500 [..............................] - ETA: 2:47 - loss: 0.4344 - regression_loss: 0.4046 - classification_loss: 0.0297 11/500 [..............................] - ETA: 2:46 - loss: 0.4251 - regression_loss: 0.3962 - classification_loss: 0.0290 12/500 [..............................] - ETA: 2:46 - loss: 0.4174 - regression_loss: 0.3885 - classification_loss: 0.0290 13/500 [..............................] - ETA: 2:45 - loss: 0.4213 - regression_loss: 0.3909 - classification_loss: 0.0304 14/500 [..............................] - ETA: 2:46 - loss: 0.4379 - regression_loss: 0.4039 - classification_loss: 0.0339 15/500 [..............................] - ETA: 2:46 - loss: 0.4322 - regression_loss: 0.3992 - classification_loss: 0.0329 16/500 [..............................] - ETA: 2:45 - loss: 0.4262 - regression_loss: 0.3933 - classification_loss: 0.0329 17/500 [>.............................] - ETA: 2:45 - loss: 0.4214 - regression_loss: 0.3890 - classification_loss: 0.0324 18/500 [>.............................] - ETA: 2:45 - loss: 0.4115 - regression_loss: 0.3805 - classification_loss: 0.0311 19/500 [>.............................] - ETA: 2:45 - loss: 0.4115 - regression_loss: 0.3815 - classification_loss: 0.0301 20/500 [>.............................] - ETA: 2:44 - loss: 0.4017 - regression_loss: 0.3723 - classification_loss: 0.0294 21/500 [>.............................] - ETA: 2:44 - loss: 0.4128 - regression_loss: 0.3828 - classification_loss: 0.0300 22/500 [>.............................] - ETA: 2:44 - loss: 0.4104 - regression_loss: 0.3807 - classification_loss: 0.0297 23/500 [>.............................] - ETA: 2:43 - loss: 0.4138 - regression_loss: 0.3838 - classification_loss: 0.0300 24/500 [>.............................] - ETA: 2:43 - loss: 0.4051 - regression_loss: 0.3759 - classification_loss: 0.0292 25/500 [>.............................] - ETA: 2:43 - loss: 0.4142 - regression_loss: 0.3844 - classification_loss: 0.0298 26/500 [>.............................] - ETA: 2:42 - loss: 0.4342 - regression_loss: 0.4046 - classification_loss: 0.0296 27/500 [>.............................] - ETA: 2:42 - loss: 0.4393 - regression_loss: 0.4099 - classification_loss: 0.0294 28/500 [>.............................] - ETA: 2:42 - loss: 0.4501 - regression_loss: 0.4191 - classification_loss: 0.0310 29/500 [>.............................] - ETA: 2:41 - loss: 0.4443 - regression_loss: 0.4138 - classification_loss: 0.0305 30/500 [>.............................] - ETA: 2:41 - loss: 0.4417 - regression_loss: 0.4112 - classification_loss: 0.0305 31/500 [>.............................] - ETA: 2:41 - loss: 0.4350 - regression_loss: 0.4049 - classification_loss: 0.0301 32/500 [>.............................] - ETA: 2:41 - loss: 0.4344 - regression_loss: 0.4049 - classification_loss: 0.0296 33/500 [>.............................] - ETA: 2:40 - loss: 0.4320 - regression_loss: 0.4029 - classification_loss: 0.0292 34/500 [=>............................] - ETA: 2:40 - loss: 0.4295 - regression_loss: 0.4003 - classification_loss: 0.0291 35/500 [=>............................] - ETA: 2:40 - loss: 0.4301 - regression_loss: 0.4013 - classification_loss: 0.0288 36/500 [=>............................] - ETA: 2:39 - loss: 0.4251 - regression_loss: 0.3966 - classification_loss: 0.0285 37/500 [=>............................] - ETA: 2:39 - loss: 0.4197 - regression_loss: 0.3914 - classification_loss: 0.0283 38/500 [=>............................] - ETA: 2:39 - loss: 0.4118 - regression_loss: 0.3839 - classification_loss: 0.0279 39/500 [=>............................] - ETA: 2:38 - loss: 0.4096 - regression_loss: 0.3821 - classification_loss: 0.0275 40/500 [=>............................] - ETA: 2:38 - loss: 0.4045 - regression_loss: 0.3774 - classification_loss: 0.0271 41/500 [=>............................] - ETA: 2:38 - loss: 0.4029 - regression_loss: 0.3761 - classification_loss: 0.0268 42/500 [=>............................] - ETA: 2:37 - loss: 0.3998 - regression_loss: 0.3733 - classification_loss: 0.0265 43/500 [=>............................] - ETA: 2:37 - loss: 0.3977 - regression_loss: 0.3714 - classification_loss: 0.0263 44/500 [=>............................] - ETA: 2:37 - loss: 0.3998 - regression_loss: 0.3733 - classification_loss: 0.0265 45/500 [=>............................] - ETA: 2:36 - loss: 0.4008 - regression_loss: 0.3746 - classification_loss: 0.0261 46/500 [=>............................] - ETA: 2:36 - loss: 0.4008 - regression_loss: 0.3746 - classification_loss: 0.0262 47/500 [=>............................] - ETA: 2:35 - loss: 0.4063 - regression_loss: 0.3795 - classification_loss: 0.0268 48/500 [=>............................] - ETA: 2:35 - loss: 0.4067 - regression_loss: 0.3801 - classification_loss: 0.0266 49/500 [=>............................] - ETA: 2:35 - loss: 0.4065 - regression_loss: 0.3800 - classification_loss: 0.0265 50/500 [==>...........................] - ETA: 2:34 - loss: 0.4032 - regression_loss: 0.3771 - classification_loss: 0.0262 51/500 [==>...........................] - ETA: 2:34 - loss: 0.4038 - regression_loss: 0.3774 - classification_loss: 0.0264 52/500 [==>...........................] - ETA: 2:34 - loss: 0.4037 - regression_loss: 0.3773 - classification_loss: 0.0264 53/500 [==>...........................] - ETA: 2:33 - loss: 0.4048 - regression_loss: 0.3785 - classification_loss: 0.0263 54/500 [==>...........................] - ETA: 2:33 - loss: 0.4048 - regression_loss: 0.3784 - classification_loss: 0.0264 55/500 [==>...........................] - ETA: 2:32 - loss: 0.4028 - regression_loss: 0.3764 - classification_loss: 0.0264 56/500 [==>...........................] - ETA: 2:32 - loss: 0.4034 - regression_loss: 0.3769 - classification_loss: 0.0265 57/500 [==>...........................] - ETA: 2:32 - loss: 0.4045 - regression_loss: 0.3778 - classification_loss: 0.0267 58/500 [==>...........................] - ETA: 2:31 - loss: 0.4037 - regression_loss: 0.3769 - classification_loss: 0.0268 59/500 [==>...........................] - ETA: 2:31 - loss: 0.4028 - regression_loss: 0.3758 - classification_loss: 0.0270 60/500 [==>...........................] - ETA: 2:31 - loss: 0.4016 - regression_loss: 0.3746 - classification_loss: 0.0269 61/500 [==>...........................] - ETA: 2:31 - loss: 0.4043 - regression_loss: 0.3773 - classification_loss: 0.0270 62/500 [==>...........................] - ETA: 2:30 - loss: 0.4075 - regression_loss: 0.3799 - classification_loss: 0.0276 63/500 [==>...........................] - ETA: 2:30 - loss: 0.4121 - regression_loss: 0.3836 - classification_loss: 0.0284 64/500 [==>...........................] - ETA: 2:29 - loss: 0.4155 - regression_loss: 0.3867 - classification_loss: 0.0287 65/500 [==>...........................] - ETA: 2:29 - loss: 0.4114 - regression_loss: 0.3829 - classification_loss: 0.0285 66/500 [==>...........................] - ETA: 2:29 - loss: 0.4085 - regression_loss: 0.3802 - classification_loss: 0.0283 67/500 [===>..........................] - ETA: 2:28 - loss: 0.4087 - regression_loss: 0.3806 - classification_loss: 0.0281 68/500 [===>..........................] - ETA: 2:28 - loss: 0.4106 - regression_loss: 0.3823 - classification_loss: 0.0282 69/500 [===>..........................] - ETA: 2:28 - loss: 0.4108 - regression_loss: 0.3824 - classification_loss: 0.0284 70/500 [===>..........................] - ETA: 2:28 - loss: 0.4073 - regression_loss: 0.3792 - classification_loss: 0.0281 71/500 [===>..........................] - ETA: 2:28 - loss: 0.4078 - regression_loss: 0.3796 - classification_loss: 0.0281 72/500 [===>..........................] - ETA: 2:27 - loss: 0.4066 - regression_loss: 0.3784 - classification_loss: 0.0281 73/500 [===>..........................] - ETA: 2:27 - loss: 0.4062 - regression_loss: 0.3779 - classification_loss: 0.0283 74/500 [===>..........................] - ETA: 2:26 - loss: 0.4063 - regression_loss: 0.3779 - classification_loss: 0.0284 75/500 [===>..........................] - ETA: 2:26 - loss: 0.4102 - regression_loss: 0.3812 - classification_loss: 0.0291 76/500 [===>..........................] - ETA: 2:26 - loss: 0.4109 - regression_loss: 0.3819 - classification_loss: 0.0291 77/500 [===>..........................] - ETA: 2:25 - loss: 0.4109 - regression_loss: 0.3818 - classification_loss: 0.0291 78/500 [===>..........................] - ETA: 2:25 - loss: 0.4093 - regression_loss: 0.3803 - classification_loss: 0.0290 79/500 [===>..........................] - ETA: 2:25 - loss: 0.4067 - regression_loss: 0.3779 - classification_loss: 0.0288 80/500 [===>..........................] - ETA: 2:24 - loss: 0.4089 - regression_loss: 0.3797 - classification_loss: 0.0292 81/500 [===>..........................] - ETA: 2:24 - loss: 0.4085 - regression_loss: 0.3791 - classification_loss: 0.0294 82/500 [===>..........................] - ETA: 2:23 - loss: 0.4084 - regression_loss: 0.3789 - classification_loss: 0.0295 83/500 [===>..........................] - ETA: 2:23 - loss: 0.4101 - regression_loss: 0.3804 - classification_loss: 0.0297 84/500 [====>.........................] - ETA: 2:23 - loss: 0.4119 - regression_loss: 0.3816 - classification_loss: 0.0303 85/500 [====>.........................] - ETA: 2:22 - loss: 0.4126 - regression_loss: 0.3821 - classification_loss: 0.0305 86/500 [====>.........................] - ETA: 2:22 - loss: 0.4099 - regression_loss: 0.3795 - classification_loss: 0.0304 87/500 [====>.........................] - ETA: 2:22 - loss: 0.4090 - regression_loss: 0.3787 - classification_loss: 0.0303 88/500 [====>.........................] - ETA: 2:21 - loss: 0.4080 - regression_loss: 0.3778 - classification_loss: 0.0302 89/500 [====>.........................] - ETA: 2:21 - loss: 0.4084 - regression_loss: 0.3780 - classification_loss: 0.0304 90/500 [====>.........................] - ETA: 2:20 - loss: 0.4083 - regression_loss: 0.3781 - classification_loss: 0.0302 91/500 [====>.........................] - ETA: 2:20 - loss: 0.4064 - regression_loss: 0.3764 - classification_loss: 0.0301 92/500 [====>.........................] - ETA: 2:20 - loss: 0.4072 - regression_loss: 0.3773 - classification_loss: 0.0299 93/500 [====>.........................] - ETA: 2:19 - loss: 0.4099 - regression_loss: 0.3797 - classification_loss: 0.0302 94/500 [====>.........................] - ETA: 2:19 - loss: 0.4084 - regression_loss: 0.3783 - classification_loss: 0.0300 95/500 [====>.........................] - ETA: 2:19 - loss: 0.4077 - regression_loss: 0.3777 - classification_loss: 0.0300 96/500 [====>.........................] - ETA: 2:18 - loss: 0.4075 - regression_loss: 0.3775 - classification_loss: 0.0300 97/500 [====>.........................] - ETA: 2:18 - loss: 0.4054 - regression_loss: 0.3756 - classification_loss: 0.0298 98/500 [====>.........................] - ETA: 2:18 - loss: 0.4035 - regression_loss: 0.3738 - classification_loss: 0.0297 99/500 [====>.........................] - ETA: 2:18 - loss: 0.4086 - regression_loss: 0.3784 - classification_loss: 0.0302 100/500 [=====>........................] - ETA: 2:17 - loss: 0.4069 - regression_loss: 0.3769 - classification_loss: 0.0300 101/500 [=====>........................] - ETA: 2:17 - loss: 0.4073 - regression_loss: 0.3773 - classification_loss: 0.0300 102/500 [=====>........................] - ETA: 2:17 - loss: 0.4065 - regression_loss: 0.3766 - classification_loss: 0.0299 103/500 [=====>........................] - ETA: 2:16 - loss: 0.4074 - regression_loss: 0.3776 - classification_loss: 0.0298 104/500 [=====>........................] - ETA: 2:16 - loss: 0.4083 - regression_loss: 0.3785 - classification_loss: 0.0299 105/500 [=====>........................] - ETA: 2:15 - loss: 0.4067 - regression_loss: 0.3770 - classification_loss: 0.0297 106/500 [=====>........................] - ETA: 2:15 - loss: 0.4060 - regression_loss: 0.3763 - classification_loss: 0.0297 107/500 [=====>........................] - ETA: 2:15 - loss: 0.4048 - regression_loss: 0.3751 - classification_loss: 0.0296 108/500 [=====>........................] - ETA: 2:14 - loss: 0.4058 - regression_loss: 0.3761 - classification_loss: 0.0297 109/500 [=====>........................] - ETA: 2:14 - loss: 0.4045 - regression_loss: 0.3750 - classification_loss: 0.0295 110/500 [=====>........................] - ETA: 2:14 - loss: 0.4051 - regression_loss: 0.3756 - classification_loss: 0.0295 111/500 [=====>........................] - ETA: 2:13 - loss: 0.4051 - regression_loss: 0.3756 - classification_loss: 0.0295 112/500 [=====>........................] - ETA: 2:13 - loss: 0.4075 - regression_loss: 0.3780 - classification_loss: 0.0295 113/500 [=====>........................] - ETA: 2:13 - loss: 0.4087 - regression_loss: 0.3790 - classification_loss: 0.0298 114/500 [=====>........................] - ETA: 2:12 - loss: 0.4085 - regression_loss: 0.3787 - classification_loss: 0.0298 115/500 [=====>........................] - ETA: 2:12 - loss: 0.4086 - regression_loss: 0.3789 - classification_loss: 0.0297 116/500 [=====>........................] - ETA: 2:12 - loss: 0.4108 - regression_loss: 0.3810 - classification_loss: 0.0298 117/500 [======>.......................] - ETA: 2:11 - loss: 0.4103 - regression_loss: 0.3806 - classification_loss: 0.0297 118/500 [======>.......................] - ETA: 2:11 - loss: 0.4096 - regression_loss: 0.3799 - classification_loss: 0.0297 119/500 [======>.......................] - ETA: 2:11 - loss: 0.4106 - regression_loss: 0.3809 - classification_loss: 0.0297 120/500 [======>.......................] - ETA: 2:10 - loss: 0.4121 - regression_loss: 0.3821 - classification_loss: 0.0300 121/500 [======>.......................] - ETA: 2:10 - loss: 0.4125 - regression_loss: 0.3825 - classification_loss: 0.0300 122/500 [======>.......................] - ETA: 2:10 - loss: 0.4140 - regression_loss: 0.3840 - classification_loss: 0.0300 123/500 [======>.......................] - ETA: 2:09 - loss: 0.4136 - regression_loss: 0.3837 - classification_loss: 0.0299 124/500 [======>.......................] - ETA: 2:09 - loss: 0.4138 - regression_loss: 0.3838 - classification_loss: 0.0299 125/500 [======>.......................] - ETA: 2:09 - loss: 0.4137 - regression_loss: 0.3838 - classification_loss: 0.0299 126/500 [======>.......................] - ETA: 2:08 - loss: 0.4132 - regression_loss: 0.3833 - classification_loss: 0.0299 127/500 [======>.......................] - ETA: 2:08 - loss: 0.4139 - regression_loss: 0.3841 - classification_loss: 0.0298 128/500 [======>.......................] - ETA: 2:08 - loss: 0.4123 - regression_loss: 0.3826 - classification_loss: 0.0296 129/500 [======>.......................] - ETA: 2:07 - loss: 0.4124 - regression_loss: 0.3829 - classification_loss: 0.0295 130/500 [======>.......................] - ETA: 2:07 - loss: 0.4110 - regression_loss: 0.3816 - classification_loss: 0.0294 131/500 [======>.......................] - ETA: 2:07 - loss: 0.4116 - regression_loss: 0.3821 - classification_loss: 0.0295 132/500 [======>.......................] - ETA: 2:06 - loss: 0.4129 - regression_loss: 0.3834 - classification_loss: 0.0295 133/500 [======>.......................] - ETA: 2:06 - loss: 0.4116 - regression_loss: 0.3823 - classification_loss: 0.0293 134/500 [=======>......................] - ETA: 2:06 - loss: 0.4125 - regression_loss: 0.3830 - classification_loss: 0.0295 135/500 [=======>......................] - ETA: 2:05 - loss: 0.4111 - regression_loss: 0.3817 - classification_loss: 0.0293 136/500 [=======>......................] - ETA: 2:05 - loss: 0.4122 - regression_loss: 0.3827 - classification_loss: 0.0296 137/500 [=======>......................] - ETA: 2:05 - loss: 0.4126 - regression_loss: 0.3830 - classification_loss: 0.0296 138/500 [=======>......................] - ETA: 2:04 - loss: 0.4128 - regression_loss: 0.3831 - classification_loss: 0.0297 139/500 [=======>......................] - ETA: 2:04 - loss: 0.4141 - regression_loss: 0.3843 - classification_loss: 0.0298 140/500 [=======>......................] - ETA: 2:04 - loss: 0.4136 - regression_loss: 0.3838 - classification_loss: 0.0298 141/500 [=======>......................] - ETA: 2:03 - loss: 0.4125 - regression_loss: 0.3827 - classification_loss: 0.0297 142/500 [=======>......................] - ETA: 2:03 - loss: 0.4106 - regression_loss: 0.3810 - classification_loss: 0.0296 143/500 [=======>......................] - ETA: 2:03 - loss: 0.4092 - regression_loss: 0.3797 - classification_loss: 0.0295 144/500 [=======>......................] - ETA: 2:02 - loss: 0.4097 - regression_loss: 0.3802 - classification_loss: 0.0295 145/500 [=======>......................] - ETA: 2:02 - loss: 0.4101 - regression_loss: 0.3804 - classification_loss: 0.0297 146/500 [=======>......................] - ETA: 2:01 - loss: 0.4101 - regression_loss: 0.3805 - classification_loss: 0.0297 147/500 [=======>......................] - ETA: 2:01 - loss: 0.4113 - regression_loss: 0.3815 - classification_loss: 0.0298 148/500 [=======>......................] - ETA: 2:01 - loss: 0.4110 - regression_loss: 0.3811 - classification_loss: 0.0299 149/500 [=======>......................] - ETA: 2:00 - loss: 0.4109 - regression_loss: 0.3810 - classification_loss: 0.0299 150/500 [========>.....................] - ETA: 2:00 - loss: 0.4104 - regression_loss: 0.3806 - classification_loss: 0.0298 151/500 [========>.....................] - ETA: 2:00 - loss: 0.4104 - regression_loss: 0.3805 - classification_loss: 0.0299 152/500 [========>.....................] - ETA: 1:59 - loss: 0.4096 - regression_loss: 0.3797 - classification_loss: 0.0299 153/500 [========>.....................] - ETA: 1:59 - loss: 0.4100 - regression_loss: 0.3801 - classification_loss: 0.0299 154/500 [========>.....................] - ETA: 1:59 - loss: 0.4110 - regression_loss: 0.3809 - classification_loss: 0.0300 155/500 [========>.....................] - ETA: 1:58 - loss: 0.4127 - regression_loss: 0.3825 - classification_loss: 0.0302 156/500 [========>.....................] - ETA: 1:58 - loss: 0.4129 - regression_loss: 0.3827 - classification_loss: 0.0302 157/500 [========>.....................] - ETA: 1:58 - loss: 0.4145 - regression_loss: 0.3842 - classification_loss: 0.0303 158/500 [========>.....................] - ETA: 1:57 - loss: 0.4127 - regression_loss: 0.3825 - classification_loss: 0.0302 159/500 [========>.....................] - ETA: 1:57 - loss: 0.4113 - regression_loss: 0.3812 - classification_loss: 0.0301 160/500 [========>.....................] - ETA: 1:57 - loss: 0.4118 - regression_loss: 0.3816 - classification_loss: 0.0302 161/500 [========>.....................] - ETA: 1:56 - loss: 0.4118 - regression_loss: 0.3816 - classification_loss: 0.0302 162/500 [========>.....................] - ETA: 1:56 - loss: 0.4117 - regression_loss: 0.3814 - classification_loss: 0.0303 163/500 [========>.....................] - ETA: 1:56 - loss: 0.4100 - regression_loss: 0.3798 - classification_loss: 0.0301 164/500 [========>.....................] - ETA: 1:55 - loss: 0.4102 - regression_loss: 0.3801 - classification_loss: 0.0302 165/500 [========>.....................] - ETA: 1:55 - loss: 0.4091 - regression_loss: 0.3791 - classification_loss: 0.0300 166/500 [========>.....................] - ETA: 1:54 - loss: 0.4093 - regression_loss: 0.3794 - classification_loss: 0.0299 167/500 [=========>....................] - ETA: 1:54 - loss: 0.4092 - regression_loss: 0.3792 - classification_loss: 0.0299 168/500 [=========>....................] - ETA: 1:54 - loss: 0.4103 - regression_loss: 0.3800 - classification_loss: 0.0303 169/500 [=========>....................] - ETA: 1:53 - loss: 0.4096 - regression_loss: 0.3795 - classification_loss: 0.0302 170/500 [=========>....................] - ETA: 1:53 - loss: 0.4100 - regression_loss: 0.3798 - classification_loss: 0.0302 171/500 [=========>....................] - ETA: 1:53 - loss: 0.4103 - regression_loss: 0.3799 - classification_loss: 0.0304 172/500 [=========>....................] - ETA: 1:52 - loss: 0.4130 - regression_loss: 0.3822 - classification_loss: 0.0308 173/500 [=========>....................] - ETA: 1:52 - loss: 0.4130 - regression_loss: 0.3822 - classification_loss: 0.0308 174/500 [=========>....................] - ETA: 1:52 - loss: 0.4128 - regression_loss: 0.3820 - classification_loss: 0.0308 175/500 [=========>....................] - ETA: 1:51 - loss: 0.4120 - regression_loss: 0.3812 - classification_loss: 0.0308 176/500 [=========>....................] - ETA: 1:51 - loss: 0.4127 - regression_loss: 0.3820 - classification_loss: 0.0307 177/500 [=========>....................] - ETA: 1:51 - loss: 0.4116 - regression_loss: 0.3810 - classification_loss: 0.0306 178/500 [=========>....................] - ETA: 1:50 - loss: 0.4117 - regression_loss: 0.3810 - classification_loss: 0.0307 179/500 [=========>....................] - ETA: 1:50 - loss: 0.4121 - regression_loss: 0.3814 - classification_loss: 0.0307 180/500 [=========>....................] - ETA: 1:49 - loss: 0.4126 - regression_loss: 0.3818 - classification_loss: 0.0307 181/500 [=========>....................] - ETA: 1:49 - loss: 0.4142 - regression_loss: 0.3833 - classification_loss: 0.0309 182/500 [=========>....................] - ETA: 1:49 - loss: 0.4147 - regression_loss: 0.3838 - classification_loss: 0.0309 183/500 [=========>....................] - ETA: 1:48 - loss: 0.4134 - regression_loss: 0.3827 - classification_loss: 0.0307 184/500 [==========>...................] - ETA: 1:48 - loss: 0.4153 - regression_loss: 0.3845 - classification_loss: 0.0308 185/500 [==========>...................] - ETA: 1:48 - loss: 0.4158 - regression_loss: 0.3850 - classification_loss: 0.0308 186/500 [==========>...................] - ETA: 1:47 - loss: 0.4154 - regression_loss: 0.3846 - classification_loss: 0.0308 187/500 [==========>...................] - ETA: 1:47 - loss: 0.4169 - regression_loss: 0.3860 - classification_loss: 0.0309 188/500 [==========>...................] - ETA: 1:47 - loss: 0.4158 - regression_loss: 0.3850 - classification_loss: 0.0308 189/500 [==========>...................] - ETA: 1:46 - loss: 0.4159 - regression_loss: 0.3850 - classification_loss: 0.0309 190/500 [==========>...................] - ETA: 1:46 - loss: 0.4148 - regression_loss: 0.3840 - classification_loss: 0.0308 191/500 [==========>...................] - ETA: 1:46 - loss: 0.4149 - regression_loss: 0.3841 - classification_loss: 0.0308 192/500 [==========>...................] - ETA: 1:45 - loss: 0.4143 - regression_loss: 0.3835 - classification_loss: 0.0308 193/500 [==========>...................] - ETA: 1:45 - loss: 0.4132 - regression_loss: 0.3826 - classification_loss: 0.0306 194/500 [==========>...................] - ETA: 1:44 - loss: 0.4145 - regression_loss: 0.3835 - classification_loss: 0.0310 195/500 [==========>...................] - ETA: 1:44 - loss: 0.4134 - regression_loss: 0.3825 - classification_loss: 0.0309 196/500 [==========>...................] - ETA: 1:44 - loss: 0.4132 - regression_loss: 0.3823 - classification_loss: 0.0308 197/500 [==========>...................] - ETA: 1:43 - loss: 0.4130 - regression_loss: 0.3822 - classification_loss: 0.0308 198/500 [==========>...................] - ETA: 1:43 - loss: 0.4117 - regression_loss: 0.3810 - classification_loss: 0.0307 199/500 [==========>...................] - ETA: 1:43 - loss: 0.4106 - regression_loss: 0.3800 - classification_loss: 0.0306 200/500 [===========>..................] - ETA: 1:42 - loss: 0.4100 - regression_loss: 0.3794 - classification_loss: 0.0306 201/500 [===========>..................] - ETA: 1:42 - loss: 0.4086 - regression_loss: 0.3781 - classification_loss: 0.0305 202/500 [===========>..................] - ETA: 1:42 - loss: 0.4096 - regression_loss: 0.3788 - classification_loss: 0.0308 203/500 [===========>..................] - ETA: 1:41 - loss: 0.4102 - regression_loss: 0.3793 - classification_loss: 0.0309 204/500 [===========>..................] - ETA: 1:41 - loss: 0.4109 - regression_loss: 0.3798 - classification_loss: 0.0311 205/500 [===========>..................] - ETA: 1:41 - loss: 0.4105 - regression_loss: 0.3794 - classification_loss: 0.0311 206/500 [===========>..................] - ETA: 1:40 - loss: 0.4108 - regression_loss: 0.3798 - classification_loss: 0.0310 207/500 [===========>..................] - ETA: 1:40 - loss: 0.4118 - regression_loss: 0.3808 - classification_loss: 0.0310 208/500 [===========>..................] - ETA: 1:40 - loss: 0.4109 - regression_loss: 0.3800 - classification_loss: 0.0309 209/500 [===========>..................] - ETA: 1:39 - loss: 0.4110 - regression_loss: 0.3800 - classification_loss: 0.0310 210/500 [===========>..................] - ETA: 1:39 - loss: 0.4099 - regression_loss: 0.3789 - classification_loss: 0.0309 211/500 [===========>..................] - ETA: 1:39 - loss: 0.4093 - regression_loss: 0.3783 - classification_loss: 0.0310 212/500 [===========>..................] - ETA: 1:38 - loss: 0.4081 - regression_loss: 0.3773 - classification_loss: 0.0309 213/500 [===========>..................] - ETA: 1:38 - loss: 0.4072 - regression_loss: 0.3764 - classification_loss: 0.0308 214/500 [===========>..................] - ETA: 1:38 - loss: 0.4077 - regression_loss: 0.3769 - classification_loss: 0.0308 215/500 [===========>..................] - ETA: 1:37 - loss: 0.4069 - regression_loss: 0.3762 - classification_loss: 0.0307 216/500 [===========>..................] - ETA: 1:37 - loss: 0.4069 - regression_loss: 0.3761 - classification_loss: 0.0308 217/500 [============>.................] - ETA: 1:37 - loss: 0.4072 - regression_loss: 0.3763 - classification_loss: 0.0309 218/500 [============>.................] - ETA: 1:36 - loss: 0.4070 - regression_loss: 0.3762 - classification_loss: 0.0308 219/500 [============>.................] - ETA: 1:36 - loss: 0.4064 - regression_loss: 0.3756 - classification_loss: 0.0307 220/500 [============>.................] - ETA: 1:36 - loss: 0.4061 - regression_loss: 0.3754 - classification_loss: 0.0307 221/500 [============>.................] - ETA: 1:35 - loss: 0.4051 - regression_loss: 0.3745 - classification_loss: 0.0306 222/500 [============>.................] - ETA: 1:35 - loss: 0.4053 - regression_loss: 0.3747 - classification_loss: 0.0306 223/500 [============>.................] - ETA: 1:35 - loss: 0.4056 - regression_loss: 0.3749 - classification_loss: 0.0307 224/500 [============>.................] - ETA: 1:34 - loss: 0.4057 - regression_loss: 0.3749 - classification_loss: 0.0308 225/500 [============>.................] - ETA: 1:34 - loss: 0.4059 - regression_loss: 0.3751 - classification_loss: 0.0308 226/500 [============>.................] - ETA: 1:34 - loss: 0.4052 - regression_loss: 0.3744 - classification_loss: 0.0308 227/500 [============>.................] - ETA: 1:33 - loss: 0.4068 - regression_loss: 0.3758 - classification_loss: 0.0310 228/500 [============>.................] - ETA: 1:33 - loss: 0.4066 - regression_loss: 0.3757 - classification_loss: 0.0309 229/500 [============>.................] - ETA: 1:33 - loss: 0.4068 - regression_loss: 0.3758 - classification_loss: 0.0311 230/500 [============>.................] - ETA: 1:32 - loss: 0.4072 - regression_loss: 0.3761 - classification_loss: 0.0311 231/500 [============>.................] - ETA: 1:32 - loss: 0.4086 - regression_loss: 0.3775 - classification_loss: 0.0312 232/500 [============>.................] - ETA: 1:31 - loss: 0.4096 - regression_loss: 0.3784 - classification_loss: 0.0312 233/500 [============>.................] - ETA: 1:31 - loss: 0.4091 - regression_loss: 0.3779 - classification_loss: 0.0312 234/500 [=============>................] - ETA: 1:31 - loss: 0.4089 - regression_loss: 0.3777 - classification_loss: 0.0312 235/500 [=============>................] - ETA: 1:30 - loss: 0.4085 - regression_loss: 0.3773 - classification_loss: 0.0312 236/500 [=============>................] - ETA: 1:30 - loss: 0.4083 - regression_loss: 0.3771 - classification_loss: 0.0312 237/500 [=============>................] - ETA: 1:30 - loss: 0.4082 - regression_loss: 0.3770 - classification_loss: 0.0311 238/500 [=============>................] - ETA: 1:29 - loss: 0.4091 - regression_loss: 0.3779 - classification_loss: 0.0312 239/500 [=============>................] - ETA: 1:29 - loss: 0.4089 - regression_loss: 0.3778 - classification_loss: 0.0312 240/500 [=============>................] - ETA: 1:29 - loss: 0.4090 - regression_loss: 0.3778 - classification_loss: 0.0312 241/500 [=============>................] - ETA: 1:28 - loss: 0.4094 - regression_loss: 0.3782 - classification_loss: 0.0312 242/500 [=============>................] - ETA: 1:28 - loss: 0.4102 - regression_loss: 0.3790 - classification_loss: 0.0312 243/500 [=============>................] - ETA: 1:28 - loss: 0.4095 - regression_loss: 0.3783 - classification_loss: 0.0312 244/500 [=============>................] - ETA: 1:27 - loss: 0.4088 - regression_loss: 0.3777 - classification_loss: 0.0311 245/500 [=============>................] - ETA: 1:27 - loss: 0.4086 - regression_loss: 0.3775 - classification_loss: 0.0311 246/500 [=============>................] - ETA: 1:27 - loss: 0.4074 - regression_loss: 0.3764 - classification_loss: 0.0310 247/500 [=============>................] - ETA: 1:26 - loss: 0.4072 - regression_loss: 0.3762 - classification_loss: 0.0310 248/500 [=============>................] - ETA: 1:26 - loss: 0.4080 - regression_loss: 0.3771 - classification_loss: 0.0309 249/500 [=============>................] - ETA: 1:26 - loss: 0.4086 - regression_loss: 0.3776 - classification_loss: 0.0309 250/500 [==============>...............] - ETA: 1:25 - loss: 0.4089 - regression_loss: 0.3780 - classification_loss: 0.0309 251/500 [==============>...............] - ETA: 1:25 - loss: 0.4099 - regression_loss: 0.3788 - classification_loss: 0.0311 252/500 [==============>...............] - ETA: 1:25 - loss: 0.4097 - regression_loss: 0.3786 - classification_loss: 0.0311 253/500 [==============>...............] - ETA: 1:24 - loss: 0.4094 - regression_loss: 0.3784 - classification_loss: 0.0311 254/500 [==============>...............] - ETA: 1:24 - loss: 0.4097 - regression_loss: 0.3786 - classification_loss: 0.0311 255/500 [==============>...............] - ETA: 1:24 - loss: 0.4104 - regression_loss: 0.3793 - classification_loss: 0.0311 256/500 [==============>...............] - ETA: 1:23 - loss: 0.4107 - regression_loss: 0.3794 - classification_loss: 0.0312 257/500 [==============>...............] - ETA: 1:23 - loss: 0.4105 - regression_loss: 0.3794 - classification_loss: 0.0311 258/500 [==============>...............] - ETA: 1:23 - loss: 0.4100 - regression_loss: 0.3789 - classification_loss: 0.0311 259/500 [==============>...............] - ETA: 1:22 - loss: 0.4095 - regression_loss: 0.3785 - classification_loss: 0.0311 260/500 [==============>...............] - ETA: 1:22 - loss: 0.4105 - regression_loss: 0.3793 - classification_loss: 0.0311 261/500 [==============>...............] - ETA: 1:22 - loss: 0.4103 - regression_loss: 0.3791 - classification_loss: 0.0311 262/500 [==============>...............] - ETA: 1:21 - loss: 0.4106 - regression_loss: 0.3794 - classification_loss: 0.0312 263/500 [==============>...............] - ETA: 1:21 - loss: 0.4111 - regression_loss: 0.3799 - classification_loss: 0.0313 264/500 [==============>...............] - ETA: 1:20 - loss: 0.4109 - regression_loss: 0.3796 - classification_loss: 0.0313 265/500 [==============>...............] - ETA: 1:20 - loss: 0.4105 - regression_loss: 0.3792 - classification_loss: 0.0312 266/500 [==============>...............] - ETA: 1:20 - loss: 0.4122 - regression_loss: 0.3809 - classification_loss: 0.0313 267/500 [===============>..............] - ETA: 1:19 - loss: 0.4119 - regression_loss: 0.3807 - classification_loss: 0.0312 268/500 [===============>..............] - ETA: 1:19 - loss: 0.4118 - regression_loss: 0.3806 - classification_loss: 0.0313 269/500 [===============>..............] - ETA: 1:19 - loss: 0.4120 - regression_loss: 0.3807 - classification_loss: 0.0313 270/500 [===============>..............] - ETA: 1:18 - loss: 0.4130 - regression_loss: 0.3816 - classification_loss: 0.0314 271/500 [===============>..............] - ETA: 1:18 - loss: 0.4129 - regression_loss: 0.3815 - classification_loss: 0.0314 272/500 [===============>..............] - ETA: 1:18 - loss: 0.4124 - regression_loss: 0.3810 - classification_loss: 0.0314 273/500 [===============>..............] - ETA: 1:17 - loss: 0.4129 - regression_loss: 0.3814 - classification_loss: 0.0315 274/500 [===============>..............] - ETA: 1:17 - loss: 0.4135 - regression_loss: 0.3819 - classification_loss: 0.0316 275/500 [===============>..............] - ETA: 1:17 - loss: 0.4129 - regression_loss: 0.3814 - classification_loss: 0.0316 276/500 [===============>..............] - ETA: 1:16 - loss: 0.4124 - regression_loss: 0.3809 - classification_loss: 0.0315 277/500 [===============>..............] - ETA: 1:16 - loss: 0.4129 - regression_loss: 0.3814 - classification_loss: 0.0315 278/500 [===============>..............] - ETA: 1:16 - loss: 0.4136 - regression_loss: 0.3820 - classification_loss: 0.0316 279/500 [===============>..............] - ETA: 1:15 - loss: 0.4144 - regression_loss: 0.3828 - classification_loss: 0.0316 280/500 [===============>..............] - ETA: 1:15 - loss: 0.4151 - regression_loss: 0.3834 - classification_loss: 0.0317 281/500 [===============>..............] - ETA: 1:15 - loss: 0.4150 - regression_loss: 0.3833 - classification_loss: 0.0317 282/500 [===============>..............] - ETA: 1:14 - loss: 0.4144 - regression_loss: 0.3829 - classification_loss: 0.0316 283/500 [===============>..............] - ETA: 1:14 - loss: 0.4149 - regression_loss: 0.3832 - classification_loss: 0.0317 284/500 [================>.............] - ETA: 1:14 - loss: 0.4153 - regression_loss: 0.3836 - classification_loss: 0.0317 285/500 [================>.............] - ETA: 1:13 - loss: 0.4153 - regression_loss: 0.3837 - classification_loss: 0.0316 286/500 [================>.............] - ETA: 1:13 - loss: 0.4156 - regression_loss: 0.3840 - classification_loss: 0.0316 287/500 [================>.............] - ETA: 1:13 - loss: 0.4152 - regression_loss: 0.3836 - classification_loss: 0.0316 288/500 [================>.............] - ETA: 1:12 - loss: 0.4151 - regression_loss: 0.3835 - classification_loss: 0.0315 289/500 [================>.............] - ETA: 1:12 - loss: 0.4146 - regression_loss: 0.3832 - classification_loss: 0.0315 290/500 [================>.............] - ETA: 1:12 - loss: 0.4159 - regression_loss: 0.3844 - classification_loss: 0.0316 291/500 [================>.............] - ETA: 1:11 - loss: 0.4152 - regression_loss: 0.3836 - classification_loss: 0.0315 292/500 [================>.............] - ETA: 1:11 - loss: 0.4150 - regression_loss: 0.3835 - classification_loss: 0.0315 293/500 [================>.............] - ETA: 1:11 - loss: 0.4148 - regression_loss: 0.3833 - classification_loss: 0.0315 294/500 [================>.............] - ETA: 1:10 - loss: 0.4144 - regression_loss: 0.3830 - classification_loss: 0.0314 295/500 [================>.............] - ETA: 1:10 - loss: 0.4140 - regression_loss: 0.3826 - classification_loss: 0.0314 296/500 [================>.............] - ETA: 1:10 - loss: 0.4137 - regression_loss: 0.3824 - classification_loss: 0.0314 297/500 [================>.............] - ETA: 1:09 - loss: 0.4138 - regression_loss: 0.3823 - classification_loss: 0.0314 298/500 [================>.............] - ETA: 1:09 - loss: 0.4134 - regression_loss: 0.3820 - classification_loss: 0.0314 299/500 [================>.............] - ETA: 1:09 - loss: 0.4130 - regression_loss: 0.3816 - classification_loss: 0.0314 300/500 [=================>............] - ETA: 1:08 - loss: 0.4130 - regression_loss: 0.3817 - classification_loss: 0.0314 301/500 [=================>............] - ETA: 1:08 - loss: 0.4132 - regression_loss: 0.3818 - classification_loss: 0.0314 302/500 [=================>............] - ETA: 1:08 - loss: 0.4127 - regression_loss: 0.3813 - classification_loss: 0.0313 303/500 [=================>............] - ETA: 1:07 - loss: 0.4120 - regression_loss: 0.3807 - classification_loss: 0.0312 304/500 [=================>............] - ETA: 1:07 - loss: 0.4115 - regression_loss: 0.3802 - classification_loss: 0.0312 305/500 [=================>............] - ETA: 1:06 - loss: 0.4110 - regression_loss: 0.3798 - classification_loss: 0.0312 306/500 [=================>............] - ETA: 1:06 - loss: 0.4111 - regression_loss: 0.3799 - classification_loss: 0.0312 307/500 [=================>............] - ETA: 1:06 - loss: 0.4111 - regression_loss: 0.3799 - classification_loss: 0.0312 308/500 [=================>............] - ETA: 1:05 - loss: 0.4112 - regression_loss: 0.3799 - classification_loss: 0.0312 309/500 [=================>............] - ETA: 1:05 - loss: 0.4113 - regression_loss: 0.3801 - classification_loss: 0.0312 310/500 [=================>............] - ETA: 1:05 - loss: 0.4109 - regression_loss: 0.3797 - classification_loss: 0.0312 311/500 [=================>............] - ETA: 1:04 - loss: 0.4107 - regression_loss: 0.3796 - classification_loss: 0.0311 312/500 [=================>............] - ETA: 1:04 - loss: 0.4101 - regression_loss: 0.3790 - classification_loss: 0.0311 313/500 [=================>............] - ETA: 1:04 - loss: 0.4099 - regression_loss: 0.3788 - classification_loss: 0.0310 314/500 [=================>............] - ETA: 1:03 - loss: 0.4102 - regression_loss: 0.3792 - classification_loss: 0.0310 315/500 [=================>............] - ETA: 1:03 - loss: 0.4094 - regression_loss: 0.3784 - classification_loss: 0.0310 316/500 [=================>............] - ETA: 1:03 - loss: 0.4092 - regression_loss: 0.3782 - classification_loss: 0.0309 317/500 [==================>...........] - ETA: 1:02 - loss: 0.4094 - regression_loss: 0.3785 - classification_loss: 0.0309 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4089 - regression_loss: 0.3780 - classification_loss: 0.0309 319/500 [==================>...........] - ETA: 1:02 - loss: 0.4088 - regression_loss: 0.3780 - classification_loss: 0.0308 320/500 [==================>...........] - ETA: 1:01 - loss: 0.4087 - regression_loss: 0.3778 - classification_loss: 0.0308 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4090 - regression_loss: 0.3782 - classification_loss: 0.0308 322/500 [==================>...........] - ETA: 1:01 - loss: 0.4083 - regression_loss: 0.3776 - classification_loss: 0.0307 323/500 [==================>...........] - ETA: 1:00 - loss: 0.4079 - regression_loss: 0.3772 - classification_loss: 0.0307 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4081 - regression_loss: 0.3774 - classification_loss: 0.0306 325/500 [==================>...........] - ETA: 1:00 - loss: 0.4079 - regression_loss: 0.3772 - classification_loss: 0.0306 326/500 [==================>...........] - ETA: 59s - loss: 0.4079 - regression_loss: 0.3773 - classification_loss: 0.0306  327/500 [==================>...........] - ETA: 59s - loss: 0.4084 - regression_loss: 0.3777 - classification_loss: 0.0307 328/500 [==================>...........] - ETA: 59s - loss: 0.4082 - regression_loss: 0.3775 - classification_loss: 0.0307 329/500 [==================>...........] - ETA: 58s - loss: 0.4082 - regression_loss: 0.3775 - classification_loss: 0.0306 330/500 [==================>...........] - ETA: 58s - loss: 0.4076 - regression_loss: 0.3770 - classification_loss: 0.0306 331/500 [==================>...........] - ETA: 58s - loss: 0.4076 - regression_loss: 0.3771 - classification_loss: 0.0306 332/500 [==================>...........] - ETA: 57s - loss: 0.4075 - regression_loss: 0.3770 - classification_loss: 0.0306 333/500 [==================>...........] - ETA: 57s - loss: 0.4071 - regression_loss: 0.3765 - classification_loss: 0.0305 334/500 [===================>..........] - ETA: 57s - loss: 0.4081 - regression_loss: 0.3774 - classification_loss: 0.0308 335/500 [===================>..........] - ETA: 56s - loss: 0.4083 - regression_loss: 0.3776 - classification_loss: 0.0308 336/500 [===================>..........] - ETA: 56s - loss: 0.4079 - regression_loss: 0.3772 - classification_loss: 0.0307 337/500 [===================>..........] - ETA: 56s - loss: 0.4078 - regression_loss: 0.3771 - classification_loss: 0.0307 338/500 [===================>..........] - ETA: 55s - loss: 0.4080 - regression_loss: 0.3773 - classification_loss: 0.0307 339/500 [===================>..........] - ETA: 55s - loss: 0.4080 - regression_loss: 0.3773 - classification_loss: 0.0307 340/500 [===================>..........] - ETA: 55s - loss: 0.4075 - regression_loss: 0.3769 - classification_loss: 0.0306 341/500 [===================>..........] - ETA: 54s - loss: 0.4070 - regression_loss: 0.3765 - classification_loss: 0.0306 342/500 [===================>..........] - ETA: 54s - loss: 0.4078 - regression_loss: 0.3771 - classification_loss: 0.0307 343/500 [===================>..........] - ETA: 53s - loss: 0.4079 - regression_loss: 0.3771 - classification_loss: 0.0308 344/500 [===================>..........] - ETA: 53s - loss: 0.4077 - regression_loss: 0.3769 - classification_loss: 0.0307 345/500 [===================>..........] - ETA: 53s - loss: 0.4076 - regression_loss: 0.3769 - classification_loss: 0.0307 346/500 [===================>..........] - ETA: 52s - loss: 0.4078 - regression_loss: 0.3770 - classification_loss: 0.0308 347/500 [===================>..........] - ETA: 52s - loss: 0.4072 - regression_loss: 0.3765 - classification_loss: 0.0307 348/500 [===================>..........] - ETA: 52s - loss: 0.4073 - regression_loss: 0.3765 - classification_loss: 0.0308 349/500 [===================>..........] - ETA: 51s - loss: 0.4070 - regression_loss: 0.3762 - classification_loss: 0.0308 350/500 [====================>.........] - ETA: 51s - loss: 0.4068 - regression_loss: 0.3761 - classification_loss: 0.0307 351/500 [====================>.........] - ETA: 51s - loss: 0.4067 - regression_loss: 0.3760 - classification_loss: 0.0307 352/500 [====================>.........] - ETA: 50s - loss: 0.4069 - regression_loss: 0.3762 - classification_loss: 0.0307 353/500 [====================>.........] - ETA: 50s - loss: 0.4073 - regression_loss: 0.3765 - classification_loss: 0.0307 354/500 [====================>.........] - ETA: 50s - loss: 0.4075 - regression_loss: 0.3767 - classification_loss: 0.0307 355/500 [====================>.........] - ETA: 49s - loss: 0.4074 - regression_loss: 0.3766 - classification_loss: 0.0307 356/500 [====================>.........] - ETA: 49s - loss: 0.4074 - regression_loss: 0.3767 - classification_loss: 0.0307 357/500 [====================>.........] - ETA: 49s - loss: 0.4072 - regression_loss: 0.3765 - classification_loss: 0.0307 358/500 [====================>.........] - ETA: 48s - loss: 0.4070 - regression_loss: 0.3763 - classification_loss: 0.0307 359/500 [====================>.........] - ETA: 48s - loss: 0.4063 - regression_loss: 0.3756 - classification_loss: 0.0306 360/500 [====================>.........] - ETA: 48s - loss: 0.4064 - regression_loss: 0.3758 - classification_loss: 0.0306 361/500 [====================>.........] - ETA: 47s - loss: 0.4056 - regression_loss: 0.3750 - classification_loss: 0.0306 362/500 [====================>.........] - ETA: 47s - loss: 0.4051 - regression_loss: 0.3746 - classification_loss: 0.0305 363/500 [====================>.........] - ETA: 47s - loss: 0.4054 - regression_loss: 0.3748 - classification_loss: 0.0306 364/500 [====================>.........] - ETA: 46s - loss: 0.4056 - regression_loss: 0.3751 - classification_loss: 0.0305 365/500 [====================>.........] - ETA: 46s - loss: 0.4055 - regression_loss: 0.3750 - classification_loss: 0.0305 366/500 [====================>.........] - ETA: 46s - loss: 0.4061 - regression_loss: 0.3755 - classification_loss: 0.0306 367/500 [=====================>........] - ETA: 45s - loss: 0.4054 - regression_loss: 0.3748 - classification_loss: 0.0306 368/500 [=====================>........] - ETA: 45s - loss: 0.4052 - regression_loss: 0.3746 - classification_loss: 0.0306 369/500 [=====================>........] - ETA: 45s - loss: 0.4046 - regression_loss: 0.3741 - classification_loss: 0.0306 370/500 [=====================>........] - ETA: 44s - loss: 0.4047 - regression_loss: 0.3742 - classification_loss: 0.0305 371/500 [=====================>........] - ETA: 44s - loss: 0.4053 - regression_loss: 0.3747 - classification_loss: 0.0306 372/500 [=====================>........] - ETA: 44s - loss: 0.4050 - regression_loss: 0.3745 - classification_loss: 0.0306 373/500 [=====================>........] - ETA: 43s - loss: 0.4041 - regression_loss: 0.3736 - classification_loss: 0.0305 374/500 [=====================>........] - ETA: 43s - loss: 0.4043 - regression_loss: 0.3738 - classification_loss: 0.0305 375/500 [=====================>........] - ETA: 42s - loss: 0.4047 - regression_loss: 0.3743 - classification_loss: 0.0305 376/500 [=====================>........] - ETA: 42s - loss: 0.4045 - regression_loss: 0.3740 - classification_loss: 0.0305 377/500 [=====================>........] - ETA: 42s - loss: 0.4043 - regression_loss: 0.3738 - classification_loss: 0.0305 378/500 [=====================>........] - ETA: 41s - loss: 0.4048 - regression_loss: 0.3742 - classification_loss: 0.0305 379/500 [=====================>........] - ETA: 41s - loss: 0.4047 - regression_loss: 0.3741 - classification_loss: 0.0306 380/500 [=====================>........] - ETA: 41s - loss: 0.4044 - regression_loss: 0.3738 - classification_loss: 0.0305 381/500 [=====================>........] - ETA: 40s - loss: 0.4046 - regression_loss: 0.3739 - classification_loss: 0.0307 382/500 [=====================>........] - ETA: 40s - loss: 0.4045 - regression_loss: 0.3738 - classification_loss: 0.0307 383/500 [=====================>........] - ETA: 40s - loss: 0.4045 - regression_loss: 0.3739 - classification_loss: 0.0306 384/500 [======================>.......] - ETA: 39s - loss: 0.4041 - regression_loss: 0.3735 - classification_loss: 0.0306 385/500 [======================>.......] - ETA: 39s - loss: 0.4040 - regression_loss: 0.3733 - classification_loss: 0.0306 386/500 [======================>.......] - ETA: 39s - loss: 0.4036 - regression_loss: 0.3730 - classification_loss: 0.0306 387/500 [======================>.......] - ETA: 38s - loss: 0.4036 - regression_loss: 0.3730 - classification_loss: 0.0306 388/500 [======================>.......] - ETA: 38s - loss: 0.4031 - regression_loss: 0.3726 - classification_loss: 0.0305 389/500 [======================>.......] - ETA: 38s - loss: 0.4026 - regression_loss: 0.3721 - classification_loss: 0.0305 390/500 [======================>.......] - ETA: 37s - loss: 0.4027 - regression_loss: 0.3722 - classification_loss: 0.0305 391/500 [======================>.......] - ETA: 37s - loss: 0.4026 - regression_loss: 0.3721 - classification_loss: 0.0305 392/500 [======================>.......] - ETA: 37s - loss: 0.4024 - regression_loss: 0.3719 - classification_loss: 0.0304 393/500 [======================>.......] - ETA: 36s - loss: 0.4018 - regression_loss: 0.3714 - classification_loss: 0.0304 394/500 [======================>.......] - ETA: 36s - loss: 0.4019 - regression_loss: 0.3715 - classification_loss: 0.0304 395/500 [======================>.......] - ETA: 36s - loss: 0.4016 - regression_loss: 0.3713 - classification_loss: 0.0303 396/500 [======================>.......] - ETA: 35s - loss: 0.4014 - regression_loss: 0.3711 - classification_loss: 0.0303 397/500 [======================>.......] - ETA: 35s - loss: 0.4012 - regression_loss: 0.3710 - classification_loss: 0.0302 398/500 [======================>.......] - ETA: 35s - loss: 0.4010 - regression_loss: 0.3708 - classification_loss: 0.0302 399/500 [======================>.......] - ETA: 34s - loss: 0.4006 - regression_loss: 0.3704 - classification_loss: 0.0302 400/500 [=======================>......] - ETA: 34s - loss: 0.4005 - regression_loss: 0.3704 - classification_loss: 0.0301 401/500 [=======================>......] - ETA: 34s - loss: 0.4005 - regression_loss: 0.3704 - classification_loss: 0.0301 402/500 [=======================>......] - ETA: 33s - loss: 0.4012 - regression_loss: 0.3710 - classification_loss: 0.0302 403/500 [=======================>......] - ETA: 33s - loss: 0.4015 - regression_loss: 0.3713 - classification_loss: 0.0302 404/500 [=======================>......] - ETA: 32s - loss: 0.4013 - regression_loss: 0.3712 - classification_loss: 0.0302 405/500 [=======================>......] - ETA: 32s - loss: 0.4015 - regression_loss: 0.3714 - classification_loss: 0.0302 406/500 [=======================>......] - ETA: 32s - loss: 0.4021 - regression_loss: 0.3718 - classification_loss: 0.0302 407/500 [=======================>......] - ETA: 31s - loss: 0.4015 - regression_loss: 0.3714 - classification_loss: 0.0302 408/500 [=======================>......] - ETA: 31s - loss: 0.4014 - regression_loss: 0.3712 - classification_loss: 0.0302 409/500 [=======================>......] - ETA: 31s - loss: 0.4008 - regression_loss: 0.3707 - classification_loss: 0.0301 410/500 [=======================>......] - ETA: 30s - loss: 0.4007 - regression_loss: 0.3706 - classification_loss: 0.0301 411/500 [=======================>......] - ETA: 30s - loss: 0.4003 - regression_loss: 0.3702 - classification_loss: 0.0300 412/500 [=======================>......] - ETA: 30s - loss: 0.4001 - regression_loss: 0.3701 - classification_loss: 0.0300 413/500 [=======================>......] - ETA: 29s - loss: 0.3996 - regression_loss: 0.3696 - classification_loss: 0.0300 414/500 [=======================>......] - ETA: 29s - loss: 0.4000 - regression_loss: 0.3700 - classification_loss: 0.0301 415/500 [=======================>......] - ETA: 29s - loss: 0.4001 - regression_loss: 0.3701 - classification_loss: 0.0300 416/500 [=======================>......] - ETA: 28s - loss: 0.4002 - regression_loss: 0.3701 - classification_loss: 0.0301 417/500 [========================>.....] - ETA: 28s - loss: 0.4006 - regression_loss: 0.3704 - classification_loss: 0.0301 418/500 [========================>.....] - ETA: 28s - loss: 0.4002 - regression_loss: 0.3701 - classification_loss: 0.0301 419/500 [========================>.....] - ETA: 27s - loss: 0.4000 - regression_loss: 0.3699 - classification_loss: 0.0300 420/500 [========================>.....] - ETA: 27s - loss: 0.3996 - regression_loss: 0.3696 - classification_loss: 0.0300 421/500 [========================>.....] - ETA: 27s - loss: 0.3995 - regression_loss: 0.3696 - classification_loss: 0.0300 422/500 [========================>.....] - ETA: 26s - loss: 0.4001 - regression_loss: 0.3700 - classification_loss: 0.0301 423/500 [========================>.....] - ETA: 26s - loss: 0.4002 - regression_loss: 0.3701 - classification_loss: 0.0300 424/500 [========================>.....] - ETA: 26s - loss: 0.4004 - regression_loss: 0.3704 - classification_loss: 0.0301 425/500 [========================>.....] - ETA: 25s - loss: 0.4002 - regression_loss: 0.3702 - classification_loss: 0.0300 426/500 [========================>.....] - ETA: 25s - loss: 0.3999 - regression_loss: 0.3699 - classification_loss: 0.0300 427/500 [========================>.....] - ETA: 25s - loss: 0.3996 - regression_loss: 0.3697 - classification_loss: 0.0300 428/500 [========================>.....] - ETA: 24s - loss: 0.4001 - regression_loss: 0.3701 - classification_loss: 0.0300 429/500 [========================>.....] - ETA: 24s - loss: 0.3999 - regression_loss: 0.3699 - classification_loss: 0.0300 430/500 [========================>.....] - ETA: 24s - loss: 0.4003 - regression_loss: 0.3702 - classification_loss: 0.0301 431/500 [========================>.....] - ETA: 23s - loss: 0.4005 - regression_loss: 0.3704 - classification_loss: 0.0301 432/500 [========================>.....] - ETA: 23s - loss: 0.4011 - regression_loss: 0.3709 - classification_loss: 0.0302 433/500 [========================>.....] - ETA: 23s - loss: 0.4014 - regression_loss: 0.3713 - classification_loss: 0.0302 434/500 [=========================>....] - ETA: 22s - loss: 0.4016 - regression_loss: 0.3714 - classification_loss: 0.0302 435/500 [=========================>....] - ETA: 22s - loss: 0.4018 - regression_loss: 0.3716 - classification_loss: 0.0302 436/500 [=========================>....] - ETA: 22s - loss: 0.4015 - regression_loss: 0.3713 - classification_loss: 0.0302 437/500 [=========================>....] - ETA: 21s - loss: 0.4016 - regression_loss: 0.3713 - classification_loss: 0.0303 438/500 [=========================>....] - ETA: 21s - loss: 0.4014 - regression_loss: 0.3712 - classification_loss: 0.0302 439/500 [=========================>....] - ETA: 20s - loss: 0.4016 - regression_loss: 0.3714 - classification_loss: 0.0302 440/500 [=========================>....] - ETA: 20s - loss: 0.4015 - regression_loss: 0.3713 - classification_loss: 0.0302 441/500 [=========================>....] - ETA: 20s - loss: 0.4020 - regression_loss: 0.3717 - classification_loss: 0.0303 442/500 [=========================>....] - ETA: 19s - loss: 0.4020 - regression_loss: 0.3717 - classification_loss: 0.0303 443/500 [=========================>....] - ETA: 19s - loss: 0.4020 - regression_loss: 0.3717 - classification_loss: 0.0303 444/500 [=========================>....] - ETA: 19s - loss: 0.4024 - regression_loss: 0.3721 - classification_loss: 0.0303 445/500 [=========================>....] - ETA: 18s - loss: 0.4019 - regression_loss: 0.3716 - classification_loss: 0.0303 446/500 [=========================>....] - ETA: 18s - loss: 0.4026 - regression_loss: 0.3722 - classification_loss: 0.0303 447/500 [=========================>....] - ETA: 18s - loss: 0.4027 - regression_loss: 0.3723 - classification_loss: 0.0303 448/500 [=========================>....] - ETA: 17s - loss: 0.4029 - regression_loss: 0.3726 - classification_loss: 0.0303 449/500 [=========================>....] - ETA: 17s - loss: 0.4034 - regression_loss: 0.3730 - classification_loss: 0.0304 450/500 [==========================>...] - ETA: 17s - loss: 0.4032 - regression_loss: 0.3728 - classification_loss: 0.0303 451/500 [==========================>...] - ETA: 16s - loss: 0.4031 - regression_loss: 0.3728 - classification_loss: 0.0303 452/500 [==========================>...] - ETA: 16s - loss: 0.4030 - regression_loss: 0.3727 - classification_loss: 0.0303 453/500 [==========================>...] - ETA: 16s - loss: 0.4027 - regression_loss: 0.3725 - classification_loss: 0.0303 454/500 [==========================>...] - ETA: 15s - loss: 0.4029 - regression_loss: 0.3725 - classification_loss: 0.0304 455/500 [==========================>...] - ETA: 15s - loss: 0.4036 - regression_loss: 0.3731 - classification_loss: 0.0305 456/500 [==========================>...] - ETA: 15s - loss: 0.4037 - regression_loss: 0.3732 - classification_loss: 0.0305 457/500 [==========================>...] - ETA: 14s - loss: 0.4037 - regression_loss: 0.3733 - classification_loss: 0.0305 458/500 [==========================>...] - ETA: 14s - loss: 0.4038 - regression_loss: 0.3733 - classification_loss: 0.0305 459/500 [==========================>...] - ETA: 14s - loss: 0.4042 - regression_loss: 0.3737 - classification_loss: 0.0305 460/500 [==========================>...] - ETA: 13s - loss: 0.4046 - regression_loss: 0.3740 - classification_loss: 0.0305 461/500 [==========================>...] - ETA: 13s - loss: 0.4053 - regression_loss: 0.3746 - classification_loss: 0.0307 462/500 [==========================>...] - ETA: 13s - loss: 0.4068 - regression_loss: 0.3760 - classification_loss: 0.0308 463/500 [==========================>...] - ETA: 12s - loss: 0.4070 - regression_loss: 0.3762 - classification_loss: 0.0308 464/500 [==========================>...] - ETA: 12s - loss: 0.4073 - regression_loss: 0.3765 - classification_loss: 0.0308 465/500 [==========================>...] - ETA: 12s - loss: 0.4080 - regression_loss: 0.3771 - classification_loss: 0.0308 466/500 [==========================>...] - ETA: 11s - loss: 0.4083 - regression_loss: 0.3774 - classification_loss: 0.0309 467/500 [===========================>..] - ETA: 11s - loss: 0.4085 - regression_loss: 0.3775 - classification_loss: 0.0310 468/500 [===========================>..] - ETA: 11s - loss: 0.4088 - regression_loss: 0.3778 - classification_loss: 0.0310 469/500 [===========================>..] - ETA: 10s - loss: 0.4087 - regression_loss: 0.3778 - classification_loss: 0.0309 470/500 [===========================>..] - ETA: 10s - loss: 0.4084 - regression_loss: 0.3775 - classification_loss: 0.0309 471/500 [===========================>..] - ETA: 9s - loss: 0.4079 - regression_loss: 0.3770 - classification_loss: 0.0309  472/500 [===========================>..] - ETA: 9s - loss: 0.4076 - regression_loss: 0.3767 - classification_loss: 0.0308 473/500 [===========================>..] - ETA: 9s - loss: 0.4072 - regression_loss: 0.3764 - classification_loss: 0.0308 474/500 [===========================>..] - ETA: 8s - loss: 0.4074 - regression_loss: 0.3765 - classification_loss: 0.0308 475/500 [===========================>..] - ETA: 8s - loss: 0.4078 - regression_loss: 0.3770 - classification_loss: 0.0308 476/500 [===========================>..] - ETA: 8s - loss: 0.4075 - regression_loss: 0.3767 - classification_loss: 0.0308 477/500 [===========================>..] - ETA: 7s - loss: 0.4075 - regression_loss: 0.3768 - classification_loss: 0.0308 478/500 [===========================>..] - ETA: 7s - loss: 0.4074 - regression_loss: 0.3766 - classification_loss: 0.0308 479/500 [===========================>..] - ETA: 7s - loss: 0.4073 - regression_loss: 0.3766 - classification_loss: 0.0307 480/500 [===========================>..] - ETA: 6s - loss: 0.4075 - regression_loss: 0.3768 - classification_loss: 0.0307 481/500 [===========================>..] - ETA: 6s - loss: 0.4077 - regression_loss: 0.3769 - classification_loss: 0.0307 482/500 [===========================>..] - ETA: 6s - loss: 0.4072 - regression_loss: 0.3765 - classification_loss: 0.0307 483/500 [===========================>..] - ETA: 5s - loss: 0.4075 - regression_loss: 0.3768 - classification_loss: 0.0307 484/500 [============================>.] - ETA: 5s - loss: 0.4083 - regression_loss: 0.3776 - classification_loss: 0.0307 485/500 [============================>.] - ETA: 5s - loss: 0.4087 - regression_loss: 0.3780 - classification_loss: 0.0307 486/500 [============================>.] - ETA: 4s - loss: 0.4092 - regression_loss: 0.3784 - classification_loss: 0.0308 487/500 [============================>.] - ETA: 4s - loss: 0.4096 - regression_loss: 0.3788 - classification_loss: 0.0308 488/500 [============================>.] - ETA: 4s - loss: 0.4095 - regression_loss: 0.3787 - classification_loss: 0.0308 489/500 [============================>.] - ETA: 3s - loss: 0.4097 - regression_loss: 0.3789 - classification_loss: 0.0308 490/500 [============================>.] - ETA: 3s - loss: 0.4099 - regression_loss: 0.3790 - classification_loss: 0.0308 491/500 [============================>.] - ETA: 3s - loss: 0.4096 - regression_loss: 0.3788 - classification_loss: 0.0308 492/500 [============================>.] - ETA: 2s - loss: 0.4095 - regression_loss: 0.3787 - classification_loss: 0.0308 493/500 [============================>.] - ETA: 2s - loss: 0.4102 - regression_loss: 0.3793 - classification_loss: 0.0310 494/500 [============================>.] - ETA: 2s - loss: 0.4102 - regression_loss: 0.3793 - classification_loss: 0.0309 495/500 [============================>.] - ETA: 1s - loss: 0.4108 - regression_loss: 0.3798 - classification_loss: 0.0310 496/500 [============================>.] - ETA: 1s - loss: 0.4105 - regression_loss: 0.3795 - classification_loss: 0.0310 497/500 [============================>.] - ETA: 1s - loss: 0.4114 - regression_loss: 0.3803 - classification_loss: 0.0310 498/500 [============================>.] - ETA: 0s - loss: 0.4116 - regression_loss: 0.3805 - classification_loss: 0.0311 499/500 [============================>.] - ETA: 0s - loss: 0.4118 - regression_loss: 0.3807 - classification_loss: 0.0311 500/500 [==============================] - 172s 344ms/step - loss: 0.4119 - regression_loss: 0.3808 - classification_loss: 0.0311 1172 instances of class plum with average precision: 0.7309 mAP: 0.7309 Epoch 00026: saving model to ./training/snapshots/resnet101_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:44 - loss: 0.3552 - regression_loss: 0.3346 - classification_loss: 0.0205 2/500 [..............................] - ETA: 2:43 - loss: 0.4068 - regression_loss: 0.3763 - classification_loss: 0.0305 3/500 [..............................] - ETA: 2:42 - loss: 0.4259 - regression_loss: 0.3963 - classification_loss: 0.0296 4/500 [..............................] - ETA: 2:42 - loss: 0.4021 - regression_loss: 0.3741 - classification_loss: 0.0280 5/500 [..............................] - ETA: 2:42 - loss: 0.4462 - regression_loss: 0.4153 - classification_loss: 0.0310 6/500 [..............................] - ETA: 2:42 - loss: 0.4525 - regression_loss: 0.4199 - classification_loss: 0.0326 7/500 [..............................] - ETA: 2:43 - loss: 0.4411 - regression_loss: 0.4084 - classification_loss: 0.0327 8/500 [..............................] - ETA: 2:44 - loss: 0.4314 - regression_loss: 0.3997 - classification_loss: 0.0317 9/500 [..............................] - ETA: 2:45 - loss: 0.4070 - regression_loss: 0.3770 - classification_loss: 0.0299 10/500 [..............................] - ETA: 2:45 - loss: 0.4194 - regression_loss: 0.3894 - classification_loss: 0.0300 11/500 [..............................] - ETA: 2:45 - loss: 0.4247 - regression_loss: 0.3942 - classification_loss: 0.0305 12/500 [..............................] - ETA: 2:45 - loss: 0.4307 - regression_loss: 0.3993 - classification_loss: 0.0314 13/500 [..............................] - ETA: 2:46 - loss: 0.4306 - regression_loss: 0.3979 - classification_loss: 0.0327 14/500 [..............................] - ETA: 2:46 - loss: 0.4233 - regression_loss: 0.3917 - classification_loss: 0.0317 15/500 [..............................] - ETA: 2:46 - loss: 0.4140 - regression_loss: 0.3835 - classification_loss: 0.0304 16/500 [..............................] - ETA: 2:45 - loss: 0.3955 - regression_loss: 0.3664 - classification_loss: 0.0291 17/500 [>.............................] - ETA: 2:45 - loss: 0.3901 - regression_loss: 0.3613 - classification_loss: 0.0288 18/500 [>.............................] - ETA: 2:44 - loss: 0.3839 - regression_loss: 0.3560 - classification_loss: 0.0279 19/500 [>.............................] - ETA: 2:44 - loss: 0.3710 - regression_loss: 0.3440 - classification_loss: 0.0271 20/500 [>.............................] - ETA: 2:43 - loss: 0.3635 - regression_loss: 0.3375 - classification_loss: 0.0261 21/500 [>.............................] - ETA: 2:43 - loss: 0.3567 - regression_loss: 0.3314 - classification_loss: 0.0253 22/500 [>.............................] - ETA: 2:43 - loss: 0.3531 - regression_loss: 0.3281 - classification_loss: 0.0250 23/500 [>.............................] - ETA: 2:42 - loss: 0.3520 - regression_loss: 0.3275 - classification_loss: 0.0245 24/500 [>.............................] - ETA: 2:42 - loss: 0.3658 - regression_loss: 0.3401 - classification_loss: 0.0257 25/500 [>.............................] - ETA: 2:41 - loss: 0.3683 - regression_loss: 0.3415 - classification_loss: 0.0268 26/500 [>.............................] - ETA: 2:40 - loss: 0.3673 - regression_loss: 0.3406 - classification_loss: 0.0268 27/500 [>.............................] - ETA: 2:40 - loss: 0.3666 - regression_loss: 0.3396 - classification_loss: 0.0270 28/500 [>.............................] - ETA: 2:40 - loss: 0.3659 - regression_loss: 0.3393 - classification_loss: 0.0266 29/500 [>.............................] - ETA: 2:39 - loss: 0.3703 - regression_loss: 0.3431 - classification_loss: 0.0272 30/500 [>.............................] - ETA: 2:39 - loss: 0.3806 - regression_loss: 0.3537 - classification_loss: 0.0269 31/500 [>.............................] - ETA: 2:39 - loss: 0.3783 - regression_loss: 0.3516 - classification_loss: 0.0268 32/500 [>.............................] - ETA: 2:39 - loss: 0.3846 - regression_loss: 0.3575 - classification_loss: 0.0271 33/500 [>.............................] - ETA: 2:38 - loss: 0.3911 - regression_loss: 0.3628 - classification_loss: 0.0284 34/500 [=>............................] - ETA: 2:38 - loss: 0.3882 - regression_loss: 0.3602 - classification_loss: 0.0280 35/500 [=>............................] - ETA: 2:38 - loss: 0.3818 - regression_loss: 0.3542 - classification_loss: 0.0276 36/500 [=>............................] - ETA: 2:38 - loss: 0.3750 - regression_loss: 0.3480 - classification_loss: 0.0270 37/500 [=>............................] - ETA: 2:37 - loss: 0.3715 - regression_loss: 0.3450 - classification_loss: 0.0265 38/500 [=>............................] - ETA: 2:37 - loss: 0.3736 - regression_loss: 0.3469 - classification_loss: 0.0267 39/500 [=>............................] - ETA: 2:37 - loss: 0.3715 - regression_loss: 0.3453 - classification_loss: 0.0262 40/500 [=>............................] - ETA: 2:36 - loss: 0.3780 - regression_loss: 0.3515 - classification_loss: 0.0266 41/500 [=>............................] - ETA: 2:36 - loss: 0.3759 - regression_loss: 0.3494 - classification_loss: 0.0265 42/500 [=>............................] - ETA: 2:36 - loss: 0.3721 - regression_loss: 0.3458 - classification_loss: 0.0263 43/500 [=>............................] - ETA: 2:36 - loss: 0.3713 - regression_loss: 0.3451 - classification_loss: 0.0262 44/500 [=>............................] - ETA: 2:36 - loss: 0.3762 - regression_loss: 0.3491 - classification_loss: 0.0270 45/500 [=>............................] - ETA: 2:35 - loss: 0.3745 - regression_loss: 0.3475 - classification_loss: 0.0269 46/500 [=>............................] - ETA: 2:35 - loss: 0.3782 - regression_loss: 0.3510 - classification_loss: 0.0272 47/500 [=>............................] - ETA: 2:34 - loss: 0.3798 - regression_loss: 0.3519 - classification_loss: 0.0279 48/500 [=>............................] - ETA: 2:34 - loss: 0.3800 - regression_loss: 0.3520 - classification_loss: 0.0280 49/500 [=>............................] - ETA: 2:34 - loss: 0.3789 - regression_loss: 0.3508 - classification_loss: 0.0282 50/500 [==>...........................] - ETA: 2:33 - loss: 0.3757 - regression_loss: 0.3478 - classification_loss: 0.0279 51/500 [==>...........................] - ETA: 2:33 - loss: 0.3757 - regression_loss: 0.3477 - classification_loss: 0.0280 52/500 [==>...........................] - ETA: 2:33 - loss: 0.3752 - regression_loss: 0.3474 - classification_loss: 0.0278 53/500 [==>...........................] - ETA: 2:32 - loss: 0.3754 - regression_loss: 0.3479 - classification_loss: 0.0275 54/500 [==>...........................] - ETA: 2:32 - loss: 0.3769 - regression_loss: 0.3492 - classification_loss: 0.0277 55/500 [==>...........................] - ETA: 2:32 - loss: 0.3797 - regression_loss: 0.3514 - classification_loss: 0.0282 56/500 [==>...........................] - ETA: 2:31 - loss: 0.3809 - regression_loss: 0.3525 - classification_loss: 0.0284 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3804 - regression_loss: 0.3524 - classification_loss: 0.0280 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3839 - regression_loss: 0.3555 - classification_loss: 0.0284 59/500 [==>...........................] - ETA: 2:30 - loss: 0.3844 - regression_loss: 0.3560 - classification_loss: 0.0283 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3812 - regression_loss: 0.3532 - classification_loss: 0.0280 61/500 [==>...........................] - ETA: 2:29 - loss: 0.3885 - regression_loss: 0.3599 - classification_loss: 0.0286 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3938 - regression_loss: 0.3648 - classification_loss: 0.0290 63/500 [==>...........................] - ETA: 2:29 - loss: 0.3937 - regression_loss: 0.3650 - classification_loss: 0.0287 64/500 [==>...........................] - ETA: 2:28 - loss: 0.3921 - regression_loss: 0.3633 - classification_loss: 0.0289 65/500 [==>...........................] - ETA: 2:28 - loss: 0.3910 - regression_loss: 0.3620 - classification_loss: 0.0290 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3891 - regression_loss: 0.3603 - classification_loss: 0.0288 67/500 [===>..........................] - ETA: 2:27 - loss: 0.3896 - regression_loss: 0.3605 - classification_loss: 0.0291 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3888 - regression_loss: 0.3599 - classification_loss: 0.0289 69/500 [===>..........................] - ETA: 2:26 - loss: 0.3881 - regression_loss: 0.3595 - classification_loss: 0.0286 70/500 [===>..........................] - ETA: 2:26 - loss: 0.3869 - regression_loss: 0.3584 - classification_loss: 0.0285 71/500 [===>..........................] - ETA: 2:26 - loss: 0.3943 - regression_loss: 0.3651 - classification_loss: 0.0291 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3992 - regression_loss: 0.3695 - classification_loss: 0.0296 73/500 [===>..........................] - ETA: 2:25 - loss: 0.3978 - regression_loss: 0.3681 - classification_loss: 0.0297 74/500 [===>..........................] - ETA: 2:25 - loss: 0.3983 - regression_loss: 0.3686 - classification_loss: 0.0297 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3972 - regression_loss: 0.3676 - classification_loss: 0.0295 76/500 [===>..........................] - ETA: 2:24 - loss: 0.3982 - regression_loss: 0.3687 - classification_loss: 0.0295 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3988 - regression_loss: 0.3693 - classification_loss: 0.0295 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3976 - regression_loss: 0.3683 - classification_loss: 0.0293 79/500 [===>..........................] - ETA: 2:23 - loss: 0.3951 - regression_loss: 0.3661 - classification_loss: 0.0290 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3942 - regression_loss: 0.3654 - classification_loss: 0.0288 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3930 - regression_loss: 0.3642 - classification_loss: 0.0287 82/500 [===>..........................] - ETA: 2:22 - loss: 0.3962 - regression_loss: 0.3670 - classification_loss: 0.0292 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3947 - regression_loss: 0.3656 - classification_loss: 0.0291 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3919 - regression_loss: 0.3629 - classification_loss: 0.0290 85/500 [====>.........................] - ETA: 2:21 - loss: 0.3908 - regression_loss: 0.3620 - classification_loss: 0.0288 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3915 - regression_loss: 0.3620 - classification_loss: 0.0295 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3940 - regression_loss: 0.3645 - classification_loss: 0.0295 88/500 [====>.........................] - ETA: 2:20 - loss: 0.3924 - regression_loss: 0.3630 - classification_loss: 0.0294 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3945 - regression_loss: 0.3651 - classification_loss: 0.0294 90/500 [====>.........................] - ETA: 2:19 - loss: 0.3969 - regression_loss: 0.3674 - classification_loss: 0.0295 91/500 [====>.........................] - ETA: 2:19 - loss: 0.3946 - regression_loss: 0.3653 - classification_loss: 0.0292 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3940 - regression_loss: 0.3649 - classification_loss: 0.0291 93/500 [====>.........................] - ETA: 2:18 - loss: 0.3957 - regression_loss: 0.3660 - classification_loss: 0.0297 94/500 [====>.........................] - ETA: 2:18 - loss: 0.3947 - regression_loss: 0.3651 - classification_loss: 0.0296 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3936 - regression_loss: 0.3641 - classification_loss: 0.0295 96/500 [====>.........................] - ETA: 2:17 - loss: 0.3945 - regression_loss: 0.3647 - classification_loss: 0.0297 97/500 [====>.........................] - ETA: 2:17 - loss: 0.3937 - regression_loss: 0.3638 - classification_loss: 0.0299 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3951 - regression_loss: 0.3649 - classification_loss: 0.0301 99/500 [====>.........................] - ETA: 2:16 - loss: 0.3954 - regression_loss: 0.3653 - classification_loss: 0.0301 100/500 [=====>........................] - ETA: 2:16 - loss: 0.3940 - regression_loss: 0.3640 - classification_loss: 0.0300 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3940 - regression_loss: 0.3639 - classification_loss: 0.0301 102/500 [=====>........................] - ETA: 2:15 - loss: 0.3963 - regression_loss: 0.3658 - classification_loss: 0.0305 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3970 - regression_loss: 0.3665 - classification_loss: 0.0305 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3981 - regression_loss: 0.3677 - classification_loss: 0.0304 105/500 [=====>........................] - ETA: 2:14 - loss: 0.4003 - regression_loss: 0.3696 - classification_loss: 0.0306 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3997 - regression_loss: 0.3692 - classification_loss: 0.0306 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3986 - regression_loss: 0.3682 - classification_loss: 0.0304 108/500 [=====>........................] - ETA: 2:13 - loss: 0.3972 - regression_loss: 0.3669 - classification_loss: 0.0303 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3977 - regression_loss: 0.3673 - classification_loss: 0.0304 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3972 - regression_loss: 0.3668 - classification_loss: 0.0305 111/500 [=====>........................] - ETA: 2:12 - loss: 0.3971 - regression_loss: 0.3666 - classification_loss: 0.0304 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3985 - regression_loss: 0.3681 - classification_loss: 0.0304 113/500 [=====>........................] - ETA: 2:12 - loss: 0.4016 - regression_loss: 0.3710 - classification_loss: 0.0306 114/500 [=====>........................] - ETA: 2:11 - loss: 0.4003 - regression_loss: 0.3698 - classification_loss: 0.0305 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3992 - regression_loss: 0.3689 - classification_loss: 0.0304 116/500 [=====>........................] - ETA: 2:11 - loss: 0.4020 - regression_loss: 0.3713 - classification_loss: 0.0308 117/500 [======>.......................] - ETA: 2:10 - loss: 0.4010 - regression_loss: 0.3704 - classification_loss: 0.0306 118/500 [======>.......................] - ETA: 2:10 - loss: 0.4009 - regression_loss: 0.3704 - classification_loss: 0.0306 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3995 - regression_loss: 0.3690 - classification_loss: 0.0304 120/500 [======>.......................] - ETA: 2:09 - loss: 0.3987 - regression_loss: 0.3684 - classification_loss: 0.0303 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3996 - regression_loss: 0.3692 - classification_loss: 0.0304 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3988 - regression_loss: 0.3686 - classification_loss: 0.0303 123/500 [======>.......................] - ETA: 2:08 - loss: 0.3992 - regression_loss: 0.3689 - classification_loss: 0.0303 124/500 [======>.......................] - ETA: 2:08 - loss: 0.4014 - regression_loss: 0.3711 - classification_loss: 0.0304 125/500 [======>.......................] - ETA: 2:08 - loss: 0.4016 - regression_loss: 0.3712 - classification_loss: 0.0304 126/500 [======>.......................] - ETA: 2:07 - loss: 0.4029 - regression_loss: 0.3725 - classification_loss: 0.0304 127/500 [======>.......................] - ETA: 2:07 - loss: 0.4029 - regression_loss: 0.3725 - classification_loss: 0.0304 128/500 [======>.......................] - ETA: 2:06 - loss: 0.4049 - regression_loss: 0.3744 - classification_loss: 0.0305 129/500 [======>.......................] - ETA: 2:06 - loss: 0.4038 - regression_loss: 0.3734 - classification_loss: 0.0304 130/500 [======>.......................] - ETA: 2:06 - loss: 0.4030 - regression_loss: 0.3728 - classification_loss: 0.0302 131/500 [======>.......................] - ETA: 2:05 - loss: 0.4052 - regression_loss: 0.3749 - classification_loss: 0.0302 132/500 [======>.......................] - ETA: 2:05 - loss: 0.4062 - regression_loss: 0.3758 - classification_loss: 0.0305 133/500 [======>.......................] - ETA: 2:05 - loss: 0.4061 - regression_loss: 0.3755 - classification_loss: 0.0306 134/500 [=======>......................] - ETA: 2:04 - loss: 0.4076 - regression_loss: 0.3768 - classification_loss: 0.0307 135/500 [=======>......................] - ETA: 2:04 - loss: 0.4086 - regression_loss: 0.3777 - classification_loss: 0.0309 136/500 [=======>......................] - ETA: 2:04 - loss: 0.4079 - regression_loss: 0.3771 - classification_loss: 0.0308 137/500 [=======>......................] - ETA: 2:03 - loss: 0.4100 - regression_loss: 0.3788 - classification_loss: 0.0311 138/500 [=======>......................] - ETA: 2:03 - loss: 0.4110 - regression_loss: 0.3798 - classification_loss: 0.0311 139/500 [=======>......................] - ETA: 2:03 - loss: 0.4105 - regression_loss: 0.3794 - classification_loss: 0.0311 140/500 [=======>......................] - ETA: 2:02 - loss: 0.4125 - regression_loss: 0.3814 - classification_loss: 0.0311 141/500 [=======>......................] - ETA: 2:02 - loss: 0.4147 - regression_loss: 0.3830 - classification_loss: 0.0317 142/500 [=======>......................] - ETA: 2:02 - loss: 0.4142 - regression_loss: 0.3826 - classification_loss: 0.0316 143/500 [=======>......................] - ETA: 2:01 - loss: 0.4152 - regression_loss: 0.3834 - classification_loss: 0.0318 144/500 [=======>......................] - ETA: 2:01 - loss: 0.4152 - regression_loss: 0.3834 - classification_loss: 0.0318 145/500 [=======>......................] - ETA: 2:01 - loss: 0.4154 - regression_loss: 0.3834 - classification_loss: 0.0320 146/500 [=======>......................] - ETA: 2:00 - loss: 0.4171 - regression_loss: 0.3849 - classification_loss: 0.0322 147/500 [=======>......................] - ETA: 2:00 - loss: 0.4163 - regression_loss: 0.3842 - classification_loss: 0.0321 148/500 [=======>......................] - ETA: 1:59 - loss: 0.4156 - regression_loss: 0.3836 - classification_loss: 0.0319 149/500 [=======>......................] - ETA: 1:59 - loss: 0.4139 - regression_loss: 0.3821 - classification_loss: 0.0318 150/500 [========>.....................] - ETA: 1:59 - loss: 0.4137 - regression_loss: 0.3820 - classification_loss: 0.0317 151/500 [========>.....................] - ETA: 1:58 - loss: 0.4135 - regression_loss: 0.3817 - classification_loss: 0.0318 152/500 [========>.....................] - ETA: 1:58 - loss: 0.4138 - regression_loss: 0.3820 - classification_loss: 0.0318 153/500 [========>.....................] - ETA: 1:58 - loss: 0.4127 - regression_loss: 0.3810 - classification_loss: 0.0316 154/500 [========>.....................] - ETA: 1:57 - loss: 0.4129 - regression_loss: 0.3813 - classification_loss: 0.0316 155/500 [========>.....................] - ETA: 1:57 - loss: 0.4139 - regression_loss: 0.3823 - classification_loss: 0.0317 156/500 [========>.....................] - ETA: 1:57 - loss: 0.4136 - regression_loss: 0.3820 - classification_loss: 0.0316 157/500 [========>.....................] - ETA: 1:56 - loss: 0.4130 - regression_loss: 0.3814 - classification_loss: 0.0316 158/500 [========>.....................] - ETA: 1:56 - loss: 0.4138 - regression_loss: 0.3822 - classification_loss: 0.0316 159/500 [========>.....................] - ETA: 1:56 - loss: 0.4121 - regression_loss: 0.3806 - classification_loss: 0.0315 160/500 [========>.....................] - ETA: 1:55 - loss: 0.4122 - regression_loss: 0.3806 - classification_loss: 0.0315 161/500 [========>.....................] - ETA: 1:55 - loss: 0.4126 - regression_loss: 0.3809 - classification_loss: 0.0317 162/500 [========>.....................] - ETA: 1:55 - loss: 0.4124 - regression_loss: 0.3807 - classification_loss: 0.0317 163/500 [========>.....................] - ETA: 1:54 - loss: 0.4116 - regression_loss: 0.3800 - classification_loss: 0.0317 164/500 [========>.....................] - ETA: 1:54 - loss: 0.4105 - regression_loss: 0.3789 - classification_loss: 0.0316 165/500 [========>.....................] - ETA: 1:54 - loss: 0.4111 - regression_loss: 0.3794 - classification_loss: 0.0317 166/500 [========>.....................] - ETA: 1:53 - loss: 0.4125 - regression_loss: 0.3805 - classification_loss: 0.0320 167/500 [=========>....................] - ETA: 1:53 - loss: 0.4126 - regression_loss: 0.3806 - classification_loss: 0.0321 168/500 [=========>....................] - ETA: 1:53 - loss: 0.4117 - regression_loss: 0.3798 - classification_loss: 0.0320 169/500 [=========>....................] - ETA: 1:52 - loss: 0.4114 - regression_loss: 0.3794 - classification_loss: 0.0320 170/500 [=========>....................] - ETA: 1:52 - loss: 0.4119 - regression_loss: 0.3799 - classification_loss: 0.0320 171/500 [=========>....................] - ETA: 1:52 - loss: 0.4119 - regression_loss: 0.3798 - classification_loss: 0.0320 172/500 [=========>....................] - ETA: 1:51 - loss: 0.4123 - regression_loss: 0.3802 - classification_loss: 0.0321 173/500 [=========>....................] - ETA: 1:51 - loss: 0.4126 - regression_loss: 0.3804 - classification_loss: 0.0322 174/500 [=========>....................] - ETA: 1:51 - loss: 0.4129 - regression_loss: 0.3807 - classification_loss: 0.0322 175/500 [=========>....................] - ETA: 1:50 - loss: 0.4114 - regression_loss: 0.3793 - classification_loss: 0.0321 176/500 [=========>....................] - ETA: 1:50 - loss: 0.4105 - regression_loss: 0.3786 - classification_loss: 0.0320 177/500 [=========>....................] - ETA: 1:50 - loss: 0.4104 - regression_loss: 0.3785 - classification_loss: 0.0319 178/500 [=========>....................] - ETA: 1:49 - loss: 0.4096 - regression_loss: 0.3778 - classification_loss: 0.0318 179/500 [=========>....................] - ETA: 1:49 - loss: 0.4090 - regression_loss: 0.3773 - classification_loss: 0.0317 180/500 [=========>....................] - ETA: 1:49 - loss: 0.4085 - regression_loss: 0.3768 - classification_loss: 0.0317 181/500 [=========>....................] - ETA: 1:48 - loss: 0.4085 - regression_loss: 0.3768 - classification_loss: 0.0317 182/500 [=========>....................] - ETA: 1:48 - loss: 0.4076 - regression_loss: 0.3760 - classification_loss: 0.0316 183/500 [=========>....................] - ETA: 1:48 - loss: 0.4074 - regression_loss: 0.3759 - classification_loss: 0.0315 184/500 [==========>...................] - ETA: 1:47 - loss: 0.4065 - regression_loss: 0.3751 - classification_loss: 0.0314 185/500 [==========>...................] - ETA: 1:47 - loss: 0.4068 - regression_loss: 0.3753 - classification_loss: 0.0314 186/500 [==========>...................] - ETA: 1:47 - loss: 0.4066 - regression_loss: 0.3753 - classification_loss: 0.0313 187/500 [==========>...................] - ETA: 1:46 - loss: 0.4067 - regression_loss: 0.3754 - classification_loss: 0.0313 188/500 [==========>...................] - ETA: 1:46 - loss: 0.4060 - regression_loss: 0.3749 - classification_loss: 0.0311 189/500 [==========>...................] - ETA: 1:46 - loss: 0.4054 - regression_loss: 0.3743 - classification_loss: 0.0310 190/500 [==========>...................] - ETA: 1:45 - loss: 0.4059 - regression_loss: 0.3747 - classification_loss: 0.0311 191/500 [==========>...................] - ETA: 1:45 - loss: 0.4061 - regression_loss: 0.3749 - classification_loss: 0.0312 192/500 [==========>...................] - ETA: 1:45 - loss: 0.4070 - regression_loss: 0.3756 - classification_loss: 0.0314 193/500 [==========>...................] - ETA: 1:44 - loss: 0.4081 - regression_loss: 0.3767 - classification_loss: 0.0314 194/500 [==========>...................] - ETA: 1:44 - loss: 0.4073 - regression_loss: 0.3759 - classification_loss: 0.0314 195/500 [==========>...................] - ETA: 1:44 - loss: 0.4073 - regression_loss: 0.3759 - classification_loss: 0.0314 196/500 [==========>...................] - ETA: 1:43 - loss: 0.4074 - regression_loss: 0.3760 - classification_loss: 0.0314 197/500 [==========>...................] - ETA: 1:43 - loss: 0.4072 - regression_loss: 0.3759 - classification_loss: 0.0313 198/500 [==========>...................] - ETA: 1:43 - loss: 0.4063 - regression_loss: 0.3751 - classification_loss: 0.0312 199/500 [==========>...................] - ETA: 1:42 - loss: 0.4072 - regression_loss: 0.3757 - classification_loss: 0.0315 200/500 [===========>..................] - ETA: 1:42 - loss: 0.4082 - regression_loss: 0.3766 - classification_loss: 0.0316 201/500 [===========>..................] - ETA: 1:42 - loss: 0.4087 - regression_loss: 0.3770 - classification_loss: 0.0317 202/500 [===========>..................] - ETA: 1:41 - loss: 0.4097 - regression_loss: 0.3781 - classification_loss: 0.0317 203/500 [===========>..................] - ETA: 1:41 - loss: 0.4102 - regression_loss: 0.3784 - classification_loss: 0.0318 204/500 [===========>..................] - ETA: 1:41 - loss: 0.4114 - regression_loss: 0.3794 - classification_loss: 0.0320 205/500 [===========>..................] - ETA: 1:40 - loss: 0.4107 - regression_loss: 0.3788 - classification_loss: 0.0319 206/500 [===========>..................] - ETA: 1:40 - loss: 0.4113 - regression_loss: 0.3792 - classification_loss: 0.0321 207/500 [===========>..................] - ETA: 1:39 - loss: 0.4110 - regression_loss: 0.3789 - classification_loss: 0.0320 208/500 [===========>..................] - ETA: 1:39 - loss: 0.4108 - regression_loss: 0.3788 - classification_loss: 0.0320 209/500 [===========>..................] - ETA: 1:39 - loss: 0.4099 - regression_loss: 0.3780 - classification_loss: 0.0320 210/500 [===========>..................] - ETA: 1:38 - loss: 0.4089 - regression_loss: 0.3770 - classification_loss: 0.0319 211/500 [===========>..................] - ETA: 1:38 - loss: 0.4091 - regression_loss: 0.3772 - classification_loss: 0.0319 212/500 [===========>..................] - ETA: 1:38 - loss: 0.4084 - regression_loss: 0.3767 - classification_loss: 0.0318 213/500 [===========>..................] - ETA: 1:37 - loss: 0.4086 - regression_loss: 0.3768 - classification_loss: 0.0318 214/500 [===========>..................] - ETA: 1:37 - loss: 0.4090 - regression_loss: 0.3772 - classification_loss: 0.0317 215/500 [===========>..................] - ETA: 1:37 - loss: 0.4085 - regression_loss: 0.3768 - classification_loss: 0.0317 216/500 [===========>..................] - ETA: 1:36 - loss: 0.4080 - regression_loss: 0.3763 - classification_loss: 0.0317 217/500 [============>.................] - ETA: 1:36 - loss: 0.4075 - regression_loss: 0.3760 - classification_loss: 0.0316 218/500 [============>.................] - ETA: 1:36 - loss: 0.4073 - regression_loss: 0.3757 - classification_loss: 0.0316 219/500 [============>.................] - ETA: 1:35 - loss: 0.4070 - regression_loss: 0.3755 - classification_loss: 0.0315 220/500 [============>.................] - ETA: 1:35 - loss: 0.4075 - regression_loss: 0.3758 - classification_loss: 0.0317 221/500 [============>.................] - ETA: 1:35 - loss: 0.4073 - regression_loss: 0.3756 - classification_loss: 0.0316 222/500 [============>.................] - ETA: 1:34 - loss: 0.4081 - regression_loss: 0.3765 - classification_loss: 0.0316 223/500 [============>.................] - ETA: 1:34 - loss: 0.4078 - regression_loss: 0.3762 - classification_loss: 0.0316 224/500 [============>.................] - ETA: 1:34 - loss: 0.4077 - regression_loss: 0.3761 - classification_loss: 0.0316 225/500 [============>.................] - ETA: 1:33 - loss: 0.4076 - regression_loss: 0.3760 - classification_loss: 0.0316 226/500 [============>.................] - ETA: 1:33 - loss: 0.4072 - regression_loss: 0.3756 - classification_loss: 0.0316 227/500 [============>.................] - ETA: 1:33 - loss: 0.4073 - regression_loss: 0.3757 - classification_loss: 0.0316 228/500 [============>.................] - ETA: 1:32 - loss: 0.4077 - regression_loss: 0.3759 - classification_loss: 0.0318 229/500 [============>.................] - ETA: 1:32 - loss: 0.4070 - regression_loss: 0.3752 - classification_loss: 0.0318 230/500 [============>.................] - ETA: 1:32 - loss: 0.4068 - regression_loss: 0.3750 - classification_loss: 0.0318 231/500 [============>.................] - ETA: 1:31 - loss: 0.4066 - regression_loss: 0.3749 - classification_loss: 0.0317 232/500 [============>.................] - ETA: 1:31 - loss: 0.4059 - regression_loss: 0.3742 - classification_loss: 0.0317 233/500 [============>.................] - ETA: 1:31 - loss: 0.4060 - regression_loss: 0.3744 - classification_loss: 0.0316 234/500 [=============>................] - ETA: 1:30 - loss: 0.4060 - regression_loss: 0.3744 - classification_loss: 0.0316 235/500 [=============>................] - ETA: 1:30 - loss: 0.4072 - regression_loss: 0.3756 - classification_loss: 0.0316 236/500 [=============>................] - ETA: 1:30 - loss: 0.4073 - regression_loss: 0.3757 - classification_loss: 0.0316 237/500 [=============>................] - ETA: 1:29 - loss: 0.4063 - regression_loss: 0.3748 - classification_loss: 0.0315 238/500 [=============>................] - ETA: 1:29 - loss: 0.4064 - regression_loss: 0.3749 - classification_loss: 0.0315 239/500 [=============>................] - ETA: 1:29 - loss: 0.4057 - regression_loss: 0.3743 - classification_loss: 0.0314 240/500 [=============>................] - ETA: 1:28 - loss: 0.4052 - regression_loss: 0.3738 - classification_loss: 0.0314 241/500 [=============>................] - ETA: 1:28 - loss: 0.4046 - regression_loss: 0.3733 - classification_loss: 0.0313 242/500 [=============>................] - ETA: 1:28 - loss: 0.4038 - regression_loss: 0.3726 - classification_loss: 0.0312 243/500 [=============>................] - ETA: 1:27 - loss: 0.4039 - regression_loss: 0.3727 - classification_loss: 0.0312 244/500 [=============>................] - ETA: 1:27 - loss: 0.4034 - regression_loss: 0.3722 - classification_loss: 0.0311 245/500 [=============>................] - ETA: 1:27 - loss: 0.4048 - regression_loss: 0.3735 - classification_loss: 0.0313 246/500 [=============>................] - ETA: 1:26 - loss: 0.4040 - regression_loss: 0.3728 - classification_loss: 0.0312 247/500 [=============>................] - ETA: 1:26 - loss: 0.4031 - regression_loss: 0.3719 - classification_loss: 0.0311 248/500 [=============>................] - ETA: 1:26 - loss: 0.4034 - regression_loss: 0.3721 - classification_loss: 0.0312 249/500 [=============>................] - ETA: 1:25 - loss: 0.4036 - regression_loss: 0.3724 - classification_loss: 0.0312 250/500 [==============>...............] - ETA: 1:25 - loss: 0.4036 - regression_loss: 0.3723 - classification_loss: 0.0312 251/500 [==============>...............] - ETA: 1:25 - loss: 0.4038 - regression_loss: 0.3725 - classification_loss: 0.0312 252/500 [==============>...............] - ETA: 1:24 - loss: 0.4036 - regression_loss: 0.3724 - classification_loss: 0.0312 253/500 [==============>...............] - ETA: 1:24 - loss: 0.4042 - regression_loss: 0.3731 - classification_loss: 0.0311 254/500 [==============>...............] - ETA: 1:23 - loss: 0.4046 - regression_loss: 0.3734 - classification_loss: 0.0312 255/500 [==============>...............] - ETA: 1:23 - loss: 0.4043 - regression_loss: 0.3732 - classification_loss: 0.0311 256/500 [==============>...............] - ETA: 1:23 - loss: 0.4045 - regression_loss: 0.3734 - classification_loss: 0.0311 257/500 [==============>...............] - ETA: 1:22 - loss: 0.4038 - regression_loss: 0.3727 - classification_loss: 0.0310 258/500 [==============>...............] - ETA: 1:22 - loss: 0.4034 - regression_loss: 0.3724 - classification_loss: 0.0310 259/500 [==============>...............] - ETA: 1:22 - loss: 0.4024 - regression_loss: 0.3715 - classification_loss: 0.0309 260/500 [==============>...............] - ETA: 1:21 - loss: 0.4022 - regression_loss: 0.3713 - classification_loss: 0.0309 261/500 [==============>...............] - ETA: 1:21 - loss: 0.4012 - regression_loss: 0.3704 - classification_loss: 0.0308 262/500 [==============>...............] - ETA: 1:21 - loss: 0.4007 - regression_loss: 0.3700 - classification_loss: 0.0307 263/500 [==============>...............] - ETA: 1:20 - loss: 0.4003 - regression_loss: 0.3696 - classification_loss: 0.0307 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3994 - regression_loss: 0.3688 - classification_loss: 0.0306 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3986 - regression_loss: 0.3681 - classification_loss: 0.0305 266/500 [==============>...............] - ETA: 1:19 - loss: 0.3991 - regression_loss: 0.3684 - classification_loss: 0.0306 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3991 - regression_loss: 0.3684 - classification_loss: 0.0307 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3990 - regression_loss: 0.3682 - classification_loss: 0.0308 269/500 [===============>..............] - ETA: 1:18 - loss: 0.3994 - regression_loss: 0.3685 - classification_loss: 0.0308 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3993 - regression_loss: 0.3686 - classification_loss: 0.0308 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3999 - regression_loss: 0.3692 - classification_loss: 0.0307 272/500 [===============>..............] - ETA: 1:17 - loss: 0.4013 - regression_loss: 0.3704 - classification_loss: 0.0309 273/500 [===============>..............] - ETA: 1:17 - loss: 0.4004 - regression_loss: 0.3696 - classification_loss: 0.0308 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3998 - regression_loss: 0.3690 - classification_loss: 0.0308 275/500 [===============>..............] - ETA: 1:16 - loss: 0.4003 - regression_loss: 0.3695 - classification_loss: 0.0308 276/500 [===============>..............] - ETA: 1:16 - loss: 0.4005 - regression_loss: 0.3696 - classification_loss: 0.0308 277/500 [===============>..............] - ETA: 1:16 - loss: 0.4019 - regression_loss: 0.3708 - classification_loss: 0.0311 278/500 [===============>..............] - ETA: 1:15 - loss: 0.4018 - regression_loss: 0.3708 - classification_loss: 0.0311 279/500 [===============>..............] - ETA: 1:15 - loss: 0.4015 - regression_loss: 0.3705 - classification_loss: 0.0311 280/500 [===============>..............] - ETA: 1:15 - loss: 0.4022 - regression_loss: 0.3711 - classification_loss: 0.0311 281/500 [===============>..............] - ETA: 1:14 - loss: 0.4018 - regression_loss: 0.3708 - classification_loss: 0.0311 282/500 [===============>..............] - ETA: 1:14 - loss: 0.4017 - regression_loss: 0.3707 - classification_loss: 0.0310 283/500 [===============>..............] - ETA: 1:14 - loss: 0.4016 - regression_loss: 0.3706 - classification_loss: 0.0310 284/500 [================>.............] - ETA: 1:13 - loss: 0.4021 - regression_loss: 0.3711 - classification_loss: 0.0310 285/500 [================>.............] - ETA: 1:13 - loss: 0.4015 - regression_loss: 0.3706 - classification_loss: 0.0309 286/500 [================>.............] - ETA: 1:13 - loss: 0.4011 - regression_loss: 0.3703 - classification_loss: 0.0309 287/500 [================>.............] - ETA: 1:12 - loss: 0.4014 - regression_loss: 0.3706 - classification_loss: 0.0308 288/500 [================>.............] - ETA: 1:12 - loss: 0.4019 - regression_loss: 0.3711 - classification_loss: 0.0309 289/500 [================>.............] - ETA: 1:12 - loss: 0.4019 - regression_loss: 0.3711 - classification_loss: 0.0309 290/500 [================>.............] - ETA: 1:11 - loss: 0.4017 - regression_loss: 0.3709 - classification_loss: 0.0308 291/500 [================>.............] - ETA: 1:11 - loss: 0.4010 - regression_loss: 0.3702 - classification_loss: 0.0308 292/500 [================>.............] - ETA: 1:11 - loss: 0.4008 - regression_loss: 0.3701 - classification_loss: 0.0307 293/500 [================>.............] - ETA: 1:10 - loss: 0.4002 - regression_loss: 0.3695 - classification_loss: 0.0307 294/500 [================>.............] - ETA: 1:10 - loss: 0.4008 - regression_loss: 0.3701 - classification_loss: 0.0307 295/500 [================>.............] - ETA: 1:10 - loss: 0.4012 - regression_loss: 0.3704 - classification_loss: 0.0308 296/500 [================>.............] - ETA: 1:09 - loss: 0.4024 - regression_loss: 0.3715 - classification_loss: 0.0310 297/500 [================>.............] - ETA: 1:09 - loss: 0.4026 - regression_loss: 0.3716 - classification_loss: 0.0310 298/500 [================>.............] - ETA: 1:09 - loss: 0.4029 - regression_loss: 0.3719 - classification_loss: 0.0310 299/500 [================>.............] - ETA: 1:08 - loss: 0.4029 - regression_loss: 0.3720 - classification_loss: 0.0309 300/500 [=================>............] - ETA: 1:08 - loss: 0.4043 - regression_loss: 0.3732 - classification_loss: 0.0311 301/500 [=================>............] - ETA: 1:08 - loss: 0.4042 - regression_loss: 0.3731 - classification_loss: 0.0311 302/500 [=================>............] - ETA: 1:07 - loss: 0.4042 - regression_loss: 0.3732 - classification_loss: 0.0310 303/500 [=================>............] - ETA: 1:07 - loss: 0.4039 - regression_loss: 0.3729 - classification_loss: 0.0310 304/500 [=================>............] - ETA: 1:07 - loss: 0.4040 - regression_loss: 0.3730 - classification_loss: 0.0310 305/500 [=================>............] - ETA: 1:06 - loss: 0.4043 - regression_loss: 0.3733 - classification_loss: 0.0310 306/500 [=================>............] - ETA: 1:06 - loss: 0.4049 - regression_loss: 0.3739 - classification_loss: 0.0310 307/500 [=================>............] - ETA: 1:06 - loss: 0.4044 - regression_loss: 0.3734 - classification_loss: 0.0309 308/500 [=================>............] - ETA: 1:05 - loss: 0.4045 - regression_loss: 0.3736 - classification_loss: 0.0310 309/500 [=================>............] - ETA: 1:05 - loss: 0.4044 - regression_loss: 0.3734 - classification_loss: 0.0309 310/500 [=================>............] - ETA: 1:05 - loss: 0.4039 - regression_loss: 0.3730 - classification_loss: 0.0309 311/500 [=================>............] - ETA: 1:04 - loss: 0.4036 - regression_loss: 0.3728 - classification_loss: 0.0308 312/500 [=================>............] - ETA: 1:04 - loss: 0.4043 - regression_loss: 0.3734 - classification_loss: 0.0309 313/500 [=================>............] - ETA: 1:04 - loss: 0.4044 - regression_loss: 0.3735 - classification_loss: 0.0308 314/500 [=================>............] - ETA: 1:03 - loss: 0.4053 - regression_loss: 0.3744 - classification_loss: 0.0309 315/500 [=================>............] - ETA: 1:03 - loss: 0.4059 - regression_loss: 0.3750 - classification_loss: 0.0309 316/500 [=================>............] - ETA: 1:03 - loss: 0.4067 - regression_loss: 0.3757 - classification_loss: 0.0310 317/500 [==================>...........] - ETA: 1:02 - loss: 0.4059 - regression_loss: 0.3750 - classification_loss: 0.0309 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4058 - regression_loss: 0.3749 - classification_loss: 0.0309 319/500 [==================>...........] - ETA: 1:01 - loss: 0.4060 - regression_loss: 0.3751 - classification_loss: 0.0309 320/500 [==================>...........] - ETA: 1:01 - loss: 0.4065 - regression_loss: 0.3756 - classification_loss: 0.0310 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4076 - regression_loss: 0.3765 - classification_loss: 0.0311 322/500 [==================>...........] - ETA: 1:00 - loss: 0.4073 - regression_loss: 0.3763 - classification_loss: 0.0311 323/500 [==================>...........] - ETA: 1:00 - loss: 0.4078 - regression_loss: 0.3767 - classification_loss: 0.0311 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4074 - regression_loss: 0.3763 - classification_loss: 0.0311 325/500 [==================>...........] - ETA: 59s - loss: 0.4071 - regression_loss: 0.3761 - classification_loss: 0.0310  326/500 [==================>...........] - ETA: 59s - loss: 0.4066 - regression_loss: 0.3757 - classification_loss: 0.0309 327/500 [==================>...........] - ETA: 59s - loss: 0.4066 - regression_loss: 0.3756 - classification_loss: 0.0310 328/500 [==================>...........] - ETA: 58s - loss: 0.4061 - regression_loss: 0.3751 - classification_loss: 0.0309 329/500 [==================>...........] - ETA: 58s - loss: 0.4067 - regression_loss: 0.3757 - classification_loss: 0.0310 330/500 [==================>...........] - ETA: 58s - loss: 0.4071 - regression_loss: 0.3760 - classification_loss: 0.0310 331/500 [==================>...........] - ETA: 57s - loss: 0.4067 - regression_loss: 0.3757 - classification_loss: 0.0310 332/500 [==================>...........] - ETA: 57s - loss: 0.4062 - regression_loss: 0.3752 - classification_loss: 0.0309 333/500 [==================>...........] - ETA: 57s - loss: 0.4058 - regression_loss: 0.3749 - classification_loss: 0.0309 334/500 [===================>..........] - ETA: 56s - loss: 0.4060 - regression_loss: 0.3750 - classification_loss: 0.0310 335/500 [===================>..........] - ETA: 56s - loss: 0.4056 - regression_loss: 0.3747 - classification_loss: 0.0309 336/500 [===================>..........] - ETA: 56s - loss: 0.4052 - regression_loss: 0.3743 - classification_loss: 0.0309 337/500 [===================>..........] - ETA: 55s - loss: 0.4052 - regression_loss: 0.3742 - classification_loss: 0.0309 338/500 [===================>..........] - ETA: 55s - loss: 0.4046 - regression_loss: 0.3738 - classification_loss: 0.0309 339/500 [===================>..........] - ETA: 55s - loss: 0.4046 - regression_loss: 0.3738 - classification_loss: 0.0308 340/500 [===================>..........] - ETA: 54s - loss: 0.4044 - regression_loss: 0.3736 - classification_loss: 0.0308 341/500 [===================>..........] - ETA: 54s - loss: 0.4044 - regression_loss: 0.3736 - classification_loss: 0.0308 342/500 [===================>..........] - ETA: 54s - loss: 0.4041 - regression_loss: 0.3732 - classification_loss: 0.0308 343/500 [===================>..........] - ETA: 53s - loss: 0.4034 - regression_loss: 0.3727 - classification_loss: 0.0308 344/500 [===================>..........] - ETA: 53s - loss: 0.4030 - regression_loss: 0.3723 - classification_loss: 0.0307 345/500 [===================>..........] - ETA: 53s - loss: 0.4032 - regression_loss: 0.3725 - classification_loss: 0.0307 346/500 [===================>..........] - ETA: 52s - loss: 0.4036 - regression_loss: 0.3728 - classification_loss: 0.0307 347/500 [===================>..........] - ETA: 52s - loss: 0.4033 - regression_loss: 0.3726 - classification_loss: 0.0307 348/500 [===================>..........] - ETA: 52s - loss: 0.4029 - regression_loss: 0.3722 - classification_loss: 0.0307 349/500 [===================>..........] - ETA: 51s - loss: 0.4029 - regression_loss: 0.3722 - classification_loss: 0.0307 350/500 [====================>.........] - ETA: 51s - loss: 0.4027 - regression_loss: 0.3721 - classification_loss: 0.0306 351/500 [====================>.........] - ETA: 51s - loss: 0.4031 - regression_loss: 0.3725 - classification_loss: 0.0306 352/500 [====================>.........] - ETA: 50s - loss: 0.4030 - regression_loss: 0.3724 - classification_loss: 0.0306 353/500 [====================>.........] - ETA: 50s - loss: 0.4025 - regression_loss: 0.3720 - classification_loss: 0.0305 354/500 [====================>.........] - ETA: 50s - loss: 0.4021 - regression_loss: 0.3716 - classification_loss: 0.0305 355/500 [====================>.........] - ETA: 49s - loss: 0.4018 - regression_loss: 0.3713 - classification_loss: 0.0305 356/500 [====================>.........] - ETA: 49s - loss: 0.4021 - regression_loss: 0.3716 - classification_loss: 0.0306 357/500 [====================>.........] - ETA: 49s - loss: 0.4018 - regression_loss: 0.3712 - classification_loss: 0.0306 358/500 [====================>.........] - ETA: 48s - loss: 0.4020 - regression_loss: 0.3714 - classification_loss: 0.0306 359/500 [====================>.........] - ETA: 48s - loss: 0.4024 - regression_loss: 0.3717 - classification_loss: 0.0307 360/500 [====================>.........] - ETA: 47s - loss: 0.4021 - regression_loss: 0.3715 - classification_loss: 0.0306 361/500 [====================>.........] - ETA: 47s - loss: 0.4023 - regression_loss: 0.3717 - classification_loss: 0.0307 362/500 [====================>.........] - ETA: 47s - loss: 0.4024 - regression_loss: 0.3717 - classification_loss: 0.0307 363/500 [====================>.........] - ETA: 46s - loss: 0.4023 - regression_loss: 0.3716 - classification_loss: 0.0306 364/500 [====================>.........] - ETA: 46s - loss: 0.4019 - regression_loss: 0.3714 - classification_loss: 0.0306 365/500 [====================>.........] - ETA: 46s - loss: 0.4013 - regression_loss: 0.3708 - classification_loss: 0.0305 366/500 [====================>.........] - ETA: 45s - loss: 0.4013 - regression_loss: 0.3708 - classification_loss: 0.0305 367/500 [=====================>........] - ETA: 45s - loss: 0.4007 - regression_loss: 0.3702 - classification_loss: 0.0305 368/500 [=====================>........] - ETA: 45s - loss: 0.4004 - regression_loss: 0.3699 - classification_loss: 0.0304 369/500 [=====================>........] - ETA: 44s - loss: 0.4004 - regression_loss: 0.3699 - classification_loss: 0.0304 370/500 [=====================>........] - ETA: 44s - loss: 0.4002 - regression_loss: 0.3698 - classification_loss: 0.0304 371/500 [=====================>........] - ETA: 44s - loss: 0.4002 - regression_loss: 0.3698 - classification_loss: 0.0304 372/500 [=====================>........] - ETA: 43s - loss: 0.4001 - regression_loss: 0.3697 - classification_loss: 0.0304 373/500 [=====================>........] - ETA: 43s - loss: 0.4001 - regression_loss: 0.3697 - classification_loss: 0.0304 374/500 [=====================>........] - ETA: 43s - loss: 0.3998 - regression_loss: 0.3694 - classification_loss: 0.0304 375/500 [=====================>........] - ETA: 42s - loss: 0.3994 - regression_loss: 0.3690 - classification_loss: 0.0304 376/500 [=====================>........] - ETA: 42s - loss: 0.3994 - regression_loss: 0.3690 - classification_loss: 0.0304 377/500 [=====================>........] - ETA: 42s - loss: 0.3989 - regression_loss: 0.3685 - classification_loss: 0.0303 378/500 [=====================>........] - ETA: 41s - loss: 0.3983 - regression_loss: 0.3681 - classification_loss: 0.0303 379/500 [=====================>........] - ETA: 41s - loss: 0.3992 - regression_loss: 0.3689 - classification_loss: 0.0303 380/500 [=====================>........] - ETA: 41s - loss: 0.3988 - regression_loss: 0.3686 - classification_loss: 0.0303 381/500 [=====================>........] - ETA: 40s - loss: 0.3985 - regression_loss: 0.3682 - classification_loss: 0.0302 382/500 [=====================>........] - ETA: 40s - loss: 0.3981 - regression_loss: 0.3679 - classification_loss: 0.0302 383/500 [=====================>........] - ETA: 40s - loss: 0.3977 - regression_loss: 0.3676 - classification_loss: 0.0301 384/500 [======================>.......] - ETA: 39s - loss: 0.3985 - regression_loss: 0.3683 - classification_loss: 0.0301 385/500 [======================>.......] - ETA: 39s - loss: 0.3987 - regression_loss: 0.3685 - classification_loss: 0.0302 386/500 [======================>.......] - ETA: 39s - loss: 0.3983 - regression_loss: 0.3681 - classification_loss: 0.0301 387/500 [======================>.......] - ETA: 38s - loss: 0.3984 - regression_loss: 0.3683 - classification_loss: 0.0301 388/500 [======================>.......] - ETA: 38s - loss: 0.3984 - regression_loss: 0.3683 - classification_loss: 0.0301 389/500 [======================>.......] - ETA: 38s - loss: 0.3986 - regression_loss: 0.3685 - classification_loss: 0.0301 390/500 [======================>.......] - ETA: 37s - loss: 0.3988 - regression_loss: 0.3687 - classification_loss: 0.0301 391/500 [======================>.......] - ETA: 37s - loss: 0.3987 - regression_loss: 0.3686 - classification_loss: 0.0301 392/500 [======================>.......] - ETA: 37s - loss: 0.3983 - regression_loss: 0.3682 - classification_loss: 0.0301 393/500 [======================>.......] - ETA: 36s - loss: 0.3980 - regression_loss: 0.3679 - classification_loss: 0.0300 394/500 [======================>.......] - ETA: 36s - loss: 0.3979 - regression_loss: 0.3679 - classification_loss: 0.0300 395/500 [======================>.......] - ETA: 35s - loss: 0.3973 - regression_loss: 0.3673 - classification_loss: 0.0300 396/500 [======================>.......] - ETA: 35s - loss: 0.3975 - regression_loss: 0.3675 - classification_loss: 0.0300 397/500 [======================>.......] - ETA: 35s - loss: 0.3974 - regression_loss: 0.3674 - classification_loss: 0.0300 398/500 [======================>.......] - ETA: 34s - loss: 0.3971 - regression_loss: 0.3672 - classification_loss: 0.0300 399/500 [======================>.......] - ETA: 34s - loss: 0.3971 - regression_loss: 0.3671 - classification_loss: 0.0299 400/500 [=======================>......] - ETA: 34s - loss: 0.3974 - regression_loss: 0.3674 - classification_loss: 0.0300 401/500 [=======================>......] - ETA: 33s - loss: 0.3975 - regression_loss: 0.3675 - classification_loss: 0.0300 402/500 [=======================>......] - ETA: 33s - loss: 0.3972 - regression_loss: 0.3672 - classification_loss: 0.0300 403/500 [=======================>......] - ETA: 33s - loss: 0.3966 - regression_loss: 0.3667 - classification_loss: 0.0299 404/500 [=======================>......] - ETA: 32s - loss: 0.3967 - regression_loss: 0.3668 - classification_loss: 0.0299 405/500 [=======================>......] - ETA: 32s - loss: 0.3959 - regression_loss: 0.3660 - classification_loss: 0.0299 406/500 [=======================>......] - ETA: 32s - loss: 0.3958 - regression_loss: 0.3660 - classification_loss: 0.0299 407/500 [=======================>......] - ETA: 31s - loss: 0.3969 - regression_loss: 0.3668 - classification_loss: 0.0301 408/500 [=======================>......] - ETA: 31s - loss: 0.3963 - regression_loss: 0.3663 - classification_loss: 0.0300 409/500 [=======================>......] - ETA: 31s - loss: 0.3958 - regression_loss: 0.3659 - classification_loss: 0.0300 410/500 [=======================>......] - ETA: 30s - loss: 0.3952 - regression_loss: 0.3653 - classification_loss: 0.0299 411/500 [=======================>......] - ETA: 30s - loss: 0.3957 - regression_loss: 0.3657 - classification_loss: 0.0300 412/500 [=======================>......] - ETA: 30s - loss: 0.3958 - regression_loss: 0.3659 - classification_loss: 0.0299 413/500 [=======================>......] - ETA: 29s - loss: 0.3958 - regression_loss: 0.3659 - classification_loss: 0.0299 414/500 [=======================>......] - ETA: 29s - loss: 0.3956 - regression_loss: 0.3657 - classification_loss: 0.0299 415/500 [=======================>......] - ETA: 29s - loss: 0.3953 - regression_loss: 0.3655 - classification_loss: 0.0298 416/500 [=======================>......] - ETA: 28s - loss: 0.3953 - regression_loss: 0.3655 - classification_loss: 0.0298 417/500 [========================>.....] - ETA: 28s - loss: 0.3951 - regression_loss: 0.3653 - classification_loss: 0.0298 418/500 [========================>.....] - ETA: 28s - loss: 0.3950 - regression_loss: 0.3652 - classification_loss: 0.0298 419/500 [========================>.....] - ETA: 27s - loss: 0.3956 - regression_loss: 0.3657 - classification_loss: 0.0298 420/500 [========================>.....] - ETA: 27s - loss: 0.3953 - regression_loss: 0.3654 - classification_loss: 0.0298 421/500 [========================>.....] - ETA: 27s - loss: 0.3949 - regression_loss: 0.3651 - classification_loss: 0.0298 422/500 [========================>.....] - ETA: 26s - loss: 0.3958 - regression_loss: 0.3658 - classification_loss: 0.0300 423/500 [========================>.....] - ETA: 26s - loss: 0.3956 - regression_loss: 0.3656 - classification_loss: 0.0300 424/500 [========================>.....] - ETA: 26s - loss: 0.3959 - regression_loss: 0.3659 - classification_loss: 0.0300 425/500 [========================>.....] - ETA: 25s - loss: 0.3961 - regression_loss: 0.3661 - classification_loss: 0.0301 426/500 [========================>.....] - ETA: 25s - loss: 0.3966 - regression_loss: 0.3665 - classification_loss: 0.0301 427/500 [========================>.....] - ETA: 25s - loss: 0.3966 - regression_loss: 0.3665 - classification_loss: 0.0301 428/500 [========================>.....] - ETA: 24s - loss: 0.3967 - regression_loss: 0.3666 - classification_loss: 0.0300 429/500 [========================>.....] - ETA: 24s - loss: 0.3965 - regression_loss: 0.3665 - classification_loss: 0.0300 430/500 [========================>.....] - ETA: 24s - loss: 0.3960 - regression_loss: 0.3660 - classification_loss: 0.0300 431/500 [========================>.....] - ETA: 23s - loss: 0.3966 - regression_loss: 0.3665 - classification_loss: 0.0301 432/500 [========================>.....] - ETA: 23s - loss: 0.3961 - regression_loss: 0.3660 - classification_loss: 0.0300 433/500 [========================>.....] - ETA: 22s - loss: 0.3965 - regression_loss: 0.3664 - classification_loss: 0.0301 434/500 [=========================>....] - ETA: 22s - loss: 0.3966 - regression_loss: 0.3665 - classification_loss: 0.0301 435/500 [=========================>....] - ETA: 22s - loss: 0.3965 - regression_loss: 0.3664 - classification_loss: 0.0301 436/500 [=========================>....] - ETA: 21s - loss: 0.3965 - regression_loss: 0.3664 - classification_loss: 0.0301 437/500 [=========================>....] - ETA: 21s - loss: 0.3965 - regression_loss: 0.3664 - classification_loss: 0.0301 438/500 [=========================>....] - ETA: 21s - loss: 0.3961 - regression_loss: 0.3661 - classification_loss: 0.0301 439/500 [=========================>....] - ETA: 20s - loss: 0.3969 - regression_loss: 0.3668 - classification_loss: 0.0301 440/500 [=========================>....] - ETA: 20s - loss: 0.3967 - regression_loss: 0.3667 - classification_loss: 0.0300 441/500 [=========================>....] - ETA: 20s - loss: 0.3961 - regression_loss: 0.3662 - classification_loss: 0.0300 442/500 [=========================>....] - ETA: 19s - loss: 0.3961 - regression_loss: 0.3660 - classification_loss: 0.0300 443/500 [=========================>....] - ETA: 19s - loss: 0.3962 - regression_loss: 0.3662 - classification_loss: 0.0300 444/500 [=========================>....] - ETA: 19s - loss: 0.3961 - regression_loss: 0.3661 - classification_loss: 0.0300 445/500 [=========================>....] - ETA: 18s - loss: 0.3965 - regression_loss: 0.3665 - classification_loss: 0.0301 446/500 [=========================>....] - ETA: 18s - loss: 0.3968 - regression_loss: 0.3667 - classification_loss: 0.0301 447/500 [=========================>....] - ETA: 18s - loss: 0.3968 - regression_loss: 0.3668 - classification_loss: 0.0301 448/500 [=========================>....] - ETA: 17s - loss: 0.3967 - regression_loss: 0.3666 - classification_loss: 0.0301 449/500 [=========================>....] - ETA: 17s - loss: 0.3967 - regression_loss: 0.3667 - classification_loss: 0.0300 450/500 [==========================>...] - ETA: 17s - loss: 0.3970 - regression_loss: 0.3670 - classification_loss: 0.0300 451/500 [==========================>...] - ETA: 16s - loss: 0.3969 - regression_loss: 0.3669 - classification_loss: 0.0300 452/500 [==========================>...] - ETA: 16s - loss: 0.3969 - regression_loss: 0.3669 - classification_loss: 0.0300 453/500 [==========================>...] - ETA: 16s - loss: 0.3964 - regression_loss: 0.3664 - classification_loss: 0.0300 454/500 [==========================>...] - ETA: 15s - loss: 0.3959 - regression_loss: 0.3660 - classification_loss: 0.0299 455/500 [==========================>...] - ETA: 15s - loss: 0.3958 - regression_loss: 0.3659 - classification_loss: 0.0299 456/500 [==========================>...] - ETA: 15s - loss: 0.3959 - regression_loss: 0.3660 - classification_loss: 0.0299 457/500 [==========================>...] - ETA: 14s - loss: 0.3958 - regression_loss: 0.3659 - classification_loss: 0.0299 458/500 [==========================>...] - ETA: 14s - loss: 0.3963 - regression_loss: 0.3664 - classification_loss: 0.0300 459/500 [==========================>...] - ETA: 14s - loss: 0.3963 - regression_loss: 0.3663 - classification_loss: 0.0300 460/500 [==========================>...] - ETA: 13s - loss: 0.3956 - regression_loss: 0.3657 - classification_loss: 0.0299 461/500 [==========================>...] - ETA: 13s - loss: 0.3957 - regression_loss: 0.3657 - classification_loss: 0.0299 462/500 [==========================>...] - ETA: 13s - loss: 0.3955 - regression_loss: 0.3655 - classification_loss: 0.0299 463/500 [==========================>...] - ETA: 12s - loss: 0.3953 - regression_loss: 0.3654 - classification_loss: 0.0299 464/500 [==========================>...] - ETA: 12s - loss: 0.3952 - regression_loss: 0.3653 - classification_loss: 0.0299 465/500 [==========================>...] - ETA: 12s - loss: 0.3954 - regression_loss: 0.3655 - classification_loss: 0.0299 466/500 [==========================>...] - ETA: 11s - loss: 0.3956 - regression_loss: 0.3657 - classification_loss: 0.0299 467/500 [===========================>..] - ETA: 11s - loss: 0.3952 - regression_loss: 0.3653 - classification_loss: 0.0299 468/500 [===========================>..] - ETA: 10s - loss: 0.3948 - regression_loss: 0.3649 - classification_loss: 0.0299 469/500 [===========================>..] - ETA: 10s - loss: 0.3949 - regression_loss: 0.3651 - classification_loss: 0.0299 470/500 [===========================>..] - ETA: 10s - loss: 0.3948 - regression_loss: 0.3650 - classification_loss: 0.0298 471/500 [===========================>..] - ETA: 9s - loss: 0.3952 - regression_loss: 0.3653 - classification_loss: 0.0299  472/500 [===========================>..] - ETA: 9s - loss: 0.3952 - regression_loss: 0.3653 - classification_loss: 0.0299 473/500 [===========================>..] - ETA: 9s - loss: 0.3951 - regression_loss: 0.3652 - classification_loss: 0.0299 474/500 [===========================>..] - ETA: 8s - loss: 0.3955 - regression_loss: 0.3655 - classification_loss: 0.0300 475/500 [===========================>..] - ETA: 8s - loss: 0.3957 - regression_loss: 0.3657 - classification_loss: 0.0300 476/500 [===========================>..] - ETA: 8s - loss: 0.3959 - regression_loss: 0.3658 - classification_loss: 0.0300 477/500 [===========================>..] - ETA: 7s - loss: 0.3960 - regression_loss: 0.3660 - classification_loss: 0.0301 478/500 [===========================>..] - ETA: 7s - loss: 0.3956 - regression_loss: 0.3656 - classification_loss: 0.0300 479/500 [===========================>..] - ETA: 7s - loss: 0.3960 - regression_loss: 0.3659 - classification_loss: 0.0301 480/500 [===========================>..] - ETA: 6s - loss: 0.3965 - regression_loss: 0.3663 - classification_loss: 0.0301 481/500 [===========================>..] - ETA: 6s - loss: 0.3959 - regression_loss: 0.3658 - classification_loss: 0.0301 482/500 [===========================>..] - ETA: 6s - loss: 0.3963 - regression_loss: 0.3662 - classification_loss: 0.0301 483/500 [===========================>..] - ETA: 5s - loss: 0.3960 - regression_loss: 0.3659 - classification_loss: 0.0300 484/500 [============================>.] - ETA: 5s - loss: 0.3959 - regression_loss: 0.3659 - classification_loss: 0.0300 485/500 [============================>.] - ETA: 5s - loss: 0.3959 - regression_loss: 0.3659 - classification_loss: 0.0301 486/500 [============================>.] - ETA: 4s - loss: 0.3955 - regression_loss: 0.3655 - classification_loss: 0.0300 487/500 [============================>.] - ETA: 4s - loss: 0.3958 - regression_loss: 0.3658 - classification_loss: 0.0300 488/500 [============================>.] - ETA: 4s - loss: 0.3958 - regression_loss: 0.3658 - classification_loss: 0.0300 489/500 [============================>.] - ETA: 3s - loss: 0.3955 - regression_loss: 0.3656 - classification_loss: 0.0300 490/500 [============================>.] - ETA: 3s - loss: 0.3964 - regression_loss: 0.3664 - classification_loss: 0.0300 491/500 [============================>.] - ETA: 3s - loss: 0.3962 - regression_loss: 0.3662 - classification_loss: 0.0300 492/500 [============================>.] - ETA: 2s - loss: 0.3962 - regression_loss: 0.3662 - classification_loss: 0.0300 493/500 [============================>.] - ETA: 2s - loss: 0.3965 - regression_loss: 0.3665 - classification_loss: 0.0300 494/500 [============================>.] - ETA: 2s - loss: 0.3973 - regression_loss: 0.3671 - classification_loss: 0.0302 495/500 [============================>.] - ETA: 1s - loss: 0.3969 - regression_loss: 0.3668 - classification_loss: 0.0301 496/500 [============================>.] - ETA: 1s - loss: 0.3965 - regression_loss: 0.3665 - classification_loss: 0.0301 497/500 [============================>.] - ETA: 1s - loss: 0.3961 - regression_loss: 0.3661 - classification_loss: 0.0300 498/500 [============================>.] - ETA: 0s - loss: 0.3966 - regression_loss: 0.3666 - classification_loss: 0.0301 499/500 [============================>.] - ETA: 0s - loss: 0.3964 - regression_loss: 0.3664 - classification_loss: 0.0300 500/500 [==============================] - 172s 343ms/step - loss: 0.3960 - regression_loss: 0.3661 - classification_loss: 0.0300 1172 instances of class plum with average precision: 0.7294 mAP: 0.7294 Epoch 00027: saving model to ./training/snapshots/resnet101_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:38 - loss: 0.2247 - regression_loss: 0.2145 - classification_loss: 0.0102 2/500 [..............................] - ETA: 2:44 - loss: 0.3682 - regression_loss: 0.3451 - classification_loss: 0.0232 3/500 [..............................] - ETA: 2:44 - loss: 0.3805 - regression_loss: 0.3528 - classification_loss: 0.0277 4/500 [..............................] - ETA: 2:44 - loss: 0.3817 - regression_loss: 0.3551 - classification_loss: 0.0266 5/500 [..............................] - ETA: 2:46 - loss: 0.3452 - regression_loss: 0.3216 - classification_loss: 0.0236 6/500 [..............................] - ETA: 2:46 - loss: 0.3277 - regression_loss: 0.3058 - classification_loss: 0.0219 7/500 [..............................] - ETA: 2:46 - loss: 0.3090 - regression_loss: 0.2876 - classification_loss: 0.0214 8/500 [..............................] - ETA: 2:46 - loss: 0.3360 - regression_loss: 0.3132 - classification_loss: 0.0228 9/500 [..............................] - ETA: 2:46 - loss: 0.3277 - regression_loss: 0.3064 - classification_loss: 0.0213 10/500 [..............................] - ETA: 2:47 - loss: 0.3337 - regression_loss: 0.3112 - classification_loss: 0.0226 11/500 [..............................] - ETA: 2:47 - loss: 0.3224 - regression_loss: 0.2992 - classification_loss: 0.0232 12/500 [..............................] - ETA: 2:47 - loss: 0.3343 - regression_loss: 0.3105 - classification_loss: 0.0238 13/500 [..............................] - ETA: 2:46 - loss: 0.3283 - regression_loss: 0.3044 - classification_loss: 0.0240 14/500 [..............................] - ETA: 2:46 - loss: 0.3161 - regression_loss: 0.2931 - classification_loss: 0.0230 15/500 [..............................] - ETA: 2:46 - loss: 0.3191 - regression_loss: 0.2951 - classification_loss: 0.0240 16/500 [..............................] - ETA: 2:46 - loss: 0.3376 - regression_loss: 0.3089 - classification_loss: 0.0287 17/500 [>.............................] - ETA: 2:45 - loss: 0.3424 - regression_loss: 0.3134 - classification_loss: 0.0290 18/500 [>.............................] - ETA: 2:44 - loss: 0.3436 - regression_loss: 0.3155 - classification_loss: 0.0281 19/500 [>.............................] - ETA: 2:44 - loss: 0.3368 - regression_loss: 0.3093 - classification_loss: 0.0275 20/500 [>.............................] - ETA: 2:44 - loss: 0.3405 - regression_loss: 0.3120 - classification_loss: 0.0285 21/500 [>.............................] - ETA: 2:44 - loss: 0.3450 - regression_loss: 0.3165 - classification_loss: 0.0284 22/500 [>.............................] - ETA: 2:44 - loss: 0.3418 - regression_loss: 0.3138 - classification_loss: 0.0280 23/500 [>.............................] - ETA: 2:44 - loss: 0.3422 - regression_loss: 0.3146 - classification_loss: 0.0276 24/500 [>.............................] - ETA: 2:43 - loss: 0.3388 - regression_loss: 0.3118 - classification_loss: 0.0270 25/500 [>.............................] - ETA: 2:43 - loss: 0.3410 - regression_loss: 0.3136 - classification_loss: 0.0273 26/500 [>.............................] - ETA: 2:42 - loss: 0.3330 - regression_loss: 0.3064 - classification_loss: 0.0265 27/500 [>.............................] - ETA: 2:42 - loss: 0.3389 - regression_loss: 0.3123 - classification_loss: 0.0266 28/500 [>.............................] - ETA: 2:41 - loss: 0.3395 - regression_loss: 0.3125 - classification_loss: 0.0270 29/500 [>.............................] - ETA: 2:41 - loss: 0.3359 - regression_loss: 0.3092 - classification_loss: 0.0267 30/500 [>.............................] - ETA: 2:40 - loss: 0.3410 - regression_loss: 0.3138 - classification_loss: 0.0272 31/500 [>.............................] - ETA: 2:40 - loss: 0.3438 - regression_loss: 0.3169 - classification_loss: 0.0269 32/500 [>.............................] - ETA: 2:39 - loss: 0.3402 - regression_loss: 0.3135 - classification_loss: 0.0267 33/500 [>.............................] - ETA: 2:39 - loss: 0.3393 - regression_loss: 0.3128 - classification_loss: 0.0266 34/500 [=>............................] - ETA: 2:39 - loss: 0.3395 - regression_loss: 0.3128 - classification_loss: 0.0267 35/500 [=>............................] - ETA: 2:38 - loss: 0.3417 - regression_loss: 0.3151 - classification_loss: 0.0266 36/500 [=>............................] - ETA: 2:38 - loss: 0.3422 - regression_loss: 0.3157 - classification_loss: 0.0265 37/500 [=>............................] - ETA: 2:37 - loss: 0.3382 - regression_loss: 0.3122 - classification_loss: 0.0260 38/500 [=>............................] - ETA: 2:37 - loss: 0.3421 - regression_loss: 0.3151 - classification_loss: 0.0270 39/500 [=>............................] - ETA: 2:37 - loss: 0.3401 - regression_loss: 0.3137 - classification_loss: 0.0264 40/500 [=>............................] - ETA: 2:37 - loss: 0.3395 - regression_loss: 0.3135 - classification_loss: 0.0260 41/500 [=>............................] - ETA: 2:37 - loss: 0.3425 - regression_loss: 0.3163 - classification_loss: 0.0263 42/500 [=>............................] - ETA: 2:37 - loss: 0.3400 - regression_loss: 0.3142 - classification_loss: 0.0258 43/500 [=>............................] - ETA: 2:36 - loss: 0.3412 - regression_loss: 0.3156 - classification_loss: 0.0255 44/500 [=>............................] - ETA: 2:36 - loss: 0.3423 - regression_loss: 0.3162 - classification_loss: 0.0261 45/500 [=>............................] - ETA: 2:35 - loss: 0.3400 - regression_loss: 0.3142 - classification_loss: 0.0258 46/500 [=>............................] - ETA: 2:35 - loss: 0.3411 - regression_loss: 0.3153 - classification_loss: 0.0258 47/500 [=>............................] - ETA: 2:35 - loss: 0.3443 - regression_loss: 0.3180 - classification_loss: 0.0263 48/500 [=>............................] - ETA: 2:34 - loss: 0.3470 - regression_loss: 0.3205 - classification_loss: 0.0266 49/500 [=>............................] - ETA: 2:34 - loss: 0.3482 - regression_loss: 0.3218 - classification_loss: 0.0264 50/500 [==>...........................] - ETA: 2:33 - loss: 0.3494 - regression_loss: 0.3228 - classification_loss: 0.0267 51/500 [==>...........................] - ETA: 2:33 - loss: 0.3515 - regression_loss: 0.3248 - classification_loss: 0.0268 52/500 [==>...........................] - ETA: 2:32 - loss: 0.3490 - regression_loss: 0.3224 - classification_loss: 0.0266 53/500 [==>...........................] - ETA: 2:32 - loss: 0.3467 - regression_loss: 0.3204 - classification_loss: 0.0263 54/500 [==>...........................] - ETA: 2:31 - loss: 0.3509 - regression_loss: 0.3239 - classification_loss: 0.0270 55/500 [==>...........................] - ETA: 2:31 - loss: 0.3513 - regression_loss: 0.3241 - classification_loss: 0.0271 56/500 [==>...........................] - ETA: 2:31 - loss: 0.3478 - regression_loss: 0.3209 - classification_loss: 0.0268 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3507 - regression_loss: 0.3240 - classification_loss: 0.0267 58/500 [==>...........................] - ETA: 2:30 - loss: 0.3529 - regression_loss: 0.3262 - classification_loss: 0.0267 59/500 [==>...........................] - ETA: 2:30 - loss: 0.3508 - regression_loss: 0.3243 - classification_loss: 0.0265 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3572 - regression_loss: 0.3304 - classification_loss: 0.0268 61/500 [==>...........................] - ETA: 2:29 - loss: 0.3564 - regression_loss: 0.3297 - classification_loss: 0.0267 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3591 - regression_loss: 0.3318 - classification_loss: 0.0273 63/500 [==>...........................] - ETA: 2:29 - loss: 0.3605 - regression_loss: 0.3336 - classification_loss: 0.0270 64/500 [==>...........................] - ETA: 2:28 - loss: 0.3619 - regression_loss: 0.3350 - classification_loss: 0.0269 65/500 [==>...........................] - ETA: 2:28 - loss: 0.3608 - regression_loss: 0.3340 - classification_loss: 0.0268 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3605 - regression_loss: 0.3340 - classification_loss: 0.0265 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3584 - regression_loss: 0.3321 - classification_loss: 0.0263 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3570 - regression_loss: 0.3308 - classification_loss: 0.0262 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3576 - regression_loss: 0.3315 - classification_loss: 0.0262 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3561 - regression_loss: 0.3301 - classification_loss: 0.0260 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3576 - regression_loss: 0.3312 - classification_loss: 0.0264 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3577 - regression_loss: 0.3313 - classification_loss: 0.0264 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3575 - regression_loss: 0.3311 - classification_loss: 0.0264 74/500 [===>..........................] - ETA: 2:26 - loss: 0.3582 - regression_loss: 0.3318 - classification_loss: 0.0264 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3569 - regression_loss: 0.3307 - classification_loss: 0.0262 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3563 - regression_loss: 0.3300 - classification_loss: 0.0263 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3534 - regression_loss: 0.3273 - classification_loss: 0.0261 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3557 - regression_loss: 0.3295 - classification_loss: 0.0262 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3589 - regression_loss: 0.3323 - classification_loss: 0.0265 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3580 - regression_loss: 0.3316 - classification_loss: 0.0265 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3586 - regression_loss: 0.3320 - classification_loss: 0.0267 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3594 - regression_loss: 0.3327 - classification_loss: 0.0267 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3602 - regression_loss: 0.3334 - classification_loss: 0.0268 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3603 - regression_loss: 0.3334 - classification_loss: 0.0269 85/500 [====>.........................] - ETA: 2:21 - loss: 0.3655 - regression_loss: 0.3384 - classification_loss: 0.0271 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3657 - regression_loss: 0.3387 - classification_loss: 0.0270 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3668 - regression_loss: 0.3397 - classification_loss: 0.0271 88/500 [====>.........................] - ETA: 2:20 - loss: 0.3649 - regression_loss: 0.3380 - classification_loss: 0.0269 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3656 - regression_loss: 0.3387 - classification_loss: 0.0269 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3632 - regression_loss: 0.3366 - classification_loss: 0.0267 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3646 - regression_loss: 0.3378 - classification_loss: 0.0269 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3651 - regression_loss: 0.3383 - classification_loss: 0.0268 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3672 - regression_loss: 0.3399 - classification_loss: 0.0273 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3663 - regression_loss: 0.3390 - classification_loss: 0.0273 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3652 - regression_loss: 0.3380 - classification_loss: 0.0272 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3632 - regression_loss: 0.3362 - classification_loss: 0.0270 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3618 - regression_loss: 0.3349 - classification_loss: 0.0269 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3632 - regression_loss: 0.3362 - classification_loss: 0.0270 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3634 - regression_loss: 0.3364 - classification_loss: 0.0270 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3629 - regression_loss: 0.3359 - classification_loss: 0.0270 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3629 - regression_loss: 0.3359 - classification_loss: 0.0270 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3630 - regression_loss: 0.3361 - classification_loss: 0.0269 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3633 - regression_loss: 0.3364 - classification_loss: 0.0269 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3646 - regression_loss: 0.3375 - classification_loss: 0.0270 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3637 - regression_loss: 0.3367 - classification_loss: 0.0270 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3643 - regression_loss: 0.3373 - classification_loss: 0.0270 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3635 - regression_loss: 0.3367 - classification_loss: 0.0268 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3666 - regression_loss: 0.3397 - classification_loss: 0.0269 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3657 - regression_loss: 0.3389 - classification_loss: 0.0268 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3662 - regression_loss: 0.3395 - classification_loss: 0.0268 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3666 - regression_loss: 0.3399 - classification_loss: 0.0268 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3683 - regression_loss: 0.3413 - classification_loss: 0.0270 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3693 - regression_loss: 0.3422 - classification_loss: 0.0271 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3715 - regression_loss: 0.3444 - classification_loss: 0.0271 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3730 - regression_loss: 0.3459 - classification_loss: 0.0271 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3721 - regression_loss: 0.3451 - classification_loss: 0.0270 117/500 [======>.......................] - ETA: 2:10 - loss: 0.3726 - regression_loss: 0.3457 - classification_loss: 0.0269 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3716 - regression_loss: 0.3448 - classification_loss: 0.0268 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3717 - regression_loss: 0.3450 - classification_loss: 0.0267 120/500 [======>.......................] - ETA: 2:09 - loss: 0.3727 - regression_loss: 0.3460 - classification_loss: 0.0267 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3720 - regression_loss: 0.3454 - classification_loss: 0.0267 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3731 - regression_loss: 0.3463 - classification_loss: 0.0267 123/500 [======>.......................] - ETA: 2:08 - loss: 0.3722 - regression_loss: 0.3455 - classification_loss: 0.0266 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3741 - regression_loss: 0.3474 - classification_loss: 0.0267 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3727 - regression_loss: 0.3460 - classification_loss: 0.0266 126/500 [======>.......................] - ETA: 2:07 - loss: 0.3740 - regression_loss: 0.3474 - classification_loss: 0.0266 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3739 - regression_loss: 0.3473 - classification_loss: 0.0266 128/500 [======>.......................] - ETA: 2:06 - loss: 0.3737 - regression_loss: 0.3470 - classification_loss: 0.0267 129/500 [======>.......................] - ETA: 2:06 - loss: 0.3750 - regression_loss: 0.3482 - classification_loss: 0.0268 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3749 - regression_loss: 0.3482 - classification_loss: 0.0267 131/500 [======>.......................] - ETA: 2:05 - loss: 0.3740 - regression_loss: 0.3474 - classification_loss: 0.0266 132/500 [======>.......................] - ETA: 2:05 - loss: 0.3749 - regression_loss: 0.3482 - classification_loss: 0.0267 133/500 [======>.......................] - ETA: 2:04 - loss: 0.3738 - regression_loss: 0.3472 - classification_loss: 0.0266 134/500 [=======>......................] - ETA: 2:04 - loss: 0.3727 - regression_loss: 0.3462 - classification_loss: 0.0265 135/500 [=======>......................] - ETA: 2:04 - loss: 0.3728 - regression_loss: 0.3464 - classification_loss: 0.0264 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3749 - regression_loss: 0.3482 - classification_loss: 0.0267 137/500 [=======>......................] - ETA: 2:03 - loss: 0.3748 - regression_loss: 0.3482 - classification_loss: 0.0267 138/500 [=======>......................] - ETA: 2:03 - loss: 0.3764 - regression_loss: 0.3497 - classification_loss: 0.0267 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3757 - regression_loss: 0.3491 - classification_loss: 0.0267 140/500 [=======>......................] - ETA: 2:02 - loss: 0.3758 - regression_loss: 0.3491 - classification_loss: 0.0267 141/500 [=======>......................] - ETA: 2:02 - loss: 0.3767 - regression_loss: 0.3500 - classification_loss: 0.0268 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3780 - regression_loss: 0.3510 - classification_loss: 0.0270 143/500 [=======>......................] - ETA: 2:01 - loss: 0.3762 - regression_loss: 0.3493 - classification_loss: 0.0269 144/500 [=======>......................] - ETA: 2:01 - loss: 0.3763 - regression_loss: 0.3494 - classification_loss: 0.0268 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3773 - regression_loss: 0.3503 - classification_loss: 0.0269 146/500 [=======>......................] - ETA: 2:00 - loss: 0.3772 - regression_loss: 0.3503 - classification_loss: 0.0270 147/500 [=======>......................] - ETA: 2:00 - loss: 0.3790 - regression_loss: 0.3519 - classification_loss: 0.0271 148/500 [=======>......................] - ETA: 1:59 - loss: 0.3795 - regression_loss: 0.3524 - classification_loss: 0.0271 149/500 [=======>......................] - ETA: 1:59 - loss: 0.3789 - regression_loss: 0.3519 - classification_loss: 0.0270 150/500 [========>.....................] - ETA: 1:59 - loss: 0.3794 - regression_loss: 0.3524 - classification_loss: 0.0271 151/500 [========>.....................] - ETA: 1:58 - loss: 0.3782 - regression_loss: 0.3513 - classification_loss: 0.0269 152/500 [========>.....................] - ETA: 1:58 - loss: 0.3796 - regression_loss: 0.3524 - classification_loss: 0.0272 153/500 [========>.....................] - ETA: 1:58 - loss: 0.3807 - regression_loss: 0.3535 - classification_loss: 0.0273 154/500 [========>.....................] - ETA: 1:57 - loss: 0.3810 - regression_loss: 0.3537 - classification_loss: 0.0273 155/500 [========>.....................] - ETA: 1:57 - loss: 0.3801 - regression_loss: 0.3530 - classification_loss: 0.0271 156/500 [========>.....................] - ETA: 1:57 - loss: 0.3795 - regression_loss: 0.3525 - classification_loss: 0.0270 157/500 [========>.....................] - ETA: 1:56 - loss: 0.3822 - regression_loss: 0.3550 - classification_loss: 0.0272 158/500 [========>.....................] - ETA: 1:56 - loss: 0.3812 - regression_loss: 0.3542 - classification_loss: 0.0270 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3804 - regression_loss: 0.3534 - classification_loss: 0.0270 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3792 - regression_loss: 0.3523 - classification_loss: 0.0269 161/500 [========>.....................] - ETA: 1:55 - loss: 0.3823 - regression_loss: 0.3552 - classification_loss: 0.0271 162/500 [========>.....................] - ETA: 1:55 - loss: 0.3822 - regression_loss: 0.3551 - classification_loss: 0.0271 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3818 - regression_loss: 0.3547 - classification_loss: 0.0271 164/500 [========>.....................] - ETA: 1:54 - loss: 0.3814 - regression_loss: 0.3544 - classification_loss: 0.0270 165/500 [========>.....................] - ETA: 1:54 - loss: 0.3821 - regression_loss: 0.3550 - classification_loss: 0.0271 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3831 - regression_loss: 0.3560 - classification_loss: 0.0271 167/500 [=========>....................] - ETA: 1:53 - loss: 0.3827 - regression_loss: 0.3556 - classification_loss: 0.0271 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3818 - regression_loss: 0.3547 - classification_loss: 0.0271 169/500 [=========>....................] - ETA: 1:52 - loss: 0.3818 - regression_loss: 0.3547 - classification_loss: 0.0271 170/500 [=========>....................] - ETA: 1:52 - loss: 0.3824 - regression_loss: 0.3553 - classification_loss: 0.0271 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3817 - regression_loss: 0.3546 - classification_loss: 0.0271 172/500 [=========>....................] - ETA: 1:51 - loss: 0.3807 - regression_loss: 0.3537 - classification_loss: 0.0270 173/500 [=========>....................] - ETA: 1:51 - loss: 0.3852 - regression_loss: 0.3576 - classification_loss: 0.0276 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3859 - regression_loss: 0.3583 - classification_loss: 0.0276 175/500 [=========>....................] - ETA: 1:50 - loss: 0.3872 - regression_loss: 0.3595 - classification_loss: 0.0277 176/500 [=========>....................] - ETA: 1:50 - loss: 0.3863 - regression_loss: 0.3587 - classification_loss: 0.0276 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3863 - regression_loss: 0.3586 - classification_loss: 0.0278 178/500 [=========>....................] - ETA: 1:49 - loss: 0.3862 - regression_loss: 0.3584 - classification_loss: 0.0278 179/500 [=========>....................] - ETA: 1:49 - loss: 0.3878 - regression_loss: 0.3597 - classification_loss: 0.0281 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3876 - regression_loss: 0.3595 - classification_loss: 0.0281 181/500 [=========>....................] - ETA: 1:48 - loss: 0.3871 - regression_loss: 0.3590 - classification_loss: 0.0280 182/500 [=========>....................] - ETA: 1:48 - loss: 0.3877 - regression_loss: 0.3597 - classification_loss: 0.0280 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3871 - regression_loss: 0.3591 - classification_loss: 0.0280 184/500 [==========>...................] - ETA: 1:47 - loss: 0.3878 - regression_loss: 0.3595 - classification_loss: 0.0283 185/500 [==========>...................] - ETA: 1:47 - loss: 0.3868 - regression_loss: 0.3585 - classification_loss: 0.0282 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3873 - regression_loss: 0.3591 - classification_loss: 0.0282 187/500 [==========>...................] - ETA: 1:46 - loss: 0.3879 - regression_loss: 0.3596 - classification_loss: 0.0283 188/500 [==========>...................] - ETA: 1:46 - loss: 0.3882 - regression_loss: 0.3600 - classification_loss: 0.0282 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3873 - regression_loss: 0.3592 - classification_loss: 0.0281 190/500 [==========>...................] - ETA: 1:45 - loss: 0.3870 - regression_loss: 0.3590 - classification_loss: 0.0281 191/500 [==========>...................] - ETA: 1:45 - loss: 0.3858 - regression_loss: 0.3579 - classification_loss: 0.0279 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3865 - regression_loss: 0.3584 - classification_loss: 0.0281 193/500 [==========>...................] - ETA: 1:44 - loss: 0.3862 - regression_loss: 0.3581 - classification_loss: 0.0281 194/500 [==========>...................] - ETA: 1:44 - loss: 0.3859 - regression_loss: 0.3578 - classification_loss: 0.0280 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3846 - regression_loss: 0.3567 - classification_loss: 0.0279 196/500 [==========>...................] - ETA: 1:43 - loss: 0.3853 - regression_loss: 0.3574 - classification_loss: 0.0279 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3857 - regression_loss: 0.3578 - classification_loss: 0.0279 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3856 - regression_loss: 0.3577 - classification_loss: 0.0279 199/500 [==========>...................] - ETA: 1:42 - loss: 0.3858 - regression_loss: 0.3579 - classification_loss: 0.0279 200/500 [===========>..................] - ETA: 1:42 - loss: 0.3869 - regression_loss: 0.3588 - classification_loss: 0.0281 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3876 - regression_loss: 0.3592 - classification_loss: 0.0284 202/500 [===========>..................] - ETA: 1:41 - loss: 0.3878 - regression_loss: 0.3594 - classification_loss: 0.0284 203/500 [===========>..................] - ETA: 1:41 - loss: 0.3894 - regression_loss: 0.3607 - classification_loss: 0.0286 204/500 [===========>..................] - ETA: 1:40 - loss: 0.3885 - regression_loss: 0.3599 - classification_loss: 0.0286 205/500 [===========>..................] - ETA: 1:40 - loss: 0.3905 - regression_loss: 0.3615 - classification_loss: 0.0290 206/500 [===========>..................] - ETA: 1:40 - loss: 0.3904 - regression_loss: 0.3615 - classification_loss: 0.0290 207/500 [===========>..................] - ETA: 1:39 - loss: 0.3925 - regression_loss: 0.3635 - classification_loss: 0.0291 208/500 [===========>..................] - ETA: 1:39 - loss: 0.3925 - regression_loss: 0.3634 - classification_loss: 0.0291 209/500 [===========>..................] - ETA: 1:39 - loss: 0.3916 - regression_loss: 0.3626 - classification_loss: 0.0290 210/500 [===========>..................] - ETA: 1:38 - loss: 0.3920 - regression_loss: 0.3631 - classification_loss: 0.0290 211/500 [===========>..................] - ETA: 1:38 - loss: 0.3924 - regression_loss: 0.3635 - classification_loss: 0.0290 212/500 [===========>..................] - ETA: 1:38 - loss: 0.3923 - regression_loss: 0.3633 - classification_loss: 0.0290 213/500 [===========>..................] - ETA: 1:37 - loss: 0.3920 - regression_loss: 0.3630 - classification_loss: 0.0290 214/500 [===========>..................] - ETA: 1:37 - loss: 0.3926 - regression_loss: 0.3634 - classification_loss: 0.0291 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3922 - regression_loss: 0.3631 - classification_loss: 0.0291 216/500 [===========>..................] - ETA: 1:36 - loss: 0.3937 - regression_loss: 0.3644 - classification_loss: 0.0293 217/500 [============>.................] - ETA: 1:36 - loss: 0.3936 - regression_loss: 0.3643 - classification_loss: 0.0293 218/500 [============>.................] - ETA: 1:36 - loss: 0.3928 - regression_loss: 0.3635 - classification_loss: 0.0292 219/500 [============>.................] - ETA: 1:35 - loss: 0.3942 - regression_loss: 0.3648 - classification_loss: 0.0293 220/500 [============>.................] - ETA: 1:35 - loss: 0.3946 - regression_loss: 0.3653 - classification_loss: 0.0293 221/500 [============>.................] - ETA: 1:35 - loss: 0.3940 - regression_loss: 0.3647 - classification_loss: 0.0293 222/500 [============>.................] - ETA: 1:34 - loss: 0.3931 - regression_loss: 0.3639 - classification_loss: 0.0292 223/500 [============>.................] - ETA: 1:34 - loss: 0.3933 - regression_loss: 0.3642 - classification_loss: 0.0292 224/500 [============>.................] - ETA: 1:34 - loss: 0.3940 - regression_loss: 0.3648 - classification_loss: 0.0292 225/500 [============>.................] - ETA: 1:33 - loss: 0.3932 - regression_loss: 0.3641 - classification_loss: 0.0291 226/500 [============>.................] - ETA: 1:33 - loss: 0.3921 - regression_loss: 0.3631 - classification_loss: 0.0290 227/500 [============>.................] - ETA: 1:33 - loss: 0.3930 - regression_loss: 0.3638 - classification_loss: 0.0292 228/500 [============>.................] - ETA: 1:32 - loss: 0.3923 - regression_loss: 0.3632 - classification_loss: 0.0291 229/500 [============>.................] - ETA: 1:32 - loss: 0.3930 - regression_loss: 0.3639 - classification_loss: 0.0291 230/500 [============>.................] - ETA: 1:32 - loss: 0.3933 - regression_loss: 0.3641 - classification_loss: 0.0291 231/500 [============>.................] - ETA: 1:31 - loss: 0.3927 - regression_loss: 0.3636 - classification_loss: 0.0290 232/500 [============>.................] - ETA: 1:31 - loss: 0.3926 - regression_loss: 0.3635 - classification_loss: 0.0290 233/500 [============>.................] - ETA: 1:31 - loss: 0.3927 - regression_loss: 0.3637 - classification_loss: 0.0291 234/500 [=============>................] - ETA: 1:30 - loss: 0.3929 - regression_loss: 0.3637 - classification_loss: 0.0292 235/500 [=============>................] - ETA: 1:30 - loss: 0.3941 - regression_loss: 0.3648 - classification_loss: 0.0293 236/500 [=============>................] - ETA: 1:30 - loss: 0.3951 - regression_loss: 0.3657 - classification_loss: 0.0293 237/500 [=============>................] - ETA: 1:29 - loss: 0.3942 - regression_loss: 0.3649 - classification_loss: 0.0293 238/500 [=============>................] - ETA: 1:29 - loss: 0.3942 - regression_loss: 0.3651 - classification_loss: 0.0292 239/500 [=============>................] - ETA: 1:29 - loss: 0.3942 - regression_loss: 0.3651 - classification_loss: 0.0291 240/500 [=============>................] - ETA: 1:28 - loss: 0.3938 - regression_loss: 0.3647 - classification_loss: 0.0291 241/500 [=============>................] - ETA: 1:28 - loss: 0.3931 - regression_loss: 0.3641 - classification_loss: 0.0290 242/500 [=============>................] - ETA: 1:28 - loss: 0.3927 - regression_loss: 0.3636 - classification_loss: 0.0290 243/500 [=============>................] - ETA: 1:27 - loss: 0.3924 - regression_loss: 0.3634 - classification_loss: 0.0290 244/500 [=============>................] - ETA: 1:27 - loss: 0.3922 - regression_loss: 0.3632 - classification_loss: 0.0290 245/500 [=============>................] - ETA: 1:27 - loss: 0.3922 - regression_loss: 0.3633 - classification_loss: 0.0289 246/500 [=============>................] - ETA: 1:26 - loss: 0.3920 - regression_loss: 0.3632 - classification_loss: 0.0289 247/500 [=============>................] - ETA: 1:26 - loss: 0.3924 - regression_loss: 0.3636 - classification_loss: 0.0289 248/500 [=============>................] - ETA: 1:26 - loss: 0.3925 - regression_loss: 0.3636 - classification_loss: 0.0289 249/500 [=============>................] - ETA: 1:25 - loss: 0.3919 - regression_loss: 0.3630 - classification_loss: 0.0289 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3909 - regression_loss: 0.3621 - classification_loss: 0.0288 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3905 - regression_loss: 0.3617 - classification_loss: 0.0287 252/500 [==============>...............] - ETA: 1:24 - loss: 0.3897 - regression_loss: 0.3610 - classification_loss: 0.0287 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3892 - regression_loss: 0.3606 - classification_loss: 0.0286 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3897 - regression_loss: 0.3610 - classification_loss: 0.0286 255/500 [==============>...............] - ETA: 1:23 - loss: 0.3911 - regression_loss: 0.3625 - classification_loss: 0.0287 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3926 - regression_loss: 0.3638 - classification_loss: 0.0287 257/500 [==============>...............] - ETA: 1:22 - loss: 0.3925 - regression_loss: 0.3638 - classification_loss: 0.0287 258/500 [==============>...............] - ETA: 1:22 - loss: 0.3917 - regression_loss: 0.3630 - classification_loss: 0.0287 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3918 - regression_loss: 0.3631 - classification_loss: 0.0287 260/500 [==============>...............] - ETA: 1:21 - loss: 0.3917 - regression_loss: 0.3630 - classification_loss: 0.0287 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3920 - regression_loss: 0.3633 - classification_loss: 0.0288 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3919 - regression_loss: 0.3632 - classification_loss: 0.0287 263/500 [==============>...............] - ETA: 1:20 - loss: 0.3919 - regression_loss: 0.3633 - classification_loss: 0.0286 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3914 - regression_loss: 0.3629 - classification_loss: 0.0286 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3910 - regression_loss: 0.3625 - classification_loss: 0.0285 266/500 [==============>...............] - ETA: 1:19 - loss: 0.3912 - regression_loss: 0.3627 - classification_loss: 0.0285 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3907 - regression_loss: 0.3622 - classification_loss: 0.0285 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3902 - regression_loss: 0.3617 - classification_loss: 0.0284 269/500 [===============>..............] - ETA: 1:18 - loss: 0.3895 - regression_loss: 0.3611 - classification_loss: 0.0284 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3886 - regression_loss: 0.3603 - classification_loss: 0.0283 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3887 - regression_loss: 0.3604 - classification_loss: 0.0283 272/500 [===============>..............] - ETA: 1:17 - loss: 0.3886 - regression_loss: 0.3604 - classification_loss: 0.0283 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3900 - regression_loss: 0.3616 - classification_loss: 0.0284 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3903 - regression_loss: 0.3620 - classification_loss: 0.0283 275/500 [===============>..............] - ETA: 1:16 - loss: 0.3916 - regression_loss: 0.3632 - classification_loss: 0.0283 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3912 - regression_loss: 0.3629 - classification_loss: 0.0283 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3916 - regression_loss: 0.3633 - classification_loss: 0.0283 278/500 [===============>..............] - ETA: 1:15 - loss: 0.3915 - regression_loss: 0.3632 - classification_loss: 0.0283 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3919 - regression_loss: 0.3636 - classification_loss: 0.0283 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3929 - regression_loss: 0.3644 - classification_loss: 0.0285 281/500 [===============>..............] - ETA: 1:14 - loss: 0.3929 - regression_loss: 0.3644 - classification_loss: 0.0285 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3942 - regression_loss: 0.3657 - classification_loss: 0.0285 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3953 - regression_loss: 0.3667 - classification_loss: 0.0286 284/500 [================>.............] - ETA: 1:13 - loss: 0.3970 - regression_loss: 0.3684 - classification_loss: 0.0286 285/500 [================>.............] - ETA: 1:13 - loss: 0.3983 - regression_loss: 0.3697 - classification_loss: 0.0286 286/500 [================>.............] - ETA: 1:13 - loss: 0.3981 - regression_loss: 0.3696 - classification_loss: 0.0285 287/500 [================>.............] - ETA: 1:12 - loss: 0.3986 - regression_loss: 0.3700 - classification_loss: 0.0285 288/500 [================>.............] - ETA: 1:12 - loss: 0.3989 - regression_loss: 0.3703 - classification_loss: 0.0285 289/500 [================>.............] - ETA: 1:12 - loss: 0.3988 - regression_loss: 0.3703 - classification_loss: 0.0285 290/500 [================>.............] - ETA: 1:11 - loss: 0.3997 - regression_loss: 0.3711 - classification_loss: 0.0285 291/500 [================>.............] - ETA: 1:11 - loss: 0.3999 - regression_loss: 0.3714 - classification_loss: 0.0285 292/500 [================>.............] - ETA: 1:11 - loss: 0.4007 - regression_loss: 0.3721 - classification_loss: 0.0285 293/500 [================>.............] - ETA: 1:10 - loss: 0.4002 - regression_loss: 0.3718 - classification_loss: 0.0285 294/500 [================>.............] - ETA: 1:10 - loss: 0.3999 - regression_loss: 0.3715 - classification_loss: 0.0284 295/500 [================>.............] - ETA: 1:10 - loss: 0.3997 - regression_loss: 0.3713 - classification_loss: 0.0284 296/500 [================>.............] - ETA: 1:09 - loss: 0.4005 - regression_loss: 0.3720 - classification_loss: 0.0285 297/500 [================>.............] - ETA: 1:09 - loss: 0.4004 - regression_loss: 0.3719 - classification_loss: 0.0285 298/500 [================>.............] - ETA: 1:09 - loss: 0.4009 - regression_loss: 0.3724 - classification_loss: 0.0285 299/500 [================>.............] - ETA: 1:08 - loss: 0.4012 - regression_loss: 0.3726 - classification_loss: 0.0286 300/500 [=================>............] - ETA: 1:08 - loss: 0.4009 - regression_loss: 0.3723 - classification_loss: 0.0286 301/500 [=================>............] - ETA: 1:07 - loss: 0.4008 - regression_loss: 0.3723 - classification_loss: 0.0285 302/500 [=================>............] - ETA: 1:07 - loss: 0.4006 - regression_loss: 0.3720 - classification_loss: 0.0286 303/500 [=================>............] - ETA: 1:07 - loss: 0.4011 - regression_loss: 0.3725 - classification_loss: 0.0286 304/500 [=================>............] - ETA: 1:06 - loss: 0.4009 - regression_loss: 0.3724 - classification_loss: 0.0286 305/500 [=================>............] - ETA: 1:06 - loss: 0.4013 - regression_loss: 0.3727 - classification_loss: 0.0286 306/500 [=================>............] - ETA: 1:06 - loss: 0.4007 - regression_loss: 0.3722 - classification_loss: 0.0285 307/500 [=================>............] - ETA: 1:05 - loss: 0.4003 - regression_loss: 0.3719 - classification_loss: 0.0284 308/500 [=================>............] - ETA: 1:05 - loss: 0.4004 - regression_loss: 0.3720 - classification_loss: 0.0285 309/500 [=================>............] - ETA: 1:05 - loss: 0.4012 - regression_loss: 0.3727 - classification_loss: 0.0285 310/500 [=================>............] - ETA: 1:04 - loss: 0.4015 - regression_loss: 0.3730 - classification_loss: 0.0285 311/500 [=================>............] - ETA: 1:04 - loss: 0.4015 - regression_loss: 0.3729 - classification_loss: 0.0285 312/500 [=================>............] - ETA: 1:04 - loss: 0.4016 - regression_loss: 0.3730 - classification_loss: 0.0286 313/500 [=================>............] - ETA: 1:03 - loss: 0.4028 - regression_loss: 0.3742 - classification_loss: 0.0286 314/500 [=================>............] - ETA: 1:03 - loss: 0.4036 - regression_loss: 0.3748 - classification_loss: 0.0288 315/500 [=================>............] - ETA: 1:03 - loss: 0.4042 - regression_loss: 0.3754 - classification_loss: 0.0288 316/500 [=================>............] - ETA: 1:02 - loss: 0.4047 - regression_loss: 0.3759 - classification_loss: 0.0289 317/500 [==================>...........] - ETA: 1:02 - loss: 0.4048 - regression_loss: 0.3759 - classification_loss: 0.0289 318/500 [==================>...........] - ETA: 1:02 - loss: 0.4053 - regression_loss: 0.3763 - classification_loss: 0.0290 319/500 [==================>...........] - ETA: 1:01 - loss: 0.4051 - regression_loss: 0.3762 - classification_loss: 0.0290 320/500 [==================>...........] - ETA: 1:01 - loss: 0.4048 - regression_loss: 0.3759 - classification_loss: 0.0290 321/500 [==================>...........] - ETA: 1:01 - loss: 0.4057 - regression_loss: 0.3767 - classification_loss: 0.0290 322/500 [==================>...........] - ETA: 1:00 - loss: 0.4055 - regression_loss: 0.3765 - classification_loss: 0.0289 323/500 [==================>...........] - ETA: 1:00 - loss: 0.4059 - regression_loss: 0.3769 - classification_loss: 0.0290 324/500 [==================>...........] - ETA: 1:00 - loss: 0.4057 - regression_loss: 0.3767 - classification_loss: 0.0290 325/500 [==================>...........] - ETA: 59s - loss: 0.4056 - regression_loss: 0.3766 - classification_loss: 0.0289  326/500 [==================>...........] - ETA: 59s - loss: 0.4050 - regression_loss: 0.3761 - classification_loss: 0.0289 327/500 [==================>...........] - ETA: 59s - loss: 0.4044 - regression_loss: 0.3755 - classification_loss: 0.0289 328/500 [==================>...........] - ETA: 58s - loss: 0.4041 - regression_loss: 0.3752 - classification_loss: 0.0288 329/500 [==================>...........] - ETA: 58s - loss: 0.4051 - regression_loss: 0.3762 - classification_loss: 0.0289 330/500 [==================>...........] - ETA: 58s - loss: 0.4068 - regression_loss: 0.3776 - classification_loss: 0.0291 331/500 [==================>...........] - ETA: 57s - loss: 0.4061 - regression_loss: 0.3770 - classification_loss: 0.0291 332/500 [==================>...........] - ETA: 57s - loss: 0.4056 - regression_loss: 0.3766 - classification_loss: 0.0291 333/500 [==================>...........] - ETA: 57s - loss: 0.4059 - regression_loss: 0.3768 - classification_loss: 0.0291 334/500 [===================>..........] - ETA: 56s - loss: 0.4066 - regression_loss: 0.3774 - classification_loss: 0.0291 335/500 [===================>..........] - ETA: 56s - loss: 0.4068 - regression_loss: 0.3776 - classification_loss: 0.0292 336/500 [===================>..........] - ETA: 56s - loss: 0.4077 - regression_loss: 0.3784 - classification_loss: 0.0293 337/500 [===================>..........] - ETA: 55s - loss: 0.4077 - regression_loss: 0.3785 - classification_loss: 0.0293 338/500 [===================>..........] - ETA: 55s - loss: 0.4075 - regression_loss: 0.3783 - classification_loss: 0.0292 339/500 [===================>..........] - ETA: 55s - loss: 0.4084 - regression_loss: 0.3790 - classification_loss: 0.0294 340/500 [===================>..........] - ETA: 54s - loss: 0.4087 - regression_loss: 0.3793 - classification_loss: 0.0294 341/500 [===================>..........] - ETA: 54s - loss: 0.4091 - regression_loss: 0.3797 - classification_loss: 0.0294 342/500 [===================>..........] - ETA: 53s - loss: 0.4103 - regression_loss: 0.3809 - classification_loss: 0.0294 343/500 [===================>..........] - ETA: 53s - loss: 0.4099 - regression_loss: 0.3805 - classification_loss: 0.0293 344/500 [===================>..........] - ETA: 53s - loss: 0.4102 - regression_loss: 0.3807 - classification_loss: 0.0295 345/500 [===================>..........] - ETA: 52s - loss: 0.4103 - regression_loss: 0.3808 - classification_loss: 0.0295 346/500 [===================>..........] - ETA: 52s - loss: 0.4107 - regression_loss: 0.3811 - classification_loss: 0.0296 347/500 [===================>..........] - ETA: 52s - loss: 0.4109 - regression_loss: 0.3813 - classification_loss: 0.0296 348/500 [===================>..........] - ETA: 51s - loss: 0.4112 - regression_loss: 0.3815 - classification_loss: 0.0297 349/500 [===================>..........] - ETA: 51s - loss: 0.4109 - regression_loss: 0.3811 - classification_loss: 0.0297 350/500 [====================>.........] - ETA: 51s - loss: 0.4109 - regression_loss: 0.3812 - classification_loss: 0.0297 351/500 [====================>.........] - ETA: 50s - loss: 0.4106 - regression_loss: 0.3809 - classification_loss: 0.0297 352/500 [====================>.........] - ETA: 50s - loss: 0.4107 - regression_loss: 0.3810 - classification_loss: 0.0297 353/500 [====================>.........] - ETA: 50s - loss: 0.4111 - regression_loss: 0.3814 - classification_loss: 0.0297 354/500 [====================>.........] - ETA: 49s - loss: 0.4110 - regression_loss: 0.3812 - classification_loss: 0.0297 355/500 [====================>.........] - ETA: 49s - loss: 0.4113 - regression_loss: 0.3816 - classification_loss: 0.0297 356/500 [====================>.........] - ETA: 49s - loss: 0.4109 - regression_loss: 0.3812 - classification_loss: 0.0297 357/500 [====================>.........] - ETA: 48s - loss: 0.4103 - regression_loss: 0.3807 - classification_loss: 0.0297 358/500 [====================>.........] - ETA: 48s - loss: 0.4108 - regression_loss: 0.3811 - classification_loss: 0.0297 359/500 [====================>.........] - ETA: 48s - loss: 0.4110 - regression_loss: 0.3813 - classification_loss: 0.0297 360/500 [====================>.........] - ETA: 47s - loss: 0.4115 - regression_loss: 0.3817 - classification_loss: 0.0298 361/500 [====================>.........] - ETA: 47s - loss: 0.4113 - regression_loss: 0.3816 - classification_loss: 0.0297 362/500 [====================>.........] - ETA: 47s - loss: 0.4122 - regression_loss: 0.3823 - classification_loss: 0.0299 363/500 [====================>.........] - ETA: 46s - loss: 0.4127 - regression_loss: 0.3827 - classification_loss: 0.0300 364/500 [====================>.........] - ETA: 46s - loss: 0.4126 - regression_loss: 0.3826 - classification_loss: 0.0300 365/500 [====================>.........] - ETA: 46s - loss: 0.4126 - regression_loss: 0.3826 - classification_loss: 0.0300 366/500 [====================>.........] - ETA: 45s - loss: 0.4129 - regression_loss: 0.3829 - classification_loss: 0.0300 367/500 [=====================>........] - ETA: 45s - loss: 0.4129 - regression_loss: 0.3829 - classification_loss: 0.0300 368/500 [=====================>........] - ETA: 45s - loss: 0.4130 - regression_loss: 0.3830 - classification_loss: 0.0299 369/500 [=====================>........] - ETA: 44s - loss: 0.4126 - regression_loss: 0.3827 - classification_loss: 0.0299 370/500 [=====================>........] - ETA: 44s - loss: 0.4118 - regression_loss: 0.3820 - classification_loss: 0.0299 371/500 [=====================>........] - ETA: 44s - loss: 0.4121 - regression_loss: 0.3822 - classification_loss: 0.0298 372/500 [=====================>........] - ETA: 43s - loss: 0.4120 - regression_loss: 0.3821 - classification_loss: 0.0298 373/500 [=====================>........] - ETA: 43s - loss: 0.4113 - regression_loss: 0.3815 - classification_loss: 0.0298 374/500 [=====================>........] - ETA: 43s - loss: 0.4107 - regression_loss: 0.3810 - classification_loss: 0.0298 375/500 [=====================>........] - ETA: 42s - loss: 0.4107 - regression_loss: 0.3809 - classification_loss: 0.0298 376/500 [=====================>........] - ETA: 42s - loss: 0.4111 - regression_loss: 0.3812 - classification_loss: 0.0299 377/500 [=====================>........] - ETA: 42s - loss: 0.4112 - regression_loss: 0.3814 - classification_loss: 0.0298 378/500 [=====================>........] - ETA: 41s - loss: 0.4110 - regression_loss: 0.3812 - classification_loss: 0.0298 379/500 [=====================>........] - ETA: 41s - loss: 0.4109 - regression_loss: 0.3811 - classification_loss: 0.0297 380/500 [=====================>........] - ETA: 41s - loss: 0.4110 - regression_loss: 0.3813 - classification_loss: 0.0297 381/500 [=====================>........] - ETA: 40s - loss: 0.4107 - regression_loss: 0.3810 - classification_loss: 0.0297 382/500 [=====================>........] - ETA: 40s - loss: 0.4107 - regression_loss: 0.3810 - classification_loss: 0.0297 383/500 [=====================>........] - ETA: 40s - loss: 0.4109 - regression_loss: 0.3811 - classification_loss: 0.0298 384/500 [======================>.......] - ETA: 39s - loss: 0.4103 - regression_loss: 0.3806 - classification_loss: 0.0297 385/500 [======================>.......] - ETA: 39s - loss: 0.4103 - regression_loss: 0.3805 - classification_loss: 0.0298 386/500 [======================>.......] - ETA: 38s - loss: 0.4107 - regression_loss: 0.3809 - classification_loss: 0.0298 387/500 [======================>.......] - ETA: 38s - loss: 0.4112 - regression_loss: 0.3814 - classification_loss: 0.0298 388/500 [======================>.......] - ETA: 38s - loss: 0.4112 - regression_loss: 0.3813 - classification_loss: 0.0299 389/500 [======================>.......] - ETA: 37s - loss: 0.4105 - regression_loss: 0.3807 - classification_loss: 0.0298 390/500 [======================>.......] - ETA: 37s - loss: 0.4104 - regression_loss: 0.3806 - classification_loss: 0.0298 391/500 [======================>.......] - ETA: 37s - loss: 0.4100 - regression_loss: 0.3802 - classification_loss: 0.0297 392/500 [======================>.......] - ETA: 36s - loss: 0.4094 - regression_loss: 0.3797 - classification_loss: 0.0297 393/500 [======================>.......] - ETA: 36s - loss: 0.4097 - regression_loss: 0.3800 - classification_loss: 0.0297 394/500 [======================>.......] - ETA: 36s - loss: 0.4108 - regression_loss: 0.3810 - classification_loss: 0.0298 395/500 [======================>.......] - ETA: 35s - loss: 0.4108 - regression_loss: 0.3809 - classification_loss: 0.0299 396/500 [======================>.......] - ETA: 35s - loss: 0.4104 - regression_loss: 0.3806 - classification_loss: 0.0298 397/500 [======================>.......] - ETA: 35s - loss: 0.4105 - regression_loss: 0.3807 - classification_loss: 0.0298 398/500 [======================>.......] - ETA: 34s - loss: 0.4100 - regression_loss: 0.3803 - classification_loss: 0.0297 399/500 [======================>.......] - ETA: 34s - loss: 0.4094 - regression_loss: 0.3797 - classification_loss: 0.0297 400/500 [=======================>......] - ETA: 34s - loss: 0.4091 - regression_loss: 0.3795 - classification_loss: 0.0296 401/500 [=======================>......] - ETA: 33s - loss: 0.4093 - regression_loss: 0.3797 - classification_loss: 0.0296 402/500 [=======================>......] - ETA: 33s - loss: 0.4097 - regression_loss: 0.3800 - classification_loss: 0.0296 403/500 [=======================>......] - ETA: 33s - loss: 0.4094 - regression_loss: 0.3798 - classification_loss: 0.0296 404/500 [=======================>......] - ETA: 32s - loss: 0.4091 - regression_loss: 0.3795 - classification_loss: 0.0296 405/500 [=======================>......] - ETA: 32s - loss: 0.4087 - regression_loss: 0.3791 - classification_loss: 0.0295 406/500 [=======================>......] - ETA: 32s - loss: 0.4082 - regression_loss: 0.3787 - classification_loss: 0.0295 407/500 [=======================>......] - ETA: 31s - loss: 0.4077 - regression_loss: 0.3783 - classification_loss: 0.0294 408/500 [=======================>......] - ETA: 31s - loss: 0.4070 - regression_loss: 0.3776 - classification_loss: 0.0294 409/500 [=======================>......] - ETA: 31s - loss: 0.4066 - regression_loss: 0.3773 - classification_loss: 0.0293 410/500 [=======================>......] - ETA: 30s - loss: 0.4063 - regression_loss: 0.3770 - classification_loss: 0.0293 411/500 [=======================>......] - ETA: 30s - loss: 0.4060 - regression_loss: 0.3767 - classification_loss: 0.0293 412/500 [=======================>......] - ETA: 30s - loss: 0.4057 - regression_loss: 0.3764 - classification_loss: 0.0293 413/500 [=======================>......] - ETA: 29s - loss: 0.4057 - regression_loss: 0.3764 - classification_loss: 0.0293 414/500 [=======================>......] - ETA: 29s - loss: 0.4057 - regression_loss: 0.3764 - classification_loss: 0.0293 415/500 [=======================>......] - ETA: 29s - loss: 0.4057 - regression_loss: 0.3764 - classification_loss: 0.0293 416/500 [=======================>......] - ETA: 28s - loss: 0.4058 - regression_loss: 0.3766 - classification_loss: 0.0293 417/500 [========================>.....] - ETA: 28s - loss: 0.4060 - regression_loss: 0.3768 - classification_loss: 0.0292 418/500 [========================>.....] - ETA: 28s - loss: 0.4058 - regression_loss: 0.3766 - classification_loss: 0.0292 419/500 [========================>.....] - ETA: 27s - loss: 0.4060 - regression_loss: 0.3768 - classification_loss: 0.0292 420/500 [========================>.....] - ETA: 27s - loss: 0.4062 - regression_loss: 0.3769 - classification_loss: 0.0292 421/500 [========================>.....] - ETA: 26s - loss: 0.4057 - regression_loss: 0.3765 - classification_loss: 0.0292 422/500 [========================>.....] - ETA: 26s - loss: 0.4062 - regression_loss: 0.3769 - classification_loss: 0.0293 423/500 [========================>.....] - ETA: 26s - loss: 0.4056 - regression_loss: 0.3763 - classification_loss: 0.0293 424/500 [========================>.....] - ETA: 25s - loss: 0.4053 - regression_loss: 0.3760 - classification_loss: 0.0292 425/500 [========================>.....] - ETA: 25s - loss: 0.4054 - regression_loss: 0.3762 - classification_loss: 0.0292 426/500 [========================>.....] - ETA: 25s - loss: 0.4053 - regression_loss: 0.3761 - classification_loss: 0.0292 427/500 [========================>.....] - ETA: 24s - loss: 0.4054 - regression_loss: 0.3762 - classification_loss: 0.0292 428/500 [========================>.....] - ETA: 24s - loss: 0.4052 - regression_loss: 0.3760 - classification_loss: 0.0292 429/500 [========================>.....] - ETA: 24s - loss: 0.4051 - regression_loss: 0.3760 - classification_loss: 0.0291 430/500 [========================>.....] - ETA: 23s - loss: 0.4052 - regression_loss: 0.3761 - classification_loss: 0.0291 431/500 [========================>.....] - ETA: 23s - loss: 0.4054 - regression_loss: 0.3762 - classification_loss: 0.0291 432/500 [========================>.....] - ETA: 23s - loss: 0.4051 - regression_loss: 0.3760 - classification_loss: 0.0291 433/500 [========================>.....] - ETA: 22s - loss: 0.4051 - regression_loss: 0.3760 - classification_loss: 0.0291 434/500 [=========================>....] - ETA: 22s - loss: 0.4048 - regression_loss: 0.3758 - classification_loss: 0.0290 435/500 [=========================>....] - ETA: 22s - loss: 0.4049 - regression_loss: 0.3759 - classification_loss: 0.0290 436/500 [=========================>....] - ETA: 21s - loss: 0.4053 - regression_loss: 0.3763 - classification_loss: 0.0290 437/500 [=========================>....] - ETA: 21s - loss: 0.4051 - regression_loss: 0.3761 - classification_loss: 0.0290 438/500 [=========================>....] - ETA: 21s - loss: 0.4047 - regression_loss: 0.3757 - classification_loss: 0.0290 439/500 [=========================>....] - ETA: 20s - loss: 0.4046 - regression_loss: 0.3757 - classification_loss: 0.0289 440/500 [=========================>....] - ETA: 20s - loss: 0.4044 - regression_loss: 0.3755 - classification_loss: 0.0289 441/500 [=========================>....] - ETA: 20s - loss: 0.4042 - regression_loss: 0.3753 - classification_loss: 0.0289 442/500 [=========================>....] - ETA: 19s - loss: 0.4039 - regression_loss: 0.3750 - classification_loss: 0.0288 443/500 [=========================>....] - ETA: 19s - loss: 0.4037 - regression_loss: 0.3748 - classification_loss: 0.0288 444/500 [=========================>....] - ETA: 19s - loss: 0.4036 - regression_loss: 0.3748 - classification_loss: 0.0288 445/500 [=========================>....] - ETA: 18s - loss: 0.4033 - regression_loss: 0.3745 - classification_loss: 0.0288 446/500 [=========================>....] - ETA: 18s - loss: 0.4034 - regression_loss: 0.3746 - classification_loss: 0.0288 447/500 [=========================>....] - ETA: 18s - loss: 0.4031 - regression_loss: 0.3743 - classification_loss: 0.0288 448/500 [=========================>....] - ETA: 17s - loss: 0.4024 - regression_loss: 0.3736 - classification_loss: 0.0287 449/500 [=========================>....] - ETA: 17s - loss: 0.4030 - regression_loss: 0.3743 - classification_loss: 0.0287 450/500 [==========================>...] - ETA: 17s - loss: 0.4039 - regression_loss: 0.3752 - classification_loss: 0.0288 451/500 [==========================>...] - ETA: 16s - loss: 0.4044 - regression_loss: 0.3757 - classification_loss: 0.0288 452/500 [==========================>...] - ETA: 16s - loss: 0.4041 - regression_loss: 0.3753 - classification_loss: 0.0287 453/500 [==========================>...] - ETA: 16s - loss: 0.4042 - regression_loss: 0.3755 - classification_loss: 0.0288 454/500 [==========================>...] - ETA: 15s - loss: 0.4054 - regression_loss: 0.3765 - classification_loss: 0.0289 455/500 [==========================>...] - ETA: 15s - loss: 0.4057 - regression_loss: 0.3768 - classification_loss: 0.0289 456/500 [==========================>...] - ETA: 15s - loss: 0.4067 - regression_loss: 0.3777 - classification_loss: 0.0290 457/500 [==========================>...] - ETA: 14s - loss: 0.4072 - regression_loss: 0.3782 - classification_loss: 0.0290 458/500 [==========================>...] - ETA: 14s - loss: 0.4078 - regression_loss: 0.3787 - classification_loss: 0.0291 459/500 [==========================>...] - ETA: 14s - loss: 0.4083 - regression_loss: 0.3792 - classification_loss: 0.0292 460/500 [==========================>...] - ETA: 13s - loss: 0.4082 - regression_loss: 0.3790 - classification_loss: 0.0291 461/500 [==========================>...] - ETA: 13s - loss: 0.4090 - regression_loss: 0.3798 - classification_loss: 0.0292 462/500 [==========================>...] - ETA: 12s - loss: 0.4096 - regression_loss: 0.3804 - classification_loss: 0.0292 463/500 [==========================>...] - ETA: 12s - loss: 0.4098 - regression_loss: 0.3806 - classification_loss: 0.0292 464/500 [==========================>...] - ETA: 12s - loss: 0.4100 - regression_loss: 0.3808 - classification_loss: 0.0292 465/500 [==========================>...] - ETA: 11s - loss: 0.4094 - regression_loss: 0.3803 - classification_loss: 0.0292 466/500 [==========================>...] - ETA: 11s - loss: 0.4093 - regression_loss: 0.3801 - classification_loss: 0.0292 467/500 [===========================>..] - ETA: 11s - loss: 0.4089 - regression_loss: 0.3798 - classification_loss: 0.0291 468/500 [===========================>..] - ETA: 10s - loss: 0.4096 - regression_loss: 0.3804 - classification_loss: 0.0291 469/500 [===========================>..] - ETA: 10s - loss: 0.4113 - regression_loss: 0.3818 - classification_loss: 0.0294 470/500 [===========================>..] - ETA: 10s - loss: 0.4119 - regression_loss: 0.3824 - classification_loss: 0.0295 471/500 [===========================>..] - ETA: 9s - loss: 0.4126 - regression_loss: 0.3830 - classification_loss: 0.0296  472/500 [===========================>..] - ETA: 9s - loss: 0.4128 - regression_loss: 0.3831 - classification_loss: 0.0297 473/500 [===========================>..] - ETA: 9s - loss: 0.4131 - regression_loss: 0.3833 - classification_loss: 0.0297 474/500 [===========================>..] - ETA: 8s - loss: 0.4131 - regression_loss: 0.3833 - classification_loss: 0.0298 475/500 [===========================>..] - ETA: 8s - loss: 0.4134 - regression_loss: 0.3836 - classification_loss: 0.0298 476/500 [===========================>..] - ETA: 8s - loss: 0.4135 - regression_loss: 0.3838 - classification_loss: 0.0298 477/500 [===========================>..] - ETA: 7s - loss: 0.4139 - regression_loss: 0.3841 - classification_loss: 0.0298 478/500 [===========================>..] - ETA: 7s - loss: 0.4136 - regression_loss: 0.3838 - classification_loss: 0.0298 479/500 [===========================>..] - ETA: 7s - loss: 0.4142 - regression_loss: 0.3842 - classification_loss: 0.0300 480/500 [===========================>..] - ETA: 6s - loss: 0.4145 - regression_loss: 0.3844 - classification_loss: 0.0300 481/500 [===========================>..] - ETA: 6s - loss: 0.4145 - regression_loss: 0.3844 - classification_loss: 0.0301 482/500 [===========================>..] - ETA: 6s - loss: 0.4142 - regression_loss: 0.3841 - classification_loss: 0.0301 483/500 [===========================>..] - ETA: 5s - loss: 0.4142 - regression_loss: 0.3841 - classification_loss: 0.0301 484/500 [============================>.] - ETA: 5s - loss: 0.4146 - regression_loss: 0.3845 - classification_loss: 0.0301 485/500 [============================>.] - ETA: 5s - loss: 0.4148 - regression_loss: 0.3847 - classification_loss: 0.0301 486/500 [============================>.] - ETA: 4s - loss: 0.4146 - regression_loss: 0.3845 - classification_loss: 0.0301 487/500 [============================>.] - ETA: 4s - loss: 0.4147 - regression_loss: 0.3846 - classification_loss: 0.0301 488/500 [============================>.] - ETA: 4s - loss: 0.4147 - regression_loss: 0.3846 - classification_loss: 0.0301 489/500 [============================>.] - ETA: 3s - loss: 0.4149 - regression_loss: 0.3847 - classification_loss: 0.0302 490/500 [============================>.] - ETA: 3s - loss: 0.4144 - regression_loss: 0.3843 - classification_loss: 0.0301 491/500 [============================>.] - ETA: 3s - loss: 0.4149 - regression_loss: 0.3846 - classification_loss: 0.0303 492/500 [============================>.] - ETA: 2s - loss: 0.4150 - regression_loss: 0.3848 - classification_loss: 0.0303 493/500 [============================>.] - ETA: 2s - loss: 0.4157 - regression_loss: 0.3853 - classification_loss: 0.0304 494/500 [============================>.] - ETA: 2s - loss: 0.4157 - regression_loss: 0.3853 - classification_loss: 0.0304 495/500 [============================>.] - ETA: 1s - loss: 0.4153 - regression_loss: 0.3850 - classification_loss: 0.0303 496/500 [============================>.] - ETA: 1s - loss: 0.4153 - regression_loss: 0.3849 - classification_loss: 0.0304 497/500 [============================>.] - ETA: 1s - loss: 0.4156 - regression_loss: 0.3852 - classification_loss: 0.0304 498/500 [============================>.] - ETA: 0s - loss: 0.4155 - regression_loss: 0.3851 - classification_loss: 0.0304 499/500 [============================>.] - ETA: 0s - loss: 0.4150 - regression_loss: 0.3846 - classification_loss: 0.0304 500/500 [==============================] - 171s 342ms/step - loss: 0.4150 - regression_loss: 0.3846 - classification_loss: 0.0304 1172 instances of class plum with average precision: 0.7247 mAP: 0.7247 Epoch 00028: saving model to ./training/snapshots/resnet101_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:49 - loss: 0.4428 - regression_loss: 0.3970 - classification_loss: 0.0458 2/500 [..............................] - ETA: 2:55 - loss: 0.7513 - regression_loss: 0.6739 - classification_loss: 0.0774 3/500 [..............................] - ETA: 2:55 - loss: 0.5769 - regression_loss: 0.5223 - classification_loss: 0.0546 4/500 [..............................] - ETA: 2:55 - loss: 0.5459 - regression_loss: 0.4971 - classification_loss: 0.0488 5/500 [..............................] - ETA: 2:54 - loss: 0.5144 - regression_loss: 0.4691 - classification_loss: 0.0453 6/500 [..............................] - ETA: 2:52 - loss: 0.4956 - regression_loss: 0.4514 - classification_loss: 0.0442 7/500 [..............................] - ETA: 2:52 - loss: 0.5146 - regression_loss: 0.4718 - classification_loss: 0.0428 8/500 [..............................] - ETA: 2:52 - loss: 0.4853 - regression_loss: 0.4460 - classification_loss: 0.0393 9/500 [..............................] - ETA: 2:51 - loss: 0.4832 - regression_loss: 0.4454 - classification_loss: 0.0379 10/500 [..............................] - ETA: 2:50 - loss: 0.4870 - regression_loss: 0.4510 - classification_loss: 0.0360 11/500 [..............................] - ETA: 2:50 - loss: 0.4860 - regression_loss: 0.4509 - classification_loss: 0.0351 12/500 [..............................] - ETA: 2:49 - loss: 0.4864 - regression_loss: 0.4482 - classification_loss: 0.0381 13/500 [..............................] - ETA: 2:49 - loss: 0.4975 - regression_loss: 0.4570 - classification_loss: 0.0405 14/500 [..............................] - ETA: 2:49 - loss: 0.4760 - regression_loss: 0.4372 - classification_loss: 0.0388 15/500 [..............................] - ETA: 2:48 - loss: 0.4598 - regression_loss: 0.4225 - classification_loss: 0.0372 16/500 [..............................] - ETA: 2:48 - loss: 0.4475 - regression_loss: 0.4122 - classification_loss: 0.0353 17/500 [>.............................] - ETA: 2:46 - loss: 0.4528 - regression_loss: 0.4178 - classification_loss: 0.0349 18/500 [>.............................] - ETA: 2:46 - loss: 0.4435 - regression_loss: 0.4095 - classification_loss: 0.0341 19/500 [>.............................] - ETA: 2:46 - loss: 0.4456 - regression_loss: 0.4118 - classification_loss: 0.0338 20/500 [>.............................] - ETA: 2:46 - loss: 0.4379 - regression_loss: 0.4051 - classification_loss: 0.0328 21/500 [>.............................] - ETA: 2:46 - loss: 0.4318 - regression_loss: 0.4000 - classification_loss: 0.0318 22/500 [>.............................] - ETA: 2:46 - loss: 0.4314 - regression_loss: 0.3994 - classification_loss: 0.0320 23/500 [>.............................] - ETA: 2:45 - loss: 0.4363 - regression_loss: 0.4033 - classification_loss: 0.0331 24/500 [>.............................] - ETA: 2:45 - loss: 0.4417 - regression_loss: 0.4081 - classification_loss: 0.0336 25/500 [>.............................] - ETA: 2:45 - loss: 0.4410 - regression_loss: 0.4084 - classification_loss: 0.0326 26/500 [>.............................] - ETA: 2:44 - loss: 0.4327 - regression_loss: 0.4005 - classification_loss: 0.0322 27/500 [>.............................] - ETA: 2:43 - loss: 0.4287 - regression_loss: 0.3963 - classification_loss: 0.0323 28/500 [>.............................] - ETA: 2:43 - loss: 0.4247 - regression_loss: 0.3929 - classification_loss: 0.0318 29/500 [>.............................] - ETA: 2:42 - loss: 0.4253 - regression_loss: 0.3928 - classification_loss: 0.0325 30/500 [>.............................] - ETA: 2:42 - loss: 0.4323 - regression_loss: 0.3999 - classification_loss: 0.0324 31/500 [>.............................] - ETA: 2:41 - loss: 0.4295 - regression_loss: 0.3979 - classification_loss: 0.0316 32/500 [>.............................] - ETA: 2:41 - loss: 0.4227 - regression_loss: 0.3918 - classification_loss: 0.0309 33/500 [>.............................] - ETA: 2:40 - loss: 0.4157 - regression_loss: 0.3855 - classification_loss: 0.0301 34/500 [=>............................] - ETA: 2:40 - loss: 0.4092 - regression_loss: 0.3796 - classification_loss: 0.0296 35/500 [=>............................] - ETA: 2:40 - loss: 0.4016 - regression_loss: 0.3726 - classification_loss: 0.0290 36/500 [=>............................] - ETA: 2:39 - loss: 0.4026 - regression_loss: 0.3735 - classification_loss: 0.0291 37/500 [=>............................] - ETA: 2:39 - loss: 0.4021 - regression_loss: 0.3730 - classification_loss: 0.0291 38/500 [=>............................] - ETA: 2:39 - loss: 0.4014 - regression_loss: 0.3724 - classification_loss: 0.0290 39/500 [=>............................] - ETA: 2:38 - loss: 0.4058 - regression_loss: 0.3766 - classification_loss: 0.0292 40/500 [=>............................] - ETA: 2:38 - loss: 0.4058 - regression_loss: 0.3770 - classification_loss: 0.0289 41/500 [=>............................] - ETA: 2:38 - loss: 0.4095 - regression_loss: 0.3805 - classification_loss: 0.0290 42/500 [=>............................] - ETA: 2:37 - loss: 0.4095 - regression_loss: 0.3810 - classification_loss: 0.0284 43/500 [=>............................] - ETA: 2:37 - loss: 0.4089 - regression_loss: 0.3806 - classification_loss: 0.0282 44/500 [=>............................] - ETA: 2:36 - loss: 0.4088 - regression_loss: 0.3806 - classification_loss: 0.0282 45/500 [=>............................] - ETA: 2:36 - loss: 0.4072 - regression_loss: 0.3791 - classification_loss: 0.0282 46/500 [=>............................] - ETA: 2:36 - loss: 0.4049 - regression_loss: 0.3770 - classification_loss: 0.0279 47/500 [=>............................] - ETA: 2:36 - loss: 0.4045 - regression_loss: 0.3765 - classification_loss: 0.0279 48/500 [=>............................] - ETA: 2:35 - loss: 0.4070 - regression_loss: 0.3791 - classification_loss: 0.0280 49/500 [=>............................] - ETA: 2:35 - loss: 0.4060 - regression_loss: 0.3782 - classification_loss: 0.0278 50/500 [==>...........................] - ETA: 2:34 - loss: 0.4031 - regression_loss: 0.3755 - classification_loss: 0.0277 51/500 [==>...........................] - ETA: 2:34 - loss: 0.4003 - regression_loss: 0.3728 - classification_loss: 0.0275 52/500 [==>...........................] - ETA: 2:34 - loss: 0.3995 - regression_loss: 0.3717 - classification_loss: 0.0279 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3961 - regression_loss: 0.3686 - classification_loss: 0.0276 54/500 [==>...........................] - ETA: 2:33 - loss: 0.3966 - regression_loss: 0.3689 - classification_loss: 0.0277 55/500 [==>...........................] - ETA: 2:33 - loss: 0.3951 - regression_loss: 0.3673 - classification_loss: 0.0278 56/500 [==>...........................] - ETA: 2:32 - loss: 0.3972 - regression_loss: 0.3693 - classification_loss: 0.0279 57/500 [==>...........................] - ETA: 2:32 - loss: 0.3984 - regression_loss: 0.3705 - classification_loss: 0.0279 58/500 [==>...........................] - ETA: 2:32 - loss: 0.3956 - regression_loss: 0.3680 - classification_loss: 0.0276 59/500 [==>...........................] - ETA: 2:31 - loss: 0.3920 - regression_loss: 0.3647 - classification_loss: 0.0273 60/500 [==>...........................] - ETA: 2:31 - loss: 0.3895 - regression_loss: 0.3626 - classification_loss: 0.0269 61/500 [==>...........................] - ETA: 2:31 - loss: 0.3902 - regression_loss: 0.3633 - classification_loss: 0.0270 62/500 [==>...........................] - ETA: 2:30 - loss: 0.3919 - regression_loss: 0.3649 - classification_loss: 0.0271 63/500 [==>...........................] - ETA: 2:30 - loss: 0.3912 - regression_loss: 0.3642 - classification_loss: 0.0270 64/500 [==>...........................] - ETA: 2:30 - loss: 0.3904 - regression_loss: 0.3634 - classification_loss: 0.0270 65/500 [==>...........................] - ETA: 2:29 - loss: 0.3886 - regression_loss: 0.3617 - classification_loss: 0.0269 66/500 [==>...........................] - ETA: 2:29 - loss: 0.3900 - regression_loss: 0.3631 - classification_loss: 0.0269 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3874 - regression_loss: 0.3607 - classification_loss: 0.0268 68/500 [===>..........................] - ETA: 2:28 - loss: 0.3896 - regression_loss: 0.3627 - classification_loss: 0.0269 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3873 - regression_loss: 0.3607 - classification_loss: 0.0266 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3927 - regression_loss: 0.3657 - classification_loss: 0.0270 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3961 - regression_loss: 0.3686 - classification_loss: 0.0275 72/500 [===>..........................] - ETA: 2:26 - loss: 0.4024 - regression_loss: 0.3751 - classification_loss: 0.0274 73/500 [===>..........................] - ETA: 2:26 - loss: 0.4007 - regression_loss: 0.3735 - classification_loss: 0.0272 74/500 [===>..........................] - ETA: 2:26 - loss: 0.3985 - regression_loss: 0.3714 - classification_loss: 0.0270 75/500 [===>..........................] - ETA: 2:25 - loss: 0.4014 - regression_loss: 0.3739 - classification_loss: 0.0275 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3991 - regression_loss: 0.3717 - classification_loss: 0.0274 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3974 - regression_loss: 0.3702 - classification_loss: 0.0272 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3974 - regression_loss: 0.3703 - classification_loss: 0.0272 79/500 [===>..........................] - ETA: 2:23 - loss: 0.3958 - regression_loss: 0.3687 - classification_loss: 0.0271 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3944 - regression_loss: 0.3673 - classification_loss: 0.0270 81/500 [===>..........................] - ETA: 2:22 - loss: 0.3923 - regression_loss: 0.3655 - classification_loss: 0.0268 82/500 [===>..........................] - ETA: 2:22 - loss: 0.3918 - regression_loss: 0.3650 - classification_loss: 0.0268 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3921 - regression_loss: 0.3653 - classification_loss: 0.0268 84/500 [====>.........................] - ETA: 2:21 - loss: 0.3897 - regression_loss: 0.3631 - classification_loss: 0.0266 85/500 [====>.........................] - ETA: 2:21 - loss: 0.3904 - regression_loss: 0.3638 - classification_loss: 0.0266 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3892 - regression_loss: 0.3627 - classification_loss: 0.0265 87/500 [====>.........................] - ETA: 2:20 - loss: 0.3900 - regression_loss: 0.3636 - classification_loss: 0.0264 88/500 [====>.........................] - ETA: 2:20 - loss: 0.3879 - regression_loss: 0.3618 - classification_loss: 0.0261 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3865 - regression_loss: 0.3605 - classification_loss: 0.0260 90/500 [====>.........................] - ETA: 2:19 - loss: 0.3857 - regression_loss: 0.3596 - classification_loss: 0.0261 91/500 [====>.........................] - ETA: 2:19 - loss: 0.3843 - regression_loss: 0.3584 - classification_loss: 0.0260 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3832 - regression_loss: 0.3572 - classification_loss: 0.0260 93/500 [====>.........................] - ETA: 2:18 - loss: 0.3833 - regression_loss: 0.3572 - classification_loss: 0.0261 94/500 [====>.........................] - ETA: 2:18 - loss: 0.3830 - regression_loss: 0.3569 - classification_loss: 0.0261 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3808 - regression_loss: 0.3549 - classification_loss: 0.0259 96/500 [====>.........................] - ETA: 2:17 - loss: 0.3785 - regression_loss: 0.3528 - classification_loss: 0.0257 97/500 [====>.........................] - ETA: 2:17 - loss: 0.3783 - regression_loss: 0.3524 - classification_loss: 0.0259 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3789 - regression_loss: 0.3527 - classification_loss: 0.0262 99/500 [====>.........................] - ETA: 2:16 - loss: 0.3767 - regression_loss: 0.3507 - classification_loss: 0.0260 100/500 [=====>........................] - ETA: 2:16 - loss: 0.3748 - regression_loss: 0.3490 - classification_loss: 0.0258 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3744 - regression_loss: 0.3486 - classification_loss: 0.0258 102/500 [=====>........................] - ETA: 2:15 - loss: 0.3731 - regression_loss: 0.3475 - classification_loss: 0.0257 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3714 - regression_loss: 0.3459 - classification_loss: 0.0255 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3734 - regression_loss: 0.3478 - classification_loss: 0.0256 105/500 [=====>........................] - ETA: 2:14 - loss: 0.3753 - regression_loss: 0.3494 - classification_loss: 0.0259 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3768 - regression_loss: 0.3508 - classification_loss: 0.0261 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3791 - regression_loss: 0.3529 - classification_loss: 0.0261 108/500 [=====>........................] - ETA: 2:13 - loss: 0.3827 - regression_loss: 0.3565 - classification_loss: 0.0262 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3846 - regression_loss: 0.3583 - classification_loss: 0.0263 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3845 - regression_loss: 0.3582 - classification_loss: 0.0263 111/500 [=====>........................] - ETA: 2:12 - loss: 0.3827 - regression_loss: 0.3566 - classification_loss: 0.0261 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3832 - regression_loss: 0.3569 - classification_loss: 0.0263 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3856 - regression_loss: 0.3589 - classification_loss: 0.0266 114/500 [=====>........................] - ETA: 2:11 - loss: 0.3895 - regression_loss: 0.3622 - classification_loss: 0.0273 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3910 - regression_loss: 0.3638 - classification_loss: 0.0272 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3910 - regression_loss: 0.3638 - classification_loss: 0.0272 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3900 - regression_loss: 0.3628 - classification_loss: 0.0272 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3891 - regression_loss: 0.3620 - classification_loss: 0.0271 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3902 - regression_loss: 0.3631 - classification_loss: 0.0271 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3915 - regression_loss: 0.3644 - classification_loss: 0.0271 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3903 - regression_loss: 0.3633 - classification_loss: 0.0270 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3897 - regression_loss: 0.3628 - classification_loss: 0.0268 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3913 - regression_loss: 0.3644 - classification_loss: 0.0269 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3916 - regression_loss: 0.3646 - classification_loss: 0.0270 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3897 - regression_loss: 0.3628 - classification_loss: 0.0269 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3885 - regression_loss: 0.3617 - classification_loss: 0.0267 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3915 - regression_loss: 0.3649 - classification_loss: 0.0267 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3918 - regression_loss: 0.3651 - classification_loss: 0.0267 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3923 - regression_loss: 0.3656 - classification_loss: 0.0268 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3908 - regression_loss: 0.3641 - classification_loss: 0.0266 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3905 - regression_loss: 0.3639 - classification_loss: 0.0266 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3896 - regression_loss: 0.3631 - classification_loss: 0.0265 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3895 - regression_loss: 0.3631 - classification_loss: 0.0264 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3889 - regression_loss: 0.3625 - classification_loss: 0.0264 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3895 - regression_loss: 0.3630 - classification_loss: 0.0264 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3886 - regression_loss: 0.3621 - classification_loss: 0.0265 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3873 - regression_loss: 0.3609 - classification_loss: 0.0264 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3863 - regression_loss: 0.3601 - classification_loss: 0.0263 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3862 - regression_loss: 0.3600 - classification_loss: 0.0262 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3847 - regression_loss: 0.3586 - classification_loss: 0.0261 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3837 - regression_loss: 0.3577 - classification_loss: 0.0260 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3830 - regression_loss: 0.3570 - classification_loss: 0.0259 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3813 - regression_loss: 0.3554 - classification_loss: 0.0258 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3807 - regression_loss: 0.3549 - classification_loss: 0.0258 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3819 - regression_loss: 0.3561 - classification_loss: 0.0258 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3807 - regression_loss: 0.3550 - classification_loss: 0.0257 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3793 - regression_loss: 0.3537 - classification_loss: 0.0256 148/500 [=======>......................] - ETA: 2:00 - loss: 0.3787 - regression_loss: 0.3533 - classification_loss: 0.0254 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3778 - regression_loss: 0.3524 - classification_loss: 0.0254 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3785 - regression_loss: 0.3527 - classification_loss: 0.0257 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3786 - regression_loss: 0.3528 - classification_loss: 0.0258 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3790 - regression_loss: 0.3532 - classification_loss: 0.0257 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3788 - regression_loss: 0.3530 - classification_loss: 0.0258 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3778 - regression_loss: 0.3520 - classification_loss: 0.0258 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3785 - regression_loss: 0.3526 - classification_loss: 0.0259 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3802 - regression_loss: 0.3542 - classification_loss: 0.0260 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3840 - regression_loss: 0.3578 - classification_loss: 0.0262 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3851 - regression_loss: 0.3589 - classification_loss: 0.0262 159/500 [========>.....................] - ETA: 1:57 - loss: 0.3844 - regression_loss: 0.3583 - classification_loss: 0.0261 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3845 - regression_loss: 0.3584 - classification_loss: 0.0261 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3857 - regression_loss: 0.3593 - classification_loss: 0.0264 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3858 - regression_loss: 0.3595 - classification_loss: 0.0263 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3850 - regression_loss: 0.3588 - classification_loss: 0.0262 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3863 - regression_loss: 0.3600 - classification_loss: 0.0263 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3869 - regression_loss: 0.3604 - classification_loss: 0.0265 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3856 - regression_loss: 0.3592 - classification_loss: 0.0264 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3852 - regression_loss: 0.3588 - classification_loss: 0.0264 168/500 [=========>....................] - ETA: 1:54 - loss: 0.3855 - regression_loss: 0.3590 - classification_loss: 0.0266 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3848 - regression_loss: 0.3582 - classification_loss: 0.0265 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3849 - regression_loss: 0.3583 - classification_loss: 0.0266 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3868 - regression_loss: 0.3601 - classification_loss: 0.0266 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3897 - regression_loss: 0.3631 - classification_loss: 0.0267 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3921 - regression_loss: 0.3651 - classification_loss: 0.0270 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3932 - regression_loss: 0.3662 - classification_loss: 0.0270 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3927 - regression_loss: 0.3658 - classification_loss: 0.0269 176/500 [=========>....................] - ETA: 1:51 - loss: 0.3918 - regression_loss: 0.3650 - classification_loss: 0.0269 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3908 - regression_loss: 0.3640 - classification_loss: 0.0268 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3918 - regression_loss: 0.3648 - classification_loss: 0.0270 179/500 [=========>....................] - ETA: 1:50 - loss: 0.3930 - regression_loss: 0.3657 - classification_loss: 0.0272 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3937 - regression_loss: 0.3665 - classification_loss: 0.0272 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3957 - regression_loss: 0.3684 - classification_loss: 0.0272 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3951 - regression_loss: 0.3679 - classification_loss: 0.0272 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3952 - regression_loss: 0.3680 - classification_loss: 0.0272 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3965 - regression_loss: 0.3690 - classification_loss: 0.0275 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3960 - regression_loss: 0.3686 - classification_loss: 0.0274 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3959 - regression_loss: 0.3684 - classification_loss: 0.0275 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3964 - regression_loss: 0.3689 - classification_loss: 0.0275 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3965 - regression_loss: 0.3690 - classification_loss: 0.0275 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3970 - regression_loss: 0.3695 - classification_loss: 0.0275 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3971 - regression_loss: 0.3695 - classification_loss: 0.0275 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3982 - regression_loss: 0.3705 - classification_loss: 0.0277 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3970 - regression_loss: 0.3693 - classification_loss: 0.0277 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3967 - regression_loss: 0.3691 - classification_loss: 0.0276 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3966 - regression_loss: 0.3690 - classification_loss: 0.0276 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3958 - regression_loss: 0.3682 - classification_loss: 0.0276 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3955 - regression_loss: 0.3680 - classification_loss: 0.0275 197/500 [==========>...................] - ETA: 1:44 - loss: 0.3956 - regression_loss: 0.3681 - classification_loss: 0.0275 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3961 - regression_loss: 0.3685 - classification_loss: 0.0276 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3949 - regression_loss: 0.3674 - classification_loss: 0.0275 200/500 [===========>..................] - ETA: 1:43 - loss: 0.3946 - regression_loss: 0.3670 - classification_loss: 0.0275 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3938 - regression_loss: 0.3664 - classification_loss: 0.0275 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3928 - regression_loss: 0.3654 - classification_loss: 0.0275 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3939 - regression_loss: 0.3663 - classification_loss: 0.0276 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3937 - regression_loss: 0.3662 - classification_loss: 0.0276 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3935 - regression_loss: 0.3660 - classification_loss: 0.0275 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3934 - regression_loss: 0.3660 - classification_loss: 0.0275 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3928 - regression_loss: 0.3654 - classification_loss: 0.0274 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3925 - regression_loss: 0.3652 - classification_loss: 0.0273 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3921 - regression_loss: 0.3648 - classification_loss: 0.0273 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3931 - regression_loss: 0.3658 - classification_loss: 0.0273 211/500 [===========>..................] - ETA: 1:39 - loss: 0.3919 - regression_loss: 0.3647 - classification_loss: 0.0272 212/500 [===========>..................] - ETA: 1:39 - loss: 0.3917 - regression_loss: 0.3645 - classification_loss: 0.0272 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3909 - regression_loss: 0.3638 - classification_loss: 0.0271 214/500 [===========>..................] - ETA: 1:38 - loss: 0.3900 - regression_loss: 0.3630 - classification_loss: 0.0270 215/500 [===========>..................] - ETA: 1:38 - loss: 0.3906 - regression_loss: 0.3636 - classification_loss: 0.0270 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3907 - regression_loss: 0.3637 - classification_loss: 0.0270 217/500 [============>.................] - ETA: 1:37 - loss: 0.3908 - regression_loss: 0.3638 - classification_loss: 0.0270 218/500 [============>.................] - ETA: 1:37 - loss: 0.3905 - regression_loss: 0.3635 - classification_loss: 0.0270 219/500 [============>.................] - ETA: 1:36 - loss: 0.3902 - regression_loss: 0.3632 - classification_loss: 0.0269 220/500 [============>.................] - ETA: 1:36 - loss: 0.3903 - regression_loss: 0.3634 - classification_loss: 0.0269 221/500 [============>.................] - ETA: 1:35 - loss: 0.3902 - regression_loss: 0.3633 - classification_loss: 0.0269 222/500 [============>.................] - ETA: 1:35 - loss: 0.3898 - regression_loss: 0.3629 - classification_loss: 0.0269 223/500 [============>.................] - ETA: 1:35 - loss: 0.3910 - regression_loss: 0.3640 - classification_loss: 0.0269 224/500 [============>.................] - ETA: 1:34 - loss: 0.3914 - regression_loss: 0.3645 - classification_loss: 0.0269 225/500 [============>.................] - ETA: 1:34 - loss: 0.3911 - regression_loss: 0.3641 - classification_loss: 0.0269 226/500 [============>.................] - ETA: 1:34 - loss: 0.3915 - regression_loss: 0.3645 - classification_loss: 0.0269 227/500 [============>.................] - ETA: 1:33 - loss: 0.3906 - regression_loss: 0.3637 - classification_loss: 0.0268 228/500 [============>.................] - ETA: 1:33 - loss: 0.3908 - regression_loss: 0.3638 - classification_loss: 0.0269 229/500 [============>.................] - ETA: 1:33 - loss: 0.3909 - regression_loss: 0.3641 - classification_loss: 0.0269 230/500 [============>.................] - ETA: 1:32 - loss: 0.3903 - regression_loss: 0.3635 - classification_loss: 0.0268 231/500 [============>.................] - ETA: 1:32 - loss: 0.3898 - regression_loss: 0.3630 - classification_loss: 0.0268 232/500 [============>.................] - ETA: 1:32 - loss: 0.3892 - regression_loss: 0.3625 - classification_loss: 0.0268 233/500 [============>.................] - ETA: 1:31 - loss: 0.3895 - regression_loss: 0.3627 - classification_loss: 0.0268 234/500 [=============>................] - ETA: 1:31 - loss: 0.3911 - regression_loss: 0.3639 - classification_loss: 0.0272 235/500 [=============>................] - ETA: 1:31 - loss: 0.3911 - regression_loss: 0.3639 - classification_loss: 0.0272 236/500 [=============>................] - ETA: 1:30 - loss: 0.3906 - regression_loss: 0.3634 - classification_loss: 0.0272 237/500 [=============>................] - ETA: 1:30 - loss: 0.3905 - regression_loss: 0.3634 - classification_loss: 0.0271 238/500 [=============>................] - ETA: 1:30 - loss: 0.3895 - regression_loss: 0.3625 - classification_loss: 0.0271 239/500 [=============>................] - ETA: 1:29 - loss: 0.3897 - regression_loss: 0.3626 - classification_loss: 0.0271 240/500 [=============>................] - ETA: 1:29 - loss: 0.3894 - regression_loss: 0.3624 - classification_loss: 0.0270 241/500 [=============>................] - ETA: 1:29 - loss: 0.3910 - regression_loss: 0.3639 - classification_loss: 0.0271 242/500 [=============>................] - ETA: 1:28 - loss: 0.3898 - regression_loss: 0.3628 - classification_loss: 0.0270 243/500 [=============>................] - ETA: 1:28 - loss: 0.3899 - regression_loss: 0.3629 - classification_loss: 0.0270 244/500 [=============>................] - ETA: 1:28 - loss: 0.3914 - regression_loss: 0.3644 - classification_loss: 0.0270 245/500 [=============>................] - ETA: 1:27 - loss: 0.3910 - regression_loss: 0.3640 - classification_loss: 0.0270 246/500 [=============>................] - ETA: 1:27 - loss: 0.3913 - regression_loss: 0.3642 - classification_loss: 0.0270 247/500 [=============>................] - ETA: 1:27 - loss: 0.3912 - regression_loss: 0.3641 - classification_loss: 0.0271 248/500 [=============>................] - ETA: 1:26 - loss: 0.3908 - regression_loss: 0.3637 - classification_loss: 0.0270 249/500 [=============>................] - ETA: 1:26 - loss: 0.3905 - regression_loss: 0.3635 - classification_loss: 0.0270 250/500 [==============>...............] - ETA: 1:26 - loss: 0.3900 - regression_loss: 0.3631 - classification_loss: 0.0269 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3902 - regression_loss: 0.3633 - classification_loss: 0.0269 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3899 - regression_loss: 0.3631 - classification_loss: 0.0268 253/500 [==============>...............] - ETA: 1:25 - loss: 0.3900 - regression_loss: 0.3631 - classification_loss: 0.0269 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3902 - regression_loss: 0.3632 - classification_loss: 0.0270 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3899 - regression_loss: 0.3629 - classification_loss: 0.0269 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3908 - regression_loss: 0.3638 - classification_loss: 0.0270 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3911 - regression_loss: 0.3640 - classification_loss: 0.0272 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3916 - regression_loss: 0.3644 - classification_loss: 0.0272 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3915 - regression_loss: 0.3643 - classification_loss: 0.0272 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3914 - regression_loss: 0.3642 - classification_loss: 0.0272 261/500 [==============>...............] - ETA: 1:22 - loss: 0.3914 - regression_loss: 0.3642 - classification_loss: 0.0272 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3913 - regression_loss: 0.3641 - classification_loss: 0.0271 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3906 - regression_loss: 0.3636 - classification_loss: 0.0271 264/500 [==============>...............] - ETA: 1:21 - loss: 0.3902 - regression_loss: 0.3632 - classification_loss: 0.0270 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3900 - regression_loss: 0.3630 - classification_loss: 0.0270 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3897 - regression_loss: 0.3628 - classification_loss: 0.0269 267/500 [===============>..............] - ETA: 1:20 - loss: 0.3892 - regression_loss: 0.3623 - classification_loss: 0.0269 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3888 - regression_loss: 0.3619 - classification_loss: 0.0268 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3881 - regression_loss: 0.3613 - classification_loss: 0.0268 270/500 [===============>..............] - ETA: 1:19 - loss: 0.3876 - regression_loss: 0.3608 - classification_loss: 0.0268 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3876 - regression_loss: 0.3608 - classification_loss: 0.0268 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3876 - regression_loss: 0.3608 - classification_loss: 0.0268 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3865 - regression_loss: 0.3598 - classification_loss: 0.0267 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3870 - regression_loss: 0.3603 - classification_loss: 0.0267 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3861 - regression_loss: 0.3594 - classification_loss: 0.0266 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3867 - regression_loss: 0.3600 - classification_loss: 0.0267 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3869 - regression_loss: 0.3601 - classification_loss: 0.0267 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3877 - regression_loss: 0.3608 - classification_loss: 0.0268 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3872 - regression_loss: 0.3603 - classification_loss: 0.0269 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3871 - regression_loss: 0.3603 - classification_loss: 0.0268 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3879 - regression_loss: 0.3610 - classification_loss: 0.0269 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3880 - regression_loss: 0.3611 - classification_loss: 0.0269 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3892 - regression_loss: 0.3620 - classification_loss: 0.0272 284/500 [================>.............] - ETA: 1:14 - loss: 0.3888 - regression_loss: 0.3617 - classification_loss: 0.0271 285/500 [================>.............] - ETA: 1:13 - loss: 0.3888 - regression_loss: 0.3617 - classification_loss: 0.0271 286/500 [================>.............] - ETA: 1:13 - loss: 0.3885 - regression_loss: 0.3615 - classification_loss: 0.0270 287/500 [================>.............] - ETA: 1:13 - loss: 0.3885 - regression_loss: 0.3615 - classification_loss: 0.0270 288/500 [================>.............] - ETA: 1:12 - loss: 0.3881 - regression_loss: 0.3612 - classification_loss: 0.0270 289/500 [================>.............] - ETA: 1:12 - loss: 0.3882 - regression_loss: 0.3612 - classification_loss: 0.0270 290/500 [================>.............] - ETA: 1:12 - loss: 0.3883 - regression_loss: 0.3613 - classification_loss: 0.0270 291/500 [================>.............] - ETA: 1:11 - loss: 0.3879 - regression_loss: 0.3610 - classification_loss: 0.0269 292/500 [================>.............] - ETA: 1:11 - loss: 0.3880 - regression_loss: 0.3611 - classification_loss: 0.0269 293/500 [================>.............] - ETA: 1:11 - loss: 0.3880 - regression_loss: 0.3610 - classification_loss: 0.0270 294/500 [================>.............] - ETA: 1:10 - loss: 0.3875 - regression_loss: 0.3606 - classification_loss: 0.0269 295/500 [================>.............] - ETA: 1:10 - loss: 0.3875 - regression_loss: 0.3606 - classification_loss: 0.0269 296/500 [================>.............] - ETA: 1:10 - loss: 0.3869 - regression_loss: 0.3601 - classification_loss: 0.0269 297/500 [================>.............] - ETA: 1:09 - loss: 0.3872 - regression_loss: 0.3603 - classification_loss: 0.0269 298/500 [================>.............] - ETA: 1:09 - loss: 0.3872 - regression_loss: 0.3602 - classification_loss: 0.0270 299/500 [================>.............] - ETA: 1:09 - loss: 0.3879 - regression_loss: 0.3609 - classification_loss: 0.0270 300/500 [=================>............] - ETA: 1:08 - loss: 0.3882 - regression_loss: 0.3612 - classification_loss: 0.0270 301/500 [=================>............] - ETA: 1:08 - loss: 0.3884 - regression_loss: 0.3614 - classification_loss: 0.0270 302/500 [=================>............] - ETA: 1:08 - loss: 0.3889 - regression_loss: 0.3618 - classification_loss: 0.0271 303/500 [=================>............] - ETA: 1:07 - loss: 0.3888 - regression_loss: 0.3617 - classification_loss: 0.0271 304/500 [=================>............] - ETA: 1:07 - loss: 0.3885 - regression_loss: 0.3615 - classification_loss: 0.0270 305/500 [=================>............] - ETA: 1:07 - loss: 0.3890 - regression_loss: 0.3619 - classification_loss: 0.0272 306/500 [=================>............] - ETA: 1:06 - loss: 0.3883 - regression_loss: 0.3612 - classification_loss: 0.0271 307/500 [=================>............] - ETA: 1:06 - loss: 0.3876 - regression_loss: 0.3605 - classification_loss: 0.0271 308/500 [=================>............] - ETA: 1:06 - loss: 0.3877 - regression_loss: 0.3606 - classification_loss: 0.0272 309/500 [=================>............] - ETA: 1:05 - loss: 0.3876 - regression_loss: 0.3604 - classification_loss: 0.0271 310/500 [=================>............] - ETA: 1:05 - loss: 0.3881 - regression_loss: 0.3610 - classification_loss: 0.0272 311/500 [=================>............] - ETA: 1:04 - loss: 0.3887 - regression_loss: 0.3614 - classification_loss: 0.0273 312/500 [=================>............] - ETA: 1:04 - loss: 0.3884 - regression_loss: 0.3611 - classification_loss: 0.0273 313/500 [=================>............] - ETA: 1:04 - loss: 0.3889 - regression_loss: 0.3615 - classification_loss: 0.0274 314/500 [=================>............] - ETA: 1:03 - loss: 0.3894 - regression_loss: 0.3620 - classification_loss: 0.0274 315/500 [=================>............] - ETA: 1:03 - loss: 0.3894 - regression_loss: 0.3619 - classification_loss: 0.0274 316/500 [=================>............] - ETA: 1:03 - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0274 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0274 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3883 - regression_loss: 0.3610 - classification_loss: 0.0273 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3879 - regression_loss: 0.3606 - classification_loss: 0.0273 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3882 - regression_loss: 0.3609 - classification_loss: 0.0273 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3884 - regression_loss: 0.3611 - classification_loss: 0.0274 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3883 - regression_loss: 0.3610 - classification_loss: 0.0273 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3880 - regression_loss: 0.3607 - classification_loss: 0.0274 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3883 - regression_loss: 0.3609 - classification_loss: 0.0274 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3879 - regression_loss: 0.3605 - classification_loss: 0.0273 326/500 [==================>...........] - ETA: 59s - loss: 0.3882 - regression_loss: 0.3608 - classification_loss: 0.0274  327/500 [==================>...........] - ETA: 59s - loss: 0.3886 - regression_loss: 0.3611 - classification_loss: 0.0275 328/500 [==================>...........] - ETA: 59s - loss: 0.3883 - regression_loss: 0.3609 - classification_loss: 0.0275 329/500 [==================>...........] - ETA: 58s - loss: 0.3881 - regression_loss: 0.3607 - classification_loss: 0.0274 330/500 [==================>...........] - ETA: 58s - loss: 0.3877 - regression_loss: 0.3603 - classification_loss: 0.0274 331/500 [==================>...........] - ETA: 58s - loss: 0.3875 - regression_loss: 0.3602 - classification_loss: 0.0273 332/500 [==================>...........] - ETA: 57s - loss: 0.3871 - regression_loss: 0.3598 - classification_loss: 0.0273 333/500 [==================>...........] - ETA: 57s - loss: 0.3870 - regression_loss: 0.3597 - classification_loss: 0.0273 334/500 [===================>..........] - ETA: 56s - loss: 0.3876 - regression_loss: 0.3602 - classification_loss: 0.0273 335/500 [===================>..........] - ETA: 56s - loss: 0.3869 - regression_loss: 0.3596 - classification_loss: 0.0273 336/500 [===================>..........] - ETA: 56s - loss: 0.3878 - regression_loss: 0.3605 - classification_loss: 0.0273 337/500 [===================>..........] - ETA: 55s - loss: 0.3873 - regression_loss: 0.3601 - classification_loss: 0.0272 338/500 [===================>..........] - ETA: 55s - loss: 0.3884 - regression_loss: 0.3611 - classification_loss: 0.0273 339/500 [===================>..........] - ETA: 55s - loss: 0.3883 - regression_loss: 0.3610 - classification_loss: 0.0273 340/500 [===================>..........] - ETA: 54s - loss: 0.3880 - regression_loss: 0.3608 - classification_loss: 0.0272 341/500 [===================>..........] - ETA: 54s - loss: 0.3880 - regression_loss: 0.3609 - classification_loss: 0.0272 342/500 [===================>..........] - ETA: 54s - loss: 0.3887 - regression_loss: 0.3615 - classification_loss: 0.0272 343/500 [===================>..........] - ETA: 53s - loss: 0.3891 - regression_loss: 0.3619 - classification_loss: 0.0272 344/500 [===================>..........] - ETA: 53s - loss: 0.3893 - regression_loss: 0.3621 - classification_loss: 0.0272 345/500 [===================>..........] - ETA: 53s - loss: 0.3889 - regression_loss: 0.3616 - classification_loss: 0.0273 346/500 [===================>..........] - ETA: 52s - loss: 0.3890 - regression_loss: 0.3617 - classification_loss: 0.0273 347/500 [===================>..........] - ETA: 52s - loss: 0.3889 - regression_loss: 0.3616 - classification_loss: 0.0273 348/500 [===================>..........] - ETA: 52s - loss: 0.3885 - regression_loss: 0.3612 - classification_loss: 0.0273 349/500 [===================>..........] - ETA: 51s - loss: 0.3894 - regression_loss: 0.3620 - classification_loss: 0.0275 350/500 [====================>.........] - ETA: 51s - loss: 0.3909 - regression_loss: 0.3634 - classification_loss: 0.0276 351/500 [====================>.........] - ETA: 51s - loss: 0.3907 - regression_loss: 0.3632 - classification_loss: 0.0275 352/500 [====================>.........] - ETA: 50s - loss: 0.3904 - regression_loss: 0.3629 - classification_loss: 0.0274 353/500 [====================>.........] - ETA: 50s - loss: 0.3904 - regression_loss: 0.3630 - classification_loss: 0.0274 354/500 [====================>.........] - ETA: 50s - loss: 0.3904 - regression_loss: 0.3630 - classification_loss: 0.0274 355/500 [====================>.........] - ETA: 49s - loss: 0.3909 - regression_loss: 0.3634 - classification_loss: 0.0274 356/500 [====================>.........] - ETA: 49s - loss: 0.3912 - regression_loss: 0.3638 - classification_loss: 0.0274 357/500 [====================>.........] - ETA: 49s - loss: 0.3916 - regression_loss: 0.3642 - classification_loss: 0.0274 358/500 [====================>.........] - ETA: 48s - loss: 0.3922 - regression_loss: 0.3647 - classification_loss: 0.0275 359/500 [====================>.........] - ETA: 48s - loss: 0.3923 - regression_loss: 0.3649 - classification_loss: 0.0275 360/500 [====================>.........] - ETA: 48s - loss: 0.3923 - regression_loss: 0.3647 - classification_loss: 0.0275 361/500 [====================>.........] - ETA: 47s - loss: 0.3921 - regression_loss: 0.3646 - classification_loss: 0.0275 362/500 [====================>.........] - ETA: 47s - loss: 0.3928 - regression_loss: 0.3653 - classification_loss: 0.0275 363/500 [====================>.........] - ETA: 46s - loss: 0.3930 - regression_loss: 0.3655 - classification_loss: 0.0276 364/500 [====================>.........] - ETA: 46s - loss: 0.3928 - regression_loss: 0.3652 - classification_loss: 0.0276 365/500 [====================>.........] - ETA: 46s - loss: 0.3923 - regression_loss: 0.3647 - classification_loss: 0.0276 366/500 [====================>.........] - ETA: 45s - loss: 0.3927 - regression_loss: 0.3651 - classification_loss: 0.0276 367/500 [=====================>........] - ETA: 45s - loss: 0.3927 - regression_loss: 0.3651 - classification_loss: 0.0276 368/500 [=====================>........] - ETA: 45s - loss: 0.3929 - regression_loss: 0.3653 - classification_loss: 0.0276 369/500 [=====================>........] - ETA: 44s - loss: 0.3932 - regression_loss: 0.3655 - classification_loss: 0.0276 370/500 [=====================>........] - ETA: 44s - loss: 0.3932 - regression_loss: 0.3656 - classification_loss: 0.0276 371/500 [=====================>........] - ETA: 44s - loss: 0.3929 - regression_loss: 0.3654 - classification_loss: 0.0276 372/500 [=====================>........] - ETA: 43s - loss: 0.3930 - regression_loss: 0.3654 - classification_loss: 0.0276 373/500 [=====================>........] - ETA: 43s - loss: 0.3934 - regression_loss: 0.3657 - classification_loss: 0.0277 374/500 [=====================>........] - ETA: 43s - loss: 0.3928 - regression_loss: 0.3651 - classification_loss: 0.0277 375/500 [=====================>........] - ETA: 42s - loss: 0.3923 - regression_loss: 0.3647 - classification_loss: 0.0276 376/500 [=====================>........] - ETA: 42s - loss: 0.3923 - regression_loss: 0.3647 - classification_loss: 0.0276 377/500 [=====================>........] - ETA: 42s - loss: 0.3918 - regression_loss: 0.3643 - classification_loss: 0.0276 378/500 [=====================>........] - ETA: 41s - loss: 0.3913 - regression_loss: 0.3638 - classification_loss: 0.0275 379/500 [=====================>........] - ETA: 41s - loss: 0.3908 - regression_loss: 0.3633 - classification_loss: 0.0275 380/500 [=====================>........] - ETA: 41s - loss: 0.3906 - regression_loss: 0.3631 - classification_loss: 0.0274 381/500 [=====================>........] - ETA: 40s - loss: 0.3902 - regression_loss: 0.3628 - classification_loss: 0.0274 382/500 [=====================>........] - ETA: 40s - loss: 0.3900 - regression_loss: 0.3626 - classification_loss: 0.0274 383/500 [=====================>........] - ETA: 40s - loss: 0.3899 - regression_loss: 0.3625 - classification_loss: 0.0274 384/500 [======================>.......] - ETA: 39s - loss: 0.3896 - regression_loss: 0.3622 - classification_loss: 0.0273 385/500 [======================>.......] - ETA: 39s - loss: 0.3891 - regression_loss: 0.3617 - classification_loss: 0.0273 386/500 [======================>.......] - ETA: 39s - loss: 0.3890 - regression_loss: 0.3616 - classification_loss: 0.0273 387/500 [======================>.......] - ETA: 38s - loss: 0.3888 - regression_loss: 0.3615 - classification_loss: 0.0273 388/500 [======================>.......] - ETA: 38s - loss: 0.3887 - regression_loss: 0.3614 - classification_loss: 0.0273 389/500 [======================>.......] - ETA: 38s - loss: 0.3891 - regression_loss: 0.3617 - classification_loss: 0.0274 390/500 [======================>.......] - ETA: 37s - loss: 0.3887 - regression_loss: 0.3613 - classification_loss: 0.0274 391/500 [======================>.......] - ETA: 37s - loss: 0.3885 - regression_loss: 0.3611 - classification_loss: 0.0273 392/500 [======================>.......] - ETA: 37s - loss: 0.3881 - regression_loss: 0.3608 - classification_loss: 0.0273 393/500 [======================>.......] - ETA: 36s - loss: 0.3881 - regression_loss: 0.3608 - classification_loss: 0.0273 394/500 [======================>.......] - ETA: 36s - loss: 0.3882 - regression_loss: 0.3610 - classification_loss: 0.0273 395/500 [======================>.......] - ETA: 36s - loss: 0.3877 - regression_loss: 0.3604 - classification_loss: 0.0273 396/500 [======================>.......] - ETA: 35s - loss: 0.3880 - regression_loss: 0.3607 - classification_loss: 0.0273 397/500 [======================>.......] - ETA: 35s - loss: 0.3883 - regression_loss: 0.3609 - classification_loss: 0.0273 398/500 [======================>.......] - ETA: 34s - loss: 0.3886 - regression_loss: 0.3612 - classification_loss: 0.0274 399/500 [======================>.......] - ETA: 34s - loss: 0.3891 - regression_loss: 0.3617 - classification_loss: 0.0274 400/500 [=======================>......] - ETA: 34s - loss: 0.3893 - regression_loss: 0.3619 - classification_loss: 0.0274 401/500 [=======================>......] - ETA: 33s - loss: 0.3897 - regression_loss: 0.3623 - classification_loss: 0.0274 402/500 [=======================>......] - ETA: 33s - loss: 0.3895 - regression_loss: 0.3620 - classification_loss: 0.0274 403/500 [=======================>......] - ETA: 33s - loss: 0.3894 - regression_loss: 0.3619 - classification_loss: 0.0274 404/500 [=======================>......] - ETA: 32s - loss: 0.3889 - regression_loss: 0.3614 - classification_loss: 0.0274 405/500 [=======================>......] - ETA: 32s - loss: 0.3893 - regression_loss: 0.3618 - classification_loss: 0.0275 406/500 [=======================>......] - ETA: 32s - loss: 0.3895 - regression_loss: 0.3621 - classification_loss: 0.0275 407/500 [=======================>......] - ETA: 31s - loss: 0.3903 - regression_loss: 0.3627 - classification_loss: 0.0276 408/500 [=======================>......] - ETA: 31s - loss: 0.3899 - regression_loss: 0.3623 - classification_loss: 0.0276 409/500 [=======================>......] - ETA: 31s - loss: 0.3901 - regression_loss: 0.3624 - classification_loss: 0.0276 410/500 [=======================>......] - ETA: 30s - loss: 0.3900 - regression_loss: 0.3624 - classification_loss: 0.0276 411/500 [=======================>......] - ETA: 30s - loss: 0.3901 - regression_loss: 0.3625 - classification_loss: 0.0276 412/500 [=======================>......] - ETA: 30s - loss: 0.3897 - regression_loss: 0.3621 - classification_loss: 0.0276 413/500 [=======================>......] - ETA: 29s - loss: 0.3898 - regression_loss: 0.3623 - classification_loss: 0.0276 414/500 [=======================>......] - ETA: 29s - loss: 0.3899 - regression_loss: 0.3623 - classification_loss: 0.0276 415/500 [=======================>......] - ETA: 29s - loss: 0.3897 - regression_loss: 0.3622 - classification_loss: 0.0275 416/500 [=======================>......] - ETA: 28s - loss: 0.3895 - regression_loss: 0.3619 - classification_loss: 0.0276 417/500 [========================>.....] - ETA: 28s - loss: 0.3897 - regression_loss: 0.3621 - classification_loss: 0.0276 418/500 [========================>.....] - ETA: 28s - loss: 0.3898 - regression_loss: 0.3622 - classification_loss: 0.0276 419/500 [========================>.....] - ETA: 27s - loss: 0.3901 - regression_loss: 0.3625 - classification_loss: 0.0276 420/500 [========================>.....] - ETA: 27s - loss: 0.3900 - regression_loss: 0.3624 - classification_loss: 0.0276 421/500 [========================>.....] - ETA: 27s - loss: 0.3896 - regression_loss: 0.3621 - classification_loss: 0.0276 422/500 [========================>.....] - ETA: 26s - loss: 0.3900 - regression_loss: 0.3624 - classification_loss: 0.0276 423/500 [========================>.....] - ETA: 26s - loss: 0.3896 - regression_loss: 0.3621 - classification_loss: 0.0276 424/500 [========================>.....] - ETA: 26s - loss: 0.3901 - regression_loss: 0.3625 - classification_loss: 0.0275 425/500 [========================>.....] - ETA: 25s - loss: 0.3904 - regression_loss: 0.3628 - classification_loss: 0.0275 426/500 [========================>.....] - ETA: 25s - loss: 0.3904 - regression_loss: 0.3628 - classification_loss: 0.0275 427/500 [========================>.....] - ETA: 25s - loss: 0.3904 - regression_loss: 0.3629 - classification_loss: 0.0275 428/500 [========================>.....] - ETA: 24s - loss: 0.3902 - regression_loss: 0.3627 - classification_loss: 0.0275 429/500 [========================>.....] - ETA: 24s - loss: 0.3900 - regression_loss: 0.3626 - classification_loss: 0.0275 430/500 [========================>.....] - ETA: 24s - loss: 0.3897 - regression_loss: 0.3622 - classification_loss: 0.0274 431/500 [========================>.....] - ETA: 23s - loss: 0.3892 - regression_loss: 0.3618 - classification_loss: 0.0274 432/500 [========================>.....] - ETA: 23s - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0274 433/500 [========================>.....] - ETA: 23s - loss: 0.3888 - regression_loss: 0.3615 - classification_loss: 0.0274 434/500 [=========================>....] - ETA: 22s - loss: 0.3890 - regression_loss: 0.3616 - classification_loss: 0.0274 435/500 [=========================>....] - ETA: 22s - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0274 436/500 [=========================>....] - ETA: 21s - loss: 0.3887 - regression_loss: 0.3614 - classification_loss: 0.0274 437/500 [=========================>....] - ETA: 21s - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0274 438/500 [=========================>....] - ETA: 21s - loss: 0.3884 - regression_loss: 0.3610 - classification_loss: 0.0273 439/500 [=========================>....] - ETA: 20s - loss: 0.3879 - regression_loss: 0.3606 - classification_loss: 0.0273 440/500 [=========================>....] - ETA: 20s - loss: 0.3877 - regression_loss: 0.3604 - classification_loss: 0.0273 441/500 [=========================>....] - ETA: 20s - loss: 0.3876 - regression_loss: 0.3603 - classification_loss: 0.0273 442/500 [=========================>....] - ETA: 19s - loss: 0.3877 - regression_loss: 0.3604 - classification_loss: 0.0273 443/500 [=========================>....] - ETA: 19s - loss: 0.3872 - regression_loss: 0.3599 - classification_loss: 0.0272 444/500 [=========================>....] - ETA: 19s - loss: 0.3872 - regression_loss: 0.3600 - classification_loss: 0.0272 445/500 [=========================>....] - ETA: 18s - loss: 0.3873 - regression_loss: 0.3600 - classification_loss: 0.0273 446/500 [=========================>....] - ETA: 18s - loss: 0.3868 - regression_loss: 0.3595 - classification_loss: 0.0272 447/500 [=========================>....] - ETA: 18s - loss: 0.3875 - regression_loss: 0.3601 - classification_loss: 0.0274 448/500 [=========================>....] - ETA: 17s - loss: 0.3877 - regression_loss: 0.3603 - classification_loss: 0.0275 449/500 [=========================>....] - ETA: 17s - loss: 0.3873 - regression_loss: 0.3599 - classification_loss: 0.0274 450/500 [==========================>...] - ETA: 17s - loss: 0.3870 - regression_loss: 0.3596 - classification_loss: 0.0274 451/500 [==========================>...] - ETA: 16s - loss: 0.3870 - regression_loss: 0.3596 - classification_loss: 0.0274 452/500 [==========================>...] - ETA: 16s - loss: 0.3872 - regression_loss: 0.3598 - classification_loss: 0.0274 453/500 [==========================>...] - ETA: 16s - loss: 0.3872 - regression_loss: 0.3598 - classification_loss: 0.0274 454/500 [==========================>...] - ETA: 15s - loss: 0.3873 - regression_loss: 0.3599 - classification_loss: 0.0274 455/500 [==========================>...] - ETA: 15s - loss: 0.3868 - regression_loss: 0.3595 - classification_loss: 0.0274 456/500 [==========================>...] - ETA: 15s - loss: 0.3867 - regression_loss: 0.3593 - classification_loss: 0.0274 457/500 [==========================>...] - ETA: 14s - loss: 0.3871 - regression_loss: 0.3597 - classification_loss: 0.0274 458/500 [==========================>...] - ETA: 14s - loss: 0.3873 - regression_loss: 0.3599 - classification_loss: 0.0274 459/500 [==========================>...] - ETA: 14s - loss: 0.3871 - regression_loss: 0.3597 - classification_loss: 0.0274 460/500 [==========================>...] - ETA: 13s - loss: 0.3868 - regression_loss: 0.3595 - classification_loss: 0.0273 461/500 [==========================>...] - ETA: 13s - loss: 0.3864 - regression_loss: 0.3591 - classification_loss: 0.0273 462/500 [==========================>...] - ETA: 13s - loss: 0.3865 - regression_loss: 0.3592 - classification_loss: 0.0273 463/500 [==========================>...] - ETA: 12s - loss: 0.3862 - regression_loss: 0.3589 - classification_loss: 0.0273 464/500 [==========================>...] - ETA: 12s - loss: 0.3861 - regression_loss: 0.3588 - classification_loss: 0.0273 465/500 [==========================>...] - ETA: 12s - loss: 0.3857 - regression_loss: 0.3585 - classification_loss: 0.0272 466/500 [==========================>...] - ETA: 11s - loss: 0.3860 - regression_loss: 0.3588 - classification_loss: 0.0273 467/500 [===========================>..] - ETA: 11s - loss: 0.3858 - regression_loss: 0.3586 - classification_loss: 0.0272 468/500 [===========================>..] - ETA: 10s - loss: 0.3862 - regression_loss: 0.3588 - classification_loss: 0.0274 469/500 [===========================>..] - ETA: 10s - loss: 0.3863 - regression_loss: 0.3589 - classification_loss: 0.0273 470/500 [===========================>..] - ETA: 10s - loss: 0.3860 - regression_loss: 0.3586 - classification_loss: 0.0273 471/500 [===========================>..] - ETA: 9s - loss: 0.3861 - regression_loss: 0.3587 - classification_loss: 0.0273  472/500 [===========================>..] - ETA: 9s - loss: 0.3859 - regression_loss: 0.3586 - classification_loss: 0.0273 473/500 [===========================>..] - ETA: 9s - loss: 0.3855 - regression_loss: 0.3582 - classification_loss: 0.0273 474/500 [===========================>..] - ETA: 8s - loss: 0.3856 - regression_loss: 0.3583 - classification_loss: 0.0273 475/500 [===========================>..] - ETA: 8s - loss: 0.3859 - regression_loss: 0.3586 - classification_loss: 0.0273 476/500 [===========================>..] - ETA: 8s - loss: 0.3869 - regression_loss: 0.3596 - classification_loss: 0.0274 477/500 [===========================>..] - ETA: 7s - loss: 0.3869 - regression_loss: 0.3596 - classification_loss: 0.0274 478/500 [===========================>..] - ETA: 7s - loss: 0.3863 - regression_loss: 0.3590 - classification_loss: 0.0273 479/500 [===========================>..] - ETA: 7s - loss: 0.3862 - regression_loss: 0.3589 - classification_loss: 0.0273 480/500 [===========================>..] - ETA: 6s - loss: 0.3866 - regression_loss: 0.3592 - classification_loss: 0.0273 481/500 [===========================>..] - ETA: 6s - loss: 0.3865 - regression_loss: 0.3592 - classification_loss: 0.0273 482/500 [===========================>..] - ETA: 6s - loss: 0.3863 - regression_loss: 0.3590 - classification_loss: 0.0274 483/500 [===========================>..] - ETA: 5s - loss: 0.3867 - regression_loss: 0.3594 - classification_loss: 0.0274 484/500 [============================>.] - ETA: 5s - loss: 0.3871 - regression_loss: 0.3597 - classification_loss: 0.0274 485/500 [============================>.] - ETA: 5s - loss: 0.3874 - regression_loss: 0.3600 - classification_loss: 0.0274 486/500 [============================>.] - ETA: 4s - loss: 0.3875 - regression_loss: 0.3601 - classification_loss: 0.0274 487/500 [============================>.] - ETA: 4s - loss: 0.3874 - regression_loss: 0.3599 - classification_loss: 0.0274 488/500 [============================>.] - ETA: 4s - loss: 0.3873 - regression_loss: 0.3599 - classification_loss: 0.0274 489/500 [============================>.] - ETA: 3s - loss: 0.3870 - regression_loss: 0.3596 - classification_loss: 0.0273 490/500 [============================>.] - ETA: 3s - loss: 0.3871 - regression_loss: 0.3598 - classification_loss: 0.0274 491/500 [============================>.] - ETA: 3s - loss: 0.3874 - regression_loss: 0.3600 - classification_loss: 0.0274 492/500 [============================>.] - ETA: 2s - loss: 0.3872 - regression_loss: 0.3598 - classification_loss: 0.0274 493/500 [============================>.] - ETA: 2s - loss: 0.3877 - regression_loss: 0.3602 - classification_loss: 0.0275 494/500 [============================>.] - ETA: 2s - loss: 0.3880 - regression_loss: 0.3605 - classification_loss: 0.0275 495/500 [============================>.] - ETA: 1s - loss: 0.3876 - regression_loss: 0.3601 - classification_loss: 0.0275 496/500 [============================>.] - ETA: 1s - loss: 0.3874 - regression_loss: 0.3599 - classification_loss: 0.0275 497/500 [============================>.] - ETA: 1s - loss: 0.3874 - regression_loss: 0.3600 - classification_loss: 0.0275 498/500 [============================>.] - ETA: 0s - loss: 0.3876 - regression_loss: 0.3601 - classification_loss: 0.0275 499/500 [============================>.] - ETA: 0s - loss: 0.3877 - regression_loss: 0.3602 - classification_loss: 0.0275 500/500 [==============================] - 172s 344ms/step - loss: 0.3877 - regression_loss: 0.3602 - classification_loss: 0.0275 1172 instances of class plum with average precision: 0.7358 mAP: 0.7358 Epoch 00029: saving model to ./training/snapshots/resnet101_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 2:40 - loss: 0.5006 - regression_loss: 0.4589 - classification_loss: 0.0416 2/500 [..............................] - ETA: 2:41 - loss: 0.4502 - regression_loss: 0.4137 - classification_loss: 0.0366 3/500 [..............................] - ETA: 2:44 - loss: 0.4700 - regression_loss: 0.4348 - classification_loss: 0.0352 4/500 [..............................] - ETA: 2:47 - loss: 0.3924 - regression_loss: 0.3634 - classification_loss: 0.0289 5/500 [..............................] - ETA: 2:49 - loss: 0.3822 - regression_loss: 0.3547 - classification_loss: 0.0275 6/500 [..............................] - ETA: 2:48 - loss: 0.3410 - regression_loss: 0.3172 - classification_loss: 0.0237 7/500 [..............................] - ETA: 2:48 - loss: 0.3338 - regression_loss: 0.3105 - classification_loss: 0.0234 8/500 [..............................] - ETA: 2:48 - loss: 0.3580 - regression_loss: 0.3334 - classification_loss: 0.0246 9/500 [..............................] - ETA: 2:47 - loss: 0.3715 - regression_loss: 0.3460 - classification_loss: 0.0255 10/500 [..............................] - ETA: 2:46 - loss: 0.3657 - regression_loss: 0.3414 - classification_loss: 0.0243 11/500 [..............................] - ETA: 2:46 - loss: 0.3625 - regression_loss: 0.3390 - classification_loss: 0.0235 12/500 [..............................] - ETA: 2:45 - loss: 0.3585 - regression_loss: 0.3361 - classification_loss: 0.0223 13/500 [..............................] - ETA: 2:44 - loss: 0.3507 - regression_loss: 0.3286 - classification_loss: 0.0220 14/500 [..............................] - ETA: 2:44 - loss: 0.3410 - regression_loss: 0.3193 - classification_loss: 0.0216 15/500 [..............................] - ETA: 2:44 - loss: 0.3478 - regression_loss: 0.3255 - classification_loss: 0.0223 16/500 [..............................] - ETA: 2:43 - loss: 0.3478 - regression_loss: 0.3254 - classification_loss: 0.0224 17/500 [>.............................] - ETA: 2:43 - loss: 0.3542 - regression_loss: 0.3314 - classification_loss: 0.0227 18/500 [>.............................] - ETA: 2:42 - loss: 0.3565 - regression_loss: 0.3319 - classification_loss: 0.0247 19/500 [>.............................] - ETA: 2:42 - loss: 0.3638 - regression_loss: 0.3374 - classification_loss: 0.0263 20/500 [>.............................] - ETA: 2:42 - loss: 0.3638 - regression_loss: 0.3379 - classification_loss: 0.0259 21/500 [>.............................] - ETA: 2:42 - loss: 0.3668 - regression_loss: 0.3408 - classification_loss: 0.0260 22/500 [>.............................] - ETA: 2:42 - loss: 0.3719 - regression_loss: 0.3447 - classification_loss: 0.0271 23/500 [>.............................] - ETA: 2:41 - loss: 0.3677 - regression_loss: 0.3410 - classification_loss: 0.0267 24/500 [>.............................] - ETA: 2:41 - loss: 0.3597 - regression_loss: 0.3336 - classification_loss: 0.0261 25/500 [>.............................] - ETA: 2:41 - loss: 0.3508 - regression_loss: 0.3255 - classification_loss: 0.0253 26/500 [>.............................] - ETA: 2:41 - loss: 0.3456 - regression_loss: 0.3209 - classification_loss: 0.0247 27/500 [>.............................] - ETA: 2:41 - loss: 0.3482 - regression_loss: 0.3234 - classification_loss: 0.0248 28/500 [>.............................] - ETA: 2:40 - loss: 0.3500 - regression_loss: 0.3253 - classification_loss: 0.0247 29/500 [>.............................] - ETA: 2:40 - loss: 0.3452 - regression_loss: 0.3208 - classification_loss: 0.0245 30/500 [>.............................] - ETA: 2:40 - loss: 0.3449 - regression_loss: 0.3200 - classification_loss: 0.0249 31/500 [>.............................] - ETA: 2:40 - loss: 0.3431 - regression_loss: 0.3181 - classification_loss: 0.0250 32/500 [>.............................] - ETA: 2:39 - loss: 0.3407 - regression_loss: 0.3160 - classification_loss: 0.0247 33/500 [>.............................] - ETA: 2:39 - loss: 0.3362 - regression_loss: 0.3121 - classification_loss: 0.0241 34/500 [=>............................] - ETA: 2:39 - loss: 0.3379 - regression_loss: 0.3137 - classification_loss: 0.0242 35/500 [=>............................] - ETA: 2:38 - loss: 0.3390 - regression_loss: 0.3150 - classification_loss: 0.0240 36/500 [=>............................] - ETA: 2:38 - loss: 0.3539 - regression_loss: 0.3276 - classification_loss: 0.0263 37/500 [=>............................] - ETA: 2:38 - loss: 0.3573 - regression_loss: 0.3309 - classification_loss: 0.0264 38/500 [=>............................] - ETA: 2:37 - loss: 0.3511 - regression_loss: 0.3251 - classification_loss: 0.0259 39/500 [=>............................] - ETA: 2:37 - loss: 0.3471 - regression_loss: 0.3216 - classification_loss: 0.0254 40/500 [=>............................] - ETA: 2:37 - loss: 0.3491 - regression_loss: 0.3233 - classification_loss: 0.0258 41/500 [=>............................] - ETA: 2:37 - loss: 0.3476 - regression_loss: 0.3220 - classification_loss: 0.0256 42/500 [=>............................] - ETA: 2:36 - loss: 0.3516 - regression_loss: 0.3263 - classification_loss: 0.0253 43/500 [=>............................] - ETA: 2:36 - loss: 0.3510 - regression_loss: 0.3258 - classification_loss: 0.0253 44/500 [=>............................] - ETA: 2:36 - loss: 0.3476 - regression_loss: 0.3226 - classification_loss: 0.0249 45/500 [=>............................] - ETA: 2:36 - loss: 0.3472 - regression_loss: 0.3225 - classification_loss: 0.0247 46/500 [=>............................] - ETA: 2:35 - loss: 0.3469 - regression_loss: 0.3221 - classification_loss: 0.0248 47/500 [=>............................] - ETA: 2:35 - loss: 0.3444 - regression_loss: 0.3198 - classification_loss: 0.0246 48/500 [=>............................] - ETA: 2:34 - loss: 0.3428 - regression_loss: 0.3185 - classification_loss: 0.0243 49/500 [=>............................] - ETA: 2:34 - loss: 0.3449 - regression_loss: 0.3201 - classification_loss: 0.0248 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3452 - regression_loss: 0.3205 - classification_loss: 0.0247 51/500 [==>...........................] - ETA: 2:33 - loss: 0.3439 - regression_loss: 0.3195 - classification_loss: 0.0245 52/500 [==>...........................] - ETA: 2:33 - loss: 0.3456 - regression_loss: 0.3212 - classification_loss: 0.0244 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3437 - regression_loss: 0.3196 - classification_loss: 0.0242 54/500 [==>...........................] - ETA: 2:32 - loss: 0.3430 - regression_loss: 0.3189 - classification_loss: 0.0241 55/500 [==>...........................] - ETA: 2:32 - loss: 0.3473 - regression_loss: 0.3233 - classification_loss: 0.0240 56/500 [==>...........................] - ETA: 2:31 - loss: 0.3469 - regression_loss: 0.3228 - classification_loss: 0.0241 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3484 - regression_loss: 0.3242 - classification_loss: 0.0241 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3454 - regression_loss: 0.3216 - classification_loss: 0.0238 59/500 [==>...........................] - ETA: 2:30 - loss: 0.3456 - regression_loss: 0.3217 - classification_loss: 0.0239 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3459 - regression_loss: 0.3220 - classification_loss: 0.0239 61/500 [==>...........................] - ETA: 2:30 - loss: 0.3439 - regression_loss: 0.3203 - classification_loss: 0.0237 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3438 - regression_loss: 0.3200 - classification_loss: 0.0238 63/500 [==>...........................] - ETA: 2:29 - loss: 0.3433 - regression_loss: 0.3196 - classification_loss: 0.0237 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3442 - regression_loss: 0.3206 - classification_loss: 0.0237 65/500 [==>...........................] - ETA: 2:29 - loss: 0.3452 - regression_loss: 0.3216 - classification_loss: 0.0236 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3459 - regression_loss: 0.3221 - classification_loss: 0.0237 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3454 - regression_loss: 0.3219 - classification_loss: 0.0236 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3434 - regression_loss: 0.3199 - classification_loss: 0.0235 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3441 - regression_loss: 0.3208 - classification_loss: 0.0233 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3446 - regression_loss: 0.3213 - classification_loss: 0.0233 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3486 - regression_loss: 0.3252 - classification_loss: 0.0234 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3520 - regression_loss: 0.3280 - classification_loss: 0.0240 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3526 - regression_loss: 0.3288 - classification_loss: 0.0238 74/500 [===>..........................] - ETA: 2:26 - loss: 0.3533 - regression_loss: 0.3293 - classification_loss: 0.0240 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3550 - regression_loss: 0.3309 - classification_loss: 0.0241 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3553 - regression_loss: 0.3309 - classification_loss: 0.0243 77/500 [===>..........................] - ETA: 2:25 - loss: 0.3564 - regression_loss: 0.3322 - classification_loss: 0.0242 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3556 - regression_loss: 0.3315 - classification_loss: 0.0241 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3565 - regression_loss: 0.3321 - classification_loss: 0.0244 80/500 [===>..........................] - ETA: 2:24 - loss: 0.3550 - regression_loss: 0.3308 - classification_loss: 0.0242 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3549 - regression_loss: 0.3307 - classification_loss: 0.0242 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3548 - regression_loss: 0.3307 - classification_loss: 0.0241 83/500 [===>..........................] - ETA: 2:23 - loss: 0.3551 - regression_loss: 0.3308 - classification_loss: 0.0243 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3542 - regression_loss: 0.3300 - classification_loss: 0.0241 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3519 - regression_loss: 0.3279 - classification_loss: 0.0239 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3515 - regression_loss: 0.3277 - classification_loss: 0.0238 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3512 - regression_loss: 0.3275 - classification_loss: 0.0237 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3499 - regression_loss: 0.3263 - classification_loss: 0.0236 89/500 [====>.........................] - ETA: 2:21 - loss: 0.3478 - regression_loss: 0.3244 - classification_loss: 0.0234 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3494 - regression_loss: 0.3258 - classification_loss: 0.0235 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3499 - regression_loss: 0.3265 - classification_loss: 0.0234 92/500 [====>.........................] - ETA: 2:20 - loss: 0.3494 - regression_loss: 0.3259 - classification_loss: 0.0234 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3497 - regression_loss: 0.3264 - classification_loss: 0.0233 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3484 - regression_loss: 0.3252 - classification_loss: 0.0232 95/500 [====>.........................] - ETA: 2:19 - loss: 0.3501 - regression_loss: 0.3267 - classification_loss: 0.0233 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3512 - regression_loss: 0.3277 - classification_loss: 0.0235 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3517 - regression_loss: 0.3282 - classification_loss: 0.0235 98/500 [====>.........................] - ETA: 2:18 - loss: 0.3520 - regression_loss: 0.3285 - classification_loss: 0.0235 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3542 - regression_loss: 0.3306 - classification_loss: 0.0236 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3544 - regression_loss: 0.3307 - classification_loss: 0.0236 101/500 [=====>........................] - ETA: 2:17 - loss: 0.3542 - regression_loss: 0.3306 - classification_loss: 0.0236 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3528 - regression_loss: 0.3293 - classification_loss: 0.0235 103/500 [=====>........................] - ETA: 2:16 - loss: 0.3541 - regression_loss: 0.3301 - classification_loss: 0.0241 104/500 [=====>........................] - ETA: 2:16 - loss: 0.3525 - regression_loss: 0.3286 - classification_loss: 0.0239 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3515 - regression_loss: 0.3275 - classification_loss: 0.0240 106/500 [=====>........................] - ETA: 2:15 - loss: 0.3529 - regression_loss: 0.3288 - classification_loss: 0.0240 107/500 [=====>........................] - ETA: 2:15 - loss: 0.3526 - regression_loss: 0.3286 - classification_loss: 0.0240 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3557 - regression_loss: 0.3314 - classification_loss: 0.0243 109/500 [=====>........................] - ETA: 2:14 - loss: 0.3568 - regression_loss: 0.3323 - classification_loss: 0.0246 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3580 - regression_loss: 0.3332 - classification_loss: 0.0248 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3583 - regression_loss: 0.3336 - classification_loss: 0.0248 112/500 [=====>........................] - ETA: 2:13 - loss: 0.3603 - regression_loss: 0.3355 - classification_loss: 0.0248 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3596 - regression_loss: 0.3348 - classification_loss: 0.0248 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3597 - regression_loss: 0.3349 - classification_loss: 0.0248 115/500 [=====>........................] - ETA: 2:12 - loss: 0.3598 - regression_loss: 0.3351 - classification_loss: 0.0247 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3599 - regression_loss: 0.3352 - classification_loss: 0.0247 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3585 - regression_loss: 0.3339 - classification_loss: 0.0246 118/500 [======>.......................] - ETA: 2:11 - loss: 0.3575 - regression_loss: 0.3330 - classification_loss: 0.0245 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3582 - regression_loss: 0.3336 - classification_loss: 0.0245 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3580 - regression_loss: 0.3335 - classification_loss: 0.0245 121/500 [======>.......................] - ETA: 2:10 - loss: 0.3577 - regression_loss: 0.3331 - classification_loss: 0.0245 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3588 - regression_loss: 0.3342 - classification_loss: 0.0247 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3596 - regression_loss: 0.3348 - classification_loss: 0.0248 124/500 [======>.......................] - ETA: 2:09 - loss: 0.3583 - regression_loss: 0.3336 - classification_loss: 0.0247 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3581 - regression_loss: 0.3334 - classification_loss: 0.0247 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3596 - regression_loss: 0.3348 - classification_loss: 0.0248 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3600 - regression_loss: 0.3352 - classification_loss: 0.0248 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3594 - regression_loss: 0.3346 - classification_loss: 0.0248 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3588 - regression_loss: 0.3341 - classification_loss: 0.0247 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3591 - regression_loss: 0.3343 - classification_loss: 0.0248 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3601 - regression_loss: 0.3354 - classification_loss: 0.0247 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3610 - regression_loss: 0.3362 - classification_loss: 0.0248 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3629 - regression_loss: 0.3377 - classification_loss: 0.0252 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3615 - regression_loss: 0.3365 - classification_loss: 0.0251 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3627 - regression_loss: 0.3374 - classification_loss: 0.0253 136/500 [=======>......................] - ETA: 2:05 - loss: 0.3624 - regression_loss: 0.3372 - classification_loss: 0.0252 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3615 - regression_loss: 0.3363 - classification_loss: 0.0251 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3609 - regression_loss: 0.3359 - classification_loss: 0.0250 139/500 [=======>......................] - ETA: 2:04 - loss: 0.3611 - regression_loss: 0.3361 - classification_loss: 0.0250 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3602 - regression_loss: 0.3353 - classification_loss: 0.0249 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3606 - regression_loss: 0.3357 - classification_loss: 0.0249 142/500 [=======>......................] - ETA: 2:03 - loss: 0.3627 - regression_loss: 0.3378 - classification_loss: 0.0249 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3646 - regression_loss: 0.3396 - classification_loss: 0.0250 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3636 - regression_loss: 0.3387 - classification_loss: 0.0249 145/500 [=======>......................] - ETA: 2:02 - loss: 0.3665 - regression_loss: 0.3413 - classification_loss: 0.0253 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3678 - regression_loss: 0.3424 - classification_loss: 0.0253 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3668 - regression_loss: 0.3416 - classification_loss: 0.0252 148/500 [=======>......................] - ETA: 2:01 - loss: 0.3664 - regression_loss: 0.3412 - classification_loss: 0.0252 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3651 - regression_loss: 0.3400 - classification_loss: 0.0251 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3645 - regression_loss: 0.3395 - classification_loss: 0.0250 151/500 [========>.....................] - ETA: 2:00 - loss: 0.3636 - regression_loss: 0.3387 - classification_loss: 0.0249 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3626 - regression_loss: 0.3378 - classification_loss: 0.0248 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3616 - regression_loss: 0.3369 - classification_loss: 0.0247 154/500 [========>.....................] - ETA: 1:59 - loss: 0.3601 - regression_loss: 0.3355 - classification_loss: 0.0246 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3617 - regression_loss: 0.3369 - classification_loss: 0.0248 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3621 - regression_loss: 0.3373 - classification_loss: 0.0249 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3613 - regression_loss: 0.3364 - classification_loss: 0.0248 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3624 - regression_loss: 0.3375 - classification_loss: 0.0249 159/500 [========>.....................] - ETA: 1:57 - loss: 0.3612 - regression_loss: 0.3365 - classification_loss: 0.0247 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3617 - regression_loss: 0.3369 - classification_loss: 0.0247 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3612 - regression_loss: 0.3365 - classification_loss: 0.0247 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3628 - regression_loss: 0.3381 - classification_loss: 0.0247 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3636 - regression_loss: 0.3388 - classification_loss: 0.0248 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3641 - regression_loss: 0.3393 - classification_loss: 0.0248 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3705 - regression_loss: 0.3454 - classification_loss: 0.0252 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3709 - regression_loss: 0.3456 - classification_loss: 0.0253 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3703 - regression_loss: 0.3450 - classification_loss: 0.0253 168/500 [=========>....................] - ETA: 1:54 - loss: 0.3692 - regression_loss: 0.3440 - classification_loss: 0.0252 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3689 - regression_loss: 0.3437 - classification_loss: 0.0252 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3685 - regression_loss: 0.3434 - classification_loss: 0.0252 171/500 [=========>....................] - ETA: 1:53 - loss: 0.3696 - regression_loss: 0.3442 - classification_loss: 0.0254 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3708 - regression_loss: 0.3450 - classification_loss: 0.0257 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3717 - regression_loss: 0.3459 - classification_loss: 0.0258 174/500 [=========>....................] - ETA: 1:52 - loss: 0.3712 - regression_loss: 0.3455 - classification_loss: 0.0258 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3711 - regression_loss: 0.3453 - classification_loss: 0.0257 176/500 [=========>....................] - ETA: 1:51 - loss: 0.3710 - regression_loss: 0.3452 - classification_loss: 0.0258 177/500 [=========>....................] - ETA: 1:51 - loss: 0.3708 - regression_loss: 0.3450 - classification_loss: 0.0258 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3711 - regression_loss: 0.3453 - classification_loss: 0.0258 179/500 [=========>....................] - ETA: 1:50 - loss: 0.3705 - regression_loss: 0.3447 - classification_loss: 0.0257 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3703 - regression_loss: 0.3445 - classification_loss: 0.0258 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3690 - regression_loss: 0.3433 - classification_loss: 0.0257 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3691 - regression_loss: 0.3434 - classification_loss: 0.0257 183/500 [=========>....................] - ETA: 1:49 - loss: 0.3706 - regression_loss: 0.3449 - classification_loss: 0.0257 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3696 - regression_loss: 0.3440 - classification_loss: 0.0256 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3689 - regression_loss: 0.3434 - classification_loss: 0.0255 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3689 - regression_loss: 0.3433 - classification_loss: 0.0256 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3677 - regression_loss: 0.3422 - classification_loss: 0.0255 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3675 - regression_loss: 0.3420 - classification_loss: 0.0255 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3673 - regression_loss: 0.3418 - classification_loss: 0.0255 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3663 - regression_loss: 0.3409 - classification_loss: 0.0254 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3665 - regression_loss: 0.3411 - classification_loss: 0.0254 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3659 - regression_loss: 0.3406 - classification_loss: 0.0253 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3647 - regression_loss: 0.3394 - classification_loss: 0.0253 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3653 - regression_loss: 0.3400 - classification_loss: 0.0253 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3647 - regression_loss: 0.3395 - classification_loss: 0.0253 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3654 - regression_loss: 0.3401 - classification_loss: 0.0253 197/500 [==========>...................] - ETA: 1:44 - loss: 0.3656 - regression_loss: 0.3402 - classification_loss: 0.0254 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3647 - regression_loss: 0.3394 - classification_loss: 0.0253 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3667 - regression_loss: 0.3412 - classification_loss: 0.0255 200/500 [===========>..................] - ETA: 1:43 - loss: 0.3677 - regression_loss: 0.3422 - classification_loss: 0.0255 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3690 - regression_loss: 0.3434 - classification_loss: 0.0256 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3687 - regression_loss: 0.3433 - classification_loss: 0.0255 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3680 - regression_loss: 0.3426 - classification_loss: 0.0254 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3671 - regression_loss: 0.3417 - classification_loss: 0.0253 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3674 - regression_loss: 0.3420 - classification_loss: 0.0254 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3676 - regression_loss: 0.3422 - classification_loss: 0.0254 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3672 - regression_loss: 0.3418 - classification_loss: 0.0254 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3665 - regression_loss: 0.3412 - classification_loss: 0.0253 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3698 - regression_loss: 0.3442 - classification_loss: 0.0256 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3693 - regression_loss: 0.3438 - classification_loss: 0.0256 211/500 [===========>..................] - ETA: 1:39 - loss: 0.3692 - regression_loss: 0.3437 - classification_loss: 0.0256 212/500 [===========>..................] - ETA: 1:39 - loss: 0.3706 - regression_loss: 0.3450 - classification_loss: 0.0256 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3700 - regression_loss: 0.3444 - classification_loss: 0.0255 214/500 [===========>..................] - ETA: 1:38 - loss: 0.3711 - regression_loss: 0.3456 - classification_loss: 0.0255 215/500 [===========>..................] - ETA: 1:38 - loss: 0.3706 - regression_loss: 0.3451 - classification_loss: 0.0255 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3710 - regression_loss: 0.3455 - classification_loss: 0.0255 217/500 [============>.................] - ETA: 1:37 - loss: 0.3705 - regression_loss: 0.3451 - classification_loss: 0.0254 218/500 [============>.................] - ETA: 1:37 - loss: 0.3733 - regression_loss: 0.3475 - classification_loss: 0.0257 219/500 [============>.................] - ETA: 1:36 - loss: 0.3740 - regression_loss: 0.3480 - classification_loss: 0.0260 220/500 [============>.................] - ETA: 1:36 - loss: 0.3733 - regression_loss: 0.3474 - classification_loss: 0.0259 221/500 [============>.................] - ETA: 1:36 - loss: 0.3731 - regression_loss: 0.3472 - classification_loss: 0.0259 222/500 [============>.................] - ETA: 1:35 - loss: 0.3730 - regression_loss: 0.3471 - classification_loss: 0.0259 223/500 [============>.................] - ETA: 1:35 - loss: 0.3732 - regression_loss: 0.3473 - classification_loss: 0.0259 224/500 [============>.................] - ETA: 1:34 - loss: 0.3742 - regression_loss: 0.3483 - classification_loss: 0.0260 225/500 [============>.................] - ETA: 1:34 - loss: 0.3752 - regression_loss: 0.3492 - classification_loss: 0.0260 226/500 [============>.................] - ETA: 1:34 - loss: 0.3741 - regression_loss: 0.3482 - classification_loss: 0.0259 227/500 [============>.................] - ETA: 1:33 - loss: 0.3743 - regression_loss: 0.3484 - classification_loss: 0.0259 228/500 [============>.................] - ETA: 1:33 - loss: 0.3748 - regression_loss: 0.3489 - classification_loss: 0.0259 229/500 [============>.................] - ETA: 1:33 - loss: 0.3745 - regression_loss: 0.3488 - classification_loss: 0.0258 230/500 [============>.................] - ETA: 1:32 - loss: 0.3746 - regression_loss: 0.3489 - classification_loss: 0.0257 231/500 [============>.................] - ETA: 1:32 - loss: 0.3738 - regression_loss: 0.3481 - classification_loss: 0.0256 232/500 [============>.................] - ETA: 1:32 - loss: 0.3737 - regression_loss: 0.3481 - classification_loss: 0.0256 233/500 [============>.................] - ETA: 1:31 - loss: 0.3742 - regression_loss: 0.3486 - classification_loss: 0.0256 234/500 [=============>................] - ETA: 1:31 - loss: 0.3739 - regression_loss: 0.3483 - classification_loss: 0.0256 235/500 [=============>................] - ETA: 1:31 - loss: 0.3736 - regression_loss: 0.3480 - classification_loss: 0.0255 236/500 [=============>................] - ETA: 1:30 - loss: 0.3747 - regression_loss: 0.3490 - classification_loss: 0.0256 237/500 [=============>................] - ETA: 1:30 - loss: 0.3741 - regression_loss: 0.3486 - classification_loss: 0.0256 238/500 [=============>................] - ETA: 1:30 - loss: 0.3756 - regression_loss: 0.3499 - classification_loss: 0.0257 239/500 [=============>................] - ETA: 1:29 - loss: 0.3762 - regression_loss: 0.3506 - classification_loss: 0.0256 240/500 [=============>................] - ETA: 1:29 - loss: 0.3761 - regression_loss: 0.3504 - classification_loss: 0.0257 241/500 [=============>................] - ETA: 1:29 - loss: 0.3763 - regression_loss: 0.3506 - classification_loss: 0.0257 242/500 [=============>................] - ETA: 1:28 - loss: 0.3762 - regression_loss: 0.3505 - classification_loss: 0.0257 243/500 [=============>................] - ETA: 1:28 - loss: 0.3757 - regression_loss: 0.3501 - classification_loss: 0.0256 244/500 [=============>................] - ETA: 1:28 - loss: 0.3756 - regression_loss: 0.3500 - classification_loss: 0.0256 245/500 [=============>................] - ETA: 1:27 - loss: 0.3755 - regression_loss: 0.3498 - classification_loss: 0.0257 246/500 [=============>................] - ETA: 1:27 - loss: 0.3760 - regression_loss: 0.3503 - classification_loss: 0.0257 247/500 [=============>................] - ETA: 1:27 - loss: 0.3761 - regression_loss: 0.3504 - classification_loss: 0.0257 248/500 [=============>................] - ETA: 1:26 - loss: 0.3761 - regression_loss: 0.3504 - classification_loss: 0.0257 249/500 [=============>................] - ETA: 1:26 - loss: 0.3763 - regression_loss: 0.3506 - classification_loss: 0.0257 250/500 [==============>...............] - ETA: 1:26 - loss: 0.3767 - regression_loss: 0.3510 - classification_loss: 0.0256 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3775 - regression_loss: 0.3518 - classification_loss: 0.0257 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3773 - regression_loss: 0.3516 - classification_loss: 0.0257 253/500 [==============>...............] - ETA: 1:25 - loss: 0.3773 - regression_loss: 0.3516 - classification_loss: 0.0257 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3764 - regression_loss: 0.3507 - classification_loss: 0.0256 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3757 - regression_loss: 0.3501 - classification_loss: 0.0256 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3760 - regression_loss: 0.3504 - classification_loss: 0.0256 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3756 - regression_loss: 0.3501 - classification_loss: 0.0255 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3759 - regression_loss: 0.3503 - classification_loss: 0.0256 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3762 - regression_loss: 0.3506 - classification_loss: 0.0256 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3760 - regression_loss: 0.3504 - classification_loss: 0.0256 261/500 [==============>...............] - ETA: 1:22 - loss: 0.3757 - regression_loss: 0.3502 - classification_loss: 0.0255 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3751 - regression_loss: 0.3496 - classification_loss: 0.0254 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3755 - regression_loss: 0.3500 - classification_loss: 0.0255 264/500 [==============>...............] - ETA: 1:21 - loss: 0.3765 - regression_loss: 0.3509 - classification_loss: 0.0256 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3765 - regression_loss: 0.3509 - classification_loss: 0.0256 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3766 - regression_loss: 0.3511 - classification_loss: 0.0256 267/500 [===============>..............] - ETA: 1:20 - loss: 0.3776 - regression_loss: 0.3520 - classification_loss: 0.0256 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3780 - regression_loss: 0.3523 - classification_loss: 0.0257 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3778 - regression_loss: 0.3521 - classification_loss: 0.0257 270/500 [===============>..............] - ETA: 1:19 - loss: 0.3788 - regression_loss: 0.3532 - classification_loss: 0.0257 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3786 - regression_loss: 0.3529 - classification_loss: 0.0257 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3780 - regression_loss: 0.3524 - classification_loss: 0.0256 273/500 [===============>..............] - ETA: 1:18 - loss: 0.3787 - regression_loss: 0.3530 - classification_loss: 0.0257 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3779 - regression_loss: 0.3523 - classification_loss: 0.0256 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3791 - regression_loss: 0.3532 - classification_loss: 0.0259 276/500 [===============>..............] - ETA: 1:17 - loss: 0.3786 - regression_loss: 0.3527 - classification_loss: 0.0258 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3785 - regression_loss: 0.3527 - classification_loss: 0.0258 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3783 - regression_loss: 0.3525 - classification_loss: 0.0258 279/500 [===============>..............] - ETA: 1:16 - loss: 0.3783 - regression_loss: 0.3525 - classification_loss: 0.0258 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3781 - regression_loss: 0.3523 - classification_loss: 0.0258 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3778 - regression_loss: 0.3520 - classification_loss: 0.0258 282/500 [===============>..............] - ETA: 1:15 - loss: 0.3775 - regression_loss: 0.3518 - classification_loss: 0.0258 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3768 - regression_loss: 0.3511 - classification_loss: 0.0257 284/500 [================>.............] - ETA: 1:14 - loss: 0.3766 - regression_loss: 0.3509 - classification_loss: 0.0257 285/500 [================>.............] - ETA: 1:13 - loss: 0.3764 - regression_loss: 0.3508 - classification_loss: 0.0256 286/500 [================>.............] - ETA: 1:13 - loss: 0.3763 - regression_loss: 0.3507 - classification_loss: 0.0256 287/500 [================>.............] - ETA: 1:13 - loss: 0.3766 - regression_loss: 0.3510 - classification_loss: 0.0256 288/500 [================>.............] - ETA: 1:12 - loss: 0.3769 - regression_loss: 0.3513 - classification_loss: 0.0256 289/500 [================>.............] - ETA: 1:12 - loss: 0.3768 - regression_loss: 0.3512 - classification_loss: 0.0256 290/500 [================>.............] - ETA: 1:12 - loss: 0.3761 - regression_loss: 0.3505 - classification_loss: 0.0256 291/500 [================>.............] - ETA: 1:11 - loss: 0.3753 - regression_loss: 0.3498 - classification_loss: 0.0255 292/500 [================>.............] - ETA: 1:11 - loss: 0.3755 - regression_loss: 0.3500 - classification_loss: 0.0255 293/500 [================>.............] - ETA: 1:11 - loss: 0.3749 - regression_loss: 0.3495 - classification_loss: 0.0255 294/500 [================>.............] - ETA: 1:10 - loss: 0.3748 - regression_loss: 0.3493 - classification_loss: 0.0254 295/500 [================>.............] - ETA: 1:10 - loss: 0.3750 - regression_loss: 0.3496 - classification_loss: 0.0255 296/500 [================>.............] - ETA: 1:10 - loss: 0.3747 - regression_loss: 0.3493 - classification_loss: 0.0254 297/500 [================>.............] - ETA: 1:09 - loss: 0.3746 - regression_loss: 0.3491 - classification_loss: 0.0255 298/500 [================>.............] - ETA: 1:09 - loss: 0.3746 - regression_loss: 0.3492 - classification_loss: 0.0255 299/500 [================>.............] - ETA: 1:09 - loss: 0.3743 - regression_loss: 0.3489 - classification_loss: 0.0254 300/500 [=================>............] - ETA: 1:08 - loss: 0.3743 - regression_loss: 0.3488 - classification_loss: 0.0255 301/500 [=================>............] - ETA: 1:08 - loss: 0.3739 - regression_loss: 0.3485 - classification_loss: 0.0254 302/500 [=================>............] - ETA: 1:08 - loss: 0.3733 - regression_loss: 0.3479 - classification_loss: 0.0254 303/500 [=================>............] - ETA: 1:07 - loss: 0.3729 - regression_loss: 0.3475 - classification_loss: 0.0254 304/500 [=================>............] - ETA: 1:07 - loss: 0.3724 - regression_loss: 0.3471 - classification_loss: 0.0253 305/500 [=================>............] - ETA: 1:07 - loss: 0.3729 - regression_loss: 0.3476 - classification_loss: 0.0253 306/500 [=================>............] - ETA: 1:06 - loss: 0.3728 - regression_loss: 0.3475 - classification_loss: 0.0252 307/500 [=================>............] - ETA: 1:06 - loss: 0.3731 - regression_loss: 0.3478 - classification_loss: 0.0253 308/500 [=================>............] - ETA: 1:06 - loss: 0.3732 - regression_loss: 0.3478 - classification_loss: 0.0254 309/500 [=================>............] - ETA: 1:05 - loss: 0.3730 - regression_loss: 0.3477 - classification_loss: 0.0253 310/500 [=================>............] - ETA: 1:05 - loss: 0.3727 - regression_loss: 0.3475 - classification_loss: 0.0253 311/500 [=================>............] - ETA: 1:04 - loss: 0.3726 - regression_loss: 0.3474 - classification_loss: 0.0252 312/500 [=================>............] - ETA: 1:04 - loss: 0.3724 - regression_loss: 0.3472 - classification_loss: 0.0252 313/500 [=================>............] - ETA: 1:04 - loss: 0.3726 - regression_loss: 0.3473 - classification_loss: 0.0253 314/500 [=================>............] - ETA: 1:03 - loss: 0.3722 - regression_loss: 0.3469 - classification_loss: 0.0253 315/500 [=================>............] - ETA: 1:03 - loss: 0.3713 - regression_loss: 0.3461 - classification_loss: 0.0252 316/500 [=================>............] - ETA: 1:03 - loss: 0.3714 - regression_loss: 0.3462 - classification_loss: 0.0252 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3718 - regression_loss: 0.3465 - classification_loss: 0.0252 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3718 - regression_loss: 0.3466 - classification_loss: 0.0252 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3711 - regression_loss: 0.3459 - classification_loss: 0.0252 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3707 - regression_loss: 0.3455 - classification_loss: 0.0251 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3714 - regression_loss: 0.3462 - classification_loss: 0.0252 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3711 - regression_loss: 0.3460 - classification_loss: 0.0252 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3710 - regression_loss: 0.3459 - classification_loss: 0.0251 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3706 - regression_loss: 0.3455 - classification_loss: 0.0251 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3707 - regression_loss: 0.3456 - classification_loss: 0.0250 326/500 [==================>...........] - ETA: 59s - loss: 0.3702 - regression_loss: 0.3452 - classification_loss: 0.0250  327/500 [==================>...........] - ETA: 59s - loss: 0.3698 - regression_loss: 0.3449 - classification_loss: 0.0249 328/500 [==================>...........] - ETA: 59s - loss: 0.3700 - regression_loss: 0.3450 - classification_loss: 0.0249 329/500 [==================>...........] - ETA: 58s - loss: 0.3697 - regression_loss: 0.3448 - classification_loss: 0.0250 330/500 [==================>...........] - ETA: 58s - loss: 0.3695 - regression_loss: 0.3445 - classification_loss: 0.0250 331/500 [==================>...........] - ETA: 58s - loss: 0.3693 - regression_loss: 0.3443 - classification_loss: 0.0250 332/500 [==================>...........] - ETA: 57s - loss: 0.3689 - regression_loss: 0.3440 - classification_loss: 0.0249 333/500 [==================>...........] - ETA: 57s - loss: 0.3686 - regression_loss: 0.3438 - classification_loss: 0.0249 334/500 [===================>..........] - ETA: 57s - loss: 0.3682 - regression_loss: 0.3434 - classification_loss: 0.0248 335/500 [===================>..........] - ETA: 56s - loss: 0.3684 - regression_loss: 0.3436 - classification_loss: 0.0249 336/500 [===================>..........] - ETA: 56s - loss: 0.3687 - regression_loss: 0.3438 - classification_loss: 0.0249 337/500 [===================>..........] - ETA: 56s - loss: 0.3691 - regression_loss: 0.3441 - classification_loss: 0.0249 338/500 [===================>..........] - ETA: 55s - loss: 0.3695 - regression_loss: 0.3446 - classification_loss: 0.0250 339/500 [===================>..........] - ETA: 55s - loss: 0.3698 - regression_loss: 0.3448 - classification_loss: 0.0250 340/500 [===================>..........] - ETA: 55s - loss: 0.3706 - regression_loss: 0.3456 - classification_loss: 0.0250 341/500 [===================>..........] - ETA: 54s - loss: 0.3702 - regression_loss: 0.3452 - classification_loss: 0.0250 342/500 [===================>..........] - ETA: 54s - loss: 0.3700 - regression_loss: 0.3451 - classification_loss: 0.0250 343/500 [===================>..........] - ETA: 53s - loss: 0.3701 - regression_loss: 0.3451 - classification_loss: 0.0250 344/500 [===================>..........] - ETA: 53s - loss: 0.3696 - regression_loss: 0.3446 - classification_loss: 0.0250 345/500 [===================>..........] - ETA: 53s - loss: 0.3694 - regression_loss: 0.3445 - classification_loss: 0.0249 346/500 [===================>..........] - ETA: 52s - loss: 0.3700 - regression_loss: 0.3450 - classification_loss: 0.0250 347/500 [===================>..........] - ETA: 52s - loss: 0.3703 - regression_loss: 0.3452 - classification_loss: 0.0251 348/500 [===================>..........] - ETA: 52s - loss: 0.3698 - regression_loss: 0.3447 - classification_loss: 0.0250 349/500 [===================>..........] - ETA: 51s - loss: 0.3699 - regression_loss: 0.3448 - classification_loss: 0.0251 350/500 [====================>.........] - ETA: 51s - loss: 0.3699 - regression_loss: 0.3449 - classification_loss: 0.0251 351/500 [====================>.........] - ETA: 51s - loss: 0.3693 - regression_loss: 0.3443 - classification_loss: 0.0250 352/500 [====================>.........] - ETA: 50s - loss: 0.3700 - regression_loss: 0.3450 - classification_loss: 0.0250 353/500 [====================>.........] - ETA: 50s - loss: 0.3699 - regression_loss: 0.3450 - classification_loss: 0.0250 354/500 [====================>.........] - ETA: 50s - loss: 0.3708 - regression_loss: 0.3458 - classification_loss: 0.0250 355/500 [====================>.........] - ETA: 49s - loss: 0.3712 - regression_loss: 0.3462 - classification_loss: 0.0250 356/500 [====================>.........] - ETA: 49s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0250 357/500 [====================>.........] - ETA: 49s - loss: 0.3720 - regression_loss: 0.3470 - classification_loss: 0.0250 358/500 [====================>.........] - ETA: 48s - loss: 0.3718 - regression_loss: 0.3467 - classification_loss: 0.0250 359/500 [====================>.........] - ETA: 48s - loss: 0.3714 - regression_loss: 0.3464 - classification_loss: 0.0250 360/500 [====================>.........] - ETA: 48s - loss: 0.3714 - regression_loss: 0.3465 - classification_loss: 0.0250 361/500 [====================>.........] - ETA: 47s - loss: 0.3714 - regression_loss: 0.3465 - classification_loss: 0.0250 362/500 [====================>.........] - ETA: 47s - loss: 0.3712 - regression_loss: 0.3462 - classification_loss: 0.0250 363/500 [====================>.........] - ETA: 47s - loss: 0.3710 - regression_loss: 0.3460 - classification_loss: 0.0249 364/500 [====================>.........] - ETA: 46s - loss: 0.3711 - regression_loss: 0.3461 - classification_loss: 0.0249 365/500 [====================>.........] - ETA: 46s - loss: 0.3706 - regression_loss: 0.3457 - classification_loss: 0.0249 366/500 [====================>.........] - ETA: 46s - loss: 0.3705 - regression_loss: 0.3456 - classification_loss: 0.0249 367/500 [=====================>........] - ETA: 45s - loss: 0.3698 - regression_loss: 0.3450 - classification_loss: 0.0248 368/500 [=====================>........] - ETA: 45s - loss: 0.3698 - regression_loss: 0.3450 - classification_loss: 0.0248 369/500 [=====================>........] - ETA: 45s - loss: 0.3698 - regression_loss: 0.3450 - classification_loss: 0.0248 370/500 [=====================>........] - ETA: 44s - loss: 0.3696 - regression_loss: 0.3448 - classification_loss: 0.0248 371/500 [=====================>........] - ETA: 44s - loss: 0.3691 - regression_loss: 0.3443 - classification_loss: 0.0248 372/500 [=====================>........] - ETA: 43s - loss: 0.3691 - regression_loss: 0.3444 - classification_loss: 0.0248 373/500 [=====================>........] - ETA: 43s - loss: 0.3689 - regression_loss: 0.3441 - classification_loss: 0.0248 374/500 [=====================>........] - ETA: 43s - loss: 0.3694 - regression_loss: 0.3446 - classification_loss: 0.0248 375/500 [=====================>........] - ETA: 42s - loss: 0.3696 - regression_loss: 0.3449 - classification_loss: 0.0247 376/500 [=====================>........] - ETA: 42s - loss: 0.3690 - regression_loss: 0.3443 - classification_loss: 0.0247 377/500 [=====================>........] - ETA: 42s - loss: 0.3695 - regression_loss: 0.3448 - classification_loss: 0.0247 378/500 [=====================>........] - ETA: 41s - loss: 0.3690 - regression_loss: 0.3443 - classification_loss: 0.0247 379/500 [=====================>........] - ETA: 41s - loss: 0.3691 - regression_loss: 0.3444 - classification_loss: 0.0248 380/500 [=====================>........] - ETA: 41s - loss: 0.3689 - regression_loss: 0.3442 - classification_loss: 0.0247 381/500 [=====================>........] - ETA: 40s - loss: 0.3689 - regression_loss: 0.3442 - classification_loss: 0.0247 382/500 [=====================>........] - ETA: 40s - loss: 0.3687 - regression_loss: 0.3440 - classification_loss: 0.0247 383/500 [=====================>........] - ETA: 40s - loss: 0.3694 - regression_loss: 0.3447 - classification_loss: 0.0248 384/500 [======================>.......] - ETA: 39s - loss: 0.3693 - regression_loss: 0.3445 - classification_loss: 0.0248 385/500 [======================>.......] - ETA: 39s - loss: 0.3697 - regression_loss: 0.3448 - classification_loss: 0.0249 386/500 [======================>.......] - ETA: 39s - loss: 0.3701 - regression_loss: 0.3452 - classification_loss: 0.0250 387/500 [======================>.......] - ETA: 38s - loss: 0.3700 - regression_loss: 0.3451 - classification_loss: 0.0249 388/500 [======================>.......] - ETA: 38s - loss: 0.3706 - regression_loss: 0.3456 - classification_loss: 0.0249 389/500 [======================>.......] - ETA: 38s - loss: 0.3702 - regression_loss: 0.3453 - classification_loss: 0.0249 390/500 [======================>.......] - ETA: 37s - loss: 0.3718 - regression_loss: 0.3468 - classification_loss: 0.0250 391/500 [======================>.......] - ETA: 37s - loss: 0.3725 - regression_loss: 0.3475 - classification_loss: 0.0250 392/500 [======================>.......] - ETA: 37s - loss: 0.3719 - regression_loss: 0.3470 - classification_loss: 0.0250 393/500 [======================>.......] - ETA: 36s - loss: 0.3723 - regression_loss: 0.3474 - classification_loss: 0.0250 394/500 [======================>.......] - ETA: 36s - loss: 0.3722 - regression_loss: 0.3472 - classification_loss: 0.0249 395/500 [======================>.......] - ETA: 36s - loss: 0.3724 - regression_loss: 0.3475 - classification_loss: 0.0249 396/500 [======================>.......] - ETA: 35s - loss: 0.3722 - regression_loss: 0.3472 - classification_loss: 0.0249 397/500 [======================>.......] - ETA: 35s - loss: 0.3727 - regression_loss: 0.3477 - classification_loss: 0.0250 398/500 [======================>.......] - ETA: 35s - loss: 0.3725 - regression_loss: 0.3475 - classification_loss: 0.0250 399/500 [======================>.......] - ETA: 34s - loss: 0.3722 - regression_loss: 0.3473 - classification_loss: 0.0250 400/500 [=======================>......] - ETA: 34s - loss: 0.3727 - regression_loss: 0.3476 - classification_loss: 0.0250 401/500 [=======================>......] - ETA: 34s - loss: 0.3726 - regression_loss: 0.3476 - classification_loss: 0.0250 402/500 [=======================>......] - ETA: 33s - loss: 0.3724 - regression_loss: 0.3474 - classification_loss: 0.0250 403/500 [=======================>......] - ETA: 33s - loss: 0.3723 - regression_loss: 0.3474 - classification_loss: 0.0250 404/500 [=======================>......] - ETA: 32s - loss: 0.3719 - regression_loss: 0.3470 - classification_loss: 0.0250 405/500 [=======================>......] - ETA: 32s - loss: 0.3719 - regression_loss: 0.3469 - classification_loss: 0.0250 406/500 [=======================>......] - ETA: 32s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0249 407/500 [=======================>......] - ETA: 31s - loss: 0.3710 - regression_loss: 0.3461 - classification_loss: 0.0249 408/500 [=======================>......] - ETA: 31s - loss: 0.3708 - regression_loss: 0.3459 - classification_loss: 0.0249 409/500 [=======================>......] - ETA: 31s - loss: 0.3713 - regression_loss: 0.3463 - classification_loss: 0.0249 410/500 [=======================>......] - ETA: 30s - loss: 0.3708 - regression_loss: 0.3459 - classification_loss: 0.0249 411/500 [=======================>......] - ETA: 30s - loss: 0.3707 - regression_loss: 0.3458 - classification_loss: 0.0249 412/500 [=======================>......] - ETA: 30s - loss: 0.3708 - regression_loss: 0.3458 - classification_loss: 0.0249 413/500 [=======================>......] - ETA: 29s - loss: 0.3711 - regression_loss: 0.3462 - classification_loss: 0.0249 414/500 [=======================>......] - ETA: 29s - loss: 0.3710 - regression_loss: 0.3461 - classification_loss: 0.0249 415/500 [=======================>......] - ETA: 29s - loss: 0.3705 - regression_loss: 0.3456 - classification_loss: 0.0249 416/500 [=======================>......] - ETA: 28s - loss: 0.3709 - regression_loss: 0.3461 - classification_loss: 0.0249 417/500 [========================>.....] - ETA: 28s - loss: 0.3713 - regression_loss: 0.3464 - classification_loss: 0.0249 418/500 [========================>.....] - ETA: 28s - loss: 0.3708 - regression_loss: 0.3460 - classification_loss: 0.0248 419/500 [========================>.....] - ETA: 27s - loss: 0.3714 - regression_loss: 0.3465 - classification_loss: 0.0249 420/500 [========================>.....] - ETA: 27s - loss: 0.3719 - regression_loss: 0.3470 - classification_loss: 0.0249 421/500 [========================>.....] - ETA: 27s - loss: 0.3729 - regression_loss: 0.3478 - classification_loss: 0.0251 422/500 [========================>.....] - ETA: 26s - loss: 0.3726 - regression_loss: 0.3476 - classification_loss: 0.0250 423/500 [========================>.....] - ETA: 26s - loss: 0.3729 - regression_loss: 0.3479 - classification_loss: 0.0250 424/500 [========================>.....] - ETA: 26s - loss: 0.3727 - regression_loss: 0.3477 - classification_loss: 0.0250 425/500 [========================>.....] - ETA: 25s - loss: 0.3726 - regression_loss: 0.3476 - classification_loss: 0.0250 426/500 [========================>.....] - ETA: 25s - loss: 0.3722 - regression_loss: 0.3472 - classification_loss: 0.0250 427/500 [========================>.....] - ETA: 25s - loss: 0.3734 - regression_loss: 0.3483 - classification_loss: 0.0251 428/500 [========================>.....] - ETA: 24s - loss: 0.3737 - regression_loss: 0.3486 - classification_loss: 0.0250 429/500 [========================>.....] - ETA: 24s - loss: 0.3745 - regression_loss: 0.3494 - classification_loss: 0.0251 430/500 [========================>.....] - ETA: 24s - loss: 0.3746 - regression_loss: 0.3495 - classification_loss: 0.0251 431/500 [========================>.....] - ETA: 23s - loss: 0.3748 - regression_loss: 0.3496 - classification_loss: 0.0252 432/500 [========================>.....] - ETA: 23s - loss: 0.3749 - regression_loss: 0.3497 - classification_loss: 0.0252 433/500 [========================>.....] - ETA: 23s - loss: 0.3747 - regression_loss: 0.3496 - classification_loss: 0.0252 434/500 [=========================>....] - ETA: 22s - loss: 0.3745 - regression_loss: 0.3493 - classification_loss: 0.0252 435/500 [=========================>....] - ETA: 22s - loss: 0.3741 - regression_loss: 0.3490 - classification_loss: 0.0251 436/500 [=========================>....] - ETA: 21s - loss: 0.3744 - regression_loss: 0.3492 - classification_loss: 0.0251 437/500 [=========================>....] - ETA: 21s - loss: 0.3744 - regression_loss: 0.3492 - classification_loss: 0.0251 438/500 [=========================>....] - ETA: 21s - loss: 0.3749 - regression_loss: 0.3497 - classification_loss: 0.0251 439/500 [=========================>....] - ETA: 20s - loss: 0.3748 - regression_loss: 0.3496 - classification_loss: 0.0251 440/500 [=========================>....] - ETA: 20s - loss: 0.3744 - regression_loss: 0.3493 - classification_loss: 0.0251 441/500 [=========================>....] - ETA: 20s - loss: 0.3748 - regression_loss: 0.3496 - classification_loss: 0.0251 442/500 [=========================>....] - ETA: 19s - loss: 0.3748 - regression_loss: 0.3497 - classification_loss: 0.0251 443/500 [=========================>....] - ETA: 19s - loss: 0.3745 - regression_loss: 0.3494 - classification_loss: 0.0251 444/500 [=========================>....] - ETA: 19s - loss: 0.3743 - regression_loss: 0.3492 - classification_loss: 0.0251 445/500 [=========================>....] - ETA: 18s - loss: 0.3745 - regression_loss: 0.3493 - classification_loss: 0.0252 446/500 [=========================>....] - ETA: 18s - loss: 0.3741 - regression_loss: 0.3489 - classification_loss: 0.0252 447/500 [=========================>....] - ETA: 18s - loss: 0.3738 - regression_loss: 0.3487 - classification_loss: 0.0252 448/500 [=========================>....] - ETA: 17s - loss: 0.3738 - regression_loss: 0.3487 - classification_loss: 0.0251 449/500 [=========================>....] - ETA: 17s - loss: 0.3733 - regression_loss: 0.3482 - classification_loss: 0.0251 450/500 [==========================>...] - ETA: 17s - loss: 0.3732 - regression_loss: 0.3481 - classification_loss: 0.0251 451/500 [==========================>...] - ETA: 16s - loss: 0.3732 - regression_loss: 0.3481 - classification_loss: 0.0251 452/500 [==========================>...] - ETA: 16s - loss: 0.3731 - regression_loss: 0.3480 - classification_loss: 0.0251 453/500 [==========================>...] - ETA: 16s - loss: 0.3732 - regression_loss: 0.3481 - classification_loss: 0.0251 454/500 [==========================>...] - ETA: 15s - loss: 0.3736 - regression_loss: 0.3485 - classification_loss: 0.0251 455/500 [==========================>...] - ETA: 15s - loss: 0.3738 - regression_loss: 0.3487 - classification_loss: 0.0251 456/500 [==========================>...] - ETA: 15s - loss: 0.3736 - regression_loss: 0.3485 - classification_loss: 0.0251 457/500 [==========================>...] - ETA: 14s - loss: 0.3735 - regression_loss: 0.3484 - classification_loss: 0.0251 458/500 [==========================>...] - ETA: 14s - loss: 0.3731 - regression_loss: 0.3480 - classification_loss: 0.0251 459/500 [==========================>...] - ETA: 14s - loss: 0.3729 - regression_loss: 0.3478 - classification_loss: 0.0251 460/500 [==========================>...] - ETA: 13s - loss: 0.3727 - regression_loss: 0.3476 - classification_loss: 0.0251 461/500 [==========================>...] - ETA: 13s - loss: 0.3723 - regression_loss: 0.3472 - classification_loss: 0.0251 462/500 [==========================>...] - ETA: 13s - loss: 0.3720 - regression_loss: 0.3469 - classification_loss: 0.0250 463/500 [==========================>...] - ETA: 12s - loss: 0.3717 - regression_loss: 0.3466 - classification_loss: 0.0250 464/500 [==========================>...] - ETA: 12s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0250 465/500 [==========================>...] - ETA: 12s - loss: 0.3711 - regression_loss: 0.3461 - classification_loss: 0.0250 466/500 [==========================>...] - ETA: 11s - loss: 0.3715 - regression_loss: 0.3466 - classification_loss: 0.0250 467/500 [===========================>..] - ETA: 11s - loss: 0.3714 - regression_loss: 0.3464 - classification_loss: 0.0250 468/500 [===========================>..] - ETA: 10s - loss: 0.3713 - regression_loss: 0.3463 - classification_loss: 0.0250 469/500 [===========================>..] - ETA: 10s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0250 470/500 [===========================>..] - ETA: 10s - loss: 0.3711 - regression_loss: 0.3462 - classification_loss: 0.0250 471/500 [===========================>..] - ETA: 9s - loss: 0.3710 - regression_loss: 0.3461 - classification_loss: 0.0249  472/500 [===========================>..] - ETA: 9s - loss: 0.3709 - regression_loss: 0.3459 - classification_loss: 0.0249 473/500 [===========================>..] - ETA: 9s - loss: 0.3712 - regression_loss: 0.3462 - classification_loss: 0.0249 474/500 [===========================>..] - ETA: 8s - loss: 0.3711 - regression_loss: 0.3461 - classification_loss: 0.0249 475/500 [===========================>..] - ETA: 8s - loss: 0.3712 - regression_loss: 0.3463 - classification_loss: 0.0249 476/500 [===========================>..] - ETA: 8s - loss: 0.3714 - regression_loss: 0.3464 - classification_loss: 0.0250 477/500 [===========================>..] - ETA: 7s - loss: 0.3713 - regression_loss: 0.3463 - classification_loss: 0.0250 478/500 [===========================>..] - ETA: 7s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0250 479/500 [===========================>..] - ETA: 7s - loss: 0.3715 - regression_loss: 0.3465 - classification_loss: 0.0250 480/500 [===========================>..] - ETA: 6s - loss: 0.3709 - regression_loss: 0.3459 - classification_loss: 0.0250 481/500 [===========================>..] - ETA: 6s - loss: 0.3710 - regression_loss: 0.3459 - classification_loss: 0.0250 482/500 [===========================>..] - ETA: 6s - loss: 0.3706 - regression_loss: 0.3456 - classification_loss: 0.0250 483/500 [===========================>..] - ETA: 5s - loss: 0.3706 - regression_loss: 0.3456 - classification_loss: 0.0250 484/500 [============================>.] - ETA: 5s - loss: 0.3706 - regression_loss: 0.3456 - classification_loss: 0.0250 485/500 [============================>.] - ETA: 5s - loss: 0.3702 - regression_loss: 0.3452 - classification_loss: 0.0250 486/500 [============================>.] - ETA: 4s - loss: 0.3701 - regression_loss: 0.3452 - classification_loss: 0.0250 487/500 [============================>.] - ETA: 4s - loss: 0.3699 - regression_loss: 0.3449 - classification_loss: 0.0249 488/500 [============================>.] - ETA: 4s - loss: 0.3698 - regression_loss: 0.3449 - classification_loss: 0.0249 489/500 [============================>.] - ETA: 3s - loss: 0.3696 - regression_loss: 0.3446 - classification_loss: 0.0250 490/500 [============================>.] - ETA: 3s - loss: 0.3698 - regression_loss: 0.3448 - classification_loss: 0.0250 491/500 [============================>.] - ETA: 3s - loss: 0.3696 - regression_loss: 0.3447 - classification_loss: 0.0249 492/500 [============================>.] - ETA: 2s - loss: 0.3698 - regression_loss: 0.3449 - classification_loss: 0.0250 493/500 [============================>.] - ETA: 2s - loss: 0.3697 - regression_loss: 0.3447 - classification_loss: 0.0249 494/500 [============================>.] - ETA: 2s - loss: 0.3693 - regression_loss: 0.3444 - classification_loss: 0.0249 495/500 [============================>.] - ETA: 1s - loss: 0.3691 - regression_loss: 0.3443 - classification_loss: 0.0249 496/500 [============================>.] - ETA: 1s - loss: 0.3696 - regression_loss: 0.3447 - classification_loss: 0.0249 497/500 [============================>.] - ETA: 1s - loss: 0.3698 - regression_loss: 0.3449 - classification_loss: 0.0249 498/500 [============================>.] - ETA: 0s - loss: 0.3698 - regression_loss: 0.3449 - classification_loss: 0.0249 499/500 [============================>.] - ETA: 0s - loss: 0.3700 - regression_loss: 0.3451 - classification_loss: 0.0249 500/500 [==============================] - 172s 344ms/step - loss: 0.3711 - regression_loss: 0.3460 - classification_loss: 0.0251 1172 instances of class plum with average precision: 0.7295 mAP: 0.7295 Epoch 00030: saving model to ./training/snapshots/resnet101_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 2:49 - loss: 0.1832 - regression_loss: 0.1721 - classification_loss: 0.0111 2/500 [..............................] - ETA: 2:51 - loss: 0.2661 - regression_loss: 0.2504 - classification_loss: 0.0157 3/500 [..............................] - ETA: 2:51 - loss: 0.3537 - regression_loss: 0.3299 - classification_loss: 0.0237 4/500 [..............................] - ETA: 2:47 - loss: 0.2905 - regression_loss: 0.2709 - classification_loss: 0.0196 5/500 [..............................] - ETA: 2:47 - loss: 0.2770 - regression_loss: 0.2569 - classification_loss: 0.0201 6/500 [..............................] - ETA: 2:47 - loss: 0.2966 - regression_loss: 0.2744 - classification_loss: 0.0222 7/500 [..............................] - ETA: 2:46 - loss: 0.3390 - regression_loss: 0.3100 - classification_loss: 0.0290 8/500 [..............................] - ETA: 2:46 - loss: 0.3418 - regression_loss: 0.3144 - classification_loss: 0.0274 9/500 [..............................] - ETA: 2:46 - loss: 0.3463 - regression_loss: 0.3208 - classification_loss: 0.0255 10/500 [..............................] - ETA: 2:45 - loss: 0.3331 - regression_loss: 0.3096 - classification_loss: 0.0235 11/500 [..............................] - ETA: 2:46 - loss: 0.3250 - regression_loss: 0.3032 - classification_loss: 0.0219 12/500 [..............................] - ETA: 2:45 - loss: 0.3189 - regression_loss: 0.2976 - classification_loss: 0.0213 13/500 [..............................] - ETA: 2:45 - loss: 0.3395 - regression_loss: 0.3177 - classification_loss: 0.0218 14/500 [..............................] - ETA: 2:44 - loss: 0.3505 - regression_loss: 0.3276 - classification_loss: 0.0228 15/500 [..............................] - ETA: 2:44 - loss: 0.3468 - regression_loss: 0.3242 - classification_loss: 0.0226 16/500 [..............................] - ETA: 2:43 - loss: 0.3432 - regression_loss: 0.3209 - classification_loss: 0.0223 17/500 [>.............................] - ETA: 2:43 - loss: 0.3485 - regression_loss: 0.3264 - classification_loss: 0.0221 18/500 [>.............................] - ETA: 2:42 - loss: 0.3488 - regression_loss: 0.3271 - classification_loss: 0.0217 19/500 [>.............................] - ETA: 2:42 - loss: 0.3510 - regression_loss: 0.3287 - classification_loss: 0.0222 20/500 [>.............................] - ETA: 2:42 - loss: 0.3432 - regression_loss: 0.3215 - classification_loss: 0.0217 21/500 [>.............................] - ETA: 2:42 - loss: 0.3320 - regression_loss: 0.3108 - classification_loss: 0.0211 22/500 [>.............................] - ETA: 2:41 - loss: 0.3331 - regression_loss: 0.3116 - classification_loss: 0.0215 23/500 [>.............................] - ETA: 2:41 - loss: 0.3390 - regression_loss: 0.3170 - classification_loss: 0.0220 24/500 [>.............................] - ETA: 2:41 - loss: 0.3351 - regression_loss: 0.3135 - classification_loss: 0.0216 25/500 [>.............................] - ETA: 2:41 - loss: 0.3360 - regression_loss: 0.3140 - classification_loss: 0.0220 26/500 [>.............................] - ETA: 2:41 - loss: 0.3324 - regression_loss: 0.3111 - classification_loss: 0.0213 27/500 [>.............................] - ETA: 2:41 - loss: 0.3321 - regression_loss: 0.3109 - classification_loss: 0.0212 28/500 [>.............................] - ETA: 2:40 - loss: 0.3382 - regression_loss: 0.3159 - classification_loss: 0.0222 29/500 [>.............................] - ETA: 2:40 - loss: 0.3377 - regression_loss: 0.3157 - classification_loss: 0.0220 30/500 [>.............................] - ETA: 2:40 - loss: 0.3359 - regression_loss: 0.3138 - classification_loss: 0.0221 31/500 [>.............................] - ETA: 2:39 - loss: 0.3328 - regression_loss: 0.3111 - classification_loss: 0.0217 32/500 [>.............................] - ETA: 2:39 - loss: 0.3354 - regression_loss: 0.3140 - classification_loss: 0.0214 33/500 [>.............................] - ETA: 2:39 - loss: 0.3365 - regression_loss: 0.3155 - classification_loss: 0.0210 34/500 [=>............................] - ETA: 2:39 - loss: 0.3438 - regression_loss: 0.3228 - classification_loss: 0.0211 35/500 [=>............................] - ETA: 2:38 - loss: 0.3509 - regression_loss: 0.3287 - classification_loss: 0.0221 36/500 [=>............................] - ETA: 2:38 - loss: 0.3531 - regression_loss: 0.3308 - classification_loss: 0.0223 37/500 [=>............................] - ETA: 2:38 - loss: 0.3527 - regression_loss: 0.3303 - classification_loss: 0.0224 38/500 [=>............................] - ETA: 2:38 - loss: 0.3497 - regression_loss: 0.3276 - classification_loss: 0.0221 39/500 [=>............................] - ETA: 2:37 - loss: 0.3479 - regression_loss: 0.3258 - classification_loss: 0.0221 40/500 [=>............................] - ETA: 2:37 - loss: 0.3489 - regression_loss: 0.3265 - classification_loss: 0.0224 41/500 [=>............................] - ETA: 2:36 - loss: 0.3441 - regression_loss: 0.3220 - classification_loss: 0.0221 42/500 [=>............................] - ETA: 2:36 - loss: 0.3392 - regression_loss: 0.3174 - classification_loss: 0.0218 43/500 [=>............................] - ETA: 2:36 - loss: 0.3367 - regression_loss: 0.3150 - classification_loss: 0.0216 44/500 [=>............................] - ETA: 2:35 - loss: 0.3384 - regression_loss: 0.3170 - classification_loss: 0.0214 45/500 [=>............................] - ETA: 2:35 - loss: 0.3427 - regression_loss: 0.3213 - classification_loss: 0.0214 46/500 [=>............................] - ETA: 2:35 - loss: 0.3441 - regression_loss: 0.3227 - classification_loss: 0.0213 47/500 [=>............................] - ETA: 2:35 - loss: 0.3454 - regression_loss: 0.3238 - classification_loss: 0.0216 48/500 [=>............................] - ETA: 2:34 - loss: 0.3440 - regression_loss: 0.3227 - classification_loss: 0.0213 49/500 [=>............................] - ETA: 2:34 - loss: 0.3439 - regression_loss: 0.3227 - classification_loss: 0.0212 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3419 - regression_loss: 0.3209 - classification_loss: 0.0210 51/500 [==>...........................] - ETA: 2:33 - loss: 0.3403 - regression_loss: 0.3194 - classification_loss: 0.0208 52/500 [==>...........................] - ETA: 2:33 - loss: 0.3382 - regression_loss: 0.3175 - classification_loss: 0.0207 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3407 - regression_loss: 0.3197 - classification_loss: 0.0210 54/500 [==>...........................] - ETA: 2:32 - loss: 0.3405 - regression_loss: 0.3194 - classification_loss: 0.0210 55/500 [==>...........................] - ETA: 2:32 - loss: 0.3404 - regression_loss: 0.3192 - classification_loss: 0.0211 56/500 [==>...........................] - ETA: 2:31 - loss: 0.3412 - regression_loss: 0.3201 - classification_loss: 0.0211 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3438 - regression_loss: 0.3226 - classification_loss: 0.0212 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3454 - regression_loss: 0.3238 - classification_loss: 0.0217 59/500 [==>...........................] - ETA: 2:30 - loss: 0.3469 - regression_loss: 0.3253 - classification_loss: 0.0217 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3463 - regression_loss: 0.3246 - classification_loss: 0.0218 61/500 [==>...........................] - ETA: 2:30 - loss: 0.3475 - regression_loss: 0.3257 - classification_loss: 0.0217 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3447 - regression_loss: 0.3232 - classification_loss: 0.0215 63/500 [==>...........................] - ETA: 2:29 - loss: 0.3426 - regression_loss: 0.3212 - classification_loss: 0.0213 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3403 - regression_loss: 0.3191 - classification_loss: 0.0212 65/500 [==>...........................] - ETA: 2:28 - loss: 0.3392 - regression_loss: 0.3182 - classification_loss: 0.0210 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3415 - regression_loss: 0.3200 - classification_loss: 0.0215 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3398 - regression_loss: 0.3182 - classification_loss: 0.0216 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3404 - regression_loss: 0.3188 - classification_loss: 0.0216 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3401 - regression_loss: 0.3185 - classification_loss: 0.0216 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3387 - regression_loss: 0.3172 - classification_loss: 0.0215 71/500 [===>..........................] - ETA: 2:26 - loss: 0.3403 - regression_loss: 0.3184 - classification_loss: 0.0219 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3390 - regression_loss: 0.3171 - classification_loss: 0.0218 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3407 - regression_loss: 0.3187 - classification_loss: 0.0220 74/500 [===>..........................] - ETA: 2:25 - loss: 0.3435 - regression_loss: 0.3215 - classification_loss: 0.0220 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3464 - regression_loss: 0.3244 - classification_loss: 0.0221 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3464 - regression_loss: 0.3245 - classification_loss: 0.0220 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3444 - regression_loss: 0.3227 - classification_loss: 0.0217 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3479 - regression_loss: 0.3256 - classification_loss: 0.0223 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3502 - regression_loss: 0.3272 - classification_loss: 0.0230 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3520 - regression_loss: 0.3289 - classification_loss: 0.0231 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3500 - regression_loss: 0.3271 - classification_loss: 0.0229 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3491 - regression_loss: 0.3263 - classification_loss: 0.0229 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3491 - regression_loss: 0.3260 - classification_loss: 0.0231 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3481 - regression_loss: 0.3252 - classification_loss: 0.0230 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3475 - regression_loss: 0.3247 - classification_loss: 0.0229 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3479 - regression_loss: 0.3251 - classification_loss: 0.0228 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3473 - regression_loss: 0.3246 - classification_loss: 0.0227 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3493 - regression_loss: 0.3265 - classification_loss: 0.0228 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3482 - regression_loss: 0.3254 - classification_loss: 0.0227 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3489 - regression_loss: 0.3261 - classification_loss: 0.0228 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3496 - regression_loss: 0.3268 - classification_loss: 0.0228 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3512 - regression_loss: 0.3283 - classification_loss: 0.0229 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3500 - regression_loss: 0.3272 - classification_loss: 0.0227 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3503 - regression_loss: 0.3275 - classification_loss: 0.0228 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3486 - regression_loss: 0.3260 - classification_loss: 0.0227 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3496 - regression_loss: 0.3267 - classification_loss: 0.0229 97/500 [====>.........................] - ETA: 2:17 - loss: 0.3489 - regression_loss: 0.3260 - classification_loss: 0.0229 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3475 - regression_loss: 0.3247 - classification_loss: 0.0228 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3454 - regression_loss: 0.3227 - classification_loss: 0.0227 100/500 [=====>........................] - ETA: 2:16 - loss: 0.3472 - regression_loss: 0.3243 - classification_loss: 0.0229 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3472 - regression_loss: 0.3243 - classification_loss: 0.0229 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3463 - regression_loss: 0.3235 - classification_loss: 0.0228 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3475 - regression_loss: 0.3246 - classification_loss: 0.0230 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3474 - regression_loss: 0.3245 - classification_loss: 0.0230 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3464 - regression_loss: 0.3235 - classification_loss: 0.0229 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3453 - regression_loss: 0.3224 - classification_loss: 0.0229 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3442 - regression_loss: 0.3214 - classification_loss: 0.0228 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3430 - regression_loss: 0.3203 - classification_loss: 0.0227 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3413 - regression_loss: 0.3188 - classification_loss: 0.0226 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3419 - regression_loss: 0.3193 - classification_loss: 0.0226 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3398 - regression_loss: 0.3173 - classification_loss: 0.0225 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3387 - regression_loss: 0.3163 - classification_loss: 0.0224 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3382 - regression_loss: 0.3159 - classification_loss: 0.0223 114/500 [=====>........................] - ETA: 2:11 - loss: 0.3374 - regression_loss: 0.3152 - classification_loss: 0.0222 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3372 - regression_loss: 0.3148 - classification_loss: 0.0224 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3356 - regression_loss: 0.3133 - classification_loss: 0.0222 117/500 [======>.......................] - ETA: 2:10 - loss: 0.3344 - regression_loss: 0.3123 - classification_loss: 0.0221 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3343 - regression_loss: 0.3123 - classification_loss: 0.0220 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3351 - regression_loss: 0.3131 - classification_loss: 0.0220 120/500 [======>.......................] - ETA: 2:09 - loss: 0.3352 - regression_loss: 0.3133 - classification_loss: 0.0219 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3342 - regression_loss: 0.3123 - classification_loss: 0.0218 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3327 - regression_loss: 0.3110 - classification_loss: 0.0217 123/500 [======>.......................] - ETA: 2:08 - loss: 0.3329 - regression_loss: 0.3112 - classification_loss: 0.0217 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3330 - regression_loss: 0.3112 - classification_loss: 0.0218 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3319 - regression_loss: 0.3103 - classification_loss: 0.0217 126/500 [======>.......................] - ETA: 2:07 - loss: 0.3308 - regression_loss: 0.3093 - classification_loss: 0.0216 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3318 - regression_loss: 0.3099 - classification_loss: 0.0219 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3319 - regression_loss: 0.3100 - classification_loss: 0.0219 129/500 [======>.......................] - ETA: 2:06 - loss: 0.3316 - regression_loss: 0.3097 - classification_loss: 0.0219 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3345 - regression_loss: 0.3127 - classification_loss: 0.0218 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3338 - regression_loss: 0.3120 - classification_loss: 0.0218 132/500 [======>.......................] - ETA: 2:05 - loss: 0.3330 - regression_loss: 0.3113 - classification_loss: 0.0217 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3336 - regression_loss: 0.3118 - classification_loss: 0.0218 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3336 - regression_loss: 0.3118 - classification_loss: 0.0218 135/500 [=======>......................] - ETA: 2:04 - loss: 0.3330 - regression_loss: 0.3112 - classification_loss: 0.0218 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3318 - regression_loss: 0.3101 - classification_loss: 0.0217 137/500 [=======>......................] - ETA: 2:03 - loss: 0.3318 - regression_loss: 0.3101 - classification_loss: 0.0217 138/500 [=======>......................] - ETA: 2:03 - loss: 0.3356 - regression_loss: 0.3139 - classification_loss: 0.0217 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3348 - regression_loss: 0.3131 - classification_loss: 0.0216 140/500 [=======>......................] - ETA: 2:02 - loss: 0.3356 - regression_loss: 0.3138 - classification_loss: 0.0218 141/500 [=======>......................] - ETA: 2:02 - loss: 0.3351 - regression_loss: 0.3133 - classification_loss: 0.0218 142/500 [=======>......................] - ETA: 2:01 - loss: 0.3346 - regression_loss: 0.3129 - classification_loss: 0.0217 143/500 [=======>......................] - ETA: 2:01 - loss: 0.3348 - regression_loss: 0.3131 - classification_loss: 0.0217 144/500 [=======>......................] - ETA: 2:01 - loss: 0.3359 - regression_loss: 0.3142 - classification_loss: 0.0217 145/500 [=======>......................] - ETA: 2:00 - loss: 0.3361 - regression_loss: 0.3143 - classification_loss: 0.0218 146/500 [=======>......................] - ETA: 2:00 - loss: 0.3363 - regression_loss: 0.3144 - classification_loss: 0.0218 147/500 [=======>......................] - ETA: 2:00 - loss: 0.3368 - regression_loss: 0.3150 - classification_loss: 0.0218 148/500 [=======>......................] - ETA: 1:59 - loss: 0.3367 - regression_loss: 0.3149 - classification_loss: 0.0218 149/500 [=======>......................] - ETA: 1:59 - loss: 0.3379 - regression_loss: 0.3159 - classification_loss: 0.0220 150/500 [========>.....................] - ETA: 1:59 - loss: 0.3381 - regression_loss: 0.3161 - classification_loss: 0.0219 151/500 [========>.....................] - ETA: 1:58 - loss: 0.3380 - regression_loss: 0.3161 - classification_loss: 0.0219 152/500 [========>.....................] - ETA: 1:58 - loss: 0.3382 - regression_loss: 0.3163 - classification_loss: 0.0219 153/500 [========>.....................] - ETA: 1:58 - loss: 0.3383 - regression_loss: 0.3164 - classification_loss: 0.0219 154/500 [========>.....................] - ETA: 1:57 - loss: 0.3378 - regression_loss: 0.3159 - classification_loss: 0.0219 155/500 [========>.....................] - ETA: 1:57 - loss: 0.3382 - regression_loss: 0.3163 - classification_loss: 0.0219 156/500 [========>.....................] - ETA: 1:57 - loss: 0.3372 - regression_loss: 0.3154 - classification_loss: 0.0218 157/500 [========>.....................] - ETA: 1:56 - loss: 0.3367 - regression_loss: 0.3149 - classification_loss: 0.0217 158/500 [========>.....................] - ETA: 1:56 - loss: 0.3374 - regression_loss: 0.3157 - classification_loss: 0.0217 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3379 - regression_loss: 0.3162 - classification_loss: 0.0217 160/500 [========>.....................] - ETA: 1:55 - loss: 0.3376 - regression_loss: 0.3160 - classification_loss: 0.0216 161/500 [========>.....................] - ETA: 1:55 - loss: 0.3368 - regression_loss: 0.3152 - classification_loss: 0.0215 162/500 [========>.....................] - ETA: 1:55 - loss: 0.3384 - regression_loss: 0.3167 - classification_loss: 0.0216 163/500 [========>.....................] - ETA: 1:54 - loss: 0.3378 - regression_loss: 0.3162 - classification_loss: 0.0216 164/500 [========>.....................] - ETA: 1:54 - loss: 0.3374 - regression_loss: 0.3159 - classification_loss: 0.0215 165/500 [========>.....................] - ETA: 1:54 - loss: 0.3374 - regression_loss: 0.3160 - classification_loss: 0.0214 166/500 [========>.....................] - ETA: 1:53 - loss: 0.3358 - regression_loss: 0.3145 - classification_loss: 0.0213 167/500 [=========>....................] - ETA: 1:53 - loss: 0.3366 - regression_loss: 0.3152 - classification_loss: 0.0214 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3381 - regression_loss: 0.3166 - classification_loss: 0.0215 169/500 [=========>....................] - ETA: 1:52 - loss: 0.3409 - regression_loss: 0.3191 - classification_loss: 0.0218 170/500 [=========>....................] - ETA: 1:52 - loss: 0.3415 - regression_loss: 0.3195 - classification_loss: 0.0220 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3410 - regression_loss: 0.3190 - classification_loss: 0.0220 172/500 [=========>....................] - ETA: 1:51 - loss: 0.3416 - regression_loss: 0.3196 - classification_loss: 0.0220 173/500 [=========>....................] - ETA: 1:51 - loss: 0.3417 - regression_loss: 0.3196 - classification_loss: 0.0221 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3414 - regression_loss: 0.3193 - classification_loss: 0.0221 175/500 [=========>....................] - ETA: 1:50 - loss: 0.3401 - regression_loss: 0.3181 - classification_loss: 0.0220 176/500 [=========>....................] - ETA: 1:50 - loss: 0.3398 - regression_loss: 0.3179 - classification_loss: 0.0219 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3403 - regression_loss: 0.3183 - classification_loss: 0.0220 178/500 [=========>....................] - ETA: 1:49 - loss: 0.3396 - regression_loss: 0.3177 - classification_loss: 0.0219 179/500 [=========>....................] - ETA: 1:49 - loss: 0.3397 - regression_loss: 0.3177 - classification_loss: 0.0220 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3394 - regression_loss: 0.3174 - classification_loss: 0.0220 181/500 [=========>....................] - ETA: 1:48 - loss: 0.3386 - regression_loss: 0.3167 - classification_loss: 0.0219 182/500 [=========>....................] - ETA: 1:48 - loss: 0.3378 - regression_loss: 0.3160 - classification_loss: 0.0219 183/500 [=========>....................] - ETA: 1:47 - loss: 0.3369 - regression_loss: 0.3151 - classification_loss: 0.0218 184/500 [==========>...................] - ETA: 1:47 - loss: 0.3367 - regression_loss: 0.3148 - classification_loss: 0.0218 185/500 [==========>...................] - ETA: 1:47 - loss: 0.3373 - regression_loss: 0.3153 - classification_loss: 0.0219 186/500 [==========>...................] - ETA: 1:46 - loss: 0.3365 - regression_loss: 0.3146 - classification_loss: 0.0219 187/500 [==========>...................] - ETA: 1:46 - loss: 0.3358 - regression_loss: 0.3140 - classification_loss: 0.0218 188/500 [==========>...................] - ETA: 1:46 - loss: 0.3364 - regression_loss: 0.3145 - classification_loss: 0.0219 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3358 - regression_loss: 0.3140 - classification_loss: 0.0218 190/500 [==========>...................] - ETA: 1:45 - loss: 0.3354 - regression_loss: 0.3135 - classification_loss: 0.0219 191/500 [==========>...................] - ETA: 1:45 - loss: 0.3357 - regression_loss: 0.3138 - classification_loss: 0.0219 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3360 - regression_loss: 0.3141 - classification_loss: 0.0219 193/500 [==========>...................] - ETA: 1:44 - loss: 0.3355 - regression_loss: 0.3136 - classification_loss: 0.0218 194/500 [==========>...................] - ETA: 1:44 - loss: 0.3357 - regression_loss: 0.3139 - classification_loss: 0.0218 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3378 - regression_loss: 0.3158 - classification_loss: 0.0220 196/500 [==========>...................] - ETA: 1:43 - loss: 0.3374 - regression_loss: 0.3154 - classification_loss: 0.0220 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3380 - regression_loss: 0.3159 - classification_loss: 0.0221 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3379 - regression_loss: 0.3159 - classification_loss: 0.0220 199/500 [==========>...................] - ETA: 1:42 - loss: 0.3381 - regression_loss: 0.3161 - classification_loss: 0.0221 200/500 [===========>..................] - ETA: 1:42 - loss: 0.3371 - regression_loss: 0.3151 - classification_loss: 0.0220 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3367 - regression_loss: 0.3147 - classification_loss: 0.0220 202/500 [===========>..................] - ETA: 1:41 - loss: 0.3375 - regression_loss: 0.3154 - classification_loss: 0.0220 203/500 [===========>..................] - ETA: 1:41 - loss: 0.3377 - regression_loss: 0.3157 - classification_loss: 0.0220 204/500 [===========>..................] - ETA: 1:40 - loss: 0.3375 - regression_loss: 0.3155 - classification_loss: 0.0220 205/500 [===========>..................] - ETA: 1:40 - loss: 0.3382 - regression_loss: 0.3161 - classification_loss: 0.0221 206/500 [===========>..................] - ETA: 1:40 - loss: 0.3395 - regression_loss: 0.3171 - classification_loss: 0.0224 207/500 [===========>..................] - ETA: 1:39 - loss: 0.3395 - regression_loss: 0.3171 - classification_loss: 0.0223 208/500 [===========>..................] - ETA: 1:39 - loss: 0.3390 - regression_loss: 0.3167 - classification_loss: 0.0223 209/500 [===========>..................] - ETA: 1:39 - loss: 0.3386 - regression_loss: 0.3163 - classification_loss: 0.0223 210/500 [===========>..................] - ETA: 1:38 - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 211/500 [===========>..................] - ETA: 1:38 - loss: 0.3392 - regression_loss: 0.3169 - classification_loss: 0.0223 212/500 [===========>..................] - ETA: 1:38 - loss: 0.3395 - regression_loss: 0.3171 - classification_loss: 0.0224 213/500 [===========>..................] - ETA: 1:37 - loss: 0.3387 - regression_loss: 0.3164 - classification_loss: 0.0223 214/500 [===========>..................] - ETA: 1:37 - loss: 0.3392 - regression_loss: 0.3168 - classification_loss: 0.0224 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3387 - regression_loss: 0.3163 - classification_loss: 0.0224 216/500 [===========>..................] - ETA: 1:36 - loss: 0.3391 - regression_loss: 0.3167 - classification_loss: 0.0224 217/500 [============>.................] - ETA: 1:36 - loss: 0.3388 - regression_loss: 0.3164 - classification_loss: 0.0224 218/500 [============>.................] - ETA: 1:36 - loss: 0.3382 - regression_loss: 0.3159 - classification_loss: 0.0224 219/500 [============>.................] - ETA: 1:35 - loss: 0.3386 - regression_loss: 0.3163 - classification_loss: 0.0223 220/500 [============>.................] - ETA: 1:35 - loss: 0.3383 - regression_loss: 0.3160 - classification_loss: 0.0223 221/500 [============>.................] - ETA: 1:35 - loss: 0.3375 - regression_loss: 0.3153 - classification_loss: 0.0223 222/500 [============>.................] - ETA: 1:34 - loss: 0.3376 - regression_loss: 0.3153 - classification_loss: 0.0223 223/500 [============>.................] - ETA: 1:34 - loss: 0.3371 - regression_loss: 0.3149 - classification_loss: 0.0222 224/500 [============>.................] - ETA: 1:34 - loss: 0.3372 - regression_loss: 0.3149 - classification_loss: 0.0223 225/500 [============>.................] - ETA: 1:33 - loss: 0.3374 - regression_loss: 0.3152 - classification_loss: 0.0223 226/500 [============>.................] - ETA: 1:33 - loss: 0.3380 - regression_loss: 0.3158 - classification_loss: 0.0222 227/500 [============>.................] - ETA: 1:33 - loss: 0.3381 - regression_loss: 0.3159 - classification_loss: 0.0222 228/500 [============>.................] - ETA: 1:32 - loss: 0.3378 - regression_loss: 0.3156 - classification_loss: 0.0222 229/500 [============>.................] - ETA: 1:32 - loss: 0.3375 - regression_loss: 0.3154 - classification_loss: 0.0221 230/500 [============>.................] - ETA: 1:32 - loss: 0.3374 - regression_loss: 0.3153 - classification_loss: 0.0221 231/500 [============>.................] - ETA: 1:31 - loss: 0.3373 - regression_loss: 0.3152 - classification_loss: 0.0221 232/500 [============>.................] - ETA: 1:31 - loss: 0.3376 - regression_loss: 0.3155 - classification_loss: 0.0221 233/500 [============>.................] - ETA: 1:31 - loss: 0.3369 - regression_loss: 0.3149 - classification_loss: 0.0221 234/500 [=============>................] - ETA: 1:30 - loss: 0.3365 - regression_loss: 0.3145 - classification_loss: 0.0220 235/500 [=============>................] - ETA: 1:30 - loss: 0.3367 - regression_loss: 0.3147 - classification_loss: 0.0220 236/500 [=============>................] - ETA: 1:30 - loss: 0.3369 - regression_loss: 0.3149 - classification_loss: 0.0220 237/500 [=============>................] - ETA: 1:29 - loss: 0.3375 - regression_loss: 0.3155 - classification_loss: 0.0221 238/500 [=============>................] - ETA: 1:29 - loss: 0.3368 - regression_loss: 0.3148 - classification_loss: 0.0220 239/500 [=============>................] - ETA: 1:29 - loss: 0.3377 - regression_loss: 0.3155 - classification_loss: 0.0221 240/500 [=============>................] - ETA: 1:28 - loss: 0.3376 - regression_loss: 0.3155 - classification_loss: 0.0221 241/500 [=============>................] - ETA: 1:28 - loss: 0.3379 - regression_loss: 0.3157 - classification_loss: 0.0223 242/500 [=============>................] - ETA: 1:27 - loss: 0.3380 - regression_loss: 0.3158 - classification_loss: 0.0223 243/500 [=============>................] - ETA: 1:27 - loss: 0.3384 - regression_loss: 0.3161 - classification_loss: 0.0223 244/500 [=============>................] - ETA: 1:27 - loss: 0.3384 - regression_loss: 0.3159 - classification_loss: 0.0224 245/500 [=============>................] - ETA: 1:26 - loss: 0.3386 - regression_loss: 0.3162 - classification_loss: 0.0225 246/500 [=============>................] - ETA: 1:26 - loss: 0.3384 - regression_loss: 0.3159 - classification_loss: 0.0225 247/500 [=============>................] - ETA: 1:26 - loss: 0.3393 - regression_loss: 0.3168 - classification_loss: 0.0226 248/500 [=============>................] - ETA: 1:25 - loss: 0.3397 - regression_loss: 0.3171 - classification_loss: 0.0226 249/500 [=============>................] - ETA: 1:25 - loss: 0.3402 - regression_loss: 0.3177 - classification_loss: 0.0226 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3398 - regression_loss: 0.3172 - classification_loss: 0.0225 251/500 [==============>...............] - ETA: 1:24 - loss: 0.3394 - regression_loss: 0.3169 - classification_loss: 0.0225 252/500 [==============>...............] - ETA: 1:24 - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0224 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3385 - regression_loss: 0.3162 - classification_loss: 0.0224 254/500 [==============>...............] - ETA: 1:23 - loss: 0.3379 - regression_loss: 0.3156 - classification_loss: 0.0223 255/500 [==============>...............] - ETA: 1:23 - loss: 0.3388 - regression_loss: 0.3164 - classification_loss: 0.0224 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3391 - regression_loss: 0.3167 - classification_loss: 0.0224 257/500 [==============>...............] - ETA: 1:22 - loss: 0.3396 - regression_loss: 0.3173 - classification_loss: 0.0224 258/500 [==============>...............] - ETA: 1:22 - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3385 - regression_loss: 0.3162 - classification_loss: 0.0223 260/500 [==============>...............] - ETA: 1:21 - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0223 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3377 - regression_loss: 0.3155 - classification_loss: 0.0222 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3382 - regression_loss: 0.3159 - classification_loss: 0.0223 263/500 [==============>...............] - ETA: 1:20 - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0223 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3382 - regression_loss: 0.3158 - classification_loss: 0.0223 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3373 - regression_loss: 0.3150 - classification_loss: 0.0223 266/500 [==============>...............] - ETA: 1:19 - loss: 0.3369 - regression_loss: 0.3147 - classification_loss: 0.0222 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3377 - regression_loss: 0.3153 - classification_loss: 0.0224 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3379 - regression_loss: 0.3155 - classification_loss: 0.0224 269/500 [===============>..............] - ETA: 1:18 - loss: 0.3380 - regression_loss: 0.3156 - classification_loss: 0.0224 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3387 - regression_loss: 0.3163 - classification_loss: 0.0224 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3393 - regression_loss: 0.3169 - classification_loss: 0.0224 272/500 [===============>..............] - ETA: 1:17 - loss: 0.3387 - regression_loss: 0.3163 - classification_loss: 0.0224 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3390 - regression_loss: 0.3166 - classification_loss: 0.0224 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0224 275/500 [===============>..............] - ETA: 1:16 - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0224 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3388 - regression_loss: 0.3165 - classification_loss: 0.0224 278/500 [===============>..............] - ETA: 1:15 - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0223 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3386 - regression_loss: 0.3162 - classification_loss: 0.0224 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 281/500 [===============>..............] - ETA: 1:14 - loss: 0.3382 - regression_loss: 0.3158 - classification_loss: 0.0224 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3381 - regression_loss: 0.3157 - classification_loss: 0.0224 283/500 [===============>..............] - ETA: 1:13 - loss: 0.3378 - regression_loss: 0.3155 - classification_loss: 0.0224 284/500 [================>.............] - ETA: 1:13 - loss: 0.3374 - regression_loss: 0.3151 - classification_loss: 0.0223 285/500 [================>.............] - ETA: 1:13 - loss: 0.3371 - regression_loss: 0.3148 - classification_loss: 0.0224 286/500 [================>.............] - ETA: 1:12 - loss: 0.3367 - regression_loss: 0.3144 - classification_loss: 0.0223 287/500 [================>.............] - ETA: 1:12 - loss: 0.3366 - regression_loss: 0.3143 - classification_loss: 0.0223 288/500 [================>.............] - ETA: 1:12 - loss: 0.3363 - regression_loss: 0.3140 - classification_loss: 0.0223 289/500 [================>.............] - ETA: 1:11 - loss: 0.3363 - regression_loss: 0.3140 - classification_loss: 0.0223 290/500 [================>.............] - ETA: 1:11 - loss: 0.3364 - regression_loss: 0.3141 - classification_loss: 0.0222 291/500 [================>.............] - ETA: 1:11 - loss: 0.3361 - regression_loss: 0.3139 - classification_loss: 0.0222 292/500 [================>.............] - ETA: 1:10 - loss: 0.3369 - regression_loss: 0.3146 - classification_loss: 0.0223 293/500 [================>.............] - ETA: 1:10 - loss: 0.3377 - regression_loss: 0.3152 - classification_loss: 0.0225 294/500 [================>.............] - ETA: 1:10 - loss: 0.3378 - regression_loss: 0.3154 - classification_loss: 0.0224 295/500 [================>.............] - ETA: 1:09 - loss: 0.3381 - regression_loss: 0.3157 - classification_loss: 0.0224 296/500 [================>.............] - ETA: 1:09 - loss: 0.3388 - regression_loss: 0.3163 - classification_loss: 0.0225 297/500 [================>.............] - ETA: 1:09 - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0225 298/500 [================>.............] - ETA: 1:08 - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 299/500 [================>.............] - ETA: 1:08 - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 300/500 [=================>............] - ETA: 1:08 - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 301/500 [=================>............] - ETA: 1:07 - loss: 0.3378 - regression_loss: 0.3154 - classification_loss: 0.0224 302/500 [=================>............] - ETA: 1:07 - loss: 0.3374 - regression_loss: 0.3151 - classification_loss: 0.0224 303/500 [=================>............] - ETA: 1:07 - loss: 0.3381 - regression_loss: 0.3157 - classification_loss: 0.0224 304/500 [=================>............] - ETA: 1:06 - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0224 305/500 [=================>............] - ETA: 1:06 - loss: 0.3388 - regression_loss: 0.3162 - classification_loss: 0.0225 306/500 [=================>............] - ETA: 1:06 - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0226 307/500 [=================>............] - ETA: 1:05 - loss: 0.3394 - regression_loss: 0.3168 - classification_loss: 0.0226 308/500 [=================>............] - ETA: 1:05 - loss: 0.3393 - regression_loss: 0.3167 - classification_loss: 0.0225 309/500 [=================>............] - ETA: 1:05 - loss: 0.3391 - regression_loss: 0.3166 - classification_loss: 0.0225 310/500 [=================>............] - ETA: 1:04 - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0225 311/500 [=================>............] - ETA: 1:04 - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 312/500 [=================>............] - ETA: 1:04 - loss: 0.3387 - regression_loss: 0.3162 - classification_loss: 0.0225 313/500 [=================>............] - ETA: 1:03 - loss: 0.3389 - regression_loss: 0.3163 - classification_loss: 0.0225 314/500 [=================>............] - ETA: 1:03 - loss: 0.3391 - regression_loss: 0.3165 - classification_loss: 0.0226 315/500 [=================>............] - ETA: 1:03 - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0226 316/500 [=================>............] - ETA: 1:02 - loss: 0.3393 - regression_loss: 0.3167 - classification_loss: 0.0226 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3390 - regression_loss: 0.3164 - classification_loss: 0.0226 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0225 319/500 [==================>...........] - ETA: 1:01 - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0225 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0225 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0225 322/500 [==================>...........] - ETA: 1:00 - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3388 - regression_loss: 0.3163 - classification_loss: 0.0225 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3384 - regression_loss: 0.3160 - classification_loss: 0.0224 325/500 [==================>...........] - ETA: 59s - loss: 0.3381 - regression_loss: 0.3157 - classification_loss: 0.0224  326/500 [==================>...........] - ETA: 59s - loss: 0.3379 - regression_loss: 0.3155 - classification_loss: 0.0224 327/500 [==================>...........] - ETA: 59s - loss: 0.3377 - regression_loss: 0.3153 - classification_loss: 0.0224 328/500 [==================>...........] - ETA: 58s - loss: 0.3382 - regression_loss: 0.3157 - classification_loss: 0.0225 329/500 [==================>...........] - ETA: 58s - loss: 0.3383 - regression_loss: 0.3158 - classification_loss: 0.0225 330/500 [==================>...........] - ETA: 57s - loss: 0.3383 - regression_loss: 0.3158 - classification_loss: 0.0225 331/500 [==================>...........] - ETA: 57s - loss: 0.3385 - regression_loss: 0.3160 - classification_loss: 0.0225 332/500 [==================>...........] - ETA: 57s - loss: 0.3388 - regression_loss: 0.3163 - classification_loss: 0.0225 333/500 [==================>...........] - ETA: 56s - loss: 0.3385 - regression_loss: 0.3160 - classification_loss: 0.0225 334/500 [===================>..........] - ETA: 56s - loss: 0.3394 - regression_loss: 0.3169 - classification_loss: 0.0225 335/500 [===================>..........] - ETA: 56s - loss: 0.3394 - regression_loss: 0.3169 - classification_loss: 0.0225 336/500 [===================>..........] - ETA: 55s - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0225 337/500 [===================>..........] - ETA: 55s - loss: 0.3387 - regression_loss: 0.3162 - classification_loss: 0.0224 338/500 [===================>..........] - ETA: 55s - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0225 339/500 [===================>..........] - ETA: 54s - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 340/500 [===================>..........] - ETA: 54s - loss: 0.3388 - regression_loss: 0.3163 - classification_loss: 0.0225 341/500 [===================>..........] - ETA: 54s - loss: 0.3391 - regression_loss: 0.3166 - classification_loss: 0.0225 342/500 [===================>..........] - ETA: 53s - loss: 0.3387 - regression_loss: 0.3162 - classification_loss: 0.0225 343/500 [===================>..........] - ETA: 53s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 344/500 [===================>..........] - ETA: 53s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 345/500 [===================>..........] - ETA: 52s - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0223 346/500 [===================>..........] - ETA: 52s - loss: 0.3378 - regression_loss: 0.3155 - classification_loss: 0.0224 347/500 [===================>..........] - ETA: 52s - loss: 0.3380 - regression_loss: 0.3156 - classification_loss: 0.0224 348/500 [===================>..........] - ETA: 51s - loss: 0.3377 - regression_loss: 0.3153 - classification_loss: 0.0224 349/500 [===================>..........] - ETA: 51s - loss: 0.3375 - regression_loss: 0.3152 - classification_loss: 0.0224 350/500 [====================>.........] - ETA: 51s - loss: 0.3379 - regression_loss: 0.3155 - classification_loss: 0.0224 351/500 [====================>.........] - ETA: 50s - loss: 0.3378 - regression_loss: 0.3154 - classification_loss: 0.0224 352/500 [====================>.........] - ETA: 50s - loss: 0.3384 - regression_loss: 0.3159 - classification_loss: 0.0225 353/500 [====================>.........] - ETA: 50s - loss: 0.3382 - regression_loss: 0.3157 - classification_loss: 0.0225 354/500 [====================>.........] - ETA: 49s - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 355/500 [====================>.........] - ETA: 49s - loss: 0.3388 - regression_loss: 0.3163 - classification_loss: 0.0225 356/500 [====================>.........] - ETA: 49s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0225 357/500 [====================>.........] - ETA: 48s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 358/500 [====================>.........] - ETA: 48s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 359/500 [====================>.........] - ETA: 48s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 360/500 [====================>.........] - ETA: 47s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 361/500 [====================>.........] - ETA: 47s - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0224 362/500 [====================>.........] - ETA: 47s - loss: 0.3384 - regression_loss: 0.3159 - classification_loss: 0.0224 363/500 [====================>.........] - ETA: 46s - loss: 0.3382 - regression_loss: 0.3158 - classification_loss: 0.0224 364/500 [====================>.........] - ETA: 46s - loss: 0.3384 - regression_loss: 0.3160 - classification_loss: 0.0224 365/500 [====================>.........] - ETA: 46s - loss: 0.3381 - regression_loss: 0.3158 - classification_loss: 0.0224 366/500 [====================>.........] - ETA: 45s - loss: 0.3384 - regression_loss: 0.3160 - classification_loss: 0.0223 367/500 [=====================>........] - ETA: 45s - loss: 0.3381 - regression_loss: 0.3158 - classification_loss: 0.0223 368/500 [=====================>........] - ETA: 45s - loss: 0.3378 - regression_loss: 0.3155 - classification_loss: 0.0223 369/500 [=====================>........] - ETA: 44s - loss: 0.3384 - regression_loss: 0.3161 - classification_loss: 0.0223 370/500 [=====================>........] - ETA: 44s - loss: 0.3382 - regression_loss: 0.3159 - classification_loss: 0.0223 371/500 [=====================>........] - ETA: 44s - loss: 0.3379 - regression_loss: 0.3157 - classification_loss: 0.0223 372/500 [=====================>........] - ETA: 43s - loss: 0.3381 - regression_loss: 0.3158 - classification_loss: 0.0223 373/500 [=====================>........] - ETA: 43s - loss: 0.3378 - regression_loss: 0.3155 - classification_loss: 0.0223 374/500 [=====================>........] - ETA: 42s - loss: 0.3376 - regression_loss: 0.3154 - classification_loss: 0.0223 375/500 [=====================>........] - ETA: 42s - loss: 0.3378 - regression_loss: 0.3154 - classification_loss: 0.0223 376/500 [=====================>........] - ETA: 42s - loss: 0.3377 - regression_loss: 0.3154 - classification_loss: 0.0223 377/500 [=====================>........] - ETA: 41s - loss: 0.3375 - regression_loss: 0.3152 - classification_loss: 0.0223 378/500 [=====================>........] - ETA: 41s - loss: 0.3375 - regression_loss: 0.3152 - classification_loss: 0.0223 379/500 [=====================>........] - ETA: 41s - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0223 380/500 [=====================>........] - ETA: 40s - loss: 0.3386 - regression_loss: 0.3163 - classification_loss: 0.0223 381/500 [=====================>........] - ETA: 40s - loss: 0.3385 - regression_loss: 0.3162 - classification_loss: 0.0223 382/500 [=====================>........] - ETA: 40s - loss: 0.3389 - regression_loss: 0.3166 - classification_loss: 0.0224 383/500 [=====================>........] - ETA: 39s - loss: 0.3388 - regression_loss: 0.3165 - classification_loss: 0.0223 384/500 [======================>.......] - ETA: 39s - loss: 0.3393 - regression_loss: 0.3169 - classification_loss: 0.0224 385/500 [======================>.......] - ETA: 39s - loss: 0.3409 - regression_loss: 0.3185 - classification_loss: 0.0224 386/500 [======================>.......] - ETA: 38s - loss: 0.3407 - regression_loss: 0.3183 - classification_loss: 0.0224 387/500 [======================>.......] - ETA: 38s - loss: 0.3414 - regression_loss: 0.3190 - classification_loss: 0.0224 388/500 [======================>.......] - ETA: 38s - loss: 0.3417 - regression_loss: 0.3192 - classification_loss: 0.0224 389/500 [======================>.......] - ETA: 37s - loss: 0.3414 - regression_loss: 0.3190 - classification_loss: 0.0224 390/500 [======================>.......] - ETA: 37s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 391/500 [======================>.......] - ETA: 37s - loss: 0.3408 - regression_loss: 0.3184 - classification_loss: 0.0224 392/500 [======================>.......] - ETA: 36s - loss: 0.3406 - regression_loss: 0.3182 - classification_loss: 0.0223 393/500 [======================>.......] - ETA: 36s - loss: 0.3415 - regression_loss: 0.3190 - classification_loss: 0.0224 394/500 [======================>.......] - ETA: 36s - loss: 0.3416 - regression_loss: 0.3192 - classification_loss: 0.0224 395/500 [======================>.......] - ETA: 35s - loss: 0.3410 - regression_loss: 0.3187 - classification_loss: 0.0223 396/500 [======================>.......] - ETA: 35s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0223 397/500 [======================>.......] - ETA: 35s - loss: 0.3412 - regression_loss: 0.3189 - classification_loss: 0.0223 398/500 [======================>.......] - ETA: 34s - loss: 0.3414 - regression_loss: 0.3190 - classification_loss: 0.0224 399/500 [======================>.......] - ETA: 34s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 400/500 [=======================>......] - ETA: 34s - loss: 0.3412 - regression_loss: 0.3188 - classification_loss: 0.0224 401/500 [=======================>......] - ETA: 33s - loss: 0.3412 - regression_loss: 0.3188 - classification_loss: 0.0224 402/500 [=======================>......] - ETA: 33s - loss: 0.3412 - regression_loss: 0.3188 - classification_loss: 0.0224 403/500 [=======================>......] - ETA: 33s - loss: 0.3416 - regression_loss: 0.3192 - classification_loss: 0.0224 404/500 [=======================>......] - ETA: 32s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 405/500 [=======================>......] - ETA: 32s - loss: 0.3416 - regression_loss: 0.3192 - classification_loss: 0.0224 406/500 [=======================>......] - ETA: 32s - loss: 0.3421 - regression_loss: 0.3196 - classification_loss: 0.0225 407/500 [=======================>......] - ETA: 31s - loss: 0.3419 - regression_loss: 0.3195 - classification_loss: 0.0225 408/500 [=======================>......] - ETA: 31s - loss: 0.3427 - regression_loss: 0.3202 - classification_loss: 0.0225 409/500 [=======================>......] - ETA: 31s - loss: 0.3427 - regression_loss: 0.3202 - classification_loss: 0.0225 410/500 [=======================>......] - ETA: 30s - loss: 0.3431 - regression_loss: 0.3205 - classification_loss: 0.0225 411/500 [=======================>......] - ETA: 30s - loss: 0.3427 - regression_loss: 0.3202 - classification_loss: 0.0225 412/500 [=======================>......] - ETA: 30s - loss: 0.3425 - regression_loss: 0.3200 - classification_loss: 0.0225 413/500 [=======================>......] - ETA: 29s - loss: 0.3424 - regression_loss: 0.3199 - classification_loss: 0.0225 414/500 [=======================>......] - ETA: 29s - loss: 0.3422 - regression_loss: 0.3198 - classification_loss: 0.0225 415/500 [=======================>......] - ETA: 29s - loss: 0.3421 - regression_loss: 0.3196 - classification_loss: 0.0225 416/500 [=======================>......] - ETA: 28s - loss: 0.3420 - regression_loss: 0.3196 - classification_loss: 0.0224 417/500 [========================>.....] - ETA: 28s - loss: 0.3415 - regression_loss: 0.3191 - classification_loss: 0.0224 418/500 [========================>.....] - ETA: 27s - loss: 0.3415 - regression_loss: 0.3191 - classification_loss: 0.0224 419/500 [========================>.....] - ETA: 27s - loss: 0.3411 - regression_loss: 0.3188 - classification_loss: 0.0223 420/500 [========================>.....] - ETA: 27s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 421/500 [========================>.....] - ETA: 26s - loss: 0.3406 - regression_loss: 0.3183 - classification_loss: 0.0223 422/500 [========================>.....] - ETA: 26s - loss: 0.3409 - regression_loss: 0.3185 - classification_loss: 0.0223 423/500 [========================>.....] - ETA: 26s - loss: 0.3410 - regression_loss: 0.3187 - classification_loss: 0.0223 424/500 [========================>.....] - ETA: 25s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 425/500 [========================>.....] - ETA: 25s - loss: 0.3411 - regression_loss: 0.3187 - classification_loss: 0.0224 426/500 [========================>.....] - ETA: 25s - loss: 0.3406 - regression_loss: 0.3183 - classification_loss: 0.0223 427/500 [========================>.....] - ETA: 24s - loss: 0.3406 - regression_loss: 0.3182 - classification_loss: 0.0223 428/500 [========================>.....] - ETA: 24s - loss: 0.3407 - regression_loss: 0.3184 - classification_loss: 0.0223 429/500 [========================>.....] - ETA: 24s - loss: 0.3404 - regression_loss: 0.3181 - classification_loss: 0.0223 430/500 [========================>.....] - ETA: 23s - loss: 0.3401 - regression_loss: 0.3178 - classification_loss: 0.0223 431/500 [========================>.....] - ETA: 23s - loss: 0.3398 - regression_loss: 0.3175 - classification_loss: 0.0222 432/500 [========================>.....] - ETA: 23s - loss: 0.3397 - regression_loss: 0.3174 - classification_loss: 0.0222 433/500 [========================>.....] - ETA: 22s - loss: 0.3398 - regression_loss: 0.3176 - classification_loss: 0.0222 434/500 [=========================>....] - ETA: 22s - loss: 0.3400 - regression_loss: 0.3177 - classification_loss: 0.0223 435/500 [=========================>....] - ETA: 22s - loss: 0.3403 - regression_loss: 0.3180 - classification_loss: 0.0223 436/500 [=========================>....] - ETA: 21s - loss: 0.3402 - regression_loss: 0.3178 - classification_loss: 0.0223 437/500 [=========================>....] - ETA: 21s - loss: 0.3401 - regression_loss: 0.3177 - classification_loss: 0.0223 438/500 [=========================>....] - ETA: 21s - loss: 0.3400 - regression_loss: 0.3177 - classification_loss: 0.0223 439/500 [=========================>....] - ETA: 20s - loss: 0.3398 - regression_loss: 0.3175 - classification_loss: 0.0223 440/500 [=========================>....] - ETA: 20s - loss: 0.3397 - regression_loss: 0.3174 - classification_loss: 0.0223 441/500 [=========================>....] - ETA: 20s - loss: 0.3399 - regression_loss: 0.3176 - classification_loss: 0.0223 442/500 [=========================>....] - ETA: 19s - loss: 0.3395 - regression_loss: 0.3173 - classification_loss: 0.0222 443/500 [=========================>....] - ETA: 19s - loss: 0.3392 - regression_loss: 0.3170 - classification_loss: 0.0222 444/500 [=========================>....] - ETA: 19s - loss: 0.3392 - regression_loss: 0.3170 - classification_loss: 0.0222 445/500 [=========================>....] - ETA: 18s - loss: 0.3400 - regression_loss: 0.3176 - classification_loss: 0.0224 446/500 [=========================>....] - ETA: 18s - loss: 0.3397 - regression_loss: 0.3173 - classification_loss: 0.0223 447/500 [=========================>....] - ETA: 18s - loss: 0.3395 - regression_loss: 0.3172 - classification_loss: 0.0223 448/500 [=========================>....] - ETA: 17s - loss: 0.3392 - regression_loss: 0.3169 - classification_loss: 0.0223 449/500 [=========================>....] - ETA: 17s - loss: 0.3389 - regression_loss: 0.3167 - classification_loss: 0.0223 450/500 [==========================>...] - ETA: 17s - loss: 0.3393 - regression_loss: 0.3170 - classification_loss: 0.0223 451/500 [==========================>...] - ETA: 16s - loss: 0.3393 - regression_loss: 0.3170 - classification_loss: 0.0223 452/500 [==========================>...] - ETA: 16s - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 453/500 [==========================>...] - ETA: 16s - loss: 0.3390 - regression_loss: 0.3167 - classification_loss: 0.0223 454/500 [==========================>...] - ETA: 15s - loss: 0.3390 - regression_loss: 0.3167 - classification_loss: 0.0223 455/500 [==========================>...] - ETA: 15s - loss: 0.3395 - regression_loss: 0.3172 - classification_loss: 0.0223 456/500 [==========================>...] - ETA: 15s - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 457/500 [==========================>...] - ETA: 14s - loss: 0.3393 - regression_loss: 0.3170 - classification_loss: 0.0223 458/500 [==========================>...] - ETA: 14s - loss: 0.3395 - regression_loss: 0.3172 - classification_loss: 0.0223 459/500 [==========================>...] - ETA: 13s - loss: 0.3393 - regression_loss: 0.3171 - classification_loss: 0.0223 460/500 [==========================>...] - ETA: 13s - loss: 0.3391 - regression_loss: 0.3169 - classification_loss: 0.0222 461/500 [==========================>...] - ETA: 13s - loss: 0.3393 - regression_loss: 0.3170 - classification_loss: 0.0223 462/500 [==========================>...] - ETA: 12s - loss: 0.3389 - regression_loss: 0.3166 - classification_loss: 0.0222 463/500 [==========================>...] - ETA: 12s - loss: 0.3391 - regression_loss: 0.3169 - classification_loss: 0.0222 464/500 [==========================>...] - ETA: 12s - loss: 0.3392 - regression_loss: 0.3170 - classification_loss: 0.0222 465/500 [==========================>...] - ETA: 11s - loss: 0.3390 - regression_loss: 0.3168 - classification_loss: 0.0222 466/500 [==========================>...] - ETA: 11s - loss: 0.3391 - regression_loss: 0.3169 - classification_loss: 0.0222 467/500 [===========================>..] - ETA: 11s - loss: 0.3388 - regression_loss: 0.3166 - classification_loss: 0.0222 468/500 [===========================>..] - ETA: 10s - loss: 0.3387 - regression_loss: 0.3165 - classification_loss: 0.0222 469/500 [===========================>..] - ETA: 10s - loss: 0.3388 - regression_loss: 0.3166 - classification_loss: 0.0222 470/500 [===========================>..] - ETA: 10s - loss: 0.3391 - regression_loss: 0.3169 - classification_loss: 0.0222 471/500 [===========================>..] - ETA: 9s - loss: 0.3396 - regression_loss: 0.3172 - classification_loss: 0.0223  472/500 [===========================>..] - ETA: 9s - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 473/500 [===========================>..] - ETA: 9s - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 474/500 [===========================>..] - ETA: 8s - loss: 0.3387 - regression_loss: 0.3165 - classification_loss: 0.0222 475/500 [===========================>..] - ETA: 8s - loss: 0.3389 - regression_loss: 0.3167 - classification_loss: 0.0223 476/500 [===========================>..] - ETA: 8s - loss: 0.3389 - regression_loss: 0.3166 - classification_loss: 0.0223 477/500 [===========================>..] - ETA: 7s - loss: 0.3390 - regression_loss: 0.3167 - classification_loss: 0.0223 478/500 [===========================>..] - ETA: 7s - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0224 479/500 [===========================>..] - ETA: 7s - loss: 0.3391 - regression_loss: 0.3167 - classification_loss: 0.0223 480/500 [===========================>..] - ETA: 6s - loss: 0.3392 - regression_loss: 0.3169 - classification_loss: 0.0223 481/500 [===========================>..] - ETA: 6s - loss: 0.3400 - regression_loss: 0.3175 - classification_loss: 0.0224 482/500 [===========================>..] - ETA: 6s - loss: 0.3398 - regression_loss: 0.3173 - classification_loss: 0.0224 483/500 [===========================>..] - ETA: 5s - loss: 0.3394 - regression_loss: 0.3170 - classification_loss: 0.0224 484/500 [============================>.] - ETA: 5s - loss: 0.3405 - regression_loss: 0.3181 - classification_loss: 0.0225 485/500 [============================>.] - ETA: 5s - loss: 0.3405 - regression_loss: 0.3181 - classification_loss: 0.0225 486/500 [============================>.] - ETA: 4s - loss: 0.3405 - regression_loss: 0.3180 - classification_loss: 0.0224 487/500 [============================>.] - ETA: 4s - loss: 0.3401 - regression_loss: 0.3177 - classification_loss: 0.0224 488/500 [============================>.] - ETA: 4s - loss: 0.3398 - regression_loss: 0.3174 - classification_loss: 0.0224 489/500 [============================>.] - ETA: 3s - loss: 0.3395 - regression_loss: 0.3172 - classification_loss: 0.0224 490/500 [============================>.] - ETA: 3s - loss: 0.3400 - regression_loss: 0.3175 - classification_loss: 0.0225 491/500 [============================>.] - ETA: 3s - loss: 0.3400 - regression_loss: 0.3175 - classification_loss: 0.0225 492/500 [============================>.] - ETA: 2s - loss: 0.3400 - regression_loss: 0.3175 - classification_loss: 0.0225 493/500 [============================>.] - ETA: 2s - loss: 0.3402 - regression_loss: 0.3177 - classification_loss: 0.0225 494/500 [============================>.] - ETA: 2s - loss: 0.3407 - regression_loss: 0.3182 - classification_loss: 0.0225 495/500 [============================>.] - ETA: 1s - loss: 0.3406 - regression_loss: 0.3181 - classification_loss: 0.0225 496/500 [============================>.] - ETA: 1s - loss: 0.3409 - regression_loss: 0.3184 - classification_loss: 0.0225 497/500 [============================>.] - ETA: 1s - loss: 0.3411 - regression_loss: 0.3186 - classification_loss: 0.0225 498/500 [============================>.] - ETA: 0s - loss: 0.3407 - regression_loss: 0.3182 - classification_loss: 0.0225 499/500 [============================>.] - ETA: 0s - loss: 0.3409 - regression_loss: 0.3184 - classification_loss: 0.0225 500/500 [==============================] - 171s 342ms/step - loss: 0.3409 - regression_loss: 0.3184 - classification_loss: 0.0225 1172 instances of class plum with average precision: 0.7393 mAP: 0.7393 Epoch 00031: saving model to ./training/snapshots/resnet101_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 2:37 - loss: 0.6012 - regression_loss: 0.5717 - classification_loss: 0.0295 2/500 [..............................] - ETA: 2:44 - loss: 0.4610 - regression_loss: 0.4377 - classification_loss: 0.0232 3/500 [..............................] - ETA: 2:47 - loss: 0.4190 - regression_loss: 0.3938 - classification_loss: 0.0251 4/500 [..............................] - ETA: 2:49 - loss: 0.4109 - regression_loss: 0.3878 - classification_loss: 0.0231 5/500 [..............................] - ETA: 2:48 - loss: 0.3841 - regression_loss: 0.3637 - classification_loss: 0.0203 6/500 [..............................] - ETA: 2:48 - loss: 0.4181 - regression_loss: 0.3926 - classification_loss: 0.0255 7/500 [..............................] - ETA: 2:47 - loss: 0.3881 - regression_loss: 0.3653 - classification_loss: 0.0228 8/500 [..............................] - ETA: 2:44 - loss: 0.3614 - regression_loss: 0.3401 - classification_loss: 0.0213 9/500 [..............................] - ETA: 2:45 - loss: 0.3359 - regression_loss: 0.3158 - classification_loss: 0.0201 10/500 [..............................] - ETA: 2:44 - loss: 0.3343 - regression_loss: 0.3148 - classification_loss: 0.0195 11/500 [..............................] - ETA: 2:44 - loss: 0.3503 - regression_loss: 0.3310 - classification_loss: 0.0193 12/500 [..............................] - ETA: 2:44 - loss: 0.3450 - regression_loss: 0.3261 - classification_loss: 0.0189 13/500 [..............................] - ETA: 2:44 - loss: 0.3322 - regression_loss: 0.3141 - classification_loss: 0.0181 14/500 [..............................] - ETA: 2:43 - loss: 0.3431 - regression_loss: 0.3242 - classification_loss: 0.0189 15/500 [..............................] - ETA: 2:43 - loss: 0.3333 - regression_loss: 0.3148 - classification_loss: 0.0185 16/500 [..............................] - ETA: 2:43 - loss: 0.3218 - regression_loss: 0.3037 - classification_loss: 0.0181 17/500 [>.............................] - ETA: 2:43 - loss: 0.3242 - regression_loss: 0.3068 - classification_loss: 0.0175 18/500 [>.............................] - ETA: 2:42 - loss: 0.3211 - regression_loss: 0.3042 - classification_loss: 0.0169 19/500 [>.............................] - ETA: 2:42 - loss: 0.3147 - regression_loss: 0.2984 - classification_loss: 0.0163 20/500 [>.............................] - ETA: 2:43 - loss: 0.3193 - regression_loss: 0.3027 - classification_loss: 0.0166 21/500 [>.............................] - ETA: 2:42 - loss: 0.3194 - regression_loss: 0.3030 - classification_loss: 0.0164 22/500 [>.............................] - ETA: 2:42 - loss: 0.3299 - regression_loss: 0.3132 - classification_loss: 0.0168 23/500 [>.............................] - ETA: 2:42 - loss: 0.3358 - regression_loss: 0.3184 - classification_loss: 0.0174 24/500 [>.............................] - ETA: 2:42 - loss: 0.3387 - regression_loss: 0.3212 - classification_loss: 0.0176 25/500 [>.............................] - ETA: 2:42 - loss: 0.3387 - regression_loss: 0.3211 - classification_loss: 0.0176 26/500 [>.............................] - ETA: 2:41 - loss: 0.3421 - regression_loss: 0.3242 - classification_loss: 0.0179 27/500 [>.............................] - ETA: 2:41 - loss: 0.3337 - regression_loss: 0.3160 - classification_loss: 0.0177 28/500 [>.............................] - ETA: 2:41 - loss: 0.3397 - regression_loss: 0.3219 - classification_loss: 0.0178 29/500 [>.............................] - ETA: 2:40 - loss: 0.3468 - regression_loss: 0.3286 - classification_loss: 0.0182 30/500 [>.............................] - ETA: 2:40 - loss: 0.3544 - regression_loss: 0.3356 - classification_loss: 0.0188 31/500 [>.............................] - ETA: 2:39 - loss: 0.3598 - regression_loss: 0.3407 - classification_loss: 0.0191 32/500 [>.............................] - ETA: 2:39 - loss: 0.3744 - regression_loss: 0.3552 - classification_loss: 0.0192 33/500 [>.............................] - ETA: 2:39 - loss: 0.3739 - regression_loss: 0.3549 - classification_loss: 0.0190 34/500 [=>............................] - ETA: 2:38 - loss: 0.3724 - regression_loss: 0.3536 - classification_loss: 0.0189 35/500 [=>............................] - ETA: 2:38 - loss: 0.3735 - regression_loss: 0.3543 - classification_loss: 0.0192 36/500 [=>............................] - ETA: 2:38 - loss: 0.3677 - regression_loss: 0.3489 - classification_loss: 0.0188 37/500 [=>............................] - ETA: 2:38 - loss: 0.3723 - regression_loss: 0.3532 - classification_loss: 0.0191 38/500 [=>............................] - ETA: 2:37 - loss: 0.3723 - regression_loss: 0.3527 - classification_loss: 0.0195 39/500 [=>............................] - ETA: 2:37 - loss: 0.3713 - regression_loss: 0.3517 - classification_loss: 0.0196 40/500 [=>............................] - ETA: 2:37 - loss: 0.3721 - regression_loss: 0.3519 - classification_loss: 0.0202 41/500 [=>............................] - ETA: 2:37 - loss: 0.3719 - regression_loss: 0.3518 - classification_loss: 0.0201 42/500 [=>............................] - ETA: 2:36 - loss: 0.3717 - regression_loss: 0.3516 - classification_loss: 0.0201 43/500 [=>............................] - ETA: 2:36 - loss: 0.3719 - regression_loss: 0.3517 - classification_loss: 0.0201 44/500 [=>............................] - ETA: 2:36 - loss: 0.3705 - regression_loss: 0.3502 - classification_loss: 0.0203 45/500 [=>............................] - ETA: 2:36 - loss: 0.3692 - regression_loss: 0.3486 - classification_loss: 0.0207 46/500 [=>............................] - ETA: 2:35 - loss: 0.3672 - regression_loss: 0.3463 - classification_loss: 0.0209 47/500 [=>............................] - ETA: 2:35 - loss: 0.3645 - regression_loss: 0.3438 - classification_loss: 0.0207 48/500 [=>............................] - ETA: 2:35 - loss: 0.3607 - regression_loss: 0.3403 - classification_loss: 0.0205 49/500 [=>............................] - ETA: 2:34 - loss: 0.3579 - regression_loss: 0.3376 - classification_loss: 0.0203 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3574 - regression_loss: 0.3372 - classification_loss: 0.0202 51/500 [==>...........................] - ETA: 2:34 - loss: 0.3586 - regression_loss: 0.3383 - classification_loss: 0.0203 52/500 [==>...........................] - ETA: 2:33 - loss: 0.3628 - regression_loss: 0.3422 - classification_loss: 0.0207 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3596 - regression_loss: 0.3391 - classification_loss: 0.0204 54/500 [==>...........................] - ETA: 2:33 - loss: 0.3556 - regression_loss: 0.3355 - classification_loss: 0.0202 55/500 [==>...........................] - ETA: 2:32 - loss: 0.3511 - regression_loss: 0.3311 - classification_loss: 0.0200 56/500 [==>...........................] - ETA: 2:32 - loss: 0.3524 - regression_loss: 0.3324 - classification_loss: 0.0200 57/500 [==>...........................] - ETA: 2:32 - loss: 0.3483 - regression_loss: 0.3286 - classification_loss: 0.0197 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3537 - regression_loss: 0.3335 - classification_loss: 0.0202 59/500 [==>...........................] - ETA: 2:31 - loss: 0.3509 - regression_loss: 0.3310 - classification_loss: 0.0199 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3483 - regression_loss: 0.3286 - classification_loss: 0.0197 61/500 [==>...........................] - ETA: 2:30 - loss: 0.3472 - regression_loss: 0.3276 - classification_loss: 0.0195 62/500 [==>...........................] - ETA: 2:30 - loss: 0.3444 - regression_loss: 0.3251 - classification_loss: 0.0193 63/500 [==>...........................] - ETA: 2:30 - loss: 0.3439 - regression_loss: 0.3245 - classification_loss: 0.0194 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3464 - regression_loss: 0.3269 - classification_loss: 0.0195 65/500 [==>...........................] - ETA: 2:29 - loss: 0.3481 - regression_loss: 0.3286 - classification_loss: 0.0195 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3466 - regression_loss: 0.3272 - classification_loss: 0.0193 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3450 - regression_loss: 0.3258 - classification_loss: 0.0192 68/500 [===>..........................] - ETA: 2:28 - loss: 0.3458 - regression_loss: 0.3265 - classification_loss: 0.0193 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3434 - regression_loss: 0.3241 - classification_loss: 0.0192 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3480 - regression_loss: 0.3279 - classification_loss: 0.0201 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3464 - regression_loss: 0.3264 - classification_loss: 0.0199 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3453 - regression_loss: 0.3254 - classification_loss: 0.0199 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3443 - regression_loss: 0.3244 - classification_loss: 0.0198 74/500 [===>..........................] - ETA: 2:25 - loss: 0.3479 - regression_loss: 0.3273 - classification_loss: 0.0206 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3472 - regression_loss: 0.3267 - classification_loss: 0.0205 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3469 - regression_loss: 0.3261 - classification_loss: 0.0208 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3459 - regression_loss: 0.3252 - classification_loss: 0.0207 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3440 - regression_loss: 0.3234 - classification_loss: 0.0205 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3422 - regression_loss: 0.3218 - classification_loss: 0.0204 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3426 - regression_loss: 0.3223 - classification_loss: 0.0204 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3440 - regression_loss: 0.3236 - classification_loss: 0.0204 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3443 - regression_loss: 0.3239 - classification_loss: 0.0204 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3475 - regression_loss: 0.3266 - classification_loss: 0.0209 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3475 - regression_loss: 0.3266 - classification_loss: 0.0209 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3461 - regression_loss: 0.3254 - classification_loss: 0.0208 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3449 - regression_loss: 0.3241 - classification_loss: 0.0208 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3451 - regression_loss: 0.3244 - classification_loss: 0.0207 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3468 - regression_loss: 0.3261 - classification_loss: 0.0207 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3443 - regression_loss: 0.3237 - classification_loss: 0.0206 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3464 - regression_loss: 0.3256 - classification_loss: 0.0208 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3457 - regression_loss: 0.3249 - classification_loss: 0.0208 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3456 - regression_loss: 0.3247 - classification_loss: 0.0208 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3466 - regression_loss: 0.3257 - classification_loss: 0.0209 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3469 - regression_loss: 0.3258 - classification_loss: 0.0211 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3473 - regression_loss: 0.3262 - classification_loss: 0.0211 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3474 - regression_loss: 0.3263 - classification_loss: 0.0211 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3463 - regression_loss: 0.3253 - classification_loss: 0.0210 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3448 - regression_loss: 0.3239 - classification_loss: 0.0209 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3439 - regression_loss: 0.3230 - classification_loss: 0.0209 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3429 - regression_loss: 0.3220 - classification_loss: 0.0209 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3419 - regression_loss: 0.3210 - classification_loss: 0.0209 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3396 - regression_loss: 0.3188 - classification_loss: 0.0208 103/500 [=====>........................] - ETA: 2:16 - loss: 0.3381 - regression_loss: 0.3174 - classification_loss: 0.0207 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3362 - regression_loss: 0.3156 - classification_loss: 0.0206 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3351 - regression_loss: 0.3146 - classification_loss: 0.0205 106/500 [=====>........................] - ETA: 2:15 - loss: 0.3374 - regression_loss: 0.3164 - classification_loss: 0.0210 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3359 - regression_loss: 0.3150 - classification_loss: 0.0209 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3356 - regression_loss: 0.3147 - classification_loss: 0.0209 109/500 [=====>........................] - ETA: 2:14 - loss: 0.3362 - regression_loss: 0.3152 - classification_loss: 0.0210 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3363 - regression_loss: 0.3152 - classification_loss: 0.0211 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3360 - regression_loss: 0.3150 - classification_loss: 0.0211 112/500 [=====>........................] - ETA: 2:13 - loss: 0.3361 - regression_loss: 0.3150 - classification_loss: 0.0211 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3364 - regression_loss: 0.3153 - classification_loss: 0.0211 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3347 - regression_loss: 0.3138 - classification_loss: 0.0210 115/500 [=====>........................] - ETA: 2:12 - loss: 0.3344 - regression_loss: 0.3135 - classification_loss: 0.0209 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3347 - regression_loss: 0.3138 - classification_loss: 0.0209 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3340 - regression_loss: 0.3131 - classification_loss: 0.0208 118/500 [======>.......................] - ETA: 2:11 - loss: 0.3337 - regression_loss: 0.3129 - classification_loss: 0.0208 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3327 - regression_loss: 0.3119 - classification_loss: 0.0208 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3311 - regression_loss: 0.3104 - classification_loss: 0.0207 121/500 [======>.......................] - ETA: 2:10 - loss: 0.3314 - regression_loss: 0.3105 - classification_loss: 0.0208 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3335 - regression_loss: 0.3124 - classification_loss: 0.0211 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3345 - regression_loss: 0.3134 - classification_loss: 0.0212 124/500 [======>.......................] - ETA: 2:09 - loss: 0.3335 - regression_loss: 0.3124 - classification_loss: 0.0211 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3334 - regression_loss: 0.3123 - classification_loss: 0.0211 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3347 - regression_loss: 0.3136 - classification_loss: 0.0212 127/500 [======>.......................] - ETA: 2:08 - loss: 0.3340 - regression_loss: 0.3130 - classification_loss: 0.0211 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3355 - regression_loss: 0.3142 - classification_loss: 0.0213 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3367 - regression_loss: 0.3152 - classification_loss: 0.0215 130/500 [======>.......................] - ETA: 2:07 - loss: 0.3349 - regression_loss: 0.3135 - classification_loss: 0.0213 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3348 - regression_loss: 0.3135 - classification_loss: 0.0213 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3341 - regression_loss: 0.3128 - classification_loss: 0.0213 133/500 [======>.......................] - ETA: 2:06 - loss: 0.3332 - regression_loss: 0.3120 - classification_loss: 0.0212 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3335 - regression_loss: 0.3122 - classification_loss: 0.0213 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3335 - regression_loss: 0.3121 - classification_loss: 0.0213 136/500 [=======>......................] - ETA: 2:05 - loss: 0.3335 - regression_loss: 0.3122 - classification_loss: 0.0213 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3336 - regression_loss: 0.3123 - classification_loss: 0.0213 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3356 - regression_loss: 0.3140 - classification_loss: 0.0216 139/500 [=======>......................] - ETA: 2:04 - loss: 0.3357 - regression_loss: 0.3141 - classification_loss: 0.0216 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3357 - regression_loss: 0.3140 - classification_loss: 0.0217 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3359 - regression_loss: 0.3142 - classification_loss: 0.0217 142/500 [=======>......................] - ETA: 2:03 - loss: 0.3347 - regression_loss: 0.3131 - classification_loss: 0.0216 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3336 - regression_loss: 0.3122 - classification_loss: 0.0215 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3328 - regression_loss: 0.3114 - classification_loss: 0.0214 145/500 [=======>......................] - ETA: 2:02 - loss: 0.3328 - regression_loss: 0.3114 - classification_loss: 0.0214 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3331 - regression_loss: 0.3117 - classification_loss: 0.0214 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3325 - regression_loss: 0.3112 - classification_loss: 0.0213 148/500 [=======>......................] - ETA: 2:01 - loss: 0.3316 - regression_loss: 0.3104 - classification_loss: 0.0212 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3317 - regression_loss: 0.3105 - classification_loss: 0.0212 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3322 - regression_loss: 0.3109 - classification_loss: 0.0213 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3312 - regression_loss: 0.3100 - classification_loss: 0.0212 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3304 - regression_loss: 0.3093 - classification_loss: 0.0211 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3314 - regression_loss: 0.3102 - classification_loss: 0.0212 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3312 - regression_loss: 0.3100 - classification_loss: 0.0212 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3319 - regression_loss: 0.3105 - classification_loss: 0.0214 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3309 - regression_loss: 0.3096 - classification_loss: 0.0213 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3312 - regression_loss: 0.3099 - classification_loss: 0.0213 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3318 - regression_loss: 0.3104 - classification_loss: 0.0214 159/500 [========>.....................] - ETA: 1:57 - loss: 0.3323 - regression_loss: 0.3110 - classification_loss: 0.0213 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3332 - regression_loss: 0.3119 - classification_loss: 0.0213 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3339 - regression_loss: 0.3125 - classification_loss: 0.0213 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3347 - regression_loss: 0.3130 - classification_loss: 0.0217 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3352 - regression_loss: 0.3133 - classification_loss: 0.0219 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3338 - regression_loss: 0.3121 - classification_loss: 0.0217 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3338 - regression_loss: 0.3120 - classification_loss: 0.0218 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3336 - regression_loss: 0.3119 - classification_loss: 0.0217 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3342 - regression_loss: 0.3123 - classification_loss: 0.0219 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3338 - regression_loss: 0.3120 - classification_loss: 0.0219 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3332 - regression_loss: 0.3114 - classification_loss: 0.0218 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3346 - regression_loss: 0.3127 - classification_loss: 0.0219 171/500 [=========>....................] - ETA: 1:53 - loss: 0.3346 - regression_loss: 0.3127 - classification_loss: 0.0219 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3351 - regression_loss: 0.3133 - classification_loss: 0.0219 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3359 - regression_loss: 0.3139 - classification_loss: 0.0220 174/500 [=========>....................] - ETA: 1:52 - loss: 0.3362 - regression_loss: 0.3142 - classification_loss: 0.0220 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3366 - regression_loss: 0.3146 - classification_loss: 0.0221 176/500 [=========>....................] - ETA: 1:51 - loss: 0.3370 - regression_loss: 0.3150 - classification_loss: 0.0221 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3391 - regression_loss: 0.3168 - classification_loss: 0.0223 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3402 - regression_loss: 0.3178 - classification_loss: 0.0224 179/500 [=========>....................] - ETA: 1:50 - loss: 0.3408 - regression_loss: 0.3179 - classification_loss: 0.0229 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3423 - regression_loss: 0.3190 - classification_loss: 0.0232 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3430 - regression_loss: 0.3197 - classification_loss: 0.0233 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3429 - regression_loss: 0.3195 - classification_loss: 0.0233 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3431 - regression_loss: 0.3198 - classification_loss: 0.0233 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3429 - regression_loss: 0.3196 - classification_loss: 0.0233 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3426 - regression_loss: 0.3192 - classification_loss: 0.0234 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3435 - regression_loss: 0.3199 - classification_loss: 0.0237 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3438 - regression_loss: 0.3202 - classification_loss: 0.0236 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3431 - regression_loss: 0.3195 - classification_loss: 0.0236 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3440 - regression_loss: 0.3203 - classification_loss: 0.0236 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3426 - regression_loss: 0.3191 - classification_loss: 0.0236 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3431 - regression_loss: 0.3195 - classification_loss: 0.0236 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3424 - regression_loss: 0.3189 - classification_loss: 0.0235 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3414 - regression_loss: 0.3180 - classification_loss: 0.0235 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3421 - regression_loss: 0.3185 - classification_loss: 0.0236 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3424 - regression_loss: 0.3188 - classification_loss: 0.0236 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3423 - regression_loss: 0.3187 - classification_loss: 0.0236 197/500 [==========>...................] - ETA: 1:44 - loss: 0.3419 - regression_loss: 0.3183 - classification_loss: 0.0236 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3416 - regression_loss: 0.3180 - classification_loss: 0.0235 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3416 - regression_loss: 0.3181 - classification_loss: 0.0236 200/500 [===========>..................] - ETA: 1:43 - loss: 0.3424 - regression_loss: 0.3188 - classification_loss: 0.0236 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3437 - regression_loss: 0.3200 - classification_loss: 0.0237 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3431 - regression_loss: 0.3195 - classification_loss: 0.0237 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3437 - regression_loss: 0.3199 - classification_loss: 0.0238 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3433 - regression_loss: 0.3195 - classification_loss: 0.0237 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3430 - regression_loss: 0.3192 - classification_loss: 0.0238 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3419 - regression_loss: 0.3182 - classification_loss: 0.0237 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3422 - regression_loss: 0.3185 - classification_loss: 0.0237 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3418 - regression_loss: 0.3181 - classification_loss: 0.0237 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3410 - regression_loss: 0.3175 - classification_loss: 0.0236 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3409 - regression_loss: 0.3174 - classification_loss: 0.0236 211/500 [===========>..................] - ETA: 1:39 - loss: 0.3400 - regression_loss: 0.3165 - classification_loss: 0.0235 212/500 [===========>..................] - ETA: 1:39 - loss: 0.3396 - regression_loss: 0.3162 - classification_loss: 0.0235 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3400 - regression_loss: 0.3164 - classification_loss: 0.0236 214/500 [===========>..................] - ETA: 1:38 - loss: 0.3396 - regression_loss: 0.3161 - classification_loss: 0.0235 215/500 [===========>..................] - ETA: 1:38 - loss: 0.3403 - regression_loss: 0.3167 - classification_loss: 0.0236 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3400 - regression_loss: 0.3164 - classification_loss: 0.0236 217/500 [============>.................] - ETA: 1:37 - loss: 0.3421 - regression_loss: 0.3184 - classification_loss: 0.0236 218/500 [============>.................] - ETA: 1:37 - loss: 0.3418 - regression_loss: 0.3182 - classification_loss: 0.0236 219/500 [============>.................] - ETA: 1:36 - loss: 0.3417 - regression_loss: 0.3181 - classification_loss: 0.0236 220/500 [============>.................] - ETA: 1:36 - loss: 0.3416 - regression_loss: 0.3180 - classification_loss: 0.0235 221/500 [============>.................] - ETA: 1:35 - loss: 0.3414 - regression_loss: 0.3178 - classification_loss: 0.0236 222/500 [============>.................] - ETA: 1:35 - loss: 0.3409 - regression_loss: 0.3173 - classification_loss: 0.0236 223/500 [============>.................] - ETA: 1:35 - loss: 0.3403 - regression_loss: 0.3168 - classification_loss: 0.0235 224/500 [============>.................] - ETA: 1:34 - loss: 0.3403 - regression_loss: 0.3168 - classification_loss: 0.0235 225/500 [============>.................] - ETA: 1:34 - loss: 0.3412 - regression_loss: 0.3176 - classification_loss: 0.0237 226/500 [============>.................] - ETA: 1:34 - loss: 0.3408 - regression_loss: 0.3172 - classification_loss: 0.0236 227/500 [============>.................] - ETA: 1:33 - loss: 0.3399 - regression_loss: 0.3163 - classification_loss: 0.0235 228/500 [============>.................] - ETA: 1:33 - loss: 0.3395 - regression_loss: 0.3160 - classification_loss: 0.0234 229/500 [============>.................] - ETA: 1:33 - loss: 0.3398 - regression_loss: 0.3163 - classification_loss: 0.0235 230/500 [============>.................] - ETA: 1:32 - loss: 0.3392 - regression_loss: 0.3158 - classification_loss: 0.0235 231/500 [============>.................] - ETA: 1:32 - loss: 0.3393 - regression_loss: 0.3159 - classification_loss: 0.0235 232/500 [============>.................] - ETA: 1:32 - loss: 0.3392 - regression_loss: 0.3157 - classification_loss: 0.0235 233/500 [============>.................] - ETA: 1:31 - loss: 0.3387 - regression_loss: 0.3153 - classification_loss: 0.0234 234/500 [=============>................] - ETA: 1:31 - loss: 0.3379 - regression_loss: 0.3146 - classification_loss: 0.0233 235/500 [=============>................] - ETA: 1:31 - loss: 0.3378 - regression_loss: 0.3145 - classification_loss: 0.0233 236/500 [=============>................] - ETA: 1:30 - loss: 0.3377 - regression_loss: 0.3144 - classification_loss: 0.0233 237/500 [=============>................] - ETA: 1:30 - loss: 0.3371 - regression_loss: 0.3139 - classification_loss: 0.0232 238/500 [=============>................] - ETA: 1:30 - loss: 0.3370 - regression_loss: 0.3137 - classification_loss: 0.0233 239/500 [=============>................] - ETA: 1:29 - loss: 0.3379 - regression_loss: 0.3144 - classification_loss: 0.0235 240/500 [=============>................] - ETA: 1:29 - loss: 0.3378 - regression_loss: 0.3144 - classification_loss: 0.0234 241/500 [=============>................] - ETA: 1:29 - loss: 0.3375 - regression_loss: 0.3141 - classification_loss: 0.0234 242/500 [=============>................] - ETA: 1:28 - loss: 0.3375 - regression_loss: 0.3141 - classification_loss: 0.0234 243/500 [=============>................] - ETA: 1:28 - loss: 0.3369 - regression_loss: 0.3136 - classification_loss: 0.0233 244/500 [=============>................] - ETA: 1:28 - loss: 0.3371 - regression_loss: 0.3138 - classification_loss: 0.0233 245/500 [=============>................] - ETA: 1:27 - loss: 0.3373 - regression_loss: 0.3140 - classification_loss: 0.0233 246/500 [=============>................] - ETA: 1:27 - loss: 0.3371 - regression_loss: 0.3138 - classification_loss: 0.0233 247/500 [=============>................] - ETA: 1:27 - loss: 0.3381 - regression_loss: 0.3148 - classification_loss: 0.0233 248/500 [=============>................] - ETA: 1:26 - loss: 0.3374 - regression_loss: 0.3142 - classification_loss: 0.0232 249/500 [=============>................] - ETA: 1:26 - loss: 0.3383 - regression_loss: 0.3150 - classification_loss: 0.0233 250/500 [==============>...............] - ETA: 1:26 - loss: 0.3391 - regression_loss: 0.3156 - classification_loss: 0.0235 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3385 - regression_loss: 0.3150 - classification_loss: 0.0235 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3378 - regression_loss: 0.3144 - classification_loss: 0.0234 253/500 [==============>...............] - ETA: 1:25 - loss: 0.3373 - regression_loss: 0.3140 - classification_loss: 0.0233 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3369 - regression_loss: 0.3135 - classification_loss: 0.0233 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3381 - regression_loss: 0.3147 - classification_loss: 0.0234 256/500 [==============>...............] - ETA: 1:24 - loss: 0.3387 - regression_loss: 0.3153 - classification_loss: 0.0234 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3391 - regression_loss: 0.3156 - classification_loss: 0.0234 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3401 - regression_loss: 0.3167 - classification_loss: 0.0234 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3409 - regression_loss: 0.3175 - classification_loss: 0.0235 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3407 - regression_loss: 0.3173 - classification_loss: 0.0234 261/500 [==============>...............] - ETA: 1:22 - loss: 0.3409 - regression_loss: 0.3176 - classification_loss: 0.0233 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3411 - regression_loss: 0.3178 - classification_loss: 0.0233 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3427 - regression_loss: 0.3192 - classification_loss: 0.0235 264/500 [==============>...............] - ETA: 1:21 - loss: 0.3426 - regression_loss: 0.3191 - classification_loss: 0.0235 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3427 - regression_loss: 0.3192 - classification_loss: 0.0235 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3420 - regression_loss: 0.3186 - classification_loss: 0.0235 267/500 [===============>..............] - ETA: 1:20 - loss: 0.3418 - regression_loss: 0.3184 - classification_loss: 0.0234 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3418 - regression_loss: 0.3184 - classification_loss: 0.0234 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3422 - regression_loss: 0.3188 - classification_loss: 0.0234 270/500 [===============>..............] - ETA: 1:19 - loss: 0.3418 - regression_loss: 0.3185 - classification_loss: 0.0233 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3413 - regression_loss: 0.3181 - classification_loss: 0.0233 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3418 - regression_loss: 0.3185 - classification_loss: 0.0233 273/500 [===============>..............] - ETA: 1:18 - loss: 0.3415 - regression_loss: 0.3182 - classification_loss: 0.0233 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3437 - regression_loss: 0.3201 - classification_loss: 0.0236 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3436 - regression_loss: 0.3200 - classification_loss: 0.0236 276/500 [===============>..............] - ETA: 1:17 - loss: 0.3429 - regression_loss: 0.3194 - classification_loss: 0.0235 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3426 - regression_loss: 0.3191 - classification_loss: 0.0235 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3426 - regression_loss: 0.3191 - classification_loss: 0.0235 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3422 - regression_loss: 0.3187 - classification_loss: 0.0235 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3416 - regression_loss: 0.3182 - classification_loss: 0.0234 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3423 - regression_loss: 0.3189 - classification_loss: 0.0235 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3418 - regression_loss: 0.3184 - classification_loss: 0.0234 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3419 - regression_loss: 0.3185 - classification_loss: 0.0234 284/500 [================>.............] - ETA: 1:14 - loss: 0.3420 - regression_loss: 0.3185 - classification_loss: 0.0234 285/500 [================>.............] - ETA: 1:13 - loss: 0.3418 - regression_loss: 0.3184 - classification_loss: 0.0234 286/500 [================>.............] - ETA: 1:13 - loss: 0.3417 - regression_loss: 0.3183 - classification_loss: 0.0234 287/500 [================>.............] - ETA: 1:13 - loss: 0.3413 - regression_loss: 0.3180 - classification_loss: 0.0233 288/500 [================>.............] - ETA: 1:12 - loss: 0.3405 - regression_loss: 0.3172 - classification_loss: 0.0233 289/500 [================>.............] - ETA: 1:12 - loss: 0.3399 - regression_loss: 0.3168 - classification_loss: 0.0232 290/500 [================>.............] - ETA: 1:12 - loss: 0.3398 - regression_loss: 0.3167 - classification_loss: 0.0231 291/500 [================>.............] - ETA: 1:11 - loss: 0.3399 - regression_loss: 0.3168 - classification_loss: 0.0231 292/500 [================>.............] - ETA: 1:11 - loss: 0.3403 - regression_loss: 0.3172 - classification_loss: 0.0231 293/500 [================>.............] - ETA: 1:11 - loss: 0.3400 - regression_loss: 0.3169 - classification_loss: 0.0231 294/500 [================>.............] - ETA: 1:10 - loss: 0.3406 - regression_loss: 0.3175 - classification_loss: 0.0231 295/500 [================>.............] - ETA: 1:10 - loss: 0.3403 - regression_loss: 0.3173 - classification_loss: 0.0230 296/500 [================>.............] - ETA: 1:10 - loss: 0.3400 - regression_loss: 0.3170 - classification_loss: 0.0230 297/500 [================>.............] - ETA: 1:09 - loss: 0.3403 - regression_loss: 0.3173 - classification_loss: 0.0230 298/500 [================>.............] - ETA: 1:09 - loss: 0.3404 - regression_loss: 0.3173 - classification_loss: 0.0231 299/500 [================>.............] - ETA: 1:09 - loss: 0.3413 - regression_loss: 0.3181 - classification_loss: 0.0232 300/500 [=================>............] - ETA: 1:08 - loss: 0.3409 - regression_loss: 0.3178 - classification_loss: 0.0232 301/500 [=================>............] - ETA: 1:08 - loss: 0.3409 - regression_loss: 0.3178 - classification_loss: 0.0231 302/500 [=================>............] - ETA: 1:08 - loss: 0.3402 - regression_loss: 0.3171 - classification_loss: 0.0231 303/500 [=================>............] - ETA: 1:07 - loss: 0.3397 - regression_loss: 0.3167 - classification_loss: 0.0231 304/500 [=================>............] - ETA: 1:07 - loss: 0.3412 - regression_loss: 0.3181 - classification_loss: 0.0231 305/500 [=================>............] - ETA: 1:07 - loss: 0.3415 - regression_loss: 0.3184 - classification_loss: 0.0231 306/500 [=================>............] - ETA: 1:06 - loss: 0.3425 - regression_loss: 0.3193 - classification_loss: 0.0232 307/500 [=================>............] - ETA: 1:06 - loss: 0.3429 - regression_loss: 0.3197 - classification_loss: 0.0232 308/500 [=================>............] - ETA: 1:06 - loss: 0.3427 - regression_loss: 0.3195 - classification_loss: 0.0232 309/500 [=================>............] - ETA: 1:05 - loss: 0.3431 - regression_loss: 0.3198 - classification_loss: 0.0233 310/500 [=================>............] - ETA: 1:05 - loss: 0.3426 - regression_loss: 0.3194 - classification_loss: 0.0232 311/500 [=================>............] - ETA: 1:05 - loss: 0.3431 - regression_loss: 0.3199 - classification_loss: 0.0232 312/500 [=================>............] - ETA: 1:04 - loss: 0.3431 - regression_loss: 0.3198 - classification_loss: 0.0232 313/500 [=================>............] - ETA: 1:04 - loss: 0.3436 - regression_loss: 0.3204 - classification_loss: 0.0232 314/500 [=================>............] - ETA: 1:03 - loss: 0.3430 - regression_loss: 0.3199 - classification_loss: 0.0232 315/500 [=================>............] - ETA: 1:03 - loss: 0.3426 - regression_loss: 0.3194 - classification_loss: 0.0231 316/500 [=================>............] - ETA: 1:03 - loss: 0.3423 - regression_loss: 0.3192 - classification_loss: 0.0231 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3417 - regression_loss: 0.3187 - classification_loss: 0.0230 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3411 - regression_loss: 0.3181 - classification_loss: 0.0230 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3417 - regression_loss: 0.3186 - classification_loss: 0.0231 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3420 - regression_loss: 0.3189 - classification_loss: 0.0231 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3422 - regression_loss: 0.3191 - classification_loss: 0.0231 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3420 - regression_loss: 0.3190 - classification_loss: 0.0230 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3419 - regression_loss: 0.3189 - classification_loss: 0.0230 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3420 - regression_loss: 0.3189 - classification_loss: 0.0230 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3425 - regression_loss: 0.3195 - classification_loss: 0.0230 326/500 [==================>...........] - ETA: 59s - loss: 0.3423 - regression_loss: 0.3193 - classification_loss: 0.0230  327/500 [==================>...........] - ETA: 59s - loss: 0.3419 - regression_loss: 0.3189 - classification_loss: 0.0230 328/500 [==================>...........] - ETA: 59s - loss: 0.3422 - regression_loss: 0.3192 - classification_loss: 0.0230 329/500 [==================>...........] - ETA: 58s - loss: 0.3429 - regression_loss: 0.3199 - classification_loss: 0.0231 330/500 [==================>...........] - ETA: 58s - loss: 0.3431 - regression_loss: 0.3201 - classification_loss: 0.0231 331/500 [==================>...........] - ETA: 58s - loss: 0.3428 - regression_loss: 0.3198 - classification_loss: 0.0230 332/500 [==================>...........] - ETA: 57s - loss: 0.3429 - regression_loss: 0.3198 - classification_loss: 0.0230 333/500 [==================>...........] - ETA: 57s - loss: 0.3433 - regression_loss: 0.3202 - classification_loss: 0.0231 334/500 [===================>..........] - ETA: 57s - loss: 0.3432 - regression_loss: 0.3201 - classification_loss: 0.0231 335/500 [===================>..........] - ETA: 56s - loss: 0.3428 - regression_loss: 0.3198 - classification_loss: 0.0230 336/500 [===================>..........] - ETA: 56s - loss: 0.3430 - regression_loss: 0.3200 - classification_loss: 0.0230 337/500 [===================>..........] - ETA: 56s - loss: 0.3427 - regression_loss: 0.3197 - classification_loss: 0.0230 338/500 [===================>..........] - ETA: 55s - loss: 0.3422 - regression_loss: 0.3193 - classification_loss: 0.0229 339/500 [===================>..........] - ETA: 55s - loss: 0.3423 - regression_loss: 0.3193 - classification_loss: 0.0230 340/500 [===================>..........] - ETA: 54s - loss: 0.3422 - regression_loss: 0.3193 - classification_loss: 0.0229 341/500 [===================>..........] - ETA: 54s - loss: 0.3425 - regression_loss: 0.3195 - classification_loss: 0.0230 342/500 [===================>..........] - ETA: 54s - loss: 0.3425 - regression_loss: 0.3196 - classification_loss: 0.0229 343/500 [===================>..........] - ETA: 53s - loss: 0.3427 - regression_loss: 0.3198 - classification_loss: 0.0229 344/500 [===================>..........] - ETA: 53s - loss: 0.3423 - regression_loss: 0.3194 - classification_loss: 0.0229 345/500 [===================>..........] - ETA: 53s - loss: 0.3424 - regression_loss: 0.3194 - classification_loss: 0.0230 346/500 [===================>..........] - ETA: 52s - loss: 0.3425 - regression_loss: 0.3196 - classification_loss: 0.0229 347/500 [===================>..........] - ETA: 52s - loss: 0.3422 - regression_loss: 0.3193 - classification_loss: 0.0229 348/500 [===================>..........] - ETA: 52s - loss: 0.3421 - regression_loss: 0.3193 - classification_loss: 0.0229 349/500 [===================>..........] - ETA: 51s - loss: 0.3425 - regression_loss: 0.3196 - classification_loss: 0.0228 350/500 [====================>.........] - ETA: 51s - loss: 0.3423 - regression_loss: 0.3195 - classification_loss: 0.0228 351/500 [====================>.........] - ETA: 51s - loss: 0.3419 - regression_loss: 0.3191 - classification_loss: 0.0228 352/500 [====================>.........] - ETA: 50s - loss: 0.3415 - regression_loss: 0.3188 - classification_loss: 0.0227 353/500 [====================>.........] - ETA: 50s - loss: 0.3412 - regression_loss: 0.3185 - classification_loss: 0.0227 354/500 [====================>.........] - ETA: 50s - loss: 0.3409 - regression_loss: 0.3182 - classification_loss: 0.0227 355/500 [====================>.........] - ETA: 49s - loss: 0.3406 - regression_loss: 0.3180 - classification_loss: 0.0226 356/500 [====================>.........] - ETA: 49s - loss: 0.3405 - regression_loss: 0.3179 - classification_loss: 0.0226 357/500 [====================>.........] - ETA: 49s - loss: 0.3410 - regression_loss: 0.3185 - classification_loss: 0.0226 358/500 [====================>.........] - ETA: 48s - loss: 0.3407 - regression_loss: 0.3181 - classification_loss: 0.0226 359/500 [====================>.........] - ETA: 48s - loss: 0.3407 - regression_loss: 0.3181 - classification_loss: 0.0226 360/500 [====================>.........] - ETA: 48s - loss: 0.3403 - regression_loss: 0.3177 - classification_loss: 0.0226 361/500 [====================>.........] - ETA: 47s - loss: 0.3399 - regression_loss: 0.3173 - classification_loss: 0.0225 362/500 [====================>.........] - ETA: 47s - loss: 0.3394 - regression_loss: 0.3169 - classification_loss: 0.0225 363/500 [====================>.........] - ETA: 47s - loss: 0.3395 - regression_loss: 0.3170 - classification_loss: 0.0225 364/500 [====================>.........] - ETA: 46s - loss: 0.3392 - regression_loss: 0.3168 - classification_loss: 0.0224 365/500 [====================>.........] - ETA: 46s - loss: 0.3391 - regression_loss: 0.3167 - classification_loss: 0.0225 366/500 [====================>.........] - ETA: 45s - loss: 0.3396 - regression_loss: 0.3171 - classification_loss: 0.0225 367/500 [=====================>........] - ETA: 45s - loss: 0.3392 - regression_loss: 0.3168 - classification_loss: 0.0224 368/500 [=====================>........] - ETA: 45s - loss: 0.3398 - regression_loss: 0.3173 - classification_loss: 0.0225 369/500 [=====================>........] - ETA: 44s - loss: 0.3395 - regression_loss: 0.3171 - classification_loss: 0.0224 370/500 [=====================>........] - ETA: 44s - loss: 0.3393 - regression_loss: 0.3169 - classification_loss: 0.0224 371/500 [=====================>........] - ETA: 44s - loss: 0.3392 - regression_loss: 0.3168 - classification_loss: 0.0224 372/500 [=====================>........] - ETA: 43s - loss: 0.3392 - regression_loss: 0.3168 - classification_loss: 0.0224 373/500 [=====================>........] - ETA: 43s - loss: 0.3390 - regression_loss: 0.3166 - classification_loss: 0.0224 374/500 [=====================>........] - ETA: 43s - loss: 0.3387 - regression_loss: 0.3164 - classification_loss: 0.0224 375/500 [=====================>........] - ETA: 42s - loss: 0.3384 - regression_loss: 0.3161 - classification_loss: 0.0223 376/500 [=====================>........] - ETA: 42s - loss: 0.3383 - regression_loss: 0.3160 - classification_loss: 0.0223 377/500 [=====================>........] - ETA: 42s - loss: 0.3379 - regression_loss: 0.3156 - classification_loss: 0.0223 378/500 [=====================>........] - ETA: 41s - loss: 0.3379 - regression_loss: 0.3156 - classification_loss: 0.0223 379/500 [=====================>........] - ETA: 41s - loss: 0.3380 - regression_loss: 0.3156 - classification_loss: 0.0223 380/500 [=====================>........] - ETA: 41s - loss: 0.3380 - regression_loss: 0.3157 - classification_loss: 0.0224 381/500 [=====================>........] - ETA: 40s - loss: 0.3379 - regression_loss: 0.3156 - classification_loss: 0.0223 382/500 [=====================>........] - ETA: 40s - loss: 0.3378 - regression_loss: 0.3156 - classification_loss: 0.0223 383/500 [=====================>........] - ETA: 40s - loss: 0.3378 - regression_loss: 0.3156 - classification_loss: 0.0222 384/500 [======================>.......] - ETA: 39s - loss: 0.3374 - regression_loss: 0.3152 - classification_loss: 0.0222 385/500 [======================>.......] - ETA: 39s - loss: 0.3376 - regression_loss: 0.3153 - classification_loss: 0.0223 386/500 [======================>.......] - ETA: 39s - loss: 0.3372 - regression_loss: 0.3150 - classification_loss: 0.0223 387/500 [======================>.......] - ETA: 38s - loss: 0.3374 - regression_loss: 0.3152 - classification_loss: 0.0223 388/500 [======================>.......] - ETA: 38s - loss: 0.3372 - regression_loss: 0.3150 - classification_loss: 0.0222 389/500 [======================>.......] - ETA: 38s - loss: 0.3376 - regression_loss: 0.3153 - classification_loss: 0.0223 390/500 [======================>.......] - ETA: 37s - loss: 0.3390 - regression_loss: 0.3165 - classification_loss: 0.0225 391/500 [======================>.......] - ETA: 37s - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0225 392/500 [======================>.......] - ETA: 37s - loss: 0.3385 - regression_loss: 0.3160 - classification_loss: 0.0224 393/500 [======================>.......] - ETA: 36s - loss: 0.3382 - regression_loss: 0.3158 - classification_loss: 0.0224 394/500 [======================>.......] - ETA: 36s - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0224 395/500 [======================>.......] - ETA: 36s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 396/500 [======================>.......] - ETA: 35s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 397/500 [======================>.......] - ETA: 35s - loss: 0.3382 - regression_loss: 0.3158 - classification_loss: 0.0224 398/500 [======================>.......] - ETA: 35s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0224 399/500 [======================>.......] - ETA: 34s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 400/500 [=======================>......] - ETA: 34s - loss: 0.3391 - regression_loss: 0.3165 - classification_loss: 0.0225 401/500 [=======================>......] - ETA: 33s - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 402/500 [=======================>......] - ETA: 33s - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 403/500 [=======================>......] - ETA: 33s - loss: 0.3387 - regression_loss: 0.3163 - classification_loss: 0.0225 404/500 [=======================>......] - ETA: 32s - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0225 405/500 [=======================>......] - ETA: 32s - loss: 0.3388 - regression_loss: 0.3164 - classification_loss: 0.0225 406/500 [=======================>......] - ETA: 32s - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 407/500 [=======================>......] - ETA: 31s - loss: 0.3386 - regression_loss: 0.3161 - classification_loss: 0.0225 408/500 [=======================>......] - ETA: 31s - loss: 0.3383 - regression_loss: 0.3159 - classification_loss: 0.0225 409/500 [=======================>......] - ETA: 31s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0225 410/500 [=======================>......] - ETA: 30s - loss: 0.3385 - regression_loss: 0.3160 - classification_loss: 0.0225 411/500 [=======================>......] - ETA: 30s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0225 412/500 [=======================>......] - ETA: 30s - loss: 0.3381 - regression_loss: 0.3157 - classification_loss: 0.0224 413/500 [=======================>......] - ETA: 29s - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0224 414/500 [=======================>......] - ETA: 29s - loss: 0.3385 - regression_loss: 0.3161 - classification_loss: 0.0224 415/500 [=======================>......] - ETA: 29s - loss: 0.3389 - regression_loss: 0.3165 - classification_loss: 0.0224 416/500 [=======================>......] - ETA: 28s - loss: 0.3395 - regression_loss: 0.3171 - classification_loss: 0.0224 417/500 [========================>.....] - ETA: 28s - loss: 0.3403 - regression_loss: 0.3179 - classification_loss: 0.0225 418/500 [========================>.....] - ETA: 28s - loss: 0.3410 - regression_loss: 0.3185 - classification_loss: 0.0225 419/500 [========================>.....] - ETA: 27s - loss: 0.3406 - regression_loss: 0.3182 - classification_loss: 0.0225 420/500 [========================>.....] - ETA: 27s - loss: 0.3404 - regression_loss: 0.3180 - classification_loss: 0.0224 421/500 [========================>.....] - ETA: 27s - loss: 0.3403 - regression_loss: 0.3178 - classification_loss: 0.0225 422/500 [========================>.....] - ETA: 26s - loss: 0.3401 - regression_loss: 0.3177 - classification_loss: 0.0224 423/500 [========================>.....] - ETA: 26s - loss: 0.3404 - regression_loss: 0.3180 - classification_loss: 0.0224 424/500 [========================>.....] - ETA: 26s - loss: 0.3401 - regression_loss: 0.3177 - classification_loss: 0.0224 425/500 [========================>.....] - ETA: 25s - loss: 0.3398 - regression_loss: 0.3174 - classification_loss: 0.0223 426/500 [========================>.....] - ETA: 25s - loss: 0.3401 - regression_loss: 0.3178 - classification_loss: 0.0224 427/500 [========================>.....] - ETA: 25s - loss: 0.3399 - regression_loss: 0.3176 - classification_loss: 0.0223 428/500 [========================>.....] - ETA: 24s - loss: 0.3400 - regression_loss: 0.3176 - classification_loss: 0.0224 429/500 [========================>.....] - ETA: 24s - loss: 0.3401 - regression_loss: 0.3177 - classification_loss: 0.0224 430/500 [========================>.....] - ETA: 24s - loss: 0.3406 - regression_loss: 0.3181 - classification_loss: 0.0226 431/500 [========================>.....] - ETA: 23s - loss: 0.3401 - regression_loss: 0.3176 - classification_loss: 0.0225 432/500 [========================>.....] - ETA: 23s - loss: 0.3400 - regression_loss: 0.3175 - classification_loss: 0.0225 433/500 [========================>.....] - ETA: 23s - loss: 0.3401 - regression_loss: 0.3176 - classification_loss: 0.0225 434/500 [=========================>....] - ETA: 22s - loss: 0.3407 - regression_loss: 0.3182 - classification_loss: 0.0226 435/500 [=========================>....] - ETA: 22s - loss: 0.3406 - regression_loss: 0.3181 - classification_loss: 0.0225 436/500 [=========================>....] - ETA: 21s - loss: 0.3408 - regression_loss: 0.3183 - classification_loss: 0.0225 437/500 [=========================>....] - ETA: 21s - loss: 0.3422 - regression_loss: 0.3196 - classification_loss: 0.0226 438/500 [=========================>....] - ETA: 21s - loss: 0.3421 - regression_loss: 0.3195 - classification_loss: 0.0226 439/500 [=========================>....] - ETA: 20s - loss: 0.3421 - regression_loss: 0.3195 - classification_loss: 0.0226 440/500 [=========================>....] - ETA: 20s - loss: 0.3421 - regression_loss: 0.3196 - classification_loss: 0.0226 441/500 [=========================>....] - ETA: 20s - loss: 0.3418 - regression_loss: 0.3193 - classification_loss: 0.0225 442/500 [=========================>....] - ETA: 19s - loss: 0.3419 - regression_loss: 0.3194 - classification_loss: 0.0225 443/500 [=========================>....] - ETA: 19s - loss: 0.3430 - regression_loss: 0.3205 - classification_loss: 0.0226 444/500 [=========================>....] - ETA: 19s - loss: 0.3438 - regression_loss: 0.3212 - classification_loss: 0.0226 445/500 [=========================>....] - ETA: 18s - loss: 0.3442 - regression_loss: 0.3216 - classification_loss: 0.0226 446/500 [=========================>....] - ETA: 18s - loss: 0.3444 - regression_loss: 0.3218 - classification_loss: 0.0226 447/500 [=========================>....] - ETA: 18s - loss: 0.3441 - regression_loss: 0.3215 - classification_loss: 0.0226 448/500 [=========================>....] - ETA: 17s - loss: 0.3441 - regression_loss: 0.3215 - classification_loss: 0.0226 449/500 [=========================>....] - ETA: 17s - loss: 0.3438 - regression_loss: 0.3212 - classification_loss: 0.0226 450/500 [==========================>...] - ETA: 17s - loss: 0.3438 - regression_loss: 0.3212 - classification_loss: 0.0226 451/500 [==========================>...] - ETA: 16s - loss: 0.3435 - regression_loss: 0.3209 - classification_loss: 0.0226 452/500 [==========================>...] - ETA: 16s - loss: 0.3431 - regression_loss: 0.3206 - classification_loss: 0.0225 453/500 [==========================>...] - ETA: 16s - loss: 0.3444 - regression_loss: 0.3218 - classification_loss: 0.0226 454/500 [==========================>...] - ETA: 15s - loss: 0.3446 - regression_loss: 0.3220 - classification_loss: 0.0226 455/500 [==========================>...] - ETA: 15s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 456/500 [==========================>...] - ETA: 15s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 457/500 [==========================>...] - ETA: 14s - loss: 0.3448 - regression_loss: 0.3222 - classification_loss: 0.0226 458/500 [==========================>...] - ETA: 14s - loss: 0.3444 - regression_loss: 0.3219 - classification_loss: 0.0226 459/500 [==========================>...] - ETA: 14s - loss: 0.3446 - regression_loss: 0.3221 - classification_loss: 0.0226 460/500 [==========================>...] - ETA: 13s - loss: 0.3445 - regression_loss: 0.3219 - classification_loss: 0.0226 461/500 [==========================>...] - ETA: 13s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 462/500 [==========================>...] - ETA: 13s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 463/500 [==========================>...] - ETA: 12s - loss: 0.3448 - regression_loss: 0.3222 - classification_loss: 0.0226 464/500 [==========================>...] - ETA: 12s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 465/500 [==========================>...] - ETA: 12s - loss: 0.3447 - regression_loss: 0.3221 - classification_loss: 0.0226 466/500 [==========================>...] - ETA: 11s - loss: 0.3446 - regression_loss: 0.3220 - classification_loss: 0.0225 467/500 [===========================>..] - ETA: 11s - loss: 0.3450 - regression_loss: 0.3225 - classification_loss: 0.0225 468/500 [===========================>..] - ETA: 10s - loss: 0.3448 - regression_loss: 0.3223 - classification_loss: 0.0225 469/500 [===========================>..] - ETA: 10s - loss: 0.3445 - regression_loss: 0.3220 - classification_loss: 0.0225 470/500 [===========================>..] - ETA: 10s - loss: 0.3451 - regression_loss: 0.3225 - classification_loss: 0.0225 471/500 [===========================>..] - ETA: 9s - loss: 0.3447 - regression_loss: 0.3222 - classification_loss: 0.0225  472/500 [===========================>..] - ETA: 9s - loss: 0.3444 - regression_loss: 0.3219 - classification_loss: 0.0224 473/500 [===========================>..] - ETA: 9s - loss: 0.3442 - regression_loss: 0.3217 - classification_loss: 0.0224 474/500 [===========================>..] - ETA: 8s - loss: 0.3438 - regression_loss: 0.3214 - classification_loss: 0.0224 475/500 [===========================>..] - ETA: 8s - loss: 0.3435 - regression_loss: 0.3211 - classification_loss: 0.0224 476/500 [===========================>..] - ETA: 8s - loss: 0.3431 - regression_loss: 0.3208 - classification_loss: 0.0223 477/500 [===========================>..] - ETA: 7s - loss: 0.3435 - regression_loss: 0.3212 - classification_loss: 0.0223 478/500 [===========================>..] - ETA: 7s - loss: 0.3434 - regression_loss: 0.3211 - classification_loss: 0.0223 479/500 [===========================>..] - ETA: 7s - loss: 0.3435 - regression_loss: 0.3212 - classification_loss: 0.0223 480/500 [===========================>..] - ETA: 6s - loss: 0.3431 - regression_loss: 0.3208 - classification_loss: 0.0223 481/500 [===========================>..] - ETA: 6s - loss: 0.3429 - regression_loss: 0.3206 - classification_loss: 0.0223 482/500 [===========================>..] - ETA: 6s - loss: 0.3430 - regression_loss: 0.3207 - classification_loss: 0.0223 483/500 [===========================>..] - ETA: 5s - loss: 0.3429 - regression_loss: 0.3207 - classification_loss: 0.0222 484/500 [============================>.] - ETA: 5s - loss: 0.3429 - regression_loss: 0.3207 - classification_loss: 0.0222 485/500 [============================>.] - ETA: 5s - loss: 0.3430 - regression_loss: 0.3208 - classification_loss: 0.0222 486/500 [============================>.] - ETA: 4s - loss: 0.3431 - regression_loss: 0.3209 - classification_loss: 0.0223 487/500 [============================>.] - ETA: 4s - loss: 0.3436 - regression_loss: 0.3213 - classification_loss: 0.0223 488/500 [============================>.] - ETA: 4s - loss: 0.3436 - regression_loss: 0.3213 - classification_loss: 0.0223 489/500 [============================>.] - ETA: 3s - loss: 0.3438 - regression_loss: 0.3215 - classification_loss: 0.0223 490/500 [============================>.] - ETA: 3s - loss: 0.3438 - regression_loss: 0.3215 - classification_loss: 0.0223 491/500 [============================>.] - ETA: 3s - loss: 0.3441 - regression_loss: 0.3218 - classification_loss: 0.0223 492/500 [============================>.] - ETA: 2s - loss: 0.3439 - regression_loss: 0.3216 - classification_loss: 0.0223 493/500 [============================>.] - ETA: 2s - loss: 0.3435 - regression_loss: 0.3212 - classification_loss: 0.0223 494/500 [============================>.] - ETA: 2s - loss: 0.3435 - regression_loss: 0.3212 - classification_loss: 0.0223 495/500 [============================>.] - ETA: 1s - loss: 0.3433 - regression_loss: 0.3210 - classification_loss: 0.0223 496/500 [============================>.] - ETA: 1s - loss: 0.3436 - regression_loss: 0.3213 - classification_loss: 0.0223 497/500 [============================>.] - ETA: 1s - loss: 0.3438 - regression_loss: 0.3215 - classification_loss: 0.0223 498/500 [============================>.] - ETA: 0s - loss: 0.3444 - regression_loss: 0.3220 - classification_loss: 0.0224 499/500 [============================>.] - ETA: 0s - loss: 0.3446 - regression_loss: 0.3222 - classification_loss: 0.0224 500/500 [==============================] - 172s 343ms/step - loss: 0.3444 - regression_loss: 0.3220 - classification_loss: 0.0224 1172 instances of class plum with average precision: 0.7470 mAP: 0.7470 Epoch 00032: saving model to ./training/snapshots/resnet101_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 2:33 - loss: 0.2986 - regression_loss: 0.2814 - classification_loss: 0.0172 2/500 [..............................] - ETA: 2:33 - loss: 0.2692 - regression_loss: 0.2542 - classification_loss: 0.0151 3/500 [..............................] - ETA: 2:34 - loss: 0.2775 - regression_loss: 0.2608 - classification_loss: 0.0167 4/500 [..............................] - ETA: 2:38 - loss: 0.2959 - regression_loss: 0.2785 - classification_loss: 0.0174 5/500 [..............................] - ETA: 2:42 - loss: 0.3053 - regression_loss: 0.2869 - classification_loss: 0.0184 6/500 [..............................] - ETA: 2:43 - loss: 0.2778 - regression_loss: 0.2612 - classification_loss: 0.0166 7/500 [..............................] - ETA: 2:42 - loss: 0.2734 - regression_loss: 0.2580 - classification_loss: 0.0153 8/500 [..............................] - ETA: 2:43 - loss: 0.2721 - regression_loss: 0.2581 - classification_loss: 0.0141 9/500 [..............................] - ETA: 2:42 - loss: 0.2878 - regression_loss: 0.2722 - classification_loss: 0.0156 10/500 [..............................] - ETA: 2:41 - loss: 0.2923 - regression_loss: 0.2762 - classification_loss: 0.0161 11/500 [..............................] - ETA: 2:42 - loss: 0.2987 - regression_loss: 0.2825 - classification_loss: 0.0162 12/500 [..............................] - ETA: 2:41 - loss: 0.3045 - regression_loss: 0.2870 - classification_loss: 0.0175 13/500 [..............................] - ETA: 2:41 - loss: 0.3089 - regression_loss: 0.2905 - classification_loss: 0.0184 14/500 [..............................] - ETA: 2:41 - loss: 0.2980 - regression_loss: 0.2802 - classification_loss: 0.0178 15/500 [..............................] - ETA: 2:41 - loss: 0.2926 - regression_loss: 0.2752 - classification_loss: 0.0174 16/500 [..............................] - ETA: 2:40 - loss: 0.2843 - regression_loss: 0.2674 - classification_loss: 0.0169 17/500 [>.............................] - ETA: 2:39 - loss: 0.3129 - regression_loss: 0.2933 - classification_loss: 0.0196 18/500 [>.............................] - ETA: 2:38 - loss: 0.3029 - regression_loss: 0.2842 - classification_loss: 0.0187 19/500 [>.............................] - ETA: 2:38 - loss: 0.3036 - regression_loss: 0.2838 - classification_loss: 0.0199 20/500 [>.............................] - ETA: 2:38 - loss: 0.3095 - regression_loss: 0.2900 - classification_loss: 0.0194 21/500 [>.............................] - ETA: 2:38 - loss: 0.3299 - regression_loss: 0.3095 - classification_loss: 0.0204 22/500 [>.............................] - ETA: 2:38 - loss: 0.3246 - regression_loss: 0.3045 - classification_loss: 0.0201 23/500 [>.............................] - ETA: 2:38 - loss: 0.3354 - regression_loss: 0.3137 - classification_loss: 0.0217 24/500 [>.............................] - ETA: 2:38 - loss: 0.3368 - regression_loss: 0.3151 - classification_loss: 0.0217 25/500 [>.............................] - ETA: 2:38 - loss: 0.3466 - regression_loss: 0.3245 - classification_loss: 0.0220 26/500 [>.............................] - ETA: 2:37 - loss: 0.3524 - regression_loss: 0.3302 - classification_loss: 0.0221 27/500 [>.............................] - ETA: 2:37 - loss: 0.3563 - regression_loss: 0.3343 - classification_loss: 0.0220 28/500 [>.............................] - ETA: 2:37 - loss: 0.3566 - regression_loss: 0.3351 - classification_loss: 0.0215 29/500 [>.............................] - ETA: 2:37 - loss: 0.3560 - regression_loss: 0.3348 - classification_loss: 0.0212 30/500 [>.............................] - ETA: 2:37 - loss: 0.3610 - regression_loss: 0.3387 - classification_loss: 0.0223 31/500 [>.............................] - ETA: 2:36 - loss: 0.3591 - regression_loss: 0.3371 - classification_loss: 0.0220 32/500 [>.............................] - ETA: 2:36 - loss: 0.3627 - regression_loss: 0.3409 - classification_loss: 0.0218 33/500 [>.............................] - ETA: 2:36 - loss: 0.3682 - regression_loss: 0.3453 - classification_loss: 0.0229 34/500 [=>............................] - ETA: 2:36 - loss: 0.3735 - regression_loss: 0.3500 - classification_loss: 0.0236 35/500 [=>............................] - ETA: 2:36 - loss: 0.3772 - regression_loss: 0.3536 - classification_loss: 0.0236 36/500 [=>............................] - ETA: 2:35 - loss: 0.3788 - regression_loss: 0.3551 - classification_loss: 0.0237 37/500 [=>............................] - ETA: 2:35 - loss: 0.3796 - regression_loss: 0.3552 - classification_loss: 0.0243 38/500 [=>............................] - ETA: 2:35 - loss: 0.3734 - regression_loss: 0.3494 - classification_loss: 0.0240 39/500 [=>............................] - ETA: 2:35 - loss: 0.3772 - regression_loss: 0.3532 - classification_loss: 0.0240 40/500 [=>............................] - ETA: 2:34 - loss: 0.3796 - regression_loss: 0.3560 - classification_loss: 0.0236 41/500 [=>............................] - ETA: 2:34 - loss: 0.3745 - regression_loss: 0.3512 - classification_loss: 0.0232 42/500 [=>............................] - ETA: 2:34 - loss: 0.3772 - regression_loss: 0.3538 - classification_loss: 0.0234 43/500 [=>............................] - ETA: 2:33 - loss: 0.3775 - regression_loss: 0.3542 - classification_loss: 0.0233 44/500 [=>............................] - ETA: 2:33 - loss: 0.3763 - regression_loss: 0.3530 - classification_loss: 0.0232 45/500 [=>............................] - ETA: 2:33 - loss: 0.3800 - regression_loss: 0.3564 - classification_loss: 0.0236 46/500 [=>............................] - ETA: 2:33 - loss: 0.3791 - regression_loss: 0.3557 - classification_loss: 0.0234 47/500 [=>............................] - ETA: 2:33 - loss: 0.3805 - regression_loss: 0.3573 - classification_loss: 0.0232 48/500 [=>............................] - ETA: 2:32 - loss: 0.3803 - regression_loss: 0.3570 - classification_loss: 0.0233 49/500 [=>............................] - ETA: 2:32 - loss: 0.3785 - regression_loss: 0.3552 - classification_loss: 0.0233 50/500 [==>...........................] - ETA: 2:32 - loss: 0.3800 - regression_loss: 0.3567 - classification_loss: 0.0233 51/500 [==>...........................] - ETA: 2:31 - loss: 0.3789 - regression_loss: 0.3557 - classification_loss: 0.0232 52/500 [==>...........................] - ETA: 2:31 - loss: 0.3772 - regression_loss: 0.3541 - classification_loss: 0.0231 53/500 [==>...........................] - ETA: 2:31 - loss: 0.3780 - regression_loss: 0.3548 - classification_loss: 0.0232 54/500 [==>...........................] - ETA: 2:30 - loss: 0.3783 - regression_loss: 0.3550 - classification_loss: 0.0233 55/500 [==>...........................] - ETA: 2:30 - loss: 0.3781 - regression_loss: 0.3549 - classification_loss: 0.0233 56/500 [==>...........................] - ETA: 2:30 - loss: 0.3753 - regression_loss: 0.3522 - classification_loss: 0.0231 57/500 [==>...........................] - ETA: 2:29 - loss: 0.3720 - regression_loss: 0.3490 - classification_loss: 0.0230 58/500 [==>...........................] - ETA: 2:29 - loss: 0.3747 - regression_loss: 0.3514 - classification_loss: 0.0233 59/500 [==>...........................] - ETA: 2:29 - loss: 0.3772 - regression_loss: 0.3534 - classification_loss: 0.0238 60/500 [==>...........................] - ETA: 2:28 - loss: 0.3762 - regression_loss: 0.3526 - classification_loss: 0.0236 61/500 [==>...........................] - ETA: 2:28 - loss: 0.3754 - regression_loss: 0.3519 - classification_loss: 0.0234 62/500 [==>...........................] - ETA: 2:28 - loss: 0.3799 - regression_loss: 0.3562 - classification_loss: 0.0237 63/500 [==>...........................] - ETA: 2:27 - loss: 0.3777 - regression_loss: 0.3541 - classification_loss: 0.0236 64/500 [==>...........................] - ETA: 2:27 - loss: 0.3791 - regression_loss: 0.3555 - classification_loss: 0.0236 65/500 [==>...........................] - ETA: 2:27 - loss: 0.3761 - regression_loss: 0.3528 - classification_loss: 0.0233 66/500 [==>...........................] - ETA: 2:26 - loss: 0.3771 - regression_loss: 0.3536 - classification_loss: 0.0235 67/500 [===>..........................] - ETA: 2:26 - loss: 0.3767 - regression_loss: 0.3533 - classification_loss: 0.0234 68/500 [===>..........................] - ETA: 2:26 - loss: 0.3731 - regression_loss: 0.3500 - classification_loss: 0.0231 69/500 [===>..........................] - ETA: 2:26 - loss: 0.3724 - regression_loss: 0.3489 - classification_loss: 0.0235 70/500 [===>..........................] - ETA: 2:25 - loss: 0.3753 - regression_loss: 0.3517 - classification_loss: 0.0236 71/500 [===>..........................] - ETA: 2:25 - loss: 0.3723 - regression_loss: 0.3489 - classification_loss: 0.0234 72/500 [===>..........................] - ETA: 2:25 - loss: 0.3732 - regression_loss: 0.3500 - classification_loss: 0.0232 73/500 [===>..........................] - ETA: 2:24 - loss: 0.3755 - regression_loss: 0.3520 - classification_loss: 0.0235 74/500 [===>..........................] - ETA: 2:24 - loss: 0.3761 - regression_loss: 0.3525 - classification_loss: 0.0236 75/500 [===>..........................] - ETA: 2:24 - loss: 0.3764 - regression_loss: 0.3525 - classification_loss: 0.0239 76/500 [===>..........................] - ETA: 2:24 - loss: 0.3775 - regression_loss: 0.3534 - classification_loss: 0.0241 77/500 [===>..........................] - ETA: 2:23 - loss: 0.3790 - regression_loss: 0.3547 - classification_loss: 0.0243 78/500 [===>..........................] - ETA: 2:23 - loss: 0.3790 - regression_loss: 0.3548 - classification_loss: 0.0242 79/500 [===>..........................] - ETA: 2:23 - loss: 0.3785 - regression_loss: 0.3543 - classification_loss: 0.0242 80/500 [===>..........................] - ETA: 2:22 - loss: 0.3794 - regression_loss: 0.3552 - classification_loss: 0.0242 81/500 [===>..........................] - ETA: 2:22 - loss: 0.3806 - regression_loss: 0.3563 - classification_loss: 0.0243 82/500 [===>..........................] - ETA: 2:22 - loss: 0.3804 - regression_loss: 0.3562 - classification_loss: 0.0242 83/500 [===>..........................] - ETA: 2:21 - loss: 0.3795 - regression_loss: 0.3555 - classification_loss: 0.0240 84/500 [====>.........................] - ETA: 2:21 - loss: 0.3781 - regression_loss: 0.3543 - classification_loss: 0.0238 85/500 [====>.........................] - ETA: 2:21 - loss: 0.3755 - regression_loss: 0.3518 - classification_loss: 0.0237 86/500 [====>.........................] - ETA: 2:20 - loss: 0.3740 - regression_loss: 0.3504 - classification_loss: 0.0235 87/500 [====>.........................] - ETA: 2:20 - loss: 0.3754 - regression_loss: 0.3516 - classification_loss: 0.0238 88/500 [====>.........................] - ETA: 2:20 - loss: 0.3731 - regression_loss: 0.3496 - classification_loss: 0.0235 89/500 [====>.........................] - ETA: 2:19 - loss: 0.3729 - regression_loss: 0.3495 - classification_loss: 0.0234 90/500 [====>.........................] - ETA: 2:19 - loss: 0.3723 - regression_loss: 0.3490 - classification_loss: 0.0234 91/500 [====>.........................] - ETA: 2:19 - loss: 0.3712 - regression_loss: 0.3479 - classification_loss: 0.0232 92/500 [====>.........................] - ETA: 2:18 - loss: 0.3684 - regression_loss: 0.3454 - classification_loss: 0.0231 93/500 [====>.........................] - ETA: 2:18 - loss: 0.3667 - regression_loss: 0.3437 - classification_loss: 0.0230 94/500 [====>.........................] - ETA: 2:18 - loss: 0.3662 - regression_loss: 0.3432 - classification_loss: 0.0231 95/500 [====>.........................] - ETA: 2:17 - loss: 0.3644 - regression_loss: 0.3414 - classification_loss: 0.0230 96/500 [====>.........................] - ETA: 2:17 - loss: 0.3629 - regression_loss: 0.3400 - classification_loss: 0.0229 97/500 [====>.........................] - ETA: 2:17 - loss: 0.3618 - regression_loss: 0.3388 - classification_loss: 0.0229 98/500 [====>.........................] - ETA: 2:16 - loss: 0.3591 - regression_loss: 0.3363 - classification_loss: 0.0228 99/500 [====>.........................] - ETA: 2:16 - loss: 0.3591 - regression_loss: 0.3361 - classification_loss: 0.0231 100/500 [=====>........................] - ETA: 2:16 - loss: 0.3598 - regression_loss: 0.3367 - classification_loss: 0.0232 101/500 [=====>........................] - ETA: 2:15 - loss: 0.3587 - regression_loss: 0.3355 - classification_loss: 0.0231 102/500 [=====>........................] - ETA: 2:15 - loss: 0.3587 - regression_loss: 0.3356 - classification_loss: 0.0231 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3620 - regression_loss: 0.3389 - classification_loss: 0.0231 104/500 [=====>........................] - ETA: 2:14 - loss: 0.3623 - regression_loss: 0.3390 - classification_loss: 0.0232 105/500 [=====>........................] - ETA: 2:14 - loss: 0.3610 - regression_loss: 0.3379 - classification_loss: 0.0232 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3609 - regression_loss: 0.3377 - classification_loss: 0.0231 107/500 [=====>........................] - ETA: 2:13 - loss: 0.3602 - regression_loss: 0.3371 - classification_loss: 0.0231 108/500 [=====>........................] - ETA: 2:13 - loss: 0.3586 - regression_loss: 0.3357 - classification_loss: 0.0229 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3592 - regression_loss: 0.3361 - classification_loss: 0.0231 110/500 [=====>........................] - ETA: 2:12 - loss: 0.3610 - regression_loss: 0.3375 - classification_loss: 0.0235 111/500 [=====>........................] - ETA: 2:12 - loss: 0.3595 - regression_loss: 0.3360 - classification_loss: 0.0235 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3586 - regression_loss: 0.3353 - classification_loss: 0.0233 113/500 [=====>........................] - ETA: 2:11 - loss: 0.3586 - regression_loss: 0.3352 - classification_loss: 0.0234 114/500 [=====>........................] - ETA: 2:11 - loss: 0.3573 - regression_loss: 0.3340 - classification_loss: 0.0232 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3566 - regression_loss: 0.3334 - classification_loss: 0.0232 116/500 [=====>........................] - ETA: 2:10 - loss: 0.3567 - regression_loss: 0.3335 - classification_loss: 0.0232 117/500 [======>.......................] - ETA: 2:10 - loss: 0.3565 - regression_loss: 0.3332 - classification_loss: 0.0233 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3556 - regression_loss: 0.3325 - classification_loss: 0.0232 119/500 [======>.......................] - ETA: 2:09 - loss: 0.3547 - regression_loss: 0.3316 - classification_loss: 0.0231 120/500 [======>.......................] - ETA: 2:09 - loss: 0.3550 - regression_loss: 0.3318 - classification_loss: 0.0232 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3542 - regression_loss: 0.3312 - classification_loss: 0.0231 122/500 [======>.......................] - ETA: 2:08 - loss: 0.3538 - regression_loss: 0.3308 - classification_loss: 0.0230 123/500 [======>.......................] - ETA: 2:08 - loss: 0.3539 - regression_loss: 0.3308 - classification_loss: 0.0230 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3533 - regression_loss: 0.3303 - classification_loss: 0.0230 125/500 [======>.......................] - ETA: 2:07 - loss: 0.3550 - regression_loss: 0.3320 - classification_loss: 0.0229 126/500 [======>.......................] - ETA: 2:07 - loss: 0.3588 - regression_loss: 0.3355 - classification_loss: 0.0233 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3585 - regression_loss: 0.3353 - classification_loss: 0.0232 128/500 [======>.......................] - ETA: 2:06 - loss: 0.3590 - regression_loss: 0.3358 - classification_loss: 0.0232 129/500 [======>.......................] - ETA: 2:06 - loss: 0.3582 - regression_loss: 0.3350 - classification_loss: 0.0231 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3583 - regression_loss: 0.3351 - classification_loss: 0.0232 131/500 [======>.......................] - ETA: 2:05 - loss: 0.3576 - regression_loss: 0.3345 - classification_loss: 0.0231 132/500 [======>.......................] - ETA: 2:05 - loss: 0.3563 - regression_loss: 0.3334 - classification_loss: 0.0230 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3564 - regression_loss: 0.3335 - classification_loss: 0.0229 134/500 [=======>......................] - ETA: 2:04 - loss: 0.3552 - regression_loss: 0.3324 - classification_loss: 0.0228 135/500 [=======>......................] - ETA: 2:04 - loss: 0.3577 - regression_loss: 0.3349 - classification_loss: 0.0228 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3569 - regression_loss: 0.3341 - classification_loss: 0.0228 137/500 [=======>......................] - ETA: 2:03 - loss: 0.3567 - regression_loss: 0.3339 - classification_loss: 0.0228 138/500 [=======>......................] - ETA: 2:03 - loss: 0.3566 - regression_loss: 0.3338 - classification_loss: 0.0227 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3566 - regression_loss: 0.3339 - classification_loss: 0.0227 140/500 [=======>......................] - ETA: 2:02 - loss: 0.3564 - regression_loss: 0.3337 - classification_loss: 0.0227 141/500 [=======>......................] - ETA: 2:02 - loss: 0.3548 - regression_loss: 0.3323 - classification_loss: 0.0225 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3540 - regression_loss: 0.3316 - classification_loss: 0.0225 143/500 [=======>......................] - ETA: 2:01 - loss: 0.3535 - regression_loss: 0.3312 - classification_loss: 0.0224 144/500 [=======>......................] - ETA: 2:01 - loss: 0.3532 - regression_loss: 0.3309 - classification_loss: 0.0223 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3534 - regression_loss: 0.3311 - classification_loss: 0.0223 146/500 [=======>......................] - ETA: 2:00 - loss: 0.3523 - regression_loss: 0.3301 - classification_loss: 0.0223 147/500 [=======>......................] - ETA: 2:00 - loss: 0.3515 - regression_loss: 0.3294 - classification_loss: 0.0222 148/500 [=======>......................] - ETA: 2:00 - loss: 0.3513 - regression_loss: 0.3291 - classification_loss: 0.0221 149/500 [=======>......................] - ETA: 1:59 - loss: 0.3512 - regression_loss: 0.3290 - classification_loss: 0.0222 150/500 [========>.....................] - ETA: 1:59 - loss: 0.3506 - regression_loss: 0.3284 - classification_loss: 0.0222 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3506 - regression_loss: 0.3285 - classification_loss: 0.0221 152/500 [========>.....................] - ETA: 1:58 - loss: 0.3497 - regression_loss: 0.3277 - classification_loss: 0.0220 153/500 [========>.....................] - ETA: 1:58 - loss: 0.3499 - regression_loss: 0.3277 - classification_loss: 0.0221 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3506 - regression_loss: 0.3281 - classification_loss: 0.0225 155/500 [========>.....................] - ETA: 1:57 - loss: 0.3499 - regression_loss: 0.3275 - classification_loss: 0.0224 156/500 [========>.....................] - ETA: 1:57 - loss: 0.3496 - regression_loss: 0.3273 - classification_loss: 0.0223 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3502 - regression_loss: 0.3278 - classification_loss: 0.0223 158/500 [========>.....................] - ETA: 1:56 - loss: 0.3503 - regression_loss: 0.3280 - classification_loss: 0.0223 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3506 - regression_loss: 0.3283 - classification_loss: 0.0223 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3494 - regression_loss: 0.3272 - classification_loss: 0.0222 161/500 [========>.....................] - ETA: 1:55 - loss: 0.3507 - regression_loss: 0.3284 - classification_loss: 0.0223 162/500 [========>.....................] - ETA: 1:55 - loss: 0.3508 - regression_loss: 0.3286 - classification_loss: 0.0222 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3501 - regression_loss: 0.3280 - classification_loss: 0.0221 164/500 [========>.....................] - ETA: 1:54 - loss: 0.3497 - regression_loss: 0.3277 - classification_loss: 0.0220 165/500 [========>.....................] - ETA: 1:54 - loss: 0.3501 - regression_loss: 0.3279 - classification_loss: 0.0222 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3498 - regression_loss: 0.3277 - classification_loss: 0.0221 167/500 [=========>....................] - ETA: 1:53 - loss: 0.3491 - regression_loss: 0.3271 - classification_loss: 0.0221 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3482 - regression_loss: 0.3263 - classification_loss: 0.0220 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3482 - regression_loss: 0.3263 - classification_loss: 0.0219 170/500 [=========>....................] - ETA: 1:52 - loss: 0.3503 - regression_loss: 0.3283 - classification_loss: 0.0220 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3503 - regression_loss: 0.3283 - classification_loss: 0.0220 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3516 - regression_loss: 0.3295 - classification_loss: 0.0222 173/500 [=========>....................] - ETA: 1:51 - loss: 0.3507 - regression_loss: 0.3285 - classification_loss: 0.0221 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3506 - regression_loss: 0.3285 - classification_loss: 0.0221 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3513 - regression_loss: 0.3290 - classification_loss: 0.0222 176/500 [=========>....................] - ETA: 1:50 - loss: 0.3516 - regression_loss: 0.3293 - classification_loss: 0.0223 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3503 - regression_loss: 0.3281 - classification_loss: 0.0222 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3506 - regression_loss: 0.3284 - classification_loss: 0.0222 179/500 [=========>....................] - ETA: 1:49 - loss: 0.3507 - regression_loss: 0.3285 - classification_loss: 0.0222 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3504 - regression_loss: 0.3282 - classification_loss: 0.0222 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3506 - regression_loss: 0.3284 - classification_loss: 0.0221 182/500 [=========>....................] - ETA: 1:48 - loss: 0.3504 - regression_loss: 0.3283 - classification_loss: 0.0221 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3501 - regression_loss: 0.3282 - classification_loss: 0.0220 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3491 - regression_loss: 0.3272 - classification_loss: 0.0220 185/500 [==========>...................] - ETA: 1:47 - loss: 0.3524 - regression_loss: 0.3301 - classification_loss: 0.0223 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3526 - regression_loss: 0.3302 - classification_loss: 0.0224 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3545 - regression_loss: 0.3321 - classification_loss: 0.0224 188/500 [==========>...................] - ETA: 1:46 - loss: 0.3552 - regression_loss: 0.3328 - classification_loss: 0.0224 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3541 - regression_loss: 0.3318 - classification_loss: 0.0223 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3532 - regression_loss: 0.3310 - classification_loss: 0.0222 191/500 [==========>...................] - ETA: 1:45 - loss: 0.3530 - regression_loss: 0.3308 - classification_loss: 0.0222 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3526 - regression_loss: 0.3304 - classification_loss: 0.0222 193/500 [==========>...................] - ETA: 1:44 - loss: 0.3532 - regression_loss: 0.3309 - classification_loss: 0.0223 194/500 [==========>...................] - ETA: 1:44 - loss: 0.3539 - regression_loss: 0.3314 - classification_loss: 0.0225 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3537 - regression_loss: 0.3313 - classification_loss: 0.0225 196/500 [==========>...................] - ETA: 1:43 - loss: 0.3534 - regression_loss: 0.3310 - classification_loss: 0.0224 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3530 - regression_loss: 0.3306 - classification_loss: 0.0224 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3529 - regression_loss: 0.3304 - classification_loss: 0.0225 199/500 [==========>...................] - ETA: 1:42 - loss: 0.3528 - regression_loss: 0.3303 - classification_loss: 0.0225 200/500 [===========>..................] - ETA: 1:42 - loss: 0.3523 - regression_loss: 0.3298 - classification_loss: 0.0224 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3521 - regression_loss: 0.3297 - classification_loss: 0.0224 202/500 [===========>..................] - ETA: 1:41 - loss: 0.3515 - regression_loss: 0.3291 - classification_loss: 0.0224 203/500 [===========>..................] - ETA: 1:41 - loss: 0.3519 - regression_loss: 0.3295 - classification_loss: 0.0224 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3515 - regression_loss: 0.3291 - classification_loss: 0.0224 205/500 [===========>..................] - ETA: 1:40 - loss: 0.3515 - regression_loss: 0.3291 - classification_loss: 0.0224 206/500 [===========>..................] - ETA: 1:40 - loss: 0.3516 - regression_loss: 0.3292 - classification_loss: 0.0224 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3521 - regression_loss: 0.3297 - classification_loss: 0.0225 208/500 [===========>..................] - ETA: 1:39 - loss: 0.3525 - regression_loss: 0.3300 - classification_loss: 0.0224 209/500 [===========>..................] - ETA: 1:39 - loss: 0.3586 - regression_loss: 0.3360 - classification_loss: 0.0226 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3578 - regression_loss: 0.3352 - classification_loss: 0.0225 211/500 [===========>..................] - ETA: 1:38 - loss: 0.3589 - regression_loss: 0.3364 - classification_loss: 0.0225 212/500 [===========>..................] - ETA: 1:38 - loss: 0.3584 - regression_loss: 0.3359 - classification_loss: 0.0225 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3609 - regression_loss: 0.3384 - classification_loss: 0.0225 214/500 [===========>..................] - ETA: 1:37 - loss: 0.3615 - regression_loss: 0.3390 - classification_loss: 0.0225 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3614 - regression_loss: 0.3389 - classification_loss: 0.0225 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3621 - regression_loss: 0.3396 - classification_loss: 0.0225 217/500 [============>.................] - ETA: 1:36 - loss: 0.3628 - regression_loss: 0.3402 - classification_loss: 0.0226 218/500 [============>.................] - ETA: 1:36 - loss: 0.3622 - regression_loss: 0.3397 - classification_loss: 0.0226 219/500 [============>.................] - ETA: 1:36 - loss: 0.3616 - regression_loss: 0.3390 - classification_loss: 0.0225 220/500 [============>.................] - ETA: 1:35 - loss: 0.3618 - regression_loss: 0.3393 - classification_loss: 0.0225 221/500 [============>.................] - ETA: 1:35 - loss: 0.3614 - regression_loss: 0.3389 - classification_loss: 0.0225 222/500 [============>.................] - ETA: 1:35 - loss: 0.3612 - regression_loss: 0.3387 - classification_loss: 0.0225 223/500 [============>.................] - ETA: 1:34 - loss: 0.3602 - regression_loss: 0.3378 - classification_loss: 0.0224 224/500 [============>.................] - ETA: 1:34 - loss: 0.3594 - regression_loss: 0.3371 - classification_loss: 0.0223 225/500 [============>.................] - ETA: 1:34 - loss: 0.3583 - regression_loss: 0.3360 - classification_loss: 0.0223 226/500 [============>.................] - ETA: 1:33 - loss: 0.3573 - regression_loss: 0.3351 - classification_loss: 0.0222 227/500 [============>.................] - ETA: 1:33 - loss: 0.3573 - regression_loss: 0.3351 - classification_loss: 0.0222 228/500 [============>.................] - ETA: 1:33 - loss: 0.3570 - regression_loss: 0.3348 - classification_loss: 0.0222 229/500 [============>.................] - ETA: 1:32 - loss: 0.3571 - regression_loss: 0.3350 - classification_loss: 0.0221 230/500 [============>.................] - ETA: 1:32 - loss: 0.3582 - regression_loss: 0.3359 - classification_loss: 0.0224 231/500 [============>.................] - ETA: 1:32 - loss: 0.3582 - regression_loss: 0.3358 - classification_loss: 0.0224 232/500 [============>.................] - ETA: 1:31 - loss: 0.3591 - regression_loss: 0.3367 - classification_loss: 0.0224 233/500 [============>.................] - ETA: 1:31 - loss: 0.3587 - regression_loss: 0.3363 - classification_loss: 0.0224 234/500 [=============>................] - ETA: 1:31 - loss: 0.3585 - regression_loss: 0.3362 - classification_loss: 0.0224 235/500 [=============>................] - ETA: 1:30 - loss: 0.3589 - regression_loss: 0.3366 - classification_loss: 0.0224 236/500 [=============>................] - ETA: 1:30 - loss: 0.3598 - regression_loss: 0.3374 - classification_loss: 0.0224 237/500 [=============>................] - ETA: 1:29 - loss: 0.3594 - regression_loss: 0.3371 - classification_loss: 0.0224 238/500 [=============>................] - ETA: 1:29 - loss: 0.3593 - regression_loss: 0.3369 - classification_loss: 0.0224 239/500 [=============>................] - ETA: 1:29 - loss: 0.3595 - regression_loss: 0.3371 - classification_loss: 0.0224 240/500 [=============>................] - ETA: 1:28 - loss: 0.3601 - regression_loss: 0.3375 - classification_loss: 0.0226 241/500 [=============>................] - ETA: 1:28 - loss: 0.3599 - regression_loss: 0.3372 - classification_loss: 0.0226 242/500 [=============>................] - ETA: 1:28 - loss: 0.3598 - regression_loss: 0.3371 - classification_loss: 0.0227 243/500 [=============>................] - ETA: 1:27 - loss: 0.3608 - regression_loss: 0.3382 - classification_loss: 0.0226 244/500 [=============>................] - ETA: 1:27 - loss: 0.3609 - regression_loss: 0.3383 - classification_loss: 0.0226 245/500 [=============>................] - ETA: 1:27 - loss: 0.3608 - regression_loss: 0.3381 - classification_loss: 0.0227 246/500 [=============>................] - ETA: 1:26 - loss: 0.3614 - regression_loss: 0.3386 - classification_loss: 0.0228 247/500 [=============>................] - ETA: 1:26 - loss: 0.3615 - regression_loss: 0.3386 - classification_loss: 0.0229 248/500 [=============>................] - ETA: 1:26 - loss: 0.3622 - regression_loss: 0.3393 - classification_loss: 0.0229 249/500 [=============>................] - ETA: 1:25 - loss: 0.3617 - regression_loss: 0.3388 - classification_loss: 0.0229 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3613 - regression_loss: 0.3385 - classification_loss: 0.0228 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3616 - regression_loss: 0.3388 - classification_loss: 0.0228 252/500 [==============>...............] - ETA: 1:24 - loss: 0.3627 - regression_loss: 0.3398 - classification_loss: 0.0229 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3621 - regression_loss: 0.3392 - classification_loss: 0.0229 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3619 - regression_loss: 0.3390 - classification_loss: 0.0229 255/500 [==============>...............] - ETA: 1:23 - loss: 0.3618 - regression_loss: 0.3390 - classification_loss: 0.0229 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3612 - regression_loss: 0.3384 - classification_loss: 0.0228 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3609 - regression_loss: 0.3382 - classification_loss: 0.0227 258/500 [==============>...............] - ETA: 1:22 - loss: 0.3608 - regression_loss: 0.3381 - classification_loss: 0.0227 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3608 - regression_loss: 0.3380 - classification_loss: 0.0227 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3628 - regression_loss: 0.3399 - classification_loss: 0.0228 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3638 - regression_loss: 0.3408 - classification_loss: 0.0229 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3640 - regression_loss: 0.3410 - classification_loss: 0.0230 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3639 - regression_loss: 0.3410 - classification_loss: 0.0229 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3639 - regression_loss: 0.3409 - classification_loss: 0.0230 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3642 - regression_loss: 0.3412 - classification_loss: 0.0231 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3644 - regression_loss: 0.3413 - classification_loss: 0.0231 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3642 - regression_loss: 0.3411 - classification_loss: 0.0231 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3644 - regression_loss: 0.3414 - classification_loss: 0.0231 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3642 - regression_loss: 0.3412 - classification_loss: 0.0230 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3643 - regression_loss: 0.3413 - classification_loss: 0.0230 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3639 - regression_loss: 0.3409 - classification_loss: 0.0230 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3650 - regression_loss: 0.3420 - classification_loss: 0.0230 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3646 - regression_loss: 0.3416 - classification_loss: 0.0230 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3644 - regression_loss: 0.3415 - classification_loss: 0.0229 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3641 - regression_loss: 0.3412 - classification_loss: 0.0229 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3642 - regression_loss: 0.3414 - classification_loss: 0.0228 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3643 - regression_loss: 0.3414 - classification_loss: 0.0229 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3646 - regression_loss: 0.3418 - classification_loss: 0.0228 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3639 - regression_loss: 0.3411 - classification_loss: 0.0227 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3635 - regression_loss: 0.3408 - classification_loss: 0.0227 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3633 - regression_loss: 0.3406 - classification_loss: 0.0227 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3633 - regression_loss: 0.3405 - classification_loss: 0.0228 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3631 - regression_loss: 0.3403 - classification_loss: 0.0228 284/500 [================>.............] - ETA: 1:14 - loss: 0.3626 - regression_loss: 0.3399 - classification_loss: 0.0227 285/500 [================>.............] - ETA: 1:13 - loss: 0.3621 - regression_loss: 0.3394 - classification_loss: 0.0227 286/500 [================>.............] - ETA: 1:13 - loss: 0.3626 - regression_loss: 0.3398 - classification_loss: 0.0227 287/500 [================>.............] - ETA: 1:13 - loss: 0.3626 - regression_loss: 0.3399 - classification_loss: 0.0227 288/500 [================>.............] - ETA: 1:12 - loss: 0.3618 - regression_loss: 0.3391 - classification_loss: 0.0227 289/500 [================>.............] - ETA: 1:12 - loss: 0.3610 - regression_loss: 0.3384 - classification_loss: 0.0226 290/500 [================>.............] - ETA: 1:12 - loss: 0.3611 - regression_loss: 0.3385 - classification_loss: 0.0227 291/500 [================>.............] - ETA: 1:11 - loss: 0.3606 - regression_loss: 0.3380 - classification_loss: 0.0226 292/500 [================>.............] - ETA: 1:11 - loss: 0.3598 - regression_loss: 0.3372 - classification_loss: 0.0226 293/500 [================>.............] - ETA: 1:10 - loss: 0.3589 - regression_loss: 0.3364 - classification_loss: 0.0225 294/500 [================>.............] - ETA: 1:10 - loss: 0.3584 - regression_loss: 0.3359 - classification_loss: 0.0225 295/500 [================>.............] - ETA: 1:10 - loss: 0.3581 - regression_loss: 0.3356 - classification_loss: 0.0225 296/500 [================>.............] - ETA: 1:09 - loss: 0.3576 - regression_loss: 0.3351 - classification_loss: 0.0225 297/500 [================>.............] - ETA: 1:09 - loss: 0.3574 - regression_loss: 0.3349 - classification_loss: 0.0225 298/500 [================>.............] - ETA: 1:09 - loss: 0.3568 - regression_loss: 0.3344 - classification_loss: 0.0224 299/500 [================>.............] - ETA: 1:08 - loss: 0.3569 - regression_loss: 0.3345 - classification_loss: 0.0224 300/500 [=================>............] - ETA: 1:08 - loss: 0.3566 - regression_loss: 0.3342 - classification_loss: 0.0224 301/500 [=================>............] - ETA: 1:08 - loss: 0.3569 - regression_loss: 0.3345 - classification_loss: 0.0224 302/500 [=================>............] - ETA: 1:07 - loss: 0.3566 - regression_loss: 0.3342 - classification_loss: 0.0224 303/500 [=================>............] - ETA: 1:07 - loss: 0.3571 - regression_loss: 0.3347 - classification_loss: 0.0224 304/500 [=================>............] - ETA: 1:07 - loss: 0.3568 - regression_loss: 0.3344 - classification_loss: 0.0223 305/500 [=================>............] - ETA: 1:06 - loss: 0.3563 - regression_loss: 0.3340 - classification_loss: 0.0223 306/500 [=================>............] - ETA: 1:06 - loss: 0.3560 - regression_loss: 0.3337 - classification_loss: 0.0223 307/500 [=================>............] - ETA: 1:06 - loss: 0.3564 - regression_loss: 0.3341 - classification_loss: 0.0223 308/500 [=================>............] - ETA: 1:05 - loss: 0.3566 - regression_loss: 0.3343 - classification_loss: 0.0223 309/500 [=================>............] - ETA: 1:05 - loss: 0.3563 - regression_loss: 0.3341 - classification_loss: 0.0222 310/500 [=================>............] - ETA: 1:05 - loss: 0.3555 - regression_loss: 0.3333 - classification_loss: 0.0222 311/500 [=================>............] - ETA: 1:04 - loss: 0.3559 - regression_loss: 0.3337 - classification_loss: 0.0222 312/500 [=================>............] - ETA: 1:04 - loss: 0.3554 - regression_loss: 0.3332 - classification_loss: 0.0221 313/500 [=================>............] - ETA: 1:04 - loss: 0.3551 - regression_loss: 0.3330 - classification_loss: 0.0221 314/500 [=================>............] - ETA: 1:03 - loss: 0.3550 - regression_loss: 0.3329 - classification_loss: 0.0221 315/500 [=================>............] - ETA: 1:03 - loss: 0.3542 - regression_loss: 0.3321 - classification_loss: 0.0220 316/500 [=================>............] - ETA: 1:03 - loss: 0.3540 - regression_loss: 0.3320 - classification_loss: 0.0220 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3540 - regression_loss: 0.3320 - classification_loss: 0.0220 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3540 - regression_loss: 0.3320 - classification_loss: 0.0220 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3536 - regression_loss: 0.3317 - classification_loss: 0.0219 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3533 - regression_loss: 0.3314 - classification_loss: 0.0219 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3526 - regression_loss: 0.3307 - classification_loss: 0.0218 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3520 - regression_loss: 0.3302 - classification_loss: 0.0218 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3526 - regression_loss: 0.3307 - classification_loss: 0.0218 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3523 - regression_loss: 0.3305 - classification_loss: 0.0218 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3521 - regression_loss: 0.3303 - classification_loss: 0.0218 326/500 [==================>...........] - ETA: 59s - loss: 0.3520 - regression_loss: 0.3302 - classification_loss: 0.0218  327/500 [==================>...........] - ETA: 59s - loss: 0.3519 - regression_loss: 0.3301 - classification_loss: 0.0218 328/500 [==================>...........] - ETA: 59s - loss: 0.3519 - regression_loss: 0.3301 - classification_loss: 0.0218 329/500 [==================>...........] - ETA: 58s - loss: 0.3523 - regression_loss: 0.3305 - classification_loss: 0.0218 330/500 [==================>...........] - ETA: 58s - loss: 0.3522 - regression_loss: 0.3304 - classification_loss: 0.0218 331/500 [==================>...........] - ETA: 57s - loss: 0.3519 - regression_loss: 0.3302 - classification_loss: 0.0218 332/500 [==================>...........] - ETA: 57s - loss: 0.3520 - regression_loss: 0.3302 - classification_loss: 0.0218 333/500 [==================>...........] - ETA: 57s - loss: 0.3521 - regression_loss: 0.3303 - classification_loss: 0.0218 334/500 [===================>..........] - ETA: 56s - loss: 0.3515 - regression_loss: 0.3298 - classification_loss: 0.0217 335/500 [===================>..........] - ETA: 56s - loss: 0.3515 - regression_loss: 0.3297 - classification_loss: 0.0217 336/500 [===================>..........] - ETA: 56s - loss: 0.3519 - regression_loss: 0.3301 - classification_loss: 0.0218 337/500 [===================>..........] - ETA: 55s - loss: 0.3518 - regression_loss: 0.3300 - classification_loss: 0.0218 338/500 [===================>..........] - ETA: 55s - loss: 0.3519 - regression_loss: 0.3301 - classification_loss: 0.0218 339/500 [===================>..........] - ETA: 55s - loss: 0.3523 - regression_loss: 0.3305 - classification_loss: 0.0218 340/500 [===================>..........] - ETA: 54s - loss: 0.3520 - regression_loss: 0.3302 - classification_loss: 0.0218 341/500 [===================>..........] - ETA: 54s - loss: 0.3518 - regression_loss: 0.3301 - classification_loss: 0.0217 342/500 [===================>..........] - ETA: 54s - loss: 0.3518 - regression_loss: 0.3301 - classification_loss: 0.0217 343/500 [===================>..........] - ETA: 53s - loss: 0.3513 - regression_loss: 0.3296 - classification_loss: 0.0217 344/500 [===================>..........] - ETA: 53s - loss: 0.3514 - regression_loss: 0.3297 - classification_loss: 0.0217 345/500 [===================>..........] - ETA: 53s - loss: 0.3519 - regression_loss: 0.3302 - classification_loss: 0.0218 346/500 [===================>..........] - ETA: 52s - loss: 0.3517 - regression_loss: 0.3299 - classification_loss: 0.0218 347/500 [===================>..........] - ETA: 52s - loss: 0.3518 - regression_loss: 0.3300 - classification_loss: 0.0218 348/500 [===================>..........] - ETA: 52s - loss: 0.3512 - regression_loss: 0.3295 - classification_loss: 0.0218 349/500 [===================>..........] - ETA: 51s - loss: 0.3504 - regression_loss: 0.3287 - classification_loss: 0.0217 350/500 [====================>.........] - ETA: 51s - loss: 0.3499 - regression_loss: 0.3283 - classification_loss: 0.0217 351/500 [====================>.........] - ETA: 51s - loss: 0.3500 - regression_loss: 0.3283 - classification_loss: 0.0217 352/500 [====================>.........] - ETA: 50s - loss: 0.3494 - regression_loss: 0.3277 - classification_loss: 0.0217 353/500 [====================>.........] - ETA: 50s - loss: 0.3501 - regression_loss: 0.3283 - classification_loss: 0.0217 354/500 [====================>.........] - ETA: 50s - loss: 0.3501 - regression_loss: 0.3284 - classification_loss: 0.0217 355/500 [====================>.........] - ETA: 49s - loss: 0.3506 - regression_loss: 0.3288 - classification_loss: 0.0217 356/500 [====================>.........] - ETA: 49s - loss: 0.3507 - regression_loss: 0.3289 - classification_loss: 0.0217 357/500 [====================>.........] - ETA: 49s - loss: 0.3506 - regression_loss: 0.3289 - classification_loss: 0.0217 358/500 [====================>.........] - ETA: 48s - loss: 0.3503 - regression_loss: 0.3285 - classification_loss: 0.0217 359/500 [====================>.........] - ETA: 48s - loss: 0.3510 - regression_loss: 0.3292 - classification_loss: 0.0219 360/500 [====================>.........] - ETA: 48s - loss: 0.3507 - regression_loss: 0.3288 - classification_loss: 0.0218 361/500 [====================>.........] - ETA: 47s - loss: 0.3506 - regression_loss: 0.3287 - classification_loss: 0.0218 362/500 [====================>.........] - ETA: 47s - loss: 0.3505 - regression_loss: 0.3287 - classification_loss: 0.0219 363/500 [====================>.........] - ETA: 46s - loss: 0.3503 - regression_loss: 0.3284 - classification_loss: 0.0219 364/500 [====================>.........] - ETA: 46s - loss: 0.3501 - regression_loss: 0.3282 - classification_loss: 0.0219 365/500 [====================>.........] - ETA: 46s - loss: 0.3504 - regression_loss: 0.3286 - classification_loss: 0.0219 366/500 [====================>.........] - ETA: 45s - loss: 0.3502 - regression_loss: 0.3283 - classification_loss: 0.0219 367/500 [=====================>........] - ETA: 45s - loss: 0.3500 - regression_loss: 0.3282 - classification_loss: 0.0219 368/500 [=====================>........] - ETA: 45s - loss: 0.3503 - regression_loss: 0.3285 - classification_loss: 0.0219 369/500 [=====================>........] - ETA: 44s - loss: 0.3498 - regression_loss: 0.3280 - classification_loss: 0.0218 370/500 [=====================>........] - ETA: 44s - loss: 0.3496 - regression_loss: 0.3278 - classification_loss: 0.0218 371/500 [=====================>........] - ETA: 44s - loss: 0.3492 - regression_loss: 0.3274 - classification_loss: 0.0218 372/500 [=====================>........] - ETA: 43s - loss: 0.3492 - regression_loss: 0.3274 - classification_loss: 0.0217 373/500 [=====================>........] - ETA: 43s - loss: 0.3492 - regression_loss: 0.3274 - classification_loss: 0.0217 374/500 [=====================>........] - ETA: 43s - loss: 0.3489 - regression_loss: 0.3272 - classification_loss: 0.0217 375/500 [=====================>........] - ETA: 42s - loss: 0.3488 - regression_loss: 0.3271 - classification_loss: 0.0217 376/500 [=====================>........] - ETA: 42s - loss: 0.3486 - regression_loss: 0.3269 - classification_loss: 0.0217 377/500 [=====================>........] - ETA: 42s - loss: 0.3481 - regression_loss: 0.3265 - classification_loss: 0.0217 378/500 [=====================>........] - ETA: 41s - loss: 0.3489 - regression_loss: 0.3271 - classification_loss: 0.0218 379/500 [=====================>........] - ETA: 41s - loss: 0.3486 - regression_loss: 0.3269 - classification_loss: 0.0217 380/500 [=====================>........] - ETA: 41s - loss: 0.3489 - regression_loss: 0.3272 - classification_loss: 0.0217 381/500 [=====================>........] - ETA: 40s - loss: 0.3489 - regression_loss: 0.3271 - classification_loss: 0.0217 382/500 [=====================>........] - ETA: 40s - loss: 0.3484 - regression_loss: 0.3268 - classification_loss: 0.0217 383/500 [=====================>........] - ETA: 40s - loss: 0.3483 - regression_loss: 0.3266 - classification_loss: 0.0217 384/500 [======================>.......] - ETA: 39s - loss: 0.3476 - regression_loss: 0.3259 - classification_loss: 0.0217 385/500 [======================>.......] - ETA: 39s - loss: 0.3475 - regression_loss: 0.3258 - classification_loss: 0.0216 386/500 [======================>.......] - ETA: 39s - loss: 0.3470 - regression_loss: 0.3254 - classification_loss: 0.0216 387/500 [======================>.......] - ETA: 38s - loss: 0.3466 - regression_loss: 0.3250 - classification_loss: 0.0216 388/500 [======================>.......] - ETA: 38s - loss: 0.3474 - regression_loss: 0.3257 - classification_loss: 0.0217 389/500 [======================>.......] - ETA: 38s - loss: 0.3472 - regression_loss: 0.3255 - classification_loss: 0.0217 390/500 [======================>.......] - ETA: 37s - loss: 0.3473 - regression_loss: 0.3256 - classification_loss: 0.0217 391/500 [======================>.......] - ETA: 37s - loss: 0.3474 - regression_loss: 0.3257 - classification_loss: 0.0217 392/500 [======================>.......] - ETA: 37s - loss: 0.3473 - regression_loss: 0.3256 - classification_loss: 0.0217 393/500 [======================>.......] - ETA: 36s - loss: 0.3471 - regression_loss: 0.3255 - classification_loss: 0.0217 394/500 [======================>.......] - ETA: 36s - loss: 0.3473 - regression_loss: 0.3256 - classification_loss: 0.0217 395/500 [======================>.......] - ETA: 36s - loss: 0.3475 - regression_loss: 0.3258 - classification_loss: 0.0217 396/500 [======================>.......] - ETA: 35s - loss: 0.3475 - regression_loss: 0.3258 - classification_loss: 0.0217 397/500 [======================>.......] - ETA: 35s - loss: 0.3476 - regression_loss: 0.3259 - classification_loss: 0.0217 398/500 [======================>.......] - ETA: 34s - loss: 0.3475 - regression_loss: 0.3258 - classification_loss: 0.0217 399/500 [======================>.......] - ETA: 34s - loss: 0.3472 - regression_loss: 0.3255 - classification_loss: 0.0217 400/500 [=======================>......] - ETA: 34s - loss: 0.3476 - regression_loss: 0.3260 - classification_loss: 0.0217 401/500 [=======================>......] - ETA: 33s - loss: 0.3471 - regression_loss: 0.3255 - classification_loss: 0.0216 402/500 [=======================>......] - ETA: 33s - loss: 0.3468 - regression_loss: 0.3252 - classification_loss: 0.0216 403/500 [=======================>......] - ETA: 33s - loss: 0.3467 - regression_loss: 0.3251 - classification_loss: 0.0216 404/500 [=======================>......] - ETA: 32s - loss: 0.3467 - regression_loss: 0.3252 - classification_loss: 0.0215 405/500 [=======================>......] - ETA: 32s - loss: 0.3468 - regression_loss: 0.3253 - classification_loss: 0.0215 406/500 [=======================>......] - ETA: 32s - loss: 0.3466 - regression_loss: 0.3251 - classification_loss: 0.0215 407/500 [=======================>......] - ETA: 31s - loss: 0.3465 - regression_loss: 0.3250 - classification_loss: 0.0215 408/500 [=======================>......] - ETA: 31s - loss: 0.3469 - regression_loss: 0.3254 - classification_loss: 0.0215 409/500 [=======================>......] - ETA: 31s - loss: 0.3471 - regression_loss: 0.3256 - classification_loss: 0.0215 410/500 [=======================>......] - ETA: 30s - loss: 0.3475 - regression_loss: 0.3260 - classification_loss: 0.0215 411/500 [=======================>......] - ETA: 30s - loss: 0.3479 - regression_loss: 0.3264 - classification_loss: 0.0215 412/500 [=======================>......] - ETA: 30s - loss: 0.3483 - regression_loss: 0.3268 - classification_loss: 0.0215 413/500 [=======================>......] - ETA: 29s - loss: 0.3484 - regression_loss: 0.3269 - classification_loss: 0.0215 414/500 [=======================>......] - ETA: 29s - loss: 0.3483 - regression_loss: 0.3268 - classification_loss: 0.0215 415/500 [=======================>......] - ETA: 29s - loss: 0.3488 - regression_loss: 0.3273 - classification_loss: 0.0215 416/500 [=======================>......] - ETA: 28s - loss: 0.3490 - regression_loss: 0.3274 - classification_loss: 0.0215 417/500 [========================>.....] - ETA: 28s - loss: 0.3489 - regression_loss: 0.3273 - classification_loss: 0.0215 418/500 [========================>.....] - ETA: 28s - loss: 0.3494 - regression_loss: 0.3278 - classification_loss: 0.0216 419/500 [========================>.....] - ETA: 27s - loss: 0.3490 - regression_loss: 0.3274 - classification_loss: 0.0216 420/500 [========================>.....] - ETA: 27s - loss: 0.3497 - regression_loss: 0.3281 - classification_loss: 0.0216 421/500 [========================>.....] - ETA: 27s - loss: 0.3497 - regression_loss: 0.3282 - classification_loss: 0.0215 422/500 [========================>.....] - ETA: 26s - loss: 0.3503 - regression_loss: 0.3288 - classification_loss: 0.0216 423/500 [========================>.....] - ETA: 26s - loss: 0.3502 - regression_loss: 0.3287 - classification_loss: 0.0215 424/500 [========================>.....] - ETA: 26s - loss: 0.3505 - regression_loss: 0.3290 - classification_loss: 0.0215 425/500 [========================>.....] - ETA: 25s - loss: 0.3502 - regression_loss: 0.3287 - classification_loss: 0.0215 426/500 [========================>.....] - ETA: 25s - loss: 0.3503 - regression_loss: 0.3288 - classification_loss: 0.0215 427/500 [========================>.....] - ETA: 25s - loss: 0.3502 - regression_loss: 0.3286 - classification_loss: 0.0215 428/500 [========================>.....] - ETA: 24s - loss: 0.3499 - regression_loss: 0.3284 - classification_loss: 0.0215 429/500 [========================>.....] - ETA: 24s - loss: 0.3507 - regression_loss: 0.3291 - classification_loss: 0.0216 430/500 [========================>.....] - ETA: 24s - loss: 0.3509 - regression_loss: 0.3293 - classification_loss: 0.0216 431/500 [========================>.....] - ETA: 23s - loss: 0.3504 - regression_loss: 0.3289 - classification_loss: 0.0215 432/500 [========================>.....] - ETA: 23s - loss: 0.3502 - regression_loss: 0.3287 - classification_loss: 0.0215 433/500 [========================>.....] - ETA: 22s - loss: 0.3498 - regression_loss: 0.3283 - classification_loss: 0.0215 434/500 [=========================>....] - ETA: 22s - loss: 0.3497 - regression_loss: 0.3282 - classification_loss: 0.0215 435/500 [=========================>....] - ETA: 22s - loss: 0.3499 - regression_loss: 0.3284 - classification_loss: 0.0215 436/500 [=========================>....] - ETA: 21s - loss: 0.3498 - regression_loss: 0.3284 - classification_loss: 0.0214 437/500 [=========================>....] - ETA: 21s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 438/500 [=========================>....] - ETA: 21s - loss: 0.3493 - regression_loss: 0.3279 - classification_loss: 0.0214 439/500 [=========================>....] - ETA: 20s - loss: 0.3495 - regression_loss: 0.3280 - classification_loss: 0.0215 440/500 [=========================>....] - ETA: 20s - loss: 0.3491 - regression_loss: 0.3277 - classification_loss: 0.0214 441/500 [=========================>....] - ETA: 20s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0215 442/500 [=========================>....] - ETA: 19s - loss: 0.3494 - regression_loss: 0.3279 - classification_loss: 0.0214 443/500 [=========================>....] - ETA: 19s - loss: 0.3493 - regression_loss: 0.3279 - classification_loss: 0.0214 444/500 [=========================>....] - ETA: 19s - loss: 0.3497 - regression_loss: 0.3283 - classification_loss: 0.0215 445/500 [=========================>....] - ETA: 18s - loss: 0.3495 - regression_loss: 0.3280 - classification_loss: 0.0214 446/500 [=========================>....] - ETA: 18s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 447/500 [=========================>....] - ETA: 18s - loss: 0.3491 - regression_loss: 0.3277 - classification_loss: 0.0214 448/500 [=========================>....] - ETA: 17s - loss: 0.3490 - regression_loss: 0.3277 - classification_loss: 0.0214 449/500 [=========================>....] - ETA: 17s - loss: 0.3495 - regression_loss: 0.3281 - classification_loss: 0.0214 450/500 [==========================>...] - ETA: 17s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 451/500 [==========================>...] - ETA: 16s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 452/500 [==========================>...] - ETA: 16s - loss: 0.3496 - regression_loss: 0.3282 - classification_loss: 0.0214 453/500 [==========================>...] - ETA: 16s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 454/500 [==========================>...] - ETA: 15s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 455/500 [==========================>...] - ETA: 15s - loss: 0.3490 - regression_loss: 0.3276 - classification_loss: 0.0214 456/500 [==========================>...] - ETA: 15s - loss: 0.3488 - regression_loss: 0.3275 - classification_loss: 0.0214 457/500 [==========================>...] - ETA: 14s - loss: 0.3485 - regression_loss: 0.3271 - classification_loss: 0.0214 458/500 [==========================>...] - ETA: 14s - loss: 0.3482 - regression_loss: 0.3269 - classification_loss: 0.0213 459/500 [==========================>...] - ETA: 14s - loss: 0.3489 - regression_loss: 0.3276 - classification_loss: 0.0213 460/500 [==========================>...] - ETA: 13s - loss: 0.3493 - regression_loss: 0.3279 - classification_loss: 0.0214 461/500 [==========================>...] - ETA: 13s - loss: 0.3491 - regression_loss: 0.3277 - classification_loss: 0.0213 462/500 [==========================>...] - ETA: 13s - loss: 0.3495 - regression_loss: 0.3281 - classification_loss: 0.0214 463/500 [==========================>...] - ETA: 12s - loss: 0.3493 - regression_loss: 0.3280 - classification_loss: 0.0214 464/500 [==========================>...] - ETA: 12s - loss: 0.3493 - regression_loss: 0.3279 - classification_loss: 0.0214 465/500 [==========================>...] - ETA: 12s - loss: 0.3493 - regression_loss: 0.3279 - classification_loss: 0.0214 466/500 [==========================>...] - ETA: 11s - loss: 0.3492 - regression_loss: 0.3278 - classification_loss: 0.0213 467/500 [===========================>..] - ETA: 11s - loss: 0.3491 - regression_loss: 0.3278 - classification_loss: 0.0213 468/500 [===========================>..] - ETA: 10s - loss: 0.3491 - regression_loss: 0.3278 - classification_loss: 0.0214 469/500 [===========================>..] - ETA: 10s - loss: 0.3495 - regression_loss: 0.3281 - classification_loss: 0.0214 470/500 [===========================>..] - ETA: 10s - loss: 0.3492 - regression_loss: 0.3278 - classification_loss: 0.0214 471/500 [===========================>..] - ETA: 9s - loss: 0.3490 - regression_loss: 0.3277 - classification_loss: 0.0213  472/500 [===========================>..] - ETA: 9s - loss: 0.3495 - regression_loss: 0.3280 - classification_loss: 0.0214 473/500 [===========================>..] - ETA: 9s - loss: 0.3497 - regression_loss: 0.3283 - classification_loss: 0.0214 474/500 [===========================>..] - ETA: 8s - loss: 0.3494 - regression_loss: 0.3280 - classification_loss: 0.0214 475/500 [===========================>..] - ETA: 8s - loss: 0.3495 - regression_loss: 0.3281 - classification_loss: 0.0214 476/500 [===========================>..] - ETA: 8s - loss: 0.3497 - regression_loss: 0.3283 - classification_loss: 0.0214 477/500 [===========================>..] - ETA: 7s - loss: 0.3498 - regression_loss: 0.3284 - classification_loss: 0.0214 478/500 [===========================>..] - ETA: 7s - loss: 0.3498 - regression_loss: 0.3284 - classification_loss: 0.0215 479/500 [===========================>..] - ETA: 7s - loss: 0.3498 - regression_loss: 0.3283 - classification_loss: 0.0215 480/500 [===========================>..] - ETA: 6s - loss: 0.3498 - regression_loss: 0.3283 - classification_loss: 0.0214 481/500 [===========================>..] - ETA: 6s - loss: 0.3500 - regression_loss: 0.3285 - classification_loss: 0.0215 482/500 [===========================>..] - ETA: 6s - loss: 0.3497 - regression_loss: 0.3283 - classification_loss: 0.0215 483/500 [===========================>..] - ETA: 5s - loss: 0.3497 - regression_loss: 0.3282 - classification_loss: 0.0215 484/500 [============================>.] - ETA: 5s - loss: 0.3497 - regression_loss: 0.3281 - classification_loss: 0.0215 485/500 [============================>.] - ETA: 5s - loss: 0.3492 - regression_loss: 0.3277 - classification_loss: 0.0215 486/500 [============================>.] - ETA: 4s - loss: 0.3488 - regression_loss: 0.3273 - classification_loss: 0.0215 487/500 [============================>.] - ETA: 4s - loss: 0.3484 - regression_loss: 0.3270 - classification_loss: 0.0214 488/500 [============================>.] - ETA: 4s - loss: 0.3483 - regression_loss: 0.3269 - classification_loss: 0.0214 489/500 [============================>.] - ETA: 3s - loss: 0.3479 - regression_loss: 0.3265 - classification_loss: 0.0214 490/500 [============================>.] - ETA: 3s - loss: 0.3479 - regression_loss: 0.3265 - classification_loss: 0.0214 491/500 [============================>.] - ETA: 3s - loss: 0.3478 - regression_loss: 0.3265 - classification_loss: 0.0214 492/500 [============================>.] - ETA: 2s - loss: 0.3475 - regression_loss: 0.3261 - classification_loss: 0.0214 493/500 [============================>.] - ETA: 2s - loss: 0.3476 - regression_loss: 0.3262 - classification_loss: 0.0214 494/500 [============================>.] - ETA: 2s - loss: 0.3475 - regression_loss: 0.3261 - classification_loss: 0.0214 495/500 [============================>.] - ETA: 1s - loss: 0.3473 - regression_loss: 0.3260 - classification_loss: 0.0214 496/500 [============================>.] - ETA: 1s - loss: 0.3472 - regression_loss: 0.3259 - classification_loss: 0.0213 497/500 [============================>.] - ETA: 1s - loss: 0.3471 - regression_loss: 0.3257 - classification_loss: 0.0214 498/500 [============================>.] - ETA: 0s - loss: 0.3485 - regression_loss: 0.3270 - classification_loss: 0.0215 499/500 [============================>.] - ETA: 0s - loss: 0.3485 - regression_loss: 0.3270 - classification_loss: 0.0215 500/500 [==============================] - 171s 342ms/step - loss: 0.3482 - regression_loss: 0.3267 - classification_loss: 0.0215 1172 instances of class plum with average precision: 0.7458 mAP: 0.7458 Epoch 00033: saving model to ./training/snapshots/resnet101_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 2:40 - loss: 0.1615 - regression_loss: 0.1558 - classification_loss: 0.0057 2/500 [..............................] - ETA: 2:46 - loss: 0.2444 - regression_loss: 0.2273 - classification_loss: 0.0172 3/500 [..............................] - ETA: 2:47 - loss: 0.2677 - regression_loss: 0.2530 - classification_loss: 0.0147 4/500 [..............................] - ETA: 2:49 - loss: 0.2880 - regression_loss: 0.2700 - classification_loss: 0.0181 5/500 [..............................] - ETA: 2:48 - loss: 0.3089 - regression_loss: 0.2886 - classification_loss: 0.0202 6/500 [..............................] - ETA: 2:49 - loss: 0.2888 - regression_loss: 0.2701 - classification_loss: 0.0186 7/500 [..............................] - ETA: 2:49 - loss: 0.2699 - regression_loss: 0.2526 - classification_loss: 0.0173 8/500 [..............................] - ETA: 2:48 - loss: 0.2791 - regression_loss: 0.2613 - classification_loss: 0.0178 9/500 [..............................] - ETA: 2:48 - loss: 0.2880 - regression_loss: 0.2681 - classification_loss: 0.0199 10/500 [..............................] - ETA: 2:48 - loss: 0.3130 - regression_loss: 0.2930 - classification_loss: 0.0199 11/500 [..............................] - ETA: 2:47 - loss: 0.3119 - regression_loss: 0.2911 - classification_loss: 0.0209 12/500 [..............................] - ETA: 2:48 - loss: 0.3034 - regression_loss: 0.2834 - classification_loss: 0.0200 13/500 [..............................] - ETA: 2:48 - loss: 0.2946 - regression_loss: 0.2755 - classification_loss: 0.0190 14/500 [..............................] - ETA: 2:48 - loss: 0.2974 - regression_loss: 0.2783 - classification_loss: 0.0192 15/500 [..............................] - ETA: 2:48 - loss: 0.2927 - regression_loss: 0.2734 - classification_loss: 0.0193 16/500 [..............................] - ETA: 2:47 - loss: 0.2893 - regression_loss: 0.2704 - classification_loss: 0.0189 17/500 [>.............................] - ETA: 2:47 - loss: 0.2851 - regression_loss: 0.2667 - classification_loss: 0.0185 18/500 [>.............................] - ETA: 2:46 - loss: 0.2832 - regression_loss: 0.2650 - classification_loss: 0.0182 19/500 [>.............................] - ETA: 2:46 - loss: 0.2883 - regression_loss: 0.2689 - classification_loss: 0.0195 20/500 [>.............................] - ETA: 2:46 - loss: 0.2964 - regression_loss: 0.2753 - classification_loss: 0.0210 21/500 [>.............................] - ETA: 2:45 - loss: 0.2898 - regression_loss: 0.2696 - classification_loss: 0.0202 22/500 [>.............................] - ETA: 2:45 - loss: 0.2928 - regression_loss: 0.2727 - classification_loss: 0.0200 23/500 [>.............................] - ETA: 2:44 - loss: 0.2941 - regression_loss: 0.2741 - classification_loss: 0.0200 24/500 [>.............................] - ETA: 2:44 - loss: 0.2907 - regression_loss: 0.2713 - classification_loss: 0.0195 25/500 [>.............................] - ETA: 2:43 - loss: 0.2949 - regression_loss: 0.2759 - classification_loss: 0.0190 26/500 [>.............................] - ETA: 2:43 - loss: 0.2935 - regression_loss: 0.2747 - classification_loss: 0.0189 27/500 [>.............................] - ETA: 2:42 - loss: 0.2948 - regression_loss: 0.2757 - classification_loss: 0.0191 28/500 [>.............................] - ETA: 2:42 - loss: 0.2946 - regression_loss: 0.2753 - classification_loss: 0.0193 29/500 [>.............................] - ETA: 2:41 - loss: 0.2950 - regression_loss: 0.2761 - classification_loss: 0.0189 30/500 [>.............................] - ETA: 2:41 - loss: 0.2963 - regression_loss: 0.2774 - classification_loss: 0.0189 31/500 [>.............................] - ETA: 2:41 - loss: 0.3012 - regression_loss: 0.2812 - classification_loss: 0.0200 32/500 [>.............................] - ETA: 2:40 - loss: 0.2989 - regression_loss: 0.2793 - classification_loss: 0.0196 33/500 [>.............................] - ETA: 2:40 - loss: 0.3096 - regression_loss: 0.2890 - classification_loss: 0.0206 34/500 [=>............................] - ETA: 2:39 - loss: 0.3138 - regression_loss: 0.2928 - classification_loss: 0.0210 35/500 [=>............................] - ETA: 2:39 - loss: 0.3117 - regression_loss: 0.2910 - classification_loss: 0.0207 36/500 [=>............................] - ETA: 2:39 - loss: 0.3080 - regression_loss: 0.2875 - classification_loss: 0.0205 37/500 [=>............................] - ETA: 2:38 - loss: 0.3113 - regression_loss: 0.2907 - classification_loss: 0.0206 38/500 [=>............................] - ETA: 2:38 - loss: 0.3143 - regression_loss: 0.2936 - classification_loss: 0.0207 39/500 [=>............................] - ETA: 2:37 - loss: 0.3157 - regression_loss: 0.2952 - classification_loss: 0.0205 40/500 [=>............................] - ETA: 2:37 - loss: 0.3098 - regression_loss: 0.2897 - classification_loss: 0.0201 41/500 [=>............................] - ETA: 2:37 - loss: 0.3095 - regression_loss: 0.2894 - classification_loss: 0.0201 42/500 [=>............................] - ETA: 2:37 - loss: 0.3167 - regression_loss: 0.2966 - classification_loss: 0.0201 43/500 [=>............................] - ETA: 2:36 - loss: 0.3164 - regression_loss: 0.2965 - classification_loss: 0.0198 44/500 [=>............................] - ETA: 2:36 - loss: 0.3130 - regression_loss: 0.2935 - classification_loss: 0.0195 45/500 [=>............................] - ETA: 2:35 - loss: 0.3096 - regression_loss: 0.2904 - classification_loss: 0.0192 46/500 [=>............................] - ETA: 2:35 - loss: 0.3067 - regression_loss: 0.2878 - classification_loss: 0.0189 47/500 [=>............................] - ETA: 2:34 - loss: 0.3043 - regression_loss: 0.2856 - classification_loss: 0.0187 48/500 [=>............................] - ETA: 2:34 - loss: 0.3085 - regression_loss: 0.2896 - classification_loss: 0.0190 49/500 [=>............................] - ETA: 2:34 - loss: 0.3118 - regression_loss: 0.2926 - classification_loss: 0.0192 50/500 [==>...........................] - ETA: 2:33 - loss: 0.3084 - regression_loss: 0.2895 - classification_loss: 0.0189 51/500 [==>...........................] - ETA: 2:33 - loss: 0.3061 - regression_loss: 0.2875 - classification_loss: 0.0186 52/500 [==>...........................] - ETA: 2:32 - loss: 0.3068 - regression_loss: 0.2881 - classification_loss: 0.0187 53/500 [==>...........................] - ETA: 2:32 - loss: 0.3093 - regression_loss: 0.2905 - classification_loss: 0.0188 54/500 [==>...........................] - ETA: 2:32 - loss: 0.3113 - regression_loss: 0.2921 - classification_loss: 0.0192 55/500 [==>...........................] - ETA: 2:31 - loss: 0.3078 - regression_loss: 0.2889 - classification_loss: 0.0189 56/500 [==>...........................] - ETA: 2:31 - loss: 0.3082 - regression_loss: 0.2892 - classification_loss: 0.0189 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3066 - regression_loss: 0.2876 - classification_loss: 0.0190 58/500 [==>...........................] - ETA: 2:30 - loss: 0.3076 - regression_loss: 0.2881 - classification_loss: 0.0195 59/500 [==>...........................] - ETA: 2:30 - loss: 0.3074 - regression_loss: 0.2882 - classification_loss: 0.0193 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3054 - regression_loss: 0.2864 - classification_loss: 0.0191 61/500 [==>...........................] - ETA: 2:29 - loss: 0.3053 - regression_loss: 0.2862 - classification_loss: 0.0191 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3074 - regression_loss: 0.2878 - classification_loss: 0.0196 63/500 [==>...........................] - ETA: 2:29 - loss: 0.3088 - regression_loss: 0.2890 - classification_loss: 0.0197 64/500 [==>...........................] - ETA: 2:28 - loss: 0.3055 - regression_loss: 0.2860 - classification_loss: 0.0195 65/500 [==>...........................] - ETA: 2:28 - loss: 0.3065 - regression_loss: 0.2869 - classification_loss: 0.0195 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3049 - regression_loss: 0.2855 - classification_loss: 0.0194 67/500 [===>..........................] - ETA: 2:27 - loss: 0.3021 - regression_loss: 0.2829 - classification_loss: 0.0192 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3001 - regression_loss: 0.2809 - classification_loss: 0.0192 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3094 - regression_loss: 0.2897 - classification_loss: 0.0197 70/500 [===>..........................] - ETA: 2:26 - loss: 0.3131 - regression_loss: 0.2931 - classification_loss: 0.0200 71/500 [===>..........................] - ETA: 2:26 - loss: 0.3171 - regression_loss: 0.2970 - classification_loss: 0.0201 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3177 - regression_loss: 0.2978 - classification_loss: 0.0199 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3182 - regression_loss: 0.2984 - classification_loss: 0.0198 74/500 [===>..........................] - ETA: 2:25 - loss: 0.3200 - regression_loss: 0.3001 - classification_loss: 0.0200 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3186 - regression_loss: 0.2988 - classification_loss: 0.0198 76/500 [===>..........................] - ETA: 2:24 - loss: 0.3219 - regression_loss: 0.3019 - classification_loss: 0.0200 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3246 - regression_loss: 0.3043 - classification_loss: 0.0203 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3256 - regression_loss: 0.3053 - classification_loss: 0.0203 79/500 [===>..........................] - ETA: 2:23 - loss: 0.3250 - regression_loss: 0.3046 - classification_loss: 0.0204 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3256 - regression_loss: 0.3052 - classification_loss: 0.0204 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3255 - regression_loss: 0.3050 - classification_loss: 0.0204 82/500 [===>..........................] - ETA: 2:22 - loss: 0.3248 - regression_loss: 0.3044 - classification_loss: 0.0204 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3248 - regression_loss: 0.3045 - classification_loss: 0.0204 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3238 - regression_loss: 0.3034 - classification_loss: 0.0204 85/500 [====>.........................] - ETA: 2:21 - loss: 0.3233 - regression_loss: 0.3030 - classification_loss: 0.0203 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3270 - regression_loss: 0.3066 - classification_loss: 0.0203 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3280 - regression_loss: 0.3076 - classification_loss: 0.0204 88/500 [====>.........................] - ETA: 2:20 - loss: 0.3283 - regression_loss: 0.3079 - classification_loss: 0.0204 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3278 - regression_loss: 0.3075 - classification_loss: 0.0202 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3263 - regression_loss: 0.3063 - classification_loss: 0.0200 91/500 [====>.........................] - ETA: 2:19 - loss: 0.3275 - regression_loss: 0.3071 - classification_loss: 0.0204 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3262 - regression_loss: 0.3059 - classification_loss: 0.0203 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3250 - regression_loss: 0.3048 - classification_loss: 0.0201 94/500 [====>.........................] - ETA: 2:18 - loss: 0.3236 - regression_loss: 0.3036 - classification_loss: 0.0200 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3237 - regression_loss: 0.3038 - classification_loss: 0.0200 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3241 - regression_loss: 0.3042 - classification_loss: 0.0199 97/500 [====>.........................] - ETA: 2:17 - loss: 0.3244 - regression_loss: 0.3046 - classification_loss: 0.0198 98/500 [====>.........................] - ETA: 2:17 - loss: 0.3246 - regression_loss: 0.3048 - classification_loss: 0.0199 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3260 - regression_loss: 0.3062 - classification_loss: 0.0199 100/500 [=====>........................] - ETA: 2:16 - loss: 0.3259 - regression_loss: 0.3061 - classification_loss: 0.0198 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3265 - regression_loss: 0.3065 - classification_loss: 0.0199 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3254 - regression_loss: 0.3056 - classification_loss: 0.0198 103/500 [=====>........................] - ETA: 2:15 - loss: 0.3268 - regression_loss: 0.3070 - classification_loss: 0.0198 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3282 - regression_loss: 0.3084 - classification_loss: 0.0198 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3286 - regression_loss: 0.3088 - classification_loss: 0.0199 106/500 [=====>........................] - ETA: 2:14 - loss: 0.3279 - regression_loss: 0.3080 - classification_loss: 0.0199 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3293 - regression_loss: 0.3091 - classification_loss: 0.0201 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3301 - regression_loss: 0.3099 - classification_loss: 0.0201 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3314 - regression_loss: 0.3112 - classification_loss: 0.0202 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3306 - regression_loss: 0.3103 - classification_loss: 0.0203 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3292 - regression_loss: 0.3090 - classification_loss: 0.0201 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3312 - regression_loss: 0.3107 - classification_loss: 0.0205 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3310 - regression_loss: 0.3105 - classification_loss: 0.0205 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3292 - regression_loss: 0.3088 - classification_loss: 0.0204 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3295 - regression_loss: 0.3091 - classification_loss: 0.0204 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3292 - regression_loss: 0.3090 - classification_loss: 0.0203 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3273 - regression_loss: 0.3072 - classification_loss: 0.0201 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3264 - regression_loss: 0.3064 - classification_loss: 0.0200 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3264 - regression_loss: 0.3064 - classification_loss: 0.0200 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3250 - regression_loss: 0.3051 - classification_loss: 0.0199 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3239 - regression_loss: 0.3041 - classification_loss: 0.0198 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3247 - regression_loss: 0.3050 - classification_loss: 0.0198 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3257 - regression_loss: 0.3059 - classification_loss: 0.0198 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3291 - regression_loss: 0.3090 - classification_loss: 0.0201 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3297 - regression_loss: 0.3095 - classification_loss: 0.0202 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3309 - regression_loss: 0.3108 - classification_loss: 0.0201 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3312 - regression_loss: 0.3112 - classification_loss: 0.0200 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3325 - regression_loss: 0.3125 - classification_loss: 0.0200 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3318 - regression_loss: 0.3118 - classification_loss: 0.0200 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3314 - regression_loss: 0.3115 - classification_loss: 0.0199 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3309 - regression_loss: 0.3111 - classification_loss: 0.0198 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3305 - regression_loss: 0.3107 - classification_loss: 0.0198 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3314 - regression_loss: 0.3112 - classification_loss: 0.0202 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3313 - regression_loss: 0.3112 - classification_loss: 0.0201 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3313 - regression_loss: 0.3110 - classification_loss: 0.0203 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3324 - regression_loss: 0.3122 - classification_loss: 0.0203 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3322 - regression_loss: 0.3119 - classification_loss: 0.0203 138/500 [=======>......................] - ETA: 2:03 - loss: 0.3320 - regression_loss: 0.3116 - classification_loss: 0.0204 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3332 - regression_loss: 0.3128 - classification_loss: 0.0204 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3353 - regression_loss: 0.3149 - classification_loss: 0.0204 141/500 [=======>......................] - ETA: 2:02 - loss: 0.3368 - regression_loss: 0.3163 - classification_loss: 0.0205 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3385 - regression_loss: 0.3178 - classification_loss: 0.0207 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3373 - regression_loss: 0.3167 - classification_loss: 0.0205 144/500 [=======>......................] - ETA: 2:01 - loss: 0.3362 - regression_loss: 0.3157 - classification_loss: 0.0204 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3393 - regression_loss: 0.3186 - classification_loss: 0.0207 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3394 - regression_loss: 0.3187 - classification_loss: 0.0208 147/500 [=======>......................] - ETA: 2:00 - loss: 0.3384 - regression_loss: 0.3176 - classification_loss: 0.0209 148/500 [=======>......................] - ETA: 2:00 - loss: 0.3397 - regression_loss: 0.3188 - classification_loss: 0.0209 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3432 - regression_loss: 0.3222 - classification_loss: 0.0210 150/500 [========>.....................] - ETA: 1:59 - loss: 0.3435 - regression_loss: 0.3224 - classification_loss: 0.0210 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3446 - regression_loss: 0.3236 - classification_loss: 0.0211 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3454 - regression_loss: 0.3245 - classification_loss: 0.0210 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3453 - regression_loss: 0.3243 - classification_loss: 0.0209 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3456 - regression_loss: 0.3246 - classification_loss: 0.0210 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3468 - regression_loss: 0.3256 - classification_loss: 0.0212 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3476 - regression_loss: 0.3262 - classification_loss: 0.0214 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3466 - regression_loss: 0.3253 - classification_loss: 0.0213 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3462 - regression_loss: 0.3249 - classification_loss: 0.0212 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3470 - regression_loss: 0.3257 - classification_loss: 0.0213 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3462 - regression_loss: 0.3249 - classification_loss: 0.0213 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3453 - regression_loss: 0.3241 - classification_loss: 0.0212 162/500 [========>.....................] - ETA: 1:55 - loss: 0.3449 - regression_loss: 0.3238 - classification_loss: 0.0212 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3457 - regression_loss: 0.3245 - classification_loss: 0.0212 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3456 - regression_loss: 0.3245 - classification_loss: 0.0211 165/500 [========>.....................] - ETA: 1:54 - loss: 0.3460 - regression_loss: 0.3248 - classification_loss: 0.0212 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3468 - regression_loss: 0.3256 - classification_loss: 0.0212 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3468 - regression_loss: 0.3257 - classification_loss: 0.0212 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3470 - regression_loss: 0.3259 - classification_loss: 0.0211 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3470 - regression_loss: 0.3259 - classification_loss: 0.0211 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3470 - regression_loss: 0.3258 - classification_loss: 0.0211 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3470 - regression_loss: 0.3259 - classification_loss: 0.0211 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3455 - regression_loss: 0.3245 - classification_loss: 0.0210 173/500 [=========>....................] - ETA: 1:51 - loss: 0.3462 - regression_loss: 0.3251 - classification_loss: 0.0210 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3457 - regression_loss: 0.3248 - classification_loss: 0.0209 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3466 - regression_loss: 0.3256 - classification_loss: 0.0211 176/500 [=========>....................] - ETA: 1:50 - loss: 0.3462 - regression_loss: 0.3252 - classification_loss: 0.0210 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3465 - regression_loss: 0.3254 - classification_loss: 0.0211 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3457 - regression_loss: 0.3247 - classification_loss: 0.0210 179/500 [=========>....................] - ETA: 1:49 - loss: 0.3454 - regression_loss: 0.3244 - classification_loss: 0.0210 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3452 - regression_loss: 0.3243 - classification_loss: 0.0209 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3461 - regression_loss: 0.3251 - classification_loss: 0.0210 182/500 [=========>....................] - ETA: 1:48 - loss: 0.3465 - regression_loss: 0.3255 - classification_loss: 0.0210 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3454 - regression_loss: 0.3245 - classification_loss: 0.0209 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3463 - regression_loss: 0.3254 - classification_loss: 0.0209 185/500 [==========>...................] - ETA: 1:47 - loss: 0.3469 - regression_loss: 0.3259 - classification_loss: 0.0210 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3463 - regression_loss: 0.3253 - classification_loss: 0.0210 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3455 - regression_loss: 0.3245 - classification_loss: 0.0209 188/500 [==========>...................] - ETA: 1:46 - loss: 0.3472 - regression_loss: 0.3262 - classification_loss: 0.0210 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3461 - regression_loss: 0.3251 - classification_loss: 0.0209 190/500 [==========>...................] - ETA: 1:45 - loss: 0.3469 - regression_loss: 0.3258 - classification_loss: 0.0211 191/500 [==========>...................] - ETA: 1:45 - loss: 0.3466 - regression_loss: 0.3254 - classification_loss: 0.0211 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3463 - regression_loss: 0.3252 - classification_loss: 0.0211 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3457 - regression_loss: 0.3247 - classification_loss: 0.0210 194/500 [==========>...................] - ETA: 1:44 - loss: 0.3457 - regression_loss: 0.3247 - classification_loss: 0.0210 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3449 - regression_loss: 0.3240 - classification_loss: 0.0209 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3458 - regression_loss: 0.3248 - classification_loss: 0.0210 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3462 - regression_loss: 0.3251 - classification_loss: 0.0211 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3468 - regression_loss: 0.3257 - classification_loss: 0.0211 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3461 - regression_loss: 0.3251 - classification_loss: 0.0210 200/500 [===========>..................] - ETA: 1:42 - loss: 0.3463 - regression_loss: 0.3253 - classification_loss: 0.0210 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3466 - regression_loss: 0.3257 - classification_loss: 0.0210 202/500 [===========>..................] - ETA: 1:41 - loss: 0.3459 - regression_loss: 0.3250 - classification_loss: 0.0209 203/500 [===========>..................] - ETA: 1:41 - loss: 0.3457 - regression_loss: 0.3249 - classification_loss: 0.0209 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3462 - regression_loss: 0.3254 - classification_loss: 0.0208 205/500 [===========>..................] - ETA: 1:40 - loss: 0.3458 - regression_loss: 0.3250 - classification_loss: 0.0208 206/500 [===========>..................] - ETA: 1:40 - loss: 0.3463 - regression_loss: 0.3254 - classification_loss: 0.0209 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3454 - regression_loss: 0.3247 - classification_loss: 0.0208 208/500 [===========>..................] - ETA: 1:39 - loss: 0.3448 - regression_loss: 0.3240 - classification_loss: 0.0207 209/500 [===========>..................] - ETA: 1:39 - loss: 0.3447 - regression_loss: 0.3240 - classification_loss: 0.0207 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3441 - regression_loss: 0.3235 - classification_loss: 0.0207 211/500 [===========>..................] - ETA: 1:38 - loss: 0.3445 - regression_loss: 0.3239 - classification_loss: 0.0206 212/500 [===========>..................] - ETA: 1:38 - loss: 0.3450 - regression_loss: 0.3242 - classification_loss: 0.0209 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3444 - regression_loss: 0.3236 - classification_loss: 0.0208 214/500 [===========>..................] - ETA: 1:37 - loss: 0.3442 - regression_loss: 0.3234 - classification_loss: 0.0208 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3434 - regression_loss: 0.3226 - classification_loss: 0.0208 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3443 - regression_loss: 0.3234 - classification_loss: 0.0208 217/500 [============>.................] - ETA: 1:36 - loss: 0.3438 - regression_loss: 0.3230 - classification_loss: 0.0208 218/500 [============>.................] - ETA: 1:36 - loss: 0.3439 - regression_loss: 0.3231 - classification_loss: 0.0208 219/500 [============>.................] - ETA: 1:36 - loss: 0.3446 - regression_loss: 0.3238 - classification_loss: 0.0208 220/500 [============>.................] - ETA: 1:35 - loss: 0.3449 - regression_loss: 0.3240 - classification_loss: 0.0208 221/500 [============>.................] - ETA: 1:35 - loss: 0.3454 - regression_loss: 0.3245 - classification_loss: 0.0209 222/500 [============>.................] - ETA: 1:34 - loss: 0.3460 - regression_loss: 0.3250 - classification_loss: 0.0210 223/500 [============>.................] - ETA: 1:34 - loss: 0.3460 - regression_loss: 0.3250 - classification_loss: 0.0210 224/500 [============>.................] - ETA: 1:34 - loss: 0.3458 - regression_loss: 0.3248 - classification_loss: 0.0210 225/500 [============>.................] - ETA: 1:34 - loss: 0.3451 - regression_loss: 0.3241 - classification_loss: 0.0209 226/500 [============>.................] - ETA: 1:33 - loss: 0.3444 - regression_loss: 0.3236 - classification_loss: 0.0209 227/500 [============>.................] - ETA: 1:33 - loss: 0.3447 - regression_loss: 0.3239 - classification_loss: 0.0208 228/500 [============>.................] - ETA: 1:32 - loss: 0.3447 - regression_loss: 0.3238 - classification_loss: 0.0209 229/500 [============>.................] - ETA: 1:32 - loss: 0.3449 - regression_loss: 0.3240 - classification_loss: 0.0209 230/500 [============>.................] - ETA: 1:32 - loss: 0.3442 - regression_loss: 0.3234 - classification_loss: 0.0208 231/500 [============>.................] - ETA: 1:31 - loss: 0.3441 - regression_loss: 0.3234 - classification_loss: 0.0207 232/500 [============>.................] - ETA: 1:31 - loss: 0.3449 - regression_loss: 0.3239 - classification_loss: 0.0210 233/500 [============>.................] - ETA: 1:31 - loss: 0.3449 - regression_loss: 0.3239 - classification_loss: 0.0210 234/500 [=============>................] - ETA: 1:30 - loss: 0.3446 - regression_loss: 0.3236 - classification_loss: 0.0211 235/500 [=============>................] - ETA: 1:30 - loss: 0.3447 - regression_loss: 0.3236 - classification_loss: 0.0211 236/500 [=============>................] - ETA: 1:30 - loss: 0.3448 - regression_loss: 0.3237 - classification_loss: 0.0211 237/500 [=============>................] - ETA: 1:29 - loss: 0.3446 - regression_loss: 0.3236 - classification_loss: 0.0211 238/500 [=============>................] - ETA: 1:29 - loss: 0.3449 - regression_loss: 0.3238 - classification_loss: 0.0211 239/500 [=============>................] - ETA: 1:29 - loss: 0.3441 - regression_loss: 0.3231 - classification_loss: 0.0210 240/500 [=============>................] - ETA: 1:28 - loss: 0.3430 - regression_loss: 0.3221 - classification_loss: 0.0209 241/500 [=============>................] - ETA: 1:28 - loss: 0.3424 - regression_loss: 0.3216 - classification_loss: 0.0209 242/500 [=============>................] - ETA: 1:28 - loss: 0.3420 - regression_loss: 0.3212 - classification_loss: 0.0208 243/500 [=============>................] - ETA: 1:27 - loss: 0.3422 - regression_loss: 0.3213 - classification_loss: 0.0209 244/500 [=============>................] - ETA: 1:27 - loss: 0.3428 - regression_loss: 0.3219 - classification_loss: 0.0208 245/500 [=============>................] - ETA: 1:27 - loss: 0.3422 - regression_loss: 0.3214 - classification_loss: 0.0208 246/500 [=============>................] - ETA: 1:26 - loss: 0.3420 - regression_loss: 0.3211 - classification_loss: 0.0209 247/500 [=============>................] - ETA: 1:26 - loss: 0.3427 - regression_loss: 0.3219 - classification_loss: 0.0208 248/500 [=============>................] - ETA: 1:26 - loss: 0.3426 - regression_loss: 0.3218 - classification_loss: 0.0208 249/500 [=============>................] - ETA: 1:25 - loss: 0.3420 - regression_loss: 0.3212 - classification_loss: 0.0208 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3419 - regression_loss: 0.3211 - classification_loss: 0.0208 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3420 - regression_loss: 0.3212 - classification_loss: 0.0208 252/500 [==============>...............] - ETA: 1:24 - loss: 0.3414 - regression_loss: 0.3207 - classification_loss: 0.0207 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3413 - regression_loss: 0.3206 - classification_loss: 0.0207 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3417 - regression_loss: 0.3210 - classification_loss: 0.0207 255/500 [==============>...............] - ETA: 1:23 - loss: 0.3410 - regression_loss: 0.3203 - classification_loss: 0.0207 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3406 - regression_loss: 0.3200 - classification_loss: 0.0206 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3405 - regression_loss: 0.3199 - classification_loss: 0.0206 258/500 [==============>...............] - ETA: 1:22 - loss: 0.3411 - regression_loss: 0.3204 - classification_loss: 0.0207 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3407 - regression_loss: 0.3200 - classification_loss: 0.0206 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3406 - regression_loss: 0.3199 - classification_loss: 0.0207 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3406 - regression_loss: 0.3199 - classification_loss: 0.0207 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3402 - regression_loss: 0.3196 - classification_loss: 0.0206 263/500 [==============>...............] - ETA: 1:20 - loss: 0.3415 - regression_loss: 0.3209 - classification_loss: 0.0206 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3417 - regression_loss: 0.3211 - classification_loss: 0.0206 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3415 - regression_loss: 0.3210 - classification_loss: 0.0205 266/500 [==============>...............] - ETA: 1:19 - loss: 0.3410 - regression_loss: 0.3205 - classification_loss: 0.0205 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3406 - regression_loss: 0.3201 - classification_loss: 0.0205 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3412 - regression_loss: 0.3207 - classification_loss: 0.0205 269/500 [===============>..............] - ETA: 1:18 - loss: 0.3417 - regression_loss: 0.3211 - classification_loss: 0.0206 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3416 - regression_loss: 0.3210 - classification_loss: 0.0206 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3412 - regression_loss: 0.3207 - classification_loss: 0.0205 272/500 [===============>..............] - ETA: 1:17 - loss: 0.3406 - regression_loss: 0.3201 - classification_loss: 0.0205 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3406 - regression_loss: 0.3201 - classification_loss: 0.0205 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3403 - regression_loss: 0.3198 - classification_loss: 0.0205 275/500 [===============>..............] - ETA: 1:16 - loss: 0.3411 - regression_loss: 0.3205 - classification_loss: 0.0206 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3410 - regression_loss: 0.3204 - classification_loss: 0.0205 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3409 - regression_loss: 0.3204 - classification_loss: 0.0205 278/500 [===============>..............] - ETA: 1:15 - loss: 0.3408 - regression_loss: 0.3202 - classification_loss: 0.0206 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3403 - regression_loss: 0.3198 - classification_loss: 0.0205 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3409 - regression_loss: 0.3203 - classification_loss: 0.0206 281/500 [===============>..............] - ETA: 1:14 - loss: 0.3411 - regression_loss: 0.3205 - classification_loss: 0.0206 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3413 - regression_loss: 0.3207 - classification_loss: 0.0206 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3413 - regression_loss: 0.3207 - classification_loss: 0.0205 284/500 [================>.............] - ETA: 1:13 - loss: 0.3410 - regression_loss: 0.3205 - classification_loss: 0.0205 285/500 [================>.............] - ETA: 1:13 - loss: 0.3416 - regression_loss: 0.3210 - classification_loss: 0.0206 286/500 [================>.............] - ETA: 1:13 - loss: 0.3427 - regression_loss: 0.3220 - classification_loss: 0.0206 287/500 [================>.............] - ETA: 1:12 - loss: 0.3421 - regression_loss: 0.3216 - classification_loss: 0.0206 288/500 [================>.............] - ETA: 1:12 - loss: 0.3429 - regression_loss: 0.3222 - classification_loss: 0.0207 289/500 [================>.............] - ETA: 1:12 - loss: 0.3434 - regression_loss: 0.3227 - classification_loss: 0.0207 290/500 [================>.............] - ETA: 1:11 - loss: 0.3437 - regression_loss: 0.3230 - classification_loss: 0.0207 291/500 [================>.............] - ETA: 1:11 - loss: 0.3442 - regression_loss: 0.3235 - classification_loss: 0.0207 292/500 [================>.............] - ETA: 1:11 - loss: 0.3444 - regression_loss: 0.3237 - classification_loss: 0.0208 293/500 [================>.............] - ETA: 1:10 - loss: 0.3460 - regression_loss: 0.3252 - classification_loss: 0.0208 294/500 [================>.............] - ETA: 1:10 - loss: 0.3461 - regression_loss: 0.3253 - classification_loss: 0.0208 295/500 [================>.............] - ETA: 1:10 - loss: 0.3462 - regression_loss: 0.3254 - classification_loss: 0.0208 296/500 [================>.............] - ETA: 1:09 - loss: 0.3463 - regression_loss: 0.3255 - classification_loss: 0.0208 297/500 [================>.............] - ETA: 1:09 - loss: 0.3459 - regression_loss: 0.3252 - classification_loss: 0.0208 298/500 [================>.............] - ETA: 1:09 - loss: 0.3472 - regression_loss: 0.3264 - classification_loss: 0.0208 299/500 [================>.............] - ETA: 1:08 - loss: 0.3465 - regression_loss: 0.3257 - classification_loss: 0.0208 300/500 [=================>............] - ETA: 1:08 - loss: 0.3466 - regression_loss: 0.3258 - classification_loss: 0.0208 301/500 [=================>............] - ETA: 1:08 - loss: 0.3475 - regression_loss: 0.3267 - classification_loss: 0.0208 302/500 [=================>............] - ETA: 1:07 - loss: 0.3470 - regression_loss: 0.3263 - classification_loss: 0.0208 303/500 [=================>............] - ETA: 1:07 - loss: 0.3472 - regression_loss: 0.3264 - classification_loss: 0.0207 304/500 [=================>............] - ETA: 1:07 - loss: 0.3478 - regression_loss: 0.3270 - classification_loss: 0.0209 305/500 [=================>............] - ETA: 1:06 - loss: 0.3483 - regression_loss: 0.3274 - classification_loss: 0.0208 306/500 [=================>............] - ETA: 1:06 - loss: 0.3478 - regression_loss: 0.3270 - classification_loss: 0.0208 307/500 [=================>............] - ETA: 1:06 - loss: 0.3494 - regression_loss: 0.3284 - classification_loss: 0.0210 308/500 [=================>............] - ETA: 1:05 - loss: 0.3492 - regression_loss: 0.3283 - classification_loss: 0.0209 309/500 [=================>............] - ETA: 1:05 - loss: 0.3492 - regression_loss: 0.3282 - classification_loss: 0.0210 310/500 [=================>............] - ETA: 1:04 - loss: 0.3483 - regression_loss: 0.3274 - classification_loss: 0.0209 311/500 [=================>............] - ETA: 1:04 - loss: 0.3484 - regression_loss: 0.3275 - classification_loss: 0.0209 312/500 [=================>............] - ETA: 1:04 - loss: 0.3485 - regression_loss: 0.3276 - classification_loss: 0.0209 313/500 [=================>............] - ETA: 1:03 - loss: 0.3484 - regression_loss: 0.3275 - classification_loss: 0.0209 314/500 [=================>............] - ETA: 1:03 - loss: 0.3482 - regression_loss: 0.3274 - classification_loss: 0.0208 315/500 [=================>............] - ETA: 1:03 - loss: 0.3483 - regression_loss: 0.3275 - classification_loss: 0.0208 316/500 [=================>............] - ETA: 1:02 - loss: 0.3486 - regression_loss: 0.3278 - classification_loss: 0.0208 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3483 - regression_loss: 0.3275 - classification_loss: 0.0208 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3484 - regression_loss: 0.3276 - classification_loss: 0.0208 319/500 [==================>...........] - ETA: 1:01 - loss: 0.3495 - regression_loss: 0.3286 - classification_loss: 0.0209 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3499 - regression_loss: 0.3290 - classification_loss: 0.0209 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3508 - regression_loss: 0.3299 - classification_loss: 0.0209 322/500 [==================>...........] - ETA: 1:00 - loss: 0.3507 - regression_loss: 0.3298 - classification_loss: 0.0209 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3508 - regression_loss: 0.3299 - classification_loss: 0.0209 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3504 - regression_loss: 0.3295 - classification_loss: 0.0209 325/500 [==================>...........] - ETA: 59s - loss: 0.3502 - regression_loss: 0.3294 - classification_loss: 0.0209  326/500 [==================>...........] - ETA: 59s - loss: 0.3496 - regression_loss: 0.3288 - classification_loss: 0.0208 327/500 [==================>...........] - ETA: 59s - loss: 0.3497 - regression_loss: 0.3289 - classification_loss: 0.0208 328/500 [==================>...........] - ETA: 58s - loss: 0.3494 - regression_loss: 0.3286 - classification_loss: 0.0208 329/500 [==================>...........] - ETA: 58s - loss: 0.3495 - regression_loss: 0.3287 - classification_loss: 0.0208 330/500 [==================>...........] - ETA: 58s - loss: 0.3489 - regression_loss: 0.3281 - classification_loss: 0.0208 331/500 [==================>...........] - ETA: 57s - loss: 0.3485 - regression_loss: 0.3278 - classification_loss: 0.0207 332/500 [==================>...........] - ETA: 57s - loss: 0.3477 - regression_loss: 0.3270 - classification_loss: 0.0207 333/500 [==================>...........] - ETA: 57s - loss: 0.3475 - regression_loss: 0.3268 - classification_loss: 0.0207 334/500 [===================>..........] - ETA: 56s - loss: 0.3474 - regression_loss: 0.3268 - classification_loss: 0.0206 335/500 [===================>..........] - ETA: 56s - loss: 0.3477 - regression_loss: 0.3271 - classification_loss: 0.0206 336/500 [===================>..........] - ETA: 56s - loss: 0.3474 - regression_loss: 0.3268 - classification_loss: 0.0206 337/500 [===================>..........] - ETA: 55s - loss: 0.3477 - regression_loss: 0.3271 - classification_loss: 0.0206 338/500 [===================>..........] - ETA: 55s - loss: 0.3476 - regression_loss: 0.3271 - classification_loss: 0.0205 339/500 [===================>..........] - ETA: 55s - loss: 0.3474 - regression_loss: 0.3269 - classification_loss: 0.0206 340/500 [===================>..........] - ETA: 54s - loss: 0.3475 - regression_loss: 0.3269 - classification_loss: 0.0206 341/500 [===================>..........] - ETA: 54s - loss: 0.3475 - regression_loss: 0.3270 - classification_loss: 0.0206 342/500 [===================>..........] - ETA: 54s - loss: 0.3476 - regression_loss: 0.3270 - classification_loss: 0.0206 343/500 [===================>..........] - ETA: 53s - loss: 0.3477 - regression_loss: 0.3271 - classification_loss: 0.0206 344/500 [===================>..........] - ETA: 53s - loss: 0.3483 - regression_loss: 0.3276 - classification_loss: 0.0206 345/500 [===================>..........] - ETA: 53s - loss: 0.3481 - regression_loss: 0.3275 - classification_loss: 0.0206 346/500 [===================>..........] - ETA: 52s - loss: 0.3485 - regression_loss: 0.3278 - classification_loss: 0.0206 347/500 [===================>..........] - ETA: 52s - loss: 0.3480 - regression_loss: 0.3274 - classification_loss: 0.0206 348/500 [===================>..........] - ETA: 52s - loss: 0.3479 - regression_loss: 0.3273 - classification_loss: 0.0206 349/500 [===================>..........] - ETA: 51s - loss: 0.3475 - regression_loss: 0.3269 - classification_loss: 0.0205 350/500 [====================>.........] - ETA: 51s - loss: 0.3475 - regression_loss: 0.3269 - classification_loss: 0.0205 351/500 [====================>.........] - ETA: 50s - loss: 0.3472 - regression_loss: 0.3267 - classification_loss: 0.0205 352/500 [====================>.........] - ETA: 50s - loss: 0.3469 - regression_loss: 0.3264 - classification_loss: 0.0205 353/500 [====================>.........] - ETA: 50s - loss: 0.3471 - regression_loss: 0.3266 - classification_loss: 0.0205 354/500 [====================>.........] - ETA: 49s - loss: 0.3470 - regression_loss: 0.3265 - classification_loss: 0.0205 355/500 [====================>.........] - ETA: 49s - loss: 0.3473 - regression_loss: 0.3267 - classification_loss: 0.0206 356/500 [====================>.........] - ETA: 49s - loss: 0.3470 - regression_loss: 0.3265 - classification_loss: 0.0205 357/500 [====================>.........] - ETA: 48s - loss: 0.3467 - regression_loss: 0.3262 - classification_loss: 0.0205 358/500 [====================>.........] - ETA: 48s - loss: 0.3462 - regression_loss: 0.3258 - classification_loss: 0.0205 359/500 [====================>.........] - ETA: 48s - loss: 0.3462 - regression_loss: 0.3257 - classification_loss: 0.0205 360/500 [====================>.........] - ETA: 47s - loss: 0.3460 - regression_loss: 0.3255 - classification_loss: 0.0205 361/500 [====================>.........] - ETA: 47s - loss: 0.3459 - regression_loss: 0.3254 - classification_loss: 0.0205 362/500 [====================>.........] - ETA: 47s - loss: 0.3456 - regression_loss: 0.3251 - classification_loss: 0.0205 363/500 [====================>.........] - ETA: 46s - loss: 0.3451 - regression_loss: 0.3247 - classification_loss: 0.0205 364/500 [====================>.........] - ETA: 46s - loss: 0.3455 - regression_loss: 0.3250 - classification_loss: 0.0204 365/500 [====================>.........] - ETA: 46s - loss: 0.3455 - regression_loss: 0.3250 - classification_loss: 0.0205 366/500 [====================>.........] - ETA: 45s - loss: 0.3451 - regression_loss: 0.3246 - classification_loss: 0.0204 367/500 [=====================>........] - ETA: 45s - loss: 0.3452 - regression_loss: 0.3248 - classification_loss: 0.0204 368/500 [=====================>........] - ETA: 45s - loss: 0.3450 - regression_loss: 0.3246 - classification_loss: 0.0204 369/500 [=====================>........] - ETA: 44s - loss: 0.3449 - regression_loss: 0.3245 - classification_loss: 0.0203 370/500 [=====================>........] - ETA: 44s - loss: 0.3453 - regression_loss: 0.3249 - classification_loss: 0.0205 371/500 [=====================>........] - ETA: 44s - loss: 0.3448 - regression_loss: 0.3244 - classification_loss: 0.0204 372/500 [=====================>........] - ETA: 43s - loss: 0.3447 - regression_loss: 0.3243 - classification_loss: 0.0204 373/500 [=====================>........] - ETA: 43s - loss: 0.3448 - regression_loss: 0.3244 - classification_loss: 0.0204 374/500 [=====================>........] - ETA: 43s - loss: 0.3442 - regression_loss: 0.3239 - classification_loss: 0.0203 375/500 [=====================>........] - ETA: 42s - loss: 0.3444 - regression_loss: 0.3240 - classification_loss: 0.0204 376/500 [=====================>........] - ETA: 42s - loss: 0.3442 - regression_loss: 0.3238 - classification_loss: 0.0204 377/500 [=====================>........] - ETA: 42s - loss: 0.3438 - regression_loss: 0.3234 - classification_loss: 0.0204 378/500 [=====================>........] - ETA: 41s - loss: 0.3440 - regression_loss: 0.3236 - classification_loss: 0.0204 379/500 [=====================>........] - ETA: 41s - loss: 0.3439 - regression_loss: 0.3236 - classification_loss: 0.0204 380/500 [=====================>........] - ETA: 41s - loss: 0.3436 - regression_loss: 0.3233 - classification_loss: 0.0203 381/500 [=====================>........] - ETA: 40s - loss: 0.3437 - regression_loss: 0.3234 - classification_loss: 0.0203 382/500 [=====================>........] - ETA: 40s - loss: 0.3435 - regression_loss: 0.3231 - classification_loss: 0.0203 383/500 [=====================>........] - ETA: 40s - loss: 0.3434 - regression_loss: 0.3231 - classification_loss: 0.0203 384/500 [======================>.......] - ETA: 39s - loss: 0.3435 - regression_loss: 0.3232 - classification_loss: 0.0203 385/500 [======================>.......] - ETA: 39s - loss: 0.3430 - regression_loss: 0.3227 - classification_loss: 0.0203 386/500 [======================>.......] - ETA: 39s - loss: 0.3426 - regression_loss: 0.3223 - classification_loss: 0.0203 387/500 [======================>.......] - ETA: 38s - loss: 0.3430 - regression_loss: 0.3227 - classification_loss: 0.0203 388/500 [======================>.......] - ETA: 38s - loss: 0.3429 - regression_loss: 0.3226 - classification_loss: 0.0203 389/500 [======================>.......] - ETA: 38s - loss: 0.3427 - regression_loss: 0.3224 - classification_loss: 0.0203 390/500 [======================>.......] - ETA: 37s - loss: 0.3426 - regression_loss: 0.3223 - classification_loss: 0.0203 391/500 [======================>.......] - ETA: 37s - loss: 0.3428 - regression_loss: 0.3225 - classification_loss: 0.0203 392/500 [======================>.......] - ETA: 36s - loss: 0.3427 - regression_loss: 0.3224 - classification_loss: 0.0203 393/500 [======================>.......] - ETA: 36s - loss: 0.3426 - regression_loss: 0.3222 - classification_loss: 0.0204 394/500 [======================>.......] - ETA: 36s - loss: 0.3428 - regression_loss: 0.3224 - classification_loss: 0.0204 395/500 [======================>.......] - ETA: 35s - loss: 0.3427 - regression_loss: 0.3222 - classification_loss: 0.0204 396/500 [======================>.......] - ETA: 35s - loss: 0.3423 - regression_loss: 0.3219 - classification_loss: 0.0204 397/500 [======================>.......] - ETA: 35s - loss: 0.3422 - regression_loss: 0.3218 - classification_loss: 0.0204 398/500 [======================>.......] - ETA: 34s - loss: 0.3416 - regression_loss: 0.3212 - classification_loss: 0.0204 399/500 [======================>.......] - ETA: 34s - loss: 0.3416 - regression_loss: 0.3212 - classification_loss: 0.0204 400/500 [=======================>......] - ETA: 34s - loss: 0.3411 - regression_loss: 0.3208 - classification_loss: 0.0203 401/500 [=======================>......] - ETA: 33s - loss: 0.3413 - regression_loss: 0.3210 - classification_loss: 0.0203 402/500 [=======================>......] - ETA: 33s - loss: 0.3409 - regression_loss: 0.3207 - classification_loss: 0.0203 403/500 [=======================>......] - ETA: 33s - loss: 0.3403 - regression_loss: 0.3201 - classification_loss: 0.0203 404/500 [=======================>......] - ETA: 32s - loss: 0.3406 - regression_loss: 0.3203 - classification_loss: 0.0203 405/500 [=======================>......] - ETA: 32s - loss: 0.3406 - regression_loss: 0.3203 - classification_loss: 0.0203 406/500 [=======================>......] - ETA: 32s - loss: 0.3408 - regression_loss: 0.3205 - classification_loss: 0.0203 407/500 [=======================>......] - ETA: 31s - loss: 0.3409 - regression_loss: 0.3206 - classification_loss: 0.0203 408/500 [=======================>......] - ETA: 31s - loss: 0.3408 - regression_loss: 0.3205 - classification_loss: 0.0203 409/500 [=======================>......] - ETA: 31s - loss: 0.3403 - regression_loss: 0.3201 - classification_loss: 0.0203 410/500 [=======================>......] - ETA: 30s - loss: 0.3405 - regression_loss: 0.3202 - classification_loss: 0.0203 411/500 [=======================>......] - ETA: 30s - loss: 0.3408 - regression_loss: 0.3204 - classification_loss: 0.0204 412/500 [=======================>......] - ETA: 30s - loss: 0.3409 - regression_loss: 0.3205 - classification_loss: 0.0204 413/500 [=======================>......] - ETA: 29s - loss: 0.3406 - regression_loss: 0.3202 - classification_loss: 0.0204 414/500 [=======================>......] - ETA: 29s - loss: 0.3408 - regression_loss: 0.3205 - classification_loss: 0.0204 415/500 [=======================>......] - ETA: 29s - loss: 0.3407 - regression_loss: 0.3204 - classification_loss: 0.0203 416/500 [=======================>......] - ETA: 28s - loss: 0.3404 - regression_loss: 0.3201 - classification_loss: 0.0203 417/500 [========================>.....] - ETA: 28s - loss: 0.3403 - regression_loss: 0.3200 - classification_loss: 0.0203 418/500 [========================>.....] - ETA: 28s - loss: 0.3398 - regression_loss: 0.3196 - classification_loss: 0.0203 419/500 [========================>.....] - ETA: 27s - loss: 0.3395 - regression_loss: 0.3192 - classification_loss: 0.0202 420/500 [========================>.....] - ETA: 27s - loss: 0.3396 - regression_loss: 0.3193 - classification_loss: 0.0203 421/500 [========================>.....] - ETA: 27s - loss: 0.3392 - regression_loss: 0.3190 - classification_loss: 0.0202 422/500 [========================>.....] - ETA: 26s - loss: 0.3395 - regression_loss: 0.3192 - classification_loss: 0.0202 423/500 [========================>.....] - ETA: 26s - loss: 0.3394 - regression_loss: 0.3192 - classification_loss: 0.0202 424/500 [========================>.....] - ETA: 26s - loss: 0.3393 - regression_loss: 0.3191 - classification_loss: 0.0202 425/500 [========================>.....] - ETA: 25s - loss: 0.3394 - regression_loss: 0.3192 - classification_loss: 0.0202 426/500 [========================>.....] - ETA: 25s - loss: 0.3396 - regression_loss: 0.3194 - classification_loss: 0.0202 427/500 [========================>.....] - ETA: 24s - loss: 0.3393 - regression_loss: 0.3192 - classification_loss: 0.0202 428/500 [========================>.....] - ETA: 24s - loss: 0.3388 - regression_loss: 0.3187 - classification_loss: 0.0202 429/500 [========================>.....] - ETA: 24s - loss: 0.3389 - regression_loss: 0.3188 - classification_loss: 0.0202 430/500 [========================>.....] - ETA: 23s - loss: 0.3391 - regression_loss: 0.3189 - classification_loss: 0.0202 431/500 [========================>.....] - ETA: 23s - loss: 0.3390 - regression_loss: 0.3188 - classification_loss: 0.0202 432/500 [========================>.....] - ETA: 23s - loss: 0.3390 - regression_loss: 0.3188 - classification_loss: 0.0202 433/500 [========================>.....] - ETA: 22s - loss: 0.3385 - regression_loss: 0.3184 - classification_loss: 0.0201 434/500 [=========================>....] - ETA: 22s - loss: 0.3386 - regression_loss: 0.3184 - classification_loss: 0.0201 435/500 [=========================>....] - ETA: 22s - loss: 0.3381 - regression_loss: 0.3180 - classification_loss: 0.0201 436/500 [=========================>....] - ETA: 21s - loss: 0.3382 - regression_loss: 0.3181 - classification_loss: 0.0201 437/500 [=========================>....] - ETA: 21s - loss: 0.3382 - regression_loss: 0.3181 - classification_loss: 0.0201 438/500 [=========================>....] - ETA: 21s - loss: 0.3385 - regression_loss: 0.3184 - classification_loss: 0.0201 439/500 [=========================>....] - ETA: 20s - loss: 0.3386 - regression_loss: 0.3185 - classification_loss: 0.0201 440/500 [=========================>....] - ETA: 20s - loss: 0.3390 - regression_loss: 0.3189 - classification_loss: 0.0201 441/500 [=========================>....] - ETA: 20s - loss: 0.3391 - regression_loss: 0.3190 - classification_loss: 0.0201 442/500 [=========================>....] - ETA: 19s - loss: 0.3392 - regression_loss: 0.3190 - classification_loss: 0.0201 443/500 [=========================>....] - ETA: 19s - loss: 0.3394 - regression_loss: 0.3192 - classification_loss: 0.0201 444/500 [=========================>....] - ETA: 19s - loss: 0.3390 - regression_loss: 0.3189 - classification_loss: 0.0201 445/500 [=========================>....] - ETA: 18s - loss: 0.3388 - regression_loss: 0.3188 - classification_loss: 0.0201 446/500 [=========================>....] - ETA: 18s - loss: 0.3387 - regression_loss: 0.3186 - classification_loss: 0.0200 447/500 [=========================>....] - ETA: 18s - loss: 0.3388 - regression_loss: 0.3187 - classification_loss: 0.0201 448/500 [=========================>....] - ETA: 17s - loss: 0.3387 - regression_loss: 0.3186 - classification_loss: 0.0201 449/500 [=========================>....] - ETA: 17s - loss: 0.3384 - regression_loss: 0.3184 - classification_loss: 0.0200 450/500 [==========================>...] - ETA: 17s - loss: 0.3388 - regression_loss: 0.3187 - classification_loss: 0.0201 451/500 [==========================>...] - ETA: 16s - loss: 0.3386 - regression_loss: 0.3185 - classification_loss: 0.0201 452/500 [==========================>...] - ETA: 16s - loss: 0.3394 - regression_loss: 0.3193 - classification_loss: 0.0201 453/500 [==========================>...] - ETA: 16s - loss: 0.3393 - regression_loss: 0.3192 - classification_loss: 0.0201 454/500 [==========================>...] - ETA: 15s - loss: 0.3392 - regression_loss: 0.3191 - classification_loss: 0.0201 455/500 [==========================>...] - ETA: 15s - loss: 0.3390 - regression_loss: 0.3189 - classification_loss: 0.0201 456/500 [==========================>...] - ETA: 15s - loss: 0.3392 - regression_loss: 0.3192 - classification_loss: 0.0201 457/500 [==========================>...] - ETA: 14s - loss: 0.3391 - regression_loss: 0.3190 - classification_loss: 0.0201 458/500 [==========================>...] - ETA: 14s - loss: 0.3396 - regression_loss: 0.3195 - classification_loss: 0.0201 459/500 [==========================>...] - ETA: 14s - loss: 0.3400 - regression_loss: 0.3199 - classification_loss: 0.0201 460/500 [==========================>...] - ETA: 13s - loss: 0.3404 - regression_loss: 0.3203 - classification_loss: 0.0201 461/500 [==========================>...] - ETA: 13s - loss: 0.3400 - regression_loss: 0.3199 - classification_loss: 0.0201 462/500 [==========================>...] - ETA: 13s - loss: 0.3402 - regression_loss: 0.3202 - classification_loss: 0.0201 463/500 [==========================>...] - ETA: 12s - loss: 0.3403 - regression_loss: 0.3203 - classification_loss: 0.0200 464/500 [==========================>...] - ETA: 12s - loss: 0.3406 - regression_loss: 0.3206 - classification_loss: 0.0201 465/500 [==========================>...] - ETA: 11s - loss: 0.3407 - regression_loss: 0.3206 - classification_loss: 0.0201 466/500 [==========================>...] - ETA: 11s - loss: 0.3409 - regression_loss: 0.3208 - classification_loss: 0.0201 467/500 [===========================>..] - ETA: 11s - loss: 0.3418 - regression_loss: 0.3217 - classification_loss: 0.0201 468/500 [===========================>..] - ETA: 10s - loss: 0.3416 - regression_loss: 0.3215 - classification_loss: 0.0201 469/500 [===========================>..] - ETA: 10s - loss: 0.3416 - regression_loss: 0.3216 - classification_loss: 0.0201 470/500 [===========================>..] - ETA: 10s - loss: 0.3418 - regression_loss: 0.3217 - classification_loss: 0.0201 471/500 [===========================>..] - ETA: 9s - loss: 0.3420 - regression_loss: 0.3219 - classification_loss: 0.0201  472/500 [===========================>..] - ETA: 9s - loss: 0.3417 - regression_loss: 0.3216 - classification_loss: 0.0201 473/500 [===========================>..] - ETA: 9s - loss: 0.3414 - regression_loss: 0.3213 - classification_loss: 0.0201 474/500 [===========================>..] - ETA: 8s - loss: 0.3417 - regression_loss: 0.3216 - classification_loss: 0.0201 475/500 [===========================>..] - ETA: 8s - loss: 0.3416 - regression_loss: 0.3215 - classification_loss: 0.0201 476/500 [===========================>..] - ETA: 8s - loss: 0.3414 - regression_loss: 0.3213 - classification_loss: 0.0201 477/500 [===========================>..] - ETA: 7s - loss: 0.3412 - regression_loss: 0.3212 - classification_loss: 0.0201 478/500 [===========================>..] - ETA: 7s - loss: 0.3409 - regression_loss: 0.3209 - classification_loss: 0.0201 479/500 [===========================>..] - ETA: 7s - loss: 0.3416 - regression_loss: 0.3214 - classification_loss: 0.0202 480/500 [===========================>..] - ETA: 6s - loss: 0.3418 - regression_loss: 0.3216 - classification_loss: 0.0202 481/500 [===========================>..] - ETA: 6s - loss: 0.3417 - regression_loss: 0.3216 - classification_loss: 0.0202 482/500 [===========================>..] - ETA: 6s - loss: 0.3416 - regression_loss: 0.3214 - classification_loss: 0.0202 483/500 [===========================>..] - ETA: 5s - loss: 0.3414 - regression_loss: 0.3212 - classification_loss: 0.0202 484/500 [============================>.] - ETA: 5s - loss: 0.3411 - regression_loss: 0.3210 - classification_loss: 0.0201 485/500 [============================>.] - ETA: 5s - loss: 0.3410 - regression_loss: 0.3208 - classification_loss: 0.0201 486/500 [============================>.] - ETA: 4s - loss: 0.3413 - regression_loss: 0.3211 - classification_loss: 0.0202 487/500 [============================>.] - ETA: 4s - loss: 0.3411 - regression_loss: 0.3209 - classification_loss: 0.0202 488/500 [============================>.] - ETA: 4s - loss: 0.3406 - regression_loss: 0.3204 - classification_loss: 0.0201 489/500 [============================>.] - ETA: 3s - loss: 0.3406 - regression_loss: 0.3204 - classification_loss: 0.0202 490/500 [============================>.] - ETA: 3s - loss: 0.3410 - regression_loss: 0.3208 - classification_loss: 0.0202 491/500 [============================>.] - ETA: 3s - loss: 0.3406 - regression_loss: 0.3204 - classification_loss: 0.0202 492/500 [============================>.] - ETA: 2s - loss: 0.3405 - regression_loss: 0.3203 - classification_loss: 0.0202 493/500 [============================>.] - ETA: 2s - loss: 0.3404 - regression_loss: 0.3202 - classification_loss: 0.0201 494/500 [============================>.] - ETA: 2s - loss: 0.3400 - regression_loss: 0.3199 - classification_loss: 0.0201 495/500 [============================>.] - ETA: 1s - loss: 0.3396 - regression_loss: 0.3195 - classification_loss: 0.0201 496/500 [============================>.] - ETA: 1s - loss: 0.3394 - regression_loss: 0.3193 - classification_loss: 0.0201 497/500 [============================>.] - ETA: 1s - loss: 0.3397 - regression_loss: 0.3196 - classification_loss: 0.0201 498/500 [============================>.] - ETA: 0s - loss: 0.3399 - regression_loss: 0.3198 - classification_loss: 0.0201 499/500 [============================>.] - ETA: 0s - loss: 0.3396 - regression_loss: 0.3195 - classification_loss: 0.0201 500/500 [==============================] - 171s 343ms/step - loss: 0.3393 - regression_loss: 0.3193 - classification_loss: 0.0200 1172 instances of class plum with average precision: 0.7530 mAP: 0.7530 Epoch 00034: saving model to ./training/snapshots/resnet101_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 2:42 - loss: 0.3898 - regression_loss: 0.3488 - classification_loss: 0.0411 2/500 [..............................] - ETA: 2:46 - loss: 0.2668 - regression_loss: 0.2418 - classification_loss: 0.0251 3/500 [..............................] - ETA: 2:47 - loss: 0.2545 - regression_loss: 0.2319 - classification_loss: 0.0226 4/500 [..............................] - ETA: 2:48 - loss: 0.3180 - regression_loss: 0.2896 - classification_loss: 0.0284 5/500 [..............................] - ETA: 2:48 - loss: 0.3143 - regression_loss: 0.2879 - classification_loss: 0.0264 6/500 [..............................] - ETA: 2:46 - loss: 0.3265 - regression_loss: 0.2997 - classification_loss: 0.0268 7/500 [..............................] - ETA: 2:46 - loss: 0.3256 - regression_loss: 0.2998 - classification_loss: 0.0258 8/500 [..............................] - ETA: 2:47 - loss: 0.3198 - regression_loss: 0.2969 - classification_loss: 0.0229 9/500 [..............................] - ETA: 2:47 - loss: 0.3235 - regression_loss: 0.3008 - classification_loss: 0.0227 10/500 [..............................] - ETA: 2:47 - loss: 0.3313 - regression_loss: 0.3087 - classification_loss: 0.0226 11/500 [..............................] - ETA: 2:47 - loss: 0.3369 - regression_loss: 0.3148 - classification_loss: 0.0221 12/500 [..............................] - ETA: 2:47 - loss: 0.3309 - regression_loss: 0.3094 - classification_loss: 0.0215 13/500 [..............................] - ETA: 2:47 - loss: 0.3444 - regression_loss: 0.3217 - classification_loss: 0.0227 14/500 [..............................] - ETA: 2:46 - loss: 0.3383 - regression_loss: 0.3160 - classification_loss: 0.0223 15/500 [..............................] - ETA: 2:46 - loss: 0.3283 - regression_loss: 0.3065 - classification_loss: 0.0218 16/500 [..............................] - ETA: 2:46 - loss: 0.3284 - regression_loss: 0.3066 - classification_loss: 0.0218 17/500 [>.............................] - ETA: 2:46 - loss: 0.3206 - regression_loss: 0.2998 - classification_loss: 0.0209 18/500 [>.............................] - ETA: 2:45 - loss: 0.3168 - regression_loss: 0.2961 - classification_loss: 0.0207 19/500 [>.............................] - ETA: 2:45 - loss: 0.3126 - regression_loss: 0.2919 - classification_loss: 0.0207 20/500 [>.............................] - ETA: 2:45 - loss: 0.3060 - regression_loss: 0.2857 - classification_loss: 0.0204 21/500 [>.............................] - ETA: 2:44 - loss: 0.2974 - regression_loss: 0.2777 - classification_loss: 0.0197 22/500 [>.............................] - ETA: 2:44 - loss: 0.2914 - regression_loss: 0.2721 - classification_loss: 0.0193 23/500 [>.............................] - ETA: 2:43 - loss: 0.3001 - regression_loss: 0.2805 - classification_loss: 0.0197 24/500 [>.............................] - ETA: 2:43 - loss: 0.3079 - regression_loss: 0.2870 - classification_loss: 0.0209 25/500 [>.............................] - ETA: 2:43 - loss: 0.3026 - regression_loss: 0.2823 - classification_loss: 0.0203 26/500 [>.............................] - ETA: 2:43 - loss: 0.3003 - regression_loss: 0.2805 - classification_loss: 0.0198 27/500 [>.............................] - ETA: 2:43 - loss: 0.2998 - regression_loss: 0.2803 - classification_loss: 0.0196 28/500 [>.............................] - ETA: 2:42 - loss: 0.3057 - regression_loss: 0.2861 - classification_loss: 0.0196 29/500 [>.............................] - ETA: 2:42 - loss: 0.3030 - regression_loss: 0.2838 - classification_loss: 0.0192 30/500 [>.............................] - ETA: 2:42 - loss: 0.3058 - regression_loss: 0.2866 - classification_loss: 0.0192 31/500 [>.............................] - ETA: 2:42 - loss: 0.3057 - regression_loss: 0.2868 - classification_loss: 0.0189 32/500 [>.............................] - ETA: 2:41 - loss: 0.3063 - regression_loss: 0.2874 - classification_loss: 0.0189 33/500 [>.............................] - ETA: 2:41 - loss: 0.3040 - regression_loss: 0.2853 - classification_loss: 0.0187 34/500 [=>............................] - ETA: 2:40 - loss: 0.3047 - regression_loss: 0.2852 - classification_loss: 0.0195 35/500 [=>............................] - ETA: 2:40 - loss: 0.3017 - regression_loss: 0.2826 - classification_loss: 0.0192 36/500 [=>............................] - ETA: 2:40 - loss: 0.2996 - regression_loss: 0.2808 - classification_loss: 0.0188 37/500 [=>............................] - ETA: 2:39 - loss: 0.2951 - regression_loss: 0.2767 - classification_loss: 0.0184 38/500 [=>............................] - ETA: 2:39 - loss: 0.2937 - regression_loss: 0.2751 - classification_loss: 0.0186 39/500 [=>............................] - ETA: 2:38 - loss: 0.2907 - regression_loss: 0.2724 - classification_loss: 0.0183 40/500 [=>............................] - ETA: 2:38 - loss: 0.2878 - regression_loss: 0.2697 - classification_loss: 0.0181 41/500 [=>............................] - ETA: 2:37 - loss: 0.2885 - regression_loss: 0.2708 - classification_loss: 0.0178 42/500 [=>............................] - ETA: 2:37 - loss: 0.2872 - regression_loss: 0.2696 - classification_loss: 0.0176 43/500 [=>............................] - ETA: 2:36 - loss: 0.2878 - regression_loss: 0.2704 - classification_loss: 0.0174 44/500 [=>............................] - ETA: 2:36 - loss: 0.2920 - regression_loss: 0.2742 - classification_loss: 0.0178 45/500 [=>............................] - ETA: 2:36 - loss: 0.2947 - regression_loss: 0.2769 - classification_loss: 0.0178 46/500 [=>............................] - ETA: 2:35 - loss: 0.2973 - regression_loss: 0.2791 - classification_loss: 0.0182 47/500 [=>............................] - ETA: 2:35 - loss: 0.3022 - regression_loss: 0.2828 - classification_loss: 0.0195 48/500 [=>............................] - ETA: 2:35 - loss: 0.3044 - regression_loss: 0.2847 - classification_loss: 0.0197 49/500 [=>............................] - ETA: 2:34 - loss: 0.3065 - regression_loss: 0.2863 - classification_loss: 0.0202 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3085 - regression_loss: 0.2883 - classification_loss: 0.0202 51/500 [==>...........................] - ETA: 2:34 - loss: 0.3088 - regression_loss: 0.2888 - classification_loss: 0.0200 52/500 [==>...........................] - ETA: 2:34 - loss: 0.3100 - regression_loss: 0.2899 - classification_loss: 0.0200 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3108 - regression_loss: 0.2906 - classification_loss: 0.0202 54/500 [==>...........................] - ETA: 2:33 - loss: 0.3133 - regression_loss: 0.2931 - classification_loss: 0.0202 55/500 [==>...........................] - ETA: 2:33 - loss: 0.3125 - regression_loss: 0.2926 - classification_loss: 0.0199 56/500 [==>...........................] - ETA: 2:32 - loss: 0.3112 - regression_loss: 0.2914 - classification_loss: 0.0198 57/500 [==>...........................] - ETA: 2:32 - loss: 0.3124 - regression_loss: 0.2923 - classification_loss: 0.0202 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3149 - regression_loss: 0.2946 - classification_loss: 0.0203 59/500 [==>...........................] - ETA: 2:31 - loss: 0.3132 - regression_loss: 0.2932 - classification_loss: 0.0200 60/500 [==>...........................] - ETA: 2:31 - loss: 0.3109 - regression_loss: 0.2910 - classification_loss: 0.0199 61/500 [==>...........................] - ETA: 2:31 - loss: 0.3090 - regression_loss: 0.2892 - classification_loss: 0.0198 62/500 [==>...........................] - ETA: 2:30 - loss: 0.3072 - regression_loss: 0.2877 - classification_loss: 0.0196 63/500 [==>...........................] - ETA: 2:30 - loss: 0.3077 - regression_loss: 0.2882 - classification_loss: 0.0195 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3089 - regression_loss: 0.2890 - classification_loss: 0.0200 65/500 [==>...........................] - ETA: 2:29 - loss: 0.3071 - regression_loss: 0.2873 - classification_loss: 0.0198 66/500 [==>...........................] - ETA: 2:29 - loss: 0.3062 - regression_loss: 0.2866 - classification_loss: 0.0196 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3110 - regression_loss: 0.2915 - classification_loss: 0.0196 68/500 [===>..........................] - ETA: 2:28 - loss: 0.3126 - regression_loss: 0.2931 - classification_loss: 0.0194 69/500 [===>..........................] - ETA: 2:28 - loss: 0.3133 - regression_loss: 0.2939 - classification_loss: 0.0194 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3132 - regression_loss: 0.2938 - classification_loss: 0.0194 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3141 - regression_loss: 0.2946 - classification_loss: 0.0194 72/500 [===>..........................] - ETA: 2:27 - loss: 0.3137 - regression_loss: 0.2942 - classification_loss: 0.0195 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3149 - regression_loss: 0.2953 - classification_loss: 0.0196 74/500 [===>..........................] - ETA: 2:26 - loss: 0.3137 - regression_loss: 0.2942 - classification_loss: 0.0196 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3154 - regression_loss: 0.2951 - classification_loss: 0.0203 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3135 - regression_loss: 0.2934 - classification_loss: 0.0201 77/500 [===>..........................] - ETA: 2:25 - loss: 0.3142 - regression_loss: 0.2940 - classification_loss: 0.0201 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3166 - regression_loss: 0.2963 - classification_loss: 0.0203 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3156 - regression_loss: 0.2955 - classification_loss: 0.0202 80/500 [===>..........................] - ETA: 2:24 - loss: 0.3154 - regression_loss: 0.2953 - classification_loss: 0.0202 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3148 - regression_loss: 0.2947 - classification_loss: 0.0201 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3181 - regression_loss: 0.2976 - classification_loss: 0.0205 83/500 [===>..........................] - ETA: 2:23 - loss: 0.3172 - regression_loss: 0.2969 - classification_loss: 0.0203 84/500 [====>.........................] - ETA: 2:23 - loss: 0.3179 - regression_loss: 0.2976 - classification_loss: 0.0203 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3178 - regression_loss: 0.2974 - classification_loss: 0.0204 86/500 [====>.........................] - ETA: 2:22 - loss: 0.3168 - regression_loss: 0.2966 - classification_loss: 0.0203 87/500 [====>.........................] - ETA: 2:22 - loss: 0.3160 - regression_loss: 0.2958 - classification_loss: 0.0202 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3155 - regression_loss: 0.2953 - classification_loss: 0.0202 89/500 [====>.........................] - ETA: 2:21 - loss: 0.3146 - regression_loss: 0.2946 - classification_loss: 0.0201 90/500 [====>.........................] - ETA: 2:21 - loss: 0.3138 - regression_loss: 0.2939 - classification_loss: 0.0199 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3116 - regression_loss: 0.2919 - classification_loss: 0.0198 92/500 [====>.........................] - ETA: 2:20 - loss: 0.3115 - regression_loss: 0.2917 - classification_loss: 0.0197 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3107 - regression_loss: 0.2910 - classification_loss: 0.0197 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3082 - regression_loss: 0.2887 - classification_loss: 0.0196 95/500 [====>.........................] - ETA: 2:19 - loss: 0.3090 - regression_loss: 0.2893 - classification_loss: 0.0196 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3105 - regression_loss: 0.2905 - classification_loss: 0.0201 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3088 - regression_loss: 0.2889 - classification_loss: 0.0199 98/500 [====>.........................] - ETA: 2:18 - loss: 0.3079 - regression_loss: 0.2880 - classification_loss: 0.0199 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3086 - regression_loss: 0.2887 - classification_loss: 0.0199 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3079 - regression_loss: 0.2882 - classification_loss: 0.0197 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3089 - regression_loss: 0.2892 - classification_loss: 0.0197 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3109 - regression_loss: 0.2909 - classification_loss: 0.0199 103/500 [=====>........................] - ETA: 2:16 - loss: 0.3098 - regression_loss: 0.2900 - classification_loss: 0.0198 104/500 [=====>........................] - ETA: 2:16 - loss: 0.3081 - regression_loss: 0.2885 - classification_loss: 0.0196 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3090 - regression_loss: 0.2895 - classification_loss: 0.0195 106/500 [=====>........................] - ETA: 2:15 - loss: 0.3131 - regression_loss: 0.2932 - classification_loss: 0.0199 107/500 [=====>........................] - ETA: 2:15 - loss: 0.3139 - regression_loss: 0.2941 - classification_loss: 0.0198 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3128 - regression_loss: 0.2931 - classification_loss: 0.0197 109/500 [=====>........................] - ETA: 2:14 - loss: 0.3130 - regression_loss: 0.2933 - classification_loss: 0.0198 110/500 [=====>........................] - ETA: 2:14 - loss: 0.3144 - regression_loss: 0.2947 - classification_loss: 0.0197 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3132 - regression_loss: 0.2936 - classification_loss: 0.0196 112/500 [=====>........................] - ETA: 2:13 - loss: 0.3139 - regression_loss: 0.2944 - classification_loss: 0.0195 113/500 [=====>........................] - ETA: 2:13 - loss: 0.3152 - regression_loss: 0.2955 - classification_loss: 0.0197 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3169 - regression_loss: 0.2971 - classification_loss: 0.0198 115/500 [=====>........................] - ETA: 2:12 - loss: 0.3177 - regression_loss: 0.2980 - classification_loss: 0.0197 116/500 [=====>........................] - ETA: 2:12 - loss: 0.3178 - regression_loss: 0.2981 - classification_loss: 0.0197 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3185 - regression_loss: 0.2988 - classification_loss: 0.0197 118/500 [======>.......................] - ETA: 2:11 - loss: 0.3196 - regression_loss: 0.2999 - classification_loss: 0.0197 119/500 [======>.......................] - ETA: 2:11 - loss: 0.3191 - regression_loss: 0.2995 - classification_loss: 0.0197 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3194 - regression_loss: 0.2998 - classification_loss: 0.0196 121/500 [======>.......................] - ETA: 2:10 - loss: 0.3182 - regression_loss: 0.2987 - classification_loss: 0.0195 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3195 - regression_loss: 0.3000 - classification_loss: 0.0195 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3205 - regression_loss: 0.3009 - classification_loss: 0.0196 124/500 [======>.......................] - ETA: 2:09 - loss: 0.3210 - regression_loss: 0.3014 - classification_loss: 0.0196 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3219 - regression_loss: 0.3023 - classification_loss: 0.0196 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3206 - regression_loss: 0.3011 - classification_loss: 0.0195 127/500 [======>.......................] - ETA: 2:08 - loss: 0.3202 - regression_loss: 0.3007 - classification_loss: 0.0194 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3197 - regression_loss: 0.3003 - classification_loss: 0.0194 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3194 - regression_loss: 0.3000 - classification_loss: 0.0194 130/500 [======>.......................] - ETA: 2:07 - loss: 0.3208 - regression_loss: 0.3013 - classification_loss: 0.0195 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3207 - regression_loss: 0.3013 - classification_loss: 0.0194 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3208 - regression_loss: 0.3014 - classification_loss: 0.0194 133/500 [======>.......................] - ETA: 2:06 - loss: 0.3201 - regression_loss: 0.3008 - classification_loss: 0.0193 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3212 - regression_loss: 0.3019 - classification_loss: 0.0193 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3196 - regression_loss: 0.3004 - classification_loss: 0.0192 136/500 [=======>......................] - ETA: 2:05 - loss: 0.3188 - regression_loss: 0.2995 - classification_loss: 0.0192 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3197 - regression_loss: 0.3004 - classification_loss: 0.0192 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3194 - regression_loss: 0.3001 - classification_loss: 0.0192 139/500 [=======>......................] - ETA: 2:04 - loss: 0.3199 - regression_loss: 0.3006 - classification_loss: 0.0193 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3200 - regression_loss: 0.3007 - classification_loss: 0.0194 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3193 - regression_loss: 0.3000 - classification_loss: 0.0193 142/500 [=======>......................] - ETA: 2:03 - loss: 0.3195 - regression_loss: 0.3002 - classification_loss: 0.0193 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3226 - regression_loss: 0.3033 - classification_loss: 0.0193 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3223 - regression_loss: 0.3031 - classification_loss: 0.0192 145/500 [=======>......................] - ETA: 2:02 - loss: 0.3223 - regression_loss: 0.3031 - classification_loss: 0.0192 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3219 - regression_loss: 0.3027 - classification_loss: 0.0192 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3225 - regression_loss: 0.3034 - classification_loss: 0.0191 148/500 [=======>......................] - ETA: 2:01 - loss: 0.3227 - regression_loss: 0.3036 - classification_loss: 0.0191 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3230 - regression_loss: 0.3039 - classification_loss: 0.0191 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3225 - regression_loss: 0.3035 - classification_loss: 0.0190 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3237 - regression_loss: 0.3046 - classification_loss: 0.0190 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3240 - regression_loss: 0.3050 - classification_loss: 0.0190 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3231 - regression_loss: 0.3042 - classification_loss: 0.0189 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3225 - regression_loss: 0.3036 - classification_loss: 0.0189 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3221 - regression_loss: 0.3033 - classification_loss: 0.0188 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3217 - regression_loss: 0.3029 - classification_loss: 0.0188 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3218 - regression_loss: 0.3029 - classification_loss: 0.0188 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3218 - regression_loss: 0.3030 - classification_loss: 0.0189 159/500 [========>.....................] - ETA: 1:57 - loss: 0.3216 - regression_loss: 0.3027 - classification_loss: 0.0188 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3217 - regression_loss: 0.3028 - classification_loss: 0.0188 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3225 - regression_loss: 0.3036 - classification_loss: 0.0189 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3232 - regression_loss: 0.3041 - classification_loss: 0.0191 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3233 - regression_loss: 0.3042 - classification_loss: 0.0191 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3225 - regression_loss: 0.3034 - classification_loss: 0.0190 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3235 - regression_loss: 0.3044 - classification_loss: 0.0190 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3241 - regression_loss: 0.3050 - classification_loss: 0.0191 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3235 - regression_loss: 0.3045 - classification_loss: 0.0190 168/500 [=========>....................] - ETA: 1:54 - loss: 0.3236 - regression_loss: 0.3046 - classification_loss: 0.0190 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3233 - regression_loss: 0.3043 - classification_loss: 0.0190 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3250 - regression_loss: 0.3059 - classification_loss: 0.0191 171/500 [=========>....................] - ETA: 1:53 - loss: 0.3252 - regression_loss: 0.3061 - classification_loss: 0.0191 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3247 - regression_loss: 0.3056 - classification_loss: 0.0191 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3235 - regression_loss: 0.3045 - classification_loss: 0.0190 174/500 [=========>....................] - ETA: 1:52 - loss: 0.3226 - regression_loss: 0.3037 - classification_loss: 0.0189 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3224 - regression_loss: 0.3035 - classification_loss: 0.0189 176/500 [=========>....................] - ETA: 1:51 - loss: 0.3219 - regression_loss: 0.3031 - classification_loss: 0.0188 177/500 [=========>....................] - ETA: 1:51 - loss: 0.3215 - regression_loss: 0.3028 - classification_loss: 0.0187 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3206 - regression_loss: 0.3019 - classification_loss: 0.0187 179/500 [=========>....................] - ETA: 1:50 - loss: 0.3204 - regression_loss: 0.3017 - classification_loss: 0.0187 180/500 [=========>....................] - ETA: 1:50 - loss: 0.3201 - regression_loss: 0.3015 - classification_loss: 0.0186 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3190 - regression_loss: 0.3004 - classification_loss: 0.0186 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3192 - regression_loss: 0.3006 - classification_loss: 0.0186 183/500 [=========>....................] - ETA: 1:49 - loss: 0.3191 - regression_loss: 0.3005 - classification_loss: 0.0186 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3191 - regression_loss: 0.3006 - classification_loss: 0.0185 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3191 - regression_loss: 0.3005 - classification_loss: 0.0186 186/500 [==========>...................] - ETA: 1:48 - loss: 0.3187 - regression_loss: 0.3002 - classification_loss: 0.0185 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3199 - regression_loss: 0.3012 - classification_loss: 0.0187 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3191 - regression_loss: 0.3005 - classification_loss: 0.0186 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3198 - regression_loss: 0.3011 - classification_loss: 0.0187 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3199 - regression_loss: 0.3012 - classification_loss: 0.0187 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3212 - regression_loss: 0.3024 - classification_loss: 0.0188 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3216 - regression_loss: 0.3028 - classification_loss: 0.0188 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3212 - regression_loss: 0.3025 - classification_loss: 0.0187 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3213 - regression_loss: 0.3026 - classification_loss: 0.0187 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3205 - regression_loss: 0.3018 - classification_loss: 0.0187 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3198 - regression_loss: 0.3012 - classification_loss: 0.0186 197/500 [==========>...................] - ETA: 1:44 - loss: 0.3202 - regression_loss: 0.3015 - classification_loss: 0.0186 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3189 - regression_loss: 0.3003 - classification_loss: 0.0186 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3181 - regression_loss: 0.2995 - classification_loss: 0.0185 200/500 [===========>..................] - ETA: 1:43 - loss: 0.3174 - regression_loss: 0.2989 - classification_loss: 0.0185 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3177 - regression_loss: 0.2991 - classification_loss: 0.0186 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3186 - regression_loss: 0.3000 - classification_loss: 0.0186 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3185 - regression_loss: 0.2999 - classification_loss: 0.0186 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3192 - regression_loss: 0.3006 - classification_loss: 0.0186 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3187 - regression_loss: 0.3001 - classification_loss: 0.0186 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3200 - regression_loss: 0.3014 - classification_loss: 0.0187 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3198 - regression_loss: 0.3012 - classification_loss: 0.0186 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3202 - regression_loss: 0.3016 - classification_loss: 0.0186 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3204 - regression_loss: 0.3018 - classification_loss: 0.0186 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3201 - regression_loss: 0.3015 - classification_loss: 0.0186 211/500 [===========>..................] - ETA: 1:39 - loss: 0.3196 - regression_loss: 0.3011 - classification_loss: 0.0185 212/500 [===========>..................] - ETA: 1:39 - loss: 0.3187 - regression_loss: 0.3003 - classification_loss: 0.0185 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3185 - regression_loss: 0.3001 - classification_loss: 0.0184 214/500 [===========>..................] - ETA: 1:38 - loss: 0.3178 - regression_loss: 0.2994 - classification_loss: 0.0184 215/500 [===========>..................] - ETA: 1:38 - loss: 0.3175 - regression_loss: 0.2991 - classification_loss: 0.0184 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3198 - regression_loss: 0.3014 - classification_loss: 0.0184 217/500 [============>.................] - ETA: 1:37 - loss: 0.3254 - regression_loss: 0.3064 - classification_loss: 0.0190 218/500 [============>.................] - ETA: 1:36 - loss: 0.3281 - regression_loss: 0.3089 - classification_loss: 0.0192 219/500 [============>.................] - ETA: 1:36 - loss: 0.3288 - regression_loss: 0.3096 - classification_loss: 0.0192 220/500 [============>.................] - ETA: 1:36 - loss: 0.3288 - regression_loss: 0.3096 - classification_loss: 0.0193 221/500 [============>.................] - ETA: 1:35 - loss: 0.3281 - regression_loss: 0.3088 - classification_loss: 0.0192 222/500 [============>.................] - ETA: 1:35 - loss: 0.3280 - regression_loss: 0.3088 - classification_loss: 0.0192 223/500 [============>.................] - ETA: 1:35 - loss: 0.3289 - regression_loss: 0.3095 - classification_loss: 0.0193 224/500 [============>.................] - ETA: 1:34 - loss: 0.3296 - regression_loss: 0.3103 - classification_loss: 0.0193 225/500 [============>.................] - ETA: 1:34 - loss: 0.3302 - regression_loss: 0.3108 - classification_loss: 0.0194 226/500 [============>.................] - ETA: 1:34 - loss: 0.3311 - regression_loss: 0.3116 - classification_loss: 0.0195 227/500 [============>.................] - ETA: 1:33 - loss: 0.3307 - regression_loss: 0.3112 - classification_loss: 0.0195 228/500 [============>.................] - ETA: 1:33 - loss: 0.3310 - regression_loss: 0.3114 - classification_loss: 0.0196 229/500 [============>.................] - ETA: 1:33 - loss: 0.3312 - regression_loss: 0.3116 - classification_loss: 0.0195 230/500 [============>.................] - ETA: 1:32 - loss: 0.3310 - regression_loss: 0.3115 - classification_loss: 0.0195 231/500 [============>.................] - ETA: 1:32 - loss: 0.3311 - regression_loss: 0.3115 - classification_loss: 0.0196 232/500 [============>.................] - ETA: 1:32 - loss: 0.3306 - regression_loss: 0.3110 - classification_loss: 0.0195 233/500 [============>.................] - ETA: 1:31 - loss: 0.3303 - regression_loss: 0.3108 - classification_loss: 0.0195 234/500 [=============>................] - ETA: 1:31 - loss: 0.3296 - regression_loss: 0.3101 - classification_loss: 0.0194 235/500 [=============>................] - ETA: 1:31 - loss: 0.3285 - regression_loss: 0.3091 - classification_loss: 0.0194 236/500 [=============>................] - ETA: 1:30 - loss: 0.3281 - regression_loss: 0.3087 - classification_loss: 0.0194 237/500 [=============>................] - ETA: 1:30 - loss: 0.3278 - regression_loss: 0.3084 - classification_loss: 0.0194 238/500 [=============>................] - ETA: 1:30 - loss: 0.3267 - regression_loss: 0.3074 - classification_loss: 0.0193 239/500 [=============>................] - ETA: 1:29 - loss: 0.3260 - regression_loss: 0.3068 - classification_loss: 0.0193 240/500 [=============>................] - ETA: 1:29 - loss: 0.3286 - regression_loss: 0.3088 - classification_loss: 0.0197 241/500 [=============>................] - ETA: 1:29 - loss: 0.3281 - regression_loss: 0.3084 - classification_loss: 0.0197 242/500 [=============>................] - ETA: 1:28 - loss: 0.3286 - regression_loss: 0.3089 - classification_loss: 0.0197 243/500 [=============>................] - ETA: 1:28 - loss: 0.3285 - regression_loss: 0.3088 - classification_loss: 0.0197 244/500 [=============>................] - ETA: 1:27 - loss: 0.3286 - regression_loss: 0.3089 - classification_loss: 0.0197 245/500 [=============>................] - ETA: 1:27 - loss: 0.3284 - regression_loss: 0.3087 - classification_loss: 0.0197 246/500 [=============>................] - ETA: 1:27 - loss: 0.3278 - regression_loss: 0.3081 - classification_loss: 0.0196 247/500 [=============>................] - ETA: 1:26 - loss: 0.3281 - regression_loss: 0.3084 - classification_loss: 0.0196 248/500 [=============>................] - ETA: 1:26 - loss: 0.3290 - regression_loss: 0.3093 - classification_loss: 0.0196 249/500 [=============>................] - ETA: 1:26 - loss: 0.3284 - regression_loss: 0.3087 - classification_loss: 0.0196 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3284 - regression_loss: 0.3088 - classification_loss: 0.0196 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3284 - regression_loss: 0.3088 - classification_loss: 0.0196 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3282 - regression_loss: 0.3087 - classification_loss: 0.0195 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3277 - regression_loss: 0.3081 - classification_loss: 0.0195 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3279 - regression_loss: 0.3083 - classification_loss: 0.0195 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3283 - regression_loss: 0.3087 - classification_loss: 0.0196 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3283 - regression_loss: 0.3087 - classification_loss: 0.0196 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3280 - regression_loss: 0.3085 - classification_loss: 0.0195 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3280 - regression_loss: 0.3085 - classification_loss: 0.0195 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3274 - regression_loss: 0.3080 - classification_loss: 0.0194 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3268 - regression_loss: 0.3074 - classification_loss: 0.0194 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3270 - regression_loss: 0.3076 - classification_loss: 0.0194 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3278 - regression_loss: 0.3084 - classification_loss: 0.0194 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3283 - regression_loss: 0.3089 - classification_loss: 0.0195 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3281 - regression_loss: 0.3087 - classification_loss: 0.0194 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3283 - regression_loss: 0.3089 - classification_loss: 0.0195 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3280 - regression_loss: 0.3086 - classification_loss: 0.0194 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3277 - regression_loss: 0.3083 - classification_loss: 0.0194 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3271 - regression_loss: 0.3078 - classification_loss: 0.0193 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3269 - regression_loss: 0.3076 - classification_loss: 0.0193 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3277 - regression_loss: 0.3082 - classification_loss: 0.0194 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3281 - regression_loss: 0.3085 - classification_loss: 0.0195 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3283 - regression_loss: 0.3087 - classification_loss: 0.0195 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3281 - regression_loss: 0.3085 - classification_loss: 0.0195 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3280 - regression_loss: 0.3085 - classification_loss: 0.0195 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3286 - regression_loss: 0.3091 - classification_loss: 0.0195 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3279 - regression_loss: 0.3084 - classification_loss: 0.0195 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3280 - regression_loss: 0.3086 - classification_loss: 0.0195 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3291 - regression_loss: 0.3096 - classification_loss: 0.0195 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3295 - regression_loss: 0.3099 - classification_loss: 0.0197 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3291 - regression_loss: 0.3095 - classification_loss: 0.0196 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3291 - regression_loss: 0.3094 - classification_loss: 0.0196 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3284 - regression_loss: 0.3088 - classification_loss: 0.0196 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3279 - regression_loss: 0.3084 - classification_loss: 0.0195 284/500 [================>.............] - ETA: 1:14 - loss: 0.3279 - regression_loss: 0.3083 - classification_loss: 0.0195 285/500 [================>.............] - ETA: 1:13 - loss: 0.3279 - regression_loss: 0.3084 - classification_loss: 0.0195 286/500 [================>.............] - ETA: 1:13 - loss: 0.3275 - regression_loss: 0.3080 - classification_loss: 0.0195 287/500 [================>.............] - ETA: 1:13 - loss: 0.3277 - regression_loss: 0.3082 - classification_loss: 0.0195 288/500 [================>.............] - ETA: 1:12 - loss: 0.3281 - regression_loss: 0.3086 - classification_loss: 0.0195 289/500 [================>.............] - ETA: 1:12 - loss: 0.3278 - regression_loss: 0.3083 - classification_loss: 0.0195 290/500 [================>.............] - ETA: 1:12 - loss: 0.3284 - regression_loss: 0.3089 - classification_loss: 0.0195 291/500 [================>.............] - ETA: 1:11 - loss: 0.3300 - regression_loss: 0.3104 - classification_loss: 0.0195 292/500 [================>.............] - ETA: 1:11 - loss: 0.3302 - regression_loss: 0.3106 - classification_loss: 0.0196 293/500 [================>.............] - ETA: 1:11 - loss: 0.3294 - regression_loss: 0.3099 - classification_loss: 0.0195 294/500 [================>.............] - ETA: 1:10 - loss: 0.3297 - regression_loss: 0.3102 - classification_loss: 0.0195 295/500 [================>.............] - ETA: 1:10 - loss: 0.3300 - regression_loss: 0.3105 - classification_loss: 0.0195 296/500 [================>.............] - ETA: 1:10 - loss: 0.3302 - regression_loss: 0.3106 - classification_loss: 0.0196 297/500 [================>.............] - ETA: 1:09 - loss: 0.3306 - regression_loss: 0.3111 - classification_loss: 0.0196 298/500 [================>.............] - ETA: 1:09 - loss: 0.3303 - regression_loss: 0.3107 - classification_loss: 0.0195 299/500 [================>.............] - ETA: 1:09 - loss: 0.3303 - regression_loss: 0.3107 - classification_loss: 0.0195 300/500 [=================>............] - ETA: 1:08 - loss: 0.3303 - regression_loss: 0.3108 - classification_loss: 0.0195 301/500 [=================>............] - ETA: 1:08 - loss: 0.3301 - regression_loss: 0.3106 - classification_loss: 0.0195 302/500 [=================>............] - ETA: 1:08 - loss: 0.3296 - regression_loss: 0.3101 - classification_loss: 0.0195 303/500 [=================>............] - ETA: 1:07 - loss: 0.3293 - regression_loss: 0.3099 - classification_loss: 0.0194 304/500 [=================>............] - ETA: 1:07 - loss: 0.3288 - regression_loss: 0.3094 - classification_loss: 0.0194 305/500 [=================>............] - ETA: 1:07 - loss: 0.3287 - regression_loss: 0.3093 - classification_loss: 0.0194 306/500 [=================>............] - ETA: 1:06 - loss: 0.3282 - regression_loss: 0.3088 - classification_loss: 0.0194 307/500 [=================>............] - ETA: 1:06 - loss: 0.3276 - regression_loss: 0.3083 - classification_loss: 0.0193 308/500 [=================>............] - ETA: 1:05 - loss: 0.3284 - regression_loss: 0.3090 - classification_loss: 0.0194 309/500 [=================>............] - ETA: 1:05 - loss: 0.3279 - regression_loss: 0.3085 - classification_loss: 0.0194 310/500 [=================>............] - ETA: 1:05 - loss: 0.3279 - regression_loss: 0.3085 - classification_loss: 0.0194 311/500 [=================>............] - ETA: 1:04 - loss: 0.3276 - regression_loss: 0.3082 - classification_loss: 0.0193 312/500 [=================>............] - ETA: 1:04 - loss: 0.3283 - regression_loss: 0.3090 - classification_loss: 0.0193 313/500 [=================>............] - ETA: 1:04 - loss: 0.3279 - regression_loss: 0.3086 - classification_loss: 0.0193 314/500 [=================>............] - ETA: 1:03 - loss: 0.3285 - regression_loss: 0.3092 - classification_loss: 0.0193 315/500 [=================>............] - ETA: 1:03 - loss: 0.3284 - regression_loss: 0.3091 - classification_loss: 0.0193 316/500 [=================>............] - ETA: 1:03 - loss: 0.3288 - regression_loss: 0.3094 - classification_loss: 0.0194 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3292 - regression_loss: 0.3097 - classification_loss: 0.0194 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3287 - regression_loss: 0.3093 - classification_loss: 0.0194 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3288 - regression_loss: 0.3094 - classification_loss: 0.0194 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3283 - regression_loss: 0.3089 - classification_loss: 0.0194 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3281 - regression_loss: 0.3087 - classification_loss: 0.0194 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3281 - regression_loss: 0.3087 - classification_loss: 0.0194 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3283 - regression_loss: 0.3090 - classification_loss: 0.0194 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3282 - regression_loss: 0.3088 - classification_loss: 0.0194 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3288 - regression_loss: 0.3094 - classification_loss: 0.0194 326/500 [==================>...........] - ETA: 59s - loss: 0.3286 - regression_loss: 0.3093 - classification_loss: 0.0194  327/500 [==================>...........] - ETA: 59s - loss: 0.3283 - regression_loss: 0.3090 - classification_loss: 0.0193 328/500 [==================>...........] - ETA: 59s - loss: 0.3288 - regression_loss: 0.3095 - classification_loss: 0.0193 329/500 [==================>...........] - ETA: 58s - loss: 0.3291 - regression_loss: 0.3098 - classification_loss: 0.0193 330/500 [==================>...........] - ETA: 58s - loss: 0.3299 - regression_loss: 0.3105 - classification_loss: 0.0194 331/500 [==================>...........] - ETA: 58s - loss: 0.3295 - regression_loss: 0.3102 - classification_loss: 0.0193 332/500 [==================>...........] - ETA: 57s - loss: 0.3292 - regression_loss: 0.3099 - classification_loss: 0.0193 333/500 [==================>...........] - ETA: 57s - loss: 0.3291 - regression_loss: 0.3098 - classification_loss: 0.0194 334/500 [===================>..........] - ETA: 57s - loss: 0.3287 - regression_loss: 0.3094 - classification_loss: 0.0193 335/500 [===================>..........] - ETA: 56s - loss: 0.3285 - regression_loss: 0.3092 - classification_loss: 0.0193 336/500 [===================>..........] - ETA: 56s - loss: 0.3286 - regression_loss: 0.3093 - classification_loss: 0.0193 337/500 [===================>..........] - ETA: 56s - loss: 0.3284 - regression_loss: 0.3091 - classification_loss: 0.0193 338/500 [===================>..........] - ETA: 55s - loss: 0.3286 - regression_loss: 0.3093 - classification_loss: 0.0193 339/500 [===================>..........] - ETA: 55s - loss: 0.3286 - regression_loss: 0.3094 - classification_loss: 0.0192 340/500 [===================>..........] - ETA: 54s - loss: 0.3289 - regression_loss: 0.3097 - classification_loss: 0.0192 341/500 [===================>..........] - ETA: 54s - loss: 0.3284 - regression_loss: 0.3092 - classification_loss: 0.0192 342/500 [===================>..........] - ETA: 54s - loss: 0.3277 - regression_loss: 0.3085 - classification_loss: 0.0192 343/500 [===================>..........] - ETA: 53s - loss: 0.3285 - regression_loss: 0.3092 - classification_loss: 0.0193 344/500 [===================>..........] - ETA: 53s - loss: 0.3281 - regression_loss: 0.3088 - classification_loss: 0.0193 345/500 [===================>..........] - ETA: 53s - loss: 0.3282 - regression_loss: 0.3088 - classification_loss: 0.0193 346/500 [===================>..........] - ETA: 52s - loss: 0.3280 - regression_loss: 0.3087 - classification_loss: 0.0193 347/500 [===================>..........] - ETA: 52s - loss: 0.3283 - regression_loss: 0.3090 - classification_loss: 0.0193 348/500 [===================>..........] - ETA: 52s - loss: 0.3279 - regression_loss: 0.3086 - classification_loss: 0.0193 349/500 [===================>..........] - ETA: 51s - loss: 0.3278 - regression_loss: 0.3085 - classification_loss: 0.0193 350/500 [====================>.........] - ETA: 51s - loss: 0.3276 - regression_loss: 0.3084 - classification_loss: 0.0193 351/500 [====================>.........] - ETA: 51s - loss: 0.3275 - regression_loss: 0.3082 - classification_loss: 0.0193 352/500 [====================>.........] - ETA: 50s - loss: 0.3273 - regression_loss: 0.3080 - classification_loss: 0.0193 353/500 [====================>.........] - ETA: 50s - loss: 0.3274 - regression_loss: 0.3082 - classification_loss: 0.0193 354/500 [====================>.........] - ETA: 50s - loss: 0.3273 - regression_loss: 0.3080 - classification_loss: 0.0193 355/500 [====================>.........] - ETA: 49s - loss: 0.3275 - regression_loss: 0.3082 - classification_loss: 0.0193 356/500 [====================>.........] - ETA: 49s - loss: 0.3274 - regression_loss: 0.3081 - classification_loss: 0.0193 357/500 [====================>.........] - ETA: 49s - loss: 0.3278 - regression_loss: 0.3085 - classification_loss: 0.0193 358/500 [====================>.........] - ETA: 48s - loss: 0.3275 - regression_loss: 0.3082 - classification_loss: 0.0193 359/500 [====================>.........] - ETA: 48s - loss: 0.3271 - regression_loss: 0.3078 - classification_loss: 0.0193 360/500 [====================>.........] - ETA: 48s - loss: 0.3269 - regression_loss: 0.3076 - classification_loss: 0.0192 361/500 [====================>.........] - ETA: 47s - loss: 0.3265 - regression_loss: 0.3073 - classification_loss: 0.0192 362/500 [====================>.........] - ETA: 47s - loss: 0.3267 - regression_loss: 0.3075 - classification_loss: 0.0192 363/500 [====================>.........] - ETA: 47s - loss: 0.3265 - regression_loss: 0.3073 - classification_loss: 0.0192 364/500 [====================>.........] - ETA: 46s - loss: 0.3260 - regression_loss: 0.3068 - classification_loss: 0.0192 365/500 [====================>.........] - ETA: 46s - loss: 0.3258 - regression_loss: 0.3066 - classification_loss: 0.0192 366/500 [====================>.........] - ETA: 46s - loss: 0.3258 - regression_loss: 0.3066 - classification_loss: 0.0192 367/500 [=====================>........] - ETA: 45s - loss: 0.3256 - regression_loss: 0.3064 - classification_loss: 0.0192 368/500 [=====================>........] - ETA: 45s - loss: 0.3255 - regression_loss: 0.3063 - classification_loss: 0.0192 369/500 [=====================>........] - ETA: 45s - loss: 0.3249 - regression_loss: 0.3057 - classification_loss: 0.0191 370/500 [=====================>........] - ETA: 44s - loss: 0.3247 - regression_loss: 0.3056 - classification_loss: 0.0191 371/500 [=====================>........] - ETA: 44s - loss: 0.3246 - regression_loss: 0.3055 - classification_loss: 0.0191 372/500 [=====================>........] - ETA: 44s - loss: 0.3245 - regression_loss: 0.3054 - classification_loss: 0.0191 373/500 [=====================>........] - ETA: 43s - loss: 0.3240 - regression_loss: 0.3049 - classification_loss: 0.0191 374/500 [=====================>........] - ETA: 43s - loss: 0.3240 - regression_loss: 0.3049 - classification_loss: 0.0192 375/500 [=====================>........] - ETA: 42s - loss: 0.3241 - regression_loss: 0.3050 - classification_loss: 0.0192 376/500 [=====================>........] - ETA: 42s - loss: 0.3239 - regression_loss: 0.3048 - classification_loss: 0.0191 377/500 [=====================>........] - ETA: 42s - loss: 0.3244 - regression_loss: 0.3053 - classification_loss: 0.0191 378/500 [=====================>........] - ETA: 41s - loss: 0.3245 - regression_loss: 0.3054 - classification_loss: 0.0191 379/500 [=====================>........] - ETA: 41s - loss: 0.3240 - regression_loss: 0.3049 - classification_loss: 0.0191 380/500 [=====================>........] - ETA: 41s - loss: 0.3241 - regression_loss: 0.3049 - classification_loss: 0.0191 381/500 [=====================>........] - ETA: 40s - loss: 0.3240 - regression_loss: 0.3050 - classification_loss: 0.0191 382/500 [=====================>........] - ETA: 40s - loss: 0.3239 - regression_loss: 0.3048 - classification_loss: 0.0191 383/500 [=====================>........] - ETA: 40s - loss: 0.3239 - regression_loss: 0.3049 - classification_loss: 0.0191 384/500 [======================>.......] - ETA: 39s - loss: 0.3238 - regression_loss: 0.3047 - classification_loss: 0.0191 385/500 [======================>.......] - ETA: 39s - loss: 0.3242 - regression_loss: 0.3051 - classification_loss: 0.0192 386/500 [======================>.......] - ETA: 39s - loss: 0.3238 - regression_loss: 0.3047 - classification_loss: 0.0191 387/500 [======================>.......] - ETA: 38s - loss: 0.3235 - regression_loss: 0.3044 - classification_loss: 0.0191 388/500 [======================>.......] - ETA: 38s - loss: 0.3234 - regression_loss: 0.3043 - classification_loss: 0.0191 389/500 [======================>.......] - ETA: 38s - loss: 0.3232 - regression_loss: 0.3042 - classification_loss: 0.0191 390/500 [======================>.......] - ETA: 37s - loss: 0.3232 - regression_loss: 0.3042 - classification_loss: 0.0191 391/500 [======================>.......] - ETA: 37s - loss: 0.3229 - regression_loss: 0.3038 - classification_loss: 0.0190 392/500 [======================>.......] - ETA: 37s - loss: 0.3225 - regression_loss: 0.3035 - classification_loss: 0.0190 393/500 [======================>.......] - ETA: 36s - loss: 0.3229 - regression_loss: 0.3038 - classification_loss: 0.0191 394/500 [======================>.......] - ETA: 36s - loss: 0.3227 - regression_loss: 0.3037 - classification_loss: 0.0191 395/500 [======================>.......] - ETA: 36s - loss: 0.3230 - regression_loss: 0.3039 - classification_loss: 0.0191 396/500 [======================>.......] - ETA: 35s - loss: 0.3226 - regression_loss: 0.3035 - classification_loss: 0.0190 397/500 [======================>.......] - ETA: 35s - loss: 0.3230 - regression_loss: 0.3039 - classification_loss: 0.0191 398/500 [======================>.......] - ETA: 35s - loss: 0.3231 - regression_loss: 0.3040 - classification_loss: 0.0191 399/500 [======================>.......] - ETA: 34s - loss: 0.3232 - regression_loss: 0.3040 - classification_loss: 0.0192 400/500 [=======================>......] - ETA: 34s - loss: 0.3231 - regression_loss: 0.3038 - classification_loss: 0.0193 401/500 [=======================>......] - ETA: 34s - loss: 0.3232 - regression_loss: 0.3039 - classification_loss: 0.0193 402/500 [=======================>......] - ETA: 33s - loss: 0.3235 - regression_loss: 0.3042 - classification_loss: 0.0193 403/500 [=======================>......] - ETA: 33s - loss: 0.3231 - regression_loss: 0.3039 - classification_loss: 0.0193 404/500 [=======================>......] - ETA: 33s - loss: 0.3230 - regression_loss: 0.3037 - classification_loss: 0.0193 405/500 [=======================>......] - ETA: 32s - loss: 0.3231 - regression_loss: 0.3038 - classification_loss: 0.0193 406/500 [=======================>......] - ETA: 32s - loss: 0.3233 - regression_loss: 0.3040 - classification_loss: 0.0193 407/500 [=======================>......] - ETA: 31s - loss: 0.3234 - regression_loss: 0.3040 - classification_loss: 0.0193 408/500 [=======================>......] - ETA: 31s - loss: 0.3237 - regression_loss: 0.3043 - classification_loss: 0.0194 409/500 [=======================>......] - ETA: 31s - loss: 0.3234 - regression_loss: 0.3040 - classification_loss: 0.0194 410/500 [=======================>......] - ETA: 30s - loss: 0.3234 - regression_loss: 0.3040 - classification_loss: 0.0194 411/500 [=======================>......] - ETA: 30s - loss: 0.3236 - regression_loss: 0.3041 - classification_loss: 0.0194 412/500 [=======================>......] - ETA: 30s - loss: 0.3233 - regression_loss: 0.3039 - classification_loss: 0.0194 413/500 [=======================>......] - ETA: 29s - loss: 0.3231 - regression_loss: 0.3037 - classification_loss: 0.0194 414/500 [=======================>......] - ETA: 29s - loss: 0.3228 - regression_loss: 0.3035 - classification_loss: 0.0194 415/500 [=======================>......] - ETA: 29s - loss: 0.3232 - regression_loss: 0.3038 - classification_loss: 0.0194 416/500 [=======================>......] - ETA: 28s - loss: 0.3231 - regression_loss: 0.3038 - classification_loss: 0.0194 417/500 [========================>.....] - ETA: 28s - loss: 0.3230 - regression_loss: 0.3037 - classification_loss: 0.0193 418/500 [========================>.....] - ETA: 28s - loss: 0.3228 - regression_loss: 0.3035 - classification_loss: 0.0193 419/500 [========================>.....] - ETA: 27s - loss: 0.3226 - regression_loss: 0.3033 - classification_loss: 0.0193 420/500 [========================>.....] - ETA: 27s - loss: 0.3225 - regression_loss: 0.3033 - classification_loss: 0.0193 421/500 [========================>.....] - ETA: 27s - loss: 0.3222 - regression_loss: 0.3030 - classification_loss: 0.0193 422/500 [========================>.....] - ETA: 26s - loss: 0.3225 - regression_loss: 0.3031 - classification_loss: 0.0194 423/500 [========================>.....] - ETA: 26s - loss: 0.3224 - regression_loss: 0.3031 - classification_loss: 0.0194 424/500 [========================>.....] - ETA: 26s - loss: 0.3228 - regression_loss: 0.3034 - classification_loss: 0.0194 425/500 [========================>.....] - ETA: 25s - loss: 0.3229 - regression_loss: 0.3034 - classification_loss: 0.0194 426/500 [========================>.....] - ETA: 25s - loss: 0.3226 - regression_loss: 0.3032 - classification_loss: 0.0194 427/500 [========================>.....] - ETA: 25s - loss: 0.3221 - regression_loss: 0.3027 - classification_loss: 0.0194 428/500 [========================>.....] - ETA: 24s - loss: 0.3218 - regression_loss: 0.3025 - classification_loss: 0.0194 429/500 [========================>.....] - ETA: 24s - loss: 0.3217 - regression_loss: 0.3024 - classification_loss: 0.0193 430/500 [========================>.....] - ETA: 24s - loss: 0.3221 - regression_loss: 0.3028 - classification_loss: 0.0193 431/500 [========================>.....] - ETA: 23s - loss: 0.3218 - regression_loss: 0.3025 - classification_loss: 0.0193 432/500 [========================>.....] - ETA: 23s - loss: 0.3215 - regression_loss: 0.3022 - classification_loss: 0.0193 433/500 [========================>.....] - ETA: 23s - loss: 0.3218 - regression_loss: 0.3025 - classification_loss: 0.0193 434/500 [=========================>....] - ETA: 22s - loss: 0.3214 - regression_loss: 0.3022 - classification_loss: 0.0193 435/500 [=========================>....] - ETA: 22s - loss: 0.3212 - regression_loss: 0.3019 - classification_loss: 0.0193 436/500 [=========================>....] - ETA: 21s - loss: 0.3211 - regression_loss: 0.3019 - classification_loss: 0.0193 437/500 [=========================>....] - ETA: 21s - loss: 0.3210 - regression_loss: 0.3017 - classification_loss: 0.0193 438/500 [=========================>....] - ETA: 21s - loss: 0.3210 - regression_loss: 0.3018 - classification_loss: 0.0192 439/500 [=========================>....] - ETA: 20s - loss: 0.3213 - regression_loss: 0.3021 - classification_loss: 0.0192 440/500 [=========================>....] - ETA: 20s - loss: 0.3213 - regression_loss: 0.3020 - classification_loss: 0.0192 441/500 [=========================>....] - ETA: 20s - loss: 0.3209 - regression_loss: 0.3017 - classification_loss: 0.0192 442/500 [=========================>....] - ETA: 19s - loss: 0.3208 - regression_loss: 0.3016 - classification_loss: 0.0192 443/500 [=========================>....] - ETA: 19s - loss: 0.3206 - regression_loss: 0.3014 - classification_loss: 0.0192 444/500 [=========================>....] - ETA: 19s - loss: 0.3202 - regression_loss: 0.3011 - classification_loss: 0.0191 445/500 [=========================>....] - ETA: 18s - loss: 0.3198 - regression_loss: 0.3007 - classification_loss: 0.0191 446/500 [=========================>....] - ETA: 18s - loss: 0.3200 - regression_loss: 0.3009 - classification_loss: 0.0191 447/500 [=========================>....] - ETA: 18s - loss: 0.3200 - regression_loss: 0.3009 - classification_loss: 0.0191 448/500 [=========================>....] - ETA: 17s - loss: 0.3201 - regression_loss: 0.3009 - classification_loss: 0.0191 449/500 [=========================>....] - ETA: 17s - loss: 0.3197 - regression_loss: 0.3006 - classification_loss: 0.0191 450/500 [==========================>...] - ETA: 17s - loss: 0.3193 - regression_loss: 0.3003 - classification_loss: 0.0191 451/500 [==========================>...] - ETA: 16s - loss: 0.3194 - regression_loss: 0.3003 - classification_loss: 0.0191 452/500 [==========================>...] - ETA: 16s - loss: 0.3191 - regression_loss: 0.3000 - classification_loss: 0.0191 453/500 [==========================>...] - ETA: 16s - loss: 0.3196 - regression_loss: 0.3004 - classification_loss: 0.0192 454/500 [==========================>...] - ETA: 15s - loss: 0.3199 - regression_loss: 0.3007 - classification_loss: 0.0192 455/500 [==========================>...] - ETA: 15s - loss: 0.3208 - regression_loss: 0.3016 - classification_loss: 0.0192 456/500 [==========================>...] - ETA: 15s - loss: 0.3219 - regression_loss: 0.3025 - classification_loss: 0.0194 457/500 [==========================>...] - ETA: 14s - loss: 0.3221 - regression_loss: 0.3028 - classification_loss: 0.0194 458/500 [==========================>...] - ETA: 14s - loss: 0.3219 - regression_loss: 0.3025 - classification_loss: 0.0194 459/500 [==========================>...] - ETA: 14s - loss: 0.3220 - regression_loss: 0.3027 - classification_loss: 0.0194 460/500 [==========================>...] - ETA: 13s - loss: 0.3220 - regression_loss: 0.3026 - classification_loss: 0.0194 461/500 [==========================>...] - ETA: 13s - loss: 0.3220 - regression_loss: 0.3026 - classification_loss: 0.0194 462/500 [==========================>...] - ETA: 13s - loss: 0.3225 - regression_loss: 0.3030 - classification_loss: 0.0195 463/500 [==========================>...] - ETA: 12s - loss: 0.3224 - regression_loss: 0.3029 - classification_loss: 0.0195 464/500 [==========================>...] - ETA: 12s - loss: 0.3240 - regression_loss: 0.3045 - classification_loss: 0.0195 465/500 [==========================>...] - ETA: 12s - loss: 0.3242 - regression_loss: 0.3047 - classification_loss: 0.0196 466/500 [==========================>...] - ETA: 11s - loss: 0.3240 - regression_loss: 0.3045 - classification_loss: 0.0195 467/500 [===========================>..] - ETA: 11s - loss: 0.3239 - regression_loss: 0.3044 - classification_loss: 0.0195 468/500 [===========================>..] - ETA: 10s - loss: 0.3238 - regression_loss: 0.3043 - classification_loss: 0.0195 469/500 [===========================>..] - ETA: 10s - loss: 0.3234 - regression_loss: 0.3040 - classification_loss: 0.0195 470/500 [===========================>..] - ETA: 10s - loss: 0.3237 - regression_loss: 0.3042 - classification_loss: 0.0195 471/500 [===========================>..] - ETA: 9s - loss: 0.3234 - regression_loss: 0.3039 - classification_loss: 0.0194  472/500 [===========================>..] - ETA: 9s - loss: 0.3236 - regression_loss: 0.3042 - classification_loss: 0.0194 473/500 [===========================>..] - ETA: 9s - loss: 0.3237 - regression_loss: 0.3043 - classification_loss: 0.0194 474/500 [===========================>..] - ETA: 8s - loss: 0.3238 - regression_loss: 0.3043 - classification_loss: 0.0195 475/500 [===========================>..] - ETA: 8s - loss: 0.3245 - regression_loss: 0.3050 - classification_loss: 0.0196 476/500 [===========================>..] - ETA: 8s - loss: 0.3246 - regression_loss: 0.3051 - classification_loss: 0.0196 477/500 [===========================>..] - ETA: 7s - loss: 0.3245 - regression_loss: 0.3050 - classification_loss: 0.0195 478/500 [===========================>..] - ETA: 7s - loss: 0.3244 - regression_loss: 0.3048 - classification_loss: 0.0195 479/500 [===========================>..] - ETA: 7s - loss: 0.3247 - regression_loss: 0.3052 - classification_loss: 0.0195 480/500 [===========================>..] - ETA: 6s - loss: 0.3246 - regression_loss: 0.3052 - classification_loss: 0.0195 481/500 [===========================>..] - ETA: 6s - loss: 0.3247 - regression_loss: 0.3052 - classification_loss: 0.0195 482/500 [===========================>..] - ETA: 6s - loss: 0.3245 - regression_loss: 0.3051 - classification_loss: 0.0194 483/500 [===========================>..] - ETA: 5s - loss: 0.3240 - regression_loss: 0.3046 - classification_loss: 0.0194 484/500 [============================>.] - ETA: 5s - loss: 0.3242 - regression_loss: 0.3048 - classification_loss: 0.0194 485/500 [============================>.] - ETA: 5s - loss: 0.3243 - regression_loss: 0.3049 - classification_loss: 0.0194 486/500 [============================>.] - ETA: 4s - loss: 0.3243 - regression_loss: 0.3049 - classification_loss: 0.0194 487/500 [============================>.] - ETA: 4s - loss: 0.3242 - regression_loss: 0.3048 - classification_loss: 0.0194 488/500 [============================>.] - ETA: 4s - loss: 0.3238 - regression_loss: 0.3045 - classification_loss: 0.0194 489/500 [============================>.] - ETA: 3s - loss: 0.3237 - regression_loss: 0.3044 - classification_loss: 0.0194 490/500 [============================>.] - ETA: 3s - loss: 0.3238 - regression_loss: 0.3044 - classification_loss: 0.0194 491/500 [============================>.] - ETA: 3s - loss: 0.3240 - regression_loss: 0.3046 - classification_loss: 0.0194 492/500 [============================>.] - ETA: 2s - loss: 0.3243 - regression_loss: 0.3049 - classification_loss: 0.0194 493/500 [============================>.] - ETA: 2s - loss: 0.3243 - regression_loss: 0.3050 - classification_loss: 0.0194 494/500 [============================>.] - ETA: 2s - loss: 0.3239 - regression_loss: 0.3046 - classification_loss: 0.0193 495/500 [============================>.] - ETA: 1s - loss: 0.3237 - regression_loss: 0.3043 - classification_loss: 0.0193 496/500 [============================>.] - ETA: 1s - loss: 0.3233 - regression_loss: 0.3040 - classification_loss: 0.0193 497/500 [============================>.] - ETA: 1s - loss: 0.3234 - regression_loss: 0.3041 - classification_loss: 0.0193 498/500 [============================>.] - ETA: 0s - loss: 0.3231 - regression_loss: 0.3038 - classification_loss: 0.0193 499/500 [============================>.] - ETA: 0s - loss: 0.3232 - regression_loss: 0.3039 - classification_loss: 0.0193 500/500 [==============================] - 172s 343ms/step - loss: 0.3229 - regression_loss: 0.3036 - classification_loss: 0.0193 1172 instances of class plum with average precision: 0.7520 mAP: 0.7520 Epoch 00035: saving model to ./training/snapshots/resnet101_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:52 - loss: 0.2687 - regression_loss: 0.2631 - classification_loss: 0.0056 2/500 [..............................] - ETA: 2:51 - loss: 0.2926 - regression_loss: 0.2759 - classification_loss: 0.0167 3/500 [..............................] - ETA: 2:50 - loss: 0.3806 - regression_loss: 0.3631 - classification_loss: 0.0174 4/500 [..............................] - ETA: 2:50 - loss: 0.4052 - regression_loss: 0.3868 - classification_loss: 0.0185 5/500 [..............................] - ETA: 2:51 - loss: 0.3775 - regression_loss: 0.3611 - classification_loss: 0.0164 6/500 [..............................] - ETA: 2:50 - loss: 0.4092 - regression_loss: 0.3901 - classification_loss: 0.0191 7/500 [..............................] - ETA: 2:50 - loss: 0.3826 - regression_loss: 0.3645 - classification_loss: 0.0181 8/500 [..............................] - ETA: 2:51 - loss: 0.3473 - regression_loss: 0.3308 - classification_loss: 0.0165 9/500 [..............................] - ETA: 2:50 - loss: 0.3466 - regression_loss: 0.3277 - classification_loss: 0.0189 10/500 [..............................] - ETA: 2:50 - loss: 0.3361 - regression_loss: 0.3180 - classification_loss: 0.0182 11/500 [..............................] - ETA: 2:50 - loss: 0.3356 - regression_loss: 0.3174 - classification_loss: 0.0182 12/500 [..............................] - ETA: 2:48 - loss: 0.3542 - regression_loss: 0.3342 - classification_loss: 0.0200 13/500 [..............................] - ETA: 2:47 - loss: 0.3623 - regression_loss: 0.3410 - classification_loss: 0.0213 14/500 [..............................] - ETA: 2:46 - loss: 0.3526 - regression_loss: 0.3319 - classification_loss: 0.0206 15/500 [..............................] - ETA: 2:45 - loss: 0.3375 - regression_loss: 0.3181 - classification_loss: 0.0194 16/500 [..............................] - ETA: 2:45 - loss: 0.3302 - regression_loss: 0.3114 - classification_loss: 0.0188 17/500 [>.............................] - ETA: 2:45 - loss: 0.3356 - regression_loss: 0.3166 - classification_loss: 0.0190 18/500 [>.............................] - ETA: 2:44 - loss: 0.3438 - regression_loss: 0.3242 - classification_loss: 0.0196 19/500 [>.............................] - ETA: 2:44 - loss: 0.3454 - regression_loss: 0.3256 - classification_loss: 0.0198 20/500 [>.............................] - ETA: 2:44 - loss: 0.3359 - regression_loss: 0.3168 - classification_loss: 0.0191 21/500 [>.............................] - ETA: 2:43 - loss: 0.3253 - regression_loss: 0.3068 - classification_loss: 0.0184 22/500 [>.............................] - ETA: 2:43 - loss: 0.3170 - regression_loss: 0.2991 - classification_loss: 0.0178 23/500 [>.............................] - ETA: 2:42 - loss: 0.3113 - regression_loss: 0.2938 - classification_loss: 0.0174 24/500 [>.............................] - ETA: 2:42 - loss: 0.3083 - regression_loss: 0.2913 - classification_loss: 0.0170 25/500 [>.............................] - ETA: 2:42 - loss: 0.3205 - regression_loss: 0.3028 - classification_loss: 0.0178 26/500 [>.............................] - ETA: 2:42 - loss: 0.3282 - regression_loss: 0.3098 - classification_loss: 0.0185 27/500 [>.............................] - ETA: 2:41 - loss: 0.3210 - regression_loss: 0.3032 - classification_loss: 0.0179 28/500 [>.............................] - ETA: 2:41 - loss: 0.3176 - regression_loss: 0.3000 - classification_loss: 0.0176 29/500 [>.............................] - ETA: 2:41 - loss: 0.3181 - regression_loss: 0.3007 - classification_loss: 0.0174 30/500 [>.............................] - ETA: 2:41 - loss: 0.3223 - regression_loss: 0.3046 - classification_loss: 0.0177 31/500 [>.............................] - ETA: 2:41 - loss: 0.3258 - regression_loss: 0.3075 - classification_loss: 0.0183 32/500 [>.............................] - ETA: 2:40 - loss: 0.3266 - regression_loss: 0.3084 - classification_loss: 0.0182 33/500 [>.............................] - ETA: 2:40 - loss: 0.3332 - regression_loss: 0.3139 - classification_loss: 0.0193 34/500 [=>............................] - ETA: 2:40 - loss: 0.3324 - regression_loss: 0.3131 - classification_loss: 0.0193 35/500 [=>............................] - ETA: 2:39 - loss: 0.3329 - regression_loss: 0.3134 - classification_loss: 0.0195 36/500 [=>............................] - ETA: 2:39 - loss: 0.3340 - regression_loss: 0.3145 - classification_loss: 0.0195 37/500 [=>............................] - ETA: 2:38 - loss: 0.3285 - regression_loss: 0.3092 - classification_loss: 0.0193 38/500 [=>............................] - ETA: 2:38 - loss: 0.3266 - regression_loss: 0.3073 - classification_loss: 0.0192 39/500 [=>............................] - ETA: 2:38 - loss: 0.3257 - regression_loss: 0.3066 - classification_loss: 0.0190 40/500 [=>............................] - ETA: 2:38 - loss: 0.3254 - regression_loss: 0.3065 - classification_loss: 0.0189 41/500 [=>............................] - ETA: 2:37 - loss: 0.3249 - regression_loss: 0.3060 - classification_loss: 0.0189 42/500 [=>............................] - ETA: 2:37 - loss: 0.3249 - regression_loss: 0.3060 - classification_loss: 0.0189 43/500 [=>............................] - ETA: 2:36 - loss: 0.3305 - regression_loss: 0.3113 - classification_loss: 0.0192 44/500 [=>............................] - ETA: 2:36 - loss: 0.3313 - regression_loss: 0.3123 - classification_loss: 0.0190 45/500 [=>............................] - ETA: 2:36 - loss: 0.3330 - regression_loss: 0.3135 - classification_loss: 0.0194 46/500 [=>............................] - ETA: 2:36 - loss: 0.3320 - regression_loss: 0.3114 - classification_loss: 0.0206 47/500 [=>............................] - ETA: 2:35 - loss: 0.3345 - regression_loss: 0.3138 - classification_loss: 0.0207 48/500 [=>............................] - ETA: 2:35 - loss: 0.3412 - regression_loss: 0.3199 - classification_loss: 0.0214 49/500 [=>............................] - ETA: 2:34 - loss: 0.3417 - regression_loss: 0.3206 - classification_loss: 0.0211 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3431 - regression_loss: 0.3219 - classification_loss: 0.0212 51/500 [==>...........................] - ETA: 2:34 - loss: 0.3400 - regression_loss: 0.3190 - classification_loss: 0.0210 52/500 [==>...........................] - ETA: 2:34 - loss: 0.3440 - regression_loss: 0.3230 - classification_loss: 0.0210 53/500 [==>...........................] - ETA: 2:33 - loss: 0.3406 - regression_loss: 0.3199 - classification_loss: 0.0207 54/500 [==>...........................] - ETA: 2:33 - loss: 0.3480 - regression_loss: 0.3273 - classification_loss: 0.0207 55/500 [==>...........................] - ETA: 2:33 - loss: 0.3527 - regression_loss: 0.3318 - classification_loss: 0.0208 56/500 [==>...........................] - ETA: 2:32 - loss: 0.3490 - regression_loss: 0.3284 - classification_loss: 0.0206 57/500 [==>...........................] - ETA: 2:32 - loss: 0.3507 - regression_loss: 0.3297 - classification_loss: 0.0211 58/500 [==>...........................] - ETA: 2:31 - loss: 0.3515 - regression_loss: 0.3290 - classification_loss: 0.0225 59/500 [==>...........................] - ETA: 2:31 - loss: 0.3564 - regression_loss: 0.3332 - classification_loss: 0.0232 60/500 [==>...........................] - ETA: 2:31 - loss: 0.3523 - regression_loss: 0.3291 - classification_loss: 0.0232 61/500 [==>...........................] - ETA: 2:30 - loss: 0.3546 - regression_loss: 0.3313 - classification_loss: 0.0233 62/500 [==>...........................] - ETA: 2:30 - loss: 0.3573 - regression_loss: 0.3335 - classification_loss: 0.0237 63/500 [==>...........................] - ETA: 2:30 - loss: 0.3577 - regression_loss: 0.3341 - classification_loss: 0.0236 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3549 - regression_loss: 0.3315 - classification_loss: 0.0234 65/500 [==>...........................] - ETA: 2:29 - loss: 0.3544 - regression_loss: 0.3313 - classification_loss: 0.0232 66/500 [==>...........................] - ETA: 2:29 - loss: 0.3573 - regression_loss: 0.3340 - classification_loss: 0.0232 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3580 - regression_loss: 0.3348 - classification_loss: 0.0232 68/500 [===>..........................] - ETA: 2:28 - loss: 0.3586 - regression_loss: 0.3354 - classification_loss: 0.0232 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3618 - regression_loss: 0.3382 - classification_loss: 0.0236 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3594 - regression_loss: 0.3361 - classification_loss: 0.0233 71/500 [===>..........................] - ETA: 2:27 - loss: 0.3563 - regression_loss: 0.3331 - classification_loss: 0.0232 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3561 - regression_loss: 0.3330 - classification_loss: 0.0231 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3545 - regression_loss: 0.3315 - classification_loss: 0.0230 74/500 [===>..........................] - ETA: 2:26 - loss: 0.3511 - regression_loss: 0.3284 - classification_loss: 0.0227 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3514 - regression_loss: 0.3287 - classification_loss: 0.0227 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3499 - regression_loss: 0.3273 - classification_loss: 0.0226 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3489 - regression_loss: 0.3263 - classification_loss: 0.0226 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3491 - regression_loss: 0.3266 - classification_loss: 0.0225 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3486 - regression_loss: 0.3261 - classification_loss: 0.0225 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3470 - regression_loss: 0.3246 - classification_loss: 0.0224 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3472 - regression_loss: 0.3248 - classification_loss: 0.0224 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3491 - regression_loss: 0.3264 - classification_loss: 0.0228 83/500 [===>..........................] - ETA: 2:22 - loss: 0.3496 - regression_loss: 0.3268 - classification_loss: 0.0228 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3513 - regression_loss: 0.3282 - classification_loss: 0.0231 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3543 - regression_loss: 0.3314 - classification_loss: 0.0229 86/500 [====>.........................] - ETA: 2:21 - loss: 0.3533 - regression_loss: 0.3305 - classification_loss: 0.0228 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3543 - regression_loss: 0.3315 - classification_loss: 0.0228 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3531 - regression_loss: 0.3305 - classification_loss: 0.0226 89/500 [====>.........................] - ETA: 2:20 - loss: 0.3520 - regression_loss: 0.3295 - classification_loss: 0.0225 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3525 - regression_loss: 0.3301 - classification_loss: 0.0225 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3499 - regression_loss: 0.3276 - classification_loss: 0.0224 92/500 [====>.........................] - ETA: 2:19 - loss: 0.3484 - regression_loss: 0.3262 - classification_loss: 0.0222 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3480 - regression_loss: 0.3258 - classification_loss: 0.0222 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3477 - regression_loss: 0.3255 - classification_loss: 0.0223 95/500 [====>.........................] - ETA: 2:18 - loss: 0.3449 - regression_loss: 0.3228 - classification_loss: 0.0221 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3443 - regression_loss: 0.3222 - classification_loss: 0.0221 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3451 - regression_loss: 0.3231 - classification_loss: 0.0221 98/500 [====>.........................] - ETA: 2:18 - loss: 0.3449 - regression_loss: 0.3228 - classification_loss: 0.0221 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3445 - regression_loss: 0.3225 - classification_loss: 0.0220 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3452 - regression_loss: 0.3233 - classification_loss: 0.0219 101/500 [=====>........................] - ETA: 2:16 - loss: 0.3466 - regression_loss: 0.3248 - classification_loss: 0.0217 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3456 - regression_loss: 0.3239 - classification_loss: 0.0216 103/500 [=====>........................] - ETA: 2:16 - loss: 0.3467 - regression_loss: 0.3251 - classification_loss: 0.0216 104/500 [=====>........................] - ETA: 2:15 - loss: 0.3467 - regression_loss: 0.3249 - classification_loss: 0.0218 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3456 - regression_loss: 0.3240 - classification_loss: 0.0217 106/500 [=====>........................] - ETA: 2:15 - loss: 0.3447 - regression_loss: 0.3231 - classification_loss: 0.0216 107/500 [=====>........................] - ETA: 2:14 - loss: 0.3430 - regression_loss: 0.3215 - classification_loss: 0.0215 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3424 - regression_loss: 0.3208 - classification_loss: 0.0215 109/500 [=====>........................] - ETA: 2:13 - loss: 0.3432 - regression_loss: 0.3216 - classification_loss: 0.0216 110/500 [=====>........................] - ETA: 2:13 - loss: 0.3433 - regression_loss: 0.3218 - classification_loss: 0.0215 111/500 [=====>........................] - ETA: 2:12 - loss: 0.3441 - regression_loss: 0.3225 - classification_loss: 0.0217 112/500 [=====>........................] - ETA: 2:12 - loss: 0.3459 - regression_loss: 0.3240 - classification_loss: 0.0219 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3447 - regression_loss: 0.3229 - classification_loss: 0.0218 114/500 [=====>........................] - ETA: 2:11 - loss: 0.3449 - regression_loss: 0.3229 - classification_loss: 0.0220 115/500 [=====>........................] - ETA: 2:11 - loss: 0.3442 - regression_loss: 0.3223 - classification_loss: 0.0219 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3437 - regression_loss: 0.3219 - classification_loss: 0.0218 117/500 [======>.......................] - ETA: 2:10 - loss: 0.3458 - regression_loss: 0.3238 - classification_loss: 0.0219 118/500 [======>.......................] - ETA: 2:10 - loss: 0.3451 - regression_loss: 0.3233 - classification_loss: 0.0219 119/500 [======>.......................] - ETA: 2:10 - loss: 0.3442 - regression_loss: 0.3224 - classification_loss: 0.0218 120/500 [======>.......................] - ETA: 2:09 - loss: 0.3433 - regression_loss: 0.3216 - classification_loss: 0.0217 121/500 [======>.......................] - ETA: 2:09 - loss: 0.3425 - regression_loss: 0.3209 - classification_loss: 0.0216 122/500 [======>.......................] - ETA: 2:09 - loss: 0.3412 - regression_loss: 0.3198 - classification_loss: 0.0214 123/500 [======>.......................] - ETA: 2:08 - loss: 0.3412 - regression_loss: 0.3198 - classification_loss: 0.0214 124/500 [======>.......................] - ETA: 2:08 - loss: 0.3412 - regression_loss: 0.3198 - classification_loss: 0.0214 125/500 [======>.......................] - ETA: 2:08 - loss: 0.3406 - regression_loss: 0.3192 - classification_loss: 0.0213 126/500 [======>.......................] - ETA: 2:07 - loss: 0.3398 - regression_loss: 0.3185 - classification_loss: 0.0213 127/500 [======>.......................] - ETA: 2:07 - loss: 0.3394 - regression_loss: 0.3182 - classification_loss: 0.0212 128/500 [======>.......................] - ETA: 2:06 - loss: 0.3396 - regression_loss: 0.3185 - classification_loss: 0.0211 129/500 [======>.......................] - ETA: 2:06 - loss: 0.3405 - regression_loss: 0.3190 - classification_loss: 0.0215 130/500 [======>.......................] - ETA: 2:06 - loss: 0.3400 - regression_loss: 0.3186 - classification_loss: 0.0214 131/500 [======>.......................] - ETA: 2:05 - loss: 0.3396 - regression_loss: 0.3182 - classification_loss: 0.0214 132/500 [======>.......................] - ETA: 2:05 - loss: 0.3379 - regression_loss: 0.3166 - classification_loss: 0.0213 133/500 [======>.......................] - ETA: 2:05 - loss: 0.3384 - regression_loss: 0.3172 - classification_loss: 0.0212 134/500 [=======>......................] - ETA: 2:04 - loss: 0.3381 - regression_loss: 0.3170 - classification_loss: 0.0211 135/500 [=======>......................] - ETA: 2:04 - loss: 0.3376 - regression_loss: 0.3165 - classification_loss: 0.0211 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3374 - regression_loss: 0.3164 - classification_loss: 0.0210 137/500 [=======>......................] - ETA: 2:03 - loss: 0.3366 - regression_loss: 0.3157 - classification_loss: 0.0210 138/500 [=======>......................] - ETA: 2:03 - loss: 0.3348 - regression_loss: 0.3140 - classification_loss: 0.0209 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3347 - regression_loss: 0.3138 - classification_loss: 0.0209 140/500 [=======>......................] - ETA: 2:02 - loss: 0.3360 - regression_loss: 0.3150 - classification_loss: 0.0209 141/500 [=======>......................] - ETA: 2:02 - loss: 0.3369 - regression_loss: 0.3160 - classification_loss: 0.0209 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3366 - regression_loss: 0.3158 - classification_loss: 0.0208 143/500 [=======>......................] - ETA: 2:01 - loss: 0.3365 - regression_loss: 0.3157 - classification_loss: 0.0208 144/500 [=======>......................] - ETA: 2:01 - loss: 0.3355 - regression_loss: 0.3148 - classification_loss: 0.0207 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3349 - regression_loss: 0.3143 - classification_loss: 0.0207 146/500 [=======>......................] - ETA: 2:00 - loss: 0.3349 - regression_loss: 0.3142 - classification_loss: 0.0206 147/500 [=======>......................] - ETA: 2:00 - loss: 0.3337 - regression_loss: 0.3132 - classification_loss: 0.0206 148/500 [=======>......................] - ETA: 2:00 - loss: 0.3338 - regression_loss: 0.3132 - classification_loss: 0.0206 149/500 [=======>......................] - ETA: 1:59 - loss: 0.3335 - regression_loss: 0.3130 - classification_loss: 0.0205 150/500 [========>.....................] - ETA: 1:59 - loss: 0.3334 - regression_loss: 0.3129 - classification_loss: 0.0205 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3325 - regression_loss: 0.3121 - classification_loss: 0.0204 152/500 [========>.....................] - ETA: 1:58 - loss: 0.3328 - regression_loss: 0.3123 - classification_loss: 0.0205 153/500 [========>.....................] - ETA: 1:58 - loss: 0.3331 - regression_loss: 0.3126 - classification_loss: 0.0205 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3336 - regression_loss: 0.3132 - classification_loss: 0.0205 155/500 [========>.....................] - ETA: 1:57 - loss: 0.3337 - regression_loss: 0.3132 - classification_loss: 0.0205 156/500 [========>.....................] - ETA: 1:57 - loss: 0.3330 - regression_loss: 0.3126 - classification_loss: 0.0204 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3334 - regression_loss: 0.3130 - classification_loss: 0.0204 158/500 [========>.....................] - ETA: 1:56 - loss: 0.3331 - regression_loss: 0.3128 - classification_loss: 0.0203 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3327 - regression_loss: 0.3124 - classification_loss: 0.0203 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3324 - regression_loss: 0.3121 - classification_loss: 0.0202 161/500 [========>.....................] - ETA: 1:55 - loss: 0.3305 - regression_loss: 0.3103 - classification_loss: 0.0201 162/500 [========>.....................] - ETA: 1:55 - loss: 0.3301 - regression_loss: 0.3100 - classification_loss: 0.0201 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3288 - regression_loss: 0.3087 - classification_loss: 0.0200 164/500 [========>.....................] - ETA: 1:54 - loss: 0.3287 - regression_loss: 0.3087 - classification_loss: 0.0200 165/500 [========>.....................] - ETA: 1:54 - loss: 0.3290 - regression_loss: 0.3091 - classification_loss: 0.0199 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3296 - regression_loss: 0.3097 - classification_loss: 0.0199 167/500 [=========>....................] - ETA: 1:53 - loss: 0.3289 - regression_loss: 0.3091 - classification_loss: 0.0198 168/500 [=========>....................] - ETA: 1:53 - loss: 0.3282 - regression_loss: 0.3085 - classification_loss: 0.0197 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3275 - regression_loss: 0.3079 - classification_loss: 0.0197 170/500 [=========>....................] - ETA: 1:52 - loss: 0.3280 - regression_loss: 0.3084 - classification_loss: 0.0196 171/500 [=========>....................] - ETA: 1:52 - loss: 0.3277 - regression_loss: 0.3081 - classification_loss: 0.0196 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3284 - regression_loss: 0.3088 - classification_loss: 0.0196 173/500 [=========>....................] - ETA: 1:51 - loss: 0.3279 - regression_loss: 0.3084 - classification_loss: 0.0195 174/500 [=========>....................] - ETA: 1:51 - loss: 0.3276 - regression_loss: 0.3082 - classification_loss: 0.0195 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3264 - regression_loss: 0.3070 - classification_loss: 0.0194 176/500 [=========>....................] - ETA: 1:50 - loss: 0.3274 - regression_loss: 0.3079 - classification_loss: 0.0195 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3272 - regression_loss: 0.3076 - classification_loss: 0.0196 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3270 - regression_loss: 0.3075 - classification_loss: 0.0195 179/500 [=========>....................] - ETA: 1:49 - loss: 0.3274 - regression_loss: 0.3079 - classification_loss: 0.0195 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3302 - regression_loss: 0.3104 - classification_loss: 0.0197 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3305 - regression_loss: 0.3108 - classification_loss: 0.0197 182/500 [=========>....................] - ETA: 1:48 - loss: 0.3298 - regression_loss: 0.3102 - classification_loss: 0.0196 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3302 - regression_loss: 0.3105 - classification_loss: 0.0196 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3303 - regression_loss: 0.3108 - classification_loss: 0.0196 185/500 [==========>...................] - ETA: 1:47 - loss: 0.3301 - regression_loss: 0.3105 - classification_loss: 0.0196 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3298 - regression_loss: 0.3102 - classification_loss: 0.0195 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3295 - regression_loss: 0.3100 - classification_loss: 0.0194 188/500 [==========>...................] - ETA: 1:46 - loss: 0.3299 - regression_loss: 0.3105 - classification_loss: 0.0194 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3288 - regression_loss: 0.3094 - classification_loss: 0.0193 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3311 - regression_loss: 0.3115 - classification_loss: 0.0196 191/500 [==========>...................] - ETA: 1:45 - loss: 0.3325 - regression_loss: 0.3128 - classification_loss: 0.0197 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3320 - regression_loss: 0.3124 - classification_loss: 0.0196 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3317 - regression_loss: 0.3120 - classification_loss: 0.0196 194/500 [==========>...................] - ETA: 1:44 - loss: 0.3319 - regression_loss: 0.3121 - classification_loss: 0.0198 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3316 - regression_loss: 0.3118 - classification_loss: 0.0198 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3319 - regression_loss: 0.3121 - classification_loss: 0.0197 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3311 - regression_loss: 0.3114 - classification_loss: 0.0197 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3307 - regression_loss: 0.3111 - classification_loss: 0.0196 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3302 - regression_loss: 0.3106 - classification_loss: 0.0196 200/500 [===========>..................] - ETA: 1:42 - loss: 0.3304 - regression_loss: 0.3107 - classification_loss: 0.0197 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3303 - regression_loss: 0.3106 - classification_loss: 0.0197 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3303 - regression_loss: 0.3106 - classification_loss: 0.0197 203/500 [===========>..................] - ETA: 1:41 - loss: 0.3297 - regression_loss: 0.3101 - classification_loss: 0.0196 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3286 - regression_loss: 0.3091 - classification_loss: 0.0195 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3285 - regression_loss: 0.3089 - classification_loss: 0.0197 206/500 [===========>..................] - ETA: 1:40 - loss: 0.3287 - regression_loss: 0.3090 - classification_loss: 0.0197 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3291 - regression_loss: 0.3092 - classification_loss: 0.0198 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3284 - regression_loss: 0.3086 - classification_loss: 0.0198 209/500 [===========>..................] - ETA: 1:39 - loss: 0.3277 - regression_loss: 0.3079 - classification_loss: 0.0197 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3283 - regression_loss: 0.3086 - classification_loss: 0.0197 211/500 [===========>..................] - ETA: 1:38 - loss: 0.3287 - regression_loss: 0.3089 - classification_loss: 0.0198 212/500 [===========>..................] - ETA: 1:38 - loss: 0.3285 - regression_loss: 0.3086 - classification_loss: 0.0198 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3288 - regression_loss: 0.3090 - classification_loss: 0.0199 214/500 [===========>..................] - ETA: 1:37 - loss: 0.3287 - regression_loss: 0.3088 - classification_loss: 0.0199 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3284 - regression_loss: 0.3085 - classification_loss: 0.0199 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3275 - regression_loss: 0.3077 - classification_loss: 0.0198 217/500 [============>.................] - ETA: 1:37 - loss: 0.3268 - regression_loss: 0.3071 - classification_loss: 0.0198 218/500 [============>.................] - ETA: 1:36 - loss: 0.3260 - regression_loss: 0.3063 - classification_loss: 0.0197 219/500 [============>.................] - ETA: 1:36 - loss: 0.3263 - regression_loss: 0.3066 - classification_loss: 0.0197 220/500 [============>.................] - ETA: 1:35 - loss: 0.3271 - regression_loss: 0.3074 - classification_loss: 0.0197 221/500 [============>.................] - ETA: 1:35 - loss: 0.3287 - regression_loss: 0.3089 - classification_loss: 0.0198 222/500 [============>.................] - ETA: 1:35 - loss: 0.3292 - regression_loss: 0.3095 - classification_loss: 0.0197 223/500 [============>.................] - ETA: 1:34 - loss: 0.3291 - regression_loss: 0.3094 - classification_loss: 0.0197 224/500 [============>.................] - ETA: 1:34 - loss: 0.3286 - regression_loss: 0.3089 - classification_loss: 0.0197 225/500 [============>.................] - ETA: 1:34 - loss: 0.3284 - regression_loss: 0.3088 - classification_loss: 0.0196 226/500 [============>.................] - ETA: 1:33 - loss: 0.3291 - regression_loss: 0.3095 - classification_loss: 0.0197 227/500 [============>.................] - ETA: 1:33 - loss: 0.3294 - regression_loss: 0.3097 - classification_loss: 0.0197 228/500 [============>.................] - ETA: 1:33 - loss: 0.3287 - regression_loss: 0.3091 - classification_loss: 0.0196 229/500 [============>.................] - ETA: 1:32 - loss: 0.3279 - regression_loss: 0.3083 - classification_loss: 0.0196 230/500 [============>.................] - ETA: 1:32 - loss: 0.3281 - regression_loss: 0.3085 - classification_loss: 0.0196 231/500 [============>.................] - ETA: 1:32 - loss: 0.3278 - regression_loss: 0.3083 - classification_loss: 0.0195 232/500 [============>.................] - ETA: 1:31 - loss: 0.3275 - regression_loss: 0.3080 - classification_loss: 0.0195 233/500 [============>.................] - ETA: 1:31 - loss: 0.3279 - regression_loss: 0.3084 - classification_loss: 0.0195 234/500 [=============>................] - ETA: 1:31 - loss: 0.3286 - regression_loss: 0.3088 - classification_loss: 0.0197 235/500 [=============>................] - ETA: 1:30 - loss: 0.3282 - regression_loss: 0.3085 - classification_loss: 0.0197 236/500 [=============>................] - ETA: 1:30 - loss: 0.3281 - regression_loss: 0.3085 - classification_loss: 0.0197 237/500 [=============>................] - ETA: 1:30 - loss: 0.3275 - regression_loss: 0.3079 - classification_loss: 0.0196 238/500 [=============>................] - ETA: 1:29 - loss: 0.3266 - regression_loss: 0.3071 - classification_loss: 0.0196 239/500 [=============>................] - ETA: 1:29 - loss: 0.3278 - regression_loss: 0.3080 - classification_loss: 0.0198 240/500 [=============>................] - ETA: 1:29 - loss: 0.3274 - regression_loss: 0.3076 - classification_loss: 0.0198 241/500 [=============>................] - ETA: 1:28 - loss: 0.3270 - regression_loss: 0.3072 - classification_loss: 0.0197 242/500 [=============>................] - ETA: 1:28 - loss: 0.3272 - regression_loss: 0.3074 - classification_loss: 0.0197 243/500 [=============>................] - ETA: 1:28 - loss: 0.3271 - regression_loss: 0.3074 - classification_loss: 0.0197 244/500 [=============>................] - ETA: 1:27 - loss: 0.3270 - regression_loss: 0.3073 - classification_loss: 0.0197 245/500 [=============>................] - ETA: 1:27 - loss: 0.3270 - regression_loss: 0.3073 - classification_loss: 0.0197 246/500 [=============>................] - ETA: 1:27 - loss: 0.3270 - regression_loss: 0.3073 - classification_loss: 0.0197 247/500 [=============>................] - ETA: 1:26 - loss: 0.3270 - regression_loss: 0.3073 - classification_loss: 0.0197 248/500 [=============>................] - ETA: 1:26 - loss: 0.3268 - regression_loss: 0.3072 - classification_loss: 0.0196 249/500 [=============>................] - ETA: 1:26 - loss: 0.3261 - regression_loss: 0.3065 - classification_loss: 0.0196 250/500 [==============>...............] - ETA: 1:25 - loss: 0.3257 - regression_loss: 0.3061 - classification_loss: 0.0196 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3259 - regression_loss: 0.3063 - classification_loss: 0.0196 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3255 - regression_loss: 0.3059 - classification_loss: 0.0195 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3253 - regression_loss: 0.3058 - classification_loss: 0.0195 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3252 - regression_loss: 0.3057 - classification_loss: 0.0195 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3254 - regression_loss: 0.3058 - classification_loss: 0.0196 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3255 - regression_loss: 0.3059 - classification_loss: 0.0196 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3258 - regression_loss: 0.3062 - classification_loss: 0.0196 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3256 - regression_loss: 0.3060 - classification_loss: 0.0196 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3268 - regression_loss: 0.3070 - classification_loss: 0.0198 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3268 - regression_loss: 0.3070 - classification_loss: 0.0198 261/500 [==============>...............] - ETA: 1:21 - loss: 0.3267 - regression_loss: 0.3069 - classification_loss: 0.0198 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3261 - regression_loss: 0.3064 - classification_loss: 0.0197 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3256 - regression_loss: 0.3059 - classification_loss: 0.0197 264/500 [==============>...............] - ETA: 1:20 - loss: 0.3272 - regression_loss: 0.3076 - classification_loss: 0.0197 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3267 - regression_loss: 0.3071 - classification_loss: 0.0196 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3274 - regression_loss: 0.3077 - classification_loss: 0.0197 267/500 [===============>..............] - ETA: 1:19 - loss: 0.3269 - regression_loss: 0.3073 - classification_loss: 0.0196 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3264 - regression_loss: 0.3067 - classification_loss: 0.0196 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3256 - regression_loss: 0.3060 - classification_loss: 0.0196 270/500 [===============>..............] - ETA: 1:18 - loss: 0.3247 - regression_loss: 0.3051 - classification_loss: 0.0195 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3244 - regression_loss: 0.3049 - classification_loss: 0.0195 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3245 - regression_loss: 0.3050 - classification_loss: 0.0196 273/500 [===============>..............] - ETA: 1:17 - loss: 0.3248 - regression_loss: 0.3052 - classification_loss: 0.0196 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3242 - regression_loss: 0.3047 - classification_loss: 0.0195 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3238 - regression_loss: 0.3043 - classification_loss: 0.0195 276/500 [===============>..............] - ETA: 1:16 - loss: 0.3244 - regression_loss: 0.3049 - classification_loss: 0.0195 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3240 - regression_loss: 0.3046 - classification_loss: 0.0195 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3243 - regression_loss: 0.3048 - classification_loss: 0.0195 279/500 [===============>..............] - ETA: 1:15 - loss: 0.3240 - regression_loss: 0.3045 - classification_loss: 0.0195 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3239 - regression_loss: 0.3044 - classification_loss: 0.0195 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3237 - regression_loss: 0.3042 - classification_loss: 0.0195 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3235 - regression_loss: 0.3040 - classification_loss: 0.0195 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3232 - regression_loss: 0.3037 - classification_loss: 0.0194 284/500 [================>.............] - ETA: 1:14 - loss: 0.3232 - regression_loss: 0.3038 - classification_loss: 0.0194 285/500 [================>.............] - ETA: 1:13 - loss: 0.3235 - regression_loss: 0.3040 - classification_loss: 0.0194 286/500 [================>.............] - ETA: 1:13 - loss: 0.3230 - regression_loss: 0.3036 - classification_loss: 0.0194 287/500 [================>.............] - ETA: 1:13 - loss: 0.3229 - regression_loss: 0.3035 - classification_loss: 0.0194 288/500 [================>.............] - ETA: 1:12 - loss: 0.3221 - regression_loss: 0.3028 - classification_loss: 0.0193 289/500 [================>.............] - ETA: 1:12 - loss: 0.3225 - regression_loss: 0.3031 - classification_loss: 0.0194 290/500 [================>.............] - ETA: 1:12 - loss: 0.3220 - regression_loss: 0.3027 - classification_loss: 0.0194 291/500 [================>.............] - ETA: 1:11 - loss: 0.3219 - regression_loss: 0.3025 - classification_loss: 0.0193 292/500 [================>.............] - ETA: 1:11 - loss: 0.3232 - regression_loss: 0.3039 - classification_loss: 0.0193 293/500 [================>.............] - ETA: 1:10 - loss: 0.3234 - regression_loss: 0.3041 - classification_loss: 0.0194 294/500 [================>.............] - ETA: 1:10 - loss: 0.3238 - regression_loss: 0.3043 - classification_loss: 0.0195 295/500 [================>.............] - ETA: 1:10 - loss: 0.3235 - regression_loss: 0.3041 - classification_loss: 0.0194 296/500 [================>.............] - ETA: 1:09 - loss: 0.3240 - regression_loss: 0.3044 - classification_loss: 0.0196 297/500 [================>.............] - ETA: 1:09 - loss: 0.3239 - regression_loss: 0.3043 - classification_loss: 0.0196 298/500 [================>.............] - ETA: 1:09 - loss: 0.3231 - regression_loss: 0.3036 - classification_loss: 0.0195 299/500 [================>.............] - ETA: 1:08 - loss: 0.3240 - regression_loss: 0.3044 - classification_loss: 0.0196 300/500 [=================>............] - ETA: 1:08 - loss: 0.3235 - regression_loss: 0.3040 - classification_loss: 0.0195 301/500 [=================>............] - ETA: 1:08 - loss: 0.3240 - regression_loss: 0.3044 - classification_loss: 0.0195 302/500 [=================>............] - ETA: 1:07 - loss: 0.3243 - regression_loss: 0.3047 - classification_loss: 0.0195 303/500 [=================>............] - ETA: 1:07 - loss: 0.3236 - regression_loss: 0.3041 - classification_loss: 0.0195 304/500 [=================>............] - ETA: 1:07 - loss: 0.3231 - regression_loss: 0.3036 - classification_loss: 0.0195 305/500 [=================>............] - ETA: 1:06 - loss: 0.3235 - regression_loss: 0.3040 - classification_loss: 0.0195 306/500 [=================>............] - ETA: 1:06 - loss: 0.3232 - regression_loss: 0.3037 - classification_loss: 0.0195 307/500 [=================>............] - ETA: 1:06 - loss: 0.3226 - regression_loss: 0.3032 - classification_loss: 0.0194 308/500 [=================>............] - ETA: 1:05 - loss: 0.3230 - regression_loss: 0.3036 - classification_loss: 0.0194 309/500 [=================>............] - ETA: 1:05 - loss: 0.3229 - regression_loss: 0.3035 - classification_loss: 0.0194 310/500 [=================>............] - ETA: 1:05 - loss: 0.3227 - regression_loss: 0.3034 - classification_loss: 0.0194 311/500 [=================>............] - ETA: 1:04 - loss: 0.3222 - regression_loss: 0.3029 - classification_loss: 0.0193 312/500 [=================>............] - ETA: 1:04 - loss: 0.3220 - regression_loss: 0.3027 - classification_loss: 0.0193 313/500 [=================>............] - ETA: 1:04 - loss: 0.3222 - regression_loss: 0.3029 - classification_loss: 0.0193 314/500 [=================>............] - ETA: 1:03 - loss: 0.3225 - regression_loss: 0.3033 - classification_loss: 0.0193 315/500 [=================>............] - ETA: 1:03 - loss: 0.3232 - regression_loss: 0.3039 - classification_loss: 0.0193 316/500 [=================>............] - ETA: 1:03 - loss: 0.3235 - regression_loss: 0.3042 - classification_loss: 0.0193 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3235 - regression_loss: 0.3042 - classification_loss: 0.0193 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3237 - regression_loss: 0.3044 - classification_loss: 0.0193 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3249 - regression_loss: 0.3056 - classification_loss: 0.0193 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3249 - regression_loss: 0.3055 - classification_loss: 0.0193 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3261 - regression_loss: 0.3067 - classification_loss: 0.0194 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3265 - regression_loss: 0.3071 - classification_loss: 0.0194 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3265 - regression_loss: 0.3072 - classification_loss: 0.0194 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3262 - regression_loss: 0.3069 - classification_loss: 0.0193 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3263 - regression_loss: 0.3069 - classification_loss: 0.0193 326/500 [==================>...........] - ETA: 59s - loss: 0.3258 - regression_loss: 0.3065 - classification_loss: 0.0193  327/500 [==================>...........] - ETA: 59s - loss: 0.3263 - regression_loss: 0.3069 - classification_loss: 0.0194 328/500 [==================>...........] - ETA: 59s - loss: 0.3261 - regression_loss: 0.3067 - classification_loss: 0.0193 329/500 [==================>...........] - ETA: 58s - loss: 0.3256 - regression_loss: 0.3064 - classification_loss: 0.0193 330/500 [==================>...........] - ETA: 58s - loss: 0.3252 - regression_loss: 0.3059 - classification_loss: 0.0192 331/500 [==================>...........] - ETA: 58s - loss: 0.3262 - regression_loss: 0.3068 - classification_loss: 0.0194 332/500 [==================>...........] - ETA: 57s - loss: 0.3262 - regression_loss: 0.3068 - classification_loss: 0.0194 333/500 [==================>...........] - ETA: 57s - loss: 0.3266 - regression_loss: 0.3072 - classification_loss: 0.0194 334/500 [===================>..........] - ETA: 56s - loss: 0.3273 - regression_loss: 0.3079 - classification_loss: 0.0194 335/500 [===================>..........] - ETA: 56s - loss: 0.3273 - regression_loss: 0.3079 - classification_loss: 0.0194 336/500 [===================>..........] - ETA: 56s - loss: 0.3276 - regression_loss: 0.3082 - classification_loss: 0.0194 337/500 [===================>..........] - ETA: 55s - loss: 0.3273 - regression_loss: 0.3079 - classification_loss: 0.0194 338/500 [===================>..........] - ETA: 55s - loss: 0.3269 - regression_loss: 0.3076 - classification_loss: 0.0194 339/500 [===================>..........] - ETA: 55s - loss: 0.3273 - regression_loss: 0.3079 - classification_loss: 0.0195 340/500 [===================>..........] - ETA: 54s - loss: 0.3270 - regression_loss: 0.3076 - classification_loss: 0.0195 341/500 [===================>..........] - ETA: 54s - loss: 0.3267 - regression_loss: 0.3073 - classification_loss: 0.0195 342/500 [===================>..........] - ETA: 54s - loss: 0.3263 - regression_loss: 0.3069 - classification_loss: 0.0194 343/500 [===================>..........] - ETA: 53s - loss: 0.3269 - regression_loss: 0.3074 - classification_loss: 0.0195 344/500 [===================>..........] - ETA: 53s - loss: 0.3269 - regression_loss: 0.3074 - classification_loss: 0.0194 345/500 [===================>..........] - ETA: 53s - loss: 0.3265 - regression_loss: 0.3071 - classification_loss: 0.0194 346/500 [===================>..........] - ETA: 52s - loss: 0.3260 - regression_loss: 0.3066 - classification_loss: 0.0194 347/500 [===================>..........] - ETA: 52s - loss: 0.3258 - regression_loss: 0.3064 - classification_loss: 0.0194 348/500 [===================>..........] - ETA: 52s - loss: 0.3258 - regression_loss: 0.3064 - classification_loss: 0.0194 349/500 [===================>..........] - ETA: 51s - loss: 0.3257 - regression_loss: 0.3063 - classification_loss: 0.0194 350/500 [====================>.........] - ETA: 51s - loss: 0.3254 - regression_loss: 0.3061 - classification_loss: 0.0194 351/500 [====================>.........] - ETA: 51s - loss: 0.3250 - regression_loss: 0.3057 - classification_loss: 0.0193 352/500 [====================>.........] - ETA: 50s - loss: 0.3249 - regression_loss: 0.3056 - classification_loss: 0.0193 353/500 [====================>.........] - ETA: 50s - loss: 0.3247 - regression_loss: 0.3054 - classification_loss: 0.0193 354/500 [====================>.........] - ETA: 50s - loss: 0.3248 - regression_loss: 0.3055 - classification_loss: 0.0193 355/500 [====================>.........] - ETA: 49s - loss: 0.3247 - regression_loss: 0.3054 - classification_loss: 0.0193 356/500 [====================>.........] - ETA: 49s - loss: 0.3241 - regression_loss: 0.3048 - classification_loss: 0.0192 357/500 [====================>.........] - ETA: 49s - loss: 0.3239 - regression_loss: 0.3047 - classification_loss: 0.0192 358/500 [====================>.........] - ETA: 48s - loss: 0.3248 - regression_loss: 0.3055 - classification_loss: 0.0193 359/500 [====================>.........] - ETA: 48s - loss: 0.3245 - regression_loss: 0.3052 - classification_loss: 0.0193 360/500 [====================>.........] - ETA: 48s - loss: 0.3245 - regression_loss: 0.3052 - classification_loss: 0.0193 361/500 [====================>.........] - ETA: 47s - loss: 0.3239 - regression_loss: 0.3047 - classification_loss: 0.0193 362/500 [====================>.........] - ETA: 47s - loss: 0.3239 - regression_loss: 0.3047 - classification_loss: 0.0193 363/500 [====================>.........] - ETA: 47s - loss: 0.3235 - regression_loss: 0.3042 - classification_loss: 0.0192 364/500 [====================>.........] - ETA: 46s - loss: 0.3232 - regression_loss: 0.3040 - classification_loss: 0.0192 365/500 [====================>.........] - ETA: 46s - loss: 0.3229 - regression_loss: 0.3037 - classification_loss: 0.0192 366/500 [====================>.........] - ETA: 46s - loss: 0.3231 - regression_loss: 0.3039 - classification_loss: 0.0192 367/500 [=====================>........] - ETA: 45s - loss: 0.3229 - regression_loss: 0.3037 - classification_loss: 0.0191 368/500 [=====================>........] - ETA: 45s - loss: 0.3232 - regression_loss: 0.3040 - classification_loss: 0.0192 369/500 [=====================>........] - ETA: 44s - loss: 0.3229 - regression_loss: 0.3037 - classification_loss: 0.0191 370/500 [=====================>........] - ETA: 44s - loss: 0.3228 - regression_loss: 0.3037 - classification_loss: 0.0192 371/500 [=====================>........] - ETA: 44s - loss: 0.3231 - regression_loss: 0.3039 - classification_loss: 0.0192 372/500 [=====================>........] - ETA: 43s - loss: 0.3234 - regression_loss: 0.3043 - classification_loss: 0.0192 373/500 [=====================>........] - ETA: 43s - loss: 0.3231 - regression_loss: 0.3039 - classification_loss: 0.0191 374/500 [=====================>........] - ETA: 43s - loss: 0.3233 - regression_loss: 0.3042 - classification_loss: 0.0192 375/500 [=====================>........] - ETA: 42s - loss: 0.3233 - regression_loss: 0.3041 - classification_loss: 0.0192 376/500 [=====================>........] - ETA: 42s - loss: 0.3232 - regression_loss: 0.3040 - classification_loss: 0.0192 377/500 [=====================>........] - ETA: 42s - loss: 0.3233 - regression_loss: 0.3041 - classification_loss: 0.0191 378/500 [=====================>........] - ETA: 41s - loss: 0.3239 - regression_loss: 0.3047 - classification_loss: 0.0192 379/500 [=====================>........] - ETA: 41s - loss: 0.3240 - regression_loss: 0.3047 - classification_loss: 0.0193 380/500 [=====================>........] - ETA: 41s - loss: 0.3246 - regression_loss: 0.3053 - classification_loss: 0.0193 381/500 [=====================>........] - ETA: 40s - loss: 0.3243 - regression_loss: 0.3051 - classification_loss: 0.0193 382/500 [=====================>........] - ETA: 40s - loss: 0.3243 - regression_loss: 0.3051 - classification_loss: 0.0193 383/500 [=====================>........] - ETA: 40s - loss: 0.3245 - regression_loss: 0.3052 - classification_loss: 0.0193 384/500 [======================>.......] - ETA: 39s - loss: 0.3242 - regression_loss: 0.3050 - classification_loss: 0.0193 385/500 [======================>.......] - ETA: 39s - loss: 0.3241 - regression_loss: 0.3049 - classification_loss: 0.0192 386/500 [======================>.......] - ETA: 39s - loss: 0.3247 - regression_loss: 0.3054 - classification_loss: 0.0193 387/500 [======================>.......] - ETA: 38s - loss: 0.3251 - regression_loss: 0.3058 - classification_loss: 0.0193 388/500 [======================>.......] - ETA: 38s - loss: 0.3250 - regression_loss: 0.3057 - classification_loss: 0.0192 389/500 [======================>.......] - ETA: 38s - loss: 0.3246 - regression_loss: 0.3054 - classification_loss: 0.0192 390/500 [======================>.......] - ETA: 37s - loss: 0.3246 - regression_loss: 0.3054 - classification_loss: 0.0193 391/500 [======================>.......] - ETA: 37s - loss: 0.3244 - regression_loss: 0.3052 - classification_loss: 0.0192 392/500 [======================>.......] - ETA: 37s - loss: 0.3241 - regression_loss: 0.3049 - classification_loss: 0.0192 393/500 [======================>.......] - ETA: 36s - loss: 0.3239 - regression_loss: 0.3047 - classification_loss: 0.0192 394/500 [======================>.......] - ETA: 36s - loss: 0.3237 - regression_loss: 0.3045 - classification_loss: 0.0192 395/500 [======================>.......] - ETA: 36s - loss: 0.3233 - regression_loss: 0.3041 - classification_loss: 0.0191 396/500 [======================>.......] - ETA: 35s - loss: 0.3229 - regression_loss: 0.3038 - classification_loss: 0.0191 397/500 [======================>.......] - ETA: 35s - loss: 0.3228 - regression_loss: 0.3037 - classification_loss: 0.0191 398/500 [======================>.......] - ETA: 35s - loss: 0.3231 - regression_loss: 0.3040 - classification_loss: 0.0191 399/500 [======================>.......] - ETA: 34s - loss: 0.3229 - regression_loss: 0.3038 - classification_loss: 0.0191 400/500 [=======================>......] - ETA: 34s - loss: 0.3228 - regression_loss: 0.3037 - classification_loss: 0.0191 401/500 [=======================>......] - ETA: 34s - loss: 0.3223 - regression_loss: 0.3032 - classification_loss: 0.0190 402/500 [=======================>......] - ETA: 33s - loss: 0.3221 - regression_loss: 0.3031 - classification_loss: 0.0190 403/500 [=======================>......] - ETA: 33s - loss: 0.3226 - regression_loss: 0.3036 - classification_loss: 0.0191 404/500 [=======================>......] - ETA: 32s - loss: 0.3227 - regression_loss: 0.3037 - classification_loss: 0.0190 405/500 [=======================>......] - ETA: 32s - loss: 0.3228 - regression_loss: 0.3038 - classification_loss: 0.0190 406/500 [=======================>......] - ETA: 32s - loss: 0.3228 - regression_loss: 0.3038 - classification_loss: 0.0190 407/500 [=======================>......] - ETA: 31s - loss: 0.3229 - regression_loss: 0.3039 - classification_loss: 0.0190 408/500 [=======================>......] - ETA: 31s - loss: 0.3229 - regression_loss: 0.3040 - classification_loss: 0.0190 409/500 [=======================>......] - ETA: 31s - loss: 0.3234 - regression_loss: 0.3045 - classification_loss: 0.0189 410/500 [=======================>......] - ETA: 30s - loss: 0.3238 - regression_loss: 0.3048 - classification_loss: 0.0190 411/500 [=======================>......] - ETA: 30s - loss: 0.3237 - regression_loss: 0.3047 - classification_loss: 0.0189 412/500 [=======================>......] - ETA: 30s - loss: 0.3234 - regression_loss: 0.3044 - classification_loss: 0.0189 413/500 [=======================>......] - ETA: 29s - loss: 0.3231 - regression_loss: 0.3042 - classification_loss: 0.0189 414/500 [=======================>......] - ETA: 29s - loss: 0.3229 - regression_loss: 0.3040 - classification_loss: 0.0189 415/500 [=======================>......] - ETA: 29s - loss: 0.3229 - regression_loss: 0.3041 - classification_loss: 0.0188 416/500 [=======================>......] - ETA: 28s - loss: 0.3231 - regression_loss: 0.3043 - classification_loss: 0.0188 417/500 [========================>.....] - ETA: 28s - loss: 0.3235 - regression_loss: 0.3047 - classification_loss: 0.0188 418/500 [========================>.....] - ETA: 28s - loss: 0.3233 - regression_loss: 0.3045 - classification_loss: 0.0188 419/500 [========================>.....] - ETA: 27s - loss: 0.3232 - regression_loss: 0.3045 - classification_loss: 0.0188 420/500 [========================>.....] - ETA: 27s - loss: 0.3236 - regression_loss: 0.3048 - classification_loss: 0.0188 421/500 [========================>.....] - ETA: 27s - loss: 0.3230 - regression_loss: 0.3043 - classification_loss: 0.0187 422/500 [========================>.....] - ETA: 26s - loss: 0.3232 - regression_loss: 0.3044 - classification_loss: 0.0187 423/500 [========================>.....] - ETA: 26s - loss: 0.3232 - regression_loss: 0.3044 - classification_loss: 0.0188 424/500 [========================>.....] - ETA: 26s - loss: 0.3232 - regression_loss: 0.3044 - classification_loss: 0.0188 425/500 [========================>.....] - ETA: 25s - loss: 0.3230 - regression_loss: 0.3042 - classification_loss: 0.0188 426/500 [========================>.....] - ETA: 25s - loss: 0.3233 - regression_loss: 0.3044 - classification_loss: 0.0188 427/500 [========================>.....] - ETA: 25s - loss: 0.3232 - regression_loss: 0.3044 - classification_loss: 0.0188 428/500 [========================>.....] - ETA: 24s - loss: 0.3235 - regression_loss: 0.3046 - classification_loss: 0.0188 429/500 [========================>.....] - ETA: 24s - loss: 0.3235 - regression_loss: 0.3047 - classification_loss: 0.0188 430/500 [========================>.....] - ETA: 24s - loss: 0.3237 - regression_loss: 0.3048 - classification_loss: 0.0188 431/500 [========================>.....] - ETA: 23s - loss: 0.3233 - regression_loss: 0.3045 - classification_loss: 0.0188 432/500 [========================>.....] - ETA: 23s - loss: 0.3234 - regression_loss: 0.3046 - classification_loss: 0.0188 433/500 [========================>.....] - ETA: 23s - loss: 0.3232 - regression_loss: 0.3044 - classification_loss: 0.0188 434/500 [=========================>....] - ETA: 22s - loss: 0.3236 - regression_loss: 0.3048 - classification_loss: 0.0188 435/500 [=========================>....] - ETA: 22s - loss: 0.3236 - regression_loss: 0.3048 - classification_loss: 0.0188 436/500 [=========================>....] - ETA: 21s - loss: 0.3237 - regression_loss: 0.3049 - classification_loss: 0.0188 437/500 [=========================>....] - ETA: 21s - loss: 0.3240 - regression_loss: 0.3052 - classification_loss: 0.0188 438/500 [=========================>....] - ETA: 21s - loss: 0.3238 - regression_loss: 0.3050 - classification_loss: 0.0188 439/500 [=========================>....] - ETA: 20s - loss: 0.3235 - regression_loss: 0.3047 - classification_loss: 0.0188 440/500 [=========================>....] - ETA: 20s - loss: 0.3235 - regression_loss: 0.3047 - classification_loss: 0.0188 441/500 [=========================>....] - ETA: 20s - loss: 0.3233 - regression_loss: 0.3045 - classification_loss: 0.0188 442/500 [=========================>....] - ETA: 19s - loss: 0.3230 - regression_loss: 0.3043 - classification_loss: 0.0187 443/500 [=========================>....] - ETA: 19s - loss: 0.3231 - regression_loss: 0.3043 - classification_loss: 0.0188 444/500 [=========================>....] - ETA: 19s - loss: 0.3229 - regression_loss: 0.3041 - classification_loss: 0.0188 445/500 [=========================>....] - ETA: 18s - loss: 0.3229 - regression_loss: 0.3041 - classification_loss: 0.0188 446/500 [=========================>....] - ETA: 18s - loss: 0.3229 - regression_loss: 0.3041 - classification_loss: 0.0188 447/500 [=========================>....] - ETA: 18s - loss: 0.3226 - regression_loss: 0.3039 - classification_loss: 0.0187 448/500 [=========================>....] - ETA: 17s - loss: 0.3224 - regression_loss: 0.3036 - classification_loss: 0.0187 449/500 [=========================>....] - ETA: 17s - loss: 0.3223 - regression_loss: 0.3036 - classification_loss: 0.0187 450/500 [==========================>...] - ETA: 17s - loss: 0.3225 - regression_loss: 0.3038 - classification_loss: 0.0187 451/500 [==========================>...] - ETA: 16s - loss: 0.3223 - regression_loss: 0.3036 - classification_loss: 0.0187 452/500 [==========================>...] - ETA: 16s - loss: 0.3221 - regression_loss: 0.3034 - classification_loss: 0.0187 453/500 [==========================>...] - ETA: 16s - loss: 0.3218 - regression_loss: 0.3031 - classification_loss: 0.0187 454/500 [==========================>...] - ETA: 15s - loss: 0.3213 - regression_loss: 0.3027 - classification_loss: 0.0186 455/500 [==========================>...] - ETA: 15s - loss: 0.3212 - regression_loss: 0.3025 - classification_loss: 0.0186 456/500 [==========================>...] - ETA: 15s - loss: 0.3210 - regression_loss: 0.3024 - classification_loss: 0.0186 457/500 [==========================>...] - ETA: 14s - loss: 0.3207 - regression_loss: 0.3021 - classification_loss: 0.0186 458/500 [==========================>...] - ETA: 14s - loss: 0.3212 - regression_loss: 0.3026 - classification_loss: 0.0186 459/500 [==========================>...] - ETA: 14s - loss: 0.3217 - regression_loss: 0.3030 - classification_loss: 0.0187 460/500 [==========================>...] - ETA: 13s - loss: 0.3217 - regression_loss: 0.3030 - classification_loss: 0.0187 461/500 [==========================>...] - ETA: 13s - loss: 0.3216 - regression_loss: 0.3029 - classification_loss: 0.0187 462/500 [==========================>...] - ETA: 13s - loss: 0.3213 - regression_loss: 0.3026 - classification_loss: 0.0186 463/500 [==========================>...] - ETA: 12s - loss: 0.3212 - regression_loss: 0.3025 - classification_loss: 0.0187 464/500 [==========================>...] - ETA: 12s - loss: 0.3207 - regression_loss: 0.3021 - classification_loss: 0.0186 465/500 [==========================>...] - ETA: 12s - loss: 0.3205 - regression_loss: 0.3019 - classification_loss: 0.0186 466/500 [==========================>...] - ETA: 11s - loss: 0.3201 - regression_loss: 0.3016 - classification_loss: 0.0186 467/500 [===========================>..] - ETA: 11s - loss: 0.3203 - regression_loss: 0.3017 - classification_loss: 0.0186 468/500 [===========================>..] - ETA: 11s - loss: 0.3199 - regression_loss: 0.3013 - classification_loss: 0.0186 469/500 [===========================>..] - ETA: 10s - loss: 0.3201 - regression_loss: 0.3015 - classification_loss: 0.0186 470/500 [===========================>..] - ETA: 10s - loss: 0.3201 - regression_loss: 0.3015 - classification_loss: 0.0186 471/500 [===========================>..] - ETA: 9s - loss: 0.3201 - regression_loss: 0.3015 - classification_loss: 0.0187  472/500 [===========================>..] - ETA: 9s - loss: 0.3202 - regression_loss: 0.3016 - classification_loss: 0.0186 473/500 [===========================>..] - ETA: 9s - loss: 0.3198 - regression_loss: 0.3012 - classification_loss: 0.0186 474/500 [===========================>..] - ETA: 8s - loss: 0.3199 - regression_loss: 0.3013 - classification_loss: 0.0186 475/500 [===========================>..] - ETA: 8s - loss: 0.3198 - regression_loss: 0.3012 - classification_loss: 0.0186 476/500 [===========================>..] - ETA: 8s - loss: 0.3195 - regression_loss: 0.3009 - classification_loss: 0.0186 477/500 [===========================>..] - ETA: 7s - loss: 0.3193 - regression_loss: 0.3007 - classification_loss: 0.0186 478/500 [===========================>..] - ETA: 7s - loss: 0.3192 - regression_loss: 0.3006 - classification_loss: 0.0185 479/500 [===========================>..] - ETA: 7s - loss: 0.3191 - regression_loss: 0.3006 - classification_loss: 0.0185 480/500 [===========================>..] - ETA: 6s - loss: 0.3193 - regression_loss: 0.3007 - classification_loss: 0.0186 481/500 [===========================>..] - ETA: 6s - loss: 0.3191 - regression_loss: 0.3005 - classification_loss: 0.0186 482/500 [===========================>..] - ETA: 6s - loss: 0.3191 - regression_loss: 0.3005 - classification_loss: 0.0186 483/500 [===========================>..] - ETA: 5s - loss: 0.3188 - regression_loss: 0.3003 - classification_loss: 0.0185 484/500 [============================>.] - ETA: 5s - loss: 0.3186 - regression_loss: 0.3000 - classification_loss: 0.0185 485/500 [============================>.] - ETA: 5s - loss: 0.3186 - regression_loss: 0.3001 - classification_loss: 0.0185 486/500 [============================>.] - ETA: 4s - loss: 0.3189 - regression_loss: 0.3003 - classification_loss: 0.0185 487/500 [============================>.] - ETA: 4s - loss: 0.3190 - regression_loss: 0.3005 - classification_loss: 0.0185 488/500 [============================>.] - ETA: 4s - loss: 0.3190 - regression_loss: 0.3004 - classification_loss: 0.0185 489/500 [============================>.] - ETA: 3s - loss: 0.3190 - regression_loss: 0.3004 - classification_loss: 0.0185 490/500 [============================>.] - ETA: 3s - loss: 0.3192 - regression_loss: 0.3006 - classification_loss: 0.0185 491/500 [============================>.] - ETA: 3s - loss: 0.3188 - regression_loss: 0.3003 - classification_loss: 0.0185 492/500 [============================>.] - ETA: 2s - loss: 0.3188 - regression_loss: 0.3004 - classification_loss: 0.0185 493/500 [============================>.] - ETA: 2s - loss: 0.3189 - regression_loss: 0.3004 - classification_loss: 0.0185 494/500 [============================>.] - ETA: 2s - loss: 0.3190 - regression_loss: 0.3005 - classification_loss: 0.0185 495/500 [============================>.] - ETA: 1s - loss: 0.3190 - regression_loss: 0.3006 - classification_loss: 0.0185 496/500 [============================>.] - ETA: 1s - loss: 0.3188 - regression_loss: 0.3004 - classification_loss: 0.0184 497/500 [============================>.] - ETA: 1s - loss: 0.3188 - regression_loss: 0.3004 - classification_loss: 0.0184 498/500 [============================>.] - ETA: 0s - loss: 0.3188 - regression_loss: 0.3003 - classification_loss: 0.0184 499/500 [============================>.] - ETA: 0s - loss: 0.3185 - regression_loss: 0.3000 - classification_loss: 0.0184 500/500 [==============================] - 172s 344ms/step - loss: 0.3184 - regression_loss: 0.3000 - classification_loss: 0.0184 1172 instances of class plum with average precision: 0.7402 mAP: 0.7402 Epoch 00036: saving model to ./training/snapshots/resnet101_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 2:40 - loss: 0.0528 - regression_loss: 0.0491 - classification_loss: 0.0038 2/500 [..............................] - ETA: 2:43 - loss: 0.1559 - regression_loss: 0.1464 - classification_loss: 0.0095 3/500 [..............................] - ETA: 2:45 - loss: 0.1464 - regression_loss: 0.1377 - classification_loss: 0.0087 4/500 [..............................] - ETA: 2:47 - loss: 0.1832 - regression_loss: 0.1724 - classification_loss: 0.0108 5/500 [..............................] - ETA: 2:48 - loss: 0.2109 - regression_loss: 0.1990 - classification_loss: 0.0119 6/500 [..............................] - ETA: 2:47 - loss: 0.2235 - regression_loss: 0.2122 - classification_loss: 0.0114 7/500 [..............................] - ETA: 2:45 - loss: 0.2379 - regression_loss: 0.2246 - classification_loss: 0.0133 8/500 [..............................] - ETA: 2:45 - loss: 0.2312 - regression_loss: 0.2191 - classification_loss: 0.0121 9/500 [..............................] - ETA: 2:46 - loss: 0.2344 - regression_loss: 0.2200 - classification_loss: 0.0145 10/500 [..............................] - ETA: 2:45 - loss: 0.2492 - regression_loss: 0.2340 - classification_loss: 0.0152 11/500 [..............................] - ETA: 2:45 - loss: 0.2404 - regression_loss: 0.2259 - classification_loss: 0.0145 12/500 [..............................] - ETA: 2:45 - loss: 0.2514 - regression_loss: 0.2371 - classification_loss: 0.0143 13/500 [..............................] - ETA: 2:44 - loss: 0.2405 - regression_loss: 0.2270 - classification_loss: 0.0135 14/500 [..............................] - ETA: 2:44 - loss: 0.2489 - regression_loss: 0.2341 - classification_loss: 0.0148 15/500 [..............................] - ETA: 2:43 - loss: 0.2449 - regression_loss: 0.2304 - classification_loss: 0.0145 16/500 [..............................] - ETA: 2:43 - loss: 0.2414 - regression_loss: 0.2271 - classification_loss: 0.0143 17/500 [>.............................] - ETA: 2:43 - loss: 0.2540 - regression_loss: 0.2390 - classification_loss: 0.0150 18/500 [>.............................] - ETA: 2:42 - loss: 0.2536 - regression_loss: 0.2387 - classification_loss: 0.0148 19/500 [>.............................] - ETA: 2:42 - loss: 0.2487 - regression_loss: 0.2344 - classification_loss: 0.0143 20/500 [>.............................] - ETA: 2:42 - loss: 0.2613 - regression_loss: 0.2462 - classification_loss: 0.0151 21/500 [>.............................] - ETA: 2:42 - loss: 0.2645 - regression_loss: 0.2489 - classification_loss: 0.0156 22/500 [>.............................] - ETA: 2:42 - loss: 0.2633 - regression_loss: 0.2478 - classification_loss: 0.0155 23/500 [>.............................] - ETA: 2:42 - loss: 0.2649 - regression_loss: 0.2494 - classification_loss: 0.0156 24/500 [>.............................] - ETA: 2:42 - loss: 0.2622 - regression_loss: 0.2465 - classification_loss: 0.0156 25/500 [>.............................] - ETA: 2:42 - loss: 0.2642 - regression_loss: 0.2484 - classification_loss: 0.0158 26/500 [>.............................] - ETA: 2:42 - loss: 0.2623 - regression_loss: 0.2468 - classification_loss: 0.0155 27/500 [>.............................] - ETA: 2:41 - loss: 0.2643 - regression_loss: 0.2485 - classification_loss: 0.0158 28/500 [>.............................] - ETA: 2:41 - loss: 0.2645 - regression_loss: 0.2483 - classification_loss: 0.0163 29/500 [>.............................] - ETA: 2:40 - loss: 0.2687 - regression_loss: 0.2526 - classification_loss: 0.0161 30/500 [>.............................] - ETA: 2:40 - loss: 0.2786 - regression_loss: 0.2623 - classification_loss: 0.0163 31/500 [>.............................] - ETA: 2:40 - loss: 0.2771 - regression_loss: 0.2610 - classification_loss: 0.0161 32/500 [>.............................] - ETA: 2:39 - loss: 0.2821 - regression_loss: 0.2660 - classification_loss: 0.0161 33/500 [>.............................] - ETA: 2:39 - loss: 0.2886 - regression_loss: 0.2723 - classification_loss: 0.0163 34/500 [=>............................] - ETA: 2:39 - loss: 0.2888 - regression_loss: 0.2728 - classification_loss: 0.0160 35/500 [=>............................] - ETA: 2:38 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0160 36/500 [=>............................] - ETA: 2:38 - loss: 0.2914 - regression_loss: 0.2756 - classification_loss: 0.0158 37/500 [=>............................] - ETA: 2:38 - loss: 0.2872 - regression_loss: 0.2718 - classification_loss: 0.0154 38/500 [=>............................] - ETA: 2:37 - loss: 0.2867 - regression_loss: 0.2713 - classification_loss: 0.0155 39/500 [=>............................] - ETA: 2:37 - loss: 0.2826 - regression_loss: 0.2674 - classification_loss: 0.0152 40/500 [=>............................] - ETA: 2:36 - loss: 0.2783 - regression_loss: 0.2632 - classification_loss: 0.0151 41/500 [=>............................] - ETA: 2:36 - loss: 0.2876 - regression_loss: 0.2710 - classification_loss: 0.0166 42/500 [=>............................] - ETA: 2:36 - loss: 0.2843 - regression_loss: 0.2679 - classification_loss: 0.0164 43/500 [=>............................] - ETA: 2:35 - loss: 0.2830 - regression_loss: 0.2666 - classification_loss: 0.0164 44/500 [=>............................] - ETA: 2:35 - loss: 0.2865 - regression_loss: 0.2700 - classification_loss: 0.0165 45/500 [=>............................] - ETA: 2:35 - loss: 0.2845 - regression_loss: 0.2682 - classification_loss: 0.0162 46/500 [=>............................] - ETA: 2:35 - loss: 0.2846 - regression_loss: 0.2684 - classification_loss: 0.0162 47/500 [=>............................] - ETA: 2:34 - loss: 0.2811 - regression_loss: 0.2651 - classification_loss: 0.0161 48/500 [=>............................] - ETA: 2:34 - loss: 0.2799 - regression_loss: 0.2639 - classification_loss: 0.0160 49/500 [=>............................] - ETA: 2:33 - loss: 0.2845 - regression_loss: 0.2683 - classification_loss: 0.0162 50/500 [==>...........................] - ETA: 2:33 - loss: 0.2943 - regression_loss: 0.2771 - classification_loss: 0.0172 51/500 [==>...........................] - ETA: 2:33 - loss: 0.2916 - regression_loss: 0.2745 - classification_loss: 0.0171 52/500 [==>...........................] - ETA: 2:33 - loss: 0.2936 - regression_loss: 0.2765 - classification_loss: 0.0171 53/500 [==>...........................] - ETA: 2:32 - loss: 0.2968 - regression_loss: 0.2797 - classification_loss: 0.0171 54/500 [==>...........................] - ETA: 2:32 - loss: 0.2974 - regression_loss: 0.2802 - classification_loss: 0.0172 55/500 [==>...........................] - ETA: 2:32 - loss: 0.2961 - regression_loss: 0.2791 - classification_loss: 0.0170 56/500 [==>...........................] - ETA: 2:32 - loss: 0.2990 - regression_loss: 0.2819 - classification_loss: 0.0171 57/500 [==>...........................] - ETA: 2:31 - loss: 0.3001 - regression_loss: 0.2832 - classification_loss: 0.0169 58/500 [==>...........................] - ETA: 2:31 - loss: 0.2983 - regression_loss: 0.2817 - classification_loss: 0.0167 59/500 [==>...........................] - ETA: 2:30 - loss: 0.2995 - regression_loss: 0.2825 - classification_loss: 0.0169 60/500 [==>...........................] - ETA: 2:30 - loss: 0.3007 - regression_loss: 0.2836 - classification_loss: 0.0170 61/500 [==>...........................] - ETA: 2:30 - loss: 0.3026 - regression_loss: 0.2855 - classification_loss: 0.0171 62/500 [==>...........................] - ETA: 2:29 - loss: 0.3009 - regression_loss: 0.2840 - classification_loss: 0.0169 63/500 [==>...........................] - ETA: 2:29 - loss: 0.2997 - regression_loss: 0.2828 - classification_loss: 0.0168 64/500 [==>...........................] - ETA: 2:29 - loss: 0.3021 - regression_loss: 0.2852 - classification_loss: 0.0169 65/500 [==>...........................] - ETA: 2:28 - loss: 0.3004 - regression_loss: 0.2836 - classification_loss: 0.0168 66/500 [==>...........................] - ETA: 2:28 - loss: 0.3007 - regression_loss: 0.2841 - classification_loss: 0.0167 67/500 [===>..........................] - ETA: 2:28 - loss: 0.3052 - regression_loss: 0.2877 - classification_loss: 0.0175 68/500 [===>..........................] - ETA: 2:27 - loss: 0.3045 - regression_loss: 0.2871 - classification_loss: 0.0174 69/500 [===>..........................] - ETA: 2:27 - loss: 0.3046 - regression_loss: 0.2871 - classification_loss: 0.0175 70/500 [===>..........................] - ETA: 2:27 - loss: 0.3039 - regression_loss: 0.2866 - classification_loss: 0.0174 71/500 [===>..........................] - ETA: 2:26 - loss: 0.3042 - regression_loss: 0.2869 - classification_loss: 0.0172 72/500 [===>..........................] - ETA: 2:26 - loss: 0.3043 - regression_loss: 0.2868 - classification_loss: 0.0175 73/500 [===>..........................] - ETA: 2:26 - loss: 0.3033 - regression_loss: 0.2860 - classification_loss: 0.0174 74/500 [===>..........................] - ETA: 2:25 - loss: 0.3022 - regression_loss: 0.2850 - classification_loss: 0.0172 75/500 [===>..........................] - ETA: 2:25 - loss: 0.3021 - regression_loss: 0.2849 - classification_loss: 0.0172 76/500 [===>..........................] - ETA: 2:25 - loss: 0.3032 - regression_loss: 0.2859 - classification_loss: 0.0173 77/500 [===>..........................] - ETA: 2:24 - loss: 0.3064 - regression_loss: 0.2885 - classification_loss: 0.0179 78/500 [===>..........................] - ETA: 2:24 - loss: 0.3070 - regression_loss: 0.2891 - classification_loss: 0.0179 79/500 [===>..........................] - ETA: 2:24 - loss: 0.3068 - regression_loss: 0.2890 - classification_loss: 0.0178 80/500 [===>..........................] - ETA: 2:23 - loss: 0.3052 - regression_loss: 0.2876 - classification_loss: 0.0176 81/500 [===>..........................] - ETA: 2:23 - loss: 0.3049 - regression_loss: 0.2873 - classification_loss: 0.0176 82/500 [===>..........................] - ETA: 2:23 - loss: 0.3043 - regression_loss: 0.2868 - classification_loss: 0.0175 83/500 [===>..........................] - ETA: 2:23 - loss: 0.3050 - regression_loss: 0.2876 - classification_loss: 0.0174 84/500 [====>.........................] - ETA: 2:22 - loss: 0.3051 - regression_loss: 0.2872 - classification_loss: 0.0179 85/500 [====>.........................] - ETA: 2:22 - loss: 0.3042 - regression_loss: 0.2863 - classification_loss: 0.0178 86/500 [====>.........................] - ETA: 2:22 - loss: 0.3042 - regression_loss: 0.2864 - classification_loss: 0.0179 87/500 [====>.........................] - ETA: 2:21 - loss: 0.3042 - regression_loss: 0.2864 - classification_loss: 0.0179 88/500 [====>.........................] - ETA: 2:21 - loss: 0.3034 - regression_loss: 0.2857 - classification_loss: 0.0178 89/500 [====>.........................] - ETA: 2:21 - loss: 0.3043 - regression_loss: 0.2863 - classification_loss: 0.0180 90/500 [====>.........................] - ETA: 2:20 - loss: 0.3059 - regression_loss: 0.2877 - classification_loss: 0.0181 91/500 [====>.........................] - ETA: 2:20 - loss: 0.3053 - regression_loss: 0.2872 - classification_loss: 0.0181 92/500 [====>.........................] - ETA: 2:20 - loss: 0.3072 - regression_loss: 0.2888 - classification_loss: 0.0184 93/500 [====>.........................] - ETA: 2:19 - loss: 0.3064 - regression_loss: 0.2881 - classification_loss: 0.0183 94/500 [====>.........................] - ETA: 2:19 - loss: 0.3053 - regression_loss: 0.2871 - classification_loss: 0.0182 95/500 [====>.........................] - ETA: 2:19 - loss: 0.3048 - regression_loss: 0.2867 - classification_loss: 0.0181 96/500 [====>.........................] - ETA: 2:18 - loss: 0.3089 - regression_loss: 0.2905 - classification_loss: 0.0185 97/500 [====>.........................] - ETA: 2:18 - loss: 0.3077 - regression_loss: 0.2893 - classification_loss: 0.0184 98/500 [====>.........................] - ETA: 2:18 - loss: 0.3075 - regression_loss: 0.2892 - classification_loss: 0.0184 99/500 [====>.........................] - ETA: 2:17 - loss: 0.3066 - regression_loss: 0.2883 - classification_loss: 0.0183 100/500 [=====>........................] - ETA: 2:17 - loss: 0.3070 - regression_loss: 0.2887 - classification_loss: 0.0183 101/500 [=====>........................] - ETA: 2:17 - loss: 0.3071 - regression_loss: 0.2887 - classification_loss: 0.0183 102/500 [=====>........................] - ETA: 2:16 - loss: 0.3091 - regression_loss: 0.2906 - classification_loss: 0.0185 103/500 [=====>........................] - ETA: 2:16 - loss: 0.3089 - regression_loss: 0.2905 - classification_loss: 0.0184 104/500 [=====>........................] - ETA: 2:16 - loss: 0.3102 - regression_loss: 0.2915 - classification_loss: 0.0186 105/500 [=====>........................] - ETA: 2:15 - loss: 0.3110 - regression_loss: 0.2925 - classification_loss: 0.0185 106/500 [=====>........................] - ETA: 2:15 - loss: 0.3149 - regression_loss: 0.2958 - classification_loss: 0.0191 107/500 [=====>........................] - ETA: 2:15 - loss: 0.3125 - regression_loss: 0.2935 - classification_loss: 0.0189 108/500 [=====>........................] - ETA: 2:14 - loss: 0.3131 - regression_loss: 0.2942 - classification_loss: 0.0189 109/500 [=====>........................] - ETA: 2:14 - loss: 0.3148 - regression_loss: 0.2959 - classification_loss: 0.0189 110/500 [=====>........................] - ETA: 2:14 - loss: 0.3142 - regression_loss: 0.2954 - classification_loss: 0.0188 111/500 [=====>........................] - ETA: 2:13 - loss: 0.3141 - regression_loss: 0.2953 - classification_loss: 0.0188 112/500 [=====>........................] - ETA: 2:13 - loss: 0.3136 - regression_loss: 0.2949 - classification_loss: 0.0187 113/500 [=====>........................] - ETA: 2:12 - loss: 0.3149 - regression_loss: 0.2961 - classification_loss: 0.0187 114/500 [=====>........................] - ETA: 2:12 - loss: 0.3150 - regression_loss: 0.2961 - classification_loss: 0.0188 115/500 [=====>........................] - ETA: 2:12 - loss: 0.3183 - regression_loss: 0.2992 - classification_loss: 0.0191 116/500 [=====>........................] - ETA: 2:11 - loss: 0.3190 - regression_loss: 0.2999 - classification_loss: 0.0191 117/500 [======>.......................] - ETA: 2:11 - loss: 0.3181 - regression_loss: 0.2991 - classification_loss: 0.0190 118/500 [======>.......................] - ETA: 2:11 - loss: 0.3182 - regression_loss: 0.2993 - classification_loss: 0.0189 119/500 [======>.......................] - ETA: 2:11 - loss: 0.3179 - regression_loss: 0.2991 - classification_loss: 0.0189 120/500 [======>.......................] - ETA: 2:10 - loss: 0.3186 - regression_loss: 0.2997 - classification_loss: 0.0189 121/500 [======>.......................] - ETA: 2:10 - loss: 0.3179 - regression_loss: 0.2991 - classification_loss: 0.0188 122/500 [======>.......................] - ETA: 2:10 - loss: 0.3168 - regression_loss: 0.2981 - classification_loss: 0.0187 123/500 [======>.......................] - ETA: 2:09 - loss: 0.3179 - regression_loss: 0.2990 - classification_loss: 0.0190 124/500 [======>.......................] - ETA: 2:09 - loss: 0.3183 - regression_loss: 0.2991 - classification_loss: 0.0192 125/500 [======>.......................] - ETA: 2:09 - loss: 0.3179 - regression_loss: 0.2988 - classification_loss: 0.0191 126/500 [======>.......................] - ETA: 2:08 - loss: 0.3207 - regression_loss: 0.3013 - classification_loss: 0.0194 127/500 [======>.......................] - ETA: 2:08 - loss: 0.3215 - regression_loss: 0.3020 - classification_loss: 0.0195 128/500 [======>.......................] - ETA: 2:07 - loss: 0.3207 - regression_loss: 0.3012 - classification_loss: 0.0194 129/500 [======>.......................] - ETA: 2:07 - loss: 0.3206 - regression_loss: 0.3013 - classification_loss: 0.0193 130/500 [======>.......................] - ETA: 2:07 - loss: 0.3201 - regression_loss: 0.3008 - classification_loss: 0.0193 131/500 [======>.......................] - ETA: 2:06 - loss: 0.3197 - regression_loss: 0.3005 - classification_loss: 0.0192 132/500 [======>.......................] - ETA: 2:06 - loss: 0.3201 - regression_loss: 0.3008 - classification_loss: 0.0194 133/500 [======>.......................] - ETA: 2:06 - loss: 0.3213 - regression_loss: 0.3016 - classification_loss: 0.0197 134/500 [=======>......................] - ETA: 2:05 - loss: 0.3204 - regression_loss: 0.3009 - classification_loss: 0.0196 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3193 - regression_loss: 0.2999 - classification_loss: 0.0195 136/500 [=======>......................] - ETA: 2:04 - loss: 0.3189 - regression_loss: 0.2995 - classification_loss: 0.0195 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3187 - regression_loss: 0.2991 - classification_loss: 0.0196 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3188 - regression_loss: 0.2991 - classification_loss: 0.0196 139/500 [=======>......................] - ETA: 2:03 - loss: 0.3178 - regression_loss: 0.2981 - classification_loss: 0.0196 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3169 - regression_loss: 0.2973 - classification_loss: 0.0196 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3160 - regression_loss: 0.2965 - classification_loss: 0.0195 142/500 [=======>......................] - ETA: 2:02 - loss: 0.3169 - regression_loss: 0.2973 - classification_loss: 0.0196 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3183 - regression_loss: 0.2986 - classification_loss: 0.0196 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3177 - regression_loss: 0.2981 - classification_loss: 0.0196 145/500 [=======>......................] - ETA: 2:01 - loss: 0.3178 - regression_loss: 0.2983 - classification_loss: 0.0195 146/500 [=======>......................] - ETA: 2:01 - loss: 0.3185 - regression_loss: 0.2990 - classification_loss: 0.0195 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3195 - regression_loss: 0.3001 - classification_loss: 0.0194 148/500 [=======>......................] - ETA: 2:00 - loss: 0.3207 - regression_loss: 0.3012 - classification_loss: 0.0195 149/500 [=======>......................] - ETA: 2:00 - loss: 0.3207 - regression_loss: 0.3012 - classification_loss: 0.0195 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3201 - regression_loss: 0.3007 - classification_loss: 0.0194 151/500 [========>.....................] - ETA: 1:59 - loss: 0.3200 - regression_loss: 0.3007 - classification_loss: 0.0193 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3189 - regression_loss: 0.2996 - classification_loss: 0.0193 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3190 - regression_loss: 0.2997 - classification_loss: 0.0193 154/500 [========>.....................] - ETA: 1:58 - loss: 0.3195 - regression_loss: 0.3000 - classification_loss: 0.0194 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3198 - regression_loss: 0.3003 - classification_loss: 0.0194 156/500 [========>.....................] - ETA: 1:57 - loss: 0.3188 - regression_loss: 0.2995 - classification_loss: 0.0193 157/500 [========>.....................] - ETA: 1:57 - loss: 0.3194 - regression_loss: 0.3000 - classification_loss: 0.0194 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3195 - regression_loss: 0.3002 - classification_loss: 0.0193 159/500 [========>.....................] - ETA: 1:56 - loss: 0.3194 - regression_loss: 0.3001 - classification_loss: 0.0193 160/500 [========>.....................] - ETA: 1:56 - loss: 0.3195 - regression_loss: 0.3002 - classification_loss: 0.0193 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3192 - regression_loss: 0.2998 - classification_loss: 0.0194 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3181 - regression_loss: 0.2988 - classification_loss: 0.0193 163/500 [========>.....................] - ETA: 1:55 - loss: 0.3187 - regression_loss: 0.2994 - classification_loss: 0.0193 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3218 - regression_loss: 0.3023 - classification_loss: 0.0195 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3212 - regression_loss: 0.3018 - classification_loss: 0.0195 166/500 [========>.....................] - ETA: 1:54 - loss: 0.3201 - regression_loss: 0.3007 - classification_loss: 0.0194 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3196 - regression_loss: 0.3003 - classification_loss: 0.0193 168/500 [=========>....................] - ETA: 1:54 - loss: 0.3210 - regression_loss: 0.3016 - classification_loss: 0.0193 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3199 - regression_loss: 0.3006 - classification_loss: 0.0193 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3203 - regression_loss: 0.3010 - classification_loss: 0.0193 171/500 [=========>....................] - ETA: 1:53 - loss: 0.3204 - regression_loss: 0.3011 - classification_loss: 0.0193 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3205 - regression_loss: 0.3013 - classification_loss: 0.0192 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3208 - regression_loss: 0.3016 - classification_loss: 0.0192 174/500 [=========>....................] - ETA: 1:52 - loss: 0.3199 - regression_loss: 0.3007 - classification_loss: 0.0192 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3199 - regression_loss: 0.3008 - classification_loss: 0.0191 176/500 [=========>....................] - ETA: 1:51 - loss: 0.3184 - regression_loss: 0.2993 - classification_loss: 0.0191 177/500 [=========>....................] - ETA: 1:50 - loss: 0.3198 - regression_loss: 0.3005 - classification_loss: 0.0193 178/500 [=========>....................] - ETA: 1:50 - loss: 0.3202 - regression_loss: 0.3008 - classification_loss: 0.0193 179/500 [=========>....................] - ETA: 1:50 - loss: 0.3196 - regression_loss: 0.3003 - classification_loss: 0.0193 180/500 [=========>....................] - ETA: 1:49 - loss: 0.3209 - regression_loss: 0.3017 - classification_loss: 0.0193 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3218 - regression_loss: 0.3024 - classification_loss: 0.0194 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3210 - regression_loss: 0.3016 - classification_loss: 0.0193 183/500 [=========>....................] - ETA: 1:48 - loss: 0.3200 - regression_loss: 0.3007 - classification_loss: 0.0192 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3204 - regression_loss: 0.3010 - classification_loss: 0.0193 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3202 - regression_loss: 0.3008 - classification_loss: 0.0193 186/500 [==========>...................] - ETA: 1:47 - loss: 0.3204 - regression_loss: 0.3011 - classification_loss: 0.0193 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3195 - regression_loss: 0.3003 - classification_loss: 0.0192 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3190 - regression_loss: 0.2998 - classification_loss: 0.0192 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3187 - regression_loss: 0.2996 - classification_loss: 0.0191 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3196 - regression_loss: 0.3003 - classification_loss: 0.0192 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3188 - regression_loss: 0.2996 - classification_loss: 0.0192 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3186 - regression_loss: 0.2994 - classification_loss: 0.0192 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3182 - regression_loss: 0.2991 - classification_loss: 0.0192 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3178 - regression_loss: 0.2986 - classification_loss: 0.0192 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3182 - regression_loss: 0.2990 - classification_loss: 0.0192 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3178 - regression_loss: 0.2986 - classification_loss: 0.0192 197/500 [==========>...................] - ETA: 1:43 - loss: 0.3177 - regression_loss: 0.2985 - classification_loss: 0.0192 198/500 [==========>...................] - ETA: 1:43 - loss: 0.3175 - regression_loss: 0.2983 - classification_loss: 0.0192 199/500 [==========>...................] - ETA: 1:43 - loss: 0.3182 - regression_loss: 0.2990 - classification_loss: 0.0192 200/500 [===========>..................] - ETA: 1:43 - loss: 0.3170 - regression_loss: 0.2979 - classification_loss: 0.0191 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3163 - regression_loss: 0.2972 - classification_loss: 0.0190 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3158 - regression_loss: 0.2968 - classification_loss: 0.0190 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3148 - regression_loss: 0.2959 - classification_loss: 0.0190 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3144 - regression_loss: 0.2954 - classification_loss: 0.0190 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3147 - regression_loss: 0.2957 - classification_loss: 0.0191 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3151 - regression_loss: 0.2961 - classification_loss: 0.0190 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3153 - regression_loss: 0.2963 - classification_loss: 0.0190 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3149 - regression_loss: 0.2960 - classification_loss: 0.0189 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3152 - regression_loss: 0.2962 - classification_loss: 0.0190 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3153 - regression_loss: 0.2963 - classification_loss: 0.0190 211/500 [===========>..................] - ETA: 1:39 - loss: 0.3152 - regression_loss: 0.2962 - classification_loss: 0.0190 212/500 [===========>..................] - ETA: 1:39 - loss: 0.3150 - regression_loss: 0.2961 - classification_loss: 0.0189 213/500 [===========>..................] - ETA: 1:38 - loss: 0.3158 - regression_loss: 0.2969 - classification_loss: 0.0189 214/500 [===========>..................] - ETA: 1:38 - loss: 0.3160 - regression_loss: 0.2971 - classification_loss: 0.0189 215/500 [===========>..................] - ETA: 1:37 - loss: 0.3152 - regression_loss: 0.2964 - classification_loss: 0.0189 216/500 [===========>..................] - ETA: 1:37 - loss: 0.3150 - regression_loss: 0.2963 - classification_loss: 0.0188 217/500 [============>.................] - ETA: 1:37 - loss: 0.3141 - regression_loss: 0.2954 - classification_loss: 0.0187 218/500 [============>.................] - ETA: 1:36 - loss: 0.3133 - regression_loss: 0.2947 - classification_loss: 0.0187 219/500 [============>.................] - ETA: 1:36 - loss: 0.3143 - regression_loss: 0.2955 - classification_loss: 0.0188 220/500 [============>.................] - ETA: 1:36 - loss: 0.3138 - regression_loss: 0.2951 - classification_loss: 0.0188 221/500 [============>.................] - ETA: 1:35 - loss: 0.3143 - regression_loss: 0.2954 - classification_loss: 0.0188 222/500 [============>.................] - ETA: 1:35 - loss: 0.3155 - regression_loss: 0.2966 - classification_loss: 0.0188 223/500 [============>.................] - ETA: 1:35 - loss: 0.3153 - regression_loss: 0.2965 - classification_loss: 0.0188 224/500 [============>.................] - ETA: 1:34 - loss: 0.3153 - regression_loss: 0.2965 - classification_loss: 0.0189 225/500 [============>.................] - ETA: 1:34 - loss: 0.3147 - regression_loss: 0.2959 - classification_loss: 0.0188 226/500 [============>.................] - ETA: 1:34 - loss: 0.3145 - regression_loss: 0.2957 - classification_loss: 0.0188 227/500 [============>.................] - ETA: 1:33 - loss: 0.3138 - regression_loss: 0.2951 - classification_loss: 0.0187 228/500 [============>.................] - ETA: 1:33 - loss: 0.3140 - regression_loss: 0.2953 - classification_loss: 0.0187 229/500 [============>.................] - ETA: 1:33 - loss: 0.3139 - regression_loss: 0.2952 - classification_loss: 0.0187 230/500 [============>.................] - ETA: 1:32 - loss: 0.3139 - regression_loss: 0.2952 - classification_loss: 0.0187 231/500 [============>.................] - ETA: 1:32 - loss: 0.3137 - regression_loss: 0.2950 - classification_loss: 0.0186 232/500 [============>.................] - ETA: 1:32 - loss: 0.3133 - regression_loss: 0.2947 - classification_loss: 0.0186 233/500 [============>.................] - ETA: 1:31 - loss: 0.3128 - regression_loss: 0.2942 - classification_loss: 0.0186 234/500 [=============>................] - ETA: 1:31 - loss: 0.3125 - regression_loss: 0.2940 - classification_loss: 0.0186 235/500 [=============>................] - ETA: 1:31 - loss: 0.3128 - regression_loss: 0.2942 - classification_loss: 0.0186 236/500 [=============>................] - ETA: 1:30 - loss: 0.3130 - regression_loss: 0.2944 - classification_loss: 0.0186 237/500 [=============>................] - ETA: 1:30 - loss: 0.3132 - regression_loss: 0.2946 - classification_loss: 0.0186 238/500 [=============>................] - ETA: 1:30 - loss: 0.3136 - regression_loss: 0.2949 - classification_loss: 0.0187 239/500 [=============>................] - ETA: 1:29 - loss: 0.3134 - regression_loss: 0.2948 - classification_loss: 0.0187 240/500 [=============>................] - ETA: 1:29 - loss: 0.3133 - regression_loss: 0.2947 - classification_loss: 0.0187 241/500 [=============>................] - ETA: 1:29 - loss: 0.3140 - regression_loss: 0.2951 - classification_loss: 0.0189 242/500 [=============>................] - ETA: 1:28 - loss: 0.3140 - regression_loss: 0.2951 - classification_loss: 0.0189 243/500 [=============>................] - ETA: 1:28 - loss: 0.3137 - regression_loss: 0.2948 - classification_loss: 0.0189 244/500 [=============>................] - ETA: 1:28 - loss: 0.3134 - regression_loss: 0.2946 - classification_loss: 0.0188 245/500 [=============>................] - ETA: 1:27 - loss: 0.3130 - regression_loss: 0.2942 - classification_loss: 0.0188 246/500 [=============>................] - ETA: 1:27 - loss: 0.3129 - regression_loss: 0.2941 - classification_loss: 0.0188 247/500 [=============>................] - ETA: 1:27 - loss: 0.3125 - regression_loss: 0.2937 - classification_loss: 0.0188 248/500 [=============>................] - ETA: 1:26 - loss: 0.3126 - regression_loss: 0.2938 - classification_loss: 0.0188 249/500 [=============>................] - ETA: 1:26 - loss: 0.3126 - regression_loss: 0.2938 - classification_loss: 0.0188 250/500 [==============>...............] - ETA: 1:26 - loss: 0.3118 - regression_loss: 0.2930 - classification_loss: 0.0188 251/500 [==============>...............] - ETA: 1:25 - loss: 0.3111 - regression_loss: 0.2924 - classification_loss: 0.0187 252/500 [==============>...............] - ETA: 1:25 - loss: 0.3108 - regression_loss: 0.2921 - classification_loss: 0.0187 253/500 [==============>...............] - ETA: 1:24 - loss: 0.3109 - regression_loss: 0.2922 - classification_loss: 0.0187 254/500 [==============>...............] - ETA: 1:24 - loss: 0.3106 - regression_loss: 0.2920 - classification_loss: 0.0186 255/500 [==============>...............] - ETA: 1:24 - loss: 0.3100 - regression_loss: 0.2914 - classification_loss: 0.0186 256/500 [==============>...............] - ETA: 1:23 - loss: 0.3098 - regression_loss: 0.2912 - classification_loss: 0.0186 257/500 [==============>...............] - ETA: 1:23 - loss: 0.3090 - regression_loss: 0.2905 - classification_loss: 0.0185 258/500 [==============>...............] - ETA: 1:23 - loss: 0.3085 - regression_loss: 0.2900 - classification_loss: 0.0184 259/500 [==============>...............] - ETA: 1:22 - loss: 0.3080 - regression_loss: 0.2896 - classification_loss: 0.0184 260/500 [==============>...............] - ETA: 1:22 - loss: 0.3081 - regression_loss: 0.2898 - classification_loss: 0.0184 261/500 [==============>...............] - ETA: 1:22 - loss: 0.3092 - regression_loss: 0.2909 - classification_loss: 0.0183 262/500 [==============>...............] - ETA: 1:21 - loss: 0.3091 - regression_loss: 0.2908 - classification_loss: 0.0183 263/500 [==============>...............] - ETA: 1:21 - loss: 0.3088 - regression_loss: 0.2905 - classification_loss: 0.0183 264/500 [==============>...............] - ETA: 1:21 - loss: 0.3090 - regression_loss: 0.2906 - classification_loss: 0.0183 265/500 [==============>...............] - ETA: 1:20 - loss: 0.3087 - regression_loss: 0.2904 - classification_loss: 0.0183 266/500 [==============>...............] - ETA: 1:20 - loss: 0.3097 - regression_loss: 0.2913 - classification_loss: 0.0184 267/500 [===============>..............] - ETA: 1:20 - loss: 0.3093 - regression_loss: 0.2909 - classification_loss: 0.0183 268/500 [===============>..............] - ETA: 1:19 - loss: 0.3093 - regression_loss: 0.2909 - classification_loss: 0.0184 269/500 [===============>..............] - ETA: 1:19 - loss: 0.3092 - regression_loss: 0.2909 - classification_loss: 0.0184 270/500 [===============>..............] - ETA: 1:19 - loss: 0.3086 - regression_loss: 0.2903 - classification_loss: 0.0183 271/500 [===============>..............] - ETA: 1:18 - loss: 0.3087 - regression_loss: 0.2903 - classification_loss: 0.0184 272/500 [===============>..............] - ETA: 1:18 - loss: 0.3093 - regression_loss: 0.2909 - classification_loss: 0.0184 273/500 [===============>..............] - ETA: 1:18 - loss: 0.3086 - regression_loss: 0.2902 - classification_loss: 0.0184 274/500 [===============>..............] - ETA: 1:17 - loss: 0.3085 - regression_loss: 0.2902 - classification_loss: 0.0183 275/500 [===============>..............] - ETA: 1:17 - loss: 0.3083 - regression_loss: 0.2901 - classification_loss: 0.0183 276/500 [===============>..............] - ETA: 1:17 - loss: 0.3084 - regression_loss: 0.2901 - classification_loss: 0.0183 277/500 [===============>..............] - ETA: 1:16 - loss: 0.3082 - regression_loss: 0.2899 - classification_loss: 0.0183 278/500 [===============>..............] - ETA: 1:16 - loss: 0.3082 - regression_loss: 0.2899 - classification_loss: 0.0183 279/500 [===============>..............] - ETA: 1:16 - loss: 0.3080 - regression_loss: 0.2898 - classification_loss: 0.0183 280/500 [===============>..............] - ETA: 1:15 - loss: 0.3081 - regression_loss: 0.2899 - classification_loss: 0.0182 281/500 [===============>..............] - ETA: 1:15 - loss: 0.3078 - regression_loss: 0.2896 - classification_loss: 0.0182 282/500 [===============>..............] - ETA: 1:14 - loss: 0.3084 - regression_loss: 0.2902 - classification_loss: 0.0182 283/500 [===============>..............] - ETA: 1:14 - loss: 0.3086 - regression_loss: 0.2904 - classification_loss: 0.0182 284/500 [================>.............] - ETA: 1:14 - loss: 0.3083 - regression_loss: 0.2901 - classification_loss: 0.0182 285/500 [================>.............] - ETA: 1:13 - loss: 0.3076 - regression_loss: 0.2895 - classification_loss: 0.0181 286/500 [================>.............] - ETA: 1:13 - loss: 0.3076 - regression_loss: 0.2895 - classification_loss: 0.0181 287/500 [================>.............] - ETA: 1:13 - loss: 0.3083 - regression_loss: 0.2902 - classification_loss: 0.0181 288/500 [================>.............] - ETA: 1:12 - loss: 0.3077 - regression_loss: 0.2897 - classification_loss: 0.0181 289/500 [================>.............] - ETA: 1:12 - loss: 0.3079 - regression_loss: 0.2898 - classification_loss: 0.0181 290/500 [================>.............] - ETA: 1:12 - loss: 0.3073 - regression_loss: 0.2893 - classification_loss: 0.0181 291/500 [================>.............] - ETA: 1:11 - loss: 0.3071 - regression_loss: 0.2891 - classification_loss: 0.0180 292/500 [================>.............] - ETA: 1:11 - loss: 0.3072 - regression_loss: 0.2892 - classification_loss: 0.0180 293/500 [================>.............] - ETA: 1:11 - loss: 0.3069 - regression_loss: 0.2889 - classification_loss: 0.0180 294/500 [================>.............] - ETA: 1:10 - loss: 0.3074 - regression_loss: 0.2893 - classification_loss: 0.0181 295/500 [================>.............] - ETA: 1:10 - loss: 0.3074 - regression_loss: 0.2893 - classification_loss: 0.0180 296/500 [================>.............] - ETA: 1:10 - loss: 0.3074 - regression_loss: 0.2893 - classification_loss: 0.0181 297/500 [================>.............] - ETA: 1:09 - loss: 0.3069 - regression_loss: 0.2889 - classification_loss: 0.0180 298/500 [================>.............] - ETA: 1:09 - loss: 0.3071 - regression_loss: 0.2890 - classification_loss: 0.0181 299/500 [================>.............] - ETA: 1:09 - loss: 0.3070 - regression_loss: 0.2890 - classification_loss: 0.0180 300/500 [=================>............] - ETA: 1:08 - loss: 0.3076 - regression_loss: 0.2895 - classification_loss: 0.0181 301/500 [=================>............] - ETA: 1:08 - loss: 0.3070 - regression_loss: 0.2889 - classification_loss: 0.0181 302/500 [=================>............] - ETA: 1:08 - loss: 0.3071 - regression_loss: 0.2890 - classification_loss: 0.0181 303/500 [=================>............] - ETA: 1:07 - loss: 0.3070 - regression_loss: 0.2890 - classification_loss: 0.0181 304/500 [=================>............] - ETA: 1:07 - loss: 0.3076 - regression_loss: 0.2895 - classification_loss: 0.0181 305/500 [=================>............] - ETA: 1:07 - loss: 0.3071 - regression_loss: 0.2891 - classification_loss: 0.0180 306/500 [=================>............] - ETA: 1:06 - loss: 0.3069 - regression_loss: 0.2889 - classification_loss: 0.0180 307/500 [=================>............] - ETA: 1:06 - loss: 0.3067 - regression_loss: 0.2887 - classification_loss: 0.0180 308/500 [=================>............] - ETA: 1:06 - loss: 0.3065 - regression_loss: 0.2885 - classification_loss: 0.0180 309/500 [=================>............] - ETA: 1:05 - loss: 0.3059 - regression_loss: 0.2880 - classification_loss: 0.0179 310/500 [=================>............] - ETA: 1:05 - loss: 0.3054 - regression_loss: 0.2875 - classification_loss: 0.0179 311/500 [=================>............] - ETA: 1:04 - loss: 0.3058 - regression_loss: 0.2879 - classification_loss: 0.0179 312/500 [=================>............] - ETA: 1:04 - loss: 0.3061 - regression_loss: 0.2882 - classification_loss: 0.0179 313/500 [=================>............] - ETA: 1:04 - loss: 0.3066 - regression_loss: 0.2887 - classification_loss: 0.0179 314/500 [=================>............] - ETA: 1:03 - loss: 0.3064 - regression_loss: 0.2885 - classification_loss: 0.0179 315/500 [=================>............] - ETA: 1:03 - loss: 0.3065 - regression_loss: 0.2886 - classification_loss: 0.0179 316/500 [=================>............] - ETA: 1:03 - loss: 0.3062 - regression_loss: 0.2883 - classification_loss: 0.0179 317/500 [==================>...........] - ETA: 1:02 - loss: 0.3067 - regression_loss: 0.2889 - classification_loss: 0.0179 318/500 [==================>...........] - ETA: 1:02 - loss: 0.3069 - regression_loss: 0.2891 - classification_loss: 0.0179 319/500 [==================>...........] - ETA: 1:02 - loss: 0.3078 - regression_loss: 0.2899 - classification_loss: 0.0179 320/500 [==================>...........] - ETA: 1:01 - loss: 0.3079 - regression_loss: 0.2899 - classification_loss: 0.0180 321/500 [==================>...........] - ETA: 1:01 - loss: 0.3081 - regression_loss: 0.2902 - classification_loss: 0.0179 322/500 [==================>...........] - ETA: 1:01 - loss: 0.3079 - regression_loss: 0.2899 - classification_loss: 0.0179 323/500 [==================>...........] - ETA: 1:00 - loss: 0.3073 - regression_loss: 0.2895 - classification_loss: 0.0179 324/500 [==================>...........] - ETA: 1:00 - loss: 0.3072 - regression_loss: 0.2893 - classification_loss: 0.0179 325/500 [==================>...........] - ETA: 1:00 - loss: 0.3077 - regression_loss: 0.2899 - classification_loss: 0.0179 326/500 [==================>...........] - ETA: 59s - loss: 0.3077 - regression_loss: 0.2899 - classification_loss: 0.0178  327/500 [==================>...........] - ETA: 59s - loss: 0.3081 - regression_loss: 0.2902 - classification_loss: 0.0179 328/500 [==================>...........] - ETA: 59s - loss: 0.3076 - regression_loss: 0.2897 - classification_loss: 0.0179 329/500 [==================>...........] - ETA: 58s - loss: 0.3073 - regression_loss: 0.2894 - classification_loss: 0.0178 330/500 [==================>...........] - ETA: 58s - loss: 0.3075 - regression_loss: 0.2896 - classification_loss: 0.0178 331/500 [==================>...........] - ETA: 58s - loss: 0.3080 - regression_loss: 0.2900 - classification_loss: 0.0179 332/500 [==================>...........] - ETA: 57s - loss: 0.3090 - regression_loss: 0.2911 - classification_loss: 0.0179 333/500 [==================>...........] - ETA: 57s - loss: 0.3095 - regression_loss: 0.2916 - classification_loss: 0.0179 334/500 [===================>..........] - ETA: 57s - loss: 0.3095 - regression_loss: 0.2916 - classification_loss: 0.0179 335/500 [===================>..........] - ETA: 56s - loss: 0.3095 - regression_loss: 0.2916 - classification_loss: 0.0179 336/500 [===================>..........] - ETA: 56s - loss: 0.3097 - regression_loss: 0.2918 - classification_loss: 0.0179 337/500 [===================>..........] - ETA: 56s - loss: 0.3096 - regression_loss: 0.2917 - classification_loss: 0.0180 338/500 [===================>..........] - ETA: 55s - loss: 0.3092 - regression_loss: 0.2913 - classification_loss: 0.0179 339/500 [===================>..........] - ETA: 55s - loss: 0.3091 - regression_loss: 0.2912 - classification_loss: 0.0179 340/500 [===================>..........] - ETA: 55s - loss: 0.3086 - regression_loss: 0.2907 - classification_loss: 0.0179 341/500 [===================>..........] - ETA: 54s - loss: 0.3087 - regression_loss: 0.2908 - classification_loss: 0.0179 342/500 [===================>..........] - ETA: 54s - loss: 0.3085 - regression_loss: 0.2906 - classification_loss: 0.0179 343/500 [===================>..........] - ETA: 54s - loss: 0.3081 - regression_loss: 0.2902 - classification_loss: 0.0178 344/500 [===================>..........] - ETA: 53s - loss: 0.3084 - regression_loss: 0.2905 - classification_loss: 0.0179 345/500 [===================>..........] - ETA: 53s - loss: 0.3088 - regression_loss: 0.2909 - classification_loss: 0.0179 346/500 [===================>..........] - ETA: 52s - loss: 0.3086 - regression_loss: 0.2907 - classification_loss: 0.0179 347/500 [===================>..........] - ETA: 52s - loss: 0.3087 - regression_loss: 0.2908 - classification_loss: 0.0179 348/500 [===================>..........] - ETA: 52s - loss: 0.3089 - regression_loss: 0.2910 - classification_loss: 0.0179 349/500 [===================>..........] - ETA: 51s - loss: 0.3087 - regression_loss: 0.2908 - classification_loss: 0.0179 350/500 [====================>.........] - ETA: 51s - loss: 0.3086 - regression_loss: 0.2907 - classification_loss: 0.0179 351/500 [====================>.........] - ETA: 51s - loss: 0.3088 - regression_loss: 0.2909 - classification_loss: 0.0179 352/500 [====================>.........] - ETA: 50s - loss: 0.3083 - regression_loss: 0.2904 - classification_loss: 0.0178 353/500 [====================>.........] - ETA: 50s - loss: 0.3083 - regression_loss: 0.2905 - classification_loss: 0.0178 354/500 [====================>.........] - ETA: 50s - loss: 0.3082 - regression_loss: 0.2904 - classification_loss: 0.0178 355/500 [====================>.........] - ETA: 49s - loss: 0.3083 - regression_loss: 0.2905 - classification_loss: 0.0178 356/500 [====================>.........] - ETA: 49s - loss: 0.3085 - regression_loss: 0.2907 - classification_loss: 0.0178 357/500 [====================>.........] - ETA: 49s - loss: 0.3088 - regression_loss: 0.2909 - classification_loss: 0.0180 358/500 [====================>.........] - ETA: 48s - loss: 0.3097 - regression_loss: 0.2917 - classification_loss: 0.0180 359/500 [====================>.........] - ETA: 48s - loss: 0.3093 - regression_loss: 0.2913 - classification_loss: 0.0180 360/500 [====================>.........] - ETA: 48s - loss: 0.3091 - regression_loss: 0.2911 - classification_loss: 0.0180 361/500 [====================>.........] - ETA: 47s - loss: 0.3091 - regression_loss: 0.2911 - classification_loss: 0.0181 362/500 [====================>.........] - ETA: 47s - loss: 0.3087 - regression_loss: 0.2907 - classification_loss: 0.0180 363/500 [====================>.........] - ETA: 47s - loss: 0.3087 - regression_loss: 0.2907 - classification_loss: 0.0180 364/500 [====================>.........] - ETA: 46s - loss: 0.3087 - regression_loss: 0.2907 - classification_loss: 0.0180 365/500 [====================>.........] - ETA: 46s - loss: 0.3092 - regression_loss: 0.2911 - classification_loss: 0.0180 366/500 [====================>.........] - ETA: 46s - loss: 0.3092 - regression_loss: 0.2912 - classification_loss: 0.0180 367/500 [=====================>........] - ETA: 45s - loss: 0.3092 - regression_loss: 0.2911 - classification_loss: 0.0180 368/500 [=====================>........] - ETA: 45s - loss: 0.3087 - regression_loss: 0.2906 - classification_loss: 0.0180 369/500 [=====================>........] - ETA: 45s - loss: 0.3095 - regression_loss: 0.2913 - classification_loss: 0.0181 370/500 [=====================>........] - ETA: 44s - loss: 0.3092 - regression_loss: 0.2911 - classification_loss: 0.0181 371/500 [=====================>........] - ETA: 44s - loss: 0.3091 - regression_loss: 0.2910 - classification_loss: 0.0181 372/500 [=====================>........] - ETA: 44s - loss: 0.3091 - regression_loss: 0.2910 - classification_loss: 0.0181 373/500 [=====================>........] - ETA: 43s - loss: 0.3091 - regression_loss: 0.2910 - classification_loss: 0.0181 374/500 [=====================>........] - ETA: 43s - loss: 0.3091 - regression_loss: 0.2911 - classification_loss: 0.0181 375/500 [=====================>........] - ETA: 43s - loss: 0.3089 - regression_loss: 0.2908 - classification_loss: 0.0181 376/500 [=====================>........] - ETA: 42s - loss: 0.3086 - regression_loss: 0.2906 - classification_loss: 0.0180 377/500 [=====================>........] - ETA: 42s - loss: 0.3088 - regression_loss: 0.2908 - classification_loss: 0.0181 378/500 [=====================>........] - ETA: 41s - loss: 0.3084 - regression_loss: 0.2904 - classification_loss: 0.0180 379/500 [=====================>........] - ETA: 41s - loss: 0.3078 - regression_loss: 0.2898 - classification_loss: 0.0180 380/500 [=====================>........] - ETA: 41s - loss: 0.3075 - regression_loss: 0.2895 - classification_loss: 0.0180 381/500 [=====================>........] - ETA: 40s - loss: 0.3072 - regression_loss: 0.2892 - classification_loss: 0.0179 382/500 [=====================>........] - ETA: 40s - loss: 0.3069 - regression_loss: 0.2890 - classification_loss: 0.0179 383/500 [=====================>........] - ETA: 40s - loss: 0.3072 - regression_loss: 0.2893 - classification_loss: 0.0179 384/500 [======================>.......] - ETA: 39s - loss: 0.3076 - regression_loss: 0.2897 - classification_loss: 0.0179 385/500 [======================>.......] - ETA: 39s - loss: 0.3075 - regression_loss: 0.2896 - classification_loss: 0.0179 386/500 [======================>.......] - ETA: 39s - loss: 0.3072 - regression_loss: 0.2893 - classification_loss: 0.0179 387/500 [======================>.......] - ETA: 38s - loss: 0.3071 - regression_loss: 0.2893 - classification_loss: 0.0179 388/500 [======================>.......] - ETA: 38s - loss: 0.3072 - regression_loss: 0.2894 - classification_loss: 0.0179 389/500 [======================>.......] - ETA: 38s - loss: 0.3072 - regression_loss: 0.2894 - classification_loss: 0.0179 390/500 [======================>.......] - ETA: 37s - loss: 0.3072 - regression_loss: 0.2894 - classification_loss: 0.0179 391/500 [======================>.......] - ETA: 37s - loss: 0.3073 - regression_loss: 0.2894 - classification_loss: 0.0179 392/500 [======================>.......] - ETA: 37s - loss: 0.3072 - regression_loss: 0.2893 - classification_loss: 0.0178 393/500 [======================>.......] - ETA: 36s - loss: 0.3072 - regression_loss: 0.2894 - classification_loss: 0.0178 394/500 [======================>.......] - ETA: 36s - loss: 0.3078 - regression_loss: 0.2900 - classification_loss: 0.0178 395/500 [======================>.......] - ETA: 36s - loss: 0.3080 - regression_loss: 0.2902 - classification_loss: 0.0178 396/500 [======================>.......] - ETA: 35s - loss: 0.3078 - regression_loss: 0.2900 - classification_loss: 0.0178 397/500 [======================>.......] - ETA: 35s - loss: 0.3081 - regression_loss: 0.2903 - classification_loss: 0.0178 398/500 [======================>.......] - ETA: 35s - loss: 0.3081 - regression_loss: 0.2904 - classification_loss: 0.0178 399/500 [======================>.......] - ETA: 34s - loss: 0.3080 - regression_loss: 0.2902 - classification_loss: 0.0178 400/500 [=======================>......] - ETA: 34s - loss: 0.3080 - regression_loss: 0.2902 - classification_loss: 0.0178 401/500 [=======================>......] - ETA: 34s - loss: 0.3080 - regression_loss: 0.2901 - classification_loss: 0.0178 402/500 [=======================>......] - ETA: 33s - loss: 0.3081 - regression_loss: 0.2903 - classification_loss: 0.0178 403/500 [=======================>......] - ETA: 33s - loss: 0.3085 - regression_loss: 0.2906 - classification_loss: 0.0178 404/500 [=======================>......] - ETA: 33s - loss: 0.3082 - regression_loss: 0.2904 - classification_loss: 0.0178 405/500 [=======================>......] - ETA: 32s - loss: 0.3080 - regression_loss: 0.2902 - classification_loss: 0.0178 406/500 [=======================>......] - ETA: 32s - loss: 0.3077 - regression_loss: 0.2899 - classification_loss: 0.0178 407/500 [=======================>......] - ETA: 32s - loss: 0.3073 - regression_loss: 0.2895 - classification_loss: 0.0178 408/500 [=======================>......] - ETA: 31s - loss: 0.3074 - regression_loss: 0.2896 - classification_loss: 0.0178 409/500 [=======================>......] - ETA: 31s - loss: 0.3076 - regression_loss: 0.2898 - classification_loss: 0.0178 410/500 [=======================>......] - ETA: 30s - loss: 0.3074 - regression_loss: 0.2896 - classification_loss: 0.0178 411/500 [=======================>......] - ETA: 30s - loss: 0.3073 - regression_loss: 0.2895 - classification_loss: 0.0178 412/500 [=======================>......] - ETA: 30s - loss: 0.3075 - regression_loss: 0.2897 - classification_loss: 0.0178 413/500 [=======================>......] - ETA: 29s - loss: 0.3076 - regression_loss: 0.2898 - classification_loss: 0.0178 414/500 [=======================>......] - ETA: 29s - loss: 0.3078 - regression_loss: 0.2900 - classification_loss: 0.0178 415/500 [=======================>......] - ETA: 29s - loss: 0.3073 - regression_loss: 0.2895 - classification_loss: 0.0178 416/500 [=======================>......] - ETA: 28s - loss: 0.3072 - regression_loss: 0.2895 - classification_loss: 0.0177 417/500 [========================>.....] - ETA: 28s - loss: 0.3073 - regression_loss: 0.2895 - classification_loss: 0.0177 418/500 [========================>.....] - ETA: 28s - loss: 0.3074 - regression_loss: 0.2897 - classification_loss: 0.0177 419/500 [========================>.....] - ETA: 27s - loss: 0.3069 - regression_loss: 0.2892 - classification_loss: 0.0177 420/500 [========================>.....] - ETA: 27s - loss: 0.3066 - regression_loss: 0.2890 - classification_loss: 0.0177 421/500 [========================>.....] - ETA: 27s - loss: 0.3065 - regression_loss: 0.2889 - classification_loss: 0.0177 422/500 [========================>.....] - ETA: 26s - loss: 0.3066 - regression_loss: 0.2889 - classification_loss: 0.0176 423/500 [========================>.....] - ETA: 26s - loss: 0.3064 - regression_loss: 0.2888 - classification_loss: 0.0176 424/500 [========================>.....] - ETA: 26s - loss: 0.3064 - regression_loss: 0.2888 - classification_loss: 0.0176 425/500 [========================>.....] - ETA: 25s - loss: 0.3063 - regression_loss: 0.2887 - classification_loss: 0.0176 426/500 [========================>.....] - ETA: 25s - loss: 0.3062 - regression_loss: 0.2886 - classification_loss: 0.0176 427/500 [========================>.....] - ETA: 25s - loss: 0.3062 - regression_loss: 0.2886 - classification_loss: 0.0176 428/500 [========================>.....] - ETA: 24s - loss: 0.3060 - regression_loss: 0.2885 - classification_loss: 0.0175 429/500 [========================>.....] - ETA: 24s - loss: 0.3060 - regression_loss: 0.2885 - classification_loss: 0.0175 430/500 [========================>.....] - ETA: 24s - loss: 0.3061 - regression_loss: 0.2885 - classification_loss: 0.0176 431/500 [========================>.....] - ETA: 23s - loss: 0.3064 - regression_loss: 0.2888 - classification_loss: 0.0176 432/500 [========================>.....] - ETA: 23s - loss: 0.3064 - regression_loss: 0.2888 - classification_loss: 0.0175 433/500 [========================>.....] - ETA: 23s - loss: 0.3066 - regression_loss: 0.2891 - classification_loss: 0.0175 434/500 [=========================>....] - ETA: 22s - loss: 0.3067 - regression_loss: 0.2891 - classification_loss: 0.0175 435/500 [=========================>....] - ETA: 22s - loss: 0.3065 - regression_loss: 0.2890 - classification_loss: 0.0175 436/500 [=========================>....] - ETA: 22s - loss: 0.3066 - regression_loss: 0.2890 - classification_loss: 0.0175 437/500 [=========================>....] - ETA: 21s - loss: 0.3065 - regression_loss: 0.2890 - classification_loss: 0.0175 438/500 [=========================>....] - ETA: 21s - loss: 0.3062 - regression_loss: 0.2888 - classification_loss: 0.0175 439/500 [=========================>....] - ETA: 20s - loss: 0.3065 - regression_loss: 0.2889 - classification_loss: 0.0175 440/500 [=========================>....] - ETA: 20s - loss: 0.3063 - regression_loss: 0.2887 - classification_loss: 0.0175 441/500 [=========================>....] - ETA: 20s - loss: 0.3062 - regression_loss: 0.2887 - classification_loss: 0.0175 442/500 [=========================>....] - ETA: 19s - loss: 0.3060 - regression_loss: 0.2885 - classification_loss: 0.0175 443/500 [=========================>....] - ETA: 19s - loss: 0.3060 - regression_loss: 0.2885 - classification_loss: 0.0175 444/500 [=========================>....] - ETA: 19s - loss: 0.3065 - regression_loss: 0.2890 - classification_loss: 0.0175 445/500 [=========================>....] - ETA: 18s - loss: 0.3063 - regression_loss: 0.2889 - classification_loss: 0.0174 446/500 [=========================>....] - ETA: 18s - loss: 0.3060 - regression_loss: 0.2885 - classification_loss: 0.0174 447/500 [=========================>....] - ETA: 18s - loss: 0.3064 - regression_loss: 0.2889 - classification_loss: 0.0174 448/500 [=========================>....] - ETA: 17s - loss: 0.3064 - regression_loss: 0.2889 - classification_loss: 0.0174 449/500 [=========================>....] - ETA: 17s - loss: 0.3065 - regression_loss: 0.2890 - classification_loss: 0.0174 450/500 [==========================>...] - ETA: 17s - loss: 0.3067 - regression_loss: 0.2892 - classification_loss: 0.0174 451/500 [==========================>...] - ETA: 16s - loss: 0.3065 - regression_loss: 0.2891 - classification_loss: 0.0174 452/500 [==========================>...] - ETA: 16s - loss: 0.3063 - regression_loss: 0.2889 - classification_loss: 0.0174 453/500 [==========================>...] - ETA: 16s - loss: 0.3066 - regression_loss: 0.2892 - classification_loss: 0.0174 454/500 [==========================>...] - ETA: 15s - loss: 0.3070 - regression_loss: 0.2896 - classification_loss: 0.0174 455/500 [==========================>...] - ETA: 15s - loss: 0.3070 - regression_loss: 0.2896 - classification_loss: 0.0174 456/500 [==========================>...] - ETA: 15s - loss: 0.3077 - regression_loss: 0.2902 - classification_loss: 0.0175 457/500 [==========================>...] - ETA: 14s - loss: 0.3074 - regression_loss: 0.2899 - classification_loss: 0.0175 458/500 [==========================>...] - ETA: 14s - loss: 0.3080 - regression_loss: 0.2904 - classification_loss: 0.0175 459/500 [==========================>...] - ETA: 14s - loss: 0.3079 - regression_loss: 0.2904 - classification_loss: 0.0175 460/500 [==========================>...] - ETA: 13s - loss: 0.3080 - regression_loss: 0.2904 - classification_loss: 0.0175 461/500 [==========================>...] - ETA: 13s - loss: 0.3081 - regression_loss: 0.2905 - classification_loss: 0.0176 462/500 [==========================>...] - ETA: 13s - loss: 0.3091 - regression_loss: 0.2915 - classification_loss: 0.0176 463/500 [==========================>...] - ETA: 12s - loss: 0.3088 - regression_loss: 0.2913 - classification_loss: 0.0176 464/500 [==========================>...] - ETA: 12s - loss: 0.3101 - regression_loss: 0.2924 - classification_loss: 0.0177 465/500 [==========================>...] - ETA: 12s - loss: 0.3107 - regression_loss: 0.2930 - classification_loss: 0.0177 466/500 [==========================>...] - ETA: 11s - loss: 0.3103 - regression_loss: 0.2926 - classification_loss: 0.0177 467/500 [===========================>..] - ETA: 11s - loss: 0.3104 - regression_loss: 0.2928 - classification_loss: 0.0177 468/500 [===========================>..] - ETA: 11s - loss: 0.3102 - regression_loss: 0.2925 - classification_loss: 0.0176 469/500 [===========================>..] - ETA: 10s - loss: 0.3105 - regression_loss: 0.2929 - classification_loss: 0.0176 470/500 [===========================>..] - ETA: 10s - loss: 0.3106 - regression_loss: 0.2929 - classification_loss: 0.0176 471/500 [===========================>..] - ETA: 9s - loss: 0.3107 - regression_loss: 0.2930 - classification_loss: 0.0177  472/500 [===========================>..] - ETA: 9s - loss: 0.3108 - regression_loss: 0.2931 - classification_loss: 0.0177 473/500 [===========================>..] - ETA: 9s - loss: 0.3105 - regression_loss: 0.2929 - classification_loss: 0.0176 474/500 [===========================>..] - ETA: 8s - loss: 0.3105 - regression_loss: 0.2929 - classification_loss: 0.0176 475/500 [===========================>..] - ETA: 8s - loss: 0.3108 - regression_loss: 0.2932 - classification_loss: 0.0176 476/500 [===========================>..] - ETA: 8s - loss: 0.3106 - regression_loss: 0.2930 - classification_loss: 0.0176 477/500 [===========================>..] - ETA: 7s - loss: 0.3104 - regression_loss: 0.2928 - classification_loss: 0.0176 478/500 [===========================>..] - ETA: 7s - loss: 0.3105 - regression_loss: 0.2929 - classification_loss: 0.0176 479/500 [===========================>..] - ETA: 7s - loss: 0.3110 - regression_loss: 0.2934 - classification_loss: 0.0176 480/500 [===========================>..] - ETA: 6s - loss: 0.3112 - regression_loss: 0.2935 - classification_loss: 0.0177 481/500 [===========================>..] - ETA: 6s - loss: 0.3114 - regression_loss: 0.2937 - classification_loss: 0.0177 482/500 [===========================>..] - ETA: 6s - loss: 0.3120 - regression_loss: 0.2943 - classification_loss: 0.0177 483/500 [===========================>..] - ETA: 5s - loss: 0.3125 - regression_loss: 0.2948 - classification_loss: 0.0178 484/500 [============================>.] - ETA: 5s - loss: 0.3124 - regression_loss: 0.2947 - classification_loss: 0.0177 485/500 [============================>.] - ETA: 5s - loss: 0.3129 - regression_loss: 0.2952 - classification_loss: 0.0177 486/500 [============================>.] - ETA: 4s - loss: 0.3127 - regression_loss: 0.2950 - classification_loss: 0.0177 487/500 [============================>.] - ETA: 4s - loss: 0.3130 - regression_loss: 0.2953 - classification_loss: 0.0177 488/500 [============================>.] - ETA: 4s - loss: 0.3126 - regression_loss: 0.2949 - classification_loss: 0.0177 489/500 [============================>.] - ETA: 3s - loss: 0.3122 - regression_loss: 0.2945 - classification_loss: 0.0177 490/500 [============================>.] - ETA: 3s - loss: 0.3125 - regression_loss: 0.2948 - classification_loss: 0.0177 491/500 [============================>.] - ETA: 3s - loss: 0.3124 - regression_loss: 0.2947 - classification_loss: 0.0177 492/500 [============================>.] - ETA: 2s - loss: 0.3124 - regression_loss: 0.2947 - classification_loss: 0.0177 493/500 [============================>.] - ETA: 2s - loss: 0.3123 - regression_loss: 0.2946 - classification_loss: 0.0177 494/500 [============================>.] - ETA: 2s - loss: 0.3120 - regression_loss: 0.2944 - classification_loss: 0.0177 495/500 [============================>.] - ETA: 1s - loss: 0.3119 - regression_loss: 0.2942 - classification_loss: 0.0177 496/500 [============================>.] - ETA: 1s - loss: 0.3117 - regression_loss: 0.2941 - classification_loss: 0.0177 497/500 [============================>.] - ETA: 1s - loss: 0.3115 - regression_loss: 0.2938 - classification_loss: 0.0176 498/500 [============================>.] - ETA: 0s - loss: 0.3117 - regression_loss: 0.2940 - classification_loss: 0.0177 499/500 [============================>.] - ETA: 0s - loss: 0.3119 - regression_loss: 0.2942 - classification_loss: 0.0177 500/500 [==============================] - 172s 344ms/step - loss: 0.3115 - regression_loss: 0.2938 - classification_loss: 0.0177 1172 instances of class plum with average precision: 0.7495 mAP: 0.7495 Epoch 00037: saving model to ./training/snapshots/resnet101_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 2:45 - loss: 0.2550 - regression_loss: 0.2465 - classification_loss: 0.0085 2/500 [..............................] - ETA: 2:47 - loss: 0.2258 - regression_loss: 0.2202 - classification_loss: 0.0056 3/500 [..............................] - ETA: 2:47 - loss: 0.2744 - regression_loss: 0.2626 - classification_loss: 0.0118 4/500 [..............................] - ETA: 2:47 - loss: 0.2924 - regression_loss: 0.2800 - classification_loss: 0.0124 5/500 [..............................] - ETA: 2:46 - loss: 0.3118 - regression_loss: 0.2945 - classification_loss: 0.0173 6/500 [..............................] - ETA: 2:47 - loss: 0.3193 - regression_loss: 0.3026 - classification_loss: 0.0167 7/500 [..............................] - ETA: 2:49 - loss: 0.3199 - regression_loss: 0.3032 - classification_loss: 0.0167 8/500 [..............................] - ETA: 2:48 - loss: 0.3140 - regression_loss: 0.2972 - classification_loss: 0.0168 9/500 [..............................] - ETA: 2:47 - loss: 0.2936 - regression_loss: 0.2779 - classification_loss: 0.0156 10/500 [..............................] - ETA: 2:46 - loss: 0.2803 - regression_loss: 0.2656 - classification_loss: 0.0146 11/500 [..............................] - ETA: 2:46 - loss: 0.2706 - regression_loss: 0.2563 - classification_loss: 0.0143 12/500 [..............................] - ETA: 2:47 - loss: 0.2600 - regression_loss: 0.2463 - classification_loss: 0.0136 13/500 [..............................] - ETA: 2:47 - loss: 0.2537 - regression_loss: 0.2404 - classification_loss: 0.0133 14/500 [..............................] - ETA: 2:46 - loss: 0.2495 - regression_loss: 0.2367 - classification_loss: 0.0128 15/500 [..............................] - ETA: 2:47 - loss: 0.2418 - regression_loss: 0.2296 - classification_loss: 0.0122 16/500 [..............................] - ETA: 2:47 - loss: 0.2344 - regression_loss: 0.2228 - classification_loss: 0.0116 17/500 [>.............................] - ETA: 2:46 - loss: 0.2538 - regression_loss: 0.2415 - classification_loss: 0.0123 18/500 [>.............................] - ETA: 2:46 - loss: 0.2657 - regression_loss: 0.2526 - classification_loss: 0.0130 19/500 [>.............................] - ETA: 2:45 - loss: 0.2790 - regression_loss: 0.2622 - classification_loss: 0.0168 20/500 [>.............................] - ETA: 2:45 - loss: 0.2783 - regression_loss: 0.2616 - classification_loss: 0.0167 21/500 [>.............................] - ETA: 2:45 - loss: 0.2791 - regression_loss: 0.2625 - classification_loss: 0.0166 22/500 [>.............................] - ETA: 2:44 - loss: 0.2745 - regression_loss: 0.2584 - classification_loss: 0.0161 23/500 [>.............................] - ETA: 2:44 - loss: 0.2693 - regression_loss: 0.2535 - classification_loss: 0.0158 24/500 [>.............................] - ETA: 2:43 - loss: 0.2703 - regression_loss: 0.2545 - classification_loss: 0.0158 25/500 [>.............................] - ETA: 2:43 - loss: 0.2703 - regression_loss: 0.2545 - classification_loss: 0.0158 26/500 [>.............................] - ETA: 2:42 - loss: 0.2712 - regression_loss: 0.2556 - classification_loss: 0.0155 27/500 [>.............................] - ETA: 2:42 - loss: 0.2672 - regression_loss: 0.2518 - classification_loss: 0.0153 28/500 [>.............................] - ETA: 2:41 - loss: 0.2706 - regression_loss: 0.2551 - classification_loss: 0.0155 29/500 [>.............................] - ETA: 2:41 - loss: 0.2691 - regression_loss: 0.2534 - classification_loss: 0.0157 30/500 [>.............................] - ETA: 2:41 - loss: 0.2672 - regression_loss: 0.2515 - classification_loss: 0.0157 31/500 [>.............................] - ETA: 2:41 - loss: 0.2720 - regression_loss: 0.2561 - classification_loss: 0.0159 32/500 [>.............................] - ETA: 2:41 - loss: 0.2709 - regression_loss: 0.2552 - classification_loss: 0.0157 33/500 [>.............................] - ETA: 2:40 - loss: 0.2687 - regression_loss: 0.2531 - classification_loss: 0.0155 34/500 [=>............................] - ETA: 2:40 - loss: 0.2654 - regression_loss: 0.2500 - classification_loss: 0.0154 35/500 [=>............................] - ETA: 2:40 - loss: 0.2708 - regression_loss: 0.2552 - classification_loss: 0.0157 36/500 [=>............................] - ETA: 2:39 - loss: 0.2755 - regression_loss: 0.2597 - classification_loss: 0.0158 37/500 [=>............................] - ETA: 2:39 - loss: 0.2752 - regression_loss: 0.2597 - classification_loss: 0.0155 38/500 [=>............................] - ETA: 2:39 - loss: 0.2746 - regression_loss: 0.2593 - classification_loss: 0.0153 39/500 [=>............................] - ETA: 2:38 - loss: 0.2786 - regression_loss: 0.2632 - classification_loss: 0.0154 40/500 [=>............................] - ETA: 2:38 - loss: 0.2775 - regression_loss: 0.2623 - classification_loss: 0.0152 41/500 [=>............................] - ETA: 2:38 - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 42/500 [=>............................] - ETA: 2:37 - loss: 0.2761 - regression_loss: 0.2611 - classification_loss: 0.0151 43/500 [=>............................] - ETA: 2:37 - loss: 0.2791 - regression_loss: 0.2636 - classification_loss: 0.0155 44/500 [=>............................] - ETA: 2:36 - loss: 0.2814 - regression_loss: 0.2658 - classification_loss: 0.0155 45/500 [=>............................] - ETA: 2:36 - loss: 0.2799 - regression_loss: 0.2645 - classification_loss: 0.0153 46/500 [=>............................] - ETA: 2:35 - loss: 0.2819 - regression_loss: 0.2666 - classification_loss: 0.0153 47/500 [=>............................] - ETA: 2:35 - loss: 0.2832 - regression_loss: 0.2681 - classification_loss: 0.0151 48/500 [=>............................] - ETA: 2:35 - loss: 0.2812 - regression_loss: 0.2662 - classification_loss: 0.0150 49/500 [=>............................] - ETA: 2:34 - loss: 0.2817 - regression_loss: 0.2667 - classification_loss: 0.0150 50/500 [==>...........................] - ETA: 2:34 - loss: 0.2865 - regression_loss: 0.2705 - classification_loss: 0.0160 51/500 [==>...........................] - ETA: 2:34 - loss: 0.2858 - regression_loss: 0.2700 - classification_loss: 0.0158 52/500 [==>...........................] - ETA: 2:33 - loss: 0.2903 - regression_loss: 0.2735 - classification_loss: 0.0168 53/500 [==>...........................] - ETA: 2:33 - loss: 0.2910 - regression_loss: 0.2741 - classification_loss: 0.0169 54/500 [==>...........................] - ETA: 2:33 - loss: 0.2905 - regression_loss: 0.2737 - classification_loss: 0.0169 55/500 [==>...........................] - ETA: 2:33 - loss: 0.2891 - regression_loss: 0.2723 - classification_loss: 0.0168 56/500 [==>...........................] - ETA: 2:32 - loss: 0.2880 - regression_loss: 0.2714 - classification_loss: 0.0166 57/500 [==>...........................] - ETA: 2:32 - loss: 0.2877 - regression_loss: 0.2710 - classification_loss: 0.0167 58/500 [==>...........................] - ETA: 2:32 - loss: 0.2868 - regression_loss: 0.2702 - classification_loss: 0.0167 59/500 [==>...........................] - ETA: 2:31 - loss: 0.2887 - regression_loss: 0.2721 - classification_loss: 0.0166 60/500 [==>...........................] - ETA: 2:31 - loss: 0.2889 - regression_loss: 0.2724 - classification_loss: 0.0165 61/500 [==>...........................] - ETA: 2:31 - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0165 62/500 [==>...........................] - ETA: 2:30 - loss: 0.2934 - regression_loss: 0.2766 - classification_loss: 0.0169 63/500 [==>...........................] - ETA: 2:30 - loss: 0.2928 - regression_loss: 0.2758 - classification_loss: 0.0170 64/500 [==>...........................] - ETA: 2:30 - loss: 0.2908 - regression_loss: 0.2739 - classification_loss: 0.0169 65/500 [==>...........................] - ETA: 2:29 - loss: 0.2905 - regression_loss: 0.2734 - classification_loss: 0.0170 66/500 [==>...........................] - ETA: 2:29 - loss: 0.2909 - regression_loss: 0.2738 - classification_loss: 0.0171 67/500 [===>..........................] - ETA: 2:29 - loss: 0.2891 - regression_loss: 0.2721 - classification_loss: 0.0170 68/500 [===>..........................] - ETA: 2:29 - loss: 0.2870 - regression_loss: 0.2700 - classification_loss: 0.0170 69/500 [===>..........................] - ETA: 2:28 - loss: 0.2871 - regression_loss: 0.2701 - classification_loss: 0.0170 70/500 [===>..........................] - ETA: 2:28 - loss: 0.2875 - regression_loss: 0.2707 - classification_loss: 0.0168 71/500 [===>..........................] - ETA: 2:27 - loss: 0.2880 - regression_loss: 0.2712 - classification_loss: 0.0168 72/500 [===>..........................] - ETA: 2:27 - loss: 0.2879 - regression_loss: 0.2711 - classification_loss: 0.0168 73/500 [===>..........................] - ETA: 2:27 - loss: 0.2867 - regression_loss: 0.2700 - classification_loss: 0.0167 74/500 [===>..........................] - ETA: 2:27 - loss: 0.2858 - regression_loss: 0.2692 - classification_loss: 0.0166 75/500 [===>..........................] - ETA: 2:26 - loss: 0.2867 - regression_loss: 0.2699 - classification_loss: 0.0168 76/500 [===>..........................] - ETA: 2:26 - loss: 0.2861 - regression_loss: 0.2694 - classification_loss: 0.0167 77/500 [===>..........................] - ETA: 2:25 - loss: 0.2857 - regression_loss: 0.2690 - classification_loss: 0.0167 78/500 [===>..........................] - ETA: 2:25 - loss: 0.2851 - regression_loss: 0.2684 - classification_loss: 0.0166 79/500 [===>..........................] - ETA: 2:25 - loss: 0.2844 - regression_loss: 0.2677 - classification_loss: 0.0166 80/500 [===>..........................] - ETA: 2:24 - loss: 0.2856 - regression_loss: 0.2687 - classification_loss: 0.0169 81/500 [===>..........................] - ETA: 2:24 - loss: 0.2839 - regression_loss: 0.2672 - classification_loss: 0.0168 82/500 [===>..........................] - ETA: 2:23 - loss: 0.2843 - regression_loss: 0.2675 - classification_loss: 0.0168 83/500 [===>..........................] - ETA: 2:23 - loss: 0.2861 - regression_loss: 0.2692 - classification_loss: 0.0168 84/500 [====>.........................] - ETA: 2:23 - loss: 0.2841 - regression_loss: 0.2674 - classification_loss: 0.0167 85/500 [====>.........................] - ETA: 2:22 - loss: 0.2850 - regression_loss: 0.2682 - classification_loss: 0.0167 86/500 [====>.........................] - ETA: 2:22 - loss: 0.2842 - regression_loss: 0.2676 - classification_loss: 0.0167 87/500 [====>.........................] - ETA: 2:21 - loss: 0.2858 - regression_loss: 0.2691 - classification_loss: 0.0167 88/500 [====>.........................] - ETA: 2:21 - loss: 0.2874 - regression_loss: 0.2705 - classification_loss: 0.0169 89/500 [====>.........................] - ETA: 2:21 - loss: 0.2910 - regression_loss: 0.2742 - classification_loss: 0.0168 90/500 [====>.........................] - ETA: 2:20 - loss: 0.2900 - regression_loss: 0.2732 - classification_loss: 0.0168 91/500 [====>.........................] - ETA: 2:20 - loss: 0.2900 - regression_loss: 0.2733 - classification_loss: 0.0167 92/500 [====>.........................] - ETA: 2:20 - loss: 0.2901 - regression_loss: 0.2734 - classification_loss: 0.0167 93/500 [====>.........................] - ETA: 2:19 - loss: 0.2885 - regression_loss: 0.2719 - classification_loss: 0.0166 94/500 [====>.........................] - ETA: 2:19 - loss: 0.2877 - regression_loss: 0.2712 - classification_loss: 0.0165 95/500 [====>.........................] - ETA: 2:19 - loss: 0.2882 - regression_loss: 0.2716 - classification_loss: 0.0165 96/500 [====>.........................] - ETA: 2:18 - loss: 0.2885 - regression_loss: 0.2721 - classification_loss: 0.0165 97/500 [====>.........................] - ETA: 2:18 - loss: 0.2898 - regression_loss: 0.2733 - classification_loss: 0.0166 98/500 [====>.........................] - ETA: 2:18 - loss: 0.2906 - regression_loss: 0.2739 - classification_loss: 0.0167 99/500 [====>.........................] - ETA: 2:17 - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 100/500 [=====>........................] - ETA: 2:17 - loss: 0.2886 - regression_loss: 0.2721 - classification_loss: 0.0165 101/500 [=====>........................] - ETA: 2:17 - loss: 0.2899 - regression_loss: 0.2735 - classification_loss: 0.0164 102/500 [=====>........................] - ETA: 2:16 - loss: 0.2900 - regression_loss: 0.2735 - classification_loss: 0.0164 103/500 [=====>........................] - ETA: 2:16 - loss: 0.2891 - regression_loss: 0.2727 - classification_loss: 0.0163 104/500 [=====>........................] - ETA: 2:16 - loss: 0.2888 - regression_loss: 0.2725 - classification_loss: 0.0163 105/500 [=====>........................] - ETA: 2:15 - loss: 0.2880 - regression_loss: 0.2717 - classification_loss: 0.0162 106/500 [=====>........................] - ETA: 2:15 - loss: 0.2873 - regression_loss: 0.2711 - classification_loss: 0.0161 107/500 [=====>........................] - ETA: 2:15 - loss: 0.2885 - regression_loss: 0.2723 - classification_loss: 0.0162 108/500 [=====>........................] - ETA: 2:14 - loss: 0.2897 - regression_loss: 0.2734 - classification_loss: 0.0163 109/500 [=====>........................] - ETA: 2:14 - loss: 0.2880 - regression_loss: 0.2719 - classification_loss: 0.0162 110/500 [=====>........................] - ETA: 2:13 - loss: 0.2881 - regression_loss: 0.2719 - classification_loss: 0.0162 111/500 [=====>........................] - ETA: 2:13 - loss: 0.2869 - regression_loss: 0.2708 - classification_loss: 0.0161 112/500 [=====>........................] - ETA: 2:13 - loss: 0.2854 - regression_loss: 0.2694 - classification_loss: 0.0160 113/500 [=====>........................] - ETA: 2:12 - loss: 0.2847 - regression_loss: 0.2687 - classification_loss: 0.0160 114/500 [=====>........................] - ETA: 2:12 - loss: 0.2843 - regression_loss: 0.2683 - classification_loss: 0.0160 115/500 [=====>........................] - ETA: 2:12 - loss: 0.2840 - regression_loss: 0.2681 - classification_loss: 0.0159 116/500 [=====>........................] - ETA: 2:11 - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0161 117/500 [======>.......................] - ETA: 2:11 - loss: 0.2864 - regression_loss: 0.2703 - classification_loss: 0.0160 118/500 [======>.......................] - ETA: 2:11 - loss: 0.2868 - regression_loss: 0.2708 - classification_loss: 0.0160 119/500 [======>.......................] - ETA: 2:11 - loss: 0.2849 - regression_loss: 0.2691 - classification_loss: 0.0159 120/500 [======>.......................] - ETA: 2:10 - loss: 0.2863 - regression_loss: 0.2702 - classification_loss: 0.0161 121/500 [======>.......................] - ETA: 2:10 - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0162 122/500 [======>.......................] - ETA: 2:10 - loss: 0.2898 - regression_loss: 0.2736 - classification_loss: 0.0162 123/500 [======>.......................] - ETA: 2:09 - loss: 0.2907 - regression_loss: 0.2743 - classification_loss: 0.0163 124/500 [======>.......................] - ETA: 2:09 - loss: 0.2893 - regression_loss: 0.2731 - classification_loss: 0.0162 125/500 [======>.......................] - ETA: 2:09 - loss: 0.2902 - regression_loss: 0.2740 - classification_loss: 0.0162 126/500 [======>.......................] - ETA: 2:08 - loss: 0.2906 - regression_loss: 0.2746 - classification_loss: 0.0161 127/500 [======>.......................] - ETA: 2:08 - loss: 0.2894 - regression_loss: 0.2734 - classification_loss: 0.0160 128/500 [======>.......................] - ETA: 2:07 - loss: 0.2893 - regression_loss: 0.2733 - classification_loss: 0.0160 129/500 [======>.......................] - ETA: 2:07 - loss: 0.2902 - regression_loss: 0.2740 - classification_loss: 0.0162 130/500 [======>.......................] - ETA: 2:07 - loss: 0.2896 - regression_loss: 0.2735 - classification_loss: 0.0161 131/500 [======>.......................] - ETA: 2:06 - loss: 0.2894 - regression_loss: 0.2734 - classification_loss: 0.0160 132/500 [======>.......................] - ETA: 2:06 - loss: 0.2889 - regression_loss: 0.2729 - classification_loss: 0.0160 133/500 [======>.......................] - ETA: 2:06 - loss: 0.2879 - regression_loss: 0.2720 - classification_loss: 0.0159 134/500 [=======>......................] - ETA: 2:05 - loss: 0.2890 - regression_loss: 0.2731 - classification_loss: 0.0160 135/500 [=======>......................] - ETA: 2:05 - loss: 0.2885 - regression_loss: 0.2726 - classification_loss: 0.0159 136/500 [=======>......................] - ETA: 2:05 - loss: 0.2886 - regression_loss: 0.2727 - classification_loss: 0.0159 137/500 [=======>......................] - ETA: 2:04 - loss: 0.2874 - regression_loss: 0.2717 - classification_loss: 0.0158 138/500 [=======>......................] - ETA: 2:04 - loss: 0.2878 - regression_loss: 0.2721 - classification_loss: 0.0157 139/500 [=======>......................] - ETA: 2:04 - loss: 0.2881 - regression_loss: 0.2723 - classification_loss: 0.0158 140/500 [=======>......................] - ETA: 2:03 - loss: 0.2883 - regression_loss: 0.2724 - classification_loss: 0.0159 141/500 [=======>......................] - ETA: 2:03 - loss: 0.2875 - regression_loss: 0.2716 - classification_loss: 0.0159 142/500 [=======>......................] - ETA: 2:03 - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 143/500 [=======>......................] - ETA: 2:02 - loss: 0.2892 - regression_loss: 0.2730 - classification_loss: 0.0161 144/500 [=======>......................] - ETA: 2:02 - loss: 0.2895 - regression_loss: 0.2730 - classification_loss: 0.0165 145/500 [=======>......................] - ETA: 2:02 - loss: 0.2900 - regression_loss: 0.2735 - classification_loss: 0.0164 146/500 [=======>......................] - ETA: 2:01 - loss: 0.2901 - regression_loss: 0.2737 - classification_loss: 0.0164 147/500 [=======>......................] - ETA: 2:01 - loss: 0.2900 - regression_loss: 0.2736 - classification_loss: 0.0164 148/500 [=======>......................] - ETA: 2:01 - loss: 0.2918 - regression_loss: 0.2751 - classification_loss: 0.0167 149/500 [=======>......................] - ETA: 2:00 - loss: 0.2910 - regression_loss: 0.2744 - classification_loss: 0.0166 150/500 [========>.....................] - ETA: 2:00 - loss: 0.2923 - regression_loss: 0.2755 - classification_loss: 0.0169 151/500 [========>.....................] - ETA: 2:00 - loss: 0.2918 - regression_loss: 0.2750 - classification_loss: 0.0168 152/500 [========>.....................] - ETA: 1:59 - loss: 0.2913 - regression_loss: 0.2745 - classification_loss: 0.0167 153/500 [========>.....................] - ETA: 1:59 - loss: 0.2905 - regression_loss: 0.2739 - classification_loss: 0.0166 154/500 [========>.....................] - ETA: 1:59 - loss: 0.2902 - regression_loss: 0.2736 - classification_loss: 0.0166 155/500 [========>.....................] - ETA: 1:58 - loss: 0.2892 - regression_loss: 0.2727 - classification_loss: 0.0165 156/500 [========>.....................] - ETA: 1:58 - loss: 0.2880 - regression_loss: 0.2716 - classification_loss: 0.0164 157/500 [========>.....................] - ETA: 1:57 - loss: 0.2886 - regression_loss: 0.2721 - classification_loss: 0.0165 158/500 [========>.....................] - ETA: 1:57 - loss: 0.2895 - regression_loss: 0.2730 - classification_loss: 0.0165 159/500 [========>.....................] - ETA: 1:57 - loss: 0.2896 - regression_loss: 0.2731 - classification_loss: 0.0165 160/500 [========>.....................] - ETA: 1:56 - loss: 0.2889 - regression_loss: 0.2724 - classification_loss: 0.0165 161/500 [========>.....................] - ETA: 1:56 - loss: 0.2890 - regression_loss: 0.2725 - classification_loss: 0.0164 162/500 [========>.....................] - ETA: 1:56 - loss: 0.2889 - regression_loss: 0.2726 - classification_loss: 0.0164 163/500 [========>.....................] - ETA: 1:55 - loss: 0.2884 - regression_loss: 0.2721 - classification_loss: 0.0163 164/500 [========>.....................] - ETA: 1:55 - loss: 0.2892 - regression_loss: 0.2727 - classification_loss: 0.0165 165/500 [========>.....................] - ETA: 1:54 - loss: 0.2889 - regression_loss: 0.2724 - classification_loss: 0.0164 166/500 [========>.....................] - ETA: 1:54 - loss: 0.2896 - regression_loss: 0.2731 - classification_loss: 0.0165 167/500 [=========>....................] - ETA: 1:54 - loss: 0.2906 - regression_loss: 0.2741 - classification_loss: 0.0166 168/500 [=========>....................] - ETA: 1:53 - loss: 0.2909 - regression_loss: 0.2743 - classification_loss: 0.0165 169/500 [=========>....................] - ETA: 1:53 - loss: 0.2905 - regression_loss: 0.2741 - classification_loss: 0.0164 170/500 [=========>....................] - ETA: 1:53 - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 171/500 [=========>....................] - ETA: 1:53 - loss: 0.2916 - regression_loss: 0.2751 - classification_loss: 0.0165 172/500 [=========>....................] - ETA: 1:52 - loss: 0.2905 - regression_loss: 0.2741 - classification_loss: 0.0164 173/500 [=========>....................] - ETA: 1:52 - loss: 0.2906 - regression_loss: 0.2742 - classification_loss: 0.0164 174/500 [=========>....................] - ETA: 1:51 - loss: 0.2918 - regression_loss: 0.2752 - classification_loss: 0.0166 175/500 [=========>....................] - ETA: 1:51 - loss: 0.2911 - regression_loss: 0.2746 - classification_loss: 0.0166 176/500 [=========>....................] - ETA: 1:51 - loss: 0.2914 - regression_loss: 0.2749 - classification_loss: 0.0165 177/500 [=========>....................] - ETA: 1:50 - loss: 0.2917 - regression_loss: 0.2752 - classification_loss: 0.0165 178/500 [=========>....................] - ETA: 1:50 - loss: 0.2920 - regression_loss: 0.2755 - classification_loss: 0.0165 179/500 [=========>....................] - ETA: 1:50 - loss: 0.2920 - regression_loss: 0.2755 - classification_loss: 0.0165 180/500 [=========>....................] - ETA: 1:49 - loss: 0.2930 - regression_loss: 0.2765 - classification_loss: 0.0166 181/500 [=========>....................] - ETA: 1:49 - loss: 0.2925 - regression_loss: 0.2760 - classification_loss: 0.0165 182/500 [=========>....................] - ETA: 1:49 - loss: 0.2931 - regression_loss: 0.2766 - classification_loss: 0.0165 183/500 [=========>....................] - ETA: 1:48 - loss: 0.2929 - regression_loss: 0.2764 - classification_loss: 0.0165 184/500 [==========>...................] - ETA: 1:48 - loss: 0.2928 - regression_loss: 0.2764 - classification_loss: 0.0165 185/500 [==========>...................] - ETA: 1:48 - loss: 0.2937 - regression_loss: 0.2771 - classification_loss: 0.0166 186/500 [==========>...................] - ETA: 1:47 - loss: 0.2943 - regression_loss: 0.2776 - classification_loss: 0.0166 187/500 [==========>...................] - ETA: 1:47 - loss: 0.2939 - regression_loss: 0.2773 - classification_loss: 0.0166 188/500 [==========>...................] - ETA: 1:47 - loss: 0.2934 - regression_loss: 0.2769 - classification_loss: 0.0165 189/500 [==========>...................] - ETA: 1:46 - loss: 0.2933 - regression_loss: 0.2769 - classification_loss: 0.0165 190/500 [==========>...................] - ETA: 1:46 - loss: 0.2934 - regression_loss: 0.2769 - classification_loss: 0.0164 191/500 [==========>...................] - ETA: 1:46 - loss: 0.2943 - regression_loss: 0.2778 - classification_loss: 0.0165 192/500 [==========>...................] - ETA: 1:45 - loss: 0.2942 - regression_loss: 0.2777 - classification_loss: 0.0165 193/500 [==========>...................] - ETA: 1:45 - loss: 0.2942 - regression_loss: 0.2778 - classification_loss: 0.0164 194/500 [==========>...................] - ETA: 1:44 - loss: 0.2948 - regression_loss: 0.2783 - classification_loss: 0.0166 195/500 [==========>...................] - ETA: 1:44 - loss: 0.2960 - regression_loss: 0.2794 - classification_loss: 0.0166 196/500 [==========>...................] - ETA: 1:44 - loss: 0.2962 - regression_loss: 0.2796 - classification_loss: 0.0166 197/500 [==========>...................] - ETA: 1:43 - loss: 0.2961 - regression_loss: 0.2795 - classification_loss: 0.0167 198/500 [==========>...................] - ETA: 1:43 - loss: 0.2958 - regression_loss: 0.2792 - classification_loss: 0.0167 199/500 [==========>...................] - ETA: 1:43 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0167 200/500 [===========>..................] - ETA: 1:42 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0167 201/500 [===========>..................] - ETA: 1:42 - loss: 0.2964 - regression_loss: 0.2797 - classification_loss: 0.0167 202/500 [===========>..................] - ETA: 1:42 - loss: 0.2963 - regression_loss: 0.2796 - classification_loss: 0.0167 203/500 [===========>..................] - ETA: 1:41 - loss: 0.2965 - regression_loss: 0.2798 - classification_loss: 0.0166 204/500 [===========>..................] - ETA: 1:41 - loss: 0.2965 - regression_loss: 0.2798 - classification_loss: 0.0166 205/500 [===========>..................] - ETA: 1:41 - loss: 0.2960 - regression_loss: 0.2794 - classification_loss: 0.0166 206/500 [===========>..................] - ETA: 1:40 - loss: 0.2950 - regression_loss: 0.2785 - classification_loss: 0.0165 207/500 [===========>..................] - ETA: 1:40 - loss: 0.2956 - regression_loss: 0.2791 - classification_loss: 0.0166 208/500 [===========>..................] - ETA: 1:40 - loss: 0.2958 - regression_loss: 0.2792 - classification_loss: 0.0166 209/500 [===========>..................] - ETA: 1:39 - loss: 0.2960 - regression_loss: 0.2794 - classification_loss: 0.0166 210/500 [===========>..................] - ETA: 1:39 - loss: 0.2962 - regression_loss: 0.2796 - classification_loss: 0.0166 211/500 [===========>..................] - ETA: 1:39 - loss: 0.2956 - regression_loss: 0.2790 - classification_loss: 0.0166 212/500 [===========>..................] - ETA: 1:38 - loss: 0.2955 - regression_loss: 0.2790 - classification_loss: 0.0165 213/500 [===========>..................] - ETA: 1:38 - loss: 0.2954 - regression_loss: 0.2788 - classification_loss: 0.0166 214/500 [===========>..................] - ETA: 1:38 - loss: 0.2955 - regression_loss: 0.2789 - classification_loss: 0.0166 215/500 [===========>..................] - ETA: 1:37 - loss: 0.2955 - regression_loss: 0.2789 - classification_loss: 0.0166 216/500 [===========>..................] - ETA: 1:37 - loss: 0.2967 - regression_loss: 0.2799 - classification_loss: 0.0167 217/500 [============>.................] - ETA: 1:37 - loss: 0.2976 - regression_loss: 0.2808 - classification_loss: 0.0168 218/500 [============>.................] - ETA: 1:36 - loss: 0.2970 - regression_loss: 0.2802 - classification_loss: 0.0167 219/500 [============>.................] - ETA: 1:36 - loss: 0.2970 - regression_loss: 0.2803 - classification_loss: 0.0167 220/500 [============>.................] - ETA: 1:35 - loss: 0.2971 - regression_loss: 0.2804 - classification_loss: 0.0167 221/500 [============>.................] - ETA: 1:35 - loss: 0.2965 - regression_loss: 0.2799 - classification_loss: 0.0167 222/500 [============>.................] - ETA: 1:35 - loss: 0.2961 - regression_loss: 0.2795 - classification_loss: 0.0166 223/500 [============>.................] - ETA: 1:34 - loss: 0.2966 - regression_loss: 0.2800 - classification_loss: 0.0166 224/500 [============>.................] - ETA: 1:34 - loss: 0.2962 - regression_loss: 0.2796 - classification_loss: 0.0166 225/500 [============>.................] - ETA: 1:34 - loss: 0.2959 - regression_loss: 0.2794 - classification_loss: 0.0165 226/500 [============>.................] - ETA: 1:33 - loss: 0.2962 - regression_loss: 0.2796 - classification_loss: 0.0165 227/500 [============>.................] - ETA: 1:33 - loss: 0.2967 - regression_loss: 0.2799 - classification_loss: 0.0168 228/500 [============>.................] - ETA: 1:33 - loss: 0.2959 - regression_loss: 0.2792 - classification_loss: 0.0167 229/500 [============>.................] - ETA: 1:32 - loss: 0.2951 - regression_loss: 0.2784 - classification_loss: 0.0167 230/500 [============>.................] - ETA: 1:32 - loss: 0.2952 - regression_loss: 0.2786 - classification_loss: 0.0167 231/500 [============>.................] - ETA: 1:32 - loss: 0.2950 - regression_loss: 0.2784 - classification_loss: 0.0166 232/500 [============>.................] - ETA: 1:31 - loss: 0.2952 - regression_loss: 0.2785 - classification_loss: 0.0166 233/500 [============>.................] - ETA: 1:31 - loss: 0.2957 - regression_loss: 0.2791 - classification_loss: 0.0166 234/500 [=============>................] - ETA: 1:31 - loss: 0.2964 - regression_loss: 0.2797 - classification_loss: 0.0167 235/500 [=============>................] - ETA: 1:30 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0167 236/500 [=============>................] - ETA: 1:30 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0167 237/500 [=============>................] - ETA: 1:30 - loss: 0.2965 - regression_loss: 0.2798 - classification_loss: 0.0168 238/500 [=============>................] - ETA: 1:29 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0168 239/500 [=============>................] - ETA: 1:29 - loss: 0.2960 - regression_loss: 0.2793 - classification_loss: 0.0167 240/500 [=============>................] - ETA: 1:29 - loss: 0.2952 - regression_loss: 0.2785 - classification_loss: 0.0166 241/500 [=============>................] - ETA: 1:28 - loss: 0.2947 - regression_loss: 0.2781 - classification_loss: 0.0166 242/500 [=============>................] - ETA: 1:28 - loss: 0.2945 - regression_loss: 0.2779 - classification_loss: 0.0166 243/500 [=============>................] - ETA: 1:28 - loss: 0.2949 - regression_loss: 0.2782 - classification_loss: 0.0166 244/500 [=============>................] - ETA: 1:27 - loss: 0.2943 - regression_loss: 0.2777 - classification_loss: 0.0166 245/500 [=============>................] - ETA: 1:27 - loss: 0.2937 - regression_loss: 0.2771 - classification_loss: 0.0166 246/500 [=============>................] - ETA: 1:27 - loss: 0.2933 - regression_loss: 0.2768 - classification_loss: 0.0165 247/500 [=============>................] - ETA: 1:26 - loss: 0.2934 - regression_loss: 0.2768 - classification_loss: 0.0166 248/500 [=============>................] - ETA: 1:26 - loss: 0.2934 - regression_loss: 0.2768 - classification_loss: 0.0166 249/500 [=============>................] - ETA: 1:25 - loss: 0.2929 - regression_loss: 0.2763 - classification_loss: 0.0166 250/500 [==============>...............] - ETA: 1:25 - loss: 0.2929 - regression_loss: 0.2763 - classification_loss: 0.0165 251/500 [==============>...............] - ETA: 1:25 - loss: 0.2925 - regression_loss: 0.2760 - classification_loss: 0.0165 252/500 [==============>...............] - ETA: 1:24 - loss: 0.2922 - regression_loss: 0.2757 - classification_loss: 0.0165 253/500 [==============>...............] - ETA: 1:24 - loss: 0.2916 - regression_loss: 0.2751 - classification_loss: 0.0165 254/500 [==============>...............] - ETA: 1:24 - loss: 0.2912 - regression_loss: 0.2748 - classification_loss: 0.0164 255/500 [==============>...............] - ETA: 1:23 - loss: 0.2922 - regression_loss: 0.2756 - classification_loss: 0.0166 256/500 [==============>...............] - ETA: 1:23 - loss: 0.2921 - regression_loss: 0.2755 - classification_loss: 0.0166 257/500 [==============>...............] - ETA: 1:23 - loss: 0.2921 - regression_loss: 0.2755 - classification_loss: 0.0166 258/500 [==============>...............] - ETA: 1:22 - loss: 0.2919 - regression_loss: 0.2753 - classification_loss: 0.0166 259/500 [==============>...............] - ETA: 1:22 - loss: 0.2911 - regression_loss: 0.2745 - classification_loss: 0.0165 260/500 [==============>...............] - ETA: 1:22 - loss: 0.2912 - regression_loss: 0.2746 - classification_loss: 0.0166 261/500 [==============>...............] - ETA: 1:21 - loss: 0.2917 - regression_loss: 0.2750 - classification_loss: 0.0167 262/500 [==============>...............] - ETA: 1:21 - loss: 0.2915 - regression_loss: 0.2748 - classification_loss: 0.0167 263/500 [==============>...............] - ETA: 1:21 - loss: 0.2915 - regression_loss: 0.2748 - classification_loss: 0.0167 264/500 [==============>...............] - ETA: 1:20 - loss: 0.2913 - regression_loss: 0.2746 - classification_loss: 0.0167 265/500 [==============>...............] - ETA: 1:20 - loss: 0.2925 - regression_loss: 0.2759 - classification_loss: 0.0166 266/500 [==============>...............] - ETA: 1:20 - loss: 0.2925 - regression_loss: 0.2758 - classification_loss: 0.0167 267/500 [===============>..............] - ETA: 1:19 - loss: 0.2932 - regression_loss: 0.2765 - classification_loss: 0.0167 268/500 [===============>..............] - ETA: 1:19 - loss: 0.2934 - regression_loss: 0.2766 - classification_loss: 0.0168 269/500 [===============>..............] - ETA: 1:19 - loss: 0.2932 - regression_loss: 0.2763 - classification_loss: 0.0168 270/500 [===============>..............] - ETA: 1:18 - loss: 0.2930 - regression_loss: 0.2761 - classification_loss: 0.0168 271/500 [===============>..............] - ETA: 1:18 - loss: 0.2934 - regression_loss: 0.2766 - classification_loss: 0.0168 272/500 [===============>..............] - ETA: 1:18 - loss: 0.2931 - regression_loss: 0.2762 - classification_loss: 0.0168 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2929 - regression_loss: 0.2761 - classification_loss: 0.0168 274/500 [===============>..............] - ETA: 1:17 - loss: 0.2927 - regression_loss: 0.2759 - classification_loss: 0.0168 275/500 [===============>..............] - ETA: 1:17 - loss: 0.2926 - regression_loss: 0.2758 - classification_loss: 0.0168 276/500 [===============>..............] - ETA: 1:16 - loss: 0.2924 - regression_loss: 0.2757 - classification_loss: 0.0168 277/500 [===============>..............] - ETA: 1:16 - loss: 0.2924 - regression_loss: 0.2756 - classification_loss: 0.0168 278/500 [===============>..............] - ETA: 1:16 - loss: 0.2922 - regression_loss: 0.2754 - classification_loss: 0.0167 279/500 [===============>..............] - ETA: 1:15 - loss: 0.2918 - regression_loss: 0.2750 - classification_loss: 0.0167 280/500 [===============>..............] - ETA: 1:15 - loss: 0.2920 - regression_loss: 0.2753 - classification_loss: 0.0167 281/500 [===============>..............] - ETA: 1:15 - loss: 0.2921 - regression_loss: 0.2754 - classification_loss: 0.0167 282/500 [===============>..............] - ETA: 1:14 - loss: 0.2921 - regression_loss: 0.2754 - classification_loss: 0.0167 283/500 [===============>..............] - ETA: 1:14 - loss: 0.2917 - regression_loss: 0.2751 - classification_loss: 0.0167 284/500 [================>.............] - ETA: 1:14 - loss: 0.2914 - regression_loss: 0.2747 - classification_loss: 0.0167 285/500 [================>.............] - ETA: 1:13 - loss: 0.2909 - regression_loss: 0.2742 - classification_loss: 0.0166 286/500 [================>.............] - ETA: 1:13 - loss: 0.2915 - regression_loss: 0.2749 - classification_loss: 0.0166 287/500 [================>.............] - ETA: 1:13 - loss: 0.2915 - regression_loss: 0.2749 - classification_loss: 0.0166 288/500 [================>.............] - ETA: 1:12 - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0166 289/500 [================>.............] - ETA: 1:12 - loss: 0.2921 - regression_loss: 0.2754 - classification_loss: 0.0167 290/500 [================>.............] - ETA: 1:12 - loss: 0.2924 - regression_loss: 0.2757 - classification_loss: 0.0167 291/500 [================>.............] - ETA: 1:11 - loss: 0.2920 - regression_loss: 0.2753 - classification_loss: 0.0166 292/500 [================>.............] - ETA: 1:11 - loss: 0.2914 - regression_loss: 0.2748 - classification_loss: 0.0166 293/500 [================>.............] - ETA: 1:11 - loss: 0.2913 - regression_loss: 0.2747 - classification_loss: 0.0166 294/500 [================>.............] - ETA: 1:10 - loss: 0.2915 - regression_loss: 0.2749 - classification_loss: 0.0166 295/500 [================>.............] - ETA: 1:10 - loss: 0.2919 - regression_loss: 0.2753 - classification_loss: 0.0166 296/500 [================>.............] - ETA: 1:10 - loss: 0.2915 - regression_loss: 0.2750 - classification_loss: 0.0165 297/500 [================>.............] - ETA: 1:09 - loss: 0.2917 - regression_loss: 0.2751 - classification_loss: 0.0165 298/500 [================>.............] - ETA: 1:09 - loss: 0.2917 - regression_loss: 0.2751 - classification_loss: 0.0165 299/500 [================>.............] - ETA: 1:08 - loss: 0.2913 - regression_loss: 0.2748 - classification_loss: 0.0165 300/500 [=================>............] - ETA: 1:08 - loss: 0.2910 - regression_loss: 0.2746 - classification_loss: 0.0165 301/500 [=================>............] - ETA: 1:08 - loss: 0.2908 - regression_loss: 0.2743 - classification_loss: 0.0164 302/500 [=================>............] - ETA: 1:07 - loss: 0.2903 - regression_loss: 0.2739 - classification_loss: 0.0164 303/500 [=================>............] - ETA: 1:07 - loss: 0.2908 - regression_loss: 0.2744 - classification_loss: 0.0164 304/500 [=================>............] - ETA: 1:07 - loss: 0.2906 - regression_loss: 0.2743 - classification_loss: 0.0164 305/500 [=================>............] - ETA: 1:06 - loss: 0.2906 - regression_loss: 0.2742 - classification_loss: 0.0164 306/500 [=================>............] - ETA: 1:06 - loss: 0.2903 - regression_loss: 0.2740 - classification_loss: 0.0163 307/500 [=================>............] - ETA: 1:06 - loss: 0.2907 - regression_loss: 0.2743 - classification_loss: 0.0164 308/500 [=================>............] - ETA: 1:05 - loss: 0.2902 - regression_loss: 0.2738 - classification_loss: 0.0164 309/500 [=================>............] - ETA: 1:05 - loss: 0.2900 - regression_loss: 0.2736 - classification_loss: 0.0163 310/500 [=================>............] - ETA: 1:05 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0164 311/500 [=================>............] - ETA: 1:04 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0164 312/500 [=================>............] - ETA: 1:04 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0164 313/500 [=================>............] - ETA: 1:04 - loss: 0.2901 - regression_loss: 0.2738 - classification_loss: 0.0163 314/500 [=================>............] - ETA: 1:03 - loss: 0.2900 - regression_loss: 0.2737 - classification_loss: 0.0163 315/500 [=================>............] - ETA: 1:03 - loss: 0.2898 - regression_loss: 0.2735 - classification_loss: 0.0163 316/500 [=================>............] - ETA: 1:03 - loss: 0.2902 - regression_loss: 0.2738 - classification_loss: 0.0164 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2897 - regression_loss: 0.2733 - classification_loss: 0.0164 318/500 [==================>...........] - ETA: 1:02 - loss: 0.2897 - regression_loss: 0.2733 - classification_loss: 0.0164 319/500 [==================>...........] - ETA: 1:02 - loss: 0.2890 - regression_loss: 0.2727 - classification_loss: 0.0164 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2884 - regression_loss: 0.2721 - classification_loss: 0.0163 321/500 [==================>...........] - ETA: 1:01 - loss: 0.2888 - regression_loss: 0.2725 - classification_loss: 0.0163 322/500 [==================>...........] - ETA: 1:01 - loss: 0.2891 - regression_loss: 0.2727 - classification_loss: 0.0163 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2890 - regression_loss: 0.2727 - classification_loss: 0.0164 324/500 [==================>...........] - ETA: 1:00 - loss: 0.2889 - regression_loss: 0.2725 - classification_loss: 0.0163 325/500 [==================>...........] - ETA: 1:00 - loss: 0.2894 - regression_loss: 0.2730 - classification_loss: 0.0164 326/500 [==================>...........] - ETA: 59s - loss: 0.2894 - regression_loss: 0.2730 - classification_loss: 0.0164  327/500 [==================>...........] - ETA: 59s - loss: 0.2894 - regression_loss: 0.2730 - classification_loss: 0.0164 328/500 [==================>...........] - ETA: 59s - loss: 0.2896 - regression_loss: 0.2732 - classification_loss: 0.0164 329/500 [==================>...........] - ETA: 58s - loss: 0.2892 - regression_loss: 0.2728 - classification_loss: 0.0164 330/500 [==================>...........] - ETA: 58s - loss: 0.2892 - regression_loss: 0.2729 - classification_loss: 0.0164 331/500 [==================>...........] - ETA: 58s - loss: 0.2895 - regression_loss: 0.2731 - classification_loss: 0.0164 332/500 [==================>...........] - ETA: 57s - loss: 0.2895 - regression_loss: 0.2732 - classification_loss: 0.0164 333/500 [==================>...........] - ETA: 57s - loss: 0.2895 - regression_loss: 0.2731 - classification_loss: 0.0164 334/500 [===================>..........] - ETA: 56s - loss: 0.2893 - regression_loss: 0.2729 - classification_loss: 0.0164 335/500 [===================>..........] - ETA: 56s - loss: 0.2898 - regression_loss: 0.2733 - classification_loss: 0.0164 336/500 [===================>..........] - ETA: 56s - loss: 0.2905 - regression_loss: 0.2741 - classification_loss: 0.0165 337/500 [===================>..........] - ETA: 55s - loss: 0.2906 - regression_loss: 0.2742 - classification_loss: 0.0164 338/500 [===================>..........] - ETA: 55s - loss: 0.2907 - regression_loss: 0.2743 - classification_loss: 0.0164 339/500 [===================>..........] - ETA: 55s - loss: 0.2909 - regression_loss: 0.2745 - classification_loss: 0.0164 340/500 [===================>..........] - ETA: 54s - loss: 0.2913 - regression_loss: 0.2749 - classification_loss: 0.0164 341/500 [===================>..........] - ETA: 54s - loss: 0.2916 - regression_loss: 0.2752 - classification_loss: 0.0164 342/500 [===================>..........] - ETA: 54s - loss: 0.2917 - regression_loss: 0.2753 - classification_loss: 0.0164 343/500 [===================>..........] - ETA: 53s - loss: 0.2914 - regression_loss: 0.2751 - classification_loss: 0.0164 344/500 [===================>..........] - ETA: 53s - loss: 0.2911 - regression_loss: 0.2748 - classification_loss: 0.0163 345/500 [===================>..........] - ETA: 53s - loss: 0.2911 - regression_loss: 0.2747 - classification_loss: 0.0163 346/500 [===================>..........] - ETA: 52s - loss: 0.2908 - regression_loss: 0.2745 - classification_loss: 0.0163 347/500 [===================>..........] - ETA: 52s - loss: 0.2909 - regression_loss: 0.2746 - classification_loss: 0.0163 348/500 [===================>..........] - ETA: 52s - loss: 0.2903 - regression_loss: 0.2740 - classification_loss: 0.0163 349/500 [===================>..........] - ETA: 51s - loss: 0.2908 - regression_loss: 0.2745 - classification_loss: 0.0164 350/500 [====================>.........] - ETA: 51s - loss: 0.2910 - regression_loss: 0.2747 - classification_loss: 0.0164 351/500 [====================>.........] - ETA: 51s - loss: 0.2910 - regression_loss: 0.2746 - classification_loss: 0.0164 352/500 [====================>.........] - ETA: 50s - loss: 0.2909 - regression_loss: 0.2746 - classification_loss: 0.0163 353/500 [====================>.........] - ETA: 50s - loss: 0.2908 - regression_loss: 0.2745 - classification_loss: 0.0163 354/500 [====================>.........] - ETA: 50s - loss: 0.2912 - regression_loss: 0.2749 - classification_loss: 0.0164 355/500 [====================>.........] - ETA: 49s - loss: 0.2913 - regression_loss: 0.2750 - classification_loss: 0.0164 356/500 [====================>.........] - ETA: 49s - loss: 0.2912 - regression_loss: 0.2748 - classification_loss: 0.0164 357/500 [====================>.........] - ETA: 49s - loss: 0.2910 - regression_loss: 0.2747 - classification_loss: 0.0163 358/500 [====================>.........] - ETA: 48s - loss: 0.2910 - regression_loss: 0.2746 - classification_loss: 0.0164 359/500 [====================>.........] - ETA: 48s - loss: 0.2905 - regression_loss: 0.2741 - classification_loss: 0.0163 360/500 [====================>.........] - ETA: 48s - loss: 0.2902 - regression_loss: 0.2739 - classification_loss: 0.0163 361/500 [====================>.........] - ETA: 47s - loss: 0.2898 - regression_loss: 0.2735 - classification_loss: 0.0163 362/500 [====================>.........] - ETA: 47s - loss: 0.2895 - regression_loss: 0.2732 - classification_loss: 0.0163 363/500 [====================>.........] - ETA: 47s - loss: 0.2902 - regression_loss: 0.2739 - classification_loss: 0.0163 364/500 [====================>.........] - ETA: 46s - loss: 0.2909 - regression_loss: 0.2745 - classification_loss: 0.0163 365/500 [====================>.........] - ETA: 46s - loss: 0.2910 - regression_loss: 0.2746 - classification_loss: 0.0164 366/500 [====================>.........] - ETA: 45s - loss: 0.2910 - regression_loss: 0.2746 - classification_loss: 0.0164 367/500 [=====================>........] - ETA: 45s - loss: 0.2910 - regression_loss: 0.2745 - classification_loss: 0.0164 368/500 [=====================>........] - ETA: 45s - loss: 0.2918 - regression_loss: 0.2753 - classification_loss: 0.0165 369/500 [=====================>........] - ETA: 44s - loss: 0.2919 - regression_loss: 0.2754 - classification_loss: 0.0165 370/500 [=====================>........] - ETA: 44s - loss: 0.2914 - regression_loss: 0.2749 - classification_loss: 0.0165 371/500 [=====================>........] - ETA: 44s - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0165 372/500 [=====================>........] - ETA: 43s - loss: 0.2908 - regression_loss: 0.2743 - classification_loss: 0.0165 373/500 [=====================>........] - ETA: 43s - loss: 0.2911 - regression_loss: 0.2746 - classification_loss: 0.0165 374/500 [=====================>........] - ETA: 43s - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 375/500 [=====================>........] - ETA: 42s - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0165 376/500 [=====================>........] - ETA: 42s - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0165 377/500 [=====================>........] - ETA: 42s - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0165 378/500 [=====================>........] - ETA: 41s - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0165 379/500 [=====================>........] - ETA: 41s - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0165 380/500 [=====================>........] - ETA: 41s - loss: 0.2913 - regression_loss: 0.2748 - classification_loss: 0.0165 381/500 [=====================>........] - ETA: 40s - loss: 0.2912 - regression_loss: 0.2747 - classification_loss: 0.0165 382/500 [=====================>........] - ETA: 40s - loss: 0.2914 - regression_loss: 0.2748 - classification_loss: 0.0165 383/500 [=====================>........] - ETA: 40s - loss: 0.2914 - regression_loss: 0.2748 - classification_loss: 0.0166 384/500 [======================>.......] - ETA: 39s - loss: 0.2913 - regression_loss: 0.2748 - classification_loss: 0.0166 385/500 [======================>.......] - ETA: 39s - loss: 0.2911 - regression_loss: 0.2746 - classification_loss: 0.0166 386/500 [======================>.......] - ETA: 39s - loss: 0.2910 - regression_loss: 0.2744 - classification_loss: 0.0166 387/500 [======================>.......] - ETA: 38s - loss: 0.2907 - regression_loss: 0.2741 - classification_loss: 0.0165 388/500 [======================>.......] - ETA: 38s - loss: 0.2904 - regression_loss: 0.2739 - classification_loss: 0.0165 389/500 [======================>.......] - ETA: 38s - loss: 0.2905 - regression_loss: 0.2740 - classification_loss: 0.0165 390/500 [======================>.......] - ETA: 37s - loss: 0.2901 - regression_loss: 0.2736 - classification_loss: 0.0165 391/500 [======================>.......] - ETA: 37s - loss: 0.2898 - regression_loss: 0.2734 - classification_loss: 0.0164 392/500 [======================>.......] - ETA: 37s - loss: 0.2897 - regression_loss: 0.2732 - classification_loss: 0.0164 393/500 [======================>.......] - ETA: 36s - loss: 0.2897 - regression_loss: 0.2732 - classification_loss: 0.0165 394/500 [======================>.......] - ETA: 36s - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 395/500 [======================>.......] - ETA: 36s - loss: 0.2908 - regression_loss: 0.2742 - classification_loss: 0.0166 396/500 [======================>.......] - ETA: 35s - loss: 0.2905 - regression_loss: 0.2739 - classification_loss: 0.0165 397/500 [======================>.......] - ETA: 35s - loss: 0.2908 - regression_loss: 0.2741 - classification_loss: 0.0166 398/500 [======================>.......] - ETA: 34s - loss: 0.2906 - regression_loss: 0.2740 - classification_loss: 0.0166 399/500 [======================>.......] - ETA: 34s - loss: 0.2903 - regression_loss: 0.2737 - classification_loss: 0.0166 400/500 [=======================>......] - ETA: 34s - loss: 0.2904 - regression_loss: 0.2738 - classification_loss: 0.0165 401/500 [=======================>......] - ETA: 33s - loss: 0.2909 - regression_loss: 0.2743 - classification_loss: 0.0166 402/500 [=======================>......] - ETA: 33s - loss: 0.2911 - regression_loss: 0.2745 - classification_loss: 0.0166 403/500 [=======================>......] - ETA: 33s - loss: 0.2909 - regression_loss: 0.2743 - classification_loss: 0.0166 404/500 [=======================>......] - ETA: 32s - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0166 405/500 [=======================>......] - ETA: 32s - loss: 0.2914 - regression_loss: 0.2747 - classification_loss: 0.0167 406/500 [=======================>......] - ETA: 32s - loss: 0.2912 - regression_loss: 0.2746 - classification_loss: 0.0167 407/500 [=======================>......] - ETA: 31s - loss: 0.2911 - regression_loss: 0.2744 - classification_loss: 0.0166 408/500 [=======================>......] - ETA: 31s - loss: 0.2914 - regression_loss: 0.2748 - classification_loss: 0.0166 409/500 [=======================>......] - ETA: 31s - loss: 0.2913 - regression_loss: 0.2747 - classification_loss: 0.0166 410/500 [=======================>......] - ETA: 30s - loss: 0.2911 - regression_loss: 0.2745 - classification_loss: 0.0166 411/500 [=======================>......] - ETA: 30s - loss: 0.2911 - regression_loss: 0.2745 - classification_loss: 0.0166 412/500 [=======================>......] - ETA: 30s - loss: 0.2915 - regression_loss: 0.2749 - classification_loss: 0.0167 413/500 [=======================>......] - ETA: 29s - loss: 0.2919 - regression_loss: 0.2752 - classification_loss: 0.0167 414/500 [=======================>......] - ETA: 29s - loss: 0.2919 - regression_loss: 0.2752 - classification_loss: 0.0167 415/500 [=======================>......] - ETA: 29s - loss: 0.2919 - regression_loss: 0.2752 - classification_loss: 0.0167 416/500 [=======================>......] - ETA: 28s - loss: 0.2918 - regression_loss: 0.2751 - classification_loss: 0.0167 417/500 [========================>.....] - ETA: 28s - loss: 0.2918 - regression_loss: 0.2751 - classification_loss: 0.0167 418/500 [========================>.....] - ETA: 28s - loss: 0.2918 - regression_loss: 0.2751 - classification_loss: 0.0167 419/500 [========================>.....] - ETA: 27s - loss: 0.2917 - regression_loss: 0.2750 - classification_loss: 0.0167 420/500 [========================>.....] - ETA: 27s - loss: 0.2915 - regression_loss: 0.2748 - classification_loss: 0.0167 421/500 [========================>.....] - ETA: 27s - loss: 0.2911 - regression_loss: 0.2745 - classification_loss: 0.0166 422/500 [========================>.....] - ETA: 26s - loss: 0.2906 - regression_loss: 0.2740 - classification_loss: 0.0166 423/500 [========================>.....] - ETA: 26s - loss: 0.2907 - regression_loss: 0.2741 - classification_loss: 0.0166 424/500 [========================>.....] - ETA: 26s - loss: 0.2903 - regression_loss: 0.2737 - classification_loss: 0.0166 425/500 [========================>.....] - ETA: 25s - loss: 0.2906 - regression_loss: 0.2740 - classification_loss: 0.0166 426/500 [========================>.....] - ETA: 25s - loss: 0.2908 - regression_loss: 0.2742 - classification_loss: 0.0166 427/500 [========================>.....] - ETA: 25s - loss: 0.2906 - regression_loss: 0.2740 - classification_loss: 0.0166 428/500 [========================>.....] - ETA: 24s - loss: 0.2904 - regression_loss: 0.2738 - classification_loss: 0.0166 429/500 [========================>.....] - ETA: 24s - loss: 0.2903 - regression_loss: 0.2737 - classification_loss: 0.0166 430/500 [========================>.....] - ETA: 24s - loss: 0.2898 - regression_loss: 0.2732 - classification_loss: 0.0166 431/500 [========================>.....] - ETA: 23s - loss: 0.2897 - regression_loss: 0.2731 - classification_loss: 0.0166 432/500 [========================>.....] - ETA: 23s - loss: 0.2897 - regression_loss: 0.2731 - classification_loss: 0.0166 433/500 [========================>.....] - ETA: 22s - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 434/500 [=========================>....] - ETA: 22s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0166 435/500 [=========================>....] - ETA: 22s - loss: 0.2900 - regression_loss: 0.2733 - classification_loss: 0.0166 436/500 [=========================>....] - ETA: 21s - loss: 0.2897 - regression_loss: 0.2731 - classification_loss: 0.0166 437/500 [=========================>....] - ETA: 21s - loss: 0.2900 - regression_loss: 0.2734 - classification_loss: 0.0167 438/500 [=========================>....] - ETA: 21s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0166 439/500 [=========================>....] - ETA: 20s - loss: 0.2900 - regression_loss: 0.2733 - classification_loss: 0.0166 440/500 [=========================>....] - ETA: 20s - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0166 441/500 [=========================>....] - ETA: 20s - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 442/500 [=========================>....] - ETA: 19s - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0166 443/500 [=========================>....] - ETA: 19s - loss: 0.2901 - regression_loss: 0.2734 - classification_loss: 0.0167 444/500 [=========================>....] - ETA: 19s - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0167 445/500 [=========================>....] - ETA: 18s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0167 446/500 [=========================>....] - ETA: 18s - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 447/500 [=========================>....] - ETA: 18s - loss: 0.2896 - regression_loss: 0.2729 - classification_loss: 0.0166 448/500 [=========================>....] - ETA: 17s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0167 449/500 [=========================>....] - ETA: 17s - loss: 0.2901 - regression_loss: 0.2734 - classification_loss: 0.0167 450/500 [==========================>...] - ETA: 17s - loss: 0.2900 - regression_loss: 0.2734 - classification_loss: 0.0167 451/500 [==========================>...] - ETA: 16s - loss: 0.2904 - regression_loss: 0.2738 - classification_loss: 0.0167 452/500 [==========================>...] - ETA: 16s - loss: 0.2900 - regression_loss: 0.2734 - classification_loss: 0.0166 453/500 [==========================>...] - ETA: 16s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0166 454/500 [==========================>...] - ETA: 15s - loss: 0.2897 - regression_loss: 0.2731 - classification_loss: 0.0166 455/500 [==========================>...] - ETA: 15s - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0166 456/500 [==========================>...] - ETA: 15s - loss: 0.2898 - regression_loss: 0.2732 - classification_loss: 0.0166 457/500 [==========================>...] - ETA: 14s - loss: 0.2898 - regression_loss: 0.2732 - classification_loss: 0.0166 458/500 [==========================>...] - ETA: 14s - loss: 0.2894 - regression_loss: 0.2728 - classification_loss: 0.0166 459/500 [==========================>...] - ETA: 14s - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 460/500 [==========================>...] - ETA: 13s - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0167 461/500 [==========================>...] - ETA: 13s - loss: 0.2896 - regression_loss: 0.2730 - classification_loss: 0.0167 462/500 [==========================>...] - ETA: 13s - loss: 0.2895 - regression_loss: 0.2728 - classification_loss: 0.0167 463/500 [==========================>...] - ETA: 12s - loss: 0.2897 - regression_loss: 0.2730 - classification_loss: 0.0167 464/500 [==========================>...] - ETA: 12s - loss: 0.2898 - regression_loss: 0.2732 - classification_loss: 0.0166 465/500 [==========================>...] - ETA: 11s - loss: 0.2898 - regression_loss: 0.2731 - classification_loss: 0.0166 466/500 [==========================>...] - ETA: 11s - loss: 0.2896 - regression_loss: 0.2730 - classification_loss: 0.0166 467/500 [===========================>..] - ETA: 11s - loss: 0.2897 - regression_loss: 0.2731 - classification_loss: 0.0166 468/500 [===========================>..] - ETA: 10s - loss: 0.2895 - regression_loss: 0.2729 - classification_loss: 0.0166 469/500 [===========================>..] - ETA: 10s - loss: 0.2892 - regression_loss: 0.2726 - classification_loss: 0.0166 470/500 [===========================>..] - ETA: 10s - loss: 0.2889 - regression_loss: 0.2724 - classification_loss: 0.0165 471/500 [===========================>..] - ETA: 9s - loss: 0.2887 - regression_loss: 0.2721 - classification_loss: 0.0165  472/500 [===========================>..] - ETA: 9s - loss: 0.2891 - regression_loss: 0.2725 - classification_loss: 0.0166 473/500 [===========================>..] - ETA: 9s - loss: 0.2888 - regression_loss: 0.2723 - classification_loss: 0.0165 474/500 [===========================>..] - ETA: 8s - loss: 0.2886 - regression_loss: 0.2721 - classification_loss: 0.0165 475/500 [===========================>..] - ETA: 8s - loss: 0.2885 - regression_loss: 0.2720 - classification_loss: 0.0165 476/500 [===========================>..] - ETA: 8s - loss: 0.2893 - regression_loss: 0.2728 - classification_loss: 0.0165 477/500 [===========================>..] - ETA: 7s - loss: 0.2893 - regression_loss: 0.2728 - classification_loss: 0.0165 478/500 [===========================>..] - ETA: 7s - loss: 0.2890 - regression_loss: 0.2725 - classification_loss: 0.0165 479/500 [===========================>..] - ETA: 7s - loss: 0.2891 - regression_loss: 0.2726 - classification_loss: 0.0165 480/500 [===========================>..] - ETA: 6s - loss: 0.2893 - regression_loss: 0.2728 - classification_loss: 0.0165 481/500 [===========================>..] - ETA: 6s - loss: 0.2896 - regression_loss: 0.2731 - classification_loss: 0.0165 482/500 [===========================>..] - ETA: 6s - loss: 0.2896 - regression_loss: 0.2731 - classification_loss: 0.0165 483/500 [===========================>..] - ETA: 5s - loss: 0.2895 - regression_loss: 0.2730 - classification_loss: 0.0165 484/500 [============================>.] - ETA: 5s - loss: 0.2898 - regression_loss: 0.2733 - classification_loss: 0.0165 485/500 [============================>.] - ETA: 5s - loss: 0.2900 - regression_loss: 0.2735 - classification_loss: 0.0165 486/500 [============================>.] - ETA: 4s - loss: 0.2901 - regression_loss: 0.2735 - classification_loss: 0.0165 487/500 [============================>.] - ETA: 4s - loss: 0.2906 - regression_loss: 0.2741 - classification_loss: 0.0165 488/500 [============================>.] - ETA: 4s - loss: 0.2905 - regression_loss: 0.2740 - classification_loss: 0.0165 489/500 [============================>.] - ETA: 3s - loss: 0.2904 - regression_loss: 0.2739 - classification_loss: 0.0165 490/500 [============================>.] - ETA: 3s - loss: 0.2904 - regression_loss: 0.2739 - classification_loss: 0.0165 491/500 [============================>.] - ETA: 3s - loss: 0.2903 - regression_loss: 0.2738 - classification_loss: 0.0165 492/500 [============================>.] - ETA: 2s - loss: 0.2901 - regression_loss: 0.2736 - classification_loss: 0.0165 493/500 [============================>.] - ETA: 2s - loss: 0.2900 - regression_loss: 0.2736 - classification_loss: 0.0165 494/500 [============================>.] - ETA: 2s - loss: 0.2900 - regression_loss: 0.2735 - classification_loss: 0.0165 495/500 [============================>.] - ETA: 1s - loss: 0.2906 - regression_loss: 0.2741 - classification_loss: 0.0165 496/500 [============================>.] - ETA: 1s - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0165 497/500 [============================>.] - ETA: 1s - loss: 0.2911 - regression_loss: 0.2746 - classification_loss: 0.0165 498/500 [============================>.] - ETA: 0s - loss: 0.2910 - regression_loss: 0.2744 - classification_loss: 0.0165 499/500 [============================>.] - ETA: 0s - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 500/500 [==============================] - 171s 343ms/step - loss: 0.2909 - regression_loss: 0.2744 - classification_loss: 0.0165 1172 instances of class plum with average precision: 0.7393 mAP: 0.7393 Epoch 00038: saving model to ./training/snapshots/resnet101_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 2:40 - loss: 0.0989 - regression_loss: 0.0915 - classification_loss: 0.0074 2/500 [..............................] - ETA: 2:44 - loss: 0.3190 - regression_loss: 0.2966 - classification_loss: 0.0223 3/500 [..............................] - ETA: 2:44 - loss: 0.3504 - regression_loss: 0.3315 - classification_loss: 0.0189 4/500 [..............................] - ETA: 2:45 - loss: 0.3990 - regression_loss: 0.3775 - classification_loss: 0.0215 5/500 [..............................] - ETA: 2:45 - loss: 0.4171 - regression_loss: 0.3880 - classification_loss: 0.0291 6/500 [..............................] - ETA: 2:45 - loss: 0.3977 - regression_loss: 0.3717 - classification_loss: 0.0260 7/500 [..............................] - ETA: 2:45 - loss: 0.3872 - regression_loss: 0.3638 - classification_loss: 0.0234 8/500 [..............................] - ETA: 2:44 - loss: 0.3748 - regression_loss: 0.3527 - classification_loss: 0.0222 9/500 [..............................] - ETA: 2:43 - loss: 0.3464 - regression_loss: 0.3266 - classification_loss: 0.0198 10/500 [..............................] - ETA: 2:43 - loss: 0.3305 - regression_loss: 0.3120 - classification_loss: 0.0185 11/500 [..............................] - ETA: 2:42 - loss: 0.3336 - regression_loss: 0.3161 - classification_loss: 0.0176 12/500 [..............................] - ETA: 2:43 - loss: 0.3341 - regression_loss: 0.3170 - classification_loss: 0.0170 13/500 [..............................] - ETA: 2:43 - loss: 0.3244 - regression_loss: 0.3085 - classification_loss: 0.0159 14/500 [..............................] - ETA: 2:43 - loss: 0.3257 - regression_loss: 0.3096 - classification_loss: 0.0160 15/500 [..............................] - ETA: 2:43 - loss: 0.3108 - regression_loss: 0.2957 - classification_loss: 0.0151 16/500 [..............................] - ETA: 2:44 - loss: 0.3150 - regression_loss: 0.2994 - classification_loss: 0.0155 17/500 [>.............................] - ETA: 2:44 - loss: 0.3134 - regression_loss: 0.2975 - classification_loss: 0.0158 18/500 [>.............................] - ETA: 2:44 - loss: 0.3118 - regression_loss: 0.2961 - classification_loss: 0.0157 19/500 [>.............................] - ETA: 2:44 - loss: 0.3107 - regression_loss: 0.2945 - classification_loss: 0.0162 20/500 [>.............................] - ETA: 2:43 - loss: 0.3144 - regression_loss: 0.2980 - classification_loss: 0.0164 21/500 [>.............................] - ETA: 2:43 - loss: 0.3053 - regression_loss: 0.2891 - classification_loss: 0.0162 22/500 [>.............................] - ETA: 2:43 - loss: 0.3080 - regression_loss: 0.2913 - classification_loss: 0.0167 23/500 [>.............................] - ETA: 2:43 - loss: 0.3079 - regression_loss: 0.2901 - classification_loss: 0.0178 24/500 [>.............................] - ETA: 2:43 - loss: 0.3017 - regression_loss: 0.2845 - classification_loss: 0.0173 25/500 [>.............................] - ETA: 2:43 - loss: 0.2982 - regression_loss: 0.2809 - classification_loss: 0.0173 26/500 [>.............................] - ETA: 2:42 - loss: 0.2980 - regression_loss: 0.2807 - classification_loss: 0.0173 27/500 [>.............................] - ETA: 2:41 - loss: 0.2961 - regression_loss: 0.2788 - classification_loss: 0.0173 28/500 [>.............................] - ETA: 2:41 - loss: 0.2947 - regression_loss: 0.2772 - classification_loss: 0.0175 29/500 [>.............................] - ETA: 2:41 - loss: 0.2905 - regression_loss: 0.2732 - classification_loss: 0.0173 30/500 [>.............................] - ETA: 2:41 - loss: 0.2896 - regression_loss: 0.2725 - classification_loss: 0.0171 31/500 [>.............................] - ETA: 2:40 - loss: 0.2881 - regression_loss: 0.2711 - classification_loss: 0.0170 32/500 [>.............................] - ETA: 2:40 - loss: 0.2922 - regression_loss: 0.2752 - classification_loss: 0.0170 33/500 [>.............................] - ETA: 2:40 - loss: 0.2957 - regression_loss: 0.2789 - classification_loss: 0.0169 34/500 [=>............................] - ETA: 2:40 - loss: 0.2942 - regression_loss: 0.2776 - classification_loss: 0.0166 35/500 [=>............................] - ETA: 2:40 - loss: 0.2878 - regression_loss: 0.2715 - classification_loss: 0.0163 36/500 [=>............................] - ETA: 2:40 - loss: 0.2880 - regression_loss: 0.2716 - classification_loss: 0.0164 37/500 [=>............................] - ETA: 2:39 - loss: 0.2847 - regression_loss: 0.2686 - classification_loss: 0.0160 38/500 [=>............................] - ETA: 2:39 - loss: 0.2873 - regression_loss: 0.2713 - classification_loss: 0.0160 39/500 [=>............................] - ETA: 2:39 - loss: 0.2896 - regression_loss: 0.2736 - classification_loss: 0.0160 40/500 [=>............................] - ETA: 2:38 - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0159 41/500 [=>............................] - ETA: 2:38 - loss: 0.2916 - regression_loss: 0.2756 - classification_loss: 0.0160 42/500 [=>............................] - ETA: 2:38 - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 43/500 [=>............................] - ETA: 2:37 - loss: 0.2881 - regression_loss: 0.2725 - classification_loss: 0.0156 44/500 [=>............................] - ETA: 2:37 - loss: 0.2905 - regression_loss: 0.2736 - classification_loss: 0.0168 45/500 [=>............................] - ETA: 2:37 - loss: 0.2896 - regression_loss: 0.2728 - classification_loss: 0.0168 46/500 [=>............................] - ETA: 2:36 - loss: 0.2856 - regression_loss: 0.2691 - classification_loss: 0.0165 47/500 [=>............................] - ETA: 2:36 - loss: 0.2843 - regression_loss: 0.2678 - classification_loss: 0.0165 48/500 [=>............................] - ETA: 2:36 - loss: 0.2836 - regression_loss: 0.2672 - classification_loss: 0.0163 49/500 [=>............................] - ETA: 2:35 - loss: 0.2837 - regression_loss: 0.2677 - classification_loss: 0.0161 50/500 [==>...........................] - ETA: 2:35 - loss: 0.2857 - regression_loss: 0.2696 - classification_loss: 0.0160 51/500 [==>...........................] - ETA: 2:35 - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0160 52/500 [==>...........................] - ETA: 2:34 - loss: 0.2863 - regression_loss: 0.2704 - classification_loss: 0.0159 53/500 [==>...........................] - ETA: 2:34 - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 54/500 [==>...........................] - ETA: 2:34 - loss: 0.2883 - regression_loss: 0.2724 - classification_loss: 0.0159 55/500 [==>...........................] - ETA: 2:33 - loss: 0.2902 - regression_loss: 0.2741 - classification_loss: 0.0160 56/500 [==>...........................] - ETA: 2:33 - loss: 0.2915 - regression_loss: 0.2754 - classification_loss: 0.0161 57/500 [==>...........................] - ETA: 2:33 - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 58/500 [==>...........................] - ETA: 2:32 - loss: 0.2885 - regression_loss: 0.2726 - classification_loss: 0.0159 59/500 [==>...........................] - ETA: 2:32 - loss: 0.2887 - regression_loss: 0.2729 - classification_loss: 0.0158 60/500 [==>...........................] - ETA: 2:32 - loss: 0.2882 - regression_loss: 0.2725 - classification_loss: 0.0157 61/500 [==>...........................] - ETA: 2:31 - loss: 0.2936 - regression_loss: 0.2776 - classification_loss: 0.0160 62/500 [==>...........................] - ETA: 2:31 - loss: 0.2924 - regression_loss: 0.2761 - classification_loss: 0.0163 63/500 [==>...........................] - ETA: 2:30 - loss: 0.2907 - regression_loss: 0.2746 - classification_loss: 0.0161 64/500 [==>...........................] - ETA: 2:30 - loss: 0.2888 - regression_loss: 0.2729 - classification_loss: 0.0159 65/500 [==>...........................] - ETA: 2:29 - loss: 0.2882 - regression_loss: 0.2723 - classification_loss: 0.0159 66/500 [==>...........................] - ETA: 2:29 - loss: 0.2875 - regression_loss: 0.2717 - classification_loss: 0.0158 67/500 [===>..........................] - ETA: 2:29 - loss: 0.2877 - regression_loss: 0.2719 - classification_loss: 0.0158 68/500 [===>..........................] - ETA: 2:28 - loss: 0.2883 - regression_loss: 0.2726 - classification_loss: 0.0156 69/500 [===>..........................] - ETA: 2:28 - loss: 0.2875 - regression_loss: 0.2718 - classification_loss: 0.0157 70/500 [===>..........................] - ETA: 2:27 - loss: 0.2868 - regression_loss: 0.2712 - classification_loss: 0.0155 71/500 [===>..........................] - ETA: 2:27 - loss: 0.2856 - regression_loss: 0.2701 - classification_loss: 0.0155 72/500 [===>..........................] - ETA: 2:27 - loss: 0.2849 - regression_loss: 0.2695 - classification_loss: 0.0154 73/500 [===>..........................] - ETA: 2:26 - loss: 0.2829 - regression_loss: 0.2676 - classification_loss: 0.0153 74/500 [===>..........................] - ETA: 2:26 - loss: 0.2824 - regression_loss: 0.2671 - classification_loss: 0.0153 75/500 [===>..........................] - ETA: 2:26 - loss: 0.2817 - regression_loss: 0.2664 - classification_loss: 0.0153 76/500 [===>..........................] - ETA: 2:25 - loss: 0.2822 - regression_loss: 0.2669 - classification_loss: 0.0153 77/500 [===>..........................] - ETA: 2:25 - loss: 0.2805 - regression_loss: 0.2650 - classification_loss: 0.0155 78/500 [===>..........................] - ETA: 2:25 - loss: 0.2807 - regression_loss: 0.2653 - classification_loss: 0.0154 79/500 [===>..........................] - ETA: 2:24 - loss: 0.2802 - regression_loss: 0.2648 - classification_loss: 0.0154 80/500 [===>..........................] - ETA: 2:24 - loss: 0.2821 - regression_loss: 0.2668 - classification_loss: 0.0153 81/500 [===>..........................] - ETA: 2:24 - loss: 0.2812 - regression_loss: 0.2660 - classification_loss: 0.0152 82/500 [===>..........................] - ETA: 2:23 - loss: 0.2802 - regression_loss: 0.2651 - classification_loss: 0.0150 83/500 [===>..........................] - ETA: 2:23 - loss: 0.2824 - regression_loss: 0.2673 - classification_loss: 0.0151 84/500 [====>.........................] - ETA: 2:23 - loss: 0.2840 - regression_loss: 0.2689 - classification_loss: 0.0150 85/500 [====>.........................] - ETA: 2:22 - loss: 0.2848 - regression_loss: 0.2697 - classification_loss: 0.0151 86/500 [====>.........................] - ETA: 2:22 - loss: 0.2863 - regression_loss: 0.2711 - classification_loss: 0.0152 87/500 [====>.........................] - ETA: 2:21 - loss: 0.2865 - regression_loss: 0.2713 - classification_loss: 0.0152 88/500 [====>.........................] - ETA: 2:21 - loss: 0.2884 - regression_loss: 0.2732 - classification_loss: 0.0153 89/500 [====>.........................] - ETA: 2:21 - loss: 0.2878 - regression_loss: 0.2726 - classification_loss: 0.0152 90/500 [====>.........................] - ETA: 2:20 - loss: 0.2857 - regression_loss: 0.2706 - classification_loss: 0.0151 91/500 [====>.........................] - ETA: 2:20 - loss: 0.2873 - regression_loss: 0.2720 - classification_loss: 0.0153 92/500 [====>.........................] - ETA: 2:20 - loss: 0.2862 - regression_loss: 0.2710 - classification_loss: 0.0152 93/500 [====>.........................] - ETA: 2:19 - loss: 0.2857 - regression_loss: 0.2704 - classification_loss: 0.0153 94/500 [====>.........................] - ETA: 2:19 - loss: 0.2851 - regression_loss: 0.2698 - classification_loss: 0.0153 95/500 [====>.........................] - ETA: 2:19 - loss: 0.2856 - regression_loss: 0.2702 - classification_loss: 0.0154 96/500 [====>.........................] - ETA: 2:18 - loss: 0.2856 - regression_loss: 0.2702 - classification_loss: 0.0154 97/500 [====>.........................] - ETA: 2:18 - loss: 0.2846 - regression_loss: 0.2692 - classification_loss: 0.0154 98/500 [====>.........................] - ETA: 2:18 - loss: 0.2874 - regression_loss: 0.2717 - classification_loss: 0.0156 99/500 [====>.........................] - ETA: 2:17 - loss: 0.2869 - regression_loss: 0.2713 - classification_loss: 0.0155 100/500 [=====>........................] - ETA: 2:17 - loss: 0.2877 - regression_loss: 0.2722 - classification_loss: 0.0155 101/500 [=====>........................] - ETA: 2:17 - loss: 0.2861 - regression_loss: 0.2707 - classification_loss: 0.0154 102/500 [=====>........................] - ETA: 2:16 - loss: 0.2854 - regression_loss: 0.2699 - classification_loss: 0.0155 103/500 [=====>........................] - ETA: 2:16 - loss: 0.2839 - regression_loss: 0.2685 - classification_loss: 0.0154 104/500 [=====>........................] - ETA: 2:16 - loss: 0.2833 - regression_loss: 0.2679 - classification_loss: 0.0155 105/500 [=====>........................] - ETA: 2:15 - loss: 0.2843 - regression_loss: 0.2687 - classification_loss: 0.0155 106/500 [=====>........................] - ETA: 2:15 - loss: 0.2831 - regression_loss: 0.2677 - classification_loss: 0.0155 107/500 [=====>........................] - ETA: 2:15 - loss: 0.2838 - regression_loss: 0.2683 - classification_loss: 0.0154 108/500 [=====>........................] - ETA: 2:14 - loss: 0.2840 - regression_loss: 0.2686 - classification_loss: 0.0154 109/500 [=====>........................] - ETA: 2:14 - loss: 0.2837 - regression_loss: 0.2683 - classification_loss: 0.0154 110/500 [=====>........................] - ETA: 2:14 - loss: 0.2829 - regression_loss: 0.2675 - classification_loss: 0.0154 111/500 [=====>........................] - ETA: 2:13 - loss: 0.2816 - regression_loss: 0.2663 - classification_loss: 0.0153 112/500 [=====>........................] - ETA: 2:13 - loss: 0.2836 - regression_loss: 0.2680 - classification_loss: 0.0156 113/500 [=====>........................] - ETA: 2:12 - loss: 0.2863 - regression_loss: 0.2706 - classification_loss: 0.0158 114/500 [=====>........................] - ETA: 2:12 - loss: 0.2864 - regression_loss: 0.2707 - classification_loss: 0.0157 115/500 [=====>........................] - ETA: 2:12 - loss: 0.2867 - regression_loss: 0.2707 - classification_loss: 0.0159 116/500 [=====>........................] - ETA: 2:11 - loss: 0.2856 - regression_loss: 0.2697 - classification_loss: 0.0159 117/500 [======>.......................] - ETA: 2:11 - loss: 0.2860 - regression_loss: 0.2701 - classification_loss: 0.0159 118/500 [======>.......................] - ETA: 2:11 - loss: 0.2864 - regression_loss: 0.2706 - classification_loss: 0.0158 119/500 [======>.......................] - ETA: 2:10 - loss: 0.2858 - regression_loss: 0.2700 - classification_loss: 0.0157 120/500 [======>.......................] - ETA: 2:10 - loss: 0.2859 - regression_loss: 0.2701 - classification_loss: 0.0157 121/500 [======>.......................] - ETA: 2:10 - loss: 0.2882 - regression_loss: 0.2724 - classification_loss: 0.0157 122/500 [======>.......................] - ETA: 2:09 - loss: 0.2873 - regression_loss: 0.2716 - classification_loss: 0.0157 123/500 [======>.......................] - ETA: 2:09 - loss: 0.2880 - regression_loss: 0.2723 - classification_loss: 0.0156 124/500 [======>.......................] - ETA: 2:09 - loss: 0.2926 - regression_loss: 0.2767 - classification_loss: 0.0158 125/500 [======>.......................] - ETA: 2:08 - loss: 0.2971 - regression_loss: 0.2810 - classification_loss: 0.0161 126/500 [======>.......................] - ETA: 2:08 - loss: 0.2974 - regression_loss: 0.2814 - classification_loss: 0.0160 127/500 [======>.......................] - ETA: 2:08 - loss: 0.2970 - regression_loss: 0.2811 - classification_loss: 0.0159 128/500 [======>.......................] - ETA: 2:07 - loss: 0.2959 - regression_loss: 0.2801 - classification_loss: 0.0158 129/500 [======>.......................] - ETA: 2:07 - loss: 0.2959 - regression_loss: 0.2800 - classification_loss: 0.0159 130/500 [======>.......................] - ETA: 2:07 - loss: 0.2964 - regression_loss: 0.2804 - classification_loss: 0.0160 131/500 [======>.......................] - ETA: 2:06 - loss: 0.2948 - regression_loss: 0.2790 - classification_loss: 0.0159 132/500 [======>.......................] - ETA: 2:06 - loss: 0.2966 - regression_loss: 0.2804 - classification_loss: 0.0162 133/500 [======>.......................] - ETA: 2:06 - loss: 0.2991 - regression_loss: 0.2827 - classification_loss: 0.0163 134/500 [=======>......................] - ETA: 2:05 - loss: 0.2998 - regression_loss: 0.2836 - classification_loss: 0.0163 135/500 [=======>......................] - ETA: 2:05 - loss: 0.3001 - regression_loss: 0.2838 - classification_loss: 0.0163 136/500 [=======>......................] - ETA: 2:05 - loss: 0.3003 - regression_loss: 0.2839 - classification_loss: 0.0164 137/500 [=======>......................] - ETA: 2:04 - loss: 0.3008 - regression_loss: 0.2844 - classification_loss: 0.0164 138/500 [=======>......................] - ETA: 2:04 - loss: 0.3011 - regression_loss: 0.2848 - classification_loss: 0.0164 139/500 [=======>......................] - ETA: 2:04 - loss: 0.3011 - regression_loss: 0.2847 - classification_loss: 0.0164 140/500 [=======>......................] - ETA: 2:03 - loss: 0.3008 - regression_loss: 0.2844 - classification_loss: 0.0164 141/500 [=======>......................] - ETA: 2:03 - loss: 0.3007 - regression_loss: 0.2844 - classification_loss: 0.0164 142/500 [=======>......................] - ETA: 2:03 - loss: 0.2994 - regression_loss: 0.2831 - classification_loss: 0.0163 143/500 [=======>......................] - ETA: 2:02 - loss: 0.3002 - regression_loss: 0.2839 - classification_loss: 0.0163 144/500 [=======>......................] - ETA: 2:02 - loss: 0.3010 - regression_loss: 0.2847 - classification_loss: 0.0163 145/500 [=======>......................] - ETA: 2:02 - loss: 0.3002 - regression_loss: 0.2840 - classification_loss: 0.0162 146/500 [=======>......................] - ETA: 2:01 - loss: 0.2991 - regression_loss: 0.2830 - classification_loss: 0.0161 147/500 [=======>......................] - ETA: 2:01 - loss: 0.3009 - regression_loss: 0.2847 - classification_loss: 0.0162 148/500 [=======>......................] - ETA: 2:01 - loss: 0.3003 - regression_loss: 0.2841 - classification_loss: 0.0162 149/500 [=======>......................] - ETA: 2:00 - loss: 0.2992 - regression_loss: 0.2831 - classification_loss: 0.0161 150/500 [========>.....................] - ETA: 2:00 - loss: 0.3007 - regression_loss: 0.2845 - classification_loss: 0.0163 151/500 [========>.....................] - ETA: 2:00 - loss: 0.3011 - regression_loss: 0.2847 - classification_loss: 0.0164 152/500 [========>.....................] - ETA: 1:59 - loss: 0.3012 - regression_loss: 0.2849 - classification_loss: 0.0164 153/500 [========>.....................] - ETA: 1:59 - loss: 0.3013 - regression_loss: 0.2850 - classification_loss: 0.0163 154/500 [========>.....................] - ETA: 1:59 - loss: 0.3010 - regression_loss: 0.2848 - classification_loss: 0.0162 155/500 [========>.....................] - ETA: 1:58 - loss: 0.3016 - regression_loss: 0.2854 - classification_loss: 0.0162 156/500 [========>.....................] - ETA: 1:58 - loss: 0.3009 - regression_loss: 0.2848 - classification_loss: 0.0162 157/500 [========>.....................] - ETA: 1:58 - loss: 0.3014 - regression_loss: 0.2852 - classification_loss: 0.0162 158/500 [========>.....................] - ETA: 1:57 - loss: 0.3021 - regression_loss: 0.2859 - classification_loss: 0.0162 159/500 [========>.....................] - ETA: 1:57 - loss: 0.3017 - regression_loss: 0.2854 - classification_loss: 0.0162 160/500 [========>.....................] - ETA: 1:57 - loss: 0.3022 - regression_loss: 0.2858 - classification_loss: 0.0164 161/500 [========>.....................] - ETA: 1:56 - loss: 0.3016 - regression_loss: 0.2853 - classification_loss: 0.0163 162/500 [========>.....................] - ETA: 1:56 - loss: 0.3020 - regression_loss: 0.2856 - classification_loss: 0.0164 163/500 [========>.....................] - ETA: 1:56 - loss: 0.3025 - regression_loss: 0.2860 - classification_loss: 0.0165 164/500 [========>.....................] - ETA: 1:55 - loss: 0.3034 - regression_loss: 0.2869 - classification_loss: 0.0165 165/500 [========>.....................] - ETA: 1:55 - loss: 0.3029 - regression_loss: 0.2865 - classification_loss: 0.0164 166/500 [========>.....................] - ETA: 1:55 - loss: 0.3029 - regression_loss: 0.2866 - classification_loss: 0.0164 167/500 [=========>....................] - ETA: 1:54 - loss: 0.3035 - regression_loss: 0.2871 - classification_loss: 0.0164 168/500 [=========>....................] - ETA: 1:54 - loss: 0.3020 - regression_loss: 0.2858 - classification_loss: 0.0163 169/500 [=========>....................] - ETA: 1:53 - loss: 0.3017 - regression_loss: 0.2854 - classification_loss: 0.0163 170/500 [=========>....................] - ETA: 1:53 - loss: 0.3023 - regression_loss: 0.2859 - classification_loss: 0.0164 171/500 [=========>....................] - ETA: 1:53 - loss: 0.3024 - regression_loss: 0.2860 - classification_loss: 0.0164 172/500 [=========>....................] - ETA: 1:52 - loss: 0.3018 - regression_loss: 0.2854 - classification_loss: 0.0164 173/500 [=========>....................] - ETA: 1:52 - loss: 0.3014 - regression_loss: 0.2850 - classification_loss: 0.0164 174/500 [=========>....................] - ETA: 1:52 - loss: 0.3011 - regression_loss: 0.2847 - classification_loss: 0.0164 175/500 [=========>....................] - ETA: 1:51 - loss: 0.3005 - regression_loss: 0.2842 - classification_loss: 0.0163 176/500 [=========>....................] - ETA: 1:51 - loss: 0.2998 - regression_loss: 0.2835 - classification_loss: 0.0162 177/500 [=========>....................] - ETA: 1:51 - loss: 0.2988 - regression_loss: 0.2827 - classification_loss: 0.0162 178/500 [=========>....................] - ETA: 1:50 - loss: 0.2984 - regression_loss: 0.2822 - classification_loss: 0.0161 179/500 [=========>....................] - ETA: 1:50 - loss: 0.2997 - regression_loss: 0.2835 - classification_loss: 0.0162 180/500 [=========>....................] - ETA: 1:50 - loss: 0.3008 - regression_loss: 0.2845 - classification_loss: 0.0163 181/500 [=========>....................] - ETA: 1:49 - loss: 0.3017 - regression_loss: 0.2855 - classification_loss: 0.0163 182/500 [=========>....................] - ETA: 1:49 - loss: 0.3019 - regression_loss: 0.2856 - classification_loss: 0.0162 183/500 [=========>....................] - ETA: 1:49 - loss: 0.3019 - regression_loss: 0.2856 - classification_loss: 0.0162 184/500 [==========>...................] - ETA: 1:48 - loss: 0.3011 - regression_loss: 0.2850 - classification_loss: 0.0162 185/500 [==========>...................] - ETA: 1:48 - loss: 0.3019 - regression_loss: 0.2854 - classification_loss: 0.0165 186/500 [==========>...................] - ETA: 1:48 - loss: 0.3016 - regression_loss: 0.2851 - classification_loss: 0.0165 187/500 [==========>...................] - ETA: 1:47 - loss: 0.3010 - regression_loss: 0.2846 - classification_loss: 0.0164 188/500 [==========>...................] - ETA: 1:47 - loss: 0.3009 - regression_loss: 0.2846 - classification_loss: 0.0164 189/500 [==========>...................] - ETA: 1:46 - loss: 0.3004 - regression_loss: 0.2841 - classification_loss: 0.0163 190/500 [==========>...................] - ETA: 1:46 - loss: 0.3000 - regression_loss: 0.2838 - classification_loss: 0.0162 191/500 [==========>...................] - ETA: 1:46 - loss: 0.3005 - regression_loss: 0.2842 - classification_loss: 0.0163 192/500 [==========>...................] - ETA: 1:45 - loss: 0.3008 - regression_loss: 0.2845 - classification_loss: 0.0162 193/500 [==========>...................] - ETA: 1:45 - loss: 0.3003 - regression_loss: 0.2840 - classification_loss: 0.0162 194/500 [==========>...................] - ETA: 1:45 - loss: 0.3003 - regression_loss: 0.2842 - classification_loss: 0.0162 195/500 [==========>...................] - ETA: 1:44 - loss: 0.3003 - regression_loss: 0.2842 - classification_loss: 0.0162 196/500 [==========>...................] - ETA: 1:44 - loss: 0.3000 - regression_loss: 0.2838 - classification_loss: 0.0162 197/500 [==========>...................] - ETA: 1:44 - loss: 0.3000 - regression_loss: 0.2839 - classification_loss: 0.0161 198/500 [==========>...................] - ETA: 1:43 - loss: 0.2992 - regression_loss: 0.2831 - classification_loss: 0.0161 199/500 [==========>...................] - ETA: 1:43 - loss: 0.2999 - regression_loss: 0.2837 - classification_loss: 0.0162 200/500 [===========>..................] - ETA: 1:43 - loss: 0.2996 - regression_loss: 0.2835 - classification_loss: 0.0162 201/500 [===========>..................] - ETA: 1:42 - loss: 0.3000 - regression_loss: 0.2837 - classification_loss: 0.0163 202/500 [===========>..................] - ETA: 1:42 - loss: 0.3015 - regression_loss: 0.2850 - classification_loss: 0.0165 203/500 [===========>..................] - ETA: 1:42 - loss: 0.3017 - regression_loss: 0.2852 - classification_loss: 0.0165 204/500 [===========>..................] - ETA: 1:41 - loss: 0.3015 - regression_loss: 0.2851 - classification_loss: 0.0164 205/500 [===========>..................] - ETA: 1:41 - loss: 0.3012 - regression_loss: 0.2848 - classification_loss: 0.0164 206/500 [===========>..................] - ETA: 1:41 - loss: 0.3011 - regression_loss: 0.2846 - classification_loss: 0.0164 207/500 [===========>..................] - ETA: 1:40 - loss: 0.3015 - regression_loss: 0.2851 - classification_loss: 0.0164 208/500 [===========>..................] - ETA: 1:40 - loss: 0.3012 - regression_loss: 0.2848 - classification_loss: 0.0164 209/500 [===========>..................] - ETA: 1:40 - loss: 0.3006 - regression_loss: 0.2843 - classification_loss: 0.0163 210/500 [===========>..................] - ETA: 1:39 - loss: 0.3006 - regression_loss: 0.2842 - classification_loss: 0.0164 211/500 [===========>..................] - ETA: 1:39 - loss: 0.2999 - regression_loss: 0.2836 - classification_loss: 0.0163 212/500 [===========>..................] - ETA: 1:39 - loss: 0.2999 - regression_loss: 0.2836 - classification_loss: 0.0163 213/500 [===========>..................] - ETA: 1:38 - loss: 0.2996 - regression_loss: 0.2834 - classification_loss: 0.0162 214/500 [===========>..................] - ETA: 1:38 - loss: 0.2992 - regression_loss: 0.2830 - classification_loss: 0.0162 215/500 [===========>..................] - ETA: 1:38 - loss: 0.2994 - regression_loss: 0.2832 - classification_loss: 0.0162 216/500 [===========>..................] - ETA: 1:37 - loss: 0.2994 - regression_loss: 0.2832 - classification_loss: 0.0162 217/500 [============>.................] - ETA: 1:37 - loss: 0.2988 - regression_loss: 0.2827 - classification_loss: 0.0161 218/500 [============>.................] - ETA: 1:37 - loss: 0.2983 - regression_loss: 0.2822 - classification_loss: 0.0161 219/500 [============>.................] - ETA: 1:36 - loss: 0.2978 - regression_loss: 0.2817 - classification_loss: 0.0161 220/500 [============>.................] - ETA: 1:36 - loss: 0.2975 - regression_loss: 0.2814 - classification_loss: 0.0161 221/500 [============>.................] - ETA: 1:35 - loss: 0.2975 - regression_loss: 0.2814 - classification_loss: 0.0161 222/500 [============>.................] - ETA: 1:35 - loss: 0.2974 - regression_loss: 0.2813 - classification_loss: 0.0161 223/500 [============>.................] - ETA: 1:35 - loss: 0.2971 - regression_loss: 0.2811 - classification_loss: 0.0160 224/500 [============>.................] - ETA: 1:34 - loss: 0.2972 - regression_loss: 0.2812 - classification_loss: 0.0160 225/500 [============>.................] - ETA: 1:34 - loss: 0.2974 - regression_loss: 0.2814 - classification_loss: 0.0160 226/500 [============>.................] - ETA: 1:34 - loss: 0.2977 - regression_loss: 0.2816 - classification_loss: 0.0161 227/500 [============>.................] - ETA: 1:33 - loss: 0.2986 - regression_loss: 0.2823 - classification_loss: 0.0163 228/500 [============>.................] - ETA: 1:33 - loss: 0.2986 - regression_loss: 0.2824 - classification_loss: 0.0163 229/500 [============>.................] - ETA: 1:33 - loss: 0.2983 - regression_loss: 0.2821 - classification_loss: 0.0162 230/500 [============>.................] - ETA: 1:32 - loss: 0.2984 - regression_loss: 0.2821 - classification_loss: 0.0163 231/500 [============>.................] - ETA: 1:32 - loss: 0.2992 - regression_loss: 0.2829 - classification_loss: 0.0163 232/500 [============>.................] - ETA: 1:32 - loss: 0.3000 - regression_loss: 0.2836 - classification_loss: 0.0164 233/500 [============>.................] - ETA: 1:31 - loss: 0.2996 - regression_loss: 0.2833 - classification_loss: 0.0163 234/500 [=============>................] - ETA: 1:31 - loss: 0.2993 - regression_loss: 0.2830 - classification_loss: 0.0163 235/500 [=============>................] - ETA: 1:31 - loss: 0.2987 - regression_loss: 0.2824 - classification_loss: 0.0163 236/500 [=============>................] - ETA: 1:30 - loss: 0.2987 - regression_loss: 0.2823 - classification_loss: 0.0164 237/500 [=============>................] - ETA: 1:30 - loss: 0.2981 - regression_loss: 0.2817 - classification_loss: 0.0164 238/500 [=============>................] - ETA: 1:30 - loss: 0.2984 - regression_loss: 0.2820 - classification_loss: 0.0164 239/500 [=============>................] - ETA: 1:29 - loss: 0.2985 - regression_loss: 0.2821 - classification_loss: 0.0164 240/500 [=============>................] - ETA: 1:29 - loss: 0.2989 - regression_loss: 0.2826 - classification_loss: 0.0164 241/500 [=============>................] - ETA: 1:29 - loss: 0.2991 - regression_loss: 0.2828 - classification_loss: 0.0164 242/500 [=============>................] - ETA: 1:28 - loss: 0.2987 - regression_loss: 0.2823 - classification_loss: 0.0164 243/500 [=============>................] - ETA: 1:28 - loss: 0.2989 - regression_loss: 0.2825 - classification_loss: 0.0164 244/500 [=============>................] - ETA: 1:28 - loss: 0.2981 - regression_loss: 0.2817 - classification_loss: 0.0163 245/500 [=============>................] - ETA: 1:27 - loss: 0.2985 - regression_loss: 0.2822 - classification_loss: 0.0163 246/500 [=============>................] - ETA: 1:27 - loss: 0.2981 - regression_loss: 0.2818 - classification_loss: 0.0163 247/500 [=============>................] - ETA: 1:26 - loss: 0.2973 - regression_loss: 0.2811 - classification_loss: 0.0162 248/500 [=============>................] - ETA: 1:26 - loss: 0.2968 - regression_loss: 0.2806 - classification_loss: 0.0162 249/500 [=============>................] - ETA: 1:26 - loss: 0.2969 - regression_loss: 0.2807 - classification_loss: 0.0162 250/500 [==============>...............] - ETA: 1:25 - loss: 0.2965 - regression_loss: 0.2804 - classification_loss: 0.0161 251/500 [==============>...............] - ETA: 1:25 - loss: 0.2968 - regression_loss: 0.2807 - classification_loss: 0.0161 252/500 [==============>...............] - ETA: 1:25 - loss: 0.2965 - regression_loss: 0.2804 - classification_loss: 0.0161 253/500 [==============>...............] - ETA: 1:24 - loss: 0.2963 - regression_loss: 0.2803 - classification_loss: 0.0160 254/500 [==============>...............] - ETA: 1:24 - loss: 0.2961 - regression_loss: 0.2801 - classification_loss: 0.0160 255/500 [==============>...............] - ETA: 1:24 - loss: 0.2959 - regression_loss: 0.2799 - classification_loss: 0.0160 256/500 [==============>...............] - ETA: 1:23 - loss: 0.2957 - regression_loss: 0.2798 - classification_loss: 0.0160 257/500 [==============>...............] - ETA: 1:23 - loss: 0.2954 - regression_loss: 0.2794 - classification_loss: 0.0159 258/500 [==============>...............] - ETA: 1:23 - loss: 0.2962 - regression_loss: 0.2802 - classification_loss: 0.0160 259/500 [==============>...............] - ETA: 1:22 - loss: 0.2964 - regression_loss: 0.2804 - classification_loss: 0.0160 260/500 [==============>...............] - ETA: 1:22 - loss: 0.2963 - regression_loss: 0.2803 - classification_loss: 0.0160 261/500 [==============>...............] - ETA: 1:22 - loss: 0.2963 - regression_loss: 0.2803 - classification_loss: 0.0160 262/500 [==============>...............] - ETA: 1:21 - loss: 0.2958 - regression_loss: 0.2799 - classification_loss: 0.0160 263/500 [==============>...............] - ETA: 1:21 - loss: 0.2956 - regression_loss: 0.2796 - classification_loss: 0.0159 264/500 [==============>...............] - ETA: 1:21 - loss: 0.2950 - regression_loss: 0.2791 - classification_loss: 0.0159 265/500 [==============>...............] - ETA: 1:20 - loss: 0.2946 - regression_loss: 0.2788 - classification_loss: 0.0158 266/500 [==============>...............] - ETA: 1:20 - loss: 0.2945 - regression_loss: 0.2787 - classification_loss: 0.0158 267/500 [===============>..............] - ETA: 1:20 - loss: 0.2942 - regression_loss: 0.2784 - classification_loss: 0.0158 268/500 [===============>..............] - ETA: 1:19 - loss: 0.2939 - regression_loss: 0.2781 - classification_loss: 0.0158 269/500 [===============>..............] - ETA: 1:19 - loss: 0.2936 - regression_loss: 0.2779 - classification_loss: 0.0157 270/500 [===============>..............] - ETA: 1:19 - loss: 0.2935 - regression_loss: 0.2778 - classification_loss: 0.0157 271/500 [===============>..............] - ETA: 1:18 - loss: 0.2936 - regression_loss: 0.2778 - classification_loss: 0.0157 272/500 [===============>..............] - ETA: 1:18 - loss: 0.2936 - regression_loss: 0.2778 - classification_loss: 0.0158 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2934 - regression_loss: 0.2776 - classification_loss: 0.0158 274/500 [===============>..............] - ETA: 1:17 - loss: 0.2932 - regression_loss: 0.2775 - classification_loss: 0.0158 275/500 [===============>..............] - ETA: 1:17 - loss: 0.2932 - regression_loss: 0.2775 - classification_loss: 0.0157 276/500 [===============>..............] - ETA: 1:16 - loss: 0.2930 - regression_loss: 0.2773 - classification_loss: 0.0157 277/500 [===============>..............] - ETA: 1:16 - loss: 0.2925 - regression_loss: 0.2768 - classification_loss: 0.0157 278/500 [===============>..............] - ETA: 1:16 - loss: 0.2925 - regression_loss: 0.2768 - classification_loss: 0.0157 279/500 [===============>..............] - ETA: 1:15 - loss: 0.2918 - regression_loss: 0.2762 - classification_loss: 0.0156 280/500 [===============>..............] - ETA: 1:15 - loss: 0.2921 - regression_loss: 0.2763 - classification_loss: 0.0158 281/500 [===============>..............] - ETA: 1:15 - loss: 0.2920 - regression_loss: 0.2762 - classification_loss: 0.0158 282/500 [===============>..............] - ETA: 1:14 - loss: 0.2912 - regression_loss: 0.2755 - classification_loss: 0.0157 283/500 [===============>..............] - ETA: 1:14 - loss: 0.2917 - regression_loss: 0.2759 - classification_loss: 0.0158 284/500 [================>.............] - ETA: 1:14 - loss: 0.2912 - regression_loss: 0.2754 - classification_loss: 0.0158 285/500 [================>.............] - ETA: 1:13 - loss: 0.2914 - regression_loss: 0.2755 - classification_loss: 0.0159 286/500 [================>.............] - ETA: 1:13 - loss: 0.2908 - regression_loss: 0.2750 - classification_loss: 0.0158 287/500 [================>.............] - ETA: 1:13 - loss: 0.2905 - regression_loss: 0.2747 - classification_loss: 0.0158 288/500 [================>.............] - ETA: 1:12 - loss: 0.2901 - regression_loss: 0.2743 - classification_loss: 0.0158 289/500 [================>.............] - ETA: 1:12 - loss: 0.2899 - regression_loss: 0.2742 - classification_loss: 0.0158 290/500 [================>.............] - ETA: 1:12 - loss: 0.2894 - regression_loss: 0.2737 - classification_loss: 0.0157 291/500 [================>.............] - ETA: 1:11 - loss: 0.2898 - regression_loss: 0.2740 - classification_loss: 0.0158 292/500 [================>.............] - ETA: 1:11 - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 293/500 [================>.............] - ETA: 1:11 - loss: 0.2913 - regression_loss: 0.2754 - classification_loss: 0.0159 294/500 [================>.............] - ETA: 1:10 - loss: 0.2908 - regression_loss: 0.2749 - classification_loss: 0.0158 295/500 [================>.............] - ETA: 1:10 - loss: 0.2904 - regression_loss: 0.2746 - classification_loss: 0.0158 296/500 [================>.............] - ETA: 1:10 - loss: 0.2904 - regression_loss: 0.2746 - classification_loss: 0.0158 297/500 [================>.............] - ETA: 1:09 - loss: 0.2899 - regression_loss: 0.2741 - classification_loss: 0.0158 298/500 [================>.............] - ETA: 1:09 - loss: 0.2899 - regression_loss: 0.2742 - classification_loss: 0.0158 299/500 [================>.............] - ETA: 1:09 - loss: 0.2900 - regression_loss: 0.2742 - classification_loss: 0.0158 300/500 [=================>............] - ETA: 1:08 - loss: 0.2902 - regression_loss: 0.2744 - classification_loss: 0.0158 301/500 [=================>............] - ETA: 1:08 - loss: 0.2905 - regression_loss: 0.2747 - classification_loss: 0.0158 302/500 [=================>............] - ETA: 1:08 - loss: 0.2901 - regression_loss: 0.2743 - classification_loss: 0.0158 303/500 [=================>............] - ETA: 1:07 - loss: 0.2898 - regression_loss: 0.2740 - classification_loss: 0.0158 304/500 [=================>............] - ETA: 1:07 - loss: 0.2897 - regression_loss: 0.2740 - classification_loss: 0.0157 305/500 [=================>............] - ETA: 1:07 - loss: 0.2899 - regression_loss: 0.2741 - classification_loss: 0.0158 306/500 [=================>............] - ETA: 1:06 - loss: 0.2897 - regression_loss: 0.2740 - classification_loss: 0.0158 307/500 [=================>............] - ETA: 1:06 - loss: 0.2894 - regression_loss: 0.2736 - classification_loss: 0.0157 308/500 [=================>............] - ETA: 1:05 - loss: 0.2893 - regression_loss: 0.2736 - classification_loss: 0.0157 309/500 [=================>............] - ETA: 1:05 - loss: 0.2902 - regression_loss: 0.2744 - classification_loss: 0.0157 310/500 [=================>............] - ETA: 1:05 - loss: 0.2907 - regression_loss: 0.2749 - classification_loss: 0.0158 311/500 [=================>............] - ETA: 1:04 - loss: 0.2906 - regression_loss: 0.2748 - classification_loss: 0.0158 312/500 [=================>............] - ETA: 1:04 - loss: 0.2907 - regression_loss: 0.2749 - classification_loss: 0.0158 313/500 [=================>............] - ETA: 1:04 - loss: 0.2911 - regression_loss: 0.2752 - classification_loss: 0.0159 314/500 [=================>............] - ETA: 1:03 - loss: 0.2909 - regression_loss: 0.2750 - classification_loss: 0.0159 315/500 [=================>............] - ETA: 1:03 - loss: 0.2909 - regression_loss: 0.2750 - classification_loss: 0.0159 316/500 [=================>............] - ETA: 1:03 - loss: 0.2903 - regression_loss: 0.2745 - classification_loss: 0.0158 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 318/500 [==================>...........] - ETA: 1:02 - loss: 0.2894 - regression_loss: 0.2737 - classification_loss: 0.0158 319/500 [==================>...........] - ETA: 1:02 - loss: 0.2900 - regression_loss: 0.2742 - classification_loss: 0.0158 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2907 - regression_loss: 0.2747 - classification_loss: 0.0160 321/500 [==================>...........] - ETA: 1:01 - loss: 0.2909 - regression_loss: 0.2749 - classification_loss: 0.0160 322/500 [==================>...........] - ETA: 1:01 - loss: 0.2905 - regression_loss: 0.2746 - classification_loss: 0.0160 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0160 324/500 [==================>...........] - ETA: 1:00 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0159 325/500 [==================>...........] - ETA: 1:00 - loss: 0.2898 - regression_loss: 0.2738 - classification_loss: 0.0159 326/500 [==================>...........] - ETA: 59s - loss: 0.2894 - regression_loss: 0.2735 - classification_loss: 0.0159  327/500 [==================>...........] - ETA: 59s - loss: 0.2895 - regression_loss: 0.2736 - classification_loss: 0.0159 328/500 [==================>...........] - ETA: 59s - loss: 0.2895 - regression_loss: 0.2735 - classification_loss: 0.0160 329/500 [==================>...........] - ETA: 58s - loss: 0.2894 - regression_loss: 0.2734 - classification_loss: 0.0160 330/500 [==================>...........] - ETA: 58s - loss: 0.2897 - regression_loss: 0.2736 - classification_loss: 0.0160 331/500 [==================>...........] - ETA: 58s - loss: 0.2898 - regression_loss: 0.2737 - classification_loss: 0.0160 332/500 [==================>...........] - ETA: 57s - loss: 0.2899 - regression_loss: 0.2739 - classification_loss: 0.0160 333/500 [==================>...........] - ETA: 57s - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0161 334/500 [===================>..........] - ETA: 57s - loss: 0.2898 - regression_loss: 0.2738 - classification_loss: 0.0160 335/500 [===================>..........] - ETA: 56s - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 336/500 [===================>..........] - ETA: 56s - loss: 0.2904 - regression_loss: 0.2743 - classification_loss: 0.0160 337/500 [===================>..........] - ETA: 55s - loss: 0.2906 - regression_loss: 0.2746 - classification_loss: 0.0160 338/500 [===================>..........] - ETA: 55s - loss: 0.2907 - regression_loss: 0.2747 - classification_loss: 0.0161 339/500 [===================>..........] - ETA: 55s - loss: 0.2905 - regression_loss: 0.2744 - classification_loss: 0.0161 340/500 [===================>..........] - ETA: 54s - loss: 0.2904 - regression_loss: 0.2743 - classification_loss: 0.0160 341/500 [===================>..........] - ETA: 54s - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0160 342/500 [===================>..........] - ETA: 54s - loss: 0.2904 - regression_loss: 0.2745 - classification_loss: 0.0160 343/500 [===================>..........] - ETA: 53s - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 344/500 [===================>..........] - ETA: 53s - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 345/500 [===================>..........] - ETA: 53s - loss: 0.2899 - regression_loss: 0.2738 - classification_loss: 0.0160 346/500 [===================>..........] - ETA: 52s - loss: 0.2895 - regression_loss: 0.2735 - classification_loss: 0.0160 347/500 [===================>..........] - ETA: 52s - loss: 0.2894 - regression_loss: 0.2734 - classification_loss: 0.0160 348/500 [===================>..........] - ETA: 52s - loss: 0.2896 - regression_loss: 0.2736 - classification_loss: 0.0160 349/500 [===================>..........] - ETA: 51s - loss: 0.2893 - regression_loss: 0.2733 - classification_loss: 0.0160 350/500 [====================>.........] - ETA: 51s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0160 351/500 [====================>.........] - ETA: 51s - loss: 0.2897 - regression_loss: 0.2738 - classification_loss: 0.0160 352/500 [====================>.........] - ETA: 50s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 353/500 [====================>.........] - ETA: 50s - loss: 0.2898 - regression_loss: 0.2738 - classification_loss: 0.0159 354/500 [====================>.........] - ETA: 50s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 355/500 [====================>.........] - ETA: 49s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 356/500 [====================>.........] - ETA: 49s - loss: 0.2897 - regression_loss: 0.2738 - classification_loss: 0.0159 357/500 [====================>.........] - ETA: 49s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 358/500 [====================>.........] - ETA: 48s - loss: 0.2908 - regression_loss: 0.2749 - classification_loss: 0.0159 359/500 [====================>.........] - ETA: 48s - loss: 0.2908 - regression_loss: 0.2750 - classification_loss: 0.0158 360/500 [====================>.........] - ETA: 48s - loss: 0.2907 - regression_loss: 0.2748 - classification_loss: 0.0159 361/500 [====================>.........] - ETA: 47s - loss: 0.2909 - regression_loss: 0.2750 - classification_loss: 0.0159 362/500 [====================>.........] - ETA: 47s - loss: 0.2912 - regression_loss: 0.2752 - classification_loss: 0.0160 363/500 [====================>.........] - ETA: 47s - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0159 364/500 [====================>.........] - ETA: 46s - loss: 0.2909 - regression_loss: 0.2749 - classification_loss: 0.0159 365/500 [====================>.........] - ETA: 46s - loss: 0.2906 - regression_loss: 0.2747 - classification_loss: 0.0159 366/500 [====================>.........] - ETA: 46s - loss: 0.2903 - regression_loss: 0.2744 - classification_loss: 0.0159 367/500 [=====================>........] - ETA: 45s - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 368/500 [=====================>........] - ETA: 45s - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0159 369/500 [=====================>........] - ETA: 45s - loss: 0.2904 - regression_loss: 0.2744 - classification_loss: 0.0159 370/500 [=====================>........] - ETA: 44s - loss: 0.2903 - regression_loss: 0.2744 - classification_loss: 0.0159 371/500 [=====================>........] - ETA: 44s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 372/500 [=====================>........] - ETA: 43s - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 373/500 [=====================>........] - ETA: 43s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 374/500 [=====================>........] - ETA: 43s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 375/500 [=====================>........] - ETA: 42s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 376/500 [=====================>........] - ETA: 42s - loss: 0.2897 - regression_loss: 0.2738 - classification_loss: 0.0159 377/500 [=====================>........] - ETA: 42s - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 378/500 [=====================>........] - ETA: 41s - loss: 0.2893 - regression_loss: 0.2735 - classification_loss: 0.0158 379/500 [=====================>........] - ETA: 41s - loss: 0.2890 - regression_loss: 0.2732 - classification_loss: 0.0158 380/500 [=====================>........] - ETA: 41s - loss: 0.2888 - regression_loss: 0.2730 - classification_loss: 0.0158 381/500 [=====================>........] - ETA: 40s - loss: 0.2889 - regression_loss: 0.2730 - classification_loss: 0.0158 382/500 [=====================>........] - ETA: 40s - loss: 0.2884 - regression_loss: 0.2726 - classification_loss: 0.0158 383/500 [=====================>........] - ETA: 40s - loss: 0.2883 - regression_loss: 0.2725 - classification_loss: 0.0158 384/500 [======================>.......] - ETA: 39s - loss: 0.2880 - regression_loss: 0.2723 - classification_loss: 0.0158 385/500 [======================>.......] - ETA: 39s - loss: 0.2880 - regression_loss: 0.2723 - classification_loss: 0.0157 386/500 [======================>.......] - ETA: 39s - loss: 0.2881 - regression_loss: 0.2724 - classification_loss: 0.0157 387/500 [======================>.......] - ETA: 38s - loss: 0.2879 - regression_loss: 0.2722 - classification_loss: 0.0157 388/500 [======================>.......] - ETA: 38s - loss: 0.2876 - regression_loss: 0.2719 - classification_loss: 0.0157 389/500 [======================>.......] - ETA: 38s - loss: 0.2879 - regression_loss: 0.2720 - classification_loss: 0.0158 390/500 [======================>.......] - ETA: 37s - loss: 0.2877 - regression_loss: 0.2719 - classification_loss: 0.0158 391/500 [======================>.......] - ETA: 37s - loss: 0.2876 - regression_loss: 0.2718 - classification_loss: 0.0158 392/500 [======================>.......] - ETA: 37s - loss: 0.2872 - regression_loss: 0.2715 - classification_loss: 0.0158 393/500 [======================>.......] - ETA: 36s - loss: 0.2875 - regression_loss: 0.2718 - classification_loss: 0.0158 394/500 [======================>.......] - ETA: 36s - loss: 0.2875 - regression_loss: 0.2717 - classification_loss: 0.0157 395/500 [======================>.......] - ETA: 36s - loss: 0.2875 - regression_loss: 0.2717 - classification_loss: 0.0157 396/500 [======================>.......] - ETA: 35s - loss: 0.2878 - regression_loss: 0.2720 - classification_loss: 0.0158 397/500 [======================>.......] - ETA: 35s - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 398/500 [======================>.......] - ETA: 35s - loss: 0.2875 - regression_loss: 0.2718 - classification_loss: 0.0158 399/500 [======================>.......] - ETA: 34s - loss: 0.2873 - regression_loss: 0.2715 - classification_loss: 0.0157 400/500 [=======================>......] - ETA: 34s - loss: 0.2868 - regression_loss: 0.2710 - classification_loss: 0.0157 401/500 [=======================>......] - ETA: 34s - loss: 0.2866 - regression_loss: 0.2709 - classification_loss: 0.0157 402/500 [=======================>......] - ETA: 33s - loss: 0.2864 - regression_loss: 0.2707 - classification_loss: 0.0157 403/500 [=======================>......] - ETA: 33s - loss: 0.2861 - regression_loss: 0.2704 - classification_loss: 0.0157 404/500 [=======================>......] - ETA: 33s - loss: 0.2862 - regression_loss: 0.2705 - classification_loss: 0.0157 405/500 [=======================>......] - ETA: 32s - loss: 0.2868 - regression_loss: 0.2711 - classification_loss: 0.0157 406/500 [=======================>......] - ETA: 32s - loss: 0.2863 - regression_loss: 0.2706 - classification_loss: 0.0157 407/500 [=======================>......] - ETA: 31s - loss: 0.2862 - regression_loss: 0.2706 - classification_loss: 0.0156 408/500 [=======================>......] - ETA: 31s - loss: 0.2866 - regression_loss: 0.2709 - classification_loss: 0.0156 409/500 [=======================>......] - ETA: 31s - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 410/500 [=======================>......] - ETA: 30s - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 411/500 [=======================>......] - ETA: 30s - loss: 0.2865 - regression_loss: 0.2709 - classification_loss: 0.0156 412/500 [=======================>......] - ETA: 30s - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 413/500 [=======================>......] - ETA: 29s - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 414/500 [=======================>......] - ETA: 29s - loss: 0.2861 - regression_loss: 0.2705 - classification_loss: 0.0156 415/500 [=======================>......] - ETA: 29s - loss: 0.2858 - regression_loss: 0.2702 - classification_loss: 0.0156 416/500 [=======================>......] - ETA: 28s - loss: 0.2858 - regression_loss: 0.2702 - classification_loss: 0.0156 417/500 [========================>.....] - ETA: 28s - loss: 0.2855 - regression_loss: 0.2700 - classification_loss: 0.0155 418/500 [========================>.....] - ETA: 28s - loss: 0.2864 - regression_loss: 0.2708 - classification_loss: 0.0155 419/500 [========================>.....] - ETA: 27s - loss: 0.2866 - regression_loss: 0.2711 - classification_loss: 0.0156 420/500 [========================>.....] - ETA: 27s - loss: 0.2864 - regression_loss: 0.2709 - classification_loss: 0.0155 421/500 [========================>.....] - ETA: 27s - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 422/500 [========================>.....] - ETA: 26s - loss: 0.2859 - regression_loss: 0.2703 - classification_loss: 0.0155 423/500 [========================>.....] - ETA: 26s - loss: 0.2859 - regression_loss: 0.2704 - classification_loss: 0.0155 424/500 [========================>.....] - ETA: 26s - loss: 0.2861 - regression_loss: 0.2706 - classification_loss: 0.0156 425/500 [========================>.....] - ETA: 25s - loss: 0.2862 - regression_loss: 0.2707 - classification_loss: 0.0156 426/500 [========================>.....] - ETA: 25s - loss: 0.2863 - regression_loss: 0.2708 - classification_loss: 0.0155 427/500 [========================>.....] - ETA: 25s - loss: 0.2861 - regression_loss: 0.2706 - classification_loss: 0.0155 428/500 [========================>.....] - ETA: 24s - loss: 0.2858 - regression_loss: 0.2703 - classification_loss: 0.0155 429/500 [========================>.....] - ETA: 24s - loss: 0.2860 - regression_loss: 0.2705 - classification_loss: 0.0156 430/500 [========================>.....] - ETA: 24s - loss: 0.2859 - regression_loss: 0.2703 - classification_loss: 0.0155 431/500 [========================>.....] - ETA: 23s - loss: 0.2857 - regression_loss: 0.2702 - classification_loss: 0.0156 432/500 [========================>.....] - ETA: 23s - loss: 0.2866 - regression_loss: 0.2708 - classification_loss: 0.0157 433/500 [========================>.....] - ETA: 22s - loss: 0.2869 - regression_loss: 0.2712 - classification_loss: 0.0157 434/500 [=========================>....] - ETA: 22s - loss: 0.2868 - regression_loss: 0.2710 - classification_loss: 0.0157 435/500 [=========================>....] - ETA: 22s - loss: 0.2870 - regression_loss: 0.2712 - classification_loss: 0.0157 436/500 [=========================>....] - ETA: 21s - loss: 0.2870 - regression_loss: 0.2713 - classification_loss: 0.0157 437/500 [=========================>....] - ETA: 21s - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 438/500 [=========================>....] - ETA: 21s - loss: 0.2879 - regression_loss: 0.2720 - classification_loss: 0.0159 439/500 [=========================>....] - ETA: 20s - loss: 0.2882 - regression_loss: 0.2723 - classification_loss: 0.0159 440/500 [=========================>....] - ETA: 20s - loss: 0.2883 - regression_loss: 0.2724 - classification_loss: 0.0159 441/500 [=========================>....] - ETA: 20s - loss: 0.2886 - regression_loss: 0.2727 - classification_loss: 0.0159 442/500 [=========================>....] - ETA: 19s - loss: 0.2887 - regression_loss: 0.2727 - classification_loss: 0.0159 443/500 [=========================>....] - ETA: 19s - loss: 0.2883 - regression_loss: 0.2724 - classification_loss: 0.0159 444/500 [=========================>....] - ETA: 19s - loss: 0.2885 - regression_loss: 0.2725 - classification_loss: 0.0160 445/500 [=========================>....] - ETA: 18s - loss: 0.2885 - regression_loss: 0.2725 - classification_loss: 0.0160 446/500 [=========================>....] - ETA: 18s - loss: 0.2887 - regression_loss: 0.2727 - classification_loss: 0.0160 447/500 [=========================>....] - ETA: 18s - loss: 0.2891 - regression_loss: 0.2730 - classification_loss: 0.0160 448/500 [=========================>....] - ETA: 17s - loss: 0.2895 - regression_loss: 0.2734 - classification_loss: 0.0161 449/500 [=========================>....] - ETA: 17s - loss: 0.2892 - regression_loss: 0.2732 - classification_loss: 0.0161 450/500 [==========================>...] - ETA: 17s - loss: 0.2893 - regression_loss: 0.2732 - classification_loss: 0.0161 451/500 [==========================>...] - ETA: 16s - loss: 0.2893 - regression_loss: 0.2732 - classification_loss: 0.0161 452/500 [==========================>...] - ETA: 16s - loss: 0.2891 - regression_loss: 0.2731 - classification_loss: 0.0161 453/500 [==========================>...] - ETA: 16s - loss: 0.2890 - regression_loss: 0.2729 - classification_loss: 0.0160 454/500 [==========================>...] - ETA: 15s - loss: 0.2892 - regression_loss: 0.2732 - classification_loss: 0.0160 455/500 [==========================>...] - ETA: 15s - loss: 0.2897 - regression_loss: 0.2736 - classification_loss: 0.0161 456/500 [==========================>...] - ETA: 15s - loss: 0.2897 - regression_loss: 0.2736 - classification_loss: 0.0161 457/500 [==========================>...] - ETA: 14s - loss: 0.2897 - regression_loss: 0.2736 - classification_loss: 0.0161 458/500 [==========================>...] - ETA: 14s - loss: 0.2894 - regression_loss: 0.2733 - classification_loss: 0.0161 459/500 [==========================>...] - ETA: 14s - loss: 0.2893 - regression_loss: 0.2732 - classification_loss: 0.0161 460/500 [==========================>...] - ETA: 13s - loss: 0.2891 - regression_loss: 0.2730 - classification_loss: 0.0161 461/500 [==========================>...] - ETA: 13s - loss: 0.2889 - regression_loss: 0.2728 - classification_loss: 0.0161 462/500 [==========================>...] - ETA: 13s - loss: 0.2891 - regression_loss: 0.2730 - classification_loss: 0.0161 463/500 [==========================>...] - ETA: 12s - loss: 0.2894 - regression_loss: 0.2733 - classification_loss: 0.0161 464/500 [==========================>...] - ETA: 12s - loss: 0.2898 - regression_loss: 0.2737 - classification_loss: 0.0161 465/500 [==========================>...] - ETA: 11s - loss: 0.2898 - regression_loss: 0.2737 - classification_loss: 0.0161 466/500 [==========================>...] - ETA: 11s - loss: 0.2900 - regression_loss: 0.2739 - classification_loss: 0.0160 467/500 [===========================>..] - ETA: 11s - loss: 0.2901 - regression_loss: 0.2740 - classification_loss: 0.0160 468/500 [===========================>..] - ETA: 10s - loss: 0.2903 - regression_loss: 0.2743 - classification_loss: 0.0161 469/500 [===========================>..] - ETA: 10s - loss: 0.2905 - regression_loss: 0.2744 - classification_loss: 0.0161 470/500 [===========================>..] - ETA: 10s - loss: 0.2908 - regression_loss: 0.2747 - classification_loss: 0.0161 471/500 [===========================>..] - ETA: 9s - loss: 0.2906 - regression_loss: 0.2745 - classification_loss: 0.0161  472/500 [===========================>..] - ETA: 9s - loss: 0.2905 - regression_loss: 0.2744 - classification_loss: 0.0161 473/500 [===========================>..] - ETA: 9s - loss: 0.2908 - regression_loss: 0.2747 - classification_loss: 0.0161 474/500 [===========================>..] - ETA: 8s - loss: 0.2904 - regression_loss: 0.2744 - classification_loss: 0.0160 475/500 [===========================>..] - ETA: 8s - loss: 0.2906 - regression_loss: 0.2746 - classification_loss: 0.0160 476/500 [===========================>..] - ETA: 8s - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0160 477/500 [===========================>..] - ETA: 7s - loss: 0.2907 - regression_loss: 0.2747 - classification_loss: 0.0160 478/500 [===========================>..] - ETA: 7s - loss: 0.2910 - regression_loss: 0.2750 - classification_loss: 0.0160 479/500 [===========================>..] - ETA: 7s - loss: 0.2913 - regression_loss: 0.2753 - classification_loss: 0.0160 480/500 [===========================>..] - ETA: 6s - loss: 0.2913 - regression_loss: 0.2753 - classification_loss: 0.0160 481/500 [===========================>..] - ETA: 6s - loss: 0.2917 - regression_loss: 0.2757 - classification_loss: 0.0160 482/500 [===========================>..] - ETA: 6s - loss: 0.2914 - regression_loss: 0.2754 - classification_loss: 0.0160 483/500 [===========================>..] - ETA: 5s - loss: 0.2913 - regression_loss: 0.2753 - classification_loss: 0.0160 484/500 [============================>.] - ETA: 5s - loss: 0.2911 - regression_loss: 0.2752 - classification_loss: 0.0160 485/500 [============================>.] - ETA: 5s - loss: 0.2909 - regression_loss: 0.2750 - classification_loss: 0.0159 486/500 [============================>.] - ETA: 4s - loss: 0.2910 - regression_loss: 0.2751 - classification_loss: 0.0159 487/500 [============================>.] - ETA: 4s - loss: 0.2908 - regression_loss: 0.2749 - classification_loss: 0.0159 488/500 [============================>.] - ETA: 4s - loss: 0.2909 - regression_loss: 0.2749 - classification_loss: 0.0160 489/500 [============================>.] - ETA: 3s - loss: 0.2907 - regression_loss: 0.2747 - classification_loss: 0.0160 490/500 [============================>.] - ETA: 3s - loss: 0.2905 - regression_loss: 0.2745 - classification_loss: 0.0160 491/500 [============================>.] - ETA: 3s - loss: 0.2903 - regression_loss: 0.2743 - classification_loss: 0.0159 492/500 [============================>.] - ETA: 2s - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 493/500 [============================>.] - ETA: 2s - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 494/500 [============================>.] - ETA: 2s - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 495/500 [============================>.] - ETA: 1s - loss: 0.2898 - regression_loss: 0.2739 - classification_loss: 0.0159 496/500 [============================>.] - ETA: 1s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 497/500 [============================>.] - ETA: 1s - loss: 0.2898 - regression_loss: 0.2739 - classification_loss: 0.0159 498/500 [============================>.] - ETA: 0s - loss: 0.2895 - regression_loss: 0.2736 - classification_loss: 0.0159 499/500 [============================>.] - ETA: 0s - loss: 0.2893 - regression_loss: 0.2734 - classification_loss: 0.0159 500/500 [==============================] - 171s 342ms/step - loss: 0.2891 - regression_loss: 0.2733 - classification_loss: 0.0158 1172 instances of class plum with average precision: 0.7494 mAP: 0.7494 Epoch 00039: saving model to ./training/snapshots/resnet101_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:40 - loss: 0.2406 - regression_loss: 0.2334 - classification_loss: 0.0072 2/500 [..............................] - ETA: 2:45 - loss: 0.2639 - regression_loss: 0.2570 - classification_loss: 0.0069 3/500 [..............................] - ETA: 2:43 - loss: 0.2683 - regression_loss: 0.2559 - classification_loss: 0.0124 4/500 [..............................] - ETA: 2:43 - loss: 0.2540 - regression_loss: 0.2436 - classification_loss: 0.0104 5/500 [..............................] - ETA: 2:44 - loss: 0.2167 - regression_loss: 0.2079 - classification_loss: 0.0087 6/500 [..............................] - ETA: 2:43 - loss: 0.2257 - regression_loss: 0.2158 - classification_loss: 0.0099 7/500 [..............................] - ETA: 2:42 - loss: 0.2519 - regression_loss: 0.2384 - classification_loss: 0.0135 8/500 [..............................] - ETA: 2:41 - loss: 0.2673 - regression_loss: 0.2539 - classification_loss: 0.0133 9/500 [..............................] - ETA: 2:39 - loss: 0.2622 - regression_loss: 0.2496 - classification_loss: 0.0126 10/500 [..............................] - ETA: 2:39 - loss: 0.2494 - regression_loss: 0.2378 - classification_loss: 0.0116 11/500 [..............................] - ETA: 2:38 - loss: 0.2412 - regression_loss: 0.2303 - classification_loss: 0.0109 12/500 [..............................] - ETA: 2:38 - loss: 0.2428 - regression_loss: 0.2316 - classification_loss: 0.0112 13/500 [..............................] - ETA: 2:37 - loss: 0.2477 - regression_loss: 0.2358 - classification_loss: 0.0118 14/500 [..............................] - ETA: 2:37 - loss: 0.2521 - regression_loss: 0.2399 - classification_loss: 0.0121 15/500 [..............................] - ETA: 2:37 - loss: 0.2567 - regression_loss: 0.2444 - classification_loss: 0.0123 16/500 [..............................] - ETA: 2:38 - loss: 0.2498 - regression_loss: 0.2380 - classification_loss: 0.0119 17/500 [>.............................] - ETA: 2:38 - loss: 0.2560 - regression_loss: 0.2440 - classification_loss: 0.0120 18/500 [>.............................] - ETA: 2:38 - loss: 0.2567 - regression_loss: 0.2441 - classification_loss: 0.0126 19/500 [>.............................] - ETA: 2:38 - loss: 0.2590 - regression_loss: 0.2462 - classification_loss: 0.0128 20/500 [>.............................] - ETA: 2:38 - loss: 0.2523 - regression_loss: 0.2400 - classification_loss: 0.0123 21/500 [>.............................] - ETA: 2:38 - loss: 0.2676 - regression_loss: 0.2518 - classification_loss: 0.0158 22/500 [>.............................] - ETA: 2:38 - loss: 0.2644 - regression_loss: 0.2491 - classification_loss: 0.0153 23/500 [>.............................] - ETA: 2:38 - loss: 0.2676 - regression_loss: 0.2525 - classification_loss: 0.0151 24/500 [>.............................] - ETA: 2:38 - loss: 0.2649 - regression_loss: 0.2498 - classification_loss: 0.0151 25/500 [>.............................] - ETA: 2:38 - loss: 0.2582 - regression_loss: 0.2436 - classification_loss: 0.0146 26/500 [>.............................] - ETA: 2:38 - loss: 0.2748 - regression_loss: 0.2593 - classification_loss: 0.0155 27/500 [>.............................] - ETA: 2:38 - loss: 0.2843 - regression_loss: 0.2675 - classification_loss: 0.0168 28/500 [>.............................] - ETA: 2:37 - loss: 0.2976 - regression_loss: 0.2802 - classification_loss: 0.0175 29/500 [>.............................] - ETA: 2:37 - loss: 0.3017 - regression_loss: 0.2838 - classification_loss: 0.0179 30/500 [>.............................] - ETA: 2:37 - loss: 0.3026 - regression_loss: 0.2840 - classification_loss: 0.0187 31/500 [>.............................] - ETA: 2:37 - loss: 0.2980 - regression_loss: 0.2798 - classification_loss: 0.0182 32/500 [>.............................] - ETA: 2:37 - loss: 0.2962 - regression_loss: 0.2781 - classification_loss: 0.0181 33/500 [>.............................] - ETA: 2:37 - loss: 0.2933 - regression_loss: 0.2755 - classification_loss: 0.0178 34/500 [=>............................] - ETA: 2:36 - loss: 0.2899 - regression_loss: 0.2724 - classification_loss: 0.0176 35/500 [=>............................] - ETA: 2:36 - loss: 0.2897 - regression_loss: 0.2723 - classification_loss: 0.0174 36/500 [=>............................] - ETA: 2:36 - loss: 0.2902 - regression_loss: 0.2730 - classification_loss: 0.0171 37/500 [=>............................] - ETA: 2:35 - loss: 0.2902 - regression_loss: 0.2732 - classification_loss: 0.0170 38/500 [=>............................] - ETA: 2:35 - loss: 0.2924 - regression_loss: 0.2754 - classification_loss: 0.0169 39/500 [=>............................] - ETA: 2:35 - loss: 0.2931 - regression_loss: 0.2762 - classification_loss: 0.0169 40/500 [=>............................] - ETA: 2:34 - loss: 0.2961 - regression_loss: 0.2788 - classification_loss: 0.0173 41/500 [=>............................] - ETA: 2:34 - loss: 0.3056 - regression_loss: 0.2876 - classification_loss: 0.0180 42/500 [=>............................] - ETA: 2:34 - loss: 0.3079 - regression_loss: 0.2900 - classification_loss: 0.0178 43/500 [=>............................] - ETA: 2:34 - loss: 0.3061 - regression_loss: 0.2884 - classification_loss: 0.0177 44/500 [=>............................] - ETA: 2:33 - loss: 0.3053 - regression_loss: 0.2877 - classification_loss: 0.0177 45/500 [=>............................] - ETA: 2:33 - loss: 0.3080 - regression_loss: 0.2907 - classification_loss: 0.0174 46/500 [=>............................] - ETA: 2:33 - loss: 0.3065 - regression_loss: 0.2893 - classification_loss: 0.0172 47/500 [=>............................] - ETA: 2:32 - loss: 0.3051 - regression_loss: 0.2880 - classification_loss: 0.0172 48/500 [=>............................] - ETA: 2:32 - loss: 0.3034 - regression_loss: 0.2863 - classification_loss: 0.0171 49/500 [=>............................] - ETA: 2:32 - loss: 0.3002 - regression_loss: 0.2831 - classification_loss: 0.0170 50/500 [==>...........................] - ETA: 2:31 - loss: 0.3003 - regression_loss: 0.2833 - classification_loss: 0.0169 51/500 [==>...........................] - ETA: 2:31 - loss: 0.2970 - regression_loss: 0.2803 - classification_loss: 0.0167 52/500 [==>...........................] - ETA: 2:31 - loss: 0.2956 - regression_loss: 0.2789 - classification_loss: 0.0167 53/500 [==>...........................] - ETA: 2:30 - loss: 0.2948 - regression_loss: 0.2781 - classification_loss: 0.0168 54/500 [==>...........................] - ETA: 2:30 - loss: 0.2940 - regression_loss: 0.2774 - classification_loss: 0.0167 55/500 [==>...........................] - ETA: 2:29 - loss: 0.2916 - regression_loss: 0.2752 - classification_loss: 0.0164 56/500 [==>...........................] - ETA: 2:29 - loss: 0.2914 - regression_loss: 0.2751 - classification_loss: 0.0163 57/500 [==>...........................] - ETA: 2:28 - loss: 0.2917 - regression_loss: 0.2753 - classification_loss: 0.0163 58/500 [==>...........................] - ETA: 2:28 - loss: 0.2892 - regression_loss: 0.2730 - classification_loss: 0.0161 59/500 [==>...........................] - ETA: 2:27 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 60/500 [==>...........................] - ETA: 2:27 - loss: 0.2925 - regression_loss: 0.2758 - classification_loss: 0.0166 61/500 [==>...........................] - ETA: 2:27 - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0172 62/500 [==>...........................] - ETA: 2:26 - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0171 63/500 [==>...........................] - ETA: 2:26 - loss: 0.2957 - regression_loss: 0.2786 - classification_loss: 0.0171 64/500 [==>...........................] - ETA: 2:26 - loss: 0.2946 - regression_loss: 0.2775 - classification_loss: 0.0171 65/500 [==>...........................] - ETA: 2:25 - loss: 0.2922 - regression_loss: 0.2753 - classification_loss: 0.0169 66/500 [==>...........................] - ETA: 2:25 - loss: 0.2914 - regression_loss: 0.2744 - classification_loss: 0.0170 67/500 [===>..........................] - ETA: 2:25 - loss: 0.2911 - regression_loss: 0.2742 - classification_loss: 0.0169 68/500 [===>..........................] - ETA: 2:25 - loss: 0.2938 - regression_loss: 0.2767 - classification_loss: 0.0171 69/500 [===>..........................] - ETA: 2:24 - loss: 0.2986 - regression_loss: 0.2816 - classification_loss: 0.0170 70/500 [===>..........................] - ETA: 2:24 - loss: 0.2978 - regression_loss: 0.2809 - classification_loss: 0.0169 71/500 [===>..........................] - ETA: 2:24 - loss: 0.2947 - regression_loss: 0.2780 - classification_loss: 0.0167 72/500 [===>..........................] - ETA: 2:23 - loss: 0.2941 - regression_loss: 0.2775 - classification_loss: 0.0167 73/500 [===>..........................] - ETA: 2:23 - loss: 0.2935 - regression_loss: 0.2771 - classification_loss: 0.0165 74/500 [===>..........................] - ETA: 2:23 - loss: 0.2931 - regression_loss: 0.2768 - classification_loss: 0.0164 75/500 [===>..........................] - ETA: 2:23 - loss: 0.2923 - regression_loss: 0.2760 - classification_loss: 0.0163 76/500 [===>..........................] - ETA: 2:22 - loss: 0.2914 - regression_loss: 0.2751 - classification_loss: 0.0163 77/500 [===>..........................] - ETA: 2:22 - loss: 0.2916 - regression_loss: 0.2753 - classification_loss: 0.0163 78/500 [===>..........................] - ETA: 2:22 - loss: 0.2923 - regression_loss: 0.2759 - classification_loss: 0.0164 79/500 [===>..........................] - ETA: 2:22 - loss: 0.2926 - regression_loss: 0.2761 - classification_loss: 0.0165 80/500 [===>..........................] - ETA: 2:21 - loss: 0.2914 - regression_loss: 0.2750 - classification_loss: 0.0164 81/500 [===>..........................] - ETA: 2:21 - loss: 0.2933 - regression_loss: 0.2769 - classification_loss: 0.0165 82/500 [===>..........................] - ETA: 2:21 - loss: 0.2936 - regression_loss: 0.2771 - classification_loss: 0.0164 83/500 [===>..........................] - ETA: 2:20 - loss: 0.2937 - regression_loss: 0.2772 - classification_loss: 0.0165 84/500 [====>.........................] - ETA: 2:20 - loss: 0.2925 - regression_loss: 0.2761 - classification_loss: 0.0163 85/500 [====>.........................] - ETA: 2:20 - loss: 0.2941 - regression_loss: 0.2777 - classification_loss: 0.0164 86/500 [====>.........................] - ETA: 2:19 - loss: 0.2930 - regression_loss: 0.2767 - classification_loss: 0.0163 87/500 [====>.........................] - ETA: 2:19 - loss: 0.2937 - regression_loss: 0.2772 - classification_loss: 0.0165 88/500 [====>.........................] - ETA: 2:19 - loss: 0.2946 - regression_loss: 0.2779 - classification_loss: 0.0167 89/500 [====>.........................] - ETA: 2:18 - loss: 0.2944 - regression_loss: 0.2777 - classification_loss: 0.0167 90/500 [====>.........................] - ETA: 2:18 - loss: 0.2925 - regression_loss: 0.2760 - classification_loss: 0.0165 91/500 [====>.........................] - ETA: 2:18 - loss: 0.2909 - regression_loss: 0.2745 - classification_loss: 0.0164 92/500 [====>.........................] - ETA: 2:17 - loss: 0.2911 - regression_loss: 0.2746 - classification_loss: 0.0165 93/500 [====>.........................] - ETA: 2:17 - loss: 0.2919 - regression_loss: 0.2754 - classification_loss: 0.0165 94/500 [====>.........................] - ETA: 2:17 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0164 95/500 [====>.........................] - ETA: 2:16 - loss: 0.2912 - regression_loss: 0.2749 - classification_loss: 0.0164 96/500 [====>.........................] - ETA: 2:16 - loss: 0.2906 - regression_loss: 0.2743 - classification_loss: 0.0163 97/500 [====>.........................] - ETA: 2:16 - loss: 0.2899 - regression_loss: 0.2736 - classification_loss: 0.0162 98/500 [====>.........................] - ETA: 2:16 - loss: 0.2894 - regression_loss: 0.2733 - classification_loss: 0.0161 99/500 [====>.........................] - ETA: 2:15 - loss: 0.2882 - regression_loss: 0.2721 - classification_loss: 0.0160 100/500 [=====>........................] - ETA: 2:15 - loss: 0.2878 - regression_loss: 0.2718 - classification_loss: 0.0160 101/500 [=====>........................] - ETA: 2:15 - loss: 0.2884 - regression_loss: 0.2725 - classification_loss: 0.0159 102/500 [=====>........................] - ETA: 2:14 - loss: 0.2870 - regression_loss: 0.2713 - classification_loss: 0.0158 103/500 [=====>........................] - ETA: 2:14 - loss: 0.2861 - regression_loss: 0.2704 - classification_loss: 0.0157 104/500 [=====>........................] - ETA: 2:14 - loss: 0.2859 - regression_loss: 0.2703 - classification_loss: 0.0156 105/500 [=====>........................] - ETA: 2:13 - loss: 0.2856 - regression_loss: 0.2701 - classification_loss: 0.0156 106/500 [=====>........................] - ETA: 2:13 - loss: 0.2862 - regression_loss: 0.2706 - classification_loss: 0.0156 107/500 [=====>........................] - ETA: 2:13 - loss: 0.2864 - regression_loss: 0.2708 - classification_loss: 0.0156 108/500 [=====>........................] - ETA: 2:12 - loss: 0.2867 - regression_loss: 0.2710 - classification_loss: 0.0157 109/500 [=====>........................] - ETA: 2:12 - loss: 0.2863 - regression_loss: 0.2707 - classification_loss: 0.0156 110/500 [=====>........................] - ETA: 2:12 - loss: 0.2874 - regression_loss: 0.2718 - classification_loss: 0.0156 111/500 [=====>........................] - ETA: 2:11 - loss: 0.2861 - regression_loss: 0.2706 - classification_loss: 0.0155 112/500 [=====>........................] - ETA: 2:11 - loss: 0.2850 - regression_loss: 0.2696 - classification_loss: 0.0154 113/500 [=====>........................] - ETA: 2:11 - loss: 0.2851 - regression_loss: 0.2695 - classification_loss: 0.0155 114/500 [=====>........................] - ETA: 2:10 - loss: 0.2857 - regression_loss: 0.2702 - classification_loss: 0.0156 115/500 [=====>........................] - ETA: 2:10 - loss: 0.2850 - regression_loss: 0.2695 - classification_loss: 0.0155 116/500 [=====>........................] - ETA: 2:10 - loss: 0.2850 - regression_loss: 0.2695 - classification_loss: 0.0155 117/500 [======>.......................] - ETA: 2:09 - loss: 0.2838 - regression_loss: 0.2684 - classification_loss: 0.0154 118/500 [======>.......................] - ETA: 2:09 - loss: 0.2845 - regression_loss: 0.2690 - classification_loss: 0.0155 119/500 [======>.......................] - ETA: 2:09 - loss: 0.2845 - regression_loss: 0.2690 - classification_loss: 0.0155 120/500 [======>.......................] - ETA: 2:08 - loss: 0.2858 - regression_loss: 0.2703 - classification_loss: 0.0155 121/500 [======>.......................] - ETA: 2:08 - loss: 0.2887 - regression_loss: 0.2730 - classification_loss: 0.0156 122/500 [======>.......................] - ETA: 2:08 - loss: 0.2887 - regression_loss: 0.2731 - classification_loss: 0.0157 123/500 [======>.......................] - ETA: 2:07 - loss: 0.2880 - regression_loss: 0.2723 - classification_loss: 0.0157 124/500 [======>.......................] - ETA: 2:07 - loss: 0.2872 - regression_loss: 0.2715 - classification_loss: 0.0157 125/500 [======>.......................] - ETA: 2:07 - loss: 0.2866 - regression_loss: 0.2710 - classification_loss: 0.0157 126/500 [======>.......................] - ETA: 2:06 - loss: 0.2867 - regression_loss: 0.2711 - classification_loss: 0.0157 127/500 [======>.......................] - ETA: 2:06 - loss: 0.2862 - regression_loss: 0.2707 - classification_loss: 0.0156 128/500 [======>.......................] - ETA: 2:06 - loss: 0.2863 - regression_loss: 0.2708 - classification_loss: 0.0155 129/500 [======>.......................] - ETA: 2:06 - loss: 0.2854 - regression_loss: 0.2699 - classification_loss: 0.0155 130/500 [======>.......................] - ETA: 2:05 - loss: 0.2863 - regression_loss: 0.2708 - classification_loss: 0.0155 131/500 [======>.......................] - ETA: 2:05 - loss: 0.2862 - regression_loss: 0.2707 - classification_loss: 0.0155 132/500 [======>.......................] - ETA: 2:05 - loss: 0.2858 - regression_loss: 0.2703 - classification_loss: 0.0155 133/500 [======>.......................] - ETA: 2:04 - loss: 0.2844 - regression_loss: 0.2690 - classification_loss: 0.0154 134/500 [=======>......................] - ETA: 2:04 - loss: 0.2838 - regression_loss: 0.2684 - classification_loss: 0.0153 135/500 [=======>......................] - ETA: 2:03 - loss: 0.2834 - regression_loss: 0.2681 - classification_loss: 0.0153 136/500 [=======>......................] - ETA: 2:03 - loss: 0.2830 - regression_loss: 0.2677 - classification_loss: 0.0153 137/500 [=======>......................] - ETA: 2:03 - loss: 0.2831 - regression_loss: 0.2678 - classification_loss: 0.0153 138/500 [=======>......................] - ETA: 2:02 - loss: 0.2828 - regression_loss: 0.2676 - classification_loss: 0.0152 139/500 [=======>......................] - ETA: 2:02 - loss: 0.2832 - regression_loss: 0.2680 - classification_loss: 0.0152 140/500 [=======>......................] - ETA: 2:02 - loss: 0.2834 - regression_loss: 0.2681 - classification_loss: 0.0153 141/500 [=======>......................] - ETA: 2:01 - loss: 0.2824 - regression_loss: 0.2672 - classification_loss: 0.0152 142/500 [=======>......................] - ETA: 2:01 - loss: 0.2828 - regression_loss: 0.2676 - classification_loss: 0.0152 143/500 [=======>......................] - ETA: 2:01 - loss: 0.2822 - regression_loss: 0.2671 - classification_loss: 0.0152 144/500 [=======>......................] - ETA: 2:00 - loss: 0.2819 - regression_loss: 0.2668 - classification_loss: 0.0151 145/500 [=======>......................] - ETA: 2:00 - loss: 0.2830 - regression_loss: 0.2677 - classification_loss: 0.0152 146/500 [=======>......................] - ETA: 2:00 - loss: 0.2816 - regression_loss: 0.2665 - classification_loss: 0.0151 147/500 [=======>......................] - ETA: 1:59 - loss: 0.2815 - regression_loss: 0.2664 - classification_loss: 0.0151 148/500 [=======>......................] - ETA: 1:59 - loss: 0.2810 - regression_loss: 0.2659 - classification_loss: 0.0151 149/500 [=======>......................] - ETA: 1:59 - loss: 0.2821 - regression_loss: 0.2668 - classification_loss: 0.0153 150/500 [========>.....................] - ETA: 1:58 - loss: 0.2824 - regression_loss: 0.2671 - classification_loss: 0.0153 151/500 [========>.....................] - ETA: 1:58 - loss: 0.2817 - regression_loss: 0.2665 - classification_loss: 0.0153 152/500 [========>.....................] - ETA: 1:58 - loss: 0.2810 - regression_loss: 0.2657 - classification_loss: 0.0152 153/500 [========>.....................] - ETA: 1:57 - loss: 0.2832 - regression_loss: 0.2680 - classification_loss: 0.0152 154/500 [========>.....................] - ETA: 1:57 - loss: 0.2834 - regression_loss: 0.2681 - classification_loss: 0.0152 155/500 [========>.....................] - ETA: 1:57 - loss: 0.2828 - regression_loss: 0.2677 - classification_loss: 0.0152 156/500 [========>.....................] - ETA: 1:56 - loss: 0.2830 - regression_loss: 0.2678 - classification_loss: 0.0152 157/500 [========>.....................] - ETA: 1:56 - loss: 0.2837 - regression_loss: 0.2685 - classification_loss: 0.0152 158/500 [========>.....................] - ETA: 1:56 - loss: 0.2835 - regression_loss: 0.2684 - classification_loss: 0.0152 159/500 [========>.....................] - ETA: 1:55 - loss: 0.2841 - regression_loss: 0.2689 - classification_loss: 0.0152 160/500 [========>.....................] - ETA: 1:55 - loss: 0.2836 - regression_loss: 0.2685 - classification_loss: 0.0151 161/500 [========>.....................] - ETA: 1:55 - loss: 0.2831 - regression_loss: 0.2680 - classification_loss: 0.0151 162/500 [========>.....................] - ETA: 1:54 - loss: 0.2823 - regression_loss: 0.2672 - classification_loss: 0.0150 163/500 [========>.....................] - ETA: 1:54 - loss: 0.2834 - regression_loss: 0.2679 - classification_loss: 0.0154 164/500 [========>.....................] - ETA: 1:54 - loss: 0.2843 - regression_loss: 0.2688 - classification_loss: 0.0155 165/500 [========>.....................] - ETA: 1:53 - loss: 0.2837 - regression_loss: 0.2682 - classification_loss: 0.0154 166/500 [========>.....................] - ETA: 1:53 - loss: 0.2833 - regression_loss: 0.2679 - classification_loss: 0.0154 167/500 [=========>....................] - ETA: 1:53 - loss: 0.2831 - regression_loss: 0.2677 - classification_loss: 0.0154 168/500 [=========>....................] - ETA: 1:52 - loss: 0.2836 - regression_loss: 0.2681 - classification_loss: 0.0155 169/500 [=========>....................] - ETA: 1:52 - loss: 0.2830 - regression_loss: 0.2675 - classification_loss: 0.0155 170/500 [=========>....................] - ETA: 1:52 - loss: 0.2848 - regression_loss: 0.2693 - classification_loss: 0.0155 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2848 - regression_loss: 0.2694 - classification_loss: 0.0155 172/500 [=========>....................] - ETA: 1:51 - loss: 0.2852 - regression_loss: 0.2696 - classification_loss: 0.0156 173/500 [=========>....................] - ETA: 1:51 - loss: 0.2849 - regression_loss: 0.2693 - classification_loss: 0.0156 174/500 [=========>....................] - ETA: 1:50 - loss: 0.2853 - regression_loss: 0.2697 - classification_loss: 0.0156 175/500 [=========>....................] - ETA: 1:50 - loss: 0.2846 - regression_loss: 0.2691 - classification_loss: 0.0156 176/500 [=========>....................] - ETA: 1:50 - loss: 0.2840 - regression_loss: 0.2685 - classification_loss: 0.0155 177/500 [=========>....................] - ETA: 1:49 - loss: 0.2850 - regression_loss: 0.2693 - classification_loss: 0.0156 178/500 [=========>....................] - ETA: 1:49 - loss: 0.2853 - regression_loss: 0.2697 - classification_loss: 0.0156 179/500 [=========>....................] - ETA: 1:49 - loss: 0.2855 - regression_loss: 0.2699 - classification_loss: 0.0156 180/500 [=========>....................] - ETA: 1:48 - loss: 0.2850 - regression_loss: 0.2694 - classification_loss: 0.0156 181/500 [=========>....................] - ETA: 1:48 - loss: 0.2848 - regression_loss: 0.2693 - classification_loss: 0.0156 182/500 [=========>....................] - ETA: 1:48 - loss: 0.2853 - regression_loss: 0.2697 - classification_loss: 0.0156 183/500 [=========>....................] - ETA: 1:47 - loss: 0.2847 - regression_loss: 0.2692 - classification_loss: 0.0155 184/500 [==========>...................] - ETA: 1:47 - loss: 0.2849 - regression_loss: 0.2694 - classification_loss: 0.0155 185/500 [==========>...................] - ETA: 1:47 - loss: 0.2840 - regression_loss: 0.2686 - classification_loss: 0.0154 186/500 [==========>...................] - ETA: 1:46 - loss: 0.2843 - regression_loss: 0.2688 - classification_loss: 0.0155 187/500 [==========>...................] - ETA: 1:46 - loss: 0.2833 - regression_loss: 0.2679 - classification_loss: 0.0154 188/500 [==========>...................] - ETA: 1:46 - loss: 0.2825 - regression_loss: 0.2671 - classification_loss: 0.0154 189/500 [==========>...................] - ETA: 1:45 - loss: 0.2816 - regression_loss: 0.2663 - classification_loss: 0.0153 190/500 [==========>...................] - ETA: 1:45 - loss: 0.2815 - regression_loss: 0.2662 - classification_loss: 0.0153 191/500 [==========>...................] - ETA: 1:45 - loss: 0.2815 - regression_loss: 0.2662 - classification_loss: 0.0153 192/500 [==========>...................] - ETA: 1:44 - loss: 0.2815 - regression_loss: 0.2660 - classification_loss: 0.0154 193/500 [==========>...................] - ETA: 1:44 - loss: 0.2816 - regression_loss: 0.2662 - classification_loss: 0.0154 194/500 [==========>...................] - ETA: 1:44 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0154 195/500 [==========>...................] - ETA: 1:43 - loss: 0.2811 - regression_loss: 0.2657 - classification_loss: 0.0154 196/500 [==========>...................] - ETA: 1:43 - loss: 0.2805 - regression_loss: 0.2652 - classification_loss: 0.0154 197/500 [==========>...................] - ETA: 1:43 - loss: 0.2804 - regression_loss: 0.2650 - classification_loss: 0.0154 198/500 [==========>...................] - ETA: 1:42 - loss: 0.2812 - regression_loss: 0.2657 - classification_loss: 0.0154 199/500 [==========>...................] - ETA: 1:42 - loss: 0.2822 - regression_loss: 0.2667 - classification_loss: 0.0155 200/500 [===========>..................] - ETA: 1:42 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0155 201/500 [===========>..................] - ETA: 1:41 - loss: 0.2822 - regression_loss: 0.2666 - classification_loss: 0.0155 202/500 [===========>..................] - ETA: 1:41 - loss: 0.2823 - regression_loss: 0.2668 - classification_loss: 0.0155 203/500 [===========>..................] - ETA: 1:41 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0155 204/500 [===========>..................] - ETA: 1:40 - loss: 0.2819 - regression_loss: 0.2665 - classification_loss: 0.0155 205/500 [===========>..................] - ETA: 1:40 - loss: 0.2827 - regression_loss: 0.2671 - classification_loss: 0.0156 206/500 [===========>..................] - ETA: 1:40 - loss: 0.2829 - regression_loss: 0.2673 - classification_loss: 0.0155 207/500 [===========>..................] - ETA: 1:39 - loss: 0.2826 - regression_loss: 0.2671 - classification_loss: 0.0155 208/500 [===========>..................] - ETA: 1:39 - loss: 0.2825 - regression_loss: 0.2670 - classification_loss: 0.0155 209/500 [===========>..................] - ETA: 1:39 - loss: 0.2824 - regression_loss: 0.2669 - classification_loss: 0.0154 210/500 [===========>..................] - ETA: 1:38 - loss: 0.2819 - regression_loss: 0.2665 - classification_loss: 0.0154 211/500 [===========>..................] - ETA: 1:38 - loss: 0.2831 - regression_loss: 0.2675 - classification_loss: 0.0156 212/500 [===========>..................] - ETA: 1:38 - loss: 0.2833 - regression_loss: 0.2677 - classification_loss: 0.0156 213/500 [===========>..................] - ETA: 1:37 - loss: 0.2826 - regression_loss: 0.2671 - classification_loss: 0.0155 214/500 [===========>..................] - ETA: 1:37 - loss: 0.2823 - regression_loss: 0.2667 - classification_loss: 0.0155 215/500 [===========>..................] - ETA: 1:36 - loss: 0.2825 - regression_loss: 0.2670 - classification_loss: 0.0155 216/500 [===========>..................] - ETA: 1:36 - loss: 0.2821 - regression_loss: 0.2666 - classification_loss: 0.0155 217/500 [============>.................] - ETA: 1:36 - loss: 0.2828 - regression_loss: 0.2673 - classification_loss: 0.0155 218/500 [============>.................] - ETA: 1:35 - loss: 0.2830 - regression_loss: 0.2675 - classification_loss: 0.0155 219/500 [============>.................] - ETA: 1:35 - loss: 0.2824 - regression_loss: 0.2669 - classification_loss: 0.0155 220/500 [============>.................] - ETA: 1:35 - loss: 0.2826 - regression_loss: 0.2671 - classification_loss: 0.0154 221/500 [============>.................] - ETA: 1:34 - loss: 0.2823 - regression_loss: 0.2669 - classification_loss: 0.0154 222/500 [============>.................] - ETA: 1:34 - loss: 0.2828 - regression_loss: 0.2674 - classification_loss: 0.0155 223/500 [============>.................] - ETA: 1:34 - loss: 0.2828 - regression_loss: 0.2673 - classification_loss: 0.0155 224/500 [============>.................] - ETA: 1:33 - loss: 0.2824 - regression_loss: 0.2669 - classification_loss: 0.0155 225/500 [============>.................] - ETA: 1:33 - loss: 0.2836 - regression_loss: 0.2681 - classification_loss: 0.0155 226/500 [============>.................] - ETA: 1:33 - loss: 0.2828 - regression_loss: 0.2673 - classification_loss: 0.0155 227/500 [============>.................] - ETA: 1:32 - loss: 0.2834 - regression_loss: 0.2678 - classification_loss: 0.0156 228/500 [============>.................] - ETA: 1:32 - loss: 0.2828 - regression_loss: 0.2673 - classification_loss: 0.0155 229/500 [============>.................] - ETA: 1:32 - loss: 0.2822 - regression_loss: 0.2667 - classification_loss: 0.0155 230/500 [============>.................] - ETA: 1:31 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0155 231/500 [============>.................] - ETA: 1:31 - loss: 0.2813 - regression_loss: 0.2659 - classification_loss: 0.0154 232/500 [============>.................] - ETA: 1:31 - loss: 0.2815 - regression_loss: 0.2660 - classification_loss: 0.0154 233/500 [============>.................] - ETA: 1:30 - loss: 0.2812 - regression_loss: 0.2658 - classification_loss: 0.0154 234/500 [=============>................] - ETA: 1:30 - loss: 0.2812 - regression_loss: 0.2658 - classification_loss: 0.0154 235/500 [=============>................] - ETA: 1:30 - loss: 0.2812 - regression_loss: 0.2658 - classification_loss: 0.0154 236/500 [=============>................] - ETA: 1:29 - loss: 0.2815 - regression_loss: 0.2662 - classification_loss: 0.0154 237/500 [=============>................] - ETA: 1:29 - loss: 0.2820 - regression_loss: 0.2666 - classification_loss: 0.0154 238/500 [=============>................] - ETA: 1:29 - loss: 0.2816 - regression_loss: 0.2663 - classification_loss: 0.0154 239/500 [=============>................] - ETA: 1:28 - loss: 0.2813 - regression_loss: 0.2660 - classification_loss: 0.0153 240/500 [=============>................] - ETA: 1:28 - loss: 0.2809 - regression_loss: 0.2656 - classification_loss: 0.0153 241/500 [=============>................] - ETA: 1:28 - loss: 0.2806 - regression_loss: 0.2653 - classification_loss: 0.0153 242/500 [=============>................] - ETA: 1:27 - loss: 0.2803 - regression_loss: 0.2650 - classification_loss: 0.0153 243/500 [=============>................] - ETA: 1:27 - loss: 0.2797 - regression_loss: 0.2645 - classification_loss: 0.0152 244/500 [=============>................] - ETA: 1:27 - loss: 0.2793 - regression_loss: 0.2641 - classification_loss: 0.0152 245/500 [=============>................] - ETA: 1:26 - loss: 0.2794 - regression_loss: 0.2642 - classification_loss: 0.0152 246/500 [=============>................] - ETA: 1:26 - loss: 0.2791 - regression_loss: 0.2639 - classification_loss: 0.0152 247/500 [=============>................] - ETA: 1:26 - loss: 0.2796 - regression_loss: 0.2645 - classification_loss: 0.0151 248/500 [=============>................] - ETA: 1:25 - loss: 0.2795 - regression_loss: 0.2644 - classification_loss: 0.0152 249/500 [=============>................] - ETA: 1:25 - loss: 0.2794 - regression_loss: 0.2643 - classification_loss: 0.0151 250/500 [==============>...............] - ETA: 1:25 - loss: 0.2797 - regression_loss: 0.2645 - classification_loss: 0.0151 251/500 [==============>...............] - ETA: 1:24 - loss: 0.2796 - regression_loss: 0.2644 - classification_loss: 0.0151 252/500 [==============>...............] - ETA: 1:24 - loss: 0.2792 - regression_loss: 0.2641 - classification_loss: 0.0151 253/500 [==============>...............] - ETA: 1:24 - loss: 0.2787 - regression_loss: 0.2637 - classification_loss: 0.0151 254/500 [==============>...............] - ETA: 1:23 - loss: 0.2787 - regression_loss: 0.2636 - classification_loss: 0.0151 255/500 [==============>...............] - ETA: 1:23 - loss: 0.2786 - regression_loss: 0.2636 - classification_loss: 0.0150 256/500 [==============>...............] - ETA: 1:23 - loss: 0.2782 - regression_loss: 0.2632 - classification_loss: 0.0150 257/500 [==============>...............] - ETA: 1:22 - loss: 0.2784 - regression_loss: 0.2635 - classification_loss: 0.0150 258/500 [==============>...............] - ETA: 1:22 - loss: 0.2791 - regression_loss: 0.2640 - classification_loss: 0.0151 259/500 [==============>...............] - ETA: 1:22 - loss: 0.2792 - regression_loss: 0.2641 - classification_loss: 0.0151 260/500 [==============>...............] - ETA: 1:21 - loss: 0.2789 - regression_loss: 0.2638 - classification_loss: 0.0151 261/500 [==============>...............] - ETA: 1:21 - loss: 0.2788 - regression_loss: 0.2637 - classification_loss: 0.0151 262/500 [==============>...............] - ETA: 1:21 - loss: 0.2787 - regression_loss: 0.2636 - classification_loss: 0.0151 263/500 [==============>...............] - ETA: 1:20 - loss: 0.2784 - regression_loss: 0.2633 - classification_loss: 0.0151 264/500 [==============>...............] - ETA: 1:20 - loss: 0.2781 - regression_loss: 0.2630 - classification_loss: 0.0151 265/500 [==============>...............] - ETA: 1:20 - loss: 0.2781 - regression_loss: 0.2631 - classification_loss: 0.0151 266/500 [==============>...............] - ETA: 1:19 - loss: 0.2779 - regression_loss: 0.2629 - classification_loss: 0.0150 267/500 [===============>..............] - ETA: 1:19 - loss: 0.2777 - regression_loss: 0.2627 - classification_loss: 0.0150 268/500 [===============>..............] - ETA: 1:18 - loss: 0.2773 - regression_loss: 0.2623 - classification_loss: 0.0150 269/500 [===============>..............] - ETA: 1:18 - loss: 0.2772 - regression_loss: 0.2622 - classification_loss: 0.0150 270/500 [===============>..............] - ETA: 1:18 - loss: 0.2775 - regression_loss: 0.2624 - classification_loss: 0.0150 271/500 [===============>..............] - ETA: 1:17 - loss: 0.2779 - regression_loss: 0.2629 - classification_loss: 0.0150 272/500 [===============>..............] - ETA: 1:17 - loss: 0.2781 - regression_loss: 0.2630 - classification_loss: 0.0150 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2777 - regression_loss: 0.2627 - classification_loss: 0.0150 274/500 [===============>..............] - ETA: 1:16 - loss: 0.2776 - regression_loss: 0.2626 - classification_loss: 0.0150 275/500 [===============>..............] - ETA: 1:16 - loss: 0.2777 - regression_loss: 0.2627 - classification_loss: 0.0150 276/500 [===============>..............] - ETA: 1:16 - loss: 0.2771 - regression_loss: 0.2622 - classification_loss: 0.0149 277/500 [===============>..............] - ETA: 1:15 - loss: 0.2777 - regression_loss: 0.2628 - classification_loss: 0.0149 278/500 [===============>..............] - ETA: 1:15 - loss: 0.2774 - regression_loss: 0.2625 - classification_loss: 0.0149 279/500 [===============>..............] - ETA: 1:15 - loss: 0.2768 - regression_loss: 0.2619 - classification_loss: 0.0149 280/500 [===============>..............] - ETA: 1:14 - loss: 0.2766 - regression_loss: 0.2618 - classification_loss: 0.0148 281/500 [===============>..............] - ETA: 1:14 - loss: 0.2765 - regression_loss: 0.2616 - classification_loss: 0.0148 282/500 [===============>..............] - ETA: 1:14 - loss: 0.2769 - regression_loss: 0.2619 - classification_loss: 0.0150 283/500 [===============>..............] - ETA: 1:13 - loss: 0.2768 - regression_loss: 0.2618 - classification_loss: 0.0150 284/500 [================>.............] - ETA: 1:13 - loss: 0.2765 - regression_loss: 0.2615 - classification_loss: 0.0150 285/500 [================>.............] - ETA: 1:13 - loss: 0.2763 - regression_loss: 0.2614 - classification_loss: 0.0150 286/500 [================>.............] - ETA: 1:12 - loss: 0.2763 - regression_loss: 0.2613 - classification_loss: 0.0150 287/500 [================>.............] - ETA: 1:12 - loss: 0.2760 - regression_loss: 0.2611 - classification_loss: 0.0149 288/500 [================>.............] - ETA: 1:12 - loss: 0.2757 - regression_loss: 0.2608 - classification_loss: 0.0149 289/500 [================>.............] - ETA: 1:11 - loss: 0.2753 - regression_loss: 0.2605 - classification_loss: 0.0149 290/500 [================>.............] - ETA: 1:11 - loss: 0.2750 - regression_loss: 0.2602 - classification_loss: 0.0148 291/500 [================>.............] - ETA: 1:11 - loss: 0.2746 - regression_loss: 0.2598 - classification_loss: 0.0148 292/500 [================>.............] - ETA: 1:10 - loss: 0.2755 - regression_loss: 0.2606 - classification_loss: 0.0149 293/500 [================>.............] - ETA: 1:10 - loss: 0.2757 - regression_loss: 0.2608 - classification_loss: 0.0149 294/500 [================>.............] - ETA: 1:10 - loss: 0.2762 - regression_loss: 0.2613 - classification_loss: 0.0149 295/500 [================>.............] - ETA: 1:09 - loss: 0.2754 - regression_loss: 0.2605 - classification_loss: 0.0149 296/500 [================>.............] - ETA: 1:09 - loss: 0.2754 - regression_loss: 0.2605 - classification_loss: 0.0149 297/500 [================>.............] - ETA: 1:09 - loss: 0.2754 - regression_loss: 0.2605 - classification_loss: 0.0149 298/500 [================>.............] - ETA: 1:08 - loss: 0.2751 - regression_loss: 0.2602 - classification_loss: 0.0149 299/500 [================>.............] - ETA: 1:08 - loss: 0.2752 - regression_loss: 0.2603 - classification_loss: 0.0149 300/500 [=================>............] - ETA: 1:08 - loss: 0.2750 - regression_loss: 0.2602 - classification_loss: 0.0148 301/500 [=================>............] - ETA: 1:07 - loss: 0.2756 - regression_loss: 0.2607 - classification_loss: 0.0149 302/500 [=================>............] - ETA: 1:07 - loss: 0.2760 - regression_loss: 0.2610 - classification_loss: 0.0149 303/500 [=================>............] - ETA: 1:07 - loss: 0.2758 - regression_loss: 0.2609 - classification_loss: 0.0149 304/500 [=================>............] - ETA: 1:06 - loss: 0.2751 - regression_loss: 0.2603 - classification_loss: 0.0149 305/500 [=================>............] - ETA: 1:06 - loss: 0.2748 - regression_loss: 0.2599 - classification_loss: 0.0149 306/500 [=================>............] - ETA: 1:06 - loss: 0.2749 - regression_loss: 0.2600 - classification_loss: 0.0149 307/500 [=================>............] - ETA: 1:05 - loss: 0.2750 - regression_loss: 0.2601 - classification_loss: 0.0149 308/500 [=================>............] - ETA: 1:05 - loss: 0.2749 - regression_loss: 0.2600 - classification_loss: 0.0149 309/500 [=================>............] - ETA: 1:05 - loss: 0.2751 - regression_loss: 0.2602 - classification_loss: 0.0149 310/500 [=================>............] - ETA: 1:04 - loss: 0.2751 - regression_loss: 0.2602 - classification_loss: 0.0149 311/500 [=================>............] - ETA: 1:04 - loss: 0.2749 - regression_loss: 0.2601 - classification_loss: 0.0148 312/500 [=================>............] - ETA: 1:04 - loss: 0.2751 - regression_loss: 0.2602 - classification_loss: 0.0149 313/500 [=================>............] - ETA: 1:03 - loss: 0.2753 - regression_loss: 0.2604 - classification_loss: 0.0149 314/500 [=================>............] - ETA: 1:03 - loss: 0.2751 - regression_loss: 0.2602 - classification_loss: 0.0149 315/500 [=================>............] - ETA: 1:03 - loss: 0.2749 - regression_loss: 0.2601 - classification_loss: 0.0149 316/500 [=================>............] - ETA: 1:02 - loss: 0.2746 - regression_loss: 0.2598 - classification_loss: 0.0148 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2742 - regression_loss: 0.2594 - classification_loss: 0.0148 318/500 [==================>...........] - ETA: 1:02 - loss: 0.2740 - regression_loss: 0.2592 - classification_loss: 0.0148 319/500 [==================>...........] - ETA: 1:01 - loss: 0.2739 - regression_loss: 0.2591 - classification_loss: 0.0148 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2742 - regression_loss: 0.2594 - classification_loss: 0.0148 321/500 [==================>...........] - ETA: 1:01 - loss: 0.2743 - regression_loss: 0.2596 - classification_loss: 0.0148 322/500 [==================>...........] - ETA: 1:00 - loss: 0.2748 - regression_loss: 0.2599 - classification_loss: 0.0149 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2750 - regression_loss: 0.2601 - classification_loss: 0.0149 324/500 [==================>...........] - ETA: 1:00 - loss: 0.2753 - regression_loss: 0.2604 - classification_loss: 0.0149 325/500 [==================>...........] - ETA: 59s - loss: 0.2748 - regression_loss: 0.2599 - classification_loss: 0.0148  326/500 [==================>...........] - ETA: 59s - loss: 0.2745 - regression_loss: 0.2597 - classification_loss: 0.0148 327/500 [==================>...........] - ETA: 59s - loss: 0.2752 - regression_loss: 0.2603 - classification_loss: 0.0148 328/500 [==================>...........] - ETA: 58s - loss: 0.2758 - regression_loss: 0.2610 - classification_loss: 0.0148 329/500 [==================>...........] - ETA: 58s - loss: 0.2755 - regression_loss: 0.2607 - classification_loss: 0.0148 330/500 [==================>...........] - ETA: 57s - loss: 0.2758 - regression_loss: 0.2610 - classification_loss: 0.0148 331/500 [==================>...........] - ETA: 57s - loss: 0.2754 - regression_loss: 0.2607 - classification_loss: 0.0147 332/500 [==================>...........] - ETA: 57s - loss: 0.2761 - regression_loss: 0.2612 - classification_loss: 0.0149 333/500 [==================>...........] - ETA: 56s - loss: 0.2765 - regression_loss: 0.2616 - classification_loss: 0.0149 334/500 [===================>..........] - ETA: 56s - loss: 0.2773 - regression_loss: 0.2624 - classification_loss: 0.0149 335/500 [===================>..........] - ETA: 56s - loss: 0.2779 - regression_loss: 0.2630 - classification_loss: 0.0149 336/500 [===================>..........] - ETA: 55s - loss: 0.2784 - regression_loss: 0.2634 - classification_loss: 0.0150 337/500 [===================>..........] - ETA: 55s - loss: 0.2784 - regression_loss: 0.2634 - classification_loss: 0.0150 338/500 [===================>..........] - ETA: 55s - loss: 0.2787 - regression_loss: 0.2637 - classification_loss: 0.0150 339/500 [===================>..........] - ETA: 54s - loss: 0.2783 - regression_loss: 0.2634 - classification_loss: 0.0149 340/500 [===================>..........] - ETA: 54s - loss: 0.2786 - regression_loss: 0.2637 - classification_loss: 0.0149 341/500 [===================>..........] - ETA: 54s - loss: 0.2780 - regression_loss: 0.2631 - classification_loss: 0.0149 342/500 [===================>..........] - ETA: 53s - loss: 0.2780 - regression_loss: 0.2631 - classification_loss: 0.0149 343/500 [===================>..........] - ETA: 53s - loss: 0.2777 - regression_loss: 0.2628 - classification_loss: 0.0149 344/500 [===================>..........] - ETA: 53s - loss: 0.2775 - regression_loss: 0.2626 - classification_loss: 0.0149 345/500 [===================>..........] - ETA: 52s - loss: 0.2778 - regression_loss: 0.2629 - classification_loss: 0.0149 346/500 [===================>..........] - ETA: 52s - loss: 0.2776 - regression_loss: 0.2627 - classification_loss: 0.0148 347/500 [===================>..........] - ETA: 52s - loss: 0.2773 - regression_loss: 0.2625 - classification_loss: 0.0148 348/500 [===================>..........] - ETA: 51s - loss: 0.2769 - regression_loss: 0.2621 - classification_loss: 0.0148 349/500 [===================>..........] - ETA: 51s - loss: 0.2768 - regression_loss: 0.2620 - classification_loss: 0.0148 350/500 [====================>.........] - ETA: 51s - loss: 0.2767 - regression_loss: 0.2619 - classification_loss: 0.0148 351/500 [====================>.........] - ETA: 50s - loss: 0.2764 - regression_loss: 0.2616 - classification_loss: 0.0147 352/500 [====================>.........] - ETA: 50s - loss: 0.2764 - regression_loss: 0.2617 - classification_loss: 0.0147 353/500 [====================>.........] - ETA: 50s - loss: 0.2760 - regression_loss: 0.2613 - classification_loss: 0.0147 354/500 [====================>.........] - ETA: 49s - loss: 0.2762 - regression_loss: 0.2615 - classification_loss: 0.0147 355/500 [====================>.........] - ETA: 49s - loss: 0.2760 - regression_loss: 0.2613 - classification_loss: 0.0147 356/500 [====================>.........] - ETA: 49s - loss: 0.2762 - regression_loss: 0.2614 - classification_loss: 0.0148 357/500 [====================>.........] - ETA: 48s - loss: 0.2764 - regression_loss: 0.2616 - classification_loss: 0.0148 358/500 [====================>.........] - ETA: 48s - loss: 0.2764 - regression_loss: 0.2616 - classification_loss: 0.0148 359/500 [====================>.........] - ETA: 48s - loss: 0.2759 - regression_loss: 0.2612 - classification_loss: 0.0148 360/500 [====================>.........] - ETA: 47s - loss: 0.2763 - regression_loss: 0.2615 - classification_loss: 0.0148 361/500 [====================>.........] - ETA: 47s - loss: 0.2757 - regression_loss: 0.2609 - classification_loss: 0.0148 362/500 [====================>.........] - ETA: 47s - loss: 0.2759 - regression_loss: 0.2611 - classification_loss: 0.0148 363/500 [====================>.........] - ETA: 46s - loss: 0.2761 - regression_loss: 0.2613 - classification_loss: 0.0148 364/500 [====================>.........] - ETA: 46s - loss: 0.2761 - regression_loss: 0.2613 - classification_loss: 0.0148 365/500 [====================>.........] - ETA: 46s - loss: 0.2756 - regression_loss: 0.2609 - classification_loss: 0.0148 366/500 [====================>.........] - ETA: 45s - loss: 0.2755 - regression_loss: 0.2607 - classification_loss: 0.0148 367/500 [=====================>........] - ETA: 45s - loss: 0.2755 - regression_loss: 0.2608 - classification_loss: 0.0147 368/500 [=====================>........] - ETA: 45s - loss: 0.2753 - regression_loss: 0.2606 - classification_loss: 0.0147 369/500 [=====================>........] - ETA: 44s - loss: 0.2754 - regression_loss: 0.2607 - classification_loss: 0.0147 370/500 [=====================>........] - ETA: 44s - loss: 0.2753 - regression_loss: 0.2606 - classification_loss: 0.0147 371/500 [=====================>........] - ETA: 43s - loss: 0.2751 - regression_loss: 0.2604 - classification_loss: 0.0147 372/500 [=====================>........] - ETA: 43s - loss: 0.2754 - regression_loss: 0.2607 - classification_loss: 0.0147 373/500 [=====================>........] - ETA: 43s - loss: 0.2753 - regression_loss: 0.2607 - classification_loss: 0.0147 374/500 [=====================>........] - ETA: 42s - loss: 0.2751 - regression_loss: 0.2604 - classification_loss: 0.0147 375/500 [=====================>........] - ETA: 42s - loss: 0.2751 - regression_loss: 0.2605 - classification_loss: 0.0146 376/500 [=====================>........] - ETA: 42s - loss: 0.2749 - regression_loss: 0.2603 - classification_loss: 0.0146 377/500 [=====================>........] - ETA: 41s - loss: 0.2745 - regression_loss: 0.2600 - classification_loss: 0.0146 378/500 [=====================>........] - ETA: 41s - loss: 0.2747 - regression_loss: 0.2601 - classification_loss: 0.0146 379/500 [=====================>........] - ETA: 41s - loss: 0.2749 - regression_loss: 0.2602 - classification_loss: 0.0146 380/500 [=====================>........] - ETA: 40s - loss: 0.2747 - regression_loss: 0.2601 - classification_loss: 0.0146 381/500 [=====================>........] - ETA: 40s - loss: 0.2748 - regression_loss: 0.2602 - classification_loss: 0.0146 382/500 [=====================>........] - ETA: 40s - loss: 0.2749 - regression_loss: 0.2602 - classification_loss: 0.0146 383/500 [=====================>........] - ETA: 39s - loss: 0.2754 - regression_loss: 0.2607 - classification_loss: 0.0147 384/500 [======================>.......] - ETA: 39s - loss: 0.2752 - regression_loss: 0.2605 - classification_loss: 0.0147 385/500 [======================>.......] - ETA: 39s - loss: 0.2750 - regression_loss: 0.2603 - classification_loss: 0.0147 386/500 [======================>.......] - ETA: 38s - loss: 0.2754 - regression_loss: 0.2606 - classification_loss: 0.0148 387/500 [======================>.......] - ETA: 38s - loss: 0.2759 - regression_loss: 0.2611 - classification_loss: 0.0149 388/500 [======================>.......] - ETA: 38s - loss: 0.2758 - regression_loss: 0.2610 - classification_loss: 0.0148 389/500 [======================>.......] - ETA: 37s - loss: 0.2757 - regression_loss: 0.2608 - classification_loss: 0.0148 390/500 [======================>.......] - ETA: 37s - loss: 0.2754 - regression_loss: 0.2606 - classification_loss: 0.0148 391/500 [======================>.......] - ETA: 37s - loss: 0.2751 - regression_loss: 0.2603 - classification_loss: 0.0148 392/500 [======================>.......] - ETA: 36s - loss: 0.2753 - regression_loss: 0.2605 - classification_loss: 0.0148 393/500 [======================>.......] - ETA: 36s - loss: 0.2749 - regression_loss: 0.2601 - classification_loss: 0.0148 394/500 [======================>.......] - ETA: 36s - loss: 0.2751 - regression_loss: 0.2603 - classification_loss: 0.0148 395/500 [======================>.......] - ETA: 35s - loss: 0.2758 - regression_loss: 0.2609 - classification_loss: 0.0149 396/500 [======================>.......] - ETA: 35s - loss: 0.2756 - regression_loss: 0.2608 - classification_loss: 0.0149 397/500 [======================>.......] - ETA: 35s - loss: 0.2753 - regression_loss: 0.2605 - classification_loss: 0.0148 398/500 [======================>.......] - ETA: 34s - loss: 0.2757 - regression_loss: 0.2608 - classification_loss: 0.0148 399/500 [======================>.......] - ETA: 34s - loss: 0.2760 - regression_loss: 0.2612 - classification_loss: 0.0148 400/500 [=======================>......] - ETA: 34s - loss: 0.2758 - regression_loss: 0.2609 - classification_loss: 0.0148 401/500 [=======================>......] - ETA: 33s - loss: 0.2755 - regression_loss: 0.2607 - classification_loss: 0.0148 402/500 [=======================>......] - ETA: 33s - loss: 0.2752 - regression_loss: 0.2605 - classification_loss: 0.0148 403/500 [=======================>......] - ETA: 33s - loss: 0.2750 - regression_loss: 0.2602 - classification_loss: 0.0147 404/500 [=======================>......] - ETA: 32s - loss: 0.2747 - regression_loss: 0.2600 - classification_loss: 0.0147 405/500 [=======================>......] - ETA: 32s - loss: 0.2748 - regression_loss: 0.2600 - classification_loss: 0.0147 406/500 [=======================>......] - ETA: 32s - loss: 0.2744 - regression_loss: 0.2597 - classification_loss: 0.0147 407/500 [=======================>......] - ETA: 31s - loss: 0.2747 - regression_loss: 0.2600 - classification_loss: 0.0147 408/500 [=======================>......] - ETA: 31s - loss: 0.2743 - regression_loss: 0.2596 - classification_loss: 0.0147 409/500 [=======================>......] - ETA: 30s - loss: 0.2740 - regression_loss: 0.2593 - classification_loss: 0.0147 410/500 [=======================>......] - ETA: 30s - loss: 0.2738 - regression_loss: 0.2591 - classification_loss: 0.0147 411/500 [=======================>......] - ETA: 30s - loss: 0.2741 - regression_loss: 0.2594 - classification_loss: 0.0147 412/500 [=======================>......] - ETA: 29s - loss: 0.2743 - regression_loss: 0.2596 - classification_loss: 0.0147 413/500 [=======================>......] - ETA: 29s - loss: 0.2741 - regression_loss: 0.2594 - classification_loss: 0.0147 414/500 [=======================>......] - ETA: 29s - loss: 0.2737 - regression_loss: 0.2590 - classification_loss: 0.0147 415/500 [=======================>......] - ETA: 28s - loss: 0.2736 - regression_loss: 0.2589 - classification_loss: 0.0147 416/500 [=======================>......] - ETA: 28s - loss: 0.2736 - regression_loss: 0.2589 - classification_loss: 0.0147 417/500 [========================>.....] - ETA: 28s - loss: 0.2734 - regression_loss: 0.2588 - classification_loss: 0.0146 418/500 [========================>.....] - ETA: 27s - loss: 0.2733 - regression_loss: 0.2587 - classification_loss: 0.0146 419/500 [========================>.....] - ETA: 27s - loss: 0.2739 - regression_loss: 0.2592 - classification_loss: 0.0146 420/500 [========================>.....] - ETA: 27s - loss: 0.2741 - regression_loss: 0.2594 - classification_loss: 0.0146 421/500 [========================>.....] - ETA: 26s - loss: 0.2740 - regression_loss: 0.2593 - classification_loss: 0.0146 422/500 [========================>.....] - ETA: 26s - loss: 0.2740 - regression_loss: 0.2594 - classification_loss: 0.0146 423/500 [========================>.....] - ETA: 26s - loss: 0.2737 - regression_loss: 0.2591 - classification_loss: 0.0146 424/500 [========================>.....] - ETA: 25s - loss: 0.2736 - regression_loss: 0.2590 - classification_loss: 0.0146 425/500 [========================>.....] - ETA: 25s - loss: 0.2735 - regression_loss: 0.2589 - classification_loss: 0.0146 426/500 [========================>.....] - ETA: 25s - loss: 0.2733 - regression_loss: 0.2587 - classification_loss: 0.0146 427/500 [========================>.....] - ETA: 24s - loss: 0.2732 - regression_loss: 0.2586 - classification_loss: 0.0146 428/500 [========================>.....] - ETA: 24s - loss: 0.2738 - regression_loss: 0.2592 - classification_loss: 0.0146 429/500 [========================>.....] - ETA: 24s - loss: 0.2741 - regression_loss: 0.2594 - classification_loss: 0.0147 430/500 [========================>.....] - ETA: 23s - loss: 0.2742 - regression_loss: 0.2595 - classification_loss: 0.0147 431/500 [========================>.....] - ETA: 23s - loss: 0.2747 - regression_loss: 0.2601 - classification_loss: 0.0146 432/500 [========================>.....] - ETA: 23s - loss: 0.2748 - regression_loss: 0.2602 - classification_loss: 0.0146 433/500 [========================>.....] - ETA: 22s - loss: 0.2749 - regression_loss: 0.2603 - classification_loss: 0.0146 434/500 [=========================>....] - ETA: 22s - loss: 0.2748 - regression_loss: 0.2602 - classification_loss: 0.0146 435/500 [=========================>....] - ETA: 22s - loss: 0.2749 - regression_loss: 0.2603 - classification_loss: 0.0146 436/500 [=========================>....] - ETA: 21s - loss: 0.2750 - regression_loss: 0.2603 - classification_loss: 0.0147 437/500 [=========================>....] - ETA: 21s - loss: 0.2745 - regression_loss: 0.2599 - classification_loss: 0.0146 438/500 [=========================>....] - ETA: 21s - loss: 0.2749 - regression_loss: 0.2603 - classification_loss: 0.0146 439/500 [=========================>....] - ETA: 20s - loss: 0.2750 - regression_loss: 0.2603 - classification_loss: 0.0146 440/500 [=========================>....] - ETA: 20s - loss: 0.2747 - regression_loss: 0.2601 - classification_loss: 0.0146 441/500 [=========================>....] - ETA: 20s - loss: 0.2753 - regression_loss: 0.2607 - classification_loss: 0.0146 442/500 [=========================>....] - ETA: 19s - loss: 0.2750 - regression_loss: 0.2604 - classification_loss: 0.0146 443/500 [=========================>....] - ETA: 19s - loss: 0.2747 - regression_loss: 0.2601 - classification_loss: 0.0146 444/500 [=========================>....] - ETA: 19s - loss: 0.2746 - regression_loss: 0.2600 - classification_loss: 0.0146 445/500 [=========================>....] - ETA: 18s - loss: 0.2746 - regression_loss: 0.2600 - classification_loss: 0.0146 446/500 [=========================>....] - ETA: 18s - loss: 0.2745 - regression_loss: 0.2599 - classification_loss: 0.0146 447/500 [=========================>....] - ETA: 18s - loss: 0.2745 - regression_loss: 0.2599 - classification_loss: 0.0146 448/500 [=========================>....] - ETA: 17s - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 449/500 [=========================>....] - ETA: 17s - loss: 0.2745 - regression_loss: 0.2599 - classification_loss: 0.0146 450/500 [==========================>...] - ETA: 17s - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 451/500 [==========================>...] - ETA: 16s - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 452/500 [==========================>...] - ETA: 16s - loss: 0.2743 - regression_loss: 0.2597 - classification_loss: 0.0146 453/500 [==========================>...] - ETA: 16s - loss: 0.2744 - regression_loss: 0.2598 - classification_loss: 0.0146 454/500 [==========================>...] - ETA: 15s - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 455/500 [==========================>...] - ETA: 15s - loss: 0.2743 - regression_loss: 0.2597 - classification_loss: 0.0146 456/500 [==========================>...] - ETA: 14s - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 457/500 [==========================>...] - ETA: 14s - loss: 0.2740 - regression_loss: 0.2594 - classification_loss: 0.0146 458/500 [==========================>...] - ETA: 14s - loss: 0.2742 - regression_loss: 0.2595 - classification_loss: 0.0146 459/500 [==========================>...] - ETA: 13s - loss: 0.2741 - regression_loss: 0.2595 - classification_loss: 0.0146 460/500 [==========================>...] - ETA: 13s - loss: 0.2739 - regression_loss: 0.2593 - classification_loss: 0.0146 461/500 [==========================>...] - ETA: 13s - loss: 0.2739 - regression_loss: 0.2593 - classification_loss: 0.0146 462/500 [==========================>...] - ETA: 12s - loss: 0.2734 - regression_loss: 0.2589 - classification_loss: 0.0146 463/500 [==========================>...] - ETA: 12s - loss: 0.2732 - regression_loss: 0.2587 - classification_loss: 0.0145 464/500 [==========================>...] - ETA: 12s - loss: 0.2731 - regression_loss: 0.2586 - classification_loss: 0.0145 465/500 [==========================>...] - ETA: 11s - loss: 0.2731 - regression_loss: 0.2586 - classification_loss: 0.0145 466/500 [==========================>...] - ETA: 11s - loss: 0.2732 - regression_loss: 0.2587 - classification_loss: 0.0145 467/500 [===========================>..] - ETA: 11s - loss: 0.2737 - regression_loss: 0.2591 - classification_loss: 0.0146 468/500 [===========================>..] - ETA: 10s - loss: 0.2737 - regression_loss: 0.2591 - classification_loss: 0.0146 469/500 [===========================>..] - ETA: 10s - loss: 0.2737 - regression_loss: 0.2592 - classification_loss: 0.0146 470/500 [===========================>..] - ETA: 10s - loss: 0.2736 - regression_loss: 0.2590 - classification_loss: 0.0146 471/500 [===========================>..] - ETA: 9s - loss: 0.2732 - regression_loss: 0.2586 - classification_loss: 0.0145  472/500 [===========================>..] - ETA: 9s - loss: 0.2732 - regression_loss: 0.2587 - classification_loss: 0.0145 473/500 [===========================>..] - ETA: 9s - loss: 0.2735 - regression_loss: 0.2590 - classification_loss: 0.0145 474/500 [===========================>..] - ETA: 8s - loss: 0.2732 - regression_loss: 0.2587 - classification_loss: 0.0145 475/500 [===========================>..] - ETA: 8s - loss: 0.2732 - regression_loss: 0.2587 - classification_loss: 0.0145 476/500 [===========================>..] - ETA: 8s - loss: 0.2729 - regression_loss: 0.2584 - classification_loss: 0.0145 477/500 [===========================>..] - ETA: 7s - loss: 0.2729 - regression_loss: 0.2584 - classification_loss: 0.0145 478/500 [===========================>..] - ETA: 7s - loss: 0.2725 - regression_loss: 0.2580 - classification_loss: 0.0145 479/500 [===========================>..] - ETA: 7s - loss: 0.2723 - regression_loss: 0.2578 - classification_loss: 0.0145 480/500 [===========================>..] - ETA: 6s - loss: 0.2721 - regression_loss: 0.2577 - classification_loss: 0.0145 481/500 [===========================>..] - ETA: 6s - loss: 0.2724 - regression_loss: 0.2579 - classification_loss: 0.0145 482/500 [===========================>..] - ETA: 6s - loss: 0.2726 - regression_loss: 0.2581 - classification_loss: 0.0145 483/500 [===========================>..] - ETA: 5s - loss: 0.2726 - regression_loss: 0.2581 - classification_loss: 0.0145 484/500 [============================>.] - ETA: 5s - loss: 0.2727 - regression_loss: 0.2582 - classification_loss: 0.0145 485/500 [============================>.] - ETA: 5s - loss: 0.2727 - regression_loss: 0.2582 - classification_loss: 0.0145 486/500 [============================>.] - ETA: 4s - loss: 0.2726 - regression_loss: 0.2581 - classification_loss: 0.0145 487/500 [============================>.] - ETA: 4s - loss: 0.2724 - regression_loss: 0.2579 - classification_loss: 0.0145 488/500 [============================>.] - ETA: 4s - loss: 0.2725 - regression_loss: 0.2580 - classification_loss: 0.0145 489/500 [============================>.] - ETA: 3s - loss: 0.2726 - regression_loss: 0.2581 - classification_loss: 0.0145 490/500 [============================>.] - ETA: 3s - loss: 0.2723 - regression_loss: 0.2579 - classification_loss: 0.0145 491/500 [============================>.] - ETA: 3s - loss: 0.2723 - regression_loss: 0.2579 - classification_loss: 0.0145 492/500 [============================>.] - ETA: 2s - loss: 0.2722 - regression_loss: 0.2577 - classification_loss: 0.0145 493/500 [============================>.] - ETA: 2s - loss: 0.2722 - regression_loss: 0.2577 - classification_loss: 0.0145 494/500 [============================>.] - ETA: 2s - loss: 0.2720 - regression_loss: 0.2575 - classification_loss: 0.0145 495/500 [============================>.] - ETA: 1s - loss: 0.2724 - regression_loss: 0.2579 - classification_loss: 0.0145 496/500 [============================>.] - ETA: 1s - loss: 0.2721 - regression_loss: 0.2576 - classification_loss: 0.0145 497/500 [============================>.] - ETA: 1s - loss: 0.2721 - regression_loss: 0.2576 - classification_loss: 0.0145 498/500 [============================>.] - ETA: 0s - loss: 0.2726 - regression_loss: 0.2580 - classification_loss: 0.0146 499/500 [============================>.] - ETA: 0s - loss: 0.2726 - regression_loss: 0.2580 - classification_loss: 0.0146 500/500 [==============================] - 170s 341ms/step - loss: 0.2726 - regression_loss: 0.2581 - classification_loss: 0.0146 1172 instances of class plum with average precision: 0.7492 mAP: 0.7492 Epoch 00040: saving model to ./training/snapshots/resnet101_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 2:45 - loss: 0.2430 - regression_loss: 0.2312 - classification_loss: 0.0118 2/500 [..............................] - ETA: 2:47 - loss: 0.2708 - regression_loss: 0.2604 - classification_loss: 0.0104 3/500 [..............................] - ETA: 2:48 - loss: 0.2884 - regression_loss: 0.2792 - classification_loss: 0.0092 4/500 [..............................] - ETA: 2:50 - loss: 0.2571 - regression_loss: 0.2481 - classification_loss: 0.0090 5/500 [..............................] - ETA: 2:50 - loss: 0.2717 - regression_loss: 0.2613 - classification_loss: 0.0104 6/500 [..............................] - ETA: 2:51 - loss: 0.2921 - regression_loss: 0.2808 - classification_loss: 0.0113 7/500 [..............................] - ETA: 2:49 - loss: 0.2811 - regression_loss: 0.2698 - classification_loss: 0.0112 8/500 [..............................] - ETA: 2:49 - loss: 0.2737 - regression_loss: 0.2630 - classification_loss: 0.0106 9/500 [..............................] - ETA: 2:48 - loss: 0.2644 - regression_loss: 0.2540 - classification_loss: 0.0104 10/500 [..............................] - ETA: 2:48 - loss: 0.2545 - regression_loss: 0.2448 - classification_loss: 0.0097 11/500 [..............................] - ETA: 2:48 - loss: 0.2484 - regression_loss: 0.2385 - classification_loss: 0.0099 12/500 [..............................] - ETA: 2:47 - loss: 0.2512 - regression_loss: 0.2414 - classification_loss: 0.0098 13/500 [..............................] - ETA: 2:45 - loss: 0.2544 - regression_loss: 0.2436 - classification_loss: 0.0109 14/500 [..............................] - ETA: 2:44 - loss: 0.2837 - regression_loss: 0.2691 - classification_loss: 0.0146 15/500 [..............................] - ETA: 2:43 - loss: 0.2703 - regression_loss: 0.2563 - classification_loss: 0.0139 16/500 [..............................] - ETA: 2:42 - loss: 0.2718 - regression_loss: 0.2574 - classification_loss: 0.0144 17/500 [>.............................] - ETA: 2:41 - loss: 0.2714 - regression_loss: 0.2572 - classification_loss: 0.0143 18/500 [>.............................] - ETA: 2:41 - loss: 0.2701 - regression_loss: 0.2558 - classification_loss: 0.0143 19/500 [>.............................] - ETA: 2:41 - loss: 0.2639 - regression_loss: 0.2501 - classification_loss: 0.0138 20/500 [>.............................] - ETA: 2:40 - loss: 0.2726 - regression_loss: 0.2584 - classification_loss: 0.0141 21/500 [>.............................] - ETA: 2:40 - loss: 0.2748 - regression_loss: 0.2601 - classification_loss: 0.0146 22/500 [>.............................] - ETA: 2:40 - loss: 0.2750 - regression_loss: 0.2602 - classification_loss: 0.0148 23/500 [>.............................] - ETA: 2:39 - loss: 0.2721 - regression_loss: 0.2572 - classification_loss: 0.0149 24/500 [>.............................] - ETA: 2:39 - loss: 0.2778 - regression_loss: 0.2625 - classification_loss: 0.0154 25/500 [>.............................] - ETA: 2:38 - loss: 0.2752 - regression_loss: 0.2598 - classification_loss: 0.0153 26/500 [>.............................] - ETA: 2:38 - loss: 0.2693 - regression_loss: 0.2544 - classification_loss: 0.0150 27/500 [>.............................] - ETA: 2:37 - loss: 0.2711 - regression_loss: 0.2559 - classification_loss: 0.0152 28/500 [>.............................] - ETA: 2:37 - loss: 0.2681 - regression_loss: 0.2533 - classification_loss: 0.0148 29/500 [>.............................] - ETA: 2:36 - loss: 0.2650 - regression_loss: 0.2504 - classification_loss: 0.0147 30/500 [>.............................] - ETA: 2:36 - loss: 0.2646 - regression_loss: 0.2501 - classification_loss: 0.0145 31/500 [>.............................] - ETA: 2:35 - loss: 0.2679 - regression_loss: 0.2527 - classification_loss: 0.0152 32/500 [>.............................] - ETA: 2:35 - loss: 0.2672 - regression_loss: 0.2523 - classification_loss: 0.0149 33/500 [>.............................] - ETA: 2:35 - loss: 0.2695 - regression_loss: 0.2545 - classification_loss: 0.0150 34/500 [=>............................] - ETA: 2:34 - loss: 0.2672 - regression_loss: 0.2523 - classification_loss: 0.0149 35/500 [=>............................] - ETA: 2:34 - loss: 0.2683 - regression_loss: 0.2534 - classification_loss: 0.0150 36/500 [=>............................] - ETA: 2:33 - loss: 0.2669 - regression_loss: 0.2519 - classification_loss: 0.0150 37/500 [=>............................] - ETA: 2:33 - loss: 0.2630 - regression_loss: 0.2481 - classification_loss: 0.0149 38/500 [=>............................] - ETA: 2:32 - loss: 0.2681 - regression_loss: 0.2529 - classification_loss: 0.0153 39/500 [=>............................] - ETA: 2:32 - loss: 0.2665 - regression_loss: 0.2511 - classification_loss: 0.0154 40/500 [=>............................] - ETA: 2:32 - loss: 0.2659 - regression_loss: 0.2506 - classification_loss: 0.0153 41/500 [=>............................] - ETA: 2:31 - loss: 0.2744 - regression_loss: 0.2592 - classification_loss: 0.0152 42/500 [=>............................] - ETA: 2:31 - loss: 0.2718 - regression_loss: 0.2568 - classification_loss: 0.0150 43/500 [=>............................] - ETA: 2:30 - loss: 0.2717 - regression_loss: 0.2568 - classification_loss: 0.0150 44/500 [=>............................] - ETA: 2:30 - loss: 0.2688 - regression_loss: 0.2540 - classification_loss: 0.0148 45/500 [=>............................] - ETA: 2:30 - loss: 0.2655 - regression_loss: 0.2509 - classification_loss: 0.0146 46/500 [=>............................] - ETA: 2:29 - loss: 0.2640 - regression_loss: 0.2495 - classification_loss: 0.0145 47/500 [=>............................] - ETA: 2:29 - loss: 0.2624 - regression_loss: 0.2480 - classification_loss: 0.0144 48/500 [=>............................] - ETA: 2:28 - loss: 0.2624 - regression_loss: 0.2480 - classification_loss: 0.0144 49/500 [=>............................] - ETA: 2:28 - loss: 0.2658 - regression_loss: 0.2507 - classification_loss: 0.0152 50/500 [==>...........................] - ETA: 2:27 - loss: 0.2687 - regression_loss: 0.2529 - classification_loss: 0.0157 51/500 [==>...........................] - ETA: 2:27 - loss: 0.2706 - regression_loss: 0.2546 - classification_loss: 0.0160 52/500 [==>...........................] - ETA: 2:27 - loss: 0.2696 - regression_loss: 0.2536 - classification_loss: 0.0159 53/500 [==>...........................] - ETA: 2:27 - loss: 0.2695 - regression_loss: 0.2536 - classification_loss: 0.0159 54/500 [==>...........................] - ETA: 2:26 - loss: 0.2689 - regression_loss: 0.2531 - classification_loss: 0.0158 55/500 [==>...........................] - ETA: 2:26 - loss: 0.2692 - regression_loss: 0.2535 - classification_loss: 0.0158 56/500 [==>...........................] - ETA: 2:26 - loss: 0.2668 - regression_loss: 0.2512 - classification_loss: 0.0156 57/500 [==>...........................] - ETA: 2:26 - loss: 0.2715 - regression_loss: 0.2547 - classification_loss: 0.0168 58/500 [==>...........................] - ETA: 2:25 - loss: 0.2743 - regression_loss: 0.2573 - classification_loss: 0.0170 59/500 [==>...........................] - ETA: 2:25 - loss: 0.2736 - regression_loss: 0.2568 - classification_loss: 0.0168 60/500 [==>...........................] - ETA: 2:25 - loss: 0.2700 - regression_loss: 0.2534 - classification_loss: 0.0165 61/500 [==>...........................] - ETA: 2:25 - loss: 0.2672 - regression_loss: 0.2509 - classification_loss: 0.0163 62/500 [==>...........................] - ETA: 2:24 - loss: 0.2677 - regression_loss: 0.2516 - classification_loss: 0.0162 63/500 [==>...........................] - ETA: 2:24 - loss: 0.2708 - regression_loss: 0.2544 - classification_loss: 0.0164 64/500 [==>...........................] - ETA: 2:24 - loss: 0.2722 - regression_loss: 0.2555 - classification_loss: 0.0167 65/500 [==>...........................] - ETA: 2:23 - loss: 0.2718 - regression_loss: 0.2553 - classification_loss: 0.0165 66/500 [==>...........................] - ETA: 2:23 - loss: 0.2727 - regression_loss: 0.2562 - classification_loss: 0.0164 67/500 [===>..........................] - ETA: 2:23 - loss: 0.2718 - regression_loss: 0.2553 - classification_loss: 0.0165 68/500 [===>..........................] - ETA: 2:22 - loss: 0.2733 - regression_loss: 0.2563 - classification_loss: 0.0169 69/500 [===>..........................] - ETA: 2:22 - loss: 0.2723 - regression_loss: 0.2555 - classification_loss: 0.0168 70/500 [===>..........................] - ETA: 2:22 - loss: 0.2712 - regression_loss: 0.2545 - classification_loss: 0.0167 71/500 [===>..........................] - ETA: 2:22 - loss: 0.2697 - regression_loss: 0.2531 - classification_loss: 0.0165 72/500 [===>..........................] - ETA: 2:21 - loss: 0.2683 - regression_loss: 0.2519 - classification_loss: 0.0164 73/500 [===>..........................] - ETA: 2:21 - loss: 0.2675 - regression_loss: 0.2511 - classification_loss: 0.0164 74/500 [===>..........................] - ETA: 2:21 - loss: 0.2668 - regression_loss: 0.2503 - classification_loss: 0.0164 75/500 [===>..........................] - ETA: 2:20 - loss: 0.2674 - regression_loss: 0.2510 - classification_loss: 0.0164 76/500 [===>..........................] - ETA: 2:20 - loss: 0.2687 - regression_loss: 0.2523 - classification_loss: 0.0164 77/500 [===>..........................] - ETA: 2:20 - loss: 0.2693 - regression_loss: 0.2529 - classification_loss: 0.0164 78/500 [===>..........................] - ETA: 2:20 - loss: 0.2688 - regression_loss: 0.2525 - classification_loss: 0.0163 79/500 [===>..........................] - ETA: 2:19 - loss: 0.2664 - regression_loss: 0.2503 - classification_loss: 0.0162 80/500 [===>..........................] - ETA: 2:19 - loss: 0.2653 - regression_loss: 0.2492 - classification_loss: 0.0161 81/500 [===>..........................] - ETA: 2:19 - loss: 0.2648 - regression_loss: 0.2488 - classification_loss: 0.0160 82/500 [===>..........................] - ETA: 2:19 - loss: 0.2645 - regression_loss: 0.2484 - classification_loss: 0.0161 83/500 [===>..........................] - ETA: 2:18 - loss: 0.2638 - regression_loss: 0.2479 - classification_loss: 0.0159 84/500 [====>.........................] - ETA: 2:18 - loss: 0.2642 - regression_loss: 0.2482 - classification_loss: 0.0160 85/500 [====>.........................] - ETA: 2:18 - loss: 0.2651 - regression_loss: 0.2491 - classification_loss: 0.0160 86/500 [====>.........................] - ETA: 2:17 - loss: 0.2663 - regression_loss: 0.2503 - classification_loss: 0.0160 87/500 [====>.........................] - ETA: 2:17 - loss: 0.2659 - regression_loss: 0.2500 - classification_loss: 0.0159 88/500 [====>.........................] - ETA: 2:17 - loss: 0.2688 - regression_loss: 0.2527 - classification_loss: 0.0161 89/500 [====>.........................] - ETA: 2:17 - loss: 0.2676 - regression_loss: 0.2517 - classification_loss: 0.0159 90/500 [====>.........................] - ETA: 2:16 - loss: 0.2674 - regression_loss: 0.2515 - classification_loss: 0.0158 91/500 [====>.........................] - ETA: 2:16 - loss: 0.2657 - regression_loss: 0.2500 - classification_loss: 0.0157 92/500 [====>.........................] - ETA: 2:16 - loss: 0.2664 - regression_loss: 0.2507 - classification_loss: 0.0157 93/500 [====>.........................] - ETA: 2:15 - loss: 0.2673 - regression_loss: 0.2514 - classification_loss: 0.0158 94/500 [====>.........................] - ETA: 2:15 - loss: 0.2670 - regression_loss: 0.2513 - classification_loss: 0.0157 95/500 [====>.........................] - ETA: 2:15 - loss: 0.2673 - regression_loss: 0.2516 - classification_loss: 0.0158 96/500 [====>.........................] - ETA: 2:15 - loss: 0.2676 - regression_loss: 0.2519 - classification_loss: 0.0157 97/500 [====>.........................] - ETA: 2:14 - loss: 0.2688 - regression_loss: 0.2530 - classification_loss: 0.0158 98/500 [====>.........................] - ETA: 2:14 - loss: 0.2692 - regression_loss: 0.2533 - classification_loss: 0.0158 99/500 [====>.........................] - ETA: 2:14 - loss: 0.2678 - regression_loss: 0.2521 - classification_loss: 0.0157 100/500 [=====>........................] - ETA: 2:13 - loss: 0.2667 - regression_loss: 0.2511 - classification_loss: 0.0156 101/500 [=====>........................] - ETA: 2:13 - loss: 0.2653 - regression_loss: 0.2499 - classification_loss: 0.0155 102/500 [=====>........................] - ETA: 2:13 - loss: 0.2656 - regression_loss: 0.2502 - classification_loss: 0.0154 103/500 [=====>........................] - ETA: 2:12 - loss: 0.2664 - regression_loss: 0.2509 - classification_loss: 0.0155 104/500 [=====>........................] - ETA: 2:12 - loss: 0.2678 - regression_loss: 0.2523 - classification_loss: 0.0155 105/500 [=====>........................] - ETA: 2:12 - loss: 0.2696 - regression_loss: 0.2542 - classification_loss: 0.0154 106/500 [=====>........................] - ETA: 2:12 - loss: 0.2699 - regression_loss: 0.2545 - classification_loss: 0.0154 107/500 [=====>........................] - ETA: 2:11 - loss: 0.2698 - regression_loss: 0.2544 - classification_loss: 0.0153 108/500 [=====>........................] - ETA: 2:11 - loss: 0.2698 - regression_loss: 0.2545 - classification_loss: 0.0153 109/500 [=====>........................] - ETA: 2:11 - loss: 0.2691 - regression_loss: 0.2539 - classification_loss: 0.0153 110/500 [=====>........................] - ETA: 2:10 - loss: 0.2692 - regression_loss: 0.2539 - classification_loss: 0.0153 111/500 [=====>........................] - ETA: 2:10 - loss: 0.2698 - regression_loss: 0.2544 - classification_loss: 0.0154 112/500 [=====>........................] - ETA: 2:10 - loss: 0.2706 - regression_loss: 0.2551 - classification_loss: 0.0154 113/500 [=====>........................] - ETA: 2:09 - loss: 0.2701 - regression_loss: 0.2548 - classification_loss: 0.0154 114/500 [=====>........................] - ETA: 2:09 - loss: 0.2699 - regression_loss: 0.2545 - classification_loss: 0.0154 115/500 [=====>........................] - ETA: 2:09 - loss: 0.2702 - regression_loss: 0.2549 - classification_loss: 0.0154 116/500 [=====>........................] - ETA: 2:08 - loss: 0.2691 - regression_loss: 0.2539 - classification_loss: 0.0152 117/500 [======>.......................] - ETA: 2:08 - loss: 0.2695 - regression_loss: 0.2543 - classification_loss: 0.0152 118/500 [======>.......................] - ETA: 2:08 - loss: 0.2681 - regression_loss: 0.2531 - classification_loss: 0.0151 119/500 [======>.......................] - ETA: 2:07 - loss: 0.2668 - regression_loss: 0.2519 - classification_loss: 0.0150 120/500 [======>.......................] - ETA: 2:07 - loss: 0.2665 - regression_loss: 0.2515 - classification_loss: 0.0149 121/500 [======>.......................] - ETA: 2:07 - loss: 0.2662 - regression_loss: 0.2513 - classification_loss: 0.0149 122/500 [======>.......................] - ETA: 2:06 - loss: 0.2670 - regression_loss: 0.2521 - classification_loss: 0.0149 123/500 [======>.......................] - ETA: 2:06 - loss: 0.2678 - regression_loss: 0.2528 - classification_loss: 0.0150 124/500 [======>.......................] - ETA: 2:06 - loss: 0.2695 - regression_loss: 0.2540 - classification_loss: 0.0154 125/500 [======>.......................] - ETA: 2:06 - loss: 0.2690 - regression_loss: 0.2537 - classification_loss: 0.0153 126/500 [======>.......................] - ETA: 2:05 - loss: 0.2685 - regression_loss: 0.2533 - classification_loss: 0.0153 127/500 [======>.......................] - ETA: 2:05 - loss: 0.2690 - regression_loss: 0.2538 - classification_loss: 0.0152 128/500 [======>.......................] - ETA: 2:05 - loss: 0.2684 - regression_loss: 0.2532 - classification_loss: 0.0152 129/500 [======>.......................] - ETA: 2:04 - loss: 0.2686 - regression_loss: 0.2534 - classification_loss: 0.0151 130/500 [======>.......................] - ETA: 2:04 - loss: 0.2694 - regression_loss: 0.2542 - classification_loss: 0.0152 131/500 [======>.......................] - ETA: 2:04 - loss: 0.2696 - regression_loss: 0.2543 - classification_loss: 0.0152 132/500 [======>.......................] - ETA: 2:03 - loss: 0.2699 - regression_loss: 0.2545 - classification_loss: 0.0153 133/500 [======>.......................] - ETA: 2:03 - loss: 0.2712 - regression_loss: 0.2559 - classification_loss: 0.0154 134/500 [=======>......................] - ETA: 2:03 - loss: 0.2727 - regression_loss: 0.2573 - classification_loss: 0.0154 135/500 [=======>......................] - ETA: 2:02 - loss: 0.2727 - regression_loss: 0.2573 - classification_loss: 0.0154 136/500 [=======>......................] - ETA: 2:02 - loss: 0.2718 - regression_loss: 0.2565 - classification_loss: 0.0153 137/500 [=======>......................] - ETA: 2:02 - loss: 0.2718 - regression_loss: 0.2565 - classification_loss: 0.0153 138/500 [=======>......................] - ETA: 2:01 - loss: 0.2703 - regression_loss: 0.2552 - classification_loss: 0.0152 139/500 [=======>......................] - ETA: 2:01 - loss: 0.2726 - regression_loss: 0.2573 - classification_loss: 0.0152 140/500 [=======>......................] - ETA: 2:01 - loss: 0.2730 - regression_loss: 0.2578 - classification_loss: 0.0151 141/500 [=======>......................] - ETA: 2:00 - loss: 0.2738 - regression_loss: 0.2586 - classification_loss: 0.0152 142/500 [=======>......................] - ETA: 2:00 - loss: 0.2732 - regression_loss: 0.2581 - classification_loss: 0.0151 143/500 [=======>......................] - ETA: 2:00 - loss: 0.2735 - regression_loss: 0.2584 - classification_loss: 0.0151 144/500 [=======>......................] - ETA: 1:59 - loss: 0.2736 - regression_loss: 0.2585 - classification_loss: 0.0151 145/500 [=======>......................] - ETA: 1:59 - loss: 0.2734 - regression_loss: 0.2582 - classification_loss: 0.0152 146/500 [=======>......................] - ETA: 1:59 - loss: 0.2745 - regression_loss: 0.2591 - classification_loss: 0.0154 147/500 [=======>......................] - ETA: 1:58 - loss: 0.2741 - regression_loss: 0.2588 - classification_loss: 0.0153 148/500 [=======>......................] - ETA: 1:58 - loss: 0.2738 - regression_loss: 0.2585 - classification_loss: 0.0153 149/500 [=======>......................] - ETA: 1:58 - loss: 0.2736 - regression_loss: 0.2584 - classification_loss: 0.0152 150/500 [========>.....................] - ETA: 1:57 - loss: 0.2751 - regression_loss: 0.2598 - classification_loss: 0.0153 151/500 [========>.....................] - ETA: 1:57 - loss: 0.2752 - regression_loss: 0.2599 - classification_loss: 0.0153 152/500 [========>.....................] - ETA: 1:57 - loss: 0.2741 - regression_loss: 0.2589 - classification_loss: 0.0152 153/500 [========>.....................] - ETA: 1:56 - loss: 0.2752 - regression_loss: 0.2599 - classification_loss: 0.0153 154/500 [========>.....................] - ETA: 1:56 - loss: 0.2748 - regression_loss: 0.2595 - classification_loss: 0.0153 155/500 [========>.....................] - ETA: 1:56 - loss: 0.2757 - regression_loss: 0.2603 - classification_loss: 0.0153 156/500 [========>.....................] - ETA: 1:55 - loss: 0.2747 - regression_loss: 0.2594 - classification_loss: 0.0153 157/500 [========>.....................] - ETA: 1:55 - loss: 0.2738 - regression_loss: 0.2586 - classification_loss: 0.0153 158/500 [========>.....................] - ETA: 1:55 - loss: 0.2730 - regression_loss: 0.2578 - classification_loss: 0.0152 159/500 [========>.....................] - ETA: 1:54 - loss: 0.2724 - regression_loss: 0.2572 - classification_loss: 0.0151 160/500 [========>.....................] - ETA: 1:54 - loss: 0.2713 - regression_loss: 0.2562 - classification_loss: 0.0151 161/500 [========>.....................] - ETA: 1:54 - loss: 0.2723 - regression_loss: 0.2572 - classification_loss: 0.0151 162/500 [========>.....................] - ETA: 1:54 - loss: 0.2722 - regression_loss: 0.2572 - classification_loss: 0.0150 163/500 [========>.....................] - ETA: 1:53 - loss: 0.2726 - regression_loss: 0.2576 - classification_loss: 0.0150 164/500 [========>.....................] - ETA: 1:53 - loss: 0.2714 - regression_loss: 0.2564 - classification_loss: 0.0149 165/500 [========>.....................] - ETA: 1:53 - loss: 0.2725 - regression_loss: 0.2574 - classification_loss: 0.0150 166/500 [========>.....................] - ETA: 1:52 - loss: 0.2729 - regression_loss: 0.2577 - classification_loss: 0.0151 167/500 [=========>....................] - ETA: 1:52 - loss: 0.2727 - regression_loss: 0.2576 - classification_loss: 0.0151 168/500 [=========>....................] - ETA: 1:52 - loss: 0.2731 - regression_loss: 0.2580 - classification_loss: 0.0151 169/500 [=========>....................] - ETA: 1:51 - loss: 0.2738 - regression_loss: 0.2587 - classification_loss: 0.0151 170/500 [=========>....................] - ETA: 1:51 - loss: 0.2741 - regression_loss: 0.2590 - classification_loss: 0.0151 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2735 - regression_loss: 0.2585 - classification_loss: 0.0150 172/500 [=========>....................] - ETA: 1:50 - loss: 0.2730 - regression_loss: 0.2580 - classification_loss: 0.0150 173/500 [=========>....................] - ETA: 1:50 - loss: 0.2732 - regression_loss: 0.2582 - classification_loss: 0.0150 174/500 [=========>....................] - ETA: 1:50 - loss: 0.2728 - regression_loss: 0.2578 - classification_loss: 0.0149 175/500 [=========>....................] - ETA: 1:49 - loss: 0.2719 - regression_loss: 0.2570 - classification_loss: 0.0149 176/500 [=========>....................] - ETA: 1:49 - loss: 0.2722 - regression_loss: 0.2573 - classification_loss: 0.0148 177/500 [=========>....................] - ETA: 1:49 - loss: 0.2724 - regression_loss: 0.2575 - classification_loss: 0.0148 178/500 [=========>....................] - ETA: 1:48 - loss: 0.2723 - regression_loss: 0.2574 - classification_loss: 0.0149 179/500 [=========>....................] - ETA: 1:48 - loss: 0.2727 - regression_loss: 0.2578 - classification_loss: 0.0149 180/500 [=========>....................] - ETA: 1:48 - loss: 0.2723 - regression_loss: 0.2574 - classification_loss: 0.0148 181/500 [=========>....................] - ETA: 1:47 - loss: 0.2720 - regression_loss: 0.2572 - classification_loss: 0.0148 182/500 [=========>....................] - ETA: 1:47 - loss: 0.2723 - regression_loss: 0.2575 - classification_loss: 0.0148 183/500 [=========>....................] - ETA: 1:47 - loss: 0.2716 - regression_loss: 0.2569 - classification_loss: 0.0147 184/500 [==========>...................] - ETA: 1:46 - loss: 0.2714 - regression_loss: 0.2568 - classification_loss: 0.0147 185/500 [==========>...................] - ETA: 1:46 - loss: 0.2710 - regression_loss: 0.2564 - classification_loss: 0.0146 186/500 [==========>...................] - ETA: 1:46 - loss: 0.2718 - regression_loss: 0.2571 - classification_loss: 0.0147 187/500 [==========>...................] - ETA: 1:45 - loss: 0.2726 - regression_loss: 0.2578 - classification_loss: 0.0148 188/500 [==========>...................] - ETA: 1:45 - loss: 0.2729 - regression_loss: 0.2580 - classification_loss: 0.0149 189/500 [==========>...................] - ETA: 1:45 - loss: 0.2730 - regression_loss: 0.2582 - classification_loss: 0.0149 190/500 [==========>...................] - ETA: 1:44 - loss: 0.2741 - regression_loss: 0.2591 - classification_loss: 0.0151 191/500 [==========>...................] - ETA: 1:44 - loss: 0.2737 - regression_loss: 0.2587 - classification_loss: 0.0150 192/500 [==========>...................] - ETA: 1:44 - loss: 0.2738 - regression_loss: 0.2587 - classification_loss: 0.0151 193/500 [==========>...................] - ETA: 1:43 - loss: 0.2734 - regression_loss: 0.2584 - classification_loss: 0.0150 194/500 [==========>...................] - ETA: 1:43 - loss: 0.2736 - regression_loss: 0.2587 - classification_loss: 0.0150 195/500 [==========>...................] - ETA: 1:43 - loss: 0.2733 - regression_loss: 0.2584 - classification_loss: 0.0149 196/500 [==========>...................] - ETA: 1:42 - loss: 0.2726 - regression_loss: 0.2578 - classification_loss: 0.0149 197/500 [==========>...................] - ETA: 1:42 - loss: 0.2717 - regression_loss: 0.2569 - classification_loss: 0.0148 198/500 [==========>...................] - ETA: 1:42 - loss: 0.2716 - regression_loss: 0.2568 - classification_loss: 0.0148 199/500 [==========>...................] - ETA: 1:41 - loss: 0.2725 - regression_loss: 0.2577 - classification_loss: 0.0148 200/500 [===========>..................] - ETA: 1:41 - loss: 0.2715 - regression_loss: 0.2567 - classification_loss: 0.0148 201/500 [===========>..................] - ETA: 1:41 - loss: 0.2713 - regression_loss: 0.2566 - classification_loss: 0.0147 202/500 [===========>..................] - ETA: 1:40 - loss: 0.2720 - regression_loss: 0.2572 - classification_loss: 0.0147 203/500 [===========>..................] - ETA: 1:40 - loss: 0.2721 - regression_loss: 0.2575 - classification_loss: 0.0147 204/500 [===========>..................] - ETA: 1:40 - loss: 0.2721 - regression_loss: 0.2575 - classification_loss: 0.0146 205/500 [===========>..................] - ETA: 1:39 - loss: 0.2729 - regression_loss: 0.2583 - classification_loss: 0.0146 206/500 [===========>..................] - ETA: 1:39 - loss: 0.2722 - regression_loss: 0.2577 - classification_loss: 0.0145 207/500 [===========>..................] - ETA: 1:39 - loss: 0.2720 - regression_loss: 0.2576 - classification_loss: 0.0145 208/500 [===========>..................] - ETA: 1:38 - loss: 0.2726 - regression_loss: 0.2580 - classification_loss: 0.0146 209/500 [===========>..................] - ETA: 1:38 - loss: 0.2722 - regression_loss: 0.2577 - classification_loss: 0.0146 210/500 [===========>..................] - ETA: 1:38 - loss: 0.2716 - regression_loss: 0.2570 - classification_loss: 0.0145 211/500 [===========>..................] - ETA: 1:37 - loss: 0.2721 - regression_loss: 0.2576 - classification_loss: 0.0146 212/500 [===========>..................] - ETA: 1:37 - loss: 0.2729 - regression_loss: 0.2583 - classification_loss: 0.0146 213/500 [===========>..................] - ETA: 1:37 - loss: 0.2725 - regression_loss: 0.2580 - classification_loss: 0.0145 214/500 [===========>..................] - ETA: 1:36 - loss: 0.2723 - regression_loss: 0.2577 - classification_loss: 0.0145 215/500 [===========>..................] - ETA: 1:36 - loss: 0.2719 - regression_loss: 0.2574 - classification_loss: 0.0145 216/500 [===========>..................] - ETA: 1:36 - loss: 0.2723 - regression_loss: 0.2577 - classification_loss: 0.0146 217/500 [============>.................] - ETA: 1:35 - loss: 0.2722 - regression_loss: 0.2576 - classification_loss: 0.0146 218/500 [============>.................] - ETA: 1:35 - loss: 0.2725 - regression_loss: 0.2579 - classification_loss: 0.0146 219/500 [============>.................] - ETA: 1:35 - loss: 0.2723 - regression_loss: 0.2577 - classification_loss: 0.0146 220/500 [============>.................] - ETA: 1:34 - loss: 0.2727 - regression_loss: 0.2580 - classification_loss: 0.0147 221/500 [============>.................] - ETA: 1:34 - loss: 0.2720 - regression_loss: 0.2573 - classification_loss: 0.0147 222/500 [============>.................] - ETA: 1:34 - loss: 0.2718 - regression_loss: 0.2571 - classification_loss: 0.0147 223/500 [============>.................] - ETA: 1:33 - loss: 0.2728 - regression_loss: 0.2580 - classification_loss: 0.0147 224/500 [============>.................] - ETA: 1:33 - loss: 0.2724 - regression_loss: 0.2576 - classification_loss: 0.0147 225/500 [============>.................] - ETA: 1:33 - loss: 0.2734 - regression_loss: 0.2586 - classification_loss: 0.0148 226/500 [============>.................] - ETA: 1:32 - loss: 0.2734 - regression_loss: 0.2587 - classification_loss: 0.0148 227/500 [============>.................] - ETA: 1:32 - loss: 0.2735 - regression_loss: 0.2588 - classification_loss: 0.0147 228/500 [============>.................] - ETA: 1:32 - loss: 0.2734 - regression_loss: 0.2586 - classification_loss: 0.0147 229/500 [============>.................] - ETA: 1:31 - loss: 0.2730 - regression_loss: 0.2583 - classification_loss: 0.0147 230/500 [============>.................] - ETA: 1:31 - loss: 0.2736 - regression_loss: 0.2589 - classification_loss: 0.0147 231/500 [============>.................] - ETA: 1:31 - loss: 0.2741 - regression_loss: 0.2594 - classification_loss: 0.0147 232/500 [============>.................] - ETA: 1:30 - loss: 0.2742 - regression_loss: 0.2595 - classification_loss: 0.0147 233/500 [============>.................] - ETA: 1:30 - loss: 0.2745 - regression_loss: 0.2598 - classification_loss: 0.0146 234/500 [=============>................] - ETA: 1:30 - loss: 0.2742 - regression_loss: 0.2596 - classification_loss: 0.0146 235/500 [=============>................] - ETA: 1:29 - loss: 0.2739 - regression_loss: 0.2593 - classification_loss: 0.0146 236/500 [=============>................] - ETA: 1:29 - loss: 0.2738 - regression_loss: 0.2592 - classification_loss: 0.0146 237/500 [=============>................] - ETA: 1:29 - loss: 0.2733 - regression_loss: 0.2588 - classification_loss: 0.0145 238/500 [=============>................] - ETA: 1:28 - loss: 0.2738 - regression_loss: 0.2593 - classification_loss: 0.0145 239/500 [=============>................] - ETA: 1:28 - loss: 0.2747 - regression_loss: 0.2602 - classification_loss: 0.0145 240/500 [=============>................] - ETA: 1:28 - loss: 0.2750 - regression_loss: 0.2604 - classification_loss: 0.0145 241/500 [=============>................] - ETA: 1:27 - loss: 0.2769 - regression_loss: 0.2622 - classification_loss: 0.0148 242/500 [=============>................] - ETA: 1:27 - loss: 0.2768 - regression_loss: 0.2621 - classification_loss: 0.0147 243/500 [=============>................] - ETA: 1:26 - loss: 0.2777 - regression_loss: 0.2629 - classification_loss: 0.0147 244/500 [=============>................] - ETA: 1:26 - loss: 0.2775 - regression_loss: 0.2627 - classification_loss: 0.0147 245/500 [=============>................] - ETA: 1:26 - loss: 0.2771 - regression_loss: 0.2624 - classification_loss: 0.0147 246/500 [=============>................] - ETA: 1:25 - loss: 0.2771 - regression_loss: 0.2624 - classification_loss: 0.0147 247/500 [=============>................] - ETA: 1:25 - loss: 0.2779 - regression_loss: 0.2633 - classification_loss: 0.0147 248/500 [=============>................] - ETA: 1:25 - loss: 0.2781 - regression_loss: 0.2635 - classification_loss: 0.0146 249/500 [=============>................] - ETA: 1:24 - loss: 0.2777 - regression_loss: 0.2631 - classification_loss: 0.0146 250/500 [==============>...............] - ETA: 1:24 - loss: 0.2787 - regression_loss: 0.2641 - classification_loss: 0.0146 251/500 [==============>...............] - ETA: 1:24 - loss: 0.2797 - regression_loss: 0.2651 - classification_loss: 0.0147 252/500 [==============>...............] - ETA: 1:23 - loss: 0.2795 - regression_loss: 0.2649 - classification_loss: 0.0146 253/500 [==============>...............] - ETA: 1:23 - loss: 0.2798 - regression_loss: 0.2652 - classification_loss: 0.0146 254/500 [==============>...............] - ETA: 1:23 - loss: 0.2796 - regression_loss: 0.2649 - classification_loss: 0.0146 255/500 [==============>...............] - ETA: 1:22 - loss: 0.2796 - regression_loss: 0.2649 - classification_loss: 0.0146 256/500 [==============>...............] - ETA: 1:22 - loss: 0.2796 - regression_loss: 0.2649 - classification_loss: 0.0146 257/500 [==============>...............] - ETA: 1:22 - loss: 0.2791 - regression_loss: 0.2646 - classification_loss: 0.0146 258/500 [==============>...............] - ETA: 1:21 - loss: 0.2792 - regression_loss: 0.2646 - classification_loss: 0.0145 259/500 [==============>...............] - ETA: 1:21 - loss: 0.2790 - regression_loss: 0.2645 - classification_loss: 0.0145 260/500 [==============>...............] - ETA: 1:21 - loss: 0.2796 - regression_loss: 0.2651 - classification_loss: 0.0146 261/500 [==============>...............] - ETA: 1:20 - loss: 0.2802 - regression_loss: 0.2656 - classification_loss: 0.0146 262/500 [==============>...............] - ETA: 1:20 - loss: 0.2800 - regression_loss: 0.2655 - classification_loss: 0.0146 263/500 [==============>...............] - ETA: 1:20 - loss: 0.2798 - regression_loss: 0.2652 - classification_loss: 0.0145 264/500 [==============>...............] - ETA: 1:19 - loss: 0.2797 - regression_loss: 0.2652 - classification_loss: 0.0145 265/500 [==============>...............] - ETA: 1:19 - loss: 0.2796 - regression_loss: 0.2651 - classification_loss: 0.0145 266/500 [==============>...............] - ETA: 1:19 - loss: 0.2795 - regression_loss: 0.2650 - classification_loss: 0.0145 267/500 [===============>..............] - ETA: 1:19 - loss: 0.2791 - regression_loss: 0.2646 - classification_loss: 0.0145 268/500 [===============>..............] - ETA: 1:18 - loss: 0.2795 - regression_loss: 0.2650 - classification_loss: 0.0145 269/500 [===============>..............] - ETA: 1:18 - loss: 0.2795 - regression_loss: 0.2650 - classification_loss: 0.0145 270/500 [===============>..............] - ETA: 1:18 - loss: 0.2797 - regression_loss: 0.2652 - classification_loss: 0.0145 271/500 [===============>..............] - ETA: 1:17 - loss: 0.2790 - regression_loss: 0.2645 - classification_loss: 0.0145 272/500 [===============>..............] - ETA: 1:17 - loss: 0.2788 - regression_loss: 0.2644 - classification_loss: 0.0145 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2784 - regression_loss: 0.2640 - classification_loss: 0.0144 274/500 [===============>..............] - ETA: 1:16 - loss: 0.2787 - regression_loss: 0.2642 - classification_loss: 0.0145 275/500 [===============>..............] - ETA: 1:16 - loss: 0.2789 - regression_loss: 0.2644 - classification_loss: 0.0145 276/500 [===============>..............] - ETA: 1:15 - loss: 0.2784 - regression_loss: 0.2639 - classification_loss: 0.0145 277/500 [===============>..............] - ETA: 1:15 - loss: 0.2782 - regression_loss: 0.2637 - classification_loss: 0.0145 278/500 [===============>..............] - ETA: 1:15 - loss: 0.2783 - regression_loss: 0.2638 - classification_loss: 0.0145 279/500 [===============>..............] - ETA: 1:14 - loss: 0.2783 - regression_loss: 0.2637 - classification_loss: 0.0145 280/500 [===============>..............] - ETA: 1:14 - loss: 0.2783 - regression_loss: 0.2638 - classification_loss: 0.0145 281/500 [===============>..............] - ETA: 1:14 - loss: 0.2779 - regression_loss: 0.2634 - classification_loss: 0.0145 282/500 [===============>..............] - ETA: 1:13 - loss: 0.2779 - regression_loss: 0.2634 - classification_loss: 0.0145 283/500 [===============>..............] - ETA: 1:13 - loss: 0.2779 - regression_loss: 0.2634 - classification_loss: 0.0145 284/500 [================>.............] - ETA: 1:13 - loss: 0.2776 - regression_loss: 0.2631 - classification_loss: 0.0145 285/500 [================>.............] - ETA: 1:12 - loss: 0.2776 - regression_loss: 0.2631 - classification_loss: 0.0144 286/500 [================>.............] - ETA: 1:12 - loss: 0.2772 - regression_loss: 0.2628 - classification_loss: 0.0144 287/500 [================>.............] - ETA: 1:12 - loss: 0.2768 - regression_loss: 0.2624 - classification_loss: 0.0144 288/500 [================>.............] - ETA: 1:11 - loss: 0.2778 - regression_loss: 0.2633 - classification_loss: 0.0145 289/500 [================>.............] - ETA: 1:11 - loss: 0.2777 - regression_loss: 0.2633 - classification_loss: 0.0144 290/500 [================>.............] - ETA: 1:11 - loss: 0.2782 - regression_loss: 0.2637 - classification_loss: 0.0145 291/500 [================>.............] - ETA: 1:10 - loss: 0.2776 - regression_loss: 0.2632 - classification_loss: 0.0144 292/500 [================>.............] - ETA: 1:10 - loss: 0.2774 - regression_loss: 0.2630 - classification_loss: 0.0144 293/500 [================>.............] - ETA: 1:10 - loss: 0.2780 - regression_loss: 0.2636 - classification_loss: 0.0144 294/500 [================>.............] - ETA: 1:09 - loss: 0.2781 - regression_loss: 0.2637 - classification_loss: 0.0144 295/500 [================>.............] - ETA: 1:09 - loss: 0.2779 - regression_loss: 0.2634 - classification_loss: 0.0144 296/500 [================>.............] - ETA: 1:09 - loss: 0.2784 - regression_loss: 0.2640 - classification_loss: 0.0144 297/500 [================>.............] - ETA: 1:08 - loss: 0.2793 - regression_loss: 0.2649 - classification_loss: 0.0144 298/500 [================>.............] - ETA: 1:08 - loss: 0.2794 - regression_loss: 0.2650 - classification_loss: 0.0144 299/500 [================>.............] - ETA: 1:08 - loss: 0.2790 - regression_loss: 0.2646 - classification_loss: 0.0144 300/500 [=================>............] - ETA: 1:07 - loss: 0.2790 - regression_loss: 0.2646 - classification_loss: 0.0144 301/500 [=================>............] - ETA: 1:07 - loss: 0.2785 - regression_loss: 0.2641 - classification_loss: 0.0144 302/500 [=================>............] - ETA: 1:07 - loss: 0.2787 - regression_loss: 0.2644 - classification_loss: 0.0143 303/500 [=================>............] - ETA: 1:06 - loss: 0.2792 - regression_loss: 0.2648 - classification_loss: 0.0143 304/500 [=================>............] - ETA: 1:06 - loss: 0.2788 - regression_loss: 0.2645 - classification_loss: 0.0143 305/500 [=================>............] - ETA: 1:06 - loss: 0.2788 - regression_loss: 0.2645 - classification_loss: 0.0143 306/500 [=================>............] - ETA: 1:05 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0143 307/500 [=================>............] - ETA: 1:05 - loss: 0.2786 - regression_loss: 0.2644 - classification_loss: 0.0143 308/500 [=================>............] - ETA: 1:05 - loss: 0.2783 - regression_loss: 0.2640 - classification_loss: 0.0142 309/500 [=================>............] - ETA: 1:04 - loss: 0.2786 - regression_loss: 0.2643 - classification_loss: 0.0143 310/500 [=================>............] - ETA: 1:04 - loss: 0.2783 - regression_loss: 0.2641 - classification_loss: 0.0143 311/500 [=================>............] - ETA: 1:04 - loss: 0.2790 - regression_loss: 0.2647 - classification_loss: 0.0143 312/500 [=================>............] - ETA: 1:03 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0142 313/500 [=================>............] - ETA: 1:03 - loss: 0.2790 - regression_loss: 0.2647 - classification_loss: 0.0143 314/500 [=================>............] - ETA: 1:03 - loss: 0.2789 - regression_loss: 0.2646 - classification_loss: 0.0143 315/500 [=================>............] - ETA: 1:02 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0143 316/500 [=================>............] - ETA: 1:02 - loss: 0.2782 - regression_loss: 0.2640 - classification_loss: 0.0143 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2783 - regression_loss: 0.2640 - classification_loss: 0.0142 318/500 [==================>...........] - ETA: 1:01 - loss: 0.2783 - regression_loss: 0.2641 - classification_loss: 0.0142 319/500 [==================>...........] - ETA: 1:01 - loss: 0.2780 - regression_loss: 0.2638 - classification_loss: 0.0142 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2778 - regression_loss: 0.2636 - classification_loss: 0.0142 321/500 [==================>...........] - ETA: 1:00 - loss: 0.2776 - regression_loss: 0.2635 - classification_loss: 0.0141 322/500 [==================>...........] - ETA: 1:00 - loss: 0.2779 - regression_loss: 0.2638 - classification_loss: 0.0141 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2778 - regression_loss: 0.2637 - classification_loss: 0.0141 324/500 [==================>...........] - ETA: 59s - loss: 0.2775 - regression_loss: 0.2635 - classification_loss: 0.0141  325/500 [==================>...........] - ETA: 59s - loss: 0.2771 - regression_loss: 0.2631 - classification_loss: 0.0141 326/500 [==================>...........] - ETA: 59s - loss: 0.2771 - regression_loss: 0.2631 - classification_loss: 0.0140 327/500 [==================>...........] - ETA: 58s - loss: 0.2770 - regression_loss: 0.2630 - classification_loss: 0.0140 328/500 [==================>...........] - ETA: 58s - loss: 0.2772 - regression_loss: 0.2631 - classification_loss: 0.0141 329/500 [==================>...........] - ETA: 58s - loss: 0.2768 - regression_loss: 0.2628 - classification_loss: 0.0140 330/500 [==================>...........] - ETA: 57s - loss: 0.2765 - regression_loss: 0.2625 - classification_loss: 0.0140 331/500 [==================>...........] - ETA: 57s - loss: 0.2764 - regression_loss: 0.2623 - classification_loss: 0.0140 332/500 [==================>...........] - ETA: 57s - loss: 0.2758 - regression_loss: 0.2618 - classification_loss: 0.0140 333/500 [==================>...........] - ETA: 56s - loss: 0.2759 - regression_loss: 0.2619 - classification_loss: 0.0140 334/500 [===================>..........] - ETA: 56s - loss: 0.2757 - regression_loss: 0.2617 - classification_loss: 0.0140 335/500 [===================>..........] - ETA: 56s - loss: 0.2756 - regression_loss: 0.2616 - classification_loss: 0.0140 336/500 [===================>..........] - ETA: 55s - loss: 0.2756 - regression_loss: 0.2616 - classification_loss: 0.0140 337/500 [===================>..........] - ETA: 55s - loss: 0.2755 - regression_loss: 0.2615 - classification_loss: 0.0140 338/500 [===================>..........] - ETA: 55s - loss: 0.2751 - regression_loss: 0.2611 - classification_loss: 0.0140 339/500 [===================>..........] - ETA: 54s - loss: 0.2750 - regression_loss: 0.2610 - classification_loss: 0.0140 340/500 [===================>..........] - ETA: 54s - loss: 0.2752 - regression_loss: 0.2612 - classification_loss: 0.0140 341/500 [===================>..........] - ETA: 54s - loss: 0.2750 - regression_loss: 0.2611 - classification_loss: 0.0140 342/500 [===================>..........] - ETA: 53s - loss: 0.2748 - regression_loss: 0.2608 - classification_loss: 0.0140 343/500 [===================>..........] - ETA: 53s - loss: 0.2742 - regression_loss: 0.2603 - classification_loss: 0.0139 344/500 [===================>..........] - ETA: 53s - loss: 0.2747 - regression_loss: 0.2608 - classification_loss: 0.0140 345/500 [===================>..........] - ETA: 52s - loss: 0.2751 - regression_loss: 0.2610 - classification_loss: 0.0140 346/500 [===================>..........] - ETA: 52s - loss: 0.2745 - regression_loss: 0.2605 - classification_loss: 0.0140 347/500 [===================>..........] - ETA: 51s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0140 348/500 [===================>..........] - ETA: 51s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0140 349/500 [===================>..........] - ETA: 51s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0140 350/500 [====================>.........] - ETA: 50s - loss: 0.2742 - regression_loss: 0.2602 - classification_loss: 0.0140 351/500 [====================>.........] - ETA: 50s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0140 352/500 [====================>.........] - ETA: 50s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0140 353/500 [====================>.........] - ETA: 49s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 354/500 [====================>.........] - ETA: 49s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0141 355/500 [====================>.........] - ETA: 49s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0140 356/500 [====================>.........] - ETA: 48s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 357/500 [====================>.........] - ETA: 48s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 358/500 [====================>.........] - ETA: 48s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0141 359/500 [====================>.........] - ETA: 47s - loss: 0.2748 - regression_loss: 0.2607 - classification_loss: 0.0141 360/500 [====================>.........] - ETA: 47s - loss: 0.2749 - regression_loss: 0.2609 - classification_loss: 0.0141 361/500 [====================>.........] - ETA: 47s - loss: 0.2753 - regression_loss: 0.2612 - classification_loss: 0.0141 362/500 [====================>.........] - ETA: 46s - loss: 0.2750 - regression_loss: 0.2609 - classification_loss: 0.0141 363/500 [====================>.........] - ETA: 46s - loss: 0.2749 - regression_loss: 0.2608 - classification_loss: 0.0141 364/500 [====================>.........] - ETA: 46s - loss: 0.2749 - regression_loss: 0.2608 - classification_loss: 0.0141 365/500 [====================>.........] - ETA: 45s - loss: 0.2749 - regression_loss: 0.2609 - classification_loss: 0.0141 366/500 [====================>.........] - ETA: 45s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0141 367/500 [=====================>........] - ETA: 45s - loss: 0.2752 - regression_loss: 0.2611 - classification_loss: 0.0140 368/500 [=====================>........] - ETA: 44s - loss: 0.2756 - regression_loss: 0.2614 - classification_loss: 0.0142 369/500 [=====================>........] - ETA: 44s - loss: 0.2753 - regression_loss: 0.2612 - classification_loss: 0.0142 370/500 [=====================>........] - ETA: 44s - loss: 0.2751 - regression_loss: 0.2610 - classification_loss: 0.0142 371/500 [=====================>........] - ETA: 43s - loss: 0.2750 - regression_loss: 0.2609 - classification_loss: 0.0141 372/500 [=====================>........] - ETA: 43s - loss: 0.2748 - regression_loss: 0.2606 - classification_loss: 0.0142 373/500 [=====================>........] - ETA: 43s - loss: 0.2753 - regression_loss: 0.2611 - classification_loss: 0.0142 374/500 [=====================>........] - ETA: 42s - loss: 0.2748 - regression_loss: 0.2606 - classification_loss: 0.0142 375/500 [=====================>........] - ETA: 42s - loss: 0.2746 - regression_loss: 0.2604 - classification_loss: 0.0142 376/500 [=====================>........] - ETA: 42s - loss: 0.2752 - regression_loss: 0.2610 - classification_loss: 0.0142 377/500 [=====================>........] - ETA: 41s - loss: 0.2752 - regression_loss: 0.2610 - classification_loss: 0.0142 378/500 [=====================>........] - ETA: 41s - loss: 0.2751 - regression_loss: 0.2609 - classification_loss: 0.0142 379/500 [=====================>........] - ETA: 41s - loss: 0.2747 - regression_loss: 0.2605 - classification_loss: 0.0142 380/500 [=====================>........] - ETA: 40s - loss: 0.2746 - regression_loss: 0.2604 - classification_loss: 0.0142 381/500 [=====================>........] - ETA: 40s - loss: 0.2746 - regression_loss: 0.2604 - classification_loss: 0.0142 382/500 [=====================>........] - ETA: 40s - loss: 0.2748 - regression_loss: 0.2606 - classification_loss: 0.0142 383/500 [=====================>........] - ETA: 39s - loss: 0.2746 - regression_loss: 0.2605 - classification_loss: 0.0141 384/500 [======================>.......] - ETA: 39s - loss: 0.2747 - regression_loss: 0.2606 - classification_loss: 0.0141 385/500 [======================>.......] - ETA: 39s - loss: 0.2746 - regression_loss: 0.2605 - classification_loss: 0.0141 386/500 [======================>.......] - ETA: 38s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0141 387/500 [======================>.......] - ETA: 38s - loss: 0.2741 - regression_loss: 0.2601 - classification_loss: 0.0141 388/500 [======================>.......] - ETA: 38s - loss: 0.2742 - regression_loss: 0.2601 - classification_loss: 0.0141 389/500 [======================>.......] - ETA: 37s - loss: 0.2741 - regression_loss: 0.2601 - classification_loss: 0.0141 390/500 [======================>.......] - ETA: 37s - loss: 0.2740 - regression_loss: 0.2599 - classification_loss: 0.0141 391/500 [======================>.......] - ETA: 37s - loss: 0.2740 - regression_loss: 0.2599 - classification_loss: 0.0141 392/500 [======================>.......] - ETA: 36s - loss: 0.2745 - regression_loss: 0.2605 - classification_loss: 0.0141 393/500 [======================>.......] - ETA: 36s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0141 394/500 [======================>.......] - ETA: 36s - loss: 0.2743 - regression_loss: 0.2602 - classification_loss: 0.0140 395/500 [======================>.......] - ETA: 35s - loss: 0.2743 - regression_loss: 0.2602 - classification_loss: 0.0140 396/500 [======================>.......] - ETA: 35s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 397/500 [======================>.......] - ETA: 35s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 398/500 [======================>.......] - ETA: 34s - loss: 0.2739 - regression_loss: 0.2599 - classification_loss: 0.0140 399/500 [======================>.......] - ETA: 34s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0141 400/500 [=======================>......] - ETA: 34s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 401/500 [=======================>......] - ETA: 33s - loss: 0.2739 - regression_loss: 0.2598 - classification_loss: 0.0141 402/500 [=======================>......] - ETA: 33s - loss: 0.2735 - regression_loss: 0.2594 - classification_loss: 0.0141 403/500 [=======================>......] - ETA: 33s - loss: 0.2733 - regression_loss: 0.2593 - classification_loss: 0.0140 404/500 [=======================>......] - ETA: 32s - loss: 0.2743 - regression_loss: 0.2602 - classification_loss: 0.0141 405/500 [=======================>......] - ETA: 32s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 406/500 [=======================>......] - ETA: 31s - loss: 0.2738 - regression_loss: 0.2598 - classification_loss: 0.0140 407/500 [=======================>......] - ETA: 31s - loss: 0.2735 - regression_loss: 0.2595 - classification_loss: 0.0140 408/500 [=======================>......] - ETA: 31s - loss: 0.2733 - regression_loss: 0.2593 - classification_loss: 0.0140 409/500 [=======================>......] - ETA: 30s - loss: 0.2734 - regression_loss: 0.2594 - classification_loss: 0.0140 410/500 [=======================>......] - ETA: 30s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 411/500 [=======================>......] - ETA: 30s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0141 412/500 [=======================>......] - ETA: 29s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0140 413/500 [=======================>......] - ETA: 29s - loss: 0.2741 - regression_loss: 0.2601 - classification_loss: 0.0140 414/500 [=======================>......] - ETA: 29s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0140 415/500 [=======================>......] - ETA: 28s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0140 416/500 [=======================>......] - ETA: 28s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0140 417/500 [========================>.....] - ETA: 28s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0140 418/500 [========================>.....] - ETA: 27s - loss: 0.2746 - regression_loss: 0.2606 - classification_loss: 0.0140 419/500 [========================>.....] - ETA: 27s - loss: 0.2745 - regression_loss: 0.2604 - classification_loss: 0.0140 420/500 [========================>.....] - ETA: 27s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0140 421/500 [========================>.....] - ETA: 26s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0140 422/500 [========================>.....] - ETA: 26s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0140 423/500 [========================>.....] - ETA: 26s - loss: 0.2738 - regression_loss: 0.2598 - classification_loss: 0.0140 424/500 [========================>.....] - ETA: 25s - loss: 0.2738 - regression_loss: 0.2598 - classification_loss: 0.0140 425/500 [========================>.....] - ETA: 25s - loss: 0.2737 - regression_loss: 0.2597 - classification_loss: 0.0140 426/500 [========================>.....] - ETA: 25s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0140 427/500 [========================>.....] - ETA: 24s - loss: 0.2749 - regression_loss: 0.2608 - classification_loss: 0.0141 428/500 [========================>.....] - ETA: 24s - loss: 0.2748 - regression_loss: 0.2607 - classification_loss: 0.0141 429/500 [========================>.....] - ETA: 24s - loss: 0.2746 - regression_loss: 0.2605 - classification_loss: 0.0141 430/500 [========================>.....] - ETA: 23s - loss: 0.2742 - regression_loss: 0.2601 - classification_loss: 0.0141 431/500 [========================>.....] - ETA: 23s - loss: 0.2739 - regression_loss: 0.2599 - classification_loss: 0.0141 432/500 [========================>.....] - ETA: 23s - loss: 0.2737 - regression_loss: 0.2596 - classification_loss: 0.0140 433/500 [========================>.....] - ETA: 22s - loss: 0.2739 - regression_loss: 0.2599 - classification_loss: 0.0140 434/500 [=========================>....] - ETA: 22s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0141 435/500 [=========================>....] - ETA: 22s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 436/500 [=========================>....] - ETA: 21s - loss: 0.2737 - regression_loss: 0.2596 - classification_loss: 0.0140 437/500 [=========================>....] - ETA: 21s - loss: 0.2733 - regression_loss: 0.2593 - classification_loss: 0.0140 438/500 [=========================>....] - ETA: 21s - loss: 0.2731 - regression_loss: 0.2590 - classification_loss: 0.0140 439/500 [=========================>....] - ETA: 20s - loss: 0.2730 - regression_loss: 0.2590 - classification_loss: 0.0140 440/500 [=========================>....] - ETA: 20s - loss: 0.2739 - regression_loss: 0.2599 - classification_loss: 0.0140 441/500 [=========================>....] - ETA: 20s - loss: 0.2742 - regression_loss: 0.2602 - classification_loss: 0.0140 442/500 [=========================>....] - ETA: 19s - loss: 0.2743 - regression_loss: 0.2603 - classification_loss: 0.0140 443/500 [=========================>....] - ETA: 19s - loss: 0.2744 - regression_loss: 0.2603 - classification_loss: 0.0140 444/500 [=========================>....] - ETA: 19s - loss: 0.2741 - regression_loss: 0.2601 - classification_loss: 0.0140 445/500 [=========================>....] - ETA: 18s - loss: 0.2742 - regression_loss: 0.2602 - classification_loss: 0.0140 446/500 [=========================>....] - ETA: 18s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 447/500 [=========================>....] - ETA: 18s - loss: 0.2740 - regression_loss: 0.2599 - classification_loss: 0.0141 448/500 [=========================>....] - ETA: 17s - loss: 0.2739 - regression_loss: 0.2598 - classification_loss: 0.0141 449/500 [=========================>....] - ETA: 17s - loss: 0.2738 - regression_loss: 0.2598 - classification_loss: 0.0141 450/500 [==========================>...] - ETA: 17s - loss: 0.2739 - regression_loss: 0.2598 - classification_loss: 0.0141 451/500 [==========================>...] - ETA: 16s - loss: 0.2739 - regression_loss: 0.2599 - classification_loss: 0.0141 452/500 [==========================>...] - ETA: 16s - loss: 0.2740 - regression_loss: 0.2599 - classification_loss: 0.0141 453/500 [==========================>...] - ETA: 15s - loss: 0.2741 - regression_loss: 0.2599 - classification_loss: 0.0141 454/500 [==========================>...] - ETA: 15s - loss: 0.2739 - regression_loss: 0.2598 - classification_loss: 0.0141 455/500 [==========================>...] - ETA: 15s - loss: 0.2736 - regression_loss: 0.2595 - classification_loss: 0.0141 456/500 [==========================>...] - ETA: 14s - loss: 0.2736 - regression_loss: 0.2596 - classification_loss: 0.0141 457/500 [==========================>...] - ETA: 14s - loss: 0.2737 - regression_loss: 0.2597 - classification_loss: 0.0141 458/500 [==========================>...] - ETA: 14s - loss: 0.2740 - regression_loss: 0.2600 - classification_loss: 0.0141 459/500 [==========================>...] - ETA: 13s - loss: 0.2742 - regression_loss: 0.2601 - classification_loss: 0.0141 460/500 [==========================>...] - ETA: 13s - loss: 0.2741 - regression_loss: 0.2600 - classification_loss: 0.0141 461/500 [==========================>...] - ETA: 13s - loss: 0.2737 - regression_loss: 0.2596 - classification_loss: 0.0141 462/500 [==========================>...] - ETA: 12s - loss: 0.2733 - regression_loss: 0.2593 - classification_loss: 0.0140 463/500 [==========================>...] - ETA: 12s - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0140 464/500 [==========================>...] - ETA: 12s - loss: 0.2727 - regression_loss: 0.2587 - classification_loss: 0.0140 465/500 [==========================>...] - ETA: 11s - loss: 0.2728 - regression_loss: 0.2588 - classification_loss: 0.0140 466/500 [==========================>...] - ETA: 11s - loss: 0.2727 - regression_loss: 0.2587 - classification_loss: 0.0140 467/500 [===========================>..] - ETA: 11s - loss: 0.2728 - regression_loss: 0.2588 - classification_loss: 0.0140 468/500 [===========================>..] - ETA: 10s - loss: 0.2733 - regression_loss: 0.2592 - classification_loss: 0.0141 469/500 [===========================>..] - ETA: 10s - loss: 0.2734 - regression_loss: 0.2593 - classification_loss: 0.0141 470/500 [===========================>..] - ETA: 10s - loss: 0.2736 - regression_loss: 0.2594 - classification_loss: 0.0141 471/500 [===========================>..] - ETA: 9s - loss: 0.2736 - regression_loss: 0.2594 - classification_loss: 0.0141  472/500 [===========================>..] - ETA: 9s - loss: 0.2733 - regression_loss: 0.2592 - classification_loss: 0.0141 473/500 [===========================>..] - ETA: 9s - loss: 0.2734 - regression_loss: 0.2593 - classification_loss: 0.0141 474/500 [===========================>..] - ETA: 8s - loss: 0.2737 - regression_loss: 0.2596 - classification_loss: 0.0141 475/500 [===========================>..] - ETA: 8s - loss: 0.2736 - regression_loss: 0.2595 - classification_loss: 0.0141 476/500 [===========================>..] - ETA: 8s - loss: 0.2735 - regression_loss: 0.2594 - classification_loss: 0.0141 477/500 [===========================>..] - ETA: 7s - loss: 0.2734 - regression_loss: 0.2593 - classification_loss: 0.0141 478/500 [===========================>..] - ETA: 7s - loss: 0.2739 - regression_loss: 0.2598 - classification_loss: 0.0141 479/500 [===========================>..] - ETA: 7s - loss: 0.2736 - regression_loss: 0.2595 - classification_loss: 0.0141 480/500 [===========================>..] - ETA: 6s - loss: 0.2734 - regression_loss: 0.2593 - classification_loss: 0.0141 481/500 [===========================>..] - ETA: 6s - loss: 0.2731 - regression_loss: 0.2590 - classification_loss: 0.0141 482/500 [===========================>..] - ETA: 6s - loss: 0.2731 - regression_loss: 0.2590 - classification_loss: 0.0141 483/500 [===========================>..] - ETA: 5s - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0141 484/500 [============================>.] - ETA: 5s - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0141 485/500 [============================>.] - ETA: 5s - loss: 0.2737 - regression_loss: 0.2595 - classification_loss: 0.0142 486/500 [============================>.] - ETA: 4s - loss: 0.2739 - regression_loss: 0.2597 - classification_loss: 0.0142 487/500 [============================>.] - ETA: 4s - loss: 0.2740 - regression_loss: 0.2598 - classification_loss: 0.0142 488/500 [============================>.] - ETA: 4s - loss: 0.2738 - regression_loss: 0.2597 - classification_loss: 0.0141 489/500 [============================>.] - ETA: 3s - loss: 0.2739 - regression_loss: 0.2597 - classification_loss: 0.0141 490/500 [============================>.] - ETA: 3s - loss: 0.2738 - regression_loss: 0.2596 - classification_loss: 0.0141 491/500 [============================>.] - ETA: 3s - loss: 0.2736 - regression_loss: 0.2595 - classification_loss: 0.0141 492/500 [============================>.] - ETA: 2s - loss: 0.2735 - regression_loss: 0.2594 - classification_loss: 0.0141 493/500 [============================>.] - ETA: 2s - loss: 0.2735 - regression_loss: 0.2594 - classification_loss: 0.0141 494/500 [============================>.] - ETA: 2s - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0141 495/500 [============================>.] - ETA: 1s - loss: 0.2732 - regression_loss: 0.2592 - classification_loss: 0.0141 496/500 [============================>.] - ETA: 1s - loss: 0.2728 - regression_loss: 0.2587 - classification_loss: 0.0140 497/500 [============================>.] - ETA: 1s - loss: 0.2730 - regression_loss: 0.2589 - classification_loss: 0.0141 498/500 [============================>.] - ETA: 0s - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0141 499/500 [============================>.] - ETA: 0s - loss: 0.2730 - regression_loss: 0.2589 - classification_loss: 0.0141 500/500 [==============================] - 170s 340ms/step - loss: 0.2729 - regression_loss: 0.2588 - classification_loss: 0.0141 1172 instances of class plum with average precision: 0.7427 mAP: 0.7427 Epoch 00041: saving model to ./training/snapshots/resnet101_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 2:45 - loss: 0.3741 - regression_loss: 0.3524 - classification_loss: 0.0218 2/500 [..............................] - ETA: 2:50 - loss: 0.3961 - regression_loss: 0.3765 - classification_loss: 0.0196 3/500 [..............................] - ETA: 2:52 - loss: 0.3611 - regression_loss: 0.3349 - classification_loss: 0.0261 4/500 [..............................] - ETA: 2:52 - loss: 0.3516 - regression_loss: 0.3291 - classification_loss: 0.0225 5/500 [..............................] - ETA: 2:51 - loss: 0.3510 - regression_loss: 0.3286 - classification_loss: 0.0224 6/500 [..............................] - ETA: 2:50 - loss: 0.3252 - regression_loss: 0.3055 - classification_loss: 0.0197 7/500 [..............................] - ETA: 2:51 - loss: 0.3108 - regression_loss: 0.2933 - classification_loss: 0.0175 8/500 [..............................] - ETA: 2:50 - loss: 0.3160 - regression_loss: 0.2988 - classification_loss: 0.0172 9/500 [..............................] - ETA: 2:51 - loss: 0.2946 - regression_loss: 0.2787 - classification_loss: 0.0159 10/500 [..............................] - ETA: 2:49 - loss: 0.2988 - regression_loss: 0.2829 - classification_loss: 0.0159 11/500 [..............................] - ETA: 2:48 - loss: 0.2856 - regression_loss: 0.2708 - classification_loss: 0.0149 12/500 [..............................] - ETA: 2:47 - loss: 0.2823 - regression_loss: 0.2678 - classification_loss: 0.0146 13/500 [..............................] - ETA: 2:47 - loss: 0.2768 - regression_loss: 0.2629 - classification_loss: 0.0139 14/500 [..............................] - ETA: 2:47 - loss: 0.2776 - regression_loss: 0.2631 - classification_loss: 0.0145 15/500 [..............................] - ETA: 2:47 - loss: 0.2773 - regression_loss: 0.2632 - classification_loss: 0.0141 16/500 [..............................] - ETA: 2:46 - loss: 0.2757 - regression_loss: 0.2623 - classification_loss: 0.0134 17/500 [>.............................] - ETA: 2:46 - loss: 0.2837 - regression_loss: 0.2702 - classification_loss: 0.0135 18/500 [>.............................] - ETA: 2:45 - loss: 0.2863 - regression_loss: 0.2734 - classification_loss: 0.0130 19/500 [>.............................] - ETA: 2:45 - loss: 0.2864 - regression_loss: 0.2736 - classification_loss: 0.0128 20/500 [>.............................] - ETA: 2:44 - loss: 0.2871 - regression_loss: 0.2740 - classification_loss: 0.0132 21/500 [>.............................] - ETA: 2:44 - loss: 0.2833 - regression_loss: 0.2702 - classification_loss: 0.0130 22/500 [>.............................] - ETA: 2:44 - loss: 0.2920 - regression_loss: 0.2772 - classification_loss: 0.0149 23/500 [>.............................] - ETA: 2:43 - loss: 0.2856 - regression_loss: 0.2710 - classification_loss: 0.0146 24/500 [>.............................] - ETA: 2:43 - loss: 0.2815 - regression_loss: 0.2673 - classification_loss: 0.0142 25/500 [>.............................] - ETA: 2:42 - loss: 0.2828 - regression_loss: 0.2683 - classification_loss: 0.0144 26/500 [>.............................] - ETA: 2:42 - loss: 0.2916 - regression_loss: 0.2767 - classification_loss: 0.0149 27/500 [>.............................] - ETA: 2:41 - loss: 0.2898 - regression_loss: 0.2753 - classification_loss: 0.0145 28/500 [>.............................] - ETA: 2:41 - loss: 0.2920 - regression_loss: 0.2773 - classification_loss: 0.0147 29/500 [>.............................] - ETA: 2:41 - loss: 0.2878 - regression_loss: 0.2734 - classification_loss: 0.0143 30/500 [>.............................] - ETA: 2:41 - loss: 0.2942 - regression_loss: 0.2797 - classification_loss: 0.0145 31/500 [>.............................] - ETA: 2:40 - loss: 0.2976 - regression_loss: 0.2829 - classification_loss: 0.0147 32/500 [>.............................] - ETA: 2:40 - loss: 0.2966 - regression_loss: 0.2820 - classification_loss: 0.0146 33/500 [>.............................] - ETA: 2:40 - loss: 0.3002 - regression_loss: 0.2853 - classification_loss: 0.0150 34/500 [=>............................] - ETA: 2:39 - loss: 0.2948 - regression_loss: 0.2802 - classification_loss: 0.0146 35/500 [=>............................] - ETA: 2:39 - loss: 0.2919 - regression_loss: 0.2776 - classification_loss: 0.0143 36/500 [=>............................] - ETA: 2:39 - loss: 0.2920 - regression_loss: 0.2776 - classification_loss: 0.0144 37/500 [=>............................] - ETA: 2:39 - loss: 0.3017 - regression_loss: 0.2870 - classification_loss: 0.0147 38/500 [=>............................] - ETA: 2:38 - loss: 0.2985 - regression_loss: 0.2840 - classification_loss: 0.0145 39/500 [=>............................] - ETA: 2:38 - loss: 0.3019 - regression_loss: 0.2868 - classification_loss: 0.0151 40/500 [=>............................] - ETA: 2:38 - loss: 0.3009 - regression_loss: 0.2857 - classification_loss: 0.0153 41/500 [=>............................] - ETA: 2:38 - loss: 0.3003 - regression_loss: 0.2849 - classification_loss: 0.0154 42/500 [=>............................] - ETA: 2:37 - loss: 0.3014 - regression_loss: 0.2857 - classification_loss: 0.0156 43/500 [=>............................] - ETA: 2:37 - loss: 0.3042 - regression_loss: 0.2883 - classification_loss: 0.0159 44/500 [=>............................] - ETA: 2:36 - loss: 0.3056 - regression_loss: 0.2900 - classification_loss: 0.0157 45/500 [=>............................] - ETA: 2:36 - loss: 0.3024 - regression_loss: 0.2871 - classification_loss: 0.0154 46/500 [=>............................] - ETA: 2:35 - loss: 0.3014 - regression_loss: 0.2861 - classification_loss: 0.0154 47/500 [=>............................] - ETA: 2:35 - loss: 0.3066 - regression_loss: 0.2911 - classification_loss: 0.0155 48/500 [=>............................] - ETA: 2:35 - loss: 0.3056 - regression_loss: 0.2901 - classification_loss: 0.0155 49/500 [=>............................] - ETA: 2:34 - loss: 0.3043 - regression_loss: 0.2889 - classification_loss: 0.0154 50/500 [==>...........................] - ETA: 2:34 - loss: 0.3035 - regression_loss: 0.2883 - classification_loss: 0.0152 51/500 [==>...........................] - ETA: 2:33 - loss: 0.2999 - regression_loss: 0.2847 - classification_loss: 0.0152 52/500 [==>...........................] - ETA: 2:33 - loss: 0.2973 - regression_loss: 0.2822 - classification_loss: 0.0150 53/500 [==>...........................] - ETA: 2:33 - loss: 0.2965 - regression_loss: 0.2813 - classification_loss: 0.0151 54/500 [==>...........................] - ETA: 2:32 - loss: 0.2957 - regression_loss: 0.2805 - classification_loss: 0.0151 55/500 [==>...........................] - ETA: 2:32 - loss: 0.2945 - regression_loss: 0.2794 - classification_loss: 0.0150 56/500 [==>...........................] - ETA: 2:32 - loss: 0.2938 - regression_loss: 0.2789 - classification_loss: 0.0149 57/500 [==>...........................] - ETA: 2:31 - loss: 0.2934 - regression_loss: 0.2786 - classification_loss: 0.0149 58/500 [==>...........................] - ETA: 2:31 - loss: 0.2923 - regression_loss: 0.2775 - classification_loss: 0.0148 59/500 [==>...........................] - ETA: 2:31 - loss: 0.2917 - regression_loss: 0.2766 - classification_loss: 0.0151 60/500 [==>...........................] - ETA: 2:30 - loss: 0.2906 - regression_loss: 0.2756 - classification_loss: 0.0151 61/500 [==>...........................] - ETA: 2:30 - loss: 0.2895 - regression_loss: 0.2746 - classification_loss: 0.0149 62/500 [==>...........................] - ETA: 2:30 - loss: 0.2861 - regression_loss: 0.2714 - classification_loss: 0.0147 63/500 [==>...........................] - ETA: 2:29 - loss: 0.2858 - regression_loss: 0.2711 - classification_loss: 0.0147 64/500 [==>...........................] - ETA: 2:29 - loss: 0.2843 - regression_loss: 0.2698 - classification_loss: 0.0145 65/500 [==>...........................] - ETA: 2:29 - loss: 0.2817 - regression_loss: 0.2674 - classification_loss: 0.0143 66/500 [==>...........................] - ETA: 2:28 - loss: 0.2822 - regression_loss: 0.2679 - classification_loss: 0.0144 67/500 [===>..........................] - ETA: 2:28 - loss: 0.2821 - regression_loss: 0.2675 - classification_loss: 0.0146 68/500 [===>..........................] - ETA: 2:28 - loss: 0.2821 - regression_loss: 0.2674 - classification_loss: 0.0147 69/500 [===>..........................] - ETA: 2:28 - loss: 0.2799 - regression_loss: 0.2654 - classification_loss: 0.0145 70/500 [===>..........................] - ETA: 2:27 - loss: 0.2803 - regression_loss: 0.2658 - classification_loss: 0.0145 71/500 [===>..........................] - ETA: 2:27 - loss: 0.2790 - regression_loss: 0.2645 - classification_loss: 0.0145 72/500 [===>..........................] - ETA: 2:26 - loss: 0.2774 - regression_loss: 0.2630 - classification_loss: 0.0144 73/500 [===>..........................] - ETA: 2:26 - loss: 0.2774 - regression_loss: 0.2629 - classification_loss: 0.0144 74/500 [===>..........................] - ETA: 2:26 - loss: 0.2759 - regression_loss: 0.2616 - classification_loss: 0.0143 75/500 [===>..........................] - ETA: 2:25 - loss: 0.2750 - regression_loss: 0.2609 - classification_loss: 0.0142 76/500 [===>..........................] - ETA: 2:25 - loss: 0.2726 - regression_loss: 0.2586 - classification_loss: 0.0140 77/500 [===>..........................] - ETA: 2:25 - loss: 0.2712 - regression_loss: 0.2573 - classification_loss: 0.0139 78/500 [===>..........................] - ETA: 2:24 - loss: 0.2727 - regression_loss: 0.2587 - classification_loss: 0.0140 79/500 [===>..........................] - ETA: 2:24 - loss: 0.2748 - regression_loss: 0.2608 - classification_loss: 0.0140 80/500 [===>..........................] - ETA: 2:24 - loss: 0.2732 - regression_loss: 0.2593 - classification_loss: 0.0139 81/500 [===>..........................] - ETA: 2:23 - loss: 0.2737 - regression_loss: 0.2597 - classification_loss: 0.0140 82/500 [===>..........................] - ETA: 2:23 - loss: 0.2720 - regression_loss: 0.2581 - classification_loss: 0.0138 83/500 [===>..........................] - ETA: 2:23 - loss: 0.2728 - regression_loss: 0.2590 - classification_loss: 0.0138 84/500 [====>.........................] - ETA: 2:22 - loss: 0.2718 - regression_loss: 0.2580 - classification_loss: 0.0138 85/500 [====>.........................] - ETA: 2:22 - loss: 0.2719 - regression_loss: 0.2582 - classification_loss: 0.0137 86/500 [====>.........................] - ETA: 2:22 - loss: 0.2720 - regression_loss: 0.2582 - classification_loss: 0.0138 87/500 [====>.........................] - ETA: 2:21 - loss: 0.2713 - regression_loss: 0.2576 - classification_loss: 0.0137 88/500 [====>.........................] - ETA: 2:21 - loss: 0.2706 - regression_loss: 0.2569 - classification_loss: 0.0137 89/500 [====>.........................] - ETA: 2:20 - loss: 0.2693 - regression_loss: 0.2557 - classification_loss: 0.0136 90/500 [====>.........................] - ETA: 2:20 - loss: 0.2684 - regression_loss: 0.2549 - classification_loss: 0.0135 91/500 [====>.........................] - ETA: 2:20 - loss: 0.2683 - regression_loss: 0.2548 - classification_loss: 0.0135 92/500 [====>.........................] - ETA: 2:19 - loss: 0.2669 - regression_loss: 0.2535 - classification_loss: 0.0134 93/500 [====>.........................] - ETA: 2:19 - loss: 0.2652 - regression_loss: 0.2519 - classification_loss: 0.0133 94/500 [====>.........................] - ETA: 2:19 - loss: 0.2654 - regression_loss: 0.2522 - classification_loss: 0.0132 95/500 [====>.........................] - ETA: 2:18 - loss: 0.2653 - regression_loss: 0.2522 - classification_loss: 0.0131 96/500 [====>.........................] - ETA: 2:18 - loss: 0.2643 - regression_loss: 0.2513 - classification_loss: 0.0130 97/500 [====>.........................] - ETA: 2:18 - loss: 0.2634 - regression_loss: 0.2504 - classification_loss: 0.0130 98/500 [====>.........................] - ETA: 2:17 - loss: 0.2622 - regression_loss: 0.2493 - classification_loss: 0.0129 99/500 [====>.........................] - ETA: 2:17 - loss: 0.2624 - regression_loss: 0.2497 - classification_loss: 0.0128 100/500 [=====>........................] - ETA: 2:17 - loss: 0.2623 - regression_loss: 0.2496 - classification_loss: 0.0127 101/500 [=====>........................] - ETA: 2:16 - loss: 0.2646 - regression_loss: 0.2517 - classification_loss: 0.0129 102/500 [=====>........................] - ETA: 2:16 - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 103/500 [=====>........................] - ETA: 2:16 - loss: 0.2669 - regression_loss: 0.2540 - classification_loss: 0.0128 104/500 [=====>........................] - ETA: 2:15 - loss: 0.2677 - regression_loss: 0.2548 - classification_loss: 0.0129 105/500 [=====>........................] - ETA: 2:15 - loss: 0.2700 - regression_loss: 0.2569 - classification_loss: 0.0131 106/500 [=====>........................] - ETA: 2:15 - loss: 0.2723 - regression_loss: 0.2585 - classification_loss: 0.0138 107/500 [=====>........................] - ETA: 2:14 - loss: 0.2722 - regression_loss: 0.2584 - classification_loss: 0.0138 108/500 [=====>........................] - ETA: 2:14 - loss: 0.2721 - regression_loss: 0.2583 - classification_loss: 0.0138 109/500 [=====>........................] - ETA: 2:14 - loss: 0.2736 - regression_loss: 0.2598 - classification_loss: 0.0139 110/500 [=====>........................] - ETA: 2:13 - loss: 0.2748 - regression_loss: 0.2610 - classification_loss: 0.0138 111/500 [=====>........................] - ETA: 2:13 - loss: 0.2753 - regression_loss: 0.2614 - classification_loss: 0.0138 112/500 [=====>........................] - ETA: 2:13 - loss: 0.2740 - regression_loss: 0.2603 - classification_loss: 0.0137 113/500 [=====>........................] - ETA: 2:12 - loss: 0.2738 - regression_loss: 0.2601 - classification_loss: 0.0137 114/500 [=====>........................] - ETA: 2:12 - loss: 0.2739 - regression_loss: 0.2602 - classification_loss: 0.0137 115/500 [=====>........................] - ETA: 2:12 - loss: 0.2738 - regression_loss: 0.2602 - classification_loss: 0.0136 116/500 [=====>........................] - ETA: 2:11 - loss: 0.2763 - regression_loss: 0.2624 - classification_loss: 0.0140 117/500 [======>.......................] - ETA: 2:11 - loss: 0.2803 - regression_loss: 0.2659 - classification_loss: 0.0144 118/500 [======>.......................] - ETA: 2:11 - loss: 0.2804 - regression_loss: 0.2660 - classification_loss: 0.0144 119/500 [======>.......................] - ETA: 2:10 - loss: 0.2792 - regression_loss: 0.2649 - classification_loss: 0.0143 120/500 [======>.......................] - ETA: 2:10 - loss: 0.2787 - regression_loss: 0.2645 - classification_loss: 0.0142 121/500 [======>.......................] - ETA: 2:09 - loss: 0.2779 - regression_loss: 0.2637 - classification_loss: 0.0141 122/500 [======>.......................] - ETA: 2:09 - loss: 0.2775 - regression_loss: 0.2634 - classification_loss: 0.0141 123/500 [======>.......................] - ETA: 2:09 - loss: 0.2782 - regression_loss: 0.2641 - classification_loss: 0.0141 124/500 [======>.......................] - ETA: 2:08 - loss: 0.2778 - regression_loss: 0.2637 - classification_loss: 0.0141 125/500 [======>.......................] - ETA: 2:08 - loss: 0.2767 - regression_loss: 0.2627 - classification_loss: 0.0140 126/500 [======>.......................] - ETA: 2:08 - loss: 0.2767 - regression_loss: 0.2626 - classification_loss: 0.0140 127/500 [======>.......................] - ETA: 2:07 - loss: 0.2769 - regression_loss: 0.2629 - classification_loss: 0.0140 128/500 [======>.......................] - ETA: 2:07 - loss: 0.2756 - regression_loss: 0.2617 - classification_loss: 0.0139 129/500 [======>.......................] - ETA: 2:07 - loss: 0.2746 - regression_loss: 0.2608 - classification_loss: 0.0139 130/500 [======>.......................] - ETA: 2:06 - loss: 0.2746 - regression_loss: 0.2607 - classification_loss: 0.0139 131/500 [======>.......................] - ETA: 2:06 - loss: 0.2742 - regression_loss: 0.2603 - classification_loss: 0.0138 132/500 [======>.......................] - ETA: 2:06 - loss: 0.2742 - regression_loss: 0.2602 - classification_loss: 0.0139 133/500 [======>.......................] - ETA: 2:05 - loss: 0.2741 - regression_loss: 0.2602 - classification_loss: 0.0139 134/500 [=======>......................] - ETA: 2:05 - loss: 0.2735 - regression_loss: 0.2596 - classification_loss: 0.0139 135/500 [=======>......................] - ETA: 2:05 - loss: 0.2740 - regression_loss: 0.2601 - classification_loss: 0.0139 136/500 [=======>......................] - ETA: 2:04 - loss: 0.2746 - regression_loss: 0.2605 - classification_loss: 0.0141 137/500 [=======>......................] - ETA: 2:04 - loss: 0.2751 - regression_loss: 0.2609 - classification_loss: 0.0142 138/500 [=======>......................] - ETA: 2:04 - loss: 0.2741 - regression_loss: 0.2599 - classification_loss: 0.0141 139/500 [=======>......................] - ETA: 2:03 - loss: 0.2732 - regression_loss: 0.2591 - classification_loss: 0.0141 140/500 [=======>......................] - ETA: 2:03 - loss: 0.2722 - regression_loss: 0.2582 - classification_loss: 0.0140 141/500 [=======>......................] - ETA: 2:03 - loss: 0.2712 - regression_loss: 0.2573 - classification_loss: 0.0139 142/500 [=======>......................] - ETA: 2:02 - loss: 0.2722 - regression_loss: 0.2581 - classification_loss: 0.0141 143/500 [=======>......................] - ETA: 2:02 - loss: 0.2718 - regression_loss: 0.2577 - classification_loss: 0.0141 144/500 [=======>......................] - ETA: 2:02 - loss: 0.2715 - regression_loss: 0.2574 - classification_loss: 0.0141 145/500 [=======>......................] - ETA: 2:01 - loss: 0.2729 - regression_loss: 0.2586 - classification_loss: 0.0143 146/500 [=======>......................] - ETA: 2:01 - loss: 0.2742 - regression_loss: 0.2597 - classification_loss: 0.0145 147/500 [=======>......................] - ETA: 2:01 - loss: 0.2746 - regression_loss: 0.2601 - classification_loss: 0.0145 148/500 [=======>......................] - ETA: 2:00 - loss: 0.2740 - regression_loss: 0.2596 - classification_loss: 0.0144 149/500 [=======>......................] - ETA: 2:00 - loss: 0.2744 - regression_loss: 0.2600 - classification_loss: 0.0145 150/500 [========>.....................] - ETA: 2:00 - loss: 0.2746 - regression_loss: 0.2600 - classification_loss: 0.0146 151/500 [========>.....................] - ETA: 1:59 - loss: 0.2748 - regression_loss: 0.2603 - classification_loss: 0.0145 152/500 [========>.....................] - ETA: 1:59 - loss: 0.2746 - regression_loss: 0.2601 - classification_loss: 0.0145 153/500 [========>.....................] - ETA: 1:59 - loss: 0.2739 - regression_loss: 0.2594 - classification_loss: 0.0145 154/500 [========>.....................] - ETA: 1:58 - loss: 0.2726 - regression_loss: 0.2582 - classification_loss: 0.0144 155/500 [========>.....................] - ETA: 1:58 - loss: 0.2733 - regression_loss: 0.2588 - classification_loss: 0.0144 156/500 [========>.....................] - ETA: 1:57 - loss: 0.2736 - regression_loss: 0.2592 - classification_loss: 0.0145 157/500 [========>.....................] - ETA: 1:57 - loss: 0.2744 - regression_loss: 0.2599 - classification_loss: 0.0144 158/500 [========>.....................] - ETA: 1:57 - loss: 0.2744 - regression_loss: 0.2599 - classification_loss: 0.0145 159/500 [========>.....................] - ETA: 1:56 - loss: 0.2737 - regression_loss: 0.2593 - classification_loss: 0.0144 160/500 [========>.....................] - ETA: 1:56 - loss: 0.2728 - regression_loss: 0.2584 - classification_loss: 0.0144 161/500 [========>.....................] - ETA: 1:56 - loss: 0.2729 - regression_loss: 0.2585 - classification_loss: 0.0144 162/500 [========>.....................] - ETA: 1:55 - loss: 0.2737 - regression_loss: 0.2593 - classification_loss: 0.0144 163/500 [========>.....................] - ETA: 1:55 - loss: 0.2759 - regression_loss: 0.2614 - classification_loss: 0.0145 164/500 [========>.....................] - ETA: 1:55 - loss: 0.2763 - regression_loss: 0.2616 - classification_loss: 0.0147 165/500 [========>.....................] - ETA: 1:54 - loss: 0.2766 - regression_loss: 0.2619 - classification_loss: 0.0147 166/500 [========>.....................] - ETA: 1:54 - loss: 0.2762 - regression_loss: 0.2616 - classification_loss: 0.0146 167/500 [=========>....................] - ETA: 1:54 - loss: 0.2788 - regression_loss: 0.2639 - classification_loss: 0.0149 168/500 [=========>....................] - ETA: 1:53 - loss: 0.2784 - regression_loss: 0.2636 - classification_loss: 0.0149 169/500 [=========>....................] - ETA: 1:53 - loss: 0.2787 - regression_loss: 0.2639 - classification_loss: 0.0148 170/500 [=========>....................] - ETA: 1:53 - loss: 0.2783 - regression_loss: 0.2635 - classification_loss: 0.0148 171/500 [=========>....................] - ETA: 1:52 - loss: 0.2783 - regression_loss: 0.2636 - classification_loss: 0.0147 172/500 [=========>....................] - ETA: 1:52 - loss: 0.2795 - regression_loss: 0.2647 - classification_loss: 0.0148 173/500 [=========>....................] - ETA: 1:52 - loss: 0.2799 - regression_loss: 0.2651 - classification_loss: 0.0148 174/500 [=========>....................] - ETA: 1:51 - loss: 0.2799 - regression_loss: 0.2650 - classification_loss: 0.0148 175/500 [=========>....................] - ETA: 1:51 - loss: 0.2801 - regression_loss: 0.2653 - classification_loss: 0.0148 176/500 [=========>....................] - ETA: 1:51 - loss: 0.2798 - regression_loss: 0.2650 - classification_loss: 0.0148 177/500 [=========>....................] - ETA: 1:50 - loss: 0.2803 - regression_loss: 0.2655 - classification_loss: 0.0148 178/500 [=========>....................] - ETA: 1:50 - loss: 0.2814 - regression_loss: 0.2666 - classification_loss: 0.0148 179/500 [=========>....................] - ETA: 1:50 - loss: 0.2816 - regression_loss: 0.2668 - classification_loss: 0.0148 180/500 [=========>....................] - ETA: 1:49 - loss: 0.2821 - regression_loss: 0.2672 - classification_loss: 0.0148 181/500 [=========>....................] - ETA: 1:49 - loss: 0.2821 - regression_loss: 0.2673 - classification_loss: 0.0148 182/500 [=========>....................] - ETA: 1:49 - loss: 0.2837 - regression_loss: 0.2689 - classification_loss: 0.0148 183/500 [=========>....................] - ETA: 1:48 - loss: 0.2835 - regression_loss: 0.2687 - classification_loss: 0.0148 184/500 [==========>...................] - ETA: 1:48 - loss: 0.2828 - regression_loss: 0.2680 - classification_loss: 0.0148 185/500 [==========>...................] - ETA: 1:48 - loss: 0.2842 - regression_loss: 0.2693 - classification_loss: 0.0149 186/500 [==========>...................] - ETA: 1:47 - loss: 0.2848 - regression_loss: 0.2699 - classification_loss: 0.0149 187/500 [==========>...................] - ETA: 1:47 - loss: 0.2847 - regression_loss: 0.2699 - classification_loss: 0.0149 188/500 [==========>...................] - ETA: 1:47 - loss: 0.2849 - regression_loss: 0.2701 - classification_loss: 0.0149 189/500 [==========>...................] - ETA: 1:46 - loss: 0.2851 - regression_loss: 0.2703 - classification_loss: 0.0149 190/500 [==========>...................] - ETA: 1:46 - loss: 0.2848 - regression_loss: 0.2699 - classification_loss: 0.0148 191/500 [==========>...................] - ETA: 1:46 - loss: 0.2842 - regression_loss: 0.2694 - classification_loss: 0.0148 192/500 [==========>...................] - ETA: 1:45 - loss: 0.2848 - regression_loss: 0.2699 - classification_loss: 0.0148 193/500 [==========>...................] - ETA: 1:45 - loss: 0.2845 - regression_loss: 0.2696 - classification_loss: 0.0149 194/500 [==========>...................] - ETA: 1:45 - loss: 0.2842 - regression_loss: 0.2694 - classification_loss: 0.0148 195/500 [==========>...................] - ETA: 1:44 - loss: 0.2835 - regression_loss: 0.2687 - classification_loss: 0.0148 196/500 [==========>...................] - ETA: 1:44 - loss: 0.2827 - regression_loss: 0.2680 - classification_loss: 0.0147 197/500 [==========>...................] - ETA: 1:44 - loss: 0.2824 - regression_loss: 0.2677 - classification_loss: 0.0147 198/500 [==========>...................] - ETA: 1:43 - loss: 0.2826 - regression_loss: 0.2679 - classification_loss: 0.0147 199/500 [==========>...................] - ETA: 1:43 - loss: 0.2821 - regression_loss: 0.2674 - classification_loss: 0.0147 200/500 [===========>..................] - ETA: 1:42 - loss: 0.2817 - regression_loss: 0.2670 - classification_loss: 0.0147 201/500 [===========>..................] - ETA: 1:42 - loss: 0.2822 - regression_loss: 0.2675 - classification_loss: 0.0147 202/500 [===========>..................] - ETA: 1:42 - loss: 0.2819 - regression_loss: 0.2672 - classification_loss: 0.0147 203/500 [===========>..................] - ETA: 1:41 - loss: 0.2830 - regression_loss: 0.2682 - classification_loss: 0.0148 204/500 [===========>..................] - ETA: 1:41 - loss: 0.2824 - regression_loss: 0.2676 - classification_loss: 0.0148 205/500 [===========>..................] - ETA: 1:41 - loss: 0.2832 - regression_loss: 0.2684 - classification_loss: 0.0148 206/500 [===========>..................] - ETA: 1:40 - loss: 0.2831 - regression_loss: 0.2683 - classification_loss: 0.0148 207/500 [===========>..................] - ETA: 1:40 - loss: 0.2831 - regression_loss: 0.2683 - classification_loss: 0.0148 208/500 [===========>..................] - ETA: 1:40 - loss: 0.2828 - regression_loss: 0.2681 - classification_loss: 0.0147 209/500 [===========>..................] - ETA: 1:39 - loss: 0.2831 - regression_loss: 0.2684 - classification_loss: 0.0147 210/500 [===========>..................] - ETA: 1:39 - loss: 0.2829 - regression_loss: 0.2683 - classification_loss: 0.0147 211/500 [===========>..................] - ETA: 1:39 - loss: 0.2829 - regression_loss: 0.2682 - classification_loss: 0.0146 212/500 [===========>..................] - ETA: 1:38 - loss: 0.2826 - regression_loss: 0.2680 - classification_loss: 0.0146 213/500 [===========>..................] - ETA: 1:38 - loss: 0.2827 - regression_loss: 0.2680 - classification_loss: 0.0146 214/500 [===========>..................] - ETA: 1:38 - loss: 0.2828 - regression_loss: 0.2682 - classification_loss: 0.0146 215/500 [===========>..................] - ETA: 1:37 - loss: 0.2827 - regression_loss: 0.2681 - classification_loss: 0.0146 216/500 [===========>..................] - ETA: 1:37 - loss: 0.2833 - regression_loss: 0.2688 - classification_loss: 0.0146 217/500 [============>.................] - ETA: 1:37 - loss: 0.2834 - regression_loss: 0.2688 - classification_loss: 0.0145 218/500 [============>.................] - ETA: 1:36 - loss: 0.2842 - regression_loss: 0.2696 - classification_loss: 0.0146 219/500 [============>.................] - ETA: 1:36 - loss: 0.2838 - regression_loss: 0.2692 - classification_loss: 0.0146 220/500 [============>.................] - ETA: 1:36 - loss: 0.2844 - regression_loss: 0.2698 - classification_loss: 0.0146 221/500 [============>.................] - ETA: 1:35 - loss: 0.2843 - regression_loss: 0.2697 - classification_loss: 0.0146 222/500 [============>.................] - ETA: 1:35 - loss: 0.2849 - regression_loss: 0.2702 - classification_loss: 0.0146 223/500 [============>.................] - ETA: 1:35 - loss: 0.2848 - regression_loss: 0.2702 - classification_loss: 0.0146 224/500 [============>.................] - ETA: 1:34 - loss: 0.2847 - regression_loss: 0.2701 - classification_loss: 0.0146 225/500 [============>.................] - ETA: 1:34 - loss: 0.2844 - regression_loss: 0.2698 - classification_loss: 0.0145 226/500 [============>.................] - ETA: 1:34 - loss: 0.2843 - regression_loss: 0.2697 - classification_loss: 0.0145 227/500 [============>.................] - ETA: 1:33 - loss: 0.2836 - regression_loss: 0.2692 - classification_loss: 0.0145 228/500 [============>.................] - ETA: 1:33 - loss: 0.2836 - regression_loss: 0.2691 - classification_loss: 0.0145 229/500 [============>.................] - ETA: 1:33 - loss: 0.2833 - regression_loss: 0.2689 - classification_loss: 0.0145 230/500 [============>.................] - ETA: 1:32 - loss: 0.2830 - regression_loss: 0.2686 - classification_loss: 0.0144 231/500 [============>.................] - ETA: 1:32 - loss: 0.2824 - regression_loss: 0.2680 - classification_loss: 0.0144 232/500 [============>.................] - ETA: 1:32 - loss: 0.2832 - regression_loss: 0.2687 - classification_loss: 0.0144 233/500 [============>.................] - ETA: 1:31 - loss: 0.2831 - regression_loss: 0.2687 - classification_loss: 0.0144 234/500 [=============>................] - ETA: 1:31 - loss: 0.2836 - regression_loss: 0.2692 - classification_loss: 0.0145 235/500 [=============>................] - ETA: 1:31 - loss: 0.2852 - regression_loss: 0.2708 - classification_loss: 0.0145 236/500 [=============>................] - ETA: 1:30 - loss: 0.2851 - regression_loss: 0.2707 - classification_loss: 0.0145 237/500 [=============>................] - ETA: 1:30 - loss: 0.2843 - regression_loss: 0.2699 - classification_loss: 0.0144 238/500 [=============>................] - ETA: 1:30 - loss: 0.2842 - regression_loss: 0.2698 - classification_loss: 0.0144 239/500 [=============>................] - ETA: 1:29 - loss: 0.2845 - regression_loss: 0.2701 - classification_loss: 0.0144 240/500 [=============>................] - ETA: 1:29 - loss: 0.2840 - regression_loss: 0.2696 - classification_loss: 0.0144 241/500 [=============>................] - ETA: 1:29 - loss: 0.2836 - regression_loss: 0.2692 - classification_loss: 0.0144 242/500 [=============>................] - ETA: 1:28 - loss: 0.2844 - regression_loss: 0.2700 - classification_loss: 0.0144 243/500 [=============>................] - ETA: 1:28 - loss: 0.2855 - regression_loss: 0.2711 - classification_loss: 0.0144 244/500 [=============>................] - ETA: 1:28 - loss: 0.2868 - regression_loss: 0.2724 - classification_loss: 0.0145 245/500 [=============>................] - ETA: 1:27 - loss: 0.2864 - regression_loss: 0.2720 - classification_loss: 0.0144 246/500 [=============>................] - ETA: 1:27 - loss: 0.2866 - regression_loss: 0.2721 - classification_loss: 0.0145 247/500 [=============>................] - ETA: 1:26 - loss: 0.2863 - regression_loss: 0.2718 - classification_loss: 0.0145 248/500 [=============>................] - ETA: 1:26 - loss: 0.2858 - regression_loss: 0.2714 - classification_loss: 0.0144 249/500 [=============>................] - ETA: 1:26 - loss: 0.2857 - regression_loss: 0.2712 - classification_loss: 0.0145 250/500 [==============>...............] - ETA: 1:25 - loss: 0.2854 - regression_loss: 0.2709 - classification_loss: 0.0145 251/500 [==============>...............] - ETA: 1:25 - loss: 0.2856 - regression_loss: 0.2711 - classification_loss: 0.0145 252/500 [==============>...............] - ETA: 1:25 - loss: 0.2854 - regression_loss: 0.2709 - classification_loss: 0.0145 253/500 [==============>...............] - ETA: 1:24 - loss: 0.2863 - regression_loss: 0.2718 - classification_loss: 0.0146 254/500 [==============>...............] - ETA: 1:24 - loss: 0.2861 - regression_loss: 0.2715 - classification_loss: 0.0146 255/500 [==============>...............] - ETA: 1:24 - loss: 0.2856 - regression_loss: 0.2710 - classification_loss: 0.0145 256/500 [==============>...............] - ETA: 1:23 - loss: 0.2861 - regression_loss: 0.2715 - classification_loss: 0.0146 257/500 [==============>...............] - ETA: 1:23 - loss: 0.2860 - regression_loss: 0.2714 - classification_loss: 0.0146 258/500 [==============>...............] - ETA: 1:23 - loss: 0.2855 - regression_loss: 0.2709 - classification_loss: 0.0146 259/500 [==============>...............] - ETA: 1:22 - loss: 0.2854 - regression_loss: 0.2708 - classification_loss: 0.0146 260/500 [==============>...............] - ETA: 1:22 - loss: 0.2847 - regression_loss: 0.2702 - classification_loss: 0.0146 261/500 [==============>...............] - ETA: 1:22 - loss: 0.2843 - regression_loss: 0.2698 - classification_loss: 0.0145 262/500 [==============>...............] - ETA: 1:21 - loss: 0.2841 - regression_loss: 0.2696 - classification_loss: 0.0145 263/500 [==============>...............] - ETA: 1:21 - loss: 0.2839 - regression_loss: 0.2694 - classification_loss: 0.0145 264/500 [==============>...............] - ETA: 1:21 - loss: 0.2834 - regression_loss: 0.2690 - classification_loss: 0.0145 265/500 [==============>...............] - ETA: 1:20 - loss: 0.2836 - regression_loss: 0.2691 - classification_loss: 0.0145 266/500 [==============>...............] - ETA: 1:20 - loss: 0.2834 - regression_loss: 0.2689 - classification_loss: 0.0145 267/500 [===============>..............] - ETA: 1:20 - loss: 0.2835 - regression_loss: 0.2690 - classification_loss: 0.0146 268/500 [===============>..............] - ETA: 1:19 - loss: 0.2831 - regression_loss: 0.2685 - classification_loss: 0.0145 269/500 [===============>..............] - ETA: 1:19 - loss: 0.2834 - regression_loss: 0.2688 - classification_loss: 0.0146 270/500 [===============>..............] - ETA: 1:19 - loss: 0.2831 - regression_loss: 0.2685 - classification_loss: 0.0146 271/500 [===============>..............] - ETA: 1:18 - loss: 0.2836 - regression_loss: 0.2689 - classification_loss: 0.0147 272/500 [===============>..............] - ETA: 1:18 - loss: 0.2835 - regression_loss: 0.2689 - classification_loss: 0.0146 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2836 - regression_loss: 0.2690 - classification_loss: 0.0146 274/500 [===============>..............] - ETA: 1:17 - loss: 0.2835 - regression_loss: 0.2689 - classification_loss: 0.0146 275/500 [===============>..............] - ETA: 1:17 - loss: 0.2834 - regression_loss: 0.2688 - classification_loss: 0.0146 276/500 [===============>..............] - ETA: 1:16 - loss: 0.2833 - regression_loss: 0.2687 - classification_loss: 0.0146 277/500 [===============>..............] - ETA: 1:16 - loss: 0.2833 - regression_loss: 0.2687 - classification_loss: 0.0146 278/500 [===============>..............] - ETA: 1:16 - loss: 0.2832 - regression_loss: 0.2686 - classification_loss: 0.0146 279/500 [===============>..............] - ETA: 1:15 - loss: 0.2830 - regression_loss: 0.2685 - classification_loss: 0.0146 280/500 [===============>..............] - ETA: 1:15 - loss: 0.2836 - regression_loss: 0.2690 - classification_loss: 0.0146 281/500 [===============>..............] - ETA: 1:15 - loss: 0.2834 - regression_loss: 0.2689 - classification_loss: 0.0146 282/500 [===============>..............] - ETA: 1:14 - loss: 0.2835 - regression_loss: 0.2689 - classification_loss: 0.0146 283/500 [===============>..............] - ETA: 1:14 - loss: 0.2834 - regression_loss: 0.2689 - classification_loss: 0.0145 284/500 [================>.............] - ETA: 1:14 - loss: 0.2831 - regression_loss: 0.2686 - classification_loss: 0.0145 285/500 [================>.............] - ETA: 1:13 - loss: 0.2827 - regression_loss: 0.2682 - classification_loss: 0.0145 286/500 [================>.............] - ETA: 1:13 - loss: 0.2830 - regression_loss: 0.2685 - classification_loss: 0.0145 287/500 [================>.............] - ETA: 1:13 - loss: 0.2826 - regression_loss: 0.2681 - classification_loss: 0.0145 288/500 [================>.............] - ETA: 1:12 - loss: 0.2824 - regression_loss: 0.2679 - classification_loss: 0.0145 289/500 [================>.............] - ETA: 1:12 - loss: 0.2822 - regression_loss: 0.2677 - classification_loss: 0.0145 290/500 [================>.............] - ETA: 1:12 - loss: 0.2821 - regression_loss: 0.2676 - classification_loss: 0.0145 291/500 [================>.............] - ETA: 1:11 - loss: 0.2816 - regression_loss: 0.2672 - classification_loss: 0.0145 292/500 [================>.............] - ETA: 1:11 - loss: 0.2810 - regression_loss: 0.2665 - classification_loss: 0.0144 293/500 [================>.............] - ETA: 1:11 - loss: 0.2808 - regression_loss: 0.2663 - classification_loss: 0.0145 294/500 [================>.............] - ETA: 1:10 - loss: 0.2804 - regression_loss: 0.2659 - classification_loss: 0.0144 295/500 [================>.............] - ETA: 1:10 - loss: 0.2799 - regression_loss: 0.2655 - classification_loss: 0.0144 296/500 [================>.............] - ETA: 1:10 - loss: 0.2798 - regression_loss: 0.2654 - classification_loss: 0.0144 297/500 [================>.............] - ETA: 1:09 - loss: 0.2793 - regression_loss: 0.2649 - classification_loss: 0.0144 298/500 [================>.............] - ETA: 1:09 - loss: 0.2789 - regression_loss: 0.2645 - classification_loss: 0.0144 299/500 [================>.............] - ETA: 1:09 - loss: 0.2789 - regression_loss: 0.2645 - classification_loss: 0.0144 300/500 [=================>............] - ETA: 1:08 - loss: 0.2792 - regression_loss: 0.2648 - classification_loss: 0.0144 301/500 [=================>............] - ETA: 1:08 - loss: 0.2791 - regression_loss: 0.2648 - classification_loss: 0.0144 302/500 [=================>............] - ETA: 1:08 - loss: 0.2792 - regression_loss: 0.2648 - classification_loss: 0.0144 303/500 [=================>............] - ETA: 1:07 - loss: 0.2791 - regression_loss: 0.2647 - classification_loss: 0.0144 304/500 [=================>............] - ETA: 1:07 - loss: 0.2788 - regression_loss: 0.2645 - classification_loss: 0.0143 305/500 [=================>............] - ETA: 1:06 - loss: 0.2794 - regression_loss: 0.2650 - classification_loss: 0.0144 306/500 [=================>............] - ETA: 1:06 - loss: 0.2790 - regression_loss: 0.2646 - classification_loss: 0.0144 307/500 [=================>............] - ETA: 1:06 - loss: 0.2788 - regression_loss: 0.2644 - classification_loss: 0.0143 308/500 [=================>............] - ETA: 1:05 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0143 309/500 [=================>............] - ETA: 1:05 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0143 310/500 [=================>............] - ETA: 1:05 - loss: 0.2785 - regression_loss: 0.2642 - classification_loss: 0.0143 311/500 [=================>............] - ETA: 1:04 - loss: 0.2787 - regression_loss: 0.2645 - classification_loss: 0.0143 312/500 [=================>............] - ETA: 1:04 - loss: 0.2790 - regression_loss: 0.2647 - classification_loss: 0.0143 313/500 [=================>............] - ETA: 1:04 - loss: 0.2794 - regression_loss: 0.2651 - classification_loss: 0.0143 314/500 [=================>............] - ETA: 1:03 - loss: 0.2798 - regression_loss: 0.2654 - classification_loss: 0.0144 315/500 [=================>............] - ETA: 1:03 - loss: 0.2793 - regression_loss: 0.2650 - classification_loss: 0.0143 316/500 [=================>............] - ETA: 1:03 - loss: 0.2790 - regression_loss: 0.2647 - classification_loss: 0.0143 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2792 - regression_loss: 0.2649 - classification_loss: 0.0143 318/500 [==================>...........] - ETA: 1:02 - loss: 0.2788 - regression_loss: 0.2645 - classification_loss: 0.0143 319/500 [==================>...........] - ETA: 1:02 - loss: 0.2786 - regression_loss: 0.2643 - classification_loss: 0.0143 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2788 - regression_loss: 0.2645 - classification_loss: 0.0143 321/500 [==================>...........] - ETA: 1:01 - loss: 0.2790 - regression_loss: 0.2646 - classification_loss: 0.0143 322/500 [==================>...........] - ETA: 1:01 - loss: 0.2806 - regression_loss: 0.2662 - classification_loss: 0.0144 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2813 - regression_loss: 0.2669 - classification_loss: 0.0144 324/500 [==================>...........] - ETA: 1:00 - loss: 0.2811 - regression_loss: 0.2667 - classification_loss: 0.0144 325/500 [==================>...........] - ETA: 1:00 - loss: 0.2808 - regression_loss: 0.2665 - classification_loss: 0.0144 326/500 [==================>...........] - ETA: 59s - loss: 0.2804 - regression_loss: 0.2660 - classification_loss: 0.0143  327/500 [==================>...........] - ETA: 59s - loss: 0.2801 - regression_loss: 0.2658 - classification_loss: 0.0143 328/500 [==================>...........] - ETA: 59s - loss: 0.2798 - regression_loss: 0.2655 - classification_loss: 0.0143 329/500 [==================>...........] - ETA: 58s - loss: 0.2795 - regression_loss: 0.2652 - classification_loss: 0.0143 330/500 [==================>...........] - ETA: 58s - loss: 0.2793 - regression_loss: 0.2651 - classification_loss: 0.0143 331/500 [==================>...........] - ETA: 58s - loss: 0.2795 - regression_loss: 0.2652 - classification_loss: 0.0142 332/500 [==================>...........] - ETA: 57s - loss: 0.2798 - regression_loss: 0.2655 - classification_loss: 0.0143 333/500 [==================>...........] - ETA: 57s - loss: 0.2803 - regression_loss: 0.2659 - classification_loss: 0.0144 334/500 [===================>..........] - ETA: 57s - loss: 0.2799 - regression_loss: 0.2656 - classification_loss: 0.0144 335/500 [===================>..........] - ETA: 56s - loss: 0.2805 - regression_loss: 0.2661 - classification_loss: 0.0144 336/500 [===================>..........] - ETA: 56s - loss: 0.2807 - regression_loss: 0.2663 - classification_loss: 0.0144 337/500 [===================>..........] - ETA: 55s - loss: 0.2809 - regression_loss: 0.2665 - classification_loss: 0.0144 338/500 [===================>..........] - ETA: 55s - loss: 0.2807 - regression_loss: 0.2663 - classification_loss: 0.0144 339/500 [===================>..........] - ETA: 55s - loss: 0.2810 - regression_loss: 0.2665 - classification_loss: 0.0145 340/500 [===================>..........] - ETA: 54s - loss: 0.2810 - regression_loss: 0.2666 - classification_loss: 0.0144 341/500 [===================>..........] - ETA: 54s - loss: 0.2814 - regression_loss: 0.2669 - classification_loss: 0.0144 342/500 [===================>..........] - ETA: 54s - loss: 0.2810 - regression_loss: 0.2666 - classification_loss: 0.0144 343/500 [===================>..........] - ETA: 53s - loss: 0.2806 - regression_loss: 0.2662 - classification_loss: 0.0144 344/500 [===================>..........] - ETA: 53s - loss: 0.2825 - regression_loss: 0.2679 - classification_loss: 0.0146 345/500 [===================>..........] - ETA: 53s - loss: 0.2825 - regression_loss: 0.2678 - classification_loss: 0.0146 346/500 [===================>..........] - ETA: 52s - loss: 0.2824 - regression_loss: 0.2679 - classification_loss: 0.0146 347/500 [===================>..........] - ETA: 52s - loss: 0.2825 - regression_loss: 0.2679 - classification_loss: 0.0146 348/500 [===================>..........] - ETA: 52s - loss: 0.2824 - regression_loss: 0.2677 - classification_loss: 0.0146 349/500 [===================>..........] - ETA: 51s - loss: 0.2818 - regression_loss: 0.2672 - classification_loss: 0.0146 350/500 [====================>.........] - ETA: 51s - loss: 0.2821 - regression_loss: 0.2674 - classification_loss: 0.0146 351/500 [====================>.........] - ETA: 51s - loss: 0.2818 - regression_loss: 0.2672 - classification_loss: 0.0146 352/500 [====================>.........] - ETA: 50s - loss: 0.2820 - regression_loss: 0.2675 - classification_loss: 0.0146 353/500 [====================>.........] - ETA: 50s - loss: 0.2818 - regression_loss: 0.2673 - classification_loss: 0.0146 354/500 [====================>.........] - ETA: 50s - loss: 0.2815 - regression_loss: 0.2669 - classification_loss: 0.0146 355/500 [====================>.........] - ETA: 49s - loss: 0.2814 - regression_loss: 0.2668 - classification_loss: 0.0146 356/500 [====================>.........] - ETA: 49s - loss: 0.2811 - regression_loss: 0.2666 - classification_loss: 0.0145 357/500 [====================>.........] - ETA: 49s - loss: 0.2811 - regression_loss: 0.2666 - classification_loss: 0.0145 358/500 [====================>.........] - ETA: 48s - loss: 0.2810 - regression_loss: 0.2664 - classification_loss: 0.0146 359/500 [====================>.........] - ETA: 48s - loss: 0.2811 - regression_loss: 0.2665 - classification_loss: 0.0146 360/500 [====================>.........] - ETA: 48s - loss: 0.2808 - regression_loss: 0.2662 - classification_loss: 0.0146 361/500 [====================>.........] - ETA: 47s - loss: 0.2808 - regression_loss: 0.2663 - classification_loss: 0.0145 362/500 [====================>.........] - ETA: 47s - loss: 0.2817 - regression_loss: 0.2672 - classification_loss: 0.0145 363/500 [====================>.........] - ETA: 47s - loss: 0.2818 - regression_loss: 0.2672 - classification_loss: 0.0145 364/500 [====================>.........] - ETA: 46s - loss: 0.2814 - regression_loss: 0.2669 - classification_loss: 0.0145 365/500 [====================>.........] - ETA: 46s - loss: 0.2815 - regression_loss: 0.2669 - classification_loss: 0.0145 366/500 [====================>.........] - ETA: 46s - loss: 0.2817 - regression_loss: 0.2672 - classification_loss: 0.0145 367/500 [=====================>........] - ETA: 45s - loss: 0.2819 - regression_loss: 0.2674 - classification_loss: 0.0145 368/500 [=====================>........] - ETA: 45s - loss: 0.2820 - regression_loss: 0.2675 - classification_loss: 0.0145 369/500 [=====================>........] - ETA: 45s - loss: 0.2828 - regression_loss: 0.2682 - classification_loss: 0.0146 370/500 [=====================>........] - ETA: 44s - loss: 0.2833 - regression_loss: 0.2686 - classification_loss: 0.0146 371/500 [=====================>........] - ETA: 44s - loss: 0.2836 - regression_loss: 0.2690 - classification_loss: 0.0146 372/500 [=====================>........] - ETA: 43s - loss: 0.2837 - regression_loss: 0.2691 - classification_loss: 0.0146 373/500 [=====================>........] - ETA: 43s - loss: 0.2833 - regression_loss: 0.2687 - classification_loss: 0.0146 374/500 [=====================>........] - ETA: 43s - loss: 0.2835 - regression_loss: 0.2689 - classification_loss: 0.0146 375/500 [=====================>........] - ETA: 42s - loss: 0.2836 - regression_loss: 0.2689 - classification_loss: 0.0146 376/500 [=====================>........] - ETA: 42s - loss: 0.2836 - regression_loss: 0.2689 - classification_loss: 0.0146 377/500 [=====================>........] - ETA: 42s - loss: 0.2830 - regression_loss: 0.2684 - classification_loss: 0.0146 378/500 [=====================>........] - ETA: 41s - loss: 0.2827 - regression_loss: 0.2681 - classification_loss: 0.0146 379/500 [=====================>........] - ETA: 41s - loss: 0.2826 - regression_loss: 0.2680 - classification_loss: 0.0146 380/500 [=====================>........] - ETA: 41s - loss: 0.2826 - regression_loss: 0.2680 - classification_loss: 0.0146 381/500 [=====================>........] - ETA: 40s - loss: 0.2822 - regression_loss: 0.2676 - classification_loss: 0.0146 382/500 [=====================>........] - ETA: 40s - loss: 0.2825 - regression_loss: 0.2679 - classification_loss: 0.0146 383/500 [=====================>........] - ETA: 40s - loss: 0.2827 - regression_loss: 0.2681 - classification_loss: 0.0146 384/500 [======================>.......] - ETA: 39s - loss: 0.2827 - regression_loss: 0.2682 - classification_loss: 0.0146 385/500 [======================>.......] - ETA: 39s - loss: 0.2826 - regression_loss: 0.2680 - classification_loss: 0.0145 386/500 [======================>.......] - ETA: 39s - loss: 0.2826 - regression_loss: 0.2681 - classification_loss: 0.0145 387/500 [======================>.......] - ETA: 38s - loss: 0.2823 - regression_loss: 0.2678 - classification_loss: 0.0145 388/500 [======================>.......] - ETA: 38s - loss: 0.2824 - regression_loss: 0.2679 - classification_loss: 0.0145 389/500 [======================>.......] - ETA: 38s - loss: 0.2825 - regression_loss: 0.2680 - classification_loss: 0.0145 390/500 [======================>.......] - ETA: 37s - loss: 0.2824 - regression_loss: 0.2679 - classification_loss: 0.0145 391/500 [======================>.......] - ETA: 37s - loss: 0.2825 - regression_loss: 0.2680 - classification_loss: 0.0145 392/500 [======================>.......] - ETA: 37s - loss: 0.2821 - regression_loss: 0.2677 - classification_loss: 0.0144 393/500 [======================>.......] - ETA: 36s - loss: 0.2818 - regression_loss: 0.2674 - classification_loss: 0.0144 394/500 [======================>.......] - ETA: 36s - loss: 0.2822 - regression_loss: 0.2677 - classification_loss: 0.0144 395/500 [======================>.......] - ETA: 36s - loss: 0.2821 - regression_loss: 0.2677 - classification_loss: 0.0144 396/500 [======================>.......] - ETA: 35s - loss: 0.2819 - regression_loss: 0.2675 - classification_loss: 0.0144 397/500 [======================>.......] - ETA: 35s - loss: 0.2817 - regression_loss: 0.2673 - classification_loss: 0.0144 398/500 [======================>.......] - ETA: 35s - loss: 0.2816 - regression_loss: 0.2673 - classification_loss: 0.0143 399/500 [======================>.......] - ETA: 34s - loss: 0.2813 - regression_loss: 0.2670 - classification_loss: 0.0143 400/500 [=======================>......] - ETA: 34s - loss: 0.2809 - regression_loss: 0.2666 - classification_loss: 0.0143 401/500 [=======================>......] - ETA: 34s - loss: 0.2804 - regression_loss: 0.2661 - classification_loss: 0.0143 402/500 [=======================>......] - ETA: 33s - loss: 0.2800 - regression_loss: 0.2657 - classification_loss: 0.0143 403/500 [=======================>......] - ETA: 33s - loss: 0.2800 - regression_loss: 0.2657 - classification_loss: 0.0142 404/500 [=======================>......] - ETA: 32s - loss: 0.2800 - regression_loss: 0.2658 - classification_loss: 0.0142 405/500 [=======================>......] - ETA: 32s - loss: 0.2797 - regression_loss: 0.2655 - classification_loss: 0.0142 406/500 [=======================>......] - ETA: 32s - loss: 0.2796 - regression_loss: 0.2653 - classification_loss: 0.0142 407/500 [=======================>......] - ETA: 31s - loss: 0.2795 - regression_loss: 0.2653 - classification_loss: 0.0142 408/500 [=======================>......] - ETA: 31s - loss: 0.2794 - regression_loss: 0.2652 - classification_loss: 0.0142 409/500 [=======================>......] - ETA: 31s - loss: 0.2794 - regression_loss: 0.2653 - classification_loss: 0.0142 410/500 [=======================>......] - ETA: 30s - loss: 0.2795 - regression_loss: 0.2653 - classification_loss: 0.0142 411/500 [=======================>......] - ETA: 30s - loss: 0.2794 - regression_loss: 0.2652 - classification_loss: 0.0142 412/500 [=======================>......] - ETA: 30s - loss: 0.2792 - regression_loss: 0.2651 - classification_loss: 0.0141 413/500 [=======================>......] - ETA: 29s - loss: 0.2791 - regression_loss: 0.2650 - classification_loss: 0.0141 414/500 [=======================>......] - ETA: 29s - loss: 0.2795 - regression_loss: 0.2653 - classification_loss: 0.0142 415/500 [=======================>......] - ETA: 29s - loss: 0.2799 - regression_loss: 0.2656 - classification_loss: 0.0143 416/500 [=======================>......] - ETA: 28s - loss: 0.2801 - regression_loss: 0.2658 - classification_loss: 0.0143 417/500 [========================>.....] - ETA: 28s - loss: 0.2800 - regression_loss: 0.2656 - classification_loss: 0.0143 418/500 [========================>.....] - ETA: 28s - loss: 0.2800 - regression_loss: 0.2656 - classification_loss: 0.0143 419/500 [========================>.....] - ETA: 27s - loss: 0.2796 - regression_loss: 0.2653 - classification_loss: 0.0143 420/500 [========================>.....] - ETA: 27s - loss: 0.2795 - regression_loss: 0.2652 - classification_loss: 0.0143 421/500 [========================>.....] - ETA: 27s - loss: 0.2795 - regression_loss: 0.2651 - classification_loss: 0.0144 422/500 [========================>.....] - ETA: 26s - loss: 0.2792 - regression_loss: 0.2648 - classification_loss: 0.0143 423/500 [========================>.....] - ETA: 26s - loss: 0.2788 - regression_loss: 0.2644 - classification_loss: 0.0143 424/500 [========================>.....] - ETA: 26s - loss: 0.2787 - regression_loss: 0.2644 - classification_loss: 0.0143 425/500 [========================>.....] - ETA: 25s - loss: 0.2786 - regression_loss: 0.2642 - classification_loss: 0.0143 426/500 [========================>.....] - ETA: 25s - loss: 0.2789 - regression_loss: 0.2645 - classification_loss: 0.0144 427/500 [========================>.....] - ETA: 25s - loss: 0.2788 - regression_loss: 0.2644 - classification_loss: 0.0144 428/500 [========================>.....] - ETA: 24s - loss: 0.2787 - regression_loss: 0.2644 - classification_loss: 0.0144 429/500 [========================>.....] - ETA: 24s - loss: 0.2783 - regression_loss: 0.2640 - classification_loss: 0.0143 430/500 [========================>.....] - ETA: 24s - loss: 0.2779 - regression_loss: 0.2636 - classification_loss: 0.0143 431/500 [========================>.....] - ETA: 23s - loss: 0.2778 - regression_loss: 0.2635 - classification_loss: 0.0143 432/500 [========================>.....] - ETA: 23s - loss: 0.2776 - regression_loss: 0.2633 - classification_loss: 0.0143 433/500 [========================>.....] - ETA: 22s - loss: 0.2774 - regression_loss: 0.2631 - classification_loss: 0.0143 434/500 [=========================>....] - ETA: 22s - loss: 0.2773 - regression_loss: 0.2631 - classification_loss: 0.0143 435/500 [=========================>....] - ETA: 22s - loss: 0.2771 - regression_loss: 0.2628 - classification_loss: 0.0143 436/500 [=========================>....] - ETA: 21s - loss: 0.2770 - regression_loss: 0.2627 - classification_loss: 0.0143 437/500 [=========================>....] - ETA: 21s - loss: 0.2769 - regression_loss: 0.2626 - classification_loss: 0.0143 438/500 [=========================>....] - ETA: 21s - loss: 0.2765 - regression_loss: 0.2623 - classification_loss: 0.0143 439/500 [=========================>....] - ETA: 20s - loss: 0.2769 - regression_loss: 0.2626 - classification_loss: 0.0142 440/500 [=========================>....] - ETA: 20s - loss: 0.2768 - regression_loss: 0.2625 - classification_loss: 0.0142 441/500 [=========================>....] - ETA: 20s - loss: 0.2768 - regression_loss: 0.2626 - classification_loss: 0.0142 442/500 [=========================>....] - ETA: 19s - loss: 0.2766 - regression_loss: 0.2624 - classification_loss: 0.0142 443/500 [=========================>....] - ETA: 19s - loss: 0.2761 - regression_loss: 0.2619 - classification_loss: 0.0142 444/500 [=========================>....] - ETA: 19s - loss: 0.2768 - regression_loss: 0.2626 - classification_loss: 0.0142 445/500 [=========================>....] - ETA: 18s - loss: 0.2765 - regression_loss: 0.2623 - classification_loss: 0.0142 446/500 [=========================>....] - ETA: 18s - loss: 0.2767 - regression_loss: 0.2625 - classification_loss: 0.0142 447/500 [=========================>....] - ETA: 18s - loss: 0.2765 - regression_loss: 0.2624 - classification_loss: 0.0142 448/500 [=========================>....] - ETA: 17s - loss: 0.2763 - regression_loss: 0.2621 - classification_loss: 0.0141 449/500 [=========================>....] - ETA: 17s - loss: 0.2760 - regression_loss: 0.2619 - classification_loss: 0.0141 450/500 [==========================>...] - ETA: 17s - loss: 0.2758 - regression_loss: 0.2617 - classification_loss: 0.0141 451/500 [==========================>...] - ETA: 16s - loss: 0.2757 - regression_loss: 0.2616 - classification_loss: 0.0141 452/500 [==========================>...] - ETA: 16s - loss: 0.2755 - regression_loss: 0.2614 - classification_loss: 0.0141 453/500 [==========================>...] - ETA: 16s - loss: 0.2756 - regression_loss: 0.2615 - classification_loss: 0.0141 454/500 [==========================>...] - ETA: 15s - loss: 0.2757 - regression_loss: 0.2616 - classification_loss: 0.0141 455/500 [==========================>...] - ETA: 15s - loss: 0.2757 - regression_loss: 0.2617 - classification_loss: 0.0141 456/500 [==========================>...] - ETA: 15s - loss: 0.2759 - regression_loss: 0.2619 - classification_loss: 0.0141 457/500 [==========================>...] - ETA: 14s - loss: 0.2759 - regression_loss: 0.2618 - classification_loss: 0.0141 458/500 [==========================>...] - ETA: 14s - loss: 0.2755 - regression_loss: 0.2615 - classification_loss: 0.0141 459/500 [==========================>...] - ETA: 14s - loss: 0.2754 - regression_loss: 0.2614 - classification_loss: 0.0141 460/500 [==========================>...] - ETA: 13s - loss: 0.2755 - regression_loss: 0.2614 - classification_loss: 0.0141 461/500 [==========================>...] - ETA: 13s - loss: 0.2753 - regression_loss: 0.2612 - classification_loss: 0.0140 462/500 [==========================>...] - ETA: 13s - loss: 0.2753 - regression_loss: 0.2613 - classification_loss: 0.0140 463/500 [==========================>...] - ETA: 12s - loss: 0.2756 - regression_loss: 0.2616 - classification_loss: 0.0140 464/500 [==========================>...] - ETA: 12s - loss: 0.2754 - regression_loss: 0.2614 - classification_loss: 0.0140 465/500 [==========================>...] - ETA: 12s - loss: 0.2752 - regression_loss: 0.2612 - classification_loss: 0.0140 466/500 [==========================>...] - ETA: 11s - loss: 0.2751 - regression_loss: 0.2612 - classification_loss: 0.0140 467/500 [===========================>..] - ETA: 11s - loss: 0.2752 - regression_loss: 0.2612 - classification_loss: 0.0140 468/500 [===========================>..] - ETA: 10s - loss: 0.2751 - regression_loss: 0.2611 - classification_loss: 0.0140 469/500 [===========================>..] - ETA: 10s - loss: 0.2752 - regression_loss: 0.2612 - classification_loss: 0.0140 470/500 [===========================>..] - ETA: 10s - loss: 0.2752 - regression_loss: 0.2612 - classification_loss: 0.0140 471/500 [===========================>..] - ETA: 9s - loss: 0.2750 - regression_loss: 0.2611 - classification_loss: 0.0140  472/500 [===========================>..] - ETA: 9s - loss: 0.2748 - regression_loss: 0.2609 - classification_loss: 0.0140 473/500 [===========================>..] - ETA: 9s - loss: 0.2749 - regression_loss: 0.2609 - classification_loss: 0.0140 474/500 [===========================>..] - ETA: 8s - loss: 0.2749 - regression_loss: 0.2609 - classification_loss: 0.0140 475/500 [===========================>..] - ETA: 8s - loss: 0.2747 - regression_loss: 0.2607 - classification_loss: 0.0140 476/500 [===========================>..] - ETA: 8s - loss: 0.2745 - regression_loss: 0.2605 - classification_loss: 0.0139 477/500 [===========================>..] - ETA: 7s - loss: 0.2742 - regression_loss: 0.2603 - classification_loss: 0.0139 478/500 [===========================>..] - ETA: 7s - loss: 0.2741 - regression_loss: 0.2602 - classification_loss: 0.0139 479/500 [===========================>..] - ETA: 7s - loss: 0.2741 - regression_loss: 0.2602 - classification_loss: 0.0139 480/500 [===========================>..] - ETA: 6s - loss: 0.2741 - regression_loss: 0.2602 - classification_loss: 0.0139 481/500 [===========================>..] - ETA: 6s - loss: 0.2738 - regression_loss: 0.2599 - classification_loss: 0.0139 482/500 [===========================>..] - ETA: 6s - loss: 0.2737 - regression_loss: 0.2597 - classification_loss: 0.0139 483/500 [===========================>..] - ETA: 5s - loss: 0.2736 - regression_loss: 0.2597 - classification_loss: 0.0139 484/500 [============================>.] - ETA: 5s - loss: 0.2734 - regression_loss: 0.2595 - classification_loss: 0.0139 485/500 [============================>.] - ETA: 5s - loss: 0.2744 - regression_loss: 0.2605 - classification_loss: 0.0139 486/500 [============================>.] - ETA: 4s - loss: 0.2747 - regression_loss: 0.2609 - classification_loss: 0.0139 487/500 [============================>.] - ETA: 4s - loss: 0.2748 - regression_loss: 0.2609 - classification_loss: 0.0139 488/500 [============================>.] - ETA: 4s - loss: 0.2748 - regression_loss: 0.2609 - classification_loss: 0.0139 489/500 [============================>.] - ETA: 3s - loss: 0.2750 - regression_loss: 0.2611 - classification_loss: 0.0140 490/500 [============================>.] - ETA: 3s - loss: 0.2749 - regression_loss: 0.2610 - classification_loss: 0.0139 491/500 [============================>.] - ETA: 3s - loss: 0.2747 - regression_loss: 0.2607 - classification_loss: 0.0139 492/500 [============================>.] - ETA: 2s - loss: 0.2745 - regression_loss: 0.2606 - classification_loss: 0.0139 493/500 [============================>.] - ETA: 2s - loss: 0.2747 - regression_loss: 0.2607 - classification_loss: 0.0139 494/500 [============================>.] - ETA: 2s - loss: 0.2746 - regression_loss: 0.2607 - classification_loss: 0.0139 495/500 [============================>.] - ETA: 1s - loss: 0.2747 - regression_loss: 0.2608 - classification_loss: 0.0139 496/500 [============================>.] - ETA: 1s - loss: 0.2746 - regression_loss: 0.2607 - classification_loss: 0.0139 497/500 [============================>.] - ETA: 1s - loss: 0.2746 - regression_loss: 0.2607 - classification_loss: 0.0139 498/500 [============================>.] - ETA: 0s - loss: 0.2745 - regression_loss: 0.2606 - classification_loss: 0.0139 499/500 [============================>.] - ETA: 0s - loss: 0.2744 - regression_loss: 0.2604 - classification_loss: 0.0139 500/500 [==============================] - 171s 343ms/step - loss: 0.2739 - regression_loss: 0.2600 - classification_loss: 0.0139 1172 instances of class plum with average precision: 0.7655 mAP: 0.7655 Epoch 00042: saving model to ./training/snapshots/resnet101_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:36 - loss: 0.0969 - regression_loss: 0.0930 - classification_loss: 0.0039 2/500 [..............................] - ETA: 2:40 - loss: 0.1602 - regression_loss: 0.1555 - classification_loss: 0.0048 3/500 [..............................] - ETA: 2:40 - loss: 0.2363 - regression_loss: 0.2268 - classification_loss: 0.0095 4/500 [..............................] - ETA: 2:42 - loss: 0.2422 - regression_loss: 0.2300 - classification_loss: 0.0122 5/500 [..............................] - ETA: 2:42 - loss: 0.2534 - regression_loss: 0.2410 - classification_loss: 0.0123 6/500 [..............................] - ETA: 2:43 - loss: 0.2586 - regression_loss: 0.2473 - classification_loss: 0.0113 7/500 [..............................] - ETA: 2:44 - loss: 0.2624 - regression_loss: 0.2524 - classification_loss: 0.0100 8/500 [..............................] - ETA: 2:44 - loss: 0.2663 - regression_loss: 0.2557 - classification_loss: 0.0106 9/500 [..............................] - ETA: 2:45 - loss: 0.2731 - regression_loss: 0.2629 - classification_loss: 0.0102 10/500 [..............................] - ETA: 2:45 - loss: 0.2694 - regression_loss: 0.2596 - classification_loss: 0.0098 11/500 [..............................] - ETA: 2:45 - loss: 0.2579 - regression_loss: 0.2477 - classification_loss: 0.0102 12/500 [..............................] - ETA: 2:44 - loss: 0.2627 - regression_loss: 0.2505 - classification_loss: 0.0123 13/500 [..............................] - ETA: 2:44 - loss: 0.2557 - regression_loss: 0.2441 - classification_loss: 0.0116 14/500 [..............................] - ETA: 2:44 - loss: 0.2476 - regression_loss: 0.2366 - classification_loss: 0.0110 15/500 [..............................] - ETA: 2:44 - loss: 0.2509 - regression_loss: 0.2404 - classification_loss: 0.0105 16/500 [..............................] - ETA: 2:44 - loss: 0.2553 - regression_loss: 0.2450 - classification_loss: 0.0103 17/500 [>.............................] - ETA: 2:43 - loss: 0.2503 - regression_loss: 0.2405 - classification_loss: 0.0098 18/500 [>.............................] - ETA: 2:43 - loss: 0.2560 - regression_loss: 0.2458 - classification_loss: 0.0102 19/500 [>.............................] - ETA: 2:43 - loss: 0.2641 - regression_loss: 0.2538 - classification_loss: 0.0103 20/500 [>.............................] - ETA: 2:43 - loss: 0.2584 - regression_loss: 0.2483 - classification_loss: 0.0101 21/500 [>.............................] - ETA: 2:42 - loss: 0.2592 - regression_loss: 0.2489 - classification_loss: 0.0102 22/500 [>.............................] - ETA: 2:42 - loss: 0.2568 - regression_loss: 0.2469 - classification_loss: 0.0099 23/500 [>.............................] - ETA: 2:42 - loss: 0.2576 - regression_loss: 0.2475 - classification_loss: 0.0101 24/500 [>.............................] - ETA: 2:42 - loss: 0.2514 - regression_loss: 0.2415 - classification_loss: 0.0098 25/500 [>.............................] - ETA: 2:41 - loss: 0.2482 - regression_loss: 0.2384 - classification_loss: 0.0097 26/500 [>.............................] - ETA: 2:41 - loss: 0.2517 - regression_loss: 0.2420 - classification_loss: 0.0098 27/500 [>.............................] - ETA: 2:40 - loss: 0.2565 - regression_loss: 0.2464 - classification_loss: 0.0102 28/500 [>.............................] - ETA: 2:40 - loss: 0.2612 - regression_loss: 0.2510 - classification_loss: 0.0102 29/500 [>.............................] - ETA: 2:39 - loss: 0.2589 - regression_loss: 0.2489 - classification_loss: 0.0100 30/500 [>.............................] - ETA: 2:39 - loss: 0.2613 - regression_loss: 0.2511 - classification_loss: 0.0103 31/500 [>.............................] - ETA: 2:39 - loss: 0.2586 - regression_loss: 0.2482 - classification_loss: 0.0103 32/500 [>.............................] - ETA: 2:39 - loss: 0.2600 - regression_loss: 0.2498 - classification_loss: 0.0102 33/500 [>.............................] - ETA: 2:38 - loss: 0.2553 - regression_loss: 0.2451 - classification_loss: 0.0102 34/500 [=>............................] - ETA: 2:38 - loss: 0.2592 - regression_loss: 0.2487 - classification_loss: 0.0105 35/500 [=>............................] - ETA: 2:38 - loss: 0.2620 - regression_loss: 0.2507 - classification_loss: 0.0113 36/500 [=>............................] - ETA: 2:38 - loss: 0.2604 - regression_loss: 0.2494 - classification_loss: 0.0110 37/500 [=>............................] - ETA: 2:38 - loss: 0.2664 - regression_loss: 0.2547 - classification_loss: 0.0117 38/500 [=>............................] - ETA: 2:37 - loss: 0.2658 - regression_loss: 0.2534 - classification_loss: 0.0124 39/500 [=>............................] - ETA: 2:37 - loss: 0.2672 - regression_loss: 0.2549 - classification_loss: 0.0123 40/500 [=>............................] - ETA: 2:37 - loss: 0.2693 - regression_loss: 0.2568 - classification_loss: 0.0125 41/500 [=>............................] - ETA: 2:36 - loss: 0.2721 - regression_loss: 0.2595 - classification_loss: 0.0126 42/500 [=>............................] - ETA: 2:36 - loss: 0.2726 - regression_loss: 0.2601 - classification_loss: 0.0125 43/500 [=>............................] - ETA: 2:36 - loss: 0.2826 - regression_loss: 0.2694 - classification_loss: 0.0132 44/500 [=>............................] - ETA: 2:35 - loss: 0.2824 - regression_loss: 0.2692 - classification_loss: 0.0131 45/500 [=>............................] - ETA: 2:35 - loss: 0.2846 - regression_loss: 0.2711 - classification_loss: 0.0134 46/500 [=>............................] - ETA: 2:34 - loss: 0.2883 - regression_loss: 0.2746 - classification_loss: 0.0136 47/500 [=>............................] - ETA: 2:34 - loss: 0.2868 - regression_loss: 0.2733 - classification_loss: 0.0136 48/500 [=>............................] - ETA: 2:34 - loss: 0.2899 - regression_loss: 0.2763 - classification_loss: 0.0136 49/500 [=>............................] - ETA: 2:34 - loss: 0.2860 - regression_loss: 0.2726 - classification_loss: 0.0134 50/500 [==>...........................] - ETA: 2:33 - loss: 0.2856 - regression_loss: 0.2723 - classification_loss: 0.0133 51/500 [==>...........................] - ETA: 2:33 - loss: 0.2829 - regression_loss: 0.2698 - classification_loss: 0.0131 52/500 [==>...........................] - ETA: 2:33 - loss: 0.2836 - regression_loss: 0.2702 - classification_loss: 0.0134 53/500 [==>...........................] - ETA: 2:32 - loss: 0.2863 - regression_loss: 0.2723 - classification_loss: 0.0140 54/500 [==>...........................] - ETA: 2:32 - loss: 0.2867 - regression_loss: 0.2728 - classification_loss: 0.0138 55/500 [==>...........................] - ETA: 2:32 - loss: 0.2832 - regression_loss: 0.2695 - classification_loss: 0.0137 56/500 [==>...........................] - ETA: 2:31 - loss: 0.2843 - regression_loss: 0.2704 - classification_loss: 0.0139 57/500 [==>...........................] - ETA: 2:31 - loss: 0.2848 - regression_loss: 0.2708 - classification_loss: 0.0140 58/500 [==>...........................] - ETA: 2:31 - loss: 0.2823 - regression_loss: 0.2685 - classification_loss: 0.0138 59/500 [==>...........................] - ETA: 2:30 - loss: 0.2802 - regression_loss: 0.2665 - classification_loss: 0.0136 60/500 [==>...........................] - ETA: 2:30 - loss: 0.2809 - regression_loss: 0.2673 - classification_loss: 0.0136 61/500 [==>...........................] - ETA: 2:30 - loss: 0.2794 - regression_loss: 0.2659 - classification_loss: 0.0135 62/500 [==>...........................] - ETA: 2:29 - loss: 0.2785 - regression_loss: 0.2651 - classification_loss: 0.0134 63/500 [==>...........................] - ETA: 2:29 - loss: 0.2756 - regression_loss: 0.2624 - classification_loss: 0.0132 64/500 [==>...........................] - ETA: 2:29 - loss: 0.2737 - regression_loss: 0.2606 - classification_loss: 0.0131 65/500 [==>...........................] - ETA: 2:28 - loss: 0.2765 - regression_loss: 0.2630 - classification_loss: 0.0135 66/500 [==>...........................] - ETA: 2:28 - loss: 0.2756 - regression_loss: 0.2622 - classification_loss: 0.0135 67/500 [===>..........................] - ETA: 2:28 - loss: 0.2747 - regression_loss: 0.2613 - classification_loss: 0.0134 68/500 [===>..........................] - ETA: 2:27 - loss: 0.2739 - regression_loss: 0.2604 - classification_loss: 0.0135 69/500 [===>..........................] - ETA: 2:27 - loss: 0.2745 - regression_loss: 0.2610 - classification_loss: 0.0135 70/500 [===>..........................] - ETA: 2:27 - loss: 0.2744 - regression_loss: 0.2609 - classification_loss: 0.0135 71/500 [===>..........................] - ETA: 2:26 - loss: 0.2730 - regression_loss: 0.2595 - classification_loss: 0.0134 72/500 [===>..........................] - ETA: 2:26 - loss: 0.2713 - regression_loss: 0.2580 - classification_loss: 0.0133 73/500 [===>..........................] - ETA: 2:26 - loss: 0.2696 - regression_loss: 0.2564 - classification_loss: 0.0131 74/500 [===>..........................] - ETA: 2:25 - loss: 0.2676 - regression_loss: 0.2545 - classification_loss: 0.0130 75/500 [===>..........................] - ETA: 2:25 - loss: 0.2679 - regression_loss: 0.2548 - classification_loss: 0.0130 76/500 [===>..........................] - ETA: 2:24 - loss: 0.2687 - regression_loss: 0.2556 - classification_loss: 0.0131 77/500 [===>..........................] - ETA: 2:24 - loss: 0.2699 - regression_loss: 0.2565 - classification_loss: 0.0134 78/500 [===>..........................] - ETA: 2:24 - loss: 0.2700 - regression_loss: 0.2566 - classification_loss: 0.0134 79/500 [===>..........................] - ETA: 2:24 - loss: 0.2712 - regression_loss: 0.2577 - classification_loss: 0.0135 80/500 [===>..........................] - ETA: 2:23 - loss: 0.2718 - regression_loss: 0.2583 - classification_loss: 0.0135 81/500 [===>..........................] - ETA: 2:23 - loss: 0.2718 - regression_loss: 0.2583 - classification_loss: 0.0135 82/500 [===>..........................] - ETA: 2:22 - loss: 0.2706 - regression_loss: 0.2572 - classification_loss: 0.0134 83/500 [===>..........................] - ETA: 2:22 - loss: 0.2686 - regression_loss: 0.2553 - classification_loss: 0.0133 84/500 [====>.........................] - ETA: 2:22 - loss: 0.2666 - regression_loss: 0.2534 - classification_loss: 0.0132 85/500 [====>.........................] - ETA: 2:22 - loss: 0.2654 - regression_loss: 0.2523 - classification_loss: 0.0131 86/500 [====>.........................] - ETA: 2:21 - loss: 0.2649 - regression_loss: 0.2518 - classification_loss: 0.0131 87/500 [====>.........................] - ETA: 2:21 - loss: 0.2680 - regression_loss: 0.2548 - classification_loss: 0.0132 88/500 [====>.........................] - ETA: 2:21 - loss: 0.2691 - regression_loss: 0.2558 - classification_loss: 0.0133 89/500 [====>.........................] - ETA: 2:20 - loss: 0.2708 - regression_loss: 0.2573 - classification_loss: 0.0135 90/500 [====>.........................] - ETA: 2:20 - loss: 0.2703 - regression_loss: 0.2567 - classification_loss: 0.0136 91/500 [====>.........................] - ETA: 2:20 - loss: 0.2698 - regression_loss: 0.2562 - classification_loss: 0.0136 92/500 [====>.........................] - ETA: 2:19 - loss: 0.2685 - regression_loss: 0.2550 - classification_loss: 0.0136 93/500 [====>.........................] - ETA: 2:19 - loss: 0.2700 - regression_loss: 0.2558 - classification_loss: 0.0142 94/500 [====>.........................] - ETA: 2:19 - loss: 0.2706 - regression_loss: 0.2564 - classification_loss: 0.0142 95/500 [====>.........................] - ETA: 2:18 - loss: 0.2716 - regression_loss: 0.2574 - classification_loss: 0.0142 96/500 [====>.........................] - ETA: 2:18 - loss: 0.2706 - regression_loss: 0.2565 - classification_loss: 0.0141 97/500 [====>.........................] - ETA: 2:18 - loss: 0.2694 - regression_loss: 0.2553 - classification_loss: 0.0141 98/500 [====>.........................] - ETA: 2:17 - loss: 0.2692 - regression_loss: 0.2552 - classification_loss: 0.0140 99/500 [====>.........................] - ETA: 2:17 - loss: 0.2702 - regression_loss: 0.2562 - classification_loss: 0.0140 100/500 [=====>........................] - ETA: 2:17 - loss: 0.2723 - regression_loss: 0.2578 - classification_loss: 0.0145 101/500 [=====>........................] - ETA: 2:16 - loss: 0.2718 - regression_loss: 0.2574 - classification_loss: 0.0144 102/500 [=====>........................] - ETA: 2:16 - loss: 0.2717 - regression_loss: 0.2573 - classification_loss: 0.0144 103/500 [=====>........................] - ETA: 2:16 - loss: 0.2716 - regression_loss: 0.2572 - classification_loss: 0.0143 104/500 [=====>........................] - ETA: 2:15 - loss: 0.2701 - regression_loss: 0.2559 - classification_loss: 0.0142 105/500 [=====>........................] - ETA: 2:15 - loss: 0.2708 - regression_loss: 0.2563 - classification_loss: 0.0145 106/500 [=====>........................] - ETA: 2:15 - loss: 0.2704 - regression_loss: 0.2560 - classification_loss: 0.0144 107/500 [=====>........................] - ETA: 2:14 - loss: 0.2694 - regression_loss: 0.2550 - classification_loss: 0.0143 108/500 [=====>........................] - ETA: 2:14 - loss: 0.2687 - regression_loss: 0.2544 - classification_loss: 0.0143 109/500 [=====>........................] - ETA: 2:14 - loss: 0.2693 - regression_loss: 0.2550 - classification_loss: 0.0143 110/500 [=====>........................] - ETA: 2:13 - loss: 0.2705 - regression_loss: 0.2561 - classification_loss: 0.0143 111/500 [=====>........................] - ETA: 2:13 - loss: 0.2696 - regression_loss: 0.2553 - classification_loss: 0.0142 112/500 [=====>........................] - ETA: 2:13 - loss: 0.2691 - regression_loss: 0.2548 - classification_loss: 0.0143 113/500 [=====>........................] - ETA: 2:12 - loss: 0.2681 - regression_loss: 0.2539 - classification_loss: 0.0142 114/500 [=====>........................] - ETA: 2:12 - loss: 0.2678 - regression_loss: 0.2536 - classification_loss: 0.0142 115/500 [=====>........................] - ETA: 2:12 - loss: 0.2667 - regression_loss: 0.2526 - classification_loss: 0.0141 116/500 [=====>........................] - ETA: 2:11 - loss: 0.2652 - regression_loss: 0.2512 - classification_loss: 0.0140 117/500 [======>.......................] - ETA: 2:11 - loss: 0.2651 - regression_loss: 0.2511 - classification_loss: 0.0140 118/500 [======>.......................] - ETA: 2:11 - loss: 0.2643 - regression_loss: 0.2505 - classification_loss: 0.0139 119/500 [======>.......................] - ETA: 2:10 - loss: 0.2636 - regression_loss: 0.2498 - classification_loss: 0.0138 120/500 [======>.......................] - ETA: 2:10 - loss: 0.2634 - regression_loss: 0.2496 - classification_loss: 0.0138 121/500 [======>.......................] - ETA: 2:10 - loss: 0.2625 - regression_loss: 0.2488 - classification_loss: 0.0137 122/500 [======>.......................] - ETA: 2:09 - loss: 0.2631 - regression_loss: 0.2494 - classification_loss: 0.0138 123/500 [======>.......................] - ETA: 2:09 - loss: 0.2615 - regression_loss: 0.2478 - classification_loss: 0.0137 124/500 [======>.......................] - ETA: 2:09 - loss: 0.2607 - regression_loss: 0.2471 - classification_loss: 0.0136 125/500 [======>.......................] - ETA: 2:08 - loss: 0.2604 - regression_loss: 0.2468 - classification_loss: 0.0135 126/500 [======>.......................] - ETA: 2:08 - loss: 0.2601 - regression_loss: 0.2465 - classification_loss: 0.0136 127/500 [======>.......................] - ETA: 2:08 - loss: 0.2600 - regression_loss: 0.2465 - classification_loss: 0.0136 128/500 [======>.......................] - ETA: 2:07 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0136 129/500 [======>.......................] - ETA: 2:07 - loss: 0.2585 - regression_loss: 0.2450 - classification_loss: 0.0135 130/500 [======>.......................] - ETA: 2:06 - loss: 0.2583 - regression_loss: 0.2447 - classification_loss: 0.0136 131/500 [======>.......................] - ETA: 2:06 - loss: 0.2577 - regression_loss: 0.2442 - classification_loss: 0.0135 132/500 [======>.......................] - ETA: 2:06 - loss: 0.2575 - regression_loss: 0.2440 - classification_loss: 0.0135 133/500 [======>.......................] - ETA: 2:05 - loss: 0.2572 - regression_loss: 0.2437 - classification_loss: 0.0135 134/500 [=======>......................] - ETA: 2:05 - loss: 0.2588 - regression_loss: 0.2452 - classification_loss: 0.0136 135/500 [=======>......................] - ETA: 2:05 - loss: 0.2576 - regression_loss: 0.2441 - classification_loss: 0.0135 136/500 [=======>......................] - ETA: 2:04 - loss: 0.2571 - regression_loss: 0.2437 - classification_loss: 0.0135 137/500 [=======>......................] - ETA: 2:04 - loss: 0.2580 - regression_loss: 0.2445 - classification_loss: 0.0135 138/500 [=======>......................] - ETA: 2:04 - loss: 0.2589 - regression_loss: 0.2453 - classification_loss: 0.0136 139/500 [=======>......................] - ETA: 2:03 - loss: 0.2597 - regression_loss: 0.2461 - classification_loss: 0.0136 140/500 [=======>......................] - ETA: 2:03 - loss: 0.2598 - regression_loss: 0.2462 - classification_loss: 0.0136 141/500 [=======>......................] - ETA: 2:03 - loss: 0.2599 - regression_loss: 0.2464 - classification_loss: 0.0135 142/500 [=======>......................] - ETA: 2:02 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0135 143/500 [=======>......................] - ETA: 2:02 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0135 144/500 [=======>......................] - ETA: 2:02 - loss: 0.2597 - regression_loss: 0.2463 - classification_loss: 0.0135 145/500 [=======>......................] - ETA: 2:01 - loss: 0.2601 - regression_loss: 0.2467 - classification_loss: 0.0135 146/500 [=======>......................] - ETA: 2:01 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 147/500 [=======>......................] - ETA: 2:01 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 148/500 [=======>......................] - ETA: 2:00 - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 149/500 [=======>......................] - ETA: 2:00 - loss: 0.2606 - regression_loss: 0.2473 - classification_loss: 0.0133 150/500 [========>.....................] - ETA: 2:00 - loss: 0.2612 - regression_loss: 0.2480 - classification_loss: 0.0133 151/500 [========>.....................] - ETA: 1:59 - loss: 0.2614 - regression_loss: 0.2481 - classification_loss: 0.0132 152/500 [========>.....................] - ETA: 1:59 - loss: 0.2608 - regression_loss: 0.2476 - classification_loss: 0.0133 153/500 [========>.....................] - ETA: 1:59 - loss: 0.2613 - regression_loss: 0.2479 - classification_loss: 0.0134 154/500 [========>.....................] - ETA: 1:58 - loss: 0.2614 - regression_loss: 0.2480 - classification_loss: 0.0133 155/500 [========>.....................] - ETA: 1:58 - loss: 0.2606 - regression_loss: 0.2474 - classification_loss: 0.0133 156/500 [========>.....................] - ETA: 1:58 - loss: 0.2614 - regression_loss: 0.2481 - classification_loss: 0.0133 157/500 [========>.....................] - ETA: 1:57 - loss: 0.2611 - regression_loss: 0.2478 - classification_loss: 0.0133 158/500 [========>.....................] - ETA: 1:57 - loss: 0.2608 - regression_loss: 0.2476 - classification_loss: 0.0132 159/500 [========>.....................] - ETA: 1:57 - loss: 0.2605 - regression_loss: 0.2472 - classification_loss: 0.0133 160/500 [========>.....................] - ETA: 1:56 - loss: 0.2601 - regression_loss: 0.2468 - classification_loss: 0.0133 161/500 [========>.....................] - ETA: 1:56 - loss: 0.2601 - regression_loss: 0.2468 - classification_loss: 0.0133 162/500 [========>.....................] - ETA: 1:56 - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0134 163/500 [========>.....................] - ETA: 1:55 - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0135 164/500 [========>.....................] - ETA: 1:55 - loss: 0.2597 - regression_loss: 0.2463 - classification_loss: 0.0134 165/500 [========>.....................] - ETA: 1:55 - loss: 0.2604 - regression_loss: 0.2469 - classification_loss: 0.0135 166/500 [========>.....................] - ETA: 1:54 - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 167/500 [=========>....................] - ETA: 1:54 - loss: 0.2610 - regression_loss: 0.2473 - classification_loss: 0.0137 168/500 [=========>....................] - ETA: 1:54 - loss: 0.2601 - regression_loss: 0.2465 - classification_loss: 0.0136 169/500 [=========>....................] - ETA: 1:53 - loss: 0.2602 - regression_loss: 0.2467 - classification_loss: 0.0136 170/500 [=========>....................] - ETA: 1:53 - loss: 0.2595 - regression_loss: 0.2459 - classification_loss: 0.0135 171/500 [=========>....................] - ETA: 1:52 - loss: 0.2603 - regression_loss: 0.2468 - classification_loss: 0.0135 172/500 [=========>....................] - ETA: 1:52 - loss: 0.2598 - regression_loss: 0.2463 - classification_loss: 0.0135 173/500 [=========>....................] - ETA: 1:52 - loss: 0.2605 - regression_loss: 0.2469 - classification_loss: 0.0136 174/500 [=========>....................] - ETA: 1:51 - loss: 0.2602 - regression_loss: 0.2467 - classification_loss: 0.0135 175/500 [=========>....................] - ETA: 1:51 - loss: 0.2603 - regression_loss: 0.2468 - classification_loss: 0.0135 176/500 [=========>....................] - ETA: 1:51 - loss: 0.2609 - regression_loss: 0.2473 - classification_loss: 0.0136 177/500 [=========>....................] - ETA: 1:50 - loss: 0.2612 - regression_loss: 0.2476 - classification_loss: 0.0136 178/500 [=========>....................] - ETA: 1:50 - loss: 0.2608 - regression_loss: 0.2473 - classification_loss: 0.0136 179/500 [=========>....................] - ETA: 1:50 - loss: 0.2608 - regression_loss: 0.2472 - classification_loss: 0.0135 180/500 [=========>....................] - ETA: 1:49 - loss: 0.2606 - regression_loss: 0.2471 - classification_loss: 0.0135 181/500 [=========>....................] - ETA: 1:49 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 182/500 [=========>....................] - ETA: 1:49 - loss: 0.2601 - regression_loss: 0.2466 - classification_loss: 0.0135 183/500 [=========>....................] - ETA: 1:48 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 184/500 [==========>...................] - ETA: 1:48 - loss: 0.2588 - regression_loss: 0.2455 - classification_loss: 0.0134 185/500 [==========>...................] - ETA: 1:48 - loss: 0.2584 - regression_loss: 0.2450 - classification_loss: 0.0134 186/500 [==========>...................] - ETA: 1:47 - loss: 0.2576 - regression_loss: 0.2443 - classification_loss: 0.0133 187/500 [==========>...................] - ETA: 1:47 - loss: 0.2567 - regression_loss: 0.2435 - classification_loss: 0.0132 188/500 [==========>...................] - ETA: 1:47 - loss: 0.2565 - regression_loss: 0.2434 - classification_loss: 0.0132 189/500 [==========>...................] - ETA: 1:46 - loss: 0.2563 - regression_loss: 0.2431 - classification_loss: 0.0132 190/500 [==========>...................] - ETA: 1:46 - loss: 0.2560 - regression_loss: 0.2428 - classification_loss: 0.0131 191/500 [==========>...................] - ETA: 1:46 - loss: 0.2557 - regression_loss: 0.2426 - classification_loss: 0.0131 192/500 [==========>...................] - ETA: 1:45 - loss: 0.2550 - regression_loss: 0.2420 - classification_loss: 0.0130 193/500 [==========>...................] - ETA: 1:45 - loss: 0.2552 - regression_loss: 0.2421 - classification_loss: 0.0130 194/500 [==========>...................] - ETA: 1:44 - loss: 0.2555 - regression_loss: 0.2424 - classification_loss: 0.0130 195/500 [==========>...................] - ETA: 1:44 - loss: 0.2553 - regression_loss: 0.2423 - classification_loss: 0.0131 196/500 [==========>...................] - ETA: 1:44 - loss: 0.2546 - regression_loss: 0.2416 - classification_loss: 0.0130 197/500 [==========>...................] - ETA: 1:43 - loss: 0.2544 - regression_loss: 0.2414 - classification_loss: 0.0130 198/500 [==========>...................] - ETA: 1:43 - loss: 0.2540 - regression_loss: 0.2410 - classification_loss: 0.0129 199/500 [==========>...................] - ETA: 1:43 - loss: 0.2540 - regression_loss: 0.2410 - classification_loss: 0.0129 200/500 [===========>..................] - ETA: 1:42 - loss: 0.2536 - regression_loss: 0.2407 - classification_loss: 0.0129 201/500 [===========>..................] - ETA: 1:42 - loss: 0.2536 - regression_loss: 0.2406 - classification_loss: 0.0129 202/500 [===========>..................] - ETA: 1:42 - loss: 0.2540 - regression_loss: 0.2410 - classification_loss: 0.0130 203/500 [===========>..................] - ETA: 1:41 - loss: 0.2532 - regression_loss: 0.2402 - classification_loss: 0.0129 204/500 [===========>..................] - ETA: 1:41 - loss: 0.2531 - regression_loss: 0.2402 - classification_loss: 0.0129 205/500 [===========>..................] - ETA: 1:41 - loss: 0.2527 - regression_loss: 0.2399 - classification_loss: 0.0129 206/500 [===========>..................] - ETA: 1:40 - loss: 0.2530 - regression_loss: 0.2401 - classification_loss: 0.0129 207/500 [===========>..................] - ETA: 1:40 - loss: 0.2524 - regression_loss: 0.2396 - classification_loss: 0.0128 208/500 [===========>..................] - ETA: 1:40 - loss: 0.2518 - regression_loss: 0.2390 - classification_loss: 0.0128 209/500 [===========>..................] - ETA: 1:39 - loss: 0.2512 - regression_loss: 0.2385 - classification_loss: 0.0128 210/500 [===========>..................] - ETA: 1:39 - loss: 0.2514 - regression_loss: 0.2385 - classification_loss: 0.0129 211/500 [===========>..................] - ETA: 1:39 - loss: 0.2521 - regression_loss: 0.2391 - classification_loss: 0.0130 212/500 [===========>..................] - ETA: 1:38 - loss: 0.2522 - regression_loss: 0.2391 - classification_loss: 0.0130 213/500 [===========>..................] - ETA: 1:38 - loss: 0.2526 - regression_loss: 0.2396 - classification_loss: 0.0130 214/500 [===========>..................] - ETA: 1:38 - loss: 0.2532 - regression_loss: 0.2402 - classification_loss: 0.0130 215/500 [===========>..................] - ETA: 1:37 - loss: 0.2534 - regression_loss: 0.2404 - classification_loss: 0.0130 216/500 [===========>..................] - ETA: 1:37 - loss: 0.2532 - regression_loss: 0.2402 - classification_loss: 0.0130 217/500 [============>.................] - ETA: 1:37 - loss: 0.2528 - regression_loss: 0.2398 - classification_loss: 0.0130 218/500 [============>.................] - ETA: 1:36 - loss: 0.2526 - regression_loss: 0.2397 - classification_loss: 0.0130 219/500 [============>.................] - ETA: 1:36 - loss: 0.2523 - regression_loss: 0.2394 - classification_loss: 0.0129 220/500 [============>.................] - ETA: 1:36 - loss: 0.2524 - regression_loss: 0.2395 - classification_loss: 0.0129 221/500 [============>.................] - ETA: 1:35 - loss: 0.2525 - regression_loss: 0.2396 - classification_loss: 0.0129 222/500 [============>.................] - ETA: 1:35 - loss: 0.2526 - regression_loss: 0.2397 - classification_loss: 0.0129 223/500 [============>.................] - ETA: 1:35 - loss: 0.2530 - regression_loss: 0.2401 - classification_loss: 0.0129 224/500 [============>.................] - ETA: 1:34 - loss: 0.2528 - regression_loss: 0.2399 - classification_loss: 0.0129 225/500 [============>.................] - ETA: 1:34 - loss: 0.2541 - regression_loss: 0.2413 - classification_loss: 0.0129 226/500 [============>.................] - ETA: 1:34 - loss: 0.2539 - regression_loss: 0.2410 - classification_loss: 0.0128 227/500 [============>.................] - ETA: 1:33 - loss: 0.2537 - regression_loss: 0.2409 - classification_loss: 0.0128 228/500 [============>.................] - ETA: 1:33 - loss: 0.2533 - regression_loss: 0.2405 - classification_loss: 0.0128 229/500 [============>.................] - ETA: 1:33 - loss: 0.2534 - regression_loss: 0.2406 - classification_loss: 0.0128 230/500 [============>.................] - ETA: 1:32 - loss: 0.2533 - regression_loss: 0.2405 - classification_loss: 0.0128 231/500 [============>.................] - ETA: 1:32 - loss: 0.2532 - regression_loss: 0.2405 - classification_loss: 0.0127 232/500 [============>.................] - ETA: 1:32 - loss: 0.2530 - regression_loss: 0.2403 - classification_loss: 0.0127 233/500 [============>.................] - ETA: 1:31 - loss: 0.2537 - regression_loss: 0.2410 - classification_loss: 0.0128 234/500 [=============>................] - ETA: 1:31 - loss: 0.2533 - regression_loss: 0.2406 - classification_loss: 0.0127 235/500 [=============>................] - ETA: 1:31 - loss: 0.2535 - regression_loss: 0.2407 - classification_loss: 0.0127 236/500 [=============>................] - ETA: 1:30 - loss: 0.2541 - regression_loss: 0.2413 - classification_loss: 0.0127 237/500 [=============>................] - ETA: 1:30 - loss: 0.2539 - regression_loss: 0.2412 - classification_loss: 0.0127 238/500 [=============>................] - ETA: 1:30 - loss: 0.2536 - regression_loss: 0.2409 - classification_loss: 0.0127 239/500 [=============>................] - ETA: 1:29 - loss: 0.2537 - regression_loss: 0.2410 - classification_loss: 0.0127 240/500 [=============>................] - ETA: 1:29 - loss: 0.2536 - regression_loss: 0.2408 - classification_loss: 0.0127 241/500 [=============>................] - ETA: 1:28 - loss: 0.2533 - regression_loss: 0.2406 - classification_loss: 0.0127 242/500 [=============>................] - ETA: 1:28 - loss: 0.2533 - regression_loss: 0.2406 - classification_loss: 0.0127 243/500 [=============>................] - ETA: 1:28 - loss: 0.2532 - regression_loss: 0.2405 - classification_loss: 0.0127 244/500 [=============>................] - ETA: 1:27 - loss: 0.2530 - regression_loss: 0.2403 - classification_loss: 0.0127 245/500 [=============>................] - ETA: 1:27 - loss: 0.2524 - regression_loss: 0.2397 - classification_loss: 0.0127 246/500 [=============>................] - ETA: 1:27 - loss: 0.2530 - regression_loss: 0.2403 - classification_loss: 0.0127 247/500 [=============>................] - ETA: 1:26 - loss: 0.2529 - regression_loss: 0.2403 - classification_loss: 0.0127 248/500 [=============>................] - ETA: 1:26 - loss: 0.2530 - regression_loss: 0.2403 - classification_loss: 0.0127 249/500 [=============>................] - ETA: 1:26 - loss: 0.2531 - regression_loss: 0.2404 - classification_loss: 0.0127 250/500 [==============>...............] - ETA: 1:25 - loss: 0.2530 - regression_loss: 0.2403 - classification_loss: 0.0127 251/500 [==============>...............] - ETA: 1:25 - loss: 0.2529 - regression_loss: 0.2402 - classification_loss: 0.0127 252/500 [==============>...............] - ETA: 1:25 - loss: 0.2523 - regression_loss: 0.2397 - classification_loss: 0.0127 253/500 [==============>...............] - ETA: 1:24 - loss: 0.2524 - regression_loss: 0.2398 - classification_loss: 0.0127 254/500 [==============>...............] - ETA: 1:24 - loss: 0.2523 - regression_loss: 0.2396 - classification_loss: 0.0127 255/500 [==============>...............] - ETA: 1:24 - loss: 0.2526 - regression_loss: 0.2399 - classification_loss: 0.0127 256/500 [==============>...............] - ETA: 1:23 - loss: 0.2521 - regression_loss: 0.2395 - classification_loss: 0.0126 257/500 [==============>...............] - ETA: 1:23 - loss: 0.2523 - regression_loss: 0.2397 - classification_loss: 0.0127 258/500 [==============>...............] - ETA: 1:23 - loss: 0.2525 - regression_loss: 0.2398 - classification_loss: 0.0127 259/500 [==============>...............] - ETA: 1:22 - loss: 0.2525 - regression_loss: 0.2398 - classification_loss: 0.0127 260/500 [==============>...............] - ETA: 1:22 - loss: 0.2530 - regression_loss: 0.2402 - classification_loss: 0.0129 261/500 [==============>...............] - ETA: 1:22 - loss: 0.2528 - regression_loss: 0.2399 - classification_loss: 0.0129 262/500 [==============>...............] - ETA: 1:21 - loss: 0.2539 - regression_loss: 0.2410 - classification_loss: 0.0129 263/500 [==============>...............] - ETA: 1:21 - loss: 0.2536 - regression_loss: 0.2408 - classification_loss: 0.0129 264/500 [==============>...............] - ETA: 1:20 - loss: 0.2531 - regression_loss: 0.2403 - classification_loss: 0.0128 265/500 [==============>...............] - ETA: 1:20 - loss: 0.2535 - regression_loss: 0.2405 - classification_loss: 0.0129 266/500 [==============>...............] - ETA: 1:20 - loss: 0.2535 - regression_loss: 0.2406 - classification_loss: 0.0129 267/500 [===============>..............] - ETA: 1:19 - loss: 0.2534 - regression_loss: 0.2404 - classification_loss: 0.0129 268/500 [===============>..............] - ETA: 1:19 - loss: 0.2528 - regression_loss: 0.2399 - classification_loss: 0.0129 269/500 [===============>..............] - ETA: 1:19 - loss: 0.2532 - regression_loss: 0.2402 - classification_loss: 0.0130 270/500 [===============>..............] - ETA: 1:18 - loss: 0.2534 - regression_loss: 0.2404 - classification_loss: 0.0130 271/500 [===============>..............] - ETA: 1:18 - loss: 0.2533 - regression_loss: 0.2404 - classification_loss: 0.0130 272/500 [===============>..............] - ETA: 1:18 - loss: 0.2537 - regression_loss: 0.2407 - classification_loss: 0.0130 273/500 [===============>..............] - ETA: 1:17 - loss: 0.2542 - regression_loss: 0.2411 - classification_loss: 0.0131 274/500 [===============>..............] - ETA: 1:17 - loss: 0.2541 - regression_loss: 0.2409 - classification_loss: 0.0131 275/500 [===============>..............] - ETA: 1:17 - loss: 0.2544 - regression_loss: 0.2413 - classification_loss: 0.0131 276/500 [===============>..............] - ETA: 1:16 - loss: 0.2543 - regression_loss: 0.2412 - classification_loss: 0.0131 277/500 [===============>..............] - ETA: 1:16 - loss: 0.2544 - regression_loss: 0.2413 - classification_loss: 0.0131 278/500 [===============>..............] - ETA: 1:16 - loss: 0.2554 - regression_loss: 0.2423 - classification_loss: 0.0132 279/500 [===============>..............] - ETA: 1:15 - loss: 0.2556 - regression_loss: 0.2424 - classification_loss: 0.0132 280/500 [===============>..............] - ETA: 1:15 - loss: 0.2561 - regression_loss: 0.2429 - classification_loss: 0.0132 281/500 [===============>..............] - ETA: 1:15 - loss: 0.2564 - regression_loss: 0.2433 - classification_loss: 0.0132 282/500 [===============>..............] - ETA: 1:14 - loss: 0.2567 - regression_loss: 0.2435 - classification_loss: 0.0132 283/500 [===============>..............] - ETA: 1:14 - loss: 0.2566 - regression_loss: 0.2434 - classification_loss: 0.0132 284/500 [================>.............] - ETA: 1:14 - loss: 0.2564 - regression_loss: 0.2432 - classification_loss: 0.0132 285/500 [================>.............] - ETA: 1:13 - loss: 0.2559 - regression_loss: 0.2427 - classification_loss: 0.0132 286/500 [================>.............] - ETA: 1:13 - loss: 0.2565 - regression_loss: 0.2433 - classification_loss: 0.0132 287/500 [================>.............] - ETA: 1:13 - loss: 0.2565 - regression_loss: 0.2433 - classification_loss: 0.0132 288/500 [================>.............] - ETA: 1:12 - loss: 0.2578 - regression_loss: 0.2446 - classification_loss: 0.0132 289/500 [================>.............] - ETA: 1:12 - loss: 0.2581 - regression_loss: 0.2449 - classification_loss: 0.0132 290/500 [================>.............] - ETA: 1:12 - loss: 0.2599 - regression_loss: 0.2466 - classification_loss: 0.0133 291/500 [================>.............] - ETA: 1:11 - loss: 0.2609 - regression_loss: 0.2476 - classification_loss: 0.0133 292/500 [================>.............] - ETA: 1:11 - loss: 0.2613 - regression_loss: 0.2480 - classification_loss: 0.0133 293/500 [================>.............] - ETA: 1:11 - loss: 0.2611 - regression_loss: 0.2478 - classification_loss: 0.0133 294/500 [================>.............] - ETA: 1:10 - loss: 0.2611 - regression_loss: 0.2479 - classification_loss: 0.0133 295/500 [================>.............] - ETA: 1:10 - loss: 0.2609 - regression_loss: 0.2476 - classification_loss: 0.0132 296/500 [================>.............] - ETA: 1:10 - loss: 0.2608 - regression_loss: 0.2476 - classification_loss: 0.0132 297/500 [================>.............] - ETA: 1:09 - loss: 0.2611 - regression_loss: 0.2479 - classification_loss: 0.0132 298/500 [================>.............] - ETA: 1:09 - loss: 0.2612 - regression_loss: 0.2481 - classification_loss: 0.0132 299/500 [================>.............] - ETA: 1:08 - loss: 0.2612 - regression_loss: 0.2480 - classification_loss: 0.0131 300/500 [=================>............] - ETA: 1:08 - loss: 0.2612 - regression_loss: 0.2481 - classification_loss: 0.0131 301/500 [=================>............] - ETA: 1:08 - loss: 0.2612 - regression_loss: 0.2480 - classification_loss: 0.0132 302/500 [=================>............] - ETA: 1:07 - loss: 0.2623 - regression_loss: 0.2491 - classification_loss: 0.0132 303/500 [=================>............] - ETA: 1:07 - loss: 0.2631 - regression_loss: 0.2498 - classification_loss: 0.0133 304/500 [=================>............] - ETA: 1:07 - loss: 0.2628 - regression_loss: 0.2496 - classification_loss: 0.0132 305/500 [=================>............] - ETA: 1:06 - loss: 0.2632 - regression_loss: 0.2499 - classification_loss: 0.0133 306/500 [=================>............] - ETA: 1:06 - loss: 0.2637 - regression_loss: 0.2505 - classification_loss: 0.0133 307/500 [=================>............] - ETA: 1:06 - loss: 0.2644 - regression_loss: 0.2512 - classification_loss: 0.0133 308/500 [=================>............] - ETA: 1:05 - loss: 0.2649 - regression_loss: 0.2516 - classification_loss: 0.0133 309/500 [=================>............] - ETA: 1:05 - loss: 0.2648 - regression_loss: 0.2516 - classification_loss: 0.0133 310/500 [=================>............] - ETA: 1:05 - loss: 0.2648 - regression_loss: 0.2516 - classification_loss: 0.0133 311/500 [=================>............] - ETA: 1:04 - loss: 0.2650 - regression_loss: 0.2517 - classification_loss: 0.0133 312/500 [=================>............] - ETA: 1:04 - loss: 0.2654 - regression_loss: 0.2522 - classification_loss: 0.0133 313/500 [=================>............] - ETA: 1:04 - loss: 0.2654 - regression_loss: 0.2522 - classification_loss: 0.0133 314/500 [=================>............] - ETA: 1:03 - loss: 0.2656 - regression_loss: 0.2523 - classification_loss: 0.0133 315/500 [=================>............] - ETA: 1:03 - loss: 0.2656 - regression_loss: 0.2524 - classification_loss: 0.0133 316/500 [=================>............] - ETA: 1:03 - loss: 0.2657 - regression_loss: 0.2524 - classification_loss: 0.0133 317/500 [==================>...........] - ETA: 1:02 - loss: 0.2661 - regression_loss: 0.2528 - classification_loss: 0.0133 318/500 [==================>...........] - ETA: 1:02 - loss: 0.2661 - regression_loss: 0.2528 - classification_loss: 0.0133 319/500 [==================>...........] - ETA: 1:02 - loss: 0.2663 - regression_loss: 0.2530 - classification_loss: 0.0133 320/500 [==================>...........] - ETA: 1:01 - loss: 0.2664 - regression_loss: 0.2531 - classification_loss: 0.0133 321/500 [==================>...........] - ETA: 1:01 - loss: 0.2669 - regression_loss: 0.2536 - classification_loss: 0.0133 322/500 [==================>...........] - ETA: 1:01 - loss: 0.2666 - regression_loss: 0.2533 - classification_loss: 0.0133 323/500 [==================>...........] - ETA: 1:00 - loss: 0.2667 - regression_loss: 0.2534 - classification_loss: 0.0134 324/500 [==================>...........] - ETA: 1:00 - loss: 0.2668 - regression_loss: 0.2534 - classification_loss: 0.0134 325/500 [==================>...........] - ETA: 1:00 - loss: 0.2666 - regression_loss: 0.2533 - classification_loss: 0.0133 326/500 [==================>...........] - ETA: 59s - loss: 0.2662 - regression_loss: 0.2529 - classification_loss: 0.0133  327/500 [==================>...........] - ETA: 59s - loss: 0.2659 - regression_loss: 0.2527 - classification_loss: 0.0133 328/500 [==================>...........] - ETA: 58s - loss: 0.2653 - regression_loss: 0.2520 - classification_loss: 0.0133 329/500 [==================>...........] - ETA: 58s - loss: 0.2651 - regression_loss: 0.2518 - classification_loss: 0.0132 330/500 [==================>...........] - ETA: 58s - loss: 0.2649 - regression_loss: 0.2517 - classification_loss: 0.0132 331/500 [==================>...........] - ETA: 57s - loss: 0.2644 - regression_loss: 0.2512 - classification_loss: 0.0132 332/500 [==================>...........] - ETA: 57s - loss: 0.2649 - regression_loss: 0.2517 - classification_loss: 0.0132 333/500 [==================>...........] - ETA: 57s - loss: 0.2647 - regression_loss: 0.2515 - classification_loss: 0.0132 334/500 [===================>..........] - ETA: 56s - loss: 0.2646 - regression_loss: 0.2515 - classification_loss: 0.0131 335/500 [===================>..........] - ETA: 56s - loss: 0.2645 - regression_loss: 0.2514 - classification_loss: 0.0131 336/500 [===================>..........] - ETA: 56s - loss: 0.2642 - regression_loss: 0.2511 - classification_loss: 0.0131 337/500 [===================>..........] - ETA: 55s - loss: 0.2639 - regression_loss: 0.2508 - classification_loss: 0.0131 338/500 [===================>..........] - ETA: 55s - loss: 0.2636 - regression_loss: 0.2505 - classification_loss: 0.0131 339/500 [===================>..........] - ETA: 55s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0130 340/500 [===================>..........] - ETA: 54s - loss: 0.2630 - regression_loss: 0.2500 - classification_loss: 0.0130 341/500 [===================>..........] - ETA: 54s - loss: 0.2634 - regression_loss: 0.2504 - classification_loss: 0.0130 342/500 [===================>..........] - ETA: 54s - loss: 0.2628 - regression_loss: 0.2499 - classification_loss: 0.0130 343/500 [===================>..........] - ETA: 53s - loss: 0.2627 - regression_loss: 0.2497 - classification_loss: 0.0130 344/500 [===================>..........] - ETA: 53s - loss: 0.2629 - regression_loss: 0.2499 - classification_loss: 0.0130 345/500 [===================>..........] - ETA: 53s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 346/500 [===================>..........] - ETA: 52s - loss: 0.2631 - regression_loss: 0.2500 - classification_loss: 0.0131 347/500 [===================>..........] - ETA: 52s - loss: 0.2629 - regression_loss: 0.2498 - classification_loss: 0.0131 348/500 [===================>..........] - ETA: 52s - loss: 0.2632 - regression_loss: 0.2501 - classification_loss: 0.0131 349/500 [===================>..........] - ETA: 51s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 350/500 [====================>.........] - ETA: 51s - loss: 0.2631 - regression_loss: 0.2500 - classification_loss: 0.0131 351/500 [====================>.........] - ETA: 51s - loss: 0.2632 - regression_loss: 0.2500 - classification_loss: 0.0132 352/500 [====================>.........] - ETA: 50s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 353/500 [====================>.........] - ETA: 50s - loss: 0.2627 - regression_loss: 0.2496 - classification_loss: 0.0131 354/500 [====================>.........] - ETA: 50s - loss: 0.2628 - regression_loss: 0.2497 - classification_loss: 0.0131 355/500 [====================>.........] - ETA: 49s - loss: 0.2630 - regression_loss: 0.2498 - classification_loss: 0.0132 356/500 [====================>.........] - ETA: 49s - loss: 0.2635 - regression_loss: 0.2503 - classification_loss: 0.0132 357/500 [====================>.........] - ETA: 49s - loss: 0.2638 - regression_loss: 0.2506 - classification_loss: 0.0133 358/500 [====================>.........] - ETA: 48s - loss: 0.2639 - regression_loss: 0.2506 - classification_loss: 0.0133 359/500 [====================>.........] - ETA: 48s - loss: 0.2635 - regression_loss: 0.2502 - classification_loss: 0.0132 360/500 [====================>.........] - ETA: 48s - loss: 0.2639 - regression_loss: 0.2507 - classification_loss: 0.0133 361/500 [====================>.........] - ETA: 47s - loss: 0.2640 - regression_loss: 0.2507 - classification_loss: 0.0133 362/500 [====================>.........] - ETA: 47s - loss: 0.2635 - regression_loss: 0.2503 - classification_loss: 0.0133 363/500 [====================>.........] - ETA: 46s - loss: 0.2636 - regression_loss: 0.2503 - classification_loss: 0.0133 364/500 [====================>.........] - ETA: 46s - loss: 0.2633 - regression_loss: 0.2501 - classification_loss: 0.0133 365/500 [====================>.........] - ETA: 46s - loss: 0.2635 - regression_loss: 0.2502 - classification_loss: 0.0133 366/500 [====================>.........] - ETA: 45s - loss: 0.2633 - regression_loss: 0.2501 - classification_loss: 0.0133 367/500 [=====================>........] - ETA: 45s - loss: 0.2636 - regression_loss: 0.2504 - classification_loss: 0.0133 368/500 [=====================>........] - ETA: 45s - loss: 0.2633 - regression_loss: 0.2501 - classification_loss: 0.0133 369/500 [=====================>........] - ETA: 44s - loss: 0.2630 - regression_loss: 0.2498 - classification_loss: 0.0132 370/500 [=====================>........] - ETA: 44s - loss: 0.2628 - regression_loss: 0.2496 - classification_loss: 0.0132 371/500 [=====================>........] - ETA: 44s - loss: 0.2624 - regression_loss: 0.2492 - classification_loss: 0.0132 372/500 [=====================>........] - ETA: 43s - loss: 0.2622 - regression_loss: 0.2490 - classification_loss: 0.0132 373/500 [=====================>........] - ETA: 43s - loss: 0.2621 - regression_loss: 0.2489 - classification_loss: 0.0132 374/500 [=====================>........] - ETA: 43s - loss: 0.2622 - regression_loss: 0.2490 - classification_loss: 0.0132 375/500 [=====================>........] - ETA: 42s - loss: 0.2623 - regression_loss: 0.2491 - classification_loss: 0.0132 376/500 [=====================>........] - ETA: 42s - loss: 0.2621 - regression_loss: 0.2489 - classification_loss: 0.0132 377/500 [=====================>........] - ETA: 42s - loss: 0.2621 - regression_loss: 0.2489 - classification_loss: 0.0132 378/500 [=====================>........] - ETA: 41s - loss: 0.2620 - regression_loss: 0.2489 - classification_loss: 0.0132 379/500 [=====================>........] - ETA: 41s - loss: 0.2621 - regression_loss: 0.2490 - classification_loss: 0.0132 380/500 [=====================>........] - ETA: 41s - loss: 0.2620 - regression_loss: 0.2488 - classification_loss: 0.0131 381/500 [=====================>........] - ETA: 40s - loss: 0.2615 - regression_loss: 0.2484 - classification_loss: 0.0131 382/500 [=====================>........] - ETA: 40s - loss: 0.2615 - regression_loss: 0.2484 - classification_loss: 0.0131 383/500 [=====================>........] - ETA: 40s - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 384/500 [======================>.......] - ETA: 39s - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 385/500 [======================>.......] - ETA: 39s - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0131 386/500 [======================>.......] - ETA: 39s - loss: 0.2612 - regression_loss: 0.2481 - classification_loss: 0.0131 387/500 [======================>.......] - ETA: 38s - loss: 0.2612 - regression_loss: 0.2482 - classification_loss: 0.0131 388/500 [======================>.......] - ETA: 38s - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 389/500 [======================>.......] - ETA: 38s - loss: 0.2613 - regression_loss: 0.2482 - classification_loss: 0.0131 390/500 [======================>.......] - ETA: 37s - loss: 0.2612 - regression_loss: 0.2482 - classification_loss: 0.0131 391/500 [======================>.......] - ETA: 37s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 392/500 [======================>.......] - ETA: 37s - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0130 393/500 [======================>.......] - ETA: 36s - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0130 394/500 [======================>.......] - ETA: 36s - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0130 395/500 [======================>.......] - ETA: 36s - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0130 396/500 [======================>.......] - ETA: 35s - loss: 0.2617 - regression_loss: 0.2487 - classification_loss: 0.0130 397/500 [======================>.......] - ETA: 35s - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0130 398/500 [======================>.......] - ETA: 34s - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 399/500 [======================>.......] - ETA: 34s - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 400/500 [=======================>......] - ETA: 34s - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 401/500 [=======================>......] - ETA: 33s - loss: 0.2619 - regression_loss: 0.2488 - classification_loss: 0.0131 402/500 [=======================>......] - ETA: 33s - loss: 0.2620 - regression_loss: 0.2489 - classification_loss: 0.0131 403/500 [=======================>......] - ETA: 33s - loss: 0.2619 - regression_loss: 0.2488 - classification_loss: 0.0131 404/500 [=======================>......] - ETA: 32s - loss: 0.2622 - regression_loss: 0.2491 - classification_loss: 0.0131 405/500 [=======================>......] - ETA: 32s - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 406/500 [=======================>......] - ETA: 32s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 407/500 [=======================>......] - ETA: 31s - loss: 0.2620 - regression_loss: 0.2489 - classification_loss: 0.0131 408/500 [=======================>......] - ETA: 31s - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 409/500 [=======================>......] - ETA: 31s - loss: 0.2618 - regression_loss: 0.2488 - classification_loss: 0.0131 410/500 [=======================>......] - ETA: 30s - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 411/500 [=======================>......] - ETA: 30s - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 412/500 [=======================>......] - ETA: 30s - loss: 0.2613 - regression_loss: 0.2482 - classification_loss: 0.0131 413/500 [=======================>......] - ETA: 29s - loss: 0.2610 - regression_loss: 0.2480 - classification_loss: 0.0130 414/500 [=======================>......] - ETA: 29s - loss: 0.2608 - regression_loss: 0.2478 - classification_loss: 0.0130 415/500 [=======================>......] - ETA: 29s - loss: 0.2610 - regression_loss: 0.2479 - classification_loss: 0.0131 416/500 [=======================>......] - ETA: 28s - loss: 0.2610 - regression_loss: 0.2480 - classification_loss: 0.0131 417/500 [========================>.....] - ETA: 28s - loss: 0.2609 - regression_loss: 0.2478 - classification_loss: 0.0131 418/500 [========================>.....] - ETA: 28s - loss: 0.2608 - regression_loss: 0.2477 - classification_loss: 0.0131 419/500 [========================>.....] - ETA: 27s - loss: 0.2606 - regression_loss: 0.2476 - classification_loss: 0.0130 420/500 [========================>.....] - ETA: 27s - loss: 0.2603 - regression_loss: 0.2473 - classification_loss: 0.0130 421/500 [========================>.....] - ETA: 27s - loss: 0.2603 - regression_loss: 0.2473 - classification_loss: 0.0130 422/500 [========================>.....] - ETA: 26s - loss: 0.2603 - regression_loss: 0.2473 - classification_loss: 0.0130 423/500 [========================>.....] - ETA: 26s - loss: 0.2602 - regression_loss: 0.2472 - classification_loss: 0.0130 424/500 [========================>.....] - ETA: 26s - loss: 0.2602 - regression_loss: 0.2472 - classification_loss: 0.0130 425/500 [========================>.....] - ETA: 25s - loss: 0.2605 - regression_loss: 0.2475 - classification_loss: 0.0130 426/500 [========================>.....] - ETA: 25s - loss: 0.2606 - regression_loss: 0.2475 - classification_loss: 0.0130 427/500 [========================>.....] - ETA: 25s - loss: 0.2606 - regression_loss: 0.2475 - classification_loss: 0.0130 428/500 [========================>.....] - ETA: 24s - loss: 0.2606 - regression_loss: 0.2475 - classification_loss: 0.0130 429/500 [========================>.....] - ETA: 24s - loss: 0.2607 - regression_loss: 0.2476 - classification_loss: 0.0131 430/500 [========================>.....] - ETA: 23s - loss: 0.2610 - regression_loss: 0.2479 - classification_loss: 0.0131 431/500 [========================>.....] - ETA: 23s - loss: 0.2606 - regression_loss: 0.2476 - classification_loss: 0.0130 432/500 [========================>.....] - ETA: 23s - loss: 0.2609 - regression_loss: 0.2478 - classification_loss: 0.0131 433/500 [========================>.....] - ETA: 22s - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0131 434/500 [=========================>....] - ETA: 22s - loss: 0.2613 - regression_loss: 0.2482 - classification_loss: 0.0131 435/500 [=========================>....] - ETA: 22s - loss: 0.2613 - regression_loss: 0.2482 - classification_loss: 0.0131 436/500 [=========================>....] - ETA: 21s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 437/500 [=========================>....] - ETA: 21s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 438/500 [=========================>....] - ETA: 21s - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 439/500 [=========================>....] - ETA: 20s - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0131 440/500 [=========================>....] - ETA: 20s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 441/500 [=========================>....] - ETA: 20s - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0131 442/500 [=========================>....] - ETA: 19s - loss: 0.2612 - regression_loss: 0.2481 - classification_loss: 0.0130 443/500 [=========================>....] - ETA: 19s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0130 444/500 [=========================>....] - ETA: 19s - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0130 445/500 [=========================>....] - ETA: 18s - loss: 0.2612 - regression_loss: 0.2482 - classification_loss: 0.0130 446/500 [=========================>....] - ETA: 18s - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0130 447/500 [=========================>....] - ETA: 18s - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0130 448/500 [=========================>....] - ETA: 17s - loss: 0.2611 - regression_loss: 0.2481 - classification_loss: 0.0130 449/500 [=========================>....] - ETA: 17s - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0130 450/500 [==========================>...] - ETA: 17s - loss: 0.2607 - regression_loss: 0.2478 - classification_loss: 0.0129 451/500 [==========================>...] - ETA: 16s - loss: 0.2611 - regression_loss: 0.2481 - classification_loss: 0.0130 452/500 [==========================>...] - ETA: 16s - loss: 0.2610 - regression_loss: 0.2480 - classification_loss: 0.0130 453/500 [==========================>...] - ETA: 16s - loss: 0.2608 - regression_loss: 0.2479 - classification_loss: 0.0130 454/500 [==========================>...] - ETA: 15s - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0130 455/500 [==========================>...] - ETA: 15s - loss: 0.2626 - regression_loss: 0.2496 - classification_loss: 0.0130 456/500 [==========================>...] - ETA: 15s - loss: 0.2632 - regression_loss: 0.2502 - classification_loss: 0.0130 457/500 [==========================>...] - ETA: 14s - loss: 0.2637 - regression_loss: 0.2507 - classification_loss: 0.0130 458/500 [==========================>...] - ETA: 14s - loss: 0.2645 - regression_loss: 0.2515 - classification_loss: 0.0130 459/500 [==========================>...] - ETA: 14s - loss: 0.2648 - regression_loss: 0.2518 - classification_loss: 0.0130 460/500 [==========================>...] - ETA: 13s - loss: 0.2655 - regression_loss: 0.2525 - classification_loss: 0.0131 461/500 [==========================>...] - ETA: 13s - loss: 0.2661 - regression_loss: 0.2530 - classification_loss: 0.0131 462/500 [==========================>...] - ETA: 12s - loss: 0.2661 - regression_loss: 0.2530 - classification_loss: 0.0131 463/500 [==========================>...] - ETA: 12s - loss: 0.2662 - regression_loss: 0.2531 - classification_loss: 0.0131 464/500 [==========================>...] - ETA: 12s - loss: 0.2662 - regression_loss: 0.2531 - classification_loss: 0.0131 465/500 [==========================>...] - ETA: 11s - loss: 0.2665 - regression_loss: 0.2534 - classification_loss: 0.0131 466/500 [==========================>...] - ETA: 11s - loss: 0.2668 - regression_loss: 0.2537 - classification_loss: 0.0131 467/500 [===========================>..] - ETA: 11s - loss: 0.2663 - regression_loss: 0.2532 - classification_loss: 0.0131 468/500 [===========================>..] - ETA: 10s - loss: 0.2664 - regression_loss: 0.2532 - classification_loss: 0.0131 469/500 [===========================>..] - ETA: 10s - loss: 0.2666 - regression_loss: 0.2535 - classification_loss: 0.0132 470/500 [===========================>..] - ETA: 10s - loss: 0.2669 - regression_loss: 0.2538 - classification_loss: 0.0131 471/500 [===========================>..] - ETA: 9s - loss: 0.2672 - regression_loss: 0.2540 - classification_loss: 0.0131  472/500 [===========================>..] - ETA: 9s - loss: 0.2675 - regression_loss: 0.2543 - classification_loss: 0.0132 473/500 [===========================>..] - ETA: 9s - loss: 0.2680 - regression_loss: 0.2548 - classification_loss: 0.0132 474/500 [===========================>..] - ETA: 8s - loss: 0.2677 - regression_loss: 0.2546 - classification_loss: 0.0132 475/500 [===========================>..] - ETA: 8s - loss: 0.2675 - regression_loss: 0.2543 - classification_loss: 0.0131 476/500 [===========================>..] - ETA: 8s - loss: 0.2679 - regression_loss: 0.2547 - classification_loss: 0.0132 477/500 [===========================>..] - ETA: 7s - loss: 0.2681 - regression_loss: 0.2549 - classification_loss: 0.0132 478/500 [===========================>..] - ETA: 7s - loss: 0.2681 - regression_loss: 0.2549 - classification_loss: 0.0132 479/500 [===========================>..] - ETA: 7s - loss: 0.2678 - regression_loss: 0.2547 - classification_loss: 0.0132 480/500 [===========================>..] - ETA: 6s - loss: 0.2676 - regression_loss: 0.2544 - classification_loss: 0.0132 481/500 [===========================>..] - ETA: 6s - loss: 0.2674 - regression_loss: 0.2543 - classification_loss: 0.0131 482/500 [===========================>..] - ETA: 6s - loss: 0.2675 - regression_loss: 0.2544 - classification_loss: 0.0132 483/500 [===========================>..] - ETA: 5s - loss: 0.2678 - regression_loss: 0.2546 - classification_loss: 0.0132 484/500 [============================>.] - ETA: 5s - loss: 0.2677 - regression_loss: 0.2545 - classification_loss: 0.0132 485/500 [============================>.] - ETA: 5s - loss: 0.2677 - regression_loss: 0.2545 - classification_loss: 0.0132 486/500 [============================>.] - ETA: 4s - loss: 0.2673 - regression_loss: 0.2541 - classification_loss: 0.0132 487/500 [============================>.] - ETA: 4s - loss: 0.2673 - regression_loss: 0.2541 - classification_loss: 0.0132 488/500 [============================>.] - ETA: 4s - loss: 0.2673 - regression_loss: 0.2542 - classification_loss: 0.0132 489/500 [============================>.] - ETA: 3s - loss: 0.2671 - regression_loss: 0.2539 - classification_loss: 0.0131 490/500 [============================>.] - ETA: 3s - loss: 0.2670 - regression_loss: 0.2539 - classification_loss: 0.0131 491/500 [============================>.] - ETA: 3s - loss: 0.2667 - regression_loss: 0.2536 - classification_loss: 0.0131 492/500 [============================>.] - ETA: 2s - loss: 0.2665 - regression_loss: 0.2533 - classification_loss: 0.0131 493/500 [============================>.] - ETA: 2s - loss: 0.2663 - regression_loss: 0.2532 - classification_loss: 0.0131 494/500 [============================>.] - ETA: 2s - loss: 0.2663 - regression_loss: 0.2531 - classification_loss: 0.0131 495/500 [============================>.] - ETA: 1s - loss: 0.2663 - regression_loss: 0.2532 - classification_loss: 0.0131 496/500 [============================>.] - ETA: 1s - loss: 0.2664 - regression_loss: 0.2533 - classification_loss: 0.0131 497/500 [============================>.] - ETA: 1s - loss: 0.2663 - regression_loss: 0.2531 - classification_loss: 0.0131 498/500 [============================>.] - ETA: 0s - loss: 0.2663 - regression_loss: 0.2531 - classification_loss: 0.0131 499/500 [============================>.] - ETA: 0s - loss: 0.2664 - regression_loss: 0.2533 - classification_loss: 0.0131 500/500 [==============================] - 171s 342ms/step - loss: 0.2663 - regression_loss: 0.2532 - classification_loss: 0.0131 1172 instances of class plum with average precision: 0.7421 mAP: 0.7421 Epoch 00043: saving model to ./training/snapshots/resnet101_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 2:45 - loss: 0.2465 - regression_loss: 0.2425 - classification_loss: 0.0040 2/500 [..............................] - ETA: 2:49 - loss: 0.3257 - regression_loss: 0.3083 - classification_loss: 0.0173 3/500 [..............................] - ETA: 2:51 - loss: 0.2684 - regression_loss: 0.2554 - classification_loss: 0.0130 4/500 [..............................] - ETA: 2:51 - loss: 0.2791 - regression_loss: 0.2655 - classification_loss: 0.0136 5/500 [..............................] - ETA: 2:49 - loss: 0.2817 - regression_loss: 0.2677 - classification_loss: 0.0140 6/500 [..............................] - ETA: 2:50 - loss: 0.2665 - regression_loss: 0.2538 - classification_loss: 0.0127 7/500 [..............................] - ETA: 2:49 - loss: 0.2556 - regression_loss: 0.2425 - classification_loss: 0.0131 8/500 [..............................] - ETA: 2:47 - loss: 0.2659 - regression_loss: 0.2522 - classification_loss: 0.0136 9/500 [..............................] - ETA: 2:47 - loss: 0.2813 - regression_loss: 0.2638 - classification_loss: 0.0175 10/500 [..............................] - ETA: 2:48 - loss: 0.2737 - regression_loss: 0.2566 - classification_loss: 0.0170 11/500 [..............................] - ETA: 2:47 - loss: 0.2822 - regression_loss: 0.2635 - classification_loss: 0.0187 12/500 [..............................] - ETA: 2:47 - loss: 0.2740 - regression_loss: 0.2567 - classification_loss: 0.0174 13/500 [..............................] - ETA: 2:46 - loss: 0.2810 - regression_loss: 0.2641 - classification_loss: 0.0170 14/500 [..............................] - ETA: 2:46 - loss: 0.2839 - regression_loss: 0.2667 - classification_loss: 0.0171 15/500 [..............................] - ETA: 2:46 - loss: 0.2775 - regression_loss: 0.2614 - classification_loss: 0.0161 16/500 [..............................] - ETA: 2:45 - loss: 0.2749 - regression_loss: 0.2590 - classification_loss: 0.0159 17/500 [>.............................] - ETA: 2:45 - loss: 0.2663 - regression_loss: 0.2508 - classification_loss: 0.0155 18/500 [>.............................] - ETA: 2:45 - loss: 0.2754 - regression_loss: 0.2600 - classification_loss: 0.0154 19/500 [>.............................] - ETA: 2:44 - loss: 0.2651 - regression_loss: 0.2504 - classification_loss: 0.0147 20/500 [>.............................] - ETA: 2:44 - loss: 0.2536 - regression_loss: 0.2394 - classification_loss: 0.0142 21/500 [>.............................] - ETA: 2:43 - loss: 0.2495 - regression_loss: 0.2354 - classification_loss: 0.0141 22/500 [>.............................] - ETA: 2:43 - loss: 0.2501 - regression_loss: 0.2359 - classification_loss: 0.0141 23/500 [>.............................] - ETA: 2:43 - loss: 0.2528 - regression_loss: 0.2386 - classification_loss: 0.0142 24/500 [>.............................] - ETA: 2:42 - loss: 0.2583 - regression_loss: 0.2439 - classification_loss: 0.0143 25/500 [>.............................] - ETA: 2:42 - loss: 0.2582 - regression_loss: 0.2440 - classification_loss: 0.0142 26/500 [>.............................] - ETA: 2:42 - loss: 0.2567 - regression_loss: 0.2426 - classification_loss: 0.0141 27/500 [>.............................] - ETA: 2:41 - loss: 0.2568 - regression_loss: 0.2430 - classification_loss: 0.0138 28/500 [>.............................] - ETA: 2:41 - loss: 0.2581 - regression_loss: 0.2445 - classification_loss: 0.0136 29/500 [>.............................] - ETA: 2:41 - loss: 0.2541 - regression_loss: 0.2409 - classification_loss: 0.0132 30/500 [>.............................] - ETA: 2:40 - loss: 0.2563 - regression_loss: 0.2435 - classification_loss: 0.0128 31/500 [>.............................] - ETA: 2:40 - loss: 0.2508 - regression_loss: 0.2383 - classification_loss: 0.0125 32/500 [>.............................] - ETA: 2:40 - loss: 0.2509 - regression_loss: 0.2386 - classification_loss: 0.0123 33/500 [>.............................] - ETA: 2:39 - loss: 0.2482 - regression_loss: 0.2361 - classification_loss: 0.0121 34/500 [=>............................] - ETA: 2:39 - loss: 0.2466 - regression_loss: 0.2346 - classification_loss: 0.0119 35/500 [=>............................] - ETA: 2:39 - loss: 0.2477 - regression_loss: 0.2355 - classification_loss: 0.0122 36/500 [=>............................] - ETA: 2:38 - loss: 0.2516 - regression_loss: 0.2393 - classification_loss: 0.0124 37/500 [=>............................] - ETA: 2:38 - loss: 0.2523 - regression_loss: 0.2398 - classification_loss: 0.0126 38/500 [=>............................] - ETA: 2:37 - loss: 0.2513 - regression_loss: 0.2389 - classification_loss: 0.0124 39/500 [=>............................] - ETA: 2:37 - loss: 0.2501 - regression_loss: 0.2380 - classification_loss: 0.0121 40/500 [=>............................] - ETA: 2:37 - loss: 0.2495 - regression_loss: 0.2376 - classification_loss: 0.0119 41/500 [=>............................] - ETA: 2:37 - loss: 0.2486 - regression_loss: 0.2368 - classification_loss: 0.0118 42/500 [=>............................] - ETA: 2:36 - loss: 0.2484 - regression_loss: 0.2366 - classification_loss: 0.0118 43/500 [=>............................] - ETA: 2:36 - loss: 0.2481 - regression_loss: 0.2360 - classification_loss: 0.0121 44/500 [=>............................] - ETA: 2:36 - loss: 0.2475 - regression_loss: 0.2355 - classification_loss: 0.0119 45/500 [=>............................] - ETA: 2:35 - loss: 0.2452 - regression_loss: 0.2335 - classification_loss: 0.0117 46/500 [=>............................] - ETA: 2:35 - loss: 0.2433 - regression_loss: 0.2317 - classification_loss: 0.0115 47/500 [=>............................] - ETA: 2:35 - loss: 0.2443 - regression_loss: 0.2327 - classification_loss: 0.0115 48/500 [=>............................] - ETA: 2:34 - loss: 0.2449 - regression_loss: 0.2334 - classification_loss: 0.0115 49/500 [=>............................] - ETA: 2:34 - loss: 0.2415 - regression_loss: 0.2302 - classification_loss: 0.0113 50/500 [==>...........................] - ETA: 2:34 - loss: 0.2396 - regression_loss: 0.2285 - classification_loss: 0.0111 51/500 [==>...........................] - ETA: 2:34 - loss: 0.2363 - regression_loss: 0.2253 - classification_loss: 0.0110 52/500 [==>...........................] - ETA: 2:33 - loss: 0.2337 - regression_loss: 0.2229 - classification_loss: 0.0109 53/500 [==>...........................] - ETA: 2:33 - loss: 0.2314 - regression_loss: 0.2205 - classification_loss: 0.0109 54/500 [==>...........................] - ETA: 2:33 - loss: 0.2299 - regression_loss: 0.2192 - classification_loss: 0.0107 55/500 [==>...........................] - ETA: 2:32 - loss: 0.2315 - regression_loss: 0.2208 - classification_loss: 0.0107 56/500 [==>...........................] - ETA: 2:32 - loss: 0.2335 - regression_loss: 0.2227 - classification_loss: 0.0108 57/500 [==>...........................] - ETA: 2:32 - loss: 0.2341 - regression_loss: 0.2232 - classification_loss: 0.0109 58/500 [==>...........................] - ETA: 2:32 - loss: 0.2345 - regression_loss: 0.2236 - classification_loss: 0.0109 59/500 [==>...........................] - ETA: 2:31 - loss: 0.2384 - regression_loss: 0.2269 - classification_loss: 0.0115 60/500 [==>...........................] - ETA: 2:31 - loss: 0.2394 - regression_loss: 0.2278 - classification_loss: 0.0116 61/500 [==>...........................] - ETA: 2:30 - loss: 0.2436 - regression_loss: 0.2321 - classification_loss: 0.0115 62/500 [==>...........................] - ETA: 2:30 - loss: 0.2450 - regression_loss: 0.2334 - classification_loss: 0.0115 63/500 [==>...........................] - ETA: 2:30 - loss: 0.2467 - regression_loss: 0.2350 - classification_loss: 0.0117 64/500 [==>...........................] - ETA: 2:29 - loss: 0.2477 - regression_loss: 0.2360 - classification_loss: 0.0117 65/500 [==>...........................] - ETA: 2:29 - loss: 0.2498 - regression_loss: 0.2381 - classification_loss: 0.0118 66/500 [==>...........................] - ETA: 2:29 - loss: 0.2509 - regression_loss: 0.2392 - classification_loss: 0.0117 67/500 [===>..........................] - ETA: 2:28 - loss: 0.2569 - regression_loss: 0.2450 - classification_loss: 0.0119 68/500 [===>..........................] - ETA: 2:28 - loss: 0.2563 - regression_loss: 0.2446 - classification_loss: 0.0118 69/500 [===>..........................] - ETA: 2:27 - loss: 0.2546 - regression_loss: 0.2429 - classification_loss: 0.0117 70/500 [===>..........................] - ETA: 2:27 - loss: 0.2572 - regression_loss: 0.2455 - classification_loss: 0.0117 71/500 [===>..........................] - ETA: 2:27 - loss: 0.2568 - regression_loss: 0.2452 - classification_loss: 0.0116 72/500 [===>..........................] - ETA: 2:26 - loss: 0.2565 - regression_loss: 0.2451 - classification_loss: 0.0115 73/500 [===>..........................] - ETA: 2:26 - loss: 0.2565 - regression_loss: 0.2452 - classification_loss: 0.0113 74/500 [===>..........................] - ETA: 2:26 - loss: 0.2558 - regression_loss: 0.2445 - classification_loss: 0.0113 75/500 [===>..........................] - ETA: 2:25 - loss: 0.2560 - regression_loss: 0.2447 - classification_loss: 0.0113 76/500 [===>..........................] - ETA: 2:25 - loss: 0.2547 - regression_loss: 0.2434 - classification_loss: 0.0112 77/500 [===>..........................] - ETA: 2:24 - loss: 0.2540 - regression_loss: 0.2428 - classification_loss: 0.0113 78/500 [===>..........................] - ETA: 2:24 - loss: 0.2534 - regression_loss: 0.2421 - classification_loss: 0.0113 79/500 [===>..........................] - ETA: 2:24 - loss: 0.2525 - regression_loss: 0.2412 - classification_loss: 0.0112 80/500 [===>..........................] - ETA: 2:24 - loss: 0.2522 - regression_loss: 0.2409 - classification_loss: 0.0114 81/500 [===>..........................] - ETA: 2:23 - loss: 0.2540 - regression_loss: 0.2423 - classification_loss: 0.0117 82/500 [===>..........................] - ETA: 2:23 - loss: 0.2582 - regression_loss: 0.2462 - classification_loss: 0.0120 83/500 [===>..........................] - ETA: 2:23 - loss: 0.2581 - regression_loss: 0.2460 - classification_loss: 0.0120 84/500 [====>.........................] - ETA: 2:22 - loss: 0.2570 - regression_loss: 0.2450 - classification_loss: 0.0120 85/500 [====>.........................] - ETA: 2:22 - loss: 0.2567 - regression_loss: 0.2447 - classification_loss: 0.0121 86/500 [====>.........................] - ETA: 2:21 - loss: 0.2567 - regression_loss: 0.2446 - classification_loss: 0.0121 87/500 [====>.........................] - ETA: 2:21 - loss: 0.2568 - regression_loss: 0.2448 - classification_loss: 0.0121 88/500 [====>.........................] - ETA: 2:21 - loss: 0.2594 - regression_loss: 0.2469 - classification_loss: 0.0125 89/500 [====>.........................] - ETA: 2:20 - loss: 0.2586 - regression_loss: 0.2462 - classification_loss: 0.0125 90/500 [====>.........................] - ETA: 2:20 - loss: 0.2579 - regression_loss: 0.2455 - classification_loss: 0.0124 91/500 [====>.........................] - ETA: 2:19 - loss: 0.2574 - regression_loss: 0.2451 - classification_loss: 0.0124 92/500 [====>.........................] - ETA: 2:19 - loss: 0.2566 - regression_loss: 0.2444 - classification_loss: 0.0123 93/500 [====>.........................] - ETA: 2:19 - loss: 0.2567 - regression_loss: 0.2444 - classification_loss: 0.0123 94/500 [====>.........................] - ETA: 2:18 - loss: 0.2574 - regression_loss: 0.2449 - classification_loss: 0.0125 95/500 [====>.........................] - ETA: 2:18 - loss: 0.2575 - regression_loss: 0.2450 - classification_loss: 0.0125 96/500 [====>.........................] - ETA: 2:18 - loss: 0.2558 - regression_loss: 0.2434 - classification_loss: 0.0124 97/500 [====>.........................] - ETA: 2:17 - loss: 0.2549 - regression_loss: 0.2425 - classification_loss: 0.0124 98/500 [====>.........................] - ETA: 2:17 - loss: 0.2556 - regression_loss: 0.2430 - classification_loss: 0.0125 99/500 [====>.........................] - ETA: 2:17 - loss: 0.2547 - regression_loss: 0.2422 - classification_loss: 0.0125 100/500 [=====>........................] - ETA: 2:16 - loss: 0.2545 - regression_loss: 0.2421 - classification_loss: 0.0124 101/500 [=====>........................] - ETA: 2:16 - loss: 0.2549 - regression_loss: 0.2426 - classification_loss: 0.0123 102/500 [=====>........................] - ETA: 2:15 - loss: 0.2553 - regression_loss: 0.2430 - classification_loss: 0.0124 103/500 [=====>........................] - ETA: 2:15 - loss: 0.2546 - regression_loss: 0.2422 - classification_loss: 0.0123 104/500 [=====>........................] - ETA: 2:15 - loss: 0.2538 - regression_loss: 0.2414 - classification_loss: 0.0123 105/500 [=====>........................] - ETA: 2:15 - loss: 0.2563 - regression_loss: 0.2440 - classification_loss: 0.0123 106/500 [=====>........................] - ETA: 2:14 - loss: 0.2573 - regression_loss: 0.2449 - classification_loss: 0.0124 107/500 [=====>........................] - ETA: 2:14 - loss: 0.2577 - regression_loss: 0.2453 - classification_loss: 0.0124 108/500 [=====>........................] - ETA: 2:14 - loss: 0.2575 - regression_loss: 0.2451 - classification_loss: 0.0125 109/500 [=====>........................] - ETA: 2:13 - loss: 0.2575 - regression_loss: 0.2451 - classification_loss: 0.0124 110/500 [=====>........................] - ETA: 2:13 - loss: 0.2574 - regression_loss: 0.2449 - classification_loss: 0.0124 111/500 [=====>........................] - ETA: 2:13 - loss: 0.2577 - regression_loss: 0.2452 - classification_loss: 0.0124 112/500 [=====>........................] - ETA: 2:12 - loss: 0.2571 - regression_loss: 0.2447 - classification_loss: 0.0124 113/500 [=====>........................] - ETA: 2:12 - loss: 0.2575 - regression_loss: 0.2449 - classification_loss: 0.0125 114/500 [=====>........................] - ETA: 2:12 - loss: 0.2583 - regression_loss: 0.2456 - classification_loss: 0.0127 115/500 [=====>........................] - ETA: 2:11 - loss: 0.2575 - regression_loss: 0.2449 - classification_loss: 0.0126 116/500 [=====>........................] - ETA: 2:11 - loss: 0.2580 - regression_loss: 0.2454 - classification_loss: 0.0126 117/500 [======>.......................] - ETA: 2:10 - loss: 0.2596 - regression_loss: 0.2464 - classification_loss: 0.0131 118/500 [======>.......................] - ETA: 2:10 - loss: 0.2611 - regression_loss: 0.2479 - classification_loss: 0.0131 119/500 [======>.......................] - ETA: 2:10 - loss: 0.2621 - regression_loss: 0.2489 - classification_loss: 0.0132 120/500 [======>.......................] - ETA: 2:09 - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 121/500 [======>.......................] - ETA: 2:09 - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 122/500 [======>.......................] - ETA: 2:09 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 123/500 [======>.......................] - ETA: 2:08 - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 124/500 [======>.......................] - ETA: 2:08 - loss: 0.2638 - regression_loss: 0.2506 - classification_loss: 0.0132 125/500 [======>.......................] - ETA: 2:07 - loss: 0.2648 - regression_loss: 0.2516 - classification_loss: 0.0132 126/500 [======>.......................] - ETA: 2:07 - loss: 0.2642 - regression_loss: 0.2510 - classification_loss: 0.0131 127/500 [======>.......................] - ETA: 2:07 - loss: 0.2651 - regression_loss: 0.2519 - classification_loss: 0.0133 128/500 [======>.......................] - ETA: 2:06 - loss: 0.2651 - regression_loss: 0.2519 - classification_loss: 0.0132 129/500 [======>.......................] - ETA: 2:06 - loss: 0.2659 - regression_loss: 0.2527 - classification_loss: 0.0132 130/500 [======>.......................] - ETA: 2:06 - loss: 0.2653 - regression_loss: 0.2521 - classification_loss: 0.0132 131/500 [======>.......................] - ETA: 2:05 - loss: 0.2654 - regression_loss: 0.2522 - classification_loss: 0.0132 132/500 [======>.......................] - ETA: 2:05 - loss: 0.2651 - regression_loss: 0.2519 - classification_loss: 0.0132 133/500 [======>.......................] - ETA: 2:05 - loss: 0.2648 - regression_loss: 0.2517 - classification_loss: 0.0132 134/500 [=======>......................] - ETA: 2:04 - loss: 0.2657 - regression_loss: 0.2526 - classification_loss: 0.0131 135/500 [=======>......................] - ETA: 2:04 - loss: 0.2662 - regression_loss: 0.2531 - classification_loss: 0.0131 136/500 [=======>......................] - ETA: 2:04 - loss: 0.2677 - regression_loss: 0.2547 - classification_loss: 0.0130 137/500 [=======>......................] - ETA: 2:03 - loss: 0.2676 - regression_loss: 0.2547 - classification_loss: 0.0130 138/500 [=======>......................] - ETA: 2:03 - loss: 0.2671 - regression_loss: 0.2542 - classification_loss: 0.0130 139/500 [=======>......................] - ETA: 2:03 - loss: 0.2676 - regression_loss: 0.2547 - classification_loss: 0.0129 140/500 [=======>......................] - ETA: 2:02 - loss: 0.2671 - regression_loss: 0.2542 - classification_loss: 0.0129 141/500 [=======>......................] - ETA: 2:02 - loss: 0.2671 - regression_loss: 0.2542 - classification_loss: 0.0129 142/500 [=======>......................] - ETA: 2:02 - loss: 0.2655 - regression_loss: 0.2527 - classification_loss: 0.0128 143/500 [=======>......................] - ETA: 2:01 - loss: 0.2649 - regression_loss: 0.2521 - classification_loss: 0.0128 144/500 [=======>......................] - ETA: 2:01 - loss: 0.2642 - regression_loss: 0.2515 - classification_loss: 0.0127 145/500 [=======>......................] - ETA: 2:00 - loss: 0.2636 - regression_loss: 0.2508 - classification_loss: 0.0127 146/500 [=======>......................] - ETA: 2:00 - loss: 0.2627 - regression_loss: 0.2500 - classification_loss: 0.0127 147/500 [=======>......................] - ETA: 2:00 - loss: 0.2625 - regression_loss: 0.2498 - classification_loss: 0.0126 148/500 [=======>......................] - ETA: 1:59 - loss: 0.2632 - regression_loss: 0.2505 - classification_loss: 0.0127 149/500 [=======>......................] - ETA: 1:59 - loss: 0.2643 - regression_loss: 0.2516 - classification_loss: 0.0127 150/500 [========>.....................] - ETA: 1:59 - loss: 0.2635 - regression_loss: 0.2508 - classification_loss: 0.0127 151/500 [========>.....................] - ETA: 1:58 - loss: 0.2639 - regression_loss: 0.2512 - classification_loss: 0.0127 152/500 [========>.....................] - ETA: 1:58 - loss: 0.2648 - regression_loss: 0.2520 - classification_loss: 0.0128 153/500 [========>.....................] - ETA: 1:58 - loss: 0.2653 - regression_loss: 0.2526 - classification_loss: 0.0128 154/500 [========>.....................] - ETA: 1:57 - loss: 0.2645 - regression_loss: 0.2518 - classification_loss: 0.0127 155/500 [========>.....................] - ETA: 1:57 - loss: 0.2640 - regression_loss: 0.2514 - classification_loss: 0.0126 156/500 [========>.....................] - ETA: 1:57 - loss: 0.2641 - regression_loss: 0.2515 - classification_loss: 0.0127 157/500 [========>.....................] - ETA: 1:56 - loss: 0.2634 - regression_loss: 0.2508 - classification_loss: 0.0126 158/500 [========>.....................] - ETA: 1:56 - loss: 0.2642 - regression_loss: 0.2516 - classification_loss: 0.0126 159/500 [========>.....................] - ETA: 1:56 - loss: 0.2632 - regression_loss: 0.2507 - classification_loss: 0.0126 160/500 [========>.....................] - ETA: 1:55 - loss: 0.2637 - regression_loss: 0.2509 - classification_loss: 0.0127 161/500 [========>.....................] - ETA: 1:55 - loss: 0.2635 - regression_loss: 0.2508 - classification_loss: 0.0127 162/500 [========>.....................] - ETA: 1:55 - loss: 0.2636 - regression_loss: 0.2510 - classification_loss: 0.0127 163/500 [========>.....................] - ETA: 1:54 - loss: 0.2634 - regression_loss: 0.2508 - classification_loss: 0.0126 164/500 [========>.....................] - ETA: 1:54 - loss: 0.2636 - regression_loss: 0.2509 - classification_loss: 0.0127 165/500 [========>.....................] - ETA: 1:53 - loss: 0.2640 - regression_loss: 0.2513 - classification_loss: 0.0127 166/500 [========>.....................] - ETA: 1:53 - loss: 0.2641 - regression_loss: 0.2514 - classification_loss: 0.0127 167/500 [=========>....................] - ETA: 1:53 - loss: 0.2641 - regression_loss: 0.2514 - classification_loss: 0.0127 168/500 [=========>....................] - ETA: 1:53 - loss: 0.2631 - regression_loss: 0.2504 - classification_loss: 0.0126 169/500 [=========>....................] - ETA: 1:52 - loss: 0.2638 - regression_loss: 0.2512 - classification_loss: 0.0126 170/500 [=========>....................] - ETA: 1:52 - loss: 0.2633 - regression_loss: 0.2507 - classification_loss: 0.0126 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2633 - regression_loss: 0.2507 - classification_loss: 0.0126 172/500 [=========>....................] - ETA: 1:51 - loss: 0.2633 - regression_loss: 0.2506 - classification_loss: 0.0126 173/500 [=========>....................] - ETA: 1:51 - loss: 0.2626 - regression_loss: 0.2500 - classification_loss: 0.0126 174/500 [=========>....................] - ETA: 1:50 - loss: 0.2615 - regression_loss: 0.2490 - classification_loss: 0.0125 175/500 [=========>....................] - ETA: 1:50 - loss: 0.2613 - regression_loss: 0.2488 - classification_loss: 0.0125 176/500 [=========>....................] - ETA: 1:50 - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0126 177/500 [=========>....................] - ETA: 1:50 - loss: 0.2612 - regression_loss: 0.2486 - classification_loss: 0.0126 178/500 [=========>....................] - ETA: 1:49 - loss: 0.2614 - regression_loss: 0.2488 - classification_loss: 0.0126 179/500 [=========>....................] - ETA: 1:49 - loss: 0.2623 - regression_loss: 0.2495 - classification_loss: 0.0128 180/500 [=========>....................] - ETA: 1:48 - loss: 0.2618 - regression_loss: 0.2491 - classification_loss: 0.0127 181/500 [=========>....................] - ETA: 1:48 - loss: 0.2624 - regression_loss: 0.2497 - classification_loss: 0.0128 182/500 [=========>....................] - ETA: 1:48 - loss: 0.2625 - regression_loss: 0.2497 - classification_loss: 0.0127 183/500 [=========>....................] - ETA: 1:47 - loss: 0.2623 - regression_loss: 0.2496 - classification_loss: 0.0127 184/500 [==========>...................] - ETA: 1:47 - loss: 0.2629 - regression_loss: 0.2501 - classification_loss: 0.0128 185/500 [==========>...................] - ETA: 1:47 - loss: 0.2622 - regression_loss: 0.2494 - classification_loss: 0.0127 186/500 [==========>...................] - ETA: 1:46 - loss: 0.2614 - regression_loss: 0.2488 - classification_loss: 0.0127 187/500 [==========>...................] - ETA: 1:46 - loss: 0.2627 - regression_loss: 0.2499 - classification_loss: 0.0128 188/500 [==========>...................] - ETA: 1:46 - loss: 0.2630 - regression_loss: 0.2502 - classification_loss: 0.0128 189/500 [==========>...................] - ETA: 1:45 - loss: 0.2637 - regression_loss: 0.2509 - classification_loss: 0.0128 190/500 [==========>...................] - ETA: 1:45 - loss: 0.2637 - regression_loss: 0.2510 - classification_loss: 0.0127 191/500 [==========>...................] - ETA: 1:45 - loss: 0.2643 - regression_loss: 0.2515 - classification_loss: 0.0129 192/500 [==========>...................] - ETA: 1:44 - loss: 0.2635 - regression_loss: 0.2507 - classification_loss: 0.0128 193/500 [==========>...................] - ETA: 1:44 - loss: 0.2648 - regression_loss: 0.2520 - classification_loss: 0.0128 194/500 [==========>...................] - ETA: 1:44 - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 195/500 [==========>...................] - ETA: 1:43 - loss: 0.2644 - regression_loss: 0.2515 - classification_loss: 0.0129 196/500 [==========>...................] - ETA: 1:43 - loss: 0.2645 - regression_loss: 0.2517 - classification_loss: 0.0128 197/500 [==========>...................] - ETA: 1:43 - loss: 0.2644 - regression_loss: 0.2515 - classification_loss: 0.0129 198/500 [==========>...................] - ETA: 1:42 - loss: 0.2647 - regression_loss: 0.2518 - classification_loss: 0.0129 199/500 [==========>...................] - ETA: 1:42 - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 200/500 [===========>..................] - ETA: 1:42 - loss: 0.2647 - regression_loss: 0.2519 - classification_loss: 0.0129 201/500 [===========>..................] - ETA: 1:41 - loss: 0.2651 - regression_loss: 0.2522 - classification_loss: 0.0129 202/500 [===========>..................] - ETA: 1:41 - loss: 0.2666 - regression_loss: 0.2536 - classification_loss: 0.0130 203/500 [===========>..................] - ETA: 1:41 - loss: 0.2662 - regression_loss: 0.2532 - classification_loss: 0.0129 204/500 [===========>..................] - ETA: 1:40 - loss: 0.2659 - regression_loss: 0.2530 - classification_loss: 0.0129 205/500 [===========>..................] - ETA: 1:40 - loss: 0.2653 - regression_loss: 0.2525 - classification_loss: 0.0129 206/500 [===========>..................] - ETA: 1:40 - loss: 0.2663 - regression_loss: 0.2533 - classification_loss: 0.0130 207/500 [===========>..................] - ETA: 1:39 - loss: 0.2659 - regression_loss: 0.2529 - classification_loss: 0.0130 208/500 [===========>..................] - ETA: 1:39 - loss: 0.2658 - regression_loss: 0.2529 - classification_loss: 0.0129 209/500 [===========>..................] - ETA: 1:39 - loss: 0.2666 - regression_loss: 0.2536 - classification_loss: 0.0129 210/500 [===========>..................] - ETA: 1:38 - loss: 0.2667 - regression_loss: 0.2538 - classification_loss: 0.0129 211/500 [===========>..................] - ETA: 1:38 - loss: 0.2667 - regression_loss: 0.2538 - classification_loss: 0.0129 212/500 [===========>..................] - ETA: 1:38 - loss: 0.2661 - regression_loss: 0.2532 - classification_loss: 0.0129 213/500 [===========>..................] - ETA: 1:37 - loss: 0.2661 - regression_loss: 0.2532 - classification_loss: 0.0129 214/500 [===========>..................] - ETA: 1:37 - loss: 0.2664 - regression_loss: 0.2535 - classification_loss: 0.0129 215/500 [===========>..................] - ETA: 1:37 - loss: 0.2663 - regression_loss: 0.2534 - classification_loss: 0.0129 216/500 [===========>..................] - ETA: 1:36 - loss: 0.2665 - regression_loss: 0.2535 - classification_loss: 0.0129 217/500 [============>.................] - ETA: 1:36 - loss: 0.2662 - regression_loss: 0.2533 - classification_loss: 0.0129 218/500 [============>.................] - ETA: 1:36 - loss: 0.2664 - regression_loss: 0.2534 - classification_loss: 0.0130 219/500 [============>.................] - ETA: 1:35 - loss: 0.2664 - regression_loss: 0.2534 - classification_loss: 0.0130 220/500 [============>.................] - ETA: 1:35 - loss: 0.2669 - regression_loss: 0.2538 - classification_loss: 0.0130 221/500 [============>.................] - ETA: 1:34 - loss: 0.2666 - regression_loss: 0.2535 - classification_loss: 0.0131 222/500 [============>.................] - ETA: 1:34 - loss: 0.2674 - regression_loss: 0.2540 - classification_loss: 0.0134 223/500 [============>.................] - ETA: 1:34 - loss: 0.2677 - regression_loss: 0.2543 - classification_loss: 0.0134 224/500 [============>.................] - ETA: 1:33 - loss: 0.2674 - regression_loss: 0.2540 - classification_loss: 0.0134 225/500 [============>.................] - ETA: 1:33 - loss: 0.2668 - regression_loss: 0.2534 - classification_loss: 0.0134 226/500 [============>.................] - ETA: 1:33 - loss: 0.2668 - regression_loss: 0.2534 - classification_loss: 0.0134 227/500 [============>.................] - ETA: 1:32 - loss: 0.2672 - regression_loss: 0.2538 - classification_loss: 0.0134 228/500 [============>.................] - ETA: 1:32 - loss: 0.2674 - regression_loss: 0.2540 - classification_loss: 0.0134 229/500 [============>.................] - ETA: 1:32 - loss: 0.2676 - regression_loss: 0.2541 - classification_loss: 0.0134 230/500 [============>.................] - ETA: 1:31 - loss: 0.2669 - regression_loss: 0.2535 - classification_loss: 0.0134 231/500 [============>.................] - ETA: 1:31 - loss: 0.2667 - regression_loss: 0.2533 - classification_loss: 0.0134 232/500 [============>.................] - ETA: 1:31 - loss: 0.2663 - regression_loss: 0.2529 - classification_loss: 0.0134 233/500 [============>.................] - ETA: 1:30 - loss: 0.2662 - regression_loss: 0.2529 - classification_loss: 0.0134 234/500 [=============>................] - ETA: 1:30 - loss: 0.2661 - regression_loss: 0.2528 - classification_loss: 0.0133 235/500 [=============>................] - ETA: 1:30 - loss: 0.2656 - regression_loss: 0.2523 - classification_loss: 0.0133 236/500 [=============>................] - ETA: 1:29 - loss: 0.2658 - regression_loss: 0.2525 - classification_loss: 0.0134 237/500 [=============>................] - ETA: 1:29 - loss: 0.2655 - regression_loss: 0.2522 - classification_loss: 0.0133 238/500 [=============>................] - ETA: 1:29 - loss: 0.2652 - regression_loss: 0.2519 - classification_loss: 0.0133 239/500 [=============>................] - ETA: 1:28 - loss: 0.2644 - regression_loss: 0.2511 - classification_loss: 0.0133 240/500 [=============>................] - ETA: 1:28 - loss: 0.2642 - regression_loss: 0.2510 - classification_loss: 0.0133 241/500 [=============>................] - ETA: 1:28 - loss: 0.2639 - regression_loss: 0.2507 - classification_loss: 0.0132 242/500 [=============>................] - ETA: 1:27 - loss: 0.2635 - regression_loss: 0.2503 - classification_loss: 0.0132 243/500 [=============>................] - ETA: 1:27 - loss: 0.2643 - regression_loss: 0.2510 - classification_loss: 0.0133 244/500 [=============>................] - ETA: 1:27 - loss: 0.2640 - regression_loss: 0.2508 - classification_loss: 0.0132 245/500 [=============>................] - ETA: 1:26 - loss: 0.2639 - regression_loss: 0.2507 - classification_loss: 0.0132 246/500 [=============>................] - ETA: 1:26 - loss: 0.2637 - regression_loss: 0.2505 - classification_loss: 0.0132 247/500 [=============>................] - ETA: 1:26 - loss: 0.2637 - regression_loss: 0.2505 - classification_loss: 0.0132 248/500 [=============>................] - ETA: 1:25 - loss: 0.2635 - regression_loss: 0.2503 - classification_loss: 0.0132 249/500 [=============>................] - ETA: 1:25 - loss: 0.2630 - regression_loss: 0.2498 - classification_loss: 0.0132 250/500 [==============>...............] - ETA: 1:24 - loss: 0.2627 - regression_loss: 0.2495 - classification_loss: 0.0132 251/500 [==============>...............] - ETA: 1:24 - loss: 0.2626 - regression_loss: 0.2495 - classification_loss: 0.0132 252/500 [==============>...............] - ETA: 1:24 - loss: 0.2621 - regression_loss: 0.2489 - classification_loss: 0.0131 253/500 [==============>...............] - ETA: 1:23 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 254/500 [==============>...............] - ETA: 1:23 - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0131 255/500 [==============>...............] - ETA: 1:23 - loss: 0.2610 - regression_loss: 0.2480 - classification_loss: 0.0130 256/500 [==============>...............] - ETA: 1:22 - loss: 0.2606 - regression_loss: 0.2476 - classification_loss: 0.0130 257/500 [==============>...............] - ETA: 1:22 - loss: 0.2612 - regression_loss: 0.2480 - classification_loss: 0.0131 258/500 [==============>...............] - ETA: 1:22 - loss: 0.2621 - regression_loss: 0.2490 - classification_loss: 0.0131 259/500 [==============>...............] - ETA: 1:21 - loss: 0.2620 - regression_loss: 0.2489 - classification_loss: 0.0131 260/500 [==============>...............] - ETA: 1:21 - loss: 0.2614 - regression_loss: 0.2482 - classification_loss: 0.0131 261/500 [==============>...............] - ETA: 1:21 - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0131 262/500 [==============>...............] - ETA: 1:20 - loss: 0.2615 - regression_loss: 0.2484 - classification_loss: 0.0131 263/500 [==============>...............] - ETA: 1:20 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 264/500 [==============>...............] - ETA: 1:20 - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 265/500 [==============>...............] - ETA: 1:19 - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0131 266/500 [==============>...............] - ETA: 1:19 - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0131 267/500 [===============>..............] - ETA: 1:19 - loss: 0.2616 - regression_loss: 0.2485 - classification_loss: 0.0131 268/500 [===============>..............] - ETA: 1:18 - loss: 0.2612 - regression_loss: 0.2481 - classification_loss: 0.0131 269/500 [===============>..............] - ETA: 1:18 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 270/500 [===============>..............] - ETA: 1:18 - loss: 0.2617 - regression_loss: 0.2487 - classification_loss: 0.0131 271/500 [===============>..............] - ETA: 1:17 - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 272/500 [===============>..............] - ETA: 1:17 - loss: 0.2612 - regression_loss: 0.2482 - classification_loss: 0.0131 273/500 [===============>..............] - ETA: 1:16 - loss: 0.2607 - regression_loss: 0.2477 - classification_loss: 0.0130 274/500 [===============>..............] - ETA: 1:16 - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0130 275/500 [===============>..............] - ETA: 1:16 - loss: 0.2608 - regression_loss: 0.2478 - classification_loss: 0.0130 276/500 [===============>..............] - ETA: 1:15 - loss: 0.2605 - regression_loss: 0.2475 - classification_loss: 0.0130 277/500 [===============>..............] - ETA: 1:15 - loss: 0.2610 - regression_loss: 0.2479 - classification_loss: 0.0131 278/500 [===============>..............] - ETA: 1:15 - loss: 0.2617 - regression_loss: 0.2486 - classification_loss: 0.0131 279/500 [===============>..............] - ETA: 1:14 - loss: 0.2618 - regression_loss: 0.2486 - classification_loss: 0.0131 280/500 [===============>..............] - ETA: 1:14 - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 281/500 [===============>..............] - ETA: 1:14 - loss: 0.2609 - regression_loss: 0.2478 - classification_loss: 0.0131 282/500 [===============>..............] - ETA: 1:13 - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0131 283/500 [===============>..............] - ETA: 1:13 - loss: 0.2611 - regression_loss: 0.2480 - classification_loss: 0.0131 284/500 [================>.............] - ETA: 1:13 - loss: 0.2608 - regression_loss: 0.2477 - classification_loss: 0.0131 285/500 [================>.............] - ETA: 1:12 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0131 286/500 [================>.............] - ETA: 1:12 - loss: 0.2619 - regression_loss: 0.2488 - classification_loss: 0.0131 287/500 [================>.............] - ETA: 1:12 - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0130 288/500 [================>.............] - ETA: 1:11 - loss: 0.2610 - regression_loss: 0.2480 - classification_loss: 0.0130 289/500 [================>.............] - ETA: 1:11 - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0130 290/500 [================>.............] - ETA: 1:11 - loss: 0.2611 - regression_loss: 0.2481 - classification_loss: 0.0130 291/500 [================>.............] - ETA: 1:10 - loss: 0.2609 - regression_loss: 0.2480 - classification_loss: 0.0129 292/500 [================>.............] - ETA: 1:10 - loss: 0.2617 - regression_loss: 0.2487 - classification_loss: 0.0131 293/500 [================>.............] - ETA: 1:10 - loss: 0.2614 - regression_loss: 0.2483 - classification_loss: 0.0130 294/500 [================>.............] - ETA: 1:09 - loss: 0.2611 - regression_loss: 0.2481 - classification_loss: 0.0130 295/500 [================>.............] - ETA: 1:09 - loss: 0.2613 - regression_loss: 0.2482 - classification_loss: 0.0130 296/500 [================>.............] - ETA: 1:09 - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0130 297/500 [================>.............] - ETA: 1:08 - loss: 0.2618 - regression_loss: 0.2487 - classification_loss: 0.0130 298/500 [================>.............] - ETA: 1:08 - loss: 0.2622 - regression_loss: 0.2491 - classification_loss: 0.0131 299/500 [================>.............] - ETA: 1:08 - loss: 0.2620 - regression_loss: 0.2490 - classification_loss: 0.0131 300/500 [=================>............] - ETA: 1:07 - loss: 0.2620 - regression_loss: 0.2490 - classification_loss: 0.0130 301/500 [=================>............] - ETA: 1:07 - loss: 0.2618 - regression_loss: 0.2488 - classification_loss: 0.0130 302/500 [=================>............] - ETA: 1:07 - loss: 0.2616 - regression_loss: 0.2487 - classification_loss: 0.0130 303/500 [=================>............] - ETA: 1:06 - loss: 0.2616 - regression_loss: 0.2487 - classification_loss: 0.0129 304/500 [=================>............] - ETA: 1:06 - loss: 0.2618 - regression_loss: 0.2489 - classification_loss: 0.0129 305/500 [=================>............] - ETA: 1:06 - loss: 0.2614 - regression_loss: 0.2485 - classification_loss: 0.0129 306/500 [=================>............] - ETA: 1:05 - loss: 0.2614 - regression_loss: 0.2485 - classification_loss: 0.0129 307/500 [=================>............] - ETA: 1:05 - loss: 0.2613 - regression_loss: 0.2484 - classification_loss: 0.0129 308/500 [=================>............] - ETA: 1:04 - loss: 0.2614 - regression_loss: 0.2485 - classification_loss: 0.0129 309/500 [=================>............] - ETA: 1:04 - loss: 0.2609 - regression_loss: 0.2481 - classification_loss: 0.0129 310/500 [=================>............] - ETA: 1:04 - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0129 311/500 [=================>............] - ETA: 1:03 - loss: 0.2609 - regression_loss: 0.2480 - classification_loss: 0.0129 312/500 [=================>............] - ETA: 1:03 - loss: 0.2605 - regression_loss: 0.2476 - classification_loss: 0.0129 313/500 [=================>............] - ETA: 1:03 - loss: 0.2609 - regression_loss: 0.2479 - classification_loss: 0.0129 314/500 [=================>............] - ETA: 1:02 - loss: 0.2614 - regression_loss: 0.2485 - classification_loss: 0.0129 315/500 [=================>............] - ETA: 1:02 - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0130 316/500 [=================>............] - ETA: 1:02 - loss: 0.2615 - regression_loss: 0.2485 - classification_loss: 0.0130 317/500 [==================>...........] - ETA: 1:01 - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0130 318/500 [==================>...........] - ETA: 1:01 - loss: 0.2621 - regression_loss: 0.2490 - classification_loss: 0.0130 319/500 [==================>...........] - ETA: 1:01 - loss: 0.2625 - regression_loss: 0.2494 - classification_loss: 0.0130 320/500 [==================>...........] - ETA: 1:00 - loss: 0.2621 - regression_loss: 0.2491 - classification_loss: 0.0130 321/500 [==================>...........] - ETA: 1:00 - loss: 0.2620 - regression_loss: 0.2490 - classification_loss: 0.0130 322/500 [==================>...........] - ETA: 1:00 - loss: 0.2625 - regression_loss: 0.2495 - classification_loss: 0.0130 323/500 [==================>...........] - ETA: 59s - loss: 0.2620 - regression_loss: 0.2490 - classification_loss: 0.0130  324/500 [==================>...........] - ETA: 59s - loss: 0.2619 - regression_loss: 0.2488 - classification_loss: 0.0130 325/500 [==================>...........] - ETA: 59s - loss: 0.2624 - regression_loss: 0.2493 - classification_loss: 0.0130 326/500 [==================>...........] - ETA: 58s - loss: 0.2623 - regression_loss: 0.2493 - classification_loss: 0.0130 327/500 [==================>...........] - ETA: 58s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 328/500 [==================>...........] - ETA: 58s - loss: 0.2632 - regression_loss: 0.2502 - classification_loss: 0.0130 329/500 [==================>...........] - ETA: 57s - loss: 0.2638 - regression_loss: 0.2507 - classification_loss: 0.0130 330/500 [==================>...........] - ETA: 57s - loss: 0.2638 - regression_loss: 0.2508 - classification_loss: 0.0130 331/500 [==================>...........] - ETA: 57s - loss: 0.2634 - regression_loss: 0.2504 - classification_loss: 0.0130 332/500 [==================>...........] - ETA: 56s - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0130 333/500 [==================>...........] - ETA: 56s - loss: 0.2634 - regression_loss: 0.2505 - classification_loss: 0.0130 334/500 [===================>..........] - ETA: 56s - loss: 0.2635 - regression_loss: 0.2505 - classification_loss: 0.0130 335/500 [===================>..........] - ETA: 55s - loss: 0.2636 - regression_loss: 0.2506 - classification_loss: 0.0130 336/500 [===================>..........] - ETA: 55s - loss: 0.2635 - regression_loss: 0.2505 - classification_loss: 0.0130 337/500 [===================>..........] - ETA: 55s - loss: 0.2632 - regression_loss: 0.2503 - classification_loss: 0.0130 338/500 [===================>..........] - ETA: 54s - loss: 0.2629 - regression_loss: 0.2499 - classification_loss: 0.0129 339/500 [===================>..........] - ETA: 54s - loss: 0.2629 - regression_loss: 0.2500 - classification_loss: 0.0129 340/500 [===================>..........] - ETA: 54s - loss: 0.2627 - regression_loss: 0.2498 - classification_loss: 0.0129 341/500 [===================>..........] - ETA: 53s - loss: 0.2626 - regression_loss: 0.2497 - classification_loss: 0.0129 342/500 [===================>..........] - ETA: 53s - loss: 0.2624 - regression_loss: 0.2494 - classification_loss: 0.0129 343/500 [===================>..........] - ETA: 53s - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0130 344/500 [===================>..........] - ETA: 52s - loss: 0.2628 - regression_loss: 0.2498 - classification_loss: 0.0129 345/500 [===================>..........] - ETA: 52s - loss: 0.2626 - regression_loss: 0.2497 - classification_loss: 0.0129 346/500 [===================>..........] - ETA: 52s - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0129 347/500 [===================>..........] - ETA: 51s - loss: 0.2622 - regression_loss: 0.2493 - classification_loss: 0.0129 348/500 [===================>..........] - ETA: 51s - loss: 0.2620 - regression_loss: 0.2491 - classification_loss: 0.0129 349/500 [===================>..........] - ETA: 51s - loss: 0.2620 - regression_loss: 0.2491 - classification_loss: 0.0129 350/500 [====================>.........] - ETA: 50s - loss: 0.2624 - regression_loss: 0.2495 - classification_loss: 0.0129 351/500 [====================>.........] - ETA: 50s - loss: 0.2620 - regression_loss: 0.2492 - classification_loss: 0.0129 352/500 [====================>.........] - ETA: 50s - loss: 0.2622 - regression_loss: 0.2494 - classification_loss: 0.0129 353/500 [====================>.........] - ETA: 49s - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0129 354/500 [====================>.........] - ETA: 49s - loss: 0.2631 - regression_loss: 0.2503 - classification_loss: 0.0129 355/500 [====================>.........] - ETA: 49s - loss: 0.2634 - regression_loss: 0.2506 - classification_loss: 0.0129 356/500 [====================>.........] - ETA: 48s - loss: 0.2632 - regression_loss: 0.2503 - classification_loss: 0.0129 357/500 [====================>.........] - ETA: 48s - loss: 0.2628 - regression_loss: 0.2499 - classification_loss: 0.0129 358/500 [====================>.........] - ETA: 48s - loss: 0.2628 - regression_loss: 0.2498 - classification_loss: 0.0129 359/500 [====================>.........] - ETA: 47s - loss: 0.2633 - regression_loss: 0.2503 - classification_loss: 0.0130 360/500 [====================>.........] - ETA: 47s - loss: 0.2637 - regression_loss: 0.2506 - classification_loss: 0.0131 361/500 [====================>.........] - ETA: 47s - loss: 0.2634 - regression_loss: 0.2503 - classification_loss: 0.0131 362/500 [====================>.........] - ETA: 46s - loss: 0.2636 - regression_loss: 0.2505 - classification_loss: 0.0131 363/500 [====================>.........] - ETA: 46s - loss: 0.2634 - regression_loss: 0.2503 - classification_loss: 0.0131 364/500 [====================>.........] - ETA: 45s - loss: 0.2631 - regression_loss: 0.2500 - classification_loss: 0.0131 365/500 [====================>.........] - ETA: 45s - loss: 0.2627 - regression_loss: 0.2497 - classification_loss: 0.0131 366/500 [====================>.........] - ETA: 45s - loss: 0.2625 - regression_loss: 0.2494 - classification_loss: 0.0130 367/500 [=====================>........] - ETA: 44s - loss: 0.2622 - regression_loss: 0.2492 - classification_loss: 0.0130 368/500 [=====================>........] - ETA: 44s - loss: 0.2617 - regression_loss: 0.2487 - classification_loss: 0.0130 369/500 [=====================>........] - ETA: 44s - loss: 0.2619 - regression_loss: 0.2489 - classification_loss: 0.0130 370/500 [=====================>........] - ETA: 43s - loss: 0.2618 - regression_loss: 0.2488 - classification_loss: 0.0130 371/500 [=====================>........] - ETA: 43s - loss: 0.2626 - regression_loss: 0.2495 - classification_loss: 0.0131 372/500 [=====================>........] - ETA: 43s - loss: 0.2627 - regression_loss: 0.2497 - classification_loss: 0.0130 373/500 [=====================>........] - ETA: 42s - loss: 0.2625 - regression_loss: 0.2495 - classification_loss: 0.0130 374/500 [=====================>........] - ETA: 42s - loss: 0.2623 - regression_loss: 0.2492 - classification_loss: 0.0130 375/500 [=====================>........] - ETA: 42s - loss: 0.2621 - regression_loss: 0.2491 - classification_loss: 0.0130 376/500 [=====================>........] - ETA: 41s - loss: 0.2624 - regression_loss: 0.2493 - classification_loss: 0.0130 377/500 [=====================>........] - ETA: 41s - loss: 0.2625 - regression_loss: 0.2495 - classification_loss: 0.0130 378/500 [=====================>........] - ETA: 41s - loss: 0.2623 - regression_loss: 0.2493 - classification_loss: 0.0130 379/500 [=====================>........] - ETA: 40s - loss: 0.2622 - regression_loss: 0.2493 - classification_loss: 0.0130 380/500 [=====================>........] - ETA: 40s - loss: 0.2620 - regression_loss: 0.2490 - classification_loss: 0.0130 381/500 [=====================>........] - ETA: 40s - loss: 0.2618 - regression_loss: 0.2488 - classification_loss: 0.0130 382/500 [=====================>........] - ETA: 39s - loss: 0.2617 - regression_loss: 0.2487 - classification_loss: 0.0130 383/500 [=====================>........] - ETA: 39s - loss: 0.2625 - regression_loss: 0.2495 - classification_loss: 0.0131 384/500 [======================>.......] - ETA: 39s - loss: 0.2628 - regression_loss: 0.2497 - classification_loss: 0.0131 385/500 [======================>.......] - ETA: 38s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 386/500 [======================>.......] - ETA: 38s - loss: 0.2632 - regression_loss: 0.2502 - classification_loss: 0.0131 387/500 [======================>.......] - ETA: 38s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 388/500 [======================>.......] - ETA: 37s - loss: 0.2630 - regression_loss: 0.2500 - classification_loss: 0.0131 389/500 [======================>.......] - ETA: 37s - loss: 0.2631 - regression_loss: 0.2500 - classification_loss: 0.0131 390/500 [======================>.......] - ETA: 37s - loss: 0.2632 - regression_loss: 0.2501 - classification_loss: 0.0131 391/500 [======================>.......] - ETA: 36s - loss: 0.2630 - regression_loss: 0.2499 - classification_loss: 0.0131 392/500 [======================>.......] - ETA: 36s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 393/500 [======================>.......] - ETA: 36s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 394/500 [======================>.......] - ETA: 35s - loss: 0.2636 - regression_loss: 0.2505 - classification_loss: 0.0131 395/500 [======================>.......] - ETA: 35s - loss: 0.2632 - regression_loss: 0.2501 - classification_loss: 0.0131 396/500 [======================>.......] - ETA: 35s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 397/500 [======================>.......] - ETA: 34s - loss: 0.2635 - regression_loss: 0.2504 - classification_loss: 0.0131 398/500 [======================>.......] - ETA: 34s - loss: 0.2633 - regression_loss: 0.2502 - classification_loss: 0.0131 399/500 [======================>.......] - ETA: 34s - loss: 0.2632 - regression_loss: 0.2502 - classification_loss: 0.0131 400/500 [=======================>......] - ETA: 33s - loss: 0.2634 - regression_loss: 0.2504 - classification_loss: 0.0131 401/500 [=======================>......] - ETA: 33s - loss: 0.2636 - regression_loss: 0.2505 - classification_loss: 0.0131 402/500 [=======================>......] - ETA: 33s - loss: 0.2646 - regression_loss: 0.2515 - classification_loss: 0.0130 403/500 [=======================>......] - ETA: 32s - loss: 0.2642 - regression_loss: 0.2512 - classification_loss: 0.0130 404/500 [=======================>......] - ETA: 32s - loss: 0.2642 - regression_loss: 0.2512 - classification_loss: 0.0130 405/500 [=======================>......] - ETA: 32s - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0130 406/500 [=======================>......] - ETA: 31s - loss: 0.2641 - regression_loss: 0.2511 - classification_loss: 0.0130 407/500 [=======================>......] - ETA: 31s - loss: 0.2647 - regression_loss: 0.2516 - classification_loss: 0.0131 408/500 [=======================>......] - ETA: 31s - loss: 0.2643 - regression_loss: 0.2513 - classification_loss: 0.0131 409/500 [=======================>......] - ETA: 30s - loss: 0.2644 - regression_loss: 0.2513 - classification_loss: 0.0131 410/500 [=======================>......] - ETA: 30s - loss: 0.2644 - regression_loss: 0.2513 - classification_loss: 0.0131 411/500 [=======================>......] - ETA: 30s - loss: 0.2643 - regression_loss: 0.2512 - classification_loss: 0.0131 412/500 [=======================>......] - ETA: 29s - loss: 0.2641 - regression_loss: 0.2510 - classification_loss: 0.0131 413/500 [=======================>......] - ETA: 29s - loss: 0.2641 - regression_loss: 0.2510 - classification_loss: 0.0131 414/500 [=======================>......] - ETA: 29s - loss: 0.2638 - regression_loss: 0.2508 - classification_loss: 0.0131 415/500 [=======================>......] - ETA: 28s - loss: 0.2638 - regression_loss: 0.2508 - classification_loss: 0.0130 416/500 [=======================>......] - ETA: 28s - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0130 417/500 [========================>.....] - ETA: 28s - loss: 0.2636 - regression_loss: 0.2506 - classification_loss: 0.0130 418/500 [========================>.....] - ETA: 27s - loss: 0.2640 - regression_loss: 0.2510 - classification_loss: 0.0130 419/500 [========================>.....] - ETA: 27s - loss: 0.2640 - regression_loss: 0.2510 - classification_loss: 0.0130 420/500 [========================>.....] - ETA: 27s - loss: 0.2640 - regression_loss: 0.2510 - classification_loss: 0.0130 421/500 [========================>.....] - ETA: 26s - loss: 0.2641 - regression_loss: 0.2511 - classification_loss: 0.0130 422/500 [========================>.....] - ETA: 26s - loss: 0.2639 - regression_loss: 0.2510 - classification_loss: 0.0130 423/500 [========================>.....] - ETA: 26s - loss: 0.2639 - regression_loss: 0.2510 - classification_loss: 0.0130 424/500 [========================>.....] - ETA: 25s - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0129 425/500 [========================>.....] - ETA: 25s - loss: 0.2638 - regression_loss: 0.2509 - classification_loss: 0.0129 426/500 [========================>.....] - ETA: 24s - loss: 0.2640 - regression_loss: 0.2510 - classification_loss: 0.0129 427/500 [========================>.....] - ETA: 24s - loss: 0.2636 - regression_loss: 0.2507 - classification_loss: 0.0129 428/500 [========================>.....] - ETA: 24s - loss: 0.2641 - regression_loss: 0.2512 - classification_loss: 0.0129 429/500 [========================>.....] - ETA: 23s - loss: 0.2643 - regression_loss: 0.2514 - classification_loss: 0.0129 430/500 [========================>.....] - ETA: 23s - loss: 0.2643 - regression_loss: 0.2513 - classification_loss: 0.0129 431/500 [========================>.....] - ETA: 23s - loss: 0.2643 - regression_loss: 0.2513 - classification_loss: 0.0129 432/500 [========================>.....] - ETA: 22s - loss: 0.2644 - regression_loss: 0.2515 - classification_loss: 0.0130 433/500 [========================>.....] - ETA: 22s - loss: 0.2643 - regression_loss: 0.2514 - classification_loss: 0.0129 434/500 [=========================>....] - ETA: 22s - loss: 0.2643 - regression_loss: 0.2514 - classification_loss: 0.0129 435/500 [=========================>....] - ETA: 21s - loss: 0.2647 - regression_loss: 0.2518 - classification_loss: 0.0129 436/500 [=========================>....] - ETA: 21s - loss: 0.2650 - regression_loss: 0.2520 - classification_loss: 0.0129 437/500 [=========================>....] - ETA: 21s - loss: 0.2650 - regression_loss: 0.2520 - classification_loss: 0.0130 438/500 [=========================>....] - ETA: 20s - loss: 0.2652 - regression_loss: 0.2523 - classification_loss: 0.0130 439/500 [=========================>....] - ETA: 20s - loss: 0.2652 - regression_loss: 0.2523 - classification_loss: 0.0130 440/500 [=========================>....] - ETA: 20s - loss: 0.2653 - regression_loss: 0.2523 - classification_loss: 0.0130 441/500 [=========================>....] - ETA: 19s - loss: 0.2654 - regression_loss: 0.2525 - classification_loss: 0.0130 442/500 [=========================>....] - ETA: 19s - loss: 0.2653 - regression_loss: 0.2524 - classification_loss: 0.0130 443/500 [=========================>....] - ETA: 19s - loss: 0.2655 - regression_loss: 0.2525 - classification_loss: 0.0129 444/500 [=========================>....] - ETA: 18s - loss: 0.2651 - regression_loss: 0.2522 - classification_loss: 0.0129 445/500 [=========================>....] - ETA: 18s - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 446/500 [=========================>....] - ETA: 18s - loss: 0.2654 - regression_loss: 0.2525 - classification_loss: 0.0129 447/500 [=========================>....] - ETA: 17s - loss: 0.2653 - regression_loss: 0.2524 - classification_loss: 0.0129 448/500 [=========================>....] - ETA: 17s - loss: 0.2651 - regression_loss: 0.2522 - classification_loss: 0.0129 449/500 [=========================>....] - ETA: 17s - loss: 0.2652 - regression_loss: 0.2523 - classification_loss: 0.0129 450/500 [==========================>...] - ETA: 16s - loss: 0.2651 - regression_loss: 0.2522 - classification_loss: 0.0129 451/500 [==========================>...] - ETA: 16s - loss: 0.2651 - regression_loss: 0.2522 - classification_loss: 0.0129 452/500 [==========================>...] - ETA: 16s - loss: 0.2648 - regression_loss: 0.2519 - classification_loss: 0.0129 453/500 [==========================>...] - ETA: 15s - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 454/500 [==========================>...] - ETA: 15s - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 455/500 [==========================>...] - ETA: 15s - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 456/500 [==========================>...] - ETA: 14s - loss: 0.2648 - regression_loss: 0.2519 - classification_loss: 0.0129 457/500 [==========================>...] - ETA: 14s - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 458/500 [==========================>...] - ETA: 14s - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 459/500 [==========================>...] - ETA: 13s - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 460/500 [==========================>...] - ETA: 13s - loss: 0.2646 - regression_loss: 0.2517 - classification_loss: 0.0128 461/500 [==========================>...] - ETA: 13s - loss: 0.2652 - regression_loss: 0.2524 - classification_loss: 0.0128 462/500 [==========================>...] - ETA: 12s - loss: 0.2651 - regression_loss: 0.2523 - classification_loss: 0.0128 463/500 [==========================>...] - ETA: 12s - loss: 0.2655 - regression_loss: 0.2525 - classification_loss: 0.0130 464/500 [==========================>...] - ETA: 12s - loss: 0.2655 - regression_loss: 0.2525 - classification_loss: 0.0130 465/500 [==========================>...] - ETA: 11s - loss: 0.2652 - regression_loss: 0.2523 - classification_loss: 0.0129 466/500 [==========================>...] - ETA: 11s - loss: 0.2652 - regression_loss: 0.2523 - classification_loss: 0.0129 467/500 [===========================>..] - ETA: 11s - loss: 0.2651 - regression_loss: 0.2521 - classification_loss: 0.0129 468/500 [===========================>..] - ETA: 10s - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 469/500 [===========================>..] - ETA: 10s - loss: 0.2650 - regression_loss: 0.2521 - classification_loss: 0.0129 470/500 [===========================>..] - ETA: 10s - loss: 0.2649 - regression_loss: 0.2520 - classification_loss: 0.0129 471/500 [===========================>..] - ETA: 9s - loss: 0.2648 - regression_loss: 0.2519 - classification_loss: 0.0129  472/500 [===========================>..] - ETA: 9s - loss: 0.2646 - regression_loss: 0.2517 - classification_loss: 0.0129 473/500 [===========================>..] - ETA: 9s - loss: 0.2645 - regression_loss: 0.2516 - classification_loss: 0.0129 474/500 [===========================>..] - ETA: 8s - loss: 0.2641 - regression_loss: 0.2512 - classification_loss: 0.0129 475/500 [===========================>..] - ETA: 8s - loss: 0.2640 - regression_loss: 0.2511 - classification_loss: 0.0129 476/500 [===========================>..] - ETA: 8s - loss: 0.2639 - regression_loss: 0.2510 - classification_loss: 0.0129 477/500 [===========================>..] - ETA: 7s - loss: 0.2639 - regression_loss: 0.2510 - classification_loss: 0.0129 478/500 [===========================>..] - ETA: 7s - loss: 0.2641 - regression_loss: 0.2512 - classification_loss: 0.0129 479/500 [===========================>..] - ETA: 7s - loss: 0.2641 - regression_loss: 0.2512 - classification_loss: 0.0129 480/500 [===========================>..] - ETA: 6s - loss: 0.2638 - regression_loss: 0.2510 - classification_loss: 0.0129 481/500 [===========================>..] - ETA: 6s - loss: 0.2636 - regression_loss: 0.2507 - classification_loss: 0.0129 482/500 [===========================>..] - ETA: 6s - loss: 0.2636 - regression_loss: 0.2508 - classification_loss: 0.0129 483/500 [===========================>..] - ETA: 5s - loss: 0.2637 - regression_loss: 0.2508 - classification_loss: 0.0129 484/500 [============================>.] - ETA: 5s - loss: 0.2634 - regression_loss: 0.2506 - classification_loss: 0.0129 485/500 [============================>.] - ETA: 5s - loss: 0.2634 - regression_loss: 0.2506 - classification_loss: 0.0129 486/500 [============================>.] - ETA: 4s - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0129 487/500 [============================>.] - ETA: 4s - loss: 0.2633 - regression_loss: 0.2505 - classification_loss: 0.0129 488/500 [============================>.] - ETA: 4s - loss: 0.2634 - regression_loss: 0.2505 - classification_loss: 0.0129 489/500 [============================>.] - ETA: 3s - loss: 0.2634 - regression_loss: 0.2505 - classification_loss: 0.0129 490/500 [============================>.] - ETA: 3s - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0129 491/500 [============================>.] - ETA: 3s - loss: 0.2634 - regression_loss: 0.2505 - classification_loss: 0.0129 492/500 [============================>.] - ETA: 2s - loss: 0.2631 - regression_loss: 0.2502 - classification_loss: 0.0129 493/500 [============================>.] - ETA: 2s - loss: 0.2630 - regression_loss: 0.2502 - classification_loss: 0.0129 494/500 [============================>.] - ETA: 2s - loss: 0.2629 - regression_loss: 0.2500 - classification_loss: 0.0129 495/500 [============================>.] - ETA: 1s - loss: 0.2630 - regression_loss: 0.2502 - classification_loss: 0.0129 496/500 [============================>.] - ETA: 1s - loss: 0.2631 - regression_loss: 0.2502 - classification_loss: 0.0129 497/500 [============================>.] - ETA: 1s - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0129 498/500 [============================>.] - ETA: 0s - loss: 0.2631 - regression_loss: 0.2502 - classification_loss: 0.0129 499/500 [============================>.] - ETA: 0s - loss: 0.2629 - regression_loss: 0.2500 - classification_loss: 0.0129 500/500 [==============================] - 169s 338ms/step - loss: 0.2631 - regression_loss: 0.2502 - classification_loss: 0.0129 1172 instances of class plum with average precision: 0.7462 mAP: 0.7462 Epoch 00044: saving model to ./training/snapshots/resnet101_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 2:32 - loss: 0.2976 - regression_loss: 0.2830 - classification_loss: 0.0146 2/500 [..............................] - ETA: 2:39 - loss: 0.2591 - regression_loss: 0.2474 - classification_loss: 0.0117 3/500 [..............................] - ETA: 2:43 - loss: 0.2179 - regression_loss: 0.2092 - classification_loss: 0.0086 4/500 [..............................] - ETA: 2:46 - loss: 0.2532 - regression_loss: 0.2400 - classification_loss: 0.0132 5/500 [..............................] - ETA: 2:47 - loss: 0.2635 - regression_loss: 0.2501 - classification_loss: 0.0134 6/500 [..............................] - ETA: 2:46 - loss: 0.2752 - regression_loss: 0.2624 - classification_loss: 0.0128 7/500 [..............................] - ETA: 2:46 - loss: 0.3012 - regression_loss: 0.2839 - classification_loss: 0.0173 8/500 [..............................] - ETA: 2:46 - loss: 0.2965 - regression_loss: 0.2791 - classification_loss: 0.0174 9/500 [..............................] - ETA: 2:46 - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 10/500 [..............................] - ETA: 2:45 - loss: 0.2925 - regression_loss: 0.2764 - classification_loss: 0.0161 11/500 [..............................] - ETA: 2:45 - loss: 0.2975 - regression_loss: 0.2805 - classification_loss: 0.0169 12/500 [..............................] - ETA: 2:45 - loss: 0.2900 - regression_loss: 0.2741 - classification_loss: 0.0159 13/500 [..............................] - ETA: 2:45 - loss: 0.2974 - regression_loss: 0.2812 - classification_loss: 0.0163 14/500 [..............................] - ETA: 2:45 - loss: 0.2973 - regression_loss: 0.2813 - classification_loss: 0.0160 15/500 [..............................] - ETA: 2:44 - loss: 0.2876 - regression_loss: 0.2725 - classification_loss: 0.0151 16/500 [..............................] - ETA: 2:44 - loss: 0.2839 - regression_loss: 0.2691 - classification_loss: 0.0148 17/500 [>.............................] - ETA: 2:44 - loss: 0.2767 - regression_loss: 0.2626 - classification_loss: 0.0141 18/500 [>.............................] - ETA: 2:44 - loss: 0.2820 - regression_loss: 0.2680 - classification_loss: 0.0140 19/500 [>.............................] - ETA: 2:43 - loss: 0.2795 - regression_loss: 0.2654 - classification_loss: 0.0142 20/500 [>.............................] - ETA: 2:42 - loss: 0.2777 - regression_loss: 0.2637 - classification_loss: 0.0140 21/500 [>.............................] - ETA: 2:42 - loss: 0.2777 - regression_loss: 0.2639 - classification_loss: 0.0138 22/500 [>.............................] - ETA: 2:42 - loss: 0.2822 - regression_loss: 0.2678 - classification_loss: 0.0144 23/500 [>.............................] - ETA: 2:41 - loss: 0.2825 - regression_loss: 0.2686 - classification_loss: 0.0139 24/500 [>.............................] - ETA: 2:41 - loss: 0.2764 - regression_loss: 0.2630 - classification_loss: 0.0134 25/500 [>.............................] - ETA: 2:41 - loss: 0.2817 - regression_loss: 0.2676 - classification_loss: 0.0141 26/500 [>.............................] - ETA: 2:40 - loss: 0.2852 - regression_loss: 0.2710 - classification_loss: 0.0142 27/500 [>.............................] - ETA: 2:40 - loss: 0.2912 - regression_loss: 0.2748 - classification_loss: 0.0165 28/500 [>.............................] - ETA: 2:39 - loss: 0.2892 - regression_loss: 0.2728 - classification_loss: 0.0163 29/500 [>.............................] - ETA: 2:39 - loss: 0.2850 - regression_loss: 0.2689 - classification_loss: 0.0161 30/500 [>.............................] - ETA: 2:38 - loss: 0.2846 - regression_loss: 0.2686 - classification_loss: 0.0160 31/500 [>.............................] - ETA: 2:38 - loss: 0.2798 - regression_loss: 0.2643 - classification_loss: 0.0156 32/500 [>.............................] - ETA: 2:38 - loss: 0.2755 - regression_loss: 0.2602 - classification_loss: 0.0153 33/500 [>.............................] - ETA: 2:37 - loss: 0.2770 - regression_loss: 0.2616 - classification_loss: 0.0154 34/500 [=>............................] - ETA: 2:37 - loss: 0.2792 - regression_loss: 0.2640 - classification_loss: 0.0152 35/500 [=>............................] - ETA: 2:37 - loss: 0.2798 - regression_loss: 0.2646 - classification_loss: 0.0153 36/500 [=>............................] - ETA: 2:36 - loss: 0.2798 - regression_loss: 0.2645 - classification_loss: 0.0153 37/500 [=>............................] - ETA: 2:35 - loss: 0.2767 - regression_loss: 0.2618 - classification_loss: 0.0149 38/500 [=>............................] - ETA: 2:35 - loss: 0.2743 - regression_loss: 0.2597 - classification_loss: 0.0146 39/500 [=>............................] - ETA: 2:35 - loss: 0.2731 - regression_loss: 0.2585 - classification_loss: 0.0146 40/500 [=>............................] - ETA: 2:34 - loss: 0.2693 - regression_loss: 0.2550 - classification_loss: 0.0143 41/500 [=>............................] - ETA: 2:34 - loss: 0.2687 - regression_loss: 0.2546 - classification_loss: 0.0141 42/500 [=>............................] - ETA: 2:34 - loss: 0.2668 - regression_loss: 0.2530 - classification_loss: 0.0139 43/500 [=>............................] - ETA: 2:33 - loss: 0.2645 - regression_loss: 0.2508 - classification_loss: 0.0137 44/500 [=>............................] - ETA: 2:33 - loss: 0.2668 - regression_loss: 0.2530 - classification_loss: 0.0138 45/500 [=>............................] - ETA: 2:33 - loss: 0.2672 - regression_loss: 0.2531 - classification_loss: 0.0141 46/500 [=>............................] - ETA: 2:33 - loss: 0.2648 - regression_loss: 0.2509 - classification_loss: 0.0138 47/500 [=>............................] - ETA: 2:32 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 48/500 [=>............................] - ETA: 2:32 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 49/500 [=>............................] - ETA: 2:32 - loss: 0.2571 - regression_loss: 0.2438 - classification_loss: 0.0133 50/500 [==>...........................] - ETA: 2:32 - loss: 0.2553 - regression_loss: 0.2422 - classification_loss: 0.0131 51/500 [==>...........................] - ETA: 2:31 - loss: 0.2577 - regression_loss: 0.2446 - classification_loss: 0.0132 52/500 [==>...........................] - ETA: 2:31 - loss: 0.2578 - regression_loss: 0.2447 - classification_loss: 0.0131 53/500 [==>...........................] - ETA: 2:31 - loss: 0.2571 - regression_loss: 0.2440 - classification_loss: 0.0131 54/500 [==>...........................] - ETA: 2:30 - loss: 0.2568 - regression_loss: 0.2438 - classification_loss: 0.0130 55/500 [==>...........................] - ETA: 2:30 - loss: 0.2567 - regression_loss: 0.2437 - classification_loss: 0.0131 56/500 [==>...........................] - ETA: 2:30 - loss: 0.2562 - regression_loss: 0.2433 - classification_loss: 0.0129 57/500 [==>...........................] - ETA: 2:30 - loss: 0.2583 - regression_loss: 0.2454 - classification_loss: 0.0129 58/500 [==>...........................] - ETA: 2:30 - loss: 0.2557 - regression_loss: 0.2430 - classification_loss: 0.0127 59/500 [==>...........................] - ETA: 2:29 - loss: 0.2547 - regression_loss: 0.2422 - classification_loss: 0.0125 60/500 [==>...........................] - ETA: 2:29 - loss: 0.2577 - regression_loss: 0.2448 - classification_loss: 0.0129 61/500 [==>...........................] - ETA: 2:28 - loss: 0.2558 - regression_loss: 0.2432 - classification_loss: 0.0127 62/500 [==>...........................] - ETA: 2:28 - loss: 0.2551 - regression_loss: 0.2425 - classification_loss: 0.0126 63/500 [==>...........................] - ETA: 2:28 - loss: 0.2540 - regression_loss: 0.2415 - classification_loss: 0.0125 64/500 [==>...........................] - ETA: 2:27 - loss: 0.2537 - regression_loss: 0.2410 - classification_loss: 0.0127 65/500 [==>...........................] - ETA: 2:27 - loss: 0.2564 - regression_loss: 0.2433 - classification_loss: 0.0131 66/500 [==>...........................] - ETA: 2:27 - loss: 0.2555 - regression_loss: 0.2425 - classification_loss: 0.0130 67/500 [===>..........................] - ETA: 2:26 - loss: 0.2553 - regression_loss: 0.2425 - classification_loss: 0.0128 68/500 [===>..........................] - ETA: 2:26 - loss: 0.2546 - regression_loss: 0.2418 - classification_loss: 0.0127 69/500 [===>..........................] - ETA: 2:26 - loss: 0.2567 - regression_loss: 0.2439 - classification_loss: 0.0128 70/500 [===>..........................] - ETA: 2:25 - loss: 0.2556 - regression_loss: 0.2429 - classification_loss: 0.0127 71/500 [===>..........................] - ETA: 2:25 - loss: 0.2555 - regression_loss: 0.2429 - classification_loss: 0.0126 72/500 [===>..........................] - ETA: 2:25 - loss: 0.2541 - regression_loss: 0.2416 - classification_loss: 0.0126 73/500 [===>..........................] - ETA: 2:24 - loss: 0.2522 - regression_loss: 0.2397 - classification_loss: 0.0124 74/500 [===>..........................] - ETA: 2:24 - loss: 0.2520 - regression_loss: 0.2395 - classification_loss: 0.0125 75/500 [===>..........................] - ETA: 2:24 - loss: 0.2527 - regression_loss: 0.2403 - classification_loss: 0.0125 76/500 [===>..........................] - ETA: 2:23 - loss: 0.2514 - regression_loss: 0.2390 - classification_loss: 0.0124 77/500 [===>..........................] - ETA: 2:23 - loss: 0.2514 - regression_loss: 0.2390 - classification_loss: 0.0124 78/500 [===>..........................] - ETA: 2:23 - loss: 0.2499 - regression_loss: 0.2376 - classification_loss: 0.0123 79/500 [===>..........................] - ETA: 2:22 - loss: 0.2502 - regression_loss: 0.2380 - classification_loss: 0.0123 80/500 [===>..........................] - ETA: 2:22 - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0122 81/500 [===>..........................] - ETA: 2:22 - loss: 0.2479 - regression_loss: 0.2358 - classification_loss: 0.0121 82/500 [===>..........................] - ETA: 2:21 - loss: 0.2487 - regression_loss: 0.2366 - classification_loss: 0.0122 83/500 [===>..........................] - ETA: 2:21 - loss: 0.2479 - regression_loss: 0.2358 - classification_loss: 0.0121 84/500 [====>.........................] - ETA: 2:21 - loss: 0.2469 - regression_loss: 0.2348 - classification_loss: 0.0121 85/500 [====>.........................] - ETA: 2:20 - loss: 0.2464 - regression_loss: 0.2344 - classification_loss: 0.0120 86/500 [====>.........................] - ETA: 2:20 - loss: 0.2466 - regression_loss: 0.2347 - classification_loss: 0.0119 87/500 [====>.........................] - ETA: 2:19 - loss: 0.2448 - regression_loss: 0.2329 - classification_loss: 0.0119 88/500 [====>.........................] - ETA: 2:19 - loss: 0.2469 - regression_loss: 0.2346 - classification_loss: 0.0123 89/500 [====>.........................] - ETA: 2:19 - loss: 0.2492 - regression_loss: 0.2366 - classification_loss: 0.0126 90/500 [====>.........................] - ETA: 2:18 - loss: 0.2512 - regression_loss: 0.2382 - classification_loss: 0.0130 91/500 [====>.........................] - ETA: 2:18 - loss: 0.2512 - regression_loss: 0.2383 - classification_loss: 0.0129 92/500 [====>.........................] - ETA: 2:18 - loss: 0.2500 - regression_loss: 0.2373 - classification_loss: 0.0128 93/500 [====>.........................] - ETA: 2:17 - loss: 0.2486 - regression_loss: 0.2360 - classification_loss: 0.0127 94/500 [====>.........................] - ETA: 2:17 - loss: 0.2490 - regression_loss: 0.2364 - classification_loss: 0.0126 95/500 [====>.........................] - ETA: 2:17 - loss: 0.2487 - regression_loss: 0.2362 - classification_loss: 0.0125 96/500 [====>.........................] - ETA: 2:16 - loss: 0.2483 - regression_loss: 0.2359 - classification_loss: 0.0125 97/500 [====>.........................] - ETA: 2:16 - loss: 0.2474 - regression_loss: 0.2350 - classification_loss: 0.0124 98/500 [====>.........................] - ETA: 2:16 - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 99/500 [====>.........................] - ETA: 2:15 - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0123 100/500 [=====>........................] - ETA: 2:15 - loss: 0.2438 - regression_loss: 0.2315 - classification_loss: 0.0122 101/500 [=====>........................] - ETA: 2:14 - loss: 0.2431 - regression_loss: 0.2309 - classification_loss: 0.0121 102/500 [=====>........................] - ETA: 2:14 - loss: 0.2418 - regression_loss: 0.2297 - classification_loss: 0.0121 103/500 [=====>........................] - ETA: 2:14 - loss: 0.2411 - regression_loss: 0.2291 - classification_loss: 0.0120 104/500 [=====>........................] - ETA: 2:13 - loss: 0.2421 - regression_loss: 0.2300 - classification_loss: 0.0121 105/500 [=====>........................] - ETA: 2:13 - loss: 0.2405 - regression_loss: 0.2285 - classification_loss: 0.0120 106/500 [=====>........................] - ETA: 2:13 - loss: 0.2395 - regression_loss: 0.2276 - classification_loss: 0.0120 107/500 [=====>........................] - ETA: 2:12 - loss: 0.2382 - regression_loss: 0.2263 - classification_loss: 0.0119 108/500 [=====>........................] - ETA: 2:12 - loss: 0.2385 - regression_loss: 0.2265 - classification_loss: 0.0120 109/500 [=====>........................] - ETA: 2:12 - loss: 0.2399 - regression_loss: 0.2279 - classification_loss: 0.0120 110/500 [=====>........................] - ETA: 2:11 - loss: 0.2404 - regression_loss: 0.2284 - classification_loss: 0.0120 111/500 [=====>........................] - ETA: 2:11 - loss: 0.2406 - regression_loss: 0.2284 - classification_loss: 0.0121 112/500 [=====>........................] - ETA: 2:11 - loss: 0.2401 - regression_loss: 0.2280 - classification_loss: 0.0121 113/500 [=====>........................] - ETA: 2:10 - loss: 0.2396 - regression_loss: 0.2276 - classification_loss: 0.0120 114/500 [=====>........................] - ETA: 2:10 - loss: 0.2407 - regression_loss: 0.2286 - classification_loss: 0.0121 115/500 [=====>........................] - ETA: 2:10 - loss: 0.2413 - regression_loss: 0.2292 - classification_loss: 0.0121 116/500 [=====>........................] - ETA: 2:09 - loss: 0.2407 - regression_loss: 0.2286 - classification_loss: 0.0121 117/500 [======>.......................] - ETA: 2:09 - loss: 0.2409 - regression_loss: 0.2288 - classification_loss: 0.0120 118/500 [======>.......................] - ETA: 2:09 - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0123 119/500 [======>.......................] - ETA: 2:08 - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0122 120/500 [======>.......................] - ETA: 2:08 - loss: 0.2442 - regression_loss: 0.2319 - classification_loss: 0.0123 121/500 [======>.......................] - ETA: 2:08 - loss: 0.2438 - regression_loss: 0.2315 - classification_loss: 0.0122 122/500 [======>.......................] - ETA: 2:07 - loss: 0.2437 - regression_loss: 0.2316 - classification_loss: 0.0121 123/500 [======>.......................] - ETA: 2:07 - loss: 0.2435 - regression_loss: 0.2314 - classification_loss: 0.0121 124/500 [======>.......................] - ETA: 2:07 - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 125/500 [======>.......................] - ETA: 2:06 - loss: 0.2434 - regression_loss: 0.2313 - classification_loss: 0.0121 126/500 [======>.......................] - ETA: 2:06 - loss: 0.2440 - regression_loss: 0.2319 - classification_loss: 0.0121 127/500 [======>.......................] - ETA: 2:06 - loss: 0.2454 - regression_loss: 0.2333 - classification_loss: 0.0121 128/500 [======>.......................] - ETA: 2:05 - loss: 0.2445 - regression_loss: 0.2325 - classification_loss: 0.0121 129/500 [======>.......................] - ETA: 2:05 - loss: 0.2439 - regression_loss: 0.2319 - classification_loss: 0.0120 130/500 [======>.......................] - ETA: 2:04 - loss: 0.2432 - regression_loss: 0.2313 - classification_loss: 0.0119 131/500 [======>.......................] - ETA: 2:04 - loss: 0.2426 - regression_loss: 0.2306 - classification_loss: 0.0119 132/500 [======>.......................] - ETA: 2:04 - loss: 0.2430 - regression_loss: 0.2311 - classification_loss: 0.0119 133/500 [======>.......................] - ETA: 2:03 - loss: 0.2431 - regression_loss: 0.2313 - classification_loss: 0.0119 134/500 [=======>......................] - ETA: 2:03 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0118 135/500 [=======>......................] - ETA: 2:03 - loss: 0.2424 - regression_loss: 0.2306 - classification_loss: 0.0118 136/500 [=======>......................] - ETA: 2:02 - loss: 0.2431 - regression_loss: 0.2312 - classification_loss: 0.0119 137/500 [=======>......................] - ETA: 2:02 - loss: 0.2437 - regression_loss: 0.2317 - classification_loss: 0.0119 138/500 [=======>......................] - ETA: 2:02 - loss: 0.2430 - regression_loss: 0.2312 - classification_loss: 0.0119 139/500 [=======>......................] - ETA: 2:01 - loss: 0.2426 - regression_loss: 0.2307 - classification_loss: 0.0119 140/500 [=======>......................] - ETA: 2:01 - loss: 0.2418 - regression_loss: 0.2300 - classification_loss: 0.0118 141/500 [=======>......................] - ETA: 2:01 - loss: 0.2427 - regression_loss: 0.2308 - classification_loss: 0.0119 142/500 [=======>......................] - ETA: 2:00 - loss: 0.2418 - regression_loss: 0.2300 - classification_loss: 0.0118 143/500 [=======>......................] - ETA: 2:00 - loss: 0.2426 - regression_loss: 0.2307 - classification_loss: 0.0119 144/500 [=======>......................] - ETA: 2:00 - loss: 0.2436 - regression_loss: 0.2316 - classification_loss: 0.0120 145/500 [=======>......................] - ETA: 1:59 - loss: 0.2427 - regression_loss: 0.2307 - classification_loss: 0.0119 146/500 [=======>......................] - ETA: 1:59 - loss: 0.2430 - regression_loss: 0.2311 - classification_loss: 0.0120 147/500 [=======>......................] - ETA: 1:59 - loss: 0.2434 - regression_loss: 0.2315 - classification_loss: 0.0120 148/500 [=======>......................] - ETA: 1:58 - loss: 0.2431 - regression_loss: 0.2312 - classification_loss: 0.0119 149/500 [=======>......................] - ETA: 1:58 - loss: 0.2431 - regression_loss: 0.2312 - classification_loss: 0.0120 150/500 [========>.....................] - ETA: 1:58 - loss: 0.2431 - regression_loss: 0.2311 - classification_loss: 0.0120 151/500 [========>.....................] - ETA: 1:57 - loss: 0.2429 - regression_loss: 0.2310 - classification_loss: 0.0120 152/500 [========>.....................] - ETA: 1:57 - loss: 0.2422 - regression_loss: 0.2303 - classification_loss: 0.0119 153/500 [========>.....................] - ETA: 1:57 - loss: 0.2417 - regression_loss: 0.2298 - classification_loss: 0.0119 154/500 [========>.....................] - ETA: 1:56 - loss: 0.2436 - regression_loss: 0.2317 - classification_loss: 0.0119 155/500 [========>.....................] - ETA: 1:56 - loss: 0.2433 - regression_loss: 0.2315 - classification_loss: 0.0119 156/500 [========>.....................] - ETA: 1:56 - loss: 0.2439 - regression_loss: 0.2319 - classification_loss: 0.0120 157/500 [========>.....................] - ETA: 1:55 - loss: 0.2450 - regression_loss: 0.2330 - classification_loss: 0.0121 158/500 [========>.....................] - ETA: 1:55 - loss: 0.2449 - regression_loss: 0.2328 - classification_loss: 0.0121 159/500 [========>.....................] - ETA: 1:55 - loss: 0.2447 - regression_loss: 0.2327 - classification_loss: 0.0121 160/500 [========>.....................] - ETA: 1:54 - loss: 0.2452 - regression_loss: 0.2332 - classification_loss: 0.0120 161/500 [========>.....................] - ETA: 1:54 - loss: 0.2461 - regression_loss: 0.2341 - classification_loss: 0.0120 162/500 [========>.....................] - ETA: 1:54 - loss: 0.2463 - regression_loss: 0.2343 - classification_loss: 0.0120 163/500 [========>.....................] - ETA: 1:53 - loss: 0.2464 - regression_loss: 0.2345 - classification_loss: 0.0119 164/500 [========>.....................] - ETA: 1:53 - loss: 0.2463 - regression_loss: 0.2345 - classification_loss: 0.0119 165/500 [========>.....................] - ETA: 1:52 - loss: 0.2465 - regression_loss: 0.2345 - classification_loss: 0.0120 166/500 [========>.....................] - ETA: 1:52 - loss: 0.2472 - regression_loss: 0.2351 - classification_loss: 0.0121 167/500 [=========>....................] - ETA: 1:52 - loss: 0.2477 - regression_loss: 0.2356 - classification_loss: 0.0121 168/500 [=========>....................] - ETA: 1:51 - loss: 0.2469 - regression_loss: 0.2348 - classification_loss: 0.0121 169/500 [=========>....................] - ETA: 1:51 - loss: 0.2469 - regression_loss: 0.2349 - classification_loss: 0.0121 170/500 [=========>....................] - ETA: 1:51 - loss: 0.2465 - regression_loss: 0.2344 - classification_loss: 0.0120 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2474 - regression_loss: 0.2353 - classification_loss: 0.0120 172/500 [=========>....................] - ETA: 1:50 - loss: 0.2480 - regression_loss: 0.2359 - classification_loss: 0.0120 173/500 [=========>....................] - ETA: 1:50 - loss: 0.2491 - regression_loss: 0.2370 - classification_loss: 0.0121 174/500 [=========>....................] - ETA: 1:49 - loss: 0.2492 - regression_loss: 0.2371 - classification_loss: 0.0121 175/500 [=========>....................] - ETA: 1:49 - loss: 0.2492 - regression_loss: 0.2370 - classification_loss: 0.0122 176/500 [=========>....................] - ETA: 1:49 - loss: 0.2482 - regression_loss: 0.2361 - classification_loss: 0.0121 177/500 [=========>....................] - ETA: 1:48 - loss: 0.2475 - regression_loss: 0.2355 - classification_loss: 0.0121 178/500 [=========>....................] - ETA: 1:48 - loss: 0.2476 - regression_loss: 0.2355 - classification_loss: 0.0121 179/500 [=========>....................] - ETA: 1:48 - loss: 0.2476 - regression_loss: 0.2355 - classification_loss: 0.0121 180/500 [=========>....................] - ETA: 1:47 - loss: 0.2477 - regression_loss: 0.2357 - classification_loss: 0.0120 181/500 [=========>....................] - ETA: 1:47 - loss: 0.2471 - regression_loss: 0.2351 - classification_loss: 0.0120 182/500 [=========>....................] - ETA: 1:47 - loss: 0.2468 - regression_loss: 0.2348 - classification_loss: 0.0120 183/500 [=========>....................] - ETA: 1:46 - loss: 0.2481 - regression_loss: 0.2359 - classification_loss: 0.0122 184/500 [==========>...................] - ETA: 1:46 - loss: 0.2487 - regression_loss: 0.2364 - classification_loss: 0.0122 185/500 [==========>...................] - ETA: 1:46 - loss: 0.2487 - regression_loss: 0.2364 - classification_loss: 0.0123 186/500 [==========>...................] - ETA: 1:45 - loss: 0.2485 - regression_loss: 0.2361 - classification_loss: 0.0123 187/500 [==========>...................] - ETA: 1:45 - loss: 0.2489 - regression_loss: 0.2365 - classification_loss: 0.0123 188/500 [==========>...................] - ETA: 1:45 - loss: 0.2480 - regression_loss: 0.2357 - classification_loss: 0.0123 189/500 [==========>...................] - ETA: 1:44 - loss: 0.2482 - regression_loss: 0.2359 - classification_loss: 0.0123 190/500 [==========>...................] - ETA: 1:44 - loss: 0.2486 - regression_loss: 0.2364 - classification_loss: 0.0123 191/500 [==========>...................] - ETA: 1:44 - loss: 0.2496 - regression_loss: 0.2373 - classification_loss: 0.0123 192/500 [==========>...................] - ETA: 1:43 - loss: 0.2497 - regression_loss: 0.2374 - classification_loss: 0.0123 193/500 [==========>...................] - ETA: 1:43 - loss: 0.2516 - regression_loss: 0.2393 - classification_loss: 0.0123 194/500 [==========>...................] - ETA: 1:43 - loss: 0.2508 - regression_loss: 0.2385 - classification_loss: 0.0123 195/500 [==========>...................] - ETA: 1:42 - loss: 0.2501 - regression_loss: 0.2379 - classification_loss: 0.0122 196/500 [==========>...................] - ETA: 1:42 - loss: 0.2511 - regression_loss: 0.2388 - classification_loss: 0.0122 197/500 [==========>...................] - ETA: 1:42 - loss: 0.2511 - regression_loss: 0.2389 - classification_loss: 0.0122 198/500 [==========>...................] - ETA: 1:41 - loss: 0.2514 - regression_loss: 0.2391 - classification_loss: 0.0122 199/500 [==========>...................] - ETA: 1:41 - loss: 0.2515 - regression_loss: 0.2392 - classification_loss: 0.0123 200/500 [===========>..................] - ETA: 1:41 - loss: 0.2511 - regression_loss: 0.2388 - classification_loss: 0.0123 201/500 [===========>..................] - ETA: 1:40 - loss: 0.2522 - regression_loss: 0.2399 - classification_loss: 0.0123 202/500 [===========>..................] - ETA: 1:40 - loss: 0.2522 - regression_loss: 0.2399 - classification_loss: 0.0123 203/500 [===========>..................] - ETA: 1:40 - loss: 0.2521 - regression_loss: 0.2398 - classification_loss: 0.0123 204/500 [===========>..................] - ETA: 1:39 - loss: 0.2514 - regression_loss: 0.2391 - classification_loss: 0.0122 205/500 [===========>..................] - ETA: 1:39 - loss: 0.2509 - regression_loss: 0.2387 - classification_loss: 0.0122 206/500 [===========>..................] - ETA: 1:39 - loss: 0.2504 - regression_loss: 0.2383 - classification_loss: 0.0122 207/500 [===========>..................] - ETA: 1:38 - loss: 0.2514 - regression_loss: 0.2392 - classification_loss: 0.0122 208/500 [===========>..................] - ETA: 1:38 - loss: 0.2511 - regression_loss: 0.2390 - classification_loss: 0.0122 209/500 [===========>..................] - ETA: 1:38 - loss: 0.2502 - regression_loss: 0.2381 - classification_loss: 0.0121 210/500 [===========>..................] - ETA: 1:37 - loss: 0.2499 - regression_loss: 0.2378 - classification_loss: 0.0121 211/500 [===========>..................] - ETA: 1:37 - loss: 0.2505 - regression_loss: 0.2384 - classification_loss: 0.0121 212/500 [===========>..................] - ETA: 1:37 - loss: 0.2507 - regression_loss: 0.2386 - classification_loss: 0.0121 213/500 [===========>..................] - ETA: 1:36 - loss: 0.2504 - regression_loss: 0.2383 - classification_loss: 0.0121 214/500 [===========>..................] - ETA: 1:36 - loss: 0.2497 - regression_loss: 0.2377 - classification_loss: 0.0120 215/500 [===========>..................] - ETA: 1:36 - loss: 0.2490 - regression_loss: 0.2370 - classification_loss: 0.0120 216/500 [===========>..................] - ETA: 1:35 - loss: 0.2490 - regression_loss: 0.2370 - classification_loss: 0.0120 217/500 [============>.................] - ETA: 1:35 - loss: 0.2495 - regression_loss: 0.2375 - classification_loss: 0.0120 218/500 [============>.................] - ETA: 1:35 - loss: 0.2495 - regression_loss: 0.2375 - classification_loss: 0.0120 219/500 [============>.................] - ETA: 1:34 - loss: 0.2500 - regression_loss: 0.2378 - classification_loss: 0.0121 220/500 [============>.................] - ETA: 1:34 - loss: 0.2494 - regression_loss: 0.2373 - classification_loss: 0.0121 221/500 [============>.................] - ETA: 1:34 - loss: 0.2495 - regression_loss: 0.2375 - classification_loss: 0.0120 222/500 [============>.................] - ETA: 1:33 - loss: 0.2490 - regression_loss: 0.2370 - classification_loss: 0.0120 223/500 [============>.................] - ETA: 1:33 - loss: 0.2492 - regression_loss: 0.2372 - classification_loss: 0.0120 224/500 [============>.................] - ETA: 1:33 - loss: 0.2495 - regression_loss: 0.2375 - classification_loss: 0.0120 225/500 [============>.................] - ETA: 1:32 - loss: 0.2494 - regression_loss: 0.2374 - classification_loss: 0.0120 226/500 [============>.................] - ETA: 1:32 - loss: 0.2493 - regression_loss: 0.2373 - classification_loss: 0.0120 227/500 [============>.................] - ETA: 1:32 - loss: 0.2490 - regression_loss: 0.2371 - classification_loss: 0.0120 228/500 [============>.................] - ETA: 1:31 - loss: 0.2490 - regression_loss: 0.2370 - classification_loss: 0.0119 229/500 [============>.................] - ETA: 1:31 - loss: 0.2493 - regression_loss: 0.2373 - classification_loss: 0.0119 230/500 [============>.................] - ETA: 1:31 - loss: 0.2493 - regression_loss: 0.2374 - classification_loss: 0.0119 231/500 [============>.................] - ETA: 1:30 - loss: 0.2495 - regression_loss: 0.2376 - classification_loss: 0.0119 232/500 [============>.................] - ETA: 1:30 - loss: 0.2493 - regression_loss: 0.2374 - classification_loss: 0.0119 233/500 [============>.................] - ETA: 1:30 - loss: 0.2491 - regression_loss: 0.2372 - classification_loss: 0.0119 234/500 [=============>................] - ETA: 1:29 - loss: 0.2492 - regression_loss: 0.2372 - classification_loss: 0.0119 235/500 [=============>................] - ETA: 1:29 - loss: 0.2486 - regression_loss: 0.2368 - classification_loss: 0.0119 236/500 [=============>................] - ETA: 1:29 - loss: 0.2493 - regression_loss: 0.2372 - classification_loss: 0.0122 237/500 [=============>................] - ETA: 1:28 - loss: 0.2491 - regression_loss: 0.2370 - classification_loss: 0.0122 238/500 [=============>................] - ETA: 1:28 - loss: 0.2490 - regression_loss: 0.2368 - classification_loss: 0.0122 239/500 [=============>................] - ETA: 1:28 - loss: 0.2490 - regression_loss: 0.2368 - classification_loss: 0.0122 240/500 [=============>................] - ETA: 1:27 - loss: 0.2494 - regression_loss: 0.2371 - classification_loss: 0.0123 241/500 [=============>................] - ETA: 1:27 - loss: 0.2493 - regression_loss: 0.2370 - classification_loss: 0.0123 242/500 [=============>................] - ETA: 1:27 - loss: 0.2510 - regression_loss: 0.2385 - classification_loss: 0.0125 243/500 [=============>................] - ETA: 1:26 - loss: 0.2508 - regression_loss: 0.2383 - classification_loss: 0.0125 244/500 [=============>................] - ETA: 1:26 - loss: 0.2524 - regression_loss: 0.2398 - classification_loss: 0.0126 245/500 [=============>................] - ETA: 1:26 - loss: 0.2520 - regression_loss: 0.2394 - classification_loss: 0.0126 246/500 [=============>................] - ETA: 1:25 - loss: 0.2515 - regression_loss: 0.2389 - classification_loss: 0.0125 247/500 [=============>................] - ETA: 1:25 - loss: 0.2516 - regression_loss: 0.2391 - classification_loss: 0.0126 248/500 [=============>................] - ETA: 1:25 - loss: 0.2513 - regression_loss: 0.2387 - classification_loss: 0.0126 249/500 [=============>................] - ETA: 1:24 - loss: 0.2508 - regression_loss: 0.2383 - classification_loss: 0.0125 250/500 [==============>...............] - ETA: 1:24 - loss: 0.2505 - regression_loss: 0.2380 - classification_loss: 0.0125 251/500 [==============>...............] - ETA: 1:24 - loss: 0.2508 - regression_loss: 0.2382 - classification_loss: 0.0125 252/500 [==============>...............] - ETA: 1:23 - loss: 0.2503 - regression_loss: 0.2378 - classification_loss: 0.0125 253/500 [==============>...............] - ETA: 1:23 - loss: 0.2502 - regression_loss: 0.2377 - classification_loss: 0.0125 254/500 [==============>...............] - ETA: 1:22 - loss: 0.2500 - regression_loss: 0.2374 - classification_loss: 0.0125 255/500 [==============>...............] - ETA: 1:22 - loss: 0.2499 - regression_loss: 0.2374 - classification_loss: 0.0125 256/500 [==============>...............] - ETA: 1:22 - loss: 0.2503 - regression_loss: 0.2378 - classification_loss: 0.0126 257/500 [==============>...............] - ETA: 1:21 - loss: 0.2501 - regression_loss: 0.2376 - classification_loss: 0.0125 258/500 [==============>...............] - ETA: 1:21 - loss: 0.2498 - regression_loss: 0.2373 - classification_loss: 0.0125 259/500 [==============>...............] - ETA: 1:21 - loss: 0.2492 - regression_loss: 0.2367 - classification_loss: 0.0125 260/500 [==============>...............] - ETA: 1:20 - loss: 0.2496 - regression_loss: 0.2371 - classification_loss: 0.0125 261/500 [==============>...............] - ETA: 1:20 - loss: 0.2499 - regression_loss: 0.2374 - classification_loss: 0.0125 262/500 [==============>...............] - ETA: 1:20 - loss: 0.2494 - regression_loss: 0.2369 - classification_loss: 0.0125 263/500 [==============>...............] - ETA: 1:19 - loss: 0.2489 - regression_loss: 0.2365 - classification_loss: 0.0124 264/500 [==============>...............] - ETA: 1:19 - loss: 0.2488 - regression_loss: 0.2363 - classification_loss: 0.0124 265/500 [==============>...............] - ETA: 1:19 - loss: 0.2484 - regression_loss: 0.2360 - classification_loss: 0.0124 266/500 [==============>...............] - ETA: 1:18 - loss: 0.2486 - regression_loss: 0.2362 - classification_loss: 0.0124 267/500 [===============>..............] - ETA: 1:18 - loss: 0.2488 - regression_loss: 0.2363 - classification_loss: 0.0125 268/500 [===============>..............] - ETA: 1:18 - loss: 0.2483 - regression_loss: 0.2359 - classification_loss: 0.0124 269/500 [===============>..............] - ETA: 1:17 - loss: 0.2490 - regression_loss: 0.2365 - classification_loss: 0.0125 270/500 [===============>..............] - ETA: 1:17 - loss: 0.2492 - regression_loss: 0.2367 - classification_loss: 0.0125 271/500 [===============>..............] - ETA: 1:17 - loss: 0.2487 - regression_loss: 0.2363 - classification_loss: 0.0125 272/500 [===============>..............] - ETA: 1:16 - loss: 0.2491 - regression_loss: 0.2366 - classification_loss: 0.0125 273/500 [===============>..............] - ETA: 1:16 - loss: 0.2485 - regression_loss: 0.2361 - classification_loss: 0.0125 274/500 [===============>..............] - ETA: 1:16 - loss: 0.2489 - regression_loss: 0.2364 - classification_loss: 0.0125 275/500 [===============>..............] - ETA: 1:15 - loss: 0.2486 - regression_loss: 0.2362 - classification_loss: 0.0124 276/500 [===============>..............] - ETA: 1:15 - loss: 0.2485 - regression_loss: 0.2361 - classification_loss: 0.0124 277/500 [===============>..............] - ETA: 1:15 - loss: 0.2484 - regression_loss: 0.2361 - classification_loss: 0.0124 278/500 [===============>..............] - ETA: 1:14 - loss: 0.2480 - regression_loss: 0.2357 - classification_loss: 0.0123 279/500 [===============>..............] - ETA: 1:14 - loss: 0.2477 - regression_loss: 0.2354 - classification_loss: 0.0123 280/500 [===============>..............] - ETA: 1:14 - loss: 0.2482 - regression_loss: 0.2358 - classification_loss: 0.0123 281/500 [===============>..............] - ETA: 1:13 - loss: 0.2480 - regression_loss: 0.2357 - classification_loss: 0.0123 282/500 [===============>..............] - ETA: 1:13 - loss: 0.2477 - regression_loss: 0.2354 - classification_loss: 0.0123 283/500 [===============>..............] - ETA: 1:13 - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0123 284/500 [================>.............] - ETA: 1:12 - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0123 285/500 [================>.............] - ETA: 1:12 - loss: 0.2473 - regression_loss: 0.2351 - classification_loss: 0.0122 286/500 [================>.............] - ETA: 1:12 - loss: 0.2470 - regression_loss: 0.2348 - classification_loss: 0.0122 287/500 [================>.............] - ETA: 1:11 - loss: 0.2468 - regression_loss: 0.2346 - classification_loss: 0.0122 288/500 [================>.............] - ETA: 1:11 - loss: 0.2464 - regression_loss: 0.2342 - classification_loss: 0.0122 289/500 [================>.............] - ETA: 1:11 - loss: 0.2459 - regression_loss: 0.2337 - classification_loss: 0.0121 290/500 [================>.............] - ETA: 1:10 - loss: 0.2457 - regression_loss: 0.2335 - classification_loss: 0.0122 291/500 [================>.............] - ETA: 1:10 - loss: 0.2456 - regression_loss: 0.2334 - classification_loss: 0.0121 292/500 [================>.............] - ETA: 1:10 - loss: 0.2457 - regression_loss: 0.2335 - classification_loss: 0.0122 293/500 [================>.............] - ETA: 1:09 - loss: 0.2453 - regression_loss: 0.2331 - classification_loss: 0.0122 294/500 [================>.............] - ETA: 1:09 - loss: 0.2454 - regression_loss: 0.2332 - classification_loss: 0.0122 295/500 [================>.............] - ETA: 1:09 - loss: 0.2454 - regression_loss: 0.2332 - classification_loss: 0.0122 296/500 [================>.............] - ETA: 1:08 - loss: 0.2455 - regression_loss: 0.2333 - classification_loss: 0.0122 297/500 [================>.............] - ETA: 1:08 - loss: 0.2454 - regression_loss: 0.2332 - classification_loss: 0.0122 298/500 [================>.............] - ETA: 1:08 - loss: 0.2459 - regression_loss: 0.2337 - classification_loss: 0.0122 299/500 [================>.............] - ETA: 1:07 - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 300/500 [=================>............] - ETA: 1:07 - loss: 0.2456 - regression_loss: 0.2334 - classification_loss: 0.0122 301/500 [=================>............] - ETA: 1:07 - loss: 0.2457 - regression_loss: 0.2335 - classification_loss: 0.0122 302/500 [=================>............] - ETA: 1:06 - loss: 0.2452 - regression_loss: 0.2330 - classification_loss: 0.0122 303/500 [=================>............] - ETA: 1:06 - loss: 0.2454 - regression_loss: 0.2332 - classification_loss: 0.0122 304/500 [=================>............] - ETA: 1:06 - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 305/500 [=================>............] - ETA: 1:05 - loss: 0.2456 - regression_loss: 0.2334 - classification_loss: 0.0122 306/500 [=================>............] - ETA: 1:05 - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0122 307/500 [=================>............] - ETA: 1:05 - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 308/500 [=================>............] - ETA: 1:04 - loss: 0.2448 - regression_loss: 0.2327 - classification_loss: 0.0121 309/500 [=================>............] - ETA: 1:04 - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 310/500 [=================>............] - ETA: 1:04 - loss: 0.2446 - regression_loss: 0.2325 - classification_loss: 0.0121 311/500 [=================>............] - ETA: 1:03 - loss: 0.2446 - regression_loss: 0.2325 - classification_loss: 0.0121 312/500 [=================>............] - ETA: 1:03 - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 313/500 [=================>............] - ETA: 1:03 - loss: 0.2444 - regression_loss: 0.2323 - classification_loss: 0.0121 314/500 [=================>............] - ETA: 1:02 - loss: 0.2444 - regression_loss: 0.2323 - classification_loss: 0.0121 315/500 [=================>............] - ETA: 1:02 - loss: 0.2442 - regression_loss: 0.2321 - classification_loss: 0.0121 316/500 [=================>............] - ETA: 1:02 - loss: 0.2446 - regression_loss: 0.2324 - classification_loss: 0.0122 317/500 [==================>...........] - ETA: 1:01 - loss: 0.2445 - regression_loss: 0.2323 - classification_loss: 0.0121 318/500 [==================>...........] - ETA: 1:01 - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 319/500 [==================>...........] - ETA: 1:01 - loss: 0.2446 - regression_loss: 0.2324 - classification_loss: 0.0122 320/500 [==================>...........] - ETA: 1:00 - loss: 0.2441 - regression_loss: 0.2319 - classification_loss: 0.0122 321/500 [==================>...........] - ETA: 1:00 - loss: 0.2442 - regression_loss: 0.2320 - classification_loss: 0.0122 322/500 [==================>...........] - ETA: 1:00 - loss: 0.2440 - regression_loss: 0.2319 - classification_loss: 0.0121 323/500 [==================>...........] - ETA: 59s - loss: 0.2440 - regression_loss: 0.2319 - classification_loss: 0.0121  324/500 [==================>...........] - ETA: 59s - loss: 0.2442 - regression_loss: 0.2321 - classification_loss: 0.0121 325/500 [==================>...........] - ETA: 58s - loss: 0.2438 - regression_loss: 0.2317 - classification_loss: 0.0121 326/500 [==================>...........] - ETA: 58s - loss: 0.2436 - regression_loss: 0.2315 - classification_loss: 0.0121 327/500 [==================>...........] - ETA: 58s - loss: 0.2436 - regression_loss: 0.2315 - classification_loss: 0.0121 328/500 [==================>...........] - ETA: 57s - loss: 0.2440 - regression_loss: 0.2320 - classification_loss: 0.0121 329/500 [==================>...........] - ETA: 57s - loss: 0.2436 - regression_loss: 0.2316 - classification_loss: 0.0121 330/500 [==================>...........] - ETA: 57s - loss: 0.2442 - regression_loss: 0.2321 - classification_loss: 0.0121 331/500 [==================>...........] - ETA: 56s - loss: 0.2446 - regression_loss: 0.2325 - classification_loss: 0.0120 332/500 [==================>...........] - ETA: 56s - loss: 0.2445 - regression_loss: 0.2325 - classification_loss: 0.0120 333/500 [==================>...........] - ETA: 56s - loss: 0.2452 - regression_loss: 0.2331 - classification_loss: 0.0122 334/500 [===================>..........] - ETA: 55s - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 335/500 [===================>..........] - ETA: 55s - loss: 0.2447 - regression_loss: 0.2326 - classification_loss: 0.0121 336/500 [===================>..........] - ETA: 55s - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 337/500 [===================>..........] - ETA: 54s - loss: 0.2456 - regression_loss: 0.2335 - classification_loss: 0.0121 338/500 [===================>..........] - ETA: 54s - loss: 0.2459 - regression_loss: 0.2338 - classification_loss: 0.0122 339/500 [===================>..........] - ETA: 54s - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 340/500 [===================>..........] - ETA: 53s - loss: 0.2458 - regression_loss: 0.2336 - classification_loss: 0.0122 341/500 [===================>..........] - ETA: 53s - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0121 342/500 [===================>..........] - ETA: 53s - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 343/500 [===================>..........] - ETA: 52s - loss: 0.2450 - regression_loss: 0.2329 - classification_loss: 0.0121 344/500 [===================>..........] - ETA: 52s - loss: 0.2447 - regression_loss: 0.2326 - classification_loss: 0.0121 345/500 [===================>..........] - ETA: 52s - loss: 0.2448 - regression_loss: 0.2327 - classification_loss: 0.0121 346/500 [===================>..........] - ETA: 51s - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 347/500 [===================>..........] - ETA: 51s - loss: 0.2451 - regression_loss: 0.2330 - classification_loss: 0.0121 348/500 [===================>..........] - ETA: 51s - loss: 0.2450 - regression_loss: 0.2328 - classification_loss: 0.0121 349/500 [===================>..........] - ETA: 50s - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 350/500 [====================>.........] - ETA: 50s - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 351/500 [====================>.........] - ETA: 50s - loss: 0.2438 - regression_loss: 0.2317 - classification_loss: 0.0120 352/500 [====================>.........] - ETA: 49s - loss: 0.2435 - regression_loss: 0.2315 - classification_loss: 0.0120 353/500 [====================>.........] - ETA: 49s - loss: 0.2438 - regression_loss: 0.2318 - classification_loss: 0.0121 354/500 [====================>.........] - ETA: 49s - loss: 0.2445 - regression_loss: 0.2323 - classification_loss: 0.0121 355/500 [====================>.........] - ETA: 48s - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0121 356/500 [====================>.........] - ETA: 48s - loss: 0.2439 - regression_loss: 0.2318 - classification_loss: 0.0121 357/500 [====================>.........] - ETA: 47s - loss: 0.2437 - regression_loss: 0.2316 - classification_loss: 0.0121 358/500 [====================>.........] - ETA: 47s - loss: 0.2436 - regression_loss: 0.2315 - classification_loss: 0.0121 359/500 [====================>.........] - ETA: 47s - loss: 0.2438 - regression_loss: 0.2317 - classification_loss: 0.0121 360/500 [====================>.........] - ETA: 46s - loss: 0.2439 - regression_loss: 0.2318 - classification_loss: 0.0121 361/500 [====================>.........] - ETA: 46s - loss: 0.2438 - regression_loss: 0.2317 - classification_loss: 0.0121 362/500 [====================>.........] - ETA: 46s - loss: 0.2438 - regression_loss: 0.2318 - classification_loss: 0.0121 363/500 [====================>.........] - ETA: 45s - loss: 0.2442 - regression_loss: 0.2322 - classification_loss: 0.0121 364/500 [====================>.........] - ETA: 45s - loss: 0.2444 - regression_loss: 0.2323 - classification_loss: 0.0121 365/500 [====================>.........] - ETA: 45s - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 366/500 [====================>.........] - ETA: 44s - loss: 0.2442 - regression_loss: 0.2321 - classification_loss: 0.0121 367/500 [=====================>........] - ETA: 44s - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 368/500 [=====================>........] - ETA: 44s - loss: 0.2449 - regression_loss: 0.2328 - classification_loss: 0.0121 369/500 [=====================>........] - ETA: 43s - loss: 0.2447 - regression_loss: 0.2326 - classification_loss: 0.0121 370/500 [=====================>........] - ETA: 43s - loss: 0.2450 - regression_loss: 0.2329 - classification_loss: 0.0121 371/500 [=====================>........] - ETA: 43s - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0121 372/500 [=====================>........] - ETA: 42s - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0121 373/500 [=====================>........] - ETA: 42s - loss: 0.2458 - regression_loss: 0.2336 - classification_loss: 0.0122 374/500 [=====================>........] - ETA: 42s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0123 375/500 [=====================>........] - ETA: 41s - loss: 0.2465 - regression_loss: 0.2342 - classification_loss: 0.0123 376/500 [=====================>........] - ETA: 41s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0123 377/500 [=====================>........] - ETA: 41s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 378/500 [=====================>........] - ETA: 40s - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0123 379/500 [=====================>........] - ETA: 40s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 380/500 [=====================>........] - ETA: 40s - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0123 381/500 [=====================>........] - ETA: 39s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0123 382/500 [=====================>........] - ETA: 39s - loss: 0.2463 - regression_loss: 0.2341 - classification_loss: 0.0122 383/500 [=====================>........] - ETA: 39s - loss: 0.2463 - regression_loss: 0.2341 - classification_loss: 0.0122 384/500 [======================>.......] - ETA: 38s - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 385/500 [======================>.......] - ETA: 38s - loss: 0.2459 - regression_loss: 0.2337 - classification_loss: 0.0122 386/500 [======================>.......] - ETA: 38s - loss: 0.2457 - regression_loss: 0.2336 - classification_loss: 0.0122 387/500 [======================>.......] - ETA: 37s - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0122 388/500 [======================>.......] - ETA: 37s - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0122 389/500 [======================>.......] - ETA: 37s - loss: 0.2460 - regression_loss: 0.2339 - classification_loss: 0.0122 390/500 [======================>.......] - ETA: 36s - loss: 0.2458 - regression_loss: 0.2336 - classification_loss: 0.0122 391/500 [======================>.......] - ETA: 36s - loss: 0.2456 - regression_loss: 0.2334 - classification_loss: 0.0121 392/500 [======================>.......] - ETA: 35s - loss: 0.2453 - regression_loss: 0.2331 - classification_loss: 0.0121 393/500 [======================>.......] - ETA: 35s - loss: 0.2454 - regression_loss: 0.2333 - classification_loss: 0.0121 394/500 [======================>.......] - ETA: 35s - loss: 0.2454 - regression_loss: 0.2333 - classification_loss: 0.0121 395/500 [======================>.......] - ETA: 34s - loss: 0.2460 - regression_loss: 0.2339 - classification_loss: 0.0121 396/500 [======================>.......] - ETA: 34s - loss: 0.2459 - regression_loss: 0.2338 - classification_loss: 0.0121 397/500 [======================>.......] - ETA: 34s - loss: 0.2459 - regression_loss: 0.2338 - classification_loss: 0.0121 398/500 [======================>.......] - ETA: 33s - loss: 0.2459 - regression_loss: 0.2338 - classification_loss: 0.0121 399/500 [======================>.......] - ETA: 33s - loss: 0.2466 - regression_loss: 0.2345 - classification_loss: 0.0122 400/500 [=======================>......] - ETA: 33s - loss: 0.2466 - regression_loss: 0.2345 - classification_loss: 0.0122 401/500 [=======================>......] - ETA: 32s - loss: 0.2467 - regression_loss: 0.2345 - classification_loss: 0.0122 402/500 [=======================>......] - ETA: 32s - loss: 0.2469 - regression_loss: 0.2348 - classification_loss: 0.0122 403/500 [=======================>......] - ETA: 32s - loss: 0.2469 - regression_loss: 0.2348 - classification_loss: 0.0122 404/500 [=======================>......] - ETA: 31s - loss: 0.2468 - regression_loss: 0.2347 - classification_loss: 0.0122 405/500 [=======================>......] - ETA: 31s - loss: 0.2464 - regression_loss: 0.2343 - classification_loss: 0.0121 406/500 [=======================>......] - ETA: 31s - loss: 0.2462 - regression_loss: 0.2341 - classification_loss: 0.0121 407/500 [=======================>......] - ETA: 30s - loss: 0.2472 - regression_loss: 0.2349 - classification_loss: 0.0123 408/500 [=======================>......] - ETA: 30s - loss: 0.2474 - regression_loss: 0.2351 - classification_loss: 0.0123 409/500 [=======================>......] - ETA: 30s - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 410/500 [=======================>......] - ETA: 29s - loss: 0.2476 - regression_loss: 0.2352 - classification_loss: 0.0125 411/500 [=======================>......] - ETA: 29s - loss: 0.2474 - regression_loss: 0.2349 - classification_loss: 0.0125 412/500 [=======================>......] - ETA: 29s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 413/500 [=======================>......] - ETA: 28s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 414/500 [=======================>......] - ETA: 28s - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0124 415/500 [=======================>......] - ETA: 28s - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0124 416/500 [=======================>......] - ETA: 27s - loss: 0.2466 - regression_loss: 0.2342 - classification_loss: 0.0124 417/500 [========================>.....] - ETA: 27s - loss: 0.2465 - regression_loss: 0.2341 - classification_loss: 0.0124 418/500 [========================>.....] - ETA: 27s - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 419/500 [========================>.....] - ETA: 26s - loss: 0.2464 - regression_loss: 0.2339 - classification_loss: 0.0124 420/500 [========================>.....] - ETA: 26s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 421/500 [========================>.....] - ETA: 26s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 422/500 [========================>.....] - ETA: 25s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 423/500 [========================>.....] - ETA: 25s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 424/500 [========================>.....] - ETA: 25s - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 425/500 [========================>.....] - ETA: 24s - loss: 0.2458 - regression_loss: 0.2334 - classification_loss: 0.0124 426/500 [========================>.....] - ETA: 24s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 427/500 [========================>.....] - ETA: 24s - loss: 0.2456 - regression_loss: 0.2332 - classification_loss: 0.0124 428/500 [========================>.....] - ETA: 23s - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 429/500 [========================>.....] - ETA: 23s - loss: 0.2456 - regression_loss: 0.2332 - classification_loss: 0.0124 430/500 [========================>.....] - ETA: 23s - loss: 0.2452 - regression_loss: 0.2328 - classification_loss: 0.0124 431/500 [========================>.....] - ETA: 22s - loss: 0.2456 - regression_loss: 0.2332 - classification_loss: 0.0124 432/500 [========================>.....] - ETA: 22s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 22s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 434/500 [=========================>....] - ETA: 21s - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 435/500 [=========================>....] - ETA: 21s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 436/500 [=========================>....] - ETA: 21s - loss: 0.2458 - regression_loss: 0.2334 - classification_loss: 0.0124 437/500 [=========================>....] - ETA: 20s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0123 438/500 [=========================>....] - ETA: 20s - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 439/500 [=========================>....] - ETA: 20s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 440/500 [=========================>....] - ETA: 19s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 441/500 [=========================>....] - ETA: 19s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 442/500 [=========================>....] - ETA: 19s - loss: 0.2453 - regression_loss: 0.2330 - classification_loss: 0.0123 443/500 [=========================>....] - ETA: 18s - loss: 0.2452 - regression_loss: 0.2329 - classification_loss: 0.0123 444/500 [=========================>....] - ETA: 18s - loss: 0.2454 - regression_loss: 0.2330 - classification_loss: 0.0123 445/500 [=========================>....] - ETA: 18s - loss: 0.2451 - regression_loss: 0.2328 - classification_loss: 0.0123 446/500 [=========================>....] - ETA: 17s - loss: 0.2449 - regression_loss: 0.2326 - classification_loss: 0.0123 447/500 [=========================>....] - ETA: 17s - loss: 0.2448 - regression_loss: 0.2326 - classification_loss: 0.0123 448/500 [=========================>....] - ETA: 17s - loss: 0.2449 - regression_loss: 0.2326 - classification_loss: 0.0123 449/500 [=========================>....] - ETA: 16s - loss: 0.2448 - regression_loss: 0.2326 - classification_loss: 0.0123 450/500 [==========================>...] - ETA: 16s - loss: 0.2447 - regression_loss: 0.2324 - classification_loss: 0.0123 451/500 [==========================>...] - ETA: 16s - loss: 0.2443 - regression_loss: 0.2321 - classification_loss: 0.0122 452/500 [==========================>...] - ETA: 15s - loss: 0.2442 - regression_loss: 0.2320 - classification_loss: 0.0122 453/500 [==========================>...] - ETA: 15s - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0122 454/500 [==========================>...] - ETA: 15s - loss: 0.2437 - regression_loss: 0.2316 - classification_loss: 0.0122 455/500 [==========================>...] - ETA: 14s - loss: 0.2437 - regression_loss: 0.2316 - classification_loss: 0.0122 456/500 [==========================>...] - ETA: 14s - loss: 0.2437 - regression_loss: 0.2316 - classification_loss: 0.0122 457/500 [==========================>...] - ETA: 14s - loss: 0.2437 - regression_loss: 0.2315 - classification_loss: 0.0122 458/500 [==========================>...] - ETA: 13s - loss: 0.2437 - regression_loss: 0.2315 - classification_loss: 0.0122 459/500 [==========================>...] - ETA: 13s - loss: 0.2435 - regression_loss: 0.2314 - classification_loss: 0.0121 460/500 [==========================>...] - ETA: 13s - loss: 0.2435 - regression_loss: 0.2314 - classification_loss: 0.0121 461/500 [==========================>...] - ETA: 12s - loss: 0.2432 - regression_loss: 0.2311 - classification_loss: 0.0121 462/500 [==========================>...] - ETA: 12s - loss: 0.2432 - regression_loss: 0.2311 - classification_loss: 0.0121 463/500 [==========================>...] - ETA: 12s - loss: 0.2430 - regression_loss: 0.2309 - classification_loss: 0.0121 464/500 [==========================>...] - ETA: 11s - loss: 0.2429 - regression_loss: 0.2308 - classification_loss: 0.0121 465/500 [==========================>...] - ETA: 11s - loss: 0.2430 - regression_loss: 0.2309 - classification_loss: 0.0121 466/500 [==========================>...] - ETA: 11s - loss: 0.2429 - regression_loss: 0.2308 - classification_loss: 0.0121 467/500 [===========================>..] - ETA: 10s - loss: 0.2433 - regression_loss: 0.2312 - classification_loss: 0.0121 468/500 [===========================>..] - ETA: 10s - loss: 0.2430 - regression_loss: 0.2309 - classification_loss: 0.0121 469/500 [===========================>..] - ETA: 10s - loss: 0.2429 - regression_loss: 0.2308 - classification_loss: 0.0121 470/500 [===========================>..] - ETA: 9s - loss: 0.2430 - regression_loss: 0.2309 - classification_loss: 0.0121  471/500 [===========================>..] - ETA: 9s - loss: 0.2435 - regression_loss: 0.2314 - classification_loss: 0.0122 472/500 [===========================>..] - ETA: 9s - loss: 0.2436 - regression_loss: 0.2314 - classification_loss: 0.0122 473/500 [===========================>..] - ETA: 8s - loss: 0.2432 - regression_loss: 0.2311 - classification_loss: 0.0121 474/500 [===========================>..] - ETA: 8s - loss: 0.2434 - regression_loss: 0.2313 - classification_loss: 0.0121 475/500 [===========================>..] - ETA: 8s - loss: 0.2435 - regression_loss: 0.2313 - classification_loss: 0.0122 476/500 [===========================>..] - ETA: 7s - loss: 0.2435 - regression_loss: 0.2313 - classification_loss: 0.0122 477/500 [===========================>..] - ETA: 7s - loss: 0.2434 - regression_loss: 0.2313 - classification_loss: 0.0122 478/500 [===========================>..] - ETA: 7s - loss: 0.2436 - regression_loss: 0.2315 - classification_loss: 0.0121 479/500 [===========================>..] - ETA: 6s - loss: 0.2439 - regression_loss: 0.2318 - classification_loss: 0.0121 480/500 [===========================>..] - ETA: 6s - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0121 481/500 [===========================>..] - ETA: 6s - loss: 0.2444 - regression_loss: 0.2322 - classification_loss: 0.0121 482/500 [===========================>..] - ETA: 5s - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 483/500 [===========================>..] - ETA: 5s - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 484/500 [============================>.] - ETA: 5s - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 485/500 [============================>.] - ETA: 4s - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 486/500 [============================>.] - ETA: 4s - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 487/500 [============================>.] - ETA: 4s - loss: 0.2442 - regression_loss: 0.2321 - classification_loss: 0.0121 488/500 [============================>.] - ETA: 3s - loss: 0.2444 - regression_loss: 0.2322 - classification_loss: 0.0122 489/500 [============================>.] - ETA: 3s - loss: 0.2443 - regression_loss: 0.2321 - classification_loss: 0.0122 490/500 [============================>.] - ETA: 3s - loss: 0.2440 - regression_loss: 0.2318 - classification_loss: 0.0121 491/500 [============================>.] - ETA: 2s - loss: 0.2446 - regression_loss: 0.2325 - classification_loss: 0.0121 492/500 [============================>.] - ETA: 2s - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 493/500 [============================>.] - ETA: 2s - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 494/500 [============================>.] - ETA: 1s - loss: 0.2447 - regression_loss: 0.2325 - classification_loss: 0.0121 495/500 [============================>.] - ETA: 1s - loss: 0.2445 - regression_loss: 0.2325 - classification_loss: 0.0121 496/500 [============================>.] - ETA: 1s - loss: 0.2445 - regression_loss: 0.2324 - classification_loss: 0.0121 497/500 [============================>.] - ETA: 0s - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 498/500 [============================>.] - ETA: 0s - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0121 499/500 [============================>.] - ETA: 0s - loss: 0.2440 - regression_loss: 0.2319 - classification_loss: 0.0121 500/500 [==============================] - 164s 328ms/step - loss: 0.2436 - regression_loss: 0.2316 - classification_loss: 0.0120 1172 instances of class plum with average precision: 0.7521 mAP: 0.7521 Epoch 00045: saving model to ./training/snapshots/resnet101_pascal_45.h5 Epoch 46/150 1/500 [..............................] - ETA: 2:36 - loss: 0.1670 - regression_loss: 0.1632 - classification_loss: 0.0038 2/500 [..............................] - ETA: 2:34 - loss: 0.1564 - regression_loss: 0.1533 - classification_loss: 0.0031 3/500 [..............................] - ETA: 2:33 - loss: 0.2466 - regression_loss: 0.2297 - classification_loss: 0.0170 4/500 [..............................] - ETA: 2:33 - loss: 0.2849 - regression_loss: 0.2650 - classification_loss: 0.0199 5/500 [..............................] - ETA: 2:32 - loss: 0.3008 - regression_loss: 0.2842 - classification_loss: 0.0166 6/500 [..............................] - ETA: 2:31 - loss: 0.2794 - regression_loss: 0.2643 - classification_loss: 0.0150 7/500 [..............................] - ETA: 2:31 - loss: 0.3074 - regression_loss: 0.2898 - classification_loss: 0.0176 8/500 [..............................] - ETA: 2:31 - loss: 0.2996 - regression_loss: 0.2815 - classification_loss: 0.0181 9/500 [..............................] - ETA: 2:30 - loss: 0.2861 - regression_loss: 0.2681 - classification_loss: 0.0180 10/500 [..............................] - ETA: 2:30 - loss: 0.2743 - regression_loss: 0.2573 - classification_loss: 0.0170 11/500 [..............................] - ETA: 2:30 - loss: 0.2863 - regression_loss: 0.2686 - classification_loss: 0.0177 12/500 [..............................] - ETA: 2:30 - loss: 0.2899 - regression_loss: 0.2732 - classification_loss: 0.0167 13/500 [..............................] - ETA: 2:29 - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0160 14/500 [..............................] - ETA: 2:29 - loss: 0.2798 - regression_loss: 0.2644 - classification_loss: 0.0154 15/500 [..............................] - ETA: 2:29 - loss: 0.2745 - regression_loss: 0.2598 - classification_loss: 0.0147 16/500 [..............................] - ETA: 2:28 - loss: 0.2779 - regression_loss: 0.2616 - classification_loss: 0.0163 17/500 [>.............................] - ETA: 2:28 - loss: 0.2680 - regression_loss: 0.2526 - classification_loss: 0.0154 18/500 [>.............................] - ETA: 2:28 - loss: 0.2602 - regression_loss: 0.2455 - classification_loss: 0.0146 19/500 [>.............................] - ETA: 2:27 - loss: 0.2512 - regression_loss: 0.2371 - classification_loss: 0.0142 20/500 [>.............................] - ETA: 2:27 - loss: 0.2510 - regression_loss: 0.2369 - classification_loss: 0.0140 21/500 [>.............................] - ETA: 2:26 - loss: 0.2510 - regression_loss: 0.2370 - classification_loss: 0.0140 22/500 [>.............................] - ETA: 2:26 - loss: 0.2523 - regression_loss: 0.2384 - classification_loss: 0.0139 23/500 [>.............................] - ETA: 2:26 - loss: 0.2463 - regression_loss: 0.2328 - classification_loss: 0.0135 24/500 [>.............................] - ETA: 2:25 - loss: 0.2473 - regression_loss: 0.2338 - classification_loss: 0.0135 25/500 [>.............................] - ETA: 2:25 - loss: 0.2451 - regression_loss: 0.2319 - classification_loss: 0.0133 26/500 [>.............................] - ETA: 2:25 - loss: 0.2429 - regression_loss: 0.2301 - classification_loss: 0.0128 27/500 [>.............................] - ETA: 2:24 - loss: 0.2397 - regression_loss: 0.2271 - classification_loss: 0.0126 28/500 [>.............................] - ETA: 2:24 - loss: 0.2410 - regression_loss: 0.2284 - classification_loss: 0.0126 29/500 [>.............................] - ETA: 2:24 - loss: 0.2412 - regression_loss: 0.2286 - classification_loss: 0.0126 30/500 [>.............................] - ETA: 2:24 - loss: 0.2429 - regression_loss: 0.2303 - classification_loss: 0.0126 31/500 [>.............................] - ETA: 2:23 - loss: 0.2399 - regression_loss: 0.2274 - classification_loss: 0.0125 32/500 [>.............................] - ETA: 2:23 - loss: 0.2395 - regression_loss: 0.2270 - classification_loss: 0.0125 33/500 [>.............................] - ETA: 2:23 - loss: 0.2350 - regression_loss: 0.2228 - classification_loss: 0.0123 34/500 [=>............................] - ETA: 2:22 - loss: 0.2335 - regression_loss: 0.2213 - classification_loss: 0.0122 35/500 [=>............................] - ETA: 2:22 - loss: 0.2322 - regression_loss: 0.2203 - classification_loss: 0.0120 36/500 [=>............................] - ETA: 2:22 - loss: 0.2337 - regression_loss: 0.2219 - classification_loss: 0.0118 37/500 [=>............................] - ETA: 2:22 - loss: 0.2334 - regression_loss: 0.2216 - classification_loss: 0.0118 38/500 [=>............................] - ETA: 2:21 - loss: 0.2361 - regression_loss: 0.2243 - classification_loss: 0.0118 39/500 [=>............................] - ETA: 2:21 - loss: 0.2336 - regression_loss: 0.2220 - classification_loss: 0.0116 40/500 [=>............................] - ETA: 2:21 - loss: 0.2330 - regression_loss: 0.2216 - classification_loss: 0.0114 41/500 [=>............................] - ETA: 2:21 - loss: 0.2394 - regression_loss: 0.2280 - classification_loss: 0.0114 42/500 [=>............................] - ETA: 2:20 - loss: 0.2439 - regression_loss: 0.2326 - classification_loss: 0.0113 43/500 [=>............................] - ETA: 2:20 - loss: 0.2415 - regression_loss: 0.2303 - classification_loss: 0.0111 44/500 [=>............................] - ETA: 2:20 - loss: 0.2382 - regression_loss: 0.2273 - classification_loss: 0.0109 45/500 [=>............................] - ETA: 2:19 - loss: 0.2388 - regression_loss: 0.2280 - classification_loss: 0.0109 46/500 [=>............................] - ETA: 2:19 - loss: 0.2380 - regression_loss: 0.2272 - classification_loss: 0.0107 47/500 [=>............................] - ETA: 2:19 - loss: 0.2391 - regression_loss: 0.2283 - classification_loss: 0.0108 48/500 [=>............................] - ETA: 2:18 - loss: 0.2381 - regression_loss: 0.2275 - classification_loss: 0.0106 49/500 [=>............................] - ETA: 2:18 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 50/500 [==>...........................] - ETA: 2:18 - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0105 51/500 [==>...........................] - ETA: 2:18 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0107 52/500 [==>...........................] - ETA: 2:17 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0105 53/500 [==>...........................] - ETA: 2:17 - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0105 54/500 [==>...........................] - ETA: 2:17 - loss: 0.2407 - regression_loss: 0.2298 - classification_loss: 0.0109 55/500 [==>...........................] - ETA: 2:16 - loss: 0.2446 - regression_loss: 0.2335 - classification_loss: 0.0112 56/500 [==>...........................] - ETA: 2:16 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 57/500 [==>...........................] - ETA: 2:16 - loss: 0.2423 - regression_loss: 0.2312 - classification_loss: 0.0111 58/500 [==>...........................] - ETA: 2:15 - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 59/500 [==>...........................] - ETA: 2:15 - loss: 0.2438 - regression_loss: 0.2328 - classification_loss: 0.0110 60/500 [==>...........................] - ETA: 2:15 - loss: 0.2452 - regression_loss: 0.2339 - classification_loss: 0.0112 61/500 [==>...........................] - ETA: 2:15 - loss: 0.2430 - regression_loss: 0.2319 - classification_loss: 0.0111 62/500 [==>...........................] - ETA: 2:14 - loss: 0.2410 - regression_loss: 0.2301 - classification_loss: 0.0109 63/500 [==>...........................] - ETA: 2:14 - loss: 0.2425 - regression_loss: 0.2315 - classification_loss: 0.0110 64/500 [==>...........................] - ETA: 2:14 - loss: 0.2438 - regression_loss: 0.2327 - classification_loss: 0.0111 65/500 [==>...........................] - ETA: 2:13 - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0110 66/500 [==>...........................] - ETA: 2:13 - loss: 0.2447 - regression_loss: 0.2336 - classification_loss: 0.0111 67/500 [===>..........................] - ETA: 2:13 - loss: 0.2457 - regression_loss: 0.2346 - classification_loss: 0.0111 68/500 [===>..........................] - ETA: 2:12 - loss: 0.2456 - regression_loss: 0.2345 - classification_loss: 0.0111 69/500 [===>..........................] - ETA: 2:12 - loss: 0.2451 - regression_loss: 0.2341 - classification_loss: 0.0110 70/500 [===>..........................] - ETA: 2:12 - loss: 0.2438 - regression_loss: 0.2328 - classification_loss: 0.0110 71/500 [===>..........................] - ETA: 2:11 - loss: 0.2461 - regression_loss: 0.2349 - classification_loss: 0.0111 72/500 [===>..........................] - ETA: 2:11 - loss: 0.2462 - regression_loss: 0.2350 - classification_loss: 0.0112 73/500 [===>..........................] - ETA: 2:11 - loss: 0.2513 - regression_loss: 0.2400 - classification_loss: 0.0113 74/500 [===>..........................] - ETA: 2:10 - loss: 0.2512 - regression_loss: 0.2399 - classification_loss: 0.0113 75/500 [===>..........................] - ETA: 2:10 - loss: 0.2532 - regression_loss: 0.2417 - classification_loss: 0.0115 76/500 [===>..........................] - ETA: 2:10 - loss: 0.2517 - regression_loss: 0.2403 - classification_loss: 0.0114 77/500 [===>..........................] - ETA: 2:09 - loss: 0.2537 - regression_loss: 0.2414 - classification_loss: 0.0122 78/500 [===>..........................] - ETA: 2:09 - loss: 0.2549 - regression_loss: 0.2427 - classification_loss: 0.0122 79/500 [===>..........................] - ETA: 2:09 - loss: 0.2540 - regression_loss: 0.2419 - classification_loss: 0.0122 80/500 [===>..........................] - ETA: 2:09 - loss: 0.2533 - regression_loss: 0.2413 - classification_loss: 0.0120 81/500 [===>..........................] - ETA: 2:08 - loss: 0.2527 - regression_loss: 0.2408 - classification_loss: 0.0119 82/500 [===>..........................] - ETA: 2:08 - loss: 0.2546 - regression_loss: 0.2425 - classification_loss: 0.0121 83/500 [===>..........................] - ETA: 2:08 - loss: 0.2547 - regression_loss: 0.2427 - classification_loss: 0.0121 84/500 [====>.........................] - ETA: 2:07 - loss: 0.2529 - regression_loss: 0.2410 - classification_loss: 0.0120 85/500 [====>.........................] - ETA: 2:07 - loss: 0.2543 - regression_loss: 0.2420 - classification_loss: 0.0122 86/500 [====>.........................] - ETA: 2:07 - loss: 0.2536 - regression_loss: 0.2415 - classification_loss: 0.0122 87/500 [====>.........................] - ETA: 2:06 - loss: 0.2532 - regression_loss: 0.2410 - classification_loss: 0.0122 88/500 [====>.........................] - ETA: 2:06 - loss: 0.2527 - regression_loss: 0.2405 - classification_loss: 0.0122 89/500 [====>.........................] - ETA: 2:06 - loss: 0.2544 - regression_loss: 0.2422 - classification_loss: 0.0122 90/500 [====>.........................] - ETA: 2:06 - loss: 0.2539 - regression_loss: 0.2418 - classification_loss: 0.0121 91/500 [====>.........................] - ETA: 2:05 - loss: 0.2530 - regression_loss: 0.2410 - classification_loss: 0.0121 92/500 [====>.........................] - ETA: 2:05 - loss: 0.2517 - regression_loss: 0.2397 - classification_loss: 0.0120 93/500 [====>.........................] - ETA: 2:05 - loss: 0.2511 - regression_loss: 0.2392 - classification_loss: 0.0119 94/500 [====>.........................] - ETA: 2:04 - loss: 0.2511 - regression_loss: 0.2392 - classification_loss: 0.0119 95/500 [====>.........................] - ETA: 2:04 - loss: 0.2501 - regression_loss: 0.2382 - classification_loss: 0.0118 96/500 [====>.........................] - ETA: 2:04 - loss: 0.2495 - regression_loss: 0.2377 - classification_loss: 0.0118 97/500 [====>.........................] - ETA: 2:03 - loss: 0.2503 - regression_loss: 0.2382 - classification_loss: 0.0120 98/500 [====>.........................] - ETA: 2:03 - loss: 0.2503 - regression_loss: 0.2382 - classification_loss: 0.0120 99/500 [====>.........................] - ETA: 2:03 - loss: 0.2498 - regression_loss: 0.2378 - classification_loss: 0.0121 100/500 [=====>........................] - ETA: 2:03 - loss: 0.2490 - regression_loss: 0.2370 - classification_loss: 0.0120 101/500 [=====>........................] - ETA: 2:02 - loss: 0.2496 - regression_loss: 0.2376 - classification_loss: 0.0120 102/500 [=====>........................] - ETA: 2:02 - loss: 0.2501 - regression_loss: 0.2381 - classification_loss: 0.0119 103/500 [=====>........................] - ETA: 2:02 - loss: 0.2494 - regression_loss: 0.2375 - classification_loss: 0.0119 104/500 [=====>........................] - ETA: 2:01 - loss: 0.2481 - regression_loss: 0.2364 - classification_loss: 0.0118 105/500 [=====>........................] - ETA: 2:01 - loss: 0.2474 - regression_loss: 0.2357 - classification_loss: 0.0117 106/500 [=====>........................] - ETA: 2:01 - loss: 0.2474 - regression_loss: 0.2356 - classification_loss: 0.0117 107/500 [=====>........................] - ETA: 2:00 - loss: 0.2469 - regression_loss: 0.2352 - classification_loss: 0.0117 108/500 [=====>........................] - ETA: 2:00 - loss: 0.2473 - regression_loss: 0.2356 - classification_loss: 0.0117 109/500 [=====>........................] - ETA: 2:00 - loss: 0.2468 - regression_loss: 0.2351 - classification_loss: 0.0117 110/500 [=====>........................] - ETA: 1:59 - loss: 0.2463 - regression_loss: 0.2346 - classification_loss: 0.0117 111/500 [=====>........................] - ETA: 1:59 - loss: 0.2469 - regression_loss: 0.2351 - classification_loss: 0.0118 112/500 [=====>........................] - ETA: 1:59 - loss: 0.2461 - regression_loss: 0.2344 - classification_loss: 0.0117 113/500 [=====>........................] - ETA: 1:58 - loss: 0.2447 - regression_loss: 0.2331 - classification_loss: 0.0116 114/500 [=====>........................] - ETA: 1:58 - loss: 0.2445 - regression_loss: 0.2328 - classification_loss: 0.0117 115/500 [=====>........................] - ETA: 1:58 - loss: 0.2441 - regression_loss: 0.2325 - classification_loss: 0.0116 116/500 [=====>........................] - ETA: 1:58 - loss: 0.2445 - regression_loss: 0.2329 - classification_loss: 0.0116 117/500 [======>.......................] - ETA: 1:57 - loss: 0.2441 - regression_loss: 0.2326 - classification_loss: 0.0116 118/500 [======>.......................] - ETA: 1:57 - loss: 0.2428 - regression_loss: 0.2313 - classification_loss: 0.0115 119/500 [======>.......................] - ETA: 1:57 - loss: 0.2429 - regression_loss: 0.2314 - classification_loss: 0.0115 120/500 [======>.......................] - ETA: 1:56 - loss: 0.2428 - regression_loss: 0.2313 - classification_loss: 0.0115 121/500 [======>.......................] - ETA: 1:56 - loss: 0.2440 - regression_loss: 0.2325 - classification_loss: 0.0115 122/500 [======>.......................] - ETA: 1:56 - loss: 0.2434 - regression_loss: 0.2319 - classification_loss: 0.0115 123/500 [======>.......................] - ETA: 1:55 - loss: 0.2432 - regression_loss: 0.2318 - classification_loss: 0.0114 124/500 [======>.......................] - ETA: 1:55 - loss: 0.2455 - regression_loss: 0.2338 - classification_loss: 0.0117 125/500 [======>.......................] - ETA: 1:55 - loss: 0.2453 - regression_loss: 0.2336 - classification_loss: 0.0117 126/500 [======>.......................] - ETA: 1:54 - loss: 0.2449 - regression_loss: 0.2333 - classification_loss: 0.0117 127/500 [======>.......................] - ETA: 1:54 - loss: 0.2442 - regression_loss: 0.2326 - classification_loss: 0.0116 128/500 [======>.......................] - ETA: 1:54 - loss: 0.2456 - regression_loss: 0.2339 - classification_loss: 0.0116 129/500 [======>.......................] - ETA: 1:54 - loss: 0.2455 - regression_loss: 0.2339 - classification_loss: 0.0116 130/500 [======>.......................] - ETA: 1:53 - loss: 0.2468 - regression_loss: 0.2351 - classification_loss: 0.0117 131/500 [======>.......................] - ETA: 1:53 - loss: 0.2492 - regression_loss: 0.2375 - classification_loss: 0.0118 132/500 [======>.......................] - ETA: 1:53 - loss: 0.2482 - regression_loss: 0.2365 - classification_loss: 0.0117 133/500 [======>.......................] - ETA: 1:52 - loss: 0.2490 - regression_loss: 0.2373 - classification_loss: 0.0117 134/500 [=======>......................] - ETA: 1:52 - loss: 0.2497 - regression_loss: 0.2380 - classification_loss: 0.0117 135/500 [=======>......................] - ETA: 1:52 - loss: 0.2495 - regression_loss: 0.2379 - classification_loss: 0.0117 136/500 [=======>......................] - ETA: 1:51 - loss: 0.2490 - regression_loss: 0.2374 - classification_loss: 0.0116 137/500 [=======>......................] - ETA: 1:51 - loss: 0.2484 - regression_loss: 0.2368 - classification_loss: 0.0116 138/500 [=======>......................] - ETA: 1:51 - loss: 0.2479 - regression_loss: 0.2363 - classification_loss: 0.0115 139/500 [=======>......................] - ETA: 1:50 - loss: 0.2476 - regression_loss: 0.2361 - classification_loss: 0.0115 140/500 [=======>......................] - ETA: 1:50 - loss: 0.2473 - regression_loss: 0.2358 - classification_loss: 0.0115 141/500 [=======>......................] - ETA: 1:50 - loss: 0.2471 - regression_loss: 0.2356 - classification_loss: 0.0115 142/500 [=======>......................] - ETA: 1:50 - loss: 0.2480 - regression_loss: 0.2364 - classification_loss: 0.0116 143/500 [=======>......................] - ETA: 1:49 - loss: 0.2479 - regression_loss: 0.2364 - classification_loss: 0.0116 144/500 [=======>......................] - ETA: 1:49 - loss: 0.2472 - regression_loss: 0.2357 - classification_loss: 0.0115 145/500 [=======>......................] - ETA: 1:49 - loss: 0.2462 - regression_loss: 0.2347 - classification_loss: 0.0114 146/500 [=======>......................] - ETA: 1:48 - loss: 0.2465 - regression_loss: 0.2351 - classification_loss: 0.0114 147/500 [=======>......................] - ETA: 1:48 - loss: 0.2463 - regression_loss: 0.2349 - classification_loss: 0.0114 148/500 [=======>......................] - ETA: 1:48 - loss: 0.2464 - regression_loss: 0.2351 - classification_loss: 0.0113 149/500 [=======>......................] - ETA: 1:47 - loss: 0.2466 - regression_loss: 0.2353 - classification_loss: 0.0114 150/500 [========>.....................] - ETA: 1:47 - loss: 0.2472 - regression_loss: 0.2359 - classification_loss: 0.0114 151/500 [========>.....................] - ETA: 1:47 - loss: 0.2475 - regression_loss: 0.2362 - classification_loss: 0.0113 152/500 [========>.....................] - ETA: 1:46 - loss: 0.2475 - regression_loss: 0.2362 - classification_loss: 0.0113 153/500 [========>.....................] - ETA: 1:46 - loss: 0.2473 - regression_loss: 0.2360 - classification_loss: 0.0113 154/500 [========>.....................] - ETA: 1:46 - loss: 0.2465 - regression_loss: 0.2353 - classification_loss: 0.0113 155/500 [========>.....................] - ETA: 1:46 - loss: 0.2458 - regression_loss: 0.2346 - classification_loss: 0.0112 156/500 [========>.....................] - ETA: 1:45 - loss: 0.2459 - regression_loss: 0.2347 - classification_loss: 0.0112 157/500 [========>.....................] - ETA: 1:45 - loss: 0.2464 - regression_loss: 0.2352 - classification_loss: 0.0112 158/500 [========>.....................] - ETA: 1:45 - loss: 0.2458 - regression_loss: 0.2347 - classification_loss: 0.0111 159/500 [========>.....................] - ETA: 1:44 - loss: 0.2453 - regression_loss: 0.2342 - classification_loss: 0.0111 160/500 [========>.....................] - ETA: 1:44 - loss: 0.2447 - regression_loss: 0.2336 - classification_loss: 0.0111 161/500 [========>.....................] - ETA: 1:44 - loss: 0.2457 - regression_loss: 0.2346 - classification_loss: 0.0112 162/500 [========>.....................] - ETA: 1:43 - loss: 0.2456 - regression_loss: 0.2344 - classification_loss: 0.0112 163/500 [========>.....................] - ETA: 1:43 - loss: 0.2462 - regression_loss: 0.2348 - classification_loss: 0.0114 164/500 [========>.....................] - ETA: 1:43 - loss: 0.2471 - regression_loss: 0.2357 - classification_loss: 0.0114 165/500 [========>.....................] - ETA: 1:42 - loss: 0.2475 - regression_loss: 0.2360 - classification_loss: 0.0115 166/500 [========>.....................] - ETA: 1:42 - loss: 0.2484 - regression_loss: 0.2368 - classification_loss: 0.0115 167/500 [=========>....................] - ETA: 1:42 - loss: 0.2474 - regression_loss: 0.2359 - classification_loss: 0.0115 168/500 [=========>....................] - ETA: 1:42 - loss: 0.2473 - regression_loss: 0.2358 - classification_loss: 0.0115 169/500 [=========>....................] - ETA: 1:41 - loss: 0.2468 - regression_loss: 0.2354 - classification_loss: 0.0115 170/500 [=========>....................] - ETA: 1:41 - loss: 0.2469 - regression_loss: 0.2355 - classification_loss: 0.0114 171/500 [=========>....................] - ETA: 1:41 - loss: 0.2466 - regression_loss: 0.2352 - classification_loss: 0.0114 172/500 [=========>....................] - ETA: 1:40 - loss: 0.2461 - regression_loss: 0.2347 - classification_loss: 0.0114 173/500 [=========>....................] - ETA: 1:40 - loss: 0.2462 - regression_loss: 0.2348 - classification_loss: 0.0114 174/500 [=========>....................] - ETA: 1:40 - loss: 0.2459 - regression_loss: 0.2346 - classification_loss: 0.0113 175/500 [=========>....................] - ETA: 1:39 - loss: 0.2461 - regression_loss: 0.2348 - classification_loss: 0.0113 176/500 [=========>....................] - ETA: 1:39 - loss: 0.2469 - regression_loss: 0.2356 - classification_loss: 0.0113 177/500 [=========>....................] - ETA: 1:39 - loss: 0.2466 - regression_loss: 0.2353 - classification_loss: 0.0113 178/500 [=========>....................] - ETA: 1:38 - loss: 0.2462 - regression_loss: 0.2350 - classification_loss: 0.0112 179/500 [=========>....................] - ETA: 1:38 - loss: 0.2465 - regression_loss: 0.2351 - classification_loss: 0.0114 180/500 [=========>....................] - ETA: 1:38 - loss: 0.2468 - regression_loss: 0.2355 - classification_loss: 0.0114 181/500 [=========>....................] - ETA: 1:38 - loss: 0.2466 - regression_loss: 0.2353 - classification_loss: 0.0114 182/500 [=========>....................] - ETA: 1:37 - loss: 0.2461 - regression_loss: 0.2348 - classification_loss: 0.0113 183/500 [=========>....................] - ETA: 1:37 - loss: 0.2456 - regression_loss: 0.2343 - classification_loss: 0.0113 184/500 [==========>...................] - ETA: 1:37 - loss: 0.2454 - regression_loss: 0.2341 - classification_loss: 0.0113 185/500 [==========>...................] - ETA: 1:36 - loss: 0.2448 - regression_loss: 0.2336 - classification_loss: 0.0112 186/500 [==========>...................] - ETA: 1:36 - loss: 0.2448 - regression_loss: 0.2336 - classification_loss: 0.0112 187/500 [==========>...................] - ETA: 1:36 - loss: 0.2456 - regression_loss: 0.2343 - classification_loss: 0.0113 188/500 [==========>...................] - ETA: 1:35 - loss: 0.2458 - regression_loss: 0.2345 - classification_loss: 0.0113 189/500 [==========>...................] - ETA: 1:35 - loss: 0.2454 - regression_loss: 0.2341 - classification_loss: 0.0113 190/500 [==========>...................] - ETA: 1:35 - loss: 0.2453 - regression_loss: 0.2340 - classification_loss: 0.0113 191/500 [==========>...................] - ETA: 1:35 - loss: 0.2452 - regression_loss: 0.2340 - classification_loss: 0.0113 192/500 [==========>...................] - ETA: 1:34 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 193/500 [==========>...................] - ETA: 1:34 - loss: 0.2450 - regression_loss: 0.2337 - classification_loss: 0.0113 194/500 [==========>...................] - ETA: 1:34 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 195/500 [==========>...................] - ETA: 1:33 - loss: 0.2447 - regression_loss: 0.2335 - classification_loss: 0.0112 196/500 [==========>...................] - ETA: 1:33 - loss: 0.2443 - regression_loss: 0.2331 - classification_loss: 0.0112 197/500 [==========>...................] - ETA: 1:33 - loss: 0.2441 - regression_loss: 0.2329 - classification_loss: 0.0112 198/500 [==========>...................] - ETA: 1:32 - loss: 0.2450 - regression_loss: 0.2338 - classification_loss: 0.0112 199/500 [==========>...................] - ETA: 1:32 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 200/500 [===========>..................] - ETA: 1:32 - loss: 0.2444 - regression_loss: 0.2332 - classification_loss: 0.0111 201/500 [===========>..................] - ETA: 1:31 - loss: 0.2452 - regression_loss: 0.2340 - classification_loss: 0.0112 202/500 [===========>..................] - ETA: 1:31 - loss: 0.2449 - regression_loss: 0.2337 - classification_loss: 0.0112 203/500 [===========>..................] - ETA: 1:31 - loss: 0.2450 - regression_loss: 0.2339 - classification_loss: 0.0111 204/500 [===========>..................] - ETA: 1:31 - loss: 0.2446 - regression_loss: 0.2335 - classification_loss: 0.0111 205/500 [===========>..................] - ETA: 1:30 - loss: 0.2450 - regression_loss: 0.2339 - classification_loss: 0.0111 206/500 [===========>..................] - ETA: 1:30 - loss: 0.2464 - regression_loss: 0.2353 - classification_loss: 0.0111 207/500 [===========>..................] - ETA: 1:30 - loss: 0.2465 - regression_loss: 0.2354 - classification_loss: 0.0111 208/500 [===========>..................] - ETA: 1:29 - loss: 0.2462 - regression_loss: 0.2351 - classification_loss: 0.0111 209/500 [===========>..................] - ETA: 1:29 - loss: 0.2461 - regression_loss: 0.2350 - classification_loss: 0.0111 210/500 [===========>..................] - ETA: 1:29 - loss: 0.2455 - regression_loss: 0.2344 - classification_loss: 0.0111 211/500 [===========>..................] - ETA: 1:28 - loss: 0.2452 - regression_loss: 0.2341 - classification_loss: 0.0111 212/500 [===========>..................] - ETA: 1:28 - loss: 0.2457 - regression_loss: 0.2345 - classification_loss: 0.0112 213/500 [===========>..................] - ETA: 1:28 - loss: 0.2456 - regression_loss: 0.2344 - classification_loss: 0.0112 214/500 [===========>..................] - ETA: 1:27 - loss: 0.2454 - regression_loss: 0.2342 - classification_loss: 0.0111 215/500 [===========>..................] - ETA: 1:27 - loss: 0.2450 - regression_loss: 0.2339 - classification_loss: 0.0111 216/500 [===========>..................] - ETA: 1:27 - loss: 0.2450 - regression_loss: 0.2339 - classification_loss: 0.0111 217/500 [============>.................] - ETA: 1:27 - loss: 0.2453 - regression_loss: 0.2342 - classification_loss: 0.0111 218/500 [============>.................] - ETA: 1:26 - loss: 0.2462 - regression_loss: 0.2350 - classification_loss: 0.0112 219/500 [============>.................] - ETA: 1:26 - loss: 0.2457 - regression_loss: 0.2346 - classification_loss: 0.0111 220/500 [============>.................] - ETA: 1:26 - loss: 0.2464 - regression_loss: 0.2351 - classification_loss: 0.0112 221/500 [============>.................] - ETA: 1:25 - loss: 0.2463 - regression_loss: 0.2350 - classification_loss: 0.0113 222/500 [============>.................] - ETA: 1:25 - loss: 0.2469 - regression_loss: 0.2356 - classification_loss: 0.0113 223/500 [============>.................] - ETA: 1:25 - loss: 0.2467 - regression_loss: 0.2354 - classification_loss: 0.0113 224/500 [============>.................] - ETA: 1:24 - loss: 0.2466 - regression_loss: 0.2353 - classification_loss: 0.0113 225/500 [============>.................] - ETA: 1:24 - loss: 0.2462 - regression_loss: 0.2349 - classification_loss: 0.0113 226/500 [============>.................] - ETA: 1:24 - loss: 0.2457 - regression_loss: 0.2344 - classification_loss: 0.0112 227/500 [============>.................] - ETA: 1:23 - loss: 0.2453 - regression_loss: 0.2341 - classification_loss: 0.0112 228/500 [============>.................] - ETA: 1:23 - loss: 0.2450 - regression_loss: 0.2337 - classification_loss: 0.0112 229/500 [============>.................] - ETA: 1:23 - loss: 0.2448 - regression_loss: 0.2336 - classification_loss: 0.0112 230/500 [============>.................] - ETA: 1:23 - loss: 0.2447 - regression_loss: 0.2335 - classification_loss: 0.0112 231/500 [============>.................] - ETA: 1:22 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 232/500 [============>.................] - ETA: 1:22 - loss: 0.2440 - regression_loss: 0.2329 - classification_loss: 0.0112 233/500 [============>.................] - ETA: 1:22 - loss: 0.2438 - regression_loss: 0.2327 - classification_loss: 0.0111 234/500 [=============>................] - ETA: 1:21 - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 235/500 [=============>................] - ETA: 1:21 - loss: 0.2433 - regression_loss: 0.2322 - classification_loss: 0.0111 236/500 [=============>................] - ETA: 1:21 - loss: 0.2436 - regression_loss: 0.2325 - classification_loss: 0.0112 237/500 [=============>................] - ETA: 1:20 - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0111 238/500 [=============>................] - ETA: 1:20 - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 239/500 [=============>................] - ETA: 1:20 - loss: 0.2421 - regression_loss: 0.2310 - classification_loss: 0.0111 240/500 [=============>................] - ETA: 1:19 - loss: 0.2425 - regression_loss: 0.2315 - classification_loss: 0.0110 241/500 [=============>................] - ETA: 1:19 - loss: 0.2418 - regression_loss: 0.2308 - classification_loss: 0.0110 242/500 [=============>................] - ETA: 1:19 - loss: 0.2413 - regression_loss: 0.2304 - classification_loss: 0.0110 243/500 [=============>................] - ETA: 1:19 - loss: 0.2413 - regression_loss: 0.2303 - classification_loss: 0.0110 244/500 [=============>................] - ETA: 1:18 - loss: 0.2412 - regression_loss: 0.2303 - classification_loss: 0.0110 245/500 [=============>................] - ETA: 1:18 - loss: 0.2413 - regression_loss: 0.2304 - classification_loss: 0.0110 246/500 [=============>................] - ETA: 1:18 - loss: 0.2407 - regression_loss: 0.2298 - classification_loss: 0.0109 247/500 [=============>................] - ETA: 1:17 - loss: 0.2408 - regression_loss: 0.2299 - classification_loss: 0.0109 248/500 [=============>................] - ETA: 1:17 - loss: 0.2410 - regression_loss: 0.2301 - classification_loss: 0.0109 249/500 [=============>................] - ETA: 1:17 - loss: 0.2406 - regression_loss: 0.2297 - classification_loss: 0.0109 250/500 [==============>...............] - ETA: 1:16 - loss: 0.2400 - regression_loss: 0.2291 - classification_loss: 0.0109 251/500 [==============>...............] - ETA: 1:16 - loss: 0.2395 - regression_loss: 0.2286 - classification_loss: 0.0109 252/500 [==============>...............] - ETA: 1:16 - loss: 0.2393 - regression_loss: 0.2284 - classification_loss: 0.0108 253/500 [==============>...............] - ETA: 1:15 - loss: 0.2392 - regression_loss: 0.2284 - classification_loss: 0.0108 254/500 [==============>...............] - ETA: 1:15 - loss: 0.2391 - regression_loss: 0.2283 - classification_loss: 0.0108 255/500 [==============>...............] - ETA: 1:15 - loss: 0.2394 - regression_loss: 0.2286 - classification_loss: 0.0108 256/500 [==============>...............] - ETA: 1:15 - loss: 0.2392 - regression_loss: 0.2284 - classification_loss: 0.0108 257/500 [==============>...............] - ETA: 1:14 - loss: 0.2397 - regression_loss: 0.2289 - classification_loss: 0.0108 258/500 [==============>...............] - ETA: 1:14 - loss: 0.2410 - regression_loss: 0.2301 - classification_loss: 0.0109 259/500 [==============>...............] - ETA: 1:14 - loss: 0.2418 - regression_loss: 0.2309 - classification_loss: 0.0109 260/500 [==============>...............] - ETA: 1:13 - loss: 0.2423 - regression_loss: 0.2314 - classification_loss: 0.0109 261/500 [==============>...............] - ETA: 1:13 - loss: 0.2434 - regression_loss: 0.2324 - classification_loss: 0.0110 262/500 [==============>...............] - ETA: 1:13 - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0110 263/500 [==============>...............] - ETA: 1:12 - loss: 0.2432 - regression_loss: 0.2323 - classification_loss: 0.0110 264/500 [==============>...............] - ETA: 1:12 - loss: 0.2428 - regression_loss: 0.2319 - classification_loss: 0.0109 265/500 [==============>...............] - ETA: 1:12 - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0110 266/500 [==============>...............] - ETA: 1:11 - loss: 0.2430 - regression_loss: 0.2321 - classification_loss: 0.0109 267/500 [===============>..............] - ETA: 1:11 - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0110 268/500 [===============>..............] - ETA: 1:11 - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 269/500 [===============>..............] - ETA: 1:11 - loss: 0.2431 - regression_loss: 0.2321 - classification_loss: 0.0111 270/500 [===============>..............] - ETA: 1:10 - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0110 271/500 [===============>..............] - ETA: 1:10 - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 272/500 [===============>..............] - ETA: 1:10 - loss: 0.2436 - regression_loss: 0.2325 - classification_loss: 0.0111 273/500 [===============>..............] - ETA: 1:09 - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 274/500 [===============>..............] - ETA: 1:09 - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 275/500 [===============>..............] - ETA: 1:09 - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0111 276/500 [===============>..............] - ETA: 1:08 - loss: 0.2438 - regression_loss: 0.2327 - classification_loss: 0.0111 277/500 [===============>..............] - ETA: 1:08 - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 278/500 [===============>..............] - ETA: 1:08 - loss: 0.2441 - regression_loss: 0.2329 - classification_loss: 0.0112 279/500 [===============>..............] - ETA: 1:07 - loss: 0.2439 - regression_loss: 0.2327 - classification_loss: 0.0112 280/500 [===============>..............] - ETA: 1:07 - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 281/500 [===============>..............] - ETA: 1:07 - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 282/500 [===============>..............] - ETA: 1:07 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 283/500 [===============>..............] - ETA: 1:06 - loss: 0.2444 - regression_loss: 0.2332 - classification_loss: 0.0112 284/500 [================>.............] - ETA: 1:06 - loss: 0.2440 - regression_loss: 0.2329 - classification_loss: 0.0112 285/500 [================>.............] - ETA: 1:06 - loss: 0.2447 - regression_loss: 0.2335 - classification_loss: 0.0112 286/500 [================>.............] - ETA: 1:05 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 287/500 [================>.............] - ETA: 1:05 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 288/500 [================>.............] - ETA: 1:05 - loss: 0.2443 - regression_loss: 0.2332 - classification_loss: 0.0112 289/500 [================>.............] - ETA: 1:04 - loss: 0.2440 - regression_loss: 0.2329 - classification_loss: 0.0112 290/500 [================>.............] - ETA: 1:04 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 291/500 [================>.............] - ETA: 1:04 - loss: 0.2443 - regression_loss: 0.2331 - classification_loss: 0.0112 292/500 [================>.............] - ETA: 1:04 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 293/500 [================>.............] - ETA: 1:03 - loss: 0.2443 - regression_loss: 0.2331 - classification_loss: 0.0112 294/500 [================>.............] - ETA: 1:03 - loss: 0.2446 - regression_loss: 0.2333 - classification_loss: 0.0112 295/500 [================>.............] - ETA: 1:03 - loss: 0.2454 - regression_loss: 0.2342 - classification_loss: 0.0112 296/500 [================>.............] - ETA: 1:02 - loss: 0.2455 - regression_loss: 0.2342 - classification_loss: 0.0112 297/500 [================>.............] - ETA: 1:02 - loss: 0.2453 - regression_loss: 0.2341 - classification_loss: 0.0112 298/500 [================>.............] - ETA: 1:02 - loss: 0.2451 - regression_loss: 0.2339 - classification_loss: 0.0112 299/500 [================>.............] - ETA: 1:01 - loss: 0.2453 - regression_loss: 0.2341 - classification_loss: 0.0112 300/500 [=================>............] - ETA: 1:01 - loss: 0.2458 - regression_loss: 0.2346 - classification_loss: 0.0112 301/500 [=================>............] - ETA: 1:01 - loss: 0.2463 - regression_loss: 0.2351 - classification_loss: 0.0112 302/500 [=================>............] - ETA: 1:00 - loss: 0.2459 - regression_loss: 0.2347 - classification_loss: 0.0112 303/500 [=================>............] - ETA: 1:00 - loss: 0.2458 - regression_loss: 0.2346 - classification_loss: 0.0112 304/500 [=================>............] - ETA: 1:00 - loss: 0.2454 - regression_loss: 0.2343 - classification_loss: 0.0112 305/500 [=================>............] - ETA: 1:00 - loss: 0.2449 - regression_loss: 0.2338 - classification_loss: 0.0111 306/500 [=================>............] - ETA: 59s - loss: 0.2451 - regression_loss: 0.2339 - classification_loss: 0.0111  307/500 [=================>............] - ETA: 59s - loss: 0.2455 - regression_loss: 0.2343 - classification_loss: 0.0111 308/500 [=================>............] - ETA: 59s - loss: 0.2455 - regression_loss: 0.2344 - classification_loss: 0.0111 309/500 [=================>............] - ETA: 58s - loss: 0.2454 - regression_loss: 0.2343 - classification_loss: 0.0111 310/500 [=================>............] - ETA: 58s - loss: 0.2453 - regression_loss: 0.2341 - classification_loss: 0.0111 311/500 [=================>............] - ETA: 58s - loss: 0.2452 - regression_loss: 0.2341 - classification_loss: 0.0111 312/500 [=================>............] - ETA: 57s - loss: 0.2452 - regression_loss: 0.2340 - classification_loss: 0.0111 313/500 [=================>............] - ETA: 57s - loss: 0.2449 - regression_loss: 0.2338 - classification_loss: 0.0111 314/500 [=================>............] - ETA: 57s - loss: 0.2451 - regression_loss: 0.2340 - classification_loss: 0.0111 315/500 [=================>............] - ETA: 56s - loss: 0.2451 - regression_loss: 0.2340 - classification_loss: 0.0111 316/500 [=================>............] - ETA: 56s - loss: 0.2449 - regression_loss: 0.2338 - classification_loss: 0.0111 317/500 [==================>...........] - ETA: 56s - loss: 0.2448 - regression_loss: 0.2337 - classification_loss: 0.0111 318/500 [==================>...........] - ETA: 56s - loss: 0.2451 - regression_loss: 0.2339 - classification_loss: 0.0113 319/500 [==================>...........] - ETA: 55s - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 320/500 [==================>...........] - ETA: 55s - loss: 0.2443 - regression_loss: 0.2331 - classification_loss: 0.0112 321/500 [==================>...........] - ETA: 55s - loss: 0.2444 - regression_loss: 0.2332 - classification_loss: 0.0112 322/500 [==================>...........] - ETA: 54s - loss: 0.2446 - regression_loss: 0.2333 - classification_loss: 0.0112 323/500 [==================>...........] - ETA: 54s - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 324/500 [==================>...........] - ETA: 54s - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 325/500 [==================>...........] - ETA: 53s - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 326/500 [==================>...........] - ETA: 53s - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 327/500 [==================>...........] - ETA: 53s - loss: 0.2440 - regression_loss: 0.2329 - classification_loss: 0.0112 328/500 [==================>...........] - ETA: 52s - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 329/500 [==================>...........] - ETA: 52s - loss: 0.2447 - regression_loss: 0.2335 - classification_loss: 0.0113 330/500 [==================>...........] - ETA: 52s - loss: 0.2446 - regression_loss: 0.2333 - classification_loss: 0.0112 331/500 [==================>...........] - ETA: 52s - loss: 0.2448 - regression_loss: 0.2335 - classification_loss: 0.0112 332/500 [==================>...........] - ETA: 51s - loss: 0.2451 - regression_loss: 0.2338 - classification_loss: 0.0113 333/500 [==================>...........] - ETA: 51s - loss: 0.2448 - regression_loss: 0.2336 - classification_loss: 0.0112 334/500 [===================>..........] - ETA: 51s - loss: 0.2443 - regression_loss: 0.2331 - classification_loss: 0.0112 335/500 [===================>..........] - ETA: 50s - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 336/500 [===================>..........] - ETA: 50s - loss: 0.2439 - regression_loss: 0.2327 - classification_loss: 0.0112 337/500 [===================>..........] - ETA: 50s - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 338/500 [===================>..........] - ETA: 49s - loss: 0.2438 - regression_loss: 0.2327 - classification_loss: 0.0112 339/500 [===================>..........] - ETA: 49s - loss: 0.2441 - regression_loss: 0.2329 - classification_loss: 0.0112 340/500 [===================>..........] - ETA: 49s - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 341/500 [===================>..........] - ETA: 48s - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0111 342/500 [===================>..........] - ETA: 48s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 343/500 [===================>..........] - ETA: 48s - loss: 0.2432 - regression_loss: 0.2320 - classification_loss: 0.0111 344/500 [===================>..........] - ETA: 47s - loss: 0.2428 - regression_loss: 0.2318 - classification_loss: 0.0111 345/500 [===================>..........] - ETA: 47s - loss: 0.2432 - regression_loss: 0.2320 - classification_loss: 0.0111 346/500 [===================>..........] - ETA: 47s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 347/500 [===================>..........] - ETA: 47s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 348/500 [===================>..........] - ETA: 46s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 349/500 [===================>..........] - ETA: 46s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 350/500 [====================>.........] - ETA: 46s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 351/500 [====================>.........] - ETA: 45s - loss: 0.2427 - regression_loss: 0.2315 - classification_loss: 0.0111 352/500 [====================>.........] - ETA: 45s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 353/500 [====================>.........] - ETA: 45s - loss: 0.2429 - regression_loss: 0.2317 - classification_loss: 0.0111 354/500 [====================>.........] - ETA: 44s - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0111 355/500 [====================>.........] - ETA: 44s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 356/500 [====================>.........] - ETA: 44s - loss: 0.2433 - regression_loss: 0.2322 - classification_loss: 0.0111 357/500 [====================>.........] - ETA: 43s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 358/500 [====================>.........] - ETA: 43s - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0111 359/500 [====================>.........] - ETA: 43s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0110 360/500 [====================>.........] - ETA: 43s - loss: 0.2423 - regression_loss: 0.2313 - classification_loss: 0.0110 361/500 [====================>.........] - ETA: 42s - loss: 0.2421 - regression_loss: 0.2311 - classification_loss: 0.0110 362/500 [====================>.........] - ETA: 42s - loss: 0.2421 - regression_loss: 0.2311 - classification_loss: 0.0110 363/500 [====================>.........] - ETA: 42s - loss: 0.2420 - regression_loss: 0.2311 - classification_loss: 0.0110 364/500 [====================>.........] - ETA: 41s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 365/500 [====================>.........] - ETA: 41s - loss: 0.2418 - regression_loss: 0.2308 - classification_loss: 0.0110 366/500 [====================>.........] - ETA: 41s - loss: 0.2418 - regression_loss: 0.2308 - classification_loss: 0.0110 367/500 [=====================>........] - ETA: 40s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 368/500 [=====================>........] - ETA: 40s - loss: 0.2417 - regression_loss: 0.2307 - classification_loss: 0.0110 369/500 [=====================>........] - ETA: 40s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 370/500 [=====================>........] - ETA: 39s - loss: 0.2417 - regression_loss: 0.2307 - classification_loss: 0.0110 371/500 [=====================>........] - ETA: 39s - loss: 0.2422 - regression_loss: 0.2312 - classification_loss: 0.0110 372/500 [=====================>........] - ETA: 39s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 373/500 [=====================>........] - ETA: 39s - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0110 374/500 [=====================>........] - ETA: 38s - loss: 0.2416 - regression_loss: 0.2306 - classification_loss: 0.0110 375/500 [=====================>........] - ETA: 38s - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 376/500 [=====================>........] - ETA: 38s - loss: 0.2421 - regression_loss: 0.2311 - classification_loss: 0.0110 377/500 [=====================>........] - ETA: 37s - loss: 0.2421 - regression_loss: 0.2310 - classification_loss: 0.0110 378/500 [=====================>........] - ETA: 37s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0111 379/500 [=====================>........] - ETA: 37s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 380/500 [=====================>........] - ETA: 36s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 381/500 [=====================>........] - ETA: 36s - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0110 382/500 [=====================>........] - ETA: 36s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0110 383/500 [=====================>........] - ETA: 35s - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0110 384/500 [======================>.......] - ETA: 35s - loss: 0.2428 - regression_loss: 0.2318 - classification_loss: 0.0110 385/500 [======================>.......] - ETA: 35s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 386/500 [======================>.......] - ETA: 35s - loss: 0.2424 - regression_loss: 0.2315 - classification_loss: 0.0109 387/500 [======================>.......] - ETA: 34s - loss: 0.2422 - regression_loss: 0.2313 - classification_loss: 0.0109 388/500 [======================>.......] - ETA: 34s - loss: 0.2425 - regression_loss: 0.2315 - classification_loss: 0.0110 389/500 [======================>.......] - ETA: 34s - loss: 0.2428 - regression_loss: 0.2318 - classification_loss: 0.0110 390/500 [======================>.......] - ETA: 33s - loss: 0.2436 - regression_loss: 0.2324 - classification_loss: 0.0112 391/500 [======================>.......] - ETA: 33s - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0112 392/500 [======================>.......] - ETA: 33s - loss: 0.2435 - regression_loss: 0.2323 - classification_loss: 0.0112 393/500 [======================>.......] - ETA: 32s - loss: 0.2434 - regression_loss: 0.2322 - classification_loss: 0.0112 394/500 [======================>.......] - ETA: 32s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0112 395/500 [======================>.......] - ETA: 32s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0112 396/500 [======================>.......] - ETA: 31s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 397/500 [======================>.......] - ETA: 31s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 398/500 [======================>.......] - ETA: 31s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 399/500 [======================>.......] - ETA: 31s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 400/500 [=======================>......] - ETA: 30s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 401/500 [=======================>......] - ETA: 30s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 402/500 [=======================>......] - ETA: 30s - loss: 0.2426 - regression_loss: 0.2315 - classification_loss: 0.0111 403/500 [=======================>......] - ETA: 29s - loss: 0.2422 - regression_loss: 0.2311 - classification_loss: 0.0111 404/500 [=======================>......] - ETA: 29s - loss: 0.2419 - regression_loss: 0.2308 - classification_loss: 0.0111 405/500 [=======================>......] - ETA: 29s - loss: 0.2418 - regression_loss: 0.2307 - classification_loss: 0.0111 406/500 [=======================>......] - ETA: 28s - loss: 0.2422 - regression_loss: 0.2311 - classification_loss: 0.0111 407/500 [=======================>......] - ETA: 28s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 408/500 [=======================>......] - ETA: 28s - loss: 0.2428 - regression_loss: 0.2316 - classification_loss: 0.0111 409/500 [=======================>......] - ETA: 27s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 410/500 [=======================>......] - ETA: 27s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 411/500 [=======================>......] - ETA: 27s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 412/500 [=======================>......] - ETA: 27s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 413/500 [=======================>......] - ETA: 26s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 414/500 [=======================>......] - ETA: 26s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 415/500 [=======================>......] - ETA: 26s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 416/500 [=======================>......] - ETA: 25s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 417/500 [========================>.....] - ETA: 25s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 418/500 [========================>.....] - ETA: 25s - loss: 0.2436 - regression_loss: 0.2325 - classification_loss: 0.0111 419/500 [========================>.....] - ETA: 24s - loss: 0.2433 - regression_loss: 0.2322 - classification_loss: 0.0111 420/500 [========================>.....] - ETA: 24s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 421/500 [========================>.....] - ETA: 24s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 422/500 [========================>.....] - ETA: 23s - loss: 0.2435 - regression_loss: 0.2323 - classification_loss: 0.0111 423/500 [========================>.....] - ETA: 23s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 424/500 [========================>.....] - ETA: 23s - loss: 0.2436 - regression_loss: 0.2325 - classification_loss: 0.0111 425/500 [========================>.....] - ETA: 23s - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 426/500 [========================>.....] - ETA: 22s - loss: 0.2438 - regression_loss: 0.2326 - classification_loss: 0.0112 427/500 [========================>.....] - ETA: 22s - loss: 0.2438 - regression_loss: 0.2326 - classification_loss: 0.0112 428/500 [========================>.....] - ETA: 22s - loss: 0.2439 - regression_loss: 0.2327 - classification_loss: 0.0112 429/500 [========================>.....] - ETA: 21s - loss: 0.2437 - regression_loss: 0.2325 - classification_loss: 0.0112 430/500 [========================>.....] - ETA: 21s - loss: 0.2437 - regression_loss: 0.2326 - classification_loss: 0.0112 431/500 [========================>.....] - ETA: 21s - loss: 0.2435 - regression_loss: 0.2323 - classification_loss: 0.0111 432/500 [========================>.....] - ETA: 20s - loss: 0.2434 - regression_loss: 0.2322 - classification_loss: 0.0111 433/500 [========================>.....] - ETA: 20s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 434/500 [=========================>....] - ETA: 20s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 435/500 [=========================>....] - ETA: 19s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 436/500 [=========================>....] - ETA: 19s - loss: 0.2430 - regression_loss: 0.2319 - classification_loss: 0.0111 437/500 [=========================>....] - ETA: 19s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 438/500 [=========================>....] - ETA: 19s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 439/500 [=========================>....] - ETA: 18s - loss: 0.2436 - regression_loss: 0.2324 - classification_loss: 0.0111 440/500 [=========================>....] - ETA: 18s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 441/500 [=========================>....] - ETA: 18s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 442/500 [=========================>....] - ETA: 17s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 443/500 [=========================>....] - ETA: 17s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 444/500 [=========================>....] - ETA: 17s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 445/500 [=========================>....] - ETA: 16s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 446/500 [=========================>....] - ETA: 16s - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0111 447/500 [=========================>....] - ETA: 16s - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0110 448/500 [=========================>....] - ETA: 15s - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0110 449/500 [=========================>....] - ETA: 15s - loss: 0.2433 - regression_loss: 0.2323 - classification_loss: 0.0110 450/500 [==========================>...] - ETA: 15s - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0110 451/500 [==========================>...] - ETA: 15s - loss: 0.2431 - regression_loss: 0.2321 - classification_loss: 0.0110 452/500 [==========================>...] - ETA: 14s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 453/500 [==========================>...] - ETA: 14s - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0110 454/500 [==========================>...] - ETA: 14s - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0110 455/500 [==========================>...] - ETA: 13s - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0110 456/500 [==========================>...] - ETA: 13s - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0111 457/500 [==========================>...] - ETA: 13s - loss: 0.2430 - regression_loss: 0.2320 - classification_loss: 0.0110 458/500 [==========================>...] - ETA: 12s - loss: 0.2430 - regression_loss: 0.2319 - classification_loss: 0.0110 459/500 [==========================>...] - ETA: 12s - loss: 0.2429 - regression_loss: 0.2319 - classification_loss: 0.0110 460/500 [==========================>...] - ETA: 12s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 461/500 [==========================>...] - ETA: 11s - loss: 0.2426 - regression_loss: 0.2316 - classification_loss: 0.0110 462/500 [==========================>...] - ETA: 11s - loss: 0.2431 - regression_loss: 0.2321 - classification_loss: 0.0111 463/500 [==========================>...] - ETA: 11s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 464/500 [==========================>...] - ETA: 11s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 465/500 [==========================>...] - ETA: 10s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0110 466/500 [==========================>...] - ETA: 10s - loss: 0.2426 - regression_loss: 0.2316 - classification_loss: 0.0110 467/500 [===========================>..] - ETA: 10s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 468/500 [===========================>..] - ETA: 9s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110  469/500 [===========================>..] - ETA: 9s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 470/500 [===========================>..] - ETA: 9s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 471/500 [===========================>..] - ETA: 8s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 472/500 [===========================>..] - ETA: 8s - loss: 0.2423 - regression_loss: 0.2313 - classification_loss: 0.0110 473/500 [===========================>..] - ETA: 8s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 474/500 [===========================>..] - ETA: 7s - loss: 0.2423 - regression_loss: 0.2313 - classification_loss: 0.0110 475/500 [===========================>..] - ETA: 7s - loss: 0.2423 - regression_loss: 0.2313 - classification_loss: 0.0110 476/500 [===========================>..] - ETA: 7s - loss: 0.2420 - regression_loss: 0.2311 - classification_loss: 0.0110 477/500 [===========================>..] - ETA: 7s - loss: 0.2419 - regression_loss: 0.2310 - classification_loss: 0.0110 478/500 [===========================>..] - ETA: 6s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0110 479/500 [===========================>..] - ETA: 6s - loss: 0.2415 - regression_loss: 0.2306 - classification_loss: 0.0109 480/500 [===========================>..] - ETA: 6s - loss: 0.2416 - regression_loss: 0.2307 - classification_loss: 0.0109 481/500 [===========================>..] - ETA: 5s - loss: 0.2416 - regression_loss: 0.2306 - classification_loss: 0.0109 482/500 [===========================>..] - ETA: 5s - loss: 0.2414 - regression_loss: 0.2305 - classification_loss: 0.0109 483/500 [===========================>..] - ETA: 5s - loss: 0.2412 - regression_loss: 0.2303 - classification_loss: 0.0109 484/500 [============================>.] - ETA: 4s - loss: 0.2409 - regression_loss: 0.2300 - classification_loss: 0.0109 485/500 [============================>.] - ETA: 4s - loss: 0.2408 - regression_loss: 0.2299 - classification_loss: 0.0109 486/500 [============================>.] - ETA: 4s - loss: 0.2410 - regression_loss: 0.2301 - classification_loss: 0.0109 487/500 [============================>.] - ETA: 3s - loss: 0.2411 - regression_loss: 0.2302 - classification_loss: 0.0109 488/500 [============================>.] - ETA: 3s - loss: 0.2412 - regression_loss: 0.2303 - classification_loss: 0.0109 489/500 [============================>.] - ETA: 3s - loss: 0.2414 - regression_loss: 0.2305 - classification_loss: 0.0109 490/500 [============================>.] - ETA: 3s - loss: 0.2414 - regression_loss: 0.2305 - classification_loss: 0.0109 491/500 [============================>.] - ETA: 2s - loss: 0.2420 - regression_loss: 0.2311 - classification_loss: 0.0110 492/500 [============================>.] - ETA: 2s - loss: 0.2421 - regression_loss: 0.2311 - classification_loss: 0.0110 493/500 [============================>.] - ETA: 2s - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0109 494/500 [============================>.] - ETA: 1s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0109 495/500 [============================>.] - ETA: 1s - loss: 0.2419 - regression_loss: 0.2310 - classification_loss: 0.0109 496/500 [============================>.] - ETA: 1s - loss: 0.2420 - regression_loss: 0.2311 - classification_loss: 0.0110 497/500 [============================>.] - ETA: 0s - loss: 0.2419 - regression_loss: 0.2310 - classification_loss: 0.0109 498/500 [============================>.] - ETA: 0s - loss: 0.2419 - regression_loss: 0.2310 - classification_loss: 0.0109 499/500 [============================>.] - ETA: 0s - loss: 0.2419 - regression_loss: 0.2309 - classification_loss: 0.0109 500/500 [==============================] - 154s 308ms/step - loss: 0.2418 - regression_loss: 0.2309 - classification_loss: 0.0109 1172 instances of class plum with average precision: 0.7534 mAP: 0.7534 Epoch 00046: saving model to ./training/snapshots/resnet101_pascal_46.h5 Epoch 47/150 1/500 [..............................] - ETA: 2:37 - loss: 0.1730 - regression_loss: 0.1684 - classification_loss: 0.0046 2/500 [..............................] - ETA: 2:42 - loss: 0.1286 - regression_loss: 0.1245 - classification_loss: 0.0041 3/500 [..............................] - ETA: 2:43 - loss: 0.1408 - regression_loss: 0.1371 - classification_loss: 0.0037 4/500 [..............................] - ETA: 2:44 - loss: 0.1699 - regression_loss: 0.1662 - classification_loss: 0.0037 5/500 [..............................] - ETA: 2:44 - loss: 0.1939 - regression_loss: 0.1873 - classification_loss: 0.0066 6/500 [..............................] - ETA: 2:44 - loss: 0.1973 - regression_loss: 0.1903 - classification_loss: 0.0070 7/500 [..............................] - ETA: 2:44 - loss: 0.2488 - regression_loss: 0.2404 - classification_loss: 0.0084 8/500 [..............................] - ETA: 2:44 - loss: 0.2422 - regression_loss: 0.2336 - classification_loss: 0.0086 9/500 [..............................] - ETA: 2:45 - loss: 0.2329 - regression_loss: 0.2248 - classification_loss: 0.0081 10/500 [..............................] - ETA: 2:44 - loss: 0.2357 - regression_loss: 0.2277 - classification_loss: 0.0079 11/500 [..............................] - ETA: 2:44 - loss: 0.2313 - regression_loss: 0.2235 - classification_loss: 0.0078 12/500 [..............................] - ETA: 2:43 - loss: 0.2302 - regression_loss: 0.2223 - classification_loss: 0.0079 13/500 [..............................] - ETA: 2:43 - loss: 0.2275 - regression_loss: 0.2196 - classification_loss: 0.0079 14/500 [..............................] - ETA: 2:43 - loss: 0.2330 - regression_loss: 0.2235 - classification_loss: 0.0095 15/500 [..............................] - ETA: 2:43 - loss: 0.2249 - regression_loss: 0.2156 - classification_loss: 0.0093 16/500 [..............................] - ETA: 2:43 - loss: 0.2240 - regression_loss: 0.2147 - classification_loss: 0.0093 17/500 [>.............................] - ETA: 2:43 - loss: 0.2247 - regression_loss: 0.2155 - classification_loss: 0.0092 18/500 [>.............................] - ETA: 2:42 - loss: 0.2282 - regression_loss: 0.2191 - classification_loss: 0.0091 19/500 [>.............................] - ETA: 2:42 - loss: 0.2258 - regression_loss: 0.2165 - classification_loss: 0.0093 20/500 [>.............................] - ETA: 2:42 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0100 21/500 [>.............................] - ETA: 2:42 - loss: 0.2239 - regression_loss: 0.2140 - classification_loss: 0.0099 22/500 [>.............................] - ETA: 2:41 - loss: 0.2250 - regression_loss: 0.2155 - classification_loss: 0.0095 23/500 [>.............................] - ETA: 2:41 - loss: 0.2249 - regression_loss: 0.2155 - classification_loss: 0.0094 24/500 [>.............................] - ETA: 2:40 - loss: 0.2221 - regression_loss: 0.2129 - classification_loss: 0.0092 25/500 [>.............................] - ETA: 2:40 - loss: 0.2194 - regression_loss: 0.2103 - classification_loss: 0.0091 26/500 [>.............................] - ETA: 2:40 - loss: 0.2238 - regression_loss: 0.2148 - classification_loss: 0.0090 27/500 [>.............................] - ETA: 2:40 - loss: 0.2222 - regression_loss: 0.2134 - classification_loss: 0.0088 28/500 [>.............................] - ETA: 2:40 - loss: 0.2195 - regression_loss: 0.2107 - classification_loss: 0.0088 29/500 [>.............................] - ETA: 2:39 - loss: 0.2194 - regression_loss: 0.2105 - classification_loss: 0.0089 30/500 [>.............................] - ETA: 2:39 - loss: 0.2158 - regression_loss: 0.2071 - classification_loss: 0.0087 31/500 [>.............................] - ETA: 2:38 - loss: 0.2151 - regression_loss: 0.2063 - classification_loss: 0.0088 32/500 [>.............................] - ETA: 2:38 - loss: 0.2117 - regression_loss: 0.2030 - classification_loss: 0.0087 33/500 [>.............................] - ETA: 2:38 - loss: 0.2115 - regression_loss: 0.2028 - classification_loss: 0.0087 34/500 [=>............................] - ETA: 2:38 - loss: 0.2129 - regression_loss: 0.2039 - classification_loss: 0.0089 35/500 [=>............................] - ETA: 2:37 - loss: 0.2110 - regression_loss: 0.2020 - classification_loss: 0.0090 36/500 [=>............................] - ETA: 2:37 - loss: 0.2148 - regression_loss: 0.2057 - classification_loss: 0.0091 37/500 [=>............................] - ETA: 2:37 - loss: 0.2197 - regression_loss: 0.2099 - classification_loss: 0.0098 38/500 [=>............................] - ETA: 2:36 - loss: 0.2221 - regression_loss: 0.2121 - classification_loss: 0.0099 39/500 [=>............................] - ETA: 2:36 - loss: 0.2239 - regression_loss: 0.2138 - classification_loss: 0.0101 40/500 [=>............................] - ETA: 2:35 - loss: 0.2224 - regression_loss: 0.2124 - classification_loss: 0.0100 41/500 [=>............................] - ETA: 2:35 - loss: 0.2248 - regression_loss: 0.2150 - classification_loss: 0.0098 42/500 [=>............................] - ETA: 2:35 - loss: 0.2241 - regression_loss: 0.2143 - classification_loss: 0.0098 43/500 [=>............................] - ETA: 2:35 - loss: 0.2246 - regression_loss: 0.2147 - classification_loss: 0.0099 44/500 [=>............................] - ETA: 2:34 - loss: 0.2225 - regression_loss: 0.2128 - classification_loss: 0.0097 45/500 [=>............................] - ETA: 2:34 - loss: 0.2215 - regression_loss: 0.2117 - classification_loss: 0.0097 46/500 [=>............................] - ETA: 2:34 - loss: 0.2236 - regression_loss: 0.2135 - classification_loss: 0.0102 47/500 [=>............................] - ETA: 2:34 - loss: 0.2218 - regression_loss: 0.2118 - classification_loss: 0.0100 48/500 [=>............................] - ETA: 2:33 - loss: 0.2191 - regression_loss: 0.2092 - classification_loss: 0.0099 49/500 [=>............................] - ETA: 2:33 - loss: 0.2200 - regression_loss: 0.2099 - classification_loss: 0.0100 50/500 [==>...........................] - ETA: 2:33 - loss: 0.2172 - regression_loss: 0.2073 - classification_loss: 0.0099 51/500 [==>...........................] - ETA: 2:33 - loss: 0.2148 - regression_loss: 0.2051 - classification_loss: 0.0097 52/500 [==>...........................] - ETA: 2:32 - loss: 0.2129 - regression_loss: 0.2029 - classification_loss: 0.0100 53/500 [==>...........................] - ETA: 2:32 - loss: 0.2157 - regression_loss: 0.2053 - classification_loss: 0.0103 54/500 [==>...........................] - ETA: 2:32 - loss: 0.2203 - regression_loss: 0.2096 - classification_loss: 0.0107 55/500 [==>...........................] - ETA: 2:31 - loss: 0.2187 - regression_loss: 0.2081 - classification_loss: 0.0105 56/500 [==>...........................] - ETA: 2:31 - loss: 0.2192 - regression_loss: 0.2086 - classification_loss: 0.0106 57/500 [==>...........................] - ETA: 2:30 - loss: 0.2198 - regression_loss: 0.2093 - classification_loss: 0.0105 58/500 [==>...........................] - ETA: 2:30 - loss: 0.2213 - regression_loss: 0.2108 - classification_loss: 0.0105 59/500 [==>...........................] - ETA: 2:30 - loss: 0.2208 - regression_loss: 0.2104 - classification_loss: 0.0105 60/500 [==>...........................] - ETA: 2:29 - loss: 0.2222 - regression_loss: 0.2114 - classification_loss: 0.0108 61/500 [==>...........................] - ETA: 2:29 - loss: 0.2233 - regression_loss: 0.2122 - classification_loss: 0.0111 62/500 [==>...........................] - ETA: 2:29 - loss: 0.2230 - regression_loss: 0.2121 - classification_loss: 0.0109 63/500 [==>...........................] - ETA: 2:28 - loss: 0.2213 - regression_loss: 0.2105 - classification_loss: 0.0108 64/500 [==>...........................] - ETA: 2:28 - loss: 0.2190 - regression_loss: 0.2084 - classification_loss: 0.0107 65/500 [==>...........................] - ETA: 2:28 - loss: 0.2190 - regression_loss: 0.2084 - classification_loss: 0.0106 66/500 [==>...........................] - ETA: 2:28 - loss: 0.2209 - regression_loss: 0.2101 - classification_loss: 0.0108 67/500 [===>..........................] - ETA: 2:27 - loss: 0.2205 - regression_loss: 0.2098 - classification_loss: 0.0107 68/500 [===>..........................] - ETA: 2:27 - loss: 0.2202 - regression_loss: 0.2095 - classification_loss: 0.0107 69/500 [===>..........................] - ETA: 2:26 - loss: 0.2207 - regression_loss: 0.2100 - classification_loss: 0.0107 70/500 [===>..........................] - ETA: 2:26 - loss: 0.2206 - regression_loss: 0.2100 - classification_loss: 0.0106 71/500 [===>..........................] - ETA: 2:26 - loss: 0.2194 - regression_loss: 0.2089 - classification_loss: 0.0105 72/500 [===>..........................] - ETA: 2:25 - loss: 0.2184 - regression_loss: 0.2080 - classification_loss: 0.0104 73/500 [===>..........................] - ETA: 2:25 - loss: 0.2174 - regression_loss: 0.2070 - classification_loss: 0.0103 74/500 [===>..........................] - ETA: 2:25 - loss: 0.2163 - regression_loss: 0.2060 - classification_loss: 0.0102 75/500 [===>..........................] - ETA: 2:24 - loss: 0.2203 - regression_loss: 0.2101 - classification_loss: 0.0102 76/500 [===>..........................] - ETA: 2:24 - loss: 0.2225 - regression_loss: 0.2122 - classification_loss: 0.0104 77/500 [===>..........................] - ETA: 2:24 - loss: 0.2225 - regression_loss: 0.2122 - classification_loss: 0.0103 78/500 [===>..........................] - ETA: 2:23 - loss: 0.2223 - regression_loss: 0.2121 - classification_loss: 0.0103 79/500 [===>..........................] - ETA: 2:23 - loss: 0.2251 - regression_loss: 0.2147 - classification_loss: 0.0104 80/500 [===>..........................] - ETA: 2:23 - loss: 0.2244 - regression_loss: 0.2141 - classification_loss: 0.0104 81/500 [===>..........................] - ETA: 2:22 - loss: 0.2249 - regression_loss: 0.2145 - classification_loss: 0.0104 82/500 [===>..........................] - ETA: 2:22 - loss: 0.2291 - regression_loss: 0.2185 - classification_loss: 0.0105 83/500 [===>..........................] - ETA: 2:22 - loss: 0.2316 - regression_loss: 0.2208 - classification_loss: 0.0108 84/500 [====>.........................] - ETA: 2:21 - loss: 0.2310 - regression_loss: 0.2203 - classification_loss: 0.0108 85/500 [====>.........................] - ETA: 2:21 - loss: 0.2306 - regression_loss: 0.2198 - classification_loss: 0.0108 86/500 [====>.........................] - ETA: 2:21 - loss: 0.2306 - regression_loss: 0.2199 - classification_loss: 0.0108 87/500 [====>.........................] - ETA: 2:20 - loss: 0.2311 - regression_loss: 0.2204 - classification_loss: 0.0107 88/500 [====>.........................] - ETA: 2:20 - loss: 0.2309 - regression_loss: 0.2203 - classification_loss: 0.0107 89/500 [====>.........................] - ETA: 2:20 - loss: 0.2328 - regression_loss: 0.2221 - classification_loss: 0.0106 90/500 [====>.........................] - ETA: 2:19 - loss: 0.2332 - regression_loss: 0.2225 - classification_loss: 0.0106 91/500 [====>.........................] - ETA: 2:19 - loss: 0.2320 - regression_loss: 0.2214 - classification_loss: 0.0105 92/500 [====>.........................] - ETA: 2:19 - loss: 0.2314 - regression_loss: 0.2209 - classification_loss: 0.0104 93/500 [====>.........................] - ETA: 2:18 - loss: 0.2321 - regression_loss: 0.2217 - classification_loss: 0.0104 94/500 [====>.........................] - ETA: 2:18 - loss: 0.2307 - regression_loss: 0.2203 - classification_loss: 0.0104 95/500 [====>.........................] - ETA: 2:17 - loss: 0.2317 - regression_loss: 0.2213 - classification_loss: 0.0104 96/500 [====>.........................] - ETA: 2:17 - loss: 0.2313 - regression_loss: 0.2210 - classification_loss: 0.0103 97/500 [====>.........................] - ETA: 2:17 - loss: 0.2305 - regression_loss: 0.2202 - classification_loss: 0.0103 98/500 [====>.........................] - ETA: 2:16 - loss: 0.2323 - regression_loss: 0.2217 - classification_loss: 0.0106 99/500 [====>.........................] - ETA: 2:16 - loss: 0.2325 - regression_loss: 0.2220 - classification_loss: 0.0105 100/500 [=====>........................] - ETA: 2:16 - loss: 0.2325 - regression_loss: 0.2220 - classification_loss: 0.0105 101/500 [=====>........................] - ETA: 2:15 - loss: 0.2323 - regression_loss: 0.2219 - classification_loss: 0.0105 102/500 [=====>........................] - ETA: 2:15 - loss: 0.2321 - regression_loss: 0.2216 - classification_loss: 0.0105 103/500 [=====>........................] - ETA: 2:15 - loss: 0.2322 - regression_loss: 0.2217 - classification_loss: 0.0104 104/500 [=====>........................] - ETA: 2:14 - loss: 0.2336 - regression_loss: 0.2231 - classification_loss: 0.0106 105/500 [=====>........................] - ETA: 2:14 - loss: 0.2338 - regression_loss: 0.2233 - classification_loss: 0.0105 106/500 [=====>........................] - ETA: 2:14 - loss: 0.2343 - regression_loss: 0.2238 - classification_loss: 0.0105 107/500 [=====>........................] - ETA: 2:13 - loss: 0.2333 - regression_loss: 0.2229 - classification_loss: 0.0105 108/500 [=====>........................] - ETA: 2:13 - loss: 0.2337 - regression_loss: 0.2233 - classification_loss: 0.0105 109/500 [=====>........................] - ETA: 2:13 - loss: 0.2332 - regression_loss: 0.2228 - classification_loss: 0.0104 110/500 [=====>........................] - ETA: 2:12 - loss: 0.2332 - regression_loss: 0.2227 - classification_loss: 0.0105 111/500 [=====>........................] - ETA: 2:12 - loss: 0.2323 - regression_loss: 0.2219 - classification_loss: 0.0104 112/500 [=====>........................] - ETA: 2:12 - loss: 0.2315 - regression_loss: 0.2211 - classification_loss: 0.0103 113/500 [=====>........................] - ETA: 2:11 - loss: 0.2313 - regression_loss: 0.2209 - classification_loss: 0.0103 114/500 [=====>........................] - ETA: 2:11 - loss: 0.2300 - regression_loss: 0.2197 - classification_loss: 0.0103 115/500 [=====>........................] - ETA: 2:11 - loss: 0.2289 - regression_loss: 0.2187 - classification_loss: 0.0102 116/500 [=====>........................] - ETA: 2:10 - loss: 0.2280 - regression_loss: 0.2178 - classification_loss: 0.0102 117/500 [======>.......................] - ETA: 2:10 - loss: 0.2286 - regression_loss: 0.2184 - classification_loss: 0.0102 118/500 [======>.......................] - ETA: 2:10 - loss: 0.2275 - regression_loss: 0.2173 - classification_loss: 0.0102 119/500 [======>.......................] - ETA: 2:09 - loss: 0.2275 - regression_loss: 0.2173 - classification_loss: 0.0102 120/500 [======>.......................] - ETA: 2:09 - loss: 0.2265 - regression_loss: 0.2164 - classification_loss: 0.0101 121/500 [======>.......................] - ETA: 2:09 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 122/500 [======>.......................] - ETA: 2:09 - loss: 0.2253 - regression_loss: 0.2153 - classification_loss: 0.0101 123/500 [======>.......................] - ETA: 2:08 - loss: 0.2257 - regression_loss: 0.2155 - classification_loss: 0.0101 124/500 [======>.......................] - ETA: 2:08 - loss: 0.2267 - regression_loss: 0.2165 - classification_loss: 0.0102 125/500 [======>.......................] - ETA: 2:08 - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 126/500 [======>.......................] - ETA: 2:07 - loss: 0.2277 - regression_loss: 0.2176 - classification_loss: 0.0102 127/500 [======>.......................] - ETA: 2:07 - loss: 0.2288 - regression_loss: 0.2185 - classification_loss: 0.0103 128/500 [======>.......................] - ETA: 2:07 - loss: 0.2286 - regression_loss: 0.2183 - classification_loss: 0.0103 129/500 [======>.......................] - ETA: 2:06 - loss: 0.2293 - regression_loss: 0.2190 - classification_loss: 0.0103 130/500 [======>.......................] - ETA: 2:06 - loss: 0.2282 - regression_loss: 0.2180 - classification_loss: 0.0102 131/500 [======>.......................] - ETA: 2:06 - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 132/500 [======>.......................] - ETA: 2:05 - loss: 0.2316 - regression_loss: 0.2211 - classification_loss: 0.0105 133/500 [======>.......................] - ETA: 2:05 - loss: 0.2309 - regression_loss: 0.2205 - classification_loss: 0.0104 134/500 [=======>......................] - ETA: 2:05 - loss: 0.2320 - regression_loss: 0.2211 - classification_loss: 0.0109 135/500 [=======>......................] - ETA: 2:04 - loss: 0.2316 - regression_loss: 0.2208 - classification_loss: 0.0109 136/500 [=======>......................] - ETA: 2:04 - loss: 0.2313 - regression_loss: 0.2204 - classification_loss: 0.0109 137/500 [=======>......................] - ETA: 2:03 - loss: 0.2319 - regression_loss: 0.2211 - classification_loss: 0.0109 138/500 [=======>......................] - ETA: 2:03 - loss: 0.2323 - regression_loss: 0.2214 - classification_loss: 0.0109 139/500 [=======>......................] - ETA: 2:03 - loss: 0.2324 - regression_loss: 0.2214 - classification_loss: 0.0109 140/500 [=======>......................] - ETA: 2:02 - loss: 0.2330 - regression_loss: 0.2219 - classification_loss: 0.0111 141/500 [=======>......................] - ETA: 2:02 - loss: 0.2335 - regression_loss: 0.2224 - classification_loss: 0.0111 142/500 [=======>......................] - ETA: 2:01 - loss: 0.2334 - regression_loss: 0.2223 - classification_loss: 0.0111 143/500 [=======>......................] - ETA: 2:01 - loss: 0.2334 - regression_loss: 0.2224 - classification_loss: 0.0110 144/500 [=======>......................] - ETA: 2:01 - loss: 0.2341 - regression_loss: 0.2230 - classification_loss: 0.0111 145/500 [=======>......................] - ETA: 2:00 - loss: 0.2332 - regression_loss: 0.2222 - classification_loss: 0.0110 146/500 [=======>......................] - ETA: 2:00 - loss: 0.2335 - regression_loss: 0.2225 - classification_loss: 0.0110 147/500 [=======>......................] - ETA: 1:59 - loss: 0.2339 - regression_loss: 0.2228 - classification_loss: 0.0111 148/500 [=======>......................] - ETA: 1:59 - loss: 0.2329 - regression_loss: 0.2219 - classification_loss: 0.0110 149/500 [=======>......................] - ETA: 1:59 - loss: 0.2343 - regression_loss: 0.2232 - classification_loss: 0.0111 150/500 [========>.....................] - ETA: 1:58 - loss: 0.2334 - regression_loss: 0.2223 - classification_loss: 0.0110 151/500 [========>.....................] - ETA: 1:58 - loss: 0.2327 - regression_loss: 0.2217 - classification_loss: 0.0110 152/500 [========>.....................] - ETA: 1:57 - loss: 0.2327 - regression_loss: 0.2217 - classification_loss: 0.0110 153/500 [========>.....................] - ETA: 1:57 - loss: 0.2323 - regression_loss: 0.2213 - classification_loss: 0.0110 154/500 [========>.....................] - ETA: 1:57 - loss: 0.2332 - regression_loss: 0.2222 - classification_loss: 0.0110 155/500 [========>.....................] - ETA: 1:56 - loss: 0.2322 - regression_loss: 0.2212 - classification_loss: 0.0110 156/500 [========>.....................] - ETA: 1:56 - loss: 0.2327 - regression_loss: 0.2217 - classification_loss: 0.0110 157/500 [========>.....................] - ETA: 1:56 - loss: 0.2329 - regression_loss: 0.2219 - classification_loss: 0.0110 158/500 [========>.....................] - ETA: 1:55 - loss: 0.2326 - regression_loss: 0.2216 - classification_loss: 0.0110 159/500 [========>.....................] - ETA: 1:55 - loss: 0.2330 - regression_loss: 0.2219 - classification_loss: 0.0110 160/500 [========>.....................] - ETA: 1:55 - loss: 0.2338 - regression_loss: 0.2228 - classification_loss: 0.0110 161/500 [========>.....................] - ETA: 1:54 - loss: 0.2343 - regression_loss: 0.2233 - classification_loss: 0.0110 162/500 [========>.....................] - ETA: 1:54 - loss: 0.2339 - regression_loss: 0.2229 - classification_loss: 0.0110 163/500 [========>.....................] - ETA: 1:54 - loss: 0.2337 - regression_loss: 0.2227 - classification_loss: 0.0110 164/500 [========>.....................] - ETA: 1:53 - loss: 0.2331 - regression_loss: 0.2222 - classification_loss: 0.0109 165/500 [========>.....................] - ETA: 1:53 - loss: 0.2335 - regression_loss: 0.2225 - classification_loss: 0.0110 166/500 [========>.....................] - ETA: 1:52 - loss: 0.2341 - regression_loss: 0.2229 - classification_loss: 0.0112 167/500 [=========>....................] - ETA: 1:52 - loss: 0.2339 - regression_loss: 0.2227 - classification_loss: 0.0112 168/500 [=========>....................] - ETA: 1:52 - loss: 0.2341 - regression_loss: 0.2229 - classification_loss: 0.0112 169/500 [=========>....................] - ETA: 1:51 - loss: 0.2339 - regression_loss: 0.2227 - classification_loss: 0.0112 170/500 [=========>....................] - ETA: 1:51 - loss: 0.2339 - regression_loss: 0.2227 - classification_loss: 0.0112 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2339 - regression_loss: 0.2227 - classification_loss: 0.0112 172/500 [=========>....................] - ETA: 1:50 - loss: 0.2337 - regression_loss: 0.2226 - classification_loss: 0.0112 173/500 [=========>....................] - ETA: 1:50 - loss: 0.2341 - regression_loss: 0.2227 - classification_loss: 0.0113 174/500 [=========>....................] - ETA: 1:49 - loss: 0.2337 - regression_loss: 0.2223 - classification_loss: 0.0113 175/500 [=========>....................] - ETA: 1:49 - loss: 0.2335 - regression_loss: 0.2222 - classification_loss: 0.0113 176/500 [=========>....................] - ETA: 1:49 - loss: 0.2333 - regression_loss: 0.2220 - classification_loss: 0.0113 177/500 [=========>....................] - ETA: 1:48 - loss: 0.2334 - regression_loss: 0.2220 - classification_loss: 0.0114 178/500 [=========>....................] - ETA: 1:48 - loss: 0.2333 - regression_loss: 0.2219 - classification_loss: 0.0114 179/500 [=========>....................] - ETA: 1:48 - loss: 0.2334 - regression_loss: 0.2219 - classification_loss: 0.0114 180/500 [=========>....................] - ETA: 1:47 - loss: 0.2330 - regression_loss: 0.2216 - classification_loss: 0.0114 181/500 [=========>....................] - ETA: 1:47 - loss: 0.2329 - regression_loss: 0.2215 - classification_loss: 0.0114 182/500 [=========>....................] - ETA: 1:47 - loss: 0.2327 - regression_loss: 0.2214 - classification_loss: 0.0114 183/500 [=========>....................] - ETA: 1:46 - loss: 0.2335 - regression_loss: 0.2221 - classification_loss: 0.0114 184/500 [==========>...................] - ETA: 1:46 - loss: 0.2347 - regression_loss: 0.2232 - classification_loss: 0.0115 185/500 [==========>...................] - ETA: 1:46 - loss: 0.2351 - regression_loss: 0.2235 - classification_loss: 0.0115 186/500 [==========>...................] - ETA: 1:45 - loss: 0.2355 - regression_loss: 0.2240 - classification_loss: 0.0115 187/500 [==========>...................] - ETA: 1:45 - loss: 0.2356 - regression_loss: 0.2240 - classification_loss: 0.0115 188/500 [==========>...................] - ETA: 1:45 - loss: 0.2350 - regression_loss: 0.2236 - classification_loss: 0.0115 189/500 [==========>...................] - ETA: 1:44 - loss: 0.2353 - regression_loss: 0.2238 - classification_loss: 0.0115 190/500 [==========>...................] - ETA: 1:44 - loss: 0.2353 - regression_loss: 0.2238 - classification_loss: 0.0115 191/500 [==========>...................] - ETA: 1:43 - loss: 0.2347 - regression_loss: 0.2232 - classification_loss: 0.0115 192/500 [==========>...................] - ETA: 1:43 - loss: 0.2340 - regression_loss: 0.2225 - classification_loss: 0.0114 193/500 [==========>...................] - ETA: 1:43 - loss: 0.2341 - regression_loss: 0.2226 - classification_loss: 0.0114 194/500 [==========>...................] - ETA: 1:42 - loss: 0.2344 - regression_loss: 0.2230 - classification_loss: 0.0114 195/500 [==========>...................] - ETA: 1:42 - loss: 0.2337 - regression_loss: 0.2223 - classification_loss: 0.0114 196/500 [==========>...................] - ETA: 1:42 - loss: 0.2334 - regression_loss: 0.2221 - classification_loss: 0.0113 197/500 [==========>...................] - ETA: 1:41 - loss: 0.2334 - regression_loss: 0.2221 - classification_loss: 0.0113 198/500 [==========>...................] - ETA: 1:41 - loss: 0.2329 - regression_loss: 0.2216 - classification_loss: 0.0113 199/500 [==========>...................] - ETA: 1:41 - loss: 0.2331 - regression_loss: 0.2218 - classification_loss: 0.0113 200/500 [===========>..................] - ETA: 1:40 - loss: 0.2326 - regression_loss: 0.2214 - classification_loss: 0.0112 201/500 [===========>..................] - ETA: 1:40 - loss: 0.2323 - regression_loss: 0.2211 - classification_loss: 0.0112 202/500 [===========>..................] - ETA: 1:39 - loss: 0.2329 - regression_loss: 0.2217 - classification_loss: 0.0112 203/500 [===========>..................] - ETA: 1:39 - loss: 0.2321 - regression_loss: 0.2210 - classification_loss: 0.0111 204/500 [===========>..................] - ETA: 1:39 - loss: 0.2316 - regression_loss: 0.2205 - classification_loss: 0.0111 205/500 [===========>..................] - ETA: 1:38 - loss: 0.2320 - regression_loss: 0.2209 - classification_loss: 0.0111 206/500 [===========>..................] - ETA: 1:38 - loss: 0.2316 - regression_loss: 0.2206 - classification_loss: 0.0111 207/500 [===========>..................] - ETA: 1:38 - loss: 0.2312 - regression_loss: 0.2202 - classification_loss: 0.0110 208/500 [===========>..................] - ETA: 1:37 - loss: 0.2308 - regression_loss: 0.2198 - classification_loss: 0.0110 209/500 [===========>..................] - ETA: 1:37 - loss: 0.2304 - regression_loss: 0.2195 - classification_loss: 0.0110 210/500 [===========>..................] - ETA: 1:37 - loss: 0.2305 - regression_loss: 0.2195 - classification_loss: 0.0110 211/500 [===========>..................] - ETA: 1:36 - loss: 0.2305 - regression_loss: 0.2195 - classification_loss: 0.0110 212/500 [===========>..................] - ETA: 1:36 - loss: 0.2307 - regression_loss: 0.2197 - classification_loss: 0.0110 213/500 [===========>..................] - ETA: 1:36 - loss: 0.2321 - regression_loss: 0.2211 - classification_loss: 0.0110 214/500 [===========>..................] - ETA: 1:35 - loss: 0.2327 - regression_loss: 0.2217 - classification_loss: 0.0110 215/500 [===========>..................] - ETA: 1:35 - loss: 0.2333 - regression_loss: 0.2222 - classification_loss: 0.0111 216/500 [===========>..................] - ETA: 1:35 - loss: 0.2328 - regression_loss: 0.2218 - classification_loss: 0.0111 217/500 [============>.................] - ETA: 1:34 - loss: 0.2325 - regression_loss: 0.2215 - classification_loss: 0.0110 218/500 [============>.................] - ETA: 1:34 - loss: 0.2320 - regression_loss: 0.2210 - classification_loss: 0.0110 219/500 [============>.................] - ETA: 1:34 - loss: 0.2321 - regression_loss: 0.2211 - classification_loss: 0.0110 220/500 [============>.................] - ETA: 1:34 - loss: 0.2317 - regression_loss: 0.2208 - classification_loss: 0.0109 221/500 [============>.................] - ETA: 1:33 - loss: 0.2315 - regression_loss: 0.2206 - classification_loss: 0.0109 222/500 [============>.................] - ETA: 1:33 - loss: 0.2312 - regression_loss: 0.2203 - classification_loss: 0.0109 223/500 [============>.................] - ETA: 1:32 - loss: 0.2311 - regression_loss: 0.2203 - classification_loss: 0.0109 224/500 [============>.................] - ETA: 1:32 - loss: 0.2315 - regression_loss: 0.2207 - classification_loss: 0.0109 225/500 [============>.................] - ETA: 1:32 - loss: 0.2321 - regression_loss: 0.2211 - classification_loss: 0.0109 226/500 [============>.................] - ETA: 1:31 - loss: 0.2320 - regression_loss: 0.2211 - classification_loss: 0.0109 227/500 [============>.................] - ETA: 1:31 - loss: 0.2320 - regression_loss: 0.2212 - classification_loss: 0.0109 228/500 [============>.................] - ETA: 1:31 - loss: 0.2325 - regression_loss: 0.2216 - classification_loss: 0.0109 229/500 [============>.................] - ETA: 1:30 - loss: 0.2333 - regression_loss: 0.2224 - classification_loss: 0.0109 230/500 [============>.................] - ETA: 1:30 - loss: 0.2333 - regression_loss: 0.2225 - classification_loss: 0.0109 231/500 [============>.................] - ETA: 1:30 - loss: 0.2338 - regression_loss: 0.2229 - classification_loss: 0.0109 232/500 [============>.................] - ETA: 1:29 - loss: 0.2338 - regression_loss: 0.2229 - classification_loss: 0.0109 233/500 [============>.................] - ETA: 1:29 - loss: 0.2355 - regression_loss: 0.2245 - classification_loss: 0.0110 234/500 [=============>................] - ETA: 1:29 - loss: 0.2361 - regression_loss: 0.2250 - classification_loss: 0.0111 235/500 [=============>................] - ETA: 1:28 - loss: 0.2358 - regression_loss: 0.2247 - classification_loss: 0.0111 236/500 [=============>................] - ETA: 1:28 - loss: 0.2356 - regression_loss: 0.2245 - classification_loss: 0.0111 237/500 [=============>................] - ETA: 1:27 - loss: 0.2354 - regression_loss: 0.2243 - classification_loss: 0.0111 238/500 [=============>................] - ETA: 1:27 - loss: 0.2372 - regression_loss: 0.2261 - classification_loss: 0.0111 239/500 [=============>................] - ETA: 1:27 - loss: 0.2370 - regression_loss: 0.2260 - classification_loss: 0.0111 240/500 [=============>................] - ETA: 1:27 - loss: 0.2374 - regression_loss: 0.2263 - classification_loss: 0.0111 241/500 [=============>................] - ETA: 1:26 - loss: 0.2377 - regression_loss: 0.2266 - classification_loss: 0.0111 242/500 [=============>................] - ETA: 1:26 - loss: 0.2369 - regression_loss: 0.2259 - classification_loss: 0.0111 243/500 [=============>................] - ETA: 1:26 - loss: 0.2365 - regression_loss: 0.2255 - classification_loss: 0.0110 244/500 [=============>................] - ETA: 1:25 - loss: 0.2364 - regression_loss: 0.2254 - classification_loss: 0.0110 245/500 [=============>................] - ETA: 1:25 - loss: 0.2368 - regression_loss: 0.2258 - classification_loss: 0.0110 246/500 [=============>................] - ETA: 1:24 - loss: 0.2370 - regression_loss: 0.2260 - classification_loss: 0.0110 247/500 [=============>................] - ETA: 1:24 - loss: 0.2371 - regression_loss: 0.2261 - classification_loss: 0.0110 248/500 [=============>................] - ETA: 1:24 - loss: 0.2368 - regression_loss: 0.2258 - classification_loss: 0.0110 249/500 [=============>................] - ETA: 1:23 - loss: 0.2382 - regression_loss: 0.2271 - classification_loss: 0.0111 250/500 [==============>...............] - ETA: 1:23 - loss: 0.2383 - regression_loss: 0.2272 - classification_loss: 0.0111 251/500 [==============>...............] - ETA: 1:23 - loss: 0.2388 - regression_loss: 0.2277 - classification_loss: 0.0111 252/500 [==============>...............] - ETA: 1:23 - loss: 0.2396 - regression_loss: 0.2285 - classification_loss: 0.0111 253/500 [==============>...............] - ETA: 1:22 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0111 254/500 [==============>...............] - ETA: 1:22 - loss: 0.2401 - regression_loss: 0.2290 - classification_loss: 0.0111 255/500 [==============>...............] - ETA: 1:21 - loss: 0.2401 - regression_loss: 0.2291 - classification_loss: 0.0110 256/500 [==============>...............] - ETA: 1:21 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0110 257/500 [==============>...............] - ETA: 1:21 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0110 258/500 [==============>...............] - ETA: 1:20 - loss: 0.2397 - regression_loss: 0.2287 - classification_loss: 0.0110 259/500 [==============>...............] - ETA: 1:20 - loss: 0.2396 - regression_loss: 0.2286 - classification_loss: 0.0110 260/500 [==============>...............] - ETA: 1:20 - loss: 0.2393 - regression_loss: 0.2283 - classification_loss: 0.0110 261/500 [==============>...............] - ETA: 1:19 - loss: 0.2391 - regression_loss: 0.2282 - classification_loss: 0.0110 262/500 [==============>...............] - ETA: 1:19 - loss: 0.2396 - regression_loss: 0.2286 - classification_loss: 0.0110 263/500 [==============>...............] - ETA: 1:19 - loss: 0.2401 - regression_loss: 0.2292 - classification_loss: 0.0110 264/500 [==============>...............] - ETA: 1:18 - loss: 0.2402 - regression_loss: 0.2292 - classification_loss: 0.0110 265/500 [==============>...............] - ETA: 1:18 - loss: 0.2409 - regression_loss: 0.2299 - classification_loss: 0.0110 266/500 [==============>...............] - ETA: 1:18 - loss: 0.2405 - regression_loss: 0.2295 - classification_loss: 0.0109 267/500 [===============>..............] - ETA: 1:17 - loss: 0.2402 - regression_loss: 0.2293 - classification_loss: 0.0109 268/500 [===============>..............] - ETA: 1:17 - loss: 0.2397 - regression_loss: 0.2289 - classification_loss: 0.0109 269/500 [===============>..............] - ETA: 1:17 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0110 270/500 [===============>..............] - ETA: 1:16 - loss: 0.2397 - regression_loss: 0.2288 - classification_loss: 0.0110 271/500 [===============>..............] - ETA: 1:16 - loss: 0.2399 - regression_loss: 0.2289 - classification_loss: 0.0109 272/500 [===============>..............] - ETA: 1:16 - loss: 0.2395 - regression_loss: 0.2285 - classification_loss: 0.0109 273/500 [===============>..............] - ETA: 1:15 - loss: 0.2394 - regression_loss: 0.2285 - classification_loss: 0.0109 274/500 [===============>..............] - ETA: 1:15 - loss: 0.2392 - regression_loss: 0.2283 - classification_loss: 0.0109 275/500 [===============>..............] - ETA: 1:15 - loss: 0.2395 - regression_loss: 0.2286 - classification_loss: 0.0109 276/500 [===============>..............] - ETA: 1:14 - loss: 0.2403 - regression_loss: 0.2293 - classification_loss: 0.0110 277/500 [===============>..............] - ETA: 1:14 - loss: 0.2402 - regression_loss: 0.2292 - classification_loss: 0.0109 278/500 [===============>..............] - ETA: 1:14 - loss: 0.2404 - regression_loss: 0.2295 - classification_loss: 0.0110 279/500 [===============>..............] - ETA: 1:13 - loss: 0.2400 - regression_loss: 0.2291 - classification_loss: 0.0110 280/500 [===============>..............] - ETA: 1:13 - loss: 0.2396 - regression_loss: 0.2286 - classification_loss: 0.0109 281/500 [===============>..............] - ETA: 1:13 - loss: 0.2395 - regression_loss: 0.2285 - classification_loss: 0.0110 282/500 [===============>..............] - ETA: 1:12 - loss: 0.2393 - regression_loss: 0.2284 - classification_loss: 0.0110 283/500 [===============>..............] - ETA: 1:12 - loss: 0.2392 - regression_loss: 0.2283 - classification_loss: 0.0109 284/500 [================>.............] - ETA: 1:12 - loss: 0.2394 - regression_loss: 0.2284 - classification_loss: 0.0109 285/500 [================>.............] - ETA: 1:11 - loss: 0.2390 - regression_loss: 0.2281 - classification_loss: 0.0109 286/500 [================>.............] - ETA: 1:11 - loss: 0.2395 - regression_loss: 0.2285 - classification_loss: 0.0110 287/500 [================>.............] - ETA: 1:11 - loss: 0.2392 - regression_loss: 0.2282 - classification_loss: 0.0110 288/500 [================>.............] - ETA: 1:10 - loss: 0.2392 - regression_loss: 0.2283 - classification_loss: 0.0109 289/500 [================>.............] - ETA: 1:10 - loss: 0.2396 - regression_loss: 0.2286 - classification_loss: 0.0109 290/500 [================>.............] - ETA: 1:10 - loss: 0.2391 - regression_loss: 0.2281 - classification_loss: 0.0109 291/500 [================>.............] - ETA: 1:09 - loss: 0.2388 - regression_loss: 0.2280 - classification_loss: 0.0109 292/500 [================>.............] - ETA: 1:09 - loss: 0.2384 - regression_loss: 0.2276 - classification_loss: 0.0109 293/500 [================>.............] - ETA: 1:09 - loss: 0.2384 - regression_loss: 0.2276 - classification_loss: 0.0108 294/500 [================>.............] - ETA: 1:08 - loss: 0.2382 - regression_loss: 0.2274 - classification_loss: 0.0108 295/500 [================>.............] - ETA: 1:08 - loss: 0.2386 - regression_loss: 0.2277 - classification_loss: 0.0109 296/500 [================>.............] - ETA: 1:08 - loss: 0.2387 - regression_loss: 0.2278 - classification_loss: 0.0109 297/500 [================>.............] - ETA: 1:07 - loss: 0.2388 - regression_loss: 0.2280 - classification_loss: 0.0109 298/500 [================>.............] - ETA: 1:07 - loss: 0.2392 - regression_loss: 0.2282 - classification_loss: 0.0109 299/500 [================>.............] - ETA: 1:07 - loss: 0.2389 - regression_loss: 0.2280 - classification_loss: 0.0109 300/500 [=================>............] - ETA: 1:06 - loss: 0.2390 - regression_loss: 0.2281 - classification_loss: 0.0109 301/500 [=================>............] - ETA: 1:06 - loss: 0.2385 - regression_loss: 0.2277 - classification_loss: 0.0109 302/500 [=================>............] - ETA: 1:06 - loss: 0.2386 - regression_loss: 0.2277 - classification_loss: 0.0109 303/500 [=================>............] - ETA: 1:05 - loss: 0.2387 - regression_loss: 0.2278 - classification_loss: 0.0109 304/500 [=================>............] - ETA: 1:05 - loss: 0.2394 - regression_loss: 0.2285 - classification_loss: 0.0109 305/500 [=================>............] - ETA: 1:05 - loss: 0.2398 - regression_loss: 0.2288 - classification_loss: 0.0110 306/500 [=================>............] - ETA: 1:04 - loss: 0.2398 - regression_loss: 0.2289 - classification_loss: 0.0109 307/500 [=================>............] - ETA: 1:04 - loss: 0.2397 - regression_loss: 0.2288 - classification_loss: 0.0109 308/500 [=================>............] - ETA: 1:04 - loss: 0.2398 - regression_loss: 0.2289 - classification_loss: 0.0109 309/500 [=================>............] - ETA: 1:03 - loss: 0.2398 - regression_loss: 0.2288 - classification_loss: 0.0109 310/500 [=================>............] - ETA: 1:03 - loss: 0.2395 - regression_loss: 0.2286 - classification_loss: 0.0109 311/500 [=================>............] - ETA: 1:03 - loss: 0.2396 - regression_loss: 0.2286 - classification_loss: 0.0109 312/500 [=================>............] - ETA: 1:02 - loss: 0.2395 - regression_loss: 0.2285 - classification_loss: 0.0109 313/500 [=================>............] - ETA: 1:02 - loss: 0.2398 - regression_loss: 0.2289 - classification_loss: 0.0109 314/500 [=================>............] - ETA: 1:02 - loss: 0.2399 - regression_loss: 0.2290 - classification_loss: 0.0109 315/500 [=================>............] - ETA: 1:01 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0110 316/500 [=================>............] - ETA: 1:01 - loss: 0.2400 - regression_loss: 0.2290 - classification_loss: 0.0109 317/500 [==================>...........] - ETA: 1:01 - loss: 0.2399 - regression_loss: 0.2290 - classification_loss: 0.0109 318/500 [==================>...........] - ETA: 1:00 - loss: 0.2402 - regression_loss: 0.2292 - classification_loss: 0.0110 319/500 [==================>...........] - ETA: 1:00 - loss: 0.2405 - regression_loss: 0.2295 - classification_loss: 0.0110 320/500 [==================>...........] - ETA: 1:00 - loss: 0.2405 - regression_loss: 0.2296 - classification_loss: 0.0109 321/500 [==================>...........] - ETA: 59s - loss: 0.2403 - regression_loss: 0.2294 - classification_loss: 0.0109  322/500 [==================>...........] - ETA: 59s - loss: 0.2401 - regression_loss: 0.2292 - classification_loss: 0.0109 323/500 [==================>...........] - ETA: 59s - loss: 0.2398 - regression_loss: 0.2289 - classification_loss: 0.0109 324/500 [==================>...........] - ETA: 58s - loss: 0.2405 - regression_loss: 0.2296 - classification_loss: 0.0109 325/500 [==================>...........] - ETA: 58s - loss: 0.2416 - regression_loss: 0.2306 - classification_loss: 0.0109 326/500 [==================>...........] - ETA: 58s - loss: 0.2419 - regression_loss: 0.2308 - classification_loss: 0.0110 327/500 [==================>...........] - ETA: 57s - loss: 0.2417 - regression_loss: 0.2307 - classification_loss: 0.0110 328/500 [==================>...........] - ETA: 57s - loss: 0.2421 - regression_loss: 0.2309 - classification_loss: 0.0112 329/500 [==================>...........] - ETA: 57s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 330/500 [==================>...........] - ETA: 56s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 331/500 [==================>...........] - ETA: 56s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 332/500 [==================>...........] - ETA: 56s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0112 333/500 [==================>...........] - ETA: 55s - loss: 0.2426 - regression_loss: 0.2315 - classification_loss: 0.0112 334/500 [===================>..........] - ETA: 55s - loss: 0.2426 - regression_loss: 0.2315 - classification_loss: 0.0111 335/500 [===================>..........] - ETA: 55s - loss: 0.2425 - regression_loss: 0.2314 - classification_loss: 0.0111 336/500 [===================>..........] - ETA: 54s - loss: 0.2429 - regression_loss: 0.2317 - classification_loss: 0.0112 337/500 [===================>..........] - ETA: 54s - loss: 0.2425 - regression_loss: 0.2314 - classification_loss: 0.0111 338/500 [===================>..........] - ETA: 54s - loss: 0.2424 - regression_loss: 0.2312 - classification_loss: 0.0111 339/500 [===================>..........] - ETA: 53s - loss: 0.2425 - regression_loss: 0.2314 - classification_loss: 0.0111 340/500 [===================>..........] - ETA: 53s - loss: 0.2423 - regression_loss: 0.2312 - classification_loss: 0.0111 341/500 [===================>..........] - ETA: 53s - loss: 0.2423 - regression_loss: 0.2312 - classification_loss: 0.0111 342/500 [===================>..........] - ETA: 52s - loss: 0.2425 - regression_loss: 0.2314 - classification_loss: 0.0111 343/500 [===================>..........] - ETA: 52s - loss: 0.2421 - regression_loss: 0.2310 - classification_loss: 0.0111 344/500 [===================>..........] - ETA: 52s - loss: 0.2419 - regression_loss: 0.2308 - classification_loss: 0.0111 345/500 [===================>..........] - ETA: 51s - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 346/500 [===================>..........] - ETA: 51s - loss: 0.2417 - regression_loss: 0.2306 - classification_loss: 0.0111 347/500 [===================>..........] - ETA: 51s - loss: 0.2416 - regression_loss: 0.2305 - classification_loss: 0.0110 348/500 [===================>..........] - ETA: 50s - loss: 0.2416 - regression_loss: 0.2305 - classification_loss: 0.0110 349/500 [===================>..........] - ETA: 50s - loss: 0.2417 - regression_loss: 0.2307 - classification_loss: 0.0111 350/500 [====================>.........] - ETA: 50s - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 351/500 [====================>.........] - ETA: 49s - loss: 0.2418 - regression_loss: 0.2308 - classification_loss: 0.0110 352/500 [====================>.........] - ETA: 49s - loss: 0.2422 - regression_loss: 0.2310 - classification_loss: 0.0111 353/500 [====================>.........] - ETA: 49s - loss: 0.2419 - regression_loss: 0.2308 - classification_loss: 0.0111 354/500 [====================>.........] - ETA: 48s - loss: 0.2421 - regression_loss: 0.2309 - classification_loss: 0.0111 355/500 [====================>.........] - ETA: 48s - loss: 0.2419 - regression_loss: 0.2308 - classification_loss: 0.0111 356/500 [====================>.........] - ETA: 48s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 357/500 [====================>.........] - ETA: 47s - loss: 0.2427 - regression_loss: 0.2315 - classification_loss: 0.0111 358/500 [====================>.........] - ETA: 47s - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0111 359/500 [====================>.........] - ETA: 47s - loss: 0.2430 - regression_loss: 0.2318 - classification_loss: 0.0111 360/500 [====================>.........] - ETA: 46s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 361/500 [====================>.........] - ETA: 46s - loss: 0.2430 - regression_loss: 0.2318 - classification_loss: 0.0112 362/500 [====================>.........] - ETA: 46s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 363/500 [====================>.........] - ETA: 45s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 364/500 [====================>.........] - ETA: 45s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 365/500 [====================>.........] - ETA: 45s - loss: 0.2422 - regression_loss: 0.2311 - classification_loss: 0.0111 366/500 [====================>.........] - ETA: 44s - loss: 0.2423 - regression_loss: 0.2312 - classification_loss: 0.0111 367/500 [=====================>........] - ETA: 44s - loss: 0.2424 - regression_loss: 0.2313 - classification_loss: 0.0111 368/500 [=====================>........] - ETA: 44s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 369/500 [=====================>........] - ETA: 43s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 370/500 [=====================>........] - ETA: 43s - loss: 0.2430 - regression_loss: 0.2319 - classification_loss: 0.0111 371/500 [=====================>........] - ETA: 43s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 372/500 [=====================>........] - ETA: 42s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 373/500 [=====================>........] - ETA: 42s - loss: 0.2423 - regression_loss: 0.2312 - classification_loss: 0.0111 374/500 [=====================>........] - ETA: 42s - loss: 0.2420 - regression_loss: 0.2309 - classification_loss: 0.0111 375/500 [=====================>........] - ETA: 41s - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 376/500 [=====================>........] - ETA: 41s - loss: 0.2413 - regression_loss: 0.2303 - classification_loss: 0.0110 377/500 [=====================>........] - ETA: 41s - loss: 0.2414 - regression_loss: 0.2303 - classification_loss: 0.0111 378/500 [=====================>........] - ETA: 40s - loss: 0.2413 - regression_loss: 0.2303 - classification_loss: 0.0110 379/500 [=====================>........] - ETA: 40s - loss: 0.2413 - regression_loss: 0.2302 - classification_loss: 0.0110 380/500 [=====================>........] - ETA: 40s - loss: 0.2418 - regression_loss: 0.2307 - classification_loss: 0.0111 381/500 [=====================>........] - ETA: 39s - loss: 0.2417 - regression_loss: 0.2306 - classification_loss: 0.0110 382/500 [=====================>........] - ETA: 39s - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 383/500 [=====================>........] - ETA: 39s - loss: 0.2418 - regression_loss: 0.2307 - classification_loss: 0.0110 384/500 [======================>.......] - ETA: 38s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 385/500 [======================>.......] - ETA: 38s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 386/500 [======================>.......] - ETA: 38s - loss: 0.2431 - regression_loss: 0.2321 - classification_loss: 0.0111 387/500 [======================>.......] - ETA: 37s - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0111 388/500 [======================>.......] - ETA: 37s - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 389/500 [======================>.......] - ETA: 37s - loss: 0.2434 - regression_loss: 0.2323 - classification_loss: 0.0111 390/500 [======================>.......] - ETA: 36s - loss: 0.2432 - regression_loss: 0.2321 - classification_loss: 0.0111 391/500 [======================>.......] - ETA: 36s - loss: 0.2430 - regression_loss: 0.2319 - classification_loss: 0.0111 392/500 [======================>.......] - ETA: 36s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 393/500 [======================>.......] - ETA: 35s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0111 394/500 [======================>.......] - ETA: 35s - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 395/500 [======================>.......] - ETA: 35s - loss: 0.2433 - regression_loss: 0.2322 - classification_loss: 0.0111 396/500 [======================>.......] - ETA: 34s - loss: 0.2429 - regression_loss: 0.2318 - classification_loss: 0.0111 397/500 [======================>.......] - ETA: 34s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0110 398/500 [======================>.......] - ETA: 34s - loss: 0.2428 - regression_loss: 0.2318 - classification_loss: 0.0110 399/500 [======================>.......] - ETA: 33s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0110 400/500 [=======================>......] - ETA: 33s - loss: 0.2428 - regression_loss: 0.2317 - classification_loss: 0.0110 401/500 [=======================>......] - ETA: 33s - loss: 0.2428 - regression_loss: 0.2318 - classification_loss: 0.0110 402/500 [=======================>......] - ETA: 32s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 403/500 [=======================>......] - ETA: 32s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 404/500 [=======================>......] - ETA: 32s - loss: 0.2422 - regression_loss: 0.2312 - classification_loss: 0.0110 405/500 [=======================>......] - ETA: 31s - loss: 0.2422 - regression_loss: 0.2312 - classification_loss: 0.0110 406/500 [=======================>......] - ETA: 31s - loss: 0.2427 - regression_loss: 0.2316 - classification_loss: 0.0111 407/500 [=======================>......] - ETA: 31s - loss: 0.2425 - regression_loss: 0.2315 - classification_loss: 0.0110 408/500 [=======================>......] - ETA: 30s - loss: 0.2423 - regression_loss: 0.2313 - classification_loss: 0.0110 409/500 [=======================>......] - ETA: 30s - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0110 410/500 [=======================>......] - ETA: 30s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 411/500 [=======================>......] - ETA: 29s - loss: 0.2427 - regression_loss: 0.2317 - classification_loss: 0.0110 412/500 [=======================>......] - ETA: 29s - loss: 0.2421 - regression_loss: 0.2312 - classification_loss: 0.0110 413/500 [=======================>......] - ETA: 29s - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0110 414/500 [=======================>......] - ETA: 28s - loss: 0.2421 - regression_loss: 0.2311 - classification_loss: 0.0110 415/500 [=======================>......] - ETA: 28s - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0110 416/500 [=======================>......] - ETA: 28s - loss: 0.2422 - regression_loss: 0.2312 - classification_loss: 0.0110 417/500 [========================>.....] - ETA: 27s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 418/500 [========================>.....] - ETA: 27s - loss: 0.2426 - regression_loss: 0.2316 - classification_loss: 0.0110 419/500 [========================>.....] - ETA: 27s - loss: 0.2426 - regression_loss: 0.2315 - classification_loss: 0.0110 420/500 [========================>.....] - ETA: 26s - loss: 0.2424 - regression_loss: 0.2314 - classification_loss: 0.0110 421/500 [========================>.....] - ETA: 26s - loss: 0.2428 - regression_loss: 0.2316 - classification_loss: 0.0112 422/500 [========================>.....] - ETA: 26s - loss: 0.2429 - regression_loss: 0.2317 - classification_loss: 0.0113 423/500 [========================>.....] - ETA: 25s - loss: 0.2427 - regression_loss: 0.2315 - classification_loss: 0.0113 424/500 [========================>.....] - ETA: 25s - loss: 0.2424 - regression_loss: 0.2312 - classification_loss: 0.0112 425/500 [========================>.....] - ETA: 25s - loss: 0.2428 - regression_loss: 0.2315 - classification_loss: 0.0113 426/500 [========================>.....] - ETA: 24s - loss: 0.2425 - regression_loss: 0.2313 - classification_loss: 0.0113 427/500 [========================>.....] - ETA: 24s - loss: 0.2425 - regression_loss: 0.2313 - classification_loss: 0.0113 428/500 [========================>.....] - ETA: 24s - loss: 0.2427 - regression_loss: 0.2314 - classification_loss: 0.0113 429/500 [========================>.....] - ETA: 23s - loss: 0.2427 - regression_loss: 0.2314 - classification_loss: 0.0113 430/500 [========================>.....] - ETA: 23s - loss: 0.2428 - regression_loss: 0.2315 - classification_loss: 0.0113 431/500 [========================>.....] - ETA: 23s - loss: 0.2426 - regression_loss: 0.2313 - classification_loss: 0.0113 432/500 [========================>.....] - ETA: 22s - loss: 0.2425 - regression_loss: 0.2312 - classification_loss: 0.0113 433/500 [========================>.....] - ETA: 22s - loss: 0.2423 - regression_loss: 0.2310 - classification_loss: 0.0113 434/500 [=========================>....] - ETA: 22s - loss: 0.2421 - regression_loss: 0.2308 - classification_loss: 0.0113 435/500 [=========================>....] - ETA: 21s - loss: 0.2421 - regression_loss: 0.2308 - classification_loss: 0.0113 436/500 [=========================>....] - ETA: 21s - loss: 0.2424 - regression_loss: 0.2311 - classification_loss: 0.0113 437/500 [=========================>....] - ETA: 21s - loss: 0.2423 - regression_loss: 0.2310 - classification_loss: 0.0113 438/500 [=========================>....] - ETA: 20s - loss: 0.2422 - regression_loss: 0.2309 - classification_loss: 0.0113 439/500 [=========================>....] - ETA: 20s - loss: 0.2419 - regression_loss: 0.2306 - classification_loss: 0.0113 440/500 [=========================>....] - ETA: 20s - loss: 0.2416 - regression_loss: 0.2304 - classification_loss: 0.0113 441/500 [=========================>....] - ETA: 19s - loss: 0.2417 - regression_loss: 0.2304 - classification_loss: 0.0113 442/500 [=========================>....] - ETA: 19s - loss: 0.2418 - regression_loss: 0.2306 - classification_loss: 0.0113 443/500 [=========================>....] - ETA: 19s - loss: 0.2419 - regression_loss: 0.2306 - classification_loss: 0.0113 444/500 [=========================>....] - ETA: 18s - loss: 0.2421 - regression_loss: 0.2307 - classification_loss: 0.0113 445/500 [=========================>....] - ETA: 18s - loss: 0.2418 - regression_loss: 0.2305 - classification_loss: 0.0113 446/500 [=========================>....] - ETA: 18s - loss: 0.2418 - regression_loss: 0.2305 - classification_loss: 0.0113 447/500 [=========================>....] - ETA: 17s - loss: 0.2414 - regression_loss: 0.2301 - classification_loss: 0.0113 448/500 [=========================>....] - ETA: 17s - loss: 0.2415 - regression_loss: 0.2302 - classification_loss: 0.0113 449/500 [=========================>....] - ETA: 17s - loss: 0.2414 - regression_loss: 0.2301 - classification_loss: 0.0113 450/500 [==========================>...] - ETA: 16s - loss: 0.2415 - regression_loss: 0.2302 - classification_loss: 0.0113 451/500 [==========================>...] - ETA: 16s - loss: 0.2412 - regression_loss: 0.2300 - classification_loss: 0.0113 452/500 [==========================>...] - ETA: 16s - loss: 0.2410 - regression_loss: 0.2297 - classification_loss: 0.0113 453/500 [==========================>...] - ETA: 15s - loss: 0.2409 - regression_loss: 0.2296 - classification_loss: 0.0112 454/500 [==========================>...] - ETA: 15s - loss: 0.2408 - regression_loss: 0.2295 - classification_loss: 0.0112 455/500 [==========================>...] - ETA: 15s - loss: 0.2407 - regression_loss: 0.2295 - classification_loss: 0.0112 456/500 [==========================>...] - ETA: 14s - loss: 0.2409 - regression_loss: 0.2297 - classification_loss: 0.0113 457/500 [==========================>...] - ETA: 14s - loss: 0.2409 - regression_loss: 0.2296 - classification_loss: 0.0113 458/500 [==========================>...] - ETA: 14s - loss: 0.2406 - regression_loss: 0.2293 - classification_loss: 0.0112 459/500 [==========================>...] - ETA: 13s - loss: 0.2404 - regression_loss: 0.2292 - classification_loss: 0.0112 460/500 [==========================>...] - ETA: 13s - loss: 0.2406 - regression_loss: 0.2294 - classification_loss: 0.0112 461/500 [==========================>...] - ETA: 13s - loss: 0.2402 - regression_loss: 0.2290 - classification_loss: 0.0112 462/500 [==========================>...] - ETA: 12s - loss: 0.2402 - regression_loss: 0.2290 - classification_loss: 0.0112 463/500 [==========================>...] - ETA: 12s - loss: 0.2404 - regression_loss: 0.2292 - classification_loss: 0.0112 464/500 [==========================>...] - ETA: 12s - loss: 0.2403 - regression_loss: 0.2291 - classification_loss: 0.0112 465/500 [==========================>...] - ETA: 11s - loss: 0.2400 - regression_loss: 0.2288 - classification_loss: 0.0112 466/500 [==========================>...] - ETA: 11s - loss: 0.2401 - regression_loss: 0.2289 - classification_loss: 0.0112 467/500 [===========================>..] - ETA: 11s - loss: 0.2403 - regression_loss: 0.2291 - classification_loss: 0.0112 468/500 [===========================>..] - ETA: 10s - loss: 0.2401 - regression_loss: 0.2289 - classification_loss: 0.0112 469/500 [===========================>..] - ETA: 10s - loss: 0.2399 - regression_loss: 0.2287 - classification_loss: 0.0112 470/500 [===========================>..] - ETA: 10s - loss: 0.2397 - regression_loss: 0.2286 - classification_loss: 0.0112 471/500 [===========================>..] - ETA: 9s - loss: 0.2396 - regression_loss: 0.2284 - classification_loss: 0.0112  472/500 [===========================>..] - ETA: 9s - loss: 0.2391 - regression_loss: 0.2280 - classification_loss: 0.0111 473/500 [===========================>..] - ETA: 9s - loss: 0.2393 - regression_loss: 0.2281 - classification_loss: 0.0111 474/500 [===========================>..] - ETA: 8s - loss: 0.2395 - regression_loss: 0.2283 - classification_loss: 0.0112 475/500 [===========================>..] - ETA: 8s - loss: 0.2394 - regression_loss: 0.2282 - classification_loss: 0.0111 476/500 [===========================>..] - ETA: 8s - loss: 0.2391 - regression_loss: 0.2280 - classification_loss: 0.0111 477/500 [===========================>..] - ETA: 7s - loss: 0.2389 - regression_loss: 0.2278 - classification_loss: 0.0111 478/500 [===========================>..] - ETA: 7s - loss: 0.2389 - regression_loss: 0.2278 - classification_loss: 0.0111 479/500 [===========================>..] - ETA: 7s - loss: 0.2390 - regression_loss: 0.2279 - classification_loss: 0.0111 480/500 [===========================>..] - ETA: 6s - loss: 0.2387 - regression_loss: 0.2276 - classification_loss: 0.0111 481/500 [===========================>..] - ETA: 6s - loss: 0.2388 - regression_loss: 0.2277 - classification_loss: 0.0111 482/500 [===========================>..] - ETA: 6s - loss: 0.2388 - regression_loss: 0.2277 - classification_loss: 0.0111 483/500 [===========================>..] - ETA: 5s - loss: 0.2387 - regression_loss: 0.2276 - classification_loss: 0.0111 484/500 [============================>.] - ETA: 5s - loss: 0.2387 - regression_loss: 0.2277 - classification_loss: 0.0111 485/500 [============================>.] - ETA: 5s - loss: 0.2387 - regression_loss: 0.2277 - classification_loss: 0.0111 486/500 [============================>.] - ETA: 4s - loss: 0.2388 - regression_loss: 0.2278 - classification_loss: 0.0111 487/500 [============================>.] - ETA: 4s - loss: 0.2386 - regression_loss: 0.2275 - classification_loss: 0.0111 488/500 [============================>.] - ETA: 4s - loss: 0.2384 - regression_loss: 0.2273 - classification_loss: 0.0110 489/500 [============================>.] - ETA: 3s - loss: 0.2384 - regression_loss: 0.2274 - classification_loss: 0.0110 490/500 [============================>.] - ETA: 3s - loss: 0.2383 - regression_loss: 0.2273 - classification_loss: 0.0110 491/500 [============================>.] - ETA: 3s - loss: 0.2383 - regression_loss: 0.2272 - classification_loss: 0.0110 492/500 [============================>.] - ETA: 2s - loss: 0.2381 - regression_loss: 0.2271 - classification_loss: 0.0110 493/500 [============================>.] - ETA: 2s - loss: 0.2382 - regression_loss: 0.2271 - classification_loss: 0.0110 494/500 [============================>.] - ETA: 2s - loss: 0.2379 - regression_loss: 0.2269 - classification_loss: 0.0110 495/500 [============================>.] - ETA: 1s - loss: 0.2381 - regression_loss: 0.2271 - classification_loss: 0.0110 496/500 [============================>.] - ETA: 1s - loss: 0.2381 - regression_loss: 0.2271 - classification_loss: 0.0110 497/500 [============================>.] - ETA: 1s - loss: 0.2380 - regression_loss: 0.2270 - classification_loss: 0.0110 498/500 [============================>.] - ETA: 0s - loss: 0.2379 - regression_loss: 0.2270 - classification_loss: 0.0110 499/500 [============================>.] - ETA: 0s - loss: 0.2381 - regression_loss: 0.2271 - classification_loss: 0.0110 500/500 [==============================] - 167s 334ms/step - loss: 0.2379 - regression_loss: 0.2269 - classification_loss: 0.0110 1172 instances of class plum with average precision: 0.7511 mAP: 0.7511 Epoch 00047: saving model to ./training/snapshots/resnet101_pascal_47.h5 Epoch 48/150 1/500 [..............................] - ETA: 2:42 - loss: 0.2209 - regression_loss: 0.2131 - classification_loss: 0.0077 2/500 [..............................] - ETA: 2:42 - loss: 0.2795 - regression_loss: 0.2694 - classification_loss: 0.0100 3/500 [..............................] - ETA: 2:45 - loss: 0.2281 - regression_loss: 0.2206 - classification_loss: 0.0075 4/500 [..............................] - ETA: 2:45 - loss: 0.2143 - regression_loss: 0.2068 - classification_loss: 0.0075 5/500 [..............................] - ETA: 2:45 - loss: 0.1984 - regression_loss: 0.1920 - classification_loss: 0.0064 6/500 [..............................] - ETA: 2:44 - loss: 0.1908 - regression_loss: 0.1847 - classification_loss: 0.0061 7/500 [..............................] - ETA: 2:44 - loss: 0.1875 - regression_loss: 0.1811 - classification_loss: 0.0064 8/500 [..............................] - ETA: 2:44 - loss: 0.2024 - regression_loss: 0.1958 - classification_loss: 0.0067 9/500 [..............................] - ETA: 2:43 - loss: 0.2077 - regression_loss: 0.2002 - classification_loss: 0.0075 10/500 [..............................] - ETA: 2:43 - loss: 0.2014 - regression_loss: 0.1941 - classification_loss: 0.0073 11/500 [..............................] - ETA: 2:43 - loss: 0.1962 - regression_loss: 0.1885 - classification_loss: 0.0077 12/500 [..............................] - ETA: 2:43 - loss: 0.1985 - regression_loss: 0.1909 - classification_loss: 0.0076 13/500 [..............................] - ETA: 2:43 - loss: 0.2025 - regression_loss: 0.1950 - classification_loss: 0.0075 14/500 [..............................] - ETA: 2:43 - loss: 0.2014 - regression_loss: 0.1939 - classification_loss: 0.0075 15/500 [..............................] - ETA: 2:43 - loss: 0.2038 - regression_loss: 0.1963 - classification_loss: 0.0074 16/500 [..............................] - ETA: 2:43 - loss: 0.2060 - regression_loss: 0.1982 - classification_loss: 0.0077 17/500 [>.............................] - ETA: 2:43 - loss: 0.2053 - regression_loss: 0.1972 - classification_loss: 0.0080 18/500 [>.............................] - ETA: 2:43 - loss: 0.2042 - regression_loss: 0.1961 - classification_loss: 0.0081 19/500 [>.............................] - ETA: 2:43 - loss: 0.1986 - regression_loss: 0.1907 - classification_loss: 0.0079 20/500 [>.............................] - ETA: 2:41 - loss: 0.2100 - regression_loss: 0.2020 - classification_loss: 0.0080 21/500 [>.............................] - ETA: 2:41 - loss: 0.2091 - regression_loss: 0.2013 - classification_loss: 0.0078 22/500 [>.............................] - ETA: 2:41 - loss: 0.2106 - regression_loss: 0.2028 - classification_loss: 0.0078 23/500 [>.............................] - ETA: 2:41 - loss: 0.2126 - regression_loss: 0.2048 - classification_loss: 0.0078 24/500 [>.............................] - ETA: 2:41 - loss: 0.2112 - regression_loss: 0.2035 - classification_loss: 0.0076 25/500 [>.............................] - ETA: 2:41 - loss: 0.2134 - regression_loss: 0.2056 - classification_loss: 0.0077 26/500 [>.............................] - ETA: 2:41 - loss: 0.2089 - regression_loss: 0.2014 - classification_loss: 0.0075 27/500 [>.............................] - ETA: 2:40 - loss: 0.2068 - regression_loss: 0.1993 - classification_loss: 0.0075 28/500 [>.............................] - ETA: 2:39 - loss: 0.2098 - regression_loss: 0.2017 - classification_loss: 0.0081 29/500 [>.............................] - ETA: 2:39 - loss: 0.2063 - regression_loss: 0.1984 - classification_loss: 0.0079 30/500 [>.............................] - ETA: 2:39 - loss: 0.2071 - regression_loss: 0.1990 - classification_loss: 0.0080 31/500 [>.............................] - ETA: 2:38 - loss: 0.2050 - regression_loss: 0.1971 - classification_loss: 0.0079 32/500 [>.............................] - ETA: 2:38 - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0077 33/500 [>.............................] - ETA: 2:37 - loss: 0.2081 - regression_loss: 0.2000 - classification_loss: 0.0081 34/500 [=>............................] - ETA: 2:37 - loss: 0.2103 - regression_loss: 0.2021 - classification_loss: 0.0082 35/500 [=>............................] - ETA: 2:37 - loss: 0.2071 - regression_loss: 0.1991 - classification_loss: 0.0080 36/500 [=>............................] - ETA: 2:37 - loss: 0.2081 - regression_loss: 0.1999 - classification_loss: 0.0082 37/500 [=>............................] - ETA: 2:36 - loss: 0.2064 - regression_loss: 0.1983 - classification_loss: 0.0081 38/500 [=>............................] - ETA: 2:36 - loss: 0.2088 - regression_loss: 0.2006 - classification_loss: 0.0082 39/500 [=>............................] - ETA: 2:35 - loss: 0.2074 - regression_loss: 0.1993 - classification_loss: 0.0081 40/500 [=>............................] - ETA: 2:35 - loss: 0.2101 - regression_loss: 0.2016 - classification_loss: 0.0084 41/500 [=>............................] - ETA: 2:35 - loss: 0.2110 - regression_loss: 0.2023 - classification_loss: 0.0087 42/500 [=>............................] - ETA: 2:34 - loss: 0.2130 - regression_loss: 0.2041 - classification_loss: 0.0089 43/500 [=>............................] - ETA: 2:34 - loss: 0.2148 - regression_loss: 0.2058 - classification_loss: 0.0090 44/500 [=>............................] - ETA: 2:33 - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 45/500 [=>............................] - ETA: 2:33 - loss: 0.2185 - regression_loss: 0.2097 - classification_loss: 0.0088 46/500 [=>............................] - ETA: 2:33 - loss: 0.2202 - regression_loss: 0.2113 - classification_loss: 0.0089 47/500 [=>............................] - ETA: 2:32 - loss: 0.2209 - regression_loss: 0.2121 - classification_loss: 0.0088 48/500 [=>............................] - ETA: 2:32 - loss: 0.2193 - regression_loss: 0.2105 - classification_loss: 0.0088 49/500 [=>............................] - ETA: 2:32 - loss: 0.2185 - regression_loss: 0.2097 - classification_loss: 0.0087 50/500 [==>...........................] - ETA: 2:31 - loss: 0.2158 - regression_loss: 0.2072 - classification_loss: 0.0087 51/500 [==>...........................] - ETA: 2:31 - loss: 0.2166 - regression_loss: 0.2077 - classification_loss: 0.0088 52/500 [==>...........................] - ETA: 2:30 - loss: 0.2162 - regression_loss: 0.2074 - classification_loss: 0.0087 53/500 [==>...........................] - ETA: 2:30 - loss: 0.2153 - regression_loss: 0.2067 - classification_loss: 0.0086 54/500 [==>...........................] - ETA: 2:30 - loss: 0.2140 - regression_loss: 0.2053 - classification_loss: 0.0086 55/500 [==>...........................] - ETA: 2:29 - loss: 0.2128 - regression_loss: 0.2043 - classification_loss: 0.0085 56/500 [==>...........................] - ETA: 2:29 - loss: 0.2111 - regression_loss: 0.2027 - classification_loss: 0.0084 57/500 [==>...........................] - ETA: 2:29 - loss: 0.2138 - regression_loss: 0.2053 - classification_loss: 0.0084 58/500 [==>...........................] - ETA: 2:28 - loss: 0.2163 - regression_loss: 0.2078 - classification_loss: 0.0085 59/500 [==>...........................] - ETA: 2:28 - loss: 0.2149 - regression_loss: 0.2064 - classification_loss: 0.0084 60/500 [==>...........................] - ETA: 2:27 - loss: 0.2164 - regression_loss: 0.2079 - classification_loss: 0.0085 61/500 [==>...........................] - ETA: 2:27 - loss: 0.2195 - regression_loss: 0.2107 - classification_loss: 0.0088 62/500 [==>...........................] - ETA: 2:27 - loss: 0.2196 - regression_loss: 0.2108 - classification_loss: 0.0088 63/500 [==>...........................] - ETA: 2:27 - loss: 0.2204 - regression_loss: 0.2116 - classification_loss: 0.0088 64/500 [==>...........................] - ETA: 2:26 - loss: 0.2211 - regression_loss: 0.2123 - classification_loss: 0.0087 65/500 [==>...........................] - ETA: 2:26 - loss: 0.2223 - regression_loss: 0.2133 - classification_loss: 0.0091 66/500 [==>...........................] - ETA: 2:26 - loss: 0.2260 - regression_loss: 0.2165 - classification_loss: 0.0096 67/500 [===>..........................] - ETA: 2:26 - loss: 0.2275 - regression_loss: 0.2178 - classification_loss: 0.0096 68/500 [===>..........................] - ETA: 2:25 - loss: 0.2277 - regression_loss: 0.2179 - classification_loss: 0.0098 69/500 [===>..........................] - ETA: 2:25 - loss: 0.2269 - regression_loss: 0.2171 - classification_loss: 0.0098 70/500 [===>..........................] - ETA: 2:25 - loss: 0.2263 - regression_loss: 0.2165 - classification_loss: 0.0098 71/500 [===>..........................] - ETA: 2:24 - loss: 0.2255 - regression_loss: 0.2159 - classification_loss: 0.0097 72/500 [===>..........................] - ETA: 2:24 - loss: 0.2245 - regression_loss: 0.2149 - classification_loss: 0.0096 73/500 [===>..........................] - ETA: 2:24 - loss: 0.2240 - regression_loss: 0.2144 - classification_loss: 0.0096 74/500 [===>..........................] - ETA: 2:24 - loss: 0.2241 - regression_loss: 0.2146 - classification_loss: 0.0095 75/500 [===>..........................] - ETA: 2:23 - loss: 0.2251 - regression_loss: 0.2157 - classification_loss: 0.0095 76/500 [===>..........................] - ETA: 2:23 - loss: 0.2256 - regression_loss: 0.2161 - classification_loss: 0.0095 77/500 [===>..........................] - ETA: 2:23 - loss: 0.2248 - regression_loss: 0.2153 - classification_loss: 0.0095 78/500 [===>..........................] - ETA: 2:22 - loss: 0.2245 - regression_loss: 0.2150 - classification_loss: 0.0095 79/500 [===>..........................] - ETA: 2:22 - loss: 0.2230 - regression_loss: 0.2136 - classification_loss: 0.0094 80/500 [===>..........................] - ETA: 2:22 - loss: 0.2233 - regression_loss: 0.2139 - classification_loss: 0.0094 81/500 [===>..........................] - ETA: 2:21 - loss: 0.2254 - regression_loss: 0.2160 - classification_loss: 0.0094 82/500 [===>..........................] - ETA: 2:21 - loss: 0.2259 - regression_loss: 0.2165 - classification_loss: 0.0094 83/500 [===>..........................] - ETA: 2:21 - loss: 0.2256 - regression_loss: 0.2162 - classification_loss: 0.0094 84/500 [====>.........................] - ETA: 2:20 - loss: 0.2265 - regression_loss: 0.2171 - classification_loss: 0.0094 85/500 [====>.........................] - ETA: 2:20 - loss: 0.2264 - regression_loss: 0.2171 - classification_loss: 0.0093 86/500 [====>.........................] - ETA: 2:19 - loss: 0.2273 - regression_loss: 0.2180 - classification_loss: 0.0093 87/500 [====>.........................] - ETA: 2:19 - loss: 0.2269 - regression_loss: 0.2177 - classification_loss: 0.0093 88/500 [====>.........................] - ETA: 2:19 - loss: 0.2288 - regression_loss: 0.2193 - classification_loss: 0.0095 89/500 [====>.........................] - ETA: 2:18 - loss: 0.2270 - regression_loss: 0.2177 - classification_loss: 0.0094 90/500 [====>.........................] - ETA: 2:18 - loss: 0.2266 - regression_loss: 0.2172 - classification_loss: 0.0093 91/500 [====>.........................] - ETA: 2:18 - loss: 0.2272 - regression_loss: 0.2179 - classification_loss: 0.0094 92/500 [====>.........................] - ETA: 2:17 - loss: 0.2275 - regression_loss: 0.2181 - classification_loss: 0.0094 93/500 [====>.........................] - ETA: 2:17 - loss: 0.2280 - regression_loss: 0.2187 - classification_loss: 0.0094 94/500 [====>.........................] - ETA: 2:17 - loss: 0.2273 - regression_loss: 0.2180 - classification_loss: 0.0093 95/500 [====>.........................] - ETA: 2:16 - loss: 0.2262 - regression_loss: 0.2170 - classification_loss: 0.0092 96/500 [====>.........................] - ETA: 2:16 - loss: 0.2261 - regression_loss: 0.2169 - classification_loss: 0.0092 97/500 [====>.........................] - ETA: 2:16 - loss: 0.2250 - regression_loss: 0.2159 - classification_loss: 0.0091 98/500 [====>.........................] - ETA: 2:15 - loss: 0.2252 - regression_loss: 0.2161 - classification_loss: 0.0091 99/500 [====>.........................] - ETA: 2:15 - loss: 0.2240 - regression_loss: 0.2149 - classification_loss: 0.0091 100/500 [=====>........................] - ETA: 2:15 - loss: 0.2244 - regression_loss: 0.2153 - classification_loss: 0.0091 101/500 [=====>........................] - ETA: 2:15 - loss: 0.2241 - regression_loss: 0.2150 - classification_loss: 0.0091 102/500 [=====>........................] - ETA: 2:14 - loss: 0.2230 - regression_loss: 0.2140 - classification_loss: 0.0090 103/500 [=====>........................] - ETA: 2:14 - loss: 0.2247 - regression_loss: 0.2155 - classification_loss: 0.0092 104/500 [=====>........................] - ETA: 2:13 - loss: 0.2242 - regression_loss: 0.2151 - classification_loss: 0.0091 105/500 [=====>........................] - ETA: 2:13 - loss: 0.2238 - regression_loss: 0.2147 - classification_loss: 0.0091 106/500 [=====>........................] - ETA: 2:13 - loss: 0.2241 - regression_loss: 0.2148 - classification_loss: 0.0093 107/500 [=====>........................] - ETA: 2:12 - loss: 0.2238 - regression_loss: 0.2146 - classification_loss: 0.0092 108/500 [=====>........................] - ETA: 2:12 - loss: 0.2245 - regression_loss: 0.2152 - classification_loss: 0.0093 109/500 [=====>........................] - ETA: 2:12 - loss: 0.2239 - regression_loss: 0.2147 - classification_loss: 0.0092 110/500 [=====>........................] - ETA: 2:11 - loss: 0.2232 - regression_loss: 0.2140 - classification_loss: 0.0092 111/500 [=====>........................] - ETA: 2:11 - loss: 0.2230 - regression_loss: 0.2138 - classification_loss: 0.0092 112/500 [=====>........................] - ETA: 2:11 - loss: 0.2237 - regression_loss: 0.2144 - classification_loss: 0.0093 113/500 [=====>........................] - ETA: 2:10 - loss: 0.2238 - regression_loss: 0.2145 - classification_loss: 0.0093 114/500 [=====>........................] - ETA: 2:10 - loss: 0.2256 - regression_loss: 0.2162 - classification_loss: 0.0094 115/500 [=====>........................] - ETA: 2:10 - loss: 0.2253 - regression_loss: 0.2160 - classification_loss: 0.0094 116/500 [=====>........................] - ETA: 2:09 - loss: 0.2253 - regression_loss: 0.2159 - classification_loss: 0.0094 117/500 [======>.......................] - ETA: 2:09 - loss: 0.2251 - regression_loss: 0.2157 - classification_loss: 0.0094 118/500 [======>.......................] - ETA: 2:09 - loss: 0.2247 - regression_loss: 0.2153 - classification_loss: 0.0094 119/500 [======>.......................] - ETA: 2:08 - loss: 0.2245 - regression_loss: 0.2152 - classification_loss: 0.0093 120/500 [======>.......................] - ETA: 2:08 - loss: 0.2242 - regression_loss: 0.2148 - classification_loss: 0.0095 121/500 [======>.......................] - ETA: 2:08 - loss: 0.2245 - regression_loss: 0.2150 - classification_loss: 0.0094 122/500 [======>.......................] - ETA: 2:07 - loss: 0.2242 - regression_loss: 0.2148 - classification_loss: 0.0094 123/500 [======>.......................] - ETA: 2:07 - loss: 0.2237 - regression_loss: 0.2143 - classification_loss: 0.0094 124/500 [======>.......................] - ETA: 2:07 - loss: 0.2245 - regression_loss: 0.2149 - classification_loss: 0.0096 125/500 [======>.......................] - ETA: 2:06 - loss: 0.2245 - regression_loss: 0.2149 - classification_loss: 0.0096 126/500 [======>.......................] - ETA: 2:06 - loss: 0.2246 - regression_loss: 0.2150 - classification_loss: 0.0095 127/500 [======>.......................] - ETA: 2:06 - loss: 0.2238 - regression_loss: 0.2143 - classification_loss: 0.0095 128/500 [======>.......................] - ETA: 2:05 - loss: 0.2237 - regression_loss: 0.2142 - classification_loss: 0.0095 129/500 [======>.......................] - ETA: 2:05 - loss: 0.2260 - regression_loss: 0.2163 - classification_loss: 0.0097 130/500 [======>.......................] - ETA: 2:05 - loss: 0.2259 - regression_loss: 0.2162 - classification_loss: 0.0097 131/500 [======>.......................] - ETA: 2:04 - loss: 0.2271 - regression_loss: 0.2172 - classification_loss: 0.0099 132/500 [======>.......................] - ETA: 2:04 - loss: 0.2287 - regression_loss: 0.2188 - classification_loss: 0.0099 133/500 [======>.......................] - ETA: 2:04 - loss: 0.2277 - regression_loss: 0.2178 - classification_loss: 0.0099 134/500 [=======>......................] - ETA: 2:03 - loss: 0.2277 - regression_loss: 0.2178 - classification_loss: 0.0099 135/500 [=======>......................] - ETA: 2:03 - loss: 0.2282 - regression_loss: 0.2183 - classification_loss: 0.0099 136/500 [=======>......................] - ETA: 2:02 - loss: 0.2283 - regression_loss: 0.2184 - classification_loss: 0.0099 137/500 [=======>......................] - ETA: 2:02 - loss: 0.2277 - regression_loss: 0.2178 - classification_loss: 0.0099 138/500 [=======>......................] - ETA: 2:02 - loss: 0.2268 - regression_loss: 0.2170 - classification_loss: 0.0098 139/500 [=======>......................] - ETA: 2:01 - loss: 0.2277 - regression_loss: 0.2175 - classification_loss: 0.0102 140/500 [=======>......................] - ETA: 2:01 - loss: 0.2283 - regression_loss: 0.2180 - classification_loss: 0.0103 141/500 [=======>......................] - ETA: 2:01 - loss: 0.2281 - regression_loss: 0.2178 - classification_loss: 0.0103 142/500 [=======>......................] - ETA: 2:00 - loss: 0.2284 - regression_loss: 0.2180 - classification_loss: 0.0104 143/500 [=======>......................] - ETA: 2:00 - loss: 0.2284 - regression_loss: 0.2181 - classification_loss: 0.0103 144/500 [=======>......................] - ETA: 2:00 - loss: 0.2283 - regression_loss: 0.2180 - classification_loss: 0.0103 145/500 [=======>......................] - ETA: 1:59 - loss: 0.2277 - regression_loss: 0.2174 - classification_loss: 0.0103 146/500 [=======>......................] - ETA: 1:59 - loss: 0.2293 - regression_loss: 0.2190 - classification_loss: 0.0103 147/500 [=======>......................] - ETA: 1:59 - loss: 0.2293 - regression_loss: 0.2191 - classification_loss: 0.0103 148/500 [=======>......................] - ETA: 1:58 - loss: 0.2297 - regression_loss: 0.2194 - classification_loss: 0.0103 149/500 [=======>......................] - ETA: 1:58 - loss: 0.2303 - regression_loss: 0.2199 - classification_loss: 0.0103 150/500 [========>.....................] - ETA: 1:58 - loss: 0.2302 - regression_loss: 0.2198 - classification_loss: 0.0103 151/500 [========>.....................] - ETA: 1:57 - loss: 0.2303 - regression_loss: 0.2200 - classification_loss: 0.0103 152/500 [========>.....................] - ETA: 1:57 - loss: 0.2315 - regression_loss: 0.2211 - classification_loss: 0.0105 153/500 [========>.....................] - ETA: 1:57 - loss: 0.2313 - regression_loss: 0.2209 - classification_loss: 0.0105 154/500 [========>.....................] - ETA: 1:56 - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104 155/500 [========>.....................] - ETA: 1:56 - loss: 0.2310 - regression_loss: 0.2206 - classification_loss: 0.0104 156/500 [========>.....................] - ETA: 1:56 - loss: 0.2307 - regression_loss: 0.2203 - classification_loss: 0.0104 157/500 [========>.....................] - ETA: 1:55 - loss: 0.2306 - regression_loss: 0.2203 - classification_loss: 0.0103 158/500 [========>.....................] - ETA: 1:55 - loss: 0.2308 - regression_loss: 0.2205 - classification_loss: 0.0103 159/500 [========>.....................] - ETA: 1:55 - loss: 0.2295 - regression_loss: 0.2193 - classification_loss: 0.0102 160/500 [========>.....................] - ETA: 1:54 - loss: 0.2292 - regression_loss: 0.2190 - classification_loss: 0.0102 161/500 [========>.....................] - ETA: 1:54 - loss: 0.2286 - regression_loss: 0.2184 - classification_loss: 0.0102 162/500 [========>.....................] - ETA: 1:54 - loss: 0.2291 - regression_loss: 0.2189 - classification_loss: 0.0102 163/500 [========>.....................] - ETA: 1:53 - loss: 0.2295 - regression_loss: 0.2193 - classification_loss: 0.0102 164/500 [========>.....................] - ETA: 1:53 - loss: 0.2293 - regression_loss: 0.2191 - classification_loss: 0.0102 165/500 [========>.....................] - ETA: 1:53 - loss: 0.2299 - regression_loss: 0.2196 - classification_loss: 0.0102 166/500 [========>.....................] - ETA: 1:52 - loss: 0.2306 - regression_loss: 0.2203 - classification_loss: 0.0103 167/500 [=========>....................] - ETA: 1:52 - loss: 0.2305 - regression_loss: 0.2201 - classification_loss: 0.0103 168/500 [=========>....................] - ETA: 1:52 - loss: 0.2302 - regression_loss: 0.2199 - classification_loss: 0.0103 169/500 [=========>....................] - ETA: 1:51 - loss: 0.2306 - regression_loss: 0.2203 - classification_loss: 0.0103 170/500 [=========>....................] - ETA: 1:51 - loss: 0.2304 - regression_loss: 0.2201 - classification_loss: 0.0103 171/500 [=========>....................] - ETA: 1:51 - loss: 0.2309 - regression_loss: 0.2206 - classification_loss: 0.0103 172/500 [=========>....................] - ETA: 1:50 - loss: 0.2316 - regression_loss: 0.2213 - classification_loss: 0.0103 173/500 [=========>....................] - ETA: 1:50 - loss: 0.2313 - regression_loss: 0.2210 - classification_loss: 0.0103 174/500 [=========>....................] - ETA: 1:50 - loss: 0.2312 - regression_loss: 0.2210 - classification_loss: 0.0102 175/500 [=========>....................] - ETA: 1:49 - loss: 0.2312 - regression_loss: 0.2210 - classification_loss: 0.0102 176/500 [=========>....................] - ETA: 1:49 - loss: 0.2310 - regression_loss: 0.2208 - classification_loss: 0.0102 177/500 [=========>....................] - ETA: 1:49 - loss: 0.2308 - regression_loss: 0.2206 - classification_loss: 0.0102 178/500 [=========>....................] - ETA: 1:48 - loss: 0.2316 - regression_loss: 0.2213 - classification_loss: 0.0103 179/500 [=========>....................] - ETA: 1:48 - loss: 0.2309 - regression_loss: 0.2206 - classification_loss: 0.0102 180/500 [=========>....................] - ETA: 1:48 - loss: 0.2312 - regression_loss: 0.2209 - classification_loss: 0.0103 181/500 [=========>....................] - ETA: 1:47 - loss: 0.2320 - regression_loss: 0.2217 - classification_loss: 0.0103 182/500 [=========>....................] - ETA: 1:47 - loss: 0.2323 - regression_loss: 0.2219 - classification_loss: 0.0104 183/500 [=========>....................] - ETA: 1:47 - loss: 0.2316 - regression_loss: 0.2212 - classification_loss: 0.0103 184/500 [==========>...................] - ETA: 1:46 - loss: 0.2308 - regression_loss: 0.2205 - classification_loss: 0.0103 185/500 [==========>...................] - ETA: 1:46 - loss: 0.2309 - regression_loss: 0.2206 - classification_loss: 0.0103 186/500 [==========>...................] - ETA: 1:46 - loss: 0.2312 - regression_loss: 0.2209 - classification_loss: 0.0103 187/500 [==========>...................] - ETA: 1:45 - loss: 0.2305 - regression_loss: 0.2203 - classification_loss: 0.0102 188/500 [==========>...................] - ETA: 1:45 - loss: 0.2298 - regression_loss: 0.2196 - classification_loss: 0.0102 189/500 [==========>...................] - ETA: 1:45 - loss: 0.2303 - regression_loss: 0.2201 - classification_loss: 0.0102 190/500 [==========>...................] - ETA: 1:44 - loss: 0.2303 - regression_loss: 0.2201 - classification_loss: 0.0102 191/500 [==========>...................] - ETA: 1:44 - loss: 0.2316 - regression_loss: 0.2214 - classification_loss: 0.0102 192/500 [==========>...................] - ETA: 1:44 - loss: 0.2337 - regression_loss: 0.2235 - classification_loss: 0.0102 193/500 [==========>...................] - ETA: 1:43 - loss: 0.2334 - regression_loss: 0.2232 - classification_loss: 0.0102 194/500 [==========>...................] - ETA: 1:43 - loss: 0.2329 - regression_loss: 0.2228 - classification_loss: 0.0102 195/500 [==========>...................] - ETA: 1:43 - loss: 0.2328 - regression_loss: 0.2227 - classification_loss: 0.0102 196/500 [==========>...................] - ETA: 1:42 - loss: 0.2327 - regression_loss: 0.2225 - classification_loss: 0.0102 197/500 [==========>...................] - ETA: 1:42 - loss: 0.2322 - regression_loss: 0.2220 - classification_loss: 0.0101 198/500 [==========>...................] - ETA: 1:42 - loss: 0.2331 - regression_loss: 0.2229 - classification_loss: 0.0102 199/500 [==========>...................] - ETA: 1:41 - loss: 0.2338 - regression_loss: 0.2236 - classification_loss: 0.0102 200/500 [===========>..................] - ETA: 1:41 - loss: 0.2336 - regression_loss: 0.2234 - classification_loss: 0.0102 201/500 [===========>..................] - ETA: 1:40 - loss: 0.2339 - regression_loss: 0.2238 - classification_loss: 0.0101 202/500 [===========>..................] - ETA: 1:40 - loss: 0.2343 - regression_loss: 0.2242 - classification_loss: 0.0101 203/500 [===========>..................] - ETA: 1:40 - loss: 0.2346 - regression_loss: 0.2245 - classification_loss: 0.0101 204/500 [===========>..................] - ETA: 1:39 - loss: 0.2351 - regression_loss: 0.2249 - classification_loss: 0.0102 205/500 [===========>..................] - ETA: 1:39 - loss: 0.2358 - regression_loss: 0.2256 - classification_loss: 0.0102 206/500 [===========>..................] - ETA: 1:39 - loss: 0.2368 - regression_loss: 0.2266 - classification_loss: 0.0102 207/500 [===========>..................] - ETA: 1:38 - loss: 0.2368 - regression_loss: 0.2266 - classification_loss: 0.0103 208/500 [===========>..................] - ETA: 1:38 - loss: 0.2369 - regression_loss: 0.2266 - classification_loss: 0.0103 209/500 [===========>..................] - ETA: 1:38 - loss: 0.2369 - regression_loss: 0.2266 - classification_loss: 0.0103 210/500 [===========>..................] - ETA: 1:37 - loss: 0.2374 - regression_loss: 0.2270 - classification_loss: 0.0104 211/500 [===========>..................] - ETA: 1:37 - loss: 0.2383 - regression_loss: 0.2279 - classification_loss: 0.0104 212/500 [===========>..................] - ETA: 1:36 - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0104 213/500 [===========>..................] - ETA: 1:36 - loss: 0.2390 - regression_loss: 0.2286 - classification_loss: 0.0104 214/500 [===========>..................] - ETA: 1:36 - loss: 0.2381 - regression_loss: 0.2278 - classification_loss: 0.0103 215/500 [===========>..................] - ETA: 1:35 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0104 216/500 [===========>..................] - ETA: 1:35 - loss: 0.2407 - regression_loss: 0.2303 - classification_loss: 0.0105 217/500 [============>.................] - ETA: 1:35 - loss: 0.2423 - regression_loss: 0.2317 - classification_loss: 0.0105 218/500 [============>.................] - ETA: 1:34 - loss: 0.2431 - regression_loss: 0.2324 - classification_loss: 0.0106 219/500 [============>.................] - ETA: 1:34 - loss: 0.2435 - regression_loss: 0.2328 - classification_loss: 0.0106 220/500 [============>.................] - ETA: 1:34 - loss: 0.2431 - regression_loss: 0.2326 - classification_loss: 0.0106 221/500 [============>.................] - ETA: 1:33 - loss: 0.2427 - regression_loss: 0.2321 - classification_loss: 0.0106 222/500 [============>.................] - ETA: 1:33 - loss: 0.2426 - regression_loss: 0.2321 - classification_loss: 0.0105 223/500 [============>.................] - ETA: 1:33 - loss: 0.2421 - regression_loss: 0.2316 - classification_loss: 0.0105 224/500 [============>.................] - ETA: 1:32 - loss: 0.2422 - regression_loss: 0.2317 - classification_loss: 0.0105 225/500 [============>.................] - ETA: 1:32 - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 226/500 [============>.................] - ETA: 1:32 - loss: 0.2421 - regression_loss: 0.2316 - classification_loss: 0.0105 227/500 [============>.................] - ETA: 1:31 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0105 228/500 [============>.................] - ETA: 1:31 - loss: 0.2419 - regression_loss: 0.2315 - classification_loss: 0.0105 229/500 [============>.................] - ETA: 1:30 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0104 230/500 [============>.................] - ETA: 1:30 - loss: 0.2411 - regression_loss: 0.2307 - classification_loss: 0.0104 231/500 [============>.................] - ETA: 1:30 - loss: 0.2414 - regression_loss: 0.2310 - classification_loss: 0.0104 232/500 [============>.................] - ETA: 1:29 - loss: 0.2421 - regression_loss: 0.2316 - classification_loss: 0.0105 233/500 [============>.................] - ETA: 1:29 - loss: 0.2421 - regression_loss: 0.2316 - classification_loss: 0.0105 234/500 [=============>................] - ETA: 1:29 - loss: 0.2417 - regression_loss: 0.2313 - classification_loss: 0.0105 235/500 [=============>................] - ETA: 1:28 - loss: 0.2418 - regression_loss: 0.2314 - classification_loss: 0.0105 236/500 [=============>................] - ETA: 1:28 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0106 237/500 [=============>................] - ETA: 1:28 - loss: 0.2427 - regression_loss: 0.2321 - classification_loss: 0.0106 238/500 [=============>................] - ETA: 1:27 - loss: 0.2433 - regression_loss: 0.2328 - classification_loss: 0.0106 239/500 [=============>................] - ETA: 1:27 - loss: 0.2434 - regression_loss: 0.2328 - classification_loss: 0.0106 240/500 [=============>................] - ETA: 1:27 - loss: 0.2434 - regression_loss: 0.2328 - classification_loss: 0.0106 241/500 [=============>................] - ETA: 1:26 - loss: 0.2435 - regression_loss: 0.2329 - classification_loss: 0.0105 242/500 [=============>................] - ETA: 1:26 - loss: 0.2427 - regression_loss: 0.2322 - classification_loss: 0.0105 243/500 [=============>................] - ETA: 1:26 - loss: 0.2427 - regression_loss: 0.2322 - classification_loss: 0.0105 244/500 [=============>................] - ETA: 1:25 - loss: 0.2425 - regression_loss: 0.2320 - classification_loss: 0.0105 245/500 [=============>................] - ETA: 1:25 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0104 246/500 [=============>................] - ETA: 1:25 - loss: 0.2425 - regression_loss: 0.2320 - classification_loss: 0.0105 247/500 [=============>................] - ETA: 1:24 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0105 248/500 [=============>................] - ETA: 1:24 - loss: 0.2423 - regression_loss: 0.2318 - classification_loss: 0.0105 249/500 [=============>................] - ETA: 1:24 - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 250/500 [==============>...............] - ETA: 1:23 - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 251/500 [==============>...............] - ETA: 1:23 - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0105 252/500 [==============>...............] - ETA: 1:22 - loss: 0.2405 - regression_loss: 0.2301 - classification_loss: 0.0104 253/500 [==============>...............] - ETA: 1:22 - loss: 0.2402 - regression_loss: 0.2298 - classification_loss: 0.0104 254/500 [==============>...............] - ETA: 1:22 - loss: 0.2401 - regression_loss: 0.2297 - classification_loss: 0.0104 255/500 [==============>...............] - ETA: 1:21 - loss: 0.2397 - regression_loss: 0.2293 - classification_loss: 0.0104 256/500 [==============>...............] - ETA: 1:21 - loss: 0.2397 - regression_loss: 0.2294 - classification_loss: 0.0103 257/500 [==============>...............] - ETA: 1:21 - loss: 0.2397 - regression_loss: 0.2294 - classification_loss: 0.0103 258/500 [==============>...............] - ETA: 1:20 - loss: 0.2395 - regression_loss: 0.2291 - classification_loss: 0.0103 259/500 [==============>...............] - ETA: 1:20 - loss: 0.2395 - regression_loss: 0.2291 - classification_loss: 0.0104 260/500 [==============>...............] - ETA: 1:20 - loss: 0.2389 - regression_loss: 0.2286 - classification_loss: 0.0103 261/500 [==============>...............] - ETA: 1:19 - loss: 0.2390 - regression_loss: 0.2286 - classification_loss: 0.0103 262/500 [==============>...............] - ETA: 1:19 - loss: 0.2393 - regression_loss: 0.2289 - classification_loss: 0.0104 263/500 [==============>...............] - ETA: 1:19 - loss: 0.2389 - regression_loss: 0.2286 - classification_loss: 0.0103 264/500 [==============>...............] - ETA: 1:18 - loss: 0.2385 - regression_loss: 0.2282 - classification_loss: 0.0103 265/500 [==============>...............] - ETA: 1:18 - loss: 0.2391 - regression_loss: 0.2288 - classification_loss: 0.0103 266/500 [==============>...............] - ETA: 1:18 - loss: 0.2386 - regression_loss: 0.2284 - classification_loss: 0.0103 267/500 [===============>..............] - ETA: 1:17 - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0103 268/500 [===============>..............] - ETA: 1:17 - loss: 0.2404 - regression_loss: 0.2301 - classification_loss: 0.0103 269/500 [===============>..............] - ETA: 1:17 - loss: 0.2404 - regression_loss: 0.2301 - classification_loss: 0.0103 270/500 [===============>..............] - ETA: 1:16 - loss: 0.2405 - regression_loss: 0.2303 - classification_loss: 0.0103 271/500 [===============>..............] - ETA: 1:16 - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0103 272/500 [===============>..............] - ETA: 1:16 - loss: 0.2409 - regression_loss: 0.2306 - classification_loss: 0.0103 273/500 [===============>..............] - ETA: 1:15 - loss: 0.2407 - regression_loss: 0.2304 - classification_loss: 0.0103 274/500 [===============>..............] - ETA: 1:15 - loss: 0.2410 - regression_loss: 0.2307 - classification_loss: 0.0103 275/500 [===============>..............] - ETA: 1:15 - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103 276/500 [===============>..............] - ETA: 1:14 - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103 277/500 [===============>..............] - ETA: 1:14 - loss: 0.2408 - regression_loss: 0.2305 - classification_loss: 0.0103 278/500 [===============>..............] - ETA: 1:14 - loss: 0.2411 - regression_loss: 0.2308 - classification_loss: 0.0103 279/500 [===============>..............] - ETA: 1:13 - loss: 0.2412 - regression_loss: 0.2309 - classification_loss: 0.0103 280/500 [===============>..............] - ETA: 1:13 - loss: 0.2411 - regression_loss: 0.2308 - classification_loss: 0.0103 281/500 [===============>..............] - ETA: 1:13 - loss: 0.2408 - regression_loss: 0.2305 - classification_loss: 0.0102 282/500 [===============>..............] - ETA: 1:12 - loss: 0.2407 - regression_loss: 0.2304 - classification_loss: 0.0103 283/500 [===============>..............] - ETA: 1:12 - loss: 0.2409 - regression_loss: 0.2306 - classification_loss: 0.0103 284/500 [================>.............] - ETA: 1:12 - loss: 0.2408 - regression_loss: 0.2305 - classification_loss: 0.0103 285/500 [================>.............] - ETA: 1:11 - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 286/500 [================>.............] - ETA: 1:11 - loss: 0.2406 - regression_loss: 0.2303 - classification_loss: 0.0102 287/500 [================>.............] - ETA: 1:11 - loss: 0.2407 - regression_loss: 0.2304 - classification_loss: 0.0103 288/500 [================>.............] - ETA: 1:10 - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0103 289/500 [================>.............] - ETA: 1:10 - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103 290/500 [================>.............] - ETA: 1:10 - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103 291/500 [================>.............] - ETA: 1:09 - loss: 0.2409 - regression_loss: 0.2306 - classification_loss: 0.0103 292/500 [================>.............] - ETA: 1:09 - loss: 0.2414 - regression_loss: 0.2310 - classification_loss: 0.0104 293/500 [================>.............] - ETA: 1:09 - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0104 294/500 [================>.............] - ETA: 1:08 - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 295/500 [================>.............] - ETA: 1:08 - loss: 0.2416 - regression_loss: 0.2312 - classification_loss: 0.0104 296/500 [================>.............] - ETA: 1:08 - loss: 0.2421 - regression_loss: 0.2316 - classification_loss: 0.0105 297/500 [================>.............] - ETA: 1:07 - loss: 0.2418 - regression_loss: 0.2312 - classification_loss: 0.0105 298/500 [================>.............] - ETA: 1:07 - loss: 0.2422 - regression_loss: 0.2317 - classification_loss: 0.0105 299/500 [================>.............] - ETA: 1:07 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0105 300/500 [=================>............] - ETA: 1:06 - loss: 0.2419 - regression_loss: 0.2314 - classification_loss: 0.0105 301/500 [=================>............] - ETA: 1:06 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 302/500 [=================>............] - ETA: 1:06 - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0106 303/500 [=================>............] - ETA: 1:05 - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0106 304/500 [=================>............] - ETA: 1:05 - loss: 0.2415 - regression_loss: 0.2309 - classification_loss: 0.0106 305/500 [=================>............] - ETA: 1:05 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0107 306/500 [=================>............] - ETA: 1:04 - loss: 0.2425 - regression_loss: 0.2318 - classification_loss: 0.0107 307/500 [=================>............] - ETA: 1:04 - loss: 0.2424 - regression_loss: 0.2318 - classification_loss: 0.0107 308/500 [=================>............] - ETA: 1:03 - loss: 0.2423 - regression_loss: 0.2317 - classification_loss: 0.0107 309/500 [=================>............] - ETA: 1:03 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0106 310/500 [=================>............] - ETA: 1:03 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0107 311/500 [=================>............] - ETA: 1:02 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0107 312/500 [=================>............] - ETA: 1:02 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0107 313/500 [=================>............] - ETA: 1:02 - loss: 0.2417 - regression_loss: 0.2311 - classification_loss: 0.0106 314/500 [=================>............] - ETA: 1:01 - loss: 0.2425 - regression_loss: 0.2318 - classification_loss: 0.0107 315/500 [=================>............] - ETA: 1:01 - loss: 0.2422 - regression_loss: 0.2315 - classification_loss: 0.0107 316/500 [=================>............] - ETA: 1:01 - loss: 0.2426 - regression_loss: 0.2318 - classification_loss: 0.0108 317/500 [==================>...........] - ETA: 1:00 - loss: 0.2421 - regression_loss: 0.2314 - classification_loss: 0.0107 318/500 [==================>...........] - ETA: 1:00 - loss: 0.2417 - regression_loss: 0.2310 - classification_loss: 0.0107 319/500 [==================>...........] - ETA: 1:00 - loss: 0.2414 - regression_loss: 0.2307 - classification_loss: 0.0107 320/500 [==================>...........] - ETA: 59s - loss: 0.2413 - regression_loss: 0.2306 - classification_loss: 0.0107  321/500 [==================>...........] - ETA: 59s - loss: 0.2409 - regression_loss: 0.2302 - classification_loss: 0.0107 322/500 [==================>...........] - ETA: 59s - loss: 0.2408 - regression_loss: 0.2301 - classification_loss: 0.0107 323/500 [==================>...........] - ETA: 58s - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0107 324/500 [==================>...........] - ETA: 58s - loss: 0.2404 - regression_loss: 0.2297 - classification_loss: 0.0107 325/500 [==================>...........] - ETA: 58s - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 326/500 [==================>...........] - ETA: 57s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 327/500 [==================>...........] - ETA: 57s - loss: 0.2393 - regression_loss: 0.2287 - classification_loss: 0.0106 328/500 [==================>...........] - ETA: 57s - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 329/500 [==================>...........] - ETA: 56s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 330/500 [==================>...........] - ETA: 56s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0106 331/500 [==================>...........] - ETA: 56s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 332/500 [==================>...........] - ETA: 55s - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 333/500 [==================>...........] - ETA: 55s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 334/500 [===================>..........] - ETA: 55s - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 335/500 [===================>..........] - ETA: 54s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 336/500 [===================>..........] - ETA: 54s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 337/500 [===================>..........] - ETA: 54s - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0105 338/500 [===================>..........] - ETA: 53s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 339/500 [===================>..........] - ETA: 53s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 340/500 [===================>..........] - ETA: 53s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 341/500 [===================>..........] - ETA: 52s - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0105 342/500 [===================>..........] - ETA: 52s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 343/500 [===================>..........] - ETA: 52s - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 344/500 [===================>..........] - ETA: 51s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 345/500 [===================>..........] - ETA: 51s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 346/500 [===================>..........] - ETA: 51s - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0105 347/500 [===================>..........] - ETA: 50s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 348/500 [===================>..........] - ETA: 50s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 349/500 [===================>..........] - ETA: 50s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 350/500 [====================>.........] - ETA: 49s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 351/500 [====================>.........] - ETA: 49s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 352/500 [====================>.........] - ETA: 49s - loss: 0.2392 - regression_loss: 0.2287 - classification_loss: 0.0105 353/500 [====================>.........] - ETA: 48s - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0105 354/500 [====================>.........] - ETA: 48s - loss: 0.2394 - regression_loss: 0.2289 - classification_loss: 0.0105 355/500 [====================>.........] - ETA: 48s - loss: 0.2394 - regression_loss: 0.2289 - classification_loss: 0.0105 356/500 [====================>.........] - ETA: 47s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 357/500 [====================>.........] - ETA: 47s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 358/500 [====================>.........] - ETA: 47s - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 359/500 [====================>.........] - ETA: 46s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 360/500 [====================>.........] - ETA: 46s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 361/500 [====================>.........] - ETA: 46s - loss: 0.2398 - regression_loss: 0.2293 - classification_loss: 0.0106 362/500 [====================>.........] - ETA: 45s - loss: 0.2397 - regression_loss: 0.2292 - classification_loss: 0.0106 363/500 [====================>.........] - ETA: 45s - loss: 0.2397 - regression_loss: 0.2292 - classification_loss: 0.0106 364/500 [====================>.........] - ETA: 45s - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0105 365/500 [====================>.........] - ETA: 44s - loss: 0.2403 - regression_loss: 0.2297 - classification_loss: 0.0106 366/500 [====================>.........] - ETA: 44s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 367/500 [=====================>........] - ETA: 44s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 368/500 [=====================>........] - ETA: 43s - loss: 0.2402 - regression_loss: 0.2295 - classification_loss: 0.0106 369/500 [=====================>........] - ETA: 43s - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 370/500 [=====================>........] - ETA: 43s - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 371/500 [=====================>........] - ETA: 42s - loss: 0.2399 - regression_loss: 0.2292 - classification_loss: 0.0106 372/500 [=====================>........] - ETA: 42s - loss: 0.2394 - regression_loss: 0.2289 - classification_loss: 0.0106 373/500 [=====================>........] - ETA: 42s - loss: 0.2392 - regression_loss: 0.2287 - classification_loss: 0.0106 374/500 [=====================>........] - ETA: 41s - loss: 0.2397 - regression_loss: 0.2289 - classification_loss: 0.0107 375/500 [=====================>........] - ETA: 41s - loss: 0.2396 - regression_loss: 0.2287 - classification_loss: 0.0108 376/500 [=====================>........] - ETA: 41s - loss: 0.2393 - regression_loss: 0.2284 - classification_loss: 0.0108 377/500 [=====================>........] - ETA: 40s - loss: 0.2391 - regression_loss: 0.2283 - classification_loss: 0.0108 378/500 [=====================>........] - ETA: 40s - loss: 0.2392 - regression_loss: 0.2284 - classification_loss: 0.0108 379/500 [=====================>........] - ETA: 40s - loss: 0.2390 - regression_loss: 0.2282 - classification_loss: 0.0108 380/500 [=====================>........] - ETA: 39s - loss: 0.2385 - regression_loss: 0.2278 - classification_loss: 0.0108 381/500 [=====================>........] - ETA: 39s - loss: 0.2385 - regression_loss: 0.2277 - classification_loss: 0.0108 382/500 [=====================>........] - ETA: 39s - loss: 0.2385 - regression_loss: 0.2278 - classification_loss: 0.0108 383/500 [=====================>........] - ETA: 38s - loss: 0.2383 - regression_loss: 0.2275 - classification_loss: 0.0108 384/500 [======================>.......] - ETA: 38s - loss: 0.2382 - regression_loss: 0.2274 - classification_loss: 0.0108 385/500 [======================>.......] - ETA: 38s - loss: 0.2382 - regression_loss: 0.2275 - classification_loss: 0.0108 386/500 [======================>.......] - ETA: 37s - loss: 0.2381 - regression_loss: 0.2273 - classification_loss: 0.0108 387/500 [======================>.......] - ETA: 37s - loss: 0.2378 - regression_loss: 0.2270 - classification_loss: 0.0108 388/500 [======================>.......] - ETA: 37s - loss: 0.2374 - regression_loss: 0.2267 - classification_loss: 0.0107 389/500 [======================>.......] - ETA: 36s - loss: 0.2374 - regression_loss: 0.2267 - classification_loss: 0.0107 390/500 [======================>.......] - ETA: 36s - loss: 0.2374 - regression_loss: 0.2267 - classification_loss: 0.0107 391/500 [======================>.......] - ETA: 36s - loss: 0.2375 - regression_loss: 0.2268 - classification_loss: 0.0107 392/500 [======================>.......] - ETA: 35s - loss: 0.2374 - regression_loss: 0.2267 - classification_loss: 0.0107 393/500 [======================>.......] - ETA: 35s - loss: 0.2372 - regression_loss: 0.2265 - classification_loss: 0.0107 394/500 [======================>.......] - ETA: 35s - loss: 0.2370 - regression_loss: 0.2263 - classification_loss: 0.0107 395/500 [======================>.......] - ETA: 34s - loss: 0.2369 - regression_loss: 0.2262 - classification_loss: 0.0107 396/500 [======================>.......] - ETA: 34s - loss: 0.2375 - regression_loss: 0.2267 - classification_loss: 0.0108 397/500 [======================>.......] - ETA: 34s - loss: 0.2373 - regression_loss: 0.2265 - classification_loss: 0.0108 398/500 [======================>.......] - ETA: 33s - loss: 0.2371 - regression_loss: 0.2264 - classification_loss: 0.0107 399/500 [======================>.......] - ETA: 33s - loss: 0.2370 - regression_loss: 0.2263 - classification_loss: 0.0107 400/500 [=======================>......] - ETA: 33s - loss: 0.2369 - regression_loss: 0.2262 - classification_loss: 0.0107 401/500 [=======================>......] - ETA: 32s - loss: 0.2372 - regression_loss: 0.2264 - classification_loss: 0.0107 402/500 [=======================>......] - ETA: 32s - loss: 0.2373 - regression_loss: 0.2265 - classification_loss: 0.0108 403/500 [=======================>......] - ETA: 32s - loss: 0.2373 - regression_loss: 0.2265 - classification_loss: 0.0107 404/500 [=======================>......] - ETA: 31s - loss: 0.2379 - regression_loss: 0.2271 - classification_loss: 0.0108 405/500 [=======================>......] - ETA: 31s - loss: 0.2381 - regression_loss: 0.2272 - classification_loss: 0.0108 406/500 [=======================>......] - ETA: 31s - loss: 0.2380 - regression_loss: 0.2272 - classification_loss: 0.0108 407/500 [=======================>......] - ETA: 30s - loss: 0.2379 - regression_loss: 0.2271 - classification_loss: 0.0108 408/500 [=======================>......] - ETA: 30s - loss: 0.2380 - regression_loss: 0.2272 - classification_loss: 0.0108 409/500 [=======================>......] - ETA: 30s - loss: 0.2376 - regression_loss: 0.2268 - classification_loss: 0.0107 410/500 [=======================>......] - ETA: 29s - loss: 0.2377 - regression_loss: 0.2270 - classification_loss: 0.0108 411/500 [=======================>......] - ETA: 29s - loss: 0.2382 - regression_loss: 0.2274 - classification_loss: 0.0108 412/500 [=======================>......] - ETA: 29s - loss: 0.2387 - regression_loss: 0.2278 - classification_loss: 0.0109 413/500 [=======================>......] - ETA: 28s - loss: 0.2385 - regression_loss: 0.2276 - classification_loss: 0.0108 414/500 [=======================>......] - ETA: 28s - loss: 0.2385 - regression_loss: 0.2277 - classification_loss: 0.0108 415/500 [=======================>......] - ETA: 28s - loss: 0.2387 - regression_loss: 0.2278 - classification_loss: 0.0108 416/500 [=======================>......] - ETA: 27s - loss: 0.2385 - regression_loss: 0.2277 - classification_loss: 0.0108 417/500 [========================>.....] - ETA: 27s - loss: 0.2387 - regression_loss: 0.2279 - classification_loss: 0.0108 418/500 [========================>.....] - ETA: 27s - loss: 0.2383 - regression_loss: 0.2275 - classification_loss: 0.0108 419/500 [========================>.....] - ETA: 26s - loss: 0.2381 - regression_loss: 0.2273 - classification_loss: 0.0108 420/500 [========================>.....] - ETA: 26s - loss: 0.2378 - regression_loss: 0.2271 - classification_loss: 0.0108 421/500 [========================>.....] - ETA: 26s - loss: 0.2382 - regression_loss: 0.2274 - classification_loss: 0.0108 422/500 [========================>.....] - ETA: 25s - loss: 0.2391 - regression_loss: 0.2283 - classification_loss: 0.0108 423/500 [========================>.....] - ETA: 25s - loss: 0.2395 - regression_loss: 0.2287 - classification_loss: 0.0108 424/500 [========================>.....] - ETA: 25s - loss: 0.2397 - regression_loss: 0.2290 - classification_loss: 0.0108 425/500 [========================>.....] - ETA: 24s - loss: 0.2394 - regression_loss: 0.2286 - classification_loss: 0.0108 426/500 [========================>.....] - ETA: 24s - loss: 0.2393 - regression_loss: 0.2285 - classification_loss: 0.0107 427/500 [========================>.....] - ETA: 24s - loss: 0.2393 - regression_loss: 0.2285 - classification_loss: 0.0107 428/500 [========================>.....] - ETA: 23s - loss: 0.2392 - regression_loss: 0.2285 - classification_loss: 0.0107 429/500 [========================>.....] - ETA: 23s - loss: 0.2391 - regression_loss: 0.2284 - classification_loss: 0.0107 430/500 [========================>.....] - ETA: 23s - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0107 431/500 [========================>.....] - ETA: 22s - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 432/500 [========================>.....] - ETA: 22s - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 433/500 [========================>.....] - ETA: 22s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0107 434/500 [=========================>....] - ETA: 21s - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0107 435/500 [=========================>....] - ETA: 21s - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 436/500 [=========================>....] - ETA: 21s - loss: 0.2389 - regression_loss: 0.2282 - classification_loss: 0.0107 437/500 [=========================>....] - ETA: 20s - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 438/500 [=========================>....] - ETA: 20s - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 439/500 [=========================>....] - ETA: 20s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0107 440/500 [=========================>....] - ETA: 19s - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 441/500 [=========================>....] - ETA: 19s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0107 442/500 [=========================>....] - ETA: 19s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 443/500 [=========================>....] - ETA: 18s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 444/500 [=========================>....] - ETA: 18s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 445/500 [=========================>....] - ETA: 18s - loss: 0.2382 - regression_loss: 0.2276 - classification_loss: 0.0106 446/500 [=========================>....] - ETA: 17s - loss: 0.2382 - regression_loss: 0.2276 - classification_loss: 0.0106 447/500 [=========================>....] - ETA: 17s - loss: 0.2384 - regression_loss: 0.2277 - classification_loss: 0.0106 448/500 [=========================>....] - ETA: 17s - loss: 0.2381 - regression_loss: 0.2275 - classification_loss: 0.0106 449/500 [=========================>....] - ETA: 16s - loss: 0.2380 - regression_loss: 0.2274 - classification_loss: 0.0106 450/500 [==========================>...] - ETA: 16s - loss: 0.2378 - regression_loss: 0.2272 - classification_loss: 0.0106 451/500 [==========================>...] - ETA: 16s - loss: 0.2378 - regression_loss: 0.2273 - classification_loss: 0.0106 452/500 [==========================>...] - ETA: 15s - loss: 0.2384 - regression_loss: 0.2277 - classification_loss: 0.0106 453/500 [==========================>...] - ETA: 15s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0107 454/500 [==========================>...] - ETA: 15s - loss: 0.2385 - regression_loss: 0.2278 - classification_loss: 0.0107 455/500 [==========================>...] - ETA: 14s - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 456/500 [==========================>...] - ETA: 14s - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 457/500 [==========================>...] - ETA: 14s - loss: 0.2391 - regression_loss: 0.2284 - classification_loss: 0.0107 458/500 [==========================>...] - ETA: 13s - loss: 0.2395 - regression_loss: 0.2288 - classification_loss: 0.0107 459/500 [==========================>...] - ETA: 13s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0107 460/500 [==========================>...] - ETA: 13s - loss: 0.2397 - regression_loss: 0.2290 - classification_loss: 0.0107 461/500 [==========================>...] - ETA: 12s - loss: 0.2395 - regression_loss: 0.2288 - classification_loss: 0.0107 462/500 [==========================>...] - ETA: 12s - loss: 0.2397 - regression_loss: 0.2290 - classification_loss: 0.0107 463/500 [==========================>...] - ETA: 12s - loss: 0.2397 - regression_loss: 0.2290 - classification_loss: 0.0107 464/500 [==========================>...] - ETA: 11s - loss: 0.2396 - regression_loss: 0.2289 - classification_loss: 0.0107 465/500 [==========================>...] - ETA: 11s - loss: 0.2398 - regression_loss: 0.2291 - classification_loss: 0.0107 466/500 [==========================>...] - ETA: 11s - loss: 0.2398 - regression_loss: 0.2291 - classification_loss: 0.0106 467/500 [===========================>..] - ETA: 10s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 468/500 [===========================>..] - ETA: 10s - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 469/500 [===========================>..] - ETA: 10s - loss: 0.2393 - regression_loss: 0.2287 - classification_loss: 0.0106 470/500 [===========================>..] - ETA: 9s - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0106  471/500 [===========================>..] - ETA: 9s - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 472/500 [===========================>..] - ETA: 9s - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0106 473/500 [===========================>..] - ETA: 8s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 474/500 [===========================>..] - ETA: 8s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 475/500 [===========================>..] - ETA: 8s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 476/500 [===========================>..] - ETA: 7s - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0106 477/500 [===========================>..] - ETA: 7s - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0106 478/500 [===========================>..] - ETA: 7s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0106 479/500 [===========================>..] - ETA: 6s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0106 480/500 [===========================>..] - ETA: 6s - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0105 481/500 [===========================>..] - ETA: 6s - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 482/500 [===========================>..] - ETA: 5s - loss: 0.2384 - regression_loss: 0.2278 - classification_loss: 0.0105 483/500 [===========================>..] - ETA: 5s - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 484/500 [============================>.] - ETA: 5s - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 485/500 [============================>.] - ETA: 4s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 486/500 [============================>.] - ETA: 4s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 487/500 [============================>.] - ETA: 4s - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 488/500 [============================>.] - ETA: 3s - loss: 0.2383 - regression_loss: 0.2279 - classification_loss: 0.0105 489/500 [============================>.] - ETA: 3s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 490/500 [============================>.] - ETA: 3s - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0105 491/500 [============================>.] - ETA: 2s - loss: 0.2393 - regression_loss: 0.2288 - classification_loss: 0.0106 492/500 [============================>.] - ETA: 2s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 493/500 [============================>.] - ETA: 2s - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0106 494/500 [============================>.] - ETA: 1s - loss: 0.2399 - regression_loss: 0.2294 - classification_loss: 0.0106 495/500 [============================>.] - ETA: 1s - loss: 0.2399 - regression_loss: 0.2294 - classification_loss: 0.0106 496/500 [============================>.] - ETA: 1s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 497/500 [============================>.] - ETA: 0s - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 498/500 [============================>.] - ETA: 0s - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 499/500 [============================>.] - ETA: 0s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 500/500 [==============================] - 166s 333ms/step - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 1172 instances of class plum with average precision: 0.7602 mAP: 0.7602 Epoch 00048: saving model to ./training/snapshots/resnet101_pascal_48.h5 Epoch 49/150 1/500 [..............................] - ETA: 2:41 - loss: 0.0565 - regression_loss: 0.0549 - classification_loss: 0.0016 2/500 [..............................] - ETA: 2:41 - loss: 0.1382 - regression_loss: 0.1343 - classification_loss: 0.0039 3/500 [..............................] - ETA: 2:41 - loss: 0.1813 - regression_loss: 0.1736 - classification_loss: 0.0076 4/500 [..............................] - ETA: 2:42 - loss: 0.2151 - regression_loss: 0.2028 - classification_loss: 0.0123 5/500 [..............................] - ETA: 2:42 - loss: 0.2134 - regression_loss: 0.2011 - classification_loss: 0.0123 6/500 [..............................] - ETA: 2:41 - loss: 0.2264 - regression_loss: 0.2148 - classification_loss: 0.0116 7/500 [..............................] - ETA: 2:42 - loss: 0.2272 - regression_loss: 0.2152 - classification_loss: 0.0120 8/500 [..............................] - ETA: 2:42 - loss: 0.2100 - regression_loss: 0.1992 - classification_loss: 0.0107 9/500 [..............................] - ETA: 2:41 - loss: 0.2093 - regression_loss: 0.1982 - classification_loss: 0.0111 10/500 [..............................] - ETA: 2:42 - loss: 0.2124 - regression_loss: 0.2009 - classification_loss: 0.0116 11/500 [..............................] - ETA: 2:42 - loss: 0.2276 - regression_loss: 0.2162 - classification_loss: 0.0115 12/500 [..............................] - ETA: 2:42 - loss: 0.2284 - regression_loss: 0.2163 - classification_loss: 0.0121 13/500 [..............................] - ETA: 2:42 - loss: 0.2322 - regression_loss: 0.2203 - classification_loss: 0.0119 14/500 [..............................] - ETA: 2:41 - loss: 0.2281 - regression_loss: 0.2167 - classification_loss: 0.0114 15/500 [..............................] - ETA: 2:41 - loss: 0.2344 - regression_loss: 0.2230 - classification_loss: 0.0114 16/500 [..............................] - ETA: 2:40 - loss: 0.2361 - regression_loss: 0.2249 - classification_loss: 0.0112 17/500 [>.............................] - ETA: 2:40 - loss: 0.2307 - regression_loss: 0.2200 - classification_loss: 0.0107 18/500 [>.............................] - ETA: 2:40 - loss: 0.2256 - regression_loss: 0.2153 - classification_loss: 0.0103 19/500 [>.............................] - ETA: 2:40 - loss: 0.2223 - regression_loss: 0.2124 - classification_loss: 0.0099 20/500 [>.............................] - ETA: 2:39 - loss: 0.2276 - regression_loss: 0.2177 - classification_loss: 0.0099 21/500 [>.............................] - ETA: 2:39 - loss: 0.2240 - regression_loss: 0.2144 - classification_loss: 0.0096 22/500 [>.............................] - ETA: 2:39 - loss: 0.2201 - regression_loss: 0.2108 - classification_loss: 0.0094 23/500 [>.............................] - ETA: 2:39 - loss: 0.2152 - regression_loss: 0.2061 - classification_loss: 0.0091 24/500 [>.............................] - ETA: 2:38 - loss: 0.2139 - regression_loss: 0.2046 - classification_loss: 0.0093 25/500 [>.............................] - ETA: 2:39 - loss: 0.2185 - regression_loss: 0.2091 - classification_loss: 0.0094 26/500 [>.............................] - ETA: 2:38 - loss: 0.2192 - regression_loss: 0.2100 - classification_loss: 0.0093 27/500 [>.............................] - ETA: 2:37 - loss: 0.2184 - regression_loss: 0.2094 - classification_loss: 0.0091 28/500 [>.............................] - ETA: 2:37 - loss: 0.2221 - regression_loss: 0.2122 - classification_loss: 0.0099 29/500 [>.............................] - ETA: 2:37 - loss: 0.2241 - regression_loss: 0.2140 - classification_loss: 0.0101 30/500 [>.............................] - ETA: 2:36 - loss: 0.2301 - regression_loss: 0.2194 - classification_loss: 0.0107 31/500 [>.............................] - ETA: 2:36 - loss: 0.2319 - regression_loss: 0.2211 - classification_loss: 0.0108 32/500 [>.............................] - ETA: 2:36 - loss: 0.2320 - regression_loss: 0.2210 - classification_loss: 0.0110 33/500 [>.............................] - ETA: 2:35 - loss: 0.2329 - regression_loss: 0.2218 - classification_loss: 0.0111 34/500 [=>............................] - ETA: 2:35 - loss: 0.2372 - regression_loss: 0.2259 - classification_loss: 0.0113 35/500 [=>............................] - ETA: 2:35 - loss: 0.2352 - regression_loss: 0.2240 - classification_loss: 0.0112 36/500 [=>............................] - ETA: 2:34 - loss: 0.2348 - regression_loss: 0.2236 - classification_loss: 0.0112 37/500 [=>............................] - ETA: 2:34 - loss: 0.2330 - regression_loss: 0.2219 - classification_loss: 0.0110 38/500 [=>............................] - ETA: 2:34 - loss: 0.2340 - regression_loss: 0.2229 - classification_loss: 0.0111 39/500 [=>............................] - ETA: 2:34 - loss: 0.2324 - regression_loss: 0.2214 - classification_loss: 0.0111 40/500 [=>............................] - ETA: 2:34 - loss: 0.2326 - regression_loss: 0.2214 - classification_loss: 0.0112 41/500 [=>............................] - ETA: 2:33 - loss: 0.2315 - regression_loss: 0.2204 - classification_loss: 0.0111 42/500 [=>............................] - ETA: 2:33 - loss: 0.2291 - regression_loss: 0.2182 - classification_loss: 0.0110 43/500 [=>............................] - ETA: 2:32 - loss: 0.2319 - regression_loss: 0.2205 - classification_loss: 0.0113 44/500 [=>............................] - ETA: 2:32 - loss: 0.2378 - regression_loss: 0.2265 - classification_loss: 0.0112 45/500 [=>............................] - ETA: 2:32 - loss: 0.2361 - regression_loss: 0.2250 - classification_loss: 0.0111 46/500 [=>............................] - ETA: 2:31 - loss: 0.2352 - regression_loss: 0.2242 - classification_loss: 0.0110 47/500 [=>............................] - ETA: 2:31 - loss: 0.2321 - regression_loss: 0.2212 - classification_loss: 0.0109 48/500 [=>............................] - ETA: 2:30 - loss: 0.2342 - regression_loss: 0.2228 - classification_loss: 0.0114 49/500 [=>............................] - ETA: 2:30 - loss: 0.2349 - regression_loss: 0.2235 - classification_loss: 0.0114 50/500 [==>...........................] - ETA: 2:30 - loss: 0.2360 - regression_loss: 0.2246 - classification_loss: 0.0113 51/500 [==>...........................] - ETA: 2:29 - loss: 0.2357 - regression_loss: 0.2243 - classification_loss: 0.0114 52/500 [==>...........................] - ETA: 2:29 - loss: 0.2341 - regression_loss: 0.2229 - classification_loss: 0.0112 53/500 [==>...........................] - ETA: 2:28 - loss: 0.2327 - regression_loss: 0.2216 - classification_loss: 0.0111 54/500 [==>...........................] - ETA: 2:28 - loss: 0.2348 - regression_loss: 0.2235 - classification_loss: 0.0112 55/500 [==>...........................] - ETA: 2:28 - loss: 0.2350 - regression_loss: 0.2238 - classification_loss: 0.0112 56/500 [==>...........................] - ETA: 2:27 - loss: 0.2342 - regression_loss: 0.2231 - classification_loss: 0.0112 57/500 [==>...........................] - ETA: 2:27 - loss: 0.2333 - regression_loss: 0.2221 - classification_loss: 0.0111 58/500 [==>...........................] - ETA: 2:27 - loss: 0.2322 - regression_loss: 0.2212 - classification_loss: 0.0111 59/500 [==>...........................] - ETA: 2:26 - loss: 0.2312 - regression_loss: 0.2202 - classification_loss: 0.0110 60/500 [==>...........................] - ETA: 2:26 - loss: 0.2305 - regression_loss: 0.2196 - classification_loss: 0.0109 61/500 [==>...........................] - ETA: 2:26 - loss: 0.2294 - regression_loss: 0.2186 - classification_loss: 0.0107 62/500 [==>...........................] - ETA: 2:25 - loss: 0.2291 - regression_loss: 0.2184 - classification_loss: 0.0107 63/500 [==>...........................] - ETA: 2:25 - loss: 0.2279 - regression_loss: 0.2173 - classification_loss: 0.0106 64/500 [==>...........................] - ETA: 2:24 - loss: 0.2264 - regression_loss: 0.2159 - classification_loss: 0.0105 65/500 [==>...........................] - ETA: 2:24 - loss: 0.2250 - regression_loss: 0.2146 - classification_loss: 0.0104 66/500 [==>...........................] - ETA: 2:24 - loss: 0.2272 - regression_loss: 0.2168 - classification_loss: 0.0104 67/500 [===>..........................] - ETA: 2:24 - loss: 0.2255 - regression_loss: 0.2152 - classification_loss: 0.0103 68/500 [===>..........................] - ETA: 2:23 - loss: 0.2273 - regression_loss: 0.2170 - classification_loss: 0.0104 69/500 [===>..........................] - ETA: 2:23 - loss: 0.2273 - regression_loss: 0.2169 - classification_loss: 0.0104 70/500 [===>..........................] - ETA: 2:23 - loss: 0.2286 - regression_loss: 0.2182 - classification_loss: 0.0105 71/500 [===>..........................] - ETA: 2:22 - loss: 0.2270 - regression_loss: 0.2167 - classification_loss: 0.0104 72/500 [===>..........................] - ETA: 2:22 - loss: 0.2274 - regression_loss: 0.2170 - classification_loss: 0.0104 73/500 [===>..........................] - ETA: 2:22 - loss: 0.2308 - regression_loss: 0.2203 - classification_loss: 0.0105 74/500 [===>..........................] - ETA: 2:22 - loss: 0.2310 - regression_loss: 0.2205 - classification_loss: 0.0105 75/500 [===>..........................] - ETA: 2:21 - loss: 0.2310 - regression_loss: 0.2206 - classification_loss: 0.0104 76/500 [===>..........................] - ETA: 2:21 - loss: 0.2346 - regression_loss: 0.2241 - classification_loss: 0.0105 77/500 [===>..........................] - ETA: 2:21 - loss: 0.2386 - regression_loss: 0.2277 - classification_loss: 0.0109 78/500 [===>..........................] - ETA: 2:20 - loss: 0.2415 - regression_loss: 0.2305 - classification_loss: 0.0110 79/500 [===>..........................] - ETA: 2:20 - loss: 0.2398 - regression_loss: 0.2288 - classification_loss: 0.0109 80/500 [===>..........................] - ETA: 2:19 - loss: 0.2403 - regression_loss: 0.2294 - classification_loss: 0.0109 81/500 [===>..........................] - ETA: 2:19 - loss: 0.2400 - regression_loss: 0.2292 - classification_loss: 0.0109 82/500 [===>..........................] - ETA: 2:19 - loss: 0.2390 - regression_loss: 0.2282 - classification_loss: 0.0108 83/500 [===>..........................] - ETA: 2:19 - loss: 0.2396 - regression_loss: 0.2288 - classification_loss: 0.0108 84/500 [====>.........................] - ETA: 2:18 - loss: 0.2396 - regression_loss: 0.2288 - classification_loss: 0.0108 85/500 [====>.........................] - ETA: 2:18 - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0107 86/500 [====>.........................] - ETA: 2:18 - loss: 0.2385 - regression_loss: 0.2278 - classification_loss: 0.0107 87/500 [====>.........................] - ETA: 2:17 - loss: 0.2395 - regression_loss: 0.2288 - classification_loss: 0.0107 88/500 [====>.........................] - ETA: 2:17 - loss: 0.2385 - regression_loss: 0.2278 - classification_loss: 0.0106 89/500 [====>.........................] - ETA: 2:17 - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 90/500 [====>.........................] - ETA: 2:16 - loss: 0.2389 - regression_loss: 0.2284 - classification_loss: 0.0105 91/500 [====>.........................] - ETA: 2:16 - loss: 0.2392 - regression_loss: 0.2287 - classification_loss: 0.0105 92/500 [====>.........................] - ETA: 2:16 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 93/500 [====>.........................] - ETA: 2:15 - loss: 0.2405 - regression_loss: 0.2299 - classification_loss: 0.0106 94/500 [====>.........................] - ETA: 2:15 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 95/500 [====>.........................] - ETA: 2:15 - loss: 0.2410 - regression_loss: 0.2306 - classification_loss: 0.0105 96/500 [====>.........................] - ETA: 2:14 - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0105 97/500 [====>.........................] - ETA: 2:14 - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 98/500 [====>.........................] - ETA: 2:14 - loss: 0.2403 - regression_loss: 0.2298 - classification_loss: 0.0104 99/500 [====>.........................] - ETA: 2:13 - loss: 0.2427 - regression_loss: 0.2323 - classification_loss: 0.0104 100/500 [=====>........................] - ETA: 2:13 - loss: 0.2454 - regression_loss: 0.2348 - classification_loss: 0.0105 101/500 [=====>........................] - ETA: 2:13 - loss: 0.2450 - regression_loss: 0.2345 - classification_loss: 0.0104 102/500 [=====>........................] - ETA: 2:12 - loss: 0.2438 - regression_loss: 0.2334 - classification_loss: 0.0104 103/500 [=====>........................] - ETA: 2:12 - loss: 0.2422 - regression_loss: 0.2319 - classification_loss: 0.0103 104/500 [=====>........................] - ETA: 2:12 - loss: 0.2411 - regression_loss: 0.2309 - classification_loss: 0.0102 105/500 [=====>........................] - ETA: 2:11 - loss: 0.2397 - regression_loss: 0.2295 - classification_loss: 0.0101 106/500 [=====>........................] - ETA: 2:11 - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0103 107/500 [=====>........................] - ETA: 2:11 - loss: 0.2427 - regression_loss: 0.2318 - classification_loss: 0.0109 108/500 [=====>........................] - ETA: 2:10 - loss: 0.2428 - regression_loss: 0.2321 - classification_loss: 0.0108 109/500 [=====>........................] - ETA: 2:10 - loss: 0.2422 - regression_loss: 0.2314 - classification_loss: 0.0107 110/500 [=====>........................] - ETA: 2:10 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0107 111/500 [=====>........................] - ETA: 2:09 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 112/500 [=====>........................] - ETA: 2:09 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 113/500 [=====>........................] - ETA: 2:09 - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0105 114/500 [=====>........................] - ETA: 2:08 - loss: 0.2396 - regression_loss: 0.2291 - classification_loss: 0.0105 115/500 [=====>........................] - ETA: 2:08 - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 116/500 [=====>........................] - ETA: 2:08 - loss: 0.2398 - regression_loss: 0.2294 - classification_loss: 0.0104 117/500 [======>.......................] - ETA: 2:07 - loss: 0.2396 - regression_loss: 0.2292 - classification_loss: 0.0104 118/500 [======>.......................] - ETA: 2:07 - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0104 119/500 [======>.......................] - ETA: 2:06 - loss: 0.2401 - regression_loss: 0.2298 - classification_loss: 0.0104 120/500 [======>.......................] - ETA: 2:06 - loss: 0.2398 - regression_loss: 0.2295 - classification_loss: 0.0103 121/500 [======>.......................] - ETA: 2:06 - loss: 0.2386 - regression_loss: 0.2284 - classification_loss: 0.0102 122/500 [======>.......................] - ETA: 2:05 - loss: 0.2376 - regression_loss: 0.2274 - classification_loss: 0.0102 123/500 [======>.......................] - ETA: 2:05 - loss: 0.2378 - regression_loss: 0.2276 - classification_loss: 0.0102 124/500 [======>.......................] - ETA: 2:05 - loss: 0.2367 - regression_loss: 0.2266 - classification_loss: 0.0101 125/500 [======>.......................] - ETA: 2:04 - loss: 0.2366 - regression_loss: 0.2265 - classification_loss: 0.0101 126/500 [======>.......................] - ETA: 2:04 - loss: 0.2365 - regression_loss: 0.2264 - classification_loss: 0.0101 127/500 [======>.......................] - ETA: 2:04 - loss: 0.2364 - regression_loss: 0.2263 - classification_loss: 0.0101 128/500 [======>.......................] - ETA: 2:03 - loss: 0.2363 - regression_loss: 0.2262 - classification_loss: 0.0101 129/500 [======>.......................] - ETA: 2:03 - loss: 0.2366 - regression_loss: 0.2265 - classification_loss: 0.0101 130/500 [======>.......................] - ETA: 2:03 - loss: 0.2364 - regression_loss: 0.2263 - classification_loss: 0.0101 131/500 [======>.......................] - ETA: 2:03 - loss: 0.2366 - regression_loss: 0.2265 - classification_loss: 0.0101 132/500 [======>.......................] - ETA: 2:02 - loss: 0.2375 - regression_loss: 0.2272 - classification_loss: 0.0103 133/500 [======>.......................] - ETA: 2:02 - loss: 0.2368 - regression_loss: 0.2265 - classification_loss: 0.0103 134/500 [=======>......................] - ETA: 2:02 - loss: 0.2362 - regression_loss: 0.2260 - classification_loss: 0.0102 135/500 [=======>......................] - ETA: 2:01 - loss: 0.2376 - regression_loss: 0.2274 - classification_loss: 0.0102 136/500 [=======>......................] - ETA: 2:01 - loss: 0.2382 - regression_loss: 0.2280 - classification_loss: 0.0102 137/500 [=======>......................] - ETA: 2:00 - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0102 138/500 [=======>......................] - ETA: 2:00 - loss: 0.2384 - regression_loss: 0.2282 - classification_loss: 0.0102 139/500 [=======>......................] - ETA: 2:00 - loss: 0.2377 - regression_loss: 0.2276 - classification_loss: 0.0101 140/500 [=======>......................] - ETA: 2:00 - loss: 0.2380 - regression_loss: 0.2278 - classification_loss: 0.0102 141/500 [=======>......................] - ETA: 1:59 - loss: 0.2371 - regression_loss: 0.2269 - classification_loss: 0.0102 142/500 [=======>......................] - ETA: 1:59 - loss: 0.2368 - regression_loss: 0.2266 - classification_loss: 0.0101 143/500 [=======>......................] - ETA: 1:59 - loss: 0.2378 - regression_loss: 0.2275 - classification_loss: 0.0103 144/500 [=======>......................] - ETA: 1:58 - loss: 0.2372 - regression_loss: 0.2269 - classification_loss: 0.0103 145/500 [=======>......................] - ETA: 1:58 - loss: 0.2372 - regression_loss: 0.2269 - classification_loss: 0.0103 146/500 [=======>......................] - ETA: 1:58 - loss: 0.2371 - regression_loss: 0.2268 - classification_loss: 0.0103 147/500 [=======>......................] - ETA: 1:57 - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0102 148/500 [=======>......................] - ETA: 1:57 - loss: 0.2355 - regression_loss: 0.2253 - classification_loss: 0.0102 149/500 [=======>......................] - ETA: 1:57 - loss: 0.2356 - regression_loss: 0.2254 - classification_loss: 0.0102 150/500 [========>.....................] - ETA: 1:56 - loss: 0.2349 - regression_loss: 0.2248 - classification_loss: 0.0101 151/500 [========>.....................] - ETA: 1:56 - loss: 0.2353 - regression_loss: 0.2251 - classification_loss: 0.0102 152/500 [========>.....................] - ETA: 1:56 - loss: 0.2349 - regression_loss: 0.2247 - classification_loss: 0.0101 153/500 [========>.....................] - ETA: 1:55 - loss: 0.2352 - regression_loss: 0.2251 - classification_loss: 0.0102 154/500 [========>.....................] - ETA: 1:55 - loss: 0.2347 - regression_loss: 0.2246 - classification_loss: 0.0101 155/500 [========>.....................] - ETA: 1:55 - loss: 0.2336 - regression_loss: 0.2235 - classification_loss: 0.0101 156/500 [========>.....................] - ETA: 1:54 - loss: 0.2335 - regression_loss: 0.2234 - classification_loss: 0.0101 157/500 [========>.....................] - ETA: 1:54 - loss: 0.2349 - regression_loss: 0.2247 - classification_loss: 0.0102 158/500 [========>.....................] - ETA: 1:54 - loss: 0.2358 - regression_loss: 0.2256 - classification_loss: 0.0102 159/500 [========>.....................] - ETA: 1:53 - loss: 0.2358 - regression_loss: 0.2256 - classification_loss: 0.0102 160/500 [========>.....................] - ETA: 1:53 - loss: 0.2363 - regression_loss: 0.2261 - classification_loss: 0.0102 161/500 [========>.....................] - ETA: 1:52 - loss: 0.2365 - regression_loss: 0.2263 - classification_loss: 0.0102 162/500 [========>.....................] - ETA: 1:52 - loss: 0.2364 - regression_loss: 0.2262 - classification_loss: 0.0102 163/500 [========>.....................] - ETA: 1:52 - loss: 0.2365 - regression_loss: 0.2263 - classification_loss: 0.0102 164/500 [========>.....................] - ETA: 1:51 - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0101 165/500 [========>.....................] - ETA: 1:51 - loss: 0.2371 - regression_loss: 0.2269 - classification_loss: 0.0102 166/500 [========>.....................] - ETA: 1:51 - loss: 0.2381 - regression_loss: 0.2279 - classification_loss: 0.0102 167/500 [=========>....................] - ETA: 1:50 - loss: 0.2385 - regression_loss: 0.2283 - classification_loss: 0.0102 168/500 [=========>....................] - ETA: 1:50 - loss: 0.2387 - regression_loss: 0.2285 - classification_loss: 0.0102 169/500 [=========>....................] - ETA: 1:50 - loss: 0.2385 - regression_loss: 0.2282 - classification_loss: 0.0103 170/500 [=========>....................] - ETA: 1:49 - loss: 0.2384 - regression_loss: 0.2281 - classification_loss: 0.0102 171/500 [=========>....................] - ETA: 1:49 - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 172/500 [=========>....................] - ETA: 1:49 - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 173/500 [=========>....................] - ETA: 1:48 - loss: 0.2427 - regression_loss: 0.2318 - classification_loss: 0.0108 174/500 [=========>....................] - ETA: 1:48 - loss: 0.2423 - regression_loss: 0.2314 - classification_loss: 0.0109 175/500 [=========>....................] - ETA: 1:48 - loss: 0.2416 - regression_loss: 0.2307 - classification_loss: 0.0108 176/500 [=========>....................] - ETA: 1:47 - loss: 0.2422 - regression_loss: 0.2313 - classification_loss: 0.0108 177/500 [=========>....................] - ETA: 1:47 - loss: 0.2423 - regression_loss: 0.2315 - classification_loss: 0.0108 178/500 [=========>....................] - ETA: 1:47 - loss: 0.2417 - regression_loss: 0.2309 - classification_loss: 0.0108 179/500 [=========>....................] - ETA: 1:46 - loss: 0.2417 - regression_loss: 0.2310 - classification_loss: 0.0107 180/500 [=========>....................] - ETA: 1:46 - loss: 0.2422 - regression_loss: 0.2314 - classification_loss: 0.0107 181/500 [=========>....................] - ETA: 1:46 - loss: 0.2428 - regression_loss: 0.2321 - classification_loss: 0.0107 182/500 [=========>....................] - ETA: 1:45 - loss: 0.2428 - regression_loss: 0.2320 - classification_loss: 0.0108 183/500 [=========>....................] - ETA: 1:45 - loss: 0.2423 - regression_loss: 0.2316 - classification_loss: 0.0107 184/500 [==========>...................] - ETA: 1:45 - loss: 0.2422 - regression_loss: 0.2314 - classification_loss: 0.0108 185/500 [==========>...................] - ETA: 1:45 - loss: 0.2420 - regression_loss: 0.2312 - classification_loss: 0.0108 186/500 [==========>...................] - ETA: 1:44 - loss: 0.2425 - regression_loss: 0.2316 - classification_loss: 0.0109 187/500 [==========>...................] - ETA: 1:44 - loss: 0.2430 - regression_loss: 0.2321 - classification_loss: 0.0110 188/500 [==========>...................] - ETA: 1:43 - loss: 0.2432 - regression_loss: 0.2322 - classification_loss: 0.0110 189/500 [==========>...................] - ETA: 1:43 - loss: 0.2435 - regression_loss: 0.2325 - classification_loss: 0.0110 190/500 [==========>...................] - ETA: 1:43 - loss: 0.2438 - regression_loss: 0.2328 - classification_loss: 0.0110 191/500 [==========>...................] - ETA: 1:42 - loss: 0.2434 - regression_loss: 0.2324 - classification_loss: 0.0110 192/500 [==========>...................] - ETA: 1:42 - loss: 0.2436 - regression_loss: 0.2326 - classification_loss: 0.0110 193/500 [==========>...................] - ETA: 1:42 - loss: 0.2434 - regression_loss: 0.2324 - classification_loss: 0.0109 194/500 [==========>...................] - ETA: 1:42 - loss: 0.2434 - regression_loss: 0.2325 - classification_loss: 0.0109 195/500 [==========>...................] - ETA: 1:41 - loss: 0.2438 - regression_loss: 0.2328 - classification_loss: 0.0110 196/500 [==========>...................] - ETA: 1:41 - loss: 0.2435 - regression_loss: 0.2325 - classification_loss: 0.0110 197/500 [==========>...................] - ETA: 1:41 - loss: 0.2443 - regression_loss: 0.2332 - classification_loss: 0.0111 198/500 [==========>...................] - ETA: 1:40 - loss: 0.2439 - regression_loss: 0.2328 - classification_loss: 0.0111 199/500 [==========>...................] - ETA: 1:40 - loss: 0.2440 - regression_loss: 0.2329 - classification_loss: 0.0111 200/500 [===========>..................] - ETA: 1:40 - loss: 0.2450 - regression_loss: 0.2338 - classification_loss: 0.0112 201/500 [===========>..................] - ETA: 1:39 - loss: 0.2456 - regression_loss: 0.2344 - classification_loss: 0.0112 202/500 [===========>..................] - ETA: 1:39 - loss: 0.2460 - regression_loss: 0.2349 - classification_loss: 0.0112 203/500 [===========>..................] - ETA: 1:39 - loss: 0.2465 - regression_loss: 0.2353 - classification_loss: 0.0112 204/500 [===========>..................] - ETA: 1:38 - loss: 0.2471 - regression_loss: 0.2359 - classification_loss: 0.0111 205/500 [===========>..................] - ETA: 1:38 - loss: 0.2470 - regression_loss: 0.2359 - classification_loss: 0.0111 206/500 [===========>..................] - ETA: 1:38 - loss: 0.2467 - regression_loss: 0.2356 - classification_loss: 0.0111 207/500 [===========>..................] - ETA: 1:37 - loss: 0.2464 - regression_loss: 0.2354 - classification_loss: 0.0111 208/500 [===========>..................] - ETA: 1:37 - loss: 0.2460 - regression_loss: 0.2350 - classification_loss: 0.0110 209/500 [===========>..................] - ETA: 1:37 - loss: 0.2455 - regression_loss: 0.2345 - classification_loss: 0.0110 210/500 [===========>..................] - ETA: 1:36 - loss: 0.2449 - regression_loss: 0.2340 - classification_loss: 0.0110 211/500 [===========>..................] - ETA: 1:36 - loss: 0.2445 - regression_loss: 0.2335 - classification_loss: 0.0109 212/500 [===========>..................] - ETA: 1:36 - loss: 0.2454 - regression_loss: 0.2344 - classification_loss: 0.0110 213/500 [===========>..................] - ETA: 1:35 - loss: 0.2456 - regression_loss: 0.2346 - classification_loss: 0.0110 214/500 [===========>..................] - ETA: 1:35 - loss: 0.2459 - regression_loss: 0.2349 - classification_loss: 0.0110 215/500 [===========>..................] - ETA: 1:35 - loss: 0.2466 - regression_loss: 0.2356 - classification_loss: 0.0110 216/500 [===========>..................] - ETA: 1:34 - loss: 0.2465 - regression_loss: 0.2355 - classification_loss: 0.0110 217/500 [============>.................] - ETA: 1:34 - loss: 0.2466 - regression_loss: 0.2356 - classification_loss: 0.0110 218/500 [============>.................] - ETA: 1:34 - loss: 0.2463 - regression_loss: 0.2354 - classification_loss: 0.0110 219/500 [============>.................] - ETA: 1:33 - loss: 0.2472 - regression_loss: 0.2363 - classification_loss: 0.0110 220/500 [============>.................] - ETA: 1:33 - loss: 0.2477 - regression_loss: 0.2367 - classification_loss: 0.0110 221/500 [============>.................] - ETA: 1:32 - loss: 0.2473 - regression_loss: 0.2364 - classification_loss: 0.0109 222/500 [============>.................] - ETA: 1:32 - loss: 0.2473 - regression_loss: 0.2363 - classification_loss: 0.0109 223/500 [============>.................] - ETA: 1:32 - loss: 0.2466 - regression_loss: 0.2357 - classification_loss: 0.0109 224/500 [============>.................] - ETA: 1:31 - loss: 0.2462 - regression_loss: 0.2353 - classification_loss: 0.0109 225/500 [============>.................] - ETA: 1:31 - loss: 0.2459 - regression_loss: 0.2350 - classification_loss: 0.0108 226/500 [============>.................] - ETA: 1:31 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0108 227/500 [============>.................] - ETA: 1:30 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0108 228/500 [============>.................] - ETA: 1:30 - loss: 0.2454 - regression_loss: 0.2347 - classification_loss: 0.0108 229/500 [============>.................] - ETA: 1:30 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0108 230/500 [============>.................] - ETA: 1:29 - loss: 0.2457 - regression_loss: 0.2349 - classification_loss: 0.0108 231/500 [============>.................] - ETA: 1:29 - loss: 0.2457 - regression_loss: 0.2350 - classification_loss: 0.0107 232/500 [============>.................] - ETA: 1:29 - loss: 0.2461 - regression_loss: 0.2353 - classification_loss: 0.0107 233/500 [============>.................] - ETA: 1:29 - loss: 0.2457 - regression_loss: 0.2350 - classification_loss: 0.0107 234/500 [=============>................] - ETA: 1:28 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 235/500 [=============>................] - ETA: 1:28 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 236/500 [=============>................] - ETA: 1:27 - loss: 0.2452 - regression_loss: 0.2346 - classification_loss: 0.0107 237/500 [=============>................] - ETA: 1:27 - loss: 0.2451 - regression_loss: 0.2344 - classification_loss: 0.0106 238/500 [=============>................] - ETA: 1:27 - loss: 0.2452 - regression_loss: 0.2345 - classification_loss: 0.0107 239/500 [=============>................] - ETA: 1:26 - loss: 0.2452 - regression_loss: 0.2345 - classification_loss: 0.0107 240/500 [=============>................] - ETA: 1:26 - loss: 0.2446 - regression_loss: 0.2339 - classification_loss: 0.0107 241/500 [=============>................] - ETA: 1:26 - loss: 0.2443 - regression_loss: 0.2337 - classification_loss: 0.0107 242/500 [=============>................] - ETA: 1:25 - loss: 0.2446 - regression_loss: 0.2339 - classification_loss: 0.0107 243/500 [=============>................] - ETA: 1:25 - loss: 0.2444 - regression_loss: 0.2337 - classification_loss: 0.0107 244/500 [=============>................] - ETA: 1:25 - loss: 0.2440 - regression_loss: 0.2333 - classification_loss: 0.0107 245/500 [=============>................] - ETA: 1:24 - loss: 0.2450 - regression_loss: 0.2343 - classification_loss: 0.0107 246/500 [=============>................] - ETA: 1:24 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 247/500 [=============>................] - ETA: 1:24 - loss: 0.2454 - regression_loss: 0.2346 - classification_loss: 0.0107 248/500 [=============>................] - ETA: 1:23 - loss: 0.2456 - regression_loss: 0.2349 - classification_loss: 0.0108 249/500 [=============>................] - ETA: 1:23 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 250/500 [==============>...............] - ETA: 1:23 - loss: 0.2449 - regression_loss: 0.2342 - classification_loss: 0.0107 251/500 [==============>...............] - ETA: 1:23 - loss: 0.2441 - regression_loss: 0.2335 - classification_loss: 0.0107 252/500 [==============>...............] - ETA: 1:22 - loss: 0.2443 - regression_loss: 0.2336 - classification_loss: 0.0107 253/500 [==============>...............] - ETA: 1:22 - loss: 0.2439 - regression_loss: 0.2332 - classification_loss: 0.0107 254/500 [==============>...............] - ETA: 1:22 - loss: 0.2442 - regression_loss: 0.2334 - classification_loss: 0.0108 255/500 [==============>...............] - ETA: 1:21 - loss: 0.2443 - regression_loss: 0.2335 - classification_loss: 0.0108 256/500 [==============>...............] - ETA: 1:21 - loss: 0.2441 - regression_loss: 0.2333 - classification_loss: 0.0108 257/500 [==============>...............] - ETA: 1:20 - loss: 0.2446 - regression_loss: 0.2337 - classification_loss: 0.0109 258/500 [==============>...............] - ETA: 1:20 - loss: 0.2448 - regression_loss: 0.2339 - classification_loss: 0.0109 259/500 [==============>...............] - ETA: 1:20 - loss: 0.2451 - regression_loss: 0.2342 - classification_loss: 0.0109 260/500 [==============>...............] - ETA: 1:20 - loss: 0.2450 - regression_loss: 0.2341 - classification_loss: 0.0109 261/500 [==============>...............] - ETA: 1:19 - loss: 0.2450 - regression_loss: 0.2341 - classification_loss: 0.0109 262/500 [==============>...............] - ETA: 1:19 - loss: 0.2450 - regression_loss: 0.2341 - classification_loss: 0.0109 263/500 [==============>...............] - ETA: 1:18 - loss: 0.2444 - regression_loss: 0.2335 - classification_loss: 0.0109 264/500 [==============>...............] - ETA: 1:18 - loss: 0.2440 - regression_loss: 0.2332 - classification_loss: 0.0109 265/500 [==============>...............] - ETA: 1:18 - loss: 0.2446 - regression_loss: 0.2336 - classification_loss: 0.0109 266/500 [==============>...............] - ETA: 1:17 - loss: 0.2439 - regression_loss: 0.2330 - classification_loss: 0.0109 267/500 [===============>..............] - ETA: 1:17 - loss: 0.2440 - regression_loss: 0.2331 - classification_loss: 0.0109 268/500 [===============>..............] - ETA: 1:17 - loss: 0.2439 - regression_loss: 0.2330 - classification_loss: 0.0109 269/500 [===============>..............] - ETA: 1:16 - loss: 0.2433 - regression_loss: 0.2324 - classification_loss: 0.0109 270/500 [===============>..............] - ETA: 1:16 - loss: 0.2429 - regression_loss: 0.2320 - classification_loss: 0.0108 271/500 [===============>..............] - ETA: 1:16 - loss: 0.2422 - regression_loss: 0.2314 - classification_loss: 0.0108 272/500 [===============>..............] - ETA: 1:15 - loss: 0.2420 - regression_loss: 0.2312 - classification_loss: 0.0108 273/500 [===============>..............] - ETA: 1:15 - loss: 0.2414 - regression_loss: 0.2307 - classification_loss: 0.0107 274/500 [===============>..............] - ETA: 1:15 - loss: 0.2412 - regression_loss: 0.2305 - classification_loss: 0.0107 275/500 [===============>..............] - ETA: 1:14 - loss: 0.2407 - regression_loss: 0.2300 - classification_loss: 0.0107 276/500 [===============>..............] - ETA: 1:14 - loss: 0.2406 - regression_loss: 0.2299 - classification_loss: 0.0107 277/500 [===============>..............] - ETA: 1:14 - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 278/500 [===============>..............] - ETA: 1:14 - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0106 279/500 [===============>..............] - ETA: 1:13 - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0106 280/500 [===============>..............] - ETA: 1:13 - loss: 0.2393 - regression_loss: 0.2286 - classification_loss: 0.0107 281/500 [===============>..............] - ETA: 1:13 - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0106 282/500 [===============>..............] - ETA: 1:12 - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 283/500 [===============>..............] - ETA: 1:12 - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 284/500 [================>.............] - ETA: 1:12 - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 285/500 [================>.............] - ETA: 1:11 - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0106 286/500 [================>.............] - ETA: 1:11 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 287/500 [================>.............] - ETA: 1:11 - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0105 288/500 [================>.............] - ETA: 1:10 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0105 289/500 [================>.............] - ETA: 1:10 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 290/500 [================>.............] - ETA: 1:10 - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 291/500 [================>.............] - ETA: 1:09 - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 292/500 [================>.............] - ETA: 1:09 - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 293/500 [================>.............] - ETA: 1:09 - loss: 0.2378 - regression_loss: 0.2273 - classification_loss: 0.0105 294/500 [================>.............] - ETA: 1:08 - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 295/500 [================>.............] - ETA: 1:08 - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 296/500 [================>.............] - ETA: 1:08 - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 297/500 [================>.............] - ETA: 1:07 - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 298/500 [================>.............] - ETA: 1:07 - loss: 0.2392 - regression_loss: 0.2287 - classification_loss: 0.0106 299/500 [================>.............] - ETA: 1:07 - loss: 0.2394 - regression_loss: 0.2288 - classification_loss: 0.0106 300/500 [=================>............] - ETA: 1:06 - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 301/500 [=================>............] - ETA: 1:06 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0106 302/500 [=================>............] - ETA: 1:06 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 303/500 [=================>............] - ETA: 1:05 - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 304/500 [=================>............] - ETA: 1:05 - loss: 0.2399 - regression_loss: 0.2292 - classification_loss: 0.0106 305/500 [=================>............] - ETA: 1:05 - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 306/500 [=================>............] - ETA: 1:04 - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 307/500 [=================>............] - ETA: 1:04 - loss: 0.2396 - regression_loss: 0.2289 - classification_loss: 0.0106 308/500 [=================>............] - ETA: 1:04 - loss: 0.2403 - regression_loss: 0.2297 - classification_loss: 0.0106 309/500 [=================>............] - ETA: 1:03 - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 310/500 [=================>............] - ETA: 1:03 - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 311/500 [=================>............] - ETA: 1:03 - loss: 0.2403 - regression_loss: 0.2297 - classification_loss: 0.0106 312/500 [=================>............] - ETA: 1:02 - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0106 313/500 [=================>............] - ETA: 1:02 - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0107 314/500 [=================>............] - ETA: 1:02 - loss: 0.2401 - regression_loss: 0.2295 - classification_loss: 0.0107 315/500 [=================>............] - ETA: 1:01 - loss: 0.2402 - regression_loss: 0.2295 - classification_loss: 0.0107 316/500 [=================>............] - ETA: 1:01 - loss: 0.2397 - regression_loss: 0.2290 - classification_loss: 0.0106 317/500 [==================>...........] - ETA: 1:01 - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 318/500 [==================>...........] - ETA: 1:00 - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 319/500 [==================>...........] - ETA: 1:00 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 320/500 [==================>...........] - ETA: 1:00 - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 321/500 [==================>...........] - ETA: 59s - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0106  322/500 [==================>...........] - ETA: 59s - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0106 323/500 [==================>...........] - ETA: 59s - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0106 324/500 [==================>...........] - ETA: 58s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 325/500 [==================>...........] - ETA: 58s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 326/500 [==================>...........] - ETA: 58s - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 327/500 [==================>...........] - ETA: 57s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 328/500 [==================>...........] - ETA: 57s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 329/500 [==================>...........] - ETA: 57s - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 330/500 [==================>...........] - ETA: 56s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0105 331/500 [==================>...........] - ETA: 56s - loss: 0.2399 - regression_loss: 0.2293 - classification_loss: 0.0106 332/500 [==================>...........] - ETA: 56s - loss: 0.2397 - regression_loss: 0.2292 - classification_loss: 0.0105 333/500 [==================>...........] - ETA: 55s - loss: 0.2396 - regression_loss: 0.2291 - classification_loss: 0.0105 334/500 [===================>..........] - ETA: 55s - loss: 0.2398 - regression_loss: 0.2293 - classification_loss: 0.0106 335/500 [===================>..........] - ETA: 55s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0105 336/500 [===================>..........] - ETA: 54s - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0107 337/500 [===================>..........] - ETA: 54s - loss: 0.2406 - regression_loss: 0.2299 - classification_loss: 0.0107 338/500 [===================>..........] - ETA: 54s - loss: 0.2409 - regression_loss: 0.2301 - classification_loss: 0.0108 339/500 [===================>..........] - ETA: 53s - loss: 0.2409 - regression_loss: 0.2301 - classification_loss: 0.0108 340/500 [===================>..........] - ETA: 53s - loss: 0.2408 - regression_loss: 0.2300 - classification_loss: 0.0108 341/500 [===================>..........] - ETA: 53s - loss: 0.2407 - regression_loss: 0.2300 - classification_loss: 0.0108 342/500 [===================>..........] - ETA: 52s - loss: 0.2409 - regression_loss: 0.2301 - classification_loss: 0.0108 343/500 [===================>..........] - ETA: 52s - loss: 0.2408 - regression_loss: 0.2300 - classification_loss: 0.0108 344/500 [===================>..........] - ETA: 52s - loss: 0.2408 - regression_loss: 0.2300 - classification_loss: 0.0108 345/500 [===================>..........] - ETA: 51s - loss: 0.2405 - regression_loss: 0.2297 - classification_loss: 0.0108 346/500 [===================>..........] - ETA: 51s - loss: 0.2407 - regression_loss: 0.2299 - classification_loss: 0.0108 347/500 [===================>..........] - ETA: 51s - loss: 0.2404 - regression_loss: 0.2296 - classification_loss: 0.0108 348/500 [===================>..........] - ETA: 50s - loss: 0.2406 - regression_loss: 0.2299 - classification_loss: 0.0108 349/500 [===================>..........] - ETA: 50s - loss: 0.2405 - regression_loss: 0.2297 - classification_loss: 0.0108 350/500 [====================>.........] - ETA: 50s - loss: 0.2402 - regression_loss: 0.2295 - classification_loss: 0.0107 351/500 [====================>.........] - ETA: 49s - loss: 0.2404 - regression_loss: 0.2297 - classification_loss: 0.0107 352/500 [====================>.........] - ETA: 49s - loss: 0.2402 - regression_loss: 0.2295 - classification_loss: 0.0107 353/500 [====================>.........] - ETA: 49s - loss: 0.2399 - regression_loss: 0.2292 - classification_loss: 0.0107 354/500 [====================>.........] - ETA: 48s - loss: 0.2400 - regression_loss: 0.2292 - classification_loss: 0.0107 355/500 [====================>.........] - ETA: 48s - loss: 0.2405 - regression_loss: 0.2298 - classification_loss: 0.0107 356/500 [====================>.........] - ETA: 48s - loss: 0.2408 - regression_loss: 0.2301 - classification_loss: 0.0107 357/500 [====================>.........] - ETA: 47s - loss: 0.2404 - regression_loss: 0.2297 - classification_loss: 0.0107 358/500 [====================>.........] - ETA: 47s - loss: 0.2404 - regression_loss: 0.2297 - classification_loss: 0.0107 359/500 [====================>.........] - ETA: 47s - loss: 0.2404 - regression_loss: 0.2298 - classification_loss: 0.0107 360/500 [====================>.........] - ETA: 46s - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 361/500 [====================>.........] - ETA: 46s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 362/500 [====================>.........] - ETA: 46s - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0106 363/500 [====================>.........] - ETA: 45s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 364/500 [====================>.........] - ETA: 45s - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0106 365/500 [====================>.........] - ETA: 45s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 366/500 [====================>.........] - ETA: 44s - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 367/500 [=====================>........] - ETA: 44s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 368/500 [=====================>........] - ETA: 44s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0106 369/500 [=====================>........] - ETA: 43s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0105 370/500 [=====================>........] - ETA: 43s - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0106 371/500 [=====================>........] - ETA: 43s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 372/500 [=====================>........] - ETA: 42s - loss: 0.2383 - regression_loss: 0.2277 - classification_loss: 0.0105 373/500 [=====================>........] - ETA: 42s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 374/500 [=====================>........] - ETA: 42s - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 375/500 [=====================>........] - ETA: 41s - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0106 376/500 [=====================>........] - ETA: 41s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 377/500 [=====================>........] - ETA: 41s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 378/500 [=====================>........] - ETA: 40s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0106 379/500 [=====================>........] - ETA: 40s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 380/500 [=====================>........] - ETA: 40s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0106 381/500 [=====================>........] - ETA: 39s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 382/500 [=====================>........] - ETA: 39s - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 383/500 [=====================>........] - ETA: 38s - loss: 0.2383 - regression_loss: 0.2278 - classification_loss: 0.0105 384/500 [======================>.......] - ETA: 38s - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 385/500 [======================>.......] - ETA: 38s - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 386/500 [======================>.......] - ETA: 37s - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 387/500 [======================>.......] - ETA: 37s - loss: 0.2380 - regression_loss: 0.2276 - classification_loss: 0.0105 388/500 [======================>.......] - ETA: 37s - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0105 389/500 [======================>.......] - ETA: 36s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 390/500 [======================>.......] - ETA: 36s - loss: 0.2392 - regression_loss: 0.2287 - classification_loss: 0.0105 391/500 [======================>.......] - ETA: 36s - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0105 392/500 [======================>.......] - ETA: 35s - loss: 0.2396 - regression_loss: 0.2291 - classification_loss: 0.0105 393/500 [======================>.......] - ETA: 35s - loss: 0.2402 - regression_loss: 0.2297 - classification_loss: 0.0105 394/500 [======================>.......] - ETA: 35s - loss: 0.2398 - regression_loss: 0.2293 - classification_loss: 0.0105 395/500 [======================>.......] - ETA: 34s - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0105 396/500 [======================>.......] - ETA: 34s - loss: 0.2393 - regression_loss: 0.2289 - classification_loss: 0.0105 397/500 [======================>.......] - ETA: 34s - loss: 0.2394 - regression_loss: 0.2289 - classification_loss: 0.0104 398/500 [======================>.......] - ETA: 33s - loss: 0.2392 - regression_loss: 0.2288 - classification_loss: 0.0104 399/500 [======================>.......] - ETA: 33s - loss: 0.2396 - regression_loss: 0.2292 - classification_loss: 0.0104 400/500 [=======================>......] - ETA: 33s - loss: 0.2397 - regression_loss: 0.2292 - classification_loss: 0.0105 401/500 [=======================>......] - ETA: 32s - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0104 402/500 [=======================>......] - ETA: 32s - loss: 0.2391 - regression_loss: 0.2287 - classification_loss: 0.0104 403/500 [=======================>......] - ETA: 32s - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0104 404/500 [=======================>......] - ETA: 31s - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0104 405/500 [=======================>......] - ETA: 31s - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0104 406/500 [=======================>......] - ETA: 31s - loss: 0.2386 - regression_loss: 0.2282 - classification_loss: 0.0104 407/500 [=======================>......] - ETA: 30s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0104 408/500 [=======================>......] - ETA: 30s - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0104 409/500 [=======================>......] - ETA: 30s - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0104 410/500 [=======================>......] - ETA: 29s - loss: 0.2389 - regression_loss: 0.2284 - classification_loss: 0.0105 411/500 [=======================>......] - ETA: 29s - loss: 0.2390 - regression_loss: 0.2285 - classification_loss: 0.0105 412/500 [=======================>......] - ETA: 29s - loss: 0.2390 - regression_loss: 0.2286 - classification_loss: 0.0104 413/500 [=======================>......] - ETA: 28s - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0104 414/500 [=======================>......] - ETA: 28s - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0104 415/500 [=======================>......] - ETA: 28s - loss: 0.2394 - regression_loss: 0.2290 - classification_loss: 0.0104 416/500 [=======================>......] - ETA: 27s - loss: 0.2393 - regression_loss: 0.2289 - classification_loss: 0.0104 417/500 [========================>.....] - ETA: 27s - loss: 0.2401 - regression_loss: 0.2297 - classification_loss: 0.0104 418/500 [========================>.....] - ETA: 27s - loss: 0.2405 - regression_loss: 0.2300 - classification_loss: 0.0105 419/500 [========================>.....] - ETA: 26s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0104 420/500 [========================>.....] - ETA: 26s - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 421/500 [========================>.....] - ETA: 26s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 422/500 [========================>.....] - ETA: 25s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 423/500 [========================>.....] - ETA: 25s - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 424/500 [========================>.....] - ETA: 25s - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 425/500 [========================>.....] - ETA: 24s - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 426/500 [========================>.....] - ETA: 24s - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 427/500 [========================>.....] - ETA: 24s - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 428/500 [========================>.....] - ETA: 23s - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0105 429/500 [========================>.....] - ETA: 23s - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 430/500 [========================>.....] - ETA: 23s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 431/500 [========================>.....] - ETA: 22s - loss: 0.2422 - regression_loss: 0.2317 - classification_loss: 0.0105 432/500 [========================>.....] - ETA: 22s - loss: 0.2422 - regression_loss: 0.2317 - classification_loss: 0.0105 433/500 [========================>.....] - ETA: 22s - loss: 0.2419 - regression_loss: 0.2314 - classification_loss: 0.0105 434/500 [=========================>....] - ETA: 21s - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 435/500 [=========================>....] - ETA: 21s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 436/500 [=========================>....] - ETA: 21s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 437/500 [=========================>....] - ETA: 20s - loss: 0.2414 - regression_loss: 0.2310 - classification_loss: 0.0105 438/500 [=========================>....] - ETA: 20s - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 439/500 [=========================>....] - ETA: 20s - loss: 0.2415 - regression_loss: 0.2310 - classification_loss: 0.0105 440/500 [=========================>....] - ETA: 19s - loss: 0.2418 - regression_loss: 0.2313 - classification_loss: 0.0105 441/500 [=========================>....] - ETA: 19s - loss: 0.2418 - regression_loss: 0.2313 - classification_loss: 0.0105 442/500 [=========================>....] - ETA: 19s - loss: 0.2419 - regression_loss: 0.2314 - classification_loss: 0.0105 443/500 [=========================>....] - ETA: 18s - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 444/500 [=========================>....] - ETA: 18s - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 445/500 [=========================>....] - ETA: 18s - loss: 0.2416 - regression_loss: 0.2312 - classification_loss: 0.0105 446/500 [=========================>....] - ETA: 17s - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 447/500 [=========================>....] - ETA: 17s - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 448/500 [=========================>....] - ETA: 17s - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 449/500 [=========================>....] - ETA: 16s - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 450/500 [==========================>...] - ETA: 16s - loss: 0.2414 - regression_loss: 0.2308 - classification_loss: 0.0105 451/500 [==========================>...] - ETA: 16s - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 452/500 [==========================>...] - ETA: 15s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 453/500 [==========================>...] - ETA: 15s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 454/500 [==========================>...] - ETA: 15s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 455/500 [==========================>...] - ETA: 14s - loss: 0.2405 - regression_loss: 0.2301 - classification_loss: 0.0105 456/500 [==========================>...] - ETA: 14s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 457/500 [==========================>...] - ETA: 14s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 458/500 [==========================>...] - ETA: 13s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 459/500 [==========================>...] - ETA: 13s - loss: 0.2406 - regression_loss: 0.2301 - classification_loss: 0.0105 460/500 [==========================>...] - ETA: 13s - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 461/500 [==========================>...] - ETA: 12s - loss: 0.2409 - regression_loss: 0.2303 - classification_loss: 0.0105 462/500 [==========================>...] - ETA: 12s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 463/500 [==========================>...] - ETA: 12s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 464/500 [==========================>...] - ETA: 11s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 465/500 [==========================>...] - ETA: 11s - loss: 0.2411 - regression_loss: 0.2305 - classification_loss: 0.0106 466/500 [==========================>...] - ETA: 11s - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 467/500 [===========================>..] - ETA: 10s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 468/500 [===========================>..] - ETA: 10s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 469/500 [===========================>..] - ETA: 10s - loss: 0.2408 - regression_loss: 0.2302 - classification_loss: 0.0105 470/500 [===========================>..] - ETA: 9s - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0106  471/500 [===========================>..] - ETA: 9s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0106 472/500 [===========================>..] - ETA: 9s - loss: 0.2414 - regression_loss: 0.2308 - classification_loss: 0.0106 473/500 [===========================>..] - ETA: 8s - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0106 474/500 [===========================>..] - ETA: 8s - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0106 475/500 [===========================>..] - ETA: 8s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0106 476/500 [===========================>..] - ETA: 7s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0106 477/500 [===========================>..] - ETA: 7s - loss: 0.2409 - regression_loss: 0.2303 - classification_loss: 0.0105 478/500 [===========================>..] - ETA: 7s - loss: 0.2407 - regression_loss: 0.2301 - classification_loss: 0.0105 479/500 [===========================>..] - ETA: 6s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 480/500 [===========================>..] - ETA: 6s - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 481/500 [===========================>..] - ETA: 6s - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0105 482/500 [===========================>..] - ETA: 5s - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 483/500 [===========================>..] - ETA: 5s - loss: 0.2411 - regression_loss: 0.2307 - classification_loss: 0.0105 484/500 [============================>.] - ETA: 5s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 485/500 [============================>.] - ETA: 4s - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 486/500 [============================>.] - ETA: 4s - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 487/500 [============================>.] - ETA: 4s - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0105 488/500 [============================>.] - ETA: 3s - loss: 0.2406 - regression_loss: 0.2302 - classification_loss: 0.0104 489/500 [============================>.] - ETA: 3s - loss: 0.2405 - regression_loss: 0.2300 - classification_loss: 0.0104 490/500 [============================>.] - ETA: 3s - loss: 0.2402 - regression_loss: 0.2298 - classification_loss: 0.0104 491/500 [============================>.] - ETA: 2s - loss: 0.2402 - regression_loss: 0.2298 - classification_loss: 0.0104 492/500 [============================>.] - ETA: 2s - loss: 0.2401 - regression_loss: 0.2297 - classification_loss: 0.0104 493/500 [============================>.] - ETA: 2s - loss: 0.2403 - regression_loss: 0.2299 - classification_loss: 0.0104 494/500 [============================>.] - ETA: 1s - loss: 0.2402 - regression_loss: 0.2298 - classification_loss: 0.0104 495/500 [============================>.] - ETA: 1s - loss: 0.2409 - regression_loss: 0.2305 - classification_loss: 0.0105 496/500 [============================>.] - ETA: 1s - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0104 497/500 [============================>.] - ETA: 0s - loss: 0.2405 - regression_loss: 0.2301 - classification_loss: 0.0104 498/500 [============================>.] - ETA: 0s - loss: 0.2410 - regression_loss: 0.2306 - classification_loss: 0.0104 499/500 [============================>.] - ETA: 0s - loss: 0.2415 - regression_loss: 0.2310 - classification_loss: 0.0105 500/500 [==============================] - 165s 330ms/step - loss: 0.2415 - regression_loss: 0.2310 - classification_loss: 0.0105 1172 instances of class plum with average precision: 0.7440 mAP: 0.7440 Epoch 00049: saving model to ./training/snapshots/resnet101_pascal_49.h5 Epoch 50/150 1/500 [..............................] - ETA: 2:32 - loss: 0.3313 - regression_loss: 0.3159 - classification_loss: 0.0154 2/500 [..............................] - ETA: 2:32 - loss: 0.2785 - regression_loss: 0.2659 - classification_loss: 0.0126 3/500 [..............................] - ETA: 2:32 - loss: 0.2411 - regression_loss: 0.2310 - classification_loss: 0.0101 4/500 [..............................] - ETA: 2:32 - loss: 0.2817 - regression_loss: 0.2703 - classification_loss: 0.0114 5/500 [..............................] - ETA: 2:32 - loss: 0.2897 - regression_loss: 0.2781 - classification_loss: 0.0116 6/500 [..............................] - ETA: 2:32 - loss: 0.3123 - regression_loss: 0.2996 - classification_loss: 0.0128 7/500 [..............................] - ETA: 2:32 - loss: 0.3009 - regression_loss: 0.2883 - classification_loss: 0.0126 8/500 [..............................] - ETA: 2:32 - loss: 0.2996 - regression_loss: 0.2882 - classification_loss: 0.0115 9/500 [..............................] - ETA: 2:31 - loss: 0.3268 - regression_loss: 0.3104 - classification_loss: 0.0163 10/500 [..............................] - ETA: 2:31 - loss: 0.3318 - regression_loss: 0.3152 - classification_loss: 0.0166 11/500 [..............................] - ETA: 2:31 - loss: 0.3267 - regression_loss: 0.3106 - classification_loss: 0.0161 12/500 [..............................] - ETA: 2:31 - loss: 0.3125 - regression_loss: 0.2960 - classification_loss: 0.0165 13/500 [..............................] - ETA: 2:30 - loss: 0.3151 - regression_loss: 0.2991 - classification_loss: 0.0160 14/500 [..............................] - ETA: 2:30 - loss: 0.3050 - regression_loss: 0.2896 - classification_loss: 0.0153 15/500 [..............................] - ETA: 2:30 - loss: 0.3148 - regression_loss: 0.2958 - classification_loss: 0.0191 16/500 [..............................] - ETA: 2:29 - loss: 0.3105 - regression_loss: 0.2921 - classification_loss: 0.0184 17/500 [>.............................] - ETA: 2:29 - loss: 0.3064 - regression_loss: 0.2886 - classification_loss: 0.0178 18/500 [>.............................] - ETA: 2:28 - loss: 0.3090 - regression_loss: 0.2918 - classification_loss: 0.0172 19/500 [>.............................] - ETA: 2:28 - loss: 0.3067 - regression_loss: 0.2898 - classification_loss: 0.0169 20/500 [>.............................] - ETA: 2:28 - loss: 0.2999 - regression_loss: 0.2832 - classification_loss: 0.0167 21/500 [>.............................] - ETA: 2:28 - loss: 0.3018 - regression_loss: 0.2851 - classification_loss: 0.0167 22/500 [>.............................] - ETA: 2:27 - loss: 0.3018 - regression_loss: 0.2853 - classification_loss: 0.0165 23/500 [>.............................] - ETA: 2:27 - loss: 0.3013 - regression_loss: 0.2848 - classification_loss: 0.0164 24/500 [>.............................] - ETA: 2:27 - loss: 0.2998 - regression_loss: 0.2836 - classification_loss: 0.0162 25/500 [>.............................] - ETA: 2:26 - loss: 0.2934 - regression_loss: 0.2777 - classification_loss: 0.0157 26/500 [>.............................] - ETA: 2:26 - loss: 0.2902 - regression_loss: 0.2750 - classification_loss: 0.0152 27/500 [>.............................] - ETA: 2:26 - loss: 0.2873 - regression_loss: 0.2724 - classification_loss: 0.0148 28/500 [>.............................] - ETA: 2:25 - loss: 0.2866 - regression_loss: 0.2720 - classification_loss: 0.0147 29/500 [>.............................] - ETA: 2:25 - loss: 0.2863 - regression_loss: 0.2717 - classification_loss: 0.0146 30/500 [>.............................] - ETA: 2:24 - loss: 0.2836 - regression_loss: 0.2690 - classification_loss: 0.0145 31/500 [>.............................] - ETA: 2:24 - loss: 0.2811 - regression_loss: 0.2667 - classification_loss: 0.0143 32/500 [>.............................] - ETA: 2:24 - loss: 0.2777 - regression_loss: 0.2636 - classification_loss: 0.0141 33/500 [>.............................] - ETA: 2:23 - loss: 0.2799 - regression_loss: 0.2662 - classification_loss: 0.0137 34/500 [=>............................] - ETA: 2:23 - loss: 0.2819 - regression_loss: 0.2680 - classification_loss: 0.0139 35/500 [=>............................] - ETA: 2:23 - loss: 0.2842 - regression_loss: 0.2701 - classification_loss: 0.0140 36/500 [=>............................] - ETA: 2:22 - loss: 0.2870 - regression_loss: 0.2730 - classification_loss: 0.0140 37/500 [=>............................] - ETA: 2:22 - loss: 0.2859 - regression_loss: 0.2721 - classification_loss: 0.0138 38/500 [=>............................] - ETA: 2:22 - loss: 0.2846 - regression_loss: 0.2708 - classification_loss: 0.0138 39/500 [=>............................] - ETA: 2:21 - loss: 0.2889 - regression_loss: 0.2748 - classification_loss: 0.0141 40/500 [=>............................] - ETA: 2:21 - loss: 0.2903 - regression_loss: 0.2761 - classification_loss: 0.0142 41/500 [=>............................] - ETA: 2:21 - loss: 0.2924 - regression_loss: 0.2783 - classification_loss: 0.0141 42/500 [=>............................] - ETA: 2:21 - loss: 0.2889 - regression_loss: 0.2750 - classification_loss: 0.0139 43/500 [=>............................] - ETA: 2:20 - loss: 0.2871 - regression_loss: 0.2734 - classification_loss: 0.0137 44/500 [=>............................] - ETA: 2:20 - loss: 0.2841 - regression_loss: 0.2707 - classification_loss: 0.0135 45/500 [=>............................] - ETA: 2:19 - loss: 0.2845 - regression_loss: 0.2710 - classification_loss: 0.0135 46/500 [=>............................] - ETA: 2:19 - loss: 0.2846 - regression_loss: 0.2711 - classification_loss: 0.0134 47/500 [=>............................] - ETA: 2:19 - loss: 0.2814 - regression_loss: 0.2680 - classification_loss: 0.0133 48/500 [=>............................] - ETA: 2:19 - loss: 0.2798 - regression_loss: 0.2665 - classification_loss: 0.0134 49/500 [=>............................] - ETA: 2:18 - loss: 0.2771 - regression_loss: 0.2640 - classification_loss: 0.0131 50/500 [==>...........................] - ETA: 2:18 - loss: 0.2769 - regression_loss: 0.2639 - classification_loss: 0.0130 51/500 [==>...........................] - ETA: 2:18 - loss: 0.2761 - regression_loss: 0.2633 - classification_loss: 0.0128 52/500 [==>...........................] - ETA: 2:17 - loss: 0.2731 - regression_loss: 0.2604 - classification_loss: 0.0127 53/500 [==>...........................] - ETA: 2:17 - loss: 0.2772 - regression_loss: 0.2646 - classification_loss: 0.0126 54/500 [==>...........................] - ETA: 2:17 - loss: 0.2785 - regression_loss: 0.2659 - classification_loss: 0.0126 55/500 [==>...........................] - ETA: 2:16 - loss: 0.2768 - regression_loss: 0.2644 - classification_loss: 0.0124 56/500 [==>...........................] - ETA: 2:16 - loss: 0.2782 - regression_loss: 0.2656 - classification_loss: 0.0126 57/500 [==>...........................] - ETA: 2:16 - loss: 0.2815 - regression_loss: 0.2683 - classification_loss: 0.0132 58/500 [==>...........................] - ETA: 2:16 - loss: 0.2801 - regression_loss: 0.2670 - classification_loss: 0.0131 59/500 [==>...........................] - ETA: 2:15 - loss: 0.2804 - regression_loss: 0.2673 - classification_loss: 0.0131 60/500 [==>...........................] - ETA: 2:15 - loss: 0.2808 - regression_loss: 0.2677 - classification_loss: 0.0131 61/500 [==>...........................] - ETA: 2:15 - loss: 0.2819 - regression_loss: 0.2686 - classification_loss: 0.0133 62/500 [==>...........................] - ETA: 2:14 - loss: 0.2806 - regression_loss: 0.2670 - classification_loss: 0.0136 63/500 [==>...........................] - ETA: 2:14 - loss: 0.2780 - regression_loss: 0.2646 - classification_loss: 0.0134 64/500 [==>...........................] - ETA: 2:14 - loss: 0.2792 - regression_loss: 0.2657 - classification_loss: 0.0135 65/500 [==>...........................] - ETA: 2:14 - loss: 0.2787 - regression_loss: 0.2653 - classification_loss: 0.0134 66/500 [==>...........................] - ETA: 2:13 - loss: 0.2780 - regression_loss: 0.2647 - classification_loss: 0.0133 67/500 [===>..........................] - ETA: 2:13 - loss: 0.2761 - regression_loss: 0.2630 - classification_loss: 0.0131 68/500 [===>..........................] - ETA: 2:13 - loss: 0.2750 - regression_loss: 0.2618 - classification_loss: 0.0131 69/500 [===>..........................] - ETA: 2:12 - loss: 0.2749 - regression_loss: 0.2616 - classification_loss: 0.0133 70/500 [===>..........................] - ETA: 2:12 - loss: 0.2737 - regression_loss: 0.2605 - classification_loss: 0.0132 71/500 [===>..........................] - ETA: 2:12 - loss: 0.2704 - regression_loss: 0.2573 - classification_loss: 0.0131 72/500 [===>..........................] - ETA: 2:11 - loss: 0.2691 - regression_loss: 0.2561 - classification_loss: 0.0130 73/500 [===>..........................] - ETA: 2:11 - loss: 0.2682 - regression_loss: 0.2554 - classification_loss: 0.0128 74/500 [===>..........................] - ETA: 2:11 - loss: 0.2665 - regression_loss: 0.2538 - classification_loss: 0.0128 75/500 [===>..........................] - ETA: 2:10 - loss: 0.2644 - regression_loss: 0.2517 - classification_loss: 0.0127 76/500 [===>..........................] - ETA: 2:10 - loss: 0.2634 - regression_loss: 0.2508 - classification_loss: 0.0126 77/500 [===>..........................] - ETA: 2:10 - loss: 0.2620 - regression_loss: 0.2495 - classification_loss: 0.0125 78/500 [===>..........................] - ETA: 2:10 - loss: 0.2605 - regression_loss: 0.2481 - classification_loss: 0.0124 79/500 [===>..........................] - ETA: 2:09 - loss: 0.2595 - regression_loss: 0.2472 - classification_loss: 0.0123 80/500 [===>..........................] - ETA: 2:09 - loss: 0.2578 - regression_loss: 0.2456 - classification_loss: 0.0122 81/500 [===>..........................] - ETA: 2:09 - loss: 0.2568 - regression_loss: 0.2447 - classification_loss: 0.0121 82/500 [===>..........................] - ETA: 2:08 - loss: 0.2552 - regression_loss: 0.2432 - classification_loss: 0.0120 83/500 [===>..........................] - ETA: 2:08 - loss: 0.2527 - regression_loss: 0.2408 - classification_loss: 0.0118 84/500 [====>.........................] - ETA: 2:08 - loss: 0.2503 - regression_loss: 0.2386 - classification_loss: 0.0117 85/500 [====>.........................] - ETA: 2:07 - loss: 0.2495 - regression_loss: 0.2378 - classification_loss: 0.0117 86/500 [====>.........................] - ETA: 2:07 - loss: 0.2524 - regression_loss: 0.2408 - classification_loss: 0.0116 87/500 [====>.........................] - ETA: 2:07 - loss: 0.2527 - regression_loss: 0.2411 - classification_loss: 0.0116 88/500 [====>.........................] - ETA: 2:06 - loss: 0.2514 - regression_loss: 0.2399 - classification_loss: 0.0115 89/500 [====>.........................] - ETA: 2:06 - loss: 0.2503 - regression_loss: 0.2388 - classification_loss: 0.0114 90/500 [====>.........................] - ETA: 2:06 - loss: 0.2520 - regression_loss: 0.2401 - classification_loss: 0.0119 91/500 [====>.........................] - ETA: 2:06 - loss: 0.2535 - regression_loss: 0.2416 - classification_loss: 0.0119 92/500 [====>.........................] - ETA: 2:05 - loss: 0.2526 - regression_loss: 0.2408 - classification_loss: 0.0118 93/500 [====>.........................] - ETA: 2:05 - loss: 0.2524 - regression_loss: 0.2407 - classification_loss: 0.0117 94/500 [====>.........................] - ETA: 2:05 - loss: 0.2515 - regression_loss: 0.2399 - classification_loss: 0.0117 95/500 [====>.........................] - ETA: 2:04 - loss: 0.2505 - regression_loss: 0.2390 - classification_loss: 0.0116 96/500 [====>.........................] - ETA: 2:04 - loss: 0.2498 - regression_loss: 0.2383 - classification_loss: 0.0115 97/500 [====>.........................] - ETA: 2:04 - loss: 0.2496 - regression_loss: 0.2381 - classification_loss: 0.0115 98/500 [====>.........................] - ETA: 2:03 - loss: 0.2504 - regression_loss: 0.2388 - classification_loss: 0.0116 99/500 [====>.........................] - ETA: 2:03 - loss: 0.2489 - regression_loss: 0.2375 - classification_loss: 0.0115 100/500 [=====>........................] - ETA: 2:03 - loss: 0.2484 - regression_loss: 0.2369 - classification_loss: 0.0115 101/500 [=====>........................] - ETA: 2:03 - loss: 0.2467 - regression_loss: 0.2352 - classification_loss: 0.0114 102/500 [=====>........................] - ETA: 2:02 - loss: 0.2449 - regression_loss: 0.2336 - classification_loss: 0.0113 103/500 [=====>........................] - ETA: 2:02 - loss: 0.2438 - regression_loss: 0.2325 - classification_loss: 0.0113 104/500 [=====>........................] - ETA: 2:02 - loss: 0.2440 - regression_loss: 0.2327 - classification_loss: 0.0113 105/500 [=====>........................] - ETA: 2:01 - loss: 0.2430 - regression_loss: 0.2318 - classification_loss: 0.0112 106/500 [=====>........................] - ETA: 2:01 - loss: 0.2426 - regression_loss: 0.2315 - classification_loss: 0.0112 107/500 [=====>........................] - ETA: 2:01 - loss: 0.2424 - regression_loss: 0.2312 - classification_loss: 0.0111 108/500 [=====>........................] - ETA: 2:00 - loss: 0.2416 - regression_loss: 0.2306 - classification_loss: 0.0111 109/500 [=====>........................] - ETA: 2:00 - loss: 0.2435 - regression_loss: 0.2322 - classification_loss: 0.0112 110/500 [=====>........................] - ETA: 2:00 - loss: 0.2419 - regression_loss: 0.2307 - classification_loss: 0.0112 111/500 [=====>........................] - ETA: 1:59 - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0112 112/500 [=====>........................] - ETA: 1:59 - loss: 0.2426 - regression_loss: 0.2314 - classification_loss: 0.0112 113/500 [=====>........................] - ETA: 1:59 - loss: 0.2428 - regression_loss: 0.2316 - classification_loss: 0.0113 114/500 [=====>........................] - ETA: 1:59 - loss: 0.2452 - regression_loss: 0.2338 - classification_loss: 0.0113 115/500 [=====>........................] - ETA: 1:58 - loss: 0.2437 - regression_loss: 0.2324 - classification_loss: 0.0113 116/500 [=====>........................] - ETA: 1:58 - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0112 117/500 [======>.......................] - ETA: 1:58 - loss: 0.2438 - regression_loss: 0.2325 - classification_loss: 0.0113 118/500 [======>.......................] - ETA: 1:57 - loss: 0.2450 - regression_loss: 0.2337 - classification_loss: 0.0114 119/500 [======>.......................] - ETA: 1:57 - loss: 0.2448 - regression_loss: 0.2335 - classification_loss: 0.0113 120/500 [======>.......................] - ETA: 1:57 - loss: 0.2457 - regression_loss: 0.2344 - classification_loss: 0.0113 121/500 [======>.......................] - ETA: 1:56 - loss: 0.2454 - regression_loss: 0.2341 - classification_loss: 0.0113 122/500 [======>.......................] - ETA: 1:56 - loss: 0.2475 - regression_loss: 0.2361 - classification_loss: 0.0114 123/500 [======>.......................] - ETA: 1:56 - loss: 0.2472 - regression_loss: 0.2359 - classification_loss: 0.0114 124/500 [======>.......................] - ETA: 1:55 - loss: 0.2459 - regression_loss: 0.2346 - classification_loss: 0.0113 125/500 [======>.......................] - ETA: 1:55 - loss: 0.2447 - regression_loss: 0.2335 - classification_loss: 0.0112 126/500 [======>.......................] - ETA: 1:55 - loss: 0.2456 - regression_loss: 0.2341 - classification_loss: 0.0115 127/500 [======>.......................] - ETA: 1:54 - loss: 0.2453 - regression_loss: 0.2339 - classification_loss: 0.0114 128/500 [======>.......................] - ETA: 1:54 - loss: 0.2456 - regression_loss: 0.2342 - classification_loss: 0.0114 129/500 [======>.......................] - ETA: 1:54 - loss: 0.2455 - regression_loss: 0.2340 - classification_loss: 0.0114 130/500 [======>.......................] - ETA: 1:54 - loss: 0.2456 - regression_loss: 0.2342 - classification_loss: 0.0114 131/500 [======>.......................] - ETA: 1:53 - loss: 0.2473 - regression_loss: 0.2356 - classification_loss: 0.0117 132/500 [======>.......................] - ETA: 1:53 - loss: 0.2475 - regression_loss: 0.2359 - classification_loss: 0.0116 133/500 [======>.......................] - ETA: 1:53 - loss: 0.2472 - regression_loss: 0.2357 - classification_loss: 0.0116 134/500 [=======>......................] - ETA: 1:52 - loss: 0.2463 - regression_loss: 0.2348 - classification_loss: 0.0115 135/500 [=======>......................] - ETA: 1:52 - loss: 0.2471 - regression_loss: 0.2356 - classification_loss: 0.0115 136/500 [=======>......................] - ETA: 1:52 - loss: 0.2459 - regression_loss: 0.2344 - classification_loss: 0.0115 137/500 [=======>......................] - ETA: 1:51 - loss: 0.2462 - regression_loss: 0.2347 - classification_loss: 0.0115 138/500 [=======>......................] - ETA: 1:51 - loss: 0.2466 - regression_loss: 0.2351 - classification_loss: 0.0115 139/500 [=======>......................] - ETA: 1:51 - loss: 0.2471 - regression_loss: 0.2356 - classification_loss: 0.0115 140/500 [=======>......................] - ETA: 1:51 - loss: 0.2462 - regression_loss: 0.2347 - classification_loss: 0.0115 141/500 [=======>......................] - ETA: 1:50 - loss: 0.2461 - regression_loss: 0.2347 - classification_loss: 0.0115 142/500 [=======>......................] - ETA: 1:50 - loss: 0.2464 - regression_loss: 0.2350 - classification_loss: 0.0114 143/500 [=======>......................] - ETA: 1:50 - loss: 0.2459 - regression_loss: 0.2345 - classification_loss: 0.0114 144/500 [=======>......................] - ETA: 1:49 - loss: 0.2454 - regression_loss: 0.2340 - classification_loss: 0.0114 145/500 [=======>......................] - ETA: 1:49 - loss: 0.2450 - regression_loss: 0.2337 - classification_loss: 0.0113 146/500 [=======>......................] - ETA: 1:49 - loss: 0.2441 - regression_loss: 0.2328 - classification_loss: 0.0113 147/500 [=======>......................] - ETA: 1:48 - loss: 0.2435 - regression_loss: 0.2323 - classification_loss: 0.0112 148/500 [=======>......................] - ETA: 1:48 - loss: 0.2441 - regression_loss: 0.2329 - classification_loss: 0.0112 149/500 [=======>......................] - ETA: 1:48 - loss: 0.2449 - regression_loss: 0.2337 - classification_loss: 0.0112 150/500 [========>.....................] - ETA: 1:47 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 151/500 [========>.....................] - ETA: 1:47 - loss: 0.2446 - regression_loss: 0.2334 - classification_loss: 0.0112 152/500 [========>.....................] - ETA: 1:47 - loss: 0.2439 - regression_loss: 0.2328 - classification_loss: 0.0111 153/500 [========>.....................] - ETA: 1:47 - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0111 154/500 [========>.....................] - ETA: 1:46 - loss: 0.2431 - regression_loss: 0.2320 - classification_loss: 0.0111 155/500 [========>.....................] - ETA: 1:46 - loss: 0.2435 - regression_loss: 0.2324 - classification_loss: 0.0112 156/500 [========>.....................] - ETA: 1:46 - loss: 0.2440 - regression_loss: 0.2328 - classification_loss: 0.0112 157/500 [========>.....................] - ETA: 1:45 - loss: 0.2439 - regression_loss: 0.2326 - classification_loss: 0.0112 158/500 [========>.....................] - ETA: 1:45 - loss: 0.2442 - regression_loss: 0.2330 - classification_loss: 0.0112 159/500 [========>.....................] - ETA: 1:45 - loss: 0.2433 - regression_loss: 0.2321 - classification_loss: 0.0112 160/500 [========>.....................] - ETA: 1:44 - loss: 0.2431 - regression_loss: 0.2319 - classification_loss: 0.0112 161/500 [========>.....................] - ETA: 1:44 - loss: 0.2443 - regression_loss: 0.2328 - classification_loss: 0.0115 162/500 [========>.....................] - ETA: 1:44 - loss: 0.2451 - regression_loss: 0.2336 - classification_loss: 0.0116 163/500 [========>.....................] - ETA: 1:43 - loss: 0.2450 - regression_loss: 0.2334 - classification_loss: 0.0116 164/500 [========>.....................] - ETA: 1:43 - loss: 0.2476 - regression_loss: 0.2359 - classification_loss: 0.0117 165/500 [========>.....................] - ETA: 1:43 - loss: 0.2492 - regression_loss: 0.2374 - classification_loss: 0.0118 166/500 [========>.....................] - ETA: 1:43 - loss: 0.2499 - regression_loss: 0.2380 - classification_loss: 0.0119 167/500 [=========>....................] - ETA: 1:42 - loss: 0.2497 - regression_loss: 0.2378 - classification_loss: 0.0119 168/500 [=========>....................] - ETA: 1:42 - loss: 0.2500 - regression_loss: 0.2382 - classification_loss: 0.0119 169/500 [=========>....................] - ETA: 1:42 - loss: 0.2504 - regression_loss: 0.2385 - classification_loss: 0.0119 170/500 [=========>....................] - ETA: 1:41 - loss: 0.2514 - regression_loss: 0.2395 - classification_loss: 0.0119 171/500 [=========>....................] - ETA: 1:41 - loss: 0.2517 - regression_loss: 0.2399 - classification_loss: 0.0118 172/500 [=========>....................] - ETA: 1:41 - loss: 0.2525 - regression_loss: 0.2406 - classification_loss: 0.0119 173/500 [=========>....................] - ETA: 1:40 - loss: 0.2534 - regression_loss: 0.2415 - classification_loss: 0.0119 174/500 [=========>....................] - ETA: 1:40 - loss: 0.2549 - regression_loss: 0.2430 - classification_loss: 0.0120 175/500 [=========>....................] - ETA: 1:40 - loss: 0.2539 - regression_loss: 0.2419 - classification_loss: 0.0119 176/500 [=========>....................] - ETA: 1:39 - loss: 0.2532 - regression_loss: 0.2413 - classification_loss: 0.0119 177/500 [=========>....................] - ETA: 1:39 - loss: 0.2526 - regression_loss: 0.2408 - classification_loss: 0.0118 178/500 [=========>....................] - ETA: 1:39 - loss: 0.2526 - regression_loss: 0.2408 - classification_loss: 0.0118 179/500 [=========>....................] - ETA: 1:38 - loss: 0.2518 - regression_loss: 0.2400 - classification_loss: 0.0118 180/500 [=========>....................] - ETA: 1:38 - loss: 0.2514 - regression_loss: 0.2397 - classification_loss: 0.0117 181/500 [=========>....................] - ETA: 1:38 - loss: 0.2508 - regression_loss: 0.2392 - classification_loss: 0.0117 182/500 [=========>....................] - ETA: 1:38 - loss: 0.2508 - regression_loss: 0.2391 - classification_loss: 0.0116 183/500 [=========>....................] - ETA: 1:37 - loss: 0.2512 - regression_loss: 0.2396 - classification_loss: 0.0116 184/500 [==========>...................] - ETA: 1:37 - loss: 0.2510 - regression_loss: 0.2395 - classification_loss: 0.0116 185/500 [==========>...................] - ETA: 1:37 - loss: 0.2509 - regression_loss: 0.2394 - classification_loss: 0.0115 186/500 [==========>...................] - ETA: 1:36 - loss: 0.2515 - regression_loss: 0.2399 - classification_loss: 0.0116 187/500 [==========>...................] - ETA: 1:36 - loss: 0.2518 - regression_loss: 0.2402 - classification_loss: 0.0115 188/500 [==========>...................] - ETA: 1:36 - loss: 0.2513 - regression_loss: 0.2398 - classification_loss: 0.0115 189/500 [==========>...................] - ETA: 1:35 - loss: 0.2510 - regression_loss: 0.2394 - classification_loss: 0.0116 190/500 [==========>...................] - ETA: 1:35 - loss: 0.2511 - regression_loss: 0.2396 - classification_loss: 0.0116 191/500 [==========>...................] - ETA: 1:35 - loss: 0.2516 - regression_loss: 0.2400 - classification_loss: 0.0116 192/500 [==========>...................] - ETA: 1:34 - loss: 0.2516 - regression_loss: 0.2400 - classification_loss: 0.0116 193/500 [==========>...................] - ETA: 1:34 - loss: 0.2513 - regression_loss: 0.2398 - classification_loss: 0.0116 194/500 [==========>...................] - ETA: 1:34 - loss: 0.2517 - regression_loss: 0.2401 - classification_loss: 0.0116 195/500 [==========>...................] - ETA: 1:34 - loss: 0.2514 - regression_loss: 0.2399 - classification_loss: 0.0115 196/500 [==========>...................] - ETA: 1:33 - loss: 0.2512 - regression_loss: 0.2397 - classification_loss: 0.0115 197/500 [==========>...................] - ETA: 1:33 - loss: 0.2503 - regression_loss: 0.2388 - classification_loss: 0.0115 198/500 [==========>...................] - ETA: 1:33 - loss: 0.2496 - regression_loss: 0.2381 - classification_loss: 0.0114 199/500 [==========>...................] - ETA: 1:32 - loss: 0.2495 - regression_loss: 0.2381 - classification_loss: 0.0114 200/500 [===========>..................] - ETA: 1:32 - loss: 0.2495 - regression_loss: 0.2381 - classification_loss: 0.0114 201/500 [===========>..................] - ETA: 1:32 - loss: 0.2490 - regression_loss: 0.2376 - classification_loss: 0.0114 202/500 [===========>..................] - ETA: 1:31 - loss: 0.2488 - regression_loss: 0.2375 - classification_loss: 0.0114 203/500 [===========>..................] - ETA: 1:31 - loss: 0.2485 - regression_loss: 0.2371 - classification_loss: 0.0113 204/500 [===========>..................] - ETA: 1:31 - loss: 0.2484 - regression_loss: 0.2371 - classification_loss: 0.0113 205/500 [===========>..................] - ETA: 1:31 - loss: 0.2484 - regression_loss: 0.2371 - classification_loss: 0.0113 206/500 [===========>..................] - ETA: 1:30 - loss: 0.2479 - regression_loss: 0.2366 - classification_loss: 0.0112 207/500 [===========>..................] - ETA: 1:30 - loss: 0.2473 - regression_loss: 0.2361 - classification_loss: 0.0112 208/500 [===========>..................] - ETA: 1:30 - loss: 0.2477 - regression_loss: 0.2364 - classification_loss: 0.0113 209/500 [===========>..................] - ETA: 1:29 - loss: 0.2479 - regression_loss: 0.2365 - classification_loss: 0.0113 210/500 [===========>..................] - ETA: 1:29 - loss: 0.2471 - regression_loss: 0.2358 - classification_loss: 0.0113 211/500 [===========>..................] - ETA: 1:29 - loss: 0.2475 - regression_loss: 0.2362 - classification_loss: 0.0113 212/500 [===========>..................] - ETA: 1:28 - loss: 0.2480 - regression_loss: 0.2367 - classification_loss: 0.0114 213/500 [===========>..................] - ETA: 1:28 - loss: 0.2481 - regression_loss: 0.2367 - classification_loss: 0.0114 214/500 [===========>..................] - ETA: 1:28 - loss: 0.2490 - regression_loss: 0.2377 - classification_loss: 0.0114 215/500 [===========>..................] - ETA: 1:28 - loss: 0.2493 - regression_loss: 0.2379 - classification_loss: 0.0114 216/500 [===========>..................] - ETA: 1:27 - loss: 0.2487 - regression_loss: 0.2374 - classification_loss: 0.0113 217/500 [============>.................] - ETA: 1:27 - loss: 0.2493 - regression_loss: 0.2379 - classification_loss: 0.0114 218/500 [============>.................] - ETA: 1:27 - loss: 0.2486 - regression_loss: 0.2372 - classification_loss: 0.0114 219/500 [============>.................] - ETA: 1:26 - loss: 0.2482 - regression_loss: 0.2369 - classification_loss: 0.0113 220/500 [============>.................] - ETA: 1:26 - loss: 0.2483 - regression_loss: 0.2369 - classification_loss: 0.0113 221/500 [============>.................] - ETA: 1:26 - loss: 0.2479 - regression_loss: 0.2366 - classification_loss: 0.0113 222/500 [============>.................] - ETA: 1:25 - loss: 0.2479 - regression_loss: 0.2366 - classification_loss: 0.0113 223/500 [============>.................] - ETA: 1:25 - loss: 0.2478 - regression_loss: 0.2365 - classification_loss: 0.0113 224/500 [============>.................] - ETA: 1:25 - loss: 0.2481 - regression_loss: 0.2367 - classification_loss: 0.0114 225/500 [============>.................] - ETA: 1:25 - loss: 0.2483 - regression_loss: 0.2369 - classification_loss: 0.0113 226/500 [============>.................] - ETA: 1:24 - loss: 0.2482 - regression_loss: 0.2368 - classification_loss: 0.0113 227/500 [============>.................] - ETA: 1:24 - loss: 0.2483 - regression_loss: 0.2370 - classification_loss: 0.0113 228/500 [============>.................] - ETA: 1:24 - loss: 0.2491 - regression_loss: 0.2378 - classification_loss: 0.0113 229/500 [============>.................] - ETA: 1:23 - loss: 0.2488 - regression_loss: 0.2375 - classification_loss: 0.0113 230/500 [============>.................] - ETA: 1:23 - loss: 0.2492 - regression_loss: 0.2379 - classification_loss: 0.0113 231/500 [============>.................] - ETA: 1:23 - loss: 0.2492 - regression_loss: 0.2379 - classification_loss: 0.0113 232/500 [============>.................] - ETA: 1:23 - loss: 0.2493 - regression_loss: 0.2380 - classification_loss: 0.0112 233/500 [============>.................] - ETA: 1:22 - loss: 0.2489 - regression_loss: 0.2377 - classification_loss: 0.0112 234/500 [=============>................] - ETA: 1:22 - loss: 0.2483 - regression_loss: 0.2372 - classification_loss: 0.0112 235/500 [=============>................] - ETA: 1:22 - loss: 0.2487 - regression_loss: 0.2375 - classification_loss: 0.0112 236/500 [=============>................] - ETA: 1:21 - loss: 0.2479 - regression_loss: 0.2368 - classification_loss: 0.0111 237/500 [=============>................] - ETA: 1:21 - loss: 0.2487 - regression_loss: 0.2375 - classification_loss: 0.0112 238/500 [=============>................] - ETA: 1:21 - loss: 0.2487 - regression_loss: 0.2375 - classification_loss: 0.0112 239/500 [=============>................] - ETA: 1:21 - loss: 0.2483 - regression_loss: 0.2371 - classification_loss: 0.0112 240/500 [=============>................] - ETA: 1:20 - loss: 0.2484 - regression_loss: 0.2372 - classification_loss: 0.0113 241/500 [=============>................] - ETA: 1:20 - loss: 0.2483 - regression_loss: 0.2370 - classification_loss: 0.0112 242/500 [=============>................] - ETA: 1:20 - loss: 0.2479 - regression_loss: 0.2367 - classification_loss: 0.0112 243/500 [=============>................] - ETA: 1:19 - loss: 0.2486 - regression_loss: 0.2373 - classification_loss: 0.0112 244/500 [=============>................] - ETA: 1:19 - loss: 0.2481 - regression_loss: 0.2369 - classification_loss: 0.0112 245/500 [=============>................] - ETA: 1:19 - loss: 0.2478 - regression_loss: 0.2366 - classification_loss: 0.0112 246/500 [=============>................] - ETA: 1:19 - loss: 0.2481 - regression_loss: 0.2369 - classification_loss: 0.0112 247/500 [=============>................] - ETA: 1:18 - loss: 0.2478 - regression_loss: 0.2365 - classification_loss: 0.0112 248/500 [=============>................] - ETA: 1:18 - loss: 0.2474 - regression_loss: 0.2362 - classification_loss: 0.0112 249/500 [=============>................] - ETA: 1:18 - loss: 0.2471 - regression_loss: 0.2359 - classification_loss: 0.0112 250/500 [==============>...............] - ETA: 1:17 - loss: 0.2480 - regression_loss: 0.2369 - classification_loss: 0.0112 251/500 [==============>...............] - ETA: 1:17 - loss: 0.2481 - regression_loss: 0.2369 - classification_loss: 0.0112 252/500 [==============>...............] - ETA: 1:17 - loss: 0.2483 - regression_loss: 0.2371 - classification_loss: 0.0112 253/500 [==============>...............] - ETA: 1:16 - loss: 0.2486 - regression_loss: 0.2374 - classification_loss: 0.0112 254/500 [==============>...............] - ETA: 1:16 - loss: 0.2486 - regression_loss: 0.2374 - classification_loss: 0.0111 255/500 [==============>...............] - ETA: 1:16 - loss: 0.2482 - regression_loss: 0.2371 - classification_loss: 0.0111 256/500 [==============>...............] - ETA: 1:16 - loss: 0.2480 - regression_loss: 0.2369 - classification_loss: 0.0111 257/500 [==============>...............] - ETA: 1:15 - loss: 0.2482 - regression_loss: 0.2371 - classification_loss: 0.0111 258/500 [==============>...............] - ETA: 1:15 - loss: 0.2484 - regression_loss: 0.2373 - classification_loss: 0.0111 259/500 [==============>...............] - ETA: 1:15 - loss: 0.2482 - regression_loss: 0.2371 - classification_loss: 0.0111 260/500 [==============>...............] - ETA: 1:14 - loss: 0.2478 - regression_loss: 0.2367 - classification_loss: 0.0110 261/500 [==============>...............] - ETA: 1:14 - loss: 0.2474 - regression_loss: 0.2364 - classification_loss: 0.0110 262/500 [==============>...............] - ETA: 1:14 - loss: 0.2477 - regression_loss: 0.2366 - classification_loss: 0.0110 263/500 [==============>...............] - ETA: 1:14 - loss: 0.2475 - regression_loss: 0.2365 - classification_loss: 0.0110 264/500 [==============>...............] - ETA: 1:13 - loss: 0.2476 - regression_loss: 0.2366 - classification_loss: 0.0110 265/500 [==============>...............] - ETA: 1:13 - loss: 0.2477 - regression_loss: 0.2367 - classification_loss: 0.0110 266/500 [==============>...............] - ETA: 1:13 - loss: 0.2474 - regression_loss: 0.2364 - classification_loss: 0.0110 267/500 [===============>..............] - ETA: 1:12 - loss: 0.2470 - regression_loss: 0.2361 - classification_loss: 0.0110 268/500 [===============>..............] - ETA: 1:12 - loss: 0.2472 - regression_loss: 0.2362 - classification_loss: 0.0110 269/500 [===============>..............] - ETA: 1:12 - loss: 0.2473 - regression_loss: 0.2363 - classification_loss: 0.0110 270/500 [===============>..............] - ETA: 1:11 - loss: 0.2469 - regression_loss: 0.2359 - classification_loss: 0.0109 271/500 [===============>..............] - ETA: 1:11 - loss: 0.2466 - regression_loss: 0.2357 - classification_loss: 0.0109 272/500 [===============>..............] - ETA: 1:11 - loss: 0.2461 - regression_loss: 0.2352 - classification_loss: 0.0109 273/500 [===============>..............] - ETA: 1:11 - loss: 0.2456 - regression_loss: 0.2347 - classification_loss: 0.0109 274/500 [===============>..............] - ETA: 1:10 - loss: 0.2454 - regression_loss: 0.2346 - classification_loss: 0.0109 275/500 [===============>..............] - ETA: 1:10 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0109 276/500 [===============>..............] - ETA: 1:10 - loss: 0.2453 - regression_loss: 0.2344 - classification_loss: 0.0109 277/500 [===============>..............] - ETA: 1:09 - loss: 0.2450 - regression_loss: 0.2342 - classification_loss: 0.0108 278/500 [===============>..............] - ETA: 1:09 - loss: 0.2449 - regression_loss: 0.2341 - classification_loss: 0.0108 279/500 [===============>..............] - ETA: 1:09 - loss: 0.2448 - regression_loss: 0.2340 - classification_loss: 0.0108 280/500 [===============>..............] - ETA: 1:09 - loss: 0.2445 - regression_loss: 0.2337 - classification_loss: 0.0108 281/500 [===============>..............] - ETA: 1:08 - loss: 0.2446 - regression_loss: 0.2338 - classification_loss: 0.0108 282/500 [===============>..............] - ETA: 1:08 - loss: 0.2444 - regression_loss: 0.2336 - classification_loss: 0.0108 283/500 [===============>..............] - ETA: 1:08 - loss: 0.2441 - regression_loss: 0.2334 - classification_loss: 0.0107 284/500 [================>.............] - ETA: 1:07 - loss: 0.2442 - regression_loss: 0.2335 - classification_loss: 0.0108 285/500 [================>.............] - ETA: 1:07 - loss: 0.2442 - regression_loss: 0.2334 - classification_loss: 0.0107 286/500 [================>.............] - ETA: 1:07 - loss: 0.2437 - regression_loss: 0.2330 - classification_loss: 0.0107 287/500 [================>.............] - ETA: 1:06 - loss: 0.2442 - regression_loss: 0.2335 - classification_loss: 0.0108 288/500 [================>.............] - ETA: 1:06 - loss: 0.2441 - regression_loss: 0.2333 - classification_loss: 0.0107 289/500 [================>.............] - ETA: 1:06 - loss: 0.2434 - regression_loss: 0.2327 - classification_loss: 0.0107 290/500 [================>.............] - ETA: 1:06 - loss: 0.2431 - regression_loss: 0.2324 - classification_loss: 0.0107 291/500 [================>.............] - ETA: 1:05 - loss: 0.2432 - regression_loss: 0.2325 - classification_loss: 0.0107 292/500 [================>.............] - ETA: 1:05 - loss: 0.2433 - regression_loss: 0.2326 - classification_loss: 0.0107 293/500 [================>.............] - ETA: 1:05 - loss: 0.2441 - regression_loss: 0.2334 - classification_loss: 0.0107 294/500 [================>.............] - ETA: 1:04 - loss: 0.2441 - regression_loss: 0.2335 - classification_loss: 0.0106 295/500 [================>.............] - ETA: 1:04 - loss: 0.2449 - regression_loss: 0.2342 - classification_loss: 0.0107 296/500 [================>.............] - ETA: 1:04 - loss: 0.2449 - regression_loss: 0.2343 - classification_loss: 0.0106 297/500 [================>.............] - ETA: 1:03 - loss: 0.2451 - regression_loss: 0.2345 - classification_loss: 0.0106 298/500 [================>.............] - ETA: 1:03 - loss: 0.2459 - regression_loss: 0.2350 - classification_loss: 0.0108 299/500 [================>.............] - ETA: 1:03 - loss: 0.2456 - regression_loss: 0.2348 - classification_loss: 0.0108 300/500 [=================>............] - ETA: 1:03 - loss: 0.2458 - regression_loss: 0.2350 - classification_loss: 0.0108 301/500 [=================>............] - ETA: 1:02 - loss: 0.2458 - regression_loss: 0.2350 - classification_loss: 0.0108 302/500 [=================>............] - ETA: 1:02 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 303/500 [=================>............] - ETA: 1:02 - loss: 0.2451 - regression_loss: 0.2343 - classification_loss: 0.0107 304/500 [=================>............] - ETA: 1:01 - loss: 0.2451 - regression_loss: 0.2344 - classification_loss: 0.0107 305/500 [=================>............] - ETA: 1:01 - loss: 0.2452 - regression_loss: 0.2344 - classification_loss: 0.0107 306/500 [=================>............] - ETA: 1:01 - loss: 0.2449 - regression_loss: 0.2342 - classification_loss: 0.0107 307/500 [=================>............] - ETA: 1:00 - loss: 0.2451 - regression_loss: 0.2344 - classification_loss: 0.0107 308/500 [=================>............] - ETA: 1:00 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0107 309/500 [=================>............] - ETA: 1:00 - loss: 0.2452 - regression_loss: 0.2345 - classification_loss: 0.0107 310/500 [=================>............] - ETA: 1:00 - loss: 0.2458 - regression_loss: 0.2351 - classification_loss: 0.0107 311/500 [=================>............] - ETA: 59s - loss: 0.2455 - regression_loss: 0.2348 - classification_loss: 0.0107  312/500 [=================>............] - ETA: 59s - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 313/500 [=================>............] - ETA: 59s - loss: 0.2450 - regression_loss: 0.2343 - classification_loss: 0.0107 314/500 [=================>............] - ETA: 58s - loss: 0.2458 - regression_loss: 0.2351 - classification_loss: 0.0107 315/500 [=================>............] - ETA: 58s - loss: 0.2458 - regression_loss: 0.2351 - classification_loss: 0.0107 316/500 [=================>............] - ETA: 58s - loss: 0.2455 - regression_loss: 0.2348 - classification_loss: 0.0107 317/500 [==================>...........] - ETA: 57s - loss: 0.2466 - regression_loss: 0.2359 - classification_loss: 0.0107 318/500 [==================>...........] - ETA: 57s - loss: 0.2471 - regression_loss: 0.2364 - classification_loss: 0.0107 319/500 [==================>...........] - ETA: 57s - loss: 0.2469 - regression_loss: 0.2362 - classification_loss: 0.0107 320/500 [==================>...........] - ETA: 56s - loss: 0.2470 - regression_loss: 0.2362 - classification_loss: 0.0107 321/500 [==================>...........] - ETA: 56s - loss: 0.2470 - regression_loss: 0.2363 - classification_loss: 0.0107 322/500 [==================>...........] - ETA: 56s - loss: 0.2469 - regression_loss: 0.2361 - classification_loss: 0.0107 323/500 [==================>...........] - ETA: 56s - loss: 0.2471 - regression_loss: 0.2363 - classification_loss: 0.0108 324/500 [==================>...........] - ETA: 55s - loss: 0.2468 - regression_loss: 0.2360 - classification_loss: 0.0107 325/500 [==================>...........] - ETA: 55s - loss: 0.2468 - regression_loss: 0.2361 - classification_loss: 0.0108 326/500 [==================>...........] - ETA: 55s - loss: 0.2464 - regression_loss: 0.2357 - classification_loss: 0.0107 327/500 [==================>...........] - ETA: 54s - loss: 0.2463 - regression_loss: 0.2356 - classification_loss: 0.0107 328/500 [==================>...........] - ETA: 54s - loss: 0.2460 - regression_loss: 0.2353 - classification_loss: 0.0107 329/500 [==================>...........] - ETA: 54s - loss: 0.2463 - regression_loss: 0.2356 - classification_loss: 0.0107 330/500 [==================>...........] - ETA: 53s - loss: 0.2465 - regression_loss: 0.2357 - classification_loss: 0.0107 331/500 [==================>...........] - ETA: 53s - loss: 0.2461 - regression_loss: 0.2354 - classification_loss: 0.0107 332/500 [==================>...........] - ETA: 53s - loss: 0.2460 - regression_loss: 0.2353 - classification_loss: 0.0107 333/500 [==================>...........] - ETA: 53s - loss: 0.2461 - regression_loss: 0.2354 - classification_loss: 0.0107 334/500 [===================>..........] - ETA: 52s - loss: 0.2462 - regression_loss: 0.2355 - classification_loss: 0.0107 335/500 [===================>..........] - ETA: 52s - loss: 0.2462 - regression_loss: 0.2355 - classification_loss: 0.0107 336/500 [===================>..........] - ETA: 52s - loss: 0.2460 - regression_loss: 0.2353 - classification_loss: 0.0107 337/500 [===================>..........] - ETA: 51s - loss: 0.2457 - regression_loss: 0.2350 - classification_loss: 0.0107 338/500 [===================>..........] - ETA: 51s - loss: 0.2456 - regression_loss: 0.2349 - classification_loss: 0.0108 339/500 [===================>..........] - ETA: 51s - loss: 0.2458 - regression_loss: 0.2350 - classification_loss: 0.0108 340/500 [===================>..........] - ETA: 50s - loss: 0.2456 - regression_loss: 0.2348 - classification_loss: 0.0107 341/500 [===================>..........] - ETA: 50s - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 342/500 [===================>..........] - ETA: 50s - loss: 0.2456 - regression_loss: 0.2349 - classification_loss: 0.0107 343/500 [===================>..........] - ETA: 49s - loss: 0.2455 - regression_loss: 0.2348 - classification_loss: 0.0107 344/500 [===================>..........] - ETA: 49s - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 345/500 [===================>..........] - ETA: 49s - loss: 0.2458 - regression_loss: 0.2351 - classification_loss: 0.0107 346/500 [===================>..........] - ETA: 49s - loss: 0.2455 - regression_loss: 0.2349 - classification_loss: 0.0106 347/500 [===================>..........] - ETA: 48s - loss: 0.2452 - regression_loss: 0.2345 - classification_loss: 0.0106 348/500 [===================>..........] - ETA: 48s - loss: 0.2453 - regression_loss: 0.2347 - classification_loss: 0.0106 349/500 [===================>..........] - ETA: 48s - loss: 0.2451 - regression_loss: 0.2345 - classification_loss: 0.0106 350/500 [====================>.........] - ETA: 47s - loss: 0.2449 - regression_loss: 0.2343 - classification_loss: 0.0106 351/500 [====================>.........] - ETA: 47s - loss: 0.2452 - regression_loss: 0.2346 - classification_loss: 0.0106 352/500 [====================>.........] - ETA: 47s - loss: 0.2451 - regression_loss: 0.2345 - classification_loss: 0.0106 353/500 [====================>.........] - ETA: 46s - loss: 0.2448 - regression_loss: 0.2342 - classification_loss: 0.0106 354/500 [====================>.........] - ETA: 46s - loss: 0.2446 - regression_loss: 0.2340 - classification_loss: 0.0106 355/500 [====================>.........] - ETA: 46s - loss: 0.2442 - regression_loss: 0.2336 - classification_loss: 0.0105 356/500 [====================>.........] - ETA: 45s - loss: 0.2442 - regression_loss: 0.2337 - classification_loss: 0.0105 357/500 [====================>.........] - ETA: 45s - loss: 0.2442 - regression_loss: 0.2337 - classification_loss: 0.0105 358/500 [====================>.........] - ETA: 45s - loss: 0.2441 - regression_loss: 0.2335 - classification_loss: 0.0105 359/500 [====================>.........] - ETA: 45s - loss: 0.2441 - regression_loss: 0.2336 - classification_loss: 0.0105 360/500 [====================>.........] - ETA: 44s - loss: 0.2443 - regression_loss: 0.2338 - classification_loss: 0.0105 361/500 [====================>.........] - ETA: 44s - loss: 0.2441 - regression_loss: 0.2336 - classification_loss: 0.0105 362/500 [====================>.........] - ETA: 44s - loss: 0.2439 - regression_loss: 0.2334 - classification_loss: 0.0105 363/500 [====================>.........] - ETA: 43s - loss: 0.2434 - regression_loss: 0.2329 - classification_loss: 0.0104 364/500 [====================>.........] - ETA: 43s - loss: 0.2438 - regression_loss: 0.2333 - classification_loss: 0.0105 365/500 [====================>.........] - ETA: 43s - loss: 0.2437 - regression_loss: 0.2332 - classification_loss: 0.0105 366/500 [====================>.........] - ETA: 42s - loss: 0.2435 - regression_loss: 0.2331 - classification_loss: 0.0104 367/500 [=====================>........] - ETA: 42s - loss: 0.2434 - regression_loss: 0.2330 - classification_loss: 0.0104 368/500 [=====================>........] - ETA: 42s - loss: 0.2432 - regression_loss: 0.2328 - classification_loss: 0.0104 369/500 [=====================>........] - ETA: 41s - loss: 0.2433 - regression_loss: 0.2328 - classification_loss: 0.0105 370/500 [=====================>........] - ETA: 41s - loss: 0.2438 - regression_loss: 0.2333 - classification_loss: 0.0105 371/500 [=====================>........] - ETA: 41s - loss: 0.2438 - regression_loss: 0.2333 - classification_loss: 0.0105 372/500 [=====================>........] - ETA: 41s - loss: 0.2434 - regression_loss: 0.2329 - classification_loss: 0.0105 373/500 [=====================>........] - ETA: 40s - loss: 0.2430 - regression_loss: 0.2325 - classification_loss: 0.0105 374/500 [=====================>........] - ETA: 40s - loss: 0.2431 - regression_loss: 0.2326 - classification_loss: 0.0105 375/500 [=====================>........] - ETA: 40s - loss: 0.2429 - regression_loss: 0.2325 - classification_loss: 0.0105 376/500 [=====================>........] - ETA: 39s - loss: 0.2428 - regression_loss: 0.2324 - classification_loss: 0.0104 377/500 [=====================>........] - ETA: 39s - loss: 0.2427 - regression_loss: 0.2323 - classification_loss: 0.0104 378/500 [=====================>........] - ETA: 39s - loss: 0.2426 - regression_loss: 0.2322 - classification_loss: 0.0104 379/500 [=====================>........] - ETA: 38s - loss: 0.2429 - regression_loss: 0.2325 - classification_loss: 0.0104 380/500 [=====================>........] - ETA: 38s - loss: 0.2428 - regression_loss: 0.2324 - classification_loss: 0.0104 381/500 [=====================>........] - ETA: 38s - loss: 0.2425 - regression_loss: 0.2321 - classification_loss: 0.0104 382/500 [=====================>........] - ETA: 37s - loss: 0.2422 - regression_loss: 0.2319 - classification_loss: 0.0104 383/500 [=====================>........] - ETA: 37s - loss: 0.2424 - regression_loss: 0.2320 - classification_loss: 0.0104 384/500 [======================>.......] - ETA: 37s - loss: 0.2423 - regression_loss: 0.2319 - classification_loss: 0.0104 385/500 [======================>.......] - ETA: 36s - loss: 0.2421 - regression_loss: 0.2318 - classification_loss: 0.0104 386/500 [======================>.......] - ETA: 36s - loss: 0.2423 - regression_loss: 0.2319 - classification_loss: 0.0104 387/500 [======================>.......] - ETA: 36s - loss: 0.2420 - regression_loss: 0.2317 - classification_loss: 0.0103 388/500 [======================>.......] - ETA: 35s - loss: 0.2419 - regression_loss: 0.2315 - classification_loss: 0.0103 389/500 [======================>.......] - ETA: 35s - loss: 0.2421 - regression_loss: 0.2318 - classification_loss: 0.0103 390/500 [======================>.......] - ETA: 35s - loss: 0.2420 - regression_loss: 0.2317 - classification_loss: 0.0104 391/500 [======================>.......] - ETA: 35s - loss: 0.2420 - regression_loss: 0.2316 - classification_loss: 0.0103 392/500 [======================>.......] - ETA: 34s - loss: 0.2417 - regression_loss: 0.2314 - classification_loss: 0.0103 393/500 [======================>.......] - ETA: 34s - loss: 0.2418 - regression_loss: 0.2315 - classification_loss: 0.0103 394/500 [======================>.......] - ETA: 34s - loss: 0.2418 - regression_loss: 0.2315 - classification_loss: 0.0103 395/500 [======================>.......] - ETA: 33s - loss: 0.2415 - regression_loss: 0.2312 - classification_loss: 0.0103 396/500 [======================>.......] - ETA: 33s - loss: 0.2416 - regression_loss: 0.2313 - classification_loss: 0.0103 397/500 [======================>.......] - ETA: 33s - loss: 0.2416 - regression_loss: 0.2313 - classification_loss: 0.0103 398/500 [======================>.......] - ETA: 32s - loss: 0.2416 - regression_loss: 0.2313 - classification_loss: 0.0103 399/500 [======================>.......] - ETA: 32s - loss: 0.2414 - regression_loss: 0.2311 - classification_loss: 0.0103 400/500 [=======================>......] - ETA: 32s - loss: 0.2413 - regression_loss: 0.2310 - classification_loss: 0.0103 401/500 [=======================>......] - ETA: 31s - loss: 0.2415 - regression_loss: 0.2311 - classification_loss: 0.0103 402/500 [=======================>......] - ETA: 31s - loss: 0.2413 - regression_loss: 0.2309 - classification_loss: 0.0103 403/500 [=======================>......] - ETA: 31s - loss: 0.2411 - regression_loss: 0.2308 - classification_loss: 0.0103 404/500 [=======================>......] - ETA: 30s - loss: 0.2410 - regression_loss: 0.2307 - classification_loss: 0.0103 405/500 [=======================>......] - ETA: 30s - loss: 0.2413 - regression_loss: 0.2310 - classification_loss: 0.0103 406/500 [=======================>......] - ETA: 30s - loss: 0.2411 - regression_loss: 0.2308 - classification_loss: 0.0103 407/500 [=======================>......] - ETA: 29s - loss: 0.2411 - regression_loss: 0.2308 - classification_loss: 0.0103 408/500 [=======================>......] - ETA: 29s - loss: 0.2408 - regression_loss: 0.2306 - classification_loss: 0.0103 409/500 [=======================>......] - ETA: 29s - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103 410/500 [=======================>......] - ETA: 28s - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0103 411/500 [=======================>......] - ETA: 28s - loss: 0.2402 - regression_loss: 0.2299 - classification_loss: 0.0102 412/500 [=======================>......] - ETA: 28s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0102 413/500 [=======================>......] - ETA: 28s - loss: 0.2401 - regression_loss: 0.2299 - classification_loss: 0.0102 414/500 [=======================>......] - ETA: 27s - loss: 0.2397 - regression_loss: 0.2295 - classification_loss: 0.0102 415/500 [=======================>......] - ETA: 27s - loss: 0.2396 - regression_loss: 0.2295 - classification_loss: 0.0102 416/500 [=======================>......] - ETA: 27s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0103 417/500 [========================>.....] - ETA: 26s - loss: 0.2400 - regression_loss: 0.2297 - classification_loss: 0.0103 418/500 [========================>.....] - ETA: 26s - loss: 0.2401 - regression_loss: 0.2299 - classification_loss: 0.0102 419/500 [========================>.....] - ETA: 26s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0102 420/500 [========================>.....] - ETA: 25s - loss: 0.2403 - regression_loss: 0.2300 - classification_loss: 0.0102 421/500 [========================>.....] - ETA: 25s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0102 422/500 [========================>.....] - ETA: 25s - loss: 0.2401 - regression_loss: 0.2299 - classification_loss: 0.0102 423/500 [========================>.....] - ETA: 24s - loss: 0.2400 - regression_loss: 0.2297 - classification_loss: 0.0102 424/500 [========================>.....] - ETA: 24s - loss: 0.2399 - regression_loss: 0.2297 - classification_loss: 0.0102 425/500 [========================>.....] - ETA: 24s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0102 426/500 [========================>.....] - ETA: 23s - loss: 0.2398 - regression_loss: 0.2296 - classification_loss: 0.0102 427/500 [========================>.....] - ETA: 23s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 428/500 [========================>.....] - ETA: 23s - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 429/500 [========================>.....] - ETA: 22s - loss: 0.2408 - regression_loss: 0.2306 - classification_loss: 0.0102 430/500 [========================>.....] - ETA: 22s - loss: 0.2409 - regression_loss: 0.2306 - classification_loss: 0.0102 431/500 [========================>.....] - ETA: 22s - loss: 0.2409 - regression_loss: 0.2307 - classification_loss: 0.0102 432/500 [========================>.....] - ETA: 21s - loss: 0.2406 - regression_loss: 0.2304 - classification_loss: 0.0102 433/500 [========================>.....] - ETA: 21s - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 434/500 [=========================>....] - ETA: 21s - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 435/500 [=========================>....] - ETA: 20s - loss: 0.2402 - regression_loss: 0.2301 - classification_loss: 0.0102 436/500 [=========================>....] - ETA: 20s - loss: 0.2401 - regression_loss: 0.2300 - classification_loss: 0.0101 437/500 [=========================>....] - ETA: 20s - loss: 0.2401 - regression_loss: 0.2300 - classification_loss: 0.0101 438/500 [=========================>....] - ETA: 20s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0102 439/500 [=========================>....] - ETA: 19s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 440/500 [=========================>....] - ETA: 19s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 441/500 [=========================>....] - ETA: 19s - loss: 0.2405 - regression_loss: 0.2303 - classification_loss: 0.0102 442/500 [=========================>....] - ETA: 18s - loss: 0.2407 - regression_loss: 0.2305 - classification_loss: 0.0102 443/500 [=========================>....] - ETA: 18s - loss: 0.2405 - regression_loss: 0.2303 - classification_loss: 0.0102 444/500 [=========================>....] - ETA: 18s - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 445/500 [=========================>....] - ETA: 17s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 446/500 [=========================>....] - ETA: 17s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0102 447/500 [=========================>....] - ETA: 17s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0102 448/500 [=========================>....] - ETA: 16s - loss: 0.2399 - regression_loss: 0.2297 - classification_loss: 0.0102 449/500 [=========================>....] - ETA: 16s - loss: 0.2399 - regression_loss: 0.2297 - classification_loss: 0.0102 450/500 [==========================>...] - ETA: 16s - loss: 0.2399 - regression_loss: 0.2297 - classification_loss: 0.0102 451/500 [==========================>...] - ETA: 15s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0102 452/500 [==========================>...] - ETA: 15s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0102 453/500 [==========================>...] - ETA: 15s - loss: 0.2402 - regression_loss: 0.2299 - classification_loss: 0.0102 454/500 [==========================>...] - ETA: 14s - loss: 0.2400 - regression_loss: 0.2298 - classification_loss: 0.0102 455/500 [==========================>...] - ETA: 14s - loss: 0.2401 - regression_loss: 0.2299 - classification_loss: 0.0102 456/500 [==========================>...] - ETA: 14s - loss: 0.2404 - regression_loss: 0.2301 - classification_loss: 0.0102 457/500 [==========================>...] - ETA: 13s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0103 458/500 [==========================>...] - ETA: 13s - loss: 0.2401 - regression_loss: 0.2299 - classification_loss: 0.0102 459/500 [==========================>...] - ETA: 13s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 460/500 [==========================>...] - ETA: 12s - loss: 0.2403 - regression_loss: 0.2301 - classification_loss: 0.0102 461/500 [==========================>...] - ETA: 12s - loss: 0.2409 - regression_loss: 0.2307 - classification_loss: 0.0103 462/500 [==========================>...] - ETA: 12s - loss: 0.2406 - regression_loss: 0.2303 - classification_loss: 0.0103 463/500 [==========================>...] - ETA: 11s - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0102 464/500 [==========================>...] - ETA: 11s - loss: 0.2408 - regression_loss: 0.2305 - classification_loss: 0.0103 465/500 [==========================>...] - ETA: 11s - loss: 0.2410 - regression_loss: 0.2308 - classification_loss: 0.0103 466/500 [==========================>...] - ETA: 11s - loss: 0.2410 - regression_loss: 0.2307 - classification_loss: 0.0103 467/500 [===========================>..] - ETA: 10s - loss: 0.2410 - regression_loss: 0.2307 - classification_loss: 0.0103 468/500 [===========================>..] - ETA: 10s - loss: 0.2408 - regression_loss: 0.2306 - classification_loss: 0.0103 469/500 [===========================>..] - ETA: 10s - loss: 0.2405 - regression_loss: 0.2303 - classification_loss: 0.0103 470/500 [===========================>..] - ETA: 9s - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0103  471/500 [===========================>..] - ETA: 9s - loss: 0.2402 - regression_loss: 0.2300 - classification_loss: 0.0103 472/500 [===========================>..] - ETA: 9s - loss: 0.2404 - regression_loss: 0.2301 - classification_loss: 0.0103 473/500 [===========================>..] - ETA: 8s - loss: 0.2406 - regression_loss: 0.2303 - classification_loss: 0.0103 474/500 [===========================>..] - ETA: 8s - loss: 0.2406 - regression_loss: 0.2303 - classification_loss: 0.0103 475/500 [===========================>..] - ETA: 8s - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0104 476/500 [===========================>..] - ETA: 7s - loss: 0.2405 - regression_loss: 0.2302 - classification_loss: 0.0104 477/500 [===========================>..] - ETA: 7s - loss: 0.2403 - regression_loss: 0.2299 - classification_loss: 0.0104 478/500 [===========================>..] - ETA: 7s - loss: 0.2402 - regression_loss: 0.2298 - classification_loss: 0.0103 479/500 [===========================>..] - ETA: 6s - loss: 0.2401 - regression_loss: 0.2298 - classification_loss: 0.0103 480/500 [===========================>..] - ETA: 6s - loss: 0.2400 - regression_loss: 0.2297 - classification_loss: 0.0103 481/500 [===========================>..] - ETA: 6s - loss: 0.2398 - regression_loss: 0.2295 - classification_loss: 0.0104 482/500 [===========================>..] - ETA: 5s - loss: 0.2397 - regression_loss: 0.2293 - classification_loss: 0.0103 483/500 [===========================>..] - ETA: 5s - loss: 0.2396 - regression_loss: 0.2293 - classification_loss: 0.0103 484/500 [============================>.] - ETA: 5s - loss: 0.2395 - regression_loss: 0.2292 - classification_loss: 0.0103 485/500 [============================>.] - ETA: 4s - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0103 486/500 [============================>.] - ETA: 4s - loss: 0.2390 - regression_loss: 0.2287 - classification_loss: 0.0103 487/500 [============================>.] - ETA: 4s - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0104 488/500 [============================>.] - ETA: 3s - loss: 0.2393 - regression_loss: 0.2290 - classification_loss: 0.0104 489/500 [============================>.] - ETA: 3s - loss: 0.2393 - regression_loss: 0.2289 - classification_loss: 0.0103 490/500 [============================>.] - ETA: 3s - loss: 0.2391 - regression_loss: 0.2288 - classification_loss: 0.0103 491/500 [============================>.] - ETA: 2s - loss: 0.2388 - regression_loss: 0.2285 - classification_loss: 0.0103 492/500 [============================>.] - ETA: 2s - loss: 0.2388 - regression_loss: 0.2285 - classification_loss: 0.0103 493/500 [============================>.] - ETA: 2s - loss: 0.2387 - regression_loss: 0.2284 - classification_loss: 0.0103 494/500 [============================>.] - ETA: 1s - loss: 0.2386 - regression_loss: 0.2283 - classification_loss: 0.0103 495/500 [============================>.] - ETA: 1s - loss: 0.2392 - regression_loss: 0.2290 - classification_loss: 0.0103 496/500 [============================>.] - ETA: 1s - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0103 497/500 [============================>.] - ETA: 0s - loss: 0.2391 - regression_loss: 0.2289 - classification_loss: 0.0103 498/500 [============================>.] - ETA: 0s - loss: 0.2391 - regression_loss: 0.2289 - classification_loss: 0.0103 499/500 [============================>.] - ETA: 0s - loss: 0.2391 - regression_loss: 0.2288 - classification_loss: 0.0103 500/500 [==============================] - 163s 325ms/step - loss: 0.2390 - regression_loss: 0.2287 - classification_loss: 0.0102 1172 instances of class plum with average precision: 0.7553 mAP: 0.7553 Epoch 00050: saving model to ./training/snapshots/resnet101_pascal_50.h5 Epoch 00050: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. Epoch 51/150 1/500 [..............................] - ETA: 2:42 - loss: 0.1784 - regression_loss: 0.1725 - classification_loss: 0.0058 2/500 [..............................] - ETA: 2:44 - loss: 0.1947 - regression_loss: 0.1889 - classification_loss: 0.0058 3/500 [..............................] - ETA: 2:45 - loss: 0.1846 - regression_loss: 0.1778 - classification_loss: 0.0067 4/500 [..............................] - ETA: 2:46 - loss: 0.2131 - regression_loss: 0.2048 - classification_loss: 0.0084 5/500 [..............................] - ETA: 2:46 - loss: 0.1917 - regression_loss: 0.1846 - classification_loss: 0.0071 6/500 [..............................] - ETA: 2:46 - loss: 0.1994 - regression_loss: 0.1914 - classification_loss: 0.0080 7/500 [..............................] - ETA: 2:47 - loss: 0.2014 - regression_loss: 0.1934 - classification_loss: 0.0080 8/500 [..............................] - ETA: 2:47 - loss: 0.2027 - regression_loss: 0.1949 - classification_loss: 0.0078 9/500 [..............................] - ETA: 2:47 - loss: 0.1948 - regression_loss: 0.1874 - classification_loss: 0.0075 10/500 [..............................] - ETA: 2:47 - loss: 0.1872 - regression_loss: 0.1794 - classification_loss: 0.0077 11/500 [..............................] - ETA: 2:46 - loss: 0.1846 - regression_loss: 0.1771 - classification_loss: 0.0075 12/500 [..............................] - ETA: 2:45 - loss: 0.1834 - regression_loss: 0.1764 - classification_loss: 0.0070 13/500 [..............................] - ETA: 2:45 - loss: 0.1773 - regression_loss: 0.1706 - classification_loss: 0.0067 14/500 [..............................] - ETA: 2:45 - loss: 0.1799 - regression_loss: 0.1726 - classification_loss: 0.0073 15/500 [..............................] - ETA: 2:45 - loss: 0.1776 - regression_loss: 0.1706 - classification_loss: 0.0070 16/500 [..............................] - ETA: 2:44 - loss: 0.1719 - regression_loss: 0.1652 - classification_loss: 0.0067 17/500 [>.............................] - ETA: 2:44 - loss: 0.1712 - regression_loss: 0.1640 - classification_loss: 0.0071 18/500 [>.............................] - ETA: 2:43 - loss: 0.1667 - regression_loss: 0.1596 - classification_loss: 0.0071 19/500 [>.............................] - ETA: 2:42 - loss: 0.1662 - regression_loss: 0.1591 - classification_loss: 0.0071 20/500 [>.............................] - ETA: 2:42 - loss: 0.1636 - regression_loss: 0.1567 - classification_loss: 0.0069 21/500 [>.............................] - ETA: 2:41 - loss: 0.1622 - regression_loss: 0.1550 - classification_loss: 0.0072 22/500 [>.............................] - ETA: 2:41 - loss: 0.1657 - regression_loss: 0.1585 - classification_loss: 0.0072 23/500 [>.............................] - ETA: 2:41 - loss: 0.1747 - regression_loss: 0.1670 - classification_loss: 0.0077 24/500 [>.............................] - ETA: 2:41 - loss: 0.1728 - regression_loss: 0.1653 - classification_loss: 0.0075 25/500 [>.............................] - ETA: 2:40 - loss: 0.1739 - regression_loss: 0.1665 - classification_loss: 0.0074 26/500 [>.............................] - ETA: 2:40 - loss: 0.1741 - regression_loss: 0.1668 - classification_loss: 0.0073 27/500 [>.............................] - ETA: 2:39 - loss: 0.1698 - regression_loss: 0.1626 - classification_loss: 0.0072 28/500 [>.............................] - ETA: 2:39 - loss: 0.1693 - regression_loss: 0.1623 - classification_loss: 0.0071 29/500 [>.............................] - ETA: 2:39 - loss: 0.1701 - regression_loss: 0.1628 - classification_loss: 0.0073 30/500 [>.............................] - ETA: 2:39 - loss: 0.1689 - regression_loss: 0.1617 - classification_loss: 0.0072 31/500 [>.............................] - ETA: 2:39 - loss: 0.1678 - regression_loss: 0.1607 - classification_loss: 0.0072 32/500 [>.............................] - ETA: 2:39 - loss: 0.1660 - regression_loss: 0.1590 - classification_loss: 0.0070 33/500 [>.............................] - ETA: 2:38 - loss: 0.1645 - regression_loss: 0.1576 - classification_loss: 0.0069 34/500 [=>............................] - ETA: 2:38 - loss: 0.1666 - regression_loss: 0.1598 - classification_loss: 0.0068 35/500 [=>............................] - ETA: 2:38 - loss: 0.1674 - regression_loss: 0.1608 - classification_loss: 0.0066 36/500 [=>............................] - ETA: 2:37 - loss: 0.1642 - regression_loss: 0.1577 - classification_loss: 0.0065 37/500 [=>............................] - ETA: 2:37 - loss: 0.1671 - regression_loss: 0.1604 - classification_loss: 0.0067 38/500 [=>............................] - ETA: 2:37 - loss: 0.1725 - regression_loss: 0.1656 - classification_loss: 0.0069 39/500 [=>............................] - ETA: 2:36 - loss: 0.1729 - regression_loss: 0.1660 - classification_loss: 0.0069 40/500 [=>............................] - ETA: 2:36 - loss: 0.1755 - regression_loss: 0.1687 - classification_loss: 0.0068 41/500 [=>............................] - ETA: 2:36 - loss: 0.1750 - regression_loss: 0.1681 - classification_loss: 0.0069 42/500 [=>............................] - ETA: 2:35 - loss: 0.1748 - regression_loss: 0.1679 - classification_loss: 0.0069 43/500 [=>............................] - ETA: 2:35 - loss: 0.1756 - regression_loss: 0.1682 - classification_loss: 0.0074 44/500 [=>............................] - ETA: 2:35 - loss: 0.1751 - regression_loss: 0.1675 - classification_loss: 0.0076 45/500 [=>............................] - ETA: 2:34 - loss: 0.1777 - regression_loss: 0.1701 - classification_loss: 0.0076 46/500 [=>............................] - ETA: 2:34 - loss: 0.1763 - regression_loss: 0.1687 - classification_loss: 0.0076 47/500 [=>............................] - ETA: 2:33 - loss: 0.1758 - regression_loss: 0.1682 - classification_loss: 0.0075 48/500 [=>............................] - ETA: 2:33 - loss: 0.1747 - regression_loss: 0.1672 - classification_loss: 0.0075 49/500 [=>............................] - ETA: 2:32 - loss: 0.1727 - regression_loss: 0.1653 - classification_loss: 0.0074 50/500 [==>...........................] - ETA: 2:32 - loss: 0.1706 - regression_loss: 0.1633 - classification_loss: 0.0073 51/500 [==>...........................] - ETA: 2:32 - loss: 0.1706 - regression_loss: 0.1634 - classification_loss: 0.0072 52/500 [==>...........................] - ETA: 2:31 - loss: 0.1729 - regression_loss: 0.1656 - classification_loss: 0.0073 53/500 [==>...........................] - ETA: 2:31 - loss: 0.1716 - regression_loss: 0.1644 - classification_loss: 0.0072 54/500 [==>...........................] - ETA: 2:31 - loss: 0.1715 - regression_loss: 0.1644 - classification_loss: 0.0071 55/500 [==>...........................] - ETA: 2:30 - loss: 0.1709 - regression_loss: 0.1638 - classification_loss: 0.0071 56/500 [==>...........................] - ETA: 2:30 - loss: 0.1749 - regression_loss: 0.1678 - classification_loss: 0.0071 57/500 [==>...........................] - ETA: 2:30 - loss: 0.1733 - regression_loss: 0.1662 - classification_loss: 0.0071 58/500 [==>...........................] - ETA: 2:29 - loss: 0.1720 - regression_loss: 0.1650 - classification_loss: 0.0070 59/500 [==>...........................] - ETA: 2:29 - loss: 0.1726 - regression_loss: 0.1656 - classification_loss: 0.0070 60/500 [==>...........................] - ETA: 2:29 - loss: 0.1722 - regression_loss: 0.1651 - classification_loss: 0.0070 61/500 [==>...........................] - ETA: 2:28 - loss: 0.1729 - regression_loss: 0.1659 - classification_loss: 0.0070 62/500 [==>...........................] - ETA: 2:28 - loss: 0.1731 - regression_loss: 0.1659 - classification_loss: 0.0071 63/500 [==>...........................] - ETA: 2:27 - loss: 0.1722 - regression_loss: 0.1652 - classification_loss: 0.0071 64/500 [==>...........................] - ETA: 2:27 - loss: 0.1746 - regression_loss: 0.1666 - classification_loss: 0.0081 65/500 [==>...........................] - ETA: 2:27 - loss: 0.1741 - regression_loss: 0.1661 - classification_loss: 0.0080 66/500 [==>...........................] - ETA: 2:26 - loss: 0.1739 - regression_loss: 0.1659 - classification_loss: 0.0080 67/500 [===>..........................] - ETA: 2:26 - loss: 0.1733 - regression_loss: 0.1654 - classification_loss: 0.0080 68/500 [===>..........................] - ETA: 2:26 - loss: 0.1712 - regression_loss: 0.1634 - classification_loss: 0.0079 69/500 [===>..........................] - ETA: 2:26 - loss: 0.1703 - regression_loss: 0.1625 - classification_loss: 0.0078 70/500 [===>..........................] - ETA: 2:26 - loss: 0.1687 - regression_loss: 0.1610 - classification_loss: 0.0077 71/500 [===>..........................] - ETA: 2:25 - loss: 0.1669 - regression_loss: 0.1593 - classification_loss: 0.0076 72/500 [===>..........................] - ETA: 2:25 - loss: 0.1666 - regression_loss: 0.1590 - classification_loss: 0.0076 73/500 [===>..........................] - ETA: 2:24 - loss: 0.1678 - regression_loss: 0.1600 - classification_loss: 0.0078 74/500 [===>..........................] - ETA: 2:24 - loss: 0.1696 - regression_loss: 0.1615 - classification_loss: 0.0080 75/500 [===>..........................] - ETA: 2:24 - loss: 0.1686 - regression_loss: 0.1606 - classification_loss: 0.0080 76/500 [===>..........................] - ETA: 2:24 - loss: 0.1673 - regression_loss: 0.1594 - classification_loss: 0.0079 77/500 [===>..........................] - ETA: 2:23 - loss: 0.1677 - regression_loss: 0.1598 - classification_loss: 0.0079 78/500 [===>..........................] - ETA: 2:23 - loss: 0.1681 - regression_loss: 0.1600 - classification_loss: 0.0081 79/500 [===>..........................] - ETA: 2:23 - loss: 0.1685 - regression_loss: 0.1604 - classification_loss: 0.0081 80/500 [===>..........................] - ETA: 2:22 - loss: 0.1695 - regression_loss: 0.1612 - classification_loss: 0.0082 81/500 [===>..........................] - ETA: 2:22 - loss: 0.1708 - regression_loss: 0.1624 - classification_loss: 0.0083 82/500 [===>..........................] - ETA: 2:22 - loss: 0.1700 - regression_loss: 0.1618 - classification_loss: 0.0083 83/500 [===>..........................] - ETA: 2:22 - loss: 0.1693 - regression_loss: 0.1611 - classification_loss: 0.0082 84/500 [====>.........................] - ETA: 2:21 - loss: 0.1680 - regression_loss: 0.1599 - classification_loss: 0.0081 85/500 [====>.........................] - ETA: 2:21 - loss: 0.1677 - regression_loss: 0.1595 - classification_loss: 0.0082 86/500 [====>.........................] - ETA: 2:21 - loss: 0.1675 - regression_loss: 0.1593 - classification_loss: 0.0082 87/500 [====>.........................] - ETA: 2:20 - loss: 0.1678 - regression_loss: 0.1595 - classification_loss: 0.0083 88/500 [====>.........................] - ETA: 2:20 - loss: 0.1671 - regression_loss: 0.1588 - classification_loss: 0.0082 89/500 [====>.........................] - ETA: 2:20 - loss: 0.1671 - regression_loss: 0.1589 - classification_loss: 0.0082 90/500 [====>.........................] - ETA: 2:19 - loss: 0.1670 - regression_loss: 0.1588 - classification_loss: 0.0082 91/500 [====>.........................] - ETA: 2:19 - loss: 0.1666 - regression_loss: 0.1584 - classification_loss: 0.0082 92/500 [====>.........................] - ETA: 2:19 - loss: 0.1675 - regression_loss: 0.1590 - classification_loss: 0.0085 93/500 [====>.........................] - ETA: 2:18 - loss: 0.1670 - regression_loss: 0.1586 - classification_loss: 0.0084 94/500 [====>.........................] - ETA: 2:18 - loss: 0.1662 - regression_loss: 0.1579 - classification_loss: 0.0083 95/500 [====>.........................] - ETA: 2:18 - loss: 0.1667 - regression_loss: 0.1583 - classification_loss: 0.0084 96/500 [====>.........................] - ETA: 2:17 - loss: 0.1661 - regression_loss: 0.1577 - classification_loss: 0.0084 97/500 [====>.........................] - ETA: 2:17 - loss: 0.1667 - regression_loss: 0.1583 - classification_loss: 0.0084 98/500 [====>.........................] - ETA: 2:17 - loss: 0.1659 - regression_loss: 0.1575 - classification_loss: 0.0084 99/500 [====>.........................] - ETA: 2:16 - loss: 0.1657 - regression_loss: 0.1573 - classification_loss: 0.0084 100/500 [=====>........................] - ETA: 2:16 - loss: 0.1652 - regression_loss: 0.1569 - classification_loss: 0.0084 101/500 [=====>........................] - ETA: 2:16 - loss: 0.1651 - regression_loss: 0.1567 - classification_loss: 0.0084 102/500 [=====>........................] - ETA: 2:15 - loss: 0.1653 - regression_loss: 0.1569 - classification_loss: 0.0084 103/500 [=====>........................] - ETA: 2:15 - loss: 0.1660 - regression_loss: 0.1576 - classification_loss: 0.0083 104/500 [=====>........................] - ETA: 2:14 - loss: 0.1661 - regression_loss: 0.1578 - classification_loss: 0.0083 105/500 [=====>........................] - ETA: 2:14 - loss: 0.1666 - regression_loss: 0.1583 - classification_loss: 0.0083 106/500 [=====>........................] - ETA: 2:14 - loss: 0.1674 - regression_loss: 0.1590 - classification_loss: 0.0084 107/500 [=====>........................] - ETA: 2:13 - loss: 0.1666 - regression_loss: 0.1583 - classification_loss: 0.0083 108/500 [=====>........................] - ETA: 2:13 - loss: 0.1678 - regression_loss: 0.1594 - classification_loss: 0.0084 109/500 [=====>........................] - ETA: 2:13 - loss: 0.1681 - regression_loss: 0.1597 - classification_loss: 0.0084 110/500 [=====>........................] - ETA: 2:12 - loss: 0.1680 - regression_loss: 0.1596 - classification_loss: 0.0084 111/500 [=====>........................] - ETA: 2:12 - loss: 0.1676 - regression_loss: 0.1592 - classification_loss: 0.0083 112/500 [=====>........................] - ETA: 2:12 - loss: 0.1673 - regression_loss: 0.1590 - classification_loss: 0.0083 113/500 [=====>........................] - ETA: 2:11 - loss: 0.1670 - regression_loss: 0.1586 - classification_loss: 0.0083 114/500 [=====>........................] - ETA: 2:11 - loss: 0.1659 - regression_loss: 0.1577 - classification_loss: 0.0083 115/500 [=====>........................] - ETA: 2:11 - loss: 0.1660 - regression_loss: 0.1578 - classification_loss: 0.0083 116/500 [=====>........................] - ETA: 2:10 - loss: 0.1659 - regression_loss: 0.1577 - classification_loss: 0.0082 117/500 [======>.......................] - ETA: 2:10 - loss: 0.1667 - regression_loss: 0.1585 - classification_loss: 0.0083 118/500 [======>.......................] - ETA: 2:10 - loss: 0.1668 - regression_loss: 0.1585 - classification_loss: 0.0083 119/500 [======>.......................] - ETA: 2:09 - loss: 0.1671 - regression_loss: 0.1588 - classification_loss: 0.0083 120/500 [======>.......................] - ETA: 2:09 - loss: 0.1664 - regression_loss: 0.1582 - classification_loss: 0.0082 121/500 [======>.......................] - ETA: 2:09 - loss: 0.1659 - regression_loss: 0.1577 - classification_loss: 0.0082 122/500 [======>.......................] - ETA: 2:08 - loss: 0.1668 - regression_loss: 0.1584 - classification_loss: 0.0083 123/500 [======>.......................] - ETA: 2:08 - loss: 0.1665 - regression_loss: 0.1583 - classification_loss: 0.0083 124/500 [======>.......................] - ETA: 2:07 - loss: 0.1663 - regression_loss: 0.1580 - classification_loss: 0.0083 125/500 [======>.......................] - ETA: 2:07 - loss: 0.1655 - regression_loss: 0.1572 - classification_loss: 0.0082 126/500 [======>.......................] - ETA: 2:07 - loss: 0.1646 - regression_loss: 0.1564 - classification_loss: 0.0082 127/500 [======>.......................] - ETA: 2:06 - loss: 0.1652 - regression_loss: 0.1570 - classification_loss: 0.0082 128/500 [======>.......................] - ETA: 2:06 - loss: 0.1645 - regression_loss: 0.1564 - classification_loss: 0.0082 129/500 [======>.......................] - ETA: 2:06 - loss: 0.1642 - regression_loss: 0.1560 - classification_loss: 0.0081 130/500 [======>.......................] - ETA: 2:05 - loss: 0.1635 - regression_loss: 0.1554 - classification_loss: 0.0081 131/500 [======>.......................] - ETA: 2:05 - loss: 0.1637 - regression_loss: 0.1556 - classification_loss: 0.0081 132/500 [======>.......................] - ETA: 2:05 - loss: 0.1631 - regression_loss: 0.1551 - classification_loss: 0.0080 133/500 [======>.......................] - ETA: 2:04 - loss: 0.1637 - regression_loss: 0.1557 - classification_loss: 0.0081 134/500 [=======>......................] - ETA: 2:04 - loss: 0.1630 - regression_loss: 0.1550 - classification_loss: 0.0080 135/500 [=======>......................] - ETA: 2:04 - loss: 0.1630 - regression_loss: 0.1550 - classification_loss: 0.0080 136/500 [=======>......................] - ETA: 2:04 - loss: 0.1643 - regression_loss: 0.1562 - classification_loss: 0.0080 137/500 [=======>......................] - ETA: 2:03 - loss: 0.1644 - regression_loss: 0.1564 - classification_loss: 0.0081 138/500 [=======>......................] - ETA: 2:03 - loss: 0.1639 - regression_loss: 0.1559 - classification_loss: 0.0080 139/500 [=======>......................] - ETA: 2:03 - loss: 0.1634 - regression_loss: 0.1554 - classification_loss: 0.0080 140/500 [=======>......................] - ETA: 2:02 - loss: 0.1647 - regression_loss: 0.1567 - classification_loss: 0.0080 141/500 [=======>......................] - ETA: 2:02 - loss: 0.1647 - regression_loss: 0.1568 - classification_loss: 0.0080 142/500 [=======>......................] - ETA: 2:02 - loss: 0.1650 - regression_loss: 0.1571 - classification_loss: 0.0080 143/500 [=======>......................] - ETA: 2:01 - loss: 0.1651 - regression_loss: 0.1571 - classification_loss: 0.0080 144/500 [=======>......................] - ETA: 2:01 - loss: 0.1649 - regression_loss: 0.1570 - classification_loss: 0.0080 145/500 [=======>......................] - ETA: 2:00 - loss: 0.1644 - regression_loss: 0.1565 - classification_loss: 0.0079 146/500 [=======>......................] - ETA: 2:00 - loss: 0.1638 - regression_loss: 0.1559 - classification_loss: 0.0079 147/500 [=======>......................] - ETA: 2:00 - loss: 0.1637 - regression_loss: 0.1559 - classification_loss: 0.0079 148/500 [=======>......................] - ETA: 2:00 - loss: 0.1632 - regression_loss: 0.1554 - classification_loss: 0.0078 149/500 [=======>......................] - ETA: 1:59 - loss: 0.1638 - regression_loss: 0.1558 - classification_loss: 0.0080 150/500 [========>.....................] - ETA: 1:59 - loss: 0.1632 - regression_loss: 0.1552 - classification_loss: 0.0080 151/500 [========>.....................] - ETA: 1:59 - loss: 0.1630 - regression_loss: 0.1551 - classification_loss: 0.0079 152/500 [========>.....................] - ETA: 1:58 - loss: 0.1634 - regression_loss: 0.1553 - classification_loss: 0.0081 153/500 [========>.....................] - ETA: 1:58 - loss: 0.1631 - regression_loss: 0.1550 - classification_loss: 0.0081 154/500 [========>.....................] - ETA: 1:58 - loss: 0.1632 - regression_loss: 0.1552 - classification_loss: 0.0081 155/500 [========>.....................] - ETA: 1:57 - loss: 0.1629 - regression_loss: 0.1548 - classification_loss: 0.0081 156/500 [========>.....................] - ETA: 1:57 - loss: 0.1630 - regression_loss: 0.1549 - classification_loss: 0.0080 157/500 [========>.....................] - ETA: 1:57 - loss: 0.1625 - regression_loss: 0.1544 - classification_loss: 0.0080 158/500 [========>.....................] - ETA: 1:56 - loss: 0.1619 - regression_loss: 0.1539 - classification_loss: 0.0080 159/500 [========>.....................] - ETA: 1:56 - loss: 0.1622 - regression_loss: 0.1542 - classification_loss: 0.0080 160/500 [========>.....................] - ETA: 1:56 - loss: 0.1615 - regression_loss: 0.1535 - classification_loss: 0.0079 161/500 [========>.....................] - ETA: 1:55 - loss: 0.1608 - regression_loss: 0.1529 - classification_loss: 0.0079 162/500 [========>.....................] - ETA: 1:55 - loss: 0.1608 - regression_loss: 0.1529 - classification_loss: 0.0079 163/500 [========>.....................] - ETA: 1:55 - loss: 0.1617 - regression_loss: 0.1538 - classification_loss: 0.0079 164/500 [========>.....................] - ETA: 1:54 - loss: 0.1616 - regression_loss: 0.1537 - classification_loss: 0.0079 165/500 [========>.....................] - ETA: 1:54 - loss: 0.1613 - regression_loss: 0.1534 - classification_loss: 0.0079 166/500 [========>.....................] - ETA: 1:54 - loss: 0.1610 - regression_loss: 0.1531 - classification_loss: 0.0079 167/500 [=========>....................] - ETA: 1:53 - loss: 0.1608 - regression_loss: 0.1529 - classification_loss: 0.0079 168/500 [=========>....................] - ETA: 1:53 - loss: 0.1612 - regression_loss: 0.1533 - classification_loss: 0.0079 169/500 [=========>....................] - ETA: 1:52 - loss: 0.1609 - regression_loss: 0.1530 - classification_loss: 0.0078 170/500 [=========>....................] - ETA: 1:52 - loss: 0.1605 - regression_loss: 0.1527 - classification_loss: 0.0078 171/500 [=========>....................] - ETA: 1:52 - loss: 0.1609 - regression_loss: 0.1530 - classification_loss: 0.0079 172/500 [=========>....................] - ETA: 1:51 - loss: 0.1605 - regression_loss: 0.1527 - classification_loss: 0.0079 173/500 [=========>....................] - ETA: 1:51 - loss: 0.1607 - regression_loss: 0.1527 - classification_loss: 0.0080 174/500 [=========>....................] - ETA: 1:51 - loss: 0.1606 - regression_loss: 0.1526 - classification_loss: 0.0080 175/500 [=========>....................] - ETA: 1:50 - loss: 0.1599 - regression_loss: 0.1520 - classification_loss: 0.0079 176/500 [=========>....................] - ETA: 1:50 - loss: 0.1596 - regression_loss: 0.1517 - classification_loss: 0.0079 177/500 [=========>....................] - ETA: 1:50 - loss: 0.1590 - regression_loss: 0.1511 - classification_loss: 0.0079 178/500 [=========>....................] - ETA: 1:49 - loss: 0.1584 - regression_loss: 0.1506 - classification_loss: 0.0078 179/500 [=========>....................] - ETA: 1:49 - loss: 0.1582 - regression_loss: 0.1504 - classification_loss: 0.0078 180/500 [=========>....................] - ETA: 1:49 - loss: 0.1591 - regression_loss: 0.1514 - classification_loss: 0.0078 181/500 [=========>....................] - ETA: 1:48 - loss: 0.1597 - regression_loss: 0.1519 - classification_loss: 0.0078 182/500 [=========>....................] - ETA: 1:48 - loss: 0.1600 - regression_loss: 0.1522 - classification_loss: 0.0078 183/500 [=========>....................] - ETA: 1:48 - loss: 0.1600 - regression_loss: 0.1522 - classification_loss: 0.0078 184/500 [==========>...................] - ETA: 1:47 - loss: 0.1601 - regression_loss: 0.1523 - classification_loss: 0.0078 185/500 [==========>...................] - ETA: 1:47 - loss: 0.1598 - regression_loss: 0.1520 - classification_loss: 0.0078 186/500 [==========>...................] - ETA: 1:47 - loss: 0.1595 - regression_loss: 0.1517 - classification_loss: 0.0078 187/500 [==========>...................] - ETA: 1:46 - loss: 0.1590 - regression_loss: 0.1512 - classification_loss: 0.0077 188/500 [==========>...................] - ETA: 1:46 - loss: 0.1587 - regression_loss: 0.1509 - classification_loss: 0.0077 189/500 [==========>...................] - ETA: 1:46 - loss: 0.1582 - regression_loss: 0.1505 - classification_loss: 0.0077 190/500 [==========>...................] - ETA: 1:45 - loss: 0.1583 - regression_loss: 0.1505 - classification_loss: 0.0078 191/500 [==========>...................] - ETA: 1:45 - loss: 0.1594 - regression_loss: 0.1516 - classification_loss: 0.0078 192/500 [==========>...................] - ETA: 1:44 - loss: 0.1595 - regression_loss: 0.1517 - classification_loss: 0.0078 193/500 [==========>...................] - ETA: 1:44 - loss: 0.1592 - regression_loss: 0.1514 - classification_loss: 0.0078 194/500 [==========>...................] - ETA: 1:44 - loss: 0.1593 - regression_loss: 0.1515 - classification_loss: 0.0078 195/500 [==========>...................] - ETA: 1:43 - loss: 0.1588 - regression_loss: 0.1510 - classification_loss: 0.0078 196/500 [==========>...................] - ETA: 1:43 - loss: 0.1586 - regression_loss: 0.1508 - classification_loss: 0.0077 197/500 [==========>...................] - ETA: 1:43 - loss: 0.1584 - regression_loss: 0.1506 - classification_loss: 0.0077 198/500 [==========>...................] - ETA: 1:42 - loss: 0.1578 - regression_loss: 0.1501 - classification_loss: 0.0077 199/500 [==========>...................] - ETA: 1:42 - loss: 0.1577 - regression_loss: 0.1499 - classification_loss: 0.0077 200/500 [===========>..................] - ETA: 1:42 - loss: 0.1573 - regression_loss: 0.1496 - classification_loss: 0.0077 201/500 [===========>..................] - ETA: 1:41 - loss: 0.1569 - regression_loss: 0.1492 - classification_loss: 0.0077 202/500 [===========>..................] - ETA: 1:41 - loss: 0.1568 - regression_loss: 0.1491 - classification_loss: 0.0077 203/500 [===========>..................] - ETA: 1:41 - loss: 0.1563 - regression_loss: 0.1486 - classification_loss: 0.0077 204/500 [===========>..................] - ETA: 1:40 - loss: 0.1565 - regression_loss: 0.1488 - classification_loss: 0.0077 205/500 [===========>..................] - ETA: 1:40 - loss: 0.1561 - regression_loss: 0.1484 - classification_loss: 0.0076 206/500 [===========>..................] - ETA: 1:40 - loss: 0.1557 - regression_loss: 0.1480 - classification_loss: 0.0076 207/500 [===========>..................] - ETA: 1:39 - loss: 0.1552 - regression_loss: 0.1477 - classification_loss: 0.0076 208/500 [===========>..................] - ETA: 1:39 - loss: 0.1552 - regression_loss: 0.1476 - classification_loss: 0.0076 209/500 [===========>..................] - ETA: 1:39 - loss: 0.1548 - regression_loss: 0.1473 - classification_loss: 0.0075 210/500 [===========>..................] - ETA: 1:38 - loss: 0.1543 - regression_loss: 0.1468 - classification_loss: 0.0075 211/500 [===========>..................] - ETA: 1:38 - loss: 0.1540 - regression_loss: 0.1465 - classification_loss: 0.0075 212/500 [===========>..................] - ETA: 1:38 - loss: 0.1534 - regression_loss: 0.1459 - classification_loss: 0.0075 213/500 [===========>..................] - ETA: 1:37 - loss: 0.1529 - regression_loss: 0.1454 - classification_loss: 0.0074 214/500 [===========>..................] - ETA: 1:37 - loss: 0.1528 - regression_loss: 0.1454 - classification_loss: 0.0074 215/500 [===========>..................] - ETA: 1:37 - loss: 0.1525 - regression_loss: 0.1451 - classification_loss: 0.0074 216/500 [===========>..................] - ETA: 1:36 - loss: 0.1524 - regression_loss: 0.1450 - classification_loss: 0.0074 217/500 [============>.................] - ETA: 1:36 - loss: 0.1521 - regression_loss: 0.1447 - classification_loss: 0.0074 218/500 [============>.................] - ETA: 1:36 - loss: 0.1521 - regression_loss: 0.1447 - classification_loss: 0.0074 219/500 [============>.................] - ETA: 1:35 - loss: 0.1519 - regression_loss: 0.1445 - classification_loss: 0.0074 220/500 [============>.................] - ETA: 1:35 - loss: 0.1517 - regression_loss: 0.1443 - classification_loss: 0.0073 221/500 [============>.................] - ETA: 1:35 - loss: 0.1516 - regression_loss: 0.1443 - classification_loss: 0.0073 222/500 [============>.................] - ETA: 1:34 - loss: 0.1517 - regression_loss: 0.1444 - classification_loss: 0.0074 223/500 [============>.................] - ETA: 1:34 - loss: 0.1512 - regression_loss: 0.1439 - classification_loss: 0.0073 224/500 [============>.................] - ETA: 1:34 - loss: 0.1509 - regression_loss: 0.1435 - classification_loss: 0.0073 225/500 [============>.................] - ETA: 1:33 - loss: 0.1511 - regression_loss: 0.1438 - classification_loss: 0.0073 226/500 [============>.................] - ETA: 1:33 - loss: 0.1508 - regression_loss: 0.1435 - classification_loss: 0.0073 227/500 [============>.................] - ETA: 1:32 - loss: 0.1504 - regression_loss: 0.1431 - classification_loss: 0.0073 228/500 [============>.................] - ETA: 1:32 - loss: 0.1506 - regression_loss: 0.1434 - classification_loss: 0.0073 229/500 [============>.................] - ETA: 1:32 - loss: 0.1504 - regression_loss: 0.1431 - classification_loss: 0.0072 230/500 [============>.................] - ETA: 1:31 - loss: 0.1500 - regression_loss: 0.1428 - classification_loss: 0.0072 231/500 [============>.................] - ETA: 1:31 - loss: 0.1504 - regression_loss: 0.1431 - classification_loss: 0.0073 232/500 [============>.................] - ETA: 1:31 - loss: 0.1500 - regression_loss: 0.1427 - classification_loss: 0.0072 233/500 [============>.................] - ETA: 1:30 - loss: 0.1496 - regression_loss: 0.1424 - classification_loss: 0.0072 234/500 [=============>................] - ETA: 1:30 - loss: 0.1503 - regression_loss: 0.1430 - classification_loss: 0.0073 235/500 [=============>................] - ETA: 1:30 - loss: 0.1502 - regression_loss: 0.1429 - classification_loss: 0.0073 236/500 [=============>................] - ETA: 1:29 - loss: 0.1498 - regression_loss: 0.1425 - classification_loss: 0.0073 237/500 [=============>................] - ETA: 1:29 - loss: 0.1497 - regression_loss: 0.1424 - classification_loss: 0.0073 238/500 [=============>................] - ETA: 1:29 - loss: 0.1494 - regression_loss: 0.1421 - classification_loss: 0.0073 239/500 [=============>................] - ETA: 1:28 - loss: 0.1494 - regression_loss: 0.1421 - classification_loss: 0.0073 240/500 [=============>................] - ETA: 1:28 - loss: 0.1495 - regression_loss: 0.1421 - classification_loss: 0.0073 241/500 [=============>................] - ETA: 1:28 - loss: 0.1494 - regression_loss: 0.1421 - classification_loss: 0.0073 242/500 [=============>................] - ETA: 1:27 - loss: 0.1490 - regression_loss: 0.1417 - classification_loss: 0.0073 243/500 [=============>................] - ETA: 1:27 - loss: 0.1492 - regression_loss: 0.1419 - classification_loss: 0.0073 244/500 [=============>................] - ETA: 1:27 - loss: 0.1495 - regression_loss: 0.1422 - classification_loss: 0.0073 245/500 [=============>................] - ETA: 1:26 - loss: 0.1496 - regression_loss: 0.1423 - classification_loss: 0.0074 246/500 [=============>................] - ETA: 1:26 - loss: 0.1495 - regression_loss: 0.1421 - classification_loss: 0.0073 247/500 [=============>................] - ETA: 1:26 - loss: 0.1503 - regression_loss: 0.1430 - classification_loss: 0.0073 248/500 [=============>................] - ETA: 1:25 - loss: 0.1504 - regression_loss: 0.1431 - classification_loss: 0.0073 249/500 [=============>................] - ETA: 1:25 - loss: 0.1505 - regression_loss: 0.1431 - classification_loss: 0.0073 250/500 [==============>...............] - ETA: 1:25 - loss: 0.1503 - regression_loss: 0.1430 - classification_loss: 0.0073 251/500 [==============>...............] - ETA: 1:24 - loss: 0.1509 - regression_loss: 0.1434 - classification_loss: 0.0076 252/500 [==============>...............] - ETA: 1:24 - loss: 0.1510 - regression_loss: 0.1434 - classification_loss: 0.0076 253/500 [==============>...............] - ETA: 1:24 - loss: 0.1512 - regression_loss: 0.1436 - classification_loss: 0.0076 254/500 [==============>...............] - ETA: 1:23 - loss: 0.1526 - regression_loss: 0.1449 - classification_loss: 0.0077 255/500 [==============>...............] - ETA: 1:23 - loss: 0.1528 - regression_loss: 0.1450 - classification_loss: 0.0078 256/500 [==============>...............] - ETA: 1:22 - loss: 0.1533 - regression_loss: 0.1455 - classification_loss: 0.0078 257/500 [==============>...............] - ETA: 1:22 - loss: 0.1531 - regression_loss: 0.1453 - classification_loss: 0.0078 258/500 [==============>...............] - ETA: 1:22 - loss: 0.1529 - regression_loss: 0.1451 - classification_loss: 0.0078 259/500 [==============>...............] - ETA: 1:22 - loss: 0.1527 - regression_loss: 0.1449 - classification_loss: 0.0077 260/500 [==============>...............] - ETA: 1:21 - loss: 0.1526 - regression_loss: 0.1449 - classification_loss: 0.0077 261/500 [==============>...............] - ETA: 1:21 - loss: 0.1528 - regression_loss: 0.1451 - classification_loss: 0.0077 262/500 [==============>...............] - ETA: 1:20 - loss: 0.1531 - regression_loss: 0.1454 - classification_loss: 0.0077 263/500 [==============>...............] - ETA: 1:20 - loss: 0.1531 - regression_loss: 0.1453 - classification_loss: 0.0077 264/500 [==============>...............] - ETA: 1:20 - loss: 0.1526 - regression_loss: 0.1449 - classification_loss: 0.0077 265/500 [==============>...............] - ETA: 1:19 - loss: 0.1524 - regression_loss: 0.1448 - classification_loss: 0.0077 266/500 [==============>...............] - ETA: 1:19 - loss: 0.1525 - regression_loss: 0.1448 - classification_loss: 0.0077 267/500 [===============>..............] - ETA: 1:19 - loss: 0.1521 - regression_loss: 0.1444 - classification_loss: 0.0077 268/500 [===============>..............] - ETA: 1:18 - loss: 0.1521 - regression_loss: 0.1444 - classification_loss: 0.0077 269/500 [===============>..............] - ETA: 1:18 - loss: 0.1521 - regression_loss: 0.1444 - classification_loss: 0.0077 270/500 [===============>..............] - ETA: 1:18 - loss: 0.1518 - regression_loss: 0.1441 - classification_loss: 0.0077 271/500 [===============>..............] - ETA: 1:17 - loss: 0.1518 - regression_loss: 0.1441 - classification_loss: 0.0077 272/500 [===============>..............] - ETA: 1:17 - loss: 0.1517 - regression_loss: 0.1440 - classification_loss: 0.0077 273/500 [===============>..............] - ETA: 1:17 - loss: 0.1513 - regression_loss: 0.1437 - classification_loss: 0.0076 274/500 [===============>..............] - ETA: 1:16 - loss: 0.1516 - regression_loss: 0.1439 - classification_loss: 0.0076 275/500 [===============>..............] - ETA: 1:16 - loss: 0.1511 - regression_loss: 0.1435 - classification_loss: 0.0076 276/500 [===============>..............] - ETA: 1:16 - loss: 0.1510 - regression_loss: 0.1434 - classification_loss: 0.0076 277/500 [===============>..............] - ETA: 1:15 - loss: 0.1509 - regression_loss: 0.1433 - classification_loss: 0.0076 278/500 [===============>..............] - ETA: 1:15 - loss: 0.1506 - regression_loss: 0.1430 - classification_loss: 0.0076 279/500 [===============>..............] - ETA: 1:15 - loss: 0.1505 - regression_loss: 0.1429 - classification_loss: 0.0076 280/500 [===============>..............] - ETA: 1:14 - loss: 0.1506 - regression_loss: 0.1430 - classification_loss: 0.0076 281/500 [===============>..............] - ETA: 1:14 - loss: 0.1505 - regression_loss: 0.1429 - classification_loss: 0.0076 282/500 [===============>..............] - ETA: 1:14 - loss: 0.1504 - regression_loss: 0.1429 - classification_loss: 0.0076 283/500 [===============>..............] - ETA: 1:13 - loss: 0.1506 - regression_loss: 0.1431 - classification_loss: 0.0076 284/500 [================>.............] - ETA: 1:13 - loss: 0.1504 - regression_loss: 0.1429 - classification_loss: 0.0075 285/500 [================>.............] - ETA: 1:13 - loss: 0.1503 - regression_loss: 0.1428 - classification_loss: 0.0076 286/500 [================>.............] - ETA: 1:12 - loss: 0.1504 - regression_loss: 0.1429 - classification_loss: 0.0076 287/500 [================>.............] - ETA: 1:12 - loss: 0.1503 - regression_loss: 0.1428 - classification_loss: 0.0075 288/500 [================>.............] - ETA: 1:12 - loss: 0.1502 - regression_loss: 0.1427 - classification_loss: 0.0075 289/500 [================>.............] - ETA: 1:11 - loss: 0.1500 - regression_loss: 0.1425 - classification_loss: 0.0075 290/500 [================>.............] - ETA: 1:11 - loss: 0.1498 - regression_loss: 0.1423 - classification_loss: 0.0075 291/500 [================>.............] - ETA: 1:11 - loss: 0.1496 - regression_loss: 0.1421 - classification_loss: 0.0075 292/500 [================>.............] - ETA: 1:10 - loss: 0.1496 - regression_loss: 0.1421 - classification_loss: 0.0075 293/500 [================>.............] - ETA: 1:10 - loss: 0.1495 - regression_loss: 0.1420 - classification_loss: 0.0075 294/500 [================>.............] - ETA: 1:10 - loss: 0.1495 - regression_loss: 0.1420 - classification_loss: 0.0075 295/500 [================>.............] - ETA: 1:09 - loss: 0.1493 - regression_loss: 0.1418 - classification_loss: 0.0075 296/500 [================>.............] - ETA: 1:09 - loss: 0.1490 - regression_loss: 0.1415 - classification_loss: 0.0075 297/500 [================>.............] - ETA: 1:09 - loss: 0.1490 - regression_loss: 0.1416 - classification_loss: 0.0075 298/500 [================>.............] - ETA: 1:08 - loss: 0.1488 - regression_loss: 0.1413 - classification_loss: 0.0075 299/500 [================>.............] - ETA: 1:08 - loss: 0.1486 - regression_loss: 0.1411 - classification_loss: 0.0075 300/500 [=================>............] - ETA: 1:08 - loss: 0.1486 - regression_loss: 0.1411 - classification_loss: 0.0075 301/500 [=================>............] - ETA: 1:07 - loss: 0.1485 - regression_loss: 0.1410 - classification_loss: 0.0075 302/500 [=================>............] - ETA: 1:07 - loss: 0.1486 - regression_loss: 0.1411 - classification_loss: 0.0075 303/500 [=================>............] - ETA: 1:07 - loss: 0.1483 - regression_loss: 0.1409 - classification_loss: 0.0075 304/500 [=================>............] - ETA: 1:06 - loss: 0.1488 - regression_loss: 0.1413 - classification_loss: 0.0075 305/500 [=================>............] - ETA: 1:06 - loss: 0.1486 - regression_loss: 0.1412 - classification_loss: 0.0075 306/500 [=================>............] - ETA: 1:06 - loss: 0.1485 - regression_loss: 0.1410 - classification_loss: 0.0075 307/500 [=================>............] - ETA: 1:05 - loss: 0.1484 - regression_loss: 0.1409 - classification_loss: 0.0075 308/500 [=================>............] - ETA: 1:05 - loss: 0.1484 - regression_loss: 0.1409 - classification_loss: 0.0075 309/500 [=================>............] - ETA: 1:05 - loss: 0.1481 - regression_loss: 0.1407 - classification_loss: 0.0074 310/500 [=================>............] - ETA: 1:04 - loss: 0.1479 - regression_loss: 0.1405 - classification_loss: 0.0074 311/500 [=================>............] - ETA: 1:04 - loss: 0.1478 - regression_loss: 0.1403 - classification_loss: 0.0074 312/500 [=================>............] - ETA: 1:04 - loss: 0.1476 - regression_loss: 0.1402 - classification_loss: 0.0074 313/500 [=================>............] - ETA: 1:03 - loss: 0.1481 - regression_loss: 0.1407 - classification_loss: 0.0074 314/500 [=================>............] - ETA: 1:03 - loss: 0.1490 - regression_loss: 0.1416 - classification_loss: 0.0074 315/500 [=================>............] - ETA: 1:03 - loss: 0.1488 - regression_loss: 0.1414 - classification_loss: 0.0074 316/500 [=================>............] - ETA: 1:02 - loss: 0.1488 - regression_loss: 0.1413 - classification_loss: 0.0074 317/500 [==================>...........] - ETA: 1:02 - loss: 0.1485 - regression_loss: 0.1411 - classification_loss: 0.0074 318/500 [==================>...........] - ETA: 1:02 - loss: 0.1487 - regression_loss: 0.1413 - classification_loss: 0.0074 319/500 [==================>...........] - ETA: 1:01 - loss: 0.1485 - regression_loss: 0.1411 - classification_loss: 0.0074 320/500 [==================>...........] - ETA: 1:01 - loss: 0.1485 - regression_loss: 0.1411 - classification_loss: 0.0074 321/500 [==================>...........] - ETA: 1:00 - loss: 0.1484 - regression_loss: 0.1410 - classification_loss: 0.0074 322/500 [==================>...........] - ETA: 1:00 - loss: 0.1483 - regression_loss: 0.1409 - classification_loss: 0.0074 323/500 [==================>...........] - ETA: 1:00 - loss: 0.1481 - regression_loss: 0.1408 - classification_loss: 0.0074 324/500 [==================>...........] - ETA: 59s - loss: 0.1481 - regression_loss: 0.1407 - classification_loss: 0.0074  325/500 [==================>...........] - ETA: 59s - loss: 0.1478 - regression_loss: 0.1405 - classification_loss: 0.0074 326/500 [==================>...........] - ETA: 59s - loss: 0.1476 - regression_loss: 0.1403 - classification_loss: 0.0074 327/500 [==================>...........] - ETA: 58s - loss: 0.1478 - regression_loss: 0.1403 - classification_loss: 0.0074 328/500 [==================>...........] - ETA: 58s - loss: 0.1474 - regression_loss: 0.1400 - classification_loss: 0.0074 329/500 [==================>...........] - ETA: 58s - loss: 0.1472 - regression_loss: 0.1398 - classification_loss: 0.0074 330/500 [==================>...........] - ETA: 57s - loss: 0.1473 - regression_loss: 0.1400 - classification_loss: 0.0074 331/500 [==================>...........] - ETA: 57s - loss: 0.1472 - regression_loss: 0.1398 - classification_loss: 0.0074 332/500 [==================>...........] - ETA: 57s - loss: 0.1471 - regression_loss: 0.1398 - classification_loss: 0.0073 333/500 [==================>...........] - ETA: 56s - loss: 0.1474 - regression_loss: 0.1400 - classification_loss: 0.0073 334/500 [===================>..........] - ETA: 56s - loss: 0.1473 - regression_loss: 0.1400 - classification_loss: 0.0073 335/500 [===================>..........] - ETA: 56s - loss: 0.1473 - regression_loss: 0.1399 - classification_loss: 0.0073 336/500 [===================>..........] - ETA: 55s - loss: 0.1474 - regression_loss: 0.1400 - classification_loss: 0.0073 337/500 [===================>..........] - ETA: 55s - loss: 0.1478 - regression_loss: 0.1404 - classification_loss: 0.0074 338/500 [===================>..........] - ETA: 55s - loss: 0.1476 - regression_loss: 0.1402 - classification_loss: 0.0074 339/500 [===================>..........] - ETA: 54s - loss: 0.1476 - regression_loss: 0.1403 - classification_loss: 0.0074 340/500 [===================>..........] - ETA: 54s - loss: 0.1477 - regression_loss: 0.1403 - classification_loss: 0.0074 341/500 [===================>..........] - ETA: 54s - loss: 0.1480 - regression_loss: 0.1406 - classification_loss: 0.0074 342/500 [===================>..........] - ETA: 53s - loss: 0.1481 - regression_loss: 0.1407 - classification_loss: 0.0074 343/500 [===================>..........] - ETA: 53s - loss: 0.1478 - regression_loss: 0.1405 - classification_loss: 0.0074 344/500 [===================>..........] - ETA: 53s - loss: 0.1477 - regression_loss: 0.1403 - classification_loss: 0.0074 345/500 [===================>..........] - ETA: 52s - loss: 0.1477 - regression_loss: 0.1404 - classification_loss: 0.0074 346/500 [===================>..........] - ETA: 52s - loss: 0.1477 - regression_loss: 0.1403 - classification_loss: 0.0074 347/500 [===================>..........] - ETA: 52s - loss: 0.1478 - regression_loss: 0.1404 - classification_loss: 0.0074 348/500 [===================>..........] - ETA: 51s - loss: 0.1477 - regression_loss: 0.1403 - classification_loss: 0.0074 349/500 [===================>..........] - ETA: 51s - loss: 0.1476 - regression_loss: 0.1403 - classification_loss: 0.0074 350/500 [====================>.........] - ETA: 51s - loss: 0.1476 - regression_loss: 0.1402 - classification_loss: 0.0074 351/500 [====================>.........] - ETA: 50s - loss: 0.1476 - regression_loss: 0.1402 - classification_loss: 0.0073 352/500 [====================>.........] - ETA: 50s - loss: 0.1476 - regression_loss: 0.1403 - classification_loss: 0.0073 353/500 [====================>.........] - ETA: 50s - loss: 0.1475 - regression_loss: 0.1401 - classification_loss: 0.0073 354/500 [====================>.........] - ETA: 49s - loss: 0.1472 - regression_loss: 0.1399 - classification_loss: 0.0073 355/500 [====================>.........] - ETA: 49s - loss: 0.1471 - regression_loss: 0.1398 - classification_loss: 0.0073 356/500 [====================>.........] - ETA: 49s - loss: 0.1471 - regression_loss: 0.1398 - classification_loss: 0.0073 357/500 [====================>.........] - ETA: 48s - loss: 0.1468 - regression_loss: 0.1395 - classification_loss: 0.0073 358/500 [====================>.........] - ETA: 48s - loss: 0.1466 - regression_loss: 0.1393 - classification_loss: 0.0073 359/500 [====================>.........] - ETA: 48s - loss: 0.1467 - regression_loss: 0.1394 - classification_loss: 0.0073 360/500 [====================>.........] - ETA: 47s - loss: 0.1464 - regression_loss: 0.1391 - classification_loss: 0.0073 361/500 [====================>.........] - ETA: 47s - loss: 0.1461 - regression_loss: 0.1388 - classification_loss: 0.0073 362/500 [====================>.........] - ETA: 47s - loss: 0.1462 - regression_loss: 0.1390 - classification_loss: 0.0073 363/500 [====================>.........] - ETA: 46s - loss: 0.1468 - regression_loss: 0.1396 - classification_loss: 0.0073 364/500 [====================>.........] - ETA: 46s - loss: 0.1470 - regression_loss: 0.1397 - classification_loss: 0.0073 365/500 [====================>.........] - ETA: 46s - loss: 0.1467 - regression_loss: 0.1394 - classification_loss: 0.0073 366/500 [====================>.........] - ETA: 45s - loss: 0.1464 - regression_loss: 0.1392 - classification_loss: 0.0072 367/500 [=====================>........] - ETA: 45s - loss: 0.1462 - regression_loss: 0.1390 - classification_loss: 0.0072 368/500 [=====================>........] - ETA: 44s - loss: 0.1460 - regression_loss: 0.1388 - classification_loss: 0.0072 369/500 [=====================>........] - ETA: 44s - loss: 0.1459 - regression_loss: 0.1387 - classification_loss: 0.0072 370/500 [=====================>........] - ETA: 44s - loss: 0.1455 - regression_loss: 0.1384 - classification_loss: 0.0072 371/500 [=====================>........] - ETA: 43s - loss: 0.1456 - regression_loss: 0.1384 - classification_loss: 0.0072 372/500 [=====================>........] - ETA: 43s - loss: 0.1456 - regression_loss: 0.1384 - classification_loss: 0.0072 373/500 [=====================>........] - ETA: 43s - loss: 0.1454 - regression_loss: 0.1382 - classification_loss: 0.0072 374/500 [=====================>........] - ETA: 42s - loss: 0.1453 - regression_loss: 0.1381 - classification_loss: 0.0072 375/500 [=====================>........] - ETA: 42s - loss: 0.1454 - regression_loss: 0.1382 - classification_loss: 0.0072 376/500 [=====================>........] - ETA: 42s - loss: 0.1452 - regression_loss: 0.1380 - classification_loss: 0.0072 377/500 [=====================>........] - ETA: 41s - loss: 0.1450 - regression_loss: 0.1378 - classification_loss: 0.0072 378/500 [=====================>........] - ETA: 41s - loss: 0.1447 - regression_loss: 0.1376 - classification_loss: 0.0071 379/500 [=====================>........] - ETA: 41s - loss: 0.1445 - regression_loss: 0.1374 - classification_loss: 0.0071 380/500 [=====================>........] - ETA: 40s - loss: 0.1446 - regression_loss: 0.1374 - classification_loss: 0.0072 381/500 [=====================>........] - ETA: 40s - loss: 0.1448 - regression_loss: 0.1376 - classification_loss: 0.0072 382/500 [=====================>........] - ETA: 40s - loss: 0.1445 - regression_loss: 0.1373 - classification_loss: 0.0072 383/500 [=====================>........] - ETA: 39s - loss: 0.1445 - regression_loss: 0.1373 - classification_loss: 0.0072 384/500 [======================>.......] - ETA: 39s - loss: 0.1443 - regression_loss: 0.1371 - classification_loss: 0.0072 385/500 [======================>.......] - ETA: 39s - loss: 0.1441 - regression_loss: 0.1370 - classification_loss: 0.0072 386/500 [======================>.......] - ETA: 38s - loss: 0.1440 - regression_loss: 0.1368 - classification_loss: 0.0072 387/500 [======================>.......] - ETA: 38s - loss: 0.1439 - regression_loss: 0.1367 - classification_loss: 0.0072 388/500 [======================>.......] - ETA: 38s - loss: 0.1436 - regression_loss: 0.1365 - classification_loss: 0.0071 389/500 [======================>.......] - ETA: 37s - loss: 0.1435 - regression_loss: 0.1364 - classification_loss: 0.0072 390/500 [======================>.......] - ETA: 37s - loss: 0.1435 - regression_loss: 0.1363 - classification_loss: 0.0072 391/500 [======================>.......] - ETA: 37s - loss: 0.1432 - regression_loss: 0.1361 - classification_loss: 0.0071 392/500 [======================>.......] - ETA: 36s - loss: 0.1431 - regression_loss: 0.1360 - classification_loss: 0.0071 393/500 [======================>.......] - ETA: 36s - loss: 0.1430 - regression_loss: 0.1359 - classification_loss: 0.0071 394/500 [======================>.......] - ETA: 36s - loss: 0.1430 - regression_loss: 0.1359 - classification_loss: 0.0071 395/500 [======================>.......] - ETA: 35s - loss: 0.1428 - regression_loss: 0.1357 - classification_loss: 0.0071 396/500 [======================>.......] - ETA: 35s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0071 397/500 [======================>.......] - ETA: 35s - loss: 0.1433 - regression_loss: 0.1361 - classification_loss: 0.0072 398/500 [======================>.......] - ETA: 34s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0072 399/500 [======================>.......] - ETA: 34s - loss: 0.1433 - regression_loss: 0.1361 - classification_loss: 0.0072 400/500 [=======================>......] - ETA: 34s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0071 401/500 [=======================>......] - ETA: 33s - loss: 0.1429 - regression_loss: 0.1357 - classification_loss: 0.0071 402/500 [=======================>......] - ETA: 33s - loss: 0.1431 - regression_loss: 0.1360 - classification_loss: 0.0071 403/500 [=======================>......] - ETA: 33s - loss: 0.1431 - regression_loss: 0.1360 - classification_loss: 0.0071 404/500 [=======================>......] - ETA: 32s - loss: 0.1430 - regression_loss: 0.1359 - classification_loss: 0.0071 405/500 [=======================>......] - ETA: 32s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0072 406/500 [=======================>......] - ETA: 32s - loss: 0.1432 - regression_loss: 0.1360 - classification_loss: 0.0072 407/500 [=======================>......] - ETA: 31s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0072 408/500 [=======================>......] - ETA: 31s - loss: 0.1429 - regression_loss: 0.1358 - classification_loss: 0.0072 409/500 [=======================>......] - ETA: 31s - loss: 0.1429 - regression_loss: 0.1357 - classification_loss: 0.0071 410/500 [=======================>......] - ETA: 30s - loss: 0.1431 - regression_loss: 0.1359 - classification_loss: 0.0072 411/500 [=======================>......] - ETA: 30s - loss: 0.1429 - regression_loss: 0.1357 - classification_loss: 0.0072 412/500 [=======================>......] - ETA: 30s - loss: 0.1427 - regression_loss: 0.1355 - classification_loss: 0.0072 413/500 [=======================>......] - ETA: 29s - loss: 0.1426 - regression_loss: 0.1355 - classification_loss: 0.0072 414/500 [=======================>......] - ETA: 29s - loss: 0.1425 - regression_loss: 0.1354 - classification_loss: 0.0072 415/500 [=======================>......] - ETA: 29s - loss: 0.1424 - regression_loss: 0.1353 - classification_loss: 0.0071 416/500 [=======================>......] - ETA: 28s - loss: 0.1424 - regression_loss: 0.1352 - classification_loss: 0.0071 417/500 [========================>.....] - ETA: 28s - loss: 0.1423 - regression_loss: 0.1351 - classification_loss: 0.0071 418/500 [========================>.....] - ETA: 27s - loss: 0.1422 - regression_loss: 0.1351 - classification_loss: 0.0071 419/500 [========================>.....] - ETA: 27s - loss: 0.1427 - regression_loss: 0.1356 - classification_loss: 0.0071 420/500 [========================>.....] - ETA: 27s - loss: 0.1427 - regression_loss: 0.1356 - classification_loss: 0.0071 421/500 [========================>.....] - ETA: 26s - loss: 0.1427 - regression_loss: 0.1355 - classification_loss: 0.0071 422/500 [========================>.....] - ETA: 26s - loss: 0.1424 - regression_loss: 0.1353 - classification_loss: 0.0071 423/500 [========================>.....] - ETA: 26s - loss: 0.1423 - regression_loss: 0.1352 - classification_loss: 0.0071 424/500 [========================>.....] - ETA: 25s - loss: 0.1422 - regression_loss: 0.1351 - classification_loss: 0.0071 425/500 [========================>.....] - ETA: 25s - loss: 0.1421 - regression_loss: 0.1350 - classification_loss: 0.0071 426/500 [========================>.....] - ETA: 25s - loss: 0.1424 - regression_loss: 0.1353 - classification_loss: 0.0071 427/500 [========================>.....] - ETA: 24s - loss: 0.1422 - regression_loss: 0.1351 - classification_loss: 0.0071 428/500 [========================>.....] - ETA: 24s - loss: 0.1424 - regression_loss: 0.1353 - classification_loss: 0.0071 429/500 [========================>.....] - ETA: 24s - loss: 0.1422 - regression_loss: 0.1351 - classification_loss: 0.0071 430/500 [========================>.....] - ETA: 23s - loss: 0.1422 - regression_loss: 0.1351 - classification_loss: 0.0071 431/500 [========================>.....] - ETA: 23s - loss: 0.1420 - regression_loss: 0.1350 - classification_loss: 0.0071 432/500 [========================>.....] - ETA: 23s - loss: 0.1418 - regression_loss: 0.1348 - classification_loss: 0.0071 433/500 [========================>.....] - ETA: 22s - loss: 0.1418 - regression_loss: 0.1347 - classification_loss: 0.0071 434/500 [=========================>....] - ETA: 22s - loss: 0.1416 - regression_loss: 0.1345 - classification_loss: 0.0071 435/500 [=========================>....] - ETA: 22s - loss: 0.1414 - regression_loss: 0.1344 - classification_loss: 0.0071 436/500 [=========================>....] - ETA: 21s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 437/500 [=========================>....] - ETA: 21s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 438/500 [=========================>....] - ETA: 21s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0070 439/500 [=========================>....] - ETA: 20s - loss: 0.1412 - regression_loss: 0.1341 - classification_loss: 0.0071 440/500 [=========================>....] - ETA: 20s - loss: 0.1412 - regression_loss: 0.1342 - classification_loss: 0.0071 441/500 [=========================>....] - ETA: 20s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 442/500 [=========================>....] - ETA: 19s - loss: 0.1411 - regression_loss: 0.1341 - classification_loss: 0.0070 443/500 [=========================>....] - ETA: 19s - loss: 0.1410 - regression_loss: 0.1339 - classification_loss: 0.0070 444/500 [=========================>....] - ETA: 19s - loss: 0.1407 - regression_loss: 0.1337 - classification_loss: 0.0070 445/500 [=========================>....] - ETA: 18s - loss: 0.1406 - regression_loss: 0.1336 - classification_loss: 0.0070 446/500 [=========================>....] - ETA: 18s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 447/500 [=========================>....] - ETA: 18s - loss: 0.1409 - regression_loss: 0.1339 - classification_loss: 0.0070 448/500 [=========================>....] - ETA: 17s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 449/500 [=========================>....] - ETA: 17s - loss: 0.1407 - regression_loss: 0.1337 - classification_loss: 0.0070 450/500 [==========================>...] - ETA: 17s - loss: 0.1410 - regression_loss: 0.1340 - classification_loss: 0.0070 451/500 [==========================>...] - ETA: 16s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 452/500 [==========================>...] - ETA: 16s - loss: 0.1407 - regression_loss: 0.1337 - classification_loss: 0.0070 453/500 [==========================>...] - ETA: 16s - loss: 0.1410 - regression_loss: 0.1340 - classification_loss: 0.0070 454/500 [==========================>...] - ETA: 15s - loss: 0.1409 - regression_loss: 0.1339 - classification_loss: 0.0070 455/500 [==========================>...] - ETA: 15s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 456/500 [==========================>...] - ETA: 15s - loss: 0.1407 - regression_loss: 0.1336 - classification_loss: 0.0070 457/500 [==========================>...] - ETA: 14s - loss: 0.1406 - regression_loss: 0.1336 - classification_loss: 0.0070 458/500 [==========================>...] - ETA: 14s - loss: 0.1403 - regression_loss: 0.1333 - classification_loss: 0.0070 459/500 [==========================>...] - ETA: 13s - loss: 0.1403 - regression_loss: 0.1333 - classification_loss: 0.0070 460/500 [==========================>...] - ETA: 13s - loss: 0.1405 - regression_loss: 0.1335 - classification_loss: 0.0070 461/500 [==========================>...] - ETA: 13s - loss: 0.1407 - regression_loss: 0.1338 - classification_loss: 0.0070 462/500 [==========================>...] - ETA: 12s - loss: 0.1406 - regression_loss: 0.1337 - classification_loss: 0.0070 463/500 [==========================>...] - ETA: 12s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 464/500 [==========================>...] - ETA: 12s - loss: 0.1407 - regression_loss: 0.1337 - classification_loss: 0.0070 465/500 [==========================>...] - ETA: 11s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 466/500 [==========================>...] - ETA: 11s - loss: 0.1409 - regression_loss: 0.1340 - classification_loss: 0.0070 467/500 [===========================>..] - ETA: 11s - loss: 0.1409 - regression_loss: 0.1339 - classification_loss: 0.0070 468/500 [===========================>..] - ETA: 10s - loss: 0.1407 - regression_loss: 0.1337 - classification_loss: 0.0070 469/500 [===========================>..] - ETA: 10s - loss: 0.1408 - regression_loss: 0.1338 - classification_loss: 0.0070 470/500 [===========================>..] - ETA: 10s - loss: 0.1410 - regression_loss: 0.1339 - classification_loss: 0.0071 471/500 [===========================>..] - ETA: 9s - loss: 0.1412 - regression_loss: 0.1341 - classification_loss: 0.0071  472/500 [===========================>..] - ETA: 9s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 473/500 [===========================>..] - ETA: 9s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 474/500 [===========================>..] - ETA: 8s - loss: 0.1412 - regression_loss: 0.1341 - classification_loss: 0.0071 475/500 [===========================>..] - ETA: 8s - loss: 0.1413 - regression_loss: 0.1342 - classification_loss: 0.0071 476/500 [===========================>..] - ETA: 8s - loss: 0.1411 - regression_loss: 0.1340 - classification_loss: 0.0071 477/500 [===========================>..] - ETA: 7s - loss: 0.1409 - regression_loss: 0.1338 - classification_loss: 0.0071 478/500 [===========================>..] - ETA: 7s - loss: 0.1409 - regression_loss: 0.1338 - classification_loss: 0.0071 479/500 [===========================>..] - ETA: 7s - loss: 0.1409 - regression_loss: 0.1338 - classification_loss: 0.0071 480/500 [===========================>..] - ETA: 6s - loss: 0.1407 - regression_loss: 0.1336 - classification_loss: 0.0071 481/500 [===========================>..] - ETA: 6s - loss: 0.1408 - regression_loss: 0.1337 - classification_loss: 0.0071 482/500 [===========================>..] - ETA: 6s - loss: 0.1406 - regression_loss: 0.1335 - classification_loss: 0.0071 483/500 [===========================>..] - ETA: 5s - loss: 0.1405 - regression_loss: 0.1334 - classification_loss: 0.0071 484/500 [============================>.] - ETA: 5s - loss: 0.1403 - regression_loss: 0.1332 - classification_loss: 0.0071 485/500 [============================>.] - ETA: 5s - loss: 0.1402 - regression_loss: 0.1331 - classification_loss: 0.0071 486/500 [============================>.] - ETA: 4s - loss: 0.1400 - regression_loss: 0.1330 - classification_loss: 0.0071 487/500 [============================>.] - ETA: 4s - loss: 0.1399 - regression_loss: 0.1328 - classification_loss: 0.0071 488/500 [============================>.] - ETA: 4s - loss: 0.1401 - regression_loss: 0.1330 - classification_loss: 0.0071 489/500 [============================>.] - ETA: 3s - loss: 0.1398 - regression_loss: 0.1328 - classification_loss: 0.0071 490/500 [============================>.] - ETA: 3s - loss: 0.1397 - regression_loss: 0.1327 - classification_loss: 0.0070 491/500 [============================>.] - ETA: 3s - loss: 0.1395 - regression_loss: 0.1324 - classification_loss: 0.0070 492/500 [============================>.] - ETA: 2s - loss: 0.1394 - regression_loss: 0.1323 - classification_loss: 0.0070 493/500 [============================>.] - ETA: 2s - loss: 0.1393 - regression_loss: 0.1323 - classification_loss: 0.0070 494/500 [============================>.] - ETA: 2s - loss: 0.1393 - regression_loss: 0.1323 - classification_loss: 0.0070 495/500 [============================>.] - ETA: 1s - loss: 0.1396 - regression_loss: 0.1325 - classification_loss: 0.0072 496/500 [============================>.] - ETA: 1s - loss: 0.1397 - regression_loss: 0.1326 - classification_loss: 0.0072 497/500 [============================>.] - ETA: 1s - loss: 0.1401 - regression_loss: 0.1328 - classification_loss: 0.0072 498/500 [============================>.] - ETA: 0s - loss: 0.1399 - regression_loss: 0.1327 - classification_loss: 0.0072 499/500 [============================>.] - ETA: 0s - loss: 0.1398 - regression_loss: 0.1326 - classification_loss: 0.0072 500/500 [==============================] - 171s 341ms/step - loss: 0.1400 - regression_loss: 0.1328 - classification_loss: 0.0072 1172 instances of class plum with average precision: 0.7587 mAP: 0.7587 Epoch 00051: saving model to ./training/snapshots/resnet101_pascal_51.h5 Epoch 52/150 1/500 [..............................] - ETA: 2:49 - loss: 0.0446 - regression_loss: 0.0428 - classification_loss: 0.0018 2/500 [..............................] - ETA: 2:52 - loss: 0.1154 - regression_loss: 0.1090 - classification_loss: 0.0065 3/500 [..............................] - ETA: 2:51 - loss: 0.1214 - regression_loss: 0.1146 - classification_loss: 0.0068 4/500 [..............................] - ETA: 2:50 - loss: 0.1204 - regression_loss: 0.1144 - classification_loss: 0.0060 5/500 [..............................] - ETA: 2:51 - loss: 0.1107 - regression_loss: 0.1053 - classification_loss: 0.0054 6/500 [..............................] - ETA: 2:49 - loss: 0.0990 - regression_loss: 0.0938 - classification_loss: 0.0051 7/500 [..............................] - ETA: 2:48 - loss: 0.0956 - regression_loss: 0.0908 - classification_loss: 0.0048 8/500 [..............................] - ETA: 2:47 - loss: 0.0956 - regression_loss: 0.0908 - classification_loss: 0.0048 9/500 [..............................] - ETA: 2:47 - loss: 0.0984 - regression_loss: 0.0936 - classification_loss: 0.0048 10/500 [..............................] - ETA: 2:47 - loss: 0.1009 - regression_loss: 0.0961 - classification_loss: 0.0049 11/500 [..............................] - ETA: 2:47 - loss: 0.1024 - regression_loss: 0.0977 - classification_loss: 0.0048 12/500 [..............................] - ETA: 2:47 - loss: 0.0995 - regression_loss: 0.0950 - classification_loss: 0.0045 13/500 [..............................] - ETA: 2:46 - loss: 0.1072 - regression_loss: 0.1025 - classification_loss: 0.0048 14/500 [..............................] - ETA: 2:46 - loss: 0.1061 - regression_loss: 0.1010 - classification_loss: 0.0051 15/500 [..............................] - ETA: 2:45 - loss: 0.1065 - regression_loss: 0.1015 - classification_loss: 0.0050 16/500 [..............................] - ETA: 2:44 - loss: 0.1084 - regression_loss: 0.1032 - classification_loss: 0.0052 17/500 [>.............................] - ETA: 2:44 - loss: 0.1046 - regression_loss: 0.0996 - classification_loss: 0.0050 18/500 [>.............................] - ETA: 2:44 - loss: 0.1044 - regression_loss: 0.0994 - classification_loss: 0.0049 19/500 [>.............................] - ETA: 2:44 - loss: 0.1022 - regression_loss: 0.0974 - classification_loss: 0.0048 20/500 [>.............................] - ETA: 2:43 - loss: 0.1049 - regression_loss: 0.1001 - classification_loss: 0.0049 21/500 [>.............................] - ETA: 2:42 - loss: 0.1111 - regression_loss: 0.1060 - classification_loss: 0.0051 22/500 [>.............................] - ETA: 2:42 - loss: 0.1072 - regression_loss: 0.1023 - classification_loss: 0.0049 23/500 [>.............................] - ETA: 2:41 - loss: 0.1056 - regression_loss: 0.1007 - classification_loss: 0.0049 24/500 [>.............................] - ETA: 2:41 - loss: 0.1116 - regression_loss: 0.1066 - classification_loss: 0.0050 25/500 [>.............................] - ETA: 2:41 - loss: 0.1115 - regression_loss: 0.1066 - classification_loss: 0.0050 26/500 [>.............................] - ETA: 2:41 - loss: 0.1158 - regression_loss: 0.1107 - classification_loss: 0.0051 27/500 [>.............................] - ETA: 2:40 - loss: 0.1166 - regression_loss: 0.1115 - classification_loss: 0.0051 28/500 [>.............................] - ETA: 2:40 - loss: 0.1141 - regression_loss: 0.1092 - classification_loss: 0.0050 29/500 [>.............................] - ETA: 2:40 - loss: 0.1124 - regression_loss: 0.1075 - classification_loss: 0.0049 30/500 [>.............................] - ETA: 2:40 - loss: 0.1107 - regression_loss: 0.1060 - classification_loss: 0.0047 31/500 [>.............................] - ETA: 2:39 - loss: 0.1103 - regression_loss: 0.1055 - classification_loss: 0.0048 32/500 [>.............................] - ETA: 2:39 - loss: 0.1120 - regression_loss: 0.1070 - classification_loss: 0.0050 33/500 [>.............................] - ETA: 2:38 - loss: 0.1112 - regression_loss: 0.1062 - classification_loss: 0.0050 34/500 [=>............................] - ETA: 2:38 - loss: 0.1090 - regression_loss: 0.1041 - classification_loss: 0.0049 35/500 [=>............................] - ETA: 2:37 - loss: 0.1091 - regression_loss: 0.1041 - classification_loss: 0.0050 36/500 [=>............................] - ETA: 2:37 - loss: 0.1125 - regression_loss: 0.1072 - classification_loss: 0.0053 37/500 [=>............................] - ETA: 2:37 - loss: 0.1105 - regression_loss: 0.1053 - classification_loss: 0.0052 38/500 [=>............................] - ETA: 2:36 - loss: 0.1101 - regression_loss: 0.1049 - classification_loss: 0.0052 39/500 [=>............................] - ETA: 2:36 - loss: 0.1087 - regression_loss: 0.1035 - classification_loss: 0.0051 40/500 [=>............................] - ETA: 2:36 - loss: 0.1137 - regression_loss: 0.1084 - classification_loss: 0.0053 41/500 [=>............................] - ETA: 2:36 - loss: 0.1145 - regression_loss: 0.1092 - classification_loss: 0.0053 42/500 [=>............................] - ETA: 2:35 - loss: 0.1184 - regression_loss: 0.1125 - classification_loss: 0.0059 43/500 [=>............................] - ETA: 2:35 - loss: 0.1166 - regression_loss: 0.1109 - classification_loss: 0.0058 44/500 [=>............................] - ETA: 2:35 - loss: 0.1191 - regression_loss: 0.1133 - classification_loss: 0.0058 45/500 [=>............................] - ETA: 2:35 - loss: 0.1181 - regression_loss: 0.1124 - classification_loss: 0.0057 46/500 [=>............................] - ETA: 2:34 - loss: 0.1163 - regression_loss: 0.1107 - classification_loss: 0.0057 47/500 [=>............................] - ETA: 2:34 - loss: 0.1181 - regression_loss: 0.1123 - classification_loss: 0.0058 48/500 [=>............................] - ETA: 2:34 - loss: 0.1185 - regression_loss: 0.1128 - classification_loss: 0.0057 49/500 [=>............................] - ETA: 2:33 - loss: 0.1224 - regression_loss: 0.1166 - classification_loss: 0.0058 50/500 [==>...........................] - ETA: 2:33 - loss: 0.1211 - regression_loss: 0.1154 - classification_loss: 0.0057 51/500 [==>...........................] - ETA: 2:33 - loss: 0.1202 - regression_loss: 0.1146 - classification_loss: 0.0057 52/500 [==>...........................] - ETA: 2:32 - loss: 0.1196 - regression_loss: 0.1140 - classification_loss: 0.0056 53/500 [==>...........................] - ETA: 2:32 - loss: 0.1205 - regression_loss: 0.1149 - classification_loss: 0.0056 54/500 [==>...........................] - ETA: 2:32 - loss: 0.1196 - regression_loss: 0.1140 - classification_loss: 0.0056 55/500 [==>...........................] - ETA: 2:31 - loss: 0.1203 - regression_loss: 0.1147 - classification_loss: 0.0056 56/500 [==>...........................] - ETA: 2:31 - loss: 0.1209 - regression_loss: 0.1152 - classification_loss: 0.0057 57/500 [==>...........................] - ETA: 2:30 - loss: 0.1205 - regression_loss: 0.1148 - classification_loss: 0.0057 58/500 [==>...........................] - ETA: 2:30 - loss: 0.1202 - regression_loss: 0.1144 - classification_loss: 0.0058 59/500 [==>...........................] - ETA: 2:30 - loss: 0.1194 - regression_loss: 0.1136 - classification_loss: 0.0057 60/500 [==>...........................] - ETA: 2:29 - loss: 0.1187 - regression_loss: 0.1131 - classification_loss: 0.0057 61/500 [==>...........................] - ETA: 2:29 - loss: 0.1191 - regression_loss: 0.1135 - classification_loss: 0.0057 62/500 [==>...........................] - ETA: 2:29 - loss: 0.1188 - regression_loss: 0.1131 - classification_loss: 0.0057 63/500 [==>...........................] - ETA: 2:29 - loss: 0.1187 - regression_loss: 0.1130 - classification_loss: 0.0057 64/500 [==>...........................] - ETA: 2:28 - loss: 0.1179 - regression_loss: 0.1122 - classification_loss: 0.0056 65/500 [==>...........................] - ETA: 2:28 - loss: 0.1198 - regression_loss: 0.1141 - classification_loss: 0.0058 66/500 [==>...........................] - ETA: 2:28 - loss: 0.1205 - regression_loss: 0.1148 - classification_loss: 0.0058 67/500 [===>..........................] - ETA: 2:27 - loss: 0.1198 - regression_loss: 0.1141 - classification_loss: 0.0057 68/500 [===>..........................] - ETA: 2:27 - loss: 0.1192 - regression_loss: 0.1134 - classification_loss: 0.0057 69/500 [===>..........................] - ETA: 2:27 - loss: 0.1187 - regression_loss: 0.1130 - classification_loss: 0.0057 70/500 [===>..........................] - ETA: 2:27 - loss: 0.1176 - regression_loss: 0.1120 - classification_loss: 0.0056 71/500 [===>..........................] - ETA: 2:26 - loss: 0.1188 - regression_loss: 0.1129 - classification_loss: 0.0059 72/500 [===>..........................] - ETA: 2:26 - loss: 0.1195 - regression_loss: 0.1136 - classification_loss: 0.0059 73/500 [===>..........................] - ETA: 2:25 - loss: 0.1191 - regression_loss: 0.1131 - classification_loss: 0.0060 74/500 [===>..........................] - ETA: 2:25 - loss: 0.1181 - regression_loss: 0.1122 - classification_loss: 0.0059 75/500 [===>..........................] - ETA: 2:25 - loss: 0.1177 - regression_loss: 0.1118 - classification_loss: 0.0059 76/500 [===>..........................] - ETA: 2:24 - loss: 0.1183 - regression_loss: 0.1123 - classification_loss: 0.0059 77/500 [===>..........................] - ETA: 2:24 - loss: 0.1169 - regression_loss: 0.1110 - classification_loss: 0.0058 78/500 [===>..........................] - ETA: 2:23 - loss: 0.1166 - regression_loss: 0.1108 - classification_loss: 0.0058 79/500 [===>..........................] - ETA: 2:23 - loss: 0.1161 - regression_loss: 0.1103 - classification_loss: 0.0058 80/500 [===>..........................] - ETA: 2:23 - loss: 0.1160 - regression_loss: 0.1102 - classification_loss: 0.0058 81/500 [===>..........................] - ETA: 2:23 - loss: 0.1157 - regression_loss: 0.1099 - classification_loss: 0.0058 82/500 [===>..........................] - ETA: 2:22 - loss: 0.1156 - regression_loss: 0.1097 - classification_loss: 0.0058 83/500 [===>..........................] - ETA: 2:22 - loss: 0.1160 - regression_loss: 0.1101 - classification_loss: 0.0058 84/500 [====>.........................] - ETA: 2:21 - loss: 0.1162 - regression_loss: 0.1103 - classification_loss: 0.0059 85/500 [====>.........................] - ETA: 2:21 - loss: 0.1155 - regression_loss: 0.1096 - classification_loss: 0.0059 86/500 [====>.........................] - ETA: 2:21 - loss: 0.1151 - regression_loss: 0.1092 - classification_loss: 0.0059 87/500 [====>.........................] - ETA: 2:20 - loss: 0.1144 - regression_loss: 0.1086 - classification_loss: 0.0059 88/500 [====>.........................] - ETA: 2:20 - loss: 0.1151 - regression_loss: 0.1091 - classification_loss: 0.0060 89/500 [====>.........................] - ETA: 2:20 - loss: 0.1146 - regression_loss: 0.1086 - classification_loss: 0.0060 90/500 [====>.........................] - ETA: 2:19 - loss: 0.1154 - regression_loss: 0.1094 - classification_loss: 0.0060 91/500 [====>.........................] - ETA: 2:19 - loss: 0.1145 - regression_loss: 0.1086 - classification_loss: 0.0060 92/500 [====>.........................] - ETA: 2:19 - loss: 0.1150 - regression_loss: 0.1089 - classification_loss: 0.0061 93/500 [====>.........................] - ETA: 2:19 - loss: 0.1145 - regression_loss: 0.1084 - classification_loss: 0.0061 94/500 [====>.........................] - ETA: 2:18 - loss: 0.1146 - regression_loss: 0.1085 - classification_loss: 0.0061 95/500 [====>.........................] - ETA: 2:18 - loss: 0.1141 - regression_loss: 0.1081 - classification_loss: 0.0060 96/500 [====>.........................] - ETA: 2:18 - loss: 0.1131 - regression_loss: 0.1072 - classification_loss: 0.0060 97/500 [====>.........................] - ETA: 2:17 - loss: 0.1129 - regression_loss: 0.1070 - classification_loss: 0.0059 98/500 [====>.........................] - ETA: 2:17 - loss: 0.1122 - regression_loss: 0.1063 - classification_loss: 0.0059 99/500 [====>.........................] - ETA: 2:17 - loss: 0.1118 - regression_loss: 0.1060 - classification_loss: 0.0058 100/500 [=====>........................] - ETA: 2:16 - loss: 0.1115 - regression_loss: 0.1057 - classification_loss: 0.0058 101/500 [=====>........................] - ETA: 2:16 - loss: 0.1120 - regression_loss: 0.1062 - classification_loss: 0.0058 102/500 [=====>........................] - ETA: 2:16 - loss: 0.1121 - regression_loss: 0.1063 - classification_loss: 0.0058 103/500 [=====>........................] - ETA: 2:15 - loss: 0.1134 - regression_loss: 0.1076 - classification_loss: 0.0059 104/500 [=====>........................] - ETA: 2:15 - loss: 0.1138 - regression_loss: 0.1079 - classification_loss: 0.0059 105/500 [=====>........................] - ETA: 2:14 - loss: 0.1130 - regression_loss: 0.1072 - classification_loss: 0.0059 106/500 [=====>........................] - ETA: 2:14 - loss: 0.1125 - regression_loss: 0.1067 - classification_loss: 0.0058 107/500 [=====>........................] - ETA: 2:14 - loss: 0.1126 - regression_loss: 0.1067 - classification_loss: 0.0058 108/500 [=====>........................] - ETA: 2:13 - loss: 0.1124 - regression_loss: 0.1066 - classification_loss: 0.0058 109/500 [=====>........................] - ETA: 2:13 - loss: 0.1117 - regression_loss: 0.1060 - classification_loss: 0.0058 110/500 [=====>........................] - ETA: 2:13 - loss: 0.1112 - regression_loss: 0.1055 - classification_loss: 0.0058 111/500 [=====>........................] - ETA: 2:12 - loss: 0.1108 - regression_loss: 0.1051 - classification_loss: 0.0057 112/500 [=====>........................] - ETA: 2:12 - loss: 0.1103 - regression_loss: 0.1046 - classification_loss: 0.0057 113/500 [=====>........................] - ETA: 2:12 - loss: 0.1097 - regression_loss: 0.1041 - classification_loss: 0.0056 114/500 [=====>........................] - ETA: 2:11 - loss: 0.1095 - regression_loss: 0.1039 - classification_loss: 0.0056 115/500 [=====>........................] - ETA: 2:11 - loss: 0.1089 - regression_loss: 0.1033 - classification_loss: 0.0056 116/500 [=====>........................] - ETA: 2:11 - loss: 0.1089 - regression_loss: 0.1034 - classification_loss: 0.0056 117/500 [======>.......................] - ETA: 2:10 - loss: 0.1099 - regression_loss: 0.1042 - classification_loss: 0.0057 118/500 [======>.......................] - ETA: 2:10 - loss: 0.1104 - regression_loss: 0.1047 - classification_loss: 0.0057 119/500 [======>.......................] - ETA: 2:10 - loss: 0.1100 - regression_loss: 0.1043 - classification_loss: 0.0057 120/500 [======>.......................] - ETA: 2:09 - loss: 0.1100 - regression_loss: 0.1042 - classification_loss: 0.0058 121/500 [======>.......................] - ETA: 2:09 - loss: 0.1094 - regression_loss: 0.1036 - classification_loss: 0.0057 122/500 [======>.......................] - ETA: 2:09 - loss: 0.1102 - regression_loss: 0.1042 - classification_loss: 0.0060 123/500 [======>.......................] - ETA: 2:08 - loss: 0.1098 - regression_loss: 0.1039 - classification_loss: 0.0059 124/500 [======>.......................] - ETA: 2:08 - loss: 0.1094 - regression_loss: 0.1035 - classification_loss: 0.0059 125/500 [======>.......................] - ETA: 2:08 - loss: 0.1092 - regression_loss: 0.1033 - classification_loss: 0.0059 126/500 [======>.......................] - ETA: 2:07 - loss: 0.1093 - regression_loss: 0.1035 - classification_loss: 0.0059 127/500 [======>.......................] - ETA: 2:07 - loss: 0.1095 - regression_loss: 0.1037 - classification_loss: 0.0059 128/500 [======>.......................] - ETA: 2:07 - loss: 0.1091 - regression_loss: 0.1033 - classification_loss: 0.0058 129/500 [======>.......................] - ETA: 2:06 - loss: 0.1085 - regression_loss: 0.1027 - classification_loss: 0.0058 130/500 [======>.......................] - ETA: 2:06 - loss: 0.1081 - regression_loss: 0.1024 - classification_loss: 0.0058 131/500 [======>.......................] - ETA: 2:06 - loss: 0.1081 - regression_loss: 0.1023 - classification_loss: 0.0058 132/500 [======>.......................] - ETA: 2:05 - loss: 0.1077 - regression_loss: 0.1020 - classification_loss: 0.0057 133/500 [======>.......................] - ETA: 2:05 - loss: 0.1073 - regression_loss: 0.1016 - classification_loss: 0.0057 134/500 [=======>......................] - ETA: 2:05 - loss: 0.1090 - regression_loss: 0.1033 - classification_loss: 0.0057 135/500 [=======>......................] - ETA: 2:04 - loss: 0.1084 - regression_loss: 0.1028 - classification_loss: 0.0057 136/500 [=======>......................] - ETA: 2:04 - loss: 0.1081 - regression_loss: 0.1024 - classification_loss: 0.0057 137/500 [=======>......................] - ETA: 2:04 - loss: 0.1081 - regression_loss: 0.1024 - classification_loss: 0.0057 138/500 [=======>......................] - ETA: 2:03 - loss: 0.1084 - regression_loss: 0.1028 - classification_loss: 0.0057 139/500 [=======>......................] - ETA: 2:03 - loss: 0.1092 - regression_loss: 0.1035 - classification_loss: 0.0057 140/500 [=======>......................] - ETA: 2:03 - loss: 0.1096 - regression_loss: 0.1039 - classification_loss: 0.0057 141/500 [=======>......................] - ETA: 2:02 - loss: 0.1096 - regression_loss: 0.1039 - classification_loss: 0.0057 142/500 [=======>......................] - ETA: 2:02 - loss: 0.1099 - regression_loss: 0.1042 - classification_loss: 0.0057 143/500 [=======>......................] - ETA: 2:01 - loss: 0.1106 - regression_loss: 0.1047 - classification_loss: 0.0058 144/500 [=======>......................] - ETA: 2:01 - loss: 0.1101 - regression_loss: 0.1043 - classification_loss: 0.0058 145/500 [=======>......................] - ETA: 2:01 - loss: 0.1097 - regression_loss: 0.1039 - classification_loss: 0.0058 146/500 [=======>......................] - ETA: 2:01 - loss: 0.1096 - regression_loss: 0.1038 - classification_loss: 0.0058 147/500 [=======>......................] - ETA: 2:00 - loss: 0.1100 - regression_loss: 0.1042 - classification_loss: 0.0058 148/500 [=======>......................] - ETA: 2:00 - loss: 0.1096 - regression_loss: 0.1038 - classification_loss: 0.0058 149/500 [=======>......................] - ETA: 2:00 - loss: 0.1099 - regression_loss: 0.1041 - classification_loss: 0.0058 150/500 [========>.....................] - ETA: 1:59 - loss: 0.1100 - regression_loss: 0.1042 - classification_loss: 0.0058 151/500 [========>.....................] - ETA: 1:59 - loss: 0.1097 - regression_loss: 0.1040 - classification_loss: 0.0058 152/500 [========>.....................] - ETA: 1:59 - loss: 0.1093 - regression_loss: 0.1036 - classification_loss: 0.0057 153/500 [========>.....................] - ETA: 1:58 - loss: 0.1097 - regression_loss: 0.1039 - classification_loss: 0.0057 154/500 [========>.....................] - ETA: 1:58 - loss: 0.1094 - regression_loss: 0.1037 - classification_loss: 0.0057 155/500 [========>.....................] - ETA: 1:57 - loss: 0.1096 - regression_loss: 0.1038 - classification_loss: 0.0057 156/500 [========>.....................] - ETA: 1:57 - loss: 0.1099 - regression_loss: 0.1041 - classification_loss: 0.0057 157/500 [========>.....................] - ETA: 1:57 - loss: 0.1100 - regression_loss: 0.1043 - classification_loss: 0.0057 158/500 [========>.....................] - ETA: 1:56 - loss: 0.1098 - regression_loss: 0.1041 - classification_loss: 0.0057 159/500 [========>.....................] - ETA: 1:56 - loss: 0.1098 - regression_loss: 0.1041 - classification_loss: 0.0057 160/500 [========>.....................] - ETA: 1:56 - loss: 0.1100 - regression_loss: 0.1043 - classification_loss: 0.0057 161/500 [========>.....................] - ETA: 1:55 - loss: 0.1097 - regression_loss: 0.1040 - classification_loss: 0.0057 162/500 [========>.....................] - ETA: 1:55 - loss: 0.1110 - regression_loss: 0.1049 - classification_loss: 0.0060 163/500 [========>.....................] - ETA: 1:55 - loss: 0.1109 - regression_loss: 0.1048 - classification_loss: 0.0060 164/500 [========>.....................] - ETA: 1:54 - loss: 0.1116 - regression_loss: 0.1054 - classification_loss: 0.0062 165/500 [========>.....................] - ETA: 1:54 - loss: 0.1117 - regression_loss: 0.1055 - classification_loss: 0.0062 166/500 [========>.....................] - ETA: 1:54 - loss: 0.1118 - regression_loss: 0.1055 - classification_loss: 0.0062 167/500 [=========>....................] - ETA: 1:53 - loss: 0.1117 - regression_loss: 0.1055 - classification_loss: 0.0062 168/500 [=========>....................] - ETA: 1:53 - loss: 0.1117 - regression_loss: 0.1055 - classification_loss: 0.0062 169/500 [=========>....................] - ETA: 1:53 - loss: 0.1116 - regression_loss: 0.1054 - classification_loss: 0.0062 170/500 [=========>....................] - ETA: 1:52 - loss: 0.1116 - regression_loss: 0.1054 - classification_loss: 0.0062 171/500 [=========>....................] - ETA: 1:52 - loss: 0.1114 - regression_loss: 0.1052 - classification_loss: 0.0062 172/500 [=========>....................] - ETA: 1:52 - loss: 0.1112 - regression_loss: 0.1050 - classification_loss: 0.0062 173/500 [=========>....................] - ETA: 1:51 - loss: 0.1133 - regression_loss: 0.1071 - classification_loss: 0.0062 174/500 [=========>....................] - ETA: 1:51 - loss: 0.1133 - regression_loss: 0.1070 - classification_loss: 0.0062 175/500 [=========>....................] - ETA: 1:51 - loss: 0.1136 - regression_loss: 0.1074 - classification_loss: 0.0062 176/500 [=========>....................] - ETA: 1:50 - loss: 0.1138 - regression_loss: 0.1075 - classification_loss: 0.0062 177/500 [=========>....................] - ETA: 1:50 - loss: 0.1136 - regression_loss: 0.1074 - classification_loss: 0.0062 178/500 [=========>....................] - ETA: 1:50 - loss: 0.1132 - regression_loss: 0.1070 - classification_loss: 0.0062 179/500 [=========>....................] - ETA: 1:49 - loss: 0.1129 - regression_loss: 0.1067 - classification_loss: 0.0062 180/500 [=========>....................] - ETA: 1:49 - loss: 0.1127 - regression_loss: 0.1065 - classification_loss: 0.0062 181/500 [=========>....................] - ETA: 1:49 - loss: 0.1121 - regression_loss: 0.1060 - classification_loss: 0.0061 182/500 [=========>....................] - ETA: 1:48 - loss: 0.1121 - regression_loss: 0.1059 - classification_loss: 0.0062 183/500 [=========>....................] - ETA: 1:48 - loss: 0.1121 - regression_loss: 0.1059 - classification_loss: 0.0062 184/500 [==========>...................] - ETA: 1:48 - loss: 0.1121 - regression_loss: 0.1060 - classification_loss: 0.0062 185/500 [==========>...................] - ETA: 1:47 - loss: 0.1123 - regression_loss: 0.1060 - classification_loss: 0.0063 186/500 [==========>...................] - ETA: 1:47 - loss: 0.1119 - regression_loss: 0.1056 - classification_loss: 0.0062 187/500 [==========>...................] - ETA: 1:47 - loss: 0.1123 - regression_loss: 0.1061 - classification_loss: 0.0062 188/500 [==========>...................] - ETA: 1:46 - loss: 0.1124 - regression_loss: 0.1062 - classification_loss: 0.0062 189/500 [==========>...................] - ETA: 1:46 - loss: 0.1122 - regression_loss: 0.1060 - classification_loss: 0.0062 190/500 [==========>...................] - ETA: 1:46 - loss: 0.1127 - regression_loss: 0.1064 - classification_loss: 0.0062 191/500 [==========>...................] - ETA: 1:45 - loss: 0.1127 - regression_loss: 0.1065 - classification_loss: 0.0062 192/500 [==========>...................] - ETA: 1:45 - loss: 0.1125 - regression_loss: 0.1062 - classification_loss: 0.0062 193/500 [==========>...................] - ETA: 1:45 - loss: 0.1121 - regression_loss: 0.1059 - classification_loss: 0.0062 194/500 [==========>...................] - ETA: 1:44 - loss: 0.1125 - regression_loss: 0.1062 - classification_loss: 0.0062 195/500 [==========>...................] - ETA: 1:44 - loss: 0.1120 - regression_loss: 0.1058 - classification_loss: 0.0062 196/500 [==========>...................] - ETA: 1:43 - loss: 0.1118 - regression_loss: 0.1056 - classification_loss: 0.0062 197/500 [==========>...................] - ETA: 1:43 - loss: 0.1114 - regression_loss: 0.1052 - classification_loss: 0.0062 198/500 [==========>...................] - ETA: 1:43 - loss: 0.1118 - regression_loss: 0.1056 - classification_loss: 0.0062 199/500 [==========>...................] - ETA: 1:42 - loss: 0.1113 - regression_loss: 0.1052 - classification_loss: 0.0061 200/500 [===========>..................] - ETA: 1:42 - loss: 0.1114 - regression_loss: 0.1052 - classification_loss: 0.0061 201/500 [===========>..................] - ETA: 1:42 - loss: 0.1118 - regression_loss: 0.1057 - classification_loss: 0.0062 202/500 [===========>..................] - ETA: 1:41 - loss: 0.1123 - regression_loss: 0.1061 - classification_loss: 0.0062 203/500 [===========>..................] - ETA: 1:41 - loss: 0.1119 - regression_loss: 0.1058 - classification_loss: 0.0061 204/500 [===========>..................] - ETA: 1:41 - loss: 0.1115 - regression_loss: 0.1054 - classification_loss: 0.0061 205/500 [===========>..................] - ETA: 1:40 - loss: 0.1112 - regression_loss: 0.1051 - classification_loss: 0.0061 206/500 [===========>..................] - ETA: 1:40 - loss: 0.1110 - regression_loss: 0.1049 - classification_loss: 0.0061 207/500 [===========>..................] - ETA: 1:40 - loss: 0.1109 - regression_loss: 0.1048 - classification_loss: 0.0061 208/500 [===========>..................] - ETA: 1:39 - loss: 0.1110 - regression_loss: 0.1048 - classification_loss: 0.0062 209/500 [===========>..................] - ETA: 1:39 - loss: 0.1110 - regression_loss: 0.1049 - classification_loss: 0.0061 210/500 [===========>..................] - ETA: 1:39 - loss: 0.1109 - regression_loss: 0.1047 - classification_loss: 0.0061 211/500 [===========>..................] - ETA: 1:38 - loss: 0.1111 - regression_loss: 0.1049 - classification_loss: 0.0062 212/500 [===========>..................] - ETA: 1:38 - loss: 0.1108 - regression_loss: 0.1046 - classification_loss: 0.0062 213/500 [===========>..................] - ETA: 1:38 - loss: 0.1106 - regression_loss: 0.1044 - classification_loss: 0.0062 214/500 [===========>..................] - ETA: 1:37 - loss: 0.1103 - regression_loss: 0.1042 - classification_loss: 0.0061 215/500 [===========>..................] - ETA: 1:37 - loss: 0.1108 - regression_loss: 0.1046 - classification_loss: 0.0062 216/500 [===========>..................] - ETA: 1:37 - loss: 0.1105 - regression_loss: 0.1043 - classification_loss: 0.0062 217/500 [============>.................] - ETA: 1:36 - loss: 0.1105 - regression_loss: 0.1043 - classification_loss: 0.0062 218/500 [============>.................] - ETA: 1:36 - loss: 0.1102 - regression_loss: 0.1040 - classification_loss: 0.0062 219/500 [============>.................] - ETA: 1:36 - loss: 0.1101 - regression_loss: 0.1039 - classification_loss: 0.0062 220/500 [============>.................] - ETA: 1:35 - loss: 0.1099 - regression_loss: 0.1037 - classification_loss: 0.0062 221/500 [============>.................] - ETA: 1:35 - loss: 0.1095 - regression_loss: 0.1034 - classification_loss: 0.0061 222/500 [============>.................] - ETA: 1:35 - loss: 0.1100 - regression_loss: 0.1038 - classification_loss: 0.0062 223/500 [============>.................] - ETA: 1:34 - loss: 0.1100 - regression_loss: 0.1038 - classification_loss: 0.0062 224/500 [============>.................] - ETA: 1:34 - loss: 0.1114 - regression_loss: 0.1052 - classification_loss: 0.0062 225/500 [============>.................] - ETA: 1:34 - loss: 0.1115 - regression_loss: 0.1053 - classification_loss: 0.0062 226/500 [============>.................] - ETA: 1:33 - loss: 0.1114 - regression_loss: 0.1052 - classification_loss: 0.0062 227/500 [============>.................] - ETA: 1:33 - loss: 0.1117 - regression_loss: 0.1055 - classification_loss: 0.0062 228/500 [============>.................] - ETA: 1:33 - loss: 0.1113 - regression_loss: 0.1051 - classification_loss: 0.0062 229/500 [============>.................] - ETA: 1:32 - loss: 0.1112 - regression_loss: 0.1050 - classification_loss: 0.0062 230/500 [============>.................] - ETA: 1:32 - loss: 0.1109 - regression_loss: 0.1047 - classification_loss: 0.0062 231/500 [============>.................] - ETA: 1:32 - loss: 0.1107 - regression_loss: 0.1045 - classification_loss: 0.0062 232/500 [============>.................] - ETA: 1:31 - loss: 0.1106 - regression_loss: 0.1044 - classification_loss: 0.0062 233/500 [============>.................] - ETA: 1:31 - loss: 0.1105 - regression_loss: 0.1044 - classification_loss: 0.0062 234/500 [=============>................] - ETA: 1:31 - loss: 0.1102 - regression_loss: 0.1041 - classification_loss: 0.0062 235/500 [=============>................] - ETA: 1:30 - loss: 0.1104 - regression_loss: 0.1043 - classification_loss: 0.0062 236/500 [=============>................] - ETA: 1:30 - loss: 0.1100 - regression_loss: 0.1039 - classification_loss: 0.0061 237/500 [=============>................] - ETA: 1:30 - loss: 0.1098 - regression_loss: 0.1037 - classification_loss: 0.0061 238/500 [=============>................] - ETA: 1:29 - loss: 0.1096 - regression_loss: 0.1035 - classification_loss: 0.0061 239/500 [=============>................] - ETA: 1:29 - loss: 0.1095 - regression_loss: 0.1034 - classification_loss: 0.0061 240/500 [=============>................] - ETA: 1:28 - loss: 0.1096 - regression_loss: 0.1035 - classification_loss: 0.0061 241/500 [=============>................] - ETA: 1:28 - loss: 0.1098 - regression_loss: 0.1037 - classification_loss: 0.0061 242/500 [=============>................] - ETA: 1:28 - loss: 0.1103 - regression_loss: 0.1041 - classification_loss: 0.0062 243/500 [=============>................] - ETA: 1:27 - loss: 0.1101 - regression_loss: 0.1039 - classification_loss: 0.0062 244/500 [=============>................] - ETA: 1:27 - loss: 0.1100 - regression_loss: 0.1039 - classification_loss: 0.0062 245/500 [=============>................] - ETA: 1:27 - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0062 246/500 [=============>................] - ETA: 1:26 - loss: 0.1109 - regression_loss: 0.1046 - classification_loss: 0.0063 247/500 [=============>................] - ETA: 1:26 - loss: 0.1111 - regression_loss: 0.1049 - classification_loss: 0.0063 248/500 [=============>................] - ETA: 1:26 - loss: 0.1114 - regression_loss: 0.1051 - classification_loss: 0.0063 249/500 [=============>................] - ETA: 1:25 - loss: 0.1115 - regression_loss: 0.1052 - classification_loss: 0.0063 250/500 [==============>...............] - ETA: 1:25 - loss: 0.1113 - regression_loss: 0.1050 - classification_loss: 0.0063 251/500 [==============>...............] - ETA: 1:25 - loss: 0.1110 - regression_loss: 0.1047 - classification_loss: 0.0062 252/500 [==============>...............] - ETA: 1:24 - loss: 0.1106 - regression_loss: 0.1044 - classification_loss: 0.0062 253/500 [==============>...............] - ETA: 1:24 - loss: 0.1107 - regression_loss: 0.1044 - classification_loss: 0.0062 254/500 [==============>...............] - ETA: 1:24 - loss: 0.1105 - regression_loss: 0.1043 - classification_loss: 0.0062 255/500 [==============>...............] - ETA: 1:23 - loss: 0.1103 - regression_loss: 0.1041 - classification_loss: 0.0062 256/500 [==============>...............] - ETA: 1:23 - loss: 0.1103 - regression_loss: 0.1041 - classification_loss: 0.0062 257/500 [==============>...............] - ETA: 1:23 - loss: 0.1103 - regression_loss: 0.1041 - classification_loss: 0.0062 258/500 [==============>...............] - ETA: 1:22 - loss: 0.1099 - regression_loss: 0.1038 - classification_loss: 0.0062 259/500 [==============>...............] - ETA: 1:22 - loss: 0.1099 - regression_loss: 0.1037 - classification_loss: 0.0062 260/500 [==============>...............] - ETA: 1:22 - loss: 0.1099 - regression_loss: 0.1037 - classification_loss: 0.0062 261/500 [==============>...............] - ETA: 1:21 - loss: 0.1098 - regression_loss: 0.1036 - classification_loss: 0.0062 262/500 [==============>...............] - ETA: 1:21 - loss: 0.1096 - regression_loss: 0.1035 - classification_loss: 0.0062 263/500 [==============>...............] - ETA: 1:21 - loss: 0.1096 - regression_loss: 0.1034 - classification_loss: 0.0062 264/500 [==============>...............] - ETA: 1:20 - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 265/500 [==============>...............] - ETA: 1:20 - loss: 0.1102 - regression_loss: 0.1038 - classification_loss: 0.0063 266/500 [==============>...............] - ETA: 1:19 - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 267/500 [===============>..............] - ETA: 1:19 - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 268/500 [===============>..............] - ETA: 1:19 - loss: 0.1096 - regression_loss: 0.1033 - classification_loss: 0.0063 269/500 [===============>..............] - ETA: 1:18 - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 270/500 [===============>..............] - ETA: 1:18 - loss: 0.1098 - regression_loss: 0.1035 - classification_loss: 0.0063 271/500 [===============>..............] - ETA: 1:18 - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 272/500 [===============>..............] - ETA: 1:17 - loss: 0.1098 - regression_loss: 0.1035 - classification_loss: 0.0063 273/500 [===============>..............] - ETA: 1:17 - loss: 0.1096 - regression_loss: 0.1033 - classification_loss: 0.0063 274/500 [===============>..............] - ETA: 1:17 - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 275/500 [===============>..............] - ETA: 1:16 - loss: 0.1103 - regression_loss: 0.1038 - classification_loss: 0.0065 276/500 [===============>..............] - ETA: 1:16 - loss: 0.1106 - regression_loss: 0.1041 - classification_loss: 0.0065 277/500 [===============>..............] - ETA: 1:16 - loss: 0.1103 - regression_loss: 0.1038 - classification_loss: 0.0065 278/500 [===============>..............] - ETA: 1:15 - loss: 0.1103 - regression_loss: 0.1038 - classification_loss: 0.0065 279/500 [===============>..............] - ETA: 1:15 - loss: 0.1102 - regression_loss: 0.1037 - classification_loss: 0.0065 280/500 [===============>..............] - ETA: 1:15 - loss: 0.1107 - regression_loss: 0.1042 - classification_loss: 0.0065 281/500 [===============>..............] - ETA: 1:14 - loss: 0.1110 - regression_loss: 0.1044 - classification_loss: 0.0065 282/500 [===============>..............] - ETA: 1:14 - loss: 0.1108 - regression_loss: 0.1043 - classification_loss: 0.0065 283/500 [===============>..............] - ETA: 1:14 - loss: 0.1107 - regression_loss: 0.1042 - classification_loss: 0.0065 284/500 [================>.............] - ETA: 1:13 - loss: 0.1105 - regression_loss: 0.1040 - classification_loss: 0.0065 285/500 [================>.............] - ETA: 1:13 - loss: 0.1107 - regression_loss: 0.1042 - classification_loss: 0.0065 286/500 [================>.............] - ETA: 1:13 - loss: 0.1107 - regression_loss: 0.1042 - classification_loss: 0.0065 287/500 [================>.............] - ETA: 1:12 - loss: 0.1107 - regression_loss: 0.1042 - classification_loss: 0.0065 288/500 [================>.............] - ETA: 1:12 - loss: 0.1104 - regression_loss: 0.1039 - classification_loss: 0.0065 289/500 [================>.............] - ETA: 1:12 - loss: 0.1103 - regression_loss: 0.1038 - classification_loss: 0.0065 290/500 [================>.............] - ETA: 1:11 - loss: 0.1101 - regression_loss: 0.1036 - classification_loss: 0.0065 291/500 [================>.............] - ETA: 1:11 - loss: 0.1105 - regression_loss: 0.1041 - classification_loss: 0.0065 292/500 [================>.............] - ETA: 1:11 - loss: 0.1109 - regression_loss: 0.1044 - classification_loss: 0.0065 293/500 [================>.............] - ETA: 1:10 - loss: 0.1107 - regression_loss: 0.1043 - classification_loss: 0.0065 294/500 [================>.............] - ETA: 1:10 - loss: 0.1108 - regression_loss: 0.1043 - classification_loss: 0.0065 295/500 [================>.............] - ETA: 1:10 - loss: 0.1109 - regression_loss: 0.1044 - classification_loss: 0.0065 296/500 [================>.............] - ETA: 1:09 - loss: 0.1108 - regression_loss: 0.1043 - classification_loss: 0.0065 297/500 [================>.............] - ETA: 1:09 - loss: 0.1108 - regression_loss: 0.1043 - classification_loss: 0.0065 298/500 [================>.............] - ETA: 1:09 - loss: 0.1109 - regression_loss: 0.1044 - classification_loss: 0.0065 299/500 [================>.............] - ETA: 1:08 - loss: 0.1106 - regression_loss: 0.1041 - classification_loss: 0.0065 300/500 [=================>............] - ETA: 1:08 - loss: 0.1109 - regression_loss: 0.1044 - classification_loss: 0.0065 301/500 [=================>............] - ETA: 1:08 - loss: 0.1109 - regression_loss: 0.1044 - classification_loss: 0.0065 302/500 [=================>............] - ETA: 1:07 - loss: 0.1120 - regression_loss: 0.1055 - classification_loss: 0.0064 303/500 [=================>............] - ETA: 1:07 - loss: 0.1119 - regression_loss: 0.1054 - classification_loss: 0.0064 304/500 [=================>............] - ETA: 1:07 - loss: 0.1117 - regression_loss: 0.1052 - classification_loss: 0.0064 305/500 [=================>............] - ETA: 1:06 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 306/500 [=================>............] - ETA: 1:06 - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 307/500 [=================>............] - ETA: 1:05 - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 308/500 [=================>............] - ETA: 1:05 - loss: 0.1122 - regression_loss: 0.1057 - classification_loss: 0.0065 309/500 [=================>............] - ETA: 1:05 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0065 310/500 [=================>............] - ETA: 1:04 - loss: 0.1121 - regression_loss: 0.1056 - classification_loss: 0.0064 311/500 [=================>............] - ETA: 1:04 - loss: 0.1122 - regression_loss: 0.1057 - classification_loss: 0.0065 312/500 [=================>............] - ETA: 1:04 - loss: 0.1123 - regression_loss: 0.1059 - classification_loss: 0.0065 313/500 [=================>............] - ETA: 1:03 - loss: 0.1123 - regression_loss: 0.1058 - classification_loss: 0.0065 314/500 [=================>............] - ETA: 1:03 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 315/500 [=================>............] - ETA: 1:03 - loss: 0.1118 - regression_loss: 0.1054 - classification_loss: 0.0064 316/500 [=================>............] - ETA: 1:02 - loss: 0.1119 - regression_loss: 0.1055 - classification_loss: 0.0064 317/500 [==================>...........] - ETA: 1:02 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 318/500 [==================>...........] - ETA: 1:02 - loss: 0.1120 - regression_loss: 0.1055 - classification_loss: 0.0064 319/500 [==================>...........] - ETA: 1:01 - loss: 0.1119 - regression_loss: 0.1055 - classification_loss: 0.0064 320/500 [==================>...........] - ETA: 1:01 - loss: 0.1121 - regression_loss: 0.1056 - classification_loss: 0.0064 321/500 [==================>...........] - ETA: 1:01 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 322/500 [==================>...........] - ETA: 1:00 - loss: 0.1118 - regression_loss: 0.1054 - classification_loss: 0.0064 323/500 [==================>...........] - ETA: 1:00 - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 324/500 [==================>...........] - ETA: 1:00 - loss: 0.1121 - regression_loss: 0.1057 - classification_loss: 0.0064 325/500 [==================>...........] - ETA: 59s - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064  326/500 [==================>...........] - ETA: 59s - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 327/500 [==================>...........] - ETA: 59s - loss: 0.1118 - regression_loss: 0.1054 - classification_loss: 0.0064 328/500 [==================>...........] - ETA: 58s - loss: 0.1121 - regression_loss: 0.1057 - classification_loss: 0.0064 329/500 [==================>...........] - ETA: 58s - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 330/500 [==================>...........] - ETA: 58s - loss: 0.1119 - regression_loss: 0.1055 - classification_loss: 0.0064 331/500 [==================>...........] - ETA: 57s - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 332/500 [==================>...........] - ETA: 57s - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 333/500 [==================>...........] - ETA: 57s - loss: 0.1121 - regression_loss: 0.1057 - classification_loss: 0.0064 334/500 [===================>..........] - ETA: 56s - loss: 0.1119 - regression_loss: 0.1055 - classification_loss: 0.0064 335/500 [===================>..........] - ETA: 56s - loss: 0.1120 - regression_loss: 0.1056 - classification_loss: 0.0064 336/500 [===================>..........] - ETA: 56s - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 337/500 [===================>..........] - ETA: 55s - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 338/500 [===================>..........] - ETA: 55s - loss: 0.1119 - regression_loss: 0.1056 - classification_loss: 0.0064 339/500 [===================>..........] - ETA: 55s - loss: 0.1124 - regression_loss: 0.1060 - classification_loss: 0.0064 340/500 [===================>..........] - ETA: 54s - loss: 0.1123 - regression_loss: 0.1059 - classification_loss: 0.0064 341/500 [===================>..........] - ETA: 54s - loss: 0.1122 - regression_loss: 0.1058 - classification_loss: 0.0064 342/500 [===================>..........] - ETA: 54s - loss: 0.1119 - regression_loss: 0.1055 - classification_loss: 0.0064 343/500 [===================>..........] - ETA: 53s - loss: 0.1118 - regression_loss: 0.1054 - classification_loss: 0.0064 344/500 [===================>..........] - ETA: 53s - loss: 0.1116 - regression_loss: 0.1052 - classification_loss: 0.0064 345/500 [===================>..........] - ETA: 53s - loss: 0.1117 - regression_loss: 0.1053 - classification_loss: 0.0064 346/500 [===================>..........] - ETA: 52s - loss: 0.1115 - regression_loss: 0.1052 - classification_loss: 0.0064 347/500 [===================>..........] - ETA: 52s - loss: 0.1115 - regression_loss: 0.1052 - classification_loss: 0.0064 348/500 [===================>..........] - ETA: 52s - loss: 0.1117 - regression_loss: 0.1053 - classification_loss: 0.0064 349/500 [===================>..........] - ETA: 51s - loss: 0.1116 - regression_loss: 0.1052 - classification_loss: 0.0064 350/500 [====================>.........] - ETA: 51s - loss: 0.1118 - regression_loss: 0.1053 - classification_loss: 0.0064 351/500 [====================>.........] - ETA: 50s - loss: 0.1115 - regression_loss: 0.1051 - classification_loss: 0.0064 352/500 [====================>.........] - ETA: 50s - loss: 0.1114 - regression_loss: 0.1050 - classification_loss: 0.0064 353/500 [====================>.........] - ETA: 50s - loss: 0.1115 - regression_loss: 0.1050 - classification_loss: 0.0064 354/500 [====================>.........] - ETA: 49s - loss: 0.1114 - regression_loss: 0.1050 - classification_loss: 0.0064 355/500 [====================>.........] - ETA: 49s - loss: 0.1117 - regression_loss: 0.1053 - classification_loss: 0.0064 356/500 [====================>.........] - ETA: 49s - loss: 0.1117 - regression_loss: 0.1053 - classification_loss: 0.0064 357/500 [====================>.........] - ETA: 48s - loss: 0.1117 - regression_loss: 0.1052 - classification_loss: 0.0064 358/500 [====================>.........] - ETA: 48s - loss: 0.1114 - regression_loss: 0.1050 - classification_loss: 0.0064 359/500 [====================>.........] - ETA: 48s - loss: 0.1114 - regression_loss: 0.1050 - classification_loss: 0.0064 360/500 [====================>.........] - ETA: 47s - loss: 0.1113 - regression_loss: 0.1049 - classification_loss: 0.0064 361/500 [====================>.........] - ETA: 47s - loss: 0.1111 - regression_loss: 0.1048 - classification_loss: 0.0064 362/500 [====================>.........] - ETA: 47s - loss: 0.1113 - regression_loss: 0.1050 - classification_loss: 0.0064 363/500 [====================>.........] - ETA: 46s - loss: 0.1112 - regression_loss: 0.1049 - classification_loss: 0.0064 364/500 [====================>.........] - ETA: 46s - loss: 0.1110 - regression_loss: 0.1047 - classification_loss: 0.0064 365/500 [====================>.........] - ETA: 46s - loss: 0.1109 - regression_loss: 0.1045 - classification_loss: 0.0063 366/500 [====================>.........] - ETA: 45s - loss: 0.1107 - regression_loss: 0.1043 - classification_loss: 0.0063 367/500 [=====================>........] - ETA: 45s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 368/500 [=====================>........] - ETA: 45s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 369/500 [=====================>........] - ETA: 44s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 370/500 [=====================>........] - ETA: 44s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 371/500 [=====================>........] - ETA: 44s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 372/500 [=====================>........] - ETA: 43s - loss: 0.1105 - regression_loss: 0.1041 - classification_loss: 0.0063 373/500 [=====================>........] - ETA: 43s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 374/500 [=====================>........] - ETA: 43s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 375/500 [=====================>........] - ETA: 42s - loss: 0.1106 - regression_loss: 0.1043 - classification_loss: 0.0063 376/500 [=====================>........] - ETA: 42s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 377/500 [=====================>........] - ETA: 42s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 378/500 [=====================>........] - ETA: 41s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 379/500 [=====================>........] - ETA: 41s - loss: 0.1103 - regression_loss: 0.1040 - classification_loss: 0.0063 380/500 [=====================>........] - ETA: 41s - loss: 0.1101 - regression_loss: 0.1038 - classification_loss: 0.0063 381/500 [=====================>........] - ETA: 40s - loss: 0.1103 - regression_loss: 0.1040 - classification_loss: 0.0063 382/500 [=====================>........] - ETA: 40s - loss: 0.1102 - regression_loss: 0.1039 - classification_loss: 0.0063 383/500 [=====================>........] - ETA: 40s - loss: 0.1103 - regression_loss: 0.1040 - classification_loss: 0.0063 384/500 [======================>.......] - ETA: 39s - loss: 0.1106 - regression_loss: 0.1043 - classification_loss: 0.0063 385/500 [======================>.......] - ETA: 39s - loss: 0.1106 - regression_loss: 0.1043 - classification_loss: 0.0063 386/500 [======================>.......] - ETA: 39s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 387/500 [======================>.......] - ETA: 38s - loss: 0.1103 - regression_loss: 0.1040 - classification_loss: 0.0063 388/500 [======================>.......] - ETA: 38s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 389/500 [======================>.......] - ETA: 37s - loss: 0.1105 - regression_loss: 0.1042 - classification_loss: 0.0063 390/500 [======================>.......] - ETA: 37s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 391/500 [======================>.......] - ETA: 37s - loss: 0.1103 - regression_loss: 0.1040 - classification_loss: 0.0063 392/500 [======================>.......] - ETA: 36s - loss: 0.1104 - regression_loss: 0.1041 - classification_loss: 0.0063 393/500 [======================>.......] - ETA: 36s - loss: 0.1102 - regression_loss: 0.1039 - classification_loss: 0.0063 394/500 [======================>.......] - ETA: 36s - loss: 0.1100 - regression_loss: 0.1037 - classification_loss: 0.0063 395/500 [======================>.......] - ETA: 35s - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 396/500 [======================>.......] - ETA: 35s - loss: 0.1098 - regression_loss: 0.1036 - classification_loss: 0.0063 397/500 [======================>.......] - ETA: 35s - loss: 0.1100 - regression_loss: 0.1037 - classification_loss: 0.0063 398/500 [======================>.......] - ETA: 34s - loss: 0.1101 - regression_loss: 0.1038 - classification_loss: 0.0063 399/500 [======================>.......] - ETA: 34s - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 400/500 [=======================>......] - ETA: 34s - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0062 401/500 [=======================>......] - ETA: 33s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0062 402/500 [=======================>......] - ETA: 33s - loss: 0.1094 - regression_loss: 0.1032 - classification_loss: 0.0062 403/500 [=======================>......] - ETA: 33s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0062 404/500 [=======================>......] - ETA: 32s - loss: 0.1094 - regression_loss: 0.1032 - classification_loss: 0.0062 405/500 [=======================>......] - ETA: 32s - loss: 0.1093 - regression_loss: 0.1031 - classification_loss: 0.0062 406/500 [=======================>......] - ETA: 32s - loss: 0.1093 - regression_loss: 0.1030 - classification_loss: 0.0062 407/500 [=======================>......] - ETA: 31s - loss: 0.1092 - regression_loss: 0.1030 - classification_loss: 0.0062 408/500 [=======================>......] - ETA: 31s - loss: 0.1094 - regression_loss: 0.1032 - classification_loss: 0.0062 409/500 [=======================>......] - ETA: 31s - loss: 0.1093 - regression_loss: 0.1031 - classification_loss: 0.0062 410/500 [=======================>......] - ETA: 30s - loss: 0.1091 - regression_loss: 0.1029 - classification_loss: 0.0062 411/500 [=======================>......] - ETA: 30s - loss: 0.1091 - regression_loss: 0.1029 - classification_loss: 0.0062 412/500 [=======================>......] - ETA: 30s - loss: 0.1090 - regression_loss: 0.1028 - classification_loss: 0.0062 413/500 [=======================>......] - ETA: 29s - loss: 0.1091 - regression_loss: 0.1029 - classification_loss: 0.0062 414/500 [=======================>......] - ETA: 29s - loss: 0.1090 - regression_loss: 0.1028 - classification_loss: 0.0062 415/500 [=======================>......] - ETA: 29s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0062 416/500 [=======================>......] - ETA: 28s - loss: 0.1086 - regression_loss: 0.1024 - classification_loss: 0.0062 417/500 [========================>.....] - ETA: 28s - loss: 0.1090 - regression_loss: 0.1028 - classification_loss: 0.0063 418/500 [========================>.....] - ETA: 28s - loss: 0.1090 - regression_loss: 0.1027 - classification_loss: 0.0063 419/500 [========================>.....] - ETA: 27s - loss: 0.1089 - regression_loss: 0.1027 - classification_loss: 0.0063 420/500 [========================>.....] - ETA: 27s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 421/500 [========================>.....] - ETA: 27s - loss: 0.1087 - regression_loss: 0.1025 - classification_loss: 0.0063 422/500 [========================>.....] - ETA: 26s - loss: 0.1086 - regression_loss: 0.1024 - classification_loss: 0.0063 423/500 [========================>.....] - ETA: 26s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 424/500 [========================>.....] - ETA: 25s - loss: 0.1091 - regression_loss: 0.1028 - classification_loss: 0.0063 425/500 [========================>.....] - ETA: 25s - loss: 0.1091 - regression_loss: 0.1028 - classification_loss: 0.0063 426/500 [========================>.....] - ETA: 25s - loss: 0.1089 - regression_loss: 0.1027 - classification_loss: 0.0063 427/500 [========================>.....] - ETA: 24s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 428/500 [========================>.....] - ETA: 24s - loss: 0.1092 - regression_loss: 0.1029 - classification_loss: 0.0063 429/500 [========================>.....] - ETA: 24s - loss: 0.1093 - regression_loss: 0.1030 - classification_loss: 0.0063 430/500 [========================>.....] - ETA: 23s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0063 431/500 [========================>.....] - ETA: 23s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0063 432/500 [========================>.....] - ETA: 23s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0063 433/500 [========================>.....] - ETA: 22s - loss: 0.1094 - regression_loss: 0.1031 - classification_loss: 0.0063 434/500 [=========================>....] - ETA: 22s - loss: 0.1094 - regression_loss: 0.1031 - classification_loss: 0.0063 435/500 [=========================>....] - ETA: 22s - loss: 0.1094 - regression_loss: 0.1030 - classification_loss: 0.0063 436/500 [=========================>....] - ETA: 21s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0063 437/500 [=========================>....] - ETA: 21s - loss: 0.1094 - regression_loss: 0.1030 - classification_loss: 0.0063 438/500 [=========================>....] - ETA: 21s - loss: 0.1094 - regression_loss: 0.1031 - classification_loss: 0.0063 439/500 [=========================>....] - ETA: 20s - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0064 440/500 [=========================>....] - ETA: 20s - loss: 0.1099 - regression_loss: 0.1035 - classification_loss: 0.0064 441/500 [=========================>....] - ETA: 20s - loss: 0.1098 - regression_loss: 0.1035 - classification_loss: 0.0064 442/500 [=========================>....] - ETA: 19s - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0064 443/500 [=========================>....] - ETA: 19s - loss: 0.1098 - regression_loss: 0.1034 - classification_loss: 0.0063 444/500 [=========================>....] - ETA: 19s - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 445/500 [=========================>....] - ETA: 18s - loss: 0.1099 - regression_loss: 0.1036 - classification_loss: 0.0063 446/500 [=========================>....] - ETA: 18s - loss: 0.1098 - regression_loss: 0.1034 - classification_loss: 0.0063 447/500 [=========================>....] - ETA: 18s - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 448/500 [=========================>....] - ETA: 17s - loss: 0.1096 - regression_loss: 0.1033 - classification_loss: 0.0063 449/500 [=========================>....] - ETA: 17s - loss: 0.1097 - regression_loss: 0.1034 - classification_loss: 0.0063 450/500 [==========================>...] - ETA: 17s - loss: 0.1095 - regression_loss: 0.1032 - classification_loss: 0.0063 451/500 [==========================>...] - ETA: 16s - loss: 0.1094 - regression_loss: 0.1031 - classification_loss: 0.0063 452/500 [==========================>...] - ETA: 16s - loss: 0.1094 - regression_loss: 0.1030 - classification_loss: 0.0063 453/500 [==========================>...] - ETA: 16s - loss: 0.1092 - regression_loss: 0.1029 - classification_loss: 0.0063 454/500 [==========================>...] - ETA: 15s - loss: 0.1091 - regression_loss: 0.1028 - classification_loss: 0.0063 455/500 [==========================>...] - ETA: 15s - loss: 0.1090 - regression_loss: 0.1027 - classification_loss: 0.0063 456/500 [==========================>...] - ETA: 15s - loss: 0.1090 - regression_loss: 0.1027 - classification_loss: 0.0063 457/500 [==========================>...] - ETA: 14s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 458/500 [==========================>...] - ETA: 14s - loss: 0.1087 - regression_loss: 0.1024 - classification_loss: 0.0063 459/500 [==========================>...] - ETA: 14s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 460/500 [==========================>...] - ETA: 13s - loss: 0.1091 - regression_loss: 0.1028 - classification_loss: 0.0063 461/500 [==========================>...] - ETA: 13s - loss: 0.1090 - regression_loss: 0.1028 - classification_loss: 0.0063 462/500 [==========================>...] - ETA: 12s - loss: 0.1090 - regression_loss: 0.1027 - classification_loss: 0.0063 463/500 [==========================>...] - ETA: 12s - loss: 0.1088 - regression_loss: 0.1026 - classification_loss: 0.0063 464/500 [==========================>...] - ETA: 12s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 465/500 [==========================>...] - ETA: 11s - loss: 0.1089 - regression_loss: 0.1027 - classification_loss: 0.0063 466/500 [==========================>...] - ETA: 11s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 467/500 [===========================>..] - ETA: 11s - loss: 0.1087 - regression_loss: 0.1024 - classification_loss: 0.0063 468/500 [===========================>..] - ETA: 10s - loss: 0.1087 - regression_loss: 0.1024 - classification_loss: 0.0063 469/500 [===========================>..] - ETA: 10s - loss: 0.1089 - regression_loss: 0.1027 - classification_loss: 0.0063 470/500 [===========================>..] - ETA: 10s - loss: 0.1089 - regression_loss: 0.1026 - classification_loss: 0.0063 471/500 [===========================>..] - ETA: 9s - loss: 0.1087 - regression_loss: 0.1024 - classification_loss: 0.0063  472/500 [===========================>..] - ETA: 9s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 473/500 [===========================>..] - ETA: 9s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 474/500 [===========================>..] - ETA: 8s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 475/500 [===========================>..] - ETA: 8s - loss: 0.1084 - regression_loss: 0.1022 - classification_loss: 0.0063 476/500 [===========================>..] - ETA: 8s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 477/500 [===========================>..] - ETA: 7s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 478/500 [===========================>..] - ETA: 7s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 479/500 [===========================>..] - ETA: 7s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 480/500 [===========================>..] - ETA: 6s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 481/500 [===========================>..] - ETA: 6s - loss: 0.1083 - regression_loss: 0.1020 - classification_loss: 0.0063 482/500 [===========================>..] - ETA: 6s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 483/500 [===========================>..] - ETA: 5s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 484/500 [============================>.] - ETA: 5s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 485/500 [============================>.] - ETA: 5s - loss: 0.1085 - regression_loss: 0.1022 - classification_loss: 0.0063 486/500 [============================>.] - ETA: 4s - loss: 0.1083 - regression_loss: 0.1020 - classification_loss: 0.0063 487/500 [============================>.] - ETA: 4s - loss: 0.1084 - regression_loss: 0.1021 - classification_loss: 0.0063 488/500 [============================>.] - ETA: 4s - loss: 0.1083 - regression_loss: 0.1021 - classification_loss: 0.0063 489/500 [============================>.] - ETA: 3s - loss: 0.1082 - regression_loss: 0.1019 - classification_loss: 0.0063 490/500 [============================>.] - ETA: 3s - loss: 0.1083 - regression_loss: 0.1020 - classification_loss: 0.0063 491/500 [============================>.] - ETA: 3s - loss: 0.1083 - regression_loss: 0.1020 - classification_loss: 0.0063 492/500 [============================>.] - ETA: 2s - loss: 0.1082 - regression_loss: 0.1019 - classification_loss: 0.0063 493/500 [============================>.] - ETA: 2s - loss: 0.1083 - regression_loss: 0.1020 - classification_loss: 0.0063 494/500 [============================>.] - ETA: 2s - loss: 0.1082 - regression_loss: 0.1019 - classification_loss: 0.0063 495/500 [============================>.] - ETA: 1s - loss: 0.1081 - regression_loss: 0.1019 - classification_loss: 0.0063 496/500 [============================>.] - ETA: 1s - loss: 0.1084 - regression_loss: 0.1020 - classification_loss: 0.0063 497/500 [============================>.] - ETA: 1s - loss: 0.1084 - regression_loss: 0.1021 - classification_loss: 0.0063 498/500 [============================>.] - ETA: 0s - loss: 0.1086 - regression_loss: 0.1022 - classification_loss: 0.0063 499/500 [============================>.] - ETA: 0s - loss: 0.1086 - regression_loss: 0.1023 - classification_loss: 0.0063 500/500 [==============================] - 171s 342ms/step - loss: 0.1085 - regression_loss: 0.1021 - classification_loss: 0.0063 1172 instances of class plum with average precision: 0.7641 mAP: 0.7641 Epoch 00052: saving model to ./training/snapshots/resnet101_pascal_52.h5 Epoch 53/150 1/500 [..............................] - ETA: 2:42 - loss: 0.0375 - regression_loss: 0.0357 - classification_loss: 0.0018 2/500 [..............................] - ETA: 2:47 - loss: 0.0316 - regression_loss: 0.0302 - classification_loss: 0.0014 3/500 [..............................] - ETA: 2:48 - loss: 0.0320 - regression_loss: 0.0304 - classification_loss: 0.0016 4/500 [..............................] - ETA: 2:49 - loss: 0.0621 - regression_loss: 0.0593 - classification_loss: 0.0029 5/500 [..............................] - ETA: 2:49 - loss: 0.0538 - regression_loss: 0.0512 - classification_loss: 0.0026 6/500 [..............................] - ETA: 2:50 - loss: 0.0519 - regression_loss: 0.0493 - classification_loss: 0.0026 7/500 [..............................] - ETA: 2:50 - loss: 0.0695 - regression_loss: 0.0661 - classification_loss: 0.0034 8/500 [..............................] - ETA: 2:49 - loss: 0.0629 - regression_loss: 0.0599 - classification_loss: 0.0031 9/500 [..............................] - ETA: 2:49 - loss: 0.0611 - regression_loss: 0.0582 - classification_loss: 0.0029 10/500 [..............................] - ETA: 2:48 - loss: 0.0565 - regression_loss: 0.0538 - classification_loss: 0.0027 11/500 [..............................] - ETA: 2:47 - loss: 0.0742 - regression_loss: 0.0713 - classification_loss: 0.0029 12/500 [..............................] - ETA: 2:47 - loss: 0.0770 - regression_loss: 0.0740 - classification_loss: 0.0030 13/500 [..............................] - ETA: 2:47 - loss: 0.0766 - regression_loss: 0.0736 - classification_loss: 0.0030 14/500 [..............................] - ETA: 2:47 - loss: 0.0899 - regression_loss: 0.0866 - classification_loss: 0.0033 15/500 [..............................] - ETA: 2:47 - loss: 0.0889 - regression_loss: 0.0856 - classification_loss: 0.0033 16/500 [..............................] - ETA: 2:46 - loss: 0.0902 - regression_loss: 0.0868 - classification_loss: 0.0033 17/500 [>.............................] - ETA: 2:46 - loss: 0.1079 - regression_loss: 0.1046 - classification_loss: 0.0033 18/500 [>.............................] - ETA: 2:45 - loss: 0.1040 - regression_loss: 0.1008 - classification_loss: 0.0033 19/500 [>.............................] - ETA: 2:44 - loss: 0.1000 - regression_loss: 0.0969 - classification_loss: 0.0031 20/500 [>.............................] - ETA: 2:44 - loss: 0.1022 - regression_loss: 0.0989 - classification_loss: 0.0033 21/500 [>.............................] - ETA: 2:43 - loss: 0.1045 - regression_loss: 0.1010 - classification_loss: 0.0035 22/500 [>.............................] - ETA: 2:42 - loss: 0.1053 - regression_loss: 0.1018 - classification_loss: 0.0034 23/500 [>.............................] - ETA: 2:41 - loss: 0.1028 - regression_loss: 0.0995 - classification_loss: 0.0034 24/500 [>.............................] - ETA: 2:41 - loss: 0.1011 - regression_loss: 0.0978 - classification_loss: 0.0033 25/500 [>.............................] - ETA: 2:40 - loss: 0.1006 - regression_loss: 0.0971 - classification_loss: 0.0036 26/500 [>.............................] - ETA: 2:40 - loss: 0.1008 - regression_loss: 0.0972 - classification_loss: 0.0036 27/500 [>.............................] - ETA: 2:39 - loss: 0.0991 - regression_loss: 0.0955 - classification_loss: 0.0036 28/500 [>.............................] - ETA: 2:39 - loss: 0.1017 - regression_loss: 0.0980 - classification_loss: 0.0036 29/500 [>.............................] - ETA: 2:39 - loss: 0.1005 - regression_loss: 0.0969 - classification_loss: 0.0036 30/500 [>.............................] - ETA: 2:39 - loss: 0.0991 - regression_loss: 0.0956 - classification_loss: 0.0035 31/500 [>.............................] - ETA: 2:39 - loss: 0.0989 - regression_loss: 0.0952 - classification_loss: 0.0037 32/500 [>.............................] - ETA: 2:38 - loss: 0.1047 - regression_loss: 0.0989 - classification_loss: 0.0058 33/500 [>.............................] - ETA: 2:38 - loss: 0.1024 - regression_loss: 0.0968 - classification_loss: 0.0056 34/500 [=>............................] - ETA: 2:38 - loss: 0.1012 - regression_loss: 0.0957 - classification_loss: 0.0055 35/500 [=>............................] - ETA: 2:38 - loss: 0.1012 - regression_loss: 0.0957 - classification_loss: 0.0055 36/500 [=>............................] - ETA: 2:37 - loss: 0.0994 - regression_loss: 0.0939 - classification_loss: 0.0054 37/500 [=>............................] - ETA: 2:37 - loss: 0.0995 - regression_loss: 0.0938 - classification_loss: 0.0058 38/500 [=>............................] - ETA: 2:37 - loss: 0.0980 - regression_loss: 0.0924 - classification_loss: 0.0057 39/500 [=>............................] - ETA: 2:36 - loss: 0.0967 - regression_loss: 0.0912 - classification_loss: 0.0056 40/500 [=>............................] - ETA: 2:36 - loss: 0.0968 - regression_loss: 0.0913 - classification_loss: 0.0055 41/500 [=>............................] - ETA: 2:35 - loss: 0.0968 - regression_loss: 0.0913 - classification_loss: 0.0055 42/500 [=>............................] - ETA: 2:35 - loss: 0.0964 - regression_loss: 0.0910 - classification_loss: 0.0054 43/500 [=>............................] - ETA: 2:35 - loss: 0.0998 - regression_loss: 0.0943 - classification_loss: 0.0055 44/500 [=>............................] - ETA: 2:35 - loss: 0.1005 - regression_loss: 0.0950 - classification_loss: 0.0056 45/500 [=>............................] - ETA: 2:34 - loss: 0.1017 - regression_loss: 0.0961 - classification_loss: 0.0057 46/500 [=>............................] - ETA: 2:34 - loss: 0.1009 - regression_loss: 0.0953 - classification_loss: 0.0056 47/500 [=>............................] - ETA: 2:33 - loss: 0.1012 - regression_loss: 0.0956 - classification_loss: 0.0056 48/500 [=>............................] - ETA: 2:33 - loss: 0.1011 - regression_loss: 0.0954 - classification_loss: 0.0056 49/500 [=>............................] - ETA: 2:32 - loss: 0.1011 - regression_loss: 0.0952 - classification_loss: 0.0059 50/500 [==>...........................] - ETA: 2:32 - loss: 0.0997 - regression_loss: 0.0938 - classification_loss: 0.0059 51/500 [==>...........................] - ETA: 2:32 - loss: 0.0991 - regression_loss: 0.0933 - classification_loss: 0.0058 52/500 [==>...........................] - ETA: 2:32 - loss: 0.0986 - regression_loss: 0.0928 - classification_loss: 0.0058 53/500 [==>...........................] - ETA: 2:31 - loss: 0.0977 - regression_loss: 0.0921 - classification_loss: 0.0057 54/500 [==>...........................] - ETA: 2:31 - loss: 0.0971 - regression_loss: 0.0915 - classification_loss: 0.0056 55/500 [==>...........................] - ETA: 2:31 - loss: 0.0962 - regression_loss: 0.0907 - classification_loss: 0.0056 56/500 [==>...........................] - ETA: 2:30 - loss: 0.0966 - regression_loss: 0.0911 - classification_loss: 0.0056 57/500 [==>...........................] - ETA: 2:30 - loss: 0.0972 - regression_loss: 0.0916 - classification_loss: 0.0056 58/500 [==>...........................] - ETA: 2:30 - loss: 0.0963 - regression_loss: 0.0908 - classification_loss: 0.0055 59/500 [==>...........................] - ETA: 2:30 - loss: 0.0956 - regression_loss: 0.0902 - classification_loss: 0.0055 60/500 [==>...........................] - ETA: 2:29 - loss: 0.0947 - regression_loss: 0.0893 - classification_loss: 0.0054 61/500 [==>...........................] - ETA: 2:29 - loss: 0.0941 - regression_loss: 0.0888 - classification_loss: 0.0054 62/500 [==>...........................] - ETA: 2:28 - loss: 0.0936 - regression_loss: 0.0883 - classification_loss: 0.0053 63/500 [==>...........................] - ETA: 2:28 - loss: 0.0947 - regression_loss: 0.0893 - classification_loss: 0.0054 64/500 [==>...........................] - ETA: 2:27 - loss: 0.0945 - regression_loss: 0.0891 - classification_loss: 0.0054 65/500 [==>...........................] - ETA: 2:27 - loss: 0.0964 - regression_loss: 0.0909 - classification_loss: 0.0055 66/500 [==>...........................] - ETA: 2:27 - loss: 0.0956 - regression_loss: 0.0901 - classification_loss: 0.0055 67/500 [===>..........................] - ETA: 2:26 - loss: 0.0965 - regression_loss: 0.0908 - classification_loss: 0.0057 68/500 [===>..........................] - ETA: 2:26 - loss: 0.0986 - regression_loss: 0.0929 - classification_loss: 0.0057 69/500 [===>..........................] - ETA: 2:26 - loss: 0.0997 - regression_loss: 0.0939 - classification_loss: 0.0058 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0990 - regression_loss: 0.0933 - classification_loss: 0.0057 71/500 [===>..........................] - ETA: 2:25 - loss: 0.0988 - regression_loss: 0.0930 - classification_loss: 0.0058 72/500 [===>..........................] - ETA: 2:25 - loss: 0.0989 - regression_loss: 0.0931 - classification_loss: 0.0058 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0979 - regression_loss: 0.0922 - classification_loss: 0.0057 74/500 [===>..........................] - ETA: 2:24 - loss: 0.0978 - regression_loss: 0.0921 - classification_loss: 0.0057 75/500 [===>..........................] - ETA: 2:24 - loss: 0.0968 - regression_loss: 0.0911 - classification_loss: 0.0057 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0962 - regression_loss: 0.0906 - classification_loss: 0.0056 77/500 [===>..........................] - ETA: 2:23 - loss: 0.0958 - regression_loss: 0.0902 - classification_loss: 0.0056 78/500 [===>..........................] - ETA: 2:23 - loss: 0.0951 - regression_loss: 0.0895 - classification_loss: 0.0056 79/500 [===>..........................] - ETA: 2:22 - loss: 0.0950 - regression_loss: 0.0894 - classification_loss: 0.0056 80/500 [===>..........................] - ETA: 2:22 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0058 81/500 [===>..........................] - ETA: 2:22 - loss: 0.0966 - regression_loss: 0.0908 - classification_loss: 0.0058 82/500 [===>..........................] - ETA: 2:22 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0058 83/500 [===>..........................] - ETA: 2:21 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 84/500 [====>.........................] - ETA: 2:21 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 85/500 [====>.........................] - ETA: 2:20 - loss: 0.0974 - regression_loss: 0.0916 - classification_loss: 0.0059 86/500 [====>.........................] - ETA: 2:20 - loss: 0.0975 - regression_loss: 0.0917 - classification_loss: 0.0059 87/500 [====>.........................] - ETA: 2:20 - loss: 0.0971 - regression_loss: 0.0912 - classification_loss: 0.0058 88/500 [====>.........................] - ETA: 2:20 - loss: 0.0979 - regression_loss: 0.0919 - classification_loss: 0.0060 89/500 [====>.........................] - ETA: 2:19 - loss: 0.0977 - regression_loss: 0.0917 - classification_loss: 0.0060 90/500 [====>.........................] - ETA: 2:19 - loss: 0.0992 - regression_loss: 0.0932 - classification_loss: 0.0060 91/500 [====>.........................] - ETA: 2:19 - loss: 0.0986 - regression_loss: 0.0926 - classification_loss: 0.0060 92/500 [====>.........................] - ETA: 2:18 - loss: 0.0996 - regression_loss: 0.0936 - classification_loss: 0.0060 93/500 [====>.........................] - ETA: 2:18 - loss: 0.1002 - regression_loss: 0.0941 - classification_loss: 0.0061 94/500 [====>.........................] - ETA: 2:18 - loss: 0.1006 - regression_loss: 0.0945 - classification_loss: 0.0061 95/500 [====>.........................] - ETA: 2:17 - loss: 0.1010 - regression_loss: 0.0949 - classification_loss: 0.0061 96/500 [====>.........................] - ETA: 2:17 - loss: 0.1005 - regression_loss: 0.0945 - classification_loss: 0.0060 97/500 [====>.........................] - ETA: 2:17 - loss: 0.1004 - regression_loss: 0.0944 - classification_loss: 0.0060 98/500 [====>.........................] - ETA: 2:16 - loss: 0.1000 - regression_loss: 0.0940 - classification_loss: 0.0060 99/500 [====>.........................] - ETA: 2:16 - loss: 0.0994 - regression_loss: 0.0934 - classification_loss: 0.0059 100/500 [=====>........................] - ETA: 2:16 - loss: 0.0993 - regression_loss: 0.0934 - classification_loss: 0.0059 101/500 [=====>........................] - ETA: 2:15 - loss: 0.0992 - regression_loss: 0.0932 - classification_loss: 0.0060 102/500 [=====>........................] - ETA: 2:15 - loss: 0.0987 - regression_loss: 0.0928 - classification_loss: 0.0060 103/500 [=====>........................] - ETA: 2:15 - loss: 0.0992 - regression_loss: 0.0932 - classification_loss: 0.0060 104/500 [=====>........................] - ETA: 2:15 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0060 105/500 [=====>........................] - ETA: 2:14 - loss: 0.0981 - regression_loss: 0.0922 - classification_loss: 0.0059 106/500 [=====>........................] - ETA: 2:14 - loss: 0.0975 - regression_loss: 0.0916 - classification_loss: 0.0059 107/500 [=====>........................] - ETA: 2:14 - loss: 0.0976 - regression_loss: 0.0917 - classification_loss: 0.0059 108/500 [=====>........................] - ETA: 2:13 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0058 109/500 [=====>........................] - ETA: 2:13 - loss: 0.0969 - regression_loss: 0.0910 - classification_loss: 0.0058 110/500 [=====>........................] - ETA: 2:12 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 111/500 [=====>........................] - ETA: 2:12 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 112/500 [=====>........................] - ETA: 2:12 - loss: 0.0965 - regression_loss: 0.0907 - classification_loss: 0.0058 113/500 [=====>........................] - ETA: 2:11 - loss: 0.0960 - regression_loss: 0.0903 - classification_loss: 0.0057 114/500 [=====>........................] - ETA: 2:11 - loss: 0.0954 - regression_loss: 0.0897 - classification_loss: 0.0057 115/500 [=====>........................] - ETA: 2:11 - loss: 0.0950 - regression_loss: 0.0893 - classification_loss: 0.0057 116/500 [=====>........................] - ETA: 2:11 - loss: 0.0950 - regression_loss: 0.0893 - classification_loss: 0.0057 117/500 [======>.......................] - ETA: 2:10 - loss: 0.0953 - regression_loss: 0.0896 - classification_loss: 0.0057 118/500 [======>.......................] - ETA: 2:10 - loss: 0.0953 - regression_loss: 0.0896 - classification_loss: 0.0057 119/500 [======>.......................] - ETA: 2:09 - loss: 0.0952 - regression_loss: 0.0895 - classification_loss: 0.0057 120/500 [======>.......................] - ETA: 2:09 - loss: 0.0959 - regression_loss: 0.0902 - classification_loss: 0.0057 121/500 [======>.......................] - ETA: 2:09 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 122/500 [======>.......................] - ETA: 2:08 - loss: 0.0964 - regression_loss: 0.0906 - classification_loss: 0.0058 123/500 [======>.......................] - ETA: 2:08 - loss: 0.0958 - regression_loss: 0.0901 - classification_loss: 0.0057 124/500 [======>.......................] - ETA: 2:08 - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0058 125/500 [======>.......................] - ETA: 2:08 - loss: 0.0963 - regression_loss: 0.0905 - classification_loss: 0.0058 126/500 [======>.......................] - ETA: 2:07 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0058 127/500 [======>.......................] - ETA: 2:07 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0058 128/500 [======>.......................] - ETA: 2:07 - loss: 0.0959 - regression_loss: 0.0901 - classification_loss: 0.0058 129/500 [======>.......................] - ETA: 2:06 - loss: 0.0956 - regression_loss: 0.0898 - classification_loss: 0.0057 130/500 [======>.......................] - ETA: 2:06 - loss: 0.0952 - regression_loss: 0.0895 - classification_loss: 0.0057 131/500 [======>.......................] - ETA: 2:05 - loss: 0.0948 - regression_loss: 0.0891 - classification_loss: 0.0057 132/500 [======>.......................] - ETA: 2:05 - loss: 0.0961 - regression_loss: 0.0904 - classification_loss: 0.0057 133/500 [======>.......................] - ETA: 2:05 - loss: 0.0955 - regression_loss: 0.0899 - classification_loss: 0.0057 134/500 [=======>......................] - ETA: 2:04 - loss: 0.0951 - regression_loss: 0.0895 - classification_loss: 0.0056 135/500 [=======>......................] - ETA: 2:04 - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0057 136/500 [=======>......................] - ETA: 2:04 - loss: 0.0960 - regression_loss: 0.0902 - classification_loss: 0.0058 137/500 [=======>......................] - ETA: 2:03 - loss: 0.0963 - regression_loss: 0.0905 - classification_loss: 0.0059 138/500 [=======>......................] - ETA: 2:03 - loss: 0.0964 - regression_loss: 0.0905 - classification_loss: 0.0058 139/500 [=======>......................] - ETA: 2:03 - loss: 0.0971 - regression_loss: 0.0911 - classification_loss: 0.0060 140/500 [=======>......................] - ETA: 2:02 - loss: 0.0966 - regression_loss: 0.0906 - classification_loss: 0.0059 141/500 [=======>......................] - ETA: 2:02 - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0060 142/500 [=======>......................] - ETA: 2:01 - loss: 0.0985 - regression_loss: 0.0925 - classification_loss: 0.0060 143/500 [=======>......................] - ETA: 2:01 - loss: 0.0983 - regression_loss: 0.0923 - classification_loss: 0.0060 144/500 [=======>......................] - ETA: 2:01 - loss: 0.0985 - regression_loss: 0.0925 - classification_loss: 0.0060 145/500 [=======>......................] - ETA: 2:00 - loss: 0.0981 - regression_loss: 0.0921 - classification_loss: 0.0060 146/500 [=======>......................] - ETA: 2:00 - loss: 0.0979 - regression_loss: 0.0919 - classification_loss: 0.0060 147/500 [=======>......................] - ETA: 2:00 - loss: 0.0977 - regression_loss: 0.0917 - classification_loss: 0.0060 148/500 [=======>......................] - ETA: 1:59 - loss: 0.0973 - regression_loss: 0.0914 - classification_loss: 0.0060 149/500 [=======>......................] - ETA: 1:59 - loss: 0.0971 - regression_loss: 0.0911 - classification_loss: 0.0060 150/500 [========>.....................] - ETA: 1:59 - loss: 0.0967 - regression_loss: 0.0907 - classification_loss: 0.0059 151/500 [========>.....................] - ETA: 1:58 - loss: 0.0970 - regression_loss: 0.0910 - classification_loss: 0.0059 152/500 [========>.....................] - ETA: 1:58 - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0059 153/500 [========>.....................] - ETA: 1:58 - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0059 154/500 [========>.....................] - ETA: 1:57 - loss: 0.0976 - regression_loss: 0.0917 - classification_loss: 0.0059 155/500 [========>.....................] - ETA: 1:57 - loss: 0.0973 - regression_loss: 0.0914 - classification_loss: 0.0059 156/500 [========>.....................] - ETA: 1:57 - loss: 0.0974 - regression_loss: 0.0915 - classification_loss: 0.0059 157/500 [========>.....................] - ETA: 1:56 - loss: 0.0974 - regression_loss: 0.0916 - classification_loss: 0.0059 158/500 [========>.....................] - ETA: 1:56 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0059 159/500 [========>.....................] - ETA: 1:56 - loss: 0.0978 - regression_loss: 0.0919 - classification_loss: 0.0059 160/500 [========>.....................] - ETA: 1:55 - loss: 0.0979 - regression_loss: 0.0920 - classification_loss: 0.0059 161/500 [========>.....................] - ETA: 1:55 - loss: 0.0983 - regression_loss: 0.0924 - classification_loss: 0.0059 162/500 [========>.....................] - ETA: 1:55 - loss: 0.0986 - regression_loss: 0.0927 - classification_loss: 0.0059 163/500 [========>.....................] - ETA: 1:54 - loss: 0.0987 - regression_loss: 0.0928 - classification_loss: 0.0059 164/500 [========>.....................] - ETA: 1:54 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 165/500 [========>.....................] - ETA: 1:53 - loss: 0.0983 - regression_loss: 0.0924 - classification_loss: 0.0059 166/500 [========>.....................] - ETA: 1:53 - loss: 0.0986 - regression_loss: 0.0928 - classification_loss: 0.0059 167/500 [=========>....................] - ETA: 1:53 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 168/500 [=========>....................] - ETA: 1:52 - loss: 0.0992 - regression_loss: 0.0933 - classification_loss: 0.0059 169/500 [=========>....................] - ETA: 1:52 - loss: 0.0996 - regression_loss: 0.0937 - classification_loss: 0.0059 170/500 [=========>....................] - ETA: 1:52 - loss: 0.0993 - regression_loss: 0.0935 - classification_loss: 0.0059 171/500 [=========>....................] - ETA: 1:51 - loss: 0.0997 - regression_loss: 0.0938 - classification_loss: 0.0059 172/500 [=========>....................] - ETA: 1:51 - loss: 0.0996 - regression_loss: 0.0937 - classification_loss: 0.0059 173/500 [=========>....................] - ETA: 1:51 - loss: 0.0992 - regression_loss: 0.0933 - classification_loss: 0.0059 174/500 [=========>....................] - ETA: 1:50 - loss: 0.0991 - regression_loss: 0.0933 - classification_loss: 0.0059 175/500 [=========>....................] - ETA: 1:50 - loss: 0.0988 - regression_loss: 0.0930 - classification_loss: 0.0058 176/500 [=========>....................] - ETA: 1:50 - loss: 0.0986 - regression_loss: 0.0928 - classification_loss: 0.0058 177/500 [=========>....................] - ETA: 1:49 - loss: 0.0986 - regression_loss: 0.0928 - classification_loss: 0.0058 178/500 [=========>....................] - ETA: 1:49 - loss: 0.0983 - regression_loss: 0.0926 - classification_loss: 0.0058 179/500 [=========>....................] - ETA: 1:49 - loss: 0.0983 - regression_loss: 0.0925 - classification_loss: 0.0058 180/500 [=========>....................] - ETA: 1:48 - loss: 0.0984 - regression_loss: 0.0926 - classification_loss: 0.0058 181/500 [=========>....................] - ETA: 1:48 - loss: 0.1000 - regression_loss: 0.0942 - classification_loss: 0.0058 182/500 [=========>....................] - ETA: 1:48 - loss: 0.0996 - regression_loss: 0.0938 - classification_loss: 0.0058 183/500 [=========>....................] - ETA: 1:47 - loss: 0.0994 - regression_loss: 0.0937 - classification_loss: 0.0058 184/500 [==========>...................] - ETA: 1:47 - loss: 0.0993 - regression_loss: 0.0936 - classification_loss: 0.0057 185/500 [==========>...................] - ETA: 1:47 - loss: 0.0998 - regression_loss: 0.0940 - classification_loss: 0.0058 186/500 [==========>...................] - ETA: 1:46 - loss: 0.0996 - regression_loss: 0.0938 - classification_loss: 0.0057 187/500 [==========>...................] - ETA: 1:46 - loss: 0.0991 - regression_loss: 0.0934 - classification_loss: 0.0057 188/500 [==========>...................] - ETA: 1:46 - loss: 0.0989 - regression_loss: 0.0932 - classification_loss: 0.0057 189/500 [==========>...................] - ETA: 1:45 - loss: 0.0997 - regression_loss: 0.0939 - classification_loss: 0.0057 190/500 [==========>...................] - ETA: 1:45 - loss: 0.1001 - regression_loss: 0.0943 - classification_loss: 0.0058 191/500 [==========>...................] - ETA: 1:45 - loss: 0.1003 - regression_loss: 0.0945 - classification_loss: 0.0058 192/500 [==========>...................] - ETA: 1:44 - loss: 0.1002 - regression_loss: 0.0944 - classification_loss: 0.0058 193/500 [==========>...................] - ETA: 1:44 - loss: 0.1001 - regression_loss: 0.0943 - classification_loss: 0.0058 194/500 [==========>...................] - ETA: 1:44 - loss: 0.1000 - regression_loss: 0.0942 - classification_loss: 0.0058 195/500 [==========>...................] - ETA: 1:43 - loss: 0.0996 - regression_loss: 0.0938 - classification_loss: 0.0058 196/500 [==========>...................] - ETA: 1:43 - loss: 0.0996 - regression_loss: 0.0938 - classification_loss: 0.0058 197/500 [==========>...................] - ETA: 1:43 - loss: 0.0995 - regression_loss: 0.0938 - classification_loss: 0.0057 198/500 [==========>...................] - ETA: 1:42 - loss: 0.0998 - regression_loss: 0.0941 - classification_loss: 0.0058 199/500 [==========>...................] - ETA: 1:42 - loss: 0.0998 - regression_loss: 0.0940 - classification_loss: 0.0057 200/500 [===========>..................] - ETA: 1:42 - loss: 0.0999 - regression_loss: 0.0941 - classification_loss: 0.0058 201/500 [===========>..................] - ETA: 1:41 - loss: 0.0996 - regression_loss: 0.0939 - classification_loss: 0.0057 202/500 [===========>..................] - ETA: 1:41 - loss: 0.0995 - regression_loss: 0.0938 - classification_loss: 0.0057 203/500 [===========>..................] - ETA: 1:41 - loss: 0.0994 - regression_loss: 0.0937 - classification_loss: 0.0057 204/500 [===========>..................] - ETA: 1:40 - loss: 0.0993 - regression_loss: 0.0935 - classification_loss: 0.0057 205/500 [===========>..................] - ETA: 1:40 - loss: 0.0992 - regression_loss: 0.0934 - classification_loss: 0.0057 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0990 - regression_loss: 0.0933 - classification_loss: 0.0057 207/500 [===========>..................] - ETA: 1:39 - loss: 0.0988 - regression_loss: 0.0931 - classification_loss: 0.0057 208/500 [===========>..................] - ETA: 1:39 - loss: 0.0987 - regression_loss: 0.0930 - classification_loss: 0.0057 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0991 - regression_loss: 0.0934 - classification_loss: 0.0057 210/500 [===========>..................] - ETA: 1:38 - loss: 0.0989 - regression_loss: 0.0932 - classification_loss: 0.0057 211/500 [===========>..................] - ETA: 1:38 - loss: 0.0985 - regression_loss: 0.0928 - classification_loss: 0.0057 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0984 - regression_loss: 0.0927 - classification_loss: 0.0057 213/500 [===========>..................] - ETA: 1:37 - loss: 0.0984 - regression_loss: 0.0927 - classification_loss: 0.0057 214/500 [===========>..................] - ETA: 1:37 - loss: 0.0982 - regression_loss: 0.0925 - classification_loss: 0.0057 215/500 [===========>..................] - ETA: 1:36 - loss: 0.0979 - regression_loss: 0.0923 - classification_loss: 0.0056 216/500 [===========>..................] - ETA: 1:36 - loss: 0.0976 - regression_loss: 0.0920 - classification_loss: 0.0056 217/500 [============>.................] - ETA: 1:36 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 218/500 [============>.................] - ETA: 1:35 - loss: 0.0986 - regression_loss: 0.0927 - classification_loss: 0.0059 219/500 [============>.................] - ETA: 1:35 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 220/500 [============>.................] - ETA: 1:35 - loss: 0.0983 - regression_loss: 0.0923 - classification_loss: 0.0059 221/500 [============>.................] - ETA: 1:34 - loss: 0.0981 - regression_loss: 0.0922 - classification_loss: 0.0059 222/500 [============>.................] - ETA: 1:34 - loss: 0.0981 - regression_loss: 0.0922 - classification_loss: 0.0059 223/500 [============>.................] - ETA: 1:34 - loss: 0.0980 - regression_loss: 0.0921 - classification_loss: 0.0059 224/500 [============>.................] - ETA: 1:33 - loss: 0.0982 - regression_loss: 0.0923 - classification_loss: 0.0059 225/500 [============>.................] - ETA: 1:33 - loss: 0.0980 - regression_loss: 0.0921 - classification_loss: 0.0059 226/500 [============>.................] - ETA: 1:33 - loss: 0.0981 - regression_loss: 0.0922 - classification_loss: 0.0059 227/500 [============>.................] - ETA: 1:32 - loss: 0.0987 - regression_loss: 0.0927 - classification_loss: 0.0059 228/500 [============>.................] - ETA: 1:32 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 229/500 [============>.................] - ETA: 1:32 - loss: 0.0983 - regression_loss: 0.0924 - classification_loss: 0.0059 230/500 [============>.................] - ETA: 1:31 - loss: 0.0980 - regression_loss: 0.0921 - classification_loss: 0.0059 231/500 [============>.................] - ETA: 1:31 - loss: 0.0977 - regression_loss: 0.0918 - classification_loss: 0.0059 232/500 [============>.................] - ETA: 1:31 - loss: 0.0976 - regression_loss: 0.0917 - classification_loss: 0.0059 233/500 [============>.................] - ETA: 1:30 - loss: 0.0974 - regression_loss: 0.0915 - classification_loss: 0.0058 234/500 [=============>................] - ETA: 1:30 - loss: 0.0975 - regression_loss: 0.0916 - classification_loss: 0.0058 235/500 [=============>................] - ETA: 1:30 - loss: 0.0972 - regression_loss: 0.0914 - classification_loss: 0.0058 236/500 [=============>................] - ETA: 1:29 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 237/500 [=============>................] - ETA: 1:29 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 238/500 [=============>................] - ETA: 1:29 - loss: 0.0967 - regression_loss: 0.0909 - classification_loss: 0.0058 239/500 [=============>................] - ETA: 1:28 - loss: 0.0965 - regression_loss: 0.0907 - classification_loss: 0.0058 240/500 [=============>................] - ETA: 1:28 - loss: 0.0967 - regression_loss: 0.0909 - classification_loss: 0.0058 241/500 [=============>................] - ETA: 1:28 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0058 242/500 [=============>................] - ETA: 1:27 - loss: 0.0969 - regression_loss: 0.0911 - classification_loss: 0.0058 243/500 [=============>................] - ETA: 1:27 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 244/500 [=============>................] - ETA: 1:27 - loss: 0.0972 - regression_loss: 0.0913 - classification_loss: 0.0058 245/500 [=============>................] - ETA: 1:26 - loss: 0.0982 - regression_loss: 0.0924 - classification_loss: 0.0058 246/500 [=============>................] - ETA: 1:26 - loss: 0.0980 - regression_loss: 0.0922 - classification_loss: 0.0058 247/500 [=============>................] - ETA: 1:26 - loss: 0.0984 - regression_loss: 0.0925 - classification_loss: 0.0059 248/500 [=============>................] - ETA: 1:25 - loss: 0.0981 - regression_loss: 0.0922 - classification_loss: 0.0059 249/500 [=============>................] - ETA: 1:25 - loss: 0.0978 - regression_loss: 0.0920 - classification_loss: 0.0058 250/500 [==============>...............] - ETA: 1:25 - loss: 0.0975 - regression_loss: 0.0917 - classification_loss: 0.0058 251/500 [==============>...............] - ETA: 1:24 - loss: 0.0978 - regression_loss: 0.0920 - classification_loss: 0.0058 252/500 [==============>...............] - ETA: 1:24 - loss: 0.0979 - regression_loss: 0.0920 - classification_loss: 0.0058 253/500 [==============>...............] - ETA: 1:24 - loss: 0.0977 - regression_loss: 0.0919 - classification_loss: 0.0058 254/500 [==============>...............] - ETA: 1:23 - loss: 0.0976 - regression_loss: 0.0918 - classification_loss: 0.0058 255/500 [==============>...............] - ETA: 1:23 - loss: 0.0974 - regression_loss: 0.0916 - classification_loss: 0.0058 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0976 - regression_loss: 0.0918 - classification_loss: 0.0058 257/500 [==============>...............] - ETA: 1:22 - loss: 0.0976 - regression_loss: 0.0918 - classification_loss: 0.0058 258/500 [==============>...............] - ETA: 1:22 - loss: 0.0974 - regression_loss: 0.0916 - classification_loss: 0.0058 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0058 260/500 [==============>...............] - ETA: 1:21 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0058 261/500 [==============>...............] - ETA: 1:21 - loss: 0.0968 - regression_loss: 0.0911 - classification_loss: 0.0058 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 263/500 [==============>...............] - ETA: 1:20 - loss: 0.0965 - regression_loss: 0.0908 - classification_loss: 0.0057 264/500 [==============>...............] - ETA: 1:20 - loss: 0.0964 - regression_loss: 0.0906 - classification_loss: 0.0057 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 266/500 [==============>...............] - ETA: 1:19 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0057 267/500 [===============>..............] - ETA: 1:19 - loss: 0.0973 - regression_loss: 0.0916 - classification_loss: 0.0058 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0972 - regression_loss: 0.0914 - classification_loss: 0.0057 269/500 [===============>..............] - ETA: 1:18 - loss: 0.0972 - regression_loss: 0.0914 - classification_loss: 0.0057 270/500 [===============>..............] - ETA: 1:18 - loss: 0.0973 - regression_loss: 0.0915 - classification_loss: 0.0057 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0974 - regression_loss: 0.0917 - classification_loss: 0.0057 272/500 [===============>..............] - ETA: 1:17 - loss: 0.0973 - regression_loss: 0.0916 - classification_loss: 0.0057 273/500 [===============>..............] - ETA: 1:17 - loss: 0.0971 - regression_loss: 0.0913 - classification_loss: 0.0057 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0972 - regression_loss: 0.0915 - classification_loss: 0.0057 275/500 [===============>..............] - ETA: 1:16 - loss: 0.0969 - regression_loss: 0.0912 - classification_loss: 0.0057 276/500 [===============>..............] - ETA: 1:16 - loss: 0.0970 - regression_loss: 0.0913 - classification_loss: 0.0057 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0969 - regression_loss: 0.0911 - classification_loss: 0.0057 278/500 [===============>..............] - ETA: 1:15 - loss: 0.0967 - regression_loss: 0.0910 - classification_loss: 0.0057 279/500 [===============>..............] - ETA: 1:15 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0963 - regression_loss: 0.0907 - classification_loss: 0.0057 281/500 [===============>..............] - ETA: 1:14 - loss: 0.0965 - regression_loss: 0.0909 - classification_loss: 0.0057 282/500 [===============>..............] - ETA: 1:14 - loss: 0.0967 - regression_loss: 0.0910 - classification_loss: 0.0057 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0971 - regression_loss: 0.0914 - classification_loss: 0.0057 284/500 [================>.............] - ETA: 1:13 - loss: 0.0972 - regression_loss: 0.0915 - classification_loss: 0.0057 285/500 [================>.............] - ETA: 1:13 - loss: 0.0970 - regression_loss: 0.0912 - classification_loss: 0.0057 286/500 [================>.............] - ETA: 1:12 - loss: 0.0969 - regression_loss: 0.0912 - classification_loss: 0.0057 287/500 [================>.............] - ETA: 1:12 - loss: 0.0969 - regression_loss: 0.0912 - classification_loss: 0.0057 288/500 [================>.............] - ETA: 1:12 - loss: 0.0968 - regression_loss: 0.0911 - classification_loss: 0.0057 289/500 [================>.............] - ETA: 1:11 - loss: 0.0971 - regression_loss: 0.0914 - classification_loss: 0.0057 290/500 [================>.............] - ETA: 1:11 - loss: 0.0968 - regression_loss: 0.0911 - classification_loss: 0.0057 291/500 [================>.............] - ETA: 1:11 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 292/500 [================>.............] - ETA: 1:10 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0057 293/500 [================>.............] - ETA: 1:10 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 294/500 [================>.............] - ETA: 1:10 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0057 295/500 [================>.............] - ETA: 1:09 - loss: 0.0963 - regression_loss: 0.0906 - classification_loss: 0.0057 296/500 [================>.............] - ETA: 1:09 - loss: 0.0964 - regression_loss: 0.0908 - classification_loss: 0.0057 297/500 [================>.............] - ETA: 1:09 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0057 298/500 [================>.............] - ETA: 1:08 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 299/500 [================>.............] - ETA: 1:08 - loss: 0.0965 - regression_loss: 0.0908 - classification_loss: 0.0057 300/500 [=================>............] - ETA: 1:08 - loss: 0.0966 - regression_loss: 0.0909 - classification_loss: 0.0057 301/500 [=================>............] - ETA: 1:07 - loss: 0.0966 - regression_loss: 0.0908 - classification_loss: 0.0057 302/500 [=================>............] - ETA: 1:07 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 303/500 [=================>............] - ETA: 1:07 - loss: 0.0965 - regression_loss: 0.0908 - classification_loss: 0.0058 304/500 [=================>............] - ETA: 1:06 - loss: 0.0964 - regression_loss: 0.0906 - classification_loss: 0.0058 305/500 [=================>............] - ETA: 1:06 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0058 306/500 [=================>............] - ETA: 1:06 - loss: 0.0965 - regression_loss: 0.0907 - classification_loss: 0.0058 307/500 [=================>............] - ETA: 1:05 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0058 308/500 [=================>............] - ETA: 1:05 - loss: 0.0965 - regression_loss: 0.0908 - classification_loss: 0.0058 309/500 [=================>............] - ETA: 1:05 - loss: 0.0964 - regression_loss: 0.0907 - classification_loss: 0.0058 310/500 [=================>............] - ETA: 1:04 - loss: 0.0962 - regression_loss: 0.0905 - classification_loss: 0.0057 311/500 [=================>............] - ETA: 1:04 - loss: 0.0961 - regression_loss: 0.0904 - classification_loss: 0.0057 312/500 [=================>............] - ETA: 1:04 - loss: 0.0959 - regression_loss: 0.0902 - classification_loss: 0.0057 313/500 [=================>............] - ETA: 1:03 - loss: 0.0960 - regression_loss: 0.0903 - classification_loss: 0.0057 314/500 [=================>............] - ETA: 1:03 - loss: 0.0959 - regression_loss: 0.0902 - classification_loss: 0.0057 315/500 [=================>............] - ETA: 1:03 - loss: 0.0957 - regression_loss: 0.0900 - classification_loss: 0.0057 316/500 [=================>............] - ETA: 1:02 - loss: 0.0956 - regression_loss: 0.0899 - classification_loss: 0.0057 317/500 [==================>...........] - ETA: 1:02 - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0057 318/500 [==================>...........] - ETA: 1:02 - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0057 319/500 [==================>...........] - ETA: 1:01 - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0057 320/500 [==================>...........] - ETA: 1:01 - loss: 0.0957 - regression_loss: 0.0900 - classification_loss: 0.0057 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0954 - regression_loss: 0.0897 - classification_loss: 0.0057 322/500 [==================>...........] - ETA: 1:00 - loss: 0.0956 - regression_loss: 0.0899 - classification_loss: 0.0057 323/500 [==================>...........] - ETA: 1:00 - loss: 0.0954 - regression_loss: 0.0898 - classification_loss: 0.0057 324/500 [==================>...........] - ETA: 59s - loss: 0.0956 - regression_loss: 0.0899 - classification_loss: 0.0057  325/500 [==================>...........] - ETA: 59s - loss: 0.0956 - regression_loss: 0.0898 - classification_loss: 0.0057 326/500 [==================>...........] - ETA: 59s - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0057 327/500 [==================>...........] - ETA: 58s - loss: 0.0954 - regression_loss: 0.0897 - classification_loss: 0.0057 328/500 [==================>...........] - ETA: 58s - loss: 0.0953 - regression_loss: 0.0896 - classification_loss: 0.0057 329/500 [==================>...........] - ETA: 58s - loss: 0.0950 - regression_loss: 0.0894 - classification_loss: 0.0057 330/500 [==================>...........] - ETA: 57s - loss: 0.0949 - regression_loss: 0.0892 - classification_loss: 0.0057 331/500 [==================>...........] - ETA: 57s - loss: 0.0953 - regression_loss: 0.0896 - classification_loss: 0.0057 332/500 [==================>...........] - ETA: 57s - loss: 0.0953 - regression_loss: 0.0896 - classification_loss: 0.0057 333/500 [==================>...........] - ETA: 56s - loss: 0.0951 - regression_loss: 0.0894 - classification_loss: 0.0057 334/500 [===================>..........] - ETA: 56s - loss: 0.0951 - regression_loss: 0.0894 - classification_loss: 0.0057 335/500 [===================>..........] - ETA: 56s - loss: 0.0949 - regression_loss: 0.0892 - classification_loss: 0.0057 336/500 [===================>..........] - ETA: 55s - loss: 0.0949 - regression_loss: 0.0892 - classification_loss: 0.0057 337/500 [===================>..........] - ETA: 55s - loss: 0.0947 - regression_loss: 0.0891 - classification_loss: 0.0057 338/500 [===================>..........] - ETA: 55s - loss: 0.0948 - regression_loss: 0.0891 - classification_loss: 0.0057 339/500 [===================>..........] - ETA: 54s - loss: 0.0949 - regression_loss: 0.0892 - classification_loss: 0.0057 340/500 [===================>..........] - ETA: 54s - loss: 0.0952 - regression_loss: 0.0895 - classification_loss: 0.0057 341/500 [===================>..........] - ETA: 54s - loss: 0.0955 - regression_loss: 0.0897 - classification_loss: 0.0058 342/500 [===================>..........] - ETA: 53s - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0058 343/500 [===================>..........] - ETA: 53s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 344/500 [===================>..........] - ETA: 53s - loss: 0.0960 - regression_loss: 0.0900 - classification_loss: 0.0059 345/500 [===================>..........] - ETA: 52s - loss: 0.0958 - regression_loss: 0.0899 - classification_loss: 0.0059 346/500 [===================>..........] - ETA: 52s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 347/500 [===================>..........] - ETA: 52s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 348/500 [===================>..........] - ETA: 51s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0059 349/500 [===================>..........] - ETA: 51s - loss: 0.0957 - regression_loss: 0.0898 - classification_loss: 0.0059 350/500 [====================>.........] - ETA: 51s - loss: 0.0956 - regression_loss: 0.0897 - classification_loss: 0.0059 351/500 [====================>.........] - ETA: 50s - loss: 0.0956 - regression_loss: 0.0897 - classification_loss: 0.0059 352/500 [====================>.........] - ETA: 50s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0059 353/500 [====================>.........] - ETA: 50s - loss: 0.0958 - regression_loss: 0.0899 - classification_loss: 0.0059 354/500 [====================>.........] - ETA: 49s - loss: 0.0958 - regression_loss: 0.0899 - classification_loss: 0.0059 355/500 [====================>.........] - ETA: 49s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0059 356/500 [====================>.........] - ETA: 49s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 357/500 [====================>.........] - ETA: 48s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 358/500 [====================>.........] - ETA: 48s - loss: 0.0961 - regression_loss: 0.0903 - classification_loss: 0.0059 359/500 [====================>.........] - ETA: 47s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 360/500 [====================>.........] - ETA: 47s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 361/500 [====================>.........] - ETA: 47s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 362/500 [====================>.........] - ETA: 46s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 363/500 [====================>.........] - ETA: 46s - loss: 0.0959 - regression_loss: 0.0901 - classification_loss: 0.0059 364/500 [====================>.........] - ETA: 46s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0059 365/500 [====================>.........] - ETA: 45s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 366/500 [====================>.........] - ETA: 45s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 367/500 [=====================>........] - ETA: 45s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0058 368/500 [=====================>........] - ETA: 44s - loss: 0.0957 - regression_loss: 0.0899 - classification_loss: 0.0058 369/500 [=====================>........] - ETA: 44s - loss: 0.0958 - regression_loss: 0.0899 - classification_loss: 0.0058 370/500 [=====================>........] - ETA: 44s - loss: 0.0957 - regression_loss: 0.0899 - classification_loss: 0.0058 371/500 [=====================>........] - ETA: 43s - loss: 0.0958 - regression_loss: 0.0900 - classification_loss: 0.0059 372/500 [=====================>........] - ETA: 43s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0058 373/500 [=====================>........] - ETA: 43s - loss: 0.0958 - regression_loss: 0.0899 - classification_loss: 0.0058 374/500 [=====================>........] - ETA: 42s - loss: 0.0957 - regression_loss: 0.0899 - classification_loss: 0.0058 375/500 [=====================>........] - ETA: 42s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 376/500 [=====================>........] - ETA: 42s - loss: 0.0959 - regression_loss: 0.0901 - classification_loss: 0.0059 377/500 [=====================>........] - ETA: 41s - loss: 0.0958 - regression_loss: 0.0900 - classification_loss: 0.0058 378/500 [=====================>........] - ETA: 41s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0058 379/500 [=====================>........] - ETA: 41s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 380/500 [=====================>........] - ETA: 40s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 381/500 [=====================>........] - ETA: 40s - loss: 0.0960 - regression_loss: 0.0901 - classification_loss: 0.0059 382/500 [=====================>........] - ETA: 40s - loss: 0.0958 - regression_loss: 0.0900 - classification_loss: 0.0059 383/500 [=====================>........] - ETA: 39s - loss: 0.0957 - regression_loss: 0.0899 - classification_loss: 0.0059 384/500 [======================>.......] - ETA: 39s - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0059 385/500 [======================>.......] - ETA: 39s - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0059 386/500 [======================>.......] - ETA: 38s - loss: 0.0966 - regression_loss: 0.0907 - classification_loss: 0.0059 387/500 [======================>.......] - ETA: 38s - loss: 0.0964 - regression_loss: 0.0905 - classification_loss: 0.0059 388/500 [======================>.......] - ETA: 38s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 389/500 [======================>.......] - ETA: 37s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 390/500 [======================>.......] - ETA: 37s - loss: 0.0961 - regression_loss: 0.0903 - classification_loss: 0.0059 391/500 [======================>.......] - ETA: 37s - loss: 0.0960 - regression_loss: 0.0902 - classification_loss: 0.0059 392/500 [======================>.......] - ETA: 36s - loss: 0.0964 - regression_loss: 0.0905 - classification_loss: 0.0059 393/500 [======================>.......] - ETA: 36s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 394/500 [======================>.......] - ETA: 36s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 395/500 [======================>.......] - ETA: 35s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 396/500 [======================>.......] - ETA: 35s - loss: 0.0961 - regression_loss: 0.0903 - classification_loss: 0.0059 397/500 [======================>.......] - ETA: 35s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 398/500 [======================>.......] - ETA: 34s - loss: 0.0963 - regression_loss: 0.0904 - classification_loss: 0.0059 399/500 [======================>.......] - ETA: 34s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 400/500 [=======================>......] - ETA: 34s - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0059 401/500 [=======================>......] - ETA: 33s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 402/500 [=======================>......] - ETA: 33s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 403/500 [=======================>......] - ETA: 33s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0058 404/500 [=======================>......] - ETA: 32s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 405/500 [=======================>......] - ETA: 32s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0059 406/500 [=======================>......] - ETA: 32s - loss: 0.0962 - regression_loss: 0.0903 - classification_loss: 0.0059 407/500 [=======================>......] - ETA: 31s - loss: 0.0964 - regression_loss: 0.0905 - classification_loss: 0.0059 408/500 [=======================>......] - ETA: 31s - loss: 0.0971 - regression_loss: 0.0912 - classification_loss: 0.0059 409/500 [=======================>......] - ETA: 30s - loss: 0.0970 - regression_loss: 0.0911 - classification_loss: 0.0059 410/500 [=======================>......] - ETA: 30s - loss: 0.0969 - regression_loss: 0.0910 - classification_loss: 0.0059 411/500 [=======================>......] - ETA: 30s - loss: 0.0967 - regression_loss: 0.0908 - classification_loss: 0.0059 412/500 [=======================>......] - ETA: 29s - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0059 413/500 [=======================>......] - ETA: 29s - loss: 0.0969 - regression_loss: 0.0910 - classification_loss: 0.0059 414/500 [=======================>......] - ETA: 29s - loss: 0.0967 - regression_loss: 0.0908 - classification_loss: 0.0059 415/500 [=======================>......] - ETA: 28s - loss: 0.0967 - regression_loss: 0.0908 - classification_loss: 0.0059 416/500 [=======================>......] - ETA: 28s - loss: 0.0965 - regression_loss: 0.0906 - classification_loss: 0.0059 417/500 [========================>.....] - ETA: 28s - loss: 0.0965 - regression_loss: 0.0907 - classification_loss: 0.0059 418/500 [========================>.....] - ETA: 27s - loss: 0.0964 - regression_loss: 0.0906 - classification_loss: 0.0059 419/500 [========================>.....] - ETA: 27s - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0058 420/500 [========================>.....] - ETA: 27s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0058 421/500 [========================>.....] - ETA: 26s - loss: 0.0959 - regression_loss: 0.0901 - classification_loss: 0.0058 422/500 [========================>.....] - ETA: 26s - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0058 423/500 [========================>.....] - ETA: 26s - loss: 0.0961 - regression_loss: 0.0903 - classification_loss: 0.0058 424/500 [========================>.....] - ETA: 25s - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0058 425/500 [========================>.....] - ETA: 25s - loss: 0.0960 - regression_loss: 0.0902 - classification_loss: 0.0058 426/500 [========================>.....] - ETA: 25s - loss: 0.0962 - regression_loss: 0.0904 - classification_loss: 0.0058 427/500 [========================>.....] - ETA: 24s - loss: 0.0961 - regression_loss: 0.0902 - classification_loss: 0.0058 428/500 [========================>.....] - ETA: 24s - loss: 0.0961 - regression_loss: 0.0903 - classification_loss: 0.0058 429/500 [========================>.....] - ETA: 24s - loss: 0.0959 - regression_loss: 0.0901 - classification_loss: 0.0058 430/500 [========================>.....] - ETA: 23s - loss: 0.0959 - regression_loss: 0.0900 - classification_loss: 0.0058 431/500 [========================>.....] - ETA: 23s - loss: 0.0958 - regression_loss: 0.0900 - classification_loss: 0.0058 432/500 [========================>.....] - ETA: 23s - loss: 0.0957 - regression_loss: 0.0899 - classification_loss: 0.0058 433/500 [========================>.....] - ETA: 22s - loss: 0.0955 - regression_loss: 0.0898 - classification_loss: 0.0058 434/500 [=========================>....] - ETA: 22s - loss: 0.0956 - regression_loss: 0.0897 - classification_loss: 0.0058 435/500 [=========================>....] - ETA: 22s - loss: 0.0954 - regression_loss: 0.0896 - classification_loss: 0.0058 436/500 [=========================>....] - ETA: 21s - loss: 0.0954 - regression_loss: 0.0896 - classification_loss: 0.0058 437/500 [=========================>....] - ETA: 21s - loss: 0.0956 - regression_loss: 0.0898 - classification_loss: 0.0058 438/500 [=========================>....] - ETA: 21s - loss: 0.0954 - regression_loss: 0.0896 - classification_loss: 0.0058 439/500 [=========================>....] - ETA: 20s - loss: 0.0955 - regression_loss: 0.0897 - classification_loss: 0.0058 440/500 [=========================>....] - ETA: 20s - loss: 0.0953 - regression_loss: 0.0895 - classification_loss: 0.0058 441/500 [=========================>....] - ETA: 20s - loss: 0.0953 - regression_loss: 0.0895 - classification_loss: 0.0058 442/500 [=========================>....] - ETA: 19s - loss: 0.0953 - regression_loss: 0.0895 - classification_loss: 0.0058 443/500 [=========================>....] - ETA: 19s - loss: 0.0952 - regression_loss: 0.0894 - classification_loss: 0.0058 444/500 [=========================>....] - ETA: 19s - loss: 0.0951 - regression_loss: 0.0893 - classification_loss: 0.0058 445/500 [=========================>....] - ETA: 18s - loss: 0.0952 - regression_loss: 0.0894 - classification_loss: 0.0058 446/500 [=========================>....] - ETA: 18s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 447/500 [=========================>....] - ETA: 18s - loss: 0.0949 - regression_loss: 0.0891 - classification_loss: 0.0058 448/500 [=========================>....] - ETA: 17s - loss: 0.0951 - regression_loss: 0.0893 - classification_loss: 0.0058 449/500 [=========================>....] - ETA: 17s - loss: 0.0951 - regression_loss: 0.0893 - classification_loss: 0.0058 450/500 [==========================>...] - ETA: 17s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 451/500 [==========================>...] - ETA: 16s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 452/500 [==========================>...] - ETA: 16s - loss: 0.0951 - regression_loss: 0.0893 - classification_loss: 0.0058 453/500 [==========================>...] - ETA: 16s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 454/500 [==========================>...] - ETA: 15s - loss: 0.0948 - regression_loss: 0.0890 - classification_loss: 0.0058 455/500 [==========================>...] - ETA: 15s - loss: 0.0947 - regression_loss: 0.0889 - classification_loss: 0.0058 456/500 [==========================>...] - ETA: 14s - loss: 0.0946 - regression_loss: 0.0889 - classification_loss: 0.0058 457/500 [==========================>...] - ETA: 14s - loss: 0.0945 - regression_loss: 0.0887 - classification_loss: 0.0058 458/500 [==========================>...] - ETA: 14s - loss: 0.0946 - regression_loss: 0.0888 - classification_loss: 0.0058 459/500 [==========================>...] - ETA: 13s - loss: 0.0949 - regression_loss: 0.0891 - classification_loss: 0.0058 460/500 [==========================>...] - ETA: 13s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 461/500 [==========================>...] - ETA: 13s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 462/500 [==========================>...] - ETA: 12s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 463/500 [==========================>...] - ETA: 12s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 464/500 [==========================>...] - ETA: 12s - loss: 0.0950 - regression_loss: 0.0892 - classification_loss: 0.0058 465/500 [==========================>...] - ETA: 11s - loss: 0.0949 - regression_loss: 0.0892 - classification_loss: 0.0058 466/500 [==========================>...] - ETA: 11s - loss: 0.0948 - regression_loss: 0.0891 - classification_loss: 0.0058 467/500 [===========================>..] - ETA: 11s - loss: 0.0947 - regression_loss: 0.0889 - classification_loss: 0.0058 468/500 [===========================>..] - ETA: 10s - loss: 0.0947 - regression_loss: 0.0889 - classification_loss: 0.0058 469/500 [===========================>..] - ETA: 10s - loss: 0.0947 - regression_loss: 0.0890 - classification_loss: 0.0058 470/500 [===========================>..] - ETA: 10s - loss: 0.0946 - regression_loss: 0.0888 - classification_loss: 0.0058 471/500 [===========================>..] - ETA: 9s - loss: 0.0946 - regression_loss: 0.0888 - classification_loss: 0.0058  472/500 [===========================>..] - ETA: 9s - loss: 0.0945 - regression_loss: 0.0887 - classification_loss: 0.0058 473/500 [===========================>..] - ETA: 9s - loss: 0.0944 - regression_loss: 0.0887 - classification_loss: 0.0058 474/500 [===========================>..] - ETA: 8s - loss: 0.0943 - regression_loss: 0.0886 - classification_loss: 0.0057 475/500 [===========================>..] - ETA: 8s - loss: 0.0942 - regression_loss: 0.0884 - classification_loss: 0.0057 476/500 [===========================>..] - ETA: 8s - loss: 0.0941 - regression_loss: 0.0884 - classification_loss: 0.0057 477/500 [===========================>..] - ETA: 7s - loss: 0.0941 - regression_loss: 0.0883 - classification_loss: 0.0057 478/500 [===========================>..] - ETA: 7s - loss: 0.0941 - regression_loss: 0.0884 - classification_loss: 0.0057 479/500 [===========================>..] - ETA: 7s - loss: 0.0940 - regression_loss: 0.0882 - classification_loss: 0.0057 480/500 [===========================>..] - ETA: 6s - loss: 0.0939 - regression_loss: 0.0881 - classification_loss: 0.0057 481/500 [===========================>..] - ETA: 6s - loss: 0.0940 - regression_loss: 0.0882 - classification_loss: 0.0058 482/500 [===========================>..] - ETA: 6s - loss: 0.0940 - regression_loss: 0.0882 - classification_loss: 0.0058 483/500 [===========================>..] - ETA: 5s - loss: 0.0938 - regression_loss: 0.0881 - classification_loss: 0.0057 484/500 [============================>.] - ETA: 5s - loss: 0.0939 - regression_loss: 0.0882 - classification_loss: 0.0058 485/500 [============================>.] - ETA: 5s - loss: 0.0939 - regression_loss: 0.0881 - classification_loss: 0.0058 486/500 [============================>.] - ETA: 4s - loss: 0.0937 - regression_loss: 0.0880 - classification_loss: 0.0057 487/500 [============================>.] - ETA: 4s - loss: 0.0938 - regression_loss: 0.0881 - classification_loss: 0.0058 488/500 [============================>.] - ETA: 4s - loss: 0.0937 - regression_loss: 0.0880 - classification_loss: 0.0057 489/500 [============================>.] - ETA: 3s - loss: 0.0941 - regression_loss: 0.0883 - classification_loss: 0.0059 490/500 [============================>.] - ETA: 3s - loss: 0.0943 - regression_loss: 0.0885 - classification_loss: 0.0059 491/500 [============================>.] - ETA: 3s - loss: 0.0943 - regression_loss: 0.0884 - classification_loss: 0.0059 492/500 [============================>.] - ETA: 2s - loss: 0.0942 - regression_loss: 0.0883 - classification_loss: 0.0059 493/500 [============================>.] - ETA: 2s - loss: 0.0941 - regression_loss: 0.0882 - classification_loss: 0.0059 494/500 [============================>.] - ETA: 2s - loss: 0.0940 - regression_loss: 0.0882 - classification_loss: 0.0058 495/500 [============================>.] - ETA: 1s - loss: 0.0939 - regression_loss: 0.0881 - classification_loss: 0.0058 496/500 [============================>.] - ETA: 1s - loss: 0.0939 - regression_loss: 0.0881 - classification_loss: 0.0058 497/500 [============================>.] - ETA: 1s - loss: 0.0941 - regression_loss: 0.0882 - classification_loss: 0.0059 498/500 [============================>.] - ETA: 0s - loss: 0.0939 - regression_loss: 0.0880 - classification_loss: 0.0059 499/500 [============================>.] - ETA: 0s - loss: 0.0940 - regression_loss: 0.0881 - classification_loss: 0.0059 500/500 [==============================] - 170s 341ms/step - loss: 0.0941 - regression_loss: 0.0882 - classification_loss: 0.0059 1172 instances of class plum with average precision: 0.7599 mAP: 0.7599 Epoch 00053: saving model to ./training/snapshots/resnet101_pascal_53.h5 Epoch 54/150 1/500 [..............................] - ETA: 2:42 - loss: 0.0385 - regression_loss: 0.0364 - classification_loss: 0.0021 2/500 [..............................] - ETA: 2:43 - loss: 0.0630 - regression_loss: 0.0600 - classification_loss: 0.0030 3/500 [..............................] - ETA: 2:44 - loss: 0.0612 - regression_loss: 0.0587 - classification_loss: 0.0024 4/500 [..............................] - ETA: 2:46 - loss: 0.0600 - regression_loss: 0.0575 - classification_loss: 0.0025 5/500 [..............................] - ETA: 2:45 - loss: 0.0789 - regression_loss: 0.0749 - classification_loss: 0.0040 6/500 [..............................] - ETA: 2:46 - loss: 0.0714 - regression_loss: 0.0678 - classification_loss: 0.0036 7/500 [..............................] - ETA: 2:46 - loss: 0.0798 - regression_loss: 0.0760 - classification_loss: 0.0038 8/500 [..............................] - ETA: 2:45 - loss: 0.0754 - regression_loss: 0.0714 - classification_loss: 0.0039 9/500 [..............................] - ETA: 2:45 - loss: 0.0681 - regression_loss: 0.0645 - classification_loss: 0.0036 10/500 [..............................] - ETA: 2:45 - loss: 0.0666 - regression_loss: 0.0630 - classification_loss: 0.0036 11/500 [..............................] - ETA: 2:45 - loss: 0.0681 - regression_loss: 0.0639 - classification_loss: 0.0042 12/500 [..............................] - ETA: 2:45 - loss: 0.0693 - regression_loss: 0.0648 - classification_loss: 0.0045 13/500 [..............................] - ETA: 2:44 - loss: 0.0690 - regression_loss: 0.0645 - classification_loss: 0.0045 14/500 [..............................] - ETA: 2:45 - loss: 0.0737 - regression_loss: 0.0692 - classification_loss: 0.0046 15/500 [..............................] - ETA: 2:45 - loss: 0.0730 - regression_loss: 0.0686 - classification_loss: 0.0044 16/500 [..............................] - ETA: 2:44 - loss: 0.0832 - regression_loss: 0.0783 - classification_loss: 0.0050 17/500 [>.............................] - ETA: 2:45 - loss: 0.0797 - regression_loss: 0.0750 - classification_loss: 0.0047 18/500 [>.............................] - ETA: 2:44 - loss: 0.0792 - regression_loss: 0.0744 - classification_loss: 0.0049 19/500 [>.............................] - ETA: 2:44 - loss: 0.0941 - regression_loss: 0.0887 - classification_loss: 0.0054 20/500 [>.............................] - ETA: 2:43 - loss: 0.0913 - regression_loss: 0.0860 - classification_loss: 0.0052 21/500 [>.............................] - ETA: 2:43 - loss: 0.0968 - regression_loss: 0.0910 - classification_loss: 0.0058 22/500 [>.............................] - ETA: 2:43 - loss: 0.1037 - regression_loss: 0.0972 - classification_loss: 0.0066 23/500 [>.............................] - ETA: 2:43 - loss: 0.1013 - regression_loss: 0.0949 - classification_loss: 0.0064 24/500 [>.............................] - ETA: 2:42 - loss: 0.1021 - regression_loss: 0.0957 - classification_loss: 0.0065 25/500 [>.............................] - ETA: 2:42 - loss: 0.0993 - regression_loss: 0.0930 - classification_loss: 0.0063 26/500 [>.............................] - ETA: 2:41 - loss: 0.0962 - regression_loss: 0.0901 - classification_loss: 0.0061 27/500 [>.............................] - ETA: 2:41 - loss: 0.0973 - regression_loss: 0.0910 - classification_loss: 0.0063 28/500 [>.............................] - ETA: 2:41 - loss: 0.0975 - regression_loss: 0.0911 - classification_loss: 0.0064 29/500 [>.............................] - ETA: 2:40 - loss: 0.0960 - regression_loss: 0.0898 - classification_loss: 0.0062 30/500 [>.............................] - ETA: 2:39 - loss: 0.0951 - regression_loss: 0.0889 - classification_loss: 0.0062 31/500 [>.............................] - ETA: 2:39 - loss: 0.0929 - regression_loss: 0.0869 - classification_loss: 0.0060 32/500 [>.............................] - ETA: 2:38 - loss: 0.0921 - regression_loss: 0.0861 - classification_loss: 0.0060 33/500 [>.............................] - ETA: 2:37 - loss: 0.0931 - regression_loss: 0.0871 - classification_loss: 0.0060 34/500 [=>............................] - ETA: 2:36 - loss: 0.0933 - regression_loss: 0.0873 - classification_loss: 0.0061 35/500 [=>............................] - ETA: 2:35 - loss: 0.0936 - regression_loss: 0.0875 - classification_loss: 0.0061 36/500 [=>............................] - ETA: 2:35 - loss: 0.0949 - regression_loss: 0.0887 - classification_loss: 0.0063 37/500 [=>............................] - ETA: 2:34 - loss: 0.0970 - regression_loss: 0.0907 - classification_loss: 0.0063 38/500 [=>............................] - ETA: 2:34 - loss: 0.0964 - regression_loss: 0.0901 - classification_loss: 0.0063 39/500 [=>............................] - ETA: 2:33 - loss: 0.0963 - regression_loss: 0.0901 - classification_loss: 0.0062 40/500 [=>............................] - ETA: 2:33 - loss: 0.0968 - regression_loss: 0.0906 - classification_loss: 0.0062 41/500 [=>............................] - ETA: 2:33 - loss: 0.0966 - regression_loss: 0.0904 - classification_loss: 0.0062 42/500 [=>............................] - ETA: 2:32 - loss: 0.0947 - regression_loss: 0.0886 - classification_loss: 0.0061 43/500 [=>............................] - ETA: 2:32 - loss: 0.0943 - regression_loss: 0.0882 - classification_loss: 0.0061 44/500 [=>............................] - ETA: 2:32 - loss: 0.0929 - regression_loss: 0.0869 - classification_loss: 0.0060 45/500 [=>............................] - ETA: 2:32 - loss: 0.0915 - regression_loss: 0.0856 - classification_loss: 0.0059 46/500 [=>............................] - ETA: 2:31 - loss: 0.0899 - regression_loss: 0.0841 - classification_loss: 0.0058 47/500 [=>............................] - ETA: 2:31 - loss: 0.0904 - regression_loss: 0.0846 - classification_loss: 0.0058 48/500 [=>............................] - ETA: 2:31 - loss: 0.0934 - regression_loss: 0.0874 - classification_loss: 0.0060 49/500 [=>............................] - ETA: 2:31 - loss: 0.0938 - regression_loss: 0.0878 - classification_loss: 0.0060 50/500 [==>...........................] - ETA: 2:31 - loss: 0.0940 - regression_loss: 0.0881 - classification_loss: 0.0059 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0949 - regression_loss: 0.0889 - classification_loss: 0.0059 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0939 - regression_loss: 0.0880 - classification_loss: 0.0059 53/500 [==>...........................] - ETA: 2:30 - loss: 0.0957 - regression_loss: 0.0897 - classification_loss: 0.0059 54/500 [==>...........................] - ETA: 2:30 - loss: 0.0944 - regression_loss: 0.0886 - classification_loss: 0.0058 55/500 [==>...........................] - ETA: 2:29 - loss: 0.0942 - regression_loss: 0.0884 - classification_loss: 0.0058 56/500 [==>...........................] - ETA: 2:29 - loss: 0.0931 - regression_loss: 0.0874 - classification_loss: 0.0058 57/500 [==>...........................] - ETA: 2:29 - loss: 0.0926 - regression_loss: 0.0868 - classification_loss: 0.0058 58/500 [==>...........................] - ETA: 2:29 - loss: 0.0944 - regression_loss: 0.0885 - classification_loss: 0.0059 59/500 [==>...........................] - ETA: 2:28 - loss: 0.0941 - regression_loss: 0.0883 - classification_loss: 0.0059 60/500 [==>...........................] - ETA: 2:28 - loss: 0.0932 - regression_loss: 0.0874 - classification_loss: 0.0058 61/500 [==>...........................] - ETA: 2:28 - loss: 0.0939 - regression_loss: 0.0880 - classification_loss: 0.0059 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0939 - regression_loss: 0.0881 - classification_loss: 0.0058 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0935 - regression_loss: 0.0877 - classification_loss: 0.0058 64/500 [==>...........................] - ETA: 2:27 - loss: 0.0928 - regression_loss: 0.0870 - classification_loss: 0.0058 65/500 [==>...........................] - ETA: 2:27 - loss: 0.0922 - regression_loss: 0.0865 - classification_loss: 0.0057 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0913 - regression_loss: 0.0856 - classification_loss: 0.0057 67/500 [===>..........................] - ETA: 2:26 - loss: 0.0911 - regression_loss: 0.0854 - classification_loss: 0.0057 68/500 [===>..........................] - ETA: 2:26 - loss: 0.0910 - regression_loss: 0.0851 - classification_loss: 0.0058 69/500 [===>..........................] - ETA: 2:26 - loss: 0.0908 - regression_loss: 0.0848 - classification_loss: 0.0060 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0903 - regression_loss: 0.0844 - classification_loss: 0.0059 71/500 [===>..........................] - ETA: 2:25 - loss: 0.0900 - regression_loss: 0.0841 - classification_loss: 0.0059 72/500 [===>..........................] - ETA: 2:25 - loss: 0.0894 - regression_loss: 0.0835 - classification_loss: 0.0058 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0923 - regression_loss: 0.0864 - classification_loss: 0.0059 74/500 [===>..........................] - ETA: 2:24 - loss: 0.0931 - regression_loss: 0.0872 - classification_loss: 0.0059 75/500 [===>..........................] - ETA: 2:24 - loss: 0.0926 - regression_loss: 0.0867 - classification_loss: 0.0059 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0925 - regression_loss: 0.0866 - classification_loss: 0.0058 77/500 [===>..........................] - ETA: 2:23 - loss: 0.0920 - regression_loss: 0.0862 - classification_loss: 0.0058 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0925 - regression_loss: 0.0867 - classification_loss: 0.0058 79/500 [===>..........................] - ETA: 2:22 - loss: 0.0922 - regression_loss: 0.0865 - classification_loss: 0.0058 80/500 [===>..........................] - ETA: 2:22 - loss: 0.0914 - regression_loss: 0.0857 - classification_loss: 0.0057 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0904 - regression_loss: 0.0848 - classification_loss: 0.0057 82/500 [===>..........................] - ETA: 2:21 - loss: 0.0909 - regression_loss: 0.0852 - classification_loss: 0.0057 83/500 [===>..........................] - ETA: 2:20 - loss: 0.0903 - regression_loss: 0.0846 - classification_loss: 0.0056 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0907 - regression_loss: 0.0850 - classification_loss: 0.0057 85/500 [====>.........................] - ETA: 2:20 - loss: 0.0903 - regression_loss: 0.0847 - classification_loss: 0.0057 86/500 [====>.........................] - ETA: 2:19 - loss: 0.0896 - regression_loss: 0.0839 - classification_loss: 0.0056 87/500 [====>.........................] - ETA: 2:19 - loss: 0.0887 - regression_loss: 0.0831 - classification_loss: 0.0056 88/500 [====>.........................] - ETA: 2:19 - loss: 0.0886 - regression_loss: 0.0831 - classification_loss: 0.0055 89/500 [====>.........................] - ETA: 2:18 - loss: 0.0897 - regression_loss: 0.0841 - classification_loss: 0.0055 90/500 [====>.........................] - ETA: 2:18 - loss: 0.0898 - regression_loss: 0.0842 - classification_loss: 0.0056 91/500 [====>.........................] - ETA: 2:18 - loss: 0.0900 - regression_loss: 0.0844 - classification_loss: 0.0056 92/500 [====>.........................] - ETA: 2:18 - loss: 0.0898 - regression_loss: 0.0843 - classification_loss: 0.0055 93/500 [====>.........................] - ETA: 2:17 - loss: 0.0908 - regression_loss: 0.0852 - classification_loss: 0.0056 94/500 [====>.........................] - ETA: 2:17 - loss: 0.0903 - regression_loss: 0.0848 - classification_loss: 0.0055 95/500 [====>.........................] - ETA: 2:17 - loss: 0.0896 - regression_loss: 0.0841 - classification_loss: 0.0055 96/500 [====>.........................] - ETA: 2:16 - loss: 0.0914 - regression_loss: 0.0857 - classification_loss: 0.0057 97/500 [====>.........................] - ETA: 2:16 - loss: 0.0916 - regression_loss: 0.0859 - classification_loss: 0.0057 98/500 [====>.........................] - ETA: 2:16 - loss: 0.0920 - regression_loss: 0.0863 - classification_loss: 0.0057 99/500 [====>.........................] - ETA: 2:15 - loss: 0.0920 - regression_loss: 0.0863 - classification_loss: 0.0057 100/500 [=====>........................] - ETA: 2:15 - loss: 0.0914 - regression_loss: 0.0857 - classification_loss: 0.0057 101/500 [=====>........................] - ETA: 2:15 - loss: 0.0918 - regression_loss: 0.0861 - classification_loss: 0.0057 102/500 [=====>........................] - ETA: 2:14 - loss: 0.0914 - regression_loss: 0.0857 - classification_loss: 0.0056 103/500 [=====>........................] - ETA: 2:14 - loss: 0.0912 - regression_loss: 0.0856 - classification_loss: 0.0056 104/500 [=====>........................] - ETA: 2:14 - loss: 0.0911 - regression_loss: 0.0855 - classification_loss: 0.0056 105/500 [=====>........................] - ETA: 2:13 - loss: 0.0910 - regression_loss: 0.0854 - classification_loss: 0.0056 106/500 [=====>........................] - ETA: 2:13 - loss: 0.0916 - regression_loss: 0.0860 - classification_loss: 0.0056 107/500 [=====>........................] - ETA: 2:13 - loss: 0.0921 - regression_loss: 0.0864 - classification_loss: 0.0058 108/500 [=====>........................] - ETA: 2:13 - loss: 0.0917 - regression_loss: 0.0860 - classification_loss: 0.0057 109/500 [=====>........................] - ETA: 2:12 - loss: 0.0912 - regression_loss: 0.0855 - classification_loss: 0.0057 110/500 [=====>........................] - ETA: 2:12 - loss: 0.0911 - regression_loss: 0.0854 - classification_loss: 0.0057 111/500 [=====>........................] - ETA: 2:12 - loss: 0.0909 - regression_loss: 0.0852 - classification_loss: 0.0057 112/500 [=====>........................] - ETA: 2:11 - loss: 0.0904 - regression_loss: 0.0847 - classification_loss: 0.0057 113/500 [=====>........................] - ETA: 2:11 - loss: 0.0907 - regression_loss: 0.0850 - classification_loss: 0.0057 114/500 [=====>........................] - ETA: 2:11 - loss: 0.0905 - regression_loss: 0.0848 - classification_loss: 0.0057 115/500 [=====>........................] - ETA: 2:10 - loss: 0.0900 - regression_loss: 0.0843 - classification_loss: 0.0057 116/500 [=====>........................] - ETA: 2:10 - loss: 0.0917 - regression_loss: 0.0859 - classification_loss: 0.0057 117/500 [======>.......................] - ETA: 2:10 - loss: 0.0920 - regression_loss: 0.0862 - classification_loss: 0.0058 118/500 [======>.......................] - ETA: 2:09 - loss: 0.0928 - regression_loss: 0.0869 - classification_loss: 0.0059 119/500 [======>.......................] - ETA: 2:09 - loss: 0.0936 - regression_loss: 0.0877 - classification_loss: 0.0059 120/500 [======>.......................] - ETA: 2:09 - loss: 0.0931 - regression_loss: 0.0872 - classification_loss: 0.0059 121/500 [======>.......................] - ETA: 2:08 - loss: 0.0934 - regression_loss: 0.0875 - classification_loss: 0.0059 122/500 [======>.......................] - ETA: 2:08 - loss: 0.0931 - regression_loss: 0.0872 - classification_loss: 0.0059 123/500 [======>.......................] - ETA: 2:08 - loss: 0.0925 - regression_loss: 0.0866 - classification_loss: 0.0059 124/500 [======>.......................] - ETA: 2:07 - loss: 0.0921 - regression_loss: 0.0863 - classification_loss: 0.0058 125/500 [======>.......................] - ETA: 2:07 - loss: 0.0917 - regression_loss: 0.0859 - classification_loss: 0.0058 126/500 [======>.......................] - ETA: 2:07 - loss: 0.0912 - regression_loss: 0.0854 - classification_loss: 0.0058 127/500 [======>.......................] - ETA: 2:06 - loss: 0.0906 - regression_loss: 0.0848 - classification_loss: 0.0057 128/500 [======>.......................] - ETA: 2:06 - loss: 0.0908 - regression_loss: 0.0849 - classification_loss: 0.0058 129/500 [======>.......................] - ETA: 2:06 - loss: 0.0928 - regression_loss: 0.0871 - classification_loss: 0.0058 130/500 [======>.......................] - ETA: 2:05 - loss: 0.0927 - regression_loss: 0.0869 - classification_loss: 0.0058 131/500 [======>.......................] - ETA: 2:05 - loss: 0.0925 - regression_loss: 0.0867 - classification_loss: 0.0058 132/500 [======>.......................] - ETA: 2:05 - loss: 0.0920 - regression_loss: 0.0863 - classification_loss: 0.0058 133/500 [======>.......................] - ETA: 2:04 - loss: 0.0916 - regression_loss: 0.0859 - classification_loss: 0.0057 134/500 [=======>......................] - ETA: 2:04 - loss: 0.0912 - regression_loss: 0.0855 - classification_loss: 0.0057 135/500 [=======>......................] - ETA: 2:04 - loss: 0.0912 - regression_loss: 0.0855 - classification_loss: 0.0057 136/500 [=======>......................] - ETA: 2:03 - loss: 0.0908 - regression_loss: 0.0851 - classification_loss: 0.0057 137/500 [=======>......................] - ETA: 2:03 - loss: 0.0905 - regression_loss: 0.0849 - classification_loss: 0.0057 138/500 [=======>......................] - ETA: 2:03 - loss: 0.0901 - regression_loss: 0.0845 - classification_loss: 0.0056 139/500 [=======>......................] - ETA: 2:02 - loss: 0.0895 - regression_loss: 0.0839 - classification_loss: 0.0056 140/500 [=======>......................] - ETA: 2:02 - loss: 0.0897 - regression_loss: 0.0841 - classification_loss: 0.0056 141/500 [=======>......................] - ETA: 2:01 - loss: 0.0901 - regression_loss: 0.0845 - classification_loss: 0.0056 142/500 [=======>......................] - ETA: 2:01 - loss: 0.0901 - regression_loss: 0.0845 - classification_loss: 0.0056 143/500 [=======>......................] - ETA: 2:01 - loss: 0.0906 - regression_loss: 0.0848 - classification_loss: 0.0058 144/500 [=======>......................] - ETA: 2:01 - loss: 0.0904 - regression_loss: 0.0847 - classification_loss: 0.0058 145/500 [=======>......................] - ETA: 2:00 - loss: 0.0902 - regression_loss: 0.0845 - classification_loss: 0.0057 146/500 [=======>......................] - ETA: 2:00 - loss: 0.0900 - regression_loss: 0.0843 - classification_loss: 0.0057 147/500 [=======>......................] - ETA: 2:00 - loss: 0.0897 - regression_loss: 0.0840 - classification_loss: 0.0057 148/500 [=======>......................] - ETA: 1:59 - loss: 0.0898 - regression_loss: 0.0841 - classification_loss: 0.0057 149/500 [=======>......................] - ETA: 1:59 - loss: 0.0895 - regression_loss: 0.0839 - classification_loss: 0.0057 150/500 [========>.....................] - ETA: 1:59 - loss: 0.0893 - regression_loss: 0.0836 - classification_loss: 0.0056 151/500 [========>.....................] - ETA: 1:58 - loss: 0.0903 - regression_loss: 0.0846 - classification_loss: 0.0057 152/500 [========>.....................] - ETA: 1:58 - loss: 0.0903 - regression_loss: 0.0846 - classification_loss: 0.0057 153/500 [========>.....................] - ETA: 1:58 - loss: 0.0901 - regression_loss: 0.0845 - classification_loss: 0.0056 154/500 [========>.....................] - ETA: 1:57 - loss: 0.0898 - regression_loss: 0.0842 - classification_loss: 0.0056 155/500 [========>.....................] - ETA: 1:57 - loss: 0.0897 - regression_loss: 0.0841 - classification_loss: 0.0056 156/500 [========>.....................] - ETA: 1:56 - loss: 0.0898 - regression_loss: 0.0842 - classification_loss: 0.0056 157/500 [========>.....................] - ETA: 1:56 - loss: 0.0902 - regression_loss: 0.0845 - classification_loss: 0.0057 158/500 [========>.....................] - ETA: 1:56 - loss: 0.0900 - regression_loss: 0.0843 - classification_loss: 0.0057 159/500 [========>.....................] - ETA: 1:55 - loss: 0.0897 - regression_loss: 0.0840 - classification_loss: 0.0057 160/500 [========>.....................] - ETA: 1:55 - loss: 0.0898 - regression_loss: 0.0841 - classification_loss: 0.0057 161/500 [========>.....................] - ETA: 1:55 - loss: 0.0898 - regression_loss: 0.0841 - classification_loss: 0.0057 162/500 [========>.....................] - ETA: 1:54 - loss: 0.0895 - regression_loss: 0.0838 - classification_loss: 0.0057 163/500 [========>.....................] - ETA: 1:54 - loss: 0.0899 - regression_loss: 0.0842 - classification_loss: 0.0057 164/500 [========>.....................] - ETA: 1:54 - loss: 0.0899 - regression_loss: 0.0842 - classification_loss: 0.0057 165/500 [========>.....................] - ETA: 1:53 - loss: 0.0897 - regression_loss: 0.0840 - classification_loss: 0.0057 166/500 [========>.....................] - ETA: 1:53 - loss: 0.0904 - regression_loss: 0.0847 - classification_loss: 0.0057 167/500 [=========>....................] - ETA: 1:53 - loss: 0.0903 - regression_loss: 0.0846 - classification_loss: 0.0057 168/500 [=========>....................] - ETA: 1:52 - loss: 0.0898 - regression_loss: 0.0842 - classification_loss: 0.0057 169/500 [=========>....................] - ETA: 1:52 - loss: 0.0898 - regression_loss: 0.0842 - classification_loss: 0.0057 170/500 [=========>....................] - ETA: 1:52 - loss: 0.0900 - regression_loss: 0.0843 - classification_loss: 0.0057 171/500 [=========>....................] - ETA: 1:51 - loss: 0.0900 - regression_loss: 0.0844 - classification_loss: 0.0057 172/500 [=========>....................] - ETA: 1:51 - loss: 0.0900 - regression_loss: 0.0844 - classification_loss: 0.0057 173/500 [=========>....................] - ETA: 1:51 - loss: 0.0898 - regression_loss: 0.0841 - classification_loss: 0.0057 174/500 [=========>....................] - ETA: 1:50 - loss: 0.0895 - regression_loss: 0.0839 - classification_loss: 0.0056 175/500 [=========>....................] - ETA: 1:50 - loss: 0.0892 - regression_loss: 0.0836 - classification_loss: 0.0056 176/500 [=========>....................] - ETA: 1:50 - loss: 0.0893 - regression_loss: 0.0837 - classification_loss: 0.0056 177/500 [=========>....................] - ETA: 1:49 - loss: 0.0890 - regression_loss: 0.0834 - classification_loss: 0.0056 178/500 [=========>....................] - ETA: 1:49 - loss: 0.0888 - regression_loss: 0.0832 - classification_loss: 0.0056 179/500 [=========>....................] - ETA: 1:49 - loss: 0.0885 - regression_loss: 0.0829 - classification_loss: 0.0056 180/500 [=========>....................] - ETA: 1:48 - loss: 0.0882 - regression_loss: 0.0826 - classification_loss: 0.0056 181/500 [=========>....................] - ETA: 1:48 - loss: 0.0885 - regression_loss: 0.0829 - classification_loss: 0.0056 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0885 - regression_loss: 0.0830 - classification_loss: 0.0056 183/500 [=========>....................] - ETA: 1:47 - loss: 0.0881 - regression_loss: 0.0826 - classification_loss: 0.0056 184/500 [==========>...................] - ETA: 1:47 - loss: 0.0881 - regression_loss: 0.0826 - classification_loss: 0.0055 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0879 - regression_loss: 0.0823 - classification_loss: 0.0055 186/500 [==========>...................] - ETA: 1:46 - loss: 0.0877 - regression_loss: 0.0821 - classification_loss: 0.0055 187/500 [==========>...................] - ETA: 1:46 - loss: 0.0873 - regression_loss: 0.0818 - classification_loss: 0.0055 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0875 - regression_loss: 0.0819 - classification_loss: 0.0055 189/500 [==========>...................] - ETA: 1:45 - loss: 0.0871 - regression_loss: 0.0816 - classification_loss: 0.0055 190/500 [==========>...................] - ETA: 1:45 - loss: 0.0872 - regression_loss: 0.0817 - classification_loss: 0.0055 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0870 - regression_loss: 0.0816 - classification_loss: 0.0055 192/500 [==========>...................] - ETA: 1:44 - loss: 0.0876 - regression_loss: 0.0821 - classification_loss: 0.0055 193/500 [==========>...................] - ETA: 1:44 - loss: 0.0876 - regression_loss: 0.0820 - classification_loss: 0.0056 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0876 - regression_loss: 0.0820 - classification_loss: 0.0056 195/500 [==========>...................] - ETA: 1:43 - loss: 0.0884 - regression_loss: 0.0828 - classification_loss: 0.0056 196/500 [==========>...................] - ETA: 1:43 - loss: 0.0881 - regression_loss: 0.0825 - classification_loss: 0.0056 197/500 [==========>...................] - ETA: 1:42 - loss: 0.0880 - regression_loss: 0.0824 - classification_loss: 0.0056 198/500 [==========>...................] - ETA: 1:42 - loss: 0.0882 - regression_loss: 0.0827 - classification_loss: 0.0056 199/500 [==========>...................] - ETA: 1:42 - loss: 0.0885 - regression_loss: 0.0830 - classification_loss: 0.0056 200/500 [===========>..................] - ETA: 1:41 - loss: 0.0885 - regression_loss: 0.0829 - classification_loss: 0.0056 201/500 [===========>..................] - ETA: 1:41 - loss: 0.0885 - regression_loss: 0.0829 - classification_loss: 0.0056 202/500 [===========>..................] - ETA: 1:41 - loss: 0.0884 - regression_loss: 0.0829 - classification_loss: 0.0056 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0882 - regression_loss: 0.0827 - classification_loss: 0.0055 204/500 [===========>..................] - ETA: 1:40 - loss: 0.0884 - regression_loss: 0.0828 - classification_loss: 0.0056 205/500 [===========>..................] - ETA: 1:40 - loss: 0.0883 - regression_loss: 0.0828 - classification_loss: 0.0055 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0882 - regression_loss: 0.0826 - classification_loss: 0.0055 207/500 [===========>..................] - ETA: 1:39 - loss: 0.0879 - regression_loss: 0.0824 - classification_loss: 0.0055 208/500 [===========>..................] - ETA: 1:39 - loss: 0.0878 - regression_loss: 0.0823 - classification_loss: 0.0055 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0876 - regression_loss: 0.0821 - classification_loss: 0.0055 210/500 [===========>..................] - ETA: 1:38 - loss: 0.0875 - regression_loss: 0.0820 - classification_loss: 0.0055 211/500 [===========>..................] - ETA: 1:38 - loss: 0.0871 - regression_loss: 0.0817 - classification_loss: 0.0055 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0873 - regression_loss: 0.0818 - classification_loss: 0.0055 213/500 [===========>..................] - ETA: 1:37 - loss: 0.0872 - regression_loss: 0.0818 - classification_loss: 0.0055 214/500 [===========>..................] - ETA: 1:37 - loss: 0.0869 - regression_loss: 0.0815 - classification_loss: 0.0054 215/500 [===========>..................] - ETA: 1:36 - loss: 0.0869 - regression_loss: 0.0815 - classification_loss: 0.0054 216/500 [===========>..................] - ETA: 1:36 - loss: 0.0870 - regression_loss: 0.0815 - classification_loss: 0.0054 217/500 [============>.................] - ETA: 1:36 - loss: 0.0873 - regression_loss: 0.0819 - classification_loss: 0.0055 218/500 [============>.................] - ETA: 1:35 - loss: 0.0880 - regression_loss: 0.0825 - classification_loss: 0.0055 219/500 [============>.................] - ETA: 1:35 - loss: 0.0883 - regression_loss: 0.0827 - classification_loss: 0.0056 220/500 [============>.................] - ETA: 1:35 - loss: 0.0891 - regression_loss: 0.0833 - classification_loss: 0.0058 221/500 [============>.................] - ETA: 1:34 - loss: 0.0889 - regression_loss: 0.0830 - classification_loss: 0.0058 222/500 [============>.................] - ETA: 1:34 - loss: 0.0892 - regression_loss: 0.0833 - classification_loss: 0.0059 223/500 [============>.................] - ETA: 1:34 - loss: 0.0892 - regression_loss: 0.0833 - classification_loss: 0.0059 224/500 [============>.................] - ETA: 1:33 - loss: 0.0890 - regression_loss: 0.0832 - classification_loss: 0.0059 225/500 [============>.................] - ETA: 1:33 - loss: 0.0889 - regression_loss: 0.0831 - classification_loss: 0.0059 226/500 [============>.................] - ETA: 1:33 - loss: 0.0886 - regression_loss: 0.0828 - classification_loss: 0.0058 227/500 [============>.................] - ETA: 1:32 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 228/500 [============>.................] - ETA: 1:32 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 229/500 [============>.................] - ETA: 1:32 - loss: 0.0883 - regression_loss: 0.0825 - classification_loss: 0.0059 230/500 [============>.................] - ETA: 1:31 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 231/500 [============>.................] - ETA: 1:31 - loss: 0.0883 - regression_loss: 0.0824 - classification_loss: 0.0059 232/500 [============>.................] - ETA: 1:31 - loss: 0.0889 - regression_loss: 0.0830 - classification_loss: 0.0059 233/500 [============>.................] - ETA: 1:30 - loss: 0.0886 - regression_loss: 0.0828 - classification_loss: 0.0059 234/500 [=============>................] - ETA: 1:30 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 235/500 [=============>................] - ETA: 1:30 - loss: 0.0889 - regression_loss: 0.0829 - classification_loss: 0.0059 236/500 [=============>................] - ETA: 1:29 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 237/500 [=============>................] - ETA: 1:29 - loss: 0.0890 - regression_loss: 0.0831 - classification_loss: 0.0059 238/500 [=============>................] - ETA: 1:29 - loss: 0.0890 - regression_loss: 0.0831 - classification_loss: 0.0059 239/500 [=============>................] - ETA: 1:28 - loss: 0.0888 - regression_loss: 0.0829 - classification_loss: 0.0059 240/500 [=============>................] - ETA: 1:28 - loss: 0.0886 - regression_loss: 0.0827 - classification_loss: 0.0059 241/500 [=============>................] - ETA: 1:28 - loss: 0.0884 - regression_loss: 0.0825 - classification_loss: 0.0059 242/500 [=============>................] - ETA: 1:27 - loss: 0.0882 - regression_loss: 0.0823 - classification_loss: 0.0059 243/500 [=============>................] - ETA: 1:27 - loss: 0.0881 - regression_loss: 0.0822 - classification_loss: 0.0059 244/500 [=============>................] - ETA: 1:27 - loss: 0.0878 - regression_loss: 0.0820 - classification_loss: 0.0059 245/500 [=============>................] - ETA: 1:26 - loss: 0.0876 - regression_loss: 0.0818 - classification_loss: 0.0058 246/500 [=============>................] - ETA: 1:26 - loss: 0.0878 - regression_loss: 0.0819 - classification_loss: 0.0059 247/500 [=============>................] - ETA: 1:26 - loss: 0.0875 - regression_loss: 0.0817 - classification_loss: 0.0058 248/500 [=============>................] - ETA: 1:25 - loss: 0.0874 - regression_loss: 0.0816 - classification_loss: 0.0058 249/500 [=============>................] - ETA: 1:25 - loss: 0.0875 - regression_loss: 0.0816 - classification_loss: 0.0059 250/500 [==============>...............] - ETA: 1:25 - loss: 0.0874 - regression_loss: 0.0815 - classification_loss: 0.0059 251/500 [==============>...............] - ETA: 1:24 - loss: 0.0875 - regression_loss: 0.0816 - classification_loss: 0.0059 252/500 [==============>...............] - ETA: 1:24 - loss: 0.0874 - regression_loss: 0.0816 - classification_loss: 0.0059 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0873 - regression_loss: 0.0814 - classification_loss: 0.0059 254/500 [==============>...............] - ETA: 1:23 - loss: 0.0870 - regression_loss: 0.0812 - classification_loss: 0.0058 255/500 [==============>...............] - ETA: 1:23 - loss: 0.0870 - regression_loss: 0.0811 - classification_loss: 0.0058 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0867 - regression_loss: 0.0809 - classification_loss: 0.0058 257/500 [==============>...............] - ETA: 1:22 - loss: 0.0865 - regression_loss: 0.0807 - classification_loss: 0.0058 258/500 [==============>...............] - ETA: 1:22 - loss: 0.0865 - regression_loss: 0.0807 - classification_loss: 0.0058 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0866 - regression_loss: 0.0808 - classification_loss: 0.0058 260/500 [==============>...............] - ETA: 1:21 - loss: 0.0864 - regression_loss: 0.0806 - classification_loss: 0.0058 261/500 [==============>...............] - ETA: 1:21 - loss: 0.0861 - regression_loss: 0.0804 - classification_loss: 0.0058 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0860 - regression_loss: 0.0803 - classification_loss: 0.0058 263/500 [==============>...............] - ETA: 1:20 - loss: 0.0859 - regression_loss: 0.0802 - classification_loss: 0.0058 264/500 [==============>...............] - ETA: 1:20 - loss: 0.0859 - regression_loss: 0.0801 - classification_loss: 0.0058 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0857 - regression_loss: 0.0799 - classification_loss: 0.0057 266/500 [==============>...............] - ETA: 1:19 - loss: 0.0857 - regression_loss: 0.0799 - classification_loss: 0.0057 267/500 [===============>..............] - ETA: 1:19 - loss: 0.0858 - regression_loss: 0.0800 - classification_loss: 0.0057 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0859 - regression_loss: 0.0802 - classification_loss: 0.0057 269/500 [===============>..............] - ETA: 1:18 - loss: 0.0859 - regression_loss: 0.0801 - classification_loss: 0.0057 270/500 [===============>..............] - ETA: 1:18 - loss: 0.0861 - regression_loss: 0.0804 - classification_loss: 0.0057 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0859 - regression_loss: 0.0802 - classification_loss: 0.0057 272/500 [===============>..............] - ETA: 1:17 - loss: 0.0859 - regression_loss: 0.0802 - classification_loss: 0.0057 273/500 [===============>..............] - ETA: 1:17 - loss: 0.0857 - regression_loss: 0.0800 - classification_loss: 0.0057 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0856 - regression_loss: 0.0799 - classification_loss: 0.0057 275/500 [===============>..............] - ETA: 1:16 - loss: 0.0857 - regression_loss: 0.0800 - classification_loss: 0.0057 276/500 [===============>..............] - ETA: 1:16 - loss: 0.0864 - regression_loss: 0.0804 - classification_loss: 0.0060 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0865 - regression_loss: 0.0806 - classification_loss: 0.0059 278/500 [===============>..............] - ETA: 1:15 - loss: 0.0863 - regression_loss: 0.0804 - classification_loss: 0.0059 279/500 [===============>..............] - ETA: 1:15 - loss: 0.0861 - regression_loss: 0.0802 - classification_loss: 0.0059 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 281/500 [===============>..............] - ETA: 1:14 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 282/500 [===============>..............] - ETA: 1:14 - loss: 0.0858 - regression_loss: 0.0799 - classification_loss: 0.0059 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 284/500 [================>.............] - ETA: 1:13 - loss: 0.0855 - regression_loss: 0.0796 - classification_loss: 0.0059 285/500 [================>.............] - ETA: 1:13 - loss: 0.0855 - regression_loss: 0.0796 - classification_loss: 0.0059 286/500 [================>.............] - ETA: 1:12 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 287/500 [================>.............] - ETA: 1:12 - loss: 0.0860 - regression_loss: 0.0800 - classification_loss: 0.0060 288/500 [================>.............] - ETA: 1:12 - loss: 0.0858 - regression_loss: 0.0798 - classification_loss: 0.0059 289/500 [================>.............] - ETA: 1:11 - loss: 0.0857 - regression_loss: 0.0798 - classification_loss: 0.0059 290/500 [================>.............] - ETA: 1:11 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 291/500 [================>.............] - ETA: 1:11 - loss: 0.0859 - regression_loss: 0.0799 - classification_loss: 0.0060 292/500 [================>.............] - ETA: 1:10 - loss: 0.0860 - regression_loss: 0.0800 - classification_loss: 0.0060 293/500 [================>.............] - ETA: 1:10 - loss: 0.0858 - regression_loss: 0.0798 - classification_loss: 0.0059 294/500 [================>.............] - ETA: 1:10 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 295/500 [================>.............] - ETA: 1:09 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0060 296/500 [================>.............] - ETA: 1:09 - loss: 0.0865 - regression_loss: 0.0806 - classification_loss: 0.0060 297/500 [================>.............] - ETA: 1:09 - loss: 0.0864 - regression_loss: 0.0804 - classification_loss: 0.0059 298/500 [================>.............] - ETA: 1:08 - loss: 0.0862 - regression_loss: 0.0803 - classification_loss: 0.0059 299/500 [================>.............] - ETA: 1:08 - loss: 0.0861 - regression_loss: 0.0802 - classification_loss: 0.0059 300/500 [=================>............] - ETA: 1:08 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 301/500 [=================>............] - ETA: 1:07 - loss: 0.0859 - regression_loss: 0.0800 - classification_loss: 0.0059 302/500 [=================>............] - ETA: 1:07 - loss: 0.0858 - regression_loss: 0.0800 - classification_loss: 0.0059 303/500 [=================>............] - ETA: 1:07 - loss: 0.0859 - regression_loss: 0.0800 - classification_loss: 0.0059 304/500 [=================>............] - ETA: 1:06 - loss: 0.0858 - regression_loss: 0.0800 - classification_loss: 0.0059 305/500 [=================>............] - ETA: 1:06 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 306/500 [=================>............] - ETA: 1:06 - loss: 0.0862 - regression_loss: 0.0802 - classification_loss: 0.0059 307/500 [=================>............] - ETA: 1:05 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 308/500 [=================>............] - ETA: 1:05 - loss: 0.0865 - regression_loss: 0.0805 - classification_loss: 0.0060 309/500 [=================>............] - ETA: 1:04 - loss: 0.0863 - regression_loss: 0.0803 - classification_loss: 0.0059 310/500 [=================>............] - ETA: 1:04 - loss: 0.0860 - regression_loss: 0.0801 - classification_loss: 0.0059 311/500 [=================>............] - ETA: 1:04 - loss: 0.0858 - regression_loss: 0.0799 - classification_loss: 0.0059 312/500 [=================>............] - ETA: 1:03 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 313/500 [=================>............] - ETA: 1:03 - loss: 0.0858 - regression_loss: 0.0799 - classification_loss: 0.0059 314/500 [=================>............] - ETA: 1:03 - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0059 315/500 [=================>............] - ETA: 1:02 - loss: 0.0853 - regression_loss: 0.0794 - classification_loss: 0.0059 316/500 [=================>............] - ETA: 1:02 - loss: 0.0855 - regression_loss: 0.0796 - classification_loss: 0.0059 317/500 [==================>...........] - ETA: 1:02 - loss: 0.0854 - regression_loss: 0.0795 - classification_loss: 0.0059 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0059 319/500 [==================>...........] - ETA: 1:01 - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0059 320/500 [==================>...........] - ETA: 1:01 - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0059 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0059 322/500 [==================>...........] - ETA: 1:00 - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 323/500 [==================>...........] - ETA: 1:00 - loss: 0.0854 - regression_loss: 0.0795 - classification_loss: 0.0058 324/500 [==================>...........] - ETA: 59s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058  325/500 [==================>...........] - ETA: 59s - loss: 0.0853 - regression_loss: 0.0794 - classification_loss: 0.0058 326/500 [==================>...........] - ETA: 59s - loss: 0.0857 - regression_loss: 0.0798 - classification_loss: 0.0059 327/500 [==================>...........] - ETA: 58s - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0058 328/500 [==================>...........] - ETA: 58s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 329/500 [==================>...........] - ETA: 58s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 330/500 [==================>...........] - ETA: 57s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 331/500 [==================>...........] - ETA: 57s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 332/500 [==================>...........] - ETA: 57s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 333/500 [==================>...........] - ETA: 56s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0059 334/500 [===================>..........] - ETA: 56s - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0059 335/500 [===================>..........] - ETA: 56s - loss: 0.0858 - regression_loss: 0.0799 - classification_loss: 0.0059 336/500 [===================>..........] - ETA: 55s - loss: 0.0857 - regression_loss: 0.0798 - classification_loss: 0.0059 337/500 [===================>..........] - ETA: 55s - loss: 0.0856 - regression_loss: 0.0798 - classification_loss: 0.0058 338/500 [===================>..........] - ETA: 55s - loss: 0.0856 - regression_loss: 0.0797 - classification_loss: 0.0058 339/500 [===================>..........] - ETA: 54s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 340/500 [===================>..........] - ETA: 54s - loss: 0.0854 - regression_loss: 0.0795 - classification_loss: 0.0058 341/500 [===================>..........] - ETA: 54s - loss: 0.0852 - regression_loss: 0.0793 - classification_loss: 0.0058 342/500 [===================>..........] - ETA: 53s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 343/500 [===================>..........] - ETA: 53s - loss: 0.0858 - regression_loss: 0.0800 - classification_loss: 0.0058 344/500 [===================>..........] - ETA: 53s - loss: 0.0856 - regression_loss: 0.0798 - classification_loss: 0.0058 345/500 [===================>..........] - ETA: 52s - loss: 0.0859 - regression_loss: 0.0801 - classification_loss: 0.0058 346/500 [===================>..........] - ETA: 52s - loss: 0.0859 - regression_loss: 0.0801 - classification_loss: 0.0058 347/500 [===================>..........] - ETA: 52s - loss: 0.0858 - regression_loss: 0.0800 - classification_loss: 0.0058 348/500 [===================>..........] - ETA: 51s - loss: 0.0857 - regression_loss: 0.0799 - classification_loss: 0.0058 349/500 [===================>..........] - ETA: 51s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0057 350/500 [====================>.........] - ETA: 50s - loss: 0.0853 - regression_loss: 0.0796 - classification_loss: 0.0057 351/500 [====================>.........] - ETA: 50s - loss: 0.0852 - regression_loss: 0.0795 - classification_loss: 0.0057 352/500 [====================>.........] - ETA: 50s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0057 353/500 [====================>.........] - ETA: 49s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0057 354/500 [====================>.........] - ETA: 49s - loss: 0.0851 - regression_loss: 0.0794 - classification_loss: 0.0057 355/500 [====================>.........] - ETA: 49s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 356/500 [====================>.........] - ETA: 48s - loss: 0.0848 - regression_loss: 0.0791 - classification_loss: 0.0057 357/500 [====================>.........] - ETA: 48s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 358/500 [====================>.........] - ETA: 48s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0057 359/500 [====================>.........] - ETA: 47s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0057 360/500 [====================>.........] - ETA: 47s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0057 361/500 [====================>.........] - ETA: 47s - loss: 0.0852 - regression_loss: 0.0795 - classification_loss: 0.0057 362/500 [====================>.........] - ETA: 46s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0057 363/500 [====================>.........] - ETA: 46s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 364/500 [====================>.........] - ETA: 46s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 365/500 [====================>.........] - ETA: 45s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0057 366/500 [====================>.........] - ETA: 45s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0057 367/500 [=====================>........] - ETA: 45s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0057 368/500 [=====================>........] - ETA: 44s - loss: 0.0851 - regression_loss: 0.0794 - classification_loss: 0.0057 369/500 [=====================>........] - ETA: 44s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0057 370/500 [=====================>........] - ETA: 44s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 371/500 [=====================>........] - ETA: 43s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0057 372/500 [=====================>........] - ETA: 43s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0058 373/500 [=====================>........] - ETA: 43s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 374/500 [=====================>........] - ETA: 42s - loss: 0.0853 - regression_loss: 0.0796 - classification_loss: 0.0058 375/500 [=====================>........] - ETA: 42s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 376/500 [=====================>........] - ETA: 42s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 377/500 [=====================>........] - ETA: 41s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 378/500 [=====================>........] - ETA: 41s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 379/500 [=====================>........] - ETA: 41s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0058 380/500 [=====================>........] - ETA: 40s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 381/500 [=====================>........] - ETA: 40s - loss: 0.0848 - regression_loss: 0.0791 - classification_loss: 0.0058 382/500 [=====================>........] - ETA: 40s - loss: 0.0848 - regression_loss: 0.0790 - classification_loss: 0.0058 383/500 [=====================>........] - ETA: 39s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0058 384/500 [======================>.......] - ETA: 39s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0057 385/500 [======================>.......] - ETA: 39s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 386/500 [======================>.......] - ETA: 38s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 387/500 [======================>.......] - ETA: 38s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 388/500 [======================>.......] - ETA: 38s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 389/500 [======================>.......] - ETA: 37s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0058 390/500 [======================>.......] - ETA: 37s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 391/500 [======================>.......] - ETA: 37s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 392/500 [======================>.......] - ETA: 36s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 393/500 [======================>.......] - ETA: 36s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 394/500 [======================>.......] - ETA: 36s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 395/500 [======================>.......] - ETA: 35s - loss: 0.0848 - regression_loss: 0.0790 - classification_loss: 0.0058 396/500 [======================>.......] - ETA: 35s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0058 397/500 [======================>.......] - ETA: 35s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 398/500 [======================>.......] - ETA: 34s - loss: 0.0847 - regression_loss: 0.0790 - classification_loss: 0.0058 399/500 [======================>.......] - ETA: 34s - loss: 0.0847 - regression_loss: 0.0790 - classification_loss: 0.0058 400/500 [=======================>......] - ETA: 34s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0058 401/500 [=======================>......] - ETA: 33s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0057 402/500 [=======================>......] - ETA: 33s - loss: 0.0847 - regression_loss: 0.0789 - classification_loss: 0.0057 403/500 [=======================>......] - ETA: 32s - loss: 0.0847 - regression_loss: 0.0789 - classification_loss: 0.0057 404/500 [=======================>......] - ETA: 32s - loss: 0.0847 - regression_loss: 0.0789 - classification_loss: 0.0057 405/500 [=======================>......] - ETA: 32s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0058 406/500 [=======================>......] - ETA: 31s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 407/500 [=======================>......] - ETA: 31s - loss: 0.0848 - regression_loss: 0.0791 - classification_loss: 0.0057 408/500 [=======================>......] - ETA: 31s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 409/500 [=======================>......] - ETA: 30s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0058 410/500 [=======================>......] - ETA: 30s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 411/500 [=======================>......] - ETA: 30s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0058 412/500 [=======================>......] - ETA: 29s - loss: 0.0849 - regression_loss: 0.0791 - classification_loss: 0.0058 413/500 [=======================>......] - ETA: 29s - loss: 0.0856 - regression_loss: 0.0798 - classification_loss: 0.0058 414/500 [=======================>......] - ETA: 29s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 415/500 [=======================>......] - ETA: 28s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 416/500 [=======================>......] - ETA: 28s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 417/500 [========================>.....] - ETA: 28s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 418/500 [========================>.....] - ETA: 27s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 419/500 [========================>.....] - ETA: 27s - loss: 0.0851 - regression_loss: 0.0794 - classification_loss: 0.0058 420/500 [========================>.....] - ETA: 27s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 421/500 [========================>.....] - ETA: 26s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 422/500 [========================>.....] - ETA: 26s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 423/500 [========================>.....] - ETA: 26s - loss: 0.0855 - regression_loss: 0.0796 - classification_loss: 0.0058 424/500 [========================>.....] - ETA: 25s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 425/500 [========================>.....] - ETA: 25s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 426/500 [========================>.....] - ETA: 25s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 427/500 [========================>.....] - ETA: 24s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 428/500 [========================>.....] - ETA: 24s - loss: 0.0850 - regression_loss: 0.0792 - classification_loss: 0.0058 429/500 [========================>.....] - ETA: 24s - loss: 0.0851 - regression_loss: 0.0793 - classification_loss: 0.0058 430/500 [========================>.....] - ETA: 23s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 431/500 [========================>.....] - ETA: 23s - loss: 0.0855 - regression_loss: 0.0797 - classification_loss: 0.0058 432/500 [========================>.....] - ETA: 23s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 433/500 [========================>.....] - ETA: 22s - loss: 0.0853 - regression_loss: 0.0796 - classification_loss: 0.0058 434/500 [=========================>....] - ETA: 22s - loss: 0.0853 - regression_loss: 0.0795 - classification_loss: 0.0058 435/500 [=========================>....] - ETA: 22s - loss: 0.0852 - regression_loss: 0.0794 - classification_loss: 0.0058 436/500 [=========================>....] - ETA: 21s - loss: 0.0851 - regression_loss: 0.0794 - classification_loss: 0.0058 437/500 [=========================>....] - ETA: 21s - loss: 0.0854 - regression_loss: 0.0796 - classification_loss: 0.0058 438/500 [=========================>....] - ETA: 21s - loss: 0.0853 - regression_loss: 0.0796 - classification_loss: 0.0058 439/500 [=========================>....] - ETA: 20s - loss: 0.0852 - regression_loss: 0.0795 - classification_loss: 0.0058 440/500 [=========================>....] - ETA: 20s - loss: 0.0851 - regression_loss: 0.0794 - classification_loss: 0.0057 441/500 [=========================>....] - ETA: 20s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 442/500 [=========================>....] - ETA: 19s - loss: 0.0850 - regression_loss: 0.0793 - classification_loss: 0.0057 443/500 [=========================>....] - ETA: 19s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0057 444/500 [=========================>....] - ETA: 19s - loss: 0.0848 - regression_loss: 0.0791 - classification_loss: 0.0057 445/500 [=========================>....] - ETA: 18s - loss: 0.0847 - regression_loss: 0.0790 - classification_loss: 0.0057 446/500 [=========================>....] - ETA: 18s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0057 447/500 [=========================>....] - ETA: 18s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0057 448/500 [=========================>....] - ETA: 17s - loss: 0.0849 - regression_loss: 0.0792 - classification_loss: 0.0057 449/500 [=========================>....] - ETA: 17s - loss: 0.0848 - regression_loss: 0.0791 - classification_loss: 0.0057 450/500 [==========================>...] - ETA: 17s - loss: 0.0847 - regression_loss: 0.0790 - classification_loss: 0.0057 451/500 [==========================>...] - ETA: 16s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0057 452/500 [==========================>...] - ETA: 16s - loss: 0.0847 - regression_loss: 0.0790 - classification_loss: 0.0057 453/500 [==========================>...] - ETA: 15s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0057 454/500 [==========================>...] - ETA: 15s - loss: 0.0845 - regression_loss: 0.0788 - classification_loss: 0.0057 455/500 [==========================>...] - ETA: 15s - loss: 0.0846 - regression_loss: 0.0789 - classification_loss: 0.0058 456/500 [==========================>...] - ETA: 14s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0058 457/500 [==========================>...] - ETA: 14s - loss: 0.0844 - regression_loss: 0.0787 - classification_loss: 0.0057 458/500 [==========================>...] - ETA: 14s - loss: 0.0844 - regression_loss: 0.0786 - classification_loss: 0.0057 459/500 [==========================>...] - ETA: 13s - loss: 0.0842 - regression_loss: 0.0785 - classification_loss: 0.0057 460/500 [==========================>...] - ETA: 13s - loss: 0.0841 - regression_loss: 0.0784 - classification_loss: 0.0057 461/500 [==========================>...] - ETA: 13s - loss: 0.0841 - regression_loss: 0.0784 - classification_loss: 0.0057 462/500 [==========================>...] - ETA: 12s - loss: 0.0841 - regression_loss: 0.0783 - classification_loss: 0.0057 463/500 [==========================>...] - ETA: 12s - loss: 0.0841 - regression_loss: 0.0783 - classification_loss: 0.0057 464/500 [==========================>...] - ETA: 12s - loss: 0.0840 - regression_loss: 0.0783 - classification_loss: 0.0057 465/500 [==========================>...] - ETA: 11s - loss: 0.0842 - regression_loss: 0.0784 - classification_loss: 0.0058 466/500 [==========================>...] - ETA: 11s - loss: 0.0842 - regression_loss: 0.0785 - classification_loss: 0.0058 467/500 [===========================>..] - ETA: 11s - loss: 0.0841 - regression_loss: 0.0784 - classification_loss: 0.0057 468/500 [===========================>..] - ETA: 10s - loss: 0.0840 - regression_loss: 0.0783 - classification_loss: 0.0057 469/500 [===========================>..] - ETA: 10s - loss: 0.0841 - regression_loss: 0.0784 - classification_loss: 0.0057 470/500 [===========================>..] - ETA: 10s - loss: 0.0840 - regression_loss: 0.0783 - classification_loss: 0.0057 471/500 [===========================>..] - ETA: 9s - loss: 0.0840 - regression_loss: 0.0783 - classification_loss: 0.0057  472/500 [===========================>..] - ETA: 9s - loss: 0.0842 - regression_loss: 0.0785 - classification_loss: 0.0058 473/500 [===========================>..] - ETA: 9s - loss: 0.0844 - regression_loss: 0.0786 - classification_loss: 0.0058 474/500 [===========================>..] - ETA: 8s - loss: 0.0844 - regression_loss: 0.0786 - classification_loss: 0.0058 475/500 [===========================>..] - ETA: 8s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0058 476/500 [===========================>..] - ETA: 8s - loss: 0.0847 - regression_loss: 0.0788 - classification_loss: 0.0058 477/500 [===========================>..] - ETA: 7s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0058 478/500 [===========================>..] - ETA: 7s - loss: 0.0845 - regression_loss: 0.0787 - classification_loss: 0.0058 479/500 [===========================>..] - ETA: 7s - loss: 0.0844 - regression_loss: 0.0786 - classification_loss: 0.0058 480/500 [===========================>..] - ETA: 6s - loss: 0.0842 - regression_loss: 0.0784 - classification_loss: 0.0058 481/500 [===========================>..] - ETA: 6s - loss: 0.0841 - regression_loss: 0.0783 - classification_loss: 0.0058 482/500 [===========================>..] - ETA: 6s - loss: 0.0843 - regression_loss: 0.0785 - classification_loss: 0.0058 483/500 [===========================>..] - ETA: 5s - loss: 0.0842 - regression_loss: 0.0784 - classification_loss: 0.0058 484/500 [============================>.] - ETA: 5s - loss: 0.0843 - regression_loss: 0.0785 - classification_loss: 0.0058 485/500 [============================>.] - ETA: 5s - loss: 0.0844 - regression_loss: 0.0786 - classification_loss: 0.0058 486/500 [============================>.] - ETA: 4s - loss: 0.0843 - regression_loss: 0.0785 - classification_loss: 0.0058 487/500 [============================>.] - ETA: 4s - loss: 0.0843 - regression_loss: 0.0786 - classification_loss: 0.0058 488/500 [============================>.] - ETA: 4s - loss: 0.0843 - regression_loss: 0.0785 - classification_loss: 0.0058 489/500 [============================>.] - ETA: 3s - loss: 0.0842 - regression_loss: 0.0785 - classification_loss: 0.0058 490/500 [============================>.] - ETA: 3s - loss: 0.0841 - regression_loss: 0.0783 - classification_loss: 0.0058 491/500 [============================>.] - ETA: 3s - loss: 0.0840 - regression_loss: 0.0782 - classification_loss: 0.0058 492/500 [============================>.] - ETA: 2s - loss: 0.0840 - regression_loss: 0.0782 - classification_loss: 0.0058 493/500 [============================>.] - ETA: 2s - loss: 0.0842 - regression_loss: 0.0784 - classification_loss: 0.0058 494/500 [============================>.] - ETA: 2s - loss: 0.0841 - regression_loss: 0.0783 - classification_loss: 0.0058 495/500 [============================>.] - ETA: 1s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0058 496/500 [============================>.] - ETA: 1s - loss: 0.0846 - regression_loss: 0.0788 - classification_loss: 0.0058 497/500 [============================>.] - ETA: 1s - loss: 0.0848 - regression_loss: 0.0790 - classification_loss: 0.0058 498/500 [============================>.] - ETA: 0s - loss: 0.0849 - regression_loss: 0.0790 - classification_loss: 0.0058 499/500 [============================>.] - ETA: 0s - loss: 0.0848 - regression_loss: 0.0790 - classification_loss: 0.0058 500/500 [==============================] - 170s 340ms/step - loss: 0.0848 - regression_loss: 0.0789 - classification_loss: 0.0058 1172 instances of class plum with average precision: 0.7593 mAP: 0.7593 Epoch 00054: saving model to ./training/snapshots/resnet101_pascal_54.h5 Epoch 55/150 1/500 [..............................] - ETA: 2:36 - loss: 0.0897 - regression_loss: 0.0835 - classification_loss: 0.0062 2/500 [..............................] - ETA: 2:40 - loss: 0.1003 - regression_loss: 0.0938 - classification_loss: 0.0066 3/500 [..............................] - ETA: 2:40 - loss: 0.0823 - regression_loss: 0.0770 - classification_loss: 0.0053 4/500 [..............................] - ETA: 2:39 - loss: 0.0675 - regression_loss: 0.0632 - classification_loss: 0.0043 5/500 [..............................] - ETA: 2:42 - loss: 0.0671 - regression_loss: 0.0627 - classification_loss: 0.0044 6/500 [..............................] - ETA: 2:44 - loss: 0.0631 - regression_loss: 0.0592 - classification_loss: 0.0038 7/500 [..............................] - ETA: 2:45 - loss: 0.0636 - regression_loss: 0.0597 - classification_loss: 0.0039 8/500 [..............................] - ETA: 2:45 - loss: 0.0676 - regression_loss: 0.0634 - classification_loss: 0.0043 9/500 [..............................] - ETA: 2:45 - loss: 0.0718 - regression_loss: 0.0671 - classification_loss: 0.0047 10/500 [..............................] - ETA: 2:45 - loss: 0.0719 - regression_loss: 0.0671 - classification_loss: 0.0048 11/500 [..............................] - ETA: 2:45 - loss: 0.0680 - regression_loss: 0.0635 - classification_loss: 0.0045 12/500 [..............................] - ETA: 2:45 - loss: 0.0699 - regression_loss: 0.0645 - classification_loss: 0.0054 13/500 [..............................] - ETA: 2:44 - loss: 0.0659 - regression_loss: 0.0608 - classification_loss: 0.0051 14/500 [..............................] - ETA: 2:44 - loss: 0.0663 - regression_loss: 0.0610 - classification_loss: 0.0053 15/500 [..............................] - ETA: 2:43 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 16/500 [..............................] - ETA: 2:43 - loss: 0.0650 - regression_loss: 0.0600 - classification_loss: 0.0050 17/500 [>.............................] - ETA: 2:43 - loss: 0.0651 - regression_loss: 0.0602 - classification_loss: 0.0049 18/500 [>.............................] - ETA: 2:43 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 19/500 [>.............................] - ETA: 2:43 - loss: 0.0696 - regression_loss: 0.0647 - classification_loss: 0.0049 20/500 [>.............................] - ETA: 2:43 - loss: 0.0668 - regression_loss: 0.0621 - classification_loss: 0.0047 21/500 [>.............................] - ETA: 2:43 - loss: 0.0694 - regression_loss: 0.0646 - classification_loss: 0.0047 22/500 [>.............................] - ETA: 2:42 - loss: 0.0679 - regression_loss: 0.0634 - classification_loss: 0.0046 23/500 [>.............................] - ETA: 2:42 - loss: 0.0683 - regression_loss: 0.0636 - classification_loss: 0.0046 24/500 [>.............................] - ETA: 2:42 - loss: 0.0674 - regression_loss: 0.0628 - classification_loss: 0.0046 25/500 [>.............................] - ETA: 2:42 - loss: 0.0705 - regression_loss: 0.0659 - classification_loss: 0.0046 26/500 [>.............................] - ETA: 2:41 - loss: 0.0723 - regression_loss: 0.0674 - classification_loss: 0.0049 27/500 [>.............................] - ETA: 2:41 - loss: 0.0720 - regression_loss: 0.0672 - classification_loss: 0.0048 28/500 [>.............................] - ETA: 2:40 - loss: 0.0708 - regression_loss: 0.0661 - classification_loss: 0.0047 29/500 [>.............................] - ETA: 2:40 - loss: 0.0698 - regression_loss: 0.0651 - classification_loss: 0.0046 30/500 [>.............................] - ETA: 2:40 - loss: 0.0703 - regression_loss: 0.0657 - classification_loss: 0.0047 31/500 [>.............................] - ETA: 2:39 - loss: 0.0684 - regression_loss: 0.0638 - classification_loss: 0.0045 32/500 [>.............................] - ETA: 2:39 - loss: 0.0680 - regression_loss: 0.0635 - classification_loss: 0.0045 33/500 [>.............................] - ETA: 2:39 - loss: 0.0666 - regression_loss: 0.0622 - classification_loss: 0.0044 34/500 [=>............................] - ETA: 2:38 - loss: 0.0668 - regression_loss: 0.0625 - classification_loss: 0.0043 35/500 [=>............................] - ETA: 2:38 - loss: 0.0660 - regression_loss: 0.0618 - classification_loss: 0.0042 36/500 [=>............................] - ETA: 2:38 - loss: 0.0660 - regression_loss: 0.0618 - classification_loss: 0.0042 37/500 [=>............................] - ETA: 2:38 - loss: 0.0672 - regression_loss: 0.0630 - classification_loss: 0.0043 38/500 [=>............................] - ETA: 2:37 - loss: 0.0667 - regression_loss: 0.0624 - classification_loss: 0.0042 39/500 [=>............................] - ETA: 2:37 - loss: 0.0657 - regression_loss: 0.0615 - classification_loss: 0.0042 40/500 [=>............................] - ETA: 2:36 - loss: 0.0660 - regression_loss: 0.0617 - classification_loss: 0.0043 41/500 [=>............................] - ETA: 2:36 - loss: 0.0647 - regression_loss: 0.0605 - classification_loss: 0.0042 42/500 [=>............................] - ETA: 2:35 - loss: 0.0658 - regression_loss: 0.0617 - classification_loss: 0.0042 43/500 [=>............................] - ETA: 2:35 - loss: 0.0651 - regression_loss: 0.0610 - classification_loss: 0.0041 44/500 [=>............................] - ETA: 2:35 - loss: 0.0653 - regression_loss: 0.0612 - classification_loss: 0.0041 45/500 [=>............................] - ETA: 2:34 - loss: 0.0650 - regression_loss: 0.0609 - classification_loss: 0.0041 46/500 [=>............................] - ETA: 2:34 - loss: 0.0650 - regression_loss: 0.0608 - classification_loss: 0.0041 47/500 [=>............................] - ETA: 2:34 - loss: 0.0647 - regression_loss: 0.0606 - classification_loss: 0.0041 48/500 [=>............................] - ETA: 2:34 - loss: 0.0665 - regression_loss: 0.0621 - classification_loss: 0.0044 49/500 [=>............................] - ETA: 2:33 - loss: 0.0658 - regression_loss: 0.0614 - classification_loss: 0.0044 50/500 [==>...........................] - ETA: 2:33 - loss: 0.0657 - regression_loss: 0.0613 - classification_loss: 0.0044 51/500 [==>...........................] - ETA: 2:32 - loss: 0.0649 - regression_loss: 0.0605 - classification_loss: 0.0044 52/500 [==>...........................] - ETA: 2:32 - loss: 0.0642 - regression_loss: 0.0598 - classification_loss: 0.0043 53/500 [==>...........................] - ETA: 2:32 - loss: 0.0657 - regression_loss: 0.0613 - classification_loss: 0.0043 54/500 [==>...........................] - ETA: 2:31 - loss: 0.0675 - regression_loss: 0.0631 - classification_loss: 0.0044 55/500 [==>...........................] - ETA: 2:31 - loss: 0.0666 - regression_loss: 0.0623 - classification_loss: 0.0043 56/500 [==>...........................] - ETA: 2:30 - loss: 0.0665 - regression_loss: 0.0622 - classification_loss: 0.0044 57/500 [==>...........................] - ETA: 2:30 - loss: 0.0669 - regression_loss: 0.0626 - classification_loss: 0.0044 58/500 [==>...........................] - ETA: 2:30 - loss: 0.0661 - regression_loss: 0.0618 - classification_loss: 0.0043 59/500 [==>...........................] - ETA: 2:29 - loss: 0.0671 - regression_loss: 0.0628 - classification_loss: 0.0043 60/500 [==>...........................] - ETA: 2:29 - loss: 0.0676 - regression_loss: 0.0633 - classification_loss: 0.0043 61/500 [==>...........................] - ETA: 2:29 - loss: 0.0684 - regression_loss: 0.0638 - classification_loss: 0.0046 62/500 [==>...........................] - ETA: 2:28 - loss: 0.0681 - regression_loss: 0.0636 - classification_loss: 0.0045 63/500 [==>...........................] - ETA: 2:28 - loss: 0.0672 - regression_loss: 0.0627 - classification_loss: 0.0044 64/500 [==>...........................] - ETA: 2:28 - loss: 0.0670 - regression_loss: 0.0625 - classification_loss: 0.0045 65/500 [==>...........................] - ETA: 2:27 - loss: 0.0675 - regression_loss: 0.0630 - classification_loss: 0.0045 66/500 [==>...........................] - ETA: 2:27 - loss: 0.0679 - regression_loss: 0.0634 - classification_loss: 0.0045 67/500 [===>..........................] - ETA: 2:27 - loss: 0.0684 - regression_loss: 0.0639 - classification_loss: 0.0045 68/500 [===>..........................] - ETA: 2:26 - loss: 0.0679 - regression_loss: 0.0634 - classification_loss: 0.0045 69/500 [===>..........................] - ETA: 2:26 - loss: 0.0720 - regression_loss: 0.0674 - classification_loss: 0.0046 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0719 - regression_loss: 0.0672 - classification_loss: 0.0046 71/500 [===>..........................] - ETA: 2:25 - loss: 0.0715 - regression_loss: 0.0669 - classification_loss: 0.0046 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0729 - regression_loss: 0.0683 - classification_loss: 0.0046 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0722 - regression_loss: 0.0676 - classification_loss: 0.0046 74/500 [===>..........................] - ETA: 2:24 - loss: 0.0728 - regression_loss: 0.0682 - classification_loss: 0.0046 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0751 - regression_loss: 0.0705 - classification_loss: 0.0046 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0757 - regression_loss: 0.0710 - classification_loss: 0.0047 77/500 [===>..........................] - ETA: 2:23 - loss: 0.0782 - regression_loss: 0.0727 - classification_loss: 0.0055 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0775 - regression_loss: 0.0721 - classification_loss: 0.0054 79/500 [===>..........................] - ETA: 2:22 - loss: 0.0781 - regression_loss: 0.0727 - classification_loss: 0.0054 80/500 [===>..........................] - ETA: 2:22 - loss: 0.0783 - regression_loss: 0.0728 - classification_loss: 0.0054 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0792 - regression_loss: 0.0737 - classification_loss: 0.0055 82/500 [===>..........................] - ETA: 2:21 - loss: 0.0787 - regression_loss: 0.0733 - classification_loss: 0.0054 83/500 [===>..........................] - ETA: 2:21 - loss: 0.0785 - regression_loss: 0.0731 - classification_loss: 0.0054 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0791 - regression_loss: 0.0737 - classification_loss: 0.0054 85/500 [====>.........................] - ETA: 2:20 - loss: 0.0794 - regression_loss: 0.0740 - classification_loss: 0.0054 86/500 [====>.........................] - ETA: 2:20 - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 87/500 [====>.........................] - ETA: 2:19 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 88/500 [====>.........................] - ETA: 2:19 - loss: 0.0810 - regression_loss: 0.0753 - classification_loss: 0.0057 89/500 [====>.........................] - ETA: 2:19 - loss: 0.0804 - regression_loss: 0.0747 - classification_loss: 0.0057 90/500 [====>.........................] - ETA: 2:18 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 91/500 [====>.........................] - ETA: 2:18 - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 92/500 [====>.........................] - ETA: 2:18 - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 93/500 [====>.........................] - ETA: 2:17 - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 94/500 [====>.........................] - ETA: 2:17 - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 95/500 [====>.........................] - ETA: 2:17 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0056 96/500 [====>.........................] - ETA: 2:16 - loss: 0.0803 - regression_loss: 0.0747 - classification_loss: 0.0056 97/500 [====>.........................] - ETA: 2:16 - loss: 0.0803 - regression_loss: 0.0747 - classification_loss: 0.0056 98/500 [====>.........................] - ETA: 2:16 - loss: 0.0797 - regression_loss: 0.0742 - classification_loss: 0.0056 99/500 [====>.........................] - ETA: 2:15 - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 100/500 [=====>........................] - ETA: 2:15 - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 101/500 [=====>........................] - ETA: 2:15 - loss: 0.0810 - regression_loss: 0.0755 - classification_loss: 0.0056 102/500 [=====>........................] - ETA: 2:14 - loss: 0.0810 - regression_loss: 0.0754 - classification_loss: 0.0056 103/500 [=====>........................] - ETA: 2:14 - loss: 0.0809 - regression_loss: 0.0753 - classification_loss: 0.0056 104/500 [=====>........................] - ETA: 2:14 - loss: 0.0804 - regression_loss: 0.0748 - classification_loss: 0.0056 105/500 [=====>........................] - ETA: 2:13 - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0055 106/500 [=====>........................] - ETA: 2:13 - loss: 0.0791 - regression_loss: 0.0736 - classification_loss: 0.0055 107/500 [=====>........................] - ETA: 2:12 - loss: 0.0792 - regression_loss: 0.0737 - classification_loss: 0.0055 108/500 [=====>........................] - ETA: 2:12 - loss: 0.0787 - regression_loss: 0.0733 - classification_loss: 0.0055 109/500 [=====>........................] - ETA: 2:12 - loss: 0.0783 - regression_loss: 0.0728 - classification_loss: 0.0054 110/500 [=====>........................] - ETA: 2:11 - loss: 0.0781 - regression_loss: 0.0727 - classification_loss: 0.0054 111/500 [=====>........................] - ETA: 2:11 - loss: 0.0776 - regression_loss: 0.0722 - classification_loss: 0.0054 112/500 [=====>........................] - ETA: 2:11 - loss: 0.0776 - regression_loss: 0.0723 - classification_loss: 0.0054 113/500 [=====>........................] - ETA: 2:10 - loss: 0.0776 - regression_loss: 0.0722 - classification_loss: 0.0054 114/500 [=====>........................] - ETA: 2:10 - loss: 0.0781 - regression_loss: 0.0727 - classification_loss: 0.0054 115/500 [=====>........................] - ETA: 2:10 - loss: 0.0791 - regression_loss: 0.0737 - classification_loss: 0.0054 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0787 - regression_loss: 0.0733 - classification_loss: 0.0054 117/500 [======>.......................] - ETA: 2:09 - loss: 0.0784 - regression_loss: 0.0730 - classification_loss: 0.0053 118/500 [======>.......................] - ETA: 2:09 - loss: 0.0779 - regression_loss: 0.0726 - classification_loss: 0.0053 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0779 - regression_loss: 0.0725 - classification_loss: 0.0053 120/500 [======>.......................] - ETA: 2:08 - loss: 0.0783 - regression_loss: 0.0729 - classification_loss: 0.0053 121/500 [======>.......................] - ETA: 2:08 - loss: 0.0778 - regression_loss: 0.0725 - classification_loss: 0.0053 122/500 [======>.......................] - ETA: 2:07 - loss: 0.0781 - regression_loss: 0.0728 - classification_loss: 0.0053 123/500 [======>.......................] - ETA: 2:07 - loss: 0.0775 - regression_loss: 0.0723 - classification_loss: 0.0053 124/500 [======>.......................] - ETA: 2:07 - loss: 0.0772 - regression_loss: 0.0720 - classification_loss: 0.0052 125/500 [======>.......................] - ETA: 2:06 - loss: 0.0769 - regression_loss: 0.0717 - classification_loss: 0.0052 126/500 [======>.......................] - ETA: 2:06 - loss: 0.0769 - regression_loss: 0.0716 - classification_loss: 0.0053 127/500 [======>.......................] - ETA: 2:06 - loss: 0.0766 - regression_loss: 0.0714 - classification_loss: 0.0053 128/500 [======>.......................] - ETA: 2:05 - loss: 0.0766 - regression_loss: 0.0713 - classification_loss: 0.0053 129/500 [======>.......................] - ETA: 2:05 - loss: 0.0766 - regression_loss: 0.0713 - classification_loss: 0.0053 130/500 [======>.......................] - ETA: 2:05 - loss: 0.0774 - regression_loss: 0.0721 - classification_loss: 0.0053 131/500 [======>.......................] - ETA: 2:05 - loss: 0.0777 - regression_loss: 0.0723 - classification_loss: 0.0053 132/500 [======>.......................] - ETA: 2:04 - loss: 0.0773 - regression_loss: 0.0720 - classification_loss: 0.0053 133/500 [======>.......................] - ETA: 2:04 - loss: 0.0771 - regression_loss: 0.0718 - classification_loss: 0.0053 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0770 - regression_loss: 0.0717 - classification_loss: 0.0053 135/500 [=======>......................] - ETA: 2:03 - loss: 0.0770 - regression_loss: 0.0717 - classification_loss: 0.0053 136/500 [=======>......................] - ETA: 2:03 - loss: 0.0772 - regression_loss: 0.0720 - classification_loss: 0.0053 137/500 [=======>......................] - ETA: 2:03 - loss: 0.0769 - regression_loss: 0.0716 - classification_loss: 0.0053 138/500 [=======>......................] - ETA: 2:02 - loss: 0.0766 - regression_loss: 0.0713 - classification_loss: 0.0052 139/500 [=======>......................] - ETA: 2:02 - loss: 0.0769 - regression_loss: 0.0716 - classification_loss: 0.0052 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0766 - regression_loss: 0.0714 - classification_loss: 0.0052 141/500 [=======>......................] - ETA: 2:01 - loss: 0.0774 - regression_loss: 0.0720 - classification_loss: 0.0053 142/500 [=======>......................] - ETA: 2:01 - loss: 0.0770 - regression_loss: 0.0717 - classification_loss: 0.0053 143/500 [=======>......................] - ETA: 2:01 - loss: 0.0770 - regression_loss: 0.0717 - classification_loss: 0.0053 144/500 [=======>......................] - ETA: 2:00 - loss: 0.0771 - regression_loss: 0.0718 - classification_loss: 0.0053 145/500 [=======>......................] - ETA: 2:00 - loss: 0.0771 - regression_loss: 0.0718 - classification_loss: 0.0053 146/500 [=======>......................] - ETA: 2:00 - loss: 0.0770 - regression_loss: 0.0717 - classification_loss: 0.0053 147/500 [=======>......................] - ETA: 1:59 - loss: 0.0772 - regression_loss: 0.0719 - classification_loss: 0.0053 148/500 [=======>......................] - ETA: 1:59 - loss: 0.0769 - regression_loss: 0.0716 - classification_loss: 0.0053 149/500 [=======>......................] - ETA: 1:58 - loss: 0.0782 - regression_loss: 0.0724 - classification_loss: 0.0057 150/500 [========>.....................] - ETA: 1:58 - loss: 0.0779 - regression_loss: 0.0722 - classification_loss: 0.0057 151/500 [========>.....................] - ETA: 1:58 - loss: 0.0777 - regression_loss: 0.0720 - classification_loss: 0.0057 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0778 - regression_loss: 0.0721 - classification_loss: 0.0057 153/500 [========>.....................] - ETA: 1:57 - loss: 0.0775 - regression_loss: 0.0718 - classification_loss: 0.0057 154/500 [========>.....................] - ETA: 1:57 - loss: 0.0777 - regression_loss: 0.0720 - classification_loss: 0.0057 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0778 - regression_loss: 0.0721 - classification_loss: 0.0057 156/500 [========>.....................] - ETA: 1:56 - loss: 0.0782 - regression_loss: 0.0724 - classification_loss: 0.0058 157/500 [========>.....................] - ETA: 1:56 - loss: 0.0782 - regression_loss: 0.0724 - classification_loss: 0.0057 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0784 - regression_loss: 0.0726 - classification_loss: 0.0057 159/500 [========>.....................] - ETA: 1:55 - loss: 0.0784 - regression_loss: 0.0726 - classification_loss: 0.0057 160/500 [========>.....................] - ETA: 1:55 - loss: 0.0781 - regression_loss: 0.0724 - classification_loss: 0.0057 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0777 - regression_loss: 0.0721 - classification_loss: 0.0057 162/500 [========>.....................] - ETA: 1:54 - loss: 0.0776 - regression_loss: 0.0720 - classification_loss: 0.0057 163/500 [========>.....................] - ETA: 1:54 - loss: 0.0781 - regression_loss: 0.0723 - classification_loss: 0.0058 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0778 - regression_loss: 0.0720 - classification_loss: 0.0058 165/500 [========>.....................] - ETA: 1:53 - loss: 0.0775 - regression_loss: 0.0717 - classification_loss: 0.0058 166/500 [========>.....................] - ETA: 1:53 - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0057 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0787 - regression_loss: 0.0729 - classification_loss: 0.0058 168/500 [=========>....................] - ETA: 1:52 - loss: 0.0789 - regression_loss: 0.0731 - classification_loss: 0.0058 169/500 [=========>....................] - ETA: 1:52 - loss: 0.0791 - regression_loss: 0.0733 - classification_loss: 0.0058 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0795 - regression_loss: 0.0737 - classification_loss: 0.0058 171/500 [=========>....................] - ETA: 1:51 - loss: 0.0791 - regression_loss: 0.0733 - classification_loss: 0.0058 172/500 [=========>....................] - ETA: 1:51 - loss: 0.0788 - regression_loss: 0.0730 - classification_loss: 0.0058 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0785 - regression_loss: 0.0727 - classification_loss: 0.0057 174/500 [=========>....................] - ETA: 1:50 - loss: 0.0789 - regression_loss: 0.0731 - classification_loss: 0.0058 175/500 [=========>....................] - ETA: 1:50 - loss: 0.0790 - regression_loss: 0.0732 - classification_loss: 0.0058 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0788 - regression_loss: 0.0730 - classification_loss: 0.0057 177/500 [=========>....................] - ETA: 1:49 - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0058 178/500 [=========>....................] - ETA: 1:49 - loss: 0.0798 - regression_loss: 0.0739 - classification_loss: 0.0058 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0058 180/500 [=========>....................] - ETA: 1:48 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0058 181/500 [=========>....................] - ETA: 1:48 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0058 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 183/500 [=========>....................] - ETA: 1:47 - loss: 0.0794 - regression_loss: 0.0736 - classification_loss: 0.0058 184/500 [==========>...................] - ETA: 1:47 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 186/500 [==========>...................] - ETA: 1:46 - loss: 0.0806 - regression_loss: 0.0748 - classification_loss: 0.0058 187/500 [==========>...................] - ETA: 1:46 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0812 - regression_loss: 0.0754 - classification_loss: 0.0058 189/500 [==========>...................] - ETA: 1:45 - loss: 0.0809 - regression_loss: 0.0751 - classification_loss: 0.0057 190/500 [==========>...................] - ETA: 1:45 - loss: 0.0811 - regression_loss: 0.0753 - classification_loss: 0.0058 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0817 - regression_loss: 0.0758 - classification_loss: 0.0059 192/500 [==========>...................] - ETA: 1:44 - loss: 0.0814 - regression_loss: 0.0755 - classification_loss: 0.0059 193/500 [==========>...................] - ETA: 1:44 - loss: 0.0817 - regression_loss: 0.0758 - classification_loss: 0.0059 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0815 - regression_loss: 0.0756 - classification_loss: 0.0059 195/500 [==========>...................] - ETA: 1:43 - loss: 0.0812 - regression_loss: 0.0754 - classification_loss: 0.0058 196/500 [==========>...................] - ETA: 1:43 - loss: 0.0809 - regression_loss: 0.0751 - classification_loss: 0.0058 197/500 [==========>...................] - ETA: 1:42 - loss: 0.0811 - regression_loss: 0.0753 - classification_loss: 0.0058 198/500 [==========>...................] - ETA: 1:42 - loss: 0.0809 - regression_loss: 0.0752 - classification_loss: 0.0058 199/500 [==========>...................] - ETA: 1:42 - loss: 0.0807 - regression_loss: 0.0749 - classification_loss: 0.0058 200/500 [===========>..................] - ETA: 1:41 - loss: 0.0807 - regression_loss: 0.0749 - classification_loss: 0.0058 201/500 [===========>..................] - ETA: 1:41 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 202/500 [===========>..................] - ETA: 1:41 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0806 - regression_loss: 0.0748 - classification_loss: 0.0058 204/500 [===========>..................] - ETA: 1:40 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 205/500 [===========>..................] - ETA: 1:40 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 207/500 [===========>..................] - ETA: 1:39 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 208/500 [===========>..................] - ETA: 1:39 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0058 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 210/500 [===========>..................] - ETA: 1:38 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 211/500 [===========>..................] - ETA: 1:38 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0800 - regression_loss: 0.0742 - classification_loss: 0.0057 213/500 [===========>..................] - ETA: 1:37 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 214/500 [===========>..................] - ETA: 1:37 - loss: 0.0804 - regression_loss: 0.0745 - classification_loss: 0.0058 215/500 [===========>..................] - ETA: 1:36 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 216/500 [===========>..................] - ETA: 1:36 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 217/500 [============>.................] - ETA: 1:36 - loss: 0.0800 - regression_loss: 0.0742 - classification_loss: 0.0058 218/500 [============>.................] - ETA: 1:35 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 219/500 [============>.................] - ETA: 1:35 - loss: 0.0800 - regression_loss: 0.0742 - classification_loss: 0.0058 220/500 [============>.................] - ETA: 1:35 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 221/500 [============>.................] - ETA: 1:34 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 222/500 [============>.................] - ETA: 1:34 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 223/500 [============>.................] - ETA: 1:34 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 224/500 [============>.................] - ETA: 1:33 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0058 225/500 [============>.................] - ETA: 1:33 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0057 226/500 [============>.................] - ETA: 1:33 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 227/500 [============>.................] - ETA: 1:32 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 228/500 [============>.................] - ETA: 1:32 - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 229/500 [============>.................] - ETA: 1:32 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 230/500 [============>.................] - ETA: 1:31 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 231/500 [============>.................] - ETA: 1:31 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 232/500 [============>.................] - ETA: 1:31 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 233/500 [============>.................] - ETA: 1:30 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 234/500 [=============>................] - ETA: 1:30 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0057 235/500 [=============>................] - ETA: 1:30 - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0057 236/500 [=============>................] - ETA: 1:29 - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0057 237/500 [=============>................] - ETA: 1:29 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 238/500 [=============>................] - ETA: 1:29 - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0057 239/500 [=============>................] - ETA: 1:28 - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0056 240/500 [=============>................] - ETA: 1:28 - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0057 241/500 [=============>................] - ETA: 1:28 - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 242/500 [=============>................] - ETA: 1:27 - loss: 0.0793 - regression_loss: 0.0736 - classification_loss: 0.0056 243/500 [=============>................] - ETA: 1:27 - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 244/500 [=============>................] - ETA: 1:27 - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 245/500 [=============>................] - ETA: 1:26 - loss: 0.0789 - regression_loss: 0.0733 - classification_loss: 0.0056 246/500 [=============>................] - ETA: 1:26 - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 247/500 [=============>................] - ETA: 1:26 - loss: 0.0806 - regression_loss: 0.0750 - classification_loss: 0.0056 248/500 [=============>................] - ETA: 1:25 - loss: 0.0808 - regression_loss: 0.0752 - classification_loss: 0.0056 249/500 [=============>................] - ETA: 1:25 - loss: 0.0805 - regression_loss: 0.0749 - classification_loss: 0.0056 250/500 [==============>...............] - ETA: 1:25 - loss: 0.0808 - regression_loss: 0.0751 - classification_loss: 0.0056 251/500 [==============>...............] - ETA: 1:24 - loss: 0.0805 - regression_loss: 0.0749 - classification_loss: 0.0056 252/500 [==============>...............] - ETA: 1:24 - loss: 0.0805 - regression_loss: 0.0749 - classification_loss: 0.0056 253/500 [==============>...............] - ETA: 1:24 - loss: 0.0804 - regression_loss: 0.0748 - classification_loss: 0.0056 254/500 [==============>...............] - ETA: 1:23 - loss: 0.0804 - regression_loss: 0.0748 - classification_loss: 0.0056 255/500 [==============>...............] - ETA: 1:23 - loss: 0.0803 - regression_loss: 0.0747 - classification_loss: 0.0056 256/500 [==============>...............] - ETA: 1:23 - loss: 0.0802 - regression_loss: 0.0746 - classification_loss: 0.0056 257/500 [==============>...............] - ETA: 1:22 - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 258/500 [==============>...............] - ETA: 1:22 - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 259/500 [==============>...............] - ETA: 1:22 - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 260/500 [==============>...............] - ETA: 1:21 - loss: 0.0806 - regression_loss: 0.0750 - classification_loss: 0.0056 261/500 [==============>...............] - ETA: 1:21 - loss: 0.0805 - regression_loss: 0.0749 - classification_loss: 0.0056 262/500 [==============>...............] - ETA: 1:21 - loss: 0.0807 - regression_loss: 0.0751 - classification_loss: 0.0056 263/500 [==============>...............] - ETA: 1:20 - loss: 0.0807 - regression_loss: 0.0751 - classification_loss: 0.0056 264/500 [==============>...............] - ETA: 1:20 - loss: 0.0811 - regression_loss: 0.0754 - classification_loss: 0.0057 265/500 [==============>...............] - ETA: 1:20 - loss: 0.0810 - regression_loss: 0.0753 - classification_loss: 0.0057 266/500 [==============>...............] - ETA: 1:19 - loss: 0.0809 - regression_loss: 0.0752 - classification_loss: 0.0057 267/500 [===============>..............] - ETA: 1:19 - loss: 0.0808 - regression_loss: 0.0751 - classification_loss: 0.0057 268/500 [===============>..............] - ETA: 1:19 - loss: 0.0806 - regression_loss: 0.0749 - classification_loss: 0.0057 269/500 [===============>..............] - ETA: 1:18 - loss: 0.0804 - regression_loss: 0.0747 - classification_loss: 0.0057 270/500 [===============>..............] - ETA: 1:18 - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 271/500 [===============>..............] - ETA: 1:18 - loss: 0.0806 - regression_loss: 0.0749 - classification_loss: 0.0057 272/500 [===============>..............] - ETA: 1:17 - loss: 0.0807 - regression_loss: 0.0750 - classification_loss: 0.0057 273/500 [===============>..............] - ETA: 1:17 - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 274/500 [===============>..............] - ETA: 1:17 - loss: 0.0808 - regression_loss: 0.0751 - classification_loss: 0.0057 275/500 [===============>..............] - ETA: 1:16 - loss: 0.0808 - regression_loss: 0.0751 - classification_loss: 0.0057 276/500 [===============>..............] - ETA: 1:16 - loss: 0.0806 - regression_loss: 0.0749 - classification_loss: 0.0057 277/500 [===============>..............] - ETA: 1:16 - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 278/500 [===============>..............] - ETA: 1:15 - loss: 0.0806 - regression_loss: 0.0749 - classification_loss: 0.0057 279/500 [===============>..............] - ETA: 1:15 - loss: 0.0804 - regression_loss: 0.0747 - classification_loss: 0.0057 280/500 [===============>..............] - ETA: 1:15 - loss: 0.0802 - regression_loss: 0.0746 - classification_loss: 0.0057 281/500 [===============>..............] - ETA: 1:14 - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 282/500 [===============>..............] - ETA: 1:14 - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0057 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0056 284/500 [================>.............] - ETA: 1:13 - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 285/500 [================>.............] - ETA: 1:13 - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 286/500 [================>.............] - ETA: 1:12 - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0056 287/500 [================>.............] - ETA: 1:12 - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0057 288/500 [================>.............] - ETA: 1:12 - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 289/500 [================>.............] - ETA: 1:11 - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 290/500 [================>.............] - ETA: 1:11 - loss: 0.0793 - regression_loss: 0.0736 - classification_loss: 0.0056 291/500 [================>.............] - ETA: 1:11 - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 292/500 [================>.............] - ETA: 1:10 - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 293/500 [================>.............] - ETA: 1:10 - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 294/500 [================>.............] - ETA: 1:10 - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 295/500 [================>.............] - ETA: 1:09 - loss: 0.0790 - regression_loss: 0.0735 - classification_loss: 0.0056 296/500 [================>.............] - ETA: 1:09 - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 297/500 [================>.............] - ETA: 1:09 - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 298/500 [================>.............] - ETA: 1:08 - loss: 0.0795 - regression_loss: 0.0737 - classification_loss: 0.0058 299/500 [================>.............] - ETA: 1:08 - loss: 0.0794 - regression_loss: 0.0736 - classification_loss: 0.0058 300/500 [=================>............] - ETA: 1:08 - loss: 0.0793 - regression_loss: 0.0735 - classification_loss: 0.0058 301/500 [=================>............] - ETA: 1:07 - loss: 0.0800 - regression_loss: 0.0742 - classification_loss: 0.0058 302/500 [=================>............] - ETA: 1:07 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 303/500 [=================>............] - ETA: 1:07 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 304/500 [=================>............] - ETA: 1:06 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0058 305/500 [=================>............] - ETA: 1:06 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 306/500 [=================>............] - ETA: 1:06 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 307/500 [=================>............] - ETA: 1:05 - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 308/500 [=================>............] - ETA: 1:05 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0058 309/500 [=================>............] - ETA: 1:05 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 310/500 [=================>............] - ETA: 1:04 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 311/500 [=================>............] - ETA: 1:04 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0058 312/500 [=================>............] - ETA: 1:04 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0058 313/500 [=================>............] - ETA: 1:03 - loss: 0.0798 - regression_loss: 0.0740 - classification_loss: 0.0058 314/500 [=================>............] - ETA: 1:03 - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 315/500 [=================>............] - ETA: 1:03 - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0058 316/500 [=================>............] - ETA: 1:02 - loss: 0.0795 - regression_loss: 0.0737 - classification_loss: 0.0058 317/500 [==================>...........] - ETA: 1:02 - loss: 0.0793 - regression_loss: 0.0736 - classification_loss: 0.0057 318/500 [==================>...........] - ETA: 1:02 - loss: 0.0792 - regression_loss: 0.0734 - classification_loss: 0.0057 319/500 [==================>...........] - ETA: 1:01 - loss: 0.0792 - regression_loss: 0.0735 - classification_loss: 0.0057 320/500 [==================>...........] - ETA: 1:01 - loss: 0.0792 - regression_loss: 0.0735 - classification_loss: 0.0057 321/500 [==================>...........] - ETA: 1:01 - loss: 0.0792 - regression_loss: 0.0734 - classification_loss: 0.0057 322/500 [==================>...........] - ETA: 1:00 - loss: 0.0791 - regression_loss: 0.0733 - classification_loss: 0.0057 323/500 [==================>...........] - ETA: 1:00 - loss: 0.0795 - regression_loss: 0.0737 - classification_loss: 0.0057 324/500 [==================>...........] - ETA: 59s - loss: 0.0794 - regression_loss: 0.0737 - classification_loss: 0.0057  325/500 [==================>...........] - ETA: 59s - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 326/500 [==================>...........] - ETA: 59s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0058 327/500 [==================>...........] - ETA: 58s - loss: 0.0793 - regression_loss: 0.0736 - classification_loss: 0.0058 328/500 [==================>...........] - ETA: 58s - loss: 0.0794 - regression_loss: 0.0737 - classification_loss: 0.0058 329/500 [==================>...........] - ETA: 58s - loss: 0.0793 - regression_loss: 0.0736 - classification_loss: 0.0058 330/500 [==================>...........] - ETA: 57s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0057 331/500 [==================>...........] - ETA: 57s - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0057 332/500 [==================>...........] - ETA: 57s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 333/500 [==================>...........] - ETA: 56s - loss: 0.0796 - regression_loss: 0.0738 - classification_loss: 0.0057 334/500 [===================>..........] - ETA: 56s - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0057 335/500 [===================>..........] - ETA: 56s - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0057 336/500 [===================>..........] - ETA: 55s - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 337/500 [===================>..........] - ETA: 55s - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 338/500 [===================>..........] - ETA: 55s - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 339/500 [===================>..........] - ETA: 54s - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 340/500 [===================>..........] - ETA: 54s - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 341/500 [===================>..........] - ETA: 54s - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 342/500 [===================>..........] - ETA: 53s - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0057 343/500 [===================>..........] - ETA: 53s - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 344/500 [===================>..........] - ETA: 53s - loss: 0.0807 - regression_loss: 0.0749 - classification_loss: 0.0058 345/500 [===================>..........] - ETA: 52s - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0058 346/500 [===================>..........] - ETA: 52s - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 347/500 [===================>..........] - ETA: 52s - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 348/500 [===================>..........] - ETA: 51s - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 349/500 [===================>..........] - ETA: 51s - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0058 350/500 [====================>.........] - ETA: 51s - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0058 351/500 [====================>.........] - ETA: 50s - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0057 352/500 [====================>.........] - ETA: 50s - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 353/500 [====================>.........] - ETA: 50s - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0057 354/500 [====================>.........] - ETA: 49s - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0057 355/500 [====================>.........] - ETA: 49s - loss: 0.0801 - regression_loss: 0.0744 - classification_loss: 0.0057 356/500 [====================>.........] - ETA: 49s - loss: 0.0803 - regression_loss: 0.0745 - classification_loss: 0.0057 357/500 [====================>.........] - ETA: 48s - loss: 0.0801 - regression_loss: 0.0744 - classification_loss: 0.0057 358/500 [====================>.........] - ETA: 48s - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 359/500 [====================>.........] - ETA: 48s - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 360/500 [====================>.........] - ETA: 47s - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 361/500 [====================>.........] - ETA: 47s - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 362/500 [====================>.........] - ETA: 47s - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 363/500 [====================>.........] - ETA: 46s - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 364/500 [====================>.........] - ETA: 46s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0057 365/500 [====================>.........] - ETA: 46s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0057 366/500 [====================>.........] - ETA: 45s - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0057 367/500 [=====================>........] - ETA: 45s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0057 368/500 [=====================>........] - ETA: 45s - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0057 369/500 [=====================>........] - ETA: 44s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0057 370/500 [=====================>........] - ETA: 44s - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 371/500 [=====================>........] - ETA: 44s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0056 372/500 [=====================>........] - ETA: 43s - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0056 373/500 [=====================>........] - ETA: 43s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0056 374/500 [=====================>........] - ETA: 42s - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 375/500 [=====================>........] - ETA: 42s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 376/500 [=====================>........] - ETA: 42s - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 377/500 [=====================>........] - ETA: 41s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 378/500 [=====================>........] - ETA: 41s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0056 379/500 [=====================>........] - ETA: 41s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 380/500 [=====================>........] - ETA: 40s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 381/500 [=====================>........] - ETA: 40s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 382/500 [=====================>........] - ETA: 40s - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 383/500 [=====================>........] - ETA: 39s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 384/500 [======================>.......] - ETA: 39s - loss: 0.0795 - regression_loss: 0.0739 - classification_loss: 0.0056 385/500 [======================>.......] - ETA: 39s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 386/500 [======================>.......] - ETA: 38s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 387/500 [======================>.......] - ETA: 38s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 388/500 [======================>.......] - ETA: 38s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 389/500 [======================>.......] - ETA: 37s - loss: 0.0794 - regression_loss: 0.0737 - classification_loss: 0.0056 390/500 [======================>.......] - ETA: 37s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 391/500 [======================>.......] - ETA: 37s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 392/500 [======================>.......] - ETA: 36s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0057 393/500 [======================>.......] - ETA: 36s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 394/500 [======================>.......] - ETA: 36s - loss: 0.0794 - regression_loss: 0.0737 - classification_loss: 0.0057 395/500 [======================>.......] - ETA: 35s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 396/500 [======================>.......] - ETA: 35s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0057 397/500 [======================>.......] - ETA: 35s - loss: 0.0796 - regression_loss: 0.0739 - classification_loss: 0.0057 398/500 [======================>.......] - ETA: 34s - loss: 0.0795 - regression_loss: 0.0738 - classification_loss: 0.0057 399/500 [======================>.......] - ETA: 34s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0057 400/500 [=======================>......] - ETA: 33s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 401/500 [=======================>......] - ETA: 33s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 402/500 [=======================>......] - ETA: 33s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 403/500 [=======================>......] - ETA: 32s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 404/500 [=======================>......] - ETA: 32s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 405/500 [=======================>......] - ETA: 32s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 406/500 [=======================>......] - ETA: 31s - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 407/500 [=======================>......] - ETA: 31s - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 408/500 [=======================>......] - ETA: 31s - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0056 409/500 [=======================>......] - ETA: 30s - loss: 0.0789 - regression_loss: 0.0733 - classification_loss: 0.0056 410/500 [=======================>......] - ETA: 30s - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 411/500 [=======================>......] - ETA: 30s - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0056 412/500 [=======================>......] - ETA: 29s - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 413/500 [=======================>......] - ETA: 29s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 414/500 [=======================>......] - ETA: 29s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 415/500 [=======================>......] - ETA: 28s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 416/500 [=======================>......] - ETA: 28s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 417/500 [========================>.....] - ETA: 28s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 418/500 [========================>.....] - ETA: 27s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 419/500 [========================>.....] - ETA: 27s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 420/500 [========================>.....] - ETA: 27s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 421/500 [========================>.....] - ETA: 26s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 422/500 [========================>.....] - ETA: 26s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 423/500 [========================>.....] - ETA: 26s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 424/500 [========================>.....] - ETA: 25s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 425/500 [========================>.....] - ETA: 25s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 426/500 [========================>.....] - ETA: 25s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 427/500 [========================>.....] - ETA: 24s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 428/500 [========================>.....] - ETA: 24s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 429/500 [========================>.....] - ETA: 24s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 430/500 [========================>.....] - ETA: 23s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 431/500 [========================>.....] - ETA: 23s - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 432/500 [========================>.....] - ETA: 23s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 433/500 [========================>.....] - ETA: 22s - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 434/500 [=========================>....] - ETA: 22s - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0056 435/500 [=========================>....] - ETA: 22s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 436/500 [=========================>....] - ETA: 21s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 437/500 [=========================>....] - ETA: 21s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 438/500 [=========================>....] - ETA: 21s - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0056 439/500 [=========================>....] - ETA: 20s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 440/500 [=========================>....] - ETA: 20s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 441/500 [=========================>....] - ETA: 19s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 442/500 [=========================>....] - ETA: 19s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 443/500 [=========================>....] - ETA: 19s - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0056 444/500 [=========================>....] - ETA: 18s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 445/500 [=========================>....] - ETA: 18s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 446/500 [=========================>....] - ETA: 18s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 447/500 [=========================>....] - ETA: 17s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0057 448/500 [=========================>....] - ETA: 17s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0057 449/500 [=========================>....] - ETA: 17s - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 450/500 [==========================>...] - ETA: 16s - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 451/500 [==========================>...] - ETA: 16s - loss: 0.0804 - regression_loss: 0.0747 - classification_loss: 0.0056 452/500 [==========================>...] - ETA: 16s - loss: 0.0802 - regression_loss: 0.0746 - classification_loss: 0.0056 453/500 [==========================>...] - ETA: 15s - loss: 0.0803 - regression_loss: 0.0747 - classification_loss: 0.0056 454/500 [==========================>...] - ETA: 15s - loss: 0.0803 - regression_loss: 0.0747 - classification_loss: 0.0056 455/500 [==========================>...] - ETA: 15s - loss: 0.0802 - regression_loss: 0.0746 - classification_loss: 0.0056 456/500 [==========================>...] - ETA: 14s - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0056 457/500 [==========================>...] - ETA: 14s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 458/500 [==========================>...] - ETA: 14s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 459/500 [==========================>...] - ETA: 13s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 460/500 [==========================>...] - ETA: 13s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 461/500 [==========================>...] - ETA: 13s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 462/500 [==========================>...] - ETA: 12s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 463/500 [==========================>...] - ETA: 12s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 464/500 [==========================>...] - ETA: 12s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 465/500 [==========================>...] - ETA: 11s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 466/500 [==========================>...] - ETA: 11s - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0056 467/500 [===========================>..] - ETA: 11s - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0056 468/500 [===========================>..] - ETA: 10s - loss: 0.0801 - regression_loss: 0.0745 - classification_loss: 0.0056 469/500 [===========================>..] - ETA: 10s - loss: 0.0801 - regression_loss: 0.0745 - classification_loss: 0.0056 470/500 [===========================>..] - ETA: 10s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 471/500 [===========================>..] - ETA: 9s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056  472/500 [===========================>..] - ETA: 9s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 473/500 [===========================>..] - ETA: 9s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 474/500 [===========================>..] - ETA: 8s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 475/500 [===========================>..] - ETA: 8s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 476/500 [===========================>..] - ETA: 8s - loss: 0.0801 - regression_loss: 0.0745 - classification_loss: 0.0056 477/500 [===========================>..] - ETA: 7s - loss: 0.0800 - regression_loss: 0.0744 - classification_loss: 0.0056 478/500 [===========================>..] - ETA: 7s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 479/500 [===========================>..] - ETA: 7s - loss: 0.0798 - regression_loss: 0.0743 - classification_loss: 0.0056 480/500 [===========================>..] - ETA: 6s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 481/500 [===========================>..] - ETA: 6s - loss: 0.0799 - regression_loss: 0.0744 - classification_loss: 0.0056 482/500 [===========================>..] - ETA: 6s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 483/500 [===========================>..] - ETA: 5s - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0056 484/500 [============================>.] - ETA: 5s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 485/500 [============================>.] - ETA: 5s - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0056 486/500 [============================>.] - ETA: 4s - loss: 0.0798 - regression_loss: 0.0743 - classification_loss: 0.0056 487/500 [============================>.] - ETA: 4s - loss: 0.0797 - regression_loss: 0.0742 - classification_loss: 0.0056 488/500 [============================>.] - ETA: 4s - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0055 489/500 [============================>.] - ETA: 3s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 490/500 [============================>.] - ETA: 3s - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0056 491/500 [============================>.] - ETA: 3s - loss: 0.0795 - regression_loss: 0.0740 - classification_loss: 0.0055 492/500 [============================>.] - ETA: 2s - loss: 0.0795 - regression_loss: 0.0740 - classification_loss: 0.0055 493/500 [============================>.] - ETA: 2s - loss: 0.0794 - regression_loss: 0.0739 - classification_loss: 0.0055 494/500 [============================>.] - ETA: 2s - loss: 0.0793 - regression_loss: 0.0738 - classification_loss: 0.0055 495/500 [============================>.] - ETA: 1s - loss: 0.0793 - regression_loss: 0.0738 - classification_loss: 0.0055 496/500 [============================>.] - ETA: 1s - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0055 497/500 [============================>.] - ETA: 1s - loss: 0.0792 - regression_loss: 0.0737 - classification_loss: 0.0055 498/500 [============================>.] - ETA: 0s - loss: 0.0791 - regression_loss: 0.0736 - classification_loss: 0.0055 499/500 [============================>.] - ETA: 0s - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0055 500/500 [==============================] - 169s 339ms/step - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0055 1172 instances of class plum with average precision: 0.7584 mAP: 0.7584 Epoch 00055: saving model to ./training/snapshots/resnet101_pascal_55.h5 Epoch 56/150 1/500 [..............................] - ETA: 2:38 - loss: 0.1249 - regression_loss: 0.1127 - classification_loss: 0.0122 2/500 [..............................] - ETA: 2:41 - loss: 0.0976 - regression_loss: 0.0886 - classification_loss: 0.0090 3/500 [..............................] - ETA: 2:40 - loss: 0.0905 - regression_loss: 0.0803 - classification_loss: 0.0102 4/500 [..............................] - ETA: 2:40 - loss: 0.0729 - regression_loss: 0.0650 - classification_loss: 0.0080 5/500 [..............................] - ETA: 2:40 - loss: 0.0599 - regression_loss: 0.0534 - classification_loss: 0.0065 6/500 [..............................] - ETA: 2:40 - loss: 0.0608 - regression_loss: 0.0542 - classification_loss: 0.0066 7/500 [..............................] - ETA: 2:40 - loss: 0.0786 - regression_loss: 0.0717 - classification_loss: 0.0070 8/500 [..............................] - ETA: 2:40 - loss: 0.0732 - regression_loss: 0.0669 - classification_loss: 0.0063 9/500 [..............................] - ETA: 2:39 - loss: 0.0712 - regression_loss: 0.0653 - classification_loss: 0.0058 10/500 [..............................] - ETA: 2:39 - loss: 0.0693 - regression_loss: 0.0636 - classification_loss: 0.0057 11/500 [..............................] - ETA: 2:39 - loss: 0.0670 - regression_loss: 0.0615 - classification_loss: 0.0055 12/500 [..............................] - ETA: 2:39 - loss: 0.0626 - regression_loss: 0.0575 - classification_loss: 0.0051 13/500 [..............................] - ETA: 2:40 - loss: 0.0662 - regression_loss: 0.0609 - classification_loss: 0.0053 14/500 [..............................] - ETA: 2:39 - loss: 0.0714 - regression_loss: 0.0647 - classification_loss: 0.0067 15/500 [..............................] - ETA: 2:39 - loss: 0.0690 - regression_loss: 0.0626 - classification_loss: 0.0063 16/500 [..............................] - ETA: 2:39 - loss: 0.0710 - regression_loss: 0.0646 - classification_loss: 0.0065 17/500 [>.............................] - ETA: 2:38 - loss: 0.0724 - regression_loss: 0.0658 - classification_loss: 0.0066 18/500 [>.............................] - ETA: 2:39 - loss: 0.0699 - regression_loss: 0.0636 - classification_loss: 0.0063 19/500 [>.............................] - ETA: 2:38 - loss: 0.0715 - regression_loss: 0.0652 - classification_loss: 0.0063 20/500 [>.............................] - ETA: 2:38 - loss: 0.0694 - regression_loss: 0.0633 - classification_loss: 0.0061 21/500 [>.............................] - ETA: 2:38 - loss: 0.0677 - regression_loss: 0.0617 - classification_loss: 0.0060 22/500 [>.............................] - ETA: 2:38 - loss: 0.0676 - regression_loss: 0.0615 - classification_loss: 0.0061 23/500 [>.............................] - ETA: 2:38 - loss: 0.0660 - regression_loss: 0.0601 - classification_loss: 0.0059 24/500 [>.............................] - ETA: 2:37 - loss: 0.0686 - regression_loss: 0.0627 - classification_loss: 0.0059 25/500 [>.............................] - ETA: 2:37 - loss: 0.0735 - regression_loss: 0.0677 - classification_loss: 0.0058 26/500 [>.............................] - ETA: 2:37 - loss: 0.0732 - regression_loss: 0.0674 - classification_loss: 0.0057 27/500 [>.............................] - ETA: 2:36 - loss: 0.0708 - regression_loss: 0.0652 - classification_loss: 0.0055 28/500 [>.............................] - ETA: 2:36 - loss: 0.0702 - regression_loss: 0.0648 - classification_loss: 0.0054 29/500 [>.............................] - ETA: 2:36 - loss: 0.0707 - regression_loss: 0.0654 - classification_loss: 0.0054 30/500 [>.............................] - ETA: 2:36 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0052 31/500 [>.............................] - ETA: 2:36 - loss: 0.0677 - regression_loss: 0.0626 - classification_loss: 0.0051 32/500 [>.............................] - ETA: 2:35 - loss: 0.0716 - regression_loss: 0.0663 - classification_loss: 0.0053 33/500 [>.............................] - ETA: 2:35 - loss: 0.0721 - regression_loss: 0.0667 - classification_loss: 0.0054 34/500 [=>............................] - ETA: 2:35 - loss: 0.0707 - regression_loss: 0.0655 - classification_loss: 0.0052 35/500 [=>............................] - ETA: 2:35 - loss: 0.0717 - regression_loss: 0.0664 - classification_loss: 0.0053 36/500 [=>............................] - ETA: 2:34 - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0053 37/500 [=>............................] - ETA: 2:34 - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053 38/500 [=>............................] - ETA: 2:34 - loss: 0.0745 - regression_loss: 0.0692 - classification_loss: 0.0053 39/500 [=>............................] - ETA: 2:33 - loss: 0.0743 - regression_loss: 0.0690 - classification_loss: 0.0052 40/500 [=>............................] - ETA: 2:33 - loss: 0.0736 - regression_loss: 0.0685 - classification_loss: 0.0052 41/500 [=>............................] - ETA: 2:33 - loss: 0.0722 - regression_loss: 0.0671 - classification_loss: 0.0051 42/500 [=>............................] - ETA: 2:33 - loss: 0.0716 - regression_loss: 0.0665 - classification_loss: 0.0050 43/500 [=>............................] - ETA: 2:32 - loss: 0.0729 - regression_loss: 0.0679 - classification_loss: 0.0051 44/500 [=>............................] - ETA: 2:32 - loss: 0.0742 - regression_loss: 0.0691 - classification_loss: 0.0051 45/500 [=>............................] - ETA: 2:32 - loss: 0.0736 - regression_loss: 0.0686 - classification_loss: 0.0051 46/500 [=>............................] - ETA: 2:31 - loss: 0.0726 - regression_loss: 0.0676 - classification_loss: 0.0050 47/500 [=>............................] - ETA: 2:31 - loss: 0.0734 - regression_loss: 0.0683 - classification_loss: 0.0051 48/500 [=>............................] - ETA: 2:31 - loss: 0.0740 - regression_loss: 0.0689 - classification_loss: 0.0051 49/500 [=>............................] - ETA: 2:31 - loss: 0.0734 - regression_loss: 0.0683 - classification_loss: 0.0050 50/500 [==>...........................] - ETA: 2:30 - loss: 0.0738 - regression_loss: 0.0687 - classification_loss: 0.0051 51/500 [==>...........................] - ETA: 2:30 - loss: 0.0730 - regression_loss: 0.0680 - classification_loss: 0.0050 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0740 - regression_loss: 0.0689 - classification_loss: 0.0051 53/500 [==>...........................] - ETA: 2:29 - loss: 0.0739 - regression_loss: 0.0688 - classification_loss: 0.0051 54/500 [==>...........................] - ETA: 2:29 - loss: 0.0746 - regression_loss: 0.0695 - classification_loss: 0.0051 55/500 [==>...........................] - ETA: 2:29 - loss: 0.0770 - regression_loss: 0.0719 - classification_loss: 0.0051 56/500 [==>...........................] - ETA: 2:28 - loss: 0.0762 - regression_loss: 0.0711 - classification_loss: 0.0050 57/500 [==>...........................] - ETA: 2:28 - loss: 0.0753 - regression_loss: 0.0703 - classification_loss: 0.0050 58/500 [==>...........................] - ETA: 2:28 - loss: 0.0748 - regression_loss: 0.0699 - classification_loss: 0.0049 59/500 [==>...........................] - ETA: 2:27 - loss: 0.0740 - regression_loss: 0.0691 - classification_loss: 0.0049 60/500 [==>...........................] - ETA: 2:27 - loss: 0.0731 - regression_loss: 0.0683 - classification_loss: 0.0048 61/500 [==>...........................] - ETA: 2:27 - loss: 0.0734 - regression_loss: 0.0686 - classification_loss: 0.0048 62/500 [==>...........................] - ETA: 2:26 - loss: 0.0733 - regression_loss: 0.0685 - classification_loss: 0.0048 63/500 [==>...........................] - ETA: 2:26 - loss: 0.0732 - regression_loss: 0.0683 - classification_loss: 0.0050 64/500 [==>...........................] - ETA: 2:26 - loss: 0.0731 - regression_loss: 0.0682 - classification_loss: 0.0049 65/500 [==>...........................] - ETA: 2:25 - loss: 0.0762 - regression_loss: 0.0703 - classification_loss: 0.0059 66/500 [==>...........................] - ETA: 2:25 - loss: 0.0756 - regression_loss: 0.0698 - classification_loss: 0.0058 67/500 [===>..........................] - ETA: 2:25 - loss: 0.0756 - regression_loss: 0.0698 - classification_loss: 0.0058 68/500 [===>..........................] - ETA: 2:24 - loss: 0.0748 - regression_loss: 0.0691 - classification_loss: 0.0057 69/500 [===>..........................] - ETA: 2:24 - loss: 0.0748 - regression_loss: 0.0690 - classification_loss: 0.0058 70/500 [===>..........................] - ETA: 2:24 - loss: 0.0760 - regression_loss: 0.0703 - classification_loss: 0.0057 71/500 [===>..........................] - ETA: 2:23 - loss: 0.0782 - regression_loss: 0.0725 - classification_loss: 0.0057 72/500 [===>..........................] - ETA: 2:23 - loss: 0.0779 - regression_loss: 0.0722 - classification_loss: 0.0057 73/500 [===>..........................] - ETA: 2:23 - loss: 0.0780 - regression_loss: 0.0723 - classification_loss: 0.0057 74/500 [===>..........................] - ETA: 2:22 - loss: 0.0780 - regression_loss: 0.0723 - classification_loss: 0.0057 75/500 [===>..........................] - ETA: 2:22 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 76/500 [===>..........................] - ETA: 2:22 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 77/500 [===>..........................] - ETA: 2:21 - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 78/500 [===>..........................] - ETA: 2:21 - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0056 79/500 [===>..........................] - ETA: 2:21 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 80/500 [===>..........................] - ETA: 2:20 - loss: 0.0803 - regression_loss: 0.0746 - classification_loss: 0.0057 81/500 [===>..........................] - ETA: 2:20 - loss: 0.0820 - regression_loss: 0.0762 - classification_loss: 0.0058 82/500 [===>..........................] - ETA: 2:20 - loss: 0.0812 - regression_loss: 0.0755 - classification_loss: 0.0057 83/500 [===>..........................] - ETA: 2:19 - loss: 0.0816 - regression_loss: 0.0759 - classification_loss: 0.0057 84/500 [====>.........................] - ETA: 2:19 - loss: 0.0818 - regression_loss: 0.0761 - classification_loss: 0.0057 85/500 [====>.........................] - ETA: 2:19 - loss: 0.0818 - regression_loss: 0.0761 - classification_loss: 0.0057 86/500 [====>.........................] - ETA: 2:18 - loss: 0.0815 - regression_loss: 0.0758 - classification_loss: 0.0057 87/500 [====>.........................] - ETA: 2:18 - loss: 0.0810 - regression_loss: 0.0754 - classification_loss: 0.0056 88/500 [====>.........................] - ETA: 2:18 - loss: 0.0804 - regression_loss: 0.0748 - classification_loss: 0.0056 89/500 [====>.........................] - ETA: 2:17 - loss: 0.0823 - regression_loss: 0.0765 - classification_loss: 0.0059 90/500 [====>.........................] - ETA: 2:17 - loss: 0.0824 - regression_loss: 0.0765 - classification_loss: 0.0059 91/500 [====>.........................] - ETA: 2:17 - loss: 0.0817 - regression_loss: 0.0759 - classification_loss: 0.0058 92/500 [====>.........................] - ETA: 2:16 - loss: 0.0814 - regression_loss: 0.0756 - classification_loss: 0.0058 93/500 [====>.........................] - ETA: 2:16 - loss: 0.0813 - regression_loss: 0.0755 - classification_loss: 0.0058 94/500 [====>.........................] - ETA: 2:16 - loss: 0.0808 - regression_loss: 0.0750 - classification_loss: 0.0058 95/500 [====>.........................] - ETA: 2:15 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0057 96/500 [====>.........................] - ETA: 2:15 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 97/500 [====>.........................] - ETA: 2:15 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 98/500 [====>.........................] - ETA: 2:15 - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0057 99/500 [====>.........................] - ETA: 2:14 - loss: 0.0800 - regression_loss: 0.0742 - classification_loss: 0.0058 100/500 [=====>........................] - ETA: 2:14 - loss: 0.0796 - regression_loss: 0.0738 - classification_loss: 0.0058 101/500 [=====>........................] - ETA: 2:13 - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 102/500 [=====>........................] - ETA: 2:13 - loss: 0.0787 - regression_loss: 0.0730 - classification_loss: 0.0057 103/500 [=====>........................] - ETA: 2:13 - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0057 104/500 [=====>........................] - ETA: 2:13 - loss: 0.0791 - regression_loss: 0.0732 - classification_loss: 0.0059 105/500 [=====>........................] - ETA: 2:12 - loss: 0.0793 - regression_loss: 0.0734 - classification_loss: 0.0059 106/500 [=====>........................] - ETA: 2:12 - loss: 0.0800 - regression_loss: 0.0741 - classification_loss: 0.0059 107/500 [=====>........................] - ETA: 2:11 - loss: 0.0802 - regression_loss: 0.0742 - classification_loss: 0.0059 108/500 [=====>........................] - ETA: 2:11 - loss: 0.0800 - regression_loss: 0.0741 - classification_loss: 0.0059 109/500 [=====>........................] - ETA: 2:11 - loss: 0.0803 - regression_loss: 0.0744 - classification_loss: 0.0059 110/500 [=====>........................] - ETA: 2:10 - loss: 0.0802 - regression_loss: 0.0743 - classification_loss: 0.0059 111/500 [=====>........................] - ETA: 2:10 - loss: 0.0799 - regression_loss: 0.0740 - classification_loss: 0.0059 112/500 [=====>........................] - ETA: 2:10 - loss: 0.0797 - regression_loss: 0.0738 - classification_loss: 0.0059 113/500 [=====>........................] - ETA: 2:09 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0059 114/500 [=====>........................] - ETA: 2:09 - loss: 0.0798 - regression_loss: 0.0740 - classification_loss: 0.0058 115/500 [=====>........................] - ETA: 2:09 - loss: 0.0797 - regression_loss: 0.0739 - classification_loss: 0.0058 116/500 [=====>........................] - ETA: 2:08 - loss: 0.0793 - regression_loss: 0.0735 - classification_loss: 0.0058 117/500 [======>.......................] - ETA: 2:08 - loss: 0.0794 - regression_loss: 0.0736 - classification_loss: 0.0058 118/500 [======>.......................] - ETA: 2:08 - loss: 0.0798 - regression_loss: 0.0740 - classification_loss: 0.0058 119/500 [======>.......................] - ETA: 2:07 - loss: 0.0793 - regression_loss: 0.0735 - classification_loss: 0.0058 120/500 [======>.......................] - ETA: 2:07 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0058 121/500 [======>.......................] - ETA: 2:07 - loss: 0.0821 - regression_loss: 0.0764 - classification_loss: 0.0058 122/500 [======>.......................] - ETA: 2:06 - loss: 0.0815 - regression_loss: 0.0758 - classification_loss: 0.0057 123/500 [======>.......................] - ETA: 2:06 - loss: 0.0814 - regression_loss: 0.0757 - classification_loss: 0.0057 124/500 [======>.......................] - ETA: 2:06 - loss: 0.0819 - regression_loss: 0.0760 - classification_loss: 0.0058 125/500 [======>.......................] - ETA: 2:05 - loss: 0.0819 - regression_loss: 0.0761 - classification_loss: 0.0058 126/500 [======>.......................] - ETA: 2:05 - loss: 0.0815 - regression_loss: 0.0757 - classification_loss: 0.0058 127/500 [======>.......................] - ETA: 2:05 - loss: 0.0810 - regression_loss: 0.0753 - classification_loss: 0.0057 128/500 [======>.......................] - ETA: 2:04 - loss: 0.0806 - regression_loss: 0.0749 - classification_loss: 0.0057 129/500 [======>.......................] - ETA: 2:04 - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 130/500 [======>.......................] - ETA: 2:04 - loss: 0.0814 - regression_loss: 0.0755 - classification_loss: 0.0058 131/500 [======>.......................] - ETA: 2:03 - loss: 0.0817 - regression_loss: 0.0758 - classification_loss: 0.0058 132/500 [======>.......................] - ETA: 2:03 - loss: 0.0812 - regression_loss: 0.0754 - classification_loss: 0.0058 133/500 [======>.......................] - ETA: 2:02 - loss: 0.0815 - regression_loss: 0.0757 - classification_loss: 0.0058 134/500 [=======>......................] - ETA: 2:02 - loss: 0.0816 - regression_loss: 0.0758 - classification_loss: 0.0058 135/500 [=======>......................] - ETA: 2:02 - loss: 0.0818 - regression_loss: 0.0758 - classification_loss: 0.0059 136/500 [=======>......................] - ETA: 2:01 - loss: 0.0815 - regression_loss: 0.0756 - classification_loss: 0.0059 137/500 [=======>......................] - ETA: 2:01 - loss: 0.0818 - regression_loss: 0.0759 - classification_loss: 0.0059 138/500 [=======>......................] - ETA: 2:01 - loss: 0.0814 - regression_loss: 0.0755 - classification_loss: 0.0059 139/500 [=======>......................] - ETA: 2:00 - loss: 0.0823 - regression_loss: 0.0764 - classification_loss: 0.0058 140/500 [=======>......................] - ETA: 2:00 - loss: 0.0819 - regression_loss: 0.0760 - classification_loss: 0.0058 141/500 [=======>......................] - ETA: 2:00 - loss: 0.0817 - regression_loss: 0.0759 - classification_loss: 0.0058 142/500 [=======>......................] - ETA: 1:59 - loss: 0.0828 - regression_loss: 0.0769 - classification_loss: 0.0058 143/500 [=======>......................] - ETA: 1:59 - loss: 0.0829 - regression_loss: 0.0771 - classification_loss: 0.0058 144/500 [=======>......................] - ETA: 1:59 - loss: 0.0826 - regression_loss: 0.0768 - classification_loss: 0.0058 145/500 [=======>......................] - ETA: 1:59 - loss: 0.0827 - regression_loss: 0.0769 - classification_loss: 0.0058 146/500 [=======>......................] - ETA: 1:58 - loss: 0.0825 - regression_loss: 0.0767 - classification_loss: 0.0058 147/500 [=======>......................] - ETA: 1:58 - loss: 0.0820 - regression_loss: 0.0763 - classification_loss: 0.0058 148/500 [=======>......................] - ETA: 1:58 - loss: 0.0817 - regression_loss: 0.0760 - classification_loss: 0.0057 149/500 [=======>......................] - ETA: 1:57 - loss: 0.0814 - regression_loss: 0.0756 - classification_loss: 0.0057 150/500 [========>.....................] - ETA: 1:57 - loss: 0.0813 - regression_loss: 0.0756 - classification_loss: 0.0057 151/500 [========>.....................] - ETA: 1:57 - loss: 0.0809 - regression_loss: 0.0752 - classification_loss: 0.0057 152/500 [========>.....................] - ETA: 1:56 - loss: 0.0819 - regression_loss: 0.0761 - classification_loss: 0.0058 153/500 [========>.....................] - ETA: 1:56 - loss: 0.0820 - regression_loss: 0.0762 - classification_loss: 0.0058 154/500 [========>.....................] - ETA: 1:56 - loss: 0.0817 - regression_loss: 0.0759 - classification_loss: 0.0058 155/500 [========>.....................] - ETA: 1:55 - loss: 0.0816 - regression_loss: 0.0758 - classification_loss: 0.0058 156/500 [========>.....................] - ETA: 1:55 - loss: 0.0827 - regression_loss: 0.0766 - classification_loss: 0.0062 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0825 - regression_loss: 0.0763 - classification_loss: 0.0062 158/500 [========>.....................] - ETA: 1:54 - loss: 0.0821 - regression_loss: 0.0760 - classification_loss: 0.0061 159/500 [========>.....................] - ETA: 1:54 - loss: 0.0828 - regression_loss: 0.0766 - classification_loss: 0.0061 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0824 - regression_loss: 0.0763 - classification_loss: 0.0061 161/500 [========>.....................] - ETA: 1:53 - loss: 0.0823 - regression_loss: 0.0762 - classification_loss: 0.0061 162/500 [========>.....................] - ETA: 1:53 - loss: 0.0821 - regression_loss: 0.0760 - classification_loss: 0.0061 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0822 - regression_loss: 0.0761 - classification_loss: 0.0061 164/500 [========>.....................] - ETA: 1:52 - loss: 0.0821 - regression_loss: 0.0759 - classification_loss: 0.0062 165/500 [========>.....................] - ETA: 1:52 - loss: 0.0817 - regression_loss: 0.0756 - classification_loss: 0.0061 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0814 - regression_loss: 0.0753 - classification_loss: 0.0061 167/500 [=========>....................] - ETA: 1:51 - loss: 0.0817 - regression_loss: 0.0755 - classification_loss: 0.0061 168/500 [=========>....................] - ETA: 1:51 - loss: 0.0814 - regression_loss: 0.0753 - classification_loss: 0.0061 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0818 - regression_loss: 0.0756 - classification_loss: 0.0062 170/500 [=========>....................] - ETA: 1:50 - loss: 0.0814 - regression_loss: 0.0752 - classification_loss: 0.0062 171/500 [=========>....................] - ETA: 1:50 - loss: 0.0814 - regression_loss: 0.0752 - classification_loss: 0.0062 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0811 - regression_loss: 0.0749 - classification_loss: 0.0061 173/500 [=========>....................] - ETA: 1:49 - loss: 0.0807 - regression_loss: 0.0746 - classification_loss: 0.0061 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0805 - regression_loss: 0.0744 - classification_loss: 0.0061 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0805 - regression_loss: 0.0744 - classification_loss: 0.0061 176/500 [=========>....................] - ETA: 1:48 - loss: 0.0807 - regression_loss: 0.0746 - classification_loss: 0.0061 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0809 - regression_loss: 0.0748 - classification_loss: 0.0061 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0809 - regression_loss: 0.0748 - classification_loss: 0.0061 179/500 [=========>....................] - ETA: 1:47 - loss: 0.0806 - regression_loss: 0.0745 - classification_loss: 0.0061 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0808 - regression_loss: 0.0747 - classification_loss: 0.0061 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0806 - regression_loss: 0.0745 - classification_loss: 0.0060 182/500 [=========>....................] - ETA: 1:46 - loss: 0.0809 - regression_loss: 0.0748 - classification_loss: 0.0061 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0813 - regression_loss: 0.0752 - classification_loss: 0.0061 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0812 - regression_loss: 0.0751 - classification_loss: 0.0061 185/500 [==========>...................] - ETA: 1:45 - loss: 0.0810 - regression_loss: 0.0749 - classification_loss: 0.0060 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0806 - regression_loss: 0.0746 - classification_loss: 0.0060 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0802 - regression_loss: 0.0743 - classification_loss: 0.0060 188/500 [==========>...................] - ETA: 1:44 - loss: 0.0803 - regression_loss: 0.0743 - classification_loss: 0.0060 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0805 - regression_loss: 0.0746 - classification_loss: 0.0060 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0808 - regression_loss: 0.0749 - classification_loss: 0.0060 191/500 [==========>...................] - ETA: 1:43 - loss: 0.0805 - regression_loss: 0.0746 - classification_loss: 0.0059 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0805 - regression_loss: 0.0745 - classification_loss: 0.0059 193/500 [==========>...................] - ETA: 1:42 - loss: 0.0805 - regression_loss: 0.0745 - classification_loss: 0.0059 194/500 [==========>...................] - ETA: 1:42 - loss: 0.0804 - regression_loss: 0.0744 - classification_loss: 0.0059 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0811 - regression_loss: 0.0751 - classification_loss: 0.0060 196/500 [==========>...................] - ETA: 1:41 - loss: 0.0810 - regression_loss: 0.0750 - classification_loss: 0.0059 197/500 [==========>...................] - ETA: 1:41 - loss: 0.0808 - regression_loss: 0.0749 - classification_loss: 0.0059 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0807 - regression_loss: 0.0748 - classification_loss: 0.0059 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0812 - regression_loss: 0.0753 - classification_loss: 0.0059 200/500 [===========>..................] - ETA: 1:40 - loss: 0.0810 - regression_loss: 0.0750 - classification_loss: 0.0059 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0807 - regression_loss: 0.0748 - classification_loss: 0.0059 202/500 [===========>..................] - ETA: 1:39 - loss: 0.0819 - regression_loss: 0.0761 - classification_loss: 0.0059 203/500 [===========>..................] - ETA: 1:39 - loss: 0.0817 - regression_loss: 0.0758 - classification_loss: 0.0059 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0816 - regression_loss: 0.0757 - classification_loss: 0.0059 205/500 [===========>..................] - ETA: 1:38 - loss: 0.0816 - regression_loss: 0.0757 - classification_loss: 0.0059 206/500 [===========>..................] - ETA: 1:38 - loss: 0.0819 - regression_loss: 0.0760 - classification_loss: 0.0059 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0817 - regression_loss: 0.0759 - classification_loss: 0.0059 208/500 [===========>..................] - ETA: 1:37 - loss: 0.0815 - regression_loss: 0.0757 - classification_loss: 0.0058 209/500 [===========>..................] - ETA: 1:37 - loss: 0.0818 - regression_loss: 0.0759 - classification_loss: 0.0058 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0816 - regression_loss: 0.0758 - classification_loss: 0.0058 211/500 [===========>..................] - ETA: 1:36 - loss: 0.0817 - regression_loss: 0.0759 - classification_loss: 0.0059 212/500 [===========>..................] - ETA: 1:36 - loss: 0.0815 - regression_loss: 0.0756 - classification_loss: 0.0058 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0813 - regression_loss: 0.0755 - classification_loss: 0.0058 214/500 [===========>..................] - ETA: 1:35 - loss: 0.0818 - regression_loss: 0.0759 - classification_loss: 0.0058 215/500 [===========>..................] - ETA: 1:35 - loss: 0.0815 - regression_loss: 0.0757 - classification_loss: 0.0058 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0815 - regression_loss: 0.0756 - classification_loss: 0.0058 217/500 [============>.................] - ETA: 1:34 - loss: 0.0813 - regression_loss: 0.0755 - classification_loss: 0.0058 218/500 [============>.................] - ETA: 1:34 - loss: 0.0811 - regression_loss: 0.0753 - classification_loss: 0.0058 219/500 [============>.................] - ETA: 1:34 - loss: 0.0811 - regression_loss: 0.0753 - classification_loss: 0.0058 220/500 [============>.................] - ETA: 1:33 - loss: 0.0810 - regression_loss: 0.0752 - classification_loss: 0.0058 221/500 [============>.................] - ETA: 1:33 - loss: 0.0813 - regression_loss: 0.0754 - classification_loss: 0.0059 222/500 [============>.................] - ETA: 1:33 - loss: 0.0817 - regression_loss: 0.0758 - classification_loss: 0.0059 223/500 [============>.................] - ETA: 1:32 - loss: 0.0814 - regression_loss: 0.0755 - classification_loss: 0.0058 224/500 [============>.................] - ETA: 1:32 - loss: 0.0812 - regression_loss: 0.0753 - classification_loss: 0.0058 225/500 [============>.................] - ETA: 1:32 - loss: 0.0811 - regression_loss: 0.0753 - classification_loss: 0.0058 226/500 [============>.................] - ETA: 1:31 - loss: 0.0808 - regression_loss: 0.0750 - classification_loss: 0.0058 227/500 [============>.................] - ETA: 1:31 - loss: 0.0807 - regression_loss: 0.0749 - classification_loss: 0.0058 228/500 [============>.................] - ETA: 1:31 - loss: 0.0805 - regression_loss: 0.0747 - classification_loss: 0.0058 229/500 [============>.................] - ETA: 1:30 - loss: 0.0804 - regression_loss: 0.0746 - classification_loss: 0.0058 230/500 [============>.................] - ETA: 1:30 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0057 231/500 [============>.................] - ETA: 1:30 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 232/500 [============>.................] - ETA: 1:29 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 233/500 [============>.................] - ETA: 1:29 - loss: 0.0801 - regression_loss: 0.0744 - classification_loss: 0.0057 234/500 [=============>................] - ETA: 1:29 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 235/500 [=============>................] - ETA: 1:28 - loss: 0.0801 - regression_loss: 0.0744 - classification_loss: 0.0057 236/500 [=============>................] - ETA: 1:28 - loss: 0.0801 - regression_loss: 0.0743 - classification_loss: 0.0057 237/500 [=============>................] - ETA: 1:28 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 238/500 [=============>................] - ETA: 1:27 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 239/500 [=============>................] - ETA: 1:27 - loss: 0.0801 - regression_loss: 0.0744 - classification_loss: 0.0057 240/500 [=============>................] - ETA: 1:27 - loss: 0.0804 - regression_loss: 0.0747 - classification_loss: 0.0057 241/500 [=============>................] - ETA: 1:26 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 242/500 [=============>................] - ETA: 1:26 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 243/500 [=============>................] - ETA: 1:26 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 244/500 [=============>................] - ETA: 1:25 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 245/500 [=============>................] - ETA: 1:25 - loss: 0.0799 - regression_loss: 0.0743 - classification_loss: 0.0057 246/500 [=============>................] - ETA: 1:25 - loss: 0.0797 - regression_loss: 0.0740 - classification_loss: 0.0057 247/500 [=============>................] - ETA: 1:24 - loss: 0.0805 - regression_loss: 0.0748 - classification_loss: 0.0057 248/500 [=============>................] - ETA: 1:24 - loss: 0.0802 - regression_loss: 0.0746 - classification_loss: 0.0057 249/500 [=============>................] - ETA: 1:24 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 250/500 [==============>...............] - ETA: 1:23 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0056 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0802 - regression_loss: 0.0744 - classification_loss: 0.0057 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 253/500 [==============>...............] - ETA: 1:22 - loss: 0.0797 - regression_loss: 0.0741 - classification_loss: 0.0057 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 256/500 [==============>...............] - ETA: 1:21 - loss: 0.0802 - regression_loss: 0.0745 - classification_loss: 0.0057 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0800 - regression_loss: 0.0743 - classification_loss: 0.0057 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 259/500 [==============>...............] - ETA: 1:20 - loss: 0.0799 - regression_loss: 0.0742 - classification_loss: 0.0057 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0798 - regression_loss: 0.0742 - classification_loss: 0.0057 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0798 - regression_loss: 0.0741 - classification_loss: 0.0057 262/500 [==============>...............] - ETA: 1:19 - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0796 - regression_loss: 0.0740 - classification_loss: 0.0056 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0794 - regression_loss: 0.0738 - classification_loss: 0.0056 265/500 [==============>...............] - ETA: 1:18 - loss: 0.0792 - regression_loss: 0.0736 - classification_loss: 0.0056 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 268/500 [===============>..............] - ETA: 1:17 - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 271/500 [===============>..............] - ETA: 1:16 - loss: 0.0789 - regression_loss: 0.0733 - classification_loss: 0.0056 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 274/500 [===============>..............] - ETA: 1:15 - loss: 0.0793 - regression_loss: 0.0737 - classification_loss: 0.0056 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0791 - regression_loss: 0.0736 - classification_loss: 0.0056 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0790 - regression_loss: 0.0734 - classification_loss: 0.0056 277/500 [===============>..............] - ETA: 1:14 - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0791 - regression_loss: 0.0736 - classification_loss: 0.0056 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0791 - regression_loss: 0.0735 - classification_loss: 0.0056 280/500 [===============>..............] - ETA: 1:13 - loss: 0.0788 - regression_loss: 0.0733 - classification_loss: 0.0056 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0787 - regression_loss: 0.0732 - classification_loss: 0.0055 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 283/500 [===============>..............] - ETA: 1:12 - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 284/500 [================>.............] - ETA: 1:12 - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 285/500 [================>.............] - ETA: 1:12 - loss: 0.0785 - regression_loss: 0.0729 - classification_loss: 0.0056 286/500 [================>.............] - ETA: 1:11 - loss: 0.0784 - regression_loss: 0.0729 - classification_loss: 0.0056 287/500 [================>.............] - ETA: 1:11 - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0055 288/500 [================>.............] - ETA: 1:11 - loss: 0.0780 - regression_loss: 0.0725 - classification_loss: 0.0055 289/500 [================>.............] - ETA: 1:10 - loss: 0.0781 - regression_loss: 0.0725 - classification_loss: 0.0055 290/500 [================>.............] - ETA: 1:10 - loss: 0.0778 - regression_loss: 0.0723 - classification_loss: 0.0055 291/500 [================>.............] - ETA: 1:10 - loss: 0.0778 - regression_loss: 0.0722 - classification_loss: 0.0055 292/500 [================>.............] - ETA: 1:09 - loss: 0.0777 - regression_loss: 0.0721 - classification_loss: 0.0055 293/500 [================>.............] - ETA: 1:09 - loss: 0.0775 - regression_loss: 0.0720 - classification_loss: 0.0055 294/500 [================>.............] - ETA: 1:09 - loss: 0.0775 - regression_loss: 0.0720 - classification_loss: 0.0055 295/500 [================>.............] - ETA: 1:08 - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0056 296/500 [================>.............] - ETA: 1:08 - loss: 0.0779 - regression_loss: 0.0724 - classification_loss: 0.0056 297/500 [================>.............] - ETA: 1:08 - loss: 0.0781 - regression_loss: 0.0725 - classification_loss: 0.0056 298/500 [================>.............] - ETA: 1:07 - loss: 0.0779 - regression_loss: 0.0723 - classification_loss: 0.0056 299/500 [================>.............] - ETA: 1:07 - loss: 0.0777 - regression_loss: 0.0721 - classification_loss: 0.0056 300/500 [=================>............] - ETA: 1:07 - loss: 0.0775 - regression_loss: 0.0720 - classification_loss: 0.0056 301/500 [=================>............] - ETA: 1:06 - loss: 0.0776 - regression_loss: 0.0721 - classification_loss: 0.0056 302/500 [=================>............] - ETA: 1:06 - loss: 0.0775 - regression_loss: 0.0719 - classification_loss: 0.0056 303/500 [=================>............] - ETA: 1:06 - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0056 304/500 [=================>............] - ETA: 1:05 - loss: 0.0779 - regression_loss: 0.0723 - classification_loss: 0.0056 305/500 [=================>............] - ETA: 1:05 - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 306/500 [=================>............] - ETA: 1:05 - loss: 0.0784 - regression_loss: 0.0728 - classification_loss: 0.0056 307/500 [=================>............] - ETA: 1:04 - loss: 0.0785 - regression_loss: 0.0729 - classification_loss: 0.0056 308/500 [=================>............] - ETA: 1:04 - loss: 0.0784 - regression_loss: 0.0728 - classification_loss: 0.0056 309/500 [=================>............] - ETA: 1:04 - loss: 0.0784 - regression_loss: 0.0728 - classification_loss: 0.0056 310/500 [=================>............] - ETA: 1:03 - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0056 311/500 [=================>............] - ETA: 1:03 - loss: 0.0781 - regression_loss: 0.0725 - classification_loss: 0.0056 312/500 [=================>............] - ETA: 1:03 - loss: 0.0781 - regression_loss: 0.0726 - classification_loss: 0.0056 313/500 [=================>............] - ETA: 1:02 - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 314/500 [=================>............] - ETA: 1:02 - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 315/500 [=================>............] - ETA: 1:02 - loss: 0.0782 - regression_loss: 0.0727 - classification_loss: 0.0056 316/500 [=================>............] - ETA: 1:01 - loss: 0.0781 - regression_loss: 0.0726 - classification_loss: 0.0056 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0056 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0056 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0785 - regression_loss: 0.0728 - classification_loss: 0.0057 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0057 322/500 [==================>...........] - ETA: 59s - loss: 0.0787 - regression_loss: 0.0730 - classification_loss: 0.0057  323/500 [==================>...........] - ETA: 59s - loss: 0.0786 - regression_loss: 0.0729 - classification_loss: 0.0057 324/500 [==================>...........] - ETA: 59s - loss: 0.0786 - regression_loss: 0.0729 - classification_loss: 0.0057 325/500 [==================>...........] - ETA: 58s - loss: 0.0786 - regression_loss: 0.0729 - classification_loss: 0.0057 326/500 [==================>...........] - ETA: 58s - loss: 0.0784 - regression_loss: 0.0728 - classification_loss: 0.0057 327/500 [==================>...........] - ETA: 58s - loss: 0.0784 - regression_loss: 0.0727 - classification_loss: 0.0057 328/500 [==================>...........] - ETA: 57s - loss: 0.0784 - regression_loss: 0.0727 - classification_loss: 0.0057 329/500 [==================>...........] - ETA: 57s - loss: 0.0785 - regression_loss: 0.0728 - classification_loss: 0.0057 330/500 [==================>...........] - ETA: 57s - loss: 0.0783 - regression_loss: 0.0726 - classification_loss: 0.0057 331/500 [==================>...........] - ETA: 56s - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0057 332/500 [==================>...........] - ETA: 56s - loss: 0.0784 - regression_loss: 0.0727 - classification_loss: 0.0057 333/500 [==================>...........] - ETA: 56s - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0057 334/500 [===================>..........] - ETA: 55s - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0057 335/500 [===================>..........] - ETA: 55s - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0057 336/500 [===================>..........] - ETA: 55s - loss: 0.0784 - regression_loss: 0.0727 - classification_loss: 0.0057 337/500 [===================>..........] - ETA: 54s - loss: 0.0782 - regression_loss: 0.0725 - classification_loss: 0.0057 338/500 [===================>..........] - ETA: 54s - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0057 339/500 [===================>..........] - ETA: 54s - loss: 0.0783 - regression_loss: 0.0726 - classification_loss: 0.0057 340/500 [===================>..........] - ETA: 53s - loss: 0.0781 - regression_loss: 0.0724 - classification_loss: 0.0056 341/500 [===================>..........] - ETA: 53s - loss: 0.0783 - regression_loss: 0.0726 - classification_loss: 0.0057 342/500 [===================>..........] - ETA: 53s - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 343/500 [===================>..........] - ETA: 52s - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0056 344/500 [===================>..........] - ETA: 52s - loss: 0.0787 - regression_loss: 0.0730 - classification_loss: 0.0056 345/500 [===================>..........] - ETA: 52s - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0057 346/500 [===================>..........] - ETA: 51s - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 347/500 [===================>..........] - ETA: 51s - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0057 348/500 [===================>..........] - ETA: 51s - loss: 0.0789 - regression_loss: 0.0732 - classification_loss: 0.0057 349/500 [===================>..........] - ETA: 50s - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 350/500 [====================>.........] - ETA: 50s - loss: 0.0790 - regression_loss: 0.0733 - classification_loss: 0.0057 351/500 [====================>.........] - ETA: 50s - loss: 0.0789 - regression_loss: 0.0733 - classification_loss: 0.0057 352/500 [====================>.........] - ETA: 49s - loss: 0.0788 - regression_loss: 0.0732 - classification_loss: 0.0057 353/500 [====================>.........] - ETA: 49s - loss: 0.0786 - regression_loss: 0.0730 - classification_loss: 0.0057 354/500 [====================>.........] - ETA: 49s - loss: 0.0786 - regression_loss: 0.0729 - classification_loss: 0.0057 355/500 [====================>.........] - ETA: 48s - loss: 0.0785 - regression_loss: 0.0728 - classification_loss: 0.0057 356/500 [====================>.........] - ETA: 48s - loss: 0.0786 - regression_loss: 0.0730 - classification_loss: 0.0057 357/500 [====================>.........] - ETA: 48s - loss: 0.0786 - regression_loss: 0.0729 - classification_loss: 0.0056 358/500 [====================>.........] - ETA: 47s - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 359/500 [====================>.........] - ETA: 47s - loss: 0.0787 - regression_loss: 0.0731 - classification_loss: 0.0056 360/500 [====================>.........] - ETA: 47s - loss: 0.0785 - regression_loss: 0.0729 - classification_loss: 0.0056 361/500 [====================>.........] - ETA: 46s - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 362/500 [====================>.........] - ETA: 46s - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0056 363/500 [====================>.........] - ETA: 46s - loss: 0.0783 - regression_loss: 0.0727 - classification_loss: 0.0056 364/500 [====================>.........] - ETA: 45s - loss: 0.0782 - regression_loss: 0.0726 - classification_loss: 0.0056 365/500 [====================>.........] - ETA: 45s - loss: 0.0780 - regression_loss: 0.0725 - classification_loss: 0.0056 366/500 [====================>.........] - ETA: 45s - loss: 0.0779 - regression_loss: 0.0723 - classification_loss: 0.0056 367/500 [=====================>........] - ETA: 44s - loss: 0.0777 - regression_loss: 0.0722 - classification_loss: 0.0056 368/500 [=====================>........] - ETA: 44s - loss: 0.0777 - regression_loss: 0.0721 - classification_loss: 0.0056 369/500 [=====================>........] - ETA: 44s - loss: 0.0777 - regression_loss: 0.0722 - classification_loss: 0.0056 370/500 [=====================>........] - ETA: 43s - loss: 0.0778 - regression_loss: 0.0722 - classification_loss: 0.0056 371/500 [=====================>........] - ETA: 43s - loss: 0.0779 - regression_loss: 0.0723 - classification_loss: 0.0056 372/500 [=====================>........] - ETA: 43s - loss: 0.0777 - regression_loss: 0.0722 - classification_loss: 0.0056 373/500 [=====================>........] - ETA: 42s - loss: 0.0777 - regression_loss: 0.0721 - classification_loss: 0.0056 374/500 [=====================>........] - ETA: 42s - loss: 0.0776 - regression_loss: 0.0720 - classification_loss: 0.0055 375/500 [=====================>........] - ETA: 42s - loss: 0.0774 - regression_loss: 0.0719 - classification_loss: 0.0055 376/500 [=====================>........] - ETA: 41s - loss: 0.0773 - regression_loss: 0.0718 - classification_loss: 0.0055 377/500 [=====================>........] - ETA: 41s - loss: 0.0774 - regression_loss: 0.0718 - classification_loss: 0.0055 378/500 [=====================>........] - ETA: 41s - loss: 0.0773 - regression_loss: 0.0718 - classification_loss: 0.0055 379/500 [=====================>........] - ETA: 40s - loss: 0.0772 - regression_loss: 0.0717 - classification_loss: 0.0055 380/500 [=====================>........] - ETA: 40s - loss: 0.0772 - regression_loss: 0.0717 - classification_loss: 0.0055 381/500 [=====================>........] - ETA: 40s - loss: 0.0771 - regression_loss: 0.0716 - classification_loss: 0.0055 382/500 [=====================>........] - ETA: 39s - loss: 0.0773 - regression_loss: 0.0717 - classification_loss: 0.0055 383/500 [=====================>........] - ETA: 39s - loss: 0.0771 - regression_loss: 0.0716 - classification_loss: 0.0055 384/500 [======================>.......] - ETA: 39s - loss: 0.0770 - regression_loss: 0.0715 - classification_loss: 0.0055 385/500 [======================>.......] - ETA: 38s - loss: 0.0768 - regression_loss: 0.0713 - classification_loss: 0.0055 386/500 [======================>.......] - ETA: 38s - loss: 0.0769 - regression_loss: 0.0714 - classification_loss: 0.0055 387/500 [======================>.......] - ETA: 38s - loss: 0.0768 - regression_loss: 0.0713 - classification_loss: 0.0055 388/500 [======================>.......] - ETA: 37s - loss: 0.0767 - regression_loss: 0.0712 - classification_loss: 0.0055 389/500 [======================>.......] - ETA: 37s - loss: 0.0766 - regression_loss: 0.0711 - classification_loss: 0.0055 390/500 [======================>.......] - ETA: 37s - loss: 0.0767 - regression_loss: 0.0712 - classification_loss: 0.0055 391/500 [======================>.......] - ETA: 36s - loss: 0.0765 - regression_loss: 0.0710 - classification_loss: 0.0055 392/500 [======================>.......] - ETA: 36s - loss: 0.0769 - regression_loss: 0.0713 - classification_loss: 0.0055 393/500 [======================>.......] - ETA: 36s - loss: 0.0769 - regression_loss: 0.0714 - classification_loss: 0.0055 394/500 [======================>.......] - ETA: 35s - loss: 0.0768 - regression_loss: 0.0713 - classification_loss: 0.0055 395/500 [======================>.......] - ETA: 35s - loss: 0.0767 - regression_loss: 0.0712 - classification_loss: 0.0055 396/500 [======================>.......] - ETA: 35s - loss: 0.0772 - regression_loss: 0.0715 - classification_loss: 0.0057 397/500 [======================>.......] - ETA: 34s - loss: 0.0773 - regression_loss: 0.0717 - classification_loss: 0.0057 398/500 [======================>.......] - ETA: 34s - loss: 0.0771 - regression_loss: 0.0715 - classification_loss: 0.0056 399/500 [======================>.......] - ETA: 33s - loss: 0.0771 - regression_loss: 0.0715 - classification_loss: 0.0056 400/500 [=======================>......] - ETA: 33s - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0056 401/500 [=======================>......] - ETA: 33s - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0057 402/500 [=======================>......] - ETA: 32s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 403/500 [=======================>......] - ETA: 32s - loss: 0.0768 - regression_loss: 0.0711 - classification_loss: 0.0056 404/500 [=======================>......] - ETA: 32s - loss: 0.0766 - regression_loss: 0.0710 - classification_loss: 0.0056 405/500 [=======================>......] - ETA: 31s - loss: 0.0766 - regression_loss: 0.0709 - classification_loss: 0.0056 406/500 [=======================>......] - ETA: 31s - loss: 0.0765 - regression_loss: 0.0708 - classification_loss: 0.0056 407/500 [=======================>......] - ETA: 31s - loss: 0.0764 - regression_loss: 0.0707 - classification_loss: 0.0056 408/500 [=======================>......] - ETA: 30s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 409/500 [=======================>......] - ETA: 30s - loss: 0.0763 - regression_loss: 0.0707 - classification_loss: 0.0056 410/500 [=======================>......] - ETA: 30s - loss: 0.0761 - regression_loss: 0.0705 - classification_loss: 0.0056 411/500 [=======================>......] - ETA: 29s - loss: 0.0761 - regression_loss: 0.0705 - classification_loss: 0.0056 412/500 [=======================>......] - ETA: 29s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 413/500 [=======================>......] - ETA: 29s - loss: 0.0763 - regression_loss: 0.0706 - classification_loss: 0.0056 414/500 [=======================>......] - ETA: 28s - loss: 0.0762 - regression_loss: 0.0705 - classification_loss: 0.0056 415/500 [=======================>......] - ETA: 28s - loss: 0.0763 - regression_loss: 0.0706 - classification_loss: 0.0056 416/500 [=======================>......] - ETA: 28s - loss: 0.0762 - regression_loss: 0.0705 - classification_loss: 0.0056 417/500 [========================>.....] - ETA: 27s - loss: 0.0764 - regression_loss: 0.0708 - classification_loss: 0.0056 418/500 [========================>.....] - ETA: 27s - loss: 0.0767 - regression_loss: 0.0711 - classification_loss: 0.0057 419/500 [========================>.....] - ETA: 27s - loss: 0.0768 - regression_loss: 0.0711 - classification_loss: 0.0057 420/500 [========================>.....] - ETA: 26s - loss: 0.0770 - regression_loss: 0.0712 - classification_loss: 0.0057 421/500 [========================>.....] - ETA: 26s - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0057 422/500 [========================>.....] - ETA: 26s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 423/500 [========================>.....] - ETA: 25s - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0057 424/500 [========================>.....] - ETA: 25s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 425/500 [========================>.....] - ETA: 25s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 426/500 [========================>.....] - ETA: 24s - loss: 0.0768 - regression_loss: 0.0711 - classification_loss: 0.0057 427/500 [========================>.....] - ETA: 24s - loss: 0.0767 - regression_loss: 0.0710 - classification_loss: 0.0057 428/500 [========================>.....] - ETA: 24s - loss: 0.0771 - regression_loss: 0.0713 - classification_loss: 0.0057 429/500 [========================>.....] - ETA: 23s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 430/500 [========================>.....] - ETA: 23s - loss: 0.0768 - regression_loss: 0.0711 - classification_loss: 0.0057 431/500 [========================>.....] - ETA: 23s - loss: 0.0769 - regression_loss: 0.0712 - classification_loss: 0.0057 432/500 [========================>.....] - ETA: 22s - loss: 0.0768 - regression_loss: 0.0711 - classification_loss: 0.0057 433/500 [========================>.....] - ETA: 22s - loss: 0.0767 - regression_loss: 0.0710 - classification_loss: 0.0057 434/500 [=========================>....] - ETA: 22s - loss: 0.0767 - regression_loss: 0.0711 - classification_loss: 0.0057 435/500 [=========================>....] - ETA: 21s - loss: 0.0767 - regression_loss: 0.0710 - classification_loss: 0.0057 436/500 [=========================>....] - ETA: 21s - loss: 0.0765 - regression_loss: 0.0709 - classification_loss: 0.0057 437/500 [=========================>....] - ETA: 21s - loss: 0.0764 - regression_loss: 0.0708 - classification_loss: 0.0056 438/500 [=========================>....] - ETA: 20s - loss: 0.0763 - regression_loss: 0.0707 - classification_loss: 0.0056 439/500 [=========================>....] - ETA: 20s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 440/500 [=========================>....] - ETA: 20s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 441/500 [=========================>....] - ETA: 19s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 442/500 [=========================>....] - ETA: 19s - loss: 0.0761 - regression_loss: 0.0705 - classification_loss: 0.0056 443/500 [=========================>....] - ETA: 19s - loss: 0.0760 - regression_loss: 0.0704 - classification_loss: 0.0056 444/500 [=========================>....] - ETA: 18s - loss: 0.0759 - regression_loss: 0.0702 - classification_loss: 0.0056 445/500 [=========================>....] - ETA: 18s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 446/500 [=========================>....] - ETA: 18s - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0056 447/500 [=========================>....] - ETA: 17s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 448/500 [=========================>....] - ETA: 17s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 449/500 [=========================>....] - ETA: 17s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 450/500 [==========================>...] - ETA: 16s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 451/500 [==========================>...] - ETA: 16s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 452/500 [==========================>...] - ETA: 16s - loss: 0.0759 - regression_loss: 0.0702 - classification_loss: 0.0056 453/500 [==========================>...] - ETA: 15s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 454/500 [==========================>...] - ETA: 15s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 455/500 [==========================>...] - ETA: 15s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 456/500 [==========================>...] - ETA: 14s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 457/500 [==========================>...] - ETA: 14s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 458/500 [==========================>...] - ETA: 14s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 459/500 [==========================>...] - ETA: 13s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 460/500 [==========================>...] - ETA: 13s - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 461/500 [==========================>...] - ETA: 13s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 462/500 [==========================>...] - ETA: 12s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 463/500 [==========================>...] - ETA: 12s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 464/500 [==========================>...] - ETA: 12s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 465/500 [==========================>...] - ETA: 11s - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0056 466/500 [==========================>...] - ETA: 11s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 467/500 [===========================>..] - ETA: 11s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 468/500 [===========================>..] - ETA: 10s - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 469/500 [===========================>..] - ETA: 10s - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0056 470/500 [===========================>..] - ETA: 10s - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0056 471/500 [===========================>..] - ETA: 9s - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0055  472/500 [===========================>..] - ETA: 9s - loss: 0.0755 - regression_loss: 0.0699 - classification_loss: 0.0055 473/500 [===========================>..] - ETA: 9s - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0055 474/500 [===========================>..] - ETA: 8s - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0056 475/500 [===========================>..] - ETA: 8s - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0055 476/500 [===========================>..] - ETA: 8s - loss: 0.0757 - regression_loss: 0.0702 - classification_loss: 0.0055 477/500 [===========================>..] - ETA: 7s - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0055 478/500 [===========================>..] - ETA: 7s - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0055 479/500 [===========================>..] - ETA: 7s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 480/500 [===========================>..] - ETA: 6s - loss: 0.0761 - regression_loss: 0.0705 - classification_loss: 0.0055 481/500 [===========================>..] - ETA: 6s - loss: 0.0764 - regression_loss: 0.0708 - classification_loss: 0.0056 482/500 [===========================>..] - ETA: 6s - loss: 0.0764 - regression_loss: 0.0709 - classification_loss: 0.0056 483/500 [===========================>..] - ETA: 5s - loss: 0.0767 - regression_loss: 0.0712 - classification_loss: 0.0056 484/500 [============================>.] - ETA: 5s - loss: 0.0767 - regression_loss: 0.0711 - classification_loss: 0.0056 485/500 [============================>.] - ETA: 5s - loss: 0.0766 - regression_loss: 0.0710 - classification_loss: 0.0055 486/500 [============================>.] - ETA: 4s - loss: 0.0767 - regression_loss: 0.0711 - classification_loss: 0.0055 487/500 [============================>.] - ETA: 4s - loss: 0.0765 - regression_loss: 0.0710 - classification_loss: 0.0055 488/500 [============================>.] - ETA: 4s - loss: 0.0766 - regression_loss: 0.0710 - classification_loss: 0.0055 489/500 [============================>.] - ETA: 3s - loss: 0.0765 - regression_loss: 0.0710 - classification_loss: 0.0055 490/500 [============================>.] - ETA: 3s - loss: 0.0765 - regression_loss: 0.0710 - classification_loss: 0.0055 491/500 [============================>.] - ETA: 3s - loss: 0.0764 - regression_loss: 0.0709 - classification_loss: 0.0055 492/500 [============================>.] - ETA: 2s - loss: 0.0763 - regression_loss: 0.0708 - classification_loss: 0.0055 493/500 [============================>.] - ETA: 2s - loss: 0.0762 - regression_loss: 0.0707 - classification_loss: 0.0055 494/500 [============================>.] - ETA: 2s - loss: 0.0761 - regression_loss: 0.0706 - classification_loss: 0.0055 495/500 [============================>.] - ETA: 1s - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0055 496/500 [============================>.] - ETA: 1s - loss: 0.0761 - regression_loss: 0.0706 - classification_loss: 0.0055 497/500 [============================>.] - ETA: 1s - loss: 0.0760 - regression_loss: 0.0705 - classification_loss: 0.0055 498/500 [============================>.] - ETA: 0s - loss: 0.0759 - regression_loss: 0.0704 - classification_loss: 0.0055 499/500 [============================>.] - ETA: 0s - loss: 0.0758 - regression_loss: 0.0703 - classification_loss: 0.0055 500/500 [==============================] - 168s 336ms/step - loss: 0.0760 - regression_loss: 0.0705 - classification_loss: 0.0055 1172 instances of class plum with average precision: 0.7571 mAP: 0.7571 Epoch 00056: saving model to ./training/snapshots/resnet101_pascal_56.h5 Epoch 57/150 1/500 [..............................] - ETA: 2:32 - loss: 0.2815 - regression_loss: 0.2724 - classification_loss: 0.0091 2/500 [..............................] - ETA: 2:39 - loss: 0.1563 - regression_loss: 0.1510 - classification_loss: 0.0053 3/500 [..............................] - ETA: 2:41 - loss: 0.1188 - regression_loss: 0.1145 - classification_loss: 0.0043 4/500 [..............................] - ETA: 2:44 - loss: 0.1124 - regression_loss: 0.1061 - classification_loss: 0.0063 5/500 [..............................] - ETA: 2:46 - loss: 0.0939 - regression_loss: 0.0886 - classification_loss: 0.0053 6/500 [..............................] - ETA: 2:45 - loss: 0.1354 - regression_loss: 0.1306 - classification_loss: 0.0048 7/500 [..............................] - ETA: 2:45 - loss: 0.1220 - regression_loss: 0.1175 - classification_loss: 0.0045 8/500 [..............................] - ETA: 2:45 - loss: 0.1167 - regression_loss: 0.1122 - classification_loss: 0.0045 9/500 [..............................] - ETA: 2:44 - loss: 0.1088 - regression_loss: 0.1044 - classification_loss: 0.0044 10/500 [..............................] - ETA: 2:44 - loss: 0.1006 - regression_loss: 0.0965 - classification_loss: 0.0041 11/500 [..............................] - ETA: 2:43 - loss: 0.0925 - regression_loss: 0.0887 - classification_loss: 0.0038 12/500 [..............................] - ETA: 2:43 - loss: 0.0971 - regression_loss: 0.0928 - classification_loss: 0.0044 13/500 [..............................] - ETA: 2:43 - loss: 0.0956 - regression_loss: 0.0911 - classification_loss: 0.0045 14/500 [..............................] - ETA: 2:42 - loss: 0.0959 - regression_loss: 0.0911 - classification_loss: 0.0047 15/500 [..............................] - ETA: 2:42 - loss: 0.0926 - regression_loss: 0.0880 - classification_loss: 0.0046 16/500 [..............................] - ETA: 2:42 - loss: 0.0874 - regression_loss: 0.0830 - classification_loss: 0.0043 17/500 [>.............................] - ETA: 2:41 - loss: 0.0842 - regression_loss: 0.0800 - classification_loss: 0.0042 18/500 [>.............................] - ETA: 2:41 - loss: 0.0859 - regression_loss: 0.0815 - classification_loss: 0.0044 19/500 [>.............................] - ETA: 2:41 - loss: 0.0844 - regression_loss: 0.0800 - classification_loss: 0.0045 20/500 [>.............................] - ETA: 2:41 - loss: 0.0821 - regression_loss: 0.0778 - classification_loss: 0.0043 21/500 [>.............................] - ETA: 2:40 - loss: 0.0795 - regression_loss: 0.0752 - classification_loss: 0.0042 22/500 [>.............................] - ETA: 2:40 - loss: 0.0787 - regression_loss: 0.0746 - classification_loss: 0.0041 23/500 [>.............................] - ETA: 2:40 - loss: 0.0812 - regression_loss: 0.0771 - classification_loss: 0.0042 24/500 [>.............................] - ETA: 2:39 - loss: 0.0787 - regression_loss: 0.0747 - classification_loss: 0.0041 25/500 [>.............................] - ETA: 2:39 - loss: 0.0829 - regression_loss: 0.0783 - classification_loss: 0.0046 26/500 [>.............................] - ETA: 2:38 - loss: 0.0819 - regression_loss: 0.0774 - classification_loss: 0.0045 27/500 [>.............................] - ETA: 2:38 - loss: 0.0837 - regression_loss: 0.0789 - classification_loss: 0.0048 28/500 [>.............................] - ETA: 2:38 - loss: 0.0832 - regression_loss: 0.0784 - classification_loss: 0.0049 29/500 [>.............................] - ETA: 2:37 - loss: 0.0811 - regression_loss: 0.0764 - classification_loss: 0.0047 30/500 [>.............................] - ETA: 2:37 - loss: 0.0802 - regression_loss: 0.0755 - classification_loss: 0.0047 31/500 [>.............................] - ETA: 2:37 - loss: 0.0787 - regression_loss: 0.0741 - classification_loss: 0.0046 32/500 [>.............................] - ETA: 2:36 - loss: 0.0784 - regression_loss: 0.0738 - classification_loss: 0.0046 33/500 [>.............................] - ETA: 2:36 - loss: 0.0791 - regression_loss: 0.0744 - classification_loss: 0.0047 34/500 [=>............................] - ETA: 2:36 - loss: 0.0786 - regression_loss: 0.0739 - classification_loss: 0.0046 35/500 [=>............................] - ETA: 2:35 - loss: 0.0774 - regression_loss: 0.0728 - classification_loss: 0.0046 36/500 [=>............................] - ETA: 2:35 - loss: 0.0768 - regression_loss: 0.0722 - classification_loss: 0.0045 37/500 [=>............................] - ETA: 2:35 - loss: 0.0750 - regression_loss: 0.0706 - classification_loss: 0.0044 38/500 [=>............................] - ETA: 2:34 - loss: 0.0756 - regression_loss: 0.0708 - classification_loss: 0.0048 39/500 [=>............................] - ETA: 2:34 - loss: 0.0744 - regression_loss: 0.0697 - classification_loss: 0.0047 40/500 [=>............................] - ETA: 2:34 - loss: 0.0788 - regression_loss: 0.0736 - classification_loss: 0.0052 41/500 [=>............................] - ETA: 2:34 - loss: 0.0778 - regression_loss: 0.0727 - classification_loss: 0.0052 42/500 [=>............................] - ETA: 2:33 - loss: 0.0765 - regression_loss: 0.0715 - classification_loss: 0.0051 43/500 [=>............................] - ETA: 2:33 - loss: 0.0752 - regression_loss: 0.0702 - classification_loss: 0.0050 44/500 [=>............................] - ETA: 2:33 - loss: 0.0750 - regression_loss: 0.0700 - classification_loss: 0.0050 45/500 [=>............................] - ETA: 2:33 - loss: 0.0773 - regression_loss: 0.0722 - classification_loss: 0.0051 46/500 [=>............................] - ETA: 2:33 - loss: 0.0760 - regression_loss: 0.0710 - classification_loss: 0.0050 47/500 [=>............................] - ETA: 2:32 - loss: 0.0755 - regression_loss: 0.0705 - classification_loss: 0.0049 48/500 [=>............................] - ETA: 2:32 - loss: 0.0741 - regression_loss: 0.0692 - classification_loss: 0.0049 49/500 [=>............................] - ETA: 2:31 - loss: 0.0733 - regression_loss: 0.0684 - classification_loss: 0.0048 50/500 [==>...........................] - ETA: 2:31 - loss: 0.0732 - regression_loss: 0.0684 - classification_loss: 0.0048 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0731 - regression_loss: 0.0683 - classification_loss: 0.0049 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0724 - regression_loss: 0.0676 - classification_loss: 0.0048 53/500 [==>...........................] - ETA: 2:30 - loss: 0.0714 - regression_loss: 0.0667 - classification_loss: 0.0047 54/500 [==>...........................] - ETA: 2:30 - loss: 0.0713 - regression_loss: 0.0666 - classification_loss: 0.0047 55/500 [==>...........................] - ETA: 2:30 - loss: 0.0734 - regression_loss: 0.0686 - classification_loss: 0.0048 56/500 [==>...........................] - ETA: 2:29 - loss: 0.0723 - regression_loss: 0.0676 - classification_loss: 0.0047 57/500 [==>...........................] - ETA: 2:29 - loss: 0.0731 - regression_loss: 0.0683 - classification_loss: 0.0047 58/500 [==>...........................] - ETA: 2:29 - loss: 0.0750 - regression_loss: 0.0701 - classification_loss: 0.0049 59/500 [==>...........................] - ETA: 2:28 - loss: 0.0755 - regression_loss: 0.0706 - classification_loss: 0.0049 60/500 [==>...........................] - ETA: 2:28 - loss: 0.0754 - regression_loss: 0.0705 - classification_loss: 0.0049 61/500 [==>...........................] - ETA: 2:28 - loss: 0.0748 - regression_loss: 0.0699 - classification_loss: 0.0049 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0754 - regression_loss: 0.0705 - classification_loss: 0.0048 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0763 - regression_loss: 0.0712 - classification_loss: 0.0051 64/500 [==>...........................] - ETA: 2:27 - loss: 0.0761 - regression_loss: 0.0710 - classification_loss: 0.0050 65/500 [==>...........................] - ETA: 2:26 - loss: 0.0781 - regression_loss: 0.0730 - classification_loss: 0.0051 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0788 - regression_loss: 0.0736 - classification_loss: 0.0051 67/500 [===>..........................] - ETA: 2:26 - loss: 0.0784 - regression_loss: 0.0733 - classification_loss: 0.0051 68/500 [===>..........................] - ETA: 2:25 - loss: 0.0787 - regression_loss: 0.0736 - classification_loss: 0.0051 69/500 [===>..........................] - ETA: 2:25 - loss: 0.0782 - regression_loss: 0.0731 - classification_loss: 0.0051 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0776 - regression_loss: 0.0726 - classification_loss: 0.0050 71/500 [===>..........................] - ETA: 2:24 - loss: 0.0768 - regression_loss: 0.0718 - classification_loss: 0.0050 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0762 - regression_loss: 0.0712 - classification_loss: 0.0050 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0775 - regression_loss: 0.0725 - classification_loss: 0.0050 74/500 [===>..........................] - ETA: 2:23 - loss: 0.0784 - regression_loss: 0.0734 - classification_loss: 0.0050 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0780 - regression_loss: 0.0730 - classification_loss: 0.0050 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0774 - regression_loss: 0.0725 - classification_loss: 0.0049 77/500 [===>..........................] - ETA: 2:22 - loss: 0.0767 - regression_loss: 0.0718 - classification_loss: 0.0049 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0777 - regression_loss: 0.0725 - classification_loss: 0.0052 79/500 [===>..........................] - ETA: 2:21 - loss: 0.0770 - regression_loss: 0.0719 - classification_loss: 0.0052 80/500 [===>..........................] - ETA: 2:21 - loss: 0.0772 - regression_loss: 0.0720 - classification_loss: 0.0052 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0778 - regression_loss: 0.0725 - classification_loss: 0.0052 82/500 [===>..........................] - ETA: 2:20 - loss: 0.0793 - regression_loss: 0.0741 - classification_loss: 0.0052 83/500 [===>..........................] - ETA: 2:20 - loss: 0.0786 - regression_loss: 0.0734 - classification_loss: 0.0052 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0794 - regression_loss: 0.0740 - classification_loss: 0.0054 85/500 [====>.........................] - ETA: 2:19 - loss: 0.0791 - regression_loss: 0.0738 - classification_loss: 0.0054 86/500 [====>.........................] - ETA: 2:19 - loss: 0.0795 - regression_loss: 0.0741 - classification_loss: 0.0054 87/500 [====>.........................] - ETA: 2:19 - loss: 0.0795 - regression_loss: 0.0741 - classification_loss: 0.0054 88/500 [====>.........................] - ETA: 2:18 - loss: 0.0788 - regression_loss: 0.0735 - classification_loss: 0.0053 89/500 [====>.........................] - ETA: 2:18 - loss: 0.0784 - regression_loss: 0.0731 - classification_loss: 0.0053 90/500 [====>.........................] - ETA: 2:18 - loss: 0.0777 - regression_loss: 0.0725 - classification_loss: 0.0053 91/500 [====>.........................] - ETA: 2:17 - loss: 0.0775 - regression_loss: 0.0722 - classification_loss: 0.0052 92/500 [====>.........................] - ETA: 2:17 - loss: 0.0773 - regression_loss: 0.0720 - classification_loss: 0.0053 93/500 [====>.........................] - ETA: 2:17 - loss: 0.0775 - regression_loss: 0.0722 - classification_loss: 0.0053 94/500 [====>.........................] - ETA: 2:16 - loss: 0.0804 - regression_loss: 0.0750 - classification_loss: 0.0054 95/500 [====>.........................] - ETA: 2:16 - loss: 0.0816 - regression_loss: 0.0759 - classification_loss: 0.0057 96/500 [====>.........................] - ETA: 2:16 - loss: 0.0828 - regression_loss: 0.0772 - classification_loss: 0.0056 97/500 [====>.........................] - ETA: 2:15 - loss: 0.0830 - regression_loss: 0.0773 - classification_loss: 0.0057 98/500 [====>.........................] - ETA: 2:15 - loss: 0.0826 - regression_loss: 0.0769 - classification_loss: 0.0057 99/500 [====>.........................] - ETA: 2:15 - loss: 0.0823 - regression_loss: 0.0767 - classification_loss: 0.0056 100/500 [=====>........................] - ETA: 2:14 - loss: 0.0817 - regression_loss: 0.0761 - classification_loss: 0.0056 101/500 [=====>........................] - ETA: 2:14 - loss: 0.0811 - regression_loss: 0.0756 - classification_loss: 0.0055 102/500 [=====>........................] - ETA: 2:14 - loss: 0.0804 - regression_loss: 0.0749 - classification_loss: 0.0055 103/500 [=====>........................] - ETA: 2:13 - loss: 0.0803 - regression_loss: 0.0748 - classification_loss: 0.0055 104/500 [=====>........................] - ETA: 2:13 - loss: 0.0798 - regression_loss: 0.0743 - classification_loss: 0.0054 105/500 [=====>........................] - ETA: 2:13 - loss: 0.0795 - regression_loss: 0.0741 - classification_loss: 0.0054 106/500 [=====>........................] - ETA: 2:12 - loss: 0.0792 - regression_loss: 0.0738 - classification_loss: 0.0054 107/500 [=====>........................] - ETA: 2:12 - loss: 0.0790 - regression_loss: 0.0736 - classification_loss: 0.0054 108/500 [=====>........................] - ETA: 2:12 - loss: 0.0793 - regression_loss: 0.0740 - classification_loss: 0.0053 109/500 [=====>........................] - ETA: 2:11 - loss: 0.0796 - regression_loss: 0.0742 - classification_loss: 0.0054 110/500 [=====>........................] - ETA: 2:11 - loss: 0.0791 - regression_loss: 0.0738 - classification_loss: 0.0053 111/500 [=====>........................] - ETA: 2:11 - loss: 0.0785 - regression_loss: 0.0733 - classification_loss: 0.0053 112/500 [=====>........................] - ETA: 2:10 - loss: 0.0784 - regression_loss: 0.0731 - classification_loss: 0.0053 113/500 [=====>........................] - ETA: 2:10 - loss: 0.0785 - regression_loss: 0.0732 - classification_loss: 0.0053 114/500 [=====>........................] - ETA: 2:10 - loss: 0.0788 - regression_loss: 0.0735 - classification_loss: 0.0053 115/500 [=====>........................] - ETA: 2:09 - loss: 0.0789 - regression_loss: 0.0736 - classification_loss: 0.0053 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0786 - regression_loss: 0.0733 - classification_loss: 0.0053 117/500 [======>.......................] - ETA: 2:09 - loss: 0.0789 - regression_loss: 0.0736 - classification_loss: 0.0053 118/500 [======>.......................] - ETA: 2:08 - loss: 0.0787 - regression_loss: 0.0734 - classification_loss: 0.0053 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0786 - regression_loss: 0.0733 - classification_loss: 0.0053 120/500 [======>.......................] - ETA: 2:08 - loss: 0.0785 - regression_loss: 0.0733 - classification_loss: 0.0053 121/500 [======>.......................] - ETA: 2:07 - loss: 0.0791 - regression_loss: 0.0737 - classification_loss: 0.0054 122/500 [======>.......................] - ETA: 2:07 - loss: 0.0788 - regression_loss: 0.0734 - classification_loss: 0.0054 123/500 [======>.......................] - ETA: 2:07 - loss: 0.0803 - regression_loss: 0.0744 - classification_loss: 0.0059 124/500 [======>.......................] - ETA: 2:06 - loss: 0.0804 - regression_loss: 0.0745 - classification_loss: 0.0059 125/500 [======>.......................] - ETA: 2:06 - loss: 0.0802 - regression_loss: 0.0743 - classification_loss: 0.0059 126/500 [======>.......................] - ETA: 2:05 - loss: 0.0806 - regression_loss: 0.0747 - classification_loss: 0.0059 127/500 [======>.......................] - ETA: 2:05 - loss: 0.0808 - regression_loss: 0.0748 - classification_loss: 0.0059 128/500 [======>.......................] - ETA: 2:05 - loss: 0.0808 - regression_loss: 0.0747 - classification_loss: 0.0060 129/500 [======>.......................] - ETA: 2:04 - loss: 0.0806 - regression_loss: 0.0745 - classification_loss: 0.0061 130/500 [======>.......................] - ETA: 2:04 - loss: 0.0803 - regression_loss: 0.0742 - classification_loss: 0.0061 131/500 [======>.......................] - ETA: 2:04 - loss: 0.0798 - regression_loss: 0.0737 - classification_loss: 0.0060 132/500 [======>.......................] - ETA: 2:03 - loss: 0.0794 - regression_loss: 0.0734 - classification_loss: 0.0060 133/500 [======>.......................] - ETA: 2:03 - loss: 0.0790 - regression_loss: 0.0731 - classification_loss: 0.0060 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0787 - regression_loss: 0.0727 - classification_loss: 0.0059 135/500 [=======>......................] - ETA: 2:03 - loss: 0.0788 - regression_loss: 0.0729 - classification_loss: 0.0059 136/500 [=======>......................] - ETA: 2:02 - loss: 0.0788 - regression_loss: 0.0729 - classification_loss: 0.0059 137/500 [=======>......................] - ETA: 2:02 - loss: 0.0785 - regression_loss: 0.0726 - classification_loss: 0.0059 138/500 [=======>......................] - ETA: 2:02 - loss: 0.0785 - regression_loss: 0.0726 - classification_loss: 0.0059 139/500 [=======>......................] - ETA: 2:01 - loss: 0.0781 - regression_loss: 0.0723 - classification_loss: 0.0059 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0779 - regression_loss: 0.0720 - classification_loss: 0.0059 141/500 [=======>......................] - ETA: 2:01 - loss: 0.0780 - regression_loss: 0.0721 - classification_loss: 0.0059 142/500 [=======>......................] - ETA: 2:00 - loss: 0.0775 - regression_loss: 0.0717 - classification_loss: 0.0058 143/500 [=======>......................] - ETA: 2:00 - loss: 0.0778 - regression_loss: 0.0719 - classification_loss: 0.0058 144/500 [=======>......................] - ETA: 2:00 - loss: 0.0774 - regression_loss: 0.0716 - classification_loss: 0.0058 145/500 [=======>......................] - ETA: 1:59 - loss: 0.0770 - regression_loss: 0.0712 - classification_loss: 0.0058 146/500 [=======>......................] - ETA: 1:59 - loss: 0.0772 - regression_loss: 0.0714 - classification_loss: 0.0058 147/500 [=======>......................] - ETA: 1:59 - loss: 0.0770 - regression_loss: 0.0712 - classification_loss: 0.0058 148/500 [=======>......................] - ETA: 1:58 - loss: 0.0766 - regression_loss: 0.0708 - classification_loss: 0.0057 149/500 [=======>......................] - ETA: 1:58 - loss: 0.0762 - regression_loss: 0.0705 - classification_loss: 0.0057 150/500 [========>.....................] - ETA: 1:57 - loss: 0.0761 - regression_loss: 0.0704 - classification_loss: 0.0057 151/500 [========>.....................] - ETA: 1:57 - loss: 0.0764 - regression_loss: 0.0707 - classification_loss: 0.0057 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0761 - regression_loss: 0.0704 - classification_loss: 0.0057 153/500 [========>.....................] - ETA: 1:56 - loss: 0.0764 - regression_loss: 0.0708 - classification_loss: 0.0057 154/500 [========>.....................] - ETA: 1:56 - loss: 0.0762 - regression_loss: 0.0705 - classification_loss: 0.0057 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0760 - regression_loss: 0.0703 - classification_loss: 0.0056 156/500 [========>.....................] - ETA: 1:55 - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0056 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0056 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0758 - regression_loss: 0.0702 - classification_loss: 0.0056 159/500 [========>.....................] - ETA: 1:54 - loss: 0.0765 - regression_loss: 0.0709 - classification_loss: 0.0056 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0760 - regression_loss: 0.0705 - classification_loss: 0.0056 162/500 [========>.....................] - ETA: 1:53 - loss: 0.0757 - regression_loss: 0.0701 - classification_loss: 0.0055 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0759 - regression_loss: 0.0703 - classification_loss: 0.0055 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0055 165/500 [========>.....................] - ETA: 1:52 - loss: 0.0766 - regression_loss: 0.0710 - classification_loss: 0.0056 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0769 - regression_loss: 0.0713 - classification_loss: 0.0056 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0767 - regression_loss: 0.0711 - classification_loss: 0.0056 168/500 [=========>....................] - ETA: 1:51 - loss: 0.0766 - regression_loss: 0.0710 - classification_loss: 0.0056 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0768 - regression_loss: 0.0712 - classification_loss: 0.0056 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0765 - regression_loss: 0.0709 - classification_loss: 0.0056 171/500 [=========>....................] - ETA: 1:50 - loss: 0.0763 - regression_loss: 0.0708 - classification_loss: 0.0056 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0761 - regression_loss: 0.0705 - classification_loss: 0.0055 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0759 - regression_loss: 0.0704 - classification_loss: 0.0056 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0760 - regression_loss: 0.0704 - classification_loss: 0.0055 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0761 - regression_loss: 0.0706 - classification_loss: 0.0055 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0758 - regression_loss: 0.0703 - classification_loss: 0.0055 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0055 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0055 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0752 - regression_loss: 0.0698 - classification_loss: 0.0055 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0750 - regression_loss: 0.0695 - classification_loss: 0.0055 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0054 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0742 - regression_loss: 0.0688 - classification_loss: 0.0054 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0054 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0742 - regression_loss: 0.0688 - classification_loss: 0.0054 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0054 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0735 - regression_loss: 0.0681 - classification_loss: 0.0053 193/500 [==========>...................] - ETA: 1:43 - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0736 - regression_loss: 0.0682 - classification_loss: 0.0053 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0736 - regression_loss: 0.0682 - classification_loss: 0.0054 196/500 [==========>...................] - ETA: 1:42 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 197/500 [==========>...................] - ETA: 1:42 - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0055 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0055 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0742 - regression_loss: 0.0687 - classification_loss: 0.0055 200/500 [===========>..................] - ETA: 1:41 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0055 202/500 [===========>..................] - ETA: 1:40 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0055 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0055 205/500 [===========>..................] - ETA: 1:39 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0055 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0743 - regression_loss: 0.0687 - classification_loss: 0.0056 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0741 - regression_loss: 0.0685 - classification_loss: 0.0056 208/500 [===========>..................] - ETA: 1:38 - loss: 0.0746 - regression_loss: 0.0689 - classification_loss: 0.0057 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0057 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0743 - regression_loss: 0.0686 - classification_loss: 0.0056 211/500 [===========>..................] - ETA: 1:37 - loss: 0.0741 - regression_loss: 0.0685 - classification_loss: 0.0056 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0739 - regression_loss: 0.0683 - classification_loss: 0.0056 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0745 - regression_loss: 0.0688 - classification_loss: 0.0056 214/500 [===========>..................] - ETA: 1:36 - loss: 0.0744 - regression_loss: 0.0688 - classification_loss: 0.0056 215/500 [===========>..................] - ETA: 1:35 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0056 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0056 217/500 [============>.................] - ETA: 1:35 - loss: 0.0742 - regression_loss: 0.0686 - classification_loss: 0.0056 218/500 [============>.................] - ETA: 1:34 - loss: 0.0742 - regression_loss: 0.0685 - classification_loss: 0.0056 219/500 [============>.................] - ETA: 1:34 - loss: 0.0739 - regression_loss: 0.0683 - classification_loss: 0.0056 220/500 [============>.................] - ETA: 1:34 - loss: 0.0742 - regression_loss: 0.0685 - classification_loss: 0.0057 221/500 [============>.................] - ETA: 1:33 - loss: 0.0743 - regression_loss: 0.0686 - classification_loss: 0.0057 222/500 [============>.................] - ETA: 1:33 - loss: 0.0743 - regression_loss: 0.0686 - classification_loss: 0.0057 223/500 [============>.................] - ETA: 1:33 - loss: 0.0741 - regression_loss: 0.0684 - classification_loss: 0.0057 224/500 [============>.................] - ETA: 1:32 - loss: 0.0739 - regression_loss: 0.0682 - classification_loss: 0.0057 225/500 [============>.................] - ETA: 1:32 - loss: 0.0739 - regression_loss: 0.0683 - classification_loss: 0.0057 226/500 [============>.................] - ETA: 1:32 - loss: 0.0739 - regression_loss: 0.0682 - classification_loss: 0.0057 227/500 [============>.................] - ETA: 1:31 - loss: 0.0737 - regression_loss: 0.0680 - classification_loss: 0.0057 228/500 [============>.................] - ETA: 1:31 - loss: 0.0747 - regression_loss: 0.0691 - classification_loss: 0.0056 229/500 [============>.................] - ETA: 1:31 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0056 230/500 [============>.................] - ETA: 1:30 - loss: 0.0743 - regression_loss: 0.0687 - classification_loss: 0.0056 231/500 [============>.................] - ETA: 1:30 - loss: 0.0742 - regression_loss: 0.0686 - classification_loss: 0.0056 232/500 [============>.................] - ETA: 1:30 - loss: 0.0746 - regression_loss: 0.0689 - classification_loss: 0.0056 233/500 [============>.................] - ETA: 1:29 - loss: 0.0748 - regression_loss: 0.0691 - classification_loss: 0.0056 234/500 [=============>................] - ETA: 1:29 - loss: 0.0748 - regression_loss: 0.0691 - classification_loss: 0.0056 235/500 [=============>................] - ETA: 1:29 - loss: 0.0749 - regression_loss: 0.0692 - classification_loss: 0.0056 236/500 [=============>................] - ETA: 1:28 - loss: 0.0750 - regression_loss: 0.0693 - classification_loss: 0.0056 237/500 [=============>................] - ETA: 1:28 - loss: 0.0749 - regression_loss: 0.0692 - classification_loss: 0.0056 238/500 [=============>................] - ETA: 1:28 - loss: 0.0748 - regression_loss: 0.0692 - classification_loss: 0.0056 239/500 [=============>................] - ETA: 1:27 - loss: 0.0753 - regression_loss: 0.0696 - classification_loss: 0.0057 240/500 [=============>................] - ETA: 1:27 - loss: 0.0751 - regression_loss: 0.0694 - classification_loss: 0.0057 241/500 [=============>................] - ETA: 1:27 - loss: 0.0748 - regression_loss: 0.0692 - classification_loss: 0.0056 242/500 [=============>................] - ETA: 1:26 - loss: 0.0747 - regression_loss: 0.0691 - classification_loss: 0.0056 243/500 [=============>................] - ETA: 1:26 - loss: 0.0747 - regression_loss: 0.0691 - classification_loss: 0.0056 244/500 [=============>................] - ETA: 1:26 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0056 245/500 [=============>................] - ETA: 1:25 - loss: 0.0749 - regression_loss: 0.0693 - classification_loss: 0.0056 246/500 [=============>................] - ETA: 1:25 - loss: 0.0749 - regression_loss: 0.0693 - classification_loss: 0.0056 247/500 [=============>................] - ETA: 1:25 - loss: 0.0747 - regression_loss: 0.0691 - classification_loss: 0.0056 248/500 [=============>................] - ETA: 1:24 - loss: 0.0746 - regression_loss: 0.0690 - classification_loss: 0.0056 249/500 [=============>................] - ETA: 1:24 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0056 250/500 [==============>...............] - ETA: 1:24 - loss: 0.0742 - regression_loss: 0.0687 - classification_loss: 0.0056 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0056 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0742 - regression_loss: 0.0687 - classification_loss: 0.0056 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0744 - regression_loss: 0.0689 - classification_loss: 0.0056 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0056 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0746 - regression_loss: 0.0690 - classification_loss: 0.0056 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0055 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0743 - regression_loss: 0.0687 - classification_loss: 0.0056 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0748 - regression_loss: 0.0692 - classification_loss: 0.0056 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0749 - regression_loss: 0.0693 - classification_loss: 0.0056 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0747 - regression_loss: 0.0692 - classification_loss: 0.0056 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0056 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0744 - regression_loss: 0.0688 - classification_loss: 0.0055 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0055 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0748 - regression_loss: 0.0693 - classification_loss: 0.0055 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0055 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0753 - regression_loss: 0.0698 - classification_loss: 0.0055 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0753 - regression_loss: 0.0698 - classification_loss: 0.0055 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0751 - regression_loss: 0.0696 - classification_loss: 0.0055 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0750 - regression_loss: 0.0695 - classification_loss: 0.0055 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0055 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0749 - regression_loss: 0.0694 - classification_loss: 0.0055 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0751 - regression_loss: 0.0697 - classification_loss: 0.0055 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0750 - regression_loss: 0.0695 - classification_loss: 0.0055 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0055 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0749 - regression_loss: 0.0694 - classification_loss: 0.0055 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0747 - regression_loss: 0.0692 - classification_loss: 0.0054 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0748 - regression_loss: 0.0693 - classification_loss: 0.0054 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0054 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0750 - regression_loss: 0.0696 - classification_loss: 0.0054 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0751 - regression_loss: 0.0696 - classification_loss: 0.0054 284/500 [================>.............] - ETA: 1:12 - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 285/500 [================>.............] - ETA: 1:12 - loss: 0.0749 - regression_loss: 0.0694 - classification_loss: 0.0054 286/500 [================>.............] - ETA: 1:12 - loss: 0.0749 - regression_loss: 0.0694 - classification_loss: 0.0054 287/500 [================>.............] - ETA: 1:11 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0054 288/500 [================>.............] - ETA: 1:11 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0054 289/500 [================>.............] - ETA: 1:11 - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0054 290/500 [================>.............] - ETA: 1:10 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 291/500 [================>.............] - ETA: 1:10 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 292/500 [================>.............] - ETA: 1:10 - loss: 0.0743 - regression_loss: 0.0690 - classification_loss: 0.0054 293/500 [================>.............] - ETA: 1:09 - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 294/500 [================>.............] - ETA: 1:09 - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0054 295/500 [================>.............] - ETA: 1:09 - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0054 296/500 [================>.............] - ETA: 1:08 - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0054 297/500 [================>.............] - ETA: 1:08 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 298/500 [================>.............] - ETA: 1:08 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 299/500 [================>.............] - ETA: 1:07 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 300/500 [=================>............] - ETA: 1:07 - loss: 0.0743 - regression_loss: 0.0690 - classification_loss: 0.0054 301/500 [=================>............] - ETA: 1:07 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 302/500 [=================>............] - ETA: 1:06 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 303/500 [=================>............] - ETA: 1:06 - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 304/500 [=================>............] - ETA: 1:06 - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 305/500 [=================>............] - ETA: 1:05 - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0054 306/500 [=================>............] - ETA: 1:05 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 307/500 [=================>............] - ETA: 1:04 - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 308/500 [=================>............] - ETA: 1:04 - loss: 0.0750 - regression_loss: 0.0696 - classification_loss: 0.0054 309/500 [=================>............] - ETA: 1:04 - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 310/500 [=================>............] - ETA: 1:03 - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0054 311/500 [=================>............] - ETA: 1:03 - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 312/500 [=================>............] - ETA: 1:03 - loss: 0.0744 - regression_loss: 0.0691 - classification_loss: 0.0054 313/500 [=================>............] - ETA: 1:02 - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0053 314/500 [=================>............] - ETA: 1:02 - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0053 315/500 [=================>............] - ETA: 1:02 - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 316/500 [=================>............] - ETA: 1:01 - loss: 0.0740 - regression_loss: 0.0687 - classification_loss: 0.0053 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0054 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0742 - regression_loss: 0.0688 - classification_loss: 0.0054 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0741 - regression_loss: 0.0687 - classification_loss: 0.0054 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0054 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0053 322/500 [==================>...........] - ETA: 59s - loss: 0.0738 - regression_loss: 0.0685 - classification_loss: 0.0053  323/500 [==================>...........] - ETA: 59s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 324/500 [==================>...........] - ETA: 59s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 325/500 [==================>...........] - ETA: 58s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 326/500 [==================>...........] - ETA: 58s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 327/500 [==================>...........] - ETA: 58s - loss: 0.0731 - regression_loss: 0.0679 - classification_loss: 0.0053 328/500 [==================>...........] - ETA: 57s - loss: 0.0730 - regression_loss: 0.0678 - classification_loss: 0.0053 329/500 [==================>...........] - ETA: 57s - loss: 0.0729 - regression_loss: 0.0677 - classification_loss: 0.0053 330/500 [==================>...........] - ETA: 57s - loss: 0.0732 - regression_loss: 0.0680 - classification_loss: 0.0053 331/500 [==================>...........] - ETA: 56s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 332/500 [==================>...........] - ETA: 56s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0055 333/500 [==================>...........] - ETA: 56s - loss: 0.0740 - regression_loss: 0.0685 - classification_loss: 0.0055 334/500 [===================>..........] - ETA: 55s - loss: 0.0738 - regression_loss: 0.0683 - classification_loss: 0.0054 335/500 [===================>..........] - ETA: 55s - loss: 0.0737 - regression_loss: 0.0682 - classification_loss: 0.0054 336/500 [===================>..........] - ETA: 55s - loss: 0.0737 - regression_loss: 0.0682 - classification_loss: 0.0054 337/500 [===================>..........] - ETA: 54s - loss: 0.0740 - regression_loss: 0.0685 - classification_loss: 0.0054 338/500 [===================>..........] - ETA: 54s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 339/500 [===================>..........] - ETA: 54s - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 340/500 [===================>..........] - ETA: 53s - loss: 0.0742 - regression_loss: 0.0688 - classification_loss: 0.0054 341/500 [===================>..........] - ETA: 53s - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 342/500 [===================>..........] - ETA: 53s - loss: 0.0741 - regression_loss: 0.0687 - classification_loss: 0.0054 343/500 [===================>..........] - ETA: 52s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 344/500 [===================>..........] - ETA: 52s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 345/500 [===================>..........] - ETA: 52s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0054 346/500 [===================>..........] - ETA: 51s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0054 347/500 [===================>..........] - ETA: 51s - loss: 0.0736 - regression_loss: 0.0682 - classification_loss: 0.0054 348/500 [===================>..........] - ETA: 51s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 349/500 [===================>..........] - ETA: 50s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 350/500 [====================>.........] - ETA: 50s - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0054 351/500 [====================>.........] - ETA: 50s - loss: 0.0741 - regression_loss: 0.0687 - classification_loss: 0.0054 352/500 [====================>.........] - ETA: 49s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 353/500 [====================>.........] - ETA: 49s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 354/500 [====================>.........] - ETA: 49s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 355/500 [====================>.........] - ETA: 48s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 356/500 [====================>.........] - ETA: 48s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 357/500 [====================>.........] - ETA: 48s - loss: 0.0736 - regression_loss: 0.0682 - classification_loss: 0.0054 358/500 [====================>.........] - ETA: 47s - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0054 359/500 [====================>.........] - ETA: 47s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 360/500 [====================>.........] - ETA: 47s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 361/500 [====================>.........] - ETA: 46s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 362/500 [====================>.........] - ETA: 46s - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 363/500 [====================>.........] - ETA: 46s - loss: 0.0744 - regression_loss: 0.0689 - classification_loss: 0.0055 364/500 [====================>.........] - ETA: 45s - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0055 365/500 [====================>.........] - ETA: 45s - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0055 366/500 [====================>.........] - ETA: 45s - loss: 0.0748 - regression_loss: 0.0693 - classification_loss: 0.0055 367/500 [=====================>........] - ETA: 44s - loss: 0.0747 - regression_loss: 0.0692 - classification_loss: 0.0055 368/500 [=====================>........] - ETA: 44s - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0055 369/500 [=====================>........] - ETA: 44s - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 370/500 [=====================>........] - ETA: 43s - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0054 371/500 [=====================>........] - ETA: 43s - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0054 372/500 [=====================>........] - ETA: 43s - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 373/500 [=====================>........] - ETA: 42s - loss: 0.0744 - regression_loss: 0.0689 - classification_loss: 0.0054 374/500 [=====================>........] - ETA: 42s - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0054 375/500 [=====================>........] - ETA: 42s - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0054 376/500 [=====================>........] - ETA: 41s - loss: 0.0746 - regression_loss: 0.0691 - classification_loss: 0.0054 377/500 [=====================>........] - ETA: 41s - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0054 378/500 [=====================>........] - ETA: 41s - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 379/500 [=====================>........] - ETA: 40s - loss: 0.0742 - regression_loss: 0.0688 - classification_loss: 0.0054 380/500 [=====================>........] - ETA: 40s - loss: 0.0741 - regression_loss: 0.0687 - classification_loss: 0.0054 381/500 [=====================>........] - ETA: 40s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 382/500 [=====================>........] - ETA: 39s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 383/500 [=====================>........] - ETA: 39s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 384/500 [======================>.......] - ETA: 39s - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0054 385/500 [======================>.......] - ETA: 38s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0054 386/500 [======================>.......] - ETA: 38s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0054 387/500 [======================>.......] - ETA: 38s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0054 388/500 [======================>.......] - ETA: 37s - loss: 0.0738 - regression_loss: 0.0684 - classification_loss: 0.0054 389/500 [======================>.......] - ETA: 37s - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0054 390/500 [======================>.......] - ETA: 37s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0054 391/500 [======================>.......] - ETA: 36s - loss: 0.0735 - regression_loss: 0.0681 - classification_loss: 0.0054 392/500 [======================>.......] - ETA: 36s - loss: 0.0736 - regression_loss: 0.0682 - classification_loss: 0.0054 393/500 [======================>.......] - ETA: 36s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 394/500 [======================>.......] - ETA: 35s - loss: 0.0735 - regression_loss: 0.0681 - classification_loss: 0.0053 395/500 [======================>.......] - ETA: 35s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 396/500 [======================>.......] - ETA: 35s - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 397/500 [======================>.......] - ETA: 34s - loss: 0.0733 - regression_loss: 0.0679 - classification_loss: 0.0053 398/500 [======================>.......] - ETA: 34s - loss: 0.0735 - regression_loss: 0.0681 - classification_loss: 0.0053 399/500 [======================>.......] - ETA: 34s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 400/500 [=======================>......] - ETA: 33s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 401/500 [=======================>......] - ETA: 33s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 402/500 [=======================>......] - ETA: 32s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0053 403/500 [=======================>......] - ETA: 32s - loss: 0.0738 - regression_loss: 0.0685 - classification_loss: 0.0053 404/500 [=======================>......] - ETA: 32s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053 405/500 [=======================>......] - ETA: 31s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 406/500 [=======================>......] - ETA: 31s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 407/500 [=======================>......] - ETA: 31s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 408/500 [=======================>......] - ETA: 30s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 409/500 [=======================>......] - ETA: 30s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 410/500 [=======================>......] - ETA: 30s - loss: 0.0732 - regression_loss: 0.0680 - classification_loss: 0.0052 411/500 [=======================>......] - ETA: 29s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0052 412/500 [=======================>......] - ETA: 29s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0052 413/500 [=======================>......] - ETA: 29s - loss: 0.0733 - regression_loss: 0.0681 - classification_loss: 0.0053 414/500 [=======================>......] - ETA: 28s - loss: 0.0736 - regression_loss: 0.0684 - classification_loss: 0.0053 415/500 [=======================>......] - ETA: 28s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0053 416/500 [=======================>......] - ETA: 28s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0053 417/500 [========================>.....] - ETA: 27s - loss: 0.0740 - regression_loss: 0.0688 - classification_loss: 0.0053 418/500 [========================>.....] - ETA: 27s - loss: 0.0739 - regression_loss: 0.0687 - classification_loss: 0.0053 419/500 [========================>.....] - ETA: 27s - loss: 0.0738 - regression_loss: 0.0686 - classification_loss: 0.0052 420/500 [========================>.....] - ETA: 26s - loss: 0.0740 - regression_loss: 0.0688 - classification_loss: 0.0052 421/500 [========================>.....] - ETA: 26s - loss: 0.0740 - regression_loss: 0.0687 - classification_loss: 0.0052 422/500 [========================>.....] - ETA: 26s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0052 423/500 [========================>.....] - ETA: 25s - loss: 0.0741 - regression_loss: 0.0689 - classification_loss: 0.0052 424/500 [========================>.....] - ETA: 25s - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0052 425/500 [========================>.....] - ETA: 25s - loss: 0.0740 - regression_loss: 0.0688 - classification_loss: 0.0052 426/500 [========================>.....] - ETA: 24s - loss: 0.0740 - regression_loss: 0.0688 - classification_loss: 0.0052 427/500 [========================>.....] - ETA: 24s - loss: 0.0740 - regression_loss: 0.0687 - classification_loss: 0.0052 428/500 [========================>.....] - ETA: 24s - loss: 0.0740 - regression_loss: 0.0687 - classification_loss: 0.0053 429/500 [========================>.....] - ETA: 23s - loss: 0.0739 - regression_loss: 0.0687 - classification_loss: 0.0052 430/500 [========================>.....] - ETA: 23s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 431/500 [========================>.....] - ETA: 23s - loss: 0.0740 - regression_loss: 0.0687 - classification_loss: 0.0053 432/500 [========================>.....] - ETA: 22s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0053 433/500 [========================>.....] - ETA: 22s - loss: 0.0739 - regression_loss: 0.0686 - classification_loss: 0.0052 434/500 [=========================>....] - ETA: 22s - loss: 0.0738 - regression_loss: 0.0686 - classification_loss: 0.0052 435/500 [=========================>....] - ETA: 21s - loss: 0.0743 - regression_loss: 0.0690 - classification_loss: 0.0053 436/500 [=========================>....] - ETA: 21s - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0052 437/500 [=========================>....] - ETA: 21s - loss: 0.0743 - regression_loss: 0.0690 - classification_loss: 0.0053 438/500 [=========================>....] - ETA: 20s - loss: 0.0745 - regression_loss: 0.0692 - classification_loss: 0.0053 439/500 [=========================>....] - ETA: 20s - loss: 0.0747 - regression_loss: 0.0693 - classification_loss: 0.0053 440/500 [=========================>....] - ETA: 20s - loss: 0.0747 - regression_loss: 0.0694 - classification_loss: 0.0053 441/500 [=========================>....] - ETA: 19s - loss: 0.0747 - regression_loss: 0.0694 - classification_loss: 0.0053 442/500 [=========================>....] - ETA: 19s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 443/500 [=========================>....] - ETA: 19s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 444/500 [=========================>....] - ETA: 18s - loss: 0.0748 - regression_loss: 0.0694 - classification_loss: 0.0053 445/500 [=========================>....] - ETA: 18s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0053 446/500 [=========================>....] - ETA: 18s - loss: 0.0750 - regression_loss: 0.0696 - classification_loss: 0.0054 447/500 [=========================>....] - ETA: 17s - loss: 0.0748 - regression_loss: 0.0695 - classification_loss: 0.0053 448/500 [=========================>....] - ETA: 17s - loss: 0.0747 - regression_loss: 0.0694 - classification_loss: 0.0053 449/500 [=========================>....] - ETA: 17s - loss: 0.0747 - regression_loss: 0.0694 - classification_loss: 0.0054 450/500 [==========================>...] - ETA: 16s - loss: 0.0750 - regression_loss: 0.0696 - classification_loss: 0.0054 451/500 [==========================>...] - ETA: 16s - loss: 0.0749 - regression_loss: 0.0696 - classification_loss: 0.0054 452/500 [==========================>...] - ETA: 16s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 453/500 [==========================>...] - ETA: 15s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 454/500 [==========================>...] - ETA: 15s - loss: 0.0749 - regression_loss: 0.0695 - classification_loss: 0.0054 455/500 [==========================>...] - ETA: 15s - loss: 0.0747 - regression_loss: 0.0694 - classification_loss: 0.0054 456/500 [==========================>...] - ETA: 14s - loss: 0.0746 - regression_loss: 0.0692 - classification_loss: 0.0054 457/500 [==========================>...] - ETA: 14s - loss: 0.0745 - regression_loss: 0.0691 - classification_loss: 0.0054 458/500 [==========================>...] - ETA: 14s - loss: 0.0744 - regression_loss: 0.0691 - classification_loss: 0.0054 459/500 [==========================>...] - ETA: 13s - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 460/500 [==========================>...] - ETA: 13s - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0054 461/500 [==========================>...] - ETA: 13s - loss: 0.0744 - regression_loss: 0.0691 - classification_loss: 0.0054 462/500 [==========================>...] - ETA: 12s - loss: 0.0744 - regression_loss: 0.0690 - classification_loss: 0.0054 463/500 [==========================>...] - ETA: 12s - loss: 0.0743 - regression_loss: 0.0689 - classification_loss: 0.0053 464/500 [==========================>...] - ETA: 12s - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0053 465/500 [==========================>...] - ETA: 11s - loss: 0.0742 - regression_loss: 0.0689 - classification_loss: 0.0053 466/500 [==========================>...] - ETA: 11s - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0053 467/500 [===========================>..] - ETA: 11s - loss: 0.0741 - regression_loss: 0.0687 - classification_loss: 0.0053 468/500 [===========================>..] - ETA: 10s - loss: 0.0740 - regression_loss: 0.0686 - classification_loss: 0.0053 469/500 [===========================>..] - ETA: 10s - loss: 0.0739 - regression_loss: 0.0685 - classification_loss: 0.0053 470/500 [===========================>..] - ETA: 10s - loss: 0.0738 - regression_loss: 0.0685 - classification_loss: 0.0053 471/500 [===========================>..] - ETA: 9s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053  472/500 [===========================>..] - ETA: 9s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053 473/500 [===========================>..] - ETA: 9s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 474/500 [===========================>..] - ETA: 8s - loss: 0.0734 - regression_loss: 0.0682 - classification_loss: 0.0053 475/500 [===========================>..] - ETA: 8s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 476/500 [===========================>..] - ETA: 8s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053 477/500 [===========================>..] - ETA: 7s - loss: 0.0737 - regression_loss: 0.0684 - classification_loss: 0.0053 478/500 [===========================>..] - ETA: 7s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 479/500 [===========================>..] - ETA: 7s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 480/500 [===========================>..] - ETA: 6s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 481/500 [===========================>..] - ETA: 6s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 482/500 [===========================>..] - ETA: 6s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 483/500 [===========================>..] - ETA: 5s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 484/500 [============================>.] - ETA: 5s - loss: 0.0736 - regression_loss: 0.0683 - classification_loss: 0.0053 485/500 [============================>.] - ETA: 5s - loss: 0.0735 - regression_loss: 0.0682 - classification_loss: 0.0053 486/500 [============================>.] - ETA: 4s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 487/500 [============================>.] - ETA: 4s - loss: 0.0734 - regression_loss: 0.0681 - classification_loss: 0.0053 488/500 [============================>.] - ETA: 4s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 489/500 [============================>.] - ETA: 3s - loss: 0.0733 - regression_loss: 0.0680 - classification_loss: 0.0053 490/500 [============================>.] - ETA: 3s - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 491/500 [============================>.] - ETA: 3s - loss: 0.0732 - regression_loss: 0.0680 - classification_loss: 0.0053 492/500 [============================>.] - ETA: 2s - loss: 0.0731 - regression_loss: 0.0679 - classification_loss: 0.0053 493/500 [============================>.] - ETA: 2s - loss: 0.0731 - regression_loss: 0.0678 - classification_loss: 0.0053 494/500 [============================>.] - ETA: 2s - loss: 0.0730 - regression_loss: 0.0678 - classification_loss: 0.0053 495/500 [============================>.] - ETA: 1s - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 496/500 [============================>.] - ETA: 1s - loss: 0.0732 - regression_loss: 0.0678 - classification_loss: 0.0053 497/500 [============================>.] - ETA: 1s - loss: 0.0730 - regression_loss: 0.0677 - classification_loss: 0.0053 498/500 [============================>.] - ETA: 0s - loss: 0.0730 - regression_loss: 0.0677 - classification_loss: 0.0053 499/500 [============================>.] - ETA: 0s - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 500/500 [==============================] - 168s 337ms/step - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 1172 instances of class plum with average precision: 0.7571 mAP: 0.7571 Epoch 00057: saving model to ./training/snapshots/resnet101_pascal_57.h5 Epoch 58/150 1/500 [..............................] - ETA: 2:46 - loss: 0.0120 - regression_loss: 0.0114 - classification_loss: 5.3324e-04 2/500 [..............................] - ETA: 2:44 - loss: 0.0288 - regression_loss: 0.0281 - classification_loss: 6.6381e-04 3/500 [..............................] - ETA: 2:44 - loss: 0.0282 - regression_loss: 0.0270 - classification_loss: 0.0012  4/500 [..............................] - ETA: 2:46 - loss: 0.0276 - regression_loss: 0.0263 - classification_loss: 0.0013 5/500 [..............................] - ETA: 2:45 - loss: 0.0299 - regression_loss: 0.0285 - classification_loss: 0.0014 6/500 [..............................] - ETA: 2:45 - loss: 0.0415 - regression_loss: 0.0396 - classification_loss: 0.0020 7/500 [..............................] - ETA: 2:46 - loss: 0.0439 - regression_loss: 0.0416 - classification_loss: 0.0023 8/500 [..............................] - ETA: 2:46 - loss: 0.0440 - regression_loss: 0.0416 - classification_loss: 0.0024 9/500 [..............................] - ETA: 2:46 - loss: 0.0420 - regression_loss: 0.0397 - classification_loss: 0.0023 10/500 [..............................] - ETA: 2:45 - loss: 0.0401 - regression_loss: 0.0379 - classification_loss: 0.0022 11/500 [..............................] - ETA: 2:44 - loss: 0.0451 - regression_loss: 0.0419 - classification_loss: 0.0032 12/500 [..............................] - ETA: 2:43 - loss: 0.0630 - regression_loss: 0.0544 - classification_loss: 0.0086 13/500 [..............................] - ETA: 2:42 - loss: 0.0617 - regression_loss: 0.0533 - classification_loss: 0.0084 14/500 [..............................] - ETA: 2:41 - loss: 0.0701 - regression_loss: 0.0618 - classification_loss: 0.0084 15/500 [..............................] - ETA: 2:41 - loss: 0.0720 - regression_loss: 0.0637 - classification_loss: 0.0083 16/500 [..............................] - ETA: 2:40 - loss: 0.0684 - regression_loss: 0.0605 - classification_loss: 0.0079 17/500 [>.............................] - ETA: 2:41 - loss: 0.0705 - regression_loss: 0.0628 - classification_loss: 0.0077 18/500 [>.............................] - ETA: 2:41 - loss: 0.0718 - regression_loss: 0.0642 - classification_loss: 0.0077 19/500 [>.............................] - ETA: 2:41 - loss: 0.0691 - regression_loss: 0.0617 - classification_loss: 0.0073 20/500 [>.............................] - ETA: 2:41 - loss: 0.0687 - regression_loss: 0.0615 - classification_loss: 0.0072 21/500 [>.............................] - ETA: 2:40 - loss: 0.0664 - regression_loss: 0.0595 - classification_loss: 0.0069 22/500 [>.............................] - ETA: 2:40 - loss: 0.0686 - regression_loss: 0.0616 - classification_loss: 0.0071 23/500 [>.............................] - ETA: 2:39 - loss: 0.0665 - regression_loss: 0.0597 - classification_loss: 0.0068 24/500 [>.............................] - ETA: 2:39 - loss: 0.0655 - regression_loss: 0.0588 - classification_loss: 0.0067 25/500 [>.............................] - ETA: 2:39 - loss: 0.0643 - regression_loss: 0.0578 - classification_loss: 0.0065 26/500 [>.............................] - ETA: 2:38 - loss: 0.0637 - regression_loss: 0.0573 - classification_loss: 0.0064 27/500 [>.............................] - ETA: 2:38 - loss: 0.0655 - regression_loss: 0.0591 - classification_loss: 0.0064 28/500 [>.............................] - ETA: 2:38 - loss: 0.0639 - regression_loss: 0.0577 - classification_loss: 0.0062 29/500 [>.............................] - ETA: 2:37 - loss: 0.0658 - regression_loss: 0.0595 - classification_loss: 0.0064 30/500 [>.............................] - ETA: 2:37 - loss: 0.0682 - regression_loss: 0.0613 - classification_loss: 0.0070 31/500 [>.............................] - ETA: 2:37 - loss: 0.0668 - regression_loss: 0.0600 - classification_loss: 0.0068 32/500 [>.............................] - ETA: 2:37 - loss: 0.0689 - regression_loss: 0.0617 - classification_loss: 0.0072 33/500 [>.............................] - ETA: 2:37 - loss: 0.0687 - regression_loss: 0.0616 - classification_loss: 0.0072 34/500 [=>............................] - ETA: 2:36 - loss: 0.0682 - regression_loss: 0.0612 - classification_loss: 0.0070 35/500 [=>............................] - ETA: 2:36 - loss: 0.0669 - regression_loss: 0.0600 - classification_loss: 0.0069 36/500 [=>............................] - ETA: 2:36 - loss: 0.0667 - regression_loss: 0.0600 - classification_loss: 0.0068 37/500 [=>............................] - ETA: 2:36 - loss: 0.0697 - regression_loss: 0.0629 - classification_loss: 0.0069 38/500 [=>............................] - ETA: 2:35 - loss: 0.0736 - regression_loss: 0.0664 - classification_loss: 0.0072 39/500 [=>............................] - ETA: 2:35 - loss: 0.0803 - regression_loss: 0.0732 - classification_loss: 0.0071 40/500 [=>............................] - ETA: 2:35 - loss: 0.0813 - regression_loss: 0.0742 - classification_loss: 0.0071 41/500 [=>............................] - ETA: 2:34 - loss: 0.0806 - regression_loss: 0.0736 - classification_loss: 0.0071 42/500 [=>............................] - ETA: 2:34 - loss: 0.0813 - regression_loss: 0.0743 - classification_loss: 0.0070 43/500 [=>............................] - ETA: 2:33 - loss: 0.0801 - regression_loss: 0.0732 - classification_loss: 0.0069 44/500 [=>............................] - ETA: 2:33 - loss: 0.0799 - regression_loss: 0.0730 - classification_loss: 0.0068 45/500 [=>............................] - ETA: 2:33 - loss: 0.0787 - regression_loss: 0.0720 - classification_loss: 0.0067 46/500 [=>............................] - ETA: 2:33 - loss: 0.0778 - regression_loss: 0.0712 - classification_loss: 0.0066 47/500 [=>............................] - ETA: 2:32 - loss: 0.0766 - regression_loss: 0.0701 - classification_loss: 0.0065 48/500 [=>............................] - ETA: 2:32 - loss: 0.0754 - regression_loss: 0.0690 - classification_loss: 0.0064 49/500 [=>............................] - ETA: 2:32 - loss: 0.0740 - regression_loss: 0.0677 - classification_loss: 0.0063 50/500 [==>...........................] - ETA: 2:31 - loss: 0.0728 - regression_loss: 0.0666 - classification_loss: 0.0062 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0731 - regression_loss: 0.0669 - classification_loss: 0.0062 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0735 - regression_loss: 0.0672 - classification_loss: 0.0062 53/500 [==>...........................] - ETA: 2:30 - loss: 0.0738 - regression_loss: 0.0674 - classification_loss: 0.0063 54/500 [==>...........................] - ETA: 2:29 - loss: 0.0763 - regression_loss: 0.0700 - classification_loss: 0.0064 55/500 [==>...........................] - ETA: 2:29 - loss: 0.0755 - regression_loss: 0.0692 - classification_loss: 0.0063 56/500 [==>...........................] - ETA: 2:29 - loss: 0.0743 - regression_loss: 0.0681 - classification_loss: 0.0062 57/500 [==>...........................] - ETA: 2:28 - loss: 0.0741 - regression_loss: 0.0679 - classification_loss: 0.0062 58/500 [==>...........................] - ETA: 2:28 - loss: 0.0733 - regression_loss: 0.0671 - classification_loss: 0.0061 59/500 [==>...........................] - ETA: 2:28 - loss: 0.0724 - regression_loss: 0.0664 - classification_loss: 0.0061 60/500 [==>...........................] - ETA: 2:27 - loss: 0.0716 - regression_loss: 0.0656 - classification_loss: 0.0060 61/500 [==>...........................] - ETA: 2:27 - loss: 0.0715 - regression_loss: 0.0654 - classification_loss: 0.0061 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0712 - regression_loss: 0.0651 - classification_loss: 0.0061 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0706 - regression_loss: 0.0646 - classification_loss: 0.0060 64/500 [==>...........................] - ETA: 2:26 - loss: 0.0701 - regression_loss: 0.0641 - classification_loss: 0.0059 65/500 [==>...........................] - ETA: 2:26 - loss: 0.0701 - regression_loss: 0.0643 - classification_loss: 0.0059 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0717 - regression_loss: 0.0658 - classification_loss: 0.0059 67/500 [===>..........................] - ETA: 2:25 - loss: 0.0709 - regression_loss: 0.0651 - classification_loss: 0.0058 68/500 [===>..........................] - ETA: 2:25 - loss: 0.0703 - regression_loss: 0.0645 - classification_loss: 0.0057 69/500 [===>..........................] - ETA: 2:25 - loss: 0.0698 - regression_loss: 0.0641 - classification_loss: 0.0057 70/500 [===>..........................] - ETA: 2:24 - loss: 0.0693 - regression_loss: 0.0636 - classification_loss: 0.0056 71/500 [===>..........................] - ETA: 2:24 - loss: 0.0688 - regression_loss: 0.0632 - classification_loss: 0.0056 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0681 - regression_loss: 0.0625 - classification_loss: 0.0055 73/500 [===>..........................] - ETA: 2:23 - loss: 0.0679 - regression_loss: 0.0624 - classification_loss: 0.0055 74/500 [===>..........................] - ETA: 2:23 - loss: 0.0674 - regression_loss: 0.0619 - classification_loss: 0.0054 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0678 - regression_loss: 0.0623 - classification_loss: 0.0054 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0679 - regression_loss: 0.0624 - classification_loss: 0.0054 77/500 [===>..........................] - ETA: 2:22 - loss: 0.0676 - regression_loss: 0.0622 - classification_loss: 0.0054 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0671 - regression_loss: 0.0618 - classification_loss: 0.0053 79/500 [===>..........................] - ETA: 2:22 - loss: 0.0674 - regression_loss: 0.0621 - classification_loss: 0.0054 80/500 [===>..........................] - ETA: 2:22 - loss: 0.0669 - regression_loss: 0.0616 - classification_loss: 0.0053 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0664 - regression_loss: 0.0612 - classification_loss: 0.0053 82/500 [===>..........................] - ETA: 2:21 - loss: 0.0674 - regression_loss: 0.0621 - classification_loss: 0.0053 83/500 [===>..........................] - ETA: 2:21 - loss: 0.0676 - regression_loss: 0.0622 - classification_loss: 0.0054 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0670 - regression_loss: 0.0616 - classification_loss: 0.0054 85/500 [====>.........................] - ETA: 2:20 - loss: 0.0666 - regression_loss: 0.0613 - classification_loss: 0.0053 86/500 [====>.........................] - ETA: 2:19 - loss: 0.0662 - regression_loss: 0.0609 - classification_loss: 0.0053 87/500 [====>.........................] - ETA: 2:19 - loss: 0.0656 - regression_loss: 0.0603 - classification_loss: 0.0053 88/500 [====>.........................] - ETA: 2:19 - loss: 0.0661 - regression_loss: 0.0608 - classification_loss: 0.0053 89/500 [====>.........................] - ETA: 2:18 - loss: 0.0659 - regression_loss: 0.0606 - classification_loss: 0.0053 90/500 [====>.........................] - ETA: 2:18 - loss: 0.0656 - regression_loss: 0.0603 - classification_loss: 0.0053 91/500 [====>.........................] - ETA: 2:18 - loss: 0.0651 - regression_loss: 0.0599 - classification_loss: 0.0052 92/500 [====>.........................] - ETA: 2:17 - loss: 0.0655 - regression_loss: 0.0602 - classification_loss: 0.0052 93/500 [====>.........................] - ETA: 2:17 - loss: 0.0652 - regression_loss: 0.0600 - classification_loss: 0.0052 94/500 [====>.........................] - ETA: 2:17 - loss: 0.0654 - regression_loss: 0.0602 - classification_loss: 0.0052 95/500 [====>.........................] - ETA: 2:16 - loss: 0.0650 - regression_loss: 0.0598 - classification_loss: 0.0052 96/500 [====>.........................] - ETA: 2:16 - loss: 0.0652 - regression_loss: 0.0600 - classification_loss: 0.0052 97/500 [====>.........................] - ETA: 2:16 - loss: 0.0656 - regression_loss: 0.0604 - classification_loss: 0.0053 98/500 [====>.........................] - ETA: 2:15 - loss: 0.0653 - regression_loss: 0.0601 - classification_loss: 0.0052 99/500 [====>.........................] - ETA: 2:15 - loss: 0.0662 - regression_loss: 0.0607 - classification_loss: 0.0055 100/500 [=====>........................] - ETA: 2:15 - loss: 0.0658 - regression_loss: 0.0604 - classification_loss: 0.0054 101/500 [=====>........................] - ETA: 2:14 - loss: 0.0669 - regression_loss: 0.0615 - classification_loss: 0.0054 102/500 [=====>........................] - ETA: 2:14 - loss: 0.0671 - regression_loss: 0.0616 - classification_loss: 0.0054 103/500 [=====>........................] - ETA: 2:14 - loss: 0.0673 - regression_loss: 0.0618 - classification_loss: 0.0054 104/500 [=====>........................] - ETA: 2:13 - loss: 0.0686 - regression_loss: 0.0632 - classification_loss: 0.0054 105/500 [=====>........................] - ETA: 2:13 - loss: 0.0688 - regression_loss: 0.0634 - classification_loss: 0.0054 106/500 [=====>........................] - ETA: 2:13 - loss: 0.0683 - regression_loss: 0.0629 - classification_loss: 0.0054 107/500 [=====>........................] - ETA: 2:12 - loss: 0.0681 - regression_loss: 0.0627 - classification_loss: 0.0054 108/500 [=====>........................] - ETA: 2:12 - loss: 0.0694 - regression_loss: 0.0640 - classification_loss: 0.0054 109/500 [=====>........................] - ETA: 2:12 - loss: 0.0694 - regression_loss: 0.0640 - classification_loss: 0.0054 110/500 [=====>........................] - ETA: 2:11 - loss: 0.0694 - regression_loss: 0.0640 - classification_loss: 0.0054 111/500 [=====>........................] - ETA: 2:11 - loss: 0.0688 - regression_loss: 0.0634 - classification_loss: 0.0054 112/500 [=====>........................] - ETA: 2:11 - loss: 0.0685 - regression_loss: 0.0632 - classification_loss: 0.0053 113/500 [=====>........................] - ETA: 2:10 - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 114/500 [=====>........................] - ETA: 2:10 - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0053 115/500 [=====>........................] - ETA: 2:10 - loss: 0.0674 - regression_loss: 0.0622 - classification_loss: 0.0052 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0673 - regression_loss: 0.0621 - classification_loss: 0.0052 117/500 [======>.......................] - ETA: 2:09 - loss: 0.0678 - regression_loss: 0.0625 - classification_loss: 0.0053 118/500 [======>.......................] - ETA: 2:08 - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0053 120/500 [======>.......................] - ETA: 2:08 - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 121/500 [======>.......................] - ETA: 2:07 - loss: 0.0676 - regression_loss: 0.0623 - classification_loss: 0.0053 122/500 [======>.......................] - ETA: 2:07 - loss: 0.0682 - regression_loss: 0.0629 - classification_loss: 0.0053 123/500 [======>.......................] - ETA: 2:07 - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 124/500 [======>.......................] - ETA: 2:06 - loss: 0.0678 - regression_loss: 0.0625 - classification_loss: 0.0053 125/500 [======>.......................] - ETA: 2:06 - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0053 126/500 [======>.......................] - ETA: 2:06 - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 127/500 [======>.......................] - ETA: 2:05 - loss: 0.0677 - regression_loss: 0.0624 - classification_loss: 0.0052 128/500 [======>.......................] - ETA: 2:05 - loss: 0.0678 - regression_loss: 0.0625 - classification_loss: 0.0052 129/500 [======>.......................] - ETA: 2:05 - loss: 0.0674 - regression_loss: 0.0623 - classification_loss: 0.0052 130/500 [======>.......................] - ETA: 2:04 - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 131/500 [======>.......................] - ETA: 2:04 - loss: 0.0674 - regression_loss: 0.0623 - classification_loss: 0.0052 132/500 [======>.......................] - ETA: 2:04 - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 133/500 [======>.......................] - ETA: 2:03 - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0051 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0672 - regression_loss: 0.0620 - classification_loss: 0.0052 135/500 [=======>......................] - ETA: 2:03 - loss: 0.0676 - regression_loss: 0.0623 - classification_loss: 0.0052 136/500 [=======>......................] - ETA: 2:02 - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 137/500 [=======>......................] - ETA: 2:02 - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 138/500 [=======>......................] - ETA: 2:02 - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 139/500 [=======>......................] - ETA: 2:01 - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0052 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0052 141/500 [=======>......................] - ETA: 2:01 - loss: 0.0689 - regression_loss: 0.0636 - classification_loss: 0.0052 142/500 [=======>......................] - ETA: 2:00 - loss: 0.0691 - regression_loss: 0.0639 - classification_loss: 0.0052 143/500 [=======>......................] - ETA: 2:00 - loss: 0.0688 - regression_loss: 0.0636 - classification_loss: 0.0052 144/500 [=======>......................] - ETA: 1:59 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0052 145/500 [=======>......................] - ETA: 1:59 - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0052 146/500 [=======>......................] - ETA: 1:59 - loss: 0.0680 - regression_loss: 0.0629 - classification_loss: 0.0051 147/500 [=======>......................] - ETA: 1:58 - loss: 0.0677 - regression_loss: 0.0626 - classification_loss: 0.0051 148/500 [=======>......................] - ETA: 1:58 - loss: 0.0676 - regression_loss: 0.0625 - classification_loss: 0.0051 149/500 [=======>......................] - ETA: 1:58 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 150/500 [========>.....................] - ETA: 1:57 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 151/500 [========>.....................] - ETA: 1:57 - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0672 - regression_loss: 0.0620 - classification_loss: 0.0051 153/500 [========>.....................] - ETA: 1:56 - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 154/500 [========>.....................] - ETA: 1:56 - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0051 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0051 156/500 [========>.....................] - ETA: 1:55 - loss: 0.0663 - regression_loss: 0.0612 - classification_loss: 0.0051 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0051 159/500 [========>.....................] - ETA: 1:54 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0051 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0679 - regression_loss: 0.0628 - classification_loss: 0.0051 162/500 [========>.....................] - ETA: 1:53 - loss: 0.0685 - regression_loss: 0.0634 - classification_loss: 0.0051 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0683 - regression_loss: 0.0632 - classification_loss: 0.0051 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0683 - regression_loss: 0.0632 - classification_loss: 0.0051 165/500 [========>.....................] - ETA: 1:52 - loss: 0.0682 - regression_loss: 0.0630 - classification_loss: 0.0051 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0688 - regression_loss: 0.0636 - classification_loss: 0.0052 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 168/500 [=========>....................] - ETA: 1:51 - loss: 0.0690 - regression_loss: 0.0638 - classification_loss: 0.0052 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0051 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0052 171/500 [=========>....................] - ETA: 1:50 - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0051 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0683 - regression_loss: 0.0632 - classification_loss: 0.0051 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0681 - regression_loss: 0.0630 - classification_loss: 0.0051 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0678 - regression_loss: 0.0627 - classification_loss: 0.0051 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0682 - regression_loss: 0.0631 - classification_loss: 0.0052 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0681 - regression_loss: 0.0629 - classification_loss: 0.0052 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0685 - regression_loss: 0.0634 - classification_loss: 0.0052 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0685 - regression_loss: 0.0634 - classification_loss: 0.0051 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0688 - regression_loss: 0.0636 - classification_loss: 0.0052 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0690 - regression_loss: 0.0638 - classification_loss: 0.0052 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0052 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0051 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0683 - regression_loss: 0.0632 - classification_loss: 0.0051 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0693 - regression_loss: 0.0639 - classification_loss: 0.0055 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0696 - regression_loss: 0.0641 - classification_loss: 0.0055 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0695 - regression_loss: 0.0641 - classification_loss: 0.0054 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0692 - regression_loss: 0.0638 - classification_loss: 0.0054 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0691 - regression_loss: 0.0637 - classification_loss: 0.0054 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0690 - regression_loss: 0.0636 - classification_loss: 0.0054 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0687 - regression_loss: 0.0634 - classification_loss: 0.0054 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0054 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0054 193/500 [==========>...................] - ETA: 1:43 - loss: 0.0686 - regression_loss: 0.0632 - classification_loss: 0.0053 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0683 - regression_loss: 0.0629 - classification_loss: 0.0053 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 196/500 [==========>...................] - ETA: 1:42 - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0053 197/500 [==========>...................] - ETA: 1:42 - loss: 0.0677 - regression_loss: 0.0624 - classification_loss: 0.0053 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0676 - regression_loss: 0.0623 - classification_loss: 0.0053 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0674 - regression_loss: 0.0621 - classification_loss: 0.0052 200/500 [===========>..................] - ETA: 1:41 - loss: 0.0674 - regression_loss: 0.0622 - classification_loss: 0.0052 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0675 - regression_loss: 0.0622 - classification_loss: 0.0052 202/500 [===========>..................] - ETA: 1:40 - loss: 0.0677 - regression_loss: 0.0624 - classification_loss: 0.0052 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0675 - regression_loss: 0.0623 - classification_loss: 0.0052 205/500 [===========>..................] - ETA: 1:39 - loss: 0.0673 - regression_loss: 0.0621 - classification_loss: 0.0052 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0672 - regression_loss: 0.0620 - classification_loss: 0.0052 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0052 208/500 [===========>..................] - ETA: 1:38 - loss: 0.0671 - regression_loss: 0.0619 - classification_loss: 0.0052 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0672 - regression_loss: 0.0620 - classification_loss: 0.0052 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0671 - regression_loss: 0.0619 - classification_loss: 0.0052 211/500 [===========>..................] - ETA: 1:37 - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0052 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0671 - regression_loss: 0.0619 - classification_loss: 0.0052 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0669 - regression_loss: 0.0617 - classification_loss: 0.0052 214/500 [===========>..................] - ETA: 1:36 - loss: 0.0673 - regression_loss: 0.0621 - classification_loss: 0.0052 215/500 [===========>..................] - ETA: 1:36 - loss: 0.0673 - regression_loss: 0.0621 - classification_loss: 0.0052 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0052 217/500 [============>.................] - ETA: 1:35 - loss: 0.0683 - regression_loss: 0.0631 - classification_loss: 0.0052 218/500 [============>.................] - ETA: 1:35 - loss: 0.0681 - regression_loss: 0.0629 - classification_loss: 0.0052 219/500 [============>.................] - ETA: 1:34 - loss: 0.0683 - regression_loss: 0.0631 - classification_loss: 0.0052 220/500 [============>.................] - ETA: 1:34 - loss: 0.0681 - regression_loss: 0.0629 - classification_loss: 0.0051 221/500 [============>.................] - ETA: 1:34 - loss: 0.0680 - regression_loss: 0.0629 - classification_loss: 0.0051 222/500 [============>.................] - ETA: 1:33 - loss: 0.0679 - regression_loss: 0.0628 - classification_loss: 0.0051 223/500 [============>.................] - ETA: 1:33 - loss: 0.0678 - regression_loss: 0.0627 - classification_loss: 0.0051 224/500 [============>.................] - ETA: 1:32 - loss: 0.0676 - regression_loss: 0.0625 - classification_loss: 0.0051 225/500 [============>.................] - ETA: 1:32 - loss: 0.0676 - regression_loss: 0.0625 - classification_loss: 0.0051 226/500 [============>.................] - ETA: 1:32 - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0051 227/500 [============>.................] - ETA: 1:31 - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0051 228/500 [============>.................] - ETA: 1:31 - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0051 229/500 [============>.................] - ETA: 1:31 - loss: 0.0677 - regression_loss: 0.0626 - classification_loss: 0.0051 230/500 [============>.................] - ETA: 1:30 - loss: 0.0676 - regression_loss: 0.0625 - classification_loss: 0.0051 231/500 [============>.................] - ETA: 1:30 - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0051 232/500 [============>.................] - ETA: 1:30 - loss: 0.0677 - regression_loss: 0.0626 - classification_loss: 0.0051 233/500 [============>.................] - ETA: 1:29 - loss: 0.0676 - regression_loss: 0.0625 - classification_loss: 0.0051 234/500 [=============>................] - ETA: 1:29 - loss: 0.0678 - regression_loss: 0.0627 - classification_loss: 0.0051 235/500 [=============>................] - ETA: 1:29 - loss: 0.0677 - regression_loss: 0.0626 - classification_loss: 0.0051 236/500 [=============>................] - ETA: 1:28 - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0052 237/500 [=============>................] - ETA: 1:28 - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0052 238/500 [=============>................] - ETA: 1:28 - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0052 239/500 [=============>................] - ETA: 1:27 - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0052 240/500 [=============>................] - ETA: 1:27 - loss: 0.0682 - regression_loss: 0.0631 - classification_loss: 0.0052 241/500 [=============>................] - ETA: 1:27 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0052 242/500 [=============>................] - ETA: 1:26 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0052 243/500 [=============>................] - ETA: 1:26 - loss: 0.0692 - regression_loss: 0.0639 - classification_loss: 0.0052 244/500 [=============>................] - ETA: 1:26 - loss: 0.0693 - regression_loss: 0.0640 - classification_loss: 0.0052 245/500 [=============>................] - ETA: 1:25 - loss: 0.0691 - regression_loss: 0.0639 - classification_loss: 0.0052 246/500 [=============>................] - ETA: 1:25 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 247/500 [=============>................] - ETA: 1:25 - loss: 0.0688 - regression_loss: 0.0637 - classification_loss: 0.0052 248/500 [=============>................] - ETA: 1:24 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0052 249/500 [=============>................] - ETA: 1:24 - loss: 0.0690 - regression_loss: 0.0639 - classification_loss: 0.0052 250/500 [==============>...............] - ETA: 1:24 - loss: 0.0688 - regression_loss: 0.0637 - classification_loss: 0.0052 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0052 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0052 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0688 - regression_loss: 0.0636 - classification_loss: 0.0052 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0051 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0051 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0052 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0052 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0052 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0052 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0052 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0688 - regression_loss: 0.0636 - classification_loss: 0.0052 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0687 - regression_loss: 0.0635 - classification_loss: 0.0052 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0689 - regression_loss: 0.0636 - classification_loss: 0.0052 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0696 - regression_loss: 0.0641 - classification_loss: 0.0055 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0700 - regression_loss: 0.0645 - classification_loss: 0.0055 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0700 - regression_loss: 0.0645 - classification_loss: 0.0055 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0699 - regression_loss: 0.0644 - classification_loss: 0.0055 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0703 - regression_loss: 0.0647 - classification_loss: 0.0055 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0700 - regression_loss: 0.0645 - classification_loss: 0.0055 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0699 - regression_loss: 0.0644 - classification_loss: 0.0055 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0704 - regression_loss: 0.0648 - classification_loss: 0.0055 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0702 - regression_loss: 0.0647 - classification_loss: 0.0055 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0701 - regression_loss: 0.0646 - classification_loss: 0.0055 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0704 - regression_loss: 0.0649 - classification_loss: 0.0055 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0702 - regression_loss: 0.0647 - classification_loss: 0.0055 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0703 - regression_loss: 0.0647 - classification_loss: 0.0055 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0701 - regression_loss: 0.0645 - classification_loss: 0.0055 284/500 [================>.............] - ETA: 1:12 - loss: 0.0702 - regression_loss: 0.0647 - classification_loss: 0.0055 285/500 [================>.............] - ETA: 1:12 - loss: 0.0701 - regression_loss: 0.0646 - classification_loss: 0.0055 286/500 [================>.............] - ETA: 1:12 - loss: 0.0699 - regression_loss: 0.0644 - classification_loss: 0.0055 287/500 [================>.............] - ETA: 1:11 - loss: 0.0698 - regression_loss: 0.0644 - classification_loss: 0.0055 288/500 [================>.............] - ETA: 1:11 - loss: 0.0698 - regression_loss: 0.0643 - classification_loss: 0.0055 289/500 [================>.............] - ETA: 1:11 - loss: 0.0701 - regression_loss: 0.0646 - classification_loss: 0.0055 290/500 [================>.............] - ETA: 1:10 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 291/500 [================>.............] - ETA: 1:10 - loss: 0.0704 - regression_loss: 0.0649 - classification_loss: 0.0055 292/500 [================>.............] - ETA: 1:09 - loss: 0.0703 - regression_loss: 0.0647 - classification_loss: 0.0055 293/500 [================>.............] - ETA: 1:09 - loss: 0.0707 - regression_loss: 0.0652 - classification_loss: 0.0055 294/500 [================>.............] - ETA: 1:09 - loss: 0.0711 - regression_loss: 0.0654 - classification_loss: 0.0056 295/500 [================>.............] - ETA: 1:08 - loss: 0.0712 - regression_loss: 0.0656 - classification_loss: 0.0056 296/500 [================>.............] - ETA: 1:08 - loss: 0.0710 - regression_loss: 0.0654 - classification_loss: 0.0056 297/500 [================>.............] - ETA: 1:08 - loss: 0.0709 - regression_loss: 0.0653 - classification_loss: 0.0056 298/500 [================>.............] - ETA: 1:07 - loss: 0.0707 - regression_loss: 0.0652 - classification_loss: 0.0056 299/500 [================>.............] - ETA: 1:07 - loss: 0.0706 - regression_loss: 0.0650 - classification_loss: 0.0056 300/500 [=================>............] - ETA: 1:07 - loss: 0.0709 - regression_loss: 0.0653 - classification_loss: 0.0056 301/500 [=================>............] - ETA: 1:06 - loss: 0.0708 - regression_loss: 0.0652 - classification_loss: 0.0055 302/500 [=================>............] - ETA: 1:06 - loss: 0.0707 - regression_loss: 0.0652 - classification_loss: 0.0055 303/500 [=================>............] - ETA: 1:06 - loss: 0.0706 - regression_loss: 0.0651 - classification_loss: 0.0055 304/500 [=================>............] - ETA: 1:05 - loss: 0.0705 - regression_loss: 0.0650 - classification_loss: 0.0055 305/500 [=================>............] - ETA: 1:05 - loss: 0.0704 - regression_loss: 0.0649 - classification_loss: 0.0055 306/500 [=================>............] - ETA: 1:05 - loss: 0.0705 - regression_loss: 0.0650 - classification_loss: 0.0055 307/500 [=================>............] - ETA: 1:04 - loss: 0.0705 - regression_loss: 0.0650 - classification_loss: 0.0055 308/500 [=================>............] - ETA: 1:04 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 309/500 [=================>............] - ETA: 1:04 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 310/500 [=================>............] - ETA: 1:03 - loss: 0.0706 - regression_loss: 0.0651 - classification_loss: 0.0055 311/500 [=================>............] - ETA: 1:03 - loss: 0.0705 - regression_loss: 0.0650 - classification_loss: 0.0055 312/500 [=================>............] - ETA: 1:03 - loss: 0.0703 - regression_loss: 0.0649 - classification_loss: 0.0055 313/500 [=================>............] - ETA: 1:02 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 314/500 [=================>............] - ETA: 1:02 - loss: 0.0703 - regression_loss: 0.0648 - classification_loss: 0.0055 315/500 [=================>............] - ETA: 1:02 - loss: 0.0703 - regression_loss: 0.0649 - classification_loss: 0.0055 316/500 [=================>............] - ETA: 1:01 - loss: 0.0705 - regression_loss: 0.0650 - classification_loss: 0.0055 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0706 - regression_loss: 0.0652 - classification_loss: 0.0054 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0708 - regression_loss: 0.0654 - classification_loss: 0.0054 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0706 - regression_loss: 0.0652 - classification_loss: 0.0054 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0705 - regression_loss: 0.0651 - classification_loss: 0.0054 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0706 - regression_loss: 0.0652 - classification_loss: 0.0054 322/500 [==================>...........] - ETA: 59s - loss: 0.0705 - regression_loss: 0.0651 - classification_loss: 0.0054  323/500 [==================>...........] - ETA: 59s - loss: 0.0706 - regression_loss: 0.0652 - classification_loss: 0.0054 324/500 [==================>...........] - ETA: 59s - loss: 0.0707 - regression_loss: 0.0653 - classification_loss: 0.0054 325/500 [==================>...........] - ETA: 58s - loss: 0.0707 - regression_loss: 0.0653 - classification_loss: 0.0054 326/500 [==================>...........] - ETA: 58s - loss: 0.0707 - regression_loss: 0.0653 - classification_loss: 0.0054 327/500 [==================>...........] - ETA: 58s - loss: 0.0705 - regression_loss: 0.0651 - classification_loss: 0.0054 328/500 [==================>...........] - ETA: 57s - loss: 0.0704 - regression_loss: 0.0650 - classification_loss: 0.0054 329/500 [==================>...........] - ETA: 57s - loss: 0.0702 - regression_loss: 0.0648 - classification_loss: 0.0054 330/500 [==================>...........] - ETA: 57s - loss: 0.0701 - regression_loss: 0.0647 - classification_loss: 0.0054 331/500 [==================>...........] - ETA: 56s - loss: 0.0701 - regression_loss: 0.0647 - classification_loss: 0.0054 332/500 [==================>...........] - ETA: 56s - loss: 0.0701 - regression_loss: 0.0647 - classification_loss: 0.0054 333/500 [==================>...........] - ETA: 56s - loss: 0.0700 - regression_loss: 0.0646 - classification_loss: 0.0054 334/500 [===================>..........] - ETA: 55s - loss: 0.0700 - regression_loss: 0.0646 - classification_loss: 0.0054 335/500 [===================>..........] - ETA: 55s - loss: 0.0698 - regression_loss: 0.0645 - classification_loss: 0.0053 336/500 [===================>..........] - ETA: 55s - loss: 0.0697 - regression_loss: 0.0644 - classification_loss: 0.0053 337/500 [===================>..........] - ETA: 54s - loss: 0.0699 - regression_loss: 0.0645 - classification_loss: 0.0053 338/500 [===================>..........] - ETA: 54s - loss: 0.0699 - regression_loss: 0.0645 - classification_loss: 0.0053 339/500 [===================>..........] - ETA: 54s - loss: 0.0699 - regression_loss: 0.0646 - classification_loss: 0.0053 340/500 [===================>..........] - ETA: 53s - loss: 0.0699 - regression_loss: 0.0646 - classification_loss: 0.0053 341/500 [===================>..........] - ETA: 53s - loss: 0.0699 - regression_loss: 0.0646 - classification_loss: 0.0053 342/500 [===================>..........] - ETA: 53s - loss: 0.0701 - regression_loss: 0.0648 - classification_loss: 0.0053 343/500 [===================>..........] - ETA: 52s - loss: 0.0702 - regression_loss: 0.0648 - classification_loss: 0.0054 344/500 [===================>..........] - ETA: 52s - loss: 0.0701 - regression_loss: 0.0647 - classification_loss: 0.0054 345/500 [===================>..........] - ETA: 52s - loss: 0.0700 - regression_loss: 0.0646 - classification_loss: 0.0054 346/500 [===================>..........] - ETA: 51s - loss: 0.0700 - regression_loss: 0.0646 - classification_loss: 0.0054 347/500 [===================>..........] - ETA: 51s - loss: 0.0699 - regression_loss: 0.0645 - classification_loss: 0.0053 348/500 [===================>..........] - ETA: 51s - loss: 0.0700 - regression_loss: 0.0646 - classification_loss: 0.0053 349/500 [===================>..........] - ETA: 50s - loss: 0.0699 - regression_loss: 0.0646 - classification_loss: 0.0054 350/500 [====================>.........] - ETA: 50s - loss: 0.0698 - regression_loss: 0.0645 - classification_loss: 0.0054 351/500 [====================>.........] - ETA: 50s - loss: 0.0697 - regression_loss: 0.0644 - classification_loss: 0.0054 352/500 [====================>.........] - ETA: 49s - loss: 0.0697 - regression_loss: 0.0643 - classification_loss: 0.0054 353/500 [====================>.........] - ETA: 49s - loss: 0.0695 - regression_loss: 0.0642 - classification_loss: 0.0054 354/500 [====================>.........] - ETA: 49s - loss: 0.0694 - regression_loss: 0.0640 - classification_loss: 0.0053 355/500 [====================>.........] - ETA: 48s - loss: 0.0693 - regression_loss: 0.0640 - classification_loss: 0.0053 356/500 [====================>.........] - ETA: 48s - loss: 0.0695 - regression_loss: 0.0642 - classification_loss: 0.0053 357/500 [====================>.........] - ETA: 48s - loss: 0.0696 - regression_loss: 0.0643 - classification_loss: 0.0054 358/500 [====================>.........] - ETA: 47s - loss: 0.0697 - regression_loss: 0.0643 - classification_loss: 0.0054 359/500 [====================>.........] - ETA: 47s - loss: 0.0695 - regression_loss: 0.0642 - classification_loss: 0.0053 360/500 [====================>.........] - ETA: 47s - loss: 0.0694 - regression_loss: 0.0640 - classification_loss: 0.0053 361/500 [====================>.........] - ETA: 46s - loss: 0.0693 - regression_loss: 0.0640 - classification_loss: 0.0053 362/500 [====================>.........] - ETA: 46s - loss: 0.0692 - regression_loss: 0.0639 - classification_loss: 0.0053 363/500 [====================>.........] - ETA: 46s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0053 364/500 [====================>.........] - ETA: 45s - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0053 365/500 [====================>.........] - ETA: 45s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0053 366/500 [====================>.........] - ETA: 45s - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0053 367/500 [=====================>........] - ETA: 44s - loss: 0.0689 - regression_loss: 0.0636 - classification_loss: 0.0053 368/500 [=====================>........] - ETA: 44s - loss: 0.0689 - regression_loss: 0.0636 - classification_loss: 0.0053 369/500 [=====================>........] - ETA: 44s - loss: 0.0692 - regression_loss: 0.0639 - classification_loss: 0.0053 370/500 [=====================>........] - ETA: 43s - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0053 371/500 [=====================>........] - ETA: 43s - loss: 0.0692 - regression_loss: 0.0639 - classification_loss: 0.0053 372/500 [=====================>........] - ETA: 43s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0053 373/500 [=====================>........] - ETA: 42s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0053 374/500 [=====================>........] - ETA: 42s - loss: 0.0691 - regression_loss: 0.0637 - classification_loss: 0.0054 375/500 [=====================>........] - ETA: 42s - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0053 376/500 [=====================>........] - ETA: 41s - loss: 0.0691 - regression_loss: 0.0637 - classification_loss: 0.0053 377/500 [=====================>........] - ETA: 41s - loss: 0.0691 - regression_loss: 0.0637 - classification_loss: 0.0054 378/500 [=====================>........] - ETA: 41s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0054 379/500 [=====================>........] - ETA: 40s - loss: 0.0692 - regression_loss: 0.0638 - classification_loss: 0.0054 380/500 [=====================>........] - ETA: 40s - loss: 0.0691 - regression_loss: 0.0638 - classification_loss: 0.0054 381/500 [=====================>........] - ETA: 40s - loss: 0.0690 - regression_loss: 0.0636 - classification_loss: 0.0053 382/500 [=====================>........] - ETA: 39s - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0053 383/500 [=====================>........] - ETA: 39s - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0053 384/500 [======================>.......] - ETA: 39s - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0053 385/500 [======================>.......] - ETA: 38s - loss: 0.0686 - regression_loss: 0.0633 - classification_loss: 0.0053 386/500 [======================>.......] - ETA: 38s - loss: 0.0685 - regression_loss: 0.0632 - classification_loss: 0.0053 387/500 [======================>.......] - ETA: 38s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0053 388/500 [======================>.......] - ETA: 37s - loss: 0.0685 - regression_loss: 0.0632 - classification_loss: 0.0053 389/500 [======================>.......] - ETA: 37s - loss: 0.0683 - regression_loss: 0.0631 - classification_loss: 0.0053 390/500 [======================>.......] - ETA: 37s - loss: 0.0683 - regression_loss: 0.0630 - classification_loss: 0.0053 391/500 [======================>.......] - ETA: 36s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0053 392/500 [======================>.......] - ETA: 36s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0053 393/500 [======================>.......] - ETA: 35s - loss: 0.0683 - regression_loss: 0.0630 - classification_loss: 0.0053 394/500 [======================>.......] - ETA: 35s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 395/500 [======================>.......] - ETA: 35s - loss: 0.0682 - regression_loss: 0.0629 - classification_loss: 0.0053 396/500 [======================>.......] - ETA: 34s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 397/500 [======================>.......] - ETA: 34s - loss: 0.0683 - regression_loss: 0.0630 - classification_loss: 0.0053 398/500 [======================>.......] - ETA: 34s - loss: 0.0682 - regression_loss: 0.0629 - classification_loss: 0.0053 399/500 [======================>.......] - ETA: 33s - loss: 0.0682 - regression_loss: 0.0629 - classification_loss: 0.0053 400/500 [=======================>......] - ETA: 33s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 401/500 [=======================>......] - ETA: 33s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0053 402/500 [=======================>......] - ETA: 32s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0053 403/500 [=======================>......] - ETA: 32s - loss: 0.0683 - regression_loss: 0.0630 - classification_loss: 0.0053 404/500 [=======================>......] - ETA: 32s - loss: 0.0681 - regression_loss: 0.0629 - classification_loss: 0.0053 405/500 [=======================>......] - ETA: 31s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 406/500 [=======================>......] - ETA: 31s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 407/500 [=======================>......] - ETA: 31s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 408/500 [=======================>......] - ETA: 30s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 409/500 [=======================>......] - ETA: 30s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0053 410/500 [=======================>......] - ETA: 30s - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 411/500 [=======================>......] - ETA: 29s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 412/500 [=======================>......] - ETA: 29s - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 413/500 [=======================>......] - ETA: 29s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 414/500 [=======================>......] - ETA: 28s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 415/500 [=======================>......] - ETA: 28s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 416/500 [=======================>......] - ETA: 28s - loss: 0.0677 - regression_loss: 0.0624 - classification_loss: 0.0052 417/500 [========================>.....] - ETA: 27s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 418/500 [========================>.....] - ETA: 27s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 419/500 [========================>.....] - ETA: 27s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 420/500 [========================>.....] - ETA: 26s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 421/500 [========================>.....] - ETA: 26s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 422/500 [========================>.....] - ETA: 26s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 423/500 [========================>.....] - ETA: 25s - loss: 0.0675 - regression_loss: 0.0623 - classification_loss: 0.0052 424/500 [========================>.....] - ETA: 25s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 425/500 [========================>.....] - ETA: 25s - loss: 0.0675 - regression_loss: 0.0623 - classification_loss: 0.0052 426/500 [========================>.....] - ETA: 24s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 427/500 [========================>.....] - ETA: 24s - loss: 0.0675 - regression_loss: 0.0623 - classification_loss: 0.0052 428/500 [========================>.....] - ETA: 24s - loss: 0.0675 - regression_loss: 0.0624 - classification_loss: 0.0052 429/500 [========================>.....] - ETA: 23s - loss: 0.0675 - regression_loss: 0.0624 - classification_loss: 0.0052 430/500 [========================>.....] - ETA: 23s - loss: 0.0675 - regression_loss: 0.0624 - classification_loss: 0.0052 431/500 [========================>.....] - ETA: 23s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 432/500 [========================>.....] - ETA: 22s - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0052 433/500 [========================>.....] - ETA: 22s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 434/500 [=========================>....] - ETA: 22s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 435/500 [=========================>....] - ETA: 21s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 436/500 [=========================>....] - ETA: 21s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 437/500 [=========================>....] - ETA: 21s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 438/500 [=========================>....] - ETA: 20s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 439/500 [=========================>....] - ETA: 20s - loss: 0.0675 - regression_loss: 0.0623 - classification_loss: 0.0052 440/500 [=========================>....] - ETA: 20s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 441/500 [=========================>....] - ETA: 19s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 442/500 [=========================>....] - ETA: 19s - loss: 0.0678 - regression_loss: 0.0625 - classification_loss: 0.0052 443/500 [=========================>....] - ETA: 19s - loss: 0.0677 - regression_loss: 0.0624 - classification_loss: 0.0052 444/500 [=========================>....] - ETA: 18s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 445/500 [=========================>....] - ETA: 18s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 446/500 [=========================>....] - ETA: 18s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 447/500 [=========================>....] - ETA: 17s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0053 448/500 [=========================>....] - ETA: 17s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0053 449/500 [=========================>....] - ETA: 17s - loss: 0.0682 - regression_loss: 0.0630 - classification_loss: 0.0053 450/500 [==========================>...] - ETA: 16s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 451/500 [==========================>...] - ETA: 16s - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0052 452/500 [==========================>...] - ETA: 16s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 453/500 [==========================>...] - ETA: 15s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 454/500 [==========================>...] - ETA: 15s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0052 455/500 [==========================>...] - ETA: 15s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 456/500 [==========================>...] - ETA: 14s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 457/500 [==========================>...] - ETA: 14s - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 458/500 [==========================>...] - ETA: 14s - loss: 0.0678 - regression_loss: 0.0625 - classification_loss: 0.0052 459/500 [==========================>...] - ETA: 13s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 460/500 [==========================>...] - ETA: 13s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 461/500 [==========================>...] - ETA: 13s - loss: 0.0676 - regression_loss: 0.0624 - classification_loss: 0.0052 462/500 [==========================>...] - ETA: 12s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 463/500 [==========================>...] - ETA: 12s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 464/500 [==========================>...] - ETA: 12s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0052 465/500 [==========================>...] - ETA: 11s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0052 466/500 [==========================>...] - ETA: 11s - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0052 467/500 [===========================>..] - ETA: 11s - loss: 0.0679 - regression_loss: 0.0627 - classification_loss: 0.0052 468/500 [===========================>..] - ETA: 10s - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 469/500 [===========================>..] - ETA: 10s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 470/500 [===========================>..] - ETA: 10s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 471/500 [===========================>..] - ETA: 9s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052  472/500 [===========================>..] - ETA: 9s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 473/500 [===========================>..] - ETA: 9s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 474/500 [===========================>..] - ETA: 8s - loss: 0.0678 - regression_loss: 0.0626 - classification_loss: 0.0052 475/500 [===========================>..] - ETA: 8s - loss: 0.0679 - regression_loss: 0.0626 - classification_loss: 0.0052 476/500 [===========================>..] - ETA: 8s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 477/500 [===========================>..] - ETA: 7s - loss: 0.0677 - regression_loss: 0.0625 - classification_loss: 0.0052 478/500 [===========================>..] - ETA: 7s - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0053 479/500 [===========================>..] - ETA: 7s - loss: 0.0680 - regression_loss: 0.0628 - classification_loss: 0.0053 480/500 [===========================>..] - ETA: 6s - loss: 0.0681 - regression_loss: 0.0628 - classification_loss: 0.0053 481/500 [===========================>..] - ETA: 6s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0052 482/500 [===========================>..] - ETA: 6s - loss: 0.0680 - regression_loss: 0.0627 - classification_loss: 0.0052 483/500 [===========================>..] - ETA: 5s - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0053 484/500 [============================>.] - ETA: 5s - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0053 485/500 [============================>.] - ETA: 5s - loss: 0.0683 - regression_loss: 0.0631 - classification_loss: 0.0052 486/500 [============================>.] - ETA: 4s - loss: 0.0684 - regression_loss: 0.0631 - classification_loss: 0.0052 487/500 [============================>.] - ETA: 4s - loss: 0.0684 - regression_loss: 0.0632 - classification_loss: 0.0052 488/500 [============================>.] - ETA: 4s - loss: 0.0685 - regression_loss: 0.0633 - classification_loss: 0.0053 489/500 [============================>.] - ETA: 3s - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0053 490/500 [============================>.] - ETA: 3s - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0052 491/500 [============================>.] - ETA: 3s - loss: 0.0686 - regression_loss: 0.0633 - classification_loss: 0.0052 492/500 [============================>.] - ETA: 2s - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0053 493/500 [============================>.] - ETA: 2s - loss: 0.0686 - regression_loss: 0.0633 - classification_loss: 0.0053 494/500 [============================>.] - ETA: 2s - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0053 495/500 [============================>.] - ETA: 1s - loss: 0.0687 - regression_loss: 0.0634 - classification_loss: 0.0053 496/500 [============================>.] - ETA: 1s - loss: 0.0686 - regression_loss: 0.0634 - classification_loss: 0.0053 497/500 [============================>.] - ETA: 1s - loss: 0.0688 - regression_loss: 0.0635 - classification_loss: 0.0053 498/500 [============================>.] - ETA: 0s - loss: 0.0687 - regression_loss: 0.0634 - classification_loss: 0.0053 499/500 [============================>.] - ETA: 0s - loss: 0.0690 - regression_loss: 0.0637 - classification_loss: 0.0054 500/500 [==============================] - 168s 336ms/step - loss: 0.0691 - regression_loss: 0.0637 - classification_loss: 0.0054 1172 instances of class plum with average precision: 0.7602 mAP: 0.7602 Epoch 00058: saving model to ./training/snapshots/resnet101_pascal_58.h5 Epoch 59/150 1/500 [..............................] - ETA: 2:40 - loss: 0.1063 - regression_loss: 0.1016 - classification_loss: 0.0047 2/500 [..............................] - ETA: 2:43 - loss: 0.0717 - regression_loss: 0.0683 - classification_loss: 0.0034 3/500 [..............................] - ETA: 2:42 - loss: 0.0810 - regression_loss: 0.0752 - classification_loss: 0.0058 4/500 [..............................] - ETA: 2:44 - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 5/500 [..............................] - ETA: 2:44 - loss: 0.0550 - regression_loss: 0.0509 - classification_loss: 0.0041 6/500 [..............................] - ETA: 2:44 - loss: 0.0492 - regression_loss: 0.0456 - classification_loss: 0.0036 7/500 [..............................] - ETA: 2:43 - loss: 0.0663 - regression_loss: 0.0627 - classification_loss: 0.0036 8/500 [..............................] - ETA: 2:42 - loss: 0.0664 - regression_loss: 0.0628 - classification_loss: 0.0036 9/500 [..............................] - ETA: 2:43 - loss: 0.0957 - regression_loss: 0.0922 - classification_loss: 0.0035 10/500 [..............................] - ETA: 2:42 - loss: 0.0885 - regression_loss: 0.0852 - classification_loss: 0.0033 11/500 [..............................] - ETA: 2:42 - loss: 0.0852 - regression_loss: 0.0816 - classification_loss: 0.0036 12/500 [..............................] - ETA: 2:42 - loss: 0.0805 - regression_loss: 0.0772 - classification_loss: 0.0034 13/500 [..............................] - ETA: 2:42 - loss: 0.0762 - regression_loss: 0.0730 - classification_loss: 0.0032 14/500 [..............................] - ETA: 2:41 - loss: 0.0836 - regression_loss: 0.0804 - classification_loss: 0.0032 15/500 [..............................] - ETA: 2:41 - loss: 0.0864 - regression_loss: 0.0828 - classification_loss: 0.0036 16/500 [..............................] - ETA: 2:41 - loss: 0.0913 - regression_loss: 0.0877 - classification_loss: 0.0037 17/500 [>.............................] - ETA: 2:40 - loss: 0.0977 - regression_loss: 0.0935 - classification_loss: 0.0042 18/500 [>.............................] - ETA: 2:40 - loss: 0.0943 - regression_loss: 0.0902 - classification_loss: 0.0041 19/500 [>.............................] - ETA: 2:40 - loss: 0.0905 - regression_loss: 0.0865 - classification_loss: 0.0039 20/500 [>.............................] - ETA: 2:40 - loss: 0.0872 - regression_loss: 0.0834 - classification_loss: 0.0038 21/500 [>.............................] - ETA: 2:40 - loss: 0.0851 - regression_loss: 0.0812 - classification_loss: 0.0039 22/500 [>.............................] - ETA: 2:39 - loss: 0.0839 - regression_loss: 0.0799 - classification_loss: 0.0040 23/500 [>.............................] - ETA: 2:39 - loss: 0.0824 - regression_loss: 0.0785 - classification_loss: 0.0040 24/500 [>.............................] - ETA: 2:38 - loss: 0.0838 - regression_loss: 0.0798 - classification_loss: 0.0040 25/500 [>.............................] - ETA: 2:38 - loss: 0.0830 - regression_loss: 0.0789 - classification_loss: 0.0041 26/500 [>.............................] - ETA: 2:37 - loss: 0.0806 - regression_loss: 0.0766 - classification_loss: 0.0040 27/500 [>.............................] - ETA: 2:37 - loss: 0.0797 - regression_loss: 0.0756 - classification_loss: 0.0041 28/500 [>.............................] - ETA: 2:37 - loss: 0.0816 - regression_loss: 0.0775 - classification_loss: 0.0041 29/500 [>.............................] - ETA: 2:36 - loss: 0.0803 - regression_loss: 0.0763 - classification_loss: 0.0040 30/500 [>.............................] - ETA: 2:36 - loss: 0.0803 - regression_loss: 0.0763 - classification_loss: 0.0040 31/500 [>.............................] - ETA: 2:36 - loss: 0.0808 - regression_loss: 0.0767 - classification_loss: 0.0041 32/500 [>.............................] - ETA: 2:36 - loss: 0.0794 - regression_loss: 0.0754 - classification_loss: 0.0040 33/500 [>.............................] - ETA: 2:35 - loss: 0.0786 - regression_loss: 0.0746 - classification_loss: 0.0040 34/500 [=>............................] - ETA: 2:35 - loss: 0.0770 - regression_loss: 0.0731 - classification_loss: 0.0039 35/500 [=>............................] - ETA: 2:34 - loss: 0.0755 - regression_loss: 0.0717 - classification_loss: 0.0039 36/500 [=>............................] - ETA: 2:34 - loss: 0.0738 - regression_loss: 0.0700 - classification_loss: 0.0038 37/500 [=>............................] - ETA: 2:33 - loss: 0.0723 - regression_loss: 0.0686 - classification_loss: 0.0037 38/500 [=>............................] - ETA: 2:33 - loss: 0.0713 - regression_loss: 0.0676 - classification_loss: 0.0037 39/500 [=>............................] - ETA: 2:33 - loss: 0.0707 - regression_loss: 0.0671 - classification_loss: 0.0036 40/500 [=>............................] - ETA: 2:33 - loss: 0.0692 - regression_loss: 0.0657 - classification_loss: 0.0035 41/500 [=>............................] - ETA: 2:32 - loss: 0.0691 - regression_loss: 0.0655 - classification_loss: 0.0036 42/500 [=>............................] - ETA: 2:32 - loss: 0.0703 - regression_loss: 0.0667 - classification_loss: 0.0036 43/500 [=>............................] - ETA: 2:31 - loss: 0.0733 - regression_loss: 0.0696 - classification_loss: 0.0037 44/500 [=>............................] - ETA: 2:31 - loss: 0.0720 - regression_loss: 0.0683 - classification_loss: 0.0037 45/500 [=>............................] - ETA: 2:31 - loss: 0.0716 - regression_loss: 0.0679 - classification_loss: 0.0037 46/500 [=>............................] - ETA: 2:30 - loss: 0.0730 - regression_loss: 0.0690 - classification_loss: 0.0040 47/500 [=>............................] - ETA: 2:30 - loss: 0.0719 - regression_loss: 0.0679 - classification_loss: 0.0039 48/500 [=>............................] - ETA: 2:30 - loss: 0.0715 - regression_loss: 0.0676 - classification_loss: 0.0039 49/500 [=>............................] - ETA: 2:29 - loss: 0.0724 - regression_loss: 0.0684 - classification_loss: 0.0040 50/500 [==>...........................] - ETA: 2:29 - loss: 0.0723 - regression_loss: 0.0682 - classification_loss: 0.0040 51/500 [==>...........................] - ETA: 2:29 - loss: 0.0735 - regression_loss: 0.0693 - classification_loss: 0.0041 52/500 [==>...........................] - ETA: 2:28 - loss: 0.0751 - regression_loss: 0.0709 - classification_loss: 0.0042 53/500 [==>...........................] - ETA: 2:28 - loss: 0.0789 - regression_loss: 0.0746 - classification_loss: 0.0043 54/500 [==>...........................] - ETA: 2:27 - loss: 0.0801 - regression_loss: 0.0758 - classification_loss: 0.0043 55/500 [==>...........................] - ETA: 2:27 - loss: 0.0799 - regression_loss: 0.0756 - classification_loss: 0.0043 56/500 [==>...........................] - ETA: 2:27 - loss: 0.0799 - regression_loss: 0.0756 - classification_loss: 0.0043 57/500 [==>...........................] - ETA: 2:27 - loss: 0.0791 - regression_loss: 0.0748 - classification_loss: 0.0043 58/500 [==>...........................] - ETA: 2:27 - loss: 0.0801 - regression_loss: 0.0757 - classification_loss: 0.0044 59/500 [==>...........................] - ETA: 2:26 - loss: 0.0792 - regression_loss: 0.0748 - classification_loss: 0.0043 60/500 [==>...........................] - ETA: 2:26 - loss: 0.0782 - regression_loss: 0.0739 - classification_loss: 0.0043 61/500 [==>...........................] - ETA: 2:26 - loss: 0.0776 - regression_loss: 0.0733 - classification_loss: 0.0043 62/500 [==>...........................] - ETA: 2:25 - loss: 0.0778 - regression_loss: 0.0735 - classification_loss: 0.0043 63/500 [==>...........................] - ETA: 2:25 - loss: 0.0779 - regression_loss: 0.0736 - classification_loss: 0.0043 64/500 [==>...........................] - ETA: 2:25 - loss: 0.0772 - regression_loss: 0.0729 - classification_loss: 0.0043 65/500 [==>...........................] - ETA: 2:24 - loss: 0.0768 - regression_loss: 0.0725 - classification_loss: 0.0042 66/500 [==>...........................] - ETA: 2:24 - loss: 0.0767 - regression_loss: 0.0724 - classification_loss: 0.0043 67/500 [===>..........................] - ETA: 2:24 - loss: 0.0758 - regression_loss: 0.0716 - classification_loss: 0.0042 68/500 [===>..........................] - ETA: 2:24 - loss: 0.0748 - regression_loss: 0.0706 - classification_loss: 0.0042 69/500 [===>..........................] - ETA: 2:23 - loss: 0.0747 - regression_loss: 0.0706 - classification_loss: 0.0041 70/500 [===>..........................] - ETA: 2:23 - loss: 0.0741 - regression_loss: 0.0700 - classification_loss: 0.0041 71/500 [===>..........................] - ETA: 2:23 - loss: 0.0734 - regression_loss: 0.0693 - classification_loss: 0.0041 72/500 [===>..........................] - ETA: 2:22 - loss: 0.0728 - regression_loss: 0.0687 - classification_loss: 0.0041 73/500 [===>..........................] - ETA: 2:22 - loss: 0.0728 - regression_loss: 0.0685 - classification_loss: 0.0042 74/500 [===>..........................] - ETA: 2:21 - loss: 0.0720 - regression_loss: 0.0678 - classification_loss: 0.0042 75/500 [===>..........................] - ETA: 2:21 - loss: 0.0712 - regression_loss: 0.0671 - classification_loss: 0.0042 76/500 [===>..........................] - ETA: 2:21 - loss: 0.0731 - regression_loss: 0.0687 - classification_loss: 0.0044 77/500 [===>..........................] - ETA: 2:20 - loss: 0.0727 - regression_loss: 0.0683 - classification_loss: 0.0043 78/500 [===>..........................] - ETA: 2:20 - loss: 0.0735 - regression_loss: 0.0691 - classification_loss: 0.0045 79/500 [===>..........................] - ETA: 2:20 - loss: 0.0729 - regression_loss: 0.0684 - classification_loss: 0.0044 80/500 [===>..........................] - ETA: 2:19 - loss: 0.0733 - regression_loss: 0.0688 - classification_loss: 0.0044 81/500 [===>..........................] - ETA: 2:19 - loss: 0.0732 - regression_loss: 0.0688 - classification_loss: 0.0044 82/500 [===>..........................] - ETA: 2:19 - loss: 0.0731 - regression_loss: 0.0686 - classification_loss: 0.0044 83/500 [===>..........................] - ETA: 2:18 - loss: 0.0729 - regression_loss: 0.0685 - classification_loss: 0.0044 84/500 [====>.........................] - ETA: 2:18 - loss: 0.0732 - regression_loss: 0.0688 - classification_loss: 0.0045 85/500 [====>.........................] - ETA: 2:18 - loss: 0.0727 - regression_loss: 0.0683 - classification_loss: 0.0044 86/500 [====>.........................] - ETA: 2:17 - loss: 0.0720 - regression_loss: 0.0676 - classification_loss: 0.0044 87/500 [====>.........................] - ETA: 2:17 - loss: 0.0717 - regression_loss: 0.0673 - classification_loss: 0.0044 88/500 [====>.........................] - ETA: 2:16 - loss: 0.0711 - regression_loss: 0.0667 - classification_loss: 0.0043 89/500 [====>.........................] - ETA: 2:16 - loss: 0.0705 - regression_loss: 0.0662 - classification_loss: 0.0043 90/500 [====>.........................] - ETA: 2:16 - loss: 0.0700 - regression_loss: 0.0658 - classification_loss: 0.0043 91/500 [====>.........................] - ETA: 2:16 - loss: 0.0709 - regression_loss: 0.0666 - classification_loss: 0.0043 92/500 [====>.........................] - ETA: 2:15 - loss: 0.0707 - regression_loss: 0.0665 - classification_loss: 0.0043 93/500 [====>.........................] - ETA: 2:15 - loss: 0.0701 - regression_loss: 0.0658 - classification_loss: 0.0043 94/500 [====>.........................] - ETA: 2:14 - loss: 0.0701 - regression_loss: 0.0658 - classification_loss: 0.0043 95/500 [====>.........................] - ETA: 2:14 - loss: 0.0697 - regression_loss: 0.0655 - classification_loss: 0.0043 96/500 [====>.........................] - ETA: 2:14 - loss: 0.0693 - regression_loss: 0.0651 - classification_loss: 0.0042 97/500 [====>.........................] - ETA: 2:13 - loss: 0.0704 - regression_loss: 0.0661 - classification_loss: 0.0043 98/500 [====>.........................] - ETA: 2:13 - loss: 0.0701 - regression_loss: 0.0658 - classification_loss: 0.0043 99/500 [====>.........................] - ETA: 2:13 - loss: 0.0701 - regression_loss: 0.0658 - classification_loss: 0.0043 100/500 [=====>........................] - ETA: 2:13 - loss: 0.0700 - regression_loss: 0.0657 - classification_loss: 0.0043 101/500 [=====>........................] - ETA: 2:12 - loss: 0.0702 - regression_loss: 0.0658 - classification_loss: 0.0044 102/500 [=====>........................] - ETA: 2:12 - loss: 0.0701 - regression_loss: 0.0658 - classification_loss: 0.0044 103/500 [=====>........................] - ETA: 2:12 - loss: 0.0700 - regression_loss: 0.0656 - classification_loss: 0.0043 104/500 [=====>........................] - ETA: 2:11 - loss: 0.0711 - regression_loss: 0.0667 - classification_loss: 0.0044 105/500 [=====>........................] - ETA: 2:11 - loss: 0.0708 - regression_loss: 0.0664 - classification_loss: 0.0044 106/500 [=====>........................] - ETA: 2:11 - loss: 0.0710 - regression_loss: 0.0666 - classification_loss: 0.0044 107/500 [=====>........................] - ETA: 2:10 - loss: 0.0708 - regression_loss: 0.0664 - classification_loss: 0.0044 108/500 [=====>........................] - ETA: 2:10 - loss: 0.0715 - regression_loss: 0.0671 - classification_loss: 0.0045 109/500 [=====>........................] - ETA: 2:10 - loss: 0.0716 - regression_loss: 0.0671 - classification_loss: 0.0045 110/500 [=====>........................] - ETA: 2:09 - loss: 0.0712 - regression_loss: 0.0668 - classification_loss: 0.0044 111/500 [=====>........................] - ETA: 2:09 - loss: 0.0710 - regression_loss: 0.0666 - classification_loss: 0.0045 112/500 [=====>........................] - ETA: 2:09 - loss: 0.0707 - regression_loss: 0.0663 - classification_loss: 0.0045 113/500 [=====>........................] - ETA: 2:08 - loss: 0.0707 - regression_loss: 0.0662 - classification_loss: 0.0045 114/500 [=====>........................] - ETA: 2:08 - loss: 0.0715 - regression_loss: 0.0670 - classification_loss: 0.0045 115/500 [=====>........................] - ETA: 2:08 - loss: 0.0710 - regression_loss: 0.0666 - classification_loss: 0.0044 116/500 [=====>........................] - ETA: 2:08 - loss: 0.0706 - regression_loss: 0.0662 - classification_loss: 0.0044 117/500 [======>.......................] - ETA: 2:07 - loss: 0.0709 - regression_loss: 0.0664 - classification_loss: 0.0044 118/500 [======>.......................] - ETA: 2:07 - loss: 0.0708 - regression_loss: 0.0664 - classification_loss: 0.0044 119/500 [======>.......................] - ETA: 2:07 - loss: 0.0704 - regression_loss: 0.0660 - classification_loss: 0.0044 120/500 [======>.......................] - ETA: 2:06 - loss: 0.0720 - regression_loss: 0.0670 - classification_loss: 0.0049 121/500 [======>.......................] - ETA: 2:06 - loss: 0.0718 - regression_loss: 0.0668 - classification_loss: 0.0049 122/500 [======>.......................] - ETA: 2:06 - loss: 0.0719 - regression_loss: 0.0669 - classification_loss: 0.0049 123/500 [======>.......................] - ETA: 2:05 - loss: 0.0721 - regression_loss: 0.0671 - classification_loss: 0.0049 124/500 [======>.......................] - ETA: 2:05 - loss: 0.0719 - regression_loss: 0.0670 - classification_loss: 0.0049 125/500 [======>.......................] - ETA: 2:05 - loss: 0.0725 - regression_loss: 0.0675 - classification_loss: 0.0049 126/500 [======>.......................] - ETA: 2:04 - loss: 0.0744 - regression_loss: 0.0694 - classification_loss: 0.0049 127/500 [======>.......................] - ETA: 2:04 - loss: 0.0739 - regression_loss: 0.0690 - classification_loss: 0.0049 128/500 [======>.......................] - ETA: 2:04 - loss: 0.0740 - regression_loss: 0.0691 - classification_loss: 0.0049 129/500 [======>.......................] - ETA: 2:03 - loss: 0.0749 - regression_loss: 0.0700 - classification_loss: 0.0049 130/500 [======>.......................] - ETA: 2:03 - loss: 0.0749 - regression_loss: 0.0700 - classification_loss: 0.0049 131/500 [======>.......................] - ETA: 2:03 - loss: 0.0745 - regression_loss: 0.0696 - classification_loss: 0.0049 132/500 [======>.......................] - ETA: 2:02 - loss: 0.0746 - regression_loss: 0.0698 - classification_loss: 0.0049 133/500 [======>.......................] - ETA: 2:02 - loss: 0.0751 - regression_loss: 0.0701 - classification_loss: 0.0050 134/500 [=======>......................] - ETA: 2:02 - loss: 0.0748 - regression_loss: 0.0698 - classification_loss: 0.0050 135/500 [=======>......................] - ETA: 2:01 - loss: 0.0743 - regression_loss: 0.0694 - classification_loss: 0.0050 136/500 [=======>......................] - ETA: 2:01 - loss: 0.0744 - regression_loss: 0.0695 - classification_loss: 0.0050 137/500 [=======>......................] - ETA: 2:01 - loss: 0.0743 - regression_loss: 0.0693 - classification_loss: 0.0050 138/500 [=======>......................] - ETA: 2:00 - loss: 0.0745 - regression_loss: 0.0695 - classification_loss: 0.0050 139/500 [=======>......................] - ETA: 2:00 - loss: 0.0741 - regression_loss: 0.0692 - classification_loss: 0.0049 140/500 [=======>......................] - ETA: 2:00 - loss: 0.0740 - regression_loss: 0.0691 - classification_loss: 0.0049 141/500 [=======>......................] - ETA: 1:59 - loss: 0.0738 - regression_loss: 0.0689 - classification_loss: 0.0049 142/500 [=======>......................] - ETA: 1:59 - loss: 0.0735 - regression_loss: 0.0686 - classification_loss: 0.0049 143/500 [=======>......................] - ETA: 1:59 - loss: 0.0737 - regression_loss: 0.0688 - classification_loss: 0.0049 144/500 [=======>......................] - ETA: 1:58 - loss: 0.0735 - regression_loss: 0.0686 - classification_loss: 0.0049 145/500 [=======>......................] - ETA: 1:58 - loss: 0.0732 - regression_loss: 0.0683 - classification_loss: 0.0049 146/500 [=======>......................] - ETA: 1:58 - loss: 0.0727 - regression_loss: 0.0678 - classification_loss: 0.0049 147/500 [=======>......................] - ETA: 1:57 - loss: 0.0732 - regression_loss: 0.0682 - classification_loss: 0.0050 148/500 [=======>......................] - ETA: 1:57 - loss: 0.0728 - regression_loss: 0.0678 - classification_loss: 0.0050 149/500 [=======>......................] - ETA: 1:57 - loss: 0.0727 - regression_loss: 0.0677 - classification_loss: 0.0050 150/500 [========>.....................] - ETA: 1:56 - loss: 0.0728 - regression_loss: 0.0678 - classification_loss: 0.0050 151/500 [========>.....................] - ETA: 1:56 - loss: 0.0724 - regression_loss: 0.0675 - classification_loss: 0.0050 152/500 [========>.....................] - ETA: 1:56 - loss: 0.0721 - regression_loss: 0.0672 - classification_loss: 0.0049 153/500 [========>.....................] - ETA: 1:55 - loss: 0.0717 - regression_loss: 0.0668 - classification_loss: 0.0049 154/500 [========>.....................] - ETA: 1:55 - loss: 0.0715 - regression_loss: 0.0666 - classification_loss: 0.0049 155/500 [========>.....................] - ETA: 1:55 - loss: 0.0714 - regression_loss: 0.0665 - classification_loss: 0.0049 156/500 [========>.....................] - ETA: 1:54 - loss: 0.0713 - regression_loss: 0.0664 - classification_loss: 0.0050 157/500 [========>.....................] - ETA: 1:54 - loss: 0.0717 - regression_loss: 0.0667 - classification_loss: 0.0050 158/500 [========>.....................] - ETA: 1:54 - loss: 0.0723 - regression_loss: 0.0672 - classification_loss: 0.0051 159/500 [========>.....................] - ETA: 1:53 - loss: 0.0720 - regression_loss: 0.0670 - classification_loss: 0.0051 160/500 [========>.....................] - ETA: 1:53 - loss: 0.0718 - regression_loss: 0.0668 - classification_loss: 0.0050 161/500 [========>.....................] - ETA: 1:53 - loss: 0.0716 - regression_loss: 0.0666 - classification_loss: 0.0050 162/500 [========>.....................] - ETA: 1:52 - loss: 0.0718 - regression_loss: 0.0668 - classification_loss: 0.0050 163/500 [========>.....................] - ETA: 1:52 - loss: 0.0717 - regression_loss: 0.0667 - classification_loss: 0.0050 164/500 [========>.....................] - ETA: 1:52 - loss: 0.0718 - regression_loss: 0.0668 - classification_loss: 0.0050 165/500 [========>.....................] - ETA: 1:51 - loss: 0.0715 - regression_loss: 0.0665 - classification_loss: 0.0050 166/500 [========>.....................] - ETA: 1:51 - loss: 0.0713 - regression_loss: 0.0663 - classification_loss: 0.0050 167/500 [=========>....................] - ETA: 1:51 - loss: 0.0710 - regression_loss: 0.0660 - classification_loss: 0.0050 168/500 [=========>....................] - ETA: 1:50 - loss: 0.0707 - regression_loss: 0.0657 - classification_loss: 0.0050 169/500 [=========>....................] - ETA: 1:50 - loss: 0.0706 - regression_loss: 0.0656 - classification_loss: 0.0050 170/500 [=========>....................] - ETA: 1:50 - loss: 0.0709 - regression_loss: 0.0659 - classification_loss: 0.0050 171/500 [=========>....................] - ETA: 1:49 - loss: 0.0712 - regression_loss: 0.0662 - classification_loss: 0.0050 172/500 [=========>....................] - ETA: 1:49 - loss: 0.0714 - regression_loss: 0.0664 - classification_loss: 0.0050 173/500 [=========>....................] - ETA: 1:49 - loss: 0.0717 - regression_loss: 0.0667 - classification_loss: 0.0050 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0722 - regression_loss: 0.0672 - classification_loss: 0.0050 175/500 [=========>....................] - ETA: 1:48 - loss: 0.0720 - regression_loss: 0.0670 - classification_loss: 0.0050 176/500 [=========>....................] - ETA: 1:48 - loss: 0.0717 - regression_loss: 0.0667 - classification_loss: 0.0050 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0715 - regression_loss: 0.0666 - classification_loss: 0.0049 178/500 [=========>....................] - ETA: 1:47 - loss: 0.0715 - regression_loss: 0.0665 - classification_loss: 0.0050 179/500 [=========>....................] - ETA: 1:47 - loss: 0.0714 - regression_loss: 0.0664 - classification_loss: 0.0050 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0711 - regression_loss: 0.0662 - classification_loss: 0.0050 181/500 [=========>....................] - ETA: 1:46 - loss: 0.0708 - regression_loss: 0.0659 - classification_loss: 0.0049 182/500 [=========>....................] - ETA: 1:46 - loss: 0.0706 - regression_loss: 0.0657 - classification_loss: 0.0049 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0705 - regression_loss: 0.0656 - classification_loss: 0.0049 184/500 [==========>...................] - ETA: 1:45 - loss: 0.0702 - regression_loss: 0.0653 - classification_loss: 0.0049 185/500 [==========>...................] - ETA: 1:45 - loss: 0.0703 - regression_loss: 0.0654 - classification_loss: 0.0049 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0700 - regression_loss: 0.0651 - classification_loss: 0.0049 187/500 [==========>...................] - ETA: 1:44 - loss: 0.0698 - regression_loss: 0.0649 - classification_loss: 0.0048 188/500 [==========>...................] - ETA: 1:44 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0048 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0691 - regression_loss: 0.0643 - classification_loss: 0.0048 190/500 [==========>...................] - ETA: 1:43 - loss: 0.0691 - regression_loss: 0.0643 - classification_loss: 0.0048 191/500 [==========>...................] - ETA: 1:43 - loss: 0.0690 - regression_loss: 0.0642 - classification_loss: 0.0048 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0048 193/500 [==========>...................] - ETA: 1:42 - loss: 0.0690 - regression_loss: 0.0642 - classification_loss: 0.0048 194/500 [==========>...................] - ETA: 1:42 - loss: 0.0696 - regression_loss: 0.0647 - classification_loss: 0.0049 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0694 - regression_loss: 0.0645 - classification_loss: 0.0049 196/500 [==========>...................] - ETA: 1:41 - loss: 0.0691 - regression_loss: 0.0643 - classification_loss: 0.0049 197/500 [==========>...................] - ETA: 1:41 - loss: 0.0689 - regression_loss: 0.0641 - classification_loss: 0.0049 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0048 199/500 [==========>...................] - ETA: 1:40 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0048 200/500 [===========>..................] - ETA: 1:40 - loss: 0.0691 - regression_loss: 0.0642 - classification_loss: 0.0049 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0689 - regression_loss: 0.0641 - classification_loss: 0.0049 202/500 [===========>..................] - ETA: 1:39 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0049 203/500 [===========>..................] - ETA: 1:39 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0049 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0686 - regression_loss: 0.0638 - classification_loss: 0.0049 205/500 [===========>..................] - ETA: 1:38 - loss: 0.0691 - regression_loss: 0.0642 - classification_loss: 0.0049 206/500 [===========>..................] - ETA: 1:38 - loss: 0.0692 - regression_loss: 0.0643 - classification_loss: 0.0049 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0690 - regression_loss: 0.0641 - classification_loss: 0.0049 208/500 [===========>..................] - ETA: 1:37 - loss: 0.0689 - regression_loss: 0.0641 - classification_loss: 0.0049 209/500 [===========>..................] - ETA: 1:37 - loss: 0.0687 - regression_loss: 0.0638 - classification_loss: 0.0049 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0685 - regression_loss: 0.0636 - classification_loss: 0.0048 211/500 [===========>..................] - ETA: 1:36 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 212/500 [===========>..................] - ETA: 1:36 - loss: 0.0681 - regression_loss: 0.0633 - classification_loss: 0.0048 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0682 - regression_loss: 0.0634 - classification_loss: 0.0048 214/500 [===========>..................] - ETA: 1:35 - loss: 0.0680 - regression_loss: 0.0632 - classification_loss: 0.0048 215/500 [===========>..................] - ETA: 1:35 - loss: 0.0680 - regression_loss: 0.0633 - classification_loss: 0.0048 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0678 - regression_loss: 0.0630 - classification_loss: 0.0048 217/500 [============>.................] - ETA: 1:34 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 218/500 [============>.................] - ETA: 1:34 - loss: 0.0685 - regression_loss: 0.0638 - classification_loss: 0.0048 219/500 [============>.................] - ETA: 1:34 - loss: 0.0687 - regression_loss: 0.0639 - classification_loss: 0.0048 220/500 [============>.................] - ETA: 1:33 - loss: 0.0686 - regression_loss: 0.0638 - classification_loss: 0.0048 221/500 [============>.................] - ETA: 1:33 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 222/500 [============>.................] - ETA: 1:33 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 223/500 [============>.................] - ETA: 1:32 - loss: 0.0685 - regression_loss: 0.0637 - classification_loss: 0.0048 224/500 [============>.................] - ETA: 1:32 - loss: 0.0686 - regression_loss: 0.0638 - classification_loss: 0.0048 225/500 [============>.................] - ETA: 1:32 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 226/500 [============>.................] - ETA: 1:31 - loss: 0.0682 - regression_loss: 0.0635 - classification_loss: 0.0048 227/500 [============>.................] - ETA: 1:31 - loss: 0.0681 - regression_loss: 0.0634 - classification_loss: 0.0047 228/500 [============>.................] - ETA: 1:31 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 229/500 [============>.................] - ETA: 1:30 - loss: 0.0682 - regression_loss: 0.0634 - classification_loss: 0.0047 230/500 [============>.................] - ETA: 1:30 - loss: 0.0682 - regression_loss: 0.0634 - classification_loss: 0.0048 231/500 [============>.................] - ETA: 1:30 - loss: 0.0686 - regression_loss: 0.0638 - classification_loss: 0.0048 232/500 [============>.................] - ETA: 1:29 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 233/500 [============>.................] - ETA: 1:29 - loss: 0.0687 - regression_loss: 0.0639 - classification_loss: 0.0048 234/500 [=============>................] - ETA: 1:29 - loss: 0.0695 - regression_loss: 0.0647 - classification_loss: 0.0048 235/500 [=============>................] - ETA: 1:28 - loss: 0.0694 - regression_loss: 0.0646 - classification_loss: 0.0048 236/500 [=============>................] - ETA: 1:28 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0048 237/500 [=============>................] - ETA: 1:28 - loss: 0.0694 - regression_loss: 0.0646 - classification_loss: 0.0048 238/500 [=============>................] - ETA: 1:27 - loss: 0.0694 - regression_loss: 0.0646 - classification_loss: 0.0048 239/500 [=============>................] - ETA: 1:27 - loss: 0.0692 - regression_loss: 0.0644 - classification_loss: 0.0048 240/500 [=============>................] - ETA: 1:27 - loss: 0.0690 - regression_loss: 0.0642 - classification_loss: 0.0048 241/500 [=============>................] - ETA: 1:26 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0048 242/500 [=============>................] - ETA: 1:26 - loss: 0.0687 - regression_loss: 0.0639 - classification_loss: 0.0048 243/500 [=============>................] - ETA: 1:26 - loss: 0.0685 - regression_loss: 0.0637 - classification_loss: 0.0048 244/500 [=============>................] - ETA: 1:25 - loss: 0.0683 - regression_loss: 0.0636 - classification_loss: 0.0047 245/500 [=============>................] - ETA: 1:25 - loss: 0.0687 - regression_loss: 0.0639 - classification_loss: 0.0048 246/500 [=============>................] - ETA: 1:25 - loss: 0.0692 - regression_loss: 0.0644 - classification_loss: 0.0049 247/500 [=============>................] - ETA: 1:24 - loss: 0.0693 - regression_loss: 0.0645 - classification_loss: 0.0049 248/500 [=============>................] - ETA: 1:24 - loss: 0.0692 - regression_loss: 0.0643 - classification_loss: 0.0049 249/500 [=============>................] - ETA: 1:24 - loss: 0.0702 - regression_loss: 0.0653 - classification_loss: 0.0049 250/500 [==============>...............] - ETA: 1:23 - loss: 0.0700 - regression_loss: 0.0652 - classification_loss: 0.0049 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0704 - regression_loss: 0.0654 - classification_loss: 0.0050 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0706 - regression_loss: 0.0656 - classification_loss: 0.0050 253/500 [==============>...............] - ETA: 1:22 - loss: 0.0704 - regression_loss: 0.0654 - classification_loss: 0.0050 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0708 - regression_loss: 0.0658 - classification_loss: 0.0050 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0707 - regression_loss: 0.0657 - classification_loss: 0.0050 256/500 [==============>...............] - ETA: 1:21 - loss: 0.0706 - regression_loss: 0.0656 - classification_loss: 0.0050 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0704 - regression_loss: 0.0655 - classification_loss: 0.0050 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0707 - regression_loss: 0.0658 - classification_loss: 0.0050 259/500 [==============>...............] - ETA: 1:20 - loss: 0.0707 - regression_loss: 0.0657 - classification_loss: 0.0050 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0705 - regression_loss: 0.0656 - classification_loss: 0.0050 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0706 - regression_loss: 0.0656 - classification_loss: 0.0050 262/500 [==============>...............] - ETA: 1:19 - loss: 0.0705 - regression_loss: 0.0655 - classification_loss: 0.0050 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0702 - regression_loss: 0.0653 - classification_loss: 0.0050 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0704 - regression_loss: 0.0654 - classification_loss: 0.0050 265/500 [==============>...............] - ETA: 1:18 - loss: 0.0702 - regression_loss: 0.0653 - classification_loss: 0.0049 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0700 - regression_loss: 0.0651 - classification_loss: 0.0049 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0698 - regression_loss: 0.0649 - classification_loss: 0.0049 268/500 [===============>..............] - ETA: 1:17 - loss: 0.0699 - regression_loss: 0.0650 - classification_loss: 0.0049 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0698 - regression_loss: 0.0649 - classification_loss: 0.0049 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0697 - regression_loss: 0.0648 - classification_loss: 0.0049 271/500 [===============>..............] - ETA: 1:16 - loss: 0.0696 - regression_loss: 0.0647 - classification_loss: 0.0049 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0701 - regression_loss: 0.0652 - classification_loss: 0.0049 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0701 - regression_loss: 0.0652 - classification_loss: 0.0049 274/500 [===============>..............] - ETA: 1:15 - loss: 0.0700 - regression_loss: 0.0651 - classification_loss: 0.0049 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0704 - regression_loss: 0.0655 - classification_loss: 0.0049 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0703 - regression_loss: 0.0654 - classification_loss: 0.0049 277/500 [===============>..............] - ETA: 1:14 - loss: 0.0701 - regression_loss: 0.0652 - classification_loss: 0.0049 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0702 - regression_loss: 0.0653 - classification_loss: 0.0049 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0702 - regression_loss: 0.0652 - classification_loss: 0.0050 280/500 [===============>..............] - ETA: 1:13 - loss: 0.0700 - regression_loss: 0.0650 - classification_loss: 0.0049 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0698 - regression_loss: 0.0648 - classification_loss: 0.0049 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0697 - regression_loss: 0.0648 - classification_loss: 0.0049 283/500 [===============>..............] - ETA: 1:12 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0049 284/500 [================>.............] - ETA: 1:12 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0049 285/500 [================>.............] - ETA: 1:12 - loss: 0.0694 - regression_loss: 0.0645 - classification_loss: 0.0049 286/500 [================>.............] - ETA: 1:11 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0049 287/500 [================>.............] - ETA: 1:11 - loss: 0.0695 - regression_loss: 0.0646 - classification_loss: 0.0049 288/500 [================>.............] - ETA: 1:11 - loss: 0.0693 - regression_loss: 0.0644 - classification_loss: 0.0049 289/500 [================>.............] - ETA: 1:10 - loss: 0.0691 - regression_loss: 0.0642 - classification_loss: 0.0049 290/500 [================>.............] - ETA: 1:10 - loss: 0.0690 - regression_loss: 0.0641 - classification_loss: 0.0049 291/500 [================>.............] - ETA: 1:10 - loss: 0.0689 - regression_loss: 0.0640 - classification_loss: 0.0049 292/500 [================>.............] - ETA: 1:09 - loss: 0.0687 - regression_loss: 0.0638 - classification_loss: 0.0049 293/500 [================>.............] - ETA: 1:09 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0049 294/500 [================>.............] - ETA: 1:09 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0049 295/500 [================>.............] - ETA: 1:08 - loss: 0.0689 - regression_loss: 0.0640 - classification_loss: 0.0049 296/500 [================>.............] - ETA: 1:08 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0049 297/500 [================>.............] - ETA: 1:08 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0049 298/500 [================>.............] - ETA: 1:07 - loss: 0.0689 - regression_loss: 0.0640 - classification_loss: 0.0049 299/500 [================>.............] - ETA: 1:07 - loss: 0.0688 - regression_loss: 0.0640 - classification_loss: 0.0049 300/500 [=================>............] - ETA: 1:07 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0049 301/500 [=================>............] - ETA: 1:06 - loss: 0.0687 - regression_loss: 0.0638 - classification_loss: 0.0049 302/500 [=================>............] - ETA: 1:06 - loss: 0.0686 - regression_loss: 0.0637 - classification_loss: 0.0049 303/500 [=================>............] - ETA: 1:06 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 304/500 [=================>............] - ETA: 1:05 - loss: 0.0685 - regression_loss: 0.0637 - classification_loss: 0.0048 305/500 [=================>............] - ETA: 1:05 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 306/500 [=================>............] - ETA: 1:05 - loss: 0.0684 - regression_loss: 0.0636 - classification_loss: 0.0048 307/500 [=================>............] - ETA: 1:04 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 308/500 [=================>............] - ETA: 1:04 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 309/500 [=================>............] - ETA: 1:04 - loss: 0.0688 - regression_loss: 0.0639 - classification_loss: 0.0048 310/500 [=================>............] - ETA: 1:03 - loss: 0.0686 - regression_loss: 0.0638 - classification_loss: 0.0048 311/500 [=================>............] - ETA: 1:03 - loss: 0.0685 - regression_loss: 0.0637 - classification_loss: 0.0048 312/500 [=================>............] - ETA: 1:03 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 313/500 [=================>............] - ETA: 1:02 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 314/500 [=================>............] - ETA: 1:02 - loss: 0.0682 - regression_loss: 0.0634 - classification_loss: 0.0048 315/500 [=================>............] - ETA: 1:02 - loss: 0.0681 - regression_loss: 0.0633 - classification_loss: 0.0048 316/500 [=================>............] - ETA: 1:01 - loss: 0.0683 - regression_loss: 0.0635 - classification_loss: 0.0048 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0681 - regression_loss: 0.0633 - classification_loss: 0.0048 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0681 - regression_loss: 0.0633 - classification_loss: 0.0048 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0680 - regression_loss: 0.0632 - classification_loss: 0.0048 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0681 - regression_loss: 0.0632 - classification_loss: 0.0048 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0681 - regression_loss: 0.0632 - classification_loss: 0.0049 322/500 [==================>...........] - ETA: 59s - loss: 0.0681 - regression_loss: 0.0633 - classification_loss: 0.0049  323/500 [==================>...........] - ETA: 59s - loss: 0.0681 - regression_loss: 0.0632 - classification_loss: 0.0049 324/500 [==================>...........] - ETA: 59s - loss: 0.0680 - regression_loss: 0.0631 - classification_loss: 0.0049 325/500 [==================>...........] - ETA: 58s - loss: 0.0678 - regression_loss: 0.0630 - classification_loss: 0.0048 326/500 [==================>...........] - ETA: 58s - loss: 0.0678 - regression_loss: 0.0629 - classification_loss: 0.0048 327/500 [==================>...........] - ETA: 58s - loss: 0.0676 - regression_loss: 0.0628 - classification_loss: 0.0048 328/500 [==================>...........] - ETA: 57s - loss: 0.0678 - regression_loss: 0.0629 - classification_loss: 0.0049 329/500 [==================>...........] - ETA: 57s - loss: 0.0680 - regression_loss: 0.0631 - classification_loss: 0.0049 330/500 [==================>...........] - ETA: 57s - loss: 0.0681 - regression_loss: 0.0632 - classification_loss: 0.0049 331/500 [==================>...........] - ETA: 56s - loss: 0.0681 - regression_loss: 0.0631 - classification_loss: 0.0049 332/500 [==================>...........] - ETA: 56s - loss: 0.0680 - regression_loss: 0.0631 - classification_loss: 0.0049 333/500 [==================>...........] - ETA: 56s - loss: 0.0680 - regression_loss: 0.0631 - classification_loss: 0.0049 334/500 [===================>..........] - ETA: 55s - loss: 0.0679 - regression_loss: 0.0630 - classification_loss: 0.0049 335/500 [===================>..........] - ETA: 55s - loss: 0.0677 - regression_loss: 0.0628 - classification_loss: 0.0049 336/500 [===================>..........] - ETA: 55s - loss: 0.0677 - regression_loss: 0.0628 - classification_loss: 0.0049 337/500 [===================>..........] - ETA: 54s - loss: 0.0676 - regression_loss: 0.0627 - classification_loss: 0.0049 338/500 [===================>..........] - ETA: 54s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 339/500 [===================>..........] - ETA: 53s - loss: 0.0672 - regression_loss: 0.0624 - classification_loss: 0.0049 340/500 [===================>..........] - ETA: 53s - loss: 0.0673 - regression_loss: 0.0625 - classification_loss: 0.0049 341/500 [===================>..........] - ETA: 53s - loss: 0.0674 - regression_loss: 0.0626 - classification_loss: 0.0049 342/500 [===================>..........] - ETA: 52s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 343/500 [===================>..........] - ETA: 52s - loss: 0.0674 - regression_loss: 0.0626 - classification_loss: 0.0049 344/500 [===================>..........] - ETA: 52s - loss: 0.0677 - regression_loss: 0.0629 - classification_loss: 0.0049 345/500 [===================>..........] - ETA: 51s - loss: 0.0676 - regression_loss: 0.0628 - classification_loss: 0.0048 346/500 [===================>..........] - ETA: 51s - loss: 0.0676 - regression_loss: 0.0627 - classification_loss: 0.0048 347/500 [===================>..........] - ETA: 51s - loss: 0.0674 - regression_loss: 0.0626 - classification_loss: 0.0048 348/500 [===================>..........] - ETA: 50s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0048 349/500 [===================>..........] - ETA: 50s - loss: 0.0672 - regression_loss: 0.0624 - classification_loss: 0.0048 350/500 [====================>.........] - ETA: 50s - loss: 0.0673 - regression_loss: 0.0625 - classification_loss: 0.0048 351/500 [====================>.........] - ETA: 49s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0048 352/500 [====================>.........] - ETA: 49s - loss: 0.0672 - regression_loss: 0.0624 - classification_loss: 0.0048 353/500 [====================>.........] - ETA: 49s - loss: 0.0672 - regression_loss: 0.0624 - classification_loss: 0.0048 354/500 [====================>.........] - ETA: 48s - loss: 0.0671 - regression_loss: 0.0623 - classification_loss: 0.0048 355/500 [====================>.........] - ETA: 48s - loss: 0.0670 - regression_loss: 0.0622 - classification_loss: 0.0048 356/500 [====================>.........] - ETA: 48s - loss: 0.0671 - regression_loss: 0.0623 - classification_loss: 0.0048 357/500 [====================>.........] - ETA: 47s - loss: 0.0670 - regression_loss: 0.0622 - classification_loss: 0.0048 358/500 [====================>.........] - ETA: 47s - loss: 0.0668 - regression_loss: 0.0621 - classification_loss: 0.0048 359/500 [====================>.........] - ETA: 47s - loss: 0.0667 - regression_loss: 0.0619 - classification_loss: 0.0047 360/500 [====================>.........] - ETA: 46s - loss: 0.0665 - regression_loss: 0.0618 - classification_loss: 0.0047 361/500 [====================>.........] - ETA: 46s - loss: 0.0664 - regression_loss: 0.0617 - classification_loss: 0.0047 362/500 [====================>.........] - ETA: 46s - loss: 0.0663 - regression_loss: 0.0616 - classification_loss: 0.0047 363/500 [====================>.........] - ETA: 45s - loss: 0.0663 - regression_loss: 0.0616 - classification_loss: 0.0047 364/500 [====================>.........] - ETA: 45s - loss: 0.0663 - regression_loss: 0.0616 - classification_loss: 0.0047 365/500 [====================>.........] - ETA: 45s - loss: 0.0667 - regression_loss: 0.0620 - classification_loss: 0.0048 366/500 [====================>.........] - ETA: 44s - loss: 0.0668 - regression_loss: 0.0621 - classification_loss: 0.0048 367/500 [=====================>........] - ETA: 44s - loss: 0.0667 - regression_loss: 0.0620 - classification_loss: 0.0048 368/500 [=====================>........] - ETA: 44s - loss: 0.0668 - regression_loss: 0.0621 - classification_loss: 0.0048 369/500 [=====================>........] - ETA: 43s - loss: 0.0669 - regression_loss: 0.0621 - classification_loss: 0.0048 370/500 [=====================>........] - ETA: 43s - loss: 0.0667 - regression_loss: 0.0620 - classification_loss: 0.0048 371/500 [=====================>........] - ETA: 43s - loss: 0.0668 - regression_loss: 0.0620 - classification_loss: 0.0048 372/500 [=====================>........] - ETA: 42s - loss: 0.0667 - regression_loss: 0.0619 - classification_loss: 0.0048 373/500 [=====================>........] - ETA: 42s - loss: 0.0666 - regression_loss: 0.0618 - classification_loss: 0.0048 374/500 [=====================>........] - ETA: 42s - loss: 0.0666 - regression_loss: 0.0619 - classification_loss: 0.0048 375/500 [=====================>........] - ETA: 41s - loss: 0.0667 - regression_loss: 0.0619 - classification_loss: 0.0048 376/500 [=====================>........] - ETA: 41s - loss: 0.0666 - regression_loss: 0.0619 - classification_loss: 0.0048 377/500 [=====================>........] - ETA: 41s - loss: 0.0666 - regression_loss: 0.0618 - classification_loss: 0.0048 378/500 [=====================>........] - ETA: 40s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0049 379/500 [=====================>........] - ETA: 40s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0049 380/500 [=====================>........] - ETA: 40s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 381/500 [=====================>........] - ETA: 39s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 382/500 [=====================>........] - ETA: 39s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 383/500 [=====================>........] - ETA: 39s - loss: 0.0667 - regression_loss: 0.0618 - classification_loss: 0.0049 384/500 [======================>.......] - ETA: 38s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 385/500 [======================>.......] - ETA: 38s - loss: 0.0673 - regression_loss: 0.0624 - classification_loss: 0.0049 386/500 [======================>.......] - ETA: 38s - loss: 0.0673 - regression_loss: 0.0624 - classification_loss: 0.0049 387/500 [======================>.......] - ETA: 37s - loss: 0.0675 - regression_loss: 0.0626 - classification_loss: 0.0049 388/500 [======================>.......] - ETA: 37s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 389/500 [======================>.......] - ETA: 37s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 390/500 [======================>.......] - ETA: 36s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 391/500 [======================>.......] - ETA: 36s - loss: 0.0676 - regression_loss: 0.0627 - classification_loss: 0.0049 392/500 [======================>.......] - ETA: 36s - loss: 0.0676 - regression_loss: 0.0628 - classification_loss: 0.0049 393/500 [======================>.......] - ETA: 35s - loss: 0.0677 - regression_loss: 0.0628 - classification_loss: 0.0049 394/500 [======================>.......] - ETA: 35s - loss: 0.0676 - regression_loss: 0.0627 - classification_loss: 0.0049 395/500 [======================>.......] - ETA: 35s - loss: 0.0674 - regression_loss: 0.0626 - classification_loss: 0.0049 396/500 [======================>.......] - ETA: 34s - loss: 0.0673 - regression_loss: 0.0625 - classification_loss: 0.0049 397/500 [======================>.......] - ETA: 34s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 398/500 [======================>.......] - ETA: 34s - loss: 0.0675 - regression_loss: 0.0626 - classification_loss: 0.0049 399/500 [======================>.......] - ETA: 33s - loss: 0.0673 - regression_loss: 0.0624 - classification_loss: 0.0049 400/500 [=======================>......] - ETA: 33s - loss: 0.0677 - regression_loss: 0.0627 - classification_loss: 0.0049 401/500 [=======================>......] - ETA: 33s - loss: 0.0676 - regression_loss: 0.0627 - classification_loss: 0.0049 402/500 [=======================>......] - ETA: 32s - loss: 0.0675 - regression_loss: 0.0626 - classification_loss: 0.0049 403/500 [=======================>......] - ETA: 32s - loss: 0.0674 - regression_loss: 0.0625 - classification_loss: 0.0049 404/500 [=======================>......] - ETA: 32s - loss: 0.0672 - regression_loss: 0.0623 - classification_loss: 0.0049 405/500 [=======================>......] - ETA: 31s - loss: 0.0672 - regression_loss: 0.0623 - classification_loss: 0.0049 406/500 [=======================>......] - ETA: 31s - loss: 0.0671 - regression_loss: 0.0622 - classification_loss: 0.0049 407/500 [=======================>......] - ETA: 31s - loss: 0.0670 - regression_loss: 0.0621 - classification_loss: 0.0049 408/500 [=======================>......] - ETA: 30s - loss: 0.0670 - regression_loss: 0.0621 - classification_loss: 0.0049 409/500 [=======================>......] - ETA: 30s - loss: 0.0670 - regression_loss: 0.0621 - classification_loss: 0.0049 410/500 [=======================>......] - ETA: 30s - loss: 0.0669 - regression_loss: 0.0621 - classification_loss: 0.0049 411/500 [=======================>......] - ETA: 29s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 412/500 [=======================>......] - ETA: 29s - loss: 0.0670 - regression_loss: 0.0621 - classification_loss: 0.0049 413/500 [=======================>......] - ETA: 29s - loss: 0.0669 - regression_loss: 0.0621 - classification_loss: 0.0049 414/500 [=======================>......] - ETA: 28s - loss: 0.0669 - regression_loss: 0.0621 - classification_loss: 0.0049 415/500 [=======================>......] - ETA: 28s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 416/500 [=======================>......] - ETA: 28s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 417/500 [========================>.....] - ETA: 27s - loss: 0.0671 - regression_loss: 0.0622 - classification_loss: 0.0049 418/500 [========================>.....] - ETA: 27s - loss: 0.0670 - regression_loss: 0.0621 - classification_loss: 0.0049 419/500 [========================>.....] - ETA: 27s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 420/500 [========================>.....] - ETA: 26s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 421/500 [========================>.....] - ETA: 26s - loss: 0.0667 - regression_loss: 0.0619 - classification_loss: 0.0049 422/500 [========================>.....] - ETA: 26s - loss: 0.0666 - regression_loss: 0.0618 - classification_loss: 0.0049 423/500 [========================>.....] - ETA: 25s - loss: 0.0666 - regression_loss: 0.0617 - classification_loss: 0.0049 424/500 [========================>.....] - ETA: 25s - loss: 0.0667 - regression_loss: 0.0619 - classification_loss: 0.0049 425/500 [========================>.....] - ETA: 25s - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0050 426/500 [========================>.....] - ETA: 24s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 427/500 [========================>.....] - ETA: 24s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 428/500 [========================>.....] - ETA: 24s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 429/500 [========================>.....] - ETA: 23s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0050 430/500 [========================>.....] - ETA: 23s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0050 431/500 [========================>.....] - ETA: 23s - loss: 0.0667 - regression_loss: 0.0618 - classification_loss: 0.0050 432/500 [========================>.....] - ETA: 22s - loss: 0.0667 - regression_loss: 0.0618 - classification_loss: 0.0050 433/500 [========================>.....] - ETA: 22s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 434/500 [=========================>....] - ETA: 22s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 435/500 [=========================>....] - ETA: 21s - loss: 0.0665 - regression_loss: 0.0616 - classification_loss: 0.0050 436/500 [=========================>....] - ETA: 21s - loss: 0.0666 - regression_loss: 0.0617 - classification_loss: 0.0050 437/500 [=========================>....] - ETA: 21s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 438/500 [=========================>....] - ETA: 20s - loss: 0.0666 - regression_loss: 0.0617 - classification_loss: 0.0050 439/500 [=========================>....] - ETA: 20s - loss: 0.0665 - regression_loss: 0.0616 - classification_loss: 0.0049 440/500 [=========================>....] - ETA: 20s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 441/500 [=========================>....] - ETA: 19s - loss: 0.0664 - regression_loss: 0.0615 - classification_loss: 0.0049 442/500 [=========================>....] - ETA: 19s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 443/500 [=========================>....] - ETA: 19s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 444/500 [=========================>....] - ETA: 18s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 445/500 [=========================>....] - ETA: 18s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 446/500 [=========================>....] - ETA: 18s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0049 447/500 [=========================>....] - ETA: 17s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0049 448/500 [=========================>....] - ETA: 17s - loss: 0.0664 - regression_loss: 0.0614 - classification_loss: 0.0049 449/500 [=========================>....] - ETA: 17s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 450/500 [==========================>...] - ETA: 16s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 451/500 [==========================>...] - ETA: 16s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 452/500 [==========================>...] - ETA: 16s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 453/500 [==========================>...] - ETA: 15s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 454/500 [==========================>...] - ETA: 15s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 455/500 [==========================>...] - ETA: 15s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 456/500 [==========================>...] - ETA: 14s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 457/500 [==========================>...] - ETA: 14s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 458/500 [==========================>...] - ETA: 14s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 459/500 [==========================>...] - ETA: 13s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 460/500 [==========================>...] - ETA: 13s - loss: 0.0660 - regression_loss: 0.0612 - classification_loss: 0.0049 461/500 [==========================>...] - ETA: 13s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 462/500 [==========================>...] - ETA: 12s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 463/500 [==========================>...] - ETA: 12s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 464/500 [==========================>...] - ETA: 12s - loss: 0.0661 - regression_loss: 0.0611 - classification_loss: 0.0049 465/500 [==========================>...] - ETA: 11s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 466/500 [==========================>...] - ETA: 11s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 467/500 [===========================>..] - ETA: 11s - loss: 0.0661 - regression_loss: 0.0611 - classification_loss: 0.0049 468/500 [===========================>..] - ETA: 10s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 469/500 [===========================>..] - ETA: 10s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 470/500 [===========================>..] - ETA: 10s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0049 471/500 [===========================>..] - ETA: 9s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049  472/500 [===========================>..] - ETA: 9s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0049 473/500 [===========================>..] - ETA: 9s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 474/500 [===========================>..] - ETA: 8s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0049 475/500 [===========================>..] - ETA: 8s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 476/500 [===========================>..] - ETA: 8s - loss: 0.0665 - regression_loss: 0.0616 - classification_loss: 0.0049 477/500 [===========================>..] - ETA: 7s - loss: 0.0664 - regression_loss: 0.0615 - classification_loss: 0.0049 478/500 [===========================>..] - ETA: 7s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 479/500 [===========================>..] - ETA: 7s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 480/500 [===========================>..] - ETA: 6s - loss: 0.0664 - regression_loss: 0.0615 - classification_loss: 0.0049 481/500 [===========================>..] - ETA: 6s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 482/500 [===========================>..] - ETA: 6s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 483/500 [===========================>..] - ETA: 5s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 484/500 [============================>.] - ETA: 5s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0049 485/500 [============================>.] - ETA: 5s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 486/500 [============================>.] - ETA: 4s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 487/500 [============================>.] - ETA: 4s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0049 488/500 [============================>.] - ETA: 4s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0049 489/500 [============================>.] - ETA: 3s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0049 490/500 [============================>.] - ETA: 3s - loss: 0.0667 - regression_loss: 0.0618 - classification_loss: 0.0049 491/500 [============================>.] - ETA: 3s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0049 492/500 [============================>.] - ETA: 2s - loss: 0.0667 - regression_loss: 0.0618 - classification_loss: 0.0049 493/500 [============================>.] - ETA: 2s - loss: 0.0666 - regression_loss: 0.0617 - classification_loss: 0.0049 494/500 [============================>.] - ETA: 2s - loss: 0.0665 - regression_loss: 0.0616 - classification_loss: 0.0049 495/500 [============================>.] - ETA: 1s - loss: 0.0665 - regression_loss: 0.0616 - classification_loss: 0.0049 496/500 [============================>.] - ETA: 1s - loss: 0.0664 - regression_loss: 0.0615 - classification_loss: 0.0049 497/500 [============================>.] - ETA: 1s - loss: 0.0664 - regression_loss: 0.0615 - classification_loss: 0.0049 498/500 [============================>.] - ETA: 0s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 499/500 [============================>.] - ETA: 0s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 500/500 [==============================] - 168s 335ms/step - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 1172 instances of class plum with average precision: 0.7562 mAP: 0.7562 Epoch 00059: saving model to ./training/snapshots/resnet101_pascal_59.h5 Epoch 60/150 1/500 [..............................] - ETA: 2:38 - loss: 0.0391 - regression_loss: 0.0360 - classification_loss: 0.0032 2/500 [..............................] - ETA: 2:44 - loss: 0.0285 - regression_loss: 0.0261 - classification_loss: 0.0024 3/500 [..............................] - ETA: 2:46 - loss: 0.0641 - regression_loss: 0.0561 - classification_loss: 0.0080 4/500 [..............................] - ETA: 2:46 - loss: 0.0683 - regression_loss: 0.0611 - classification_loss: 0.0071 5/500 [..............................] - ETA: 2:46 - loss: 0.0745 - regression_loss: 0.0677 - classification_loss: 0.0069 6/500 [..............................] - ETA: 2:44 - loss: 0.0694 - regression_loss: 0.0628 - classification_loss: 0.0067 7/500 [..............................] - ETA: 2:45 - loss: 0.0789 - regression_loss: 0.0718 - classification_loss: 0.0071 8/500 [..............................] - ETA: 2:45 - loss: 0.0929 - regression_loss: 0.0848 - classification_loss: 0.0081 9/500 [..............................] - ETA: 2:44 - loss: 0.0893 - regression_loss: 0.0807 - classification_loss: 0.0087 10/500 [..............................] - ETA: 2:45 - loss: 0.0953 - regression_loss: 0.0844 - classification_loss: 0.0108 11/500 [..............................] - ETA: 2:45 - loss: 0.0914 - regression_loss: 0.0810 - classification_loss: 0.0104 12/500 [..............................] - ETA: 2:45 - loss: 0.0852 - regression_loss: 0.0756 - classification_loss: 0.0096 13/500 [..............................] - ETA: 2:44 - loss: 0.0792 - regression_loss: 0.0703 - classification_loss: 0.0090 14/500 [..............................] - ETA: 2:43 - loss: 0.0756 - regression_loss: 0.0671 - classification_loss: 0.0085 15/500 [..............................] - ETA: 2:42 - loss: 0.0770 - regression_loss: 0.0688 - classification_loss: 0.0082 16/500 [..............................] - ETA: 2:42 - loss: 0.0733 - regression_loss: 0.0655 - classification_loss: 0.0078 17/500 [>.............................] - ETA: 2:41 - loss: 0.0722 - regression_loss: 0.0646 - classification_loss: 0.0075 18/500 [>.............................] - ETA: 2:41 - loss: 0.0723 - regression_loss: 0.0651 - classification_loss: 0.0072 19/500 [>.............................] - ETA: 2:41 - loss: 0.0694 - regression_loss: 0.0625 - classification_loss: 0.0069 20/500 [>.............................] - ETA: 2:41 - loss: 0.0719 - regression_loss: 0.0651 - classification_loss: 0.0068 21/500 [>.............................] - ETA: 2:41 - loss: 0.0839 - regression_loss: 0.0773 - classification_loss: 0.0066 22/500 [>.............................] - ETA: 2:41 - loss: 0.0828 - regression_loss: 0.0764 - classification_loss: 0.0064 23/500 [>.............................] - ETA: 2:40 - loss: 0.0852 - regression_loss: 0.0788 - classification_loss: 0.0064 24/500 [>.............................] - ETA: 2:40 - loss: 0.0867 - regression_loss: 0.0803 - classification_loss: 0.0063 25/500 [>.............................] - ETA: 2:40 - loss: 0.0866 - regression_loss: 0.0803 - classification_loss: 0.0063 26/500 [>.............................] - ETA: 2:40 - loss: 0.0863 - regression_loss: 0.0799 - classification_loss: 0.0063 27/500 [>.............................] - ETA: 2:40 - loss: 0.0860 - regression_loss: 0.0797 - classification_loss: 0.0063 28/500 [>.............................] - ETA: 2:39 - loss: 0.0854 - regression_loss: 0.0792 - classification_loss: 0.0062 29/500 [>.............................] - ETA: 2:39 - loss: 0.0836 - regression_loss: 0.0775 - classification_loss: 0.0061 30/500 [>.............................] - ETA: 2:39 - loss: 0.0822 - regression_loss: 0.0762 - classification_loss: 0.0060 31/500 [>.............................] - ETA: 2:38 - loss: 0.0823 - regression_loss: 0.0764 - classification_loss: 0.0059 32/500 [>.............................] - ETA: 2:38 - loss: 0.0799 - regression_loss: 0.0741 - classification_loss: 0.0058 33/500 [>.............................] - ETA: 2:37 - loss: 0.0780 - regression_loss: 0.0724 - classification_loss: 0.0057 34/500 [=>............................] - ETA: 2:37 - loss: 0.0772 - regression_loss: 0.0715 - classification_loss: 0.0057 35/500 [=>............................] - ETA: 2:37 - loss: 0.0762 - regression_loss: 0.0706 - classification_loss: 0.0056 36/500 [=>............................] - ETA: 2:37 - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0056 37/500 [=>............................] - ETA: 2:36 - loss: 0.0775 - regression_loss: 0.0715 - classification_loss: 0.0059 38/500 [=>............................] - ETA: 2:36 - loss: 0.0775 - regression_loss: 0.0716 - classification_loss: 0.0059 39/500 [=>............................] - ETA: 2:35 - loss: 0.0764 - regression_loss: 0.0706 - classification_loss: 0.0058 40/500 [=>............................] - ETA: 2:35 - loss: 0.0748 - regression_loss: 0.0691 - classification_loss: 0.0057 41/500 [=>............................] - ETA: 2:35 - loss: 0.0741 - regression_loss: 0.0685 - classification_loss: 0.0056 42/500 [=>............................] - ETA: 2:34 - loss: 0.0748 - regression_loss: 0.0692 - classification_loss: 0.0056 43/500 [=>............................] - ETA: 2:34 - loss: 0.0759 - regression_loss: 0.0701 - classification_loss: 0.0058 44/500 [=>............................] - ETA: 2:34 - loss: 0.0763 - regression_loss: 0.0705 - classification_loss: 0.0059 45/500 [=>............................] - ETA: 2:33 - loss: 0.0759 - regression_loss: 0.0701 - classification_loss: 0.0058 46/500 [=>............................] - ETA: 2:33 - loss: 0.0752 - regression_loss: 0.0694 - classification_loss: 0.0058 47/500 [=>............................] - ETA: 2:33 - loss: 0.0741 - regression_loss: 0.0684 - classification_loss: 0.0057 48/500 [=>............................] - ETA: 2:32 - loss: 0.0753 - regression_loss: 0.0692 - classification_loss: 0.0061 49/500 [=>............................] - ETA: 2:32 - loss: 0.0771 - regression_loss: 0.0711 - classification_loss: 0.0061 50/500 [==>...........................] - ETA: 2:32 - loss: 0.0766 - regression_loss: 0.0706 - classification_loss: 0.0060 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0753 - regression_loss: 0.0694 - classification_loss: 0.0059 52/500 [==>...........................] - ETA: 2:31 - loss: 0.0743 - regression_loss: 0.0685 - classification_loss: 0.0058 53/500 [==>...........................] - ETA: 2:31 - loss: 0.0734 - regression_loss: 0.0676 - classification_loss: 0.0057 54/500 [==>...........................] - ETA: 2:30 - loss: 0.0726 - regression_loss: 0.0669 - classification_loss: 0.0057 55/500 [==>...........................] - ETA: 2:30 - loss: 0.0732 - regression_loss: 0.0675 - classification_loss: 0.0057 56/500 [==>...........................] - ETA: 2:30 - loss: 0.0724 - regression_loss: 0.0668 - classification_loss: 0.0056 57/500 [==>...........................] - ETA: 2:29 - loss: 0.0730 - regression_loss: 0.0673 - classification_loss: 0.0057 58/500 [==>...........................] - ETA: 2:29 - loss: 0.0741 - regression_loss: 0.0683 - classification_loss: 0.0058 59/500 [==>...........................] - ETA: 2:29 - loss: 0.0730 - regression_loss: 0.0673 - classification_loss: 0.0057 60/500 [==>...........................] - ETA: 2:28 - loss: 0.0719 - regression_loss: 0.0663 - classification_loss: 0.0056 61/500 [==>...........................] - ETA: 2:28 - loss: 0.0715 - regression_loss: 0.0659 - classification_loss: 0.0056 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0712 - regression_loss: 0.0656 - classification_loss: 0.0056 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0722 - regression_loss: 0.0666 - classification_loss: 0.0056 64/500 [==>...........................] - ETA: 2:27 - loss: 0.0735 - regression_loss: 0.0678 - classification_loss: 0.0057 65/500 [==>...........................] - ETA: 2:27 - loss: 0.0727 - regression_loss: 0.0671 - classification_loss: 0.0056 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0737 - regression_loss: 0.0681 - classification_loss: 0.0056 67/500 [===>..........................] - ETA: 2:26 - loss: 0.0756 - regression_loss: 0.0700 - classification_loss: 0.0056 68/500 [===>..........................] - ETA: 2:26 - loss: 0.0750 - regression_loss: 0.0694 - classification_loss: 0.0056 69/500 [===>..........................] - ETA: 2:25 - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0055 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0744 - regression_loss: 0.0688 - classification_loss: 0.0056 71/500 [===>..........................] - ETA: 2:24 - loss: 0.0735 - regression_loss: 0.0680 - classification_loss: 0.0055 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0733 - regression_loss: 0.0678 - classification_loss: 0.0055 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0727 - regression_loss: 0.0673 - classification_loss: 0.0055 74/500 [===>..........................] - ETA: 2:23 - loss: 0.0725 - regression_loss: 0.0671 - classification_loss: 0.0055 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0727 - regression_loss: 0.0672 - classification_loss: 0.0055 76/500 [===>..........................] - ETA: 2:22 - loss: 0.0729 - regression_loss: 0.0675 - classification_loss: 0.0054 77/500 [===>..........................] - ETA: 2:22 - loss: 0.0736 - regression_loss: 0.0681 - classification_loss: 0.0055 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0741 - regression_loss: 0.0686 - classification_loss: 0.0055 79/500 [===>..........................] - ETA: 2:21 - loss: 0.0743 - regression_loss: 0.0688 - classification_loss: 0.0055 80/500 [===>..........................] - ETA: 2:21 - loss: 0.0756 - regression_loss: 0.0701 - classification_loss: 0.0055 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0750 - regression_loss: 0.0696 - classification_loss: 0.0055 82/500 [===>..........................] - ETA: 2:20 - loss: 0.0745 - regression_loss: 0.0690 - classification_loss: 0.0054 83/500 [===>..........................] - ETA: 2:20 - loss: 0.0755 - regression_loss: 0.0700 - classification_loss: 0.0055 84/500 [====>.........................] - ETA: 2:19 - loss: 0.0767 - regression_loss: 0.0712 - classification_loss: 0.0056 85/500 [====>.........................] - ETA: 2:19 - loss: 0.0760 - regression_loss: 0.0704 - classification_loss: 0.0055 86/500 [====>.........................] - ETA: 2:19 - loss: 0.0759 - regression_loss: 0.0704 - classification_loss: 0.0055 87/500 [====>.........................] - ETA: 2:18 - loss: 0.0754 - regression_loss: 0.0699 - classification_loss: 0.0055 88/500 [====>.........................] - ETA: 2:18 - loss: 0.0748 - regression_loss: 0.0693 - classification_loss: 0.0054 89/500 [====>.........................] - ETA: 2:18 - loss: 0.0741 - regression_loss: 0.0688 - classification_loss: 0.0054 90/500 [====>.........................] - ETA: 2:17 - loss: 0.0737 - regression_loss: 0.0683 - classification_loss: 0.0053 91/500 [====>.........................] - ETA: 2:17 - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 92/500 [====>.........................] - ETA: 2:16 - loss: 0.0729 - regression_loss: 0.0676 - classification_loss: 0.0053 93/500 [====>.........................] - ETA: 2:16 - loss: 0.0732 - regression_loss: 0.0679 - classification_loss: 0.0053 94/500 [====>.........................] - ETA: 2:16 - loss: 0.0731 - regression_loss: 0.0678 - classification_loss: 0.0053 95/500 [====>.........................] - ETA: 2:15 - loss: 0.0750 - regression_loss: 0.0691 - classification_loss: 0.0059 96/500 [====>.........................] - ETA: 2:15 - loss: 0.0747 - regression_loss: 0.0688 - classification_loss: 0.0059 97/500 [====>.........................] - ETA: 2:15 - loss: 0.0751 - regression_loss: 0.0693 - classification_loss: 0.0059 98/500 [====>.........................] - ETA: 2:14 - loss: 0.0763 - regression_loss: 0.0704 - classification_loss: 0.0059 99/500 [====>.........................] - ETA: 2:14 - loss: 0.0768 - regression_loss: 0.0709 - classification_loss: 0.0059 100/500 [=====>........................] - ETA: 2:14 - loss: 0.0769 - regression_loss: 0.0710 - classification_loss: 0.0059 101/500 [=====>........................] - ETA: 2:13 - loss: 0.0763 - regression_loss: 0.0705 - classification_loss: 0.0059 102/500 [=====>........................] - ETA: 2:13 - loss: 0.0760 - regression_loss: 0.0702 - classification_loss: 0.0058 103/500 [=====>........................] - ETA: 2:13 - loss: 0.0761 - regression_loss: 0.0702 - classification_loss: 0.0059 104/500 [=====>........................] - ETA: 2:12 - loss: 0.0770 - regression_loss: 0.0712 - classification_loss: 0.0058 105/500 [=====>........................] - ETA: 2:12 - loss: 0.0766 - regression_loss: 0.0708 - classification_loss: 0.0058 106/500 [=====>........................] - ETA: 2:12 - loss: 0.0770 - regression_loss: 0.0713 - classification_loss: 0.0058 107/500 [=====>........................] - ETA: 2:11 - loss: 0.0770 - regression_loss: 0.0711 - classification_loss: 0.0058 108/500 [=====>........................] - ETA: 2:11 - loss: 0.0764 - regression_loss: 0.0707 - classification_loss: 0.0058 109/500 [=====>........................] - ETA: 2:11 - loss: 0.0760 - regression_loss: 0.0703 - classification_loss: 0.0058 110/500 [=====>........................] - ETA: 2:10 - loss: 0.0756 - regression_loss: 0.0699 - classification_loss: 0.0057 111/500 [=====>........................] - ETA: 2:10 - loss: 0.0756 - regression_loss: 0.0699 - classification_loss: 0.0057 112/500 [=====>........................] - ETA: 2:10 - loss: 0.0751 - regression_loss: 0.0694 - classification_loss: 0.0057 113/500 [=====>........................] - ETA: 2:10 - loss: 0.0745 - regression_loss: 0.0689 - classification_loss: 0.0057 114/500 [=====>........................] - ETA: 2:09 - loss: 0.0744 - regression_loss: 0.0687 - classification_loss: 0.0057 115/500 [=====>........................] - ETA: 2:09 - loss: 0.0746 - regression_loss: 0.0689 - classification_loss: 0.0057 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0744 - regression_loss: 0.0687 - classification_loss: 0.0056 117/500 [======>.......................] - ETA: 2:08 - loss: 0.0746 - regression_loss: 0.0690 - classification_loss: 0.0056 118/500 [======>.......................] - ETA: 2:08 - loss: 0.0742 - regression_loss: 0.0686 - classification_loss: 0.0056 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0738 - regression_loss: 0.0682 - classification_loss: 0.0056 120/500 [======>.......................] - ETA: 2:07 - loss: 0.0736 - regression_loss: 0.0681 - classification_loss: 0.0055 121/500 [======>.......................] - ETA: 2:07 - loss: 0.0738 - regression_loss: 0.0682 - classification_loss: 0.0055 122/500 [======>.......................] - ETA: 2:06 - loss: 0.0736 - regression_loss: 0.0681 - classification_loss: 0.0056 123/500 [======>.......................] - ETA: 2:06 - loss: 0.0738 - regression_loss: 0.0683 - classification_loss: 0.0055 124/500 [======>.......................] - ETA: 2:06 - loss: 0.0737 - regression_loss: 0.0682 - classification_loss: 0.0055 125/500 [======>.......................] - ETA: 2:05 - loss: 0.0733 - regression_loss: 0.0678 - classification_loss: 0.0055 126/500 [======>.......................] - ETA: 2:05 - loss: 0.0732 - regression_loss: 0.0677 - classification_loss: 0.0055 127/500 [======>.......................] - ETA: 2:05 - loss: 0.0744 - regression_loss: 0.0688 - classification_loss: 0.0056 128/500 [======>.......................] - ETA: 2:04 - loss: 0.0740 - regression_loss: 0.0684 - classification_loss: 0.0056 129/500 [======>.......................] - ETA: 2:04 - loss: 0.0739 - regression_loss: 0.0683 - classification_loss: 0.0056 130/500 [======>.......................] - ETA: 2:04 - loss: 0.0734 - regression_loss: 0.0678 - classification_loss: 0.0056 131/500 [======>.......................] - ETA: 2:03 - loss: 0.0734 - regression_loss: 0.0679 - classification_loss: 0.0055 132/500 [======>.......................] - ETA: 2:03 - loss: 0.0729 - regression_loss: 0.0674 - classification_loss: 0.0055 133/500 [======>.......................] - ETA: 2:03 - loss: 0.0725 - regression_loss: 0.0671 - classification_loss: 0.0055 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0722 - regression_loss: 0.0667 - classification_loss: 0.0054 135/500 [=======>......................] - ETA: 2:02 - loss: 0.0723 - regression_loss: 0.0668 - classification_loss: 0.0054 136/500 [=======>......................] - ETA: 2:02 - loss: 0.0721 - regression_loss: 0.0667 - classification_loss: 0.0054 137/500 [=======>......................] - ETA: 2:02 - loss: 0.0722 - regression_loss: 0.0667 - classification_loss: 0.0054 138/500 [=======>......................] - ETA: 2:01 - loss: 0.0717 - regression_loss: 0.0663 - classification_loss: 0.0054 139/500 [=======>......................] - ETA: 2:01 - loss: 0.0716 - regression_loss: 0.0662 - classification_loss: 0.0054 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0715 - regression_loss: 0.0661 - classification_loss: 0.0054 141/500 [=======>......................] - ETA: 2:00 - loss: 0.0712 - regression_loss: 0.0658 - classification_loss: 0.0054 142/500 [=======>......................] - ETA: 2:00 - loss: 0.0715 - regression_loss: 0.0661 - classification_loss: 0.0053 143/500 [=======>......................] - ETA: 1:59 - loss: 0.0713 - regression_loss: 0.0659 - classification_loss: 0.0053 144/500 [=======>......................] - ETA: 1:59 - loss: 0.0709 - regression_loss: 0.0655 - classification_loss: 0.0053 145/500 [=======>......................] - ETA: 1:59 - loss: 0.0707 - regression_loss: 0.0654 - classification_loss: 0.0053 146/500 [=======>......................] - ETA: 1:59 - loss: 0.0703 - regression_loss: 0.0650 - classification_loss: 0.0053 147/500 [=======>......................] - ETA: 1:58 - loss: 0.0700 - regression_loss: 0.0647 - classification_loss: 0.0053 148/500 [=======>......................] - ETA: 1:58 - loss: 0.0701 - regression_loss: 0.0648 - classification_loss: 0.0052 149/500 [=======>......................] - ETA: 1:57 - loss: 0.0699 - regression_loss: 0.0646 - classification_loss: 0.0052 150/500 [========>.....................] - ETA: 1:57 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0052 151/500 [========>.....................] - ETA: 1:57 - loss: 0.0695 - regression_loss: 0.0643 - classification_loss: 0.0052 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0694 - regression_loss: 0.0642 - classification_loss: 0.0052 153/500 [========>.....................] - ETA: 1:56 - loss: 0.0693 - regression_loss: 0.0641 - classification_loss: 0.0052 154/500 [========>.....................] - ETA: 1:56 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0052 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0052 156/500 [========>.....................] - ETA: 1:55 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0051 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0051 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0052 159/500 [========>.....................] - ETA: 1:54 - loss: 0.0690 - regression_loss: 0.0638 - classification_loss: 0.0052 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0689 - regression_loss: 0.0637 - classification_loss: 0.0052 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0691 - regression_loss: 0.0639 - classification_loss: 0.0051 162/500 [========>.....................] - ETA: 1:53 - loss: 0.0697 - regression_loss: 0.0646 - classification_loss: 0.0052 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0701 - regression_loss: 0.0649 - classification_loss: 0.0053 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0698 - regression_loss: 0.0646 - classification_loss: 0.0052 165/500 [========>.....................] - ETA: 1:52 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0053 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0694 - regression_loss: 0.0642 - classification_loss: 0.0052 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0700 - regression_loss: 0.0647 - classification_loss: 0.0052 168/500 [=========>....................] - ETA: 1:51 - loss: 0.0701 - regression_loss: 0.0649 - classification_loss: 0.0052 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0705 - regression_loss: 0.0652 - classification_loss: 0.0053 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0706 - regression_loss: 0.0653 - classification_loss: 0.0053 171/500 [=========>....................] - ETA: 1:50 - loss: 0.0703 - regression_loss: 0.0650 - classification_loss: 0.0053 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0701 - regression_loss: 0.0648 - classification_loss: 0.0053 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0698 - regression_loss: 0.0645 - classification_loss: 0.0053 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0696 - regression_loss: 0.0643 - classification_loss: 0.0052 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0693 - regression_loss: 0.0641 - classification_loss: 0.0052 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0052 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0053 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0695 - regression_loss: 0.0642 - classification_loss: 0.0052 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0706 - regression_loss: 0.0653 - classification_loss: 0.0053 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0704 - regression_loss: 0.0651 - classification_loss: 0.0053 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0705 - regression_loss: 0.0653 - classification_loss: 0.0053 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0705 - regression_loss: 0.0653 - classification_loss: 0.0053 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0706 - regression_loss: 0.0654 - classification_loss: 0.0053 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0710 - regression_loss: 0.0657 - classification_loss: 0.0053 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0709 - regression_loss: 0.0656 - classification_loss: 0.0053 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0707 - regression_loss: 0.0654 - classification_loss: 0.0053 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0704 - regression_loss: 0.0651 - classification_loss: 0.0053 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0702 - regression_loss: 0.0649 - classification_loss: 0.0052 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0699 - regression_loss: 0.0647 - classification_loss: 0.0052 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0699 - regression_loss: 0.0647 - classification_loss: 0.0052 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0698 - regression_loss: 0.0645 - classification_loss: 0.0052 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0695 - regression_loss: 0.0643 - classification_loss: 0.0052 193/500 [==========>...................] - ETA: 1:43 - loss: 0.0700 - regression_loss: 0.0648 - classification_loss: 0.0052 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0700 - regression_loss: 0.0648 - classification_loss: 0.0052 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0698 - regression_loss: 0.0646 - classification_loss: 0.0052 196/500 [==========>...................] - ETA: 1:42 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0052 197/500 [==========>...................] - ETA: 1:41 - loss: 0.0695 - regression_loss: 0.0643 - classification_loss: 0.0052 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0702 - regression_loss: 0.0650 - classification_loss: 0.0052 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0699 - regression_loss: 0.0647 - classification_loss: 0.0052 200/500 [===========>..................] - ETA: 1:40 - loss: 0.0697 - regression_loss: 0.0645 - classification_loss: 0.0052 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0694 - regression_loss: 0.0643 - classification_loss: 0.0052 202/500 [===========>..................] - ETA: 1:40 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0051 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0692 - regression_loss: 0.0640 - classification_loss: 0.0052 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0051 205/500 [===========>..................] - ETA: 1:39 - loss: 0.0688 - regression_loss: 0.0637 - classification_loss: 0.0051 206/500 [===========>..................] - ETA: 1:38 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0051 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0688 - regression_loss: 0.0637 - classification_loss: 0.0051 208/500 [===========>..................] - ETA: 1:38 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0051 209/500 [===========>..................] - ETA: 1:37 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0689 - regression_loss: 0.0638 - classification_loss: 0.0051 211/500 [===========>..................] - ETA: 1:37 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 212/500 [===========>..................] - ETA: 1:36 - loss: 0.0684 - regression_loss: 0.0633 - classification_loss: 0.0051 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0681 - regression_loss: 0.0630 - classification_loss: 0.0050 214/500 [===========>..................] - ETA: 1:36 - loss: 0.0681 - regression_loss: 0.0630 - classification_loss: 0.0050 215/500 [===========>..................] - ETA: 1:35 - loss: 0.0679 - regression_loss: 0.0629 - classification_loss: 0.0050 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0680 - regression_loss: 0.0629 - classification_loss: 0.0050 217/500 [============>.................] - ETA: 1:35 - loss: 0.0684 - regression_loss: 0.0634 - classification_loss: 0.0050 218/500 [============>.................] - ETA: 1:34 - loss: 0.0685 - regression_loss: 0.0635 - classification_loss: 0.0050 219/500 [============>.................] - ETA: 1:34 - loss: 0.0684 - regression_loss: 0.0633 - classification_loss: 0.0050 220/500 [============>.................] - ETA: 1:34 - loss: 0.0681 - regression_loss: 0.0631 - classification_loss: 0.0050 221/500 [============>.................] - ETA: 1:33 - loss: 0.0683 - regression_loss: 0.0633 - classification_loss: 0.0050 222/500 [============>.................] - ETA: 1:33 - loss: 0.0687 - regression_loss: 0.0636 - classification_loss: 0.0050 223/500 [============>.................] - ETA: 1:33 - loss: 0.0685 - regression_loss: 0.0635 - classification_loss: 0.0050 224/500 [============>.................] - ETA: 1:32 - loss: 0.0683 - regression_loss: 0.0633 - classification_loss: 0.0050 225/500 [============>.................] - ETA: 1:32 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 226/500 [============>.................] - ETA: 1:32 - loss: 0.0683 - regression_loss: 0.0633 - classification_loss: 0.0051 227/500 [============>.................] - ETA: 1:31 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 228/500 [============>.................] - ETA: 1:31 - loss: 0.0697 - regression_loss: 0.0646 - classification_loss: 0.0051 229/500 [============>.................] - ETA: 1:31 - loss: 0.0696 - regression_loss: 0.0645 - classification_loss: 0.0051 230/500 [============>.................] - ETA: 1:30 - loss: 0.0694 - regression_loss: 0.0644 - classification_loss: 0.0051 231/500 [============>.................] - ETA: 1:30 - loss: 0.0694 - regression_loss: 0.0643 - classification_loss: 0.0051 232/500 [============>.................] - ETA: 1:30 - loss: 0.0693 - regression_loss: 0.0642 - classification_loss: 0.0051 233/500 [============>.................] - ETA: 1:29 - loss: 0.0694 - regression_loss: 0.0643 - classification_loss: 0.0051 234/500 [=============>................] - ETA: 1:29 - loss: 0.0695 - regression_loss: 0.0644 - classification_loss: 0.0051 235/500 [=============>................] - ETA: 1:29 - loss: 0.0696 - regression_loss: 0.0644 - classification_loss: 0.0051 236/500 [=============>................] - ETA: 1:28 - loss: 0.0696 - regression_loss: 0.0645 - classification_loss: 0.0051 237/500 [=============>................] - ETA: 1:28 - loss: 0.0696 - regression_loss: 0.0644 - classification_loss: 0.0051 238/500 [=============>................] - ETA: 1:28 - loss: 0.0694 - regression_loss: 0.0642 - classification_loss: 0.0051 239/500 [=============>................] - ETA: 1:27 - loss: 0.0691 - regression_loss: 0.0640 - classification_loss: 0.0051 240/500 [=============>................] - ETA: 1:27 - loss: 0.0690 - regression_loss: 0.0639 - classification_loss: 0.0051 241/500 [=============>................] - ETA: 1:27 - loss: 0.0688 - regression_loss: 0.0637 - classification_loss: 0.0051 242/500 [=============>................] - ETA: 1:26 - loss: 0.0686 - regression_loss: 0.0635 - classification_loss: 0.0051 243/500 [=============>................] - ETA: 1:26 - loss: 0.0684 - regression_loss: 0.0634 - classification_loss: 0.0051 244/500 [=============>................] - ETA: 1:26 - loss: 0.0684 - regression_loss: 0.0634 - classification_loss: 0.0050 245/500 [=============>................] - ETA: 1:25 - loss: 0.0685 - regression_loss: 0.0634 - classification_loss: 0.0051 246/500 [=============>................] - ETA: 1:25 - loss: 0.0683 - regression_loss: 0.0633 - classification_loss: 0.0050 247/500 [=============>................] - ETA: 1:25 - loss: 0.0684 - regression_loss: 0.0633 - classification_loss: 0.0050 248/500 [=============>................] - ETA: 1:24 - loss: 0.0682 - regression_loss: 0.0632 - classification_loss: 0.0050 249/500 [=============>................] - ETA: 1:24 - loss: 0.0681 - regression_loss: 0.0631 - classification_loss: 0.0050 250/500 [==============>...............] - ETA: 1:24 - loss: 0.0683 - regression_loss: 0.0633 - classification_loss: 0.0050 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0685 - regression_loss: 0.0635 - classification_loss: 0.0050 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0685 - regression_loss: 0.0634 - classification_loss: 0.0050 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0684 - regression_loss: 0.0634 - classification_loss: 0.0050 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0682 - regression_loss: 0.0632 - classification_loss: 0.0050 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0680 - regression_loss: 0.0630 - classification_loss: 0.0050 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0681 - regression_loss: 0.0631 - classification_loss: 0.0050 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0679 - regression_loss: 0.0629 - classification_loss: 0.0050 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0677 - regression_loss: 0.0627 - classification_loss: 0.0050 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0677 - regression_loss: 0.0628 - classification_loss: 0.0050 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0680 - regression_loss: 0.0630 - classification_loss: 0.0050 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0678 - regression_loss: 0.0628 - classification_loss: 0.0050 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0677 - regression_loss: 0.0627 - classification_loss: 0.0050 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0675 - regression_loss: 0.0625 - classification_loss: 0.0050 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0672 - regression_loss: 0.0623 - classification_loss: 0.0050 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0673 - regression_loss: 0.0624 - classification_loss: 0.0050 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0677 - regression_loss: 0.0627 - classification_loss: 0.0050 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0675 - regression_loss: 0.0626 - classification_loss: 0.0050 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0677 - regression_loss: 0.0627 - classification_loss: 0.0050 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0682 - regression_loss: 0.0631 - classification_loss: 0.0050 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0681 - regression_loss: 0.0631 - classification_loss: 0.0050 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0679 - regression_loss: 0.0629 - classification_loss: 0.0050 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0680 - regression_loss: 0.0630 - classification_loss: 0.0050 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0680 - regression_loss: 0.0630 - classification_loss: 0.0050 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0678 - regression_loss: 0.0628 - classification_loss: 0.0050 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0676 - regression_loss: 0.0626 - classification_loss: 0.0050 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0674 - regression_loss: 0.0624 - classification_loss: 0.0050 277/500 [===============>..............] - ETA: 1:14 - loss: 0.0672 - regression_loss: 0.0623 - classification_loss: 0.0050 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0674 - regression_loss: 0.0623 - classification_loss: 0.0050 280/500 [===============>..............] - ETA: 1:13 - loss: 0.0673 - regression_loss: 0.0623 - classification_loss: 0.0050 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0051 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0051 283/500 [===============>..............] - ETA: 1:12 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 284/500 [================>.............] - ETA: 1:12 - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0051 285/500 [================>.............] - ETA: 1:12 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 286/500 [================>.............] - ETA: 1:11 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 287/500 [================>.............] - ETA: 1:11 - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 288/500 [================>.............] - ETA: 1:11 - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 289/500 [================>.............] - ETA: 1:10 - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 290/500 [================>.............] - ETA: 1:10 - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0050 291/500 [================>.............] - ETA: 1:10 - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 292/500 [================>.............] - ETA: 1:09 - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0051 293/500 [================>.............] - ETA: 1:09 - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0050 294/500 [================>.............] - ETA: 1:09 - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 295/500 [================>.............] - ETA: 1:08 - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0050 296/500 [================>.............] - ETA: 1:08 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 297/500 [================>.............] - ETA: 1:08 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 298/500 [================>.............] - ETA: 1:07 - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 299/500 [================>.............] - ETA: 1:07 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 300/500 [=================>............] - ETA: 1:07 - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 301/500 [=================>............] - ETA: 1:06 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 302/500 [=================>............] - ETA: 1:06 - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 303/500 [=================>............] - ETA: 1:06 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 304/500 [=================>............] - ETA: 1:05 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 305/500 [=================>............] - ETA: 1:05 - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 306/500 [=================>............] - ETA: 1:05 - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 307/500 [=================>............] - ETA: 1:04 - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 308/500 [=================>............] - ETA: 1:04 - loss: 0.0664 - regression_loss: 0.0614 - classification_loss: 0.0050 309/500 [=================>............] - ETA: 1:04 - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0050 310/500 [=================>............] - ETA: 1:03 - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0050 311/500 [=================>............] - ETA: 1:03 - loss: 0.0662 - regression_loss: 0.0612 - classification_loss: 0.0050 312/500 [=================>............] - ETA: 1:03 - loss: 0.0660 - regression_loss: 0.0610 - classification_loss: 0.0049 313/500 [=================>............] - ETA: 1:02 - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 314/500 [=================>............] - ETA: 1:02 - loss: 0.0659 - regression_loss: 0.0610 - classification_loss: 0.0049 315/500 [=================>............] - ETA: 1:02 - loss: 0.0658 - regression_loss: 0.0609 - classification_loss: 0.0049 316/500 [=================>............] - ETA: 1:01 - loss: 0.0657 - regression_loss: 0.0608 - classification_loss: 0.0049 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0659 - regression_loss: 0.0610 - classification_loss: 0.0049 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0660 - regression_loss: 0.0610 - classification_loss: 0.0049 322/500 [==================>...........] - ETA: 59s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049  323/500 [==================>...........] - ETA: 59s - loss: 0.0662 - regression_loss: 0.0613 - classification_loss: 0.0049 324/500 [==================>...........] - ETA: 59s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 325/500 [==================>...........] - ETA: 58s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 326/500 [==================>...........] - ETA: 58s - loss: 0.0658 - regression_loss: 0.0609 - classification_loss: 0.0049 327/500 [==================>...........] - ETA: 58s - loss: 0.0658 - regression_loss: 0.0609 - classification_loss: 0.0049 328/500 [==================>...........] - ETA: 57s - loss: 0.0657 - regression_loss: 0.0608 - classification_loss: 0.0049 329/500 [==================>...........] - ETA: 57s - loss: 0.0655 - regression_loss: 0.0606 - classification_loss: 0.0049 330/500 [==================>...........] - ETA: 57s - loss: 0.0654 - regression_loss: 0.0606 - classification_loss: 0.0049 331/500 [==================>...........] - ETA: 56s - loss: 0.0654 - regression_loss: 0.0605 - classification_loss: 0.0049 332/500 [==================>...........] - ETA: 56s - loss: 0.0655 - regression_loss: 0.0606 - classification_loss: 0.0049 333/500 [==================>...........] - ETA: 56s - loss: 0.0656 - regression_loss: 0.0607 - classification_loss: 0.0049 334/500 [===================>..........] - ETA: 55s - loss: 0.0657 - regression_loss: 0.0608 - classification_loss: 0.0049 335/500 [===================>..........] - ETA: 55s - loss: 0.0655 - regression_loss: 0.0607 - classification_loss: 0.0049 336/500 [===================>..........] - ETA: 55s - loss: 0.0654 - regression_loss: 0.0606 - classification_loss: 0.0049 337/500 [===================>..........] - ETA: 54s - loss: 0.0655 - regression_loss: 0.0606 - classification_loss: 0.0049 338/500 [===================>..........] - ETA: 54s - loss: 0.0654 - regression_loss: 0.0606 - classification_loss: 0.0049 339/500 [===================>..........] - ETA: 54s - loss: 0.0653 - regression_loss: 0.0604 - classification_loss: 0.0049 340/500 [===================>..........] - ETA: 53s - loss: 0.0656 - regression_loss: 0.0607 - classification_loss: 0.0049 341/500 [===================>..........] - ETA: 53s - loss: 0.0655 - regression_loss: 0.0607 - classification_loss: 0.0049 342/500 [===================>..........] - ETA: 53s - loss: 0.0655 - regression_loss: 0.0607 - classification_loss: 0.0049 343/500 [===================>..........] - ETA: 52s - loss: 0.0654 - regression_loss: 0.0606 - classification_loss: 0.0048 344/500 [===================>..........] - ETA: 52s - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0048 345/500 [===================>..........] - ETA: 52s - loss: 0.0656 - regression_loss: 0.0608 - classification_loss: 0.0048 346/500 [===================>..........] - ETA: 51s - loss: 0.0655 - regression_loss: 0.0607 - classification_loss: 0.0048 347/500 [===================>..........] - ETA: 51s - loss: 0.0655 - regression_loss: 0.0607 - classification_loss: 0.0049 348/500 [===================>..........] - ETA: 51s - loss: 0.0654 - regression_loss: 0.0605 - classification_loss: 0.0048 349/500 [===================>..........] - ETA: 50s - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0048 350/500 [====================>.........] - ETA: 50s - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0048 351/500 [====================>.........] - ETA: 50s - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0048 352/500 [====================>.........] - ETA: 49s - loss: 0.0654 - regression_loss: 0.0606 - classification_loss: 0.0049 353/500 [====================>.........] - ETA: 49s - loss: 0.0654 - regression_loss: 0.0605 - classification_loss: 0.0049 354/500 [====================>.........] - ETA: 49s - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0049 355/500 [====================>.........] - ETA: 48s - loss: 0.0656 - regression_loss: 0.0608 - classification_loss: 0.0049 356/500 [====================>.........] - ETA: 48s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 357/500 [====================>.........] - ETA: 48s - loss: 0.0663 - regression_loss: 0.0614 - classification_loss: 0.0049 358/500 [====================>.........] - ETA: 47s - loss: 0.0661 - regression_loss: 0.0612 - classification_loss: 0.0049 359/500 [====================>.........] - ETA: 47s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 360/500 [====================>.........] - ETA: 47s - loss: 0.0660 - regression_loss: 0.0611 - classification_loss: 0.0049 361/500 [====================>.........] - ETA: 46s - loss: 0.0658 - regression_loss: 0.0610 - classification_loss: 0.0049 362/500 [====================>.........] - ETA: 46s - loss: 0.0657 - regression_loss: 0.0608 - classification_loss: 0.0048 363/500 [====================>.........] - ETA: 46s - loss: 0.0659 - regression_loss: 0.0610 - classification_loss: 0.0049 364/500 [====================>.........] - ETA: 45s - loss: 0.0659 - regression_loss: 0.0610 - classification_loss: 0.0049 365/500 [====================>.........] - ETA: 45s - loss: 0.0658 - regression_loss: 0.0609 - classification_loss: 0.0049 366/500 [====================>.........] - ETA: 45s - loss: 0.0658 - regression_loss: 0.0609 - classification_loss: 0.0049 367/500 [=====================>........] - ETA: 44s - loss: 0.0663 - regression_loss: 0.0612 - classification_loss: 0.0050 368/500 [=====================>........] - ETA: 44s - loss: 0.0662 - regression_loss: 0.0611 - classification_loss: 0.0050 369/500 [=====================>........] - ETA: 44s - loss: 0.0661 - regression_loss: 0.0611 - classification_loss: 0.0050 370/500 [=====================>........] - ETA: 43s - loss: 0.0662 - regression_loss: 0.0611 - classification_loss: 0.0050 371/500 [=====================>........] - ETA: 43s - loss: 0.0661 - regression_loss: 0.0611 - classification_loss: 0.0050 372/500 [=====================>........] - ETA: 43s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0051 373/500 [=====================>........] - ETA: 42s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 374/500 [=====================>........] - ETA: 42s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 375/500 [=====================>........] - ETA: 42s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 376/500 [=====================>........] - ETA: 41s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 377/500 [=====================>........] - ETA: 41s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 378/500 [=====================>........] - ETA: 40s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 379/500 [=====================>........] - ETA: 40s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 380/500 [=====================>........] - ETA: 40s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 381/500 [=====================>........] - ETA: 39s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 382/500 [=====================>........] - ETA: 39s - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0051 383/500 [=====================>........] - ETA: 39s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 384/500 [======================>.......] - ETA: 38s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 385/500 [======================>.......] - ETA: 38s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0051 386/500 [======================>.......] - ETA: 38s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 387/500 [======================>.......] - ETA: 37s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 388/500 [======================>.......] - ETA: 37s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 389/500 [======================>.......] - ETA: 37s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0051 390/500 [======================>.......] - ETA: 36s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 391/500 [======================>.......] - ETA: 36s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0051 392/500 [======================>.......] - ETA: 36s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 393/500 [======================>.......] - ETA: 35s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 394/500 [======================>.......] - ETA: 35s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 395/500 [======================>.......] - ETA: 35s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 396/500 [======================>.......] - ETA: 34s - loss: 0.0674 - regression_loss: 0.0624 - classification_loss: 0.0050 397/500 [======================>.......] - ETA: 34s - loss: 0.0673 - regression_loss: 0.0623 - classification_loss: 0.0050 398/500 [======================>.......] - ETA: 34s - loss: 0.0674 - regression_loss: 0.0623 - classification_loss: 0.0050 399/500 [======================>.......] - ETA: 33s - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0050 400/500 [=======================>......] - ETA: 33s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 401/500 [=======================>......] - ETA: 33s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 402/500 [=======================>......] - ETA: 32s - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0050 403/500 [=======================>......] - ETA: 32s - loss: 0.0674 - regression_loss: 0.0623 - classification_loss: 0.0051 404/500 [=======================>......] - ETA: 32s - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0050 405/500 [=======================>......] - ETA: 31s - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0050 406/500 [=======================>......] - ETA: 31s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 407/500 [=======================>......] - ETA: 31s - loss: 0.0672 - regression_loss: 0.0622 - classification_loss: 0.0050 408/500 [=======================>......] - ETA: 30s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0050 409/500 [=======================>......] - ETA: 30s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 410/500 [=======================>......] - ETA: 30s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 411/500 [=======================>......] - ETA: 29s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 412/500 [=======================>......] - ETA: 29s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 413/500 [=======================>......] - ETA: 29s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 414/500 [=======================>......] - ETA: 28s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0050 415/500 [=======================>......] - ETA: 28s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 416/500 [=======================>......] - ETA: 28s - loss: 0.0668 - regression_loss: 0.0619 - classification_loss: 0.0050 417/500 [========================>.....] - ETA: 27s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 418/500 [========================>.....] - ETA: 27s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 419/500 [========================>.....] - ETA: 27s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 420/500 [========================>.....] - ETA: 26s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 421/500 [========================>.....] - ETA: 26s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 422/500 [========================>.....] - ETA: 26s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 423/500 [========================>.....] - ETA: 25s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 424/500 [========================>.....] - ETA: 25s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 425/500 [========================>.....] - ETA: 25s - loss: 0.0669 - regression_loss: 0.0620 - classification_loss: 0.0050 426/500 [========================>.....] - ETA: 24s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 427/500 [========================>.....] - ETA: 24s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 428/500 [========================>.....] - ETA: 24s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 429/500 [========================>.....] - ETA: 23s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 430/500 [========================>.....] - ETA: 23s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 431/500 [========================>.....] - ETA: 23s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 432/500 [========================>.....] - ETA: 22s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 433/500 [========================>.....] - ETA: 22s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 434/500 [=========================>....] - ETA: 22s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 435/500 [=========================>....] - ETA: 21s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 436/500 [=========================>....] - ETA: 21s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 437/500 [=========================>....] - ETA: 21s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 438/500 [=========================>....] - ETA: 20s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 439/500 [=========================>....] - ETA: 20s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 440/500 [=========================>....] - ETA: 20s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 441/500 [=========================>....] - ETA: 19s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 442/500 [=========================>....] - ETA: 19s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 443/500 [=========================>....] - ETA: 19s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 444/500 [=========================>....] - ETA: 18s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 445/500 [=========================>....] - ETA: 18s - loss: 0.0669 - regression_loss: 0.0618 - classification_loss: 0.0051 446/500 [=========================>....] - ETA: 18s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0051 447/500 [=========================>....] - ETA: 17s - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0051 448/500 [=========================>....] - ETA: 17s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0051 449/500 [=========================>....] - ETA: 17s - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0050 450/500 [==========================>...] - ETA: 16s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 451/500 [==========================>...] - ETA: 16s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 452/500 [==========================>...] - ETA: 16s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 453/500 [==========================>...] - ETA: 15s - loss: 0.0665 - regression_loss: 0.0614 - classification_loss: 0.0050 454/500 [==========================>...] - ETA: 15s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 455/500 [==========================>...] - ETA: 15s - loss: 0.0666 - regression_loss: 0.0615 - classification_loss: 0.0050 456/500 [==========================>...] - ETA: 14s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 457/500 [==========================>...] - ETA: 14s - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0051 458/500 [==========================>...] - ETA: 14s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 459/500 [==========================>...] - ETA: 13s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 460/500 [==========================>...] - ETA: 13s - loss: 0.0668 - regression_loss: 0.0617 - classification_loss: 0.0050 461/500 [==========================>...] - ETA: 13s - loss: 0.0667 - regression_loss: 0.0616 - classification_loss: 0.0050 462/500 [==========================>...] - ETA: 12s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 463/500 [==========================>...] - ETA: 12s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 464/500 [==========================>...] - ETA: 12s - loss: 0.0665 - regression_loss: 0.0614 - classification_loss: 0.0050 465/500 [==========================>...] - ETA: 11s - loss: 0.0664 - regression_loss: 0.0614 - classification_loss: 0.0050 466/500 [==========================>...] - ETA: 11s - loss: 0.0664 - regression_loss: 0.0614 - classification_loss: 0.0050 467/500 [===========================>..] - ETA: 11s - loss: 0.0664 - regression_loss: 0.0614 - classification_loss: 0.0050 468/500 [===========================>..] - ETA: 10s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0050 469/500 [===========================>..] - ETA: 10s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 470/500 [===========================>..] - ETA: 10s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 471/500 [===========================>..] - ETA: 9s - loss: 0.0665 - regression_loss: 0.0614 - classification_loss: 0.0050  472/500 [===========================>..] - ETA: 9s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0050 473/500 [===========================>..] - ETA: 9s - loss: 0.0663 - regression_loss: 0.0613 - classification_loss: 0.0050 474/500 [===========================>..] - ETA: 8s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 475/500 [===========================>..] - ETA: 8s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 476/500 [===========================>..] - ETA: 8s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 477/500 [===========================>..] - ETA: 7s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 478/500 [===========================>..] - ETA: 7s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 479/500 [===========================>..] - ETA: 7s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 480/500 [===========================>..] - ETA: 6s - loss: 0.0665 - regression_loss: 0.0615 - classification_loss: 0.0050 481/500 [===========================>..] - ETA: 6s - loss: 0.0666 - regression_loss: 0.0616 - classification_loss: 0.0050 482/500 [===========================>..] - ETA: 6s - loss: 0.0667 - regression_loss: 0.0617 - classification_loss: 0.0050 483/500 [===========================>..] - ETA: 5s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 484/500 [============================>.] - ETA: 5s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 485/500 [============================>.] - ETA: 5s - loss: 0.0671 - regression_loss: 0.0621 - classification_loss: 0.0050 486/500 [============================>.] - ETA: 4s - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0050 487/500 [============================>.] - ETA: 4s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 488/500 [============================>.] - ETA: 4s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 489/500 [============================>.] - ETA: 3s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0050 490/500 [============================>.] - ETA: 3s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 491/500 [============================>.] - ETA: 3s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 492/500 [============================>.] - ETA: 2s - loss: 0.0668 - regression_loss: 0.0618 - classification_loss: 0.0050 493/500 [============================>.] - ETA: 2s - loss: 0.0669 - regression_loss: 0.0619 - classification_loss: 0.0050 494/500 [============================>.] - ETA: 2s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 495/500 [============================>.] - ETA: 1s - loss: 0.0673 - regression_loss: 0.0622 - classification_loss: 0.0051 496/500 [============================>.] - ETA: 1s - loss: 0.0672 - regression_loss: 0.0621 - classification_loss: 0.0051 497/500 [============================>.] - ETA: 1s - loss: 0.0671 - regression_loss: 0.0620 - classification_loss: 0.0051 498/500 [============================>.] - ETA: 0s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 499/500 [============================>.] - ETA: 0s - loss: 0.0670 - regression_loss: 0.0620 - classification_loss: 0.0051 500/500 [==============================] - 168s 337ms/step - loss: 0.0670 - regression_loss: 0.0619 - classification_loss: 0.0051 1172 instances of class plum with average precision: 0.7553 mAP: 0.7553 Epoch 00060: saving model to ./training/snapshots/resnet101_pascal_60.h5 Epoch 61/150 1/500 [..............................] - ETA: 2:45 - loss: 0.0268 - regression_loss: 0.0253 - classification_loss: 0.0015 2/500 [..............................] - ETA: 2:47 - loss: 0.1021 - regression_loss: 0.0867 - classification_loss: 0.0154 3/500 [..............................] - ETA: 2:49 - loss: 0.0705 - regression_loss: 0.0599 - classification_loss: 0.0106 4/500 [..............................] - ETA: 2:51 - loss: 0.0628 - regression_loss: 0.0541 - classification_loss: 0.0088 5/500 [..............................] - ETA: 2:51 - loss: 0.0566 - regression_loss: 0.0490 - classification_loss: 0.0075 6/500 [..............................] - ETA: 2:51 - loss: 0.0607 - regression_loss: 0.0532 - classification_loss: 0.0075 7/500 [..............................] - ETA: 2:49 - loss: 0.0554 - regression_loss: 0.0487 - classification_loss: 0.0067 8/500 [..............................] - ETA: 2:48 - loss: 0.0522 - regression_loss: 0.0461 - classification_loss: 0.0061 9/500 [..............................] - ETA: 2:48 - loss: 0.0616 - regression_loss: 0.0556 - classification_loss: 0.0060 10/500 [..............................] - ETA: 2:47 - loss: 0.0601 - regression_loss: 0.0540 - classification_loss: 0.0061 11/500 [..............................] - ETA: 2:46 - loss: 0.0616 - regression_loss: 0.0557 - classification_loss: 0.0060 12/500 [..............................] - ETA: 2:45 - loss: 0.0615 - regression_loss: 0.0556 - classification_loss: 0.0060 13/500 [..............................] - ETA: 2:45 - loss: 0.0581 - regression_loss: 0.0525 - classification_loss: 0.0056 14/500 [..............................] - ETA: 2:45 - loss: 0.0640 - regression_loss: 0.0582 - classification_loss: 0.0057 15/500 [..............................] - ETA: 2:44 - loss: 0.0690 - regression_loss: 0.0621 - classification_loss: 0.0069 16/500 [..............................] - ETA: 2:43 - loss: 0.0775 - regression_loss: 0.0699 - classification_loss: 0.0077 17/500 [>.............................] - ETA: 2:42 - loss: 0.0762 - regression_loss: 0.0687 - classification_loss: 0.0074 18/500 [>.............................] - ETA: 2:42 - loss: 0.0775 - regression_loss: 0.0703 - classification_loss: 0.0072 19/500 [>.............................] - ETA: 2:42 - loss: 0.0818 - regression_loss: 0.0746 - classification_loss: 0.0072 20/500 [>.............................] - ETA: 2:42 - loss: 0.0789 - regression_loss: 0.0719 - classification_loss: 0.0069 21/500 [>.............................] - ETA: 2:42 - loss: 0.0762 - regression_loss: 0.0695 - classification_loss: 0.0067 22/500 [>.............................] - ETA: 2:42 - loss: 0.0735 - regression_loss: 0.0671 - classification_loss: 0.0064 23/500 [>.............................] - ETA: 2:41 - loss: 0.0738 - regression_loss: 0.0674 - classification_loss: 0.0064 24/500 [>.............................] - ETA: 2:41 - loss: 0.0716 - regression_loss: 0.0654 - classification_loss: 0.0062 25/500 [>.............................] - ETA: 2:41 - loss: 0.0733 - regression_loss: 0.0672 - classification_loss: 0.0061 26/500 [>.............................] - ETA: 2:40 - loss: 0.0735 - regression_loss: 0.0675 - classification_loss: 0.0060 27/500 [>.............................] - ETA: 2:40 - loss: 0.0710 - regression_loss: 0.0653 - classification_loss: 0.0058 28/500 [>.............................] - ETA: 2:39 - loss: 0.0699 - regression_loss: 0.0642 - classification_loss: 0.0057 29/500 [>.............................] - ETA: 2:39 - loss: 0.0694 - regression_loss: 0.0636 - classification_loss: 0.0058 30/500 [>.............................] - ETA: 2:38 - loss: 0.0678 - regression_loss: 0.0621 - classification_loss: 0.0057 31/500 [>.............................] - ETA: 2:38 - loss: 0.0697 - regression_loss: 0.0636 - classification_loss: 0.0061 32/500 [>.............................] - ETA: 2:38 - loss: 0.0678 - regression_loss: 0.0619 - classification_loss: 0.0059 33/500 [>.............................] - ETA: 2:37 - loss: 0.0661 - regression_loss: 0.0603 - classification_loss: 0.0058 34/500 [=>............................] - ETA: 2:37 - loss: 0.0648 - regression_loss: 0.0591 - classification_loss: 0.0057 35/500 [=>............................] - ETA: 2:36 - loss: 0.0636 - regression_loss: 0.0580 - classification_loss: 0.0056 36/500 [=>............................] - ETA: 2:36 - loss: 0.0633 - regression_loss: 0.0578 - classification_loss: 0.0055 37/500 [=>............................] - ETA: 2:36 - loss: 0.0619 - regression_loss: 0.0565 - classification_loss: 0.0054 38/500 [=>............................] - ETA: 2:35 - loss: 0.0611 - regression_loss: 0.0558 - classification_loss: 0.0052 39/500 [=>............................] - ETA: 2:35 - loss: 0.0602 - regression_loss: 0.0551 - classification_loss: 0.0052 40/500 [=>............................] - ETA: 2:35 - loss: 0.0595 - regression_loss: 0.0544 - classification_loss: 0.0051 41/500 [=>............................] - ETA: 2:34 - loss: 0.0587 - regression_loss: 0.0538 - classification_loss: 0.0050 42/500 [=>............................] - ETA: 2:34 - loss: 0.0586 - regression_loss: 0.0536 - classification_loss: 0.0050 43/500 [=>............................] - ETA: 2:34 - loss: 0.0596 - regression_loss: 0.0546 - classification_loss: 0.0050 44/500 [=>............................] - ETA: 2:33 - loss: 0.0585 - regression_loss: 0.0536 - classification_loss: 0.0049 45/500 [=>............................] - ETA: 2:33 - loss: 0.0582 - regression_loss: 0.0533 - classification_loss: 0.0049 46/500 [=>............................] - ETA: 2:33 - loss: 0.0586 - regression_loss: 0.0538 - classification_loss: 0.0048 47/500 [=>............................] - ETA: 2:32 - loss: 0.0577 - regression_loss: 0.0529 - classification_loss: 0.0048 48/500 [=>............................] - ETA: 2:32 - loss: 0.0570 - regression_loss: 0.0523 - classification_loss: 0.0047 49/500 [=>............................] - ETA: 2:31 - loss: 0.0581 - regression_loss: 0.0534 - classification_loss: 0.0047 50/500 [==>...........................] - ETA: 2:31 - loss: 0.0581 - regression_loss: 0.0535 - classification_loss: 0.0047 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0573 - regression_loss: 0.0527 - classification_loss: 0.0046 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0570 - regression_loss: 0.0525 - classification_loss: 0.0045 53/500 [==>...........................] - ETA: 2:30 - loss: 0.0571 - regression_loss: 0.0526 - classification_loss: 0.0045 54/500 [==>...........................] - ETA: 2:30 - loss: 0.0576 - regression_loss: 0.0531 - classification_loss: 0.0045 55/500 [==>...........................] - ETA: 2:30 - loss: 0.0581 - regression_loss: 0.0534 - classification_loss: 0.0047 56/500 [==>...........................] - ETA: 2:29 - loss: 0.0576 - regression_loss: 0.0529 - classification_loss: 0.0046 57/500 [==>...........................] - ETA: 2:29 - loss: 0.0578 - regression_loss: 0.0532 - classification_loss: 0.0046 58/500 [==>...........................] - ETA: 2:28 - loss: 0.0578 - regression_loss: 0.0531 - classification_loss: 0.0046 59/500 [==>...........................] - ETA: 2:28 - loss: 0.0595 - regression_loss: 0.0549 - classification_loss: 0.0046 60/500 [==>...........................] - ETA: 2:28 - loss: 0.0602 - regression_loss: 0.0555 - classification_loss: 0.0047 61/500 [==>...........................] - ETA: 2:27 - loss: 0.0596 - regression_loss: 0.0550 - classification_loss: 0.0046 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0599 - regression_loss: 0.0553 - classification_loss: 0.0046 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0591 - regression_loss: 0.0545 - classification_loss: 0.0045 64/500 [==>...........................] - ETA: 2:26 - loss: 0.0591 - regression_loss: 0.0545 - classification_loss: 0.0045 65/500 [==>...........................] - ETA: 2:26 - loss: 0.0584 - regression_loss: 0.0540 - classification_loss: 0.0045 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0577 - regression_loss: 0.0533 - classification_loss: 0.0044 67/500 [===>..........................] - ETA: 2:25 - loss: 0.0580 - regression_loss: 0.0536 - classification_loss: 0.0044 68/500 [===>..........................] - ETA: 2:25 - loss: 0.0598 - regression_loss: 0.0554 - classification_loss: 0.0044 69/500 [===>..........................] - ETA: 2:25 - loss: 0.0594 - regression_loss: 0.0551 - classification_loss: 0.0044 70/500 [===>..........................] - ETA: 2:24 - loss: 0.0629 - regression_loss: 0.0586 - classification_loss: 0.0043 71/500 [===>..........................] - ETA: 2:24 - loss: 0.0633 - regression_loss: 0.0590 - classification_loss: 0.0044 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0635 - regression_loss: 0.0591 - classification_loss: 0.0044 73/500 [===>..........................] - ETA: 2:23 - loss: 0.0630 - regression_loss: 0.0587 - classification_loss: 0.0044 74/500 [===>..........................] - ETA: 2:23 - loss: 0.0628 - regression_loss: 0.0584 - classification_loss: 0.0043 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0630 - regression_loss: 0.0586 - classification_loss: 0.0044 76/500 [===>..........................] - ETA: 2:22 - loss: 0.0628 - regression_loss: 0.0584 - classification_loss: 0.0044 77/500 [===>..........................] - ETA: 2:22 - loss: 0.0644 - regression_loss: 0.0601 - classification_loss: 0.0044 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0646 - regression_loss: 0.0602 - classification_loss: 0.0044 79/500 [===>..........................] - ETA: 2:21 - loss: 0.0642 - regression_loss: 0.0598 - classification_loss: 0.0043 80/500 [===>..........................] - ETA: 2:21 - loss: 0.0638 - regression_loss: 0.0595 - classification_loss: 0.0043 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0633 - regression_loss: 0.0590 - classification_loss: 0.0043 82/500 [===>..........................] - ETA: 2:20 - loss: 0.0628 - regression_loss: 0.0585 - classification_loss: 0.0042 83/500 [===>..........................] - ETA: 2:20 - loss: 0.0636 - regression_loss: 0.0592 - classification_loss: 0.0044 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0636 - regression_loss: 0.0591 - classification_loss: 0.0045 85/500 [====>.........................] - ETA: 2:19 - loss: 0.0636 - regression_loss: 0.0591 - classification_loss: 0.0045 86/500 [====>.........................] - ETA: 2:19 - loss: 0.0630 - regression_loss: 0.0586 - classification_loss: 0.0044 87/500 [====>.........................] - ETA: 2:18 - loss: 0.0628 - regression_loss: 0.0584 - classification_loss: 0.0044 88/500 [====>.........................] - ETA: 2:18 - loss: 0.0636 - regression_loss: 0.0592 - classification_loss: 0.0045 89/500 [====>.........................] - ETA: 2:18 - loss: 0.0632 - regression_loss: 0.0588 - classification_loss: 0.0044 90/500 [====>.........................] - ETA: 2:17 - loss: 0.0630 - regression_loss: 0.0586 - classification_loss: 0.0044 91/500 [====>.........................] - ETA: 2:17 - loss: 0.0626 - regression_loss: 0.0582 - classification_loss: 0.0044 92/500 [====>.........................] - ETA: 2:17 - loss: 0.0631 - regression_loss: 0.0587 - classification_loss: 0.0044 93/500 [====>.........................] - ETA: 2:16 - loss: 0.0629 - regression_loss: 0.0585 - classification_loss: 0.0044 94/500 [====>.........................] - ETA: 2:16 - loss: 0.0625 - regression_loss: 0.0581 - classification_loss: 0.0044 95/500 [====>.........................] - ETA: 2:16 - loss: 0.0626 - regression_loss: 0.0581 - classification_loss: 0.0045 96/500 [====>.........................] - ETA: 2:15 - loss: 0.0622 - regression_loss: 0.0577 - classification_loss: 0.0045 97/500 [====>.........................] - ETA: 2:15 - loss: 0.0618 - regression_loss: 0.0574 - classification_loss: 0.0044 98/500 [====>.........................] - ETA: 2:14 - loss: 0.0629 - regression_loss: 0.0584 - classification_loss: 0.0045 99/500 [====>.........................] - ETA: 2:14 - loss: 0.0630 - regression_loss: 0.0585 - classification_loss: 0.0045 100/500 [=====>........................] - ETA: 2:14 - loss: 0.0628 - regression_loss: 0.0583 - classification_loss: 0.0045 101/500 [=====>........................] - ETA: 2:14 - loss: 0.0623 - regression_loss: 0.0579 - classification_loss: 0.0044 102/500 [=====>........................] - ETA: 2:13 - loss: 0.0630 - regression_loss: 0.0585 - classification_loss: 0.0045 103/500 [=====>........................] - ETA: 2:13 - loss: 0.0627 - regression_loss: 0.0582 - classification_loss: 0.0045 104/500 [=====>........................] - ETA: 2:13 - loss: 0.0626 - regression_loss: 0.0581 - classification_loss: 0.0045 105/500 [=====>........................] - ETA: 2:12 - loss: 0.0628 - regression_loss: 0.0583 - classification_loss: 0.0045 106/500 [=====>........................] - ETA: 2:12 - loss: 0.0625 - regression_loss: 0.0580 - classification_loss: 0.0044 107/500 [=====>........................] - ETA: 2:12 - loss: 0.0629 - regression_loss: 0.0584 - classification_loss: 0.0045 108/500 [=====>........................] - ETA: 2:11 - loss: 0.0630 - regression_loss: 0.0585 - classification_loss: 0.0045 109/500 [=====>........................] - ETA: 2:11 - loss: 0.0629 - regression_loss: 0.0584 - classification_loss: 0.0045 110/500 [=====>........................] - ETA: 2:11 - loss: 0.0625 - regression_loss: 0.0581 - classification_loss: 0.0044 111/500 [=====>........................] - ETA: 2:10 - loss: 0.0624 - regression_loss: 0.0579 - classification_loss: 0.0044 112/500 [=====>........................] - ETA: 2:10 - loss: 0.0622 - regression_loss: 0.0578 - classification_loss: 0.0044 113/500 [=====>........................] - ETA: 2:10 - loss: 0.0626 - regression_loss: 0.0582 - classification_loss: 0.0045 114/500 [=====>........................] - ETA: 2:09 - loss: 0.0622 - regression_loss: 0.0577 - classification_loss: 0.0044 115/500 [=====>........................] - ETA: 2:09 - loss: 0.0621 - regression_loss: 0.0577 - classification_loss: 0.0045 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0046 117/500 [======>.......................] - ETA: 2:08 - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 118/500 [======>.......................] - ETA: 2:08 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 120/500 [======>.......................] - ETA: 2:07 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 121/500 [======>.......................] - ETA: 2:07 - loss: 0.0619 - regression_loss: 0.0573 - classification_loss: 0.0046 122/500 [======>.......................] - ETA: 2:07 - loss: 0.0618 - regression_loss: 0.0572 - classification_loss: 0.0046 123/500 [======>.......................] - ETA: 2:06 - loss: 0.0616 - regression_loss: 0.0571 - classification_loss: 0.0046 124/500 [======>.......................] - ETA: 2:06 - loss: 0.0617 - regression_loss: 0.0571 - classification_loss: 0.0046 125/500 [======>.......................] - ETA: 2:06 - loss: 0.0615 - regression_loss: 0.0569 - classification_loss: 0.0046 126/500 [======>.......................] - ETA: 2:05 - loss: 0.0614 - regression_loss: 0.0568 - classification_loss: 0.0046 127/500 [======>.......................] - ETA: 2:05 - loss: 0.0612 - regression_loss: 0.0566 - classification_loss: 0.0046 128/500 [======>.......................] - ETA: 2:05 - loss: 0.0609 - regression_loss: 0.0563 - classification_loss: 0.0046 129/500 [======>.......................] - ETA: 2:04 - loss: 0.0605 - regression_loss: 0.0560 - classification_loss: 0.0045 130/500 [======>.......................] - ETA: 2:04 - loss: 0.0611 - regression_loss: 0.0566 - classification_loss: 0.0045 131/500 [======>.......................] - ETA: 2:04 - loss: 0.0608 - regression_loss: 0.0563 - classification_loss: 0.0045 132/500 [======>.......................] - ETA: 2:03 - loss: 0.0613 - regression_loss: 0.0568 - classification_loss: 0.0046 133/500 [======>.......................] - ETA: 2:03 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0622 - regression_loss: 0.0575 - classification_loss: 0.0046 135/500 [=======>......................] - ETA: 2:02 - loss: 0.0619 - regression_loss: 0.0573 - classification_loss: 0.0046 136/500 [=======>......................] - ETA: 2:02 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 137/500 [=======>......................] - ETA: 2:02 - loss: 0.0623 - regression_loss: 0.0576 - classification_loss: 0.0046 138/500 [=======>......................] - ETA: 2:01 - loss: 0.0620 - regression_loss: 0.0574 - classification_loss: 0.0046 139/500 [=======>......................] - ETA: 2:01 - loss: 0.0618 - regression_loss: 0.0572 - classification_loss: 0.0046 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0615 - regression_loss: 0.0570 - classification_loss: 0.0046 141/500 [=======>......................] - ETA: 2:00 - loss: 0.0613 - regression_loss: 0.0568 - classification_loss: 0.0045 142/500 [=======>......................] - ETA: 2:00 - loss: 0.0611 - regression_loss: 0.0565 - classification_loss: 0.0045 143/500 [=======>......................] - ETA: 2:00 - loss: 0.0614 - regression_loss: 0.0569 - classification_loss: 0.0046 144/500 [=======>......................] - ETA: 1:59 - loss: 0.0612 - regression_loss: 0.0566 - classification_loss: 0.0045 145/500 [=======>......................] - ETA: 1:59 - loss: 0.0609 - regression_loss: 0.0564 - classification_loss: 0.0045 146/500 [=======>......................] - ETA: 1:59 - loss: 0.0611 - regression_loss: 0.0565 - classification_loss: 0.0045 147/500 [=======>......................] - ETA: 1:58 - loss: 0.0611 - regression_loss: 0.0565 - classification_loss: 0.0046 148/500 [=======>......................] - ETA: 1:58 - loss: 0.0608 - regression_loss: 0.0562 - classification_loss: 0.0045 149/500 [=======>......................] - ETA: 1:58 - loss: 0.0611 - regression_loss: 0.0565 - classification_loss: 0.0045 150/500 [========>.....................] - ETA: 1:57 - loss: 0.0608 - regression_loss: 0.0563 - classification_loss: 0.0045 151/500 [========>.....................] - ETA: 1:57 - loss: 0.0606 - regression_loss: 0.0561 - classification_loss: 0.0045 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0610 - regression_loss: 0.0564 - classification_loss: 0.0045 153/500 [========>.....................] - ETA: 1:56 - loss: 0.0609 - regression_loss: 0.0564 - classification_loss: 0.0045 154/500 [========>.....................] - ETA: 1:56 - loss: 0.0612 - regression_loss: 0.0567 - classification_loss: 0.0045 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0611 - regression_loss: 0.0566 - classification_loss: 0.0045 156/500 [========>.....................] - ETA: 1:55 - loss: 0.0622 - regression_loss: 0.0575 - classification_loss: 0.0047 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0619 - regression_loss: 0.0573 - classification_loss: 0.0046 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0617 - regression_loss: 0.0571 - classification_loss: 0.0046 159/500 [========>.....................] - ETA: 1:54 - loss: 0.0616 - regression_loss: 0.0569 - classification_loss: 0.0046 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0616 - regression_loss: 0.0570 - classification_loss: 0.0046 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0616 - regression_loss: 0.0570 - classification_loss: 0.0046 162/500 [========>.....................] - ETA: 1:53 - loss: 0.0618 - regression_loss: 0.0571 - classification_loss: 0.0046 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0619 - regression_loss: 0.0572 - classification_loss: 0.0046 165/500 [========>.....................] - ETA: 1:52 - loss: 0.0620 - regression_loss: 0.0574 - classification_loss: 0.0046 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0623 - regression_loss: 0.0576 - classification_loss: 0.0046 168/500 [=========>....................] - ETA: 1:51 - loss: 0.0619 - regression_loss: 0.0573 - classification_loss: 0.0046 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0623 - regression_loss: 0.0577 - classification_loss: 0.0046 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 171/500 [=========>....................] - ETA: 1:50 - loss: 0.0620 - regression_loss: 0.0574 - classification_loss: 0.0046 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0617 - regression_loss: 0.0572 - classification_loss: 0.0046 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0615 - regression_loss: 0.0569 - classification_loss: 0.0046 174/500 [=========>....................] - ETA: 1:49 - loss: 0.0616 - regression_loss: 0.0570 - classification_loss: 0.0046 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0616 - regression_loss: 0.0570 - classification_loss: 0.0046 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0616 - regression_loss: 0.0570 - classification_loss: 0.0046 177/500 [=========>....................] - ETA: 1:48 - loss: 0.0615 - regression_loss: 0.0570 - classification_loss: 0.0046 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0620 - regression_loss: 0.0574 - classification_loss: 0.0046 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 180/500 [=========>....................] - ETA: 1:47 - loss: 0.0623 - regression_loss: 0.0577 - classification_loss: 0.0046 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 182/500 [=========>....................] - ETA: 1:46 - loss: 0.0618 - regression_loss: 0.0572 - classification_loss: 0.0045 183/500 [=========>....................] - ETA: 1:46 - loss: 0.0624 - regression_loss: 0.0577 - classification_loss: 0.0047 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0047 185/500 [==========>...................] - ETA: 1:45 - loss: 0.0623 - regression_loss: 0.0576 - classification_loss: 0.0047 186/500 [==========>...................] - ETA: 1:45 - loss: 0.0620 - regression_loss: 0.0573 - classification_loss: 0.0046 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0619 - regression_loss: 0.0572 - classification_loss: 0.0047 188/500 [==========>...................] - ETA: 1:44 - loss: 0.0624 - regression_loss: 0.0577 - classification_loss: 0.0047 189/500 [==========>...................] - ETA: 1:44 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0047 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0624 - regression_loss: 0.0577 - classification_loss: 0.0047 191/500 [==========>...................] - ETA: 1:43 - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 192/500 [==========>...................] - ETA: 1:43 - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 193/500 [==========>...................] - ETA: 1:43 - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 194/500 [==========>...................] - ETA: 1:42 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0047 195/500 [==========>...................] - ETA: 1:42 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 196/500 [==========>...................] - ETA: 1:42 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 197/500 [==========>...................] - ETA: 1:41 - loss: 0.0635 - regression_loss: 0.0588 - classification_loss: 0.0047 198/500 [==========>...................] - ETA: 1:41 - loss: 0.0636 - regression_loss: 0.0590 - classification_loss: 0.0047 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0635 - regression_loss: 0.0588 - classification_loss: 0.0047 200/500 [===========>..................] - ETA: 1:40 - loss: 0.0646 - regression_loss: 0.0598 - classification_loss: 0.0047 201/500 [===========>..................] - ETA: 1:40 - loss: 0.0646 - regression_loss: 0.0599 - classification_loss: 0.0047 202/500 [===========>..................] - ETA: 1:40 - loss: 0.0644 - regression_loss: 0.0597 - classification_loss: 0.0047 203/500 [===========>..................] - ETA: 1:39 - loss: 0.0642 - regression_loss: 0.0595 - classification_loss: 0.0047 204/500 [===========>..................] - ETA: 1:39 - loss: 0.0640 - regression_loss: 0.0593 - classification_loss: 0.0047 205/500 [===========>..................] - ETA: 1:39 - loss: 0.0638 - regression_loss: 0.0591 - classification_loss: 0.0047 206/500 [===========>..................] - ETA: 1:38 - loss: 0.0637 - regression_loss: 0.0591 - classification_loss: 0.0047 207/500 [===========>..................] - ETA: 1:38 - loss: 0.0635 - regression_loss: 0.0589 - classification_loss: 0.0046 208/500 [===========>..................] - ETA: 1:38 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0046 209/500 [===========>..................] - ETA: 1:37 - loss: 0.0634 - regression_loss: 0.0588 - classification_loss: 0.0046 210/500 [===========>..................] - ETA: 1:37 - loss: 0.0636 - regression_loss: 0.0590 - classification_loss: 0.0046 211/500 [===========>..................] - ETA: 1:37 - loss: 0.0634 - regression_loss: 0.0588 - classification_loss: 0.0046 212/500 [===========>..................] - ETA: 1:36 - loss: 0.0635 - regression_loss: 0.0589 - classification_loss: 0.0046 213/500 [===========>..................] - ETA: 1:36 - loss: 0.0644 - regression_loss: 0.0595 - classification_loss: 0.0049 214/500 [===========>..................] - ETA: 1:36 - loss: 0.0649 - regression_loss: 0.0600 - classification_loss: 0.0049 215/500 [===========>..................] - ETA: 1:35 - loss: 0.0647 - regression_loss: 0.0598 - classification_loss: 0.0049 216/500 [===========>..................] - ETA: 1:35 - loss: 0.0650 - regression_loss: 0.0601 - classification_loss: 0.0049 217/500 [============>.................] - ETA: 1:35 - loss: 0.0648 - regression_loss: 0.0599 - classification_loss: 0.0049 218/500 [============>.................] - ETA: 1:34 - loss: 0.0651 - regression_loss: 0.0602 - classification_loss: 0.0050 219/500 [============>.................] - ETA: 1:34 - loss: 0.0653 - regression_loss: 0.0603 - classification_loss: 0.0050 220/500 [============>.................] - ETA: 1:34 - loss: 0.0659 - regression_loss: 0.0609 - classification_loss: 0.0050 221/500 [============>.................] - ETA: 1:33 - loss: 0.0657 - regression_loss: 0.0607 - classification_loss: 0.0050 222/500 [============>.................] - ETA: 1:33 - loss: 0.0659 - regression_loss: 0.0609 - classification_loss: 0.0051 223/500 [============>.................] - ETA: 1:33 - loss: 0.0658 - regression_loss: 0.0607 - classification_loss: 0.0050 224/500 [============>.................] - ETA: 1:32 - loss: 0.0656 - regression_loss: 0.0606 - classification_loss: 0.0050 225/500 [============>.................] - ETA: 1:32 - loss: 0.0656 - regression_loss: 0.0606 - classification_loss: 0.0050 226/500 [============>.................] - ETA: 1:32 - loss: 0.0653 - regression_loss: 0.0603 - classification_loss: 0.0050 227/500 [============>.................] - ETA: 1:31 - loss: 0.0652 - regression_loss: 0.0602 - classification_loss: 0.0050 228/500 [============>.................] - ETA: 1:31 - loss: 0.0650 - regression_loss: 0.0600 - classification_loss: 0.0050 229/500 [============>.................] - ETA: 1:31 - loss: 0.0651 - regression_loss: 0.0601 - classification_loss: 0.0050 230/500 [============>.................] - ETA: 1:30 - loss: 0.0650 - regression_loss: 0.0600 - classification_loss: 0.0050 231/500 [============>.................] - ETA: 1:30 - loss: 0.0648 - regression_loss: 0.0598 - classification_loss: 0.0050 232/500 [============>.................] - ETA: 1:30 - loss: 0.0649 - regression_loss: 0.0599 - classification_loss: 0.0050 233/500 [============>.................] - ETA: 1:29 - loss: 0.0646 - regression_loss: 0.0597 - classification_loss: 0.0050 234/500 [=============>................] - ETA: 1:29 - loss: 0.0646 - regression_loss: 0.0596 - classification_loss: 0.0050 235/500 [=============>................] - ETA: 1:29 - loss: 0.0646 - regression_loss: 0.0597 - classification_loss: 0.0049 236/500 [=============>................] - ETA: 1:28 - loss: 0.0644 - regression_loss: 0.0595 - classification_loss: 0.0049 237/500 [=============>................] - ETA: 1:28 - loss: 0.0643 - regression_loss: 0.0594 - classification_loss: 0.0049 238/500 [=============>................] - ETA: 1:28 - loss: 0.0643 - regression_loss: 0.0594 - classification_loss: 0.0049 239/500 [=============>................] - ETA: 1:27 - loss: 0.0642 - regression_loss: 0.0593 - classification_loss: 0.0049 240/500 [=============>................] - ETA: 1:27 - loss: 0.0641 - regression_loss: 0.0592 - classification_loss: 0.0049 241/500 [=============>................] - ETA: 1:27 - loss: 0.0643 - regression_loss: 0.0594 - classification_loss: 0.0049 242/500 [=============>................] - ETA: 1:26 - loss: 0.0641 - regression_loss: 0.0593 - classification_loss: 0.0049 243/500 [=============>................] - ETA: 1:26 - loss: 0.0640 - regression_loss: 0.0591 - classification_loss: 0.0049 244/500 [=============>................] - ETA: 1:26 - loss: 0.0638 - regression_loss: 0.0590 - classification_loss: 0.0048 245/500 [=============>................] - ETA: 1:25 - loss: 0.0638 - regression_loss: 0.0590 - classification_loss: 0.0048 246/500 [=============>................] - ETA: 1:25 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 247/500 [=============>................] - ETA: 1:25 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 248/500 [=============>................] - ETA: 1:24 - loss: 0.0635 - regression_loss: 0.0587 - classification_loss: 0.0048 249/500 [=============>................] - ETA: 1:24 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 250/500 [==============>...............] - ETA: 1:24 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0048 251/500 [==============>...............] - ETA: 1:23 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0048 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0635 - regression_loss: 0.0587 - classification_loss: 0.0048 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0048 254/500 [==============>...............] - ETA: 1:22 - loss: 0.0631 - regression_loss: 0.0583 - classification_loss: 0.0048 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0048 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0629 - regression_loss: 0.0581 - classification_loss: 0.0047 257/500 [==============>...............] - ETA: 1:21 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0629 - regression_loss: 0.0581 - classification_loss: 0.0048 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0048 260/500 [==============>...............] - ETA: 1:20 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0048 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0639 - regression_loss: 0.0592 - classification_loss: 0.0048 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0639 - regression_loss: 0.0591 - classification_loss: 0.0048 263/500 [==============>...............] - ETA: 1:19 - loss: 0.0637 - regression_loss: 0.0590 - classification_loss: 0.0047 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0637 - regression_loss: 0.0590 - classification_loss: 0.0048 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0047 266/500 [==============>...............] - ETA: 1:18 - loss: 0.0640 - regression_loss: 0.0592 - classification_loss: 0.0048 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0639 - regression_loss: 0.0591 - classification_loss: 0.0048 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 269/500 [===============>..............] - ETA: 1:17 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0047 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0634 - regression_loss: 0.0587 - classification_loss: 0.0047 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0047 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0047 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0047 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0047 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0047 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0047 284/500 [================>.............] - ETA: 1:12 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 285/500 [================>.............] - ETA: 1:12 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 286/500 [================>.............] - ETA: 1:12 - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0047 287/500 [================>.............] - ETA: 1:11 - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 288/500 [================>.............] - ETA: 1:11 - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 289/500 [================>.............] - ETA: 1:11 - loss: 0.0631 - regression_loss: 0.0583 - classification_loss: 0.0048 290/500 [================>.............] - ETA: 1:10 - loss: 0.0629 - regression_loss: 0.0581 - classification_loss: 0.0048 291/500 [================>.............] - ETA: 1:10 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0048 292/500 [================>.............] - ETA: 1:10 - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0048 293/500 [================>.............] - ETA: 1:09 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 294/500 [================>.............] - ETA: 1:09 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 295/500 [================>.............] - ETA: 1:09 - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0048 296/500 [================>.............] - ETA: 1:08 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 297/500 [================>.............] - ETA: 1:08 - loss: 0.0635 - regression_loss: 0.0587 - classification_loss: 0.0048 298/500 [================>.............] - ETA: 1:08 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0048 299/500 [================>.............] - ETA: 1:07 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 300/500 [=================>............] - ETA: 1:07 - loss: 0.0636 - regression_loss: 0.0589 - classification_loss: 0.0048 301/500 [=================>............] - ETA: 1:07 - loss: 0.0635 - regression_loss: 0.0588 - classification_loss: 0.0048 302/500 [=================>............] - ETA: 1:06 - loss: 0.0635 - regression_loss: 0.0587 - classification_loss: 0.0048 303/500 [=================>............] - ETA: 1:06 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 304/500 [=================>............] - ETA: 1:06 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 305/500 [=================>............] - ETA: 1:05 - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0048 306/500 [=================>............] - ETA: 1:05 - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0048 307/500 [=================>............] - ETA: 1:05 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0048 308/500 [=================>............] - ETA: 1:04 - loss: 0.0629 - regression_loss: 0.0581 - classification_loss: 0.0047 309/500 [=================>............] - ETA: 1:04 - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0048 310/500 [=================>............] - ETA: 1:03 - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 311/500 [=================>............] - ETA: 1:03 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 312/500 [=================>............] - ETA: 1:03 - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 313/500 [=================>............] - ETA: 1:02 - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 314/500 [=================>............] - ETA: 1:02 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0047 315/500 [=================>............] - ETA: 1:02 - loss: 0.0631 - regression_loss: 0.0583 - classification_loss: 0.0047 316/500 [=================>............] - ETA: 1:01 - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0047 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 319/500 [==================>...........] - ETA: 1:00 - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0047 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 322/500 [==================>...........] - ETA: 59s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047  323/500 [==================>...........] - ETA: 59s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 324/500 [==================>...........] - ETA: 59s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 325/500 [==================>...........] - ETA: 58s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 326/500 [==================>...........] - ETA: 58s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 327/500 [==================>...........] - ETA: 58s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 328/500 [==================>...........] - ETA: 57s - loss: 0.0624 - regression_loss: 0.0577 - classification_loss: 0.0047 329/500 [==================>...........] - ETA: 57s - loss: 0.0623 - regression_loss: 0.0576 - classification_loss: 0.0047 330/500 [==================>...........] - ETA: 57s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0047 331/500 [==================>...........] - ETA: 56s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0047 332/500 [==================>...........] - ETA: 56s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 333/500 [==================>...........] - ETA: 56s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 334/500 [===================>..........] - ETA: 55s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0047 335/500 [===================>..........] - ETA: 55s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0047 336/500 [===================>..........] - ETA: 55s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 337/500 [===================>..........] - ETA: 54s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 338/500 [===================>..........] - ETA: 54s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 339/500 [===================>..........] - ETA: 54s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0047 340/500 [===================>..........] - ETA: 53s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 341/500 [===================>..........] - ETA: 53s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0047 342/500 [===================>..........] - ETA: 53s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0047 343/500 [===================>..........] - ETA: 52s - loss: 0.0624 - regression_loss: 0.0577 - classification_loss: 0.0046 344/500 [===================>..........] - ETA: 52s - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 345/500 [===================>..........] - ETA: 52s - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 346/500 [===================>..........] - ETA: 51s - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0046 347/500 [===================>..........] - ETA: 51s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 348/500 [===================>..........] - ETA: 51s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0047 349/500 [===================>..........] - ETA: 50s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 350/500 [====================>.........] - ETA: 50s - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0047 351/500 [====================>.........] - ETA: 50s - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0047 352/500 [====================>.........] - ETA: 49s - loss: 0.0623 - regression_loss: 0.0577 - classification_loss: 0.0047 353/500 [====================>.........] - ETA: 49s - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 354/500 [====================>.........] - ETA: 49s - loss: 0.0622 - regression_loss: 0.0575 - classification_loss: 0.0047 355/500 [====================>.........] - ETA: 48s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0046 356/500 [====================>.........] - ETA: 48s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 357/500 [====================>.........] - ETA: 48s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 358/500 [====================>.........] - ETA: 47s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 359/500 [====================>.........] - ETA: 47s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0047 360/500 [====================>.........] - ETA: 47s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0047 361/500 [====================>.........] - ETA: 46s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0046 362/500 [====================>.........] - ETA: 46s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 363/500 [====================>.........] - ETA: 46s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0046 364/500 [====================>.........] - ETA: 45s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0046 365/500 [====================>.........] - ETA: 45s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 366/500 [====================>.........] - ETA: 45s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0046 367/500 [=====================>........] - ETA: 44s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 368/500 [=====================>........] - ETA: 44s - loss: 0.0632 - regression_loss: 0.0586 - classification_loss: 0.0046 369/500 [=====================>........] - ETA: 44s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0046 370/500 [=====================>........] - ETA: 43s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 371/500 [=====================>........] - ETA: 43s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 372/500 [=====================>........] - ETA: 43s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 373/500 [=====================>........] - ETA: 42s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 374/500 [=====================>........] - ETA: 42s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 375/500 [=====================>........] - ETA: 42s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0046 376/500 [=====================>........] - ETA: 41s - loss: 0.0632 - regression_loss: 0.0586 - classification_loss: 0.0046 377/500 [=====================>........] - ETA: 41s - loss: 0.0632 - regression_loss: 0.0586 - classification_loss: 0.0046 378/500 [=====================>........] - ETA: 41s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0046 379/500 [=====================>........] - ETA: 40s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 380/500 [=====================>........] - ETA: 40s - loss: 0.0632 - regression_loss: 0.0586 - classification_loss: 0.0046 381/500 [=====================>........] - ETA: 40s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0046 382/500 [=====================>........] - ETA: 39s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0046 383/500 [=====================>........] - ETA: 39s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 384/500 [======================>.......] - ETA: 39s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 385/500 [======================>.......] - ETA: 38s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 386/500 [======================>.......] - ETA: 38s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 387/500 [======================>.......] - ETA: 38s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 388/500 [======================>.......] - ETA: 37s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 389/500 [======================>.......] - ETA: 37s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 390/500 [======================>.......] - ETA: 37s - loss: 0.0625 - regression_loss: 0.0580 - classification_loss: 0.0046 391/500 [======================>.......] - ETA: 36s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 392/500 [======================>.......] - ETA: 36s - loss: 0.0628 - regression_loss: 0.0583 - classification_loss: 0.0046 393/500 [======================>.......] - ETA: 36s - loss: 0.0627 - regression_loss: 0.0582 - classification_loss: 0.0046 394/500 [======================>.......] - ETA: 35s - loss: 0.0628 - regression_loss: 0.0583 - classification_loss: 0.0046 395/500 [======================>.......] - ETA: 35s - loss: 0.0629 - regression_loss: 0.0584 - classification_loss: 0.0046 396/500 [======================>.......] - ETA: 35s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 397/500 [======================>.......] - ETA: 34s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 398/500 [======================>.......] - ETA: 34s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 399/500 [======================>.......] - ETA: 34s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 400/500 [=======================>......] - ETA: 33s - loss: 0.0627 - regression_loss: 0.0582 - classification_loss: 0.0046 401/500 [=======================>......] - ETA: 33s - loss: 0.0626 - regression_loss: 0.0581 - classification_loss: 0.0046 402/500 [=======================>......] - ETA: 33s - loss: 0.0626 - regression_loss: 0.0581 - classification_loss: 0.0046 403/500 [=======================>......] - ETA: 32s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0046 404/500 [=======================>......] - ETA: 32s - loss: 0.0626 - regression_loss: 0.0581 - classification_loss: 0.0046 405/500 [=======================>......] - ETA: 32s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 406/500 [=======================>......] - ETA: 31s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 407/500 [=======================>......] - ETA: 31s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0046 408/500 [=======================>......] - ETA: 30s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 409/500 [=======================>......] - ETA: 30s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0046 410/500 [=======================>......] - ETA: 30s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0046 411/500 [=======================>......] - ETA: 29s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0046 412/500 [=======================>......] - ETA: 29s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 413/500 [=======================>......] - ETA: 29s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0047 414/500 [=======================>......] - ETA: 28s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0046 415/500 [=======================>......] - ETA: 28s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 416/500 [=======================>......] - ETA: 28s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 417/500 [========================>.....] - ETA: 27s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0046 418/500 [========================>.....] - ETA: 27s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 419/500 [========================>.....] - ETA: 27s - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0047 420/500 [========================>.....] - ETA: 26s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0047 421/500 [========================>.....] - ETA: 26s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 422/500 [========================>.....] - ETA: 26s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 423/500 [========================>.....] - ETA: 25s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0046 424/500 [========================>.....] - ETA: 25s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0046 425/500 [========================>.....] - ETA: 25s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0046 426/500 [========================>.....] - ETA: 24s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0046 427/500 [========================>.....] - ETA: 24s - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 428/500 [========================>.....] - ETA: 24s - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 429/500 [========================>.....] - ETA: 23s - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 430/500 [========================>.....] - ETA: 23s - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 431/500 [========================>.....] - ETA: 23s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 432/500 [========================>.....] - ETA: 22s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 433/500 [========================>.....] - ETA: 22s - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0047 434/500 [=========================>....] - ETA: 22s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 435/500 [=========================>....] - ETA: 21s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 436/500 [=========================>....] - ETA: 21s - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0047 437/500 [=========================>....] - ETA: 21s - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0047 438/500 [=========================>....] - ETA: 20s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 439/500 [=========================>....] - ETA: 20s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 440/500 [=========================>....] - ETA: 20s - loss: 0.0631 - regression_loss: 0.0583 - classification_loss: 0.0047 441/500 [=========================>....] - ETA: 19s - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0047 442/500 [=========================>....] - ETA: 19s - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0047 443/500 [=========================>....] - ETA: 19s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 444/500 [=========================>....] - ETA: 18s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 445/500 [=========================>....] - ETA: 18s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 446/500 [=========================>....] - ETA: 18s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 447/500 [=========================>....] - ETA: 17s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 448/500 [=========================>....] - ETA: 17s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 449/500 [=========================>....] - ETA: 17s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 450/500 [==========================>...] - ETA: 16s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 451/500 [==========================>...] - ETA: 16s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 452/500 [==========================>...] - ETA: 16s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 453/500 [==========================>...] - ETA: 15s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 454/500 [==========================>...] - ETA: 15s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 455/500 [==========================>...] - ETA: 15s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 456/500 [==========================>...] - ETA: 14s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 457/500 [==========================>...] - ETA: 14s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0047 458/500 [==========================>...] - ETA: 14s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 459/500 [==========================>...] - ETA: 13s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 460/500 [==========================>...] - ETA: 13s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 461/500 [==========================>...] - ETA: 13s - loss: 0.0630 - regression_loss: 0.0584 - classification_loss: 0.0047 462/500 [==========================>...] - ETA: 12s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0047 463/500 [==========================>...] - ETA: 12s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 464/500 [==========================>...] - ETA: 12s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 465/500 [==========================>...] - ETA: 11s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0046 466/500 [==========================>...] - ETA: 11s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0046 467/500 [===========================>..] - ETA: 11s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0046 468/500 [===========================>..] - ETA: 10s - loss: 0.0629 - regression_loss: 0.0583 - classification_loss: 0.0047 469/500 [===========================>..] - ETA: 10s - loss: 0.0628 - regression_loss: 0.0582 - classification_loss: 0.0046 470/500 [===========================>..] - ETA: 10s - loss: 0.0631 - regression_loss: 0.0585 - classification_loss: 0.0047 471/500 [===========================>..] - ETA: 9s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047  472/500 [===========================>..] - ETA: 9s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 473/500 [===========================>..] - ETA: 9s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 474/500 [===========================>..] - ETA: 8s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 475/500 [===========================>..] - ETA: 8s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 476/500 [===========================>..] - ETA: 8s - loss: 0.0632 - regression_loss: 0.0585 - classification_loss: 0.0047 477/500 [===========================>..] - ETA: 7s - loss: 0.0631 - regression_loss: 0.0584 - classification_loss: 0.0047 478/500 [===========================>..] - ETA: 7s - loss: 0.0630 - regression_loss: 0.0583 - classification_loss: 0.0047 479/500 [===========================>..] - ETA: 7s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 480/500 [===========================>..] - ETA: 6s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 481/500 [===========================>..] - ETA: 6s - loss: 0.0629 - regression_loss: 0.0582 - classification_loss: 0.0047 482/500 [===========================>..] - ETA: 6s - loss: 0.0628 - regression_loss: 0.0581 - classification_loss: 0.0047 483/500 [===========================>..] - ETA: 5s - loss: 0.0627 - regression_loss: 0.0580 - classification_loss: 0.0047 484/500 [============================>.] - ETA: 5s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 485/500 [============================>.] - ETA: 5s - loss: 0.0627 - regression_loss: 0.0581 - classification_loss: 0.0046 486/500 [============================>.] - ETA: 4s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0046 487/500 [============================>.] - ETA: 4s - loss: 0.0626 - regression_loss: 0.0580 - classification_loss: 0.0046 488/500 [============================>.] - ETA: 4s - loss: 0.0626 - regression_loss: 0.0579 - classification_loss: 0.0046 489/500 [============================>.] - ETA: 3s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 490/500 [============================>.] - ETA: 3s - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0046 491/500 [============================>.] - ETA: 3s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 492/500 [============================>.] - ETA: 2s - loss: 0.0625 - regression_loss: 0.0579 - classification_loss: 0.0046 493/500 [============================>.] - ETA: 2s - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0046 494/500 [============================>.] - ETA: 2s - loss: 0.0624 - regression_loss: 0.0578 - classification_loss: 0.0046 495/500 [============================>.] - ETA: 1s - loss: 0.0623 - regression_loss: 0.0577 - classification_loss: 0.0046 496/500 [============================>.] - ETA: 1s - loss: 0.0623 - regression_loss: 0.0577 - classification_loss: 0.0046 497/500 [============================>.] - ETA: 1s - loss: 0.0622 - regression_loss: 0.0576 - classification_loss: 0.0046 498/500 [============================>.] - ETA: 0s - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 499/500 [============================>.] - ETA: 0s - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 500/500 [==============================] - 168s 337ms/step - loss: 0.0621 - regression_loss: 0.0575 - classification_loss: 0.0046 1172 instances of class plum with average precision: 0.7543 mAP: 0.7543 Epoch 00061: saving model to ./training/snapshots/resnet101_pascal_61.h5 Epoch 62/150 1/500 [..............................] - ETA: 2:41 - loss: 0.0311 - regression_loss: 0.0292 - classification_loss: 0.0019 2/500 [..............................] - ETA: 2:43 - loss: 0.0810 - regression_loss: 0.0773 - classification_loss: 0.0037 3/500 [..............................] - ETA: 2:44 - loss: 0.0621 - regression_loss: 0.0586 - classification_loss: 0.0035 4/500 [..............................] - ETA: 2:41 - loss: 0.1051 - regression_loss: 0.0862 - classification_loss: 0.0189 5/500 [..............................] - ETA: 2:41 - loss: 0.0945 - regression_loss: 0.0785 - classification_loss: 0.0161 6/500 [..............................] - ETA: 2:41 - loss: 0.1011 - regression_loss: 0.0867 - classification_loss: 0.0144 7/500 [..............................] - ETA: 2:42 - loss: 0.1115 - regression_loss: 0.0970 - classification_loss: 0.0145 8/500 [..............................] - ETA: 2:42 - loss: 0.1097 - regression_loss: 0.0959 - classification_loss: 0.0139 9/500 [..............................] - ETA: 2:42 - loss: 0.1084 - regression_loss: 0.0954 - classification_loss: 0.0130 10/500 [..............................] - ETA: 2:41 - loss: 0.1006 - regression_loss: 0.0888 - classification_loss: 0.0119 11/500 [..............................] - ETA: 2:40 - loss: 0.0958 - regression_loss: 0.0848 - classification_loss: 0.0110 12/500 [..............................] - ETA: 2:41 - loss: 0.0934 - regression_loss: 0.0820 - classification_loss: 0.0115 13/500 [..............................] - ETA: 2:41 - loss: 0.0972 - regression_loss: 0.0862 - classification_loss: 0.0111 14/500 [..............................] - ETA: 2:41 - loss: 0.0913 - regression_loss: 0.0809 - classification_loss: 0.0104 15/500 [..............................] - ETA: 2:41 - loss: 0.0875 - regression_loss: 0.0777 - classification_loss: 0.0098 16/500 [..............................] - ETA: 2:41 - loss: 0.0834 - regression_loss: 0.0741 - classification_loss: 0.0092 17/500 [>.............................] - ETA: 2:41 - loss: 0.0795 - regression_loss: 0.0707 - classification_loss: 0.0088 18/500 [>.............................] - ETA: 2:41 - loss: 0.0784 - regression_loss: 0.0698 - classification_loss: 0.0085 19/500 [>.............................] - ETA: 2:40 - loss: 0.0813 - regression_loss: 0.0730 - classification_loss: 0.0084 20/500 [>.............................] - ETA: 2:40 - loss: 0.0777 - regression_loss: 0.0697 - classification_loss: 0.0080 21/500 [>.............................] - ETA: 2:39 - loss: 0.0751 - regression_loss: 0.0674 - classification_loss: 0.0076 22/500 [>.............................] - ETA: 2:39 - loss: 0.0739 - regression_loss: 0.0665 - classification_loss: 0.0074 23/500 [>.............................] - ETA: 2:39 - loss: 0.0738 - regression_loss: 0.0666 - classification_loss: 0.0072 24/500 [>.............................] - ETA: 2:38 - loss: 0.0742 - regression_loss: 0.0671 - classification_loss: 0.0071 25/500 [>.............................] - ETA: 2:38 - loss: 0.0751 - regression_loss: 0.0681 - classification_loss: 0.0070 26/500 [>.............................] - ETA: 2:38 - loss: 0.0756 - regression_loss: 0.0686 - classification_loss: 0.0069 27/500 [>.............................] - ETA: 2:38 - loss: 0.0745 - regression_loss: 0.0677 - classification_loss: 0.0068 28/500 [>.............................] - ETA: 2:38 - loss: 0.0764 - regression_loss: 0.0695 - classification_loss: 0.0068 29/500 [>.............................] - ETA: 2:37 - loss: 0.0759 - regression_loss: 0.0692 - classification_loss: 0.0068 30/500 [>.............................] - ETA: 2:37 - loss: 0.0737 - regression_loss: 0.0672 - classification_loss: 0.0066 31/500 [>.............................] - ETA: 2:36 - loss: 0.0725 - regression_loss: 0.0661 - classification_loss: 0.0064 32/500 [>.............................] - ETA: 2:36 - loss: 0.0704 - regression_loss: 0.0642 - classification_loss: 0.0062 33/500 [>.............................] - ETA: 2:36 - loss: 0.0687 - regression_loss: 0.0626 - classification_loss: 0.0061 34/500 [=>............................] - ETA: 2:35 - loss: 0.0675 - regression_loss: 0.0615 - classification_loss: 0.0059 35/500 [=>............................] - ETA: 2:35 - loss: 0.0660 - regression_loss: 0.0601 - classification_loss: 0.0058 36/500 [=>............................] - ETA: 2:35 - loss: 0.0643 - regression_loss: 0.0586 - classification_loss: 0.0057 37/500 [=>............................] - ETA: 2:34 - loss: 0.0636 - regression_loss: 0.0580 - classification_loss: 0.0056 38/500 [=>............................] - ETA: 2:34 - loss: 0.0653 - regression_loss: 0.0593 - classification_loss: 0.0059 39/500 [=>............................] - ETA: 2:34 - loss: 0.0664 - regression_loss: 0.0604 - classification_loss: 0.0059 40/500 [=>............................] - ETA: 2:34 - loss: 0.0653 - regression_loss: 0.0595 - classification_loss: 0.0058 41/500 [=>............................] - ETA: 2:33 - loss: 0.0640 - regression_loss: 0.0583 - classification_loss: 0.0057 42/500 [=>............................] - ETA: 2:33 - loss: 0.0634 - regression_loss: 0.0578 - classification_loss: 0.0056 43/500 [=>............................] - ETA: 2:33 - loss: 0.0658 - regression_loss: 0.0602 - classification_loss: 0.0056 44/500 [=>............................] - ETA: 2:32 - loss: 0.0646 - regression_loss: 0.0591 - classification_loss: 0.0055 45/500 [=>............................] - ETA: 2:32 - loss: 0.0637 - regression_loss: 0.0583 - classification_loss: 0.0054 46/500 [=>............................] - ETA: 2:32 - loss: 0.0626 - regression_loss: 0.0573 - classification_loss: 0.0053 47/500 [=>............................] - ETA: 2:32 - loss: 0.0627 - regression_loss: 0.0574 - classification_loss: 0.0053 48/500 [=>............................] - ETA: 2:31 - loss: 0.0626 - regression_loss: 0.0573 - classification_loss: 0.0053 49/500 [=>............................] - ETA: 2:31 - loss: 0.0619 - regression_loss: 0.0567 - classification_loss: 0.0052 50/500 [==>...........................] - ETA: 2:31 - loss: 0.0611 - regression_loss: 0.0559 - classification_loss: 0.0051 51/500 [==>...........................] - ETA: 2:31 - loss: 0.0600 - regression_loss: 0.0550 - classification_loss: 0.0051 52/500 [==>...........................] - ETA: 2:30 - loss: 0.0595 - regression_loss: 0.0545 - classification_loss: 0.0050 53/500 [==>...........................] - ETA: 2:30 - loss: 0.0593 - regression_loss: 0.0543 - classification_loss: 0.0050 54/500 [==>...........................] - ETA: 2:30 - loss: 0.0595 - regression_loss: 0.0545 - classification_loss: 0.0050 55/500 [==>...........................] - ETA: 2:29 - loss: 0.0592 - regression_loss: 0.0542 - classification_loss: 0.0050 56/500 [==>...........................] - ETA: 2:29 - loss: 0.0588 - regression_loss: 0.0539 - classification_loss: 0.0049 57/500 [==>...........................] - ETA: 2:29 - loss: 0.0587 - regression_loss: 0.0538 - classification_loss: 0.0049 58/500 [==>...........................] - ETA: 2:28 - loss: 0.0582 - regression_loss: 0.0534 - classification_loss: 0.0049 59/500 [==>...........................] - ETA: 2:28 - loss: 0.0581 - regression_loss: 0.0532 - classification_loss: 0.0049 60/500 [==>...........................] - ETA: 2:28 - loss: 0.0598 - regression_loss: 0.0548 - classification_loss: 0.0049 61/500 [==>...........................] - ETA: 2:27 - loss: 0.0607 - regression_loss: 0.0558 - classification_loss: 0.0049 62/500 [==>...........................] - ETA: 2:27 - loss: 0.0609 - regression_loss: 0.0560 - classification_loss: 0.0049 63/500 [==>...........................] - ETA: 2:27 - loss: 0.0612 - regression_loss: 0.0563 - classification_loss: 0.0049 64/500 [==>...........................] - ETA: 2:26 - loss: 0.0610 - regression_loss: 0.0561 - classification_loss: 0.0049 65/500 [==>...........................] - ETA: 2:26 - loss: 0.0621 - regression_loss: 0.0571 - classification_loss: 0.0050 66/500 [==>...........................] - ETA: 2:26 - loss: 0.0620 - regression_loss: 0.0570 - classification_loss: 0.0050 67/500 [===>..........................] - ETA: 2:26 - loss: 0.0623 - regression_loss: 0.0573 - classification_loss: 0.0050 68/500 [===>..........................] - ETA: 2:25 - loss: 0.0620 - regression_loss: 0.0570 - classification_loss: 0.0050 69/500 [===>..........................] - ETA: 2:25 - loss: 0.0616 - regression_loss: 0.0566 - classification_loss: 0.0049 70/500 [===>..........................] - ETA: 2:25 - loss: 0.0625 - regression_loss: 0.0574 - classification_loss: 0.0050 71/500 [===>..........................] - ETA: 2:24 - loss: 0.0636 - regression_loss: 0.0582 - classification_loss: 0.0054 72/500 [===>..........................] - ETA: 2:24 - loss: 0.0643 - regression_loss: 0.0589 - classification_loss: 0.0054 73/500 [===>..........................] - ETA: 2:24 - loss: 0.0655 - regression_loss: 0.0601 - classification_loss: 0.0055 74/500 [===>..........................] - ETA: 2:24 - loss: 0.0653 - regression_loss: 0.0599 - classification_loss: 0.0055 75/500 [===>..........................] - ETA: 2:23 - loss: 0.0654 - regression_loss: 0.0599 - classification_loss: 0.0055 76/500 [===>..........................] - ETA: 2:23 - loss: 0.0670 - regression_loss: 0.0615 - classification_loss: 0.0055 77/500 [===>..........................] - ETA: 2:23 - loss: 0.0685 - regression_loss: 0.0630 - classification_loss: 0.0056 78/500 [===>..........................] - ETA: 2:22 - loss: 0.0678 - regression_loss: 0.0623 - classification_loss: 0.0055 79/500 [===>..........................] - ETA: 2:22 - loss: 0.0685 - regression_loss: 0.0629 - classification_loss: 0.0057 80/500 [===>..........................] - ETA: 2:22 - loss: 0.0682 - regression_loss: 0.0626 - classification_loss: 0.0056 81/500 [===>..........................] - ETA: 2:21 - loss: 0.0679 - regression_loss: 0.0623 - classification_loss: 0.0056 82/500 [===>..........................] - ETA: 2:21 - loss: 0.0673 - regression_loss: 0.0617 - classification_loss: 0.0056 83/500 [===>..........................] - ETA: 2:21 - loss: 0.0668 - regression_loss: 0.0612 - classification_loss: 0.0055 84/500 [====>.........................] - ETA: 2:20 - loss: 0.0673 - regression_loss: 0.0617 - classification_loss: 0.0056 85/500 [====>.........................] - ETA: 2:20 - loss: 0.0672 - regression_loss: 0.0615 - classification_loss: 0.0057 86/500 [====>.........................] - ETA: 2:20 - loss: 0.0675 - regression_loss: 0.0619 - classification_loss: 0.0056 87/500 [====>.........................] - ETA: 2:19 - loss: 0.0678 - regression_loss: 0.0621 - classification_loss: 0.0056 88/500 [====>.........................] - ETA: 2:19 - loss: 0.0674 - regression_loss: 0.0618 - classification_loss: 0.0056 89/500 [====>.........................] - ETA: 2:19 - loss: 0.0672 - regression_loss: 0.0616 - classification_loss: 0.0056 90/500 [====>.........................] - ETA: 2:18 - loss: 0.0675 - regression_loss: 0.0619 - classification_loss: 0.0056 91/500 [====>.........................] - ETA: 2:18 - loss: 0.0669 - regression_loss: 0.0614 - classification_loss: 0.0056 92/500 [====>.........................] - ETA: 2:18 - loss: 0.0668 - regression_loss: 0.0613 - classification_loss: 0.0055 93/500 [====>.........................] - ETA: 2:17 - loss: 0.0670 - regression_loss: 0.0614 - classification_loss: 0.0055 94/500 [====>.........................] - ETA: 2:17 - loss: 0.0666 - regression_loss: 0.0611 - classification_loss: 0.0055 95/500 [====>.........................] - ETA: 2:17 - loss: 0.0663 - regression_loss: 0.0608 - classification_loss: 0.0055 96/500 [====>.........................] - ETA: 2:16 - loss: 0.0666 - regression_loss: 0.0611 - classification_loss: 0.0055 97/500 [====>.........................] - ETA: 2:16 - loss: 0.0661 - regression_loss: 0.0607 - classification_loss: 0.0055 98/500 [====>.........................] - ETA: 2:15 - loss: 0.0659 - regression_loss: 0.0604 - classification_loss: 0.0055 99/500 [====>.........................] - ETA: 2:15 - loss: 0.0660 - regression_loss: 0.0605 - classification_loss: 0.0055 100/500 [=====>........................] - ETA: 2:15 - loss: 0.0656 - regression_loss: 0.0602 - classification_loss: 0.0054 101/500 [=====>........................] - ETA: 2:15 - loss: 0.0655 - regression_loss: 0.0601 - classification_loss: 0.0054 102/500 [=====>........................] - ETA: 2:14 - loss: 0.0650 - regression_loss: 0.0597 - classification_loss: 0.0053 103/500 [=====>........................] - ETA: 2:14 - loss: 0.0645 - regression_loss: 0.0592 - classification_loss: 0.0053 104/500 [=====>........................] - ETA: 2:13 - loss: 0.0668 - regression_loss: 0.0615 - classification_loss: 0.0053 105/500 [=====>........................] - ETA: 2:13 - loss: 0.0673 - regression_loss: 0.0619 - classification_loss: 0.0054 106/500 [=====>........................] - ETA: 2:13 - loss: 0.0674 - regression_loss: 0.0619 - classification_loss: 0.0054 107/500 [=====>........................] - ETA: 2:13 - loss: 0.0672 - regression_loss: 0.0618 - classification_loss: 0.0054 108/500 [=====>........................] - ETA: 2:12 - loss: 0.0673 - regression_loss: 0.0619 - classification_loss: 0.0054 109/500 [=====>........................] - ETA: 2:12 - loss: 0.0676 - regression_loss: 0.0622 - classification_loss: 0.0054 110/500 [=====>........................] - ETA: 2:12 - loss: 0.0670 - regression_loss: 0.0616 - classification_loss: 0.0054 111/500 [=====>........................] - ETA: 2:11 - loss: 0.0671 - regression_loss: 0.0617 - classification_loss: 0.0054 112/500 [=====>........................] - ETA: 2:11 - loss: 0.0682 - regression_loss: 0.0628 - classification_loss: 0.0054 113/500 [=====>........................] - ETA: 2:11 - loss: 0.0688 - regression_loss: 0.0632 - classification_loss: 0.0056 114/500 [=====>........................] - ETA: 2:10 - loss: 0.0684 - regression_loss: 0.0629 - classification_loss: 0.0055 115/500 [=====>........................] - ETA: 2:10 - loss: 0.0680 - regression_loss: 0.0625 - classification_loss: 0.0055 116/500 [=====>........................] - ETA: 2:09 - loss: 0.0678 - regression_loss: 0.0623 - classification_loss: 0.0055 117/500 [======>.......................] - ETA: 2:09 - loss: 0.0674 - regression_loss: 0.0620 - classification_loss: 0.0054 118/500 [======>.......................] - ETA: 2:09 - loss: 0.0671 - regression_loss: 0.0617 - classification_loss: 0.0054 119/500 [======>.......................] - ETA: 2:08 - loss: 0.0677 - regression_loss: 0.0623 - classification_loss: 0.0054 120/500 [======>.......................] - ETA: 2:08 - loss: 0.0673 - regression_loss: 0.0619 - classification_loss: 0.0054 121/500 [======>.......................] - ETA: 2:08 - loss: 0.0669 - regression_loss: 0.0616 - classification_loss: 0.0054 122/500 [======>.......................] - ETA: 2:07 - loss: 0.0670 - regression_loss: 0.0617 - classification_loss: 0.0053 123/500 [======>.......................] - ETA: 2:07 - loss: 0.0671 - regression_loss: 0.0617 - classification_loss: 0.0054 124/500 [======>.......................] - ETA: 2:07 - loss: 0.0667 - regression_loss: 0.0613 - classification_loss: 0.0053 125/500 [======>.......................] - ETA: 2:06 - loss: 0.0664 - regression_loss: 0.0611 - classification_loss: 0.0053 126/500 [======>.......................] - ETA: 2:06 - loss: 0.0663 - regression_loss: 0.0609 - classification_loss: 0.0053 127/500 [======>.......................] - ETA: 2:06 - loss: 0.0664 - regression_loss: 0.0610 - classification_loss: 0.0054 128/500 [======>.......................] - ETA: 2:05 - loss: 0.0662 - regression_loss: 0.0608 - classification_loss: 0.0053 129/500 [======>.......................] - ETA: 2:05 - loss: 0.0657 - regression_loss: 0.0604 - classification_loss: 0.0053 130/500 [======>.......................] - ETA: 2:05 - loss: 0.0653 - regression_loss: 0.0600 - classification_loss: 0.0053 131/500 [======>.......................] - ETA: 2:04 - loss: 0.0650 - regression_loss: 0.0597 - classification_loss: 0.0052 132/500 [======>.......................] - ETA: 2:04 - loss: 0.0648 - regression_loss: 0.0596 - classification_loss: 0.0052 133/500 [======>.......................] - ETA: 2:04 - loss: 0.0648 - regression_loss: 0.0596 - classification_loss: 0.0052 134/500 [=======>......................] - ETA: 2:03 - loss: 0.0650 - regression_loss: 0.0598 - classification_loss: 0.0052 135/500 [=======>......................] - ETA: 2:03 - loss: 0.0652 - regression_loss: 0.0600 - classification_loss: 0.0052 136/500 [=======>......................] - ETA: 2:03 - loss: 0.0649 - regression_loss: 0.0597 - classification_loss: 0.0052 137/500 [=======>......................] - ETA: 2:02 - loss: 0.0661 - regression_loss: 0.0608 - classification_loss: 0.0053 138/500 [=======>......................] - ETA: 2:02 - loss: 0.0657 - regression_loss: 0.0604 - classification_loss: 0.0053 139/500 [=======>......................] - ETA: 2:02 - loss: 0.0654 - regression_loss: 0.0601 - classification_loss: 0.0052 140/500 [=======>......................] - ETA: 2:01 - loss: 0.0656 - regression_loss: 0.0603 - classification_loss: 0.0052 141/500 [=======>......................] - ETA: 2:01 - loss: 0.0651 - regression_loss: 0.0599 - classification_loss: 0.0052 142/500 [=======>......................] - ETA: 2:01 - loss: 0.0649 - regression_loss: 0.0597 - classification_loss: 0.0052 143/500 [=======>......................] - ETA: 2:00 - loss: 0.0647 - regression_loss: 0.0595 - classification_loss: 0.0052 144/500 [=======>......................] - ETA: 2:00 - loss: 0.0650 - regression_loss: 0.0598 - classification_loss: 0.0052 145/500 [=======>......................] - ETA: 2:00 - loss: 0.0647 - regression_loss: 0.0596 - classification_loss: 0.0051 146/500 [=======>......................] - ETA: 1:59 - loss: 0.0646 - regression_loss: 0.0594 - classification_loss: 0.0052 147/500 [=======>......................] - ETA: 1:59 - loss: 0.0649 - regression_loss: 0.0597 - classification_loss: 0.0052 148/500 [=======>......................] - ETA: 1:59 - loss: 0.0645 - regression_loss: 0.0594 - classification_loss: 0.0051 149/500 [=======>......................] - ETA: 1:58 - loss: 0.0646 - regression_loss: 0.0595 - classification_loss: 0.0051 150/500 [========>.....................] - ETA: 1:58 - loss: 0.0643 - regression_loss: 0.0592 - classification_loss: 0.0051 151/500 [========>.....................] - ETA: 1:58 - loss: 0.0643 - regression_loss: 0.0592 - classification_loss: 0.0051 152/500 [========>.....................] - ETA: 1:57 - loss: 0.0641 - regression_loss: 0.0590 - classification_loss: 0.0051 153/500 [========>.....................] - ETA: 1:57 - loss: 0.0639 - regression_loss: 0.0588 - classification_loss: 0.0051 154/500 [========>.....................] - ETA: 1:57 - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 155/500 [========>.....................] - ETA: 1:56 - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 156/500 [========>.....................] - ETA: 1:56 - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 157/500 [========>.....................] - ETA: 1:55 - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 158/500 [========>.....................] - ETA: 1:55 - loss: 0.0640 - regression_loss: 0.0589 - classification_loss: 0.0050 159/500 [========>.....................] - ETA: 1:55 - loss: 0.0639 - regression_loss: 0.0588 - classification_loss: 0.0050 160/500 [========>.....................] - ETA: 1:54 - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0050 161/500 [========>.....................] - ETA: 1:54 - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 162/500 [========>.....................] - ETA: 1:54 - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0050 163/500 [========>.....................] - ETA: 1:53 - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 164/500 [========>.....................] - ETA: 1:53 - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 165/500 [========>.....................] - ETA: 1:53 - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 166/500 [========>.....................] - ETA: 1:52 - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 167/500 [=========>....................] - ETA: 1:52 - loss: 0.0630 - regression_loss: 0.0580 - classification_loss: 0.0050 168/500 [=========>....................] - ETA: 1:52 - loss: 0.0629 - regression_loss: 0.0579 - classification_loss: 0.0049 169/500 [=========>....................] - ETA: 1:51 - loss: 0.0628 - regression_loss: 0.0578 - classification_loss: 0.0049 170/500 [=========>....................] - ETA: 1:51 - loss: 0.0629 - regression_loss: 0.0579 - classification_loss: 0.0049 171/500 [=========>....................] - ETA: 1:51 - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0049 172/500 [=========>....................] - ETA: 1:50 - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 173/500 [=========>....................] - ETA: 1:50 - loss: 0.0628 - regression_loss: 0.0579 - classification_loss: 0.0049 174/500 [=========>....................] - ETA: 1:50 - loss: 0.0627 - regression_loss: 0.0578 - classification_loss: 0.0049 175/500 [=========>....................] - ETA: 1:49 - loss: 0.0629 - regression_loss: 0.0580 - classification_loss: 0.0049 176/500 [=========>....................] - ETA: 1:49 - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0049 177/500 [=========>....................] - ETA: 1:49 - loss: 0.0626 - regression_loss: 0.0578 - classification_loss: 0.0049 178/500 [=========>....................] - ETA: 1:48 - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 179/500 [=========>....................] - ETA: 1:48 - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 180/500 [=========>....................] - ETA: 1:48 - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 181/500 [=========>....................] - ETA: 1:47 - loss: 0.0628 - regression_loss: 0.0580 - classification_loss: 0.0048 182/500 [=========>....................] - ETA: 1:47 - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0048 183/500 [=========>....................] - ETA: 1:47 - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0048 184/500 [==========>...................] - ETA: 1:46 - loss: 0.0624 - regression_loss: 0.0576 - classification_loss: 0.0048 185/500 [==========>...................] - ETA: 1:46 - loss: 0.0625 - regression_loss: 0.0578 - classification_loss: 0.0048 186/500 [==========>...................] - ETA: 1:46 - loss: 0.0623 - regression_loss: 0.0575 - classification_loss: 0.0048 187/500 [==========>...................] - ETA: 1:45 - loss: 0.0621 - regression_loss: 0.0573 - classification_loss: 0.0048 188/500 [==========>...................] - ETA: 1:45 - loss: 0.0623 - regression_loss: 0.0576 - classification_loss: 0.0048 189/500 [==========>...................] - ETA: 1:45 - loss: 0.0621 - regression_loss: 0.0573 - classification_loss: 0.0047 190/500 [==========>...................] - ETA: 1:44 - loss: 0.0618 - regression_loss: 0.0571 - classification_loss: 0.0047 191/500 [==========>...................] - ETA: 1:44 - loss: 0.0622 - regression_loss: 0.0574 - classification_loss: 0.0048 192/500 [==========>...................] - ETA: 1:44 - loss: 0.0620 - regression_loss: 0.0572 - classification_loss: 0.0048 193/500 [==========>...................] - ETA: 1:43 - loss: 0.0623 - regression_loss: 0.0575 - classification_loss: 0.0048 194/500 [==========>...................] - ETA: 1:43 - loss: 0.0621 - regression_loss: 0.0573 - classification_loss: 0.0048 195/500 [==========>...................] - ETA: 1:43 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 196/500 [==========>...................] - ETA: 1:42 - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0048 197/500 [==========>...................] - ETA: 1:42 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 198/500 [==========>...................] - ETA: 1:42 - loss: 0.0639 - regression_loss: 0.0591 - classification_loss: 0.0048 199/500 [==========>...................] - ETA: 1:41 - loss: 0.0640 - regression_loss: 0.0592 - classification_loss: 0.0048 200/500 [===========>..................] - ETA: 1:41 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 201/500 [===========>..................] - ETA: 1:41 - loss: 0.0640 - regression_loss: 0.0592 - classification_loss: 0.0048 202/500 [===========>..................] - ETA: 1:40 - loss: 0.0639 - regression_loss: 0.0591 - classification_loss: 0.0048 203/500 [===========>..................] - ETA: 1:40 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 204/500 [===========>..................] - ETA: 1:40 - loss: 0.0636 - regression_loss: 0.0589 - classification_loss: 0.0048 205/500 [===========>..................] - ETA: 1:39 - loss: 0.0636 - regression_loss: 0.0589 - classification_loss: 0.0048 206/500 [===========>..................] - ETA: 1:39 - loss: 0.0634 - regression_loss: 0.0587 - classification_loss: 0.0047 207/500 [===========>..................] - ETA: 1:39 - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0047 208/500 [===========>..................] - ETA: 1:38 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0047 209/500 [===========>..................] - ETA: 1:38 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0047 210/500 [===========>..................] - ETA: 1:38 - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0048 211/500 [===========>..................] - ETA: 1:37 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0048 212/500 [===========>..................] - ETA: 1:37 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0048 213/500 [===========>..................] - ETA: 1:37 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 214/500 [===========>..................] - ETA: 1:36 - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0048 215/500 [===========>..................] - ETA: 1:36 - loss: 0.0633 - regression_loss: 0.0586 - classification_loss: 0.0048 216/500 [===========>..................] - ETA: 1:36 - loss: 0.0635 - regression_loss: 0.0587 - classification_loss: 0.0048 217/500 [============>.................] - ETA: 1:35 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 218/500 [============>.................] - ETA: 1:35 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 219/500 [============>.................] - ETA: 1:35 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0047 220/500 [============>.................] - ETA: 1:34 - loss: 0.0634 - regression_loss: 0.0587 - classification_loss: 0.0048 221/500 [============>.................] - ETA: 1:34 - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0048 222/500 [============>.................] - ETA: 1:34 - loss: 0.0639 - regression_loss: 0.0592 - classification_loss: 0.0048 223/500 [============>.................] - ETA: 1:33 - loss: 0.0638 - regression_loss: 0.0590 - classification_loss: 0.0047 224/500 [============>.................] - ETA: 1:33 - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0047 225/500 [============>.................] - ETA: 1:32 - loss: 0.0639 - regression_loss: 0.0592 - classification_loss: 0.0047 226/500 [============>.................] - ETA: 1:32 - loss: 0.0644 - regression_loss: 0.0596 - classification_loss: 0.0047 227/500 [============>.................] - ETA: 1:32 - loss: 0.0646 - regression_loss: 0.0598 - classification_loss: 0.0047 228/500 [============>.................] - ETA: 1:31 - loss: 0.0650 - regression_loss: 0.0603 - classification_loss: 0.0048 229/500 [============>.................] - ETA: 1:31 - loss: 0.0651 - regression_loss: 0.0604 - classification_loss: 0.0048 230/500 [============>.................] - ETA: 1:31 - loss: 0.0653 - regression_loss: 0.0605 - classification_loss: 0.0048 231/500 [============>.................] - ETA: 1:30 - loss: 0.0651 - regression_loss: 0.0603 - classification_loss: 0.0048 232/500 [============>.................] - ETA: 1:30 - loss: 0.0652 - regression_loss: 0.0604 - classification_loss: 0.0048 233/500 [============>.................] - ETA: 1:30 - loss: 0.0650 - regression_loss: 0.0602 - classification_loss: 0.0048 234/500 [=============>................] - ETA: 1:29 - loss: 0.0648 - regression_loss: 0.0601 - classification_loss: 0.0048 235/500 [=============>................] - ETA: 1:29 - loss: 0.0647 - regression_loss: 0.0600 - classification_loss: 0.0047 236/500 [=============>................] - ETA: 1:29 - loss: 0.0646 - regression_loss: 0.0599 - classification_loss: 0.0047 237/500 [=============>................] - ETA: 1:28 - loss: 0.0649 - regression_loss: 0.0601 - classification_loss: 0.0048 238/500 [=============>................] - ETA: 1:28 - loss: 0.0647 - regression_loss: 0.0599 - classification_loss: 0.0048 239/500 [=============>................] - ETA: 1:28 - loss: 0.0645 - regression_loss: 0.0597 - classification_loss: 0.0048 240/500 [=============>................] - ETA: 1:27 - loss: 0.0649 - regression_loss: 0.0600 - classification_loss: 0.0048 241/500 [=============>................] - ETA: 1:27 - loss: 0.0652 - regression_loss: 0.0604 - classification_loss: 0.0049 242/500 [=============>................] - ETA: 1:27 - loss: 0.0650 - regression_loss: 0.0602 - classification_loss: 0.0048 243/500 [=============>................] - ETA: 1:26 - loss: 0.0657 - regression_loss: 0.0606 - classification_loss: 0.0051 244/500 [=============>................] - ETA: 1:26 - loss: 0.0657 - regression_loss: 0.0605 - classification_loss: 0.0051 245/500 [=============>................] - ETA: 1:26 - loss: 0.0655 - regression_loss: 0.0605 - classification_loss: 0.0051 246/500 [=============>................] - ETA: 1:25 - loss: 0.0654 - regression_loss: 0.0603 - classification_loss: 0.0051 247/500 [=============>................] - ETA: 1:25 - loss: 0.0655 - regression_loss: 0.0604 - classification_loss: 0.0051 248/500 [=============>................] - ETA: 1:25 - loss: 0.0653 - regression_loss: 0.0602 - classification_loss: 0.0051 249/500 [=============>................] - ETA: 1:24 - loss: 0.0652 - regression_loss: 0.0601 - classification_loss: 0.0051 250/500 [==============>...............] - ETA: 1:24 - loss: 0.0652 - regression_loss: 0.0601 - classification_loss: 0.0051 251/500 [==============>...............] - ETA: 1:24 - loss: 0.0654 - regression_loss: 0.0603 - classification_loss: 0.0051 252/500 [==============>...............] - ETA: 1:23 - loss: 0.0652 - regression_loss: 0.0601 - classification_loss: 0.0051 253/500 [==============>...............] - ETA: 1:23 - loss: 0.0650 - regression_loss: 0.0599 - classification_loss: 0.0051 254/500 [==============>...............] - ETA: 1:23 - loss: 0.0650 - regression_loss: 0.0599 - classification_loss: 0.0051 255/500 [==============>...............] - ETA: 1:22 - loss: 0.0650 - regression_loss: 0.0599 - classification_loss: 0.0051 256/500 [==============>...............] - ETA: 1:22 - loss: 0.0655 - regression_loss: 0.0604 - classification_loss: 0.0051 257/500 [==============>...............] - ETA: 1:22 - loss: 0.0657 - regression_loss: 0.0606 - classification_loss: 0.0052 258/500 [==============>...............] - ETA: 1:21 - loss: 0.0660 - regression_loss: 0.0608 - classification_loss: 0.0052 259/500 [==============>...............] - ETA: 1:21 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 260/500 [==============>...............] - ETA: 1:21 - loss: 0.0658 - regression_loss: 0.0606 - classification_loss: 0.0052 261/500 [==============>...............] - ETA: 1:20 - loss: 0.0658 - regression_loss: 0.0606 - classification_loss: 0.0052 262/500 [==============>...............] - ETA: 1:20 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 263/500 [==============>...............] - ETA: 1:20 - loss: 0.0658 - regression_loss: 0.0606 - classification_loss: 0.0052 264/500 [==============>...............] - ETA: 1:19 - loss: 0.0656 - regression_loss: 0.0605 - classification_loss: 0.0052 265/500 [==============>...............] - ETA: 1:19 - loss: 0.0655 - regression_loss: 0.0603 - classification_loss: 0.0052 266/500 [==============>...............] - ETA: 1:19 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 267/500 [===============>..............] - ETA: 1:18 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 268/500 [===============>..............] - ETA: 1:18 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 269/500 [===============>..............] - ETA: 1:18 - loss: 0.0657 - regression_loss: 0.0605 - classification_loss: 0.0052 270/500 [===============>..............] - ETA: 1:17 - loss: 0.0658 - regression_loss: 0.0606 - classification_loss: 0.0052 271/500 [===============>..............] - ETA: 1:17 - loss: 0.0660 - regression_loss: 0.0608 - classification_loss: 0.0052 272/500 [===============>..............] - ETA: 1:16 - loss: 0.0659 - regression_loss: 0.0607 - classification_loss: 0.0052 273/500 [===============>..............] - ETA: 1:16 - loss: 0.0658 - regression_loss: 0.0606 - classification_loss: 0.0052 274/500 [===============>..............] - ETA: 1:16 - loss: 0.0657 - regression_loss: 0.0605 - classification_loss: 0.0052 275/500 [===============>..............] - ETA: 1:15 - loss: 0.0655 - regression_loss: 0.0604 - classification_loss: 0.0052 276/500 [===============>..............] - ETA: 1:15 - loss: 0.0653 - regression_loss: 0.0602 - classification_loss: 0.0051 277/500 [===============>..............] - ETA: 1:15 - loss: 0.0654 - regression_loss: 0.0602 - classification_loss: 0.0051 278/500 [===============>..............] - ETA: 1:14 - loss: 0.0652 - regression_loss: 0.0601 - classification_loss: 0.0051 279/500 [===============>..............] - ETA: 1:14 - loss: 0.0650 - regression_loss: 0.0599 - classification_loss: 0.0051 280/500 [===============>..............] - ETA: 1:14 - loss: 0.0651 - regression_loss: 0.0600 - classification_loss: 0.0051 281/500 [===============>..............] - ETA: 1:13 - loss: 0.0653 - regression_loss: 0.0602 - classification_loss: 0.0051 282/500 [===============>..............] - ETA: 1:13 - loss: 0.0652 - regression_loss: 0.0601 - classification_loss: 0.0051 283/500 [===============>..............] - ETA: 1:13 - loss: 0.0650 - regression_loss: 0.0599 - classification_loss: 0.0051 284/500 [================>.............] - ETA: 1:12 - loss: 0.0651 - regression_loss: 0.0600 - classification_loss: 0.0051 285/500 [================>.............] - ETA: 1:12 - loss: 0.0649 - regression_loss: 0.0599 - classification_loss: 0.0051 286/500 [================>.............] - ETA: 1:12 - loss: 0.0648 - regression_loss: 0.0598 - classification_loss: 0.0051 287/500 [================>.............] - ETA: 1:11 - loss: 0.0647 - regression_loss: 0.0596 - classification_loss: 0.0051 288/500 [================>.............] - ETA: 1:11 - loss: 0.0646 - regression_loss: 0.0595 - classification_loss: 0.0051 289/500 [================>.............] - ETA: 1:11 - loss: 0.0645 - regression_loss: 0.0595 - classification_loss: 0.0050 290/500 [================>.............] - ETA: 1:10 - loss: 0.0645 - regression_loss: 0.0594 - classification_loss: 0.0050 291/500 [================>.............] - ETA: 1:10 - loss: 0.0644 - regression_loss: 0.0593 - classification_loss: 0.0050 292/500 [================>.............] - ETA: 1:10 - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 293/500 [================>.............] - ETA: 1:09 - loss: 0.0642 - regression_loss: 0.0591 - classification_loss: 0.0050 294/500 [================>.............] - ETA: 1:09 - loss: 0.0645 - regression_loss: 0.0595 - classification_loss: 0.0050 295/500 [================>.............] - ETA: 1:09 - loss: 0.0643 - regression_loss: 0.0593 - classification_loss: 0.0050 296/500 [================>.............] - ETA: 1:08 - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 297/500 [================>.............] - ETA: 1:08 - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 298/500 [================>.............] - ETA: 1:08 - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 299/500 [================>.............] - ETA: 1:07 - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0050 300/500 [=================>............] - ETA: 1:07 - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 301/500 [=================>............] - ETA: 1:07 - loss: 0.0638 - regression_loss: 0.0588 - classification_loss: 0.0050 302/500 [=================>............] - ETA: 1:06 - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0050 303/500 [=================>............] - ETA: 1:06 - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 304/500 [=================>............] - ETA: 1:06 - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 305/500 [=================>............] - ETA: 1:05 - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 306/500 [=================>............] - ETA: 1:05 - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0049 307/500 [=================>............] - ETA: 1:05 - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 308/500 [=================>............] - ETA: 1:04 - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0049 309/500 [=================>............] - ETA: 1:04 - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 310/500 [=================>............] - ETA: 1:04 - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 311/500 [=================>............] - ETA: 1:03 - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0049 312/500 [=================>............] - ETA: 1:03 - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 313/500 [=================>............] - ETA: 1:03 - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 314/500 [=================>............] - ETA: 1:02 - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 315/500 [=================>............] - ETA: 1:02 - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0050 316/500 [=================>............] - ETA: 1:02 - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0049 317/500 [==================>...........] - ETA: 1:01 - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 318/500 [==================>...........] - ETA: 1:01 - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0049 319/500 [==================>...........] - ETA: 1:01 - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 320/500 [==================>...........] - ETA: 1:00 - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0049 321/500 [==================>...........] - ETA: 1:00 - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 322/500 [==================>...........] - ETA: 1:00 - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 323/500 [==================>...........] - ETA: 59s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049  324/500 [==================>...........] - ETA: 59s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 325/500 [==================>...........] - ETA: 59s - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 326/500 [==================>...........] - ETA: 58s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 327/500 [==================>...........] - ETA: 58s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 328/500 [==================>...........] - ETA: 58s - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 329/500 [==================>...........] - ETA: 57s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0049 330/500 [==================>...........] - ETA: 57s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 331/500 [==================>...........] - ETA: 57s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0049 332/500 [==================>...........] - ETA: 56s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 333/500 [==================>...........] - ETA: 56s - loss: 0.0637 - regression_loss: 0.0588 - classification_loss: 0.0049 334/500 [===================>..........] - ETA: 56s - loss: 0.0637 - regression_loss: 0.0588 - classification_loss: 0.0049 335/500 [===================>..........] - ETA: 55s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0049 336/500 [===================>..........] - ETA: 55s - loss: 0.0645 - regression_loss: 0.0595 - classification_loss: 0.0050 337/500 [===================>..........] - ETA: 54s - loss: 0.0643 - regression_loss: 0.0594 - classification_loss: 0.0049 338/500 [===================>..........] - ETA: 54s - loss: 0.0642 - regression_loss: 0.0593 - classification_loss: 0.0049 339/500 [===================>..........] - ETA: 54s - loss: 0.0641 - regression_loss: 0.0592 - classification_loss: 0.0049 340/500 [===================>..........] - ETA: 53s - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0049 341/500 [===================>..........] - ETA: 53s - loss: 0.0640 - regression_loss: 0.0591 - classification_loss: 0.0049 342/500 [===================>..........] - ETA: 53s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0049 343/500 [===================>..........] - ETA: 52s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0049 344/500 [===================>..........] - ETA: 52s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0049 345/500 [===================>..........] - ETA: 52s - loss: 0.0637 - regression_loss: 0.0588 - classification_loss: 0.0049 346/500 [===================>..........] - ETA: 51s - loss: 0.0640 - regression_loss: 0.0591 - classification_loss: 0.0049 347/500 [===================>..........] - ETA: 51s - loss: 0.0640 - regression_loss: 0.0591 - classification_loss: 0.0049 348/500 [===================>..........] - ETA: 51s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0049 349/500 [===================>..........] - ETA: 50s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0049 350/500 [====================>.........] - ETA: 50s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0049 351/500 [====================>.........] - ETA: 50s - loss: 0.0636 - regression_loss: 0.0588 - classification_loss: 0.0049 352/500 [====================>.........] - ETA: 49s - loss: 0.0637 - regression_loss: 0.0588 - classification_loss: 0.0049 353/500 [====================>.........] - ETA: 49s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0048 354/500 [====================>.........] - ETA: 49s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0049 355/500 [====================>.........] - ETA: 48s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0049 356/500 [====================>.........] - ETA: 48s - loss: 0.0637 - regression_loss: 0.0589 - classification_loss: 0.0049 357/500 [====================>.........] - ETA: 48s - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 358/500 [====================>.........] - ETA: 47s - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0051 359/500 [====================>.........] - ETA: 47s - loss: 0.0641 - regression_loss: 0.0590 - classification_loss: 0.0050 360/500 [====================>.........] - ETA: 47s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 361/500 [====================>.........] - ETA: 46s - loss: 0.0642 - regression_loss: 0.0591 - classification_loss: 0.0051 362/500 [====================>.........] - ETA: 46s - loss: 0.0641 - regression_loss: 0.0590 - classification_loss: 0.0051 363/500 [====================>.........] - ETA: 46s - loss: 0.0640 - regression_loss: 0.0589 - classification_loss: 0.0051 364/500 [====================>.........] - ETA: 45s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0051 365/500 [====================>.........] - ETA: 45s - loss: 0.0638 - regression_loss: 0.0587 - classification_loss: 0.0051 366/500 [====================>.........] - ETA: 45s - loss: 0.0637 - regression_loss: 0.0586 - classification_loss: 0.0051 367/500 [=====================>........] - ETA: 44s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 368/500 [=====================>........] - ETA: 44s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 369/500 [=====================>........] - ETA: 44s - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 370/500 [=====================>........] - ETA: 43s - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0050 371/500 [=====================>........] - ETA: 43s - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0050 372/500 [=====================>........] - ETA: 43s - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 373/500 [=====================>........] - ETA: 42s - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0050 374/500 [=====================>........] - ETA: 42s - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0050 375/500 [=====================>........] - ETA: 42s - loss: 0.0631 - regression_loss: 0.0581 - classification_loss: 0.0050 376/500 [=====================>........] - ETA: 41s - loss: 0.0631 - regression_loss: 0.0581 - classification_loss: 0.0050 377/500 [=====================>........] - ETA: 41s - loss: 0.0630 - regression_loss: 0.0580 - classification_loss: 0.0050 378/500 [=====================>........] - ETA: 41s - loss: 0.0630 - regression_loss: 0.0580 - classification_loss: 0.0050 379/500 [=====================>........] - ETA: 40s - loss: 0.0631 - regression_loss: 0.0581 - classification_loss: 0.0050 380/500 [=====================>........] - ETA: 40s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 381/500 [=====================>........] - ETA: 40s - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 382/500 [=====================>........] - ETA: 39s - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 383/500 [=====================>........] - ETA: 39s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 384/500 [======================>.......] - ETA: 39s - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 385/500 [======================>.......] - ETA: 38s - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 386/500 [======================>.......] - ETA: 38s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0050 387/500 [======================>.......] - ETA: 38s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 388/500 [======================>.......] - ETA: 37s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 389/500 [======================>.......] - ETA: 37s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 390/500 [======================>.......] - ETA: 37s - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0050 391/500 [======================>.......] - ETA: 36s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 392/500 [======================>.......] - ETA: 36s - loss: 0.0638 - regression_loss: 0.0588 - classification_loss: 0.0049 393/500 [======================>.......] - ETA: 36s - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0049 394/500 [======================>.......] - ETA: 35s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 395/500 [======================>.......] - ETA: 35s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 396/500 [======================>.......] - ETA: 35s - loss: 0.0642 - regression_loss: 0.0592 - classification_loss: 0.0050 397/500 [======================>.......] - ETA: 34s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 398/500 [======================>.......] - ETA: 34s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 399/500 [======================>.......] - ETA: 34s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 400/500 [=======================>......] - ETA: 33s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 401/500 [=======================>......] - ETA: 33s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0050 402/500 [=======================>......] - ETA: 33s - loss: 0.0638 - regression_loss: 0.0588 - classification_loss: 0.0050 403/500 [=======================>......] - ETA: 32s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 404/500 [=======================>......] - ETA: 32s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 405/500 [=======================>......] - ETA: 32s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 406/500 [=======================>......] - ETA: 31s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 407/500 [=======================>......] - ETA: 31s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0050 408/500 [=======================>......] - ETA: 31s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 409/500 [=======================>......] - ETA: 30s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0049 410/500 [=======================>......] - ETA: 30s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0049 411/500 [=======================>......] - ETA: 30s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 412/500 [=======================>......] - ETA: 29s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 413/500 [=======================>......] - ETA: 29s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0049 414/500 [=======================>......] - ETA: 29s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 415/500 [=======================>......] - ETA: 28s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 416/500 [=======================>......] - ETA: 28s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 417/500 [========================>.....] - ETA: 28s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 418/500 [========================>.....] - ETA: 27s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 419/500 [========================>.....] - ETA: 27s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 420/500 [========================>.....] - ETA: 26s - loss: 0.0641 - regression_loss: 0.0591 - classification_loss: 0.0050 421/500 [========================>.....] - ETA: 26s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 422/500 [========================>.....] - ETA: 26s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 423/500 [========================>.....] - ETA: 25s - loss: 0.0640 - regression_loss: 0.0590 - classification_loss: 0.0050 424/500 [========================>.....] - ETA: 25s - loss: 0.0639 - regression_loss: 0.0589 - classification_loss: 0.0050 425/500 [========================>.....] - ETA: 25s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 426/500 [========================>.....] - ETA: 24s - loss: 0.0638 - regression_loss: 0.0588 - classification_loss: 0.0050 427/500 [========================>.....] - ETA: 24s - loss: 0.0639 - regression_loss: 0.0590 - classification_loss: 0.0050 428/500 [========================>.....] - ETA: 24s - loss: 0.0638 - regression_loss: 0.0589 - classification_loss: 0.0050 429/500 [========================>.....] - ETA: 23s - loss: 0.0638 - regression_loss: 0.0588 - classification_loss: 0.0050 430/500 [========================>.....] - ETA: 23s - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0049 431/500 [========================>.....] - ETA: 23s - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0050 432/500 [========================>.....] - ETA: 22s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 433/500 [========================>.....] - ETA: 22s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 434/500 [=========================>....] - ETA: 22s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 435/500 [=========================>....] - ETA: 21s - loss: 0.0637 - regression_loss: 0.0587 - classification_loss: 0.0050 436/500 [=========================>....] - ETA: 21s - loss: 0.0636 - regression_loss: 0.0586 - classification_loss: 0.0050 437/500 [=========================>....] - ETA: 21s - loss: 0.0635 - regression_loss: 0.0585 - classification_loss: 0.0050 438/500 [=========================>....] - ETA: 20s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0050 439/500 [=========================>....] - ETA: 20s - loss: 0.0633 - regression_loss: 0.0583 - classification_loss: 0.0050 440/500 [=========================>....] - ETA: 20s - loss: 0.0632 - regression_loss: 0.0582 - classification_loss: 0.0049 441/500 [=========================>....] - ETA: 19s - loss: 0.0631 - regression_loss: 0.0581 - classification_loss: 0.0049 442/500 [=========================>....] - ETA: 19s - loss: 0.0634 - regression_loss: 0.0584 - classification_loss: 0.0049 443/500 [=========================>....] - ETA: 19s - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 444/500 [=========================>....] - ETA: 18s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 445/500 [=========================>....] - ETA: 18s - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 446/500 [=========================>....] - ETA: 18s - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 447/500 [=========================>....] - ETA: 17s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 448/500 [=========================>....] - ETA: 17s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 449/500 [=========================>....] - ETA: 17s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 450/500 [==========================>...] - ETA: 16s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 451/500 [==========================>...] - ETA: 16s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 452/500 [==========================>...] - ETA: 16s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 453/500 [==========================>...] - ETA: 15s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 454/500 [==========================>...] - ETA: 15s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 455/500 [==========================>...] - ETA: 15s - loss: 0.0636 - regression_loss: 0.0587 - classification_loss: 0.0049 456/500 [==========================>...] - ETA: 14s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 457/500 [==========================>...] - ETA: 14s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 458/500 [==========================>...] - ETA: 14s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 459/500 [==========================>...] - ETA: 13s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 460/500 [==========================>...] - ETA: 13s - loss: 0.0633 - regression_loss: 0.0584 - classification_loss: 0.0049 461/500 [==========================>...] - ETA: 13s - loss: 0.0633 - regression_loss: 0.0585 - classification_loss: 0.0049 462/500 [==========================>...] - ETA: 12s - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0049 463/500 [==========================>...] - ETA: 12s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 464/500 [==========================>...] - ETA: 12s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 465/500 [==========================>...] - ETA: 11s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 466/500 [==========================>...] - ETA: 11s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 467/500 [===========================>..] - ETA: 11s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 468/500 [===========================>..] - ETA: 10s - loss: 0.0635 - regression_loss: 0.0586 - classification_loss: 0.0049 469/500 [===========================>..] - ETA: 10s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 470/500 [===========================>..] - ETA: 10s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 471/500 [===========================>..] - ETA: 9s - loss: 0.0634 - regression_loss: 0.0586 - classification_loss: 0.0049  472/500 [===========================>..] - ETA: 9s - loss: 0.0634 - regression_loss: 0.0585 - classification_loss: 0.0049 473/500 [===========================>..] - ETA: 9s - loss: 0.0632 - regression_loss: 0.0584 - classification_loss: 0.0049 474/500 [===========================>..] - ETA: 8s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 475/500 [===========================>..] - ETA: 8s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 476/500 [===========================>..] - ETA: 8s - loss: 0.0632 - regression_loss: 0.0583 - classification_loss: 0.0049 477/500 [===========================>..] - ETA: 7s - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 478/500 [===========================>..] - ETA: 7s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 479/500 [===========================>..] - ETA: 7s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 480/500 [===========================>..] - ETA: 6s - loss: 0.0629 - regression_loss: 0.0580 - classification_loss: 0.0049 481/500 [===========================>..] - ETA: 6s - loss: 0.0629 - regression_loss: 0.0580 - classification_loss: 0.0049 482/500 [===========================>..] - ETA: 6s - loss: 0.0630 - regression_loss: 0.0582 - classification_loss: 0.0049 483/500 [===========================>..] - ETA: 5s - loss: 0.0629 - regression_loss: 0.0581 - classification_loss: 0.0049 484/500 [============================>.] - ETA: 5s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 485/500 [============================>.] - ETA: 5s - loss: 0.0629 - regression_loss: 0.0580 - classification_loss: 0.0049 486/500 [============================>.] - ETA: 4s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 487/500 [============================>.] - ETA: 4s - loss: 0.0631 - regression_loss: 0.0582 - classification_loss: 0.0049 488/500 [============================>.] - ETA: 4s - loss: 0.0630 - regression_loss: 0.0581 - classification_loss: 0.0049 489/500 [============================>.] - ETA: 3s - loss: 0.0629 - regression_loss: 0.0580 - classification_loss: 0.0049 490/500 [============================>.] - ETA: 3s - loss: 0.0628 - regression_loss: 0.0579 - classification_loss: 0.0048 491/500 [============================>.] - ETA: 3s - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0048 492/500 [============================>.] - ETA: 2s - loss: 0.0628 - regression_loss: 0.0579 - classification_loss: 0.0048 493/500 [============================>.] - ETA: 2s - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0048 494/500 [============================>.] - ETA: 2s - loss: 0.0627 - regression_loss: 0.0579 - classification_loss: 0.0048 495/500 [============================>.] - ETA: 1s - loss: 0.0626 - regression_loss: 0.0578 - classification_loss: 0.0048 496/500 [============================>.] - ETA: 1s - loss: 0.0626 - regression_loss: 0.0578 - classification_loss: 0.0048 497/500 [============================>.] - ETA: 1s - loss: 0.0625 - regression_loss: 0.0577 - classification_loss: 0.0048 498/500 [============================>.] - ETA: 0s - loss: 0.0625 - regression_loss: 0.0576 - classification_loss: 0.0048 499/500 [============================>.] - ETA: 0s - loss: 0.0624 - regression_loss: 0.0576 - classification_loss: 0.0048 500/500 [==============================] - 169s 338ms/step - loss: 0.0623 - regression_loss: 0.0575 - classification_loss: 0.0048 1172 instances of class plum with average precision: 0.7573 mAP: 0.7573 Epoch 00062: saving model to ./training/snapshots/resnet101_pascal_62.h5